In the field of robotics and automation, determining the position and orientation of objects is essential for various applications. Traditionally, this has been accomplished using vision-based sensors. The information from these sensors can recover the local geometry of a target object, which is done making use of convolutional neural networks. After this, the orientation of a target object can be determined by scanning different positions of the object. Using the ground truth of the target object, it can be determined at which position this scanned part is most likely to be [1].
Demonstration of the position estimation done in MidasTouch [1]
A new flexible tactile sensor that uses a grid of piezoresistive sensors has been developed [2]. This skin-like touch-based sensor offers an alternative to vision-based sensors. The goal of this project is to explore the capabilities of this sensor. Mapping sensor input to physical positions in space, as well as reconstructing as much information as possible about the geometry of a target object touching the sensor. The eventual goal of this project is giving similar output to the vision-based tactile sensor, such that it can be used for the same goal described above.
References