Researchers have developed a new framework for designing vision-based tactile sensors, aiming to enhance their ability to perceive and reconstruct the 3D shape of objects through touch. This innovation addresses the challenges in creating effective tactile sensors, especially those with curved shapes designed for robotic applications. The framework begins with the design of illumination, carefully selecting light sources and their placement to optimize tactile image generation. This design is then evaluated using optical simulations and an automated assessment function called RGB2Normal, which helps determine the design’s effectiveness. The most promising illumination setups proceed to the next stage focusing on sensor shape.
In the shape design stage, the researchers use a procedural method to generate various sensor shapes, guided by low-dimensional shape parameters. These shapes also undergo optical simulation and automatic assessment to identify the optimal designs. This optimization process is crucial because the sensor’s shape significantly influences how light interacts within the sensor and ultimately affects the quality of tactile data.
The core objective of this framework is to ensure accurate recovery of surface normals, a key element in reconstructing 3D shapes from tactile readings. By optimizing both illumination and sensor shape, the framework aims to produce tactile sensors that provide precise 3D shape information. This information is vital for robots to effectively interact with objects, enabling tasks like object recognition, manipulation, and force estimation.
The framework employs a novel objective function, RGB2Normal score, to evaluate sensor designs. This function assesses how well the sensor captures surface details by analyzing the relationship between color changes in tactile images and the actual surface normals of the touched object. This method allows for rapid design evaluation without requiring complex calibration procedures.
To accurately simulate the sensor’s optical behavior, the researchers developed a sophisticated optical simulator. This simulator utilizes physically based rendering techniques to account for realistic light behavior and material properties, ensuring that the simulation results closely match real-world sensor performance. They validated the simulator by comparing simulated images with images captured from a physical prototype, demonstrating a high degree of accuracy.
The researchers explored various design parameters, including illumination types, sensor surface shapes, sensor thickness, and coating materials. They found that carefully designing the illumination profile and optimizing the sensor shape, particularly the curved surfaces, significantly improves sensor performance. They tested different illumination profiles like calibrated IES lights, spotlights, and area light sources to determine the most effective lighting strategy for uniform illumination and high RGB2Normal scores.
For sensor shape optimization, they employed parameterized curves, such as ellipses and B-splines, to generate a range of curved sensor designs. These designs were then evaluated to find the shapes that best enhance tactile perception, especially in curved vision-based tactile sensors (VBTS). The optimization process utilized evolutionary algorithms to efficiently search for the best performing sensor shapes.
The fabrication of these curved tactile sensors involves a three-step process including 3D printing, mold creation, and assembly with illumination and camera components. The hard shell of the sensor and molds are 3D printed using resin and polished for optical clarity. A soft elastomer is then cast into the mold to form the deformable sensing element. Finally, the sensor is assembled with LEDs for illumination and a camera to capture tactile images.
This research presents a significant advancement in the design of vision-based tactile sensors. The developed framework, incorporating simulation and optimization tools, offers a systematic approach to creating high-performance tactile sensors tailored for complex robotic tasks and environments requiring detailed 3D perception through touch.