Summary:
UCLA researchers have developed a tactile sensing technology that combines optical and microfluidic sensing to provide high spatio-temporal resolution and multi-modal feedback for applications like robotics, prosthetics, and human-machine interaction.
Background:
Human touch depends on a unique combination of high spatial resolution—to distinguish positions and shapes—and high temporal resolution—to detect rapid changes. Together, these capabilities are essential for precise grasping and manipulation and distinguishing textures. Conventional tactile sensors struggle to capture both high spatial and temporal resolution at once, typically sacrificing one capability for the other. For example, some sensors use cameras to visually track the deformation of a surface, which provides high spatial resolution for detecting deformation and shapes. However, these sensors are typically limited by sampling rates that are much slower than human perception. Other sensors, like those based on liquid metal strain gauges or fluid pressure, can detect rapid changes with very high temporal resolution, but they have a lower spatial resolution, meaning they can't "see" fine details as well.
This creates a critical gap in the state-of-the-art for applications that require a comprehensive understanding of touch, such as advanced robotics or realistic virtual reality haptics. There remains an unmet need for a single device that can provide both detailed visual information and fast, subtle feedback necessary to truly mimic the sense of human touch.
Innovation:
UCLA researchers have addressed this challenge by developing the OptiStrain sensor, which integrates both vision-based and microfluidics-based sensing mechanisms into a single elastomeric fingertip to achieve high spatial and temporal resolution capabilities.
High spatial resolution is achieved through vision-based sensing, where a camera inside the sensor's housing points at the underside of a clear, flexible fingerpad. The fingerpad's outer surface has a visual texture, and when it deforms from contact, the camera visually tracks the changes, providing high-resolution data about the local shape and texture of the object. High temporal resolution is achieved through a liquid metal strain gauge, which is embedded within the flexible fingerpad. When the fingerpad is deformed, the channels of the strain gauge also deform, which changes their electrical resistance. This change is measured as an analog signal that can be captured at a very high sampling rate (over 1000 Hz), allowing the sensor to detect small, rapidly changing electrical signals. This unique combination allows the OptiStrain to not only map force distributions across a surface but also detect subtle deformations with enhanced precision. In an experiment, a model using both data streams reduced force estimation errors by 12% compared to image-only inputs and 14% compared to strain gauge-only inputs.
By integrating both high-resolution visual sensing and high-speed microfluidic sensing, this innovation has the potential to elevate the capabilities of robotics and prosthetics by providing a sense of touch closer to human perception. This paves the way for a new generation of robotic systems capable of more delicate and adaptive manipulation, as well as more realistic and intuitive haptic feedback.
Potential Applications:
- Robotic manipulation & grasping with improved dexterity and precision
- Prosthetics with more lifelike and functional artificial limbs
- Medical devices with enhanced feedback, such as surgical instruments
- Virtual reality (VR) & gaming device with more immersive and realistic tactile feedback
- Human-machine interaction (HMI)-based smart devices with more intuitive controls
- General consumer products such as enhanced touch-sensitive interfaces and wearables
- Touch screens
Advantages:
- Dual-mode sensor with spatial and temporal resolution
- Fusion sensing improves force estimation accuracy
- Liquid metal gauge enables >1000 Hz sampling
- Vision system maps fine textures and shapes
- Compact fingertip design, scalable and modular
- Easily integrates into robotics, prosthetics, VR/AR
- Mimics human touch for natural interaction
State of Development:
Initial schematic design and demonstration have been completed.
Related Papers:
- W. Yuan et al., “GelSight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, 2017.
- B. Ward-Cherrier et al., “The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies,” So. Ro., pp. 216–227, 2018.
- Alspach et al., “Soft-bubble grippers for robust and perceptive manipulation,” in IROS, 2020, pp. 9917–9924.
- N. Wettels et al., “Biomimetic tactile sensor array,” Adv. Robot., pp. 829–849, 2008.
- K. Dai et al., “Design of a biomimetic tactile sensor for material classification,” in ICRA, 2022, pp. 10 774–10 780.
- J. Yin et al., “Measuring dynamic shear force and vibration with a bioinspired tactile sensor skin,” Sensors, pp. 3544–3553, 2018.
- W. Yuan et al., “Shape-independent hardness estimation using deep learning and a gelsight tactile sensor,” CoRR, 2017.
Reference:
UCLA Case No. 2025-99P
Lead Inventor:
Veronica Santos, Department of Mechanical and Aerospace Engineering, Samueli School of Engineering,