Home Bots & Brains Listening skills bring human-like touch to robots

Listening skills bring human-like touch to robots

by Pieter Werner

Researchers at Duke University have developed a new system called SonicSense, which allows robots to interpret objects through vibrations, mimicking a human-like sense of touch. The system, to be presented at the Conference on Robot Learning (CoRL 2024), enables robots to identify materials, understand shapes, and recognize objects based on vibrations detected through contact microphones embedded in their fingertips.

SonicSense works by detecting and recording vibrations when the robotic hand interacts with an object. The system uses these signals to analyze the object’s material and shape, employing artificial intelligence to make these determinations. In tests, SonicSense was able to count dice inside a box, measure the liquid inside a bottle, and reconstruct 3D shapes by tapping on the object. This acoustic-based approach complements vision-based systems by offering robots additional layers of sensory information that the eye may overlook.

The system features four-fingered robotic hands, each equipped with touch-based microphones, which help reduce interference from external noise. Its ability to function independently in open lab settings distinguishes it from earlier efforts that were often limited to controlled environments. SonicSense also has practical advantages, with its low-cost components, including standard microphones used in musical instruments, keeping the construction cost at just over $200.

Looking ahead, the research team aims to enhance SonicSense’s capabilities for dynamic, cluttered environments and refine its robotic hand for more complex tasks involving multiple sensory inputs. The goal is to further expand the system’s sensory abilities, enabling robots to perform nuanced tasks with human-like adaptability in real-world situations.

Misschien vind je deze berichten ook interessant