Researchers at Tsinghua University have developed an advanced robotic system capable of sorting garbage with high accuracy by mimicking the complex human sense of touch. Detailed in Applied Physics Reviews, the system utilizes tactile sensing and logical reasoning strategies to improve object recognition and classification.
Traditional robotic systems often struggle with objects of similar size and shape or those unfamiliar to the robot, particularly when background noise is present or objects vary in shape and size within the same category. To address these challenges, the Tsinghua University team focused on enhancing the robot’s tactile sensing abilities. They drew inspiration from human touch, which includes thermal sensation, enabling the perception of temperature differences and the ability to distinguish materials like wood and metal.
The researchers developed a multi-layered sensor system for the robot. The top layer detects material types, the middle porous layer is sensitive to thermal changes, and the bottom layer measures pressure. This sensor setup was combined with a cascade classification algorithm designed to classify objects from easy to hard. For example, the system begins by identifying simple items such as empty cartons and progresses to more complex items like orange peels and cloth scraps.
In tests, the intelligent robot tactile system successfully sorted various types of garbage, including empty cartons, bread scraps, plastic bags, plastic bottles, napkins, sponges, orange peels, and expired drugs. The robot categorized these items into containers for recyclables, food scraps, hazardous waste, and other waste types. The system achieved a classification accuracy of 98.85% for previously unencountered objects, demonstrating its potential to significantly reduce the need for human labor in waste management and other applications of smart technology.
Future research will focus on further enhancing robotic intelligence and autonomous operation. Additionally, combining this tactile sensor with brain-computer interface technology could convert sensory data into neural signals, potentially restoring tactile perception for individuals with hand disabilities.
Image credit: Qian Mao and Rong Zhu