Engineers at the University of California San Diego have successfully trained a humanoid robot to perform a variety of expressive movements such as dancing, waving, high-fiving, and hugging, while maintaining balance on diverse terrains. This advancement is part of ongoing efforts to develop robots capable of performing more complex tasks in the future.
The improved expressiveness and agility of this humanoid robot aim to enhance human-robot interactions in various settings including factory assembly lines, hospitals, and homes. The robot’s capabilities could also allow it to safely operate alongside humans or take on roles in hazardous environments such as laboratories or disaster sites.
Xiaolong Wang, a professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering, emphasized the importance of building trust through expressive and human-like body motions. He noted that the goal is to reshape public perceptions of robots, promoting them as friendly and collaborative rather than intimidating.
Wang and his team are scheduled to present their research at the 2024 Robotics: Science and Systems Conference in Delft, Netherlands, from July 15 to 19.
The robot’s expressiveness is attributed to its training on a wide range of human body motions, allowing it to generalize and mimic new movements efficiently. The training process involved extensive motion capture data and dance videos, focusing separately on the upper and lower body. This method enabled the robot’s upper body to perform various reference motions while its legs maintained a steady gait on uneven surfaces.
Despite the separate training, the robot operates under a unified policy that integrates its entire structure, ensuring it can execute complex upper body gestures while walking steadily on surfaces such as gravel, dirt, wood chips, grass, and inclined concrete paths. Initial training and testing were conducted on a virtual humanoid robot before being transferred to a real robot, which demonstrated the ability to perform both learned and new movements in real-world conditions.
Currently, the robot is controlled by a human operator using a game controller to manage its speed, direction, and specific motions. The team aims to develop a future version with an integrated camera to enable autonomous task performance and navigation.
The next phase of the project will focus on refining the robot’s design to handle more intricate and precise tasks, thereby expanding the range of motions and gestures it can perform.
Image: UC San Diego Jacobs School of Engineering