Engineers at the University of California San Diego have trained a humanoid robot to effortlessly learn and perform a variety of expressive movements, including simple dance routines and gestures like waving, high-fiving and hugging, all while maintaining a steady gait on diverse terrains.

The enhanced expressiveness and agility of this humanoid robot pave the way for improving human-robot interactions in settings such as factory assembly lines, hospitals and homes, where robots could safely operate alongside humans or even replace them in hazardous environments like laboratories or disaster sites.

“Through expressive and more human-like body motions, we aim to build trust and showcase the potential for robots to co-exist in harmony with humans,” said Xiaolong Wang, a professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering. “We are working to help reshape public perceptions of robots as friendly and collaborative rather than terrifying like The Terminator.”

Wang and his team will present their work at the 2024 Robotics: Science and Systems Conference, which will take place from July 15 to 19 in Delft, Netherlands.

What makes this humanoid robot so expressive is that it is trained on a diverse array of human body motions, enabling it to generalize new motions and mimic them with ease. Much like a quick-learning dance student, the robot can swiftly learn new routines and gestures.

To train their robot, the team used an extensive collection of motion capture data and dance videos. Their technique involved training the upper and lower body separately. This approach allowed the robot’s upper body to replicate various reference motions, such as dancing and high-fiving, while its legs focused on a steady stepping motion to maintain balance and traverse different terrains.

“The main goal here is to show the ability of the robot to do different things while it’s walking from place to place without falling,” said Wang.

Despite the separate training of the upper and lower body, the robot operates under a unified policy that governs its entire structure. This coordinated policy ensures that the robot can perform complex upper body gestures while walking steadily on surfaces like gravel, dirt, wood chips, grass and inclined concrete paths.

Simulations were first conducted on a virtual humanoid robot and then transferred to a real robot. The robot demonstrated the ability to execute both learned and new movements in real-world conditions.

Currently, the robot’s movements are directed by a human operator using a game controller, which dictates its speed, direction and specific motions. The team envisions a future version equipped with a camera to enable the robot to perform tasks and navigate terrains all autonomously.

The team is now focused on refining the robot’s design to tackle more intricate and fine-grained tasks. “By extending the capabilities of the upper body, we can expand the range of motions and gestures the robot can perform,” said Wang.

Paper title: “Expressive Whole-Body Control for Humanoid Robots.” Co-authors include Xuxin Cheng*, Yandong Ji*, Junming Chen and Ruihan Yang, UC San Diego; and Ge Yang, Massachusetts Institute of Technology.

*These authors contributed equally to this work.

Meeting Link: Robotics: Science and Systems Conference, Jul-2024