Human-like robot masters the waltz through mimicking human actions

Humanoid robot waltzes with the help of AI trained on human motion capture recordings

Xuxin Cheng and Mazeyu Ji

AI that helps humanoid robots mirror human movements could allow robots to walk, dance, and fight in more human-like ways.

The most agile and fluid robot movements, such as Boston Dynamics’ impressive demonstration of robotic acrobatics, are typically narrow, pre-programmed sequences. Teaching robots a wide repertoire of persuasive human movements remains difficult.

In order to overcome this hurdle, Peng Xuanbin at the University of California, San Diego, and colleagues have developed an artificial intelligence system called ExBody2. This allows the robot to imitate various human movements in a more realistic way and execute them smoothly.

Peng and his team began by building a database of possible movements that a humanoid robot could perform, from simple movements such as standing and walking to more complex movements such as tricky dance moves. Created. The database contained motion capture recordings of hundreds of human volunteers collected in previous research projects.

“Humanoid robots share a similar physical structure with us, so it makes sense to leverage the vast amount of human movement data that is already available,” Peng says. “By learning to imitate this kind of behavior, robots can quickly learn a variety of human-like behaviors. This means that anything humans can do, robots have the potential to learn.” It means something.”

To teach the pseudo-humanoid robot how to move, Peng and his team used reinforcement learning. In this learning, the AI ​​is given an example of what makes a successful move and then challenged to figure out how to do it yourself through trial and error. They started by training ExBody2 with full access to all the data on this virtual robot, including the coordinates of each joint, so it could mimic human movements as closely as possible. It then learned from these movements, using only data accessible in the real world, such as inertia and velocity measurements from sensors on the actual robot’s body.

After ExBody2 was trained on the database, it was able to control two different commercially available humanoid robots. It was able to smoothly combine simple movements such as walking in a straight line and crouching, as well as perform tricky movements such as following a 40-second dance routine, throwing punches, and waltzing with humans.

“Humanoid robots work best when all limbs and joints work together,” Penn says. “Many tasks and movements require coordination between the arms, legs, and torso, and whole-body coordination greatly increases the range of a robot’s capabilities.”

topic:

Source: www.newscientist.com

Robot Dog Masters opening door with paw

A machine learning model figured out how to keep the robot stable on three legs while opening a door with one leg.

Philip Arm, Mayank Mittal, Hendrik Kolvenbach, Marco Hutter/Robot Systems Laboratory

The robot dog can open doors, press buttons, and pick up backpacks with one leg while balancing on its other three legs.

Quadruped robots like Spot, the star of Boston Dynamics' viral video, typically require arms attached to their bodies to open doors or lift objects, which adds significantly to their weight. This can make it difficult for the robot to maneuver through tight spaces. .

philip arm Researchers at ETH Zurich in Switzerland used a machine learning model to teach an off-the-shelf robotic dog to perform tasks using one of its legs while remaining stationary or moving with the other three. I taught you to do it.

“We can't do everything with our legs that we can do with our arms. We're much more dexterous with our hands at the moment. But what's really important is making this work in applications where there are mass constraints, or in robots. “The idea is to make this work in applications where you don’t want the added complexity, such as space exploration, where every kilogram counts,” Arm says.

To train the dog, the ANYmal robot from ANYbotics, Arm and his team gave the machine learning model the goal of finding a specific point in space on one of the robot's legs. The model then took control of his remaining three legs and independently worked out how to keep the robot balanced when standing and walking.

Arm and his team can now remotely control the robot to perform actions such as picking up backpacks and putting them in boxes, or collecting rocks. Currently, the robot can only perform these tasks when controlled by a human, but Arm hopes future improvements will allow the dog to autonomously manipulate objects with its paws.

topic:

Source: www.newscientist.com