Sign In

Communications of the ACM

ACM Careers

A Robotic Leg Learns to Walk Without Explicit Programming


robotic limb motors and animal-like tendons

Three motors drive the animal-like tendons of the USC team's robotic limb.

Credit: Matthew Lin

For a newborn giraffe or wildebeest, being born can be a perilous introduction to the world—predators lie in wait for an opportunity to make a meal of the herd's weakest member. This is why many species have evolved ways for their juveniles to find their footing within minutes of birth.

It's an astonishing evolutionary feat that has long inspired biologists and roboticists. Now a team of USC researchers at the USC Viterbi School of Engineering believe they have become the first to create an AI-controlled robotic limb driven by animal-like tendons that can be tripped up and then recover before the next footfall, a task the robot was never explicitly programmed to do.

Francisco J. Valero-Cuevas, a professor of Biomedical Engineering and professor of Biokinesiology & Physical Therapy at USC, in a project with USC Viterbi School of Engineering doctoral student Ali Marjaninejad and two other doctoral students—Darío Urbina-Meléndez and Brian Cohn, has developed a bio-inspired algorithm that can learn a new walking task by itself after only 5 minutes of unstructured play, and then adapt to other tasks without any additional programming.

They describe their work in "Autonomous Functional Movements In a Tendon-Driven Limb via Limited Experience," published in the journal Nature Machine Intelligence. Their article opens exciting possibilities for understanding human movement and disability, creating responsive prosthetics, and robots that can interact with complex and changing environments like space exploration and search-and-rescue.

"Nowadays, it takes the equivalent of months or years of training for a robot to be ready to interact with the world, but we want to achieve the quick learning and adaptations seen in nature," says senior author Valero-Cuevas, who also has appointments in computer science, electrical and computer engineering, aerospace and mechanical engineering, and neuroscience at USC.

Marjaninejad, a doctoral candidate in the Department of Biomedical Engineering at USC, and the paper's lead author, says this breakthrough is akin to the natural learning that happens in babies. Marjaninejad explains, the robot was first allowed to understand its environment in a process of free play (or what is known as 'motor babbling').

"These random movements of the leg allow the robot to build an internal map of its limb and its interactions with the environment," Marjaninejad says.

The paper's authors say that, unlike most current work, their robots learn-by-doing, and without any prior or parallel computer simulations to guide learning.

This is particularly important, Marjaninejad says, because programmers can predict and code for multiple scenarios, but not for every possible scenario—thus pre-programmed robots are inevitably prone to failure.

"However, if you let these [new] robots learn from relevant experience, then they will eventually find a solution that, once found, will be put to use and adapted as needed," he says. "The solution may not be perfect, but will be adopted if it is good enough for the situation. Not every one of us needs or wants—or is able to spend the time and effort— to win an Olympic medal."

Through this process of discovering their body and environment, the robot limbs designed at Valero-Cuevas' lab at USC use their unique experience to develop the gait pattern that works well enough for them, producing robots with personalized movements. "You can recognize someone coming down the hall because they have a particular footfall," Valero-Cuevas says. "Our robot uses its limited experience to find a solution to a problem that then becomes its personalized habit, or 'personality.' We get the dainty walker, the lazy walker, the champ . . . you name it."

The potential applications for the technology are many, particularly in assistive technology, where robotic limbs and exoskeletons that are intuitive and responsive to a user's personal needs would be invaluable to those who have lost the use of their limbs. "Exoskeletons or assistive devices will need to naturally interpret your movements to accommodate what you need," Valero-Cuevas says.

"Because our robots can learn habits, they can learn your habits, and mimic your movement style for the tasks you need in everyday life—even as you learn a new task, or grow stronger or weaker," he says.

According to the authors, the research will also have strong applications in the fields of space exploration and rescue missions, allowing for robots that do what needs to be done without being escorted or supervised as they venture into a new planet, or uncertain and dangerous terrain in the wake of natural disasters. These robots would be able to adapt to low or high gravity, loose rocks one day and mud after it rains, for example.

The paper's additional authors describe the possible implications of their work. "The ability for a species to learn and adapt their movements as their bodies and environments change has been a powerful driver of evolution from the start," says Cohn, a doctoral candidate in computer science at the USC Viterbi School of Engineering. "Our work constitutes a step towards empowering robots to learn and adapt from each experience, just as animals do."

"I envision muscle-driven robots, capable of mastering what an animal takes months to learn, in just a few minutes," says Urbina-Meléndez, a doctoral candidate in biomedical engineering who believes in the capacity for robotics to take bold inspiration from life. "Our work combining engineering, AI, anatomy, and neuroscience is a strong indication that this is possible."

This research was funded in part by the National Institutes of Health, the Department of Defense's CDMRP program, and DARPA's Lifelong Learning Machines program. Discover more about the project.


 

No entries found