acm-header
Sign In

Communications of the ACM

ACM TechNews

Robots Now Can Understand What You Are Saying to Follow Commands


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
A robot seeking verbal feedback.

Telling a robot where to go is now a lot easier, thanks to a new model based on how people actually speak when giving directions.

Credit: Getty Images

University of Michigan (UM) researchers have developed a model that simplifies robots' ability to follow commands by allowing them to understand what people are saying.

Many robots utilize simultaneous localization and mapping (SLAM) to know their whereabouts, concurrently tracking their location on a map and updating their environmental knowledge.

UM's Jason Corso and colleagues remote-controlled a robot around a tabletop maze arranged in 116 configurations without being able to see the labyrinth, while a natural language processing model associated the navigator's commands to the driver.

Once the language dataset was compiled, the models that parse them were trained under simulation, and learned to follow plain-text commands.

Corso said, "The challenge for humans to interact with SLAM-based machines is we need to think on their terms. It's really rigid and we have to adapt to the robots. The goal is to flip that and have the robot adapt to the human language."

From New Scientist
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found