A Brown University-led team of robotics researchers has demonstrated how a robot can detect and respond to nonverbal commands in various environments without having to adjust for variations in lighting. "We have created a novel system where the robot will follow you at a precise distance, where you don't need to wear special clothing, you don't need to be in a special environment and you don't need to look backward to track it," says Brown professor Chad Jenkins.
A demonstration of the robot shows Brown graduate students using a variety of hand and arm signals to instruct the robot. The user can walk with his or her back turned to the robot and naturally move around corners, down narrow hallways or in an outdoor parking lot. The robot reliably follows approximately three feet behind and will back up when a student turns around and approaches the robot.
The researchers augmented a PackBot, developed by iRobot, with a commercial depth-imaging camera and a laptop with software that enables the machine to recognize human gestures and respond. The researchers say their work has resulted in advances in visual recognition and a depth-imaging camera, which makes possible a robot that doesn't require remote control or constant vigilance, Jenkins says. "Advances in enabling intuitive human-robot interaction, such as through speech or gestures, go a long way into making the robot more of a valuable sidekick and less of a machine you have to constantly command," says iRobot's Chris Jones.
From Brown University
View Full Article
No entries found