Sign In

Communications of the ACM

ACM TechNews

Robot Can Understand What a Hug Is


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Researcher Yuhan Hu says the approach effectively gave the robot a new sense that lies somewhere between human touch and sight.

Cornell University researchers have created a low-cost method for soft, deformable robots to detect a range of physical interactions, using a USB camera inside the robot to capture shadow movements of hand gestures on the robot's skin and classifying them with machine learning software.

Credit: techxplore.com

A prototype soft robot with nylon skin developed by researchers at Cornell University is imbued with sensory perception that Cornell's Yuhan Hu said lies somewhere between human touch and sight.

The team stretched the skin over a 1.2-meter (3.9-foot) cylindrical scaffold atop a wheeled platform, with a commercial USB camera for interpreting different types of touch on the nylon.

The Cornell researchers then compiled a database from camera images of humans making one of six interactions with the robot's skin, training a neural network to detect and identify different interactions with up to 92% accuracy.

The team matched simple commands to the gestures, and also showed that with a projector added, the robot could display a user interface on its skin for use as a touchscreen.

From New Scientist
View Full Article

 

Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA


 

No entries found