Sign In

Communications of the ACM

ACM TechNews

Muscle Signals Can Pilot a Robot


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
The Massachusetts Institute of Technology's Joseph DelPreto directs a drone through obstacles with muscle movements.

Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory researchers have developedd a system that taps human muscle signals from wearable sensors to pilot a robot.

Credit: Joseph DelPreto et al

Researchers at the Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (CSAIL) have invented a system that taps human muscle signals from wearable sensors to pilot a robot.

Conduct-A-Bot uses electromyography and motion sensors worn on the user's biceps, triceps, and forearms to quantify muscle signals and movement, and processes that data with algorithms to identify gestures in real time.

The researchers used Conduct-A-Bot with a Parrot Bebop 2 drone, translating user actions like rotational gestures, clenched fists, and tensed arms into drone movement.

The drone correctly responded to 82% of roughly 1,500 human gestures when remotely piloted to fly through hoops, and correctly identified about 94% of cued gestures when not being piloted.

From MIT News
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found