Researchers at Microsoft, the University of Washington, and the University of Toronto have developed a human-computer interface that uses muscle movement for hands-free, gestural interaction. A band of electrodes attached to the user's forearm is used to read electrical activity from different arm muscles. Signals are correlated to specific hand gestures, such as touching a finger and thumb together or gripping an object with a certain degree of tightness. The researchers say the technology could be used to scroll through and select songs on a MP3 player or to play a game without a controller.
The project is focusing on bringing muscle interfaces to healthy individuals looking for richer input modalities, says Microsoft researcher Desney Tan. The researchers' most recent interface uses six electromyography sensors and two ground electrodes positioned in a ring around a user's right forearm to sense finger movement, and two sensors on the left forearm to sense hand squeezes. The system's software needs to be trained to associate the electrical signals with different gestures.
"Most of today's computer interfaces require the user's complete attention," says Massachusetts Institute of Technology professor Pattie Maes. "We desperately need novel interfaces such as the one developed by the Microsoft team to enable a more seamless integration of digital information and applications into our busy daily lives." The interface was demonstrated at the recent ACM Symposium on User Interface Software and Technology.
View a video that shows how a muscle-sensing computer interface interprets specific hand gestures.
From Technology Review
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found