Future U.S. Air Force drone operators could talk to a drone and receive a verbal response, similar to the Siri-style two-way voice exchange. Moreover, next-generation controls could include smarter, easier-to-interpret computer displays and tactile feedback, similar to vibrating controls such as the Xbox controller, that shake the drone operator's virtual cockpit if the robot detects incoming enemy fire.
The current interface consists of computer screens, keyboards, and joysticks for steering robots, while input is limited to keystrokes and mouse and joystick movements transmitted via satellite. The Air Force Research Laboratory's (AFRL's) Mike Patzek says man-machine interfaces could replace this desktop-type environment in the next decade or so.
The progress of the Air Force's research and its funding will determine how the interfaces evolve, but there is no dispute that flying robots will have a key role in U.S. air power in the years to come.
"The fundamental issue is that the [robotic] systems are going to be more capable and have more automation," says AFRL's Mark Draper. "The trick is, how do you keep the human who is located in a different location understanding what that system is doing, monitoring and intervening when he or she needs to?"
From Wired News
View Full Article
Abstracts Copyright © 2012 Information Inc. , Bethesda, Maryland, USA
No entries found