Researchers at the Munich-based Cognition for Technical Systems excellence cluster have developed a flight simulator for flies to determine the insect's neural activity while in flight in order to gain knowledge that could apply to the development of robots that can independently perceive and learn from their environment.
It has been known for a long time that flies absorb many more images per second than humans, and the size of the fly's brain requires a simpler and more efficient way of processing images from the eyes into visual perception than people. The flight simulator features a hemispherical display on which researchers present various patterns, movements, and sensory stimuli to blow flies held in place by a halter. The restraint of the insect allows electrodes to register the reactions of its brain cells.
Researchers at the Technischen Universitat Munchen are developing intelligent machines that can observe their environment via cameras, learn from what they see, and respond appropriately to the present circumstances. Their long-range goal is to facilitate the creation of machines capable of direct, effective, and safe interaction with people. One aspect of this project is the development of small, flying robots whose position and movement in flight will be controlled by a computer for visual analysis influenced by the example of the fly's brain.
Efficient image analysis is essential if intelligent machines and humans are to interact naturally, and insights drawn from the flight simulator for flies could offer a simple strategy for bridging the technical gap between insects and robots.
From TU Munchen
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found