The European Union-funded CASBLiP project has developed a portable system to help visually-impaired users navigate outdoors. The system analyzes video from portable cameras to calculate the distance of objects and predict the movement of people and cars to create a three-dimensional (3D) "picture" of sound that can be used to create a sound map.
Researchers from the University of Bristol have developed real-time image processing software and algorithms to identify objects and obstacles. The Bristol system uses stereo images to create a "depth map" for calculating distances, and can be used to analyze moving objects and predict where they will go. The University of Laguna has developed technology that presents spatial information as a 3D acoustic map. By combining these two systems, the CASBLiP project has developed a system that makes it possible to place sounds so the brain can interpret them as points in space, with sounds becoming louder as the object gets closer.
The CASBLiP system also uses a gyroscopic sensor, developed by the University of Marche, to detect how the user moves his or her head, so the relative position of the sounds being played stay in the correct position when the user changes direction. The project has produced two prototype devices mounted on a helmet, and has successfully tested the systems in real-world environments, including busy streets. The first design uses an infrared-light laser sensor, mounted inside glasses, to calculate the distance to objects. The second version adds two digital video cameras and can detect moving objects and predict their path.
The University of Marche and the Francesco Cavazza Institute also have developed a complementary GPS location system that could be used to provide the wearer with verbal directions to a destination.
From ICT Results
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found