Although the iPhone can superimpose navigation routes or reviews on top of real-time images, augmented reality (AR) still has a long way to go. The iPhone's application is easily confused and can make errors.
AR expert Blair MacIntyre at the Georgia Institute of Technology, while emphasizing AR's strong potential, says that for now "these sensors are astonishingly bad at what people are trying to do with them." He says the reason is that AR applications for mobile phones cannot find locations precisely enough. MacIntyre is unsure if a global positioning system (GPS) can be built that is small and sophisticated enough to solve this problem. Graz University of Technology professor Dieter Schmalstieg says that the only solution is "to include computer vision on the phones."
Occipital has created a three-dimensional (3D) map of San Francisco that can be stored in an iPhone. The application uses the phone's internal compass and GPS to position itself on the map, then makes further adjustments based on real-time images superimposed on top of the diagram. Google and Microsoft are building 3D maps as well, but such diagrams could never be comprehensive and accurate thanks to the constant fluctuation of city landscapes. Some researchers think that crowdsourcing could be the most accurate option. Online images of a location could be pulled from Web sites such as Flickr and thrown together to make a 3D map. The mobile phone could use this quilted diagram, along with a real-time photo of an area, to pinpoint its location. At the same time, the phone could record more photos, making it more accurate the more frequently it is used.
University of California, Santa Barbara researcher Tobias Hollerer dubs the system "social augmented reality."
From New Scientist
View Full Article
Abstracts Copyright © 2009 Information Inc., Bethesda, Maryland, USA
No entries found