Sign In

Communications of the ACM

ACM TechNews

Google Lookout ­ses AI to Describe Surroundings for Visually Impaired


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
At left, a screenshot image of Lookout's modes; at right, a screenshot of Lookout detecting a dog.

A new Google app uses artificial intelligence to help visually impaired users better perceive their surroundings.

Credit: Google

Google has launched an app that uses artificial intelligence (AI) to help visually impaired users better perceive their surroundings by aiming their phone at objects and receiving verbal feedback.

The Lookout app functions like the Google Lens, receiving information and supplying feedback based on what is captured on the device's rear camera.

Said Google's Patrick Clary, "Lookout detects items in the scene and takes a best guess at what they are, reporting this to you."

Google said Lookout can help users in situations such as learning about a new environment for the first time, reading text and documents, and completing daily routines like cooking, cleaning, and shopping.

The company recommended the phone be positioned on a lanyard around the user's neck, or in the front pocket of a shirt, although it acknowledged Lookout "will not always be 100% perfect."

From ZDNet
View Full Article

 

Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


 

No entries found