Last year, Google announced a new app to help the visually impaired named Lookout. The app uses AI to identify objects through your phone’s camera. It can also read text in signs and labels, scan barcodes, and identify currencies. This week, Google announced that Lookout will finally be available to download — though only for Pixel devices in the US.
Since announcing the app last year, Google says it’s been “testing and improving the quality” of its results. The company cautions that, as with all new technology, Lookout’s results will not always be “100 percent perfect,” but it’s soliciting feedback from early users.
To use Lookout, Google recommends that users wear their Pixel device on a lanyard around their neck or placed in the front pocket of a shirt or coat. That way, the phone’s camera gets an unobstructed view of the world and can identify objects and text “in situations where people might typically have to ask for help.”
It’s not clear when Lookout will be available on hardware other than Google’s own, but the company says it’s hoping to bring the app “to more devices, countries, and platforms soon.”
Luckily, this isn’t the first time we’ve seen a big tech company apply AI to the task of helping the visually impaired. Microsoft launched an app with very similar functionality named Seeing AI in 2017. And this week the Redmond company announced an update for Seeing AI that lets users feel the shape of objects on their phone screens using haptic feedback.
Originally by published in The Verge by James Vincent on March 13, 2019