Lookout · Google (2020-2022)
As a Senior Interaction Designer on the Central Accessibility team at Google and sole designer on the Lookout app, I led the global launch of Lookout 2.0—an augmented reality app that uses artificial intelligence to help blind and low vision users complete everyday tasks like identifying groceries, documents, and currency.
The redesign drove a 50x increase in weekly active users, 11% higher engagement, and 25% better retention, with 100K+ new installs. It was covered by 70+ media outlets (BBC, TechCrunch, Verge), praised as “more accessible,” and earned the Google Design Best of 2020 award.
Lookout uses computer vision to assist people with low vision or blindness with daily tasks. Using their phone’s camera, Lookout makes it easier to sort mail, put away groceries, and more.
Designing for the visually impaired in the physical world
People who are blind cannot rely on their vision and find visual concepts and framing challenging, especially those who have less visual experience. They can rely on their hearing and using a screen reader like TalkBack.
Lookout uses AI and a smartphone camera to identify food items for example by sight, speaking results aloud to help blind users navigate grocery stores independently. Built on the same technology as Google Lens, the feature turns visual recognition into actionable audio cues.
Scan document mode captures a snapshot of a document to be read aloud by a screen reader. Lookout will verbally announce what it has scanned. Users can also review previously scanned text and share or delete it.
Using sound design to guide blind users to frame their camera
In order to combat an already verbose life experience for blind users, I opted to progressively shorten guidance as it was being repeated (since this is likely to occur frequently when trying to frame). For example, “Too close, move device away” shortens to “Too close, move away”, and “Move away”.
I gave users an encouragement tone when they move in the right direction. A success tone and haptic informs the user that the document is in frame.
Lookout guides users on how to move their device to frame their document (“Move device left”, “Too close, move device away”).
“I hope all other blind & partially sighted users find this app as fantastic as I did. In my opinion this is the best innovation since the smartphone was introduced.”
- Play Store review
Helping visually impaired users at the grocery store and in the kitchen
Food labels are designed to be attention-grabbing and distinctive, but not necessarily highly readable or informative. If a sighted person can accidentally buy the wrong kind of peanut butter, what chance does someone who can’t read the label themselves have?
The new Food Label mode focuses on identifying products—not just reading text. If the item isn’t visible enough, the app prompts the user to adjust it. It matches the image to a product database and reads out key details like brand and flavor, with barcode scanning as a backup.
The app guides the user to position the item where the camera can identify it, and then a screen reader can speak the description aloud.
The new food label mode, then, is less about reading text and more about recognizing exactly what product it’s looking at. If the user needs to turn the can or bottle to give the camera a good look, the app will tell them so. It compares what it sees to a database of product images, and when it gets a match it reads off the relevant information: brand, product, flavor, other relevant information.
“This app has sort of changed my life in terms of how quick and simple it is to use while in the kitchen looking at spices and other ingredients when cooking food for the kids and family”
- User research