Today at Google I/O 2019, Google has announced a number new additions to Google Lens, its image recognition software that is available as an app and is built into the cameras of Google Pixel devices. Google is making AR a lot more useful for phones as it brings AR and Lens closer to the future of search. Google Lens is being integrated into Google Search that works the following way: you point your camera at a wall of text, and Lens will automatically start reading the text out loud.
Google Search is getting a number of new features but two are the key ones. First is the ability to use Augmented Reality(AR) to create search results that are information in virtual 3D models. Secondly, it is getting the ability to search for podcasts. You can tap a “listen” button and Lens will read the text out to you in the translated language and highlight which word it’s reading so you can follow along.
“We think, with the technologies coming together in augmented reality in particular, there’s this opportunity for Google to be vastly more helpful,” says Clay Bavor, vice president of virtual and augmented reality, about the variety of AR updates Google has coming this year.
Google Lens is now capable of reading out text in addition to capturing it and translating it. A short video showed how this functionality could be used to helpful to the people who can’t read understand signs and computer interfaces. This feature is coming first to Google Go, the operating system meant for low-powered devices, where it will take up just over 100kb of space.
The Lens team has been working with early testers in India and is working to make the technology lightweight enough that it can run on less-robust phones.