It’s been almost half a year since Google unveiled Google Lens, a new Android feature called Google Lens that allows you to identify items using your phone’s camera, and then allow you to get more information about those items. Now Google Lens is starting to roll out… to a handful of phones in a handful of markets.
Lens actually made its debut with the launch of the Google Pixel 2 and Pixel 2 XL smartphones, but at the time it was built into Google Photos. The only way to use Google Lens was to first snap a picture, then open it in Google Photos and tap the Lens button.
Now you can use Lens directly from Google Assistant, which means you don’t need to save and open a picture.
Among other things, Google Lens can identify landmarks like historic buildings, famous artworks, movie posters, or books, all sorts of objects. For example, using the Google Photos version with a picture of my cat, it was able to bring up a Google image search for other cats. When I snapped a picture of Philadelphia City Hall, it knew what it was looking at. And you can get movie showtimes or reviews by pointing the camera at a poster.
You can also use Google Lens as a barcode or QR code scanner or you can scan text to save the contact info on a business card or open the URL on a poster, flyer, or menu.
Google Lens in Assistant is rolling out first to Pixel (and Pixel 2) phones set to English, and it’ll be available initially in the US, UK, Australia, Canada, India, and Singapore.