INTERNET

Google Lens Is A Super Enhanced Google Goggles — Remember That?

The company's new feature will help you inspect your surroundings through your camera

INTERNET
Google
May 17, 2017 at 4:51 PM ET

Google is back with the Google Googles — except it uses advanced photo recognition technology, and it’s called Google Lens.

CEO Sunday Pichai announced the launch Google Lens at the I/O developer conference on Wednesday, and demoed the feature, which will live on Google Assistant — Google’s version of Siri and Alexa. Google Lens uses the company’s computer vision and AI technology to help users learn more about their surroundings in real-time.

Google Lens is reminiscent of Google Goggles, which is an app that lets you Google search by taking photos of objects. You could take a photo of a hotdog and the app would tell you if it’s a hot dog or not — it was the original “Not Hotdog” from HBO’s “Silicon Valley.” The feature is also similar to the company’s World Lens, which translates signs in foreign languages as if it was magic.

In the demo of Google Lens, a user is seen pointing their phone camera at a plant and Google identifying what species it is. Google Lens will also pull up the name, rating, and listing information for restaurant and shops whenever users point their camera to a storefront. Yet, the coolest thing in the demo was when the user snaps a photo of a sticker on a router and instantly connects to the Wi-Fi network.

https://twitter.com/Google/status/864891667723300864

Google Lens will first become available as a Google Assistant feature, but thanks to the app now being available on iOS, anyone can use the smart camera regardless of their phone model.

The feature functions similarly to other apps with augmented reality such as Snapchat, Instagram, and Pokémon Go — you just point, tap, and go. Pichai didn’t say exactly when the feature will roll out, but it will eventually make it into Google Photos and all other Google products.