At I/O Developer Conference 2017, Google has recently announced several new primary products and services including the latest technology it has been bringing to Android smartphones known as “Google Lens”.
An AI-Powered technology, Google Lens, uses your smartphone’s camera with deep machine learning to not merely identify an object but also comprehends what it identifies and provides actions or moves based on the identification.
Here are some things you need to know about the new Google Lens.
Purpose Of Google Lens
Google Lens is a newly super powered version of Google Goggles and akin to Samsung Bixby Vision, which was launched on Samsung Galaxy 8.
It allows users to do things like pointing their smartphone’s camera at anything, for example, a particular flower, and ask Google Assistance that what is the object they are pointing at and they will not only get the answer of what it is but also will get suggestions to act on the object or even nearby florists and flower shops in case of a flower. It will do the same for shoes, dresses, and many other things you will point at.
With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action.
— Google (@Google) May 17, 2017
You can also take the picture of the SSID sticker on the backside of your router and further your smartphone will connect to the Wi-Fi automatically with no need to perform any other necessary activities. Now, you don’t have to crawl under the cupboard to read the password while typing it in your phone and realizing that you have forgotten to put CAPS on.
Google Lens will also identify clubs, restaurants, bars, and cafes as well, along with a pop-up window showing address details, opening and closing times, as well as reviews.
Apps That Google Lens Work With
Google Lens will be first implemented on Google Assistant and Google Photos in the coming months and other apps will be followed eventually.
To work Google Lens with Google Assistant, users will have to tap on the Google Lens icon, point the camera at objects, for example, show times at a cinema or an information board of a gig venue. Then in the viewfinder, you will be presented with a multitude of suggestions, like hear some songs from the chosen artist from the information board, or get tickets for the event by TicketMaster, or even add the event to your calendar.
When Google Lens will work with Google Photos, they will be allowed to recognize buildings and landmarks, like showing users directions to reach there and opening or working hours. It will also information on a popular work of art. It will perhaps put an end to the debate of whether or not Mona Lisa is smiling.
When will Google Lens be Arriving?
Google didn’t specifically announce any date when it will arrive on Android smartphones but it has been heard that there are a few months until it arrives.
You can watch the entire Google I/O 2017 keynote here: