How to use Google Lens on your iPhone or iPad?

Even though every day the artificial intelligence of Google Lens is increasingly penetrating the device running Android, for iOS users, there is also good news. This article will explore how to use this technology on iPhone and iPad.

What is Google Lens?

Google Lens, an image recognition technology that lets you “search for what you see,” essentially lets you point your phone at anything that catches your eye and google it.

Google’s object recognition app first came out back in March, but at first, it only ran on Android. Now Google Lens is available in the Google iOS app. It is worth noting that previously, iOS users could access the functionality of Google Lens through Google Photos. However, to do this, it was necessary to first photograph the object and then add it to the list.

Now users can turn on the camera and identify the image directly in the Google application. A corresponding Lens icon has already appeared in the search bar at the top of the Google app.

Functions of the service

Google Lens has the following features:

  • Search for similar products – you should point the camera at a picture or any object the application will process the information and give the user several answer options.
  • Translation mode adds support for offline translations, as does the full app and its “Instant” camera. You will be prompted to download language packs to use Google Lens when there is no network connection.
  • Google Lens will help you online translate words into other languages, pronounce them or show the meaning of a word. “These are minor changes, but certainly enough to make Lens a more useful tool — and provide users with another way to kill time.
  • Google Lens already allows visitors to do image search from iPhone and identify a variety of objects – architectural monuments, sculptures, household items, and objects, scan receipts and documents and search for goods. Now the function has been made even more useful.
  • In the new version of Google Lens, the viewfinder for object recognition takes up only a third of the screen. The rest of the space is reserved for accessing photos and screenshots in the custom gallery. You can expand the camera window to full screen by swiping down or clicking on the “Search” button. But even in this case, a small icon for accessing the photo will be displayed in the lower-left corner of the interface.
  • Another innovation is the saving of photos in the gallery when shooting in Live View mode. Previously, the user irretrievably lost access to the images that were captured in the application – the emphasis was on the “live” analysis of the picture in the frame.

How to use it on Apple devices?

You can take advantage of Google Lens by downloading the Google iOS app. So, on the first launch, users will be prompted to grant the app access to the camera and agree to other terms. Also, users can go through a short training to understand the functionality of Google Lens.

True, the application does not yet have a text recognition function. But Google Lens can be useful when searching for similar products.

The application has a simple and intuitive interface. Users just need to point the camera at a specific object to start the analysis. The control system includes triggering a flash for better illumination and opening a photo gallery to analyze existing images.