Google Lens, once a Pixel-only feature, is now a part of the Google Photos app (or a standalone Android download). During Google I/O this year, Google announced a number of new features for Google Lens, and you can play with them on both iOS and Android right now - assuming your device now supports Lens in its Camera app (or the standalone Lens app, if it doesn't).
To try out Lens in Google Photos, open a photo in Google Photos and then tap on the Lens icon that looks like a square with a magnifying glass. Google will scan the image and pop up a card with relevant information based on what it found (which you can then interact with in various ways).
Google Lens is a visual analysis app that provides additional interactivity to your phone's camera and photo roll. For example, you can use it to instantly create a new contact in your address book by taking a photo of a business card, or access online reviews by snapping a product in a shop.
Originally only available for Pixel 2 (and later added to the original Pixel), the tool is now being rolled out to all Android phones with Google Assistant and Google Photos. Hurrah!
When Lens first launched, it allowed you to do a few fun things: Take a picture of a business card to save a person as a new contact in your phone, for example. It could also identify landmarks in your photographs and where you might be able to buy an awesome shirt you see someone wearing. Now, Lens can do even more:
Copy text from the real world
With this feature, one of my favourites, you can snap a picture of something and Google will let you copy and paste the words it discovers elsewhere using your phone. For instance, if you have a gift card with a crazy-long number that you want to use to buy something online, you can snap a picture of it, and then copy and paste the number when checking out at an online retailer.
Just press and hold over the highlighted text in any photo. Like other places in your phone, a "Copy" button will appear, which will then allow you to paste the text anywhere else in Android or iOS.
Get real-time analysis with Anchor Points
With Anchor Points, Google Lens works in real-time, so you can hold the camera up to something and try and identify it rather than snapping a picture and having Google analyse it after-the-fact.
With the camera open, a dot (AKA "Anchor Point") will appear on things it's able to identify. When you tap on a dot, a card will appear with more information.
Let Google play dress-up
Sometimes, you want Gucci, but you can only afford Payless. Lens' "similar style" feature can show you other items that look like that article of clothing you just shot with your camera. Better yet, it helps you figure out where to buy them.
The feature works for clothing and other objects, such as furniture. No, it (probably) won't tell you the funky IKEA name for that bookshelf or desk.
To find a similar style, just snap a picture of something you're interested in and tap on the dot that appears on the item - assuming Lens recognises what it is. A card will then appear that shows you where you can buy similar items.