Google Lens has been rolling out for the past months to both the Photos app and the Assistant, but the device availability and featureset of the option have been different between the two. It looks like we’re moving to some semblance of parity though, as the Assistant version of Google Lens is now capable of selecting text, just like the Photos version could starting last December.
Previously, when you tapped on text in the viewfinder of Lens, it would recognize some of the text and give you relevant answers, but wouldn’t let you specifically select which part of the text you wanted to search for. This was something Google Now on Tap (remember that?) then Screen Search were able to do many years ago, so it was annoying not to be able to do it in Lens. If you wanted to use it, you had to take a photo first, then open Lens in the Photos app. Now that’s no longer the case.
When you tap anywhere in the Assistant’s Lens, text bits will have a lighter background behind them. Tap again and you can select with precision the text you want to search for or copy. Here’s a .gif of the feature in action.
We first spotted Assistant Lens’ text selection when Google announced ARCore 1.0 at the end of February, but we tested it back then and it wasn’t working. We’ve tried it again every few days and weren’t able to replicate it until today, so it seems to have just begun rolling out. Give it a go and let us know if it’s finally working for you now.