Google has taken the latest step in ongoing efforts to make visual search a thing. As we examined recently – and as validated in last week’s project IRIS news – Google continues to retreat from AR hardware while doubling down on software. Google Lens is propelled by the latter.

Specifically, Google now will integrate Lens deeper into iOS via the Chrome browser. This could have a real impact, considering the meaty center of the Venn diagram that sits between iOS and Chrome. Incubating Lens in well-traveled places has also been a bit of a theme for Google.

Backing up for context, visual search is when you use your camera as a search input. Rather than type/tap queries in a search box, you point your phone at items to identify or contextualize them. It’s fitting to subjects like fashion & food, and can carry high shopping intent.

Google’s flavor of visual search is Google Lens, while others include Snap Scan and Pinterest Lens. Google is keen to develop it because, like voice search, it can boost query volume by expanding the surface area of search. It can also help future-proof its core search business.

These motivating factors have pushed Google to offer training wheels for Google Lens to ease users into it. Past moves to expose Google Lens include planting it next to the voice search icon on Google’s homepage on desktop and mobile – not a bad place to incubate any product.

Data Dive: ARCore Gains Ground

Moments of Discovery

That brings us back to this week’s move to push Google Lens more front & center in well-trafficked places. As noted, it will be integrated deeper into iOS, with Chrome as its vessel. Specifically, you can jump from the browser to the camera to launch visual searches with fewer steps.

To unpack that a bit, a Google Lens icon will be planted in the Chrome address bar. Pressing it opens the camera where you can take a picture or query an image in your camera roll. In either case, Lens goes to work quickly to identify objects or launch searches for similar items.

Before this move, users could activate Lens within Chrome in a few ways. Its icon sits next to the voice search button on the Google homepage (as opposed to address-bar positioning). Users can also long-press any image on the web to search for related images and topics.

The difference now is that address-bar positioning means Lens goes along with you to every page you visit. This could help it break free from the web to bounce to the camera – and thus the physical world – for moments of discovery in educational, travel, or local commerce contexts.

To be fair, these use cases already existed. But one challenge is getting users from point A to visual searches in more seamless and integrated ways. Because visual search is a new technology that isn’t yet ingrained nor widespread, it needs to shed all possible friction.

Revenue Precursor

Going deeper into potential Google Lens use cases, the physical world (as opposed to the web) is where the technology was meant to live. As noted, this includes fashion, food, and local discovery. The latter is all about identifying storefronts and diving deeper into menus & reviews.

That brings in AR’s “space race” and Google’s position in it. The company is primed given visual databases like Google Images and Street View. These are basically training sets to help Lens identify things. That’s right, Google Lens is a form of AI, though it’s rarely identified as such.

Google is also motivated. Visual search, along with voice search, is one way to boost query volume which is a key revenue precursor for Google. We’re also talking high-intent queries, given that proximity – in this case, being within view of a subject – correlates to intent.

Of course, that all starts with user demand and traction. Though visual search checks all the boxes for killer apps (utility, frequency, etc.), it hasn’t caught on as quickly as we expected. One beacon of hope for Google is the camera-native and increasingly buying-empowered Gen Z.

Meanwhile, Google Lens is no slouch, with 10 billion monthly searches. But if it starts to gradually move towards a more ubiquitous technology on the backs of Gen-Z affinities, it could be a strong revenue generator that fits right into the broader search machine.

More from AR Insider…