As you may have heard, Apple recently launched 3D urban navigation — its answer to Google Live view. Not to be outdone, Google fired back with an update of its own: better landmark detection in Google Lens. This is just the latest in its ongoing “Internet of places,” ambitions

But buried in that announcement was a notable nugget: Google Lens is used 3 billion times per month. Naturally, our ears perked up in the eternal quest for AR usage data. Though Google has revealed the volume of objects that Lens recognizes, usage figures have been more scarce. 

“Google Lens is now used over three billion times per month by people around the world,” wrote Google Lens product management director Lou Wang, “and with many ready to explore this summer and rediscover their cities, we’re officially launching the new Places Filter in Lens.” 

The AR Space Race, Part I: Google

Arms Race

Going deeper on the Places Filter, it recognizes and identifies buildings that you point your phone at. Like Live View navigation, it uses Google Earth and Street View imagery as an object recognition database. Lens then does its thing using computer vision and camera magic.

Back to the arms race with Apple, this will continue to ratchet up as both players invest in geo-local AR product development. In addition to navigation, Apple unveiled new visual search functionality at WWDC, including the ability to recognize flowers, animals and text.

These visual search moves by Apple are early, though still potentially threatening to Google (it is Apple after all). Conversely, Google’s competition on this front has erstwhile come from two places: Pinterest and Snap which are each building more focused visual search use cases. 

“Focused” is the key word, as the competitive field for visual search (much like Web search) will be defined by focal range. Google will work towards “all the world’s information” while Snap and Pinterest (and potentially Apple) will zero in on things like shopping and “outfit inspiration.” 

The latter is where the opportunity is. Most of the above players are starting with pets & flowers as noted. But the ultimate destination is monetizable visual searches including fashion and food. Indeed, visual search will be a key format in the broader field of “camera commerce.”  

Data Dive: 170 Million People Use Snap Scan

Buy Local

As the new Place Filter signals, Google will also seek monetizable visual search in local discovery, such as finding out about a restaurant by pointing your phone at it. This is an area that Snap is also angling towards, given Local Lenses and its recent Pixel8.earth acquisition. 

But Google could be the king of visually-oriented local discovery….just as it’s the king of desktop and mobile local search and mapping. Given years of developing those products, it has valuable data assets such as Street View imagery to localize devices and recognize storefronts. 

Back to Apple, it should be noted that it’s likewise working towards building those very assets. That starts with its Apple Maps revamp that heavily involves capturing 3D spatial maps. As we predicted, that leads it right to 3D local navigation and other geospatial AR use cases. 

But of all these players, our money is on Google given the depth of its first-party geo-location data. It also has the greatest financial motivation of all of the above players. It built tremendous value indexing the web, and now it sees the opportunity to do similar for the physical world. 

That’s where “Internet of Places” comes from, and Google is highly motivated to make it happen. It won’t be a winner take all market as the above efforts suggest. But visual search’s inclusion of the word “search” in some ways signals Google positioning and potential dominance.

More from AR Insider…