Visual search continues to be a wild card as a potential AR killer app. For those unfamiliar, it uses computer vision and machine learning to let users point their phones at physical world objects – everything from furniture to footwear – to identify and contextualize them.

Google leads the field with its Lens feature. As it continues to place Lens in high-traffic places like the search bar on Google’s mobile app, it sees this as a way to boost query volume through multiple inputs. This also future-proofs its core search business with the camera-native Gen-Z.

But Google isn’t alone. Visual search challengers include Snap Scan, Pinterest Lens, and even Etsy. Each of these players sees an intersection between visual search and their core products. For Snap, it’s socially fueled AR and brand marketing. For Pinterest, it’s product discovery.

Now the latest player has joined the mix: Klarna. The buy-now-pay-later (BNPL) company’s latest feature lets users take a picture of physical-world objects to identify them and find where to buy them. This advances both Klarna’s business and the broader visual search sector.

What’s Driving Google’s Visual Search Play?

Shopping Flow

Starting with the former, Klarna’s new Shopping Lens feature identifies 10 million+ products across eCommerce categories like clothing and electronics. It can also match those items with 50 million eCommerce destinations, including the ability to compare prices and reviews.

A related use case offered in Klarna’s Shopping Lens is to scan barcodes in physical stores to prompt searches that follow a similar path as the above shopping flow: with price comparisons across Klarna’s associated network of online retailers, and the ability to buy something.

This shopping-oriented use case makes Klarna’s visual search play decidedly actionable and focused. That contrasts Google’s approach which – true to form – is more broadly about “all the world’s info.” And Klarna is well positioned to pull this off, given its place in the eCommerce stack.

That positioning importantly endows Klarna with product metadata for AI-training and visual recognition. Its integration with eCommerce storefronts for payment checkout has also empowered it to develop its own shopping front-end to drive consumer purchases.

Who’s Best Positioned for Visual Search?

Follow the Money

Moving on to the broader visual search sector, Klarna’s entry raises a key question: how will new players enter, given its technical complexity? As noted, Klarna has the ingredients for AI training to enable visual search’s core object recognition. But how will other players enter?

The answer could be through APIs. And that’s where Google reenters the picture. Just like it powers site search for websites – and creates ad inventory in the process – a prospective Google Lens API could empower anyone looking to integrate visual search into their sites or apps.

To that end, we predict Google will release a Lens API at some point. There’s precedent in Google’s interest in APIs for AR functionality, such as its geospatial API for visual navigation. But ultimately, its incentive (follow the money) is to create a sort of AdSense for visual search.

This would lower visual-search barriers for consumer apps while giving Google what it wants: more surface area for monetizable search. Fortunately, another beneficiary will be visual search itself. Fueled by an API, this could accelerate consumer exposure and, ultimately, adoption.

More from AR Insider…