The Future Is Here
We may earn a commission from links on this page

Google’s New Multisearch Tool Is for Window-Shopping the Real World

You can try out the latest beta feature in the Google app on iOS and Android.

A photo of a phone showing the new Google ability in Lens
The new multisearch in Google Lens lets you combine images and text to search for things.
Photo: Florence Ion / Gizmodo

Google’s latest mobile search feature is aimed at window shoppers and friends who browse each other’s closets. The new multisearch ability in Lens lets you perform a visual search with a bit of textual aid to help steer the search engine. It’s not perfect every time, but it can help determine where your friend got her patterned skirt or whether that Pikachu pillow you saw in the window is for sale anywhere online.

Google says you can also use the multisearch feature to search for matching furniture in your home, which could prove helpful if you’re the kind of person who decorates piece-by-piece.

Advertisement
The new multisearch feature works with furniture, too.
Gif: Florence Ion / Gizmodo
Advertisement

The new lens ability is a beta feature available to anyone with the Google search app for iOS and Android. Here’s how it works: tap into the app, then tap on the camera search icon. If you’re on Android and you use a camera app with an integrated Google Lens shortcut, you can access the feature from there, too.

Advertisement

Snap a photo of what you’re looking at with Google Lens, then make sure the option is set to Search and swipe up on the page. You’ll see all of Google’s visual matches, but you’ll also see an option to Add to your search at the very top. Tap that item, and then input a color, brand name, or anything else that could be identifying enough to narrow it down.

If you’re shopping for something you see and know its manufacturer or brand, you could try entering the name to surface an official storefront. Or, if you’re hoping to find it on Amazon, add that as your text, and Google will try and return links to sellers.

Advertisement

I tried the new feature with some of the collector’s items and toys I have strewn around my home office. I took a photo of my Aggretsuko Squishme, then typed in “squishies” as the text aid, and Google returned related listings on Mercari and eBay. Of course, there were plenty of useless links thrown in there, too.

Google said multisearch works with both screenshots and live photos, and it works best when the refining text search is a color, brand, or a visual search attribute—something like “modern” or “bohemian,” or in the case of a pair of shoes, “boots” and “flats.” Google also said that it’s exploring ways in which the new multisearch feature might be enhanced by MUM, the AI model it introduced last year at its developer conference.