Skip to main content
  1. Home
  2. Phones
  3. Computing
  4. Mobile
  5. News

You can now augment Google Lens photo searches with text

Add as a preferred source on Google

Google is looking to improve its search results by leveraging both the power of photos, along with additional text for context. The new experience is called multisearch, which will be available on phones and tablets as part of Google Lens inside the Google app.

Google says the feature combines visual and word searches together to deliver the best results possible, even when you can’t describe exactly what it is that you’re trying to search for.

Five screenshots that show how to search in Google using multiple elements.
Image used with permission by copyright holder

“At Google, we’re always dreaming up new ways to help you uncover the information you’re looking for — no matter how tricky it might be to express what you need,” Google explained of the new multisearch feature. “That’s why today, we’re introducing an entirely new way to search: Using text and images at the same time. With multisearch in Lens, you can go beyond the search box and ask questions about what you see.”

Recommended Videos

A practical example of where multisearch will be useful is online shopping. Fashionistas may like a particular style of dress but may not know what that style is called. In addition, rather than shopping from a catalog with that particular dress available in a specific color, by leveraging the power of multisearch, you can snap a picture of the dress and search for the color green or orange. Google will even suggest similar alternatives in the colors you want.

“All this is made possible by our latest advancements in artificial intelligence.”

In this sense, multisearch extends the Google Lens experience by not only identifying what you see but by entering additional search text — like the color green — your search becomes more meaningful.

“All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways,” Google explained the technology powering multisearch. “We’re also exploring ways in which this feature might be enhanced by MUM — our latest AI model in Search — to improve results for all the questions you could imagine asking.”

To begin your multisearch experience, you’ll need the Google app, which can be downloaded as a free app on iOS and Android devices. After you download the app, launch it, tap on the Lens icon, which resembles a camera, and snap a picture or upload one from your camera roll to begin your search. Next, you’ll want to swipe up and click on the plus (+) icon to add to your search.

Some ways to use this new multisearch tool include snapping a picture of your dining set and adding the “coffee table” term to your search to find matching tables online, or capturing an image of your rosemary plant and adding the term “care instruction” to your search to find out how to plant and care for rosemary, Google said.

Multisearch is available now as a beta experience within the Google app. Be sure to keep your Google app updated for the best results.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
I tried the AI editing tools on my Galaxy S26, and it quietly blew my mind
The Galaxy S26's AI editor fixed my photos in seconds
Samsung Galaxy S26

I have tried AI photo editing tools on a bunch of phones by now, and most of them follow the same pattern. They look great in a demo, seem useful in theory, and then become wildly unpredictable the moment you use them on an actual photo you care about. My issue with AI erasers and other editing tools was how inconsistent they were.

But the Galaxy S26 is proving to be different.

Read more
Apple wants you to verify your identity before you get Education discount on products
Apple moving the US Education Store off the honor system also seems about making a globally consistent verification infrastructure that could eventually support more aggressive Education Store expansion.
Computer, Electronics, Laptop

Getting an Apple Education discount in the United States used to be as simple as claiming you’re a student or a teacher; it didn’t need a formal verification. That era is officially over. 

Starting May 8, 2026, Apple now requires formal identity verification for all Education Store purchases in the US, ending the informal honor system that was in place for years (via MacRumors). 

Read more
Whoop’s response to Fitbit Air and Google Health is real doctors, not just an AI chatbot.
In the race to own your health data, Google chose an AI, and Whoop chose a doctor. That single decision may define which fitness tracker serious health users reach for in 2026 and beyond.
A person wearing the Whoop 5.0.

Recently, Google launched the Fitbit Air as a direct rival to the Whoop screenless fitness band, rebranded the Fitbit app to Google Health, and released a Gemini-powered AI coach. Exactly one day later, Whoop has responded with on-demand video consultations with licensed clinicians for US users. 

The contrast is hard to ignore. While Google is betting on AI as your general health advisor, Whoop is doubling down on real, licensed doctors, and making the case that they can serve its fitness-focused users considerably better (via CNBC).

Read more