Skip to main content
  1. Home
  2. Phones
  3. Mobile
  4. Photography
  5. Features

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Here’s what Google Lens’ Style Match, Smart Text Selection features look like

Add as a preferred source on Google
 

Like a pair of sneakers someone’s wearing? Or maybe a dress? There are quite a few apps and services — like Amazon’s Firefly or Samsung’s Bixby Vision — that let you simply point your smartphone camera at the object and search for it, or similar styles. Google is following suit with a similar feature in Google Lens, but it has the potential to reach far more people.

Recommended Videos

Google Lens is currently built into the Google Assistant on Android phones, as well as Google Photos. It lets you point the smartphone camera at objects to identify them, teach you more about landmarks, recognize QR codes, pull contact information from business cards, and more. At its annual Google I/O developer conference, the search giant announced four new improvements to Lens, and we got to try it out.

Built into camera apps

Dan Baker/Digital Trends

Google Lens is now built into the camera app on phones from 10 manufacturers: LG, Motorola, Xiaomi, Sony, Nokia, Transsion, TCL, OnePlus, BQ, Asus. That is not including Google’s very own Google Pixel 2. You are still able to access it through Google Assistant on all Android phones.

We got a chance to try it out on the recently announced LG G7 ThinQ, and the new option sits right next to the phone’s Portrait Mode.

Style Match

Dan Baker/Digital Trends

The biggest addition to Lens in this I/O announcement is Style Match. Like Bixby Vision or Amazon Firefly, you can point the smartphone camera at certain objects to find similar items. We pointed it at a few dresses and shoes, and were able to find similar-looking items, if not the exact same item. Once you find what you’re looking for, you can purchase it if available directly through Google Shopping.

It’s relatively quick, and an easy way to find things you can’t quite write into the Google Search bar.

Smart text selection

Perhaps even more useful is Smart Text Selection. Point Google Lens at text, say like from a book or a menu, and it can single out the text from everything else. You can then tap on the text and copy it or translate it. When we tried it, Lens managed to grab an entire three paragraphs of text, though we’d have to do more testing to see how well it can pick up handwritten text.

Real time

Google Lens now works in real time, so you don’t need to pause and take a photo for it to understand the subject. That means you can point it at several things and you will see it creating colored dots on the objects it grabs information for. Google said it is identifying billions of words, phrases, and things in a split second all thanks to “state-of-the-art machine learning, on-device intelligence, and cloud TPUs.”

Google said it will be rolling out all of these features toward the end of May.

Julian Chokkattu
Former Mobile and Wearables Editor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
I can’t live without iPhone shortcuts. These 7 are my favorites that you must try, too.
I've been using these 7 iPhone shortcuts for years, and they've completely changed how I use my phone.
iPhone showing shortcuts app

The iPhone Shortcuts app reminds me of Minecraft. It might be relatively easy to jump into, but it offers nearly limitless potential, allowing you to build anything you want. The same holds true for the Shortcuts app, and that endless possibilities are what many iPhone users might find intimidating. But you don't have to.

If you are new to iPhone shortcuts, think of them as little automated helpers. You can build them yourself or find ones that others have built and use them. And that’s the beauty of shortcuts. If you don’t want to get your hands dirty, you can find shortcuts others have created and tailor them to your needs. 

Read more
Gemini Intelligence has strict requirements, and your phone may not qualify
Gemini Intelligence

Google’s new Gemini Intelligence platform is quickly becoming one of the biggest talking points in the Android world right now. After being highlighted during this week’s Android Show, the feature is already being tied to several upcoming premium foldables and flagship phones. But there’s a catch: not every high-end Android device will be able to run it. And surprisingly, even some of Google and Samsung’s latest foldables may miss out.

According to Google’s requirements, Gemini Intelligence isn’t just another software update you can casually push to older devices. The company appears to be building this around a much stricter hardware and long-term software support system. To qualify, a phone needs a flagship-grade chipset, at least 12GB RAM, support for AI Core, and Gemini Nano v3 or newer. That immediately creates a problem for several current-generation phones.

Read more
Meta’s Ray-Ban Display now types messages from your finger movements
Neural Handwriting is a really cool feature, but Meta opening the Ray-Ban Display to developers is the quiet announcement that turns a clever wearable into a platform with immense possibilities.
Meta Ray-Ban Display and EMG Band.

Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device. 

The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users. 

Read more