Skip to main content
  1. Home
  2. Phones
  3. Mobile
  4. Photography
  5. News

Google Lens adds unprecedented intelligence to your smartphone camera

Add as a preferred source on Google

Want to know the name of that flower or bird you encounter during your stroll through a park? Soon, Google Assistant will be able to tell you, using the camera and artificial intelligence.

Google jump-started its 2017 I/O conference around AI and machine learning, and one computer vision technology it highlighted is Google Lens, which lets the camera do more than just capture an image — it gives greater context around what it is that you’re seeing.

Recommended Videos

Coming to Google Assistant and Google Photos, the Google Lens technology can “understand what you’re looking at and help you take action,” Google CEO Sundar Pichai said during the keynote. For example, if you point the camera at a concert venue marquee, Google Assistant can tell you more about the performer, as well as play music, help you buy tickets to the show, and add it to your calendar, all within a single app.

When the camera’s pointed at an unfamiliar object, Google Assistant, through image recognition, can tell you what it is. Point it at a shop sign, and using location info, can give you meaningful information about the business. All this can be done through the “conversational” voice interaction the user has with Assistant.

“You can point your phone at it and we can automatically do the hard work for you,” Pichai said.

With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17 pic.twitter.com/viOmWFjqk1

— Google (@Google) May 17, 2017

If you use Google’s Translate app, you have already seen how the technology works: Place a camera over some text and the app will translate it to a language you understand. In Google Assistant, Google Lens will take this further. In a demonstration, Google showed that Google Assistant not only will translate foreign text, but also display images of what the text is describing, to give more information.

In a demo, Scott Huffman, Google VP of engineering for Assistant, demoed how Google Lens within Google Assistant can translate the Japanese text of an image, but also give further context about what the word is. Image used with permission by copyright holder

Image recognition technology isn’t new, but Google Lens shows how advanced machine learning is becoming. Pichai said that as with its work on speech, Google is seeing great improvements in vision. The computer vision technology not only helps recognize what something is, but can even help repair or enhance an image. Took a blurry photo of the Eiffel Tower? Because the computer recognizes the object and knows what it’s suppose to look like, it can automatically enhance that image based on what is already knows.

“We can understand the attributes behind a photo,” Pichai said. “Our computer vision systems now are even better than humans at image recognition.”

No longer will you need to write down what’s in your vacation photos. Google VP Anil Sabharwal for Google Photos showed how Google Lens can recognize objects in a photo and bring up relevant information about it. Image used with permission by copyright holder

To make Lens effective at its job, Google is employing sophisticated computational architecture of Cloud Tensor Processing Unit (TPU) chipsets, to handle training and inference for its machine learning. Its second-generation TPU technology can handle 180 trillion floating point operations per second; 64 TPU boards in one super computer can handle 11.5 petaflops. With this computing power, new TPU can handle both training and inference simultaneously, which wasn’t possible in the past (the previous TPU could only handle inference work, but not the more complex training). Machine learning takes time, but this hardware will help accelerate the effort.

Google Lens will also power the next update of Google Photos. Image recognition is already used in Photos to recognize faces, places, and things to help with organization and search. With Google Lens, Google Photos can give you greater information about the things in your photos, like the name and description of a building; tapping on a phone number in a photo will place a call, pulling up more info on an artwork you saw in a museum, or even enter the Wi-Fi password automatically from a photo you took of the back of a Wi-Fi router.

Hate entering Wi-Fi network passwords? Snap a photo of the wireless settings, and Google Lens technology through Google Photos can automatically enter it for you. Image used with permission by copyright holder

Assistant and Photos will be the first apps to use Google Lens, but it will be rolled out into other apps. And with the announcement of support for Assistant in iOS, iPhone users will be able to utilize the Google Lens technology as well.

Les Shu
Former Senior Editor, Photography
I am formerly a senior editor at Digital Trends. I bring with me more than a decade of tech and lifestyle journalism…
Apple says Lockdown Mode thwarted spyware attacks with a clean slate
Apple’s strongest defense is actually holding up
Lockdown Mode information page on an iPhone 14 Pro.

Apple says it has not seen a successful spyware attack on any iPhone with Lockdown Mode enabled, a claim it shared with TechCrunch.

Lockdown Mode arrived in 2022 as an opt-in feature for iPhone, iPad, and Mac. It was introduced as a stricter security mode for people at high risk of targeted attacks, such as journalists, activists, and government officials.

Read more
The Dynamic Island could shrink on the iPhone 18 series, and not just on the Pro models
One leaker, one claim, and a big question: is Apple genuinely ready to give every iPhone buyer the same design treatment as Pro owners this cycle?
Apple iPhone 17 Pro in Cosmic Orange leaning on a gray wall.

Apple’s Dynamic Island has been around long enough that most people have made their peace with it or forgotten it’s there. In fact, I’ve seen people associating the pill-shaped notch with newer iPhone models (released in the last 3 years). Now, a fresh leak suggests that the notch replacement is about to shrink, not just on the expensive models. 

What did the leaker actually say?

Read more
Apple Podcasts finally gets serious about video, adds multiple YouTube-inspired features
With offline downloads, Picture-in-Picture, and a dedicated video hub, iOS 26.4 turns Apple Podcasts into a platform creators can no longer afford to ignore.
Electronics, Phone, Mobile Phone

For years, the Apple Podcasts app supported video, at least it did technically, but nobody used it. Creators ignored it, while listeners forgot it. Meanwhile, other platforms like YouTube and Spotify quietly built empires on video podcasting. However, that changes with the iOS 26.4 update, or at least that is what Apple hopes for. 

Video podcasting exploded in popularity in recent years, with audiences gravitating toward platforms that treated the format well (as already mentioned above). Despite being an iPhone user, I personally consume podcasts on YouTube (I briefly paid for the Premium membership as well). 

Read more