Skip to main content
  1. Home
  2. Phones
  3. Android
  4. Mobile
  5. News

Google Lens’ new dining and translation features are now rolling out

Add as a preferred source on Google

Google Lens is getting an upgrade. At Google I/O 2019, Google announced that it would give Lens a number of new features, and even allowing the feature to work within Search. Now, some of those additions are finally rolling out.

Not all of the new features are rolling out right now, but some of the more substantial ones are. For starters, Google Lens will be able to automatically highlight which dishes are popular on a restaurant menu. All you have to do is point the Google Lens camera at the menu itself, and popular dishes will be highlighted on your phone’s screen. You’ll also be able to see reviews of dishes, photos of them, and more. Not only that, but Lens can help split the bill — point your camera at the receipt, and Lens will calculate the amounts each person has to pay.

Recommended Videos

Google Translate is getting injected into Lens, too. With Google Lens, you’ll be able to point your camera at text in a foreign language and a translated version of the text will appear over the sign, menu, or whatever else.

Other features are also available. Notably, Lens now has a text-recognition feature that allows you to point it at text and you can then copy and paste that text to other apps and services.

Last but not least, Google Lens is making it a little easier to buy products you see in the real world. With Lens, you’ll be able to point the camera at clothing or furniture and then see similar items available online. If you can find the barcode of the product, you will be able to see that exact product and where it might be available for purchase, which is a nice touch.

The new features are rolling out to Lens users now and are expected to be available to all Lens users on both Android and iOS later this week. Lens can be found in Google Assistant and Google Photos on Android and in the Google and Google Photos apps on iOS. The feature is also available in the camera app on many Pixel phones.

Christian de Looper
Christian de Looper is a long-time freelance writer who has covered every facet of the consumer tech and electric vehicle…
Gemini Intelligence has strict requirements, and your phone may not qualify
Gemini Intelligence

Google’s new Gemini Intelligence platform is quickly becoming one of the biggest talking points in the Android world right now. After being highlighted during this week’s Android Show, the feature is already being tied to several upcoming premium foldables and flagship phones. But there’s a catch: not every high-end Android device will be able to run it. And surprisingly, even some of Google and Samsung’s latest foldables may miss out.

According to Google’s requirements, Gemini Intelligence isn’t just another software update you can casually push to older devices. The company appears to be building this around a much stricter hardware and long-term software support system. To qualify, a phone needs a flagship-grade chipset, at least 12GB RAM, support for AI Core, and Gemini Nano v3 or newer. That immediately creates a problem for several current-generation phones.

Read more
Meta’s Ray-Ban Display now types messages from your finger movements
Neural Handwriting is a really cool feature, but Meta opening the Ray-Ban Display to developers is the quiet announcement that turns a clever wearable into a platform with immense possibilities.
Meta Ray-Ban Display and EMG Band.

Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device. 

The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users. 

Read more
WhatsApp is testing disappearing messages that wait for you to actually read them before vanishing
WhatsApp's new After Reading timer deletes messages only after the recipient reads them.
whatsapp-disappearing-messages-after-reading-timer

WhatsApp has always let you send messages that vanish on a timer, but the clock starts the moment you hit send, not when the other person actually read it. That means a message could sit unread for hours and still disappear before anyone sees it.

This is why WhatsApp is testing a new feature called 'After Reading' timer for disappearing messages, spotted in the latest iOS beta update by WABetaInfo.

Read more