Skip to main content
  1. Home
  2. Computing
  3. Emerging Tech
  4. Gaming
  5. Features

Seeing food in VR games? This sensor will put the real taste in your mouth

Add as a preferred source on Google
A person’s lips holding the e-Taste sensor kit.
Ohio State University / Science Advances

Over the past few years, researchers have tried to make virtual reality (VR) experiences even more immersive and personal. For example, back in 2022, the experts over at Stockholm University created a machine called olfactometer, which let users smell what they were seeing in a game while wearing a VR headset.

But smell is only half the picture. What if you could actually deliver a real taste on the tongue? A team from the Ohio State University have created a sensing system called e-Taste that can replicate the taste of a real food items and drinks, and deliver it straight to another person’s tongue living hundreds of miles away.

Recommended Videos

The e-Taste consists of two components — a taster and a receiver. The sensor contains a special patch that can detect the fundamental molecules responsible for the five varieties of taste viz. bitter, salty, sour, sweet, and umami. As part of their research, the team focused on detecting glucose and glutamate on the sensor patch.

How it works?

Person wearing the e-Taste sensor kit.
Ohio State University / Science Advances

During the tests, a California-based person sipping lemonade dipped the taster patch inside the beverage. The sensor patch detected the concentration of target chemicals in the lemonade, and transmitted that electrochemical data to the receiver kit sitting in an Ohio lab.

The receiver part includes a patch that rests on the tongue and a pump connected to a liquid channel of solutions. When stimulated electrically, the liquid passes through a gel-based system onto a person’s tongue, providing a realistic sense of tasting food.

The specific taste and its intensity can be adjusted by changing the volume of fluid discharged by the pump. During the human trials, the participants wearing the receiver kit were able to identify different levels of sourness with a 70% accuracy.

“Beyond helping to build a better and more dynamic gaming experience, the study notes that the work could be useful in promoting accessibility and inclusivity in virtual spaces for individuals with disabilities, like those with traumatic brain injuries or Long Covid, which brought gustatory loss to mainstream attention,” says the team.

Schematic representation of the e-Taste sensor kit.
Ohio State University / Science Advances

It could also help with identification and tasting of potential food sources in harsh conditions, online shopping, remote education, quality monitoring for freshness and consistency by robotic machines and remote analysis of taste perception by doctors.

The team tested e-Taste over two network protocols. The short-range implementation covered a distance of 200 meters, while the long-range design relies on an internet connection with no limit on the distance involved.

The possibilities for VR, and beyond

“This concept is here and it is a good first step to becoming a small part of the metaverse,” says Jinghua Li, co-author of the paper and faculty member at the institution.

This won’t be the first attempt of its kind to augment the VR experience with a sense of smell or taste. Two years ago, OVR revealed the ION 3 wearable kit for XR hardware that can produce hundreds of scents using a system of cartridges.

The folks over at City University of Hong Kong and Beihang University in China developed a patch-based wearable sensor that relies on miniaturized odor generators using perfumed wax.

A person testing multi-taste sensor kit.
Ohio State University / Science Advances

The VR kit can recognize visuals and produce the corresponding scent in roughly two seconds. But when it comes to the perception of taste, the sense of smell (or olfaction) plays an equally important role.

The industry has already figured out how to deliver smell as part of AR and VR experiences. The e-Taste system demonstrates that remotely triggering realistic taste on the tongue is also possible.

In addition to lemonade, the team also tested the human participants with food grade chemicals that represent the taste of cake, fried egg, coffee, and fish soup. This mixed-taste analysis was conducted using a mixed-channel e-Taste system called the digital cup.

As far as latency goes, the short-range format measured at 0.3 second, while the figures for long-range information transfer stood at 1.4 seconds. The sensor response time, on the other hand, was roughly ten seconds.

The road to a realistic metaverse

“The gustatory interface will pave the way for a new era of AR/VR systems with chemical components by allowing users not only to visualize and hear virtual environments but also to taste them,” says the research paper.

The team is now focused on miniaturizing the e-Taste sensing kit. Moreover, they are also experimenting with a non-gel solution for delivering the taste chemicals. One of those ideas involves using separate pouches of water and taste fluid, and accordingly varying the concentration.

The e-Taste sensor kit inside mouth cavity.
Ohio State University / Science Advances

Water solves another crucial problem: residual chemicals in the channel connected to the tongue. After each session, a water flow would internally clean the pipe and reduce chances of any taste contamination for future sessions.

Learnings from the e-Taste system can be used to develop VR gaming systems that can help users get an immersive sensation combining real taste and smell of what they’re seeing in a virtual world.

For now, what we have is an experimentally-validated system that it’s possible to fold taste into the virtual experiences. What remains is the miniaturization of the whole system and standardizing the electrochemical data representing various food items and beverages.

Beyond the domain of VR, and moving on to your humble computing station, there’s another solution that is about to launch in the market. Asus recently introduced a mouse that comes with a refillable pouch for aromatic oils. It can diffuse nice scents in the air while adding some zen to the whole work environment.

Nadeem Sarwar
Nadeem is the Managing Editor at Digital Trends.
I built a Mac app to track my bad posture with AirPods. I didn’t write a line of code.
A one-shot attempt with Claude that ran in the first attempt. It almost felt like witnessing magic.
Person wearing AirPods Pro.

A few weeks ago, I wrote about an app that looks at you through the Mac’s webcam, and as soon as it detects a slouching posture, it sends a notification. The app even logs all the instances and provides a daily posture score. It was an open-source app, but soon after it was shared on Reddit by the creator, a huge chunk of fellow Reddit lurkers started asking about how it processes and stores data. Those were existentially valid queries.

After all, you are giving an app access to the camera, which can monitor you and the world around you in real-time. Is there a backdoor that allows a bad actor to take a sneak peek? What else is the app logging in the background, and how much of the audio-visual stream is being relayed or stored on an external cloud server? Thankfully, the app works fully online, and all the processing happens locally on my Mac. But the sense of unease prevailed.

Read more
Fitbit is becoming Google Health, and it’s getting a bunch of wellness upgrades
Google is finally treating health tracking as a platform play, pulling in medical records, third-party fitness data, and AI coaching in a way that Fitbit's standalone app was never built to handle.
New Google Health app.

Google is officially pulling the plug on the Fitbit app, replacing it with the new Google Health app on May 19, 2026. It is quite ironic, as the company just announced a new Fitbit Air screenless fitness tracker, but the change will take place via an OTA update. 

This is happening after Fitbit’s fifteen-year run, wherein it gathered millions of fitness-focused users and provided them with various health trackers and meaningful insights via its software. 

Read more
Your coworker’s AI-built app might be leaking company secrets
Thousands of AI-built apps are spilling secrets online
girl coding on computer

AI coding tools have made it ridiculously easy to build a web app, and it only takes a few minutes to set up now. This ease has lowered the barrier to app development, which is causing a new set of issues. So what happens when these AI-made apps go live without anyone checking the locks? You get secrets spilling out all over the internet.

A WIRED report highlights a major security problem around so-called “vibe-coded” apps, which are built using AI development platforms such as Lovable, Replit, Base44, and Netlify.

Read more