Skip to main content
  1. Home
  2. Computing
  3. News

Tongue-tracking in VR has arrived

Add as a preferred source on Google
Meta Quest Pro face-tracking demo shows a green-skinned flower person with surprised expression.
Meta

A new face-tracking extension lets Meta Quest Pro users poke their tongues out at you in a wordless statement of their feelings or even blow a raspberry.

How better to express a sweltering day or an exhausting workout? This could be the technology that revolutionizes VR in 2024. Am I being too tongue-in-cheek?

Recommended Videos

Honestly, it’s a nice option to add to the Quest Pro’s current hand-, eye-, and face-tracking. Meta’s most expensive VR headset allows quite a range of expression already. I can gesture with my hands, blink, wink, move my eyes, and open and close my mouth in various positions. You can’t do that with the new Quest 3.

Tongues were wagging when @0GUzzuqPjUVpY57 broke the news on X (formerly Twitter), along with providing a video demonstration showing two animated avatars alongside a Quest Pro user controlling the action.

Rleased new #VRCFT modules for new #QuestPro v2 facial tracking with support for tongue.

Supports (Air)Link and VD via VDX (when VD adds support) using the ALXR Local module and ALXR clients with the ALXR Remote module

You can download from here https://t.co/F37U2zA8is#VRChat pic.twitter.com/sUmfLaXVpN

— コレヂャン (@0GUzzuqPjUVpY57) December 14, 2023

Since the Quest Pro includes face-tracking, it seems like Meta should have included tongue-tracking, but perhaps felt it would be controversial. HTC makes a Face Tracker accessory to synchronize mouth and tongue movements with an avatar, and it’s compatible with many VR headsets.

Of course, you shouldn’t add more weight and extra expense if you can avoid it. With the new ALXR module from korejan on Github, you can emote with your tongue in compatible PC VR apps and games like VRChat.

The developer shows off Quest Pro tongue tracking extension.
A developer shows off the Quest Pro tongue-tracking extension. korejan

The extension reportedly works with Meta’s AirLink and Virtual Desktop via an app called VDX. You can install a tongue-tracking extension as a local module for ALXR or as a remote module if you use ALVR streaming.

If you’re unfamiliar with VRChat and ALXR, you’ll need to do a bit of reading to understand how to use the tongue-tracking extension. In the tweet thread, there’s a link to Reddit for more details about how to get started.

Note that installing this extension won’t add tongues to your Meta avatar and can’t help with the Quest version of VRChat. Also, few games and apps support face-tracking. It’s probably only worth the effort if you are already enjoying face- or body-tracking in VRChat or another social VR app on your PC.

Alan Truly
Alan Truly is a Writer at Digital Trends, covering computers, laptops, hardware, software, and accessories that stand out as…
A simple coding mistake is exposing API keys across thousands of websites
Security gaps that are easier to miss than you think
Computer, Electronics, Laptop

After analyzing 10 million webpages, researchers have found thousands of websites accidentally exposing sensitive API credentials, including keys linked to major services like Amazon Web Services, Stripe, and OpenAI.

This is a serious issue because APIs act as the backbone of the apps we use today. They allow websites to connect to services like payments, cloud storage, and AI tools, but they rely on digital keys to stay secure. Once exposed, API keys can allow anyone to interact with those services with malicious intent.

Read more
AMD’s latest Ryzen 9 9950X3D2 pushes X3D to the limit
Dual 3D V-Cache, higher power, and a focus on enthusiast performance
AMD Ryzen 9 9950X3D2 FEatured

AMD has unveiled what might be its most extreme desktop CPU yet, the Ryzen 9 9950X3D2. And it’s going all-in on one thing: cache.

https://twitter.com/jackhuynh/status/2037159705395491033?s=20

Read more
Next-gen AI breakthrough promises chatbots that can read the room better
Researchers are teaching AI chatbots to read between the lines
Generative AI

Have you ever asked a chatbot something and felt like it completely missed your point? You say something with a bit of nuance, and the AI misses the subtlety entirely. That is exactly the problem researchers are trying to solve.

Even though the emotional connection with AI can feel deeper than human conversation for many users, most AI systems today still treat a sentence as a single block of sentiment. If you mix praise and criticism, the nuance often gets lost.

Read more