Skip to main content
  1. Home
  2. Emerging Tech
  3. Mobile
  4. Legacy Archives

Google developing smart contact lens that monitors blood sugar levels

Add as a preferred source on Google

Detailed within a post on the official Google Blog, a team of developers within the Google X unit are currently working on a next generation contact lens that can measure glucose levels. Working on a prototype that can provide a reading once per second, this type of technology could be incredibly useful for anyone that has to deal with type 1 or type 2 diabetes. Up to this point, people with diabetes have had to prick their finger to test blood sugar throughout the day or embed a glucose monitor underneath the skin to constant measure spikes in blood sugar.

To create the smart lens, the development team has embedded a glucose sensor in between two layers of contact lens as well as a tiny wireless chip to transmit data about blood sugar levels. In addition, there’s a minuscule pinhole between the lens and the eye to allow fluid to seep into the lens. The wireless chip needs just one microwatt of power to operate constantly and obtains that power from a static electricity charge.

Recommended Videos

Speaking about the new technology in the post, Google X project leads Brian Otis and Babak Parvi write “At Google, we wondered if miniaturized electronics—think: chips and sensors so small they look like bits of glitter, and an antenna thinner than a human hair—might be a way to crack the mystery of tear glucose and measure it with greater accuracy.” The development team had to make the glucose sensor much more sensitive than a typical sensor, only because the concentration of glucose is much lower in tear fluid compared to human blood.

google-smart-contact-lens-glucose-testing
Image used with permission by copyright holder

The team is also researching the possibility of including LED lights to act as an early warning system in case blood sugar is at a dangerous level. However, the data could be linked to a smartphone or tablet and an app could provide blood sugar notifications. As data is collected throughout the day, anyone that suffers from diabetes would have a huge amount of data to compare up against nutritional data, assuming they logged their food intake. The data could also be made available to a family doctor, thus providing a more complete picture of the patient’s health.

At this time, Google doesn’t have a prospective release date for the new technology, but has completed several clinical research studies testing the product. Google is also in talks with the U.S. Food and Drug Administration about the progress of testing. Google plans on seeking partners to bring the technology to market, likely a contact lens manufacturer as well as companies to developer the miniature tech for the product. The Google X team is also responsible for products that include the self-driving car, Google Glass and Project Loon.

A pain-free alternative to checking blood sugar levels could be particularly useful for pediatricians that likely run into resistance when teaching children with type 1 diabetes how to check glucose levels. It’s likely that Google would also work with contact lens manufacturers to offer the lens in a variety of strengths in order to be useful to people with both diabetes and vision issues.

Mike Flacy
By day, I'm the content and social media manager for High-Def Digest, Steve's Digicams and The CheckOut on Ben's Bargains…
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more
Meta’s own employees are having a hard time digesting AI. Who would’ve thought?
Artificial Intelligence

If you wanted a snapshot of what it looks like when a tech giant tries to force-feed its workforce an AI future, look no further than Meta right now. The company that built its empire on knowing everything about its users has turned that same appetite inward, and its employees are not happy about it. Last month, Meta quietly informed tens of thousands of its U.S. workers that their corporate laptops would begin tracking their keystrokes, mouse movements, clicks, and screen activity. The purpose was to feed that behavioral data into Meta's AI models so they could learn how people actually use computers. The reaction was immediate — within hours, internal comment threads were flooded with anger, confusion, and more than a hundred emoji reactions that left little to the imagination about how employees felt.

When an engineering manager asked how to opt out, Meta's chief technology officer, Andrew Bosworth, had a blunt answer: there was no opt-out, at least not on a company laptop. This is the same company that is also tying AI tool usage to performance reviews, running mandatory "AI Transformation Weeks" to retrain its workforce, and building internal dashboards that gamify how many AI tokens employees consume in a day — a metric so aggressively tracked that some workers started building AI agents to manage their other AI agents. The whole thing started to resemble a feedback loop eating itself.

Read more
Sci-fi got the gadgets right, but the vibes wrong
Sci-fi got plenty of consumer tech right, but reality keeps delivering the useful, compromised version of the dream
Officer K looking up at a neon-colored hologram in Blade Runner 2049.

I was recently waiting for an Uber when the GPS decided to lie for sport. The car was somewhere nearby, I was somewhere nearby, and somehow both of us were trapped in that modern ritual of wrong pins, slow turns, vague waving, and "I'm here" messages that help absolutely no one.

That was when I had a very reasonable thought: this is exactly where a hologram of a giant arrow pointing at me would be useful.

Read more