Skip to main content
  1. Home
  2. Emerging Tech
  3. Gaming
  4. Legacy Archives

Microsoft wants to turn your walls into a battlefield with a modified Kinect and projector, called IllumiRoom

Add as a preferred source on Google

Today at CES, Microsoft Research joined Samsung on stage to show off an early look at a new piece of technology, which it dubbed IllumiRoom. The oddly named IllumiRoom is very literal in its title. It combines a Kinect and a projector, to turn your room into a display.

IllumiRoom was introduced by Microsoft’s Chief Technology Strategic Officer, Eric Rudder, who emphasized that this is just the first step in a much larger scheme. Microsoft wants to potentially make any surface in your house into a computer display. Imagine having a Kinect’s motion-based controls work on a projected surface, say a random counter-top  and you have an idea where it wants to go.

Recommended Videos

The goals are lofty, and the reality of turning a blank wall into an interactive display capable of replacing a far more sensitive and responsive screen or keyboard are likely still years away, but the potential is undeniable.

Microsoft was quick to point out that this technology is still in the very early stages – “raw” was a word it used to describe it – but that didn’t stop the manufacturer from having a little fun with it.

To display a few of the simpler things the IllumiRoom can do, Microsoft released a video that merely hints at the potential of the tech, but also shows off what might be the next iteration of the Kinect – or at least one possible feature of it.

The video below highlights the idea that explosions on screen won’t be limited to the TV, but could instead carry over into the room itself. While this aspect of IllumiRoom may end up on the floor of a lab somewhere, it could also be incorporated with new technology that will be available for the next Xbox which hasn’t been announced officially yet (but we think will be out before the end of the year). Either way, it’s an interesting look at what the technology could potentially do.

Microsoft is keeping quiet about the technical aspects and its detailed plans, but more news is planned to be released in April at CHI in Paris.

Ryan Fleming
Former Gaming/Movies Editor
Ryan Fleming is the Gaming and Cinema Editor for Digital Trends. He joined the DT staff in 2009 after spending time covering…
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more
Meta’s own employees are having a hard time digesting AI. Who would’ve thought?
Artificial Intelligence

If you wanted a snapshot of what it looks like when a tech giant tries to force-feed its workforce an AI future, look no further than Meta right now. The company that built its empire on knowing everything about its users has turned that same appetite inward, and its employees are not happy about it. Last month, Meta quietly informed tens of thousands of its U.S. workers that their corporate laptops would begin tracking their keystrokes, mouse movements, clicks, and screen activity. The purpose was to feed that behavioral data into Meta's AI models so they could learn how people actually use computers. The reaction was immediate — within hours, internal comment threads were flooded with anger, confusion, and more than a hundred emoji reactions that left little to the imagination about how employees felt.

When an engineering manager asked how to opt out, Meta's chief technology officer, Andrew Bosworth, had a blunt answer: there was no opt-out, at least not on a company laptop. This is the same company that is also tying AI tool usage to performance reviews, running mandatory "AI Transformation Weeks" to retrain its workforce, and building internal dashboards that gamify how many AI tokens employees consume in a day — a metric so aggressively tracked that some workers started building AI agents to manage their other AI agents. The whole thing started to resemble a feedback loop eating itself.

Read more
Sci-fi got the gadgets right, but the vibes wrong
Sci-fi got plenty of consumer tech right, but reality keeps delivering the useful, compromised version of the dream
Officer K looking up at a neon-colored hologram in Blade Runner 2049.

I was recently waiting for an Uber when the GPS decided to lie for sport. The car was somewhere nearby, I was somewhere nearby, and somehow both of us were trapped in that modern ritual of wrong pins, slow turns, vague waving, and "I'm here" messages that help absolutely no one.

That was when I had a very reasonable thought: this is exactly where a hologram of a giant arrow pointing at me would be useful.

Read more