Skip to main content
  1. Home
  2. Emerging Tech
  3. Smart Home
  4. Legacy Archives

Ikea’s new augmented reality app superimposes furniture into your empty room

Add as a preferred source on Google

The typical Ikea experience usually goes like this: You get to the showroom, everything looks so wonderful and amazing, you pretend to live in the overly-organized fake bedroom, realize you should make a list of things to buy and… crap, now you’re not quite sure how that’ll look in your apartment. So you go home with some Swedish meatballs in your belly and a new version of the year’s annual catalog.

Ikea AR appDespite what you might think was a failure of a trip, obtaining a catalog was actually a start! In addition to last year’s unveil of an augmented reality-capable catalog, Ikea now boasts a new app feature that can turn that little book into a virtual piece of furniture. The new AR can now help shoppers envision what the furniture might look like in their apartment by adding the illusion of the product on top of the live view through a smartphone camera. The catalog acts as an anchor point which helps the camera detect angle, light, position, and size. The end result is not always in the right scale, but it’s helpful enough for users who just want to know whether that armchair would look good on top of their favorite rug. It’s like getting a virtual test drive for Ikea furniture, sans-assembly and transportation.

Recommended Videos

Of course, before buying, you’ll still want to measure the dimensions to ensure things will fit in there. And no, the AR app does not tell you what your mangled furniture will end up looking like should you get frustrated with the vague assembly instructions. Perfection is only a dream… or in this case, one augmented reality app away. For the most part, AR has been a kitschy piece of technology, but this Ikea integration is one of the few that seems rather useful.

Watch the video below to see a family have way too much fun with an Ikea catalog and several mobile devices.

Natt Garun
An avid gadgets and Internet culture enthusiast, Natt Garun spends her days bringing you the funniest, coolest, and strangest…
6 things Gemini Intelligence is about to do across your Android devices
Logo, Disk, Symbol

Google is bringing Gemini Intelligence to Android, which brings the best of Gemini to its most intelligent devices. The company really wants you to get your work done by Gemini throughout the day, all while staying in control and keeping your data private. Google is rolling out these features starting with the Samsung Galaxy and Google Pixel devices this summer. Furthermore, we’ll see these features on other Android devices, including watches, cars, glasses, and laptops, later this year.

Your assistant is about to get a lot more hands-on, without you having to ask twice

Read more
Google’s next Chrome update is a big deal for Android users
Electronics, Mobile Phone, Phone

Gemini is clearly becoming the centerpiece of Google’s AI strategy, and that focus is now extending deep into Chrome on Android. Starting in June, Chrome is getting a fresh wave of AI-powered features built around Gemini, and the goal is pretty simple: turn your browser into something that actually helps you think, plan, and act, instead of just showing you pages.

Chrome is about to get a little too helpful in the best way

Read more
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more