Skip to main content
  1. Home
  2. Emerging Tech
  3. Computing
  4. Smart Home
  5. Legacy Archives

Kinect-powered virtual therapists can fully decode a patient’s body language

Add as a preferred source on Google
simsensei kinect virtual therapist
Image used with permission by copyright holder

At a glance, the name SimSensei sounds like something that could virtually teach you to be the next Ninja master. Alas, that dream is too far-fetched; SimSensei is actually a Kinect-powered virtual therapist that uses the motion recognition technology to read a patient’s body language, thus determining underlying anxiety, nervousness, happiness, or contemplation.

Developed by a team at the University of Southern California’s Institute for Creative Technologies, the SimSensei program is still in its early stages. The beta preview uses a virtual avatar that acts as your shrink, while the second half of the screen uses the Microsoft Kinect to detect changes in the patient’s face and body movements. SimSensei would only record the patient as he or she is verbally responding so the program could see how the patient physically reacts when they are answering questions. These body languages include patients leaning forward or backward on their chair, smiling, prolonged/lack of eye contact, and the directions in which the eyes move.

Recommended Videos

That’s not to say SimSensei is entirely robotic either. The virtual therapist is even programmed to “Hmm” at appropriate times as if to ponder a response, and guide the patient along their conversation depending on what he or she answers. The point is to make the program look and feel as natural as possible, but that’s also not to say SimSensei doesn’t sound kind of creepy. I may have not been to too many shrinks, but the sample session USC provided in the video below shows a conversation that lacks white noise, feels awkward, sounds a bit intimidating. We’re not sure what’s worse: Trusting a human stranger or a friendly…ish robot with your deepest anxieties.

SimSensei is part of the many programs being developed for the Association for Computing Machinery‘s contest. The competition would compare the programs to see which can most accurately diagnose a patient based on their body language, identifying the correct group of depressed patients from non-depressed ones. Hey, at least if you reveal all your deep, dark secrets to SimSensei or other virtual psychiatrists, you will never run the risk of bumping into them in the street or something equally embarrassing. Meanwhile, students studying psychology have another reason to go back to their dorm rooms and weep themselves to sleep tonight for having chosen a doomed career path. Good thing there’s a robot they can talk to that’s currently in development.

Natt Garun
An avid gadgets and Internet culture enthusiast, Natt Garun spends her days bringing you the funniest, coolest, and strangest…
Robots just ran the Beijing half-marathon faster than the world record holder
humanoid robot running a marathon

A humanoid robot just ran a half-marathon faster than the world record holder. It might not seem impressive at first, but considering last year, the fastest robot at Beijing's humanoid robot half-marathon finished in two hours and 40 minutes, this is a huge achievement. 

As reported by the Associated Press, the winning robot at this year's Beijing half-marathon crossed the finish line in 50 minutes and 26 seconds, comfortably beating the human world record of 57 minutes recently set by Jacob Kiplimo. 

Read more
As if the plate wasn’t already full, AI is about to worsen the global e-waste crisis
New report highlights a rising environmental concern
Stack of graphics cards and motherboards in a landfill site e-waste

AI is already changing how the world works, but it’s also quietly making one of our biggest environmental problems even worse. And no, this isn’t about energy consumption this time. It’s about the hardware. Because every smarter AI model comes with a physical cost.

AI is about to supercharge the e-waste problem

Read more
Smart glasses are finding a surprise niche — Korean drama and theater shows
Urban, Night Life, Person

Every year, millions of people follow Korean content without speaking a word of the language. They stream shows with subtitles, read translated lyrics, and find workarounds. But live theater has always been a different problem — you can't pause or rewind it. That's the problem: a Korean startup thinks it's cracked, and Yuroy Wang was one of the first to try it. The 22-year-old Taipei retail worker is a K-pop fan who loves Korean culture but doesn't speak the language. When he went to see "The Second Chance Convenience Store," a touring play based on a Korean novel that was a bestseller in Taiwan, he expected supertitles. What he got instead was a pair of chunky black-framed AI-powered glasses sitting on his nose, translating the dialogue in real time directly on the lenses. "As soon as I found out they were available, I couldn't wait to try them," he said. Wang is part of a growing audience discovering that smart glasses, a category of tech that has struggled to find mainstream purpose for years, might have just found their calling in the most unexpected of places: live Korean theater.

How do the glasses work?

Read more