Skip to main content
  1. Home
  2. Wearables
  3. Android
  4. Mobile
  5. Legacy Archives

Sony steps up development of the SmartEyeGlass, ready to compete with Google

Add as a preferred source on Google

Sony is stepping up development of the SmartEyeGlass, a pair of smart glasses to compete with Google Glass. The company has released a software development kit for the eyewear, helping developers get started with the tech. Sony briefly discussed its plans for the SmartEyeGlass at both CES and IFA tech shows this year, and the announcement shows it’s ready to move past the concept stage.

Sony SmartEyeGlass TopThe glasses seen in these early pictures is a developmental prototype, and is almost certainly not the final design. At least we hope it’s not, because it makes Google Glass look like an Armani suit. Like the looks, the tech inside the SmartEyeGlass goes in a slightly different direction from Glass.

Recommended Videos

Instead of using a single prism, both lenses in the SmartEyeGlass act as screens, displaying a monochrome, holographic-style image that’s transparent enough not to block your sight, while still remaining functional. Sony promises good readability even in sunlight, a notorious problem on Glass. By not choosing a color display, Sony can also make the battery last longer, but at the moment it’s stored in an external pack, and not built into the glasses.

The specs will connect to an Android phone and Sony provides a few examples of what wearers will be able to see. Like the rest of Sony’s SmartWear range, the glasses will link up with the Lifelog app. A navigational view while using GPS is to be expected, along with hands-free viewing of websites — handy for recipes, or following a how-to guide — and contextual information such as overlaying real-time player stats while at a sports game.

The SmartEyeGlass also includes the usual array of sensors, including a microphone, and a relatively basic 3-megapixel camera. Take a look at the video to get an idea of how Sony wants the interface to look, along with a demo of a sure to be controversial Face Recognition app. Now the SDK has been released, Sony can concentrate on perfecting the hardware side, and plans to have a demo device ready for developers early in 2015.

Andy Boxall
Andy has written about mobile technology for almost a decade. From 2G to 5G and smartphone to smartwatch, Andy knows tech.
Meta’s Ray-Ban Display now types messages from your finger movements
Neural Handwriting is a really cool feature, but Meta opening the Ray-Ban Display to developers is the quiet announcement that turns a clever wearable into a platform with immense possibilities.
Meta Ray-Ban Display and EMG Band.

Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device. 

The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users. 

Read more
Forget smart glasses, these earbuds can see, hear, and remember everything for you
Electronics, Headphones

Smart glasses have always felt a little awkward to me. Sure, they can play music, take calls, snap photos, and even throw notifications in front of your eyes, but at the end of the day, they’re still just tiny screens sitting on your face. Now imagine removing the screen entirely.

That’s exactly what this new pair of AI-powered earbuds is trying to do. Instead of showing you information, these earbuds are designed to quietly hear, see, remember, and respond to the world around you. And honestly, this might be one of the more interesting directions wearable AI has taken so far.

Read more
Meta will allow third-party apps for Ray-Ban Display glasses. Your eyes must stay glued to digital reality.
Meta just handed developers the keys to the glasses display, and the possibilities are endless.
Meta Ray Bans Display close up shot

Meta announced its Meta Ray-Ban Display glasses with a built-in in-lens display that allows users to see what the smart glasses capture, see and respond to messages, and interact with its Meta AI.

While the built-in apps and features were good, the only way to unleash the potential was to allow third-party developers to build apps. And today, Meta is finally opening up the display on its Ray-Ban smart glasses to developers.

Read more