Skip to main content
  1. Home
  2. Emerging Tech
  3. News

The PowerEgg X is a 4K handheld camera that’s also a waterproof drone

Add as a preferred source on Google
Image used with permission by copyright holder

PowerVision is known for its innovative drones in the commercial space, but it has only produced consumer drones for three years, starting with the original PowerEgg. On Monday night, the company returned to CES with a much smaller (and much smarter) personal drone dubbed the PowerEgg X .

The PowerEgg X is impressive, starting with its much smaller size. Unlike the original PowerEgg, you can easily hold this one in the palm of you hand. And that’s precisely what PowerVision wants you to do: The PowerEgg X can be used both as a handheld camera and a drone.

Recommended Videos

One other thing — it’s waterproof. On the CES show floor, we saw the drone withstand a man-made waterfall without losing any elevation.

This is so dope! @PowerVisionme is showcasing its new waterproof drone at #CES2020 by flying it through a man-made waterfall that cascades down from the ceiling of the convention center. Notice how it maintains elevation! @DigitalTrends pic.twitter.com/KjIrB12LHx

— Drew Prindle (@GonzoTorpedo) January 9, 2020

Enterprising filmmakers have long known you can take advantage of the automatic stabilization and gimbal of drones by using them as a handheld camera. The PowerEgg X can also be used like this, with a setting that allows for filming when it’s not flying.

But the PowerEgg X’s camera has something that most other drones do not: Artificial intelligence. “Three years in development, PowerEgg X pulls together the technology consumers are seeking and puts it in a small, elegant egg shape,” PowerVison founder and CEO Wally Zheng said.

Algorithms will help the camera detect faces and learn over time to help keep the subject, which also doesn’t necessarily need to be a person, in the frame of view at all times. The tracking also works if the subject moves inside and outside of the field of view. PowerVision is also using the A.I. engine to enable what it calls a “massive” gesture database.

Digital Trends had a chance to catch the PowerEgg X at a press event om Monday evening and walked away impressed. The hyped A.I. functionality works as advertised, and the recorded 4K video was of good quality, with great image stabilization, which is vital to good drone video. While the confines of CES didn’t allow us to put the PowerEgg X through its paces, we hope to have a full review of the device in the weeks after the show.

The PowerEgg X is available starting today and retails for $899.

Follow our live blog for more CES news and announcements.

Ed Oswald
For fifteen years, Ed has written about the latest and greatest in gadgets and technology trends. At Digital Trends, he's…
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more
Meta’s own employees are having a hard time digesting AI. Who would’ve thought?
Artificial Intelligence

If you wanted a snapshot of what it looks like when a tech giant tries to force-feed its workforce an AI future, look no further than Meta right now. The company that built its empire on knowing everything about its users has turned that same appetite inward, and its employees are not happy about it. Last month, Meta quietly informed tens of thousands of its U.S. workers that their corporate laptops would begin tracking their keystrokes, mouse movements, clicks, and screen activity. The purpose was to feed that behavioral data into Meta's AI models so they could learn how people actually use computers. The reaction was immediate — within hours, internal comment threads were flooded with anger, confusion, and more than a hundred emoji reactions that left little to the imagination about how employees felt.

When an engineering manager asked how to opt out, Meta's chief technology officer, Andrew Bosworth, had a blunt answer: there was no opt-out, at least not on a company laptop. This is the same company that is also tying AI tool usage to performance reviews, running mandatory "AI Transformation Weeks" to retrain its workforce, and building internal dashboards that gamify how many AI tokens employees consume in a day — a metric so aggressively tracked that some workers started building AI agents to manage their other AI agents. The whole thing started to resemble a feedback loop eating itself.

Read more
Sci-fi got the gadgets right, but the vibes wrong
Sci-fi got plenty of consumer tech right, but reality keeps delivering the useful, compromised version of the dream
Officer K looking up at a neon-colored hologram in Blade Runner 2049.

I was recently waiting for an Uber when the GPS decided to lie for sport. The car was somewhere nearby, I was somewhere nearby, and somehow both of us were trapped in that modern ritual of wrong pins, slow turns, vague waving, and "I'm here" messages that help absolutely no one.

That was when I had a very reasonable thought: this is exactly where a hologram of a giant arrow pointing at me would be useful.

Read more