Skip to main content
  1. Home
  2. Emerging Tech
  3. News

NASA’s planet-hunting satellite sends back its first image — and it’s amazing

Add as a preferred source on Google
The Unique Orbit of NASA’s Newest Planet Hunter

NASA’s new planet-hunting mission, the Transiting Exoplanet Survey Satellite (TESS), was launched on April 18. After getting accustomed to its new surroundings and doing a quick buzz past the moon, it’s already produced a stunning image that was just released by NASA. As part of the calibration sequence for one of its four on-board cameras, TESS captured a swatch of the sky that includes more than 200,000 stars.

TESS satellite captures southern sky
A test image from the TESS satellite captures a swath of the southern sky along the plane of our galaxy. NASA/MIT/TESS

The two-second exposure is centered on the southern constellation Centaurus. The Coalsack Nebula is featured in the upper right quadrant, and the star Beta Centauri can be seen at the lower left edge.

Recommended Videos

The stunning display has captivated scientists and space enthusiasts around the globe. “We are truly excited about how well the TESS cameras are working,” MIT planetary scientist George Ricker told Forbes. “This beautiful image just popped up on the MIT payload operations display screens right after initial turn-on of the TESS instrument.”

Keep in mind, this image was produced using only one of TESS’s four cameras. Once the mission becomes fully operational, NASA expects future images to cover more than 400 times as much sky. A “first light” image suitable for detailed scientific analysis will be released in June. Meanwhile, here’s everything you need to know about the mission.

The four cameras will scan 26 entire sectors of the sky, covering both hemispheres during its two-year mission. The observations will focus on possible “transits” of exoplanets, where a far-away planet passes in front of its star causing a measurable drop in brightness. (And there’s quite a few way-off worlds; from the Dracula planet to Earth’s bigger, older cousin, here are the 10 best exoplanets discovered so far.)

“We learned from Kepler that there are more planets than stars in our sky, and now TESS will open our eyes to the variety of planets around some of the closest stars,” said Paul Hertz of NASA. “TESS will cast a wider net than ever before for enigmatic worlds whose properties can be probed by NASA’s upcoming James Webb Space Telescope and other missions.”

TESS is on its way to an unusual but highly stable elliptical orbit that takes it around the Earth every 13.7 days. After getting a gravity assist by passing within 5,000 miles of the moon, a final thruster burn on May 30 will finalize its orbit. The satellite will begin its detailed observation mission utilizing all four cameras in mid-June.

The search for exoplanets and ultimately, extraterrestrial life, will kick into high gear in the coming years. The aging Kepler satellite may be on its last legs (and about to run out of fuel), but the launch of the James Webb telescope will help scientists build on the discoveries made by TESS and expand our knowledge of the universe even further. If NASA can ever launch the darn thing.

Mark Austin
Former Digital Trends Contributor
Mark’s first encounter with high-tech was a TRS-80. He spent 20 years working for Nintendo and Xbox as a writer and…
Rice grain-sized sensor could give robots a delicate touch and keep them from breaking stuff
Sprout Robot

Robots are incredibly precise, but being gentle is not always their strong suit. A machine that can build a car with near-perfect accuracy can still apply too much pressure when working in places where even the smallest mistake matters, like inside a human eye or during delicate surgery. That is why researchers at Shanghai Jiao Tong University are developing a new type of force sensor that could help robots “feel” what they are touching more accurately.

The sensor is tiny, about the size of a grain of rice at just 1.7 millimeters wide, making it small enough to fit inside advanced surgical tools. What makes it especially interesting is that it does not rely on traditional electronics. Instead, it uses light to measure force from every direction, including pressure, sliding movements, and twisting. Here is how it works. At the tip of an optical fiber sits a soft material that slightly changes shape when it comes into contact with something. That tiny deformation alters how light travels through the sensor. The altered light pattern is then sent through optical fibers to a camera, which captures it like an image. Researchers then use a machine learning model to study those light patterns and translate them into precise force readings. In simple terms, the system learns how to “read” touch through light alone, without needing a bunch of wires or multiple separate sensors packed into such a tiny space.

Read more
Meta’s own employees are having a hard time digesting AI. Who would’ve thought?
Artificial Intelligence

If you wanted a snapshot of what it looks like when a tech giant tries to force-feed its workforce an AI future, look no further than Meta right now. The company that built its empire on knowing everything about its users has turned that same appetite inward, and its employees are not happy about it. Last month, Meta quietly informed tens of thousands of its U.S. workers that their corporate laptops would begin tracking their keystrokes, mouse movements, clicks, and screen activity. The purpose was to feed that behavioral data into Meta's AI models so they could learn how people actually use computers. The reaction was immediate — within hours, internal comment threads were flooded with anger, confusion, and more than a hundred emoji reactions that left little to the imagination about how employees felt.

When an engineering manager asked how to opt out, Meta's chief technology officer, Andrew Bosworth, had a blunt answer: there was no opt-out, at least not on a company laptop. This is the same company that is also tying AI tool usage to performance reviews, running mandatory "AI Transformation Weeks" to retrain its workforce, and building internal dashboards that gamify how many AI tokens employees consume in a day — a metric so aggressively tracked that some workers started building AI agents to manage their other AI agents. The whole thing started to resemble a feedback loop eating itself.

Read more
Sci-fi got the gadgets right, but the vibes wrong
Sci-fi got plenty of consumer tech right, but reality keeps delivering the useful, compromised version of the dream
Officer K looking up at a neon-colored hologram in Blade Runner 2049.

I was recently waiting for an Uber when the GPS decided to lie for sport. The car was somewhere nearby, I was somewhere nearby, and somehow both of us were trapped in that modern ritual of wrong pins, slow turns, vague waving, and "I'm here" messages that help absolutely no one.

That was when I had a very reasonable thought: this is exactly where a hologram of a giant arrow pointing at me would be useful.

Read more