Skip to main content
  1. Home
  2. Emerging Tech
  3. News

European planet-hunting satellite CHEOPS captures its first image

Add as a preferred source on Google

Illustration of CHEOPS, ESA’s first exoplanet mission
Illustration of CHEOPS, ESA’s first exoplanet mission ESA / ATG medialab

The European Space Agency (ESA)’s new exoplanet-hunting satellite, CHEOPS, has taken its first image of a target star. The satellite was launched in December last year and opened the cover to its camera last week, and now it has begun imaging distant stars to search for telltale signs of planets orbiting around them.

Recommended Videos

Waiting for the first image was somewhat fraught for the scientists working on the project, as they could not be sure that the camera and other instruments were working perfectly until they saw the data come in.

“The first images that were about to appear on the screen were crucial for us to be able to determine if the telescope’s optics had survived the rocket launch in good shape,” explains Willy Benz, Professor of Astrophysics at the University of Bern and Principal Investigator of the CHEOPS mission. “When the first images of a field of stars appeared on the screen, it was immediately clear to everyone that we did indeed have a working telescope.”

First image of the star chosen as target for CHEOPS after cover opening.
First image of the star chosen as a target for CHEOPS after cover opening. The star, at the center of the image, is located at a distance of 150 light-years from us, in the constellation of Cancer. Illustration of CHEOPS, ESA’s first exoplanet mission

The captured image, shown above, may look rather blurry. But this isn’t a mistake or a problem — the telescope is deliberately defocused. Counterintuitively, this actually makes the image more precise as the light from sources like stars is spread over more pixels, which lessens the variations caused by the movements of the spacecraft.

“The good news is that the actual blurred images received are smoother and more symmetrical than what we expected from measurements performed in the laboratory,” Benz explained. “These initial promising analyses are a great relief and also a boost for the team.”

CHEOPS detects exoplanets through the transit method, in which it monitors the brightness levels of stars to look for periodic dips. If there are regular dips in brightness, that suggests that something — like an exoplanet — is orbiting around the star and periodically blocking its light. This is the same method used by NASA’s planet-hunter, TESS.

With CHEOPS capturing images, there will be more tests of its systems over the next two months to ensure everything is working smoothly. “We will analyze many more images in detail to determine the exact level of accuracy that can be achieved by CHEOPS in the different aspects of the science program,” said David Ehrenreich, CHEOPS project scientist at the University of Geneva. “The results so far bode well.”

Georgina Torbet
Georgina has been the space writer at Digital Trends space writer for six years, covering human space exploration, planetary…
Robots just ran the Beijing half-marathon faster than the world record holder
humanoid robot running a marathon

A humanoid robot just ran a half-marathon faster than the world record holder. It might not seem impressive at first, but considering last year, the fastest robot at Beijing's humanoid robot half-marathon finished in two hours and 40 minutes, this is a huge achievement. 

As reported by the Associated Press, the winning robot at this year's Beijing half-marathon crossed the finish line in 50 minutes and 26 seconds, comfortably beating the human world record of 57 minutes recently set by Jacob Kiplimo. 

Read more
As if the plate wasn’t already full, AI is about to worsen the global e-waste crisis
New report highlights a rising environmental concern
Stack of graphics cards and motherboards in a landfill site e-waste

AI is already changing how the world works, but it’s also quietly making one of our biggest environmental problems even worse. And no, this isn’t about energy consumption this time. It’s about the hardware. Because every smarter AI model comes with a physical cost.

AI is about to supercharge the e-waste problem

Read more
Smart glasses are finding a surprise niche — Korean drama and theater shows
Urban, Night Life, Person

Every year, millions of people follow Korean content without speaking a word of the language. They stream shows with subtitles, read translated lyrics, and find workarounds. But live theater has always been a different problem — you can't pause or rewind it. That's the problem: a Korean startup thinks it's cracked, and Yuroy Wang was one of the first to try it. The 22-year-old Taipei retail worker is a K-pop fan who loves Korean culture but doesn't speak the language. When he went to see "The Second Chance Convenience Store," a touring play based on a Korean novel that was a bestseller in Taiwan, he expected supertitles. What he got instead was a pair of chunky black-framed AI-powered glasses sitting on his nose, translating the dialogue in real time directly on the lenses. "As soon as I found out they were available, I couldn't wait to try them," he said. Wang is part of a growing audience discovering that smart glasses, a category of tech that has struggled to find mainstream purpose for years, might have just found their calling in the most unexpected of places: live Korean theater.

How do the glasses work?

Read more