Skip to main content
  1. Home
  2. Emerging Tech
  3. Photography
  4. News

Brighter sun spots: Researchers can now view solar storms in high definition

Add as a preferred source on Google

You’re not supposed to look directly at the sun but, if you’re going to insist on it, at least make sure that you’re getting the highest quality, widest angle images possible.

That is what scientists have achieved thanks to an upgrade of the world’s highest-resolution telescope at Big Bear Solar Observatory in California. The pictures they have taken use something called “multi-conjugate adaptive optics” (MCAO), which capture light from different altitudes in Earth’s atmosphere by way of three deformable mirrors, thereby helping correct some of the temperature-related imaging problems that previously got in the way of our attempts to record such images.

Recommended Videos

To put it simply, consider the hazy image you get when peering through hot exhaust fumes, or the way stars can appear to twinkle in the sky when you look at them. Using multi-conjugate adaptive optics, the new system is able to get around such light distortions and achieve a sharper view of solar activity that’s also three times wider than we had before.

Thanks to the upgraded system — which is guided by cameras recording at an astonishing 2,000 frames every second — it’s possible to scope out sunspots up to 32,000 kilometers in width.

This opens up a slew of new research possibilities.

Comparing the old and new image quality researchers could record. Image used with permission by copyright holder

“In large [solar] flares, the explosive events seem to occur simultaneously in many places in the field of view because with post-facto image reconstruction [note: referring to the practice of combining hundreds of images to get a wider corrected field of view] the time cadence is several seconds,” Philip Goode, a research professor of physics at the New Jersey Institute of Technology, told Digital Trends. “For the first time, we can study such events with corrected images over a wide field with a sub-second time cadence, and finally observe the fundamental processes as they occur.”

At present, the system is still classed as a “demonstrator,” and Goode said the team has many questions to answer about how to optimize the system and bring it into regular operation. This means testing out many permutations in the order of the deformable mirrors and wavefront sensors.

“The next thing in line for us is to build and implement a real-time profilometer to measure the atmospheric turbulence as a function of altitude to help to optimize the conjugation altitudes of the deformable mirrors,” he said.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Robots just ran the Beijing half-marathon faster than the world record holder
humanoid robot running a marathon

A humanoid robot just ran a half-marathon faster than the world record holder. It might not seem impressive at first, but considering last year, the fastest robot at Beijing's humanoid robot half-marathon finished in two hours and 40 minutes, this is a huge achievement. 

As reported by the Associated Press, the winning robot at this year's Beijing half-marathon crossed the finish line in 50 minutes and 26 seconds, comfortably beating the human world record of 57 minutes recently set by Jacob Kiplimo. 

Read more
As if the plate wasn’t already full, AI is about to worsen the global e-waste crisis
New report highlights a rising environmental concern
Stack of graphics cards and motherboards in a landfill site e-waste

AI is already changing how the world works, but it’s also quietly making one of our biggest environmental problems even worse. And no, this isn’t about energy consumption this time. It’s about the hardware. Because every smarter AI model comes with a physical cost.

AI is about to supercharge the e-waste problem

Read more
Smart glasses are finding a surprise niche — Korean drama and theater shows
Urban, Night Life, Person

Every year, millions of people follow Korean content without speaking a word of the language. They stream shows with subtitles, read translated lyrics, and find workarounds. But live theater has always been a different problem — you can't pause or rewind it. That's the problem: a Korean startup thinks it's cracked, and Yuroy Wang was one of the first to try it. The 22-year-old Taipei retail worker is a K-pop fan who loves Korean culture but doesn't speak the language. When he went to see "The Second Chance Convenience Store," a touring play based on a Korean novel that was a bestseller in Taiwan, he expected supertitles. What he got instead was a pair of chunky black-framed AI-powered glasses sitting on his nose, translating the dialogue in real time directly on the lenses. "As soon as I found out they were available, I couldn't wait to try them," he said. Wang is part of a growing audience discovering that smart glasses, a category of tech that has struggled to find mainstream purpose for years, might have just found their calling in the most unexpected of places: live Korean theater.

How do the glasses work?

Read more