Skip to main content
  1. Home
  2. Emerging Tech
  3. News

A lunar time capsule: 50-year-old moon rock samples to be opened for study

Add as a preferred source on Google

11 December 1972 — Scientist-astronaut Harrison H. Schmitt collects lunar rake samples at Station 1 during the first Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. Schmitt is the lunar module pilot. The Lunar Rake, an Apollo Lunar Geology Hand Tool, is used to collect discrete samples of rocks and rock chips ranging in size from one-half inch (1.3 cm) to one inch (2.5 cm). Eugene A. Cernan, Apollo 17 Commander

Nearly 50 years after the Apollo missions to the Moon, NASA is breaking open samples of Moon rock for the first time. Samples collected from Apollo missions 15, 16, and 17 (launched in July 1971, April 1972, and December 1972 respectively) have been preserved and never before exposed to Earth’s atmosphere, making them invaluable resources for understanding the geology of the Moon.

Recommended Videos

Nine teams have been selected to study the samples, with the hope that developments in analysis techniques and further understanding of the lunar environment will enable discoveries which were not possible at the time of the original missions.

“By studying these precious lunar samples for the first time, a new generation of scientists will help advance our understanding of our lunar neighbor and prepare for the next era of exploration of the Moon and beyond, “ Thomas Zurbuchen, Associate Administrator for NASA’s Science Mission Directorate in Washington, DC, said in a statement. “This exploration will bring with it new and unique samples into the best labs right here on Earth.”

Some of the samples were brought back to Earth in vacuum-sealed packages to protect them from degradation in Earth’s atmosphere. A typical sample is 800 grams (1.8 pounds) of rock collected from beneath the Moon’s surface on the Apollo 17 mission, which is still enclosed in the “drive tube” that was pushed into the lunar soil to collect a core sample. This means scientists can see the layers of rock beneath the surface, exactly as they were in place on the Moon.

Other samples were preserved by freezing them or storing them in helium to prevent chemical reactions. The teams won’t open the samples right away, as it will take time to figure out how to open them without causing any kind of contamination.

NASA scientists today applaud the foresight of preserving these samples: “Returned samples are an investment in the future,” Lori Glaze, acting director of NASA’s Planetary Science Division in Washington, DC, said in the same statement. “These samples were deliberately saved so we can take advantage of today’s more advanced and sophisticated technology to answer questions we didn’t know we needed to ask.”

Georgina Torbet
Georgina has been the space writer at Digital Trends space writer for six years, covering human space exploration, planetary…
Smart glasses are finding a surprise niche — Korean drama and theater shows
Urban, Night Life, Person

Every year, millions of people follow Korean content without speaking a word of the language. They stream shows with subtitles, read translated lyrics, and find workarounds. But live theater has always been a different problem — you can't pause or rewind it. That's the problem: a Korean startup thinks it's cracked, and Yuroy Wang was one of the first to try it. The 22-year-old Taipei retail worker is a K-pop fan who loves Korean culture but doesn't speak the language. When he went to see "The Second Chance Convenience Store," a touring play based on a Korean novel that was a bestseller in Taiwan, he expected supertitles. What he got instead was a pair of chunky black-framed AI-powered glasses sitting on his nose, translating the dialogue in real time directly on the lenses. "As soon as I found out they were available, I couldn't wait to try them," he said. Wang is part of a growing audience discovering that smart glasses, a category of tech that has struggled to find mainstream purpose for years, might have just found their calling in the most unexpected of places: live Korean theater.

How do the glasses work?

Read more
Amazon thinks you love AI, so it has launched a special storefront for AI-powered gadgets
Google AI mode mockup showing new feature

You're browsing for a new laptop — one has a better processor, another has more RAM, a third says "AI-powered" in bold letters, and you're not entirely sure what that means. But Amazon has noticed you pausing on that third one, and it has thoughts. The company just launched an AI Store on Amazon.in — a dedicated storefront that rounds up AI-enabled gadgets across categories, from smartphones and laptops to refrigerators and washing machines. So, instead of you wading through spec sheets trying to figure out which "AI feature" actually does something useful, the store spells it out for you.

What the AI store actually is

Read more
Gemini now makes personalized images by understanding your taste from Photos library
Logo, Disk, Symbol

Up until now, using Google Gemini meant being very specific. If you wanted an image, you’d spell it all out, the mood, the lighting, the tiny details, just to get something close to what you had in mind. That’s still how most AI tools operate. But this is where things start to shift. With the integration of Nano Banana 2 and Google Photos, Gemini feels much more familiar. It leans on your preferences, what you like, what you usually capture, and the kind of visuals you gravitate towards, and uses that context to shape what it creates for you.

So instead of over-explaining every prompt, you’re nudging it in a direction, and it fills in the rest in a way that feels personal. The goal here is simple: spend less time describing and more time seeing your ideas come to life, almost the way you imagined them, without having to say everything out loud.

Read more