Skip to main content
  1. Home
  2. Emerging Tech
  3. News

Mars 2020 will capture high-definition color images from the Jezero Crater

Add as a preferred source on Google

In this picture taken on May 23, 2019, in the Spacecraft Assembly Facility’s High Bay 1 clean room at the Jet Propulsion Laboratory in Pasadena, California, engineers re-install the cover to the remote sensing mast (RSM) head after integration of two Mastcam-Z high-definition cameras that will go on the Mars 2020 rover. NASA/JPL-Caltech

The Mars 2020 rover is getting ready to join its sibling Curiosity on the surface of the planet, with scientists fitting vital instruments for the mission.

Recommended Videos

The rover will be heading to an area of Mars called the Jezero Crater, at the edge of an impact basin called the Isidis Basin. The Jezero Crater has been a target of scientific interest for some time, and was also in the running to be the exploration site of the Curiosity rover before the Gale Crater was chosen instead.

Now scientists will finally get the chance to gather data from the Jezero Crater. It is believed to be the site of an ancient lake, and it is even remotely possible that life could once have developed in this area. Even if, as is probable, there is no sign of life in the region, there likely will be minerals that form in the presence of water, which scientists can investigate for clues about the development of the lake over time.

But in order to navigate and document the crater, the rover will need to collect visual information. For this purpose, it is being equipped with two high definition cameras called Mastcam-Z which will be installed this week. Mastcam-Z will be able to collect images in color, and well as having the ability to zoom. This will help scientists to observe the mineralogy and structure of the rocks and sediment on Mars.

“Mastcam-Z will be the first Mars color camera that can zoom, enabling 3D images at unprecedented resolution,” Mastcam-Z Principal Investigator Jim Bell of Arizona State University said in a statement. “With a resolution of three-hundredths of an inch (0.8 millimeters) in front of the rover and less than one-and-a-half inches (38 millimeters) from over 330 feet (100 meters) away — Mastcam-Z images will play a key role in selecting the best possible samples to return from Jezero Crater.”

For now, the camera is covered with a lens cover (the round red object in the photo above) while it is fitted to a mast called the remote sensing mast (RSM). The RSM will be raised once the rover touches down on Mars to get a view of the environment.

Georgina Torbet
Georgina has been the space writer at Digital Trends space writer for six years, covering human space exploration, planetary…
Smart glasses are finding a surprise niche — Korean drama and theater shows
Urban, Night Life, Person

Every year, millions of people follow Korean content without speaking a word of the language. They stream shows with subtitles, read translated lyrics, and find workarounds. But live theater has always been a different problem — you can't pause or rewind it. That's the problem: a Korean startup thinks it's cracked, and Yuroy Wang was one of the first to try it. The 22-year-old Taipei retail worker is a K-pop fan who loves Korean culture but doesn't speak the language. When he went to see "The Second Chance Convenience Store," a touring play based on a Korean novel that was a bestseller in Taiwan, he expected supertitles. What he got instead was a pair of chunky black-framed AI-powered glasses sitting on his nose, translating the dialogue in real time directly on the lenses. "As soon as I found out they were available, I couldn't wait to try them," he said. Wang is part of a growing audience discovering that smart glasses, a category of tech that has struggled to find mainstream purpose for years, might have just found their calling in the most unexpected of places: live Korean theater.

How do the glasses work?

Read more
Amazon thinks you love AI, so it has launched a special storefront for AI-powered gadgets
Google AI mode mockup showing new feature

You're browsing for a new laptop — one has a better processor, another has more RAM, a third says "AI-powered" in bold letters, and you're not entirely sure what that means. But Amazon has noticed you pausing on that third one, and it has thoughts. The company just launched an AI Store on Amazon.in — a dedicated storefront that rounds up AI-enabled gadgets across categories, from smartphones and laptops to refrigerators and washing machines. So, instead of you wading through spec sheets trying to figure out which "AI feature" actually does something useful, the store spells it out for you.

What the AI store actually is

Read more
Gemini now makes personalized images by understanding your taste from Photos library
Logo, Disk, Symbol

Up until now, using Google Gemini meant being very specific. If you wanted an image, you’d spell it all out, the mood, the lighting, the tiny details, just to get something close to what you had in mind. That’s still how most AI tools operate. But this is where things start to shift. With the integration of Nano Banana 2 and Google Photos, Gemini feels much more familiar. It leans on your preferences, what you like, what you usually capture, and the kind of visuals you gravitate towards, and uses that context to shape what it creates for you.

So instead of over-explaining every prompt, you’re nudging it in a direction, and it fills in the rest in a way that feels personal. The goal here is simple: spend less time describing and more time seeing your ideas come to life, almost the way you imagined them, without having to say everything out loud.

Read more