Skip to main content
  1. Home
  2. Emerging Tech
  3. Cars
  4. News

This is what London looks like through a self-driving car’s eyes

Add as a preferred source on Google

If cars could talk, they’d say, “Wow.” At least, if those cars were of the self-driving variety and are as impressed as we are by stunning cityscapes. Thanks to an incredible visualization completed by ScanLAB Projects on behalf of New York Times Magazine, we now have an idea of what autonomous cars “see” when they navigate the roads of our global cities, and the view ain’t shabby at all. Providing a brand-new perspective on the world we inhabit, the 3D laser scans of London are truly phenomenal in their degree of detail, and moreover, their almost dreamlike quality.

“One of the most significant uses of 3D scanning in the years to come will not be by humans at all but by autonomous vehicles,” notes Geoff Manaugh of New York Times Magazine. And to give us a sense of just what sort of scanning these vehicles deal with on their drives, two architectural designers, Matthew Shaw and William Trossell, strapped a laser scanner to a Honda CR-V, and went exploring.

Recommended Videos

Today, self-driving cars find their way around streets, other vehicles, and roadblocks by way of a technology known as “lidar,” a combination of “light” and “radar.” By sending out nearly a million bursts of light per second, all imperceptible to the human eye, cars are able to sense their surroundings. But not only are they able to sense, but these autonomous vehicles actually “capture” their environments, meaning they effectively recreate a three-dimensional model of a scene — one that continues to evolve and adapt to the ever-changing location.

ScanLAB’s recent project, to some extent, attempts to recreate the car’s lidar point of view for human appreciation, with pretty incredible results. “The London that their work reveals is a landscape of aging monuments and ornate buildings, but also one haunted by duplications and digital ghosts,” Manaugh writes. “The city’s double-­decker buses, scanned over and over again, become time-­stretched into featureless mega-­structures blocking whole streets at a time. Other buildings seem to repeat and stutter, a riot of Houses of Parliament jostling shoulder to shoulder with themselves in the distance. Workers setting out for a lunchtime stroll become spectral silhouettes popping up as aberrations on the edge of the image. Glass towers unravel into the sky like smoke.”

But it’s more than aesthetics. As the architects note, not only do the laser scans reveal how far technology has come, but also, how far it has yet to go in creating accurate representations of a car’s whereabouts. But until these autonomous vehicles can see just as we can, we should enjoy their view — it’s pretty amazing.

Lulu Chang
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Gemini Intelligence has strict requirements, and your phone may not qualify
Gemini Intelligence

Google’s new Gemini Intelligence platform is quickly becoming one of the biggest talking points in the Android world right now. After being highlighted during this week’s Android Show, the feature is already being tied to several upcoming premium foldables and flagship phones. But there’s a catch: not every high-end Android device will be able to run it. And surprisingly, even some of Google and Samsung’s latest foldables may miss out.

According to Google’s requirements, Gemini Intelligence isn’t just another software update you can casually push to older devices. The company appears to be building this around a much stricter hardware and long-term software support system. To qualify, a phone needs a flagship-grade chipset, at least 12GB RAM, support for AI Core, and Gemini Nano v3 or newer. That immediately creates a problem for several current-generation phones.

Read more
Asus ROG and Xreal just built the AR glasses gamers have been waiting for, at a price that stings
At $849 and 240Hz, ASUS and Xreal's R1 is either the most exciting gaming peripheral of 2026 or the most expensive leap of faith, possibly both.
Accessories, Sunglasses, Glasses

AR Glasses have promised a lot over the years but delivered considerably less. Asus ROG and Xreal are making a serious case that time is different. The companies have announced the ROG Xreal R1, the world’s first 240Hz micro-OLED gaming AR glasses.

Pre-orders for the device are live on Best Buy for $849. Worldwide shipping begins on June 1, 2026.

Read more
Gemini is about to get wings on your phone with agentic skills
Logo, Disk, Symbol

Google I/O is almost here, and now that Google has already wrapped up The Android Show, all eyes are shifting toward the company’s AI ambitions — especially Gemini. While nothing has been officially announced yet, a new leak gives us an early glimpse of what Google could be preparing behind the scenes.

Your inbox might soon fear Gemini more than spam

Read more