Skip to main content
  1. Home
  2. Cars
  3. News

Drive.ai’s self-driving cars use dashboard displays so passengers won’t stress out

Add as a preferred source on Google

Passengers may not be skittish when self-driving cars are commonplace, but in the early years, most of us will be on high alert. Self-driving system developer Drive.ai employs multiple visualization technologies to reassure passengers, as well as to help company engineers understand what the system is “seeing” and how it performs.

Drive.ai recently outlined four primary visualization tools it uses for internal study and passenger reassurance in Medium. The company described how it uses dashboard displays, 3D data visualization, annotated data sets, and interactive simulations in product development.

Onboard display

Onboard displays

Passenger reassurance and comfort motivates Drive.ai’s dashboard displays. The onboard display combines data from lidar sensors and full-surround cameras to create 3D images as the car drives. By enhancing the image with data from radar, GPS, and an inertial measurement unit (IMU), the system helps passengers understand what the vehicle is about to do as well, as what it picks up with the various sensors.

Drive.ai Pointcloud

The many uses of AI visualization

Recommended Videos

Off-board analysis

Drive.ai’s engineers use real-time data from cars to create 3D visualizations that include mapping, motion planning, perception, and localization and state estimation, plus a host of additional robotics elements. The full assemblage enables the company to dive deeper into self-driving performance.

Synchronizing the timing of the various sensor data signals is a crucial element of successful autonomous vehicle performance. By incorporating a wide range of vehicle sensor, mapping, and traffic network data into a single visualization, the engineers can tweak the various algorithms to enhance the timing coordination. This toolset also facilitates testing and variable analysis.

Annotated massive datasets

Annotated datasets

According to Drive.ai, it takes about 800 human hours to correctly label all the data collected during one hour of driving. Human annotators label the initial data sets. Deep-learning A.I. applies what it “learns” from the human-annotated data to label additional data quickly and reliably. The human annotators work on new types of data and quality check the machine-labeled data.

Visual data simulation

Simulation

Working from what the company calls “massive libraries of scenarios,” Drive.ai engineers test and evaluate the company’s autonomous systems by using driving simulators in 3D visualized worlds. With the company’s autonomous system running in the background, the team can change elements such as traffic light patterns and pedestrian behaviors to observe how the self-driving program reacts and responds.

Bruce Brown
Bruce Brown Contributing Editor   As a Contributing Editor to the Auto teams at Digital Trends and TheManual.com, Bruce…
iOS 26.4 adds ChatGPT to you car’s infotainment screen
Apple's iOS 26.4 brings ChatGPT, Gemini, and Claude to your car's screen, adds calming ambient music widgets, and previews the in-car video future that drivers have been waiting for.
CarPlay shown in March 2025.

Apple rolled out iOS 26.4 recently, and while your iPhone got several upgrades, CarPlay quietly had one of its best days in years. The latest iPhone updates bring two meaningful features that can change the way you use CarPlay on your car’s infotainment screen. 

Would you use ChatGPT while driving?

Read more
Sony and Honda’s electric car dream with Afeela series is officially dead 
Sony Honda Mobility has shelved the Afeela 1 and its follow-up, and the EV market has another high-profile casualty.
Machine, Wheel, Adult

Sony and Honda’s shared dream of launching an electric car has just come to an end. The joint venture between the two brands — Sony Honda Mobility — has just announced that plans for the upcoming Afeela 1 electric car have been shelved. Additionally, the follow-up model has been nixed from the roadmap. 

But why did the Afeela go?

Read more
This AI checks if your driving habits signal crash risk
Researchers say eye tracking, heart rate, and personality data can flag risk early.
Person, Wristwatch, Car

A new AI model is taking aim at a question most drivers don’t ask soon enough. How likely are you to crash before you even start the engine?

The system looks at how you behave behind the wheel, pulling in signals like eye movement, heart rate, and personality traits to flag warning patterns early. Instead of waiting for real-world mistakes, it relies on simulated driving tests to surface behaviors linked to dangerous outcomes.

Read more