While Tesla’s new hands-free driving is drawing a lot of interest this week, it’s the technology behind-the-scenes of the company’s newly-enabled autopilot service that should be getting more attention.
At an event on Wednesday Tesla’s CEO Elon Musk explained that the company’s new autopilot service is constantly learning and improving thanks to machine learning algorithms, the car’s wireless connection, and detailed mapping and sensor data that Tesla collects.
Tesla’s cars in general have long been using data, and over-the-air software updates, to improve the way they operate.
Machine learning algorithms are the latest in computer science where computers can take a large data set, analyze it and use it to make increasingly accurate predictions. In short, they are learning. Companies like Google (GOOG), Facebook (FB) and now Tesla (TSLA) are using machine learning as a way to train software to help customers or sell them new services.
Machine learning is the way that computers can become artificially intelligent, and the technology is a form of AI. While Musk has taken a sort of alarmist stance against the dangers of AI, he clarified during the event on Wednesday that he’s only concerned with artificial intelligence that is meant for nefarious purposes.
When a reporter asked Musk during the media Q&A what made his company’s autopilot service different than other computer-based driving assistance features that competing big auto makers are working on, Musk emphasized learning.
“The whole Tesla fleet operates as a network. When one car learns something, they all learn it. That is beyond what other car companies are doing,” said Musk. When it comes to the autopilot software, Musk explained that each driver using the autopilot system essentially becomes an “expert trainer for how the autopilot should work.”
While most car companies might not be building learning systems, Google’s self-driving cars operate in a similar manner.
In that way, Tesla’s cars are more similar to smart connected gadgets like Nest’s learning thermostat (now owned by Google’s Alphabet), than they are to traditional cars. Nest’s thermostat, using sensors and algorithms, learns its owner’s behavior over time, and through software updates offers increasingly useful services, or even informs Nest’s decisions about its next-generation of hardware.
So, how does Tesla’s autopilot system, and its cars in general, learn? It all starts with data.
Companies building these types of driver-assistance services, as well as full-blown self-driving cars like Google’s, need to teach a computer how to take over key parts (or all) of driving using digital sensor systems instead of a human’s senses. To do that companies generally start out by training algorithms using a large amount of data.
You can think of it how a child learns through constant experiences and replication, explained Nvidia’s Senior Director of Automotive, Danny Shapiro in an interview with Fortune. Nvidia (NVDA) sells high performance chips that enable computers to process large amounts of data, and more recently started selling a computing system, called Drive PX, for self-driving cars and driver-assist applications.
To create a self-driving car, companies feed hundreds of thousands, or even millions, of miles of driving videos and data into a computer’s data model to basically create a massive vocabulary around driving. The algorithms use visual techniques to break down the videos and to understand them. The goal is that when something unexpected happens — a ball rolls into the street — the car can recognize the pattern and react accordingly (slow down because a child could be running into the street after it).
For Nvidia, the company loads this “driving dictionary,” as Shapiro calls it, onto powerful but compact computing hardware that can be used on the car. After that, companies like Google and Tesla add various types of data from different sources to continue to inform the model over time.
Companies try to gather as much data as possible to help a car’s computer make smarter and better decisions on the roads. This includes data from customers driving, data from GPS and maps, and data from company employees driving research cars.
The data from Tesla drivers was enabled by the hardware choices that Tesla has made. All Tesla cars built in the past year have 12 sensors on the bottom of the vehicle, a front-facing camera next to the rear-view mirror, and a radar system under the nose. These sensing systems are constantly collecting data to help the autopilot work on the road today, but also to amass data that can make Tesla’s operate better in the future.
Because all of Tesla’s cars have an always-on wireless connection, data from driving and using autopilot is collected, sent to the cloud, and analyzed with software. For autopilot, Tesla takes the data from cars using the new automated steering or lane change system, and uses it to train its algorithms. Tesla then takes these algorithms, tests them out and incorporates them into upcoming software.
Companies will rely on different types of data depending on what they’re trying to do with the cars. For example, Google has used large and expensive LIDAR (light-based radar) sensors on its self-driving cars. But Tesla’s Musk said that LIDAR was basically overkill for what Tesla’s autopilot cars need.
But Musk said that Tesla wanted much more detailed high-precision mapping data for its automated steering and lane change applications than was available through the standard navigation tech. To meet its needs, Tesla has started to build high-precision maps —that have 100 times the level of granularity compared to standard navigation systems — using mostly data from Tesla cars driving on roads, but also some data from Tesla employees driving research cars.
These new services could provide unexpected business models for companies. Musk said that Tesla might be interested in selling the mapping data to other car companies down the road.
Tesla isn’t the only car maker working on driver-assist and self-driving car tech. Google is blazing ahead on its futuristic tech, while Audi has traffic jam assist software. Nvidia’s Shapiro says that most automakers are investigating these technologies.
Nvidia started shipping Drive PX this summer, and Shapiro says that it’s engaged with over 50 companies and researchers. Tesla uses Nvidia chips in the 17-inch screen and the instrument cluster for its Model S and there has been speculation around whether Tesla might use the Drive PX system in future versions of the Model X SUV. Shapiro wouldn’t discuss the specifics of its relationships with Tesla or Audi, which uses Nvidia’s tech in its traffic jam system.
Shapiro cautioned that despite some companies already deploying these technologies, it’s still early days for self-driving car tech. “A huge amount of work will be done on this over the next decade,” he said.
To learn more about Tesla’s autopilot tech watch this Fortune video: