Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward

Tesla Drivers Claim Autopilot Caused Crashes

May 14, 2016, 6:42 PM UTC
Tesla Introduces Self-Driving Features With Software Upgrade
A member of the media test drives a Tesla Motors Inc. Model S car equipped with Autopilot in Palo Alto, California, U.S., on Wednesday, Oct. 14, 2015. Tesla Motors Inc. will begin rolling out the first version of its highly anticipated "autopilot" features to owners of its all-electric Model S sedan Thursday. Autopilot is a step toward the vision of autonomous or self-driving cars, and includes features like automatic lane changing and the ability of the Model S to parallel park for you. Photographer: David Paul Morris/Bloomberg via Getty Images
Photograph by David Paul Morris — Bloomberg via Getty Images

In a preview of a type of dispute that is certain to become much more common, two drivers of Tesla’s Model S this week blamed the car’s self-driving features for accidents. In both cases, the company refuted those claims, pointing out onboard data that conflicted with the drivers’ stories, and citing clear warnings about how self-driving features should be used.

On Tuesday, Tesla owner Jared Overton claimed to Utah’s KSL that he parked his car, stood near it for around 30 seconds, then entered a store, only to return to find it had pulled itself forward and smashed its windshield against an overhanging trailer (KSL has detailed photos of the accident).

Get Data Sheet, Fortune’s technology newsletter.

But Tesla, using data from the car’s ‘black box,’ says that Overton activated his car’s Summon feature, which went to work just seconds later, presumably causing the crash. Summon is intended for parking in tight spaces with the car under close supervision, and the company specifically warns that the car can’t detect overhanging objects well.

Then on Friday, Ars Technica reported on claims by one Arianna Simpson that her Models S’s autopilot system didn’t engage when she assumed it would to prevent her from rear-ending a vehicle in front of her. But again, Tesla responded with data showing Simpson had tapped her brake pedal before the crash, deactivating autopilot features.

Assuming Tesla is citing accurate data, there are several takeaways here. Clearly, there will be a learning curve as drivers adapt to the different capabilities and limitations of self-driving features—and as manufacturers learn how to make them more intuitive. Robust data will be crucial to tracking just where breakdowns in that interaction happen, both to help make the features easier to use, and to help establish liability when crashes occur.

For more on Tesla, watch our video.

Tesla (TSLA) is likely to lead the way in navigating those challenges. It has been speculated that the Model 3, which has experienced huge preorder demand, will include autopilot features. Tesla announced earlier this month that it will hugely accelerate its production schedule, meaning there could be hundreds of thousands of cars with robust self-driving features on the road within as little as three years.

That means a lot of drivers learning how to use features that they’ve never seen before—and which, when they’re not used correctly, can clearly cause accidents.