Another day, another Tesla crash.
This time, a Tesla (TSLA) Model S driver claims that his car’s Autopilot failed to react to a van parked on a busy road somewhere in Europe and slammed into its rear. The driver, Chris Thomann, who blamed the crash on Tesla and said that the entire front of his car must be replaced, posted a video of the incident online.
“I was in contact with Tesla Europe, but they could not provide me with any useful information,” he wrote in the description of the video he posted to YouTube. “They just stated that ‘all systems worked as expected.’ Well, certainly not how I expected them to work.”
The video, which was posted to YouTube on Wednesday and earlier reported on by Electrek, shows the Model S driving along a busy street and using Autopilot to track a black car in front of it. After the car in front swerved around the stopped van, the Model S neither reacted to avoid the obstacle nor stopped, leading to the collision.
According to Thomann, he was using Tesla’s Autopilot active cruise control at the time and said that it “did not brake as it normally does.” He added that the car’s “collision avoidance system did not make an emergency brake,” and that the “forward collision warning turned on way too late,” despite being turned on to its normal setting. What’s worse, he says, the car actually sped up before he hit the brakes.
Get Data Sheet, Fortune’s technology newsletter.
Tesla’s autopilot is a semi- but not fully autonomous feature in Tesla’s sedan. The feature automatically takes over steering within a lane. Drivers who want to change lanes can do so by tapping the turn signal. In addition, autopilot controls speed and uses sensors and other technology to perform other functions like steering to avoid collisions and parallel parking.
Autopilot, while handy, has been the subject of some debate of late after two Model S drivers earlier this month blamed the car’s self-driving features for causing a crash. One of those drivers accused the technology of driving the car into a trailer and smashing his car’s windshield. Another driver claimed that the car’s autopilot didn’t engage when she assumed it would prevent colliding with the rear of another car.
Tesla has countered those claims, saying that the windshield-smashing incident could have been prevented if the driver had properly used an auto-parking feature. In the second incident involving a rear-end collision, Tesla said that if the driver hadn’t tapped the brake pedal, the car would have likely prevented a crash.
Tesla based its findings on an analysis of each car’s “black box,” which includes important data about how the vehicle is operating and what caused an accident.
In both cases, however, drivers made assumptions that led to crashes, illustrating the work that auto makers must still do before fully autonomous vehicles are ready for widespread use. Indeed, most analysts don’t expect most fully autonomous cars to hit the road until 2020 and to become widely used long after then.
A Tesla spokesperson was quick to note to Fortune that customer education about what Autopilot can do is critical.
“Tesla Autopilot is designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable,” the spokesperson says. “Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”
For more about Tesla, watch:
Thomann, the latest driver to be involved in an Autopilot-related crash, doesn’t appear willing to back down. While he acknowledged that he could have been more careful, he said that he trusted his car to do the right thing. Only this time, it didn’t.
“Yes, I could have reacted sooner, but when the car slows down correctly 1,000 times, you trust it to do it the next time, too,” he wrote. “My bad.”