Google’s self-driving cars have to navigate through a potpourri of legal, regulatory, and technical problems before they’ll be ready for public use. Last month’s incident with a city bus—Google’s first at-fault accident—is a reminder of those technical challenges.
In short, Google’s self-driving cars have a lot more to learn. Chris Urmson, project director for the self-driving car team, detailed what his team is doing to solve those challenges during a keynote at South by Southwest, the annual interactive, film, and music conference in Austin.
The underlying message of his talk is that this technology is going to come out incrementally. And when it does, the software will have to show that it’s better at driving than people.
Get Data Sheet, Fortune’s technology newsletter.
While Urmson rehashed a lot of information that’s been shared publicly, he did make a few surprising comments and shared some interesting details about what the self-driving car team is developing. Here are a few of them:
On the accident involving a city bus and Google’s self-driving car
“This was a tough day for us,” Urmson said describing a February 14 incident in which a Google self-driving car hit a city bus at about two miles per hour. In the aftermath of the city bus accident, the company implemented about 3,500 hours of tests to ensure it wouldn’t happen again.
But make no mistake, there will be more accidents
“I’m confident that we’re going to have another day like our Valentine’s Day accident,” Urmson said. “And we’ll most certainly have worse days like that over time. This technology is so important, if we can band together and think of the net benefit to society then we can actually make it happen.”
There’s a team dedicated to making the self-driving car’s life difficult
The autonomous vehicle software is put through about three million miles of simulation testing daily, and a team is dedicated to coming up with crazy scenarios.
While this team has been eluded to before, there hasn’t been much detail in what kind of odd scenarios being inflicted through the software. For example, the team has run a scenario in which the simulator where people “snail” across the road. Snailing, similar to the planking phenomenon, is when a person drops to his or her belly and moves like a snail. If that were to ever happen, the Google car now will know what to expect.
Google’s self-driving car has encountered some crazy stuff—most of which the team didn’t consider
Urmson listed a few, including:
- Human Frogger: A group of people crossed the road in front of the self-driving car, reenacting the 1980s video game. “Please don’t do this,” Urmson pleaded.
- “Some dude” who came out of nowhere, rolled onto the car’s hood, and ran away.
- A guy who came running out to check out car—while naked.
- A woman in an electric chair with a broom to chase a duck out of the middle of the street.
The Google self-driving car doesn’t have to be able to recognize every odd situation. It just has to know when something is out of the ordinary. The cars have “anomaly detection, ” which means the vehicles don’t know what is happening. The cars respond by slowing down and letting the situations play out before continuing on, Urmson explained.
Why the mayor of Austin digs Google’s self-driving car:
Google cars are paranoid
The goal, Urmson said, is to make the self-driving cars drive a little more smoothly—but still while maintaining that paranoia. Instead of jumping at shadows in fear they might miss something, the cars should be able to understand those are just shadows and not worry over them.