What’s in a name? For Tesla’s Full Self Driving, it may be danger

November 8, 2020, 8:00 PM UTC

Federal regulators perked up when Tesla last month started letting some customers test new autonomous driving software that it had added to their cars. The regulators said they would “monitor the new technology closely” and “take action to protect the public against unreasonable risks to safety.”

One of those safety risks may be the name of Tesla’s software: “Full Self Driving.” In its current form, the software can take over from drivers to park, stay in a lane, cruise on highways and stop at red lights, but it does not make vehicles “self-driving.”

By industry standards, Tesla’s system is considered an advanced driver assistance system, or ADAS, and not autonomous driving technology. There is hope that ADAS systems will increase road safety overall. But there is also concern that if drivers place too much faith in assistive technology, they may become inattentive, undermining those safety benefits.

Names and marketing that imply higher levels of autonomy for an ADAS system increase the risk of distracted or irresponsible driving, according to a study published in September by the AAA Foundation for Traffic Safety. In the study, 90 drivers were told they were about to test drive a vehicle with driver assistance features.

Half of the drivers were told the system was called “AutonoDrive” and given introductory information that focused on how it would reduce their workload. The other half were told they would be using a system called “DriveAssist,” and that they had to monitor it.

The system’s actual capabilities were described accurately to each group.

When surveyed after this introduction, 42% of drivers in the “AutonoDrive” group overestimated the capabilities of the system, while only 11% of drivers in the “DriveAssist” group made the same mistake. “AutonoDrive” users also reported greater “willingness to engage in potentially distracting or risky behaviors” while using the system, and greater belief that the system would prevent a collision. In fact, the system was unlikely to prevent crashes in the scenarios presented, according to researchers.

Both groups of drivers then test-drove vehicles equipped with Cadillac’s SuperCruise driver assistance system (though the system’s real name was not disclosed). Drivers who had been told they were using “AutonoDrive” spent 30% more time with their hands away from the wheel than those told they were using “DriveAssist.” Subjects in the “AutonoDrive” group were also much more likely to respond slowly (more than five seconds) to directions from the car’s automated system to take control of the vehicle.

“The results shouldn’t be very surprising,” says Greg Brannon, AAA’s director of automotive engineering and industry relations. “If you use a name that overestimates the capabilities of the system … it sets a context for a driver that the system perhaps can do more than it can.”

Brannon was not directly involved in the September study, but he leads an initiative by AAA and other auto industry groups to set standardized naming conventions for driver-assistance and automated driving systems.

The fact that systems like Tesla’s are designed to improve over time could add to user confusion. Tesla and its CEO, Elon Musk, have frequently predicted that the Full Self Driving software will eventually enable a Tesla to drive itself without any human oversight, and even operate as a robotic taxi. That would be considered at least Level 4 automation, according to classifications by the Society For Automotive Engineers. But today, Tesla still emphasizes that Full Self-Driving requires “active driver supervision,” meaning it qualifies as Level 2 automation on the SAE scale.

The resulting confusion is widespread: even some automotive publications refer to the software as “fully autonomous” when Tesla explicitly says it’s not. Some regulators have cried foul at the disconnect – a German court, for instance, ruled in July that “Autopilot” branding was misleading.

Tesla did not reply to requests for comment from Fortune about its naming practices.

Brannon says Tesla isn’t AAA’s sole concern, citing names like Volvo’s PilotAssist and Daimler’s DrivePilot as also potentially confusing. Confusing names could become a matter for the Federal Trade Commission, which enforces truth in advertising rules. But U.S. auto safety regulators haven’t created any naming standards, and Brannon doesn’t expect them to.

“We understand that automakers are likely to continue the individualized marketing of these systems under different names,” Brannon said.

Instead, he hopes manufacturers adopt standard names for specific system features, such as automatic lanekeeping or automatic emergency braking, under the broader umbrella of their system brand names. That could help reduce driver confusion, while also making it easier to compare the performance of different vehicles’ features using uniform tests.

Some within the autonomous vehicle industry, though, would prefer even more cautious terminology.

“As an industry, we are at the point where having the right public perception of the technology is important,” says Gautam Narang, CEO of Gatik, a startup that’s working with companies like Walmart to develop autonomous delivery systems for vans or trucks that travel fixed routes.

Narang says those fixed routes, chosen for their predictability, are a better fit for what truly autonomous vehicles are capable of right now. He’s concerned that by promising too much, manufacturers like Tesla could create confusion – and perhaps, when things go wrong, long-term disillusionment.

“The branding ‘Full Self Driving’ is a terrifying contradiction,” says Narang, “because it can’t actually be driven without close human supervision.”

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward