Allowing Tesla Owners to Beta Test Autonomous Functionality on Public Roads Is a Safety Hazard

Keith Barry, Consumer Reports:

FSD beta 9 is a prototype of what the automaker calls its “Full Self-Driving” feature, which, despite its name, does not yet make a Tesla fully self-driving. Although Tesla has been sending out software updates to its vehicles for years—adding new features with every release—the beta 9 upgrade has offered some of the most sweeping changes to how the vehicle operates. The software update now automates more driving tasks. For example, Tesla vehicles equipped with the software can now navigate intersections and city streets under the driver’s supervision.

“Videos of FSD beta 9 in action don’t show a system that makes driving safer or even less stressful,” says Jake Fisher, senior director of CR’s Auto Test Center. “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”

One owner’s car repeatedly drove over a double yellow line while creeping into an intersection. Another’s confused the moon with a yellow traffic light. Tesla excuses this by pointing to cautionary statements on the in-car display that state that it is “not a substitute for an attentive driver”, and says that it is only being rolled out to owners who signed up to participate in pre-release testing.

But the fact of the matter is that these features are being marketed as “Full Self-Driving” and “Autopilot”. Unlike other cars equipped with automatic lane keeping and radar-assisted cruise control, Tesla is not pitching these features as part of a safety enhancement package, but as autonomous vehicle technologies. There is no way the company does not know how owners are using these features and, consequently, subjecting other drivers, pedestrians, and cyclists to their beta testing experience at great risk to public safety.

It is also true that human drivers will make mistakes. Not every driver on the road is equally competent, and it is possible that Tesla’s system is better than some human drivers. But autonomous systems can lull the human operator into a false impression of safety, with sometimes deadly consequences.