Ken Klippenstein, the Intercept:
Highway surveillance footage from Thanksgiving Day shows a Tesla Model S vehicle changing lanes and then abruptly braking in the far-left lane of the San Francisco Bay Bridge, resulting in an eight-vehicle crash. The crash injured nine people, including a 2-year-old child, and blocked traffic on the bridge for over an hour.
I have seen an awful lot of people blaming this crash on the cars behind the Tesla following too closely. But watch the video carefully: the Tesla moves from the second lane into the first at low speed, cutting off the first car involved in the crash from about four or five car-lengths. It appears the Tesla driver was using one of its autonomous systems at the time; there are conflicting reports about which erroneously named option — of “Autopilot” and “Full Self Driving” — was engaged.
Jason Torchinsky, the Autopian:
This isn’t news to people who pay attention. It’s been proven since 1948, when N.H. Mackworth published his study The Breakdown of Vigilance During Prolonged Visual Search which defined what has come to be known as the “vigilance problem.” Essentially, the problem is that people are just not great at paying close attention to monitoring tasks, and if a semi-automated driving system is doing most of the steering, speed control, and other aspects of the driving task, the human in the driver’s seat’s job changes from one of active control to one of monitoring for when the system may make an error. The results of the human not performing this task well are evidenced by the crash we’re talking about.
I think it’s not unreasonable to think of Level 2 driving as potentially impaired driving, because the mental focus of the driver when engaging with the driving task from a monitoring approach is impaired when compared to an active driver.
I think this argument is worth considering. These semi-autonomous systems are playing the same sort of trick as ChatGPT: they offer a convincing but shallow impression of a competent driverless car without any broader context to fall back on.