On March 23, 2018, just after 9:30 am, a Tesla SUV plowed straight into a road divider as it was passing by the exit ramp to Route 85 from Route 101, in Mountain View, California. The impact caused the battery to burst into flames, and the driver was trapped in his seat by a seat belt. He died in the hospital.
Last week, the National Transportation Safety Bureau (NTSB) issued a preliminary report providing further details. Here are the key findings:
- The "AutoPilot" feature was switched on at the time of the crash. It was on continuously for the last 19 minutes of the drive (out of 32 minutes total).
- The driver's hands were not on the wheel in the last 6 seconds before the crash. In the last 60 seconds, hands were detected on three separate occasions, totaling 34 seconds.
- In the final 3 seconds, still with no hands on the wheel, the car accelerated from 62 mph to 71 mph, with "no pre-crash braking or evasive steering movement detected."
- The autopilot system was set to a cruise speed of 75 mph, which is above the speed limit of 65 mph.
- During the 19-minute stretch before the crash, the Autopilot system warned three times about putting hands on wheel, the last time was 15 minutes prior to the crash.
- The car with the single driver was on the HOV lane, which should be in effect at that time in the morning.
These are all telling observations. What did the reporter doing the Tesla beat at the Wall Street Journal think is the most salient finding? Well - the title of their article says it all: "Tesla Autopilot System Warned Driver to Put Hands on Wheel, U.S. Investigators Say." (link)
It is hard to imagine how that particular finding is the most important... unless you work for Tesla, or someone who wants to pin the blame on the driver who could not defend himself.
Problem #1
The Autopilot was continuously turned on for almost 20 minutes prior to the crash. The last warning to the driver happened "more than 15 minutes" before the crash. When driving at 60 mph and above, 15 minutes is 15 miles or more from the crash site. For example, the last warning might have been issued when passing Palo Alto, but the crash happened 15 minutes later in Mountain View. Rather than supporting the view that the driver recklessly ignored repeated warnings, this finding raises the question of why the car did not detect imminent collision, and take evasive action.
Problem #2
Hands not on wheel is taken to be a finding of major importance but here is the problem: every accident investigation will discover hands not on wheel. Why? Because our sample consists of only car-casses (pun intended).
If the driver had taken evasive steering action, it would have compensated for computer error, and the accident would have been avoided!
This is a reverse of the classical Wald paradox that I covered some years ago. In that example, statistician Abraham Wald warned that you couldn't inspect the damage on warplanes that returned home to determine which parts of the planes were most subject to damage - because your sample is missing the planes which were shot down!
In our example, we get to see the dead but not the survivors. In order to understand whether the computer made errors, we'd need to also include cases in which the driver prevented the Autopilot from crashing the car.
Problem #3
Given that the driver did not steer the wheel in the last seconds, why did the Tesla vehicle accelerate? Why was it allowed to go above the speed limit? This suggests technical challenges.
So, "every accident investigation will discover hands not on the wheel"? Logically, then, putting your hands on the wheel - exactly as Tesla instructs - will guarantee that you don't get in an accident. Great! If that were true, then you'd be right that this finding isn't very interesting.
But of course you're ignoring all the possible alternative explanations for the accident. What if the driver had tried to turn, but the autopilot overruled him? What if the turning mechanism had malfunctioned? What if the battery had caught fire and the resulting smoke blocked the driver's view? And so on. The fact that the NTSB report didn't offer any of these alternative explanations, but instead offered an explanation implying that the accident could have been avoided if the driver had operated the car safely, is actually quite relevant.
Of course, it's also very important to understand why the car made the steering error in the first place, and why it didn't detect the obstruction. But the preliminary NTSB report offers no information on those topics.
As for why the car accelerated, that's no mystery. The car in front moved out of the way, and the Tesla didn't detect any other obstructions. Since it was moving at a speed slower than the target speed, it accelerated.
Finally, your question as to why the car was allowed to go above the speed limit is a strange one. I mean, why does your car, and every other car, allow you to go above the speed limit? There are social and technical reasons why we don't require cars to enforce speed limits, but they don't have anything to do with Tesla or this accident in particular.
Posted by: Kevin Henry | 06/13/2018 at 07:04 AM
KH: Thanks for participating in the conversation. You're missing my point about the nature of the evidence. The Wald problem addresses the issue of selection bias. I'm not drawing causal conclusions about what caused the accident; however, many people are, including yourself, proclaiming that the driver should have hands on wheel, with the presumption that it would have then avoided the accident. In fact, this line of thinking is perfectly aligned with my point that for the cars that crashed, you'll very likely find no hands on wheel.
As I am a data scientist, my concern is around how to improve the underlying algorithms to prevent a tragedy. It's clear that the algorithm failed to detect an impending danger, and in fact, made the wrong decision to accelerate.
Re: speed limits. In the world of pure self-driving cars, should speed limits be enforced?
The NTSB report at this stage is descriptive. They have not offered any causal conclusions at all.
Posted by: Kaiser | 06/13/2018 at 10:12 AM
This point from NTSB is strange (wrong?): The car with the single driver was on the HOV lane, which should be in effect at that time in the morning.
Living in Sunnyvale, electric cars are allowed in HOV lanes. Are there exceptions anywhere? Every HOV lane I've seen allows for motorcycles and electric/hybrid cars.
Posted by: Glen DePalma | 06/13/2018 at 12:24 PM
GdP: Thanks for the correction. I was not aware that hybrids are allowed on all HOV lanes. The NTSB only points out the facts. I interpreted the HOV restrictions to be in place based on the 9:30 am time frame so it's not their mistake. This webpage is useful: https://arb.ca.gov/msprog/carpool/carpool.htm
Posted by: Kaiser | 06/13/2018 at 02:01 PM
I understand the point about Wald. The problem is that, in order to make it, you've had to assume something obviously untrue: that you can't get in an accident if your hands are on the wheel. You've simply assumed away a huge range of possible problems that people are concerned about when it comes to autonomous driving. What if the driver had tried to turn the wheel but the car hadn't let him? And so forth.
I agree that the report doesn't prove that the driver could have avoided the accident. It would be impossible to prove that. I agree that you would need data on avoided accidents to fully evaluate the performance of the car. That's obvious. The point is that the report is entirely consistent with what Tesla says about the car, and how to safely operate it. The fact that the report is consistent with that, and not inconsistent, is relevant and newsworthy.
Thinking about how to structure society—and algorithms—to make autonomous driving safe is laudable, but I think your attempts to tie it to this accident report have fallen well wide of the mark. Should society require the self-driving cars of the future to enforce speed limits? is a great question. Should the car in this particular accident have been allowed to exceed the speed limit? doesn't make much sense.
Posted by: Kevin Henry | 06/14/2018 at 09:16 AM
If the driver take his hands on the wheel while autopilot is on, could the car steer against his will? I ask hoping that some expert in the field can reply.
Posted by: Antonio Rinaldi | 06/24/2018 at 02:39 PM
KH: Let's see what you're saying: if the driver had hands on wheel, then the accident is the driver's fault and if the driver did not have hands on wheel, it's the driver's fault. So we conclude that all accidents are the driver's fault.
Posted by: Kaiser | 06/26/2018 at 10:13 AM