« The professional disease of data enthusiasts | Main | Don't miss our networking mixer with star speakers on Tuesday »

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Kevin Henry

So, "every accident investigation will discover hands not on the wheel"? Logically, then, putting your hands on the wheel - exactly as Tesla instructs - will guarantee that you don't get in an accident. Great! If that were true, then you'd be right that this finding isn't very interesting.

But of course you're ignoring all the possible alternative explanations for the accident. What if the driver had tried to turn, but the autopilot overruled him? What if the turning mechanism had malfunctioned? What if the battery had caught fire and the resulting smoke blocked the driver's view? And so on. The fact that the NTSB report didn't offer any of these alternative explanations, but instead offered an explanation implying that the accident could have been avoided if the driver had operated the car safely, is actually quite relevant.

Of course, it's also very important to understand why the car made the steering error in the first place, and why it didn't detect the obstruction. But the preliminary NTSB report offers no information on those topics.

As for why the car accelerated, that's no mystery. The car in front moved out of the way, and the Tesla didn't detect any other obstructions. Since it was moving at a speed slower than the target speed, it accelerated.

Finally, your question as to why the car was allowed to go above the speed limit is a strange one. I mean, why does your car, and every other car, allow you to go above the speed limit? There are social and technical reasons why we don't require cars to enforce speed limits, but they don't have anything to do with Tesla or this accident in particular.

Kaiser

KH: Thanks for participating in the conversation. You're missing my point about the nature of the evidence. The Wald problem addresses the issue of selection bias. I'm not drawing causal conclusions about what caused the accident; however, many people are, including yourself, proclaiming that the driver should have hands on wheel, with the presumption that it would have then avoided the accident. In fact, this line of thinking is perfectly aligned with my point that for the cars that crashed, you'll very likely find no hands on wheel.

As I am a data scientist, my concern is around how to improve the underlying algorithms to prevent a tragedy. It's clear that the algorithm failed to detect an impending danger, and in fact, made the wrong decision to accelerate.

Re: speed limits. In the world of pure self-driving cars, should speed limits be enforced?

The NTSB report at this stage is descriptive. They have not offered any causal conclusions at all.

Glen DePalma

This point from NTSB is strange (wrong?): The car with the single driver was on the HOV lane, which should be in effect at that time in the morning.

Living in Sunnyvale, electric cars are allowed in HOV lanes. Are there exceptions anywhere? Every HOV lane I've seen allows for motorcycles and electric/hybrid cars.

Kaiser

GdP: Thanks for the correction. I was not aware that hybrids are allowed on all HOV lanes. The NTSB only points out the facts. I interpreted the HOV restrictions to be in place based on the 9:30 am time frame so it's not their mistake. This webpage is useful: https://arb.ca.gov/msprog/carpool/carpool.htm

Kevin Henry

I understand the point about Wald. The problem is that, in order to make it, you've had to assume something obviously untrue: that you can't get in an accident if your hands are on the wheel. You've simply assumed away a huge range of possible problems that people are concerned about when it comes to autonomous driving. What if the driver had tried to turn the wheel but the car hadn't let him? And so forth.

I agree that the report doesn't prove that the driver could have avoided the accident. It would be impossible to prove that. I agree that you would need data on avoided accidents to fully evaluate the performance of the car. That's obvious. The point is that the report is entirely consistent with what Tesla says about the car, and how to safely operate it. The fact that the report is consistent with that, and not inconsistent, is relevant and newsworthy.

Thinking about how to structure society—and algorithms—to make autonomous driving safe is laudable, but I think your attempts to tie it to this accident report have fallen well wide of the mark. Should society require the self-driving cars of the future to enforce speed limits? is a great question. Should the car in this particular accident have been allowed to exceed the speed limit? doesn't make much sense.

Antonio Rinaldi

If the driver take his hands on the wheel while autopilot is on, could the car steer against his will? I ask hoping that some expert in the field can reply.

Kaiser

KH: Let's see what you're saying: if the driver had hands on wheel, then the accident is the driver's fault and if the driver did not have hands on wheel, it's the driver's fault. So we conclude that all accidents are the driver's fault.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)

NEW BOOTCAMP



Link to Principal Analytics Prep

See our curriculum, instructors. Apply.
Business analytics and data visualization expert. Author and Speaker. Founder of Principal Analytics Prep, MS Applied Analytics at Columbia. See my full bio.

Next Events

July: 24 Data Analytics Resume Workshop, NYC

July: 30 Joint Statistical Meetings, Vancouver

Aug: 28 Swiss Statistics Meeting, Zurich

Sep: 6 Data Visualization Seminar, San Diego, CA

Sep: 12 NYPL Analytics Careers Talk, NYC

Past Events

See here

Future Courses (New York)

Summer: Statistical Reasoning & Numbersense, Principal Analytics Prep (4 weeks)

Summer: Applied Analytics Frameworks & Methods, Columbia (6 weeks)

Junk Charts Blog



Link to junkcharts

Graphics design by Amanda Lee

Search3

  • only in Big Data

Community