Andrew Gelman pointed readers of his blog to a really great press release / report for those interested in how statisticians look at real-world data. This is work done by the Highway Loss Data Institute (HLDI) which made the Freakonomics-style claim "Texting bans don't reduce crashes; effects are slight crash increases."
This is a long post as I'm going to walk through the analysis in detail. In short, great methodology, bad conclusion. (Bear with me, I'll get to the spade part.)
***
What's good about the piece is how they found a creative and yet credible way to approach a seemingly impossible problem. Ideally, to answer the question "Do texting bans reduce crashes?", you would ban texting for a random group of drivers, and allow texting for another random group of drivers, and then compare the accident rates for the two groups. Of course, we can't do that. Like most public policies, once you make the policy, you don't have test and control groups to do direct comparisons.
What helped is that not all states have banned texting (yet). So they came up with the concept of comparing a text-ban state with its surrounding states that have not banned texting. The chart on the right showed the accident rates in California (text-ban state) compared to Arizona, Nevada and Oregon, before and after the ban went into effect in 2009.
The two lines are essentially the same, implying that the texting ban did not change the accident rates.
What's impressive about this analysis is that they did not make the classical mistake of looking only at the red line. (If the red line shifted a lot after the text ban but so did the blue line, then something else other than the text ban was probably causing the accident rates to change.)
***
Now we get to the spade part. If you persisted and read to the last 25% of the press release, you learned something super-important. There, they tell us:
Noncompliance is a likely reason texting bans aren't reducing crashes. ... Many respondents who knew it was illegal to text said they didn't think police were strongly enforcing the bans. (my italics)
So, here are the three super-important words: nonawareness, noncompliance, nonenforcement. Sounds like the "texting ban" has not stopped anyone from texting while driving, and this can explain the unchanged accident rates. When they state the claim "texting bans don't reduce crashes", it sends the message that despite drivers not texting while driving, accidents remain as frequent as before. That's absolutely not what the research is saying.
In other words, the test group is called "texting ban" but in reality, the behavior of these drivers was no different before or after the ban. (According to surveys, of the 48% drivers who admit they text, 45% said they ignored the ban.) But naming something is very powerful. We think text-ban states have drivers who don't text but with compliance close to zero, text-ban states are essentially the same as non-text-ban states.
Imagine in a clinical trial: drug A is a pill, drug B is administered by a painful injection. Drug B is in fact extremely effective. However, in this trial, drug A beats drug B handily. But looking more deeply, you discover that nobody in the test group B actually took drug B because the pain was too much. So in reality, the trial compared drug A to no drug at all. A good conclusion for this study is: taking drug A is better than nothing, and no one wants to take drug B. A poor conclusion is: taking drug A is better than taking drug B.
***
Beyond this, savvy readers should be asking these questions:
- How is accident rate measured? The collision claims per 100 insured vehicles, how is it correlated with the accident rate due to texting?
- Why are vehicles older than 9 years old excluded? What impact might this exclusion have?
- What do they mean by the nearby states where texting laws "weren't substantially changed"? Substantially and substantively are very different; substantially seems to me like they were changed but not by much.
- Notice the implicit assumptions: that neighboring states are comparable
***
Finally, what is the evidence for "effects [of texting bans] are slight crash increases"? The HLDI President elaborated: "In a perverse twist, crashes increased in 3 of the 4 states we studied after bans were enacted."
Problem: that is simply not true. It turns out California is considered one of the 3 states. Look back at the graph shown above, and try to convince yourself that accident rates have increased after the text-ban went into effect. I can't. You don't even need statistical software for this.
Still not convinced... this is the chart for Minnesota compared to Iowa and Wisconsin, where the accident rate for Minnesota after the ban went up by 9%, the largest among the 4 text-ban states. First thing you noticed is that there is virtually no difference between the states after the law went in. All of that 9% difference had to do with something that happened about 5 to 10 months prior to the ban, and if anything, that period should be investigated. But I can't see how one can conclude that the text-ban caused accident rates to go up.
***
But the researchers didn't stop there. They now declare story time! First, they managed to convince themselves that noncompliance was not really an issue:
But [noncompliance] doesn't explain why crashes increased after texting bans. If drivers were disregarding the bans, then the crash patterns should have remained steady. So clearly drivers did respond to the bans somehow...
Absolute shocker. They say that because the texting ban increased accident rates (not even true), that proves that there is compliance, and therefore, they could conclude that texting bans did not reduce accidents! My advice: go find statistics of how many drivers have been booked for texting while driving in those states.
Then, they spin more fiction, following on that last sentence:
... what they might have been doing was moving their phones down and out of sight when they texted, in recognition that what they were doing was illegal. This could exacerbate the risk of texting by taking drivers' eyes further from the road and for a longer time.
Great sounding story but the problem: this sounds smart if the accident rate went up, it sounds silly if it didn't. Imagine if I buy this story--then the onus is on the researchers to explain why drivers moving their phones down and out of sight did *not* in fact lead to increased accident rates.
This is the discontent of story time. These stories are made up to explain one particular interpretation of the observed data, and do not hold up if the data were to be interpreted differently. Either the stories had to be reversed, or ever more elaborate extensions have to be developed.
Recent Comments