« The look elsewhere effect | Main | Do to other as you would have them do to you »


Feed You can follow this conversation by subscribing to the comment feed for this post.

Floormaster Squeeze

In the pet study, I was not impressed with the "significance" of the results statistically to begin with but more so when I realized their unit of analysis was not families but "family-weeks". That is they analyzed the impact of current pet presence with current family health each week. That seems all kinds of mixed up and any large N effects are overstated (aren't the effects on immunity cumulative? aren't immunity effects lagged? aren't levels of sickness and health week to week for a single highly related to each other? doesn't excluding urban families bias results?).

I agree that pet owners have many potential confounding factors but surely excluding urban families is even worse. They really seemed to be mixed up about the relationship between pets, pets being outdoors, people being with pets, people being with people, and people being outdoors.

Budding Scientist

It's important to note that none of the scientists involved in either of these studies actually said ANY of those cause-effect statements. Those come from Susan E. Matthews and "MyHealthNewsDailyStaff", respectively. The researchers KNOW they aren't doing Freakonomics. Somebody needs to tell the idiots who skim these papers and write articles about them. This has been a constant problem ever since science was first in the news. So, blame where blame is due, I guess.


Budding Scientist: One of the study researchers was cited in the article speculating on all kinds of reasons why animals could have caused disease so they definitely was providing cover for the reporter. This is only one example I picked up on the day I wanted to write about causation creep; I come across examples like this every week. In Freakonomics, when they told us some economist wanted to change her surname to start with A or that parents should consider timing the births of their kids in order to turn them into star athletes, they fell into the same trap. It's very easy to make careless statements like these.

Michael L.

I'm not a statistician but a quantitative social scientist; so, I'm familiar with the issue you raise in the post. I have two comments. One is that just as correlation does not imply causation it also doesn't imply lack of causation. A correlation may or may not also be a causal relationship. So as I see it, the problem isn't so much suggesting that a correlation might indicate that causality is involved. It is suggesting this without appropriate qualification and/or suggesting it without making a convincing case against plausible alternative explanations. At the end of your post you refer to how hard it is to prove causation with data that is "conveniently collected." But in science we never prove causation if "prove" is being used the way that term is used in mathematics and philosophy. Even if one has conducted a randomized trial and found a causal effect, the case for such an effect hasn't been proven. Just because random assignment is the "gold standard" way of increasing the likelihood of balance in the treatment and control groups that allow us to conclude that causality is involved, this doesn't mean that it guarantees such balance. And a finding of statistical significance is not a proof. Of course, I'm not saying anything that you don't already know. I was just worried that your comment about proving causality might reinforce another misconception about science and statistics that I often see in discussions in the media---that scientific studies are about proving things.


Do you have a link to where was the abortion hypothesis debunked?


Tom: You can start with the generally neutral list of references from Wikipedia (link).

Michael: Thanks for the note, which I totally agree with. What I'm against is the sloppy language used to make the limited conclusions sound exciting. If the researcher admits that he/she doesn't know why A is correlated with B, and there is nothing in the study to back A as a cause of B, then don't suggest that people engage in behavior A in the hope of attaining outcome B.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)

Business analytics and data visualization expert. Author and Speaker. Currently at Columbia. See my full bio.

Next Events

Apr: 9 ASA Data Fest (judge), Vassar College, NY

Apr: 10 Math & Stats Colloquium, Vassar College, NY

Apr: 19 JMP Explorers on Data Viz , Charlotte, NC

May: 2 AQR Proseminar on Data Viz, New York City

Past Events

See here

Future Courses (New York)

Summer: Statistical Reasoning & Numbersense, Principal Analytics Prep (4 weeks)

Summer: Applied Analytics Frameworks & Methods, Columbia (6 weeks)

Junk Charts Blog

Link to junkcharts

Graphics design by Amanda Lee


  • only in Big Data