« The look elsewhere effect | Main | Do to other as you would have them do to you »

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Floormaster Squeeze

In the pet study, I was not impressed with the "significance" of the results statistically to begin with but more so when I realized their unit of analysis was not families but "family-weeks". That is they analyzed the impact of current pet presence with current family health each week. That seems all kinds of mixed up and any large N effects are overstated (aren't the effects on immunity cumulative? aren't immunity effects lagged? aren't levels of sickness and health week to week for a single highly related to each other? doesn't excluding urban families bias results?).

I agree that pet owners have many potential confounding factors but surely excluding urban families is even worse. They really seemed to be mixed up about the relationship between pets, pets being outdoors, people being with pets, people being with people, and people being outdoors.

Budding Scientist

It's important to note that none of the scientists involved in either of these studies actually said ANY of those cause-effect statements. Those come from Susan E. Matthews and "MyHealthNewsDailyStaff", respectively. The researchers KNOW they aren't doing Freakonomics. Somebody needs to tell the idiots who skim these papers and write articles about them. This has been a constant problem ever since science was first in the news. So, blame where blame is due, I guess.

Kaiser

Budding Scientist: One of the study researchers was cited in the article speculating on all kinds of reasons why animals could have caused disease so they definitely was providing cover for the reporter. This is only one example I picked up on the day I wanted to write about causation creep; I come across examples like this every week. In Freakonomics, when they told us some economist wanted to change her surname to start with A or that parents should consider timing the births of their kids in order to turn them into star athletes, they fell into the same trap. It's very easy to make careless statements like these.

Michael L.

I'm not a statistician but a quantitative social scientist; so, I'm familiar with the issue you raise in the post. I have two comments. One is that just as correlation does not imply causation it also doesn't imply lack of causation. A correlation may or may not also be a causal relationship. So as I see it, the problem isn't so much suggesting that a correlation might indicate that causality is involved. It is suggesting this without appropriate qualification and/or suggesting it without making a convincing case against plausible alternative explanations. At the end of your post you refer to how hard it is to prove causation with data that is "conveniently collected." But in science we never prove causation if "prove" is being used the way that term is used in mathematics and philosophy. Even if one has conducted a randomized trial and found a causal effect, the case for such an effect hasn't been proven. Just because random assignment is the "gold standard" way of increasing the likelihood of balance in the treatment and control groups that allow us to conclude that causality is involved, this doesn't mean that it guarantees such balance. And a finding of statistical significance is not a proof. Of course, I'm not saying anything that you don't already know. I was just worried that your comment about proving causality might reinforce another misconception about science and statistics that I often see in discussions in the media---that scientific studies are about proving things.

Tom

Do you have a link to where was the abortion hypothesis debunked?

Kaiser

Tom: You can start with the generally neutral list of references from Wikipedia (link).

Michael: Thanks for the note, which I totally agree with. What I'm against is the sloppy language used to make the limited conclusions sound exciting. If the researcher admits that he/she doesn't know why A is correlated with B, and there is nothing in the study to back A as a cause of B, then don't suggest that people engage in behavior A in the hope of attaining outcome B.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Marketing and advertising analytics expert. Author and Speaker. Currently at Vimeo and NYU. See my full bio.

Next Events

Mar: 26 Agilone Webinar "How to Build Data Driven Marketing Teams"

Apr: 4 Analytically Speaking Webcast, by JMP, with Alberto Cairo

May: 19-21 Midwest Biopharmaceutical Statistics Workshop, Muncie, IN

May: 25-28 Statistical Society of Canada Conference, Toronto

June: 16-19 Predictive Analytics World (Keynote), Chicago



Past Events

Feb: 27 Data-Driven Marketing Summit by Agilone, San Francisco

Dec: 12 Brand Innovators Big Data Event

Nov: 20 NC State Invited Big Data Seminar

Nov 5: Social Media Today Webinar

Nov: 1 LISA Conference

Oct: 29 NYU Coles Science Center

Oct: 9 Princeton Tech Meetup

Oct: 8 NYU Bookstore, NYC

Sep: 18 INFORMS NYC

Jul: 30 BIG Frontier, Chicago

May: 30 Book Expo, NYC

Apr: 4 New York Public Library Labs and Leaders in Software and Art Data Viz Panel, NYC

Mar: 22 INFORMS NY Student-Practitioner Forum on Analytics, NYC

Oct: 19 Predictive Analytics World, NYC

Jul: 30 JSM, Miami

Junk Charts Blog



Link to junkcharts

Graphics design by Amanda Lee

Search3

  • only in Big Data

Community