« Know your data 15: the false promise of data correction | Main | What the DST researchers actually found »


Feed You can follow this conversation by subscribing to the comment feed for this post.


what about all the data that exists in circadian rhythms to show that this is not a chance observation (e.g. PMID 19273720)? This is a fundamental problem with the myopic views of all the "data analysis" scientists who have no capacity to integrate information beyond the two graphs and two p-values they are seeing in one paper.


Cblitzes: Just read the last sentence you write. You're making a strong charge based on no evidence at all. I did not even mention p-values in this post. Further, I'll comment when you respond to the substance of my post.


Cblitzes: As well the problem with this time series type of data is that someone has looked for something interesting in the data or possibly in a lot of data. This especially is common with time series, as there are lots of possible patterns. As the saying goes that one swallow doesn't make a summer, finding one bit of data doesn't support a hypothesis. One of the problem with medical research is often that the basic medical research doesn't extend to real world situations, so it is fairly much irevelant what people have found about circadian rhythms.

What would be nice to see is say the last 20 years data, but of course that could be selected as well. Try it in various countries with DST and find that there is an effect in most and then you can start believing, as long as you believe that there isn't a problem with the data.

My findings with data are that either the systems are implemented to guarantee a high reliability of data reliability, or people have very strong incentives to get the data correct or the data are rubbish.

One last point, most hospital admissions are higher on Mondays. That, and random variation probably explains most things in the graphs.


I'd ask the question, "Given the daily AMI data for the last 10 years, how confident are you that you can identify the Sundays on which the clocks are changed? How confident are you that you can identify any specific day of the week?"

If there is clearly a signal present post time change, then it should be easy to identify when the time change occurs even if the day of the week is not given along with the AMI counts.


Richard: not sure I'm interpreting your question correctly... the challenge is not about figuring out which day the time change was scheduled to occur. My worry about this type of analysis is that I have no confidence that every human being who entered the data in the system remembered to set the clock. To check this, you need to hospital level data and look for odd trends around the hour of the shift.


Kaiser: I realize I am asking a different question, but I react according to my training and experience in Quality Engineering. Regardless of the p-value, I always have to ask if this is a real signal or the occasional time when the random occurance happened. Given that a rise came after the DST Sunday at a low p-value doesn't mean much to me until I see that, compared to other two day pairs, it is unlikely this would happen. Using the model that DST leads to something which increases AMI counts, the authors ask is it the time change?, you are asking is it inconsistency in data tracking?, I am asking is there a rise?

An AMI count for one day is not likely to be repeated the next day. I expect the count the next day to be higher or lower. The fact that the data shows it higher is not surprising, just as the fact that it is lower would not be surprising. Is the pattern significantly different that a heads-or-tails pattern, where heads=up and tails=down?

Your point about the raw data is definitely worth asking. The quality of the raw data must be known in order to draw the best conclusions. Like Stamps said, the statistics all start with a watchman who writes down whatever he damn pleases.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.


Post a comment

Your Information

(Name is required. Email address will not be displayed with the comment.)


Link to Principal Analytics Prep

See our curriculum, instructors. Apply.
Business analytics and data visualization expert. Author and Speaker. Founder of Principal Analytics Prep, MS Applied Analytics at Columbia. See my full bio.

Next Events

May: 2 New York Marketing Association Big Data Workshop, NYC

May: 5 NYPL Analytics Careers Talk, NYC

May: 8 Data Visualization Seminar, Denver, CO

May: 15 Data Visualization Seminar, Cambridge, MA

May: 17 Data Visualization Seminar, Philadelphia, PA

May: 22 Data Visualization Seminar, San Ramon, CA

Past Events

See here

Future Courses (New York)

Summer: Statistical Reasoning & Numbersense, Principal Analytics Prep (4 weeks)

Summer: Applied Analytics Frameworks & Methods, Columbia (6 weeks)

Junk Charts Blog

Link to junkcharts

Graphics design by Amanda Lee


  • only in Big Data