« The slow suffocation of science by data | Main | How to rise above the cacophony of sometimes phony data analyses »


Feed You can follow this conversation by subscribing to the comment feed for this post.

Bryan F

Hi Kaiser
Noticed a few common things in VAX trials reports.
No geogogphic or occupational data provided.
Infections appear to grew in a very smooth manner.
Testing data unclear.
Only one control for 'guessing' placebo
Low seroprevalence and +ves in screenings.

Any insights?


BF: The list of issues is very long, as you pointed out. This shines a light on an industry that has never been all that transparent. The press for time during the pandemic has further lowered the bar. (For example, I have pages and pages of questions about the Remdesivir study, and I never ended up putting a post up on that because each time I tried, the list grew longer!)
For vaccine studies, I think the most significant issues are (a) infrequent testing to proactively identify cases (b) lack of transparency around "adjudications" of cases (c) lack of information on ITT (Intent to treat) analyses (d) failure to realize that an interim analysis coupled with sequential enrollment by subgroups present a different analysis context from a full analysis (e) lack of updates after the initial analysis (f) science by press releases (g) glaring lack of discussion among scientists around major hiccups such as the low dose "accident" of Astrazeneca-Oxford vaccine, and the Moderna dataset being way below the required two-month follow-up period (h) the definition of a case in terms of the symptoms list and testing regime (i) the definition of the case-counting window, which I've written a lot about (j) ignoring all confidence intervals (most of which are laughably wide) (k) the silence from the investigators surrounding post-hoc analyses published by associates or foes ....
All of these must be seen in the context of the culture of this science. The situation is similar to the on-going battle Andrew Gelman and associates are waging over the psychology research establishment - many if not all of the above practices appear to be accepted, while outsiders like myself are raising concerns about what we perceive to be non-ignorable issues.

S Kefteridis

I am tempetd to speculate that some of this disjointedness is due to a change / rush to efficacy from safety. Aside from the issues about doing science you have raised, the question then is what are the consequences direct and indirect of potential innaccurate or massaged outputs?

Kudos BTW for your excellent posts!

The comments to this entry are closed.

Get new posts by email:
Kaiser Fung. Business analytics and data visualization expert. Author and Speaker.
Visit my website. Follow my Twitter. See my articles at Daily Beast, 538, HBR, Wired.

See my Youtube and Flickr.


  • only in Big Data
Numbers Rule Your World:
Amazon - Barnes&Noble

Amazon - Barnes&Noble

Junk Charts Blog

Link to junkcharts

Graphics design by Amanda Lee

Next Events

Jan: 10 NYPL Data Science Careers Talk, New York, NY

Past Events

Aug: 15 NYPL Analytics Resume Review Workshop, New York, NY

Apr: 2 Data Visualization Seminar, Pasadena, CA

Mar: 30 ASA DataFest, New York, NY

See more here

Principal Analytics Prep

Link to Principal Analytics Prep