Tips To Find Study Pitfalls #AHCJ16

Flaws, limits, and conflicts: Tips to find study pitfalls

10 April 2016
Hilda Bastian and Andrew M. Seaman (Reuters Health)

These are my slides from this session, with links for further reading, mostly from my own personal blog posts. (Apologies that you'll need to zoom in on some of them to read text.)

(The views expressed are personal, and do not necessarily reflect the views of the National Center for Biotechnology Information (NCBI), National Library of Medicine (NLM), National Institutes of Health (NIH), or the US Department of Health and Human Services.)


Beware of the too simple answer (Statistically Funny)




A problem for us all!

The next series of slides are based on a 2016 study by JE Blümel and colleagues, its press release, and media coverage. (There was very little coverage of this study.)

Unfortunately, this study is not open access, so most people will only be able to see its abstract. And that can be very misleading. (More on this in my post at Absolutely Maybe.)



Exercise and hot flashes that keep you up at night: could this be a case of which is the chicken, which is the egg rather than cause and effect?


Watch out for "statistical significance" traps:


More on the risks of multiple testing, p-hacking, and data dredging in Data Bingo (Statistically Funny).


(Briefly switched to coverage of another study, this time from the New York Times.)



The quote is from the American Statistical Association's 2016 statement on p-values. (A great story on this from Christie Aschwanden.)

On statistical significance from me at Statistically Funny and Absolutely Maybe (and another).

More on this from Christie Aschwanden, as she gets scientists to try to explain p-values.

Still want to read more on this? Here's a great classic article by Steven Goodman on 12 common misconceptions about p-values: [PDF]. And here's a post from the Cochrane Collaboration's editorial unit on why results should not be reported as statistically significant (or not).


An introduction to confidence intervals (Statistically Funny).

But back to that question about menopause and exercise. A quick trip to PubMed Health to look for a systematic review. PubMed Health is a clinical effectiveness resource for finding systematic reviews, and information for consumers and clinicians based on systematic reviews. (Disclosure: PubMed Health is one of the projects I work on at NCBI.)

What's a systematic review? A post on why they matter (Statistically Funny).

Want more on searching for studies? Check out my post on 9 Ninja PubMed Skills (Absolutely Maybe).


Here's the Cochrane systematic review on exercise and hot flashes by Daley et al we looked at.



There are 5 studies in this systematic review - but only 3 in the meta-analysis for this question. This is a key pitfall - and not just in systematic reviews. The quality of data usually varies within a study: a methodologically strong study can have weak data on some outcomes and not others.


Understanding the outcome is critical. If you can't understand it, be very careful with it. It might not mean what it seems to mean. Here are some relevant posts:

Surrogate outcomes and biomarkers (Statistically Funny).

Subgroup and other post-hoc analyses (Statistically Funny).

Composite endpoints (Statistically Funny).



Here's an example of what can go wrong when relying on surrogate outcomes, instead of actual health outcomes:


But here's what happened 5 years ago when the first trial was published:


Many reported the caveats, though:



More tips on personal de-biasing and bias in science communication inside this post at Absolutely Maybe.

Thanks for your attention! And thanks to Andrew M. Seaman - will add his slides to this post when they go online.




No comments:

Post a Comment