John Ioannidis has famously blown the whistle on how much of what is published in even the leading medical journals is simply wrong.
And in June 2017, the Friends of the National Library of Medicine hosted a 2-day program at the National Institutes of Health (NIH) titled “Consequential and Reproducible Clinical Research: Charting the Course for Continuous Improvement.” Nice positive spin on a very disturbing past and present.
Who knows how much published preclinical and clinical work has not been or could not be reproduced? We do know that vast sums have been expended trying to develop useful clinical drugs, devices, and procedures on the basis of seriously flawed preclinical research.
How to fix this mess?
- All involved in the scientific enterprise should recognize that irreproducible (or unreproduced) published reports are common and a serious problem.
- The scientific community should cultivate a culture of increased scrutiny and criticism about research reports.
- Real (not fake) journals should improve their prepublication peer review process and procedures.
- Medical and science journals should welcome and give higher priority to studies that attempt to reproduce studies that have already been published, even in the same journals, rather than always insisting that the results that they publish must be novel.
- The NIH and other funding agencies, and their review committees, should give a higher priority to funding projects intended to support or refute research that has already been published.
- The academic personnel processes of the best medical schools should reward faculty whose research tests the reproducibility of prior published studies. Such work is needed and worthy.
- Academic personnel processes should cease to rely on misleading measures, such as the impact factors of journals, as a prime way to evaluate the quality of a faculty member’s work. Primacy of such measures precludes incentive to reproduce studies.
- Postpublication peer review should be enhanced by widespread use of more formal methods of assessment.