Science news and articles on health, environment, global warming, stem cells, bird flu, autism, nanotechnology, dinosaurs, evolution -- the latest discoveries in astronomy, anthropology, biology, chemistry, climate & bioengineering, computers, engineering ; medicine, math, physics, psychology, technology, and more from the world's leading research centers universities.

Poor transparency and reporting jeopardize the reproducibility of science


Reported research across the biomedical sciences rarely provides full protocol, data, and necessary level of transparency to verify or replicate the study, according to two articles publishing in PLOS Biology as part of a new Meta-Research Section, on January 4th, 2016. The authors argue that the information publicly available on reported research is in dire need of improvement.

Authors of one study, Shareen Iqbal from Emory University, John Ioannidis from the Meta-Research Innovation Center at Stanford (METRICS) and colleagues, analyzed a corpus of papers published between 2000 and 2014 to determine the extent researchers report key information necessary for properly evaluating and replicating published research, including availability of protocols, data, and the frequency of published novel or replication studies. The authors were surprised by the results: out of 441 articles drawn from across the biomedical literature, only one paper provided a full protocol and no paper made all the data available. The majority of studies didn't state funding or conflicts of interest and replication studies were very rare.

"We hope our survey will further sensitize scientists, funders, journals and other science-related stakeholders about the need to improve these indicators," the authors stated.

A related study, led by Ulrich Dirnagl and team at Charité Universitätsmedizin in Berlin, Germany, examined hundreds of published stroke and cancer research experiments and found that the vast majority don't contain sufficient information about how many animals were used. What's more, in many papers animals "vanished" over the course of the study. Using a computer model, the team simulated the effects of such animal loss on the validity of the experiments. They found that the more animals lost or removed, the shakier or more biased the experimental conclusions.

"The study began with an attempt to look at the robustness of findings in a handful of preclinical papers" explains first author Constance Holman, "but the sheer number of missing animals stopped us in our tracks". In human medicine, publishing a clinical trial without information about the number of patients, or how many dropped out or died over the course of a study would be unthinkable. But nobody had looked carefully at whether animal numbers are properly reported in basic research.

Billions of dollars are wasted every year on research that cannot be reproduced. The findings of these two studies join a long list of concerns about bias and reporting in basic research. However, they also establish ways in which research can become more transparent and potentially more reproducible.


Please mention PLOS Biology as the source for these articles and include the links below in your coverage to take readers to the online, open access articles.

All works published in PLOS Biology are open access, which means that everything is immediately and freely available. Use these URLs in your coverage to provide readers access to the papers upon publication:

PLOS Biology's Meta-Research Editorial:

Holman, Dirnagl and colleagues:

Iqbal, Ioannidis and colleagues:

Holman, Dirnagl and colleagues:

Citation: Holman C, Piper SK, Grittner U, Diamantaras AA, Kimmelman J, Siegerink B, et al. (2016) Where Have All the Rodents Gone? The effects of Attrition in Experimental Research on Cancer and Stroke. PLoS Biol 14(1): e1002331.doi:10.1371/journal.pbio.1002331

Funding: UG was supported by Charité and the Steinbeis Foundation. UD and BS would like to acknowledge the financial support of the German Federal Ministry of Education and Research (BMBF
01 EO 08 01). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing Interests: The authors have declared that no competing interests exist.

Iqbal, Ioannidis and colleagues:

Citation: Iqbal SA, Wallach JD, Khoury MJ, Schully SD, Ioannidis JPA (2016) Reproducible Research Practices and Transparency across the Biomedical Literature. PLoS Biol 14(1): e1002333. doi:10.1371/journal.pbio.1002333

Funding: The authors received no specific funding for this work. The Meta-Research Innovation Center at Stanford (METRICS) is supported by a grant from the Laura and John Arnold Foundation

Competing Interests: The authors have declared that no competing interests exist.

Media Contact

Gavin Morrison
[email protected]

Leave A Reply

Your email address will not be published.