A Manifesto for Reproducible Science. Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E., Ware, J. J., & Ioannidis, J. P. A. 1(1):0021+.
A Manifesto for Reproducible Science [link]Paper  doi  abstract   bibtex   
Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research. [\n] What proportion of published research is likely to be false? Low sample size, small effect sizes, data dredging (also known as P-hacking), conflicts of interest, large numbers of scientists working competitively in silos without combining their efforts, and so on, may conspire to dramatically increase the probability that a published finding is incorrect. The field of metascience – the scientific study of science itself – is flourishing and has generated substantial empirical evidence for the existence and prevalence of threats to efficiency in knowledge accumulation [...]. [\n] Data from many fields suggests reproducibility is lower than is desirable [...]; one analysis estimates that 85\,% of biomedical research efforts are wasted14, while 90\,% of respondents to a recent survey in Nature agreed that there is a 'reproducibility crisis'15. Whether 'crisis' is the appropriate term to describe the current state or trajectory of science is debatable, but accumulated evidence indicates that there is substantial room for improvement with regard to research practices to maximize the efficiency of the research community's use of the public's financial investment in research. [\n] Here we propose a series of measures that we believe will improve research efficiency and robustness of scientific findings by directly targeting specific threats to reproducible science. We argue for the adoption, evaluation and ongoing improvement of these measures to optimize the pace and efficiency of knowledge accumulation. The measures are organized into the following categories16: methods, reporting and dissemination, reproducibility, evaluation and incentives. They are not intended to be exhaustive, but provide a broad, practical and evidence-based set of actions that can be implemented by researchers, institutions, journals and funders. The measures and their current implementation are summarized in Table 1.
@article{munafoManifestoReproducibleScience2017,
  title = {A Manifesto for Reproducible Science},
  author = {Munafò, Marcus R. and Nosek, Brian A. and Bishop, Dorothy V. M. and Button, Katherine S. and Chambers, Christopher D. and Percie du Sert, Nathalie and Simonsohn, Uri and Wagenmakers, Eric-Jan and Ware, Jennifer J. and Ioannidis, John P. A.},
  date = {2017-01},
  journaltitle = {Nature Human Behaviour},
  volume = {1},
  pages = {0021+},
  issn = {2397-3374},
  doi = {10.1038/s41562-016-0021},
  url = {https://doi.org/10.1038/s41562-016-0021},
  abstract = {Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research.

[\textbackslash n] What proportion of published research is likely to be false? Low sample size, small effect sizes, data dredging (also known as P-hacking), conflicts of interest, large numbers of scientists working competitively in silos without combining their efforts, and so on, may conspire to dramatically increase the probability that a published finding is incorrect. The field of metascience -- the scientific study of science itself -- is flourishing and has generated substantial empirical evidence for the existence and prevalence of threats to efficiency in knowledge accumulation [...].

[\textbackslash n] Data from many fields suggests reproducibility is lower than is desirable [...]; one analysis estimates that 85\,\% of biomedical research efforts are wasted14, while 90\,\% of respondents to a recent survey in Nature agreed that there is a 'reproducibility crisis'15. Whether 'crisis' is the appropriate term to describe the current state or trajectory of science is debatable, but accumulated evidence indicates that there is substantial room for improvement with regard to research practices to maximize the efficiency of the research community's use of the public's financial investment in research.

[\textbackslash n] Here we propose a series of measures that we believe will improve research efficiency and robustness of scientific findings by directly targeting specific threats to reproducible science. We argue for the adoption, evaluation and ongoing improvement of these measures to optimize the pace and efficiency of knowledge accumulation. The measures are organized into the following categories16: methods, reporting and dissemination, reproducibility, evaluation and incentives. They are not intended to be exhaustive, but provide a broad, practical and evidence-based set of actions that can be implemented by researchers, institutions, journals and funders. The measures and their current implementation are summarized in Table 1.},
  keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14250247,~to-add-doi-URL,check-list,data-sharing,free-scientific-knowledge,free-software,harking,hypothesizing-after-the-results-are-known,knowledge-freedom,open-data,open-science,reproducibility,reproducible-research,research-management,science-ethics,scientific-knowledge-sharing},
  number = {1}
}

Downloads: 0