Reality Check on Reproducibility. Nature 533(7604):437.
Reality Check on Reproducibility [link]Paper  doi  abstract   bibtex   
A survey of Nature readers revealed a high level of concern about the problem of irreproducible results. Researchers, funders and journals need to work together to make research more reliable. [Excerpt] Is there a reproducibility crisis in science? Yes, according to the readers of Nature. Two-thirds of researchers who responded to a survey by this journal said that current levels of reproducibility are a major problem. [\n] [...] [\n] What does 'reproducibility' mean? Those who study the science of science joke that the definition of reproducibility itself is not reproducible. Reproducibility can occur across different realms: empirical, computational and statistical. Replication can be analytical, direct, systematic or conceptual. Different people use reproducibility to mean repeatability, robustness, reliability and generalizability. [\n] [...] [\n] Even with a fixed definition, the criteria for reproducibility can vary dramatically between scientists. Senior scientists will not expect each tumour sample they examine under a microscope to look exactly like the images presented in a scientific publication; less experienced scientists might worry that such a result shows lack of reproducibility. [\n] [...] [\n] Pressure to publish, selective reporting, poor use of statistics and finicky protocols can all contribute to wobbly work. Researchers can also be hampered from building on basically solid work by difficult techniques, poorly described methods and incompletely reported data. Funding agencies and publishers are helping to reduce these problems. Funders have changed their grant requirements and awarded grants for the design of courses to improve statistical literacy; journals are supporting technologies and policies that help to address inadequate documentation. For example, Nature's Protocol Exchange website is available to host a protocol for any experiment, pre- or post-publication. [\n] One-third of survey respondents report that they have taken the initiative to improve reproducibility. The simple presence of another person ready to question whether a data point or a sample should really be excluded from analysis can help to cut down on cherry-picking, conscious or not. [...] [\n] [...] [\n] [...] More steps are needed – starting with a discussion in the research community on how to properly credit, and talk to each other about, attempted replications.
@article{natureRealityCheckReproducibility2016,
  title = {Reality Check on Reproducibility},
  author = {{Nature}},
  date = {2016-05},
  journaltitle = {Nature},
  volume = {533},
  pages = {437},
  issn = {0028-0836},
  doi = {10.1038/533437a},
  url = {https://doi.org/10.1038/533437a},
  abstract = {A survey of Nature readers revealed a high level of concern about the problem of irreproducible results. Researchers, funders and journals need to work together to make research more reliable.

[Excerpt] 

Is there a reproducibility crisis in science? Yes, according to the readers of Nature. Two-thirds of researchers who responded to a survey by this journal said that current levels of reproducibility are a major problem.

[\textbackslash n] [...]

[\textbackslash n] What does 'reproducibility' mean? Those who study the science of science joke that the definition of reproducibility itself is not reproducible. Reproducibility can occur across different realms: empirical, computational and statistical. Replication can be analytical, direct, systematic or conceptual. Different people use reproducibility to mean repeatability, robustness, reliability and generalizability.

[\textbackslash n] [...]

[\textbackslash n] Even with a fixed definition, the criteria for reproducibility can vary dramatically between scientists. Senior scientists will not expect each tumour sample they examine under a microscope to look exactly like the images presented in a scientific publication; less experienced scientists might worry that such a result shows lack of reproducibility.

[\textbackslash n] [...]

[\textbackslash n] Pressure to publish, selective reporting, poor use of statistics and finicky protocols can all contribute to wobbly work. Researchers can also be hampered from building on basically solid work by difficult techniques, poorly described methods and incompletely reported data. Funding agencies and publishers are helping to reduce these problems. Funders have changed their grant requirements and awarded grants for the design of courses to improve statistical literacy; journals are supporting technologies and policies that help to address inadequate documentation. For example, Nature's Protocol Exchange website is available to host a protocol for any experiment, pre- or post-publication.

[\textbackslash n] One-third of survey respondents report that they have taken the initiative to improve reproducibility. The simple presence of another person ready to question whether a data point or a sample should really be excluded from analysis can help to cut down on cherry-picking, conscious or not. [...]

[\textbackslash n] [...]

[\textbackslash n] [...] More steps are needed -- starting with a discussion in the research community on how to properly credit, and talk to each other about, attempted replications.},
  keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-14042569,~to-add-doi-URL,data-sharing,open-data,open-science,publish-or-perish,reproducibility,reproducible-research,science-ethics,scientific-knowledge-sharing,survey,uncertainty,uncertainty-propagation},
  number = {7604}
}
Downloads: 0