Analysis of Open Data and Computational Reproducibility in Registered Reports in Psychology. Obels, P., Lakens, D., Coles, N. A., Gottfried, J., & Green, S. A. Advances in Methods and Practices in Psychological Science, 2020.
Analysis of Open Data and Computational Reproducibility in Registered Reports in Psychology [link]Paper  doi  abstract   bibtex   
Ongoing technological developments have made it easier than ever before for scientists to share their data, materials, and analysis code. Sharing data and analysis code makes it easier for other researchers to re-use or check published research. These benefits will only emerge if researchers can reproduce the analysis reported in published articles, and if data is annotated well enough so that it is clear what all variables mean. Because most researchers have not been trained in computational reproducibility, it is important to evaluate current practices to identify practices that can be improved. We examined data and code sharing, as well as computational reproducibility of the main results, without contacting the original authors, for Registered Reports published in the psychological literature between 2014 and 2018. Of the 62 articles that met our inclusion criteria, data was available for 40 articles, and analysis scripts for 37 articles. For the 35 articles that shared both data and code and performed analyses in SPSS, R, Python, MATLAB, or JASP, we could run the scripts for 31 articles, and reproduce the main results for 20 articles. Although the articles that shared both data and code (35 out of 62, or 56%) and articles that could be computationally reproduced (20 out of 35, or 57%) was relatively high compared to other studies, there is clear room for improvement. We provide practical recommendations based on our observations, and link to examples of good research practices in the papers we reproduced.
@article{obels_analysis_2020,
	title = {Analysis of {Open} {Data} and {Computational} {Reproducibility} in {Registered} {Reports} in {Psychology}},
	url = {https://osf.io/fk8vh},
	doi = {10.31234/osf.io/fk8vh},
	abstract = {Ongoing technological developments have made it easier than ever before for scientists to share their data, materials, and analysis code. Sharing data and analysis code makes it easier for other researchers to re-use or check published research. These benefits will only emerge if researchers can reproduce the analysis reported in published articles, and if data is annotated well enough so that it is clear what all variables mean. Because most researchers have not been trained in computational reproducibility, it is important to evaluate current practices to identify practices that can be improved. We examined data and code sharing, as well as computational reproducibility of the main results, without contacting the original authors, for Registered Reports published in the psychological literature between 2014 and 2018. Of the 62 articles that met our inclusion criteria, data was available for 40 articles, and analysis scripts for 37 articles. For the 35 articles that shared both data and code and performed analyses in SPSS, R, Python, MATLAB, or JASP, we could run the scripts for 31 articles, and reproduce the main results for 20 articles. Although the articles that shared both data and code (35 out of 62, or 56\%) and articles that could be computationally reproduced (20 out of 35, or 57\%) was relatively high compared to other studies, there is clear room for improvement. We provide practical recommendations based on our observations, and link to examples of good research practices in the papers we reproduced.},
	urldate = {2020-02-14},
	journal = {Advances in Methods and Practices in Psychological Science},
	author = {Obels, Pepijn and Lakens, Daniel and Coles, Nicholas Alvaro and Gottfried, Jaroslav and Green, Seth Ariel},
	year = {2020}
}

Downloads: 0