Updating search results...

Search Resources

2 Results

Selected filters:
  • F1000Research
Badges for sharing data and code at Biostatistics: an observational study
Unrestricted Use
0.0 stars

Background: The reproducibility policy at the journal Biostatistics rewards articles with badges for data and code sharing. This study investigates the effect of badges at increasing reproducible research. Methods: The setting of this observational study is the Biostatistics and Statistics in Medicine (control journal) online research archives. The data consisted of 240 randomly sampled articles from 2006 to 2013 (30 articles per year) per journal. Data analyses included: plotting probability of data and code sharing by article submission date, and Bayesian logistic regression modelling. Results: The probability of data sharing was higher at Biostatistics than the control journal but the probability of code sharing was comparable for both journals. The probability of data sharing increased by 3.9 times (95% credible interval: 1.5 to 8.44 times, p-value probability that sharing increased: 0.998) after badges were introduced at Biostatistics. On an absolute scale, this difference was only a 7.6% increase in data sharing (95% CI: 2 to 15%, p-value: 0.998). Badges did not have an impact on code sharing at the journal (mean increase: 1 time, 95% credible interval: 0.03 to 3.58 times, p-value probability that sharing increased: 0.378). 64% of articles at Biostatistics that provide data/code had broken links, and at Statistics in Medicine, 40%; assuming these links worked only slightly changed the effect of badges on data (mean increase: 6.7%, 95% CI: 0.0% to 17.0%, p-value: 0.974) and on code (mean increase: -2%, 95% CI: -10.0 to 7.0%, p-value: 0.286). Conclusions: The effect of badges at Biostatistics was a 7.6% increase in the data sharing rate, 5 times less than the effect of badges at Psychological Science. Though badges at Biostatistics did not impact code sharing, and had a moderate effect on data sharing, badges are an interesting step that journals are taking to incentivise and promote reproducible research.

Material Type:
Adrian G. Barnett
Anisa Rowhani-Farid
Date Added:
Four simple recommendations to encourage best practices in research software
Unrestricted Use
0.0 stars

Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

Computer Science
Information Science
Material Type:
Alejandra Gonzalez-Beltran
Allegra Via
Andrew Treloar
Bérénice Batut
Bernard Pope
Björn GrüningJonas Hagberg
Brane Leskošek
Carole Goble
Daniel S. Katz
Daniel Vaughan
David Mellor
Federico López Gómez
Ferran Sanz
Harry-Anton Talvik
Horst Pichler
Ilian Todorov
Jon Ison
Josep Ll. Gelpí
Leyla Garcia
Luis J. Oliveira
Maarten van Gompel
Madison Flannery
Manuel Corpas
Maria V. Schneider
Martin Cook
Mateusz Kuzak
Michelle Barker
Mikael Borg
Monther Alhamdoosh
Montserrat González Ferreiro
Nathan S. Watson-Haigh
Neil Chue Hong
Nicola Mulder
Petr Holub
Philippa C. Griffin
Radka Svobodová Vařeková
Radosław Suchecki
Rafael C. Jiménez
Robert Pergl
Rob Hooft
Rowland Mosbergen
Salvador Capella-Gutierrez
Simon Gladman
Sonika Tyagi
Steve Crouchc
Victoria Stodden
Xiaochuan Wang
Yasset Perez-Riverol
Date Added: