Updating search results...

Search Resources

24 Results

View
Selected filters:
  • PLOS ONE
A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review
Unrestricted Use
CC BY
Rating
0.0 stars

Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (Nature) and one that does not (Cell), to identify in vivo animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing Nature to Cell from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Doris M. Rubio
Janet S. Lee
Jill Zupetic
John P. Pribis
Joo Heung Yoon
Kwonho Jeong
Kyle M. Holleran
Nader Shaikh
SeungHye Han
Tolani F. Olonisakin
Date Added:
08/07/2020
A funder-imposed data publication requirement seldom inspired data sharing
Unrestricted Use
CC BY
Rating
0.0 stars

Growth of the open science movement has drawn significant attention to data sharing and availability across the scientific community. In this study, we tested the ability to recover data collected under a particular funder-imposed requirement of public availability. We assessed overall data recovery success, tested whether characteristics of the data or data creator were indicators of recovery success, and identified hurdles to data recovery. Overall the majority of data were not recovered (26% recovery of 315 data projects), a similar result to journal-driven efforts to recover data. Field of research was the most important indicator of recovery success, but neither home agency sector nor age of data were determinants of recovery. While we did not find a relationship between recovery of data and age of data, age did predict whether we could find contact information for the grantee. The main hurdles to data recovery included those associated with communication with the researcher; loss of contact with the data creator accounted for half (50%) of unrecoverable datasets, and unavailability of contact information accounted for 35% of unrecoverable datasets. Overall, our results suggest that funding agencies and journals face similar challenges to enforcement of data requirements. We advocate that funding agencies could improve the availability of the data they fund by dedicating more resources to enforcing compliance with data requirements, providing data-sharing tools and technical support to awardees, and administering stricter consequences for those who ignore data sharing preconditions.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Colette L. Ward
Gavin McDonald
Jessica L. Couture
Rachael E. Blake
Date Added:
08/07/2020
The influence of journal submission guidelines on authors' reporting of statistics and use of open research practices
Unrestricted Use
CC BY
Rating
0.0 stars

From January 2014, Psychological Science introduced new submission guidelines that encouraged the use of effect sizes, estimation, and meta-analysis (the “new statistics”), required extra detail of methods, and offered badges for use of open science practices. We investigated the use of these practices in empirical articles published by Psychological Science and, for comparison, by the Journal of Experimental Psychology: General, during the period of January 2013 to December 2015. The use of null hypothesis significance testing (NHST) was extremely high at all times and in both journals. In Psychological Science, the use of confidence intervals increased markedly overall, from 28% of articles in 2013 to 70% in 2015, as did the availability of open data (3 to 39%) and open materials (7 to 31%). The other journal showed smaller or much smaller changes. Our findings suggest that journal-specific submission guidelines may encourage desirable changes in authors’ practices.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
David Giofrè
Geoff Cumming
Ingrid Boedker
Luca Fresc
Patrizio Tressoldi
Date Added:
08/07/2020
A study of the impact of data sharing on article citations using journal policies as a natural experiment
Unrestricted Use
CC BY
Rating
0.0 stars

This study estimates the effect of data sharing on the citations of academic articles, using journal policies as a natural experiment. We begin by examining 17 high-impact journals that have adopted the requirement that data from published articles be publicly posted. We match these 17 journals to 13 journals without policy changes and find that empirical articles published just before their change in editorial policy have citation rates with no statistically significant difference from those published shortly after the shift. We then ask whether this null result stems from poor compliance with data sharing policies, and use the data sharing policy changes as instrumental variables to examine more closely two leading journals in economics and political science with relatively strong enforcement of new data policies. We find that articles that make their data available receive 97 additional citations (estimate standard error of 34). We conclude that: a) authors who share data may be rewarded eventually with additional scholarly citations, and b) data-posting policies alone do not increase the impact of articles published in a journal unless those policies are enforced.

Subject:
Economics
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Allan Dafoe
Andrew K. Rose
Don A. Moore
Edward Miguel
Garret Christensen
Date Added:
08/07/2020