All resources in Funders

Preregistration Overview page

(View Complete Item Description)

What is Preregistration? When you preregister your research, you're simply specifying your research plan in advance of your study and submitting it to a registry. Preregistration separates hypothesis-generating (exploratory) from hypothesis-testing (confirmatory) research. Both are important. But the same data cannot be used to generate and test a hypothesis, which can happen unintentionally and reduce the credibility of your results. Addressing this problem through planning improves the quality and transparency of your research. This helps you clearly report your study and helps others who may wish to build on it.

Material Type: Reading

Author: Center for Open Science

TOP Guidelines

(View Complete Item Description)

The Transparency and Openness Promotion guidelines include eight modular standards, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to implement and select a level of implementation for each. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

Material Type: Lesson

Author: Open Science Collaboration

OpenAccess.net

(View Complete Item Description)

The open-access.net platform provides comprehensive information on the subject of Open Access (OA) and offers practical advice on its implementation. Developed collaboratively by the Freie Universität Berlin and the Universities of Goettingen, Konstanz, and Bielefeld, open-access.net first went online at the beginning of May 2007. The platform's target groups include all relevant stakeholders in the science sector, especially the scientists and scholars themselves, university and research institution managers, infrastructure service providers such as libraries and data centres, and funding agencies and policy makers. open-access.net provides easy, one-stop access to comprehensive information on OA. Aspects covered include OA concepts, legal, organisational and technical frameworks, concrete implementation experiences, initiatives, services, service providers, and position papers. The target-group-oriented and discipline-specific presentation of the content enables users to access relevant themes quickly and efficiently. Moreover, the platform offers practical implementation advice and answers to fundamental questions regarding OA. In collaboration with cooperation partners in Austria (the University of Vienna) and Switzerland (the University of Zurich), country-specific web pages for these two countries have been integrated into the platform - especially in the Legal Issues section. Each year since 2007, the information platform has organised the "Open Access Days" at alternating venues in collaboration with local partners. This event is the key conference on OA and Open Science in the German-speaking area. With funding from the Ministry of Science, Research and the Arts (MWK) of the State of Baden-Württemberg, the platform underwent a complete technical and substantive overhaul in 2015.

Material Type: Reading

Author: OpenAccess Germany

SPARC Popular Resources

(View Complete Item Description)

SPARC is a global coalition committed to making Open the default for research and education. SPARC empowers people to solve big problems and make new discoveries through the adoption of policies and practices that advance Open Access, Open Data, and Open Education.

Material Type: Reading

Author: Nick Shockey

Curate Science

(View Complete Item Description)

Curate Science is a unified curation system and platform to verify that research is transparent and credible. It will allow researchers, journals, universities, funders, teachers, journalists, and the general public to ensure:- Transparency: Ensure research meets minimum transparency standards appropriate to the article type and employed methodologies.- Credibility: Ensure follow-up scrutiny is linked to its parent paper, including critical commentaries, reproducibility/robustness re-analyses, and new sample replications.

Material Type: Data Set

Data policies of highly-ranked social science journals

(View Complete Item Description)

By encouraging and requiring that authors share their data in order to publish articles, scholarly journals have become an important actor in the movement to improve the openness of data and the reproducibility of research. But how many social science journals encourage or mandate that authors share the data supporting their research findings? How does the share of journal data policies vary by discipline? What influences these journals’ decisions to adopt such policies and instructions? And what do those policies and instructions look like? We discuss the results of our analysis of the instructions and policies of 291 highly-ranked journals publishing social science research, where we studied the contents of journal data policies and instructions across 14 variables, such as when and how authors are asked to share their data, and what role journal ranking and age play in the existence and quality of data policies and instructions. We also compare our results to the results of other studies that have analyzed the policies of social science journals, although differences in the journals chosen and how each study defines what constitutes a data policy limit this comparison.We conclude that a little more than half of the journals in our study have data policies. A greater share of the economics journals have data policies and mandate sharing, followed by political science/international relations and psychology journals. Finally, we use our findings to make several recommendations: Policies should include the terms “data,� “dataset� or more specific terms that make it clear what to make available; policies should include the benefits of data sharing; journals, publishers, and associations need to collaborate more to clarify data policies; and policies should explicitly ask for qualitative data.

Material Type: Reading

Authors: Abigail Schwartz, Dessi Kirilova, Gerard Otalora, Julian Gautier, Mercè Crosas, Sebastian Karcher

Assessing data availability and research reproducibility in hydrology and water resources

(View Complete Item Description)

There is broad interest to improve the reproducibility of published research. We developed a survey tool to assess the availability of digital research artifacts published alongside peer-reviewed journal articles (e.g. data, models, code, directions for use) and reproducibility of article results. We used the tool to assess 360 of the 1,989 articles published by six hydrology and water resources journals in 2017. Like studies from other fields, we reproduced results for only a small fraction of articles (1.6% of tested articles) using their available artifacts. We estimated, with 95% confidence, that results might be reproduced for only 0.6% to 6.8% of all 1,989 articles. Unlike prior studies, the survey tool identified key bottlenecks to making work more reproducible. Bottlenecks include: only some digital artifacts available (44% of articles), no directions (89%), or all artifacts available but results not reproducible (5%). The tool (or extensions) can help authors, journals, funders, and institutions to self-assess manuscripts, provide feedback to improve reproducibility, and recognize and reward reproducible articles as examples for others.

Material Type: Reading

Authors: Adel M. Abdallah, David E. Rosenberg, Hadia Akbar, James H. Stagge, Nour A. Attallah, Ryan James

A manifesto for reproducible science

(View Complete Item Description)

Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research.

Material Type: Reading

Authors: Brian A. Nosek, Christopher D. Chambers, Dorothy V. M. Bishop, Eric-Jan Wagenmakers, Jennifer J. Ware, John P. A. Ioannidis, Katherine S. Button, Marcus R. Munafò, Nathalie Percie du Sert, Uri Simonsohn

Four simple recommendations to encourage best practices in research software

(View Complete Item Description)

Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

Material Type: Reading

Authors: Alejandra Gonzalez-Beltran, Allegra Via, Andrew Treloar, Bérénice Batut, Bernard Pope, Björn GrüningJonas Hagberg, Brane Leskošek, Carole Goble, Daniel S. Katz, Daniel Vaughan, David Mellor, Federico López Gómez, Ferran Sanz, Harry-Anton Talvik, Horst Pichler, Ilian Todorov, Jon Ison, Josep Ll. Gelpí, Leyla Garcia, Luis J. Oliveira, Maarten van Gompel, Madison Flannery, Manuel Corpas, Maria V. Schneider, Martin Cook, Mateusz Kuzak, Michelle Barker, Mikael Borg, Monther Alhamdoosh, Montserrat González Ferreiro, Nathan S. Watson-Haigh, Neil Chue Hong, Nicola Mulder, Petr Holub, Philippa C. Griffin, Radka Svobodová Vařeková, Radosław Suchecki, Rafael C. Jiménez, Robert Pergl, Rob Hooft, Rowland Mosbergen, Salvador Capella-Gutierrez, Simon Gladman, Sonika Tyagi, Steve Crouchc, Victoria Stodden, Xiaochuan Wang, Yasset Perez-Riverol

Data sharing in PLOS ONE: An analysis of Data Availability Statements

(View Complete Item Description)

A number of publishers and funders, including PLOS, have recently adopted policies requiring researchers to share the data underlying their results and publications. Such policies help increase the reproducibility of the published literature, as well as make a larger body of data available for reuse and re-analysis. In this study, we evaluate the extent to which authors have complied with this policy by analyzing Data Availability Statements from 47,593 papers published in PLOS ONE between March 2014 (when the policy went into effect) and May 2016. Our analysis shows that compliance with the policy has increased, with a significant decline over time in papers that did not include a Data Availability Statement. However, only about 20% of statements indicate that data are deposited in a repository, which the PLOS policy states is the preferred method. More commonly, authors state that their data are in the paper itself or in the supplemental information, though it is unclear whether these data meet the level of sharing required in the PLOS policy. These findings suggest that additional review of Data Availability Statements or more stringent policies may be needed to increase data sharing.

Material Type: Reading

Authors: Alicia Livinski, Christopher W. Belter, Douglas J. Joubert, Holly Thompson, Lisa M. Federer, Lissa N. Snyders, Ya-Ling Lu

A funder-imposed data publication requirement seldom inspired data sharing

(View Complete Item Description)

Growth of the open science movement has drawn significant attention to data sharing and availability across the scientific community. In this study, we tested the ability to recover data collected under a particular funder-imposed requirement of public availability. We assessed overall data recovery success, tested whether characteristics of the data or data creator were indicators of recovery success, and identified hurdles to data recovery. Overall the majority of data were not recovered (26% recovery of 315 data projects), a similar result to journal-driven efforts to recover data. Field of research was the most important indicator of recovery success, but neither home agency sector nor age of data were determinants of recovery. While we did not find a relationship between recovery of data and age of data, age did predict whether we could find contact information for the grantee. The main hurdles to data recovery included those associated with communication with the researcher; loss of contact with the data creator accounted for half (50%) of unrecoverable datasets, and unavailability of contact information accounted for 35% of unrecoverable datasets. Overall, our results suggest that funding agencies and journals face similar challenges to enforcement of data requirements. We advocate that funding agencies could improve the availability of the data they fund by dedicating more resources to enforcing compliance with data requirements, providing data-sharing tools and technical support to awardees, and administering stricter consequences for those who ignore data sharing preconditions.

Material Type: Reading

Authors: Colette L. Ward, Gavin McDonald, Jessica L. Couture, Rachael E. Blake

Peer Review: Decisions, decisions

(View Complete Item Description)

Journals are exploring new approaches to peer review in order to reduce bias, increase transparency and respond to author preferences. Funders are also getting involved. If you start reading about the subject of peer review, it won't be long before you encounter articles with titles like Can we trust peer review?, Is peer review just a crapshoot? and It's time to overhaul the secretive peer review process. Read some more and you will learn that despite its many shortcomings – it is slow, it is biased, and it lets flawed papers get published while rejecting work that goes on to win Nobel Prizes – the practice of having your work reviewed by your peers before it is published is still regarded as the 'gold standard' of scientific research. Carry on reading and you will discover that peer review as currently practiced is a relatively new phenomenon and that, ironically, there have been remarkably few peer-reviewed studies of peer review.

Material Type: Reading

Author: Peter Rodgers

Attitudes towards animal study registries and their characteristics: An online survey of three cohorts of animal researchers

(View Complete Item Description)

Objectives Prospective registration of animal studies has been suggested as a new measure to increase value and reduce waste in biomedical research. We sought to further explore and quantify animal researchers’ attitudes and preferences regarding animal study registries (ASRs). Design Cross-sectional online survey. Setting and participants We conducted a survey with three different samples representing animal researchers: i) corresponding authors from journals with high Eigenfactor, ii) a random Pubmed sample and iii) members of the CAMARADES network. Main outcome measures Perceived level of importance of different aspects of publication bias, the effect of ASRs on different aspects of research as well as the importance of different research types for being registered. Results The survey yielded responses from 413 animal researchers (response rate 7%). The respondents indicated, that some aspects of ASRs can increase administrative burden but could be outweighed by other aspects decreasing this burden. Animal researchers found it more important to register studies that involved animal species with higher levels of cognitive capabilities. The time frame for making registry entries publicly available revealed a strong heterogeneity among respondents, with the largest proportion voting for “access only after consent by the principal investigator” and the second largest proportion voting for “access immediately after registration”. Conclusions The fact that the more senior and experienced animal researchers participating in this survey clearly indicated the practical importance of publication bias and the importance of ASRs underscores the problem awareness across animal researchers and the willingness to actively engage in study registration if effective safeguards for the potential weaknesses of ASRs are put into place. To overcome the first-mover dilemma international consensus statements on how to deal with prospective registration of animal studies might be necessary for all relevant stakeholder groups including animal researchers, academic institutions, private companies, funders, regulatory agencies, and journals.

Material Type: Reading

Authors: André Bleich, Daniel Strech, Emily S. Sena, Hans Laser, René Tolba, Susanne Wieschowski

An Introduction to Registered Reports for the Research Funder Community

(View Complete Item Description)

In this webinar, Doctors David Mellor (Center for Open Science) and Stavroula Kousta (Nature Human Behavior) discuss the Registered Reports publishing workflow and the benefits it may bring to funders of research. Dr. Mellor details the workflow and what it is intended to do, and Dr. Kousta discusses the lessons learned at Nature Human Behavior from their efforts to implement Registered Reports as a journal.

Material Type: Lecture

Author: Center for Open Science

COS Registered Reports Portal

(View Complete Item Description)

Registered Reports: Peer review before results are known to align scientific values and practices. Registered Reports is a publishing format used by over 250 journals that emphasizes the importance of the research question and the quality of methodology by conducting peer review prior to data collection. High quality protocols are then provisionally accepted for publication if the authors follow through with the registered methodology. This format is designed to reward best practices in adhering to the hypothetico-deductive model of the scientific method. It eliminates a variety of questionable research practices, including low statistical power, selective reporting of results, and publication bias, while allowing complete flexibility to report serendipitous findings. This page includes information on Registered Reports including readings on Registered Reports, Participating Journals, Details & Workflow, Resources for Editors, Resources For Funders, FAQs, and Allied Initiatives.

Material Type: Student Guide

Authors: Center for Open Science, David Mellor

Open Access Directory

(View Complete Item Description)

The Open Access Directory is an online compendium of factual lists about open access to science and scholarship, maintained by the community at large. It exists as a wiki hosted by the School of Library and Information Science at Simmons University in Boston, USA. The goal is for the open access community itself to enlarge and correct the lists with little intervention from the editors or editorial board. For quality control, editing privileges are granted to registered users. As far as possible, lists are limited to brief factual statements without narrative or opinion.

Material Type: Reading

Author: OAD Simmons