Updating search results...

Search Resources

8 Results

View
Selected filters:
COS Registered Reports Portal
Unrestricted Use
CC BY
Rating
0.0 stars

Registered Reports: Peer review before results are known to align scientific values and practices.

Registered Reports is a publishing format used by over 250 journals that emphasizes the importance of the research question and the quality of methodology by conducting peer review prior to data collection. High quality protocols are then provisionally accepted for publication if the authors follow through with the registered methodology.

This format is designed to reward best practices in adhering to the hypothetico-deductive model of the scientific method. It eliminates a variety of questionable research practices, including low statistical power, selective reporting of results, and publication bias, while allowing complete flexibility to report serendipitous findings.

This page includes information on Registered Reports including readings on Registered Reports, Participating Journals, Details & Workflow, Resources for Editors, Resources For Funders, FAQs, and Allied Initiatives.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Student Guide
Provider:
Center for Open Science
Author:
Center for Open Science
David Mellor
Date Added:
08/07/2020
Four simple recommendations to encourage best practices in research software
Unrestricted Use
CC BY
Rating
0.0 stars

Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Reading
Provider:
F1000Research
Author:
Alejandra Gonzalez-Beltran
Allegra Via
Andrew Treloar
Bernard Pope
Björn GrüningJonas Hagberg
Brane Leskošek
Bérénice Batut
Carole Goble
Daniel S. Katz
Daniel Vaughan
David Mellor
Federico López Gómez
Ferran Sanz
Harry-Anton Talvik
Horst Pichler
Ilian Todorov
Jon Ison
Josep Ll. Gelpí
Leyla Garcia
Luis J. Oliveira
Maarten van Gompel
Madison Flannery
Manuel Corpas
Maria V. Schneider
Martin Cook
Mateusz Kuzak
Michelle Barker
Mikael Borg
Monther Alhamdoosh
Montserrat González Ferreiro
Nathan S. Watson-Haigh
Neil Chue Hong
Nicola Mulder
Petr Holub
Philippa C. Griffin
Radka Svobodová Vařeková
Radosław Suchecki
Rafael C. Jiménez
Rob Hooft
Robert Pergl
Rowland Mosbergen
Salvador Capella-Gutierrez
Simon Gladman
Sonika Tyagi
Steve Crouchc
Victoria Stodden
Xiaochuan Wang
Yasset Perez-Riverol
Date Added:
08/07/2020
Improving the credibility of empirical legal research: practical suggestions for researchers, journals, and law schools
Only Sharing Permitted
CC BY-NC-ND
Rating
0.0 stars

Fields closely related to empirical legal research are enhancing their methods to improve the credibility of their findings. This includes making data, analysis code, and other materials openly available, and preregistering studies. Empirical legal research appears to be lagging behind other fields. This may be due, in part, to a lack of meta-research and guidance on empirical legal studies. The authors seek to fill that gap by evaluating some indicators of credibility in empirical legal research, including a review of guidelines at legal journals. They then provide both general recommendations for researchers, and more specific recommendations aimed at three commonly used empirical legal methods: case law analysis, surveys, and qualitative studies. They end with suggestions for policies and incentive systems that may be implemented by journals and law schools.

Subject:
Law
Material Type:
Reading
Author:
Alex Holcombe
Alexander DeHaven
Crystal N. Steltenpohl
David Mellor
Justin Pickett
Kathryn Zeiler
Simine Vazire
Tobias Heycke
Jason Chin
Date Added:
11/13/2020
Registered Reports Q&A
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar addresses questions related to writing, reviewing, editing, or funding a study using the Registered Report format, featuring Chris Chambers and ...

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Author:
Chris Chambers
david mellor
Date Added:
03/31/2021
Secondary Data Preregistration
Unrestricted Use
Public Domain
Rating
0.0 stars

Preregistration is the process of specifying project details, such as hypotheses, data collection procedures, and analytical decisions, prior to conducting a study. It is designed to make a clearer distinction between data-driven, exploratory work and a-priori, confirmatory work. Both modes of research are valuable, but are easy to unintentionally conflate. See the Preregistration Revolution for more background and recommendations.

For research that uses existing datasets, there is an increased risk of analysts being biased by preliminary trends in the dataset. However, that risk can be balanced by proper blinding to any summary statistics in the dataset and the use of hold out datasets (where the "training" and "validation" datasets are kept separate from each other). See this page for specific recommendations about "split samples" or "hold out" datasets. Finally, if those procedures are not followed, disclosure of possible biases can inform the researcher and her audience about the proper role any results should have (i.e. the results should be deemed mostly exploratory and ideal for additional confirmation).

This project contains a template for creating your preregistration, designed specifically for research using existing data. In the future, this template will be integrated into the OSF.

Subject:
Life Science
Social Science
Material Type:
Reading
Author:
Alexander C. DeHaven
Andrew Hall
Brian Brown
Charles R. Ebersole
Courtney K. Soderberg
David Thomas Mellor
Elliott Kruse
Jerome Olsen
Jessica Kosie
K.D. Valentine
Lorne Campbell
Marjan Bakker
Olmo van den Akker
Pamela Davis-Kean
Rodica I. Damian
Stuart J Ritchie
Thuy-vy Nguyen
William J. Chopik
Sara J. Weston
Date Added:
08/03/2021
Secondary Data Preregistration
Unrestricted Use
Public Domain
Rating
0.0 stars

Preregistration is the process of specifying project details, such as hypotheses, data collection procedures, and analytical decisions, prior to conducting a study. It is designed to make a clearer distinction between data-driven, exploratory work and a-priori, confirmatory work. Both modes of research are valuable, but are easy to unintentionally conflate. See the Preregistration Revolution for more background and recommendations.

For research that uses existing datasets, there is an increased risk of analysts being biased by preliminary trends in the dataset. However, that risk can be balanced by proper blinding to any summary statistics in the dataset and the use of hold out datasets (where the "training" and "validation" datasets are kept separate from each other). See this page for specific recommendations about "split samples" or "hold out" datasets. Finally, if those procedures are not followed, disclosure of possible biases can inform the researcher and her audience about the proper role any results should have (i.e. the results should be deemed mostly exploratory and ideal for additional confirmation).

This project contains a template for creating your preregistration, designed specifically for research using existing data. In the future, this template will be integrated into the OSF.

Subject:
Applied Science
Material Type:
Reading
Author:
Alexander C. DeHaven
Andrew Hall
Brian Brown
Charles R. Ebersole
Courtney K. Soderberg
David Thomas Mellor
Elliott Kruse
Jerome Olsen
Jessica Kosie
K. D. Valentine
Lorne Campbell
Marjan Bakker
Olmo van den Akker
Pamela Davis-Kean
Rodica I. Damian
Stuart J. Ritchie
Thuy-vy Ngugen
William J. Chopik
Sara J. Weston
Date Added:
08/12/2021