Updating search results...

Search Resources

9 Results

View
Selected filters:
Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency
Unrestricted Use
CC BY
Rating
0.0 stars

Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among comparison journals. Moreover, reporting openness does not guarantee openness. When badges were earned, reportedly available data were more likely to be actually available, correct, usable, and complete than when badges were not earned. Open materials also increased to a weaker degree, and there was more variability among comparison journals. Badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories.

Subject:
Biology
Life Science
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS Biology
Author:
Agnieszka Slowik
Brian A. Nosek
Carina Sonnleitner
Chelsey Hess-Holden
Curtis Kennett
Erica Baranski
Lina-Sophia Falkenberg
Ljiljana B. Lazarević
Mallory C. Kidwell
Sarah Piechowski
Susann Fiedler
Timothy M. Errington
Tom E. Hardwicke
Date Added:
08/07/2020
Evaluating Registered Reports: A Naturalistic Comparative Study of Article Impact
Unrestricted Use
CC BY
Rating
0.0 stars

Registered Reports (RRs) is a publishing model in which initial peer review is conducted prior to knowing the outcomes of the research. In-principle acceptance of papers at this review stage combats publication bias, and provides a clear distinction between confirmatory and exploratory research. Some editors raise a practical concern about adopting RRs. By reducing publication bias, RRs may produce more negative or mixed results and, if such results are not valued by the research community, receive less citations as a consequence. If so, by adopting RRs, a journal’s impact factor may decline. Despite known flaws with impact factor, it is still used as a heuristic for judging journal prestige and quality. Whatever the merits of considering impact factor as a decision-rule for adopting RRs, it is worthwhile to know whether RRs are cited less than other articles. We will conduct a naturalistic comparison of citation and altmetric impact between published RRs and comparable empirical articles from the same journals.

Subject:
Life Science
Social Science
Material Type:
Reading
Author:
Brian A. Nosek
Felix Singleton Thorn
Lilian T. Hummer
Timothy M. Errington
Date Added:
08/07/2020
Local Grassroots Networks Engaging Open Science in Their Communities
Unrestricted Use
CC BY
Rating
0.0 stars

This recorded webinar features insights from international panelists currently nurturing culture change in research among their local communities.Representat...

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Author:
Brian Nosek
Date Added:
03/31/2021
Metascience Forum 2020 - YouTube
Unrestricted Use
CC BY
Rating
0.0 stars

In his talk, Professor Nosek defines replication as gathering evidence that tests an empirical claim made in an original paper. This intent influences the design and interpretation of a replication study and addresses confusion between conceptual and direct replications.
---
Are you a funder interested in supporting research on the scientific process? Learn more about the communities mobilizing around the emerging field of metascience by visiting metascience.com. Funders are encouraged to review and adopt the practices overviewed at cos.io/top-funders as part of the solution to issues discussed during the Funders Forum.

Subject:
Education
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Brian Nosek
Date Added:
03/21/2021
Recommendations for Increasing Replicability in Psychology: Recommendations for increasing replicability
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
European Journal of Personality
Author:
Brent W. Roberts
Brian A. Nosek
David C. Funder
Filip De Fruyt
Hannelore Weber
Jaap J. A. Denissen
Jan De Houwer
Jelte M. Wicherts
Jens B. Asendorpf
Klaus Fiedler
Manfred Schmitt
Marcel A. G. van Aken
Marco Perugini
Mark Conner
Reinhold Kliegl
Susann Fiedler
Date Added:
08/07/2020
Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability
Unrestricted Use
CC BY
Rating
0.0 stars

An academic scientist’s professional success depends on publishing. Publishing norms emphasize novel, positive results. As such, disciplinary incentives encourage design, analysis, and reporting decisions that elicit positive results and ignore negative results. Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false results persist in the literature unchallenged, reducing efficiency in knowledge accumulation. Previous suggestions to address this problem are unlikely to be effective. For example, a journal of negative results publishes otherwise unpublishable reports. This enshrines the low status of the journal and its content. The persistence of false findings can be meliorated with strategies that make the fundamental but abstract accuracy motive—getting it right—competitive with the more tangible and concrete incentive—getting it published. This article develops strategies for improving scientific practices and knowledge accumulation that account for ordinary human motivations and biases.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Perspectives on Psychological Science
Author:
Brian A. Nosek
Jeffrey R. Spies
Matt Motyl
Date Added:
08/07/2020
What is replication?
Read the Fine Print
Rating
0.0 stars

Replications are inevitably different from the original studies. How do we decide whether something is a replication? The answer shifts the conception of replication from a boring, uncreative, housekeeping activity to an exciting, generative, vital contributor to research progress.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
Center for Open Science
Author:
Brian A. Nosek
Timothy M. Errington
Date Added:
09/10/2019
A manifesto for reproducible science
Unrestricted Use
CC BY
Rating
0.0 stars

Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research.

Subject:
Social Science
Material Type:
Reading
Provider:
Nature Human Behaviour
Author:
Brian A. Nosek
Christopher D. Chambers
Dorothy V. M. Bishop
Eric-Jan Wagenmakers
Jennifer J. Ware
John P. A. Ioannidis
Katherine S. Button
Marcus R. Munafò
Nathalie Percie du Sert
Uri Simonsohn
Date Added:
08/07/2020
An open investigation of the reproducibility of cancer biology research
Unrestricted Use
CC BY
Rating
0.0 stars

It is widely believed that research that builds upon previously published findings has reproduced the original work. However, it is rare for researchers to perform or publish direct replications of existing results. The Reproducibility Project: Cancer Biology is an open investigation of reproducibility in preclinical cancer biology research. We have identified 50 high impact cancer biology articles published in the period 2010-2012, and plan to replicate a subset of experimental results from each article. A Registered Report detailing the proposed experimental designs and protocols for each subset of experiments will be peer reviewed and published prior to data collection. The results of these experiments will then be published in a Replication Study. The resulting open methodology and dataset will provide evidence about the reproducibility of high-impact results, and an opportunity to identify predictors of reproducibility.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Material Type:
Reading
Provider:
eLife
Author:
Brian A Nosek
Elizabeth Iorns
Fraser Elisabeth Tan
Joelle Lomax
Timothy M Errington
William Gunn
Date Added:
08/07/2020