The Open Science movement is rapidly changing the scientific landscape. Because exact definitions are often lacking and reforms are constantly evolving, accessible guides to open science are needed. This paper provides an introduction to open science and related reforms in the form of an annotated reading list of seven peer-reviewed articles, following the format of Etz et al. (2018). Written for researchers and students - particularly in psychological science - it highlights and introduces seven topics: understanding open science; open access; open data, materials, and code; reproducible analyses; preregistration and registered reports; replication research; and teaching open science. For each topic, we provide a detailed summary of one particularly informative and actionable article and suggest several further resources. Supporting a broader understanding of open science issues, this overview should enable researchers to engage with, improve, and implement current open, transparent, reproducible, replicable, and cumulative scientific practices.
Documents, protocols, consent forms, reagents, and all materials necessary for research.
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. This OSF Guides covers the topic of accessing your OSF account: Create an OSF Account Sign in to OSF Claim an Unregistered Account Reset Your Password
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. This OSF Guides covers the topics using add-on storage services in the OSF, including: Connect Amazon S3 to a Project Connect Bitbucket to a Project Connect Box to a Project Connect Dataverse to a Project Connect Dropbox to a Project Connect figshare to a Project Connect GitHub to a Project Connect GitLab to a Project Connect Google Drive to a Project Connect OneDrive to a Project Connect ownCloud to a Project
Ongoing technological developments have made it easier than ever before for scientists to share their data, materials, and analysis code. Sharing data and analysis code makes it easier for other researchers to re-use or check published research. These benefits will only emerge if researchers can reproduce the analysis reported in published articles, and if data is annotated well enough so that it is clear what all variables mean. Because most researchers have not been trained in computational reproducibility, it is important to evaluate current practices to identify practices that can be improved. We examined data and code sharing, as well as computational reproducibility of the main results, without contacting the original authors, for Registered Reports published in the psychological literature between 2014 and 2018. Of the 62 articles that met our inclusion criteria, data was available for 40 articles, and analysis scripts for 37 articles. For the 35 articles that shared both data and code and performed analyses in SPSS, R, Python, MATLAB, or JASP, we could run the scripts for 31 articles, and reproduce the main results for 20 articles. Although the articles that shared both data and code (35 out of 62, or 56%) and articles that could be computationally reproduced (20 out of 35, or 57%) was relatively high compared to other studies, there is clear room for improvement. We provide practical recommendations based on our observations, and link to examples of good research practices in the papers we reproduced.
Scientific data and tools should, as much as possible, be free as in beer and free as in freedom. The vast majority of science today is paid for by taxpayer-funded grants; at the same time, the incredible successes of science are strong evidence for the benefit of collaboration in knowledgable pursuits. Within the scientific academy, sharing of expertise, data, tools, etc. is prolific, but only recently with the rise of the Open Access movement has this sharing come to embrace the public. Even though most research data is never shared, both the public and even scientists in their own fields are often unaware of just much data, tools, and other resources are made freely available for analysis! This list is a small attempt at bringing light to data repositories and computational science tools that are often siloed according to each scientific discipline, in the hopes of spurring along both public and professional contributions to science.
Experienced Registered Reports editors and reviewers come together to discuss the format and best practices for handling submissions. The panelists also share insights into what editors are looking for from reviewers as well as practical guidelines for writing a Registered Report. ABOUT THE PANELISTS: Chris Chambers | Chris is a professor of cognitive neuroscience at Cardiff University, Chair of the Registered Reports Committee supported by the Center for Open Science, and one of the founders of Registered Reports. He has helped establish the Registered Reports format for over a dozen journals. Anastasia Kiyonaga | Anastasia is a cognitive neuroscientist who uses converging behavioral, brain stimulation, and neuroimaging methods to probe memory and attention processes. She is currently a postdoctoral researcher with Mark D'Esposito in the Helen Wills Neuroscience Institute at the University of California, Berkeley. Before coming to Berkeley, she received her Ph.D. with Tobias Egner in the Duke Center for Cognitive Neuroscience. She will be an Assistant Professor in the Department of Cognitive Science at UC San Diego starting January, 2020. Jason Scimeca | Jason is a cognitive neuroscientist at UC Berkeley. His research investigates the neural systems that support high-level cognitive processes such as executive function, working memory, and the flexible control of behavior. He completed his Ph.D. at Brown University with David Badre and is currently a postdoctoral researcher in Mark D'Esposito's Cognitive Neuroscience Lab. Moderated by David Mellor, Director of Policy Initiatives for the Center for Open Science.
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. This OSF Guides covers the topic of best practices in open science, including: File Management and Licensing File naming Organizing files Licensing Version Control Research Design Preregistration Creating a data management plan (DMP) document Handling Data How to Make a Data Dictionary Sharing Research Outputs Sharing data Publishing Research Outputs Preprints
Introduction to citations as a presentation. Citing data and code as well as getting citations for data and code.
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. This OSF Guides covers the topics of collaborating on the OSF, including: Requesting access Request Access to a Private Project Request Access to a Public Project Grant Access to a Project Commenting Comment on a Project Wiki Enable Wiki Contributions Edit the Wiki Add and Delete Wiki Pages Rename Wiki Pages View Versions of the Wiki Disable the Wiki
This webinar (recorded Sept. 27, 2017) introduces how to connect other services as add-ons to projects on the Open Science Framework (OSF; https://osf.io). Connecting services to your OSF projects via add-ons enables you to pull together the different parts of your research efforts without having to switch away from tools and workflows you wish to continue using. The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github and Mendeley, to streamline workflows and increase efficiency.
This video will go over three issues that can arise when scientific studies have low statistical power. All materials shown in the video, as well as the content from our other videos, can be found here: https://osf.io/7gqsi/
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. This OSF Guides covers the topics of creating and managing OSF projects, including: Projects and Components Create a Project Create Components Create a Project from a Template Delete a Project Delete a Project with Components Delete a Component See all 7 articles Contributors and Permissions Understand Contributor Permissions Add Contributors to Projects and Components Edit Contributor Permissions Remove Contributors from a Project Import Contributors from a Parent Project into a Component Add Admins from the Parent Project to a Component Management Control Your Privacy Settings View Recent Activity Rename a Project License Your Project Configure Notifications View Project Analytics
In this deep dive session, Dr. Willa van Dijk discusses how transparency with data, materials, and code is beneficial for educational research and education researchers. She illustrates these points by sharing experiences with transparency that were crucial to her success. She then shifts gears to provide tips and tricks for planning a new research project with transparency in mind, including attention to potential pitfalls, and also discusses adapting materials from previous projects to share.
In this deep dive session, Amanda Montoya (UCLA) and Karen Rambo-Hernandez (Texas A&M University) introduce the basics of preregistration and Registered Reports: two methods for creating a permanent record of a research plan prior to conducting data collection. They discuss the conceptual similarities and practical differences between pre-registration and registered reports. They provide practical advice from their own experiences using these practices in research labs and resources available for researchers interested in using these approaches. The session concludes with questions and discussion about adopting these practices and unique considerations for implementing these practices in education research.
Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.
OSF Guides are self-help introductions to using the Open Science Framework (OSF). OSF is a free and open source project management tool that supports researchers throughout their entire project lifecycle. How can it be free? How will OSF be useful to my research? What is a registration? Get your questions about OSF answered here.
Best Practices Guides are a part of OSF Guides by the Center for Open Science. This Best Practices Guide covers File Management and Licensing, including: File naming Organizing files Licensing Version Control
The FOSTER portal is an e-learning platform that brings together the best training resources addressed to those who need to know more about Open Science, or need to develop strategies and skills for implementing Open Science practices in their daily workflows. Here you will find a growing collection of training materials. Many different users - from early-career researchers, to data managers, librarians, research administrators, and graduate schools - can benefit from the portal. In order to meet their needs, the existing materials will be extended from basic to more advanced-level resources. In addition, discipline-specific resources will be created.
Best Practices Guides are a part of OSF Guides by the Center for Open Science. This Best Practices Guide covers Handling Data, including: How to Make a Data Dictionary Sharing Research Outputs Sharing data
This webinar outlines how to use the free Open Science Framework (OSF) as an Electronic Lab Notebook for personal work or private collaborations. Fundamental features we cover include how to record daily activity, how to store images or arbitrary data files, how to invite collaborators, how to view old versions of files, and how to connect all this usage to more complex structures that support the full work of a lab across multiple projects and experiments.