Updating search results...

Publishing

Reporting along the research lifecycle. Registrations, preprints, publication models, copyright, peer review, metrics, open access publishing, and more.
 

84 affiliated resources

Search Resources

View
Selected filters:
Open Science Toolbox
Unrestricted Use
CC BY
Rating
0.0 stars

There is a vast body of helpful tools that can be used in order to foster Open Science practices. For reasons of clarity, this toolbox aims at providing only a selection of links to these resources and tools. Our goal is to give a short overview on possibilities of how to enhance your Open Science practices without consuming too much of your time.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
Uni Muenchen
Author:
Lutz Heil
Date Added:
07/10/2019
The Open Science Training Handbook
Read the Fine Print
Some Rights Reserved
Rating
0.0 stars

Open Science, the movement to make scientific products and processes accessible to and reusable by all, is about culture and knowledge as much as it is about technologies and services. Convincing researchers of the benefits of changing their practices, and equipping them with the skills and knowledge needed to do so, is hence an important task.This book offers guidance and resources for Open Science instructors and trainers, as well as anyone interested in improving levels of transparency and participation in research practices. Supporting and connecting an emerging Open Science community that wishes to pass on its knowledge, the handbook suggests training activities that can be adapted to various settings and target audiences. The book equips trainers with methods, instructions, exemplary training outlines and inspiration for their own Open Science trainings. It provides Open Science advocates across the globe with practical know-how to deliver Open Science principles to researchers and support staff. What works, what doesn’t? How can you make the most of limited resources? Here you will find a wealth of resources to help you build your own training events.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
FOSTER Open Science
Author:
FOSTER Open Science
Date Added:
06/18/2020
Open Science: What, Why, and How
Unrestricted Use
CC BY
Rating
0.0 stars

Open Science is a collection of actions designed to make scientific processes more transparent and results more accessible. Its goal is to build a more replicable and robust science; it does so using new technologies, altering incentives, and changing attitudes. The current movement towards open science was spurred, in part, by a recent “series of unfortunate events” within psychology and other sciences. These events include the large number of studies that have failed to replicate and the prevalence of common research and publication procedures that could explain why. Many journals and funding agencies now encourage, require, or reward some open science practices, including pre-registration, providing full materials, posting data, distinguishing between exploratory and confirmatory analyses, and running replication studies. Individuals can practice and encourage open science in their many roles as researchers, authors, reviewers, editors, teachers, and members of hiring, tenure, promotion, and awards committees. A plethora of resources are available to help scientists, and science, achieve these goals.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Bobbie Spellman
Elizabeth Gilbert
Katherine Corker
Date Added:
07/02/2018
Open Science in Latin America
Unrestricted Use
CC BY
Rating
0.0 stars

Note: This webinar was presented in Spanish. The slides presented during this webinar can be found here:https://osf.io/6qnse/ The slides presented during this seminar can be found here: https://osf.io/6qnse/ Este seminario web se centrará en el estado de la ciencia abierta en América Latina, desde los esfuerzos de los investigadores individuales para abrir sus flujos de trabajo, herramientas para ayudar a los investigadores a ser abiertos y nuevas redes e iniciativas prometedoras en ciencia abierta. Ricardo Hartley (@ametodico) es profesor de metodología de la investigación de la Universidad Central de Chile, investigador en biología de la reproducción y en comunicación - valoración del conocimiento. Organizador de las OpenCon Santiago 2016 y 2017 y embajador COS. Erin McKiernan es profesora del Departamento de Física, Programa de Física Biomédica de la Universidad Nacional Autónoma de México. También es la fundadora del Why Open Research? proyecto, un sitio educativo para que los investigadores aprendan cómo compartir su trabajo, financiado en parte por la Fundación Shuttleworth. Fernan Federici Noe es profesor asistente e investigador de la Universidad Católica de Chile y fellow internacional del OpenPlant Synthetic Biology Center, University of Cambridge. Fernan es miembro del Global For Open Science Hardware (GOSH) y TECNOx (www.tecnox.org).

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Optimizing Research Collaboration
Unrestricted Use
CC BY
Rating
0.0 stars

In this webinar, we demonstrate the OSF tools available for contributors, labs, centers, and institutions that support stronger collaborations. The demo includes useful practices like: contributor management, the OSF wiki as an electronic lab notebook, using OSF to manage online courses and syllabi, and more. Finally, we look at how OSF Institutions can provide discovery and intelligence gathering infrastructure so that you can focus on conducting and supporting exceptional research. The Center for Open Science’s ongoing mission is to provide community and technical resources to support your commitments to rigorous, transparent research practices. Visit cos.io/institutions to learn more.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Outcome reporting bias in randomized-controlled trials investigating antipsychotic drugs
Unrestricted Use
CC BY
Rating
0.0 stars

Recent literature hints that outcomes of clinical trials in medicine are selectively reported. If applicable to psychotic disorders, such bias would jeopardize the reliability of randomized clinical trials (RCTs) investigating antipsychotics and thus their extrapolation to clinical practice. We therefore comprehensively examined outcome reporting bias in RCTs of antipsychotic drugs by a systematic review of prespecified outcomes on ClinicalTrials.gov records of RCTs investigating antipsychotic drugs in schizophrenia and schizoaffective disorder between 1 January 2006 and 31 December 2013. These outcomes were compared with outcomes published in scientific journals. Our primary outcome measure was concordance between prespecified and published outcomes; secondary outcome measures included outcome modifications on ClinicalTrials.gov after trial inception and the effects of funding source and directionality of results on record adherence. Of the 48 RCTs, 85% did not fully adhere to the prespecified outcomes. Discrepancies between prespecified and published outcomes were found in 23% of RCTs for primary outcomes, whereas 81% of RCTs had at least one secondary outcome non-reported, newly introduced, or changed to a primary outcome in the respective publication. In total, 14% of primary and 44% of secondary prespecified outcomes were modified after trial initiation. Neither funding source (P=0.60) nor directionality of the RCT results (P=0.10) impacted ClinicalTrials.gov record adherence. Finally, the number of published safety endpoints (N=335) exceeded the number of prespecified safety outcomes by 5.5 fold. We conclude that RCTs investigating antipsychotic drugs suffer from substantial outcome reporting bias and offer suggestions to both monitor and limit such bias in the future.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
Translational Psychiatry
Author:
C. H. Vinkers
C. M. C. Lemmens
J. J. Luykx
M. Lancee
R. S. Kahn
Date Added:
08/07/2020
Peer Review: Decisions, decisions
Unrestricted Use
CC BY
Rating
0.0 stars

Journals are exploring new approaches to peer review in order to reduce bias, increase transparency and respond to author preferences. Funders are also getting involved. If you start reading about the subject of peer review, it won't be long before you encounter articles with titles like Can we trust peer review?, Is peer review just a crapshoot? and It's time to overhaul the secretive peer review process. Read some more and you will learn that despite its many shortcomings – it is slow, it is biased, and it lets flawed papers get published while rejecting work that goes on to win Nobel Prizes – the practice of having your work reviewed by your peers before it is published is still regarded as the 'gold standard' of scientific research. Carry on reading and you will discover that peer review as currently practiced is a relatively new phenomenon and that, ironically, there have been remarkably few peer-reviewed studies of peer review.

Subject:
Applied Science
Information Science
Material Type:
Reading
Provider:
eLife
Author:
Peter Rodgers
Date Added:
08/07/2020
Poor replication validity of biomedical association studies reported by newspapers
Unrestricted Use
CC BY
Rating
0.0 stars

Objective To investigate the replication validity of biomedical association studies covered by newspapers. Methods We used a database of 4723 primary studies included in 306 meta-analysis articles. These studies associated a risk factor with a disease in three biomedical domains, psychiatry, neurology and four somatic diseases. They were classified into a lifestyle category (e.g. smoking) and a non-lifestyle category (e.g. genetic risk). Using the database Dow Jones Factiva, we investigated the newspaper coverage of each study. Their replication validity was assessed using a comparison with their corresponding meta-analyses. Results Among the 5029 articles of our database, 156 primary studies (of which 63 were lifestyle studies) and 5 meta-analysis articles were reported in 1561 newspaper articles. The percentage of covered studies and the number of newspaper articles per study strongly increased with the impact factor of the journal that published each scientific study. Newspapers almost equally covered initial (5/39 12.8%) and subsequent (58/600 9.7%) lifestyle studies. In contrast, initial non-lifestyle studies were covered more often (48/366 13.1%) than subsequent ones (45/3718 1.2%). Newspapers never covered initial studies reporting null findings and rarely reported subsequent null observations. Only 48.7% of the 156 studies reported by newspapers were confirmed by the corresponding meta-analyses. Initial non-lifestyle studies were less often confirmed (16/48) than subsequent ones (29/45) and than lifestyle studies (31/63). Psychiatric studies covered by newspapers were less often confirmed (10/38) than the neurological (26/41) or somatic (40/77) ones. This is correlated to an even larger coverage of initial studies in psychiatry. Whereas 234 newspaper articles covered the 35 initial studies that were later disconfirmed, only four press articles covered a subsequent null finding and mentioned the refutation of an initial claim. Conclusion Journalists preferentially cover initial findings although they are often contradicted by meta-analyses and rarely inform the public when they are disconfirmed.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Andy Smith
Estelle Dumas-Mallet
François Gonon
Thomas Boraud
Date Added:
08/07/2020
Preregistration: Improve Research Rigor, Reduce Bias
Unrestricted Use
CC BY
Rating
0.0 stars

In this webinar Professor Brian Nosek, Executive Director of the Center for Open Science (https://cos.io), outlines the practice of Preregistration and how it can aid in increasing the rigor and reproducibility of research. The webinar is co-hosted by the Health Research Alliance, a collaborative member organization of nonprofit research funders. Slides available at: https://osf.io/9m6tx/

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Preregistration in Complex Contexts: A Preregistration Template for the Application of Cognitive Models
Unrestricted Use
CC BY
Rating
0.0 stars

In recent years, open science practices have become increasingly popular in psychology and related sciences. These practices aim to increase rigour and transparency in science as a potential response to the challenges posed by the replication crisis. Many of these reforms -- including the highly influential preregistration -- have been designed for experimental work that tests simple hypotheses with standard statistical analyses, such as assessing whether an experimental manipulation has an effect on a variable of interest. However, psychology is a diverse field of research, and the somewhat narrow focus of the prevalent discussions surrounding and templates for preregistration has led to debates on how appropriate these reforms are for areas of research with more diverse hypotheses and more complex methods of analysis, such as cognitive modelling research within mathematical psychology. Our article attempts to bridge the gap between open science and mathematical psychology, focusing on the type of cognitive modelling that Crüwell, Stefan, & Evans (2019) labelled model application, where researchers apply a cognitive model as a measurement tool to test hypotheses about parameters of the cognitive model. Specifically, we (1) discuss several potential researcher degrees of freedom within model application, (2) provide the first preregistration template for model application, and (3) provide an example of a preregistered model application using our preregistration template. More broadly, we hope that our discussions and proposals constructively advance the debate surrounding preregistration in cognitive modelling, and provide a guide for how preregistration templates may be developed in other diverse or complex research contexts.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Nathan Evans
Sophia Crüwell
Date Added:
12/07/2019
Public Availability of Published Research Data in High-Impact Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Background There is increasing interest to make primary data from published research publicly available. We aimed to assess the current status of making research data available in highly-cited journals across the scientific literature. Methods and Results We reviewed the first 10 original research papers of 2009 published in the 50 original research journals with the highest impact factor. For each journal we documented the policies related to public availability and sharing of data. Of the 50 journals, 44 (88%) had a statement in their instructions to authors related to public availability and sharing of data. However, there was wide variation in journal requirements, ranging from requiring the sharing of all primary data related to the research to just including a statement in the published manuscript that data can be available on request. Of the 500 assessed papers, 149 (30%) were not subject to any data availability policy. Of the remaining 351 papers that were covered by some data availability policy, 208 papers (59%) did not fully adhere to the data availability instructions of the journals they were published in, most commonly (73%) by not publicly depositing microarray data. The other 143 papers that adhered to the data availability instructions did so by publicly depositing only the specific data type as required, making a statement of willingness to share, or actually sharing all the primary data. Overall, only 47 papers (9%) deposited full primary raw data online. None of the 149 papers not subject to data availability policies made their full primary data publicly available. Conclusion A substantial proportion of original research papers published in high-impact journals are either not subject to any data availability policies, or do not adhere to the data availability instructions in their respective journals. This empiric evaluation highlights opportunities for improvement.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Alawi A. Alsheikh-Ali
John P. A. Ioannidis
Mouaz H. Al-Mallah
Waqas Qureshi
Date Added:
08/07/2020
Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size
Unrestricted Use
CC BY
Rating
0.0 stars

Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Anton Kühberger
Astrid Fritz
Thomas Scherndl
Date Added:
08/07/2020
P values in display items are ubiquitous and almost invariably significant: A survey of top science journals
Unrestricted Use
CC BY
Rating
0.0 stars

P values represent a widely used, but pervasively misunderstood and fiercely contested method of scientific inference. Display items, such as figures and tables, often containing the main results, are an important source of P values. We conducted a survey comparing the overall use of P values and the occurrence of significant P values in display items of a sample of articles in the three top multidisciplinary journals (Nature, Science, PNAS) in 2017 and, respectively, in 1997. We also examined the reporting of multiplicity corrections and its potential influence on the proportion of statistically significant P values. Our findings demonstrated substantial and growing reliance on P values in display items, with increases of 2.5 to 14.5 times in 2017 compared to 1997. The overwhelming majority of P values (94%, 95% confidence interval [CI] 92% to 96%) were statistically significant. Methods to adjust for multiplicity were almost non-existent in 1997, but reported in many articles relying on P values in 2017 (Nature 68%, Science 48%, PNAS 38%). In their absence, almost all reported P values were statistically significant (98%, 95% CI 96% to 99%). Conversely, when any multiplicity corrections were described, 88% (95% CI 82% to 93%) of reported P values were statistically significant. Use of Bayesian methods was scant (2.5%) and rarely (0.7%) articles relied exclusively on Bayesian statistics. Overall, wider appreciation of the need for multiplicity corrections is a welcome evolution, but the rapid growth of reliance on P values and implausibly high rates of reported statistical significance are worrisome.

Subject:
Mathematics
Statistics and Probability
Material Type:
Reading
Provider:
PLOS ONE
Author:
Ioana Alina Cristea
John P. A. Ioannidis
Date Added:
08/07/2020
Registered Reports Q&A
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar addresses questions related to writing, reviewing, editing, or funding a study using the Registered Report format, featuring Chris Chambers and ...

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Author:
Chris Chambers
david mellor
Date Added:
03/31/2021
Registered reports: an early example and analysis
Unrestricted Use
CC BY
Rating
0.0 stars

The recent ‘replication crisis’ in psychology has focused attention on ways of increasing methodological rigor within the behavioral sciences. Part of this work has involved promoting ‘Registered Reports’, wherein journals peer review papers prior to data collection and publication. Although this approach is usually seen as a relatively recent development, we note that a prototype of this publishing model was initiated in the mid-1970s by parapsychologist Martin Johnson in the European Journal of Parapsychology (EJP). A retrospective and observational comparison of Registered and non-Registered Reports published in the EJP during a seventeen-year period provides circumstantial evidence to suggest that the approach helped to reduce questionable research practices. This paper aims both to bring Johnson’s pioneering work to a wider audience, and to investigate the positive role that Registered Reports may play in helping to promote higher methodological and statistical standards.

Subject:
Applied Science
Information Science
Psychology
Social Science
Material Type:
Reading
Provider:
PeerJ
Author:
Caroline Watt
Diana Kornbrot
Richard Wiseman
Date Added:
08/07/2020
Releasing a preprint is associated with more attention and citations for the peer-reviewed article
Unrestricted Use
CC BY
Rating
0.0 stars

Preprints in biology are becoming more popular, but only a small fraction of the articles published in peer-reviewed journals have previously been released as preprints. To examine whether releasing a preprint on bioRxiv was associated with the attention and citations received by the corresponding peer-reviewed article, we assembled a dataset of 74,239 articles, 5,405 of which had a preprint, published in 39 journals. Using log-linear regression and random-effects meta-analysis, we found that articles with a preprint had, on average, a 49% higher Altmetric Attention Score and 36% more citations than articles without a preprint. These associations were independent of several other article- and author-level variables (such as scientific subfield and number of authors), and were unrelated to journal-level variables such as access model and Impact Factor. This observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
eLife
Author:
Darwin Y Fu
Jacob J Hughey
Date Added:
08/07/2020
Risk of Bias in Reports of In Vivo Research: A Focus for Improvement
Unrestricted Use
CC BY
Rating
0.0 stars

The reliability of experimental findings depends on the rigour of experimental design. Here we show limited reporting of measures to reduce the risk of bias in a random sample of life sciences publications, significantly lower reporting of randomisation in work published in journals of high impact, and very limited reporting of measures to reduce the risk of bias in publications from leading United Kingdom institutions. Ascertainment of differences between institutions might serve both as a measure of research quality and as a tool for institutional efforts to improve research quality.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
PLOS Biology
Author:
Aaron Lawson McLean
Aikaterini Kyriakopoulou
Andrew Thomson
Aparna Potluru
Arno de Wilde
Cristina Nunes-Fonseca
David W. Howells
Emily S. Sena
Gillian L. Currie
Hanna Vesterinen
Julija Baginskitae
Kieren Egan
Leonid Churilov
Malcolm R. Macleod
Nicki Sherratt
Rachel Hemblade
Stylianos Serghiou
Theo Hirst
Zsanett Bahor
Date Added:
08/07/2020
SPARC Popular Resources
Unrestricted Use
CC BY
Rating
0.0 stars

SPARC is a global coalition committed to making Open the default for research and education. SPARC empowers people to solve big problems and make new discoveries through the adoption of policies and practices that advance Open Access, Open Data, and Open Education.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
SPARC
Author:
Nick Shockey
Date Added:
01/31/2020