Updating search results...

Search Resources

100 Results

View
Selected filters:
  • publishing
Dissemination and publication of research findings: an updated review of related biases
Read the Fine Print
Rating
0.0 stars

Objectives To identify and appraise empirical studies on publication and related biases published since 1998; to assess methods to deal with publication and related biases; and to examine, in a random sample of published systematic reviews, measures taken to prevent, reduce and detect dissemination bias. Data sources The main literature search, in August 2008, covered the Cochrane Methodology Register Database, MEDLINE, EMBASE, AMED and CINAHL. In May 2009, PubMed, PsycINFO and OpenSIGLE were also searched. Reference lists of retrieved studies were also examined. Review methods In Part I, studies were classified as evidence or method studies and data were extracted according to types of dissemination bias or methods for dealing with it. Evidence from empirical studies was summarised narratively. In Part II, 300 systematic reviews were randomly selected from MEDLINE and the methods used to deal with publication and related biases were assessed. Results Studies with significant or positive results were more likely to be published than those with non-significant or negative results, thereby confirming findings from a previous HTA report. There was convincing evidence that outcome reporting bias exists and has an impact on the pooled summary in systematic reviews. Studies with significant results tended to be published earlier than studies with non-significant results, and empirical evidence suggests that published studies tended to report a greater treatment effect than those from the grey literature. Exclusion of non-English-language studies appeared to result in a high risk of bias in some areas of research such as complementary and alternative medicine. In a few cases, publication and related biases had a potentially detrimental impact on patients or resource use. Publication bias can be prevented before a literature review (e.g. by prospective registration of trials), or detected during a literature review (e.g. by locating unpublished studies, funnel plot and related tests, sensitivity analysis modelling), or its impact can be minimised after a literature review (e.g. by confirmatory large-scale trials, updating the systematic review). The interpretation of funnel plot and related statistical tests, often used to assess publication bias, was often too simplistic and likely misleading. More sophisticated modelling methods have not been widely used. Compared with systematic reviews published in 1996, recent reviews of health-care interventions were more likely to locate and include non-English-language studies and grey literature or unpublished studies, and to test for publication bias. Conclusions Dissemination of research findings is likely to be a biased process, although the actual impact of such bias depends on specific circumstances. The prospective registration of clinical trials and the endorsement of reporting guidelines may reduce research dissemination bias in clinical research. In systematic reviews, measures can be taken to minimise the impact of dissemination bias by systematically searching for and including relevant studies that are difficult to access. Statistical methods can be useful for sensitivity analyses. Further research is needed to develop methods for qualitatively assessing the risk of publication bias in systematic reviews, and to evaluate the effect of prospective registration of studies, open access policy and improved publication guidelines.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
Health Technology Assessment
Author:
Aj Sutton
C Hing
C Pang
Cs Kwok
F Song
I Harvey
J Ryder
L Hooper
S Parekh
Yk Loke
Date Added:
08/07/2020
Does use of the CONSORT Statement impact the completeness of reporting of randomised controlled trials published in medical journals? A Cochrane reviewa
Unrestricted Use
CC BY
Rating
0.0 stars

Background
The Consolidated Standards of Reporting Trials (CONSORT) Statement is intended to facilitate better reporting of randomised clinical trials (RCTs). A systematic review recently published in the Cochrane Library assesses whether journal endorsement of CONSORT impacts the completeness of reporting of RCTs; those findings are summarised here.

Methods
Evaluations assessing the completeness of reporting of RCTs based on any of 27 outcomes formulated based on the 1996 or 2001 CONSORT checklists were included; two primary comparisons were evaluated. The 27 outcomes were: the 22 items of the 2001 CONSORT checklist, four sub-items describing blinding and a ‘total summary score’ of aggregate items, as reported. Relative risks (RR) and 99% confidence intervals were calculated to determine effect estimates for each outcome across evaluations.

Results
Fifty-three reports describing 50 evaluations of 16,604 RCTs were assessed for adherence to at least one of 27 outcomes. Sixty-nine of 81 meta-analyses show relative benefit from CONSORT endorsement on completeness of reporting. Between endorsing and non-endorsing journals, 25 outcomes are improved with CONSORT endorsement, five of these significantly (α = 0.01). The number of evaluations per meta-analysis was often low with substantial heterogeneity; validity was assessed as low or unclear for many evaluations.

Conclusions
The results of this review suggest that journal endorsement of CONSORT may benefit the completeness of reporting of RCTs they publish. No evidence suggests that endorsement hinders the completeness of RCT reporting. However, despite relative improvements when CONSORT is endorsed by journals, the completeness of reporting of trials remains sub-optimal. Journals are not sending a clear message about endorsement to authors submitting manuscripts for publication. As such, fidelity of endorsement as an ‘intervention’ has been weak to date. Journals need to take further action regarding their endorsement and implementation of CONSORT to facilitate accurate, transparent and complete reporting of trials.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
Systematic Reviews
Author:
David Moher
Douglas G Altman
Kenneth F Schulz
Larissa Shamseer
Lucy Turner
Date Added:
08/07/2020
Empirical Study of Data Sharing by Authors Publishing in PLoS Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Background Many journals now require authors share their data with other investigators, either by depositing the data in a public repository or making it freely available upon request. These policies are explicit, but remain largely untested. We sought to determine how well authors comply with such policies by requesting data from authors who had published in one of two journals with clear data sharing policies. Methods and Findings We requested data from ten investigators who had published in either PLoS Medicine or PLoS Clinical Trials. All responses were carefully documented. In the event that we were refused data, we reminded authors of the journal's data sharing guidelines. If we did not receive a response to our initial request, a second request was made. Following the ten requests for raw data, three investigators did not respond, four authors responded and refused to share their data, two email addresses were no longer valid, and one author requested further details. A reminder of PLoS's explicit requirement that authors share data did not change the reply from the four authors who initially refused. Only one author sent an original data set. Conclusions We received only one of ten raw data sets requested. This suggests that journal policies requiring data sharing do not lead to authors making their data sets available to independent investigators.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Andrew J. Vickers
Caroline J. Savage
Date Added:
08/07/2020
Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature
Unrestricted Use
CC BY
Rating
0.0 stars

We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64–1.46) for nominally statistically significant results and D = 0.24 (0.11–0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS Biology
Author:
Denes Szucs
John P. A. Ioannidis
Date Added:
08/07/2020
Evaluating Registered Reports: A Naturalistic Comparative Study of Article Impact
Unrestricted Use
CC BY
Rating
0.0 stars

Registered Reports (RRs) is a publishing model in which initial peer review is conducted prior to knowing the outcomes of the research. In-principle acceptance of papers at this review stage combats publication bias, and provides a clear distinction between confirmatory and exploratory research. Some editors raise a practical concern about adopting RRs. By reducing publication bias, RRs may produce more negative or mixed results and, if such results are not valued by the research community, receive less citations as a consequence. If so, by adopting RRs, a journal’s impact factor may decline. Despite known flaws with impact factor, it is still used as a heuristic for judging journal prestige and quality. Whatever the merits of considering impact factor as a decision-rule for adopting RRs, it is worthwhile to know whether RRs are cited less than other articles. We will conduct a naturalistic comparison of citation and altmetric impact between published RRs and comparable empirical articles from the same journals.

Subject:
Life Science
Social Science
Material Type:
Reading
Author:
Brian A. Nosek
Felix Singleton Thorn
Lilian T. Hummer
Timothy M. Errington
Date Added:
08/07/2020
Foster Open Science
Unrestricted Use
CC BY
Rating
0.0 stars

The FOSTER portal is an e-learning platform that brings together the best training resources addressed to those who need to know more about Open Science, or need to develop strategies and skills for implementing Open Science practices in their daily workflows. Here you will find a growing collection of training materials. Many different users - from early-career researchers, to data managers, librarians, research administrators, and graduate schools - can benefit from the portal. In order to meet their needs, the existing materials will be extended from basic to more advanced-level resources. In addition, discipline-specific resources will be created.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Full Course
Provider:
FOSTER Open Science
Author:
FOSTER Open Science
Date Added:
08/07/2020
Histories of Information, Communication, and Computing Technologies
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

The histories of information, communication, and computing technologies have attracted attention from scholars across a variety of disciplines. This course introduces students to prominent voices in these topics across fields. Alongside readings introducing students to this broad scholarly terrain, the course offers guidance in research and writing for publication based on the reality that PhD candidates on the job market need to be published authors, and that every term paper has the potential to be a journal article. We work towards publication by reading widely-cited scholarly histories both for their content and for what they can tell us about scholarly craft.

Subject:
Arts and Humanities
Business and Communication
Communication
History
Social Science
Material Type:
Full Course
Provider:
MIT
Provider Set:
MIT OpenCourseWare
Author:
Light, Jennifer
Date Added:
02/01/2015
How significant are the public dimensions of faculty work in review, promotion and tenure documents?
Unrestricted Use
CC BY
Rating
0.0 stars

Much of the work done by faculty at both public and private universities has significant public dimensions: it is often paid for by public funds; it is often aimed at serving the public good; and it is often subject to public evaluation. To understand how the public dimensions of faculty work are valued, we analyzed review, promotion, and tenure documents from a representative sample of 129 universities in the US and Canada. Terms and concepts related to public and community are mentioned in a large portion of documents, but mostly in ways that relate to service, which is an undervalued aspect of academic careers. Moreover, the documents make significant mention of traditional research outputs and citation-based metrics: however, such outputs and metrics reward faculty work targeted to academics, and often disregard the public dimensions. Institutions that seek to embody their public mission could therefore work towards changing how faculty work is assessed and incentivized.

Subject:
Applied Science
Information Science
Material Type:
Reading
Provider:
eLife
Author:
Carol Muñoz Nieves
Erin C McKiernan
Gustavo E Fischman
Juan P Alperin
Lesley A Schimanski
Meredith T Niles
Date Added:
08/07/2020
How to Use OSF as an Electronic Lab Notebook
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar outlines how to use the free Open Science Framework (OSF) as an Electronic Lab Notebook for personal work or private collaborations. Fundamental features we cover include how to record daily activity, how to store images or arbitrary data files, how to invite collaborators, how to view old versions of files, and how to connect all this usage to more complex structures that support the full work of a lab across multiple projects and experiments.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Information Literacy for Master's and PhD students
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Welcome to this information literacy course for Master’s and PhD students. You probably already have some knowledge of information literacy, but if some of it has slipped your mind or if terms sound unfamiliar, this course includes links to information from the instructions for Bachelor’s students.

Writing your Master’s thesis involves a number of different phases. You cannot simply start writing! You will first need extensive knowledge of the general field of research, in order to see where your subject fits in.

Subject:
Career and Technical Education
Material Type:
Full Course
Provider:
Delft University of Technology
Provider Set:
Delft University OpenCourseWare
Author:
Education Support team
Date Added:
08/16/2019
Introduction to Media Studies, Fall 2003
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

Offers an overview of the social, cultural, political, and economic impact of mediated communication on modern culture. Combines critical discussions with hands-on "experiments" working with different media. Media covered include radio, television, film, the printed word, and digital technologies. Topics include the nature and function of media, core media institutions, and media in transition.

Subject:
Economics
Social Science
Material Type:
Full Course
Homework/Assignment
Syllabus
Provider:
MIT
Provider Set:
MIT OpenCourseWare
Author:
Walsh, Andrea S.
Date Added:
01/01/2003
Introduction to Preprints
Unrestricted Use
CC BY
Rating
0.0 stars

This is a recording of a 45 minute introductory webinar on preprints. With our guest speaker Philip Cohen, we’ll cover what preprints/postprints are, the benefits of preprints, and address some common concerns researcher may have. We’ll show how to determine whether you can post preprints/postprints, and also demonstrate how to use OSF preprints (https://osf.io/preprints/) to share preprints. The OSF is the flagship product of the Center for Open Science, a non-profit technology start-up dedicated to improving the alignment between scientific values and scientific practices. Learn more at cos.io and osf.io, or email contact@cos.io.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
An Introduction to Registered Reports for the Research Funder Community
Unrestricted Use
CC BY
Rating
0.0 stars

In this webinar, Doctors David Mellor (Center for Open Science) and Stavroula Kousta (Nature Human Behavior) discuss the Registered Reports publishing workflow and the benefits it may bring to funders of research. Dr. Mellor details the workflow and what it is intended to do, and Dr. Kousta discusses the lessons learned at Nature Human Behavior from their efforts to implement Registered Reports as a journal.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Intro to Calculating Confidence Intervals
Unrestricted Use
CC BY
Rating
0.0 stars

This video will introduce how to calculate confidence intervals around effect sizes using the MBESS package in R. All materials shown in the video, as well as content from our other videos, can be found here: https://osf.io/7gqsi/

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Licensing your research
Unrestricted Use
CC BY
Rating
0.0 stars

Join us for a 30 minute guest webinar by Brandon Butler, Director of Information Policy at the University of Virginia. This webinar will introduce questions to think about when picking a license for your research. You can signal which license you pick using the License Picker on the Open Science Framework (OSF; https://osf.io). The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github, Mendeley, and now is integrated with JASP, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Mapping the universe of registered reports
Read the Fine Print
Rating
0.0 stars

Registered reports present a substantial departure from traditional publishing models with the goal of enhancing the transparency and credibility of the scientific literature. We map the evolving universe of registered reports to assess their growth, implementation and shortcomings at journals across scientific disciplines.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Nature Human Behaviour
Author:
John P. A. Ioannidis
Tom E. Hardwicke
Date Added:
08/07/2020
Meeting the Requirements of Funders Around Open Science: Open Resources and Processes for Education
Unrestricted Use
CC BY
Rating
0.0 stars

Expectations by funders for transparent and reproducible methods are on the rise. This session covers expectations for preregistration, data sharing, and open access results of three key funders of education research including the Institute of Education Sciences, the National Science Foundation, and Arnold Ventures. Presenters cover practical resources for meeting these requirements such as the Registry for Efficacy and Effectiveness Studies (REES), the Open Science Framework (OSF), and EdArXiv. Presenters: Jessaca Spybrook, Western Michigan University Bryan Cook, University of Virginia David Mellor, Center for Open Science

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Meta-assessment of bias in science
Unrestricted Use
CC BY
Rating
0.0 stars

Numerous biases are believed to affect the scientific literature, but their actual prevalence across disciplines is unknown. To gain a comprehensive picture of the potential imprint of bias in science, we probed for the most commonly postulated bias-related patterns and risk factors, in a large random sample of meta-analyses taken from all disciplines. The magnitude of these biases varied widely across fields and was overall relatively small. However, we consistently observed a significant risk of small, early, and highly cited studies to overestimate effects and of studies not published in peer-reviewed journals to underestimate them. We also found at least partial confirmation of previous evidence suggesting that US studies and early studies might report more extreme effects, although these effects were smaller and more heterogeneously distributed across meta-analyses and disciplines. Authors publishing at high rates and receiving many citations were, overall, not at greater risk of bias. However, effect sizes were likely to be overestimated by early-career researchers, those working in small or long-distance collaborations, and those responsible for scientific misconduct, supporting hypotheses that connect bias to situational factors, lack of mutual control, and individual integrity. Some of these patterns and risk factors might have modestly increased in intensity over time, particularly in the social sciences. Our findings suggest that, besides one being routinely cautious that published small, highly-cited, and earlier studies may yield inflated results, the feasibility and costs of interventions to attenuate biases in the literature might need to be discussed on a discipline-specific and topic-specific basis.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
Proceedings of the National Academy of Sciences
Author:
Daniele Fanelli
John P. A. Ioannidis
Rodrigo Costas
Date Added:
08/07/2020