Search Resources

8 Results

View
Selected filters:
  • Jelte M. Wicherts
Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking
Unrestricted Use
CC BY
Rating

The designing, collecting, analyzing, and reporting of psychological studies entail many choices that are often arbitrary. The opportunistic use of these so-called researcher degrees of freedom aimed at obtaining statistically significant results is problematic because it enhances the chances of false positive results and may inflate effect size estimates. In this review article, we present an extensive list of 34 degrees of freedom that researchers have in formulating hypotheses, and in designing, running, analyzing, and reporting of psychological research. The list can be used in research methods education, and as a checklist to assess the quality of preregistrations and to determine the potential for bias due to (arbitrary) choices in unregistered studies.

Subject:
Psychology
Material Type:
Reading
Provider:
Frontiers in Psychology
Author:
Coosje L. S. Veldkamp
Hilde E. M. Augusteijn
Jelte M. Wicherts
Marcel A. L. M. van Assen
Marjan Bakker
Robbie C. M. van Aert
Date Added:
08/07/2020
Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking
Unrestricted Use
CC BY
Rating

The designing, collecting, analyzing, and reporting of psychological studies entail many choices that are often arbitrary. The opportunistic use of these so-called researcher degrees of freedom aimed at obtaining statistically significant results is problematic because it enhances the chances of false positive results and may inflate effect size estimates. In this review article, we present an extensive list of 34 degrees of freedom that researchers have in formulating hypotheses, and in designing, running, analyzing, and reporting of psychological research. The list can be used in research methods education, and as a checklist to assess the quality of preregistrations and to determine the potential for bias due to (arbitrary) choices in unregistered studies.

Subject:
Psychology
Material Type:
Reading
Author:
Coosje L. S. Veldkamp
Hilde E. M. Augusteijn
Jelte M. Wicherts
Marcel A. L. M. van Assen
Marjan Bakker
Robbie C. M. van Aert
Date Added:
11/25/2016
Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology
Unrestricted Use
CC BY
Rating

In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011). We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies. In Study 2, we compared reporting inconsistencies in psychology articles published in PLOS journals (with a data sharing policy) and Frontiers in Psychology (without a stipulated data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors. Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing seem extremely effective in promoting data sharing. We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

Subject:
Psychology
Material Type:
Reading
Provider:
Collabra: Psychology
Author:
Coosje L. S. Veldkamp
Jelte M. Wicherts
Jeroen Borghuis
Linda Dominguez-Alvarez
Marcel A. L. M. Van Assen
Michèle B. Nuijten
Date Added:
08/07/2020
Questionable research practices among italian research psychologists
Unrestricted Use
CC BY
Rating

A survey in the United States revealed that an alarmingly large percentage of university psychologists admitted having used questionable research practices that can contaminate the research literature with false positive and biased findings. We conducted a replication of this study among Italian research psychologists to investigate whether these findings generalize to other countries. All the original materials were translated into Italian, and members of the Italian Association of Psychology were invited to participate via an online survey. The percentages of Italian psychologists who admitted to having used ten questionable research practices were similar to the results obtained in the United States although there were small but significant differences in self-admission rates for some QRPs. Nearly all researchers (88%) admitted using at least one of the practices, and researchers generally considered a practice possibly defensible if they admitted using it, but Italian researchers were much less likely than US researchers to consider a practice defensible. Participants’ estimates of the percentage of researchers who have used these practices were greater than the self-admission rates, and participants estimated that researchers would be unlikely to admit it. In written responses, participants argued that some of these practices are not questionable and they have used some practices because reviewers and journals demand it. The similarity of results obtained in the United States, this study, and a related study conducted in Germany suggest that adoption of these practices is an international phenomenon and is likely due to systemic features of the international research and publication processes.

Subject:
Psychology
Material Type:
Reading
Provider:
PLOS ONE
Author:
Coosje L. S. Veldkamp
Franca Agnoli
Jelte M. Wicherts
Paolo Albiero
Roberto Cubelli
Date Added:
08/07/2020
Recommendations for Increasing Replicability in Psychology: Recommendations for increasing replicability
Unrestricted Use
CC BY
Rating

Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward.

Subject:
Psychology
Material Type:
Reading
Provider:
European Journal of Personality
Author:
Brent W. Roberts
Brian A. Nosek
David C. Funder
Filip De Fruyt
Hannelore Weber
Jaap J. A. Denissen
Jan De Houwer
Jelte M. Wicherts
Jens B. Asendorpf
Klaus Fiedler
Manfred Schmitt
Marcel A. G. van Aken
Marco Perugini
Mark Conner
Reinhold Kliegl
Susann Fiedler
Date Added:
08/07/2020
The Weak Spots in Contemporary Science (and How to Fix Them)
Unrestricted Use
CC BY
Rating

In this review, the author discusses several of the weak spots in contemporary science, including scientific misconduct, the problems of post hoc hypothesizing (HARKing), outcome switching, theoretical bloopers in formulating research questions and hypotheses, selective reading of the literature, selective citing of previous results, improper blinding and other design failures, p-hacking or researchers’ tendency to analyze data in many different ways to find positive (typically significant) results, errors and biases in the reporting of results, and publication bias. The author presents some empirical results highlighting problems that lower the trustworthiness of reported results in scientific literatures, including that of animal welfare studies. Some of the underlying causes of these biases are discussed based on the notion that researchers are only human and hence are not immune to confirmation bias, hindsight bias, and minor ethical transgressions. The author discusses solutions in the form of enhanced transparency, sharing of data and materials, (post-publication) peer review, pre-registration, registered reports, improved training, reporting guidelines, replication, dealing with publication bias, alternative inferential techniques, power, and other statistical tools.

Subject:
Biology
Material Type:
Reading
Provider:
Animals
Author:
Jelte M. Wicherts
Date Added:
08/07/2020
Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results
Unrestricted Use
CC BY
Rating

Background The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.

Subject:
Psychology
Material Type:
Reading
Provider:
PLOS ONE
Author:
Dylan Molenaar
Jelte M. Wicherts
Marjan Bakker
Date Added:
08/07/2020
A consensus-based transparency checklist
Unrestricted Use
CC BY
Rating

We present a consensus-based checklist to improve and document the transparency of research reports in social and behavioural research. An accompanying online application allows users to complete the form and generate a report that they can submit with their manuscript or post to a public repository.

Subject:
Psychology
Material Type:
Reading
Provider:
Nature Human Behaviour
Author:
Agneta Fisher
Alexandra M. Freund
Alexandra Sarafoglou
Alice S. Carter
Andrew A. Bennett
Andrew Gelman
Balazs Aczel
Barnabas Szaszi
Benjamin R. Newell
Brendan Nyhan
Candice C. Morey
Charles Clifton
Christopher Beevers
Christopher D. Chambers
Christopher Sullivan
Cristina Cacciari
Daniel Benjamin
Daniel J. Simons
David R. Shanks
Debra Lieberman
Derek Isaacowitz
Dolores Albarracin
Don P. Green
D. Stephen Lindsay
Eric-Jan Wagenmakers
Eric Johnson
Eveline A. Crone
Fernando Hoces de la Guardia
Fiammetta Cosci
George C. Banks
Gordon D. Logan
Hal R. Arkes
Harold Pashler
Janet Kolodner
Jarret Crawford
Jeffrey Pollack
Jelte M. Wicherts
John Antonakis
John Curtin
John P. Ioannidis
Joseph Cesario
Kai Jonas
Lea Moersdorf
Lisa L. Harlow
Marcus Munafò
Mark Fichman
M. Gareth Gaskell
Mike Cortese
Mitja D. Back
Morton A. Gernsbacher
Nelson Cowan
Nicole D. Anderson
Pasco Fearon
Randall Engle
Robert L. Greene
Roger Giner-Sorolla
Ronán M. Conroy
Scott O. Lilienfeld
Simine Vazire
Simon Farrell
Šimon Kucharský
Stavroula Kousta
Ty W. Boyer
Wendy B. Mendes
Wiebke Bleidorn
Willem Frankenhuis
Zoltan Kekecs
Date Added:
08/07/2020