Updating search results...

Search Resources

6 Results

View
Selected filters:
Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking
Unrestricted Use
CC BY
Rating
0.0 stars

The designing, collecting, analyzing, and reporting of psychological studies entail many choices that are often arbitrary. The opportunistic use of these so-called researcher degrees of freedom aimed at obtaining statistically significant results is problematic because it enhances the chances of false positive results and may inflate effect size estimates. In this review article, we present an extensive list of 34 degrees of freedom that researchers have in formulating hypotheses, and in designing, running, analyzing, and reporting of psychological research. The list can be used in research methods education, and as a checklist to assess the quality of preregistrations and to determine the potential for bias due to (arbitrary) choices in unregistered studies.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Frontiers in Psychology
Author:
Coosje L. S. Veldkamp
Hilde E. M. Augusteijn
Jelte M. Wicherts
Marcel A. L. M. van Assen
Marjan Bakker
Robbie C. M. van Aert
Date Added:
08/07/2020
Questionable research practices among italian research psychologists
Unrestricted Use
CC BY
Rating
0.0 stars

A survey in the United States revealed that an alarmingly large percentage of university psychologists admitted having used questionable research practices that can contaminate the research literature with false positive and biased findings. We conducted a replication of this study among Italian research psychologists to investigate whether these findings generalize to other countries. All the original materials were translated into Italian, and members of the Italian Association of Psychology were invited to participate via an online survey. The percentages of Italian psychologists who admitted to having used ten questionable research practices were similar to the results obtained in the United States although there were small but significant differences in self-admission rates for some QRPs. Nearly all researchers (88%) admitted using at least one of the practices, and researchers generally considered a practice possibly defensible if they admitted using it, but Italian researchers were much less likely than US researchers to consider a practice defensible. Participants’ estimates of the percentage of researchers who have used these practices were greater than the self-admission rates, and participants estimated that researchers would be unlikely to admit it. In written responses, participants argued that some of these practices are not questionable and they have used some practices because reviewers and journals demand it. The similarity of results obtained in the United States, this study, and a related study conducted in Germany suggest that adoption of these practices is an international phenomenon and is likely due to systemic features of the international research and publication processes.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Coosje L. S. Veldkamp
Franca Agnoli
Jelte M. Wicherts
Paolo Albiero
Roberto Cubelli
Date Added:
08/07/2020
Recommendations for Increasing Replicability in Psychology: Recommendations for increasing replicability
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
European Journal of Personality
Author:
Brent W. Roberts
Brian A. Nosek
David C. Funder
Filip De Fruyt
Hannelore Weber
Jaap J. A. Denissen
Jan De Houwer
Jelte M. Wicherts
Jens B. Asendorpf
Klaus Fiedler
Manfred Schmitt
Marcel A. G. van Aken
Marco Perugini
Mark Conner
Reinhold Kliegl
Susann Fiedler
Date Added:
08/07/2020
The Weak Spots in Contemporary Science (and How to Fix Them)
Unrestricted Use
CC BY
Rating
0.0 stars

In this review, the author discusses several of the weak spots in contemporary science, including scientific misconduct, the problems of post hoc hypothesizing (HARKing), outcome switching, theoretical bloopers in formulating research questions and hypotheses, selective reading of the literature, selective citing of previous results, improper blinding and other design failures, p-hacking or researchers’ tendency to analyze data in many different ways to find positive (typically significant) results, errors and biases in the reporting of results, and publication bias. The author presents some empirical results highlighting problems that lower the trustworthiness of reported results in scientific literatures, including that of animal welfare studies. Some of the underlying causes of these biases are discussed based on the notion that researchers are only human and hence are not immune to confirmation bias, hindsight bias, and minor ethical transgressions. The author discusses solutions in the form of enhanced transparency, sharing of data and materials, (post-publication) peer review, pre-registration, registered reports, improved training, reporting guidelines, replication, dealing with publication bias, alternative inferential techniques, power, and other statistical tools.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
Animals
Author:
Jelte M. Wicherts
Date Added:
08/07/2020
Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results
Unrestricted Use
CC BY
Rating
0.0 stars

Background The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Dylan Molenaar
Jelte M. Wicherts
Marjan Bakker
Date Added:
08/07/2020
A consensus-based transparency checklist
Unrestricted Use
CC BY
Rating
0.0 stars

We present a consensus-based checklist to improve and document the transparency of research reports in social and behavioural research. An accompanying online application allows users to complete the form and generate a report that they can submit with their manuscript or post to a public repository.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Nature Human Behaviour
Author:
Agneta Fisher
Alexandra M. Freund
Alexandra Sarafoglou
Alice S. Carter
Andrew A. Bennett
Andrew Gelman
Balazs Aczel
Barnabas Szaszi
Benjamin R. Newell
Brendan Nyhan
Candice C. Morey
Charles Clifton
Christopher Beevers
Christopher D. Chambers
Christopher Sullivan
Cristina Cacciari
D. Stephen Lindsay
Daniel Benjamin
Daniel J. Simons
David R. Shanks
Debra Lieberman
Derek Isaacowitz
Dolores Albarracin
Don P. Green
Eric Johnson
Eric-Jan Wagenmakers
Eveline A. Crone
Fernando Hoces de la Guardia
Fiammetta Cosci
George C. Banks
Gordon D. Logan
Hal R. Arkes
Harold Pashler
Janet Kolodner
Jarret Crawford
Jeffrey Pollack
Jelte M. Wicherts
John Antonakis
John Curtin
John P. Ioannidis
Joseph Cesario
Kai Jonas
Lea Moersdorf
Lisa L. Harlow
M. Gareth Gaskell
Marcus Munafò
Mark Fichman
Mike Cortese
Mitja D. Back
Morton A. Gernsbacher
Nelson Cowan
Nicole D. Anderson
Pasco Fearon
Randall Engle
Robert L. Greene
Roger Giner-Sorolla
Ronán M. Conroy
Scott O. Lilienfeld
Simine Vazire
Simon Farrell
Stavroula Kousta
Ty W. Boyer
Wendy B. Mendes
Wiebke Bleidorn
Willem Frankenhuis
Zoltan Kekecs
Šimon Kucharský
Date Added:
08/07/2020