What has happened down here is the winds have changed – Gelman - 2016 - Article

What is the article about?

Andrew Gelman, the author, responds to Susan Fiske’s article in the APS observer (Mob rule or Wisdom of crowds?) and the current replication crisis. He mostly refers to how her attitudes, can be understood in light of the recent history of psychology and its replication crisis. He suggests that the replication crisis has redrawn the topography of science, especially in social psychology and that some people find these changes to feel catastrophic, like Fiske.

What is research incumbency?

Fiske does not like it when people use social media to publish negative comments on published research. She follows what Gelman calls the research incumbency rule: once an article is published in some approved venue, it should be taken as truth.

Problems with research incumbency: (a) many published papers are in error (e.g. unsuccessful replication), and (b) statistical error draws the line between published and unpublished work.

How did we get here? (a timeline of important events)

To understand Fiske’s attitude, it helps to realize how fast things have changed. In 2011, the replication crisis was barely a cloud on the horizon.

1960s-1970s: Paul Meehl argues that the standard paradigm of experimental psychology does not work. A clever investigator can slowly work is way through a tenuous nomological network, performing many related experiments appearing to the uncritical reader as a fine example of an integrated research program, without once ever refuting or corroborating a single strand of the network.

1960s: Jacob Cohen studies statistical power, arguing that design and data collection are central to good research in psychology à his book: Statistical Power Analysis for the Behavioural Sciences.

1971: Tversky and Kahneman write ‘Belief in the law of small numbers’ focusing on persistent biases in human cognition and researchers’ misunderstanding of uncertainty and variation.

1980s-1990s: Null hypothesis significance testing becomes more and more controversial in psychology.

2006: Satoshi Kanazawa, sociologist, publishes a series of papers with provocative claims (e.g. engineers have more sons, nurses have more daughters etc.) that all turn out to be based on some statistical error. Realization that similar research programs are dead on arrival because of too low signal-to-noise ratio.

2008: Edward Vul, Christine Harris, Piotr Winkielman, and Harold Pashler write a controversial article (voodoo correlations in social neuroscience) arguing that statistical problems are distorting the research field and that many prominent published claims can’t be trusted.

  • Also in 2008: Neuroskeptic: a blog starting to criticize soft targets, then science hype, and then moves to larger criticism of the field.

2011:

  • Joseph Simmons, Leif Nelson, and Uri Simonsohn publish a paper, “False-positive psychology” introducing the terms ‘researcher degrees of freedom’ and ‘p-hacking’.
  • Daryl Bem publishes his article, “Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect”. Had obvious multiple comparisons problem. Earlier work seemed to fit into this larger pattern, that certain methodological flaws in standard statistical practice were not just isolated mistakes. Bem prompted the realization that bad work could be the rule, not the exception.
  • Various cases of scientific misconduct hit the news. Diederik Stapel is kicked out of Tilburg University and Marc Hauser leaves Harvard. Brings attention to the Retraction Watch blog.
    • Researchers who are true believers of their hypotheses, which in turn are vague enough to support any evidence thrown at them à Clarke’s Law.

2012: Gregory Francis publishes “Too good to be true” – arguing that repeated statistically significant results can be a sign of selection bias.

2014: Katherine Button, John Ioannidis, Claire Mokrysz, Brian Nosek, Jonathan Flint, Emma Robinson, and Marcus Munafo publish “Power failure, why small sample size undermines the reliability of neuroscience”. – Closes the loop from Cohen’s power analysis to Meehl’s general despair – connecting selection and overestimates of effect sizes.

2015: “Power pose” research from Dana Carney, Amy Cuddy, and Any Yap, receiving adoring media coverage but suffered from the now-familiar problems of uncontrolled researcher degrees of freedom and failed to be replicated. Prestigious PPNAS (Proceedings of the National Academy of Sciences) also published flawed papers.

2016: Brian Nosek and others organize a large collaborative replication project. Lots of prominent studies don’t replicate.

Barely anything happened for a long time, and even after the first revelations people could still ignore the crisis. Then all of a sudden everything changed. If you were deeply invested in the old system, it would be hard to accept change.

Who is Susan Fiske?

She is the editor of the PPNAS articles. She has had issues with her own published work. There were many data errors and when pointed out (Fiske and colleagues), refused to consider anything. Their theory was so open-ended that it could explain almost any result. Authors claimed that fixing the errors wouldn’t “change the conclusion of the paper”.

The problem is that Fiske is working within a dead paradigm. The paradigm of open-ended theory, of publication in top journals and promotion in the popular and business press, based on ‘p less than 0.05’ results obtained during using abundant degrees of freedom.

What is the goal?

The goal is to do good science. It is hard to do so when mistakes aren’t getting flagged and you’re supposed to act like you’ve been right all along, that any data pattern you see is consistent with theory etc. It’s an issue for authors of original work, for researchers following up on incorrect work, and other researchers who want to do careful work but find it hard to compete in a busy publishing environment with authors of flashy, sloppy exercises in noise mining.

When statistical design analysis shows that this research is impossible, or when replication failures show that published conclusions were wrong, then it’s expected that you move forward, not that you keep doing the same thing, insisting you were always right. It’s an inefficient way to do science, for individual researchers to devote their careers to dead ends because they refuse to admit error.

Access: 
Public

Image

Click & Go to more related summaries or chapters:

Summaries per article with Research Methods: theory and ethics at University of Groningen 20/21

Summaries per article with Research Methods: theory and ethics at University of Groningen 20/21

Summaries and supporting content: 
Access: 
Public
Work for WorldSupporter

Image

JoHo can really use your help!  Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world

Working for JoHo as a student in Leyden

Parttime werken voor JoHo

Comments, Compliments & Kudos:

Add new contribution

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.
Check how to use summaries on WorldSupporter.org

Online access to all summaries, study notes en practice exams

How and why would you use WorldSupporter.org for your summaries and study assistance?

  • For free use of many of the summaries and study aids provided or collected by your fellow students.
  • For free use of many of the lecture and study group notes, exam questions and practice questions.
  • For use of all exclusive summaries and study assistance for those who are member with JoHo WorldSupporter with online access
  • For compiling your own materials and contributions with relevant study help
  • For sharing and finding relevant and interesting summaries, documents, notes, blogs, tips, videos, discussions, activities, recipes, side jobs and more.

Using and finding summaries, study notes en practice exams on JoHo WorldSupporter

There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.

  1. Use the menu above every page to go to one of the main starting pages
    • Starting pages: for some fields of study and some university curricula editors have created (start) magazines where customised selections of summaries are put together to smoothen navigation. When you have found a magazine of your likings, add that page to your favorites so you can easily go to that starting point directly from your profile during future visits. Below you will find some start magazines per field of study
  2. Use the topics and taxonomy terms
    • The topics and taxonomy of the study and working fields gives you insight in the amount of summaries that are tagged by authors on specific subjects. This type of navigation can help find summaries that you could have missed when just using the search tools. Tags are organised per field of study and per study institution. Note: not all content is tagged thoroughly, so when this approach doesn't give the results you were looking for, please check the search tool as back up
  3. Check or follow your (study) organizations:
    • by checking or using your study organizations you are likely to discover all relevant study materials.
    • this option is only available trough partner organizations
  4. Check or follow authors or other WorldSupporters
    • by following individual users, authors  you are likely to discover more relevant study materials.
  5. Use the Search tools
    • 'Quick & Easy'- not very elegant but the fastest way to find a specific summary of a book or study assistance with a specific course or subject.
    • The search tool is also available at the bottom of most pages

Do you want to share your summaries with JoHo WorldSupporter and its visitors?

Quicklinks to fields of study for summaries and study assistance

Field of study

Access level of this page
  • Public
  • WorldSupporters only
  • JoHo members
  • Private
Statistics
774