Summary of Thinking, Fast and Slow by Kahneman - 1st edition
- 4123 reads
Join with a free account for more service, or become a member for full access to exclusives and extra support of WorldSupporter >>
Economist Kunreuther found that availability effects are helpful in explaining the pattern of insurance purchase and prevention following disasters. Victims are worried after a disaster making them more eager to purchase insurance and adopt measures of prevention. This is temporary: once the memories start to fade, so does the worry. The recurrent cycles of disaster, worry and increasing complacency can be explained by the dynamics of memory.
A classic example of an availability bias is the survey carried out to analyse the public perceptions of risks. Participants were asked to consider sets of causes of death: accidents and strokes or asthma and diabetes. They had to indicate the most frequent cause per set and estimate the ratio of both frequencies. Their judgments were then compared to statistics. Some of the findings were:
80% of participants judged accidental deaths more likely than death by strokes, although strokes cause nearly twice as many deaths.
Tornadoes were considered more deathly than asthma, although asthma kills 20 times more people.
Death by accident and by disease were seen as equally likely, although death by disease is 18 time more frequent.
It was clear that media coverage influenced the estimates of causes of death. Media coverage is biased towards sensationalism and novelty. The media shape the public interest and are shaped by it. Unusual causes of death receive disproportionate attention and are therefore seen as less unusual than they actually are. The world in our mind does not equal the real world. Expectations about the frequency of events are warped by the emotional intensity and prevalence of the information we are exposed to.
The estimates of causes of death represent the activated ideas in associative memory and are an example of substitution. Research also shows that the ease with which ideas of several risks come to mind and the emotional responses to these risks are connected. Terrifying images and thoughts easily come to mind, and vivid thoughts of danger induce fear. Psychologist Slovic introduced the affect heuristic: people rely on their emotions when making decisions and judgments. Do I hate or love it? In many aspects of life, our choices and opinions express our feelings. The affect heuristic is an example of substitution: the difficult question (What do I think about this?) is replaced by the easier question (How do I feel about this?). Slovic relates his finding to the finding of neuroscientist Damasio: when making decisions, our emotional evaluations of outcomes, the bodily state and the avoidance and approach tendencies connected to them all play a key role. Someone who does not show the appropriate emotions before making a decision also has an impaired ability to make reasonable decisions.
Slovic asked participants about their opinions about several technologies. They had to list the risks and benefits of these technologies. They found an extremely high negative correlation between the estimated level of risk and the estimated level of benefit. When they liked a technology, they listed great benefits and minimal risks. After the first task, they had to read a number of arguments in favor of the technologies. Some read arguments regarding the benefits, others read arguments that focused on the low risks. These statements changed the emotional appeal of the technology. The participants who were given the beneficial arguments altered their beliefs about the risks. They liked the technology more after reading about the benefits and considered the technology less risky, without any evidence. The participants who read about the mild risks gained a more positive view of the benefits.
According to Slovic, people are guided by emotion instead of reason. Experts show a lot of the same biases as ‘normal people’, but their preferences and judgments about risks differ from those of others. Differences between the public and experts reflect a conflict of values. Experts usually measure risks by the number of years or lives lost. The public differentiates between ‘bad and good deaths’. The public has a richer idea of risks than the experts, who only count cases. Slovic argues that the assessment of a risk depends on the chosen measure. Measurement and risk are both subjective.
Legal scholar Sunstein disagrees with Slovic. He argues that objectivity can be achieved by expertise, careful deliberation and science. He believes that biased responses to risks are a source of misplaced priorities in the United States’ policy. The system of regulation should reflect objective analysis, not irrational concerns from the public. Citizens are prone to cognitive biases, which in turn influences regulators. Jurist Kuran calls this process of biases turning into policy the ‘availability cascade’. This cascade could start from media coverage of a relatively minor incident, leading up to public worry and ultimately government action. The Alar scare-case demonstrates how a huge public overreaction to a chemical sprayed on apples, which turned out to pose a minimal health risk, led to the FDA banning it.
Dealing with small risks is a limitation in the ability of the mind: they either get ignored or given way too much weight. The amount of concern does not adequately reflect probability of harm. You imagine the dramatic story in the paper (the numerator) and do not think about all the safe cases (the denominator). A parent anxiously waiting for a child who is late from school cannot prevent horrible visions of disasters coming to mind, although there is almost nothing to worry about. Sunstein calls this the ‘probability neglect’. The combination of availability cascades and probability neglect leads to a major exaggeration of a minor threat.
Nowadays, terrorists are a significant source of availability cascades. Terror attacks cause relatively few deaths, for instance compared to the amount of traffic deaths. The difference is in the availability of the risk, the frequency and ease with which they are retrieved from memory. A lot of media coverage and horrible images cause a public concern. Terrorism triggers System 1. It is hard to reason yourself into calmness.
Kahneman shares the discomfort of Sunstein with the influence of availability cascades and irrationals concerns on the public risk policy. But he also agrees with Slovic’s opinion that policy makers should not ignore public concerns, whether they are reasonable or not. The public must be protected from fear, not merely from real dangers. Risk policies should combine the emotions of the public with the knowledge of experts.
Summary per chapter with the 1st edition of Thinking, Fast and Slow by Kahneman
There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.
Do you want to share your summaries with JoHo WorldSupporter and its visitors?
Field of study
JoHo can really use your help! Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world
1369 |
Add new contribution