Summary of Thinking, Fast and Slow by Kahneman - 1st edition
- 4155 keer gelezen
Join with a free account for more service, or become a member for full access to exclusives and extra support of WorldSupporter >>
One of the characteristics of System 1 is jumping to conclusions. Jumping to a conclusion is efficient if the conclusion is likely to be true, the costs of a potential mistake are acceptable and it saves a fair amount of effort and time. It is risky when the stakes are high, the situation is unfamiliar and there is a lack of time for collecting further information. In this case, it is likely to make an intuitive error, unless System 2 intervenes.
If you read a list of letters, including the number ‘13’ in the same shape as the letters, you tend to read it as the letter ‘B’. Would it be the letter ‘B’ in a list of numbers, you tend to read the number ‘13’. This is explained by the fact that the context affects the interpretation of each character. You jump to a conclusion and fail to detect the deviation. When there is no explicit context, System 1 produces a plausible context. When the situation is uncertain, System 1 takes a bet, which is guided by experience. The current context and recent events strongly influence the interpretation. When you do not remember recent events, you rely on older memories (like singing the alphabet). The B/13-example shows that a definite choice was made without you being aware of it. System 1 did not consider alternatives: it does not know conscious doubt. Doubt and uncertainty are typical for System 2.
Psychologist Gilbert came up with the theory of believing and unbelieving. He argued that understanding an idea starts with attempting to believe it. What would it mean if it were true? The first attempt to believe is an automatic process of System 1, which constructs the most plausible interpretation of the situation. Even a foolish idea (“birds drink wine”) will initially be believed due to the automatic process of associative memory searching for connections between both ideas that would make sense of it.
Unbelieving is according to Gilbert a process of System 2. When System 2 is engaged, we tend to believe most things. We are more likely to be persuaded by commercials when we are depleted and fatigued.
The operations of associative memory are linked to ‘confirmation bias’. The question ‘Is Naomi nice?’ evokes different memories than the question ‘Is Naomi rude?’. System 2 tests a hypothesis by conscious searching for confirming facts. It is a rule of science to test a hypothesis by trying to refute it, but people (even scientists) tend to search for evidence that supports their beliefs. The confirmation bias of System 1 is guilty of uncritically accepting suggestions and exaggerating the probability of unlikely events.
If you like someone’s views and opinions, you are likely to also like his/her appearance and voice. The tendency to like or dislike everything about someone, including the unobserved things, is called the ‘halo effect’. This common bias plays a significant role in the way we shape our view of situations and people. It represents the worlds more coherent than it is in reality.
When you meet a man who is approachable and pleasant to speak to, you tend to believe that he is kind to cats. You know nothing about his love for animals, but you like him and you like animal lovers, so by association you believe he likes cats, which makes you even like him more. Evidence was missing, the gap was filled by a guess that matches your feelings. In other cases, your interpretation is influenced by the emotion linked to your first impression. A classic experiment is the following:
Mark: smart – passionate – impulsive – envious – stubborn
Eric: stubborn – envious – impulsive – passionate – smart
Who do you like more? Most people pick Mark, because the first qualities alter the meaning of the qualities mentioned later. The sequence in which we observe things is important, because the halo effect assigns more weight to first impressions.
The halo effect can be tamed by the principle of decorrelate error. Ask a big group of people to estimate the number of marbles in a jar. Individuals will perform poorly, but the whole group usually does well. Some people underestimate the number, others overestimate it, but the average tends to be fairly accurate. The errors that observers make are independent of the errors made by others. Error reduction only works well in case of independent observations and uncorrelated errors. Error reduction fails if the individuals share a bias, as they will influence each other, which leads to a reduced sample size and a less precise group estimation. The most useful data can be derived from multiple sources by making them independent of each other. An example is the rule that multiple witnesses are not allowed to discuss the incident prior to their testimony.
Our mind treats currently available information completely different from information that is not retrieved from memory. System 1 construct the most plausible story from currently activated ideas, without considering missing information. The coherence of the created story is it’s measure of success, not the quality and amount of the information it is based on. When there is very little information, which occurs regularly, System 1 jumps to conclusions.
Consider this statement: “Will Carlo be a good boss? He is smart and ambitious.” The answer ‘yes’ immediately popped into your head, based on the limited information available. What if the next words were ‘rude’ and ‘irresponsible’? System 1 only came to the conclusion that being smart and ambitious is positive, which will be revised if novel information becomes available. It does not wait for it though and then there is also the problem of the first-impression-bias.
System 1 seeks for coherence and System 2 is lazy, which indicates that System 2 will support a great amount of intuitive beliefs. Although System 2 is capable of checking the evidence and seeking information that is needed before making a decision, System 1 will still influence these decisions by providing non-stop input.
In order to understand intuitive thinking, you must realize that jumping to conclusions on the basis of very little information is an important part of it. Keep the abbreviation ‘WYSIATI’ in mind (What You See Is All There Is). System 1 is extremely insensitive to the quantity and quality of the information that leads to intuitions and impressions.
An experiment involving jury members who were given one-sided evidence demonstrated the striking effect on their judgments. They also felt more confident in their judgments. The consistency of the evidence mattered more than the completeness of it. The less you know, the easier it is to create a coherent story. WYSIATI induces achieving coherence and cognitive ease, that makes us believe a statement is true. It is why we are fast thinkers and can make sense of incomplete stories. In most cases, the created stories come close to reality and result into appropriate actions. WYSIATI can, however, lead to biases of choice and judgement. Examples are overconfidence (we believe what we see and neglect the possibility that crucial evidence is missing), base-rate neglect and framing effects (presenting the same information in different ways evokes different emotions).
Summary per chapter with the 1st edition of Thinking, Fast and Slow by Kahneman
There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.
Do you want to share your summaries with JoHo WorldSupporter and its visitors?
Field of study
JoHo can really use your help! Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world
3271 |
Add new contribution