Friday, November 6, 2020

Review of Thinking, Fast and Slow by Daniel Kahneman

This review was written by Eugene Kernes  

Book can be found in: 
Book Club Event = Book List (11/12/2022)
Intriguing Connections = 1) Why Do People Think Differently?

Watch Short Review

Excerpts
“We proposed that they used resemblance as a simplifying heuristic (roughly, a rule of thumb) to make a difficult judgment.  The reliance on the heuristic caused predictable biases (systematic errors) in their predictions” – Daniel Kahneman, Introduction, Page 7

“You cannot help dealing with the limited information you have as if it were all there is to know.  You build the best possible story from the information available to you, and if it is a good story, you believe it.  Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle.” – Daniel Kahneman, Chapter 19: The Illusion Of Understanding, Page 201

“Everything makes sense in hindsight, a fact that financial pundits exploit every evening as they offer convincing accounts of the day’s events.  And we cannot suppress the powerful intuition that what makes sense in hindsight today was predictable yesterday.  The illusion that we understand the past fosters overconfidence in our ability to predict the future.” – Daniel Kahneman, Chapter 20: The Illusion Of Validity, Page 218

Full Review

Overview:

Thinking requires a lot of energy, with the effort usually going unnoticed.  To use less energy, the mind creates heuristics.  Creates shortcut responses to difficult questions.  A simple procedure for finding adequate, but not potentially best responses.  These are automatic responses, which are part of System 1, the automatic system.  System 1 quickly responds to various stimuli and makes suggestions to System 2, which is the effortful system.  System 2 is engaged to deal with mentally intensive activities.  System 2 is associated with agency, and self-control.  

There is an efficient division of labor between System 1 and System 2, as they minimize effort while optimizing performance.  The problem is that System 2 is not and cannot always be engaged, and that the heuristics of System 1 create many biases.  The interpretation of information becomes predictably biased.  This is a book about the biases of System 1, of intuitions, of heuristics.  By knowing when System 1 produces suboptimal performance though biases, System 2 can be activated to correct those biases.  


Getting to Know System 1, and System 2:

Judgement is split into two systems.  An automatic system named System 1, and an effortful system named System 2.  Book is mostly about System 1, and the interactions between System 1 and System 2.  System 1 and System 2 are fictions, but can be useful.  The names themselves act like heuristics, for they are shorthand descriptions for long explanations.  Using the fictions uses up less working memory, less mental bandwidth.  That bandwidth can then be directed to understanding their implications. 

The automatic system, System 1, with little to no effort and with speed responds to stimuli.  Associated with a lack of voluntary control.  System 1 is very intuitive, and a large influence on how decisions get made.  Very good at applying short-term predictions to familiar situations with appropriate reactions.  System 1 detects simple relations, and can integrate information about a single thing.  

The effortful system, System 2, is how attention gets focused on mental activities that need more care.  It is System 2 that is associated with agency, choice, and concentration.  Attention is directed using System 2.  Meant to overcome impulses of System 1.  Self-control is the domain of System 2.  What System 2 does is follow rules, make comparisons with several attributes, and make deliberate choices between options.  

It is the involuntary responses that direct what the voluntary attention considers.  System 1 cannot be turned off, and is continuously making suggestions for System 2.  If the suggestion is endorsed by System 2, then voluntary action is taken.  Normally, System 2 adopts the suggestions of System 1 without much or no modifications. 

If System 1 cannot readily find a respond, the brain enables System 2 to apply more effort to look for the responses.  The source of the cognitive strain does not matter, as any would mobilize System 2.  Mobilizing System 2 means that intuitive responses suggested by System 1 are likely to be rejected.

System 1 does reject alternatives, but does not keep track of those rejections.  System 1 does not have conscious doubt.  Mental effort is needed to consider incompatible interpretations at the same time.  Uncertainty and doubt require mental effort, which is the domain of System 2.

Attention is not only limited, but cannot go beyond its capacity.  Cannot use more attention than the amount available.  Rather than overloading, System 2 reduces the attention elsewhere.  Response to being strained to capacity is to be selective and precise.  Activities which require effort can interfere with each other.  It is difficult or impossible to undertake multiple effortful activities simultaneously.  

Even seemingly obvious stimuli can be missed when attention is taken up elsewhere.  When attention is being used up, the individual not only becomes blind to obvious stimuli, but that the individual cannot recognize that they are attentionally blind.  

Memory is part of System 1, but System 2 deliberately checks memory when needed.  Time after a disaster reduces the memory of that disaster.  With less memory of disasters, there is also less diligence in preventing disasters.  Creating a cycle of disaster then complacency.

An idea triggers thoughts about other ideas in a process called associative activation.  Associations are made only by activated ideas.  Information not activated, information not retrieved from memory, is like the information does not exist.  System 1 incorporates activated ideas, without allowing for consideration of lack of information.  System 1 jumps to conclusions with little information, but does not recognize the size of the jumps.  WYSIATI, which stands for What You See Is All There Is, is a term that can help remember that there is more information that is not seen, and to prevent adding information that is not there.  

The real world is represented as a model in the mind.  Representing what is normal in the world.  System 1’s main function is to maintain and update the model of the world.  Model is constructed by experiences with the real world, and informed by the regularity of outcomes given events and actions, which themselves depend on circumstances.  What System 1 does is interpret the present and expected potentials.  Internal world in not a replica of the real world.  The internal world is influenced by expectations of frequencies being distorted by prevalence and emotional intensity.

Changing the internal model of the world, changes what the individual has thought about the world before the change.  Losing the memory of ideas held before changing the mind.  Leading to lack of recognition that beliefs have changed, and prior states of knowledge.  Individuals holding opposing contradictory views because they have access to different information or select which information to believe.

Continuous vigilance to prevent biases is not necessarily beneficial or practical.  Constantly questioning everything would be extremely tedious, and System 2 is too slow and inefficient to handle System 1’s routine decisions.  A compromise is to recognize situations that are familiar, and for System 2 to handle the situations where mistakes would carry great risk and consequence.


Heuristics, And Energy Use:

Heuristics intuitively substitute difficult question with easier questions.  Then answering the easier question, without noticing that the question substitution.  System 2 endorses a heuristic answer without much scrutiny.  Without noticing that the actual question has not been answered.  Under these heuristics, predictable biases occur, creating systemic errors.  Systemic biases and errors in decisions does not denigrate human intelligence. 

Law of least effort applies to cognitive and physical tasks.  When alternative options are available that have the same outcome, individuals tend to gravitate towards the least demanding of the options.  

As effort is a cost, the acquisition of skills depends on the forthcoming benefits of the skills and the effort the skills take to develop.   Switching tasks takes more effort.  

Thinking becomes homogenized because of heuristics.  Using the heuristic for prediction can often times be accurate.  But with heterogenous judgments, the representativeness heuristic is misleading because individuals neglect the base-rate information which leads to alternative responses.  Representative heuristic has validity, but cannot exclusively rely on the heuristic because it goes against statistical logic.  While neglecting valid homogenized leads to suboptimal judgements, it takes a lot of energy to have heterogenous thoughts.  Costs worth paying for when wanting better outcomes in society, but those costs cannot be scientifically denied.  

Self-control and cognitive effort require mental work which uses up a lot of energy.  When already challenged by demanding cognitive tasks, temptation is easy to fall into.  Ego depletion occurs after being challenged and using up a lot of energy, individuals become less willing and have less self-control after.  Ego depletion leads to loss of motivation.  More default options are taken when the mental bandwidth is low.  

The mind needs coherence.  Useless information, such as such as two contradictory information explaining an event, still satisfies that need for coherence.  Events have consequences, and consequences need causes to explain them.  System 1 is adept at finding a coherent causal story linking fragments of knowledge at its disposal.  More than causes, the mind eagerly identifies agents, and attributes them with personality traits.  Eager to assign specific intentions, with the outcomes coming from individual propensities rather than randomness.   

Even without much information, the mind can jump to conclusions.  Jumping to conclusion is efficient when time and energy are saved while the outcomes are likely to be correct and costs of a mistake acceptable.  The problem is when the mind jumps to conclusion in unfamiliar situations.  Especially when the consequences of a mistake a high, and no time for gathering information. 

Consistency makes for good stories, not completeness.  Having less information makes it easier to fit everything into a coherent pattern.  In experiments, those given 1 side of the story were far more confident of their judgments, than those given both sides.  They used their available information, and did not consider the unprovided information.  Persuasive stories are simple, and focus on few occurred events than the myriad of events that did not happen.  The nonevents, the counterfactuals.  Sometimes adding details to scenarios makes them more persuasive, but the details change their base-rate making them less likely.  


Types of Heuristics, and Biases:

Repetition of information leads to familiar.  Familiarity is not easily distinguished between truth, making repeated falsehoods believable.  Caution is needed for survival, but exposure and repetition signal which decisions are not so dangerous and can be accepted.

A limitation of informational judgement is overconfidence in what is known, while an apparent inability to acknowledge ignorance and uncertainty of the world.  An overestimation of understanding, while an underestimation of chance.  An overconfidence fed by illusory certainty of hindsight.  Illusion of understanding when assuming knowledge of past leads to confidence in the ability to predict the future.

Availability heuristic is when ideas are seen as more prominent due to ease of retrieving them from memory.  What makes public policies more salient or neglected within the mind of the public, is their coverage in the media.  

Seeking disconfirming evidence is a tool within science.  Confirmation bias is an anti-science, as people generally seek confirming evidence of already held beliefs.  

Affect heuristic is when likes and dislikes determine which arguments are compelling.  Easier to dismiss all benefits or all costs from an argument.  Simplifying a complex real world, which makes it more coherent.  Real world is complicated with painful tradeoffs between benefits and costs.  Halo effect is a tendency to like or dislike everything about a person.  To prevent the halo effect, need to decorrelate errors.  Theory-induced blindness is when accepted theories are used as tools for thinking, without noticing their flaws.

Anchoring effect is when an individual anchors to a particular value to estimate the value of an unknown quantity.  The anchoring value can be arbitrary or random, with the individual not straying far from that value to make the forthcoming estimation.  

Law of small numbers is a bias in assuming that the law of large numbers applies to small numbers as well.  The problem with too few samples is that it makes the results are subject to sampling luck.

Planning fallacy occurs when only best-cast scenarios are turned into plans and forecasts.  While ignoring statistics of similar cases.  Many projects have a planning fallacy because they know that when a project is started, it is rarely abandoned by cost overruns or completion times.  Irrational perseverance displayed in failing to abandon projects.  Optimism can make an individual persevere during obstacles, but can be costly. 


Additional Random Observations:

As the mind is meant to detect patterns, the mind will find them whether or not they actually exist.  Random processes lack patters, but they might not appear random.  The assumptions about what is or is not a random result, influences whether an individual considers a result as random.

There are inevitable fluctuations in random processes, which individuals tend to attach causal interpretations for, even if the interpretations are wrong.  Individual can have high and low outcomes, but tend to regress to the mean.  

Illusion of skill can be created by a cultural that attributes results to skill than random chance.  Acquiring more knowledge does not make for better predictions, but develops an enhanced illusion of skills and becomes unrealistically overconfident.  Personal experience is more limited than base-rate information, but individuals favor their personal experience.  Validity of expertise depends on sufficiently regular environment, and opportunities to learn from practice.  Without stable regularities in the environment, intuition cannot be trusted. 

Evaluating risks depends on how that risk is measures.  As what is measured gets more attention than what is not measured.  Defining risks is an act of power.

Acknowledging lack of knowledge can be penalized.  Unbiased appreciation of uncertainty is needed for rationality but is not wanted.  Extreme uncertainty is paralyzing under dangerous circumstances.  Admission of guessing is unacceptable under high stakes.  Pretending knowledge is preferred solution.  

Human judgement can be supplemented by formulas.  The problem is when humans override the formula.  

Asking for proposal premortem can cause individuals to consider alternative views.  A proposal premortem considers ideas taken to lead to disaster, and then asks to write the history of the disaster.  

Prospect theory indicates that the sensitivity to gains and losses are not the same.  Individuals are loss averse, which means that losses carry heavier emotional responses than gains.  What is relative to expectations, determines what is perceived as a gain or loss.  It is not absolute wealth that determines happiness, but recent changes in wealth.

Part of prospect theory is the different between probabilities and the decision weights.  Possibility effect is when highly unlikely outcomes, are given disproportionate weights. 

Endowment effect is using present conditions as reference point for forthcoming decision.  Negotiating relative to that reference point.  Giving up something feels like loss.  Pleasure in getting something.  Endowment effect is counteracted when the owners of the product see the product as value for future exchange.  Loss aversion indicates that individuals are motivated more to avoid losses than to achieve gains.

Illusions of validity is when believers create a social construction that is false, but sustained by their belief.  


Caveats?

The systemic biases and the reason why they occur is a central theme found throughout the book, but the various applications are all minor.  The book takes a few core arguments, puts them in different settings, and gives them a different name.  A large range of biases are presented, but not enough on each.  To understand each bias would require more research, and some explanations need editing.  

Some biases contradict other biases, with similar contexts.  Many of the experiments depends on context and would provide different results given a different context.  Much like how System 1 does not consider alternatives and nonevents, the author favors results given than providing potential alternatives.  

The information is provided in a sporadic and disorganized manner.  This is done to make it easier to read the book, but harder to put all the different information together.  More difficult to understand the core arguments with the way information is spread out. 

Renaming automatic system and effort system to System 1 and System 2 is claimed to use up less energy, but it can do the exact opposite.  Reading the book might have made it less energy intensive to rename, but that increased the energy used in sharing the understanding with others.  The author also makes a note that System 1 and System 2 are fictional characters, but that does not mean everyone understands that. 


Questions to Consider while Reading the Book

•What is the raison d’etre of the book?  For what purpose did the author write the book?  Why do people read this book?
•What are some limitations of the book?
•Why does the brain need to manage energy use?
•How does the brain reduce strain on its energy?
•What are heuristics?
•What are System 1 and System 2?  What does each do?  
•How do System 1 and System 2 interact?
•Why do decisions have systemic biases?
•How does attention get distributed?
•What is WYSIATI?
•What happens to counterfactuals? 
•How is the model of the world constructed?  How is it different from the real world?
•How to avoid systemic biases? 
•What is the law of least effort?
•What is ego depletion? 
•What is the purpose of coherence?
•Is something that is familiar also true? 
•What is the availability heuristic?
•What is the confirmation bias?
•What is the halo effect?
•What is the affect heuristic? 
•What is the anchoring effect?
•What is the planning fallacy?
•What is the law of small numbers?
•How can randomness be identified? 
•What is the regression to the mean?
•What is the illusion of skill?
•What is the prospect theory?  
•What is tyranny of the remembering self?

Book Details
Publisher:         Farrar, Straus and Giroux
Edition ISBN:  9780374533557
Pages to read:   444
Publication:     2013
1st Edition:      2010
Format:            Paperback

Ratings out of 5:
Readability    5
Content          5
Overall           5