HomeGroupsTalkMoreZeitgeist
Search Site
This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.

Results from Google Books

Click on a thumbnail to go to Google Books.

Thinking, Fast and Slow by Daniel Kahneman
Loading...

Thinking, Fast and Slow (original 2011; edition 2011)

by Daniel Kahneman

MembersReviewsPopularityAverage ratingMentions
11,899228545 (4.11)181
In this work the author, a recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology that challenged the rational model of judgment and decision making, has brought together his many years of research and thinking in one book. He explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. He exposes the extraordinary capabilities, and also the faults and biases, of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. He reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives, and how we can use different techniques to guard against the mental glitches that often get us into trouble. This author's work has transformed cognitive psychology and launched the new fields of behavioral economics and happiness studies. In this book, he takes us on a tour of the mind and explains the two systems that drive the way we think and the way we make choices.… (more)
Member:EduLeslie
Title:Thinking, Fast and Slow
Authors:Daniel Kahneman
Info:Farrar, Straus and Giroux (2011), Edition: 1, Hardcover, 512 pages
Collections:Your library
Rating:
Tags:Kindle, psychology

Work Information

Thinking, Fast and Slow by Daniel Kahneman (Author) (2011)

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 181 mentions

English (211)  Dutch (6)  German (2)  French (2)  Portuguese (Brazil) (1)  Norwegian (1)  Spanish (1)  Catalan (1)  Italian (1)  All languages (226)
Showing 1-5 of 211 (next | show all)
Overview:
Thinking requires a lot of energy, with the effort usually going unnoticed. To use less energy, the mind creates heuristics. Creates shortcut responses to difficult questions. A simple procedure for finding adequate, but not potentially best responses. These are automatic responses, which are part of System 1, the automatic system. System 1 quickly responds to various stimuli and makes suggestions to System 2, which is the effortful system. System 2 is engaged to deal with mentally intensive activities. System 2 is associated with agency, and self-control.

There is an efficient division of labor between System 1 and System 2, as they minimize effort while optimizing performance. The problem is that System 2 is not and cannot always be engaged, and that the heuristics of System 1 create many biases. The interpretation of information becomes predictably biased. This is a book about the biases of System 1, of intuitions, of heuristics. By knowing when System 1 produces suboptimal performance though biases, System 2 can be activated to correct those biases.

Getting to Know System 1, and System 2:
Judgement is split into two systems. An automatic system named System 1, and an effortful system named System 2. Book is mostly about System 1, and the interactions between System 1 and System 2. System 1 and System 2 are fictions, but can be useful. The names themselves act like heuristics, for they are shorthand descriptions for long explanations. Using the fictions uses up less working memory, less mental bandwidth. That bandwidth can then be directed to understanding their implications.

The automatic system, System 1, with little to no effort and with speed responds to stimuli. Associated with a lack of voluntary control. System 1 is very intuitive, and a large influence on how decisions get made. Very good at applying short-term predictions to familiar situations with appropriate reactions. System 1 detects simple relations, and can integrate information about a single thing.

The effortful system, System 2, is how attention gets focused on mental activities that need more care. It is System 2 that is associated with agency, choice, and concentration. Attention is directed using System 2. Meant to overcome impulses of System 1. Self-control is the domain of System 2. What System 2 does is follow rules, make comparisons with several attributes, and make deliberate choices between options.

It is the involuntary responses that direct what the voluntary attention considers. System 1 cannot be turned off, and is continuously making suggestions for System 2. If the suggestion is endorsed by System 2, then voluntary action is taken. Normally, System 2 adopts the suggestions of System 1 without much or no modifications.

If System 1 cannot readily find a respond, the brain enables System 2 to apply more effort to look for the responses. The source of the cognitive strain does not matter, as any would mobilize System 2. Mobilizing System 2 means that intuitive responses suggested by System 1 are likely to be rejected.

System 1 does reject alternatives, but does not keep track of those rejections. System 1 does not have conscious doubt. Mental effort is needed to consider incompatible interpretations at the same time. Uncertainty and doubt require mental effort, which is the domain of System 2.

Attention is not only limited, but cannot go beyond its capacity. Cannot use more attention than the amount available. Rather than overloading, System 2 reduces the attention elsewhere. Response to being strained to capacity is to be selective and precise. Activities which require effort can interfere with each other. It is difficult or impossible to undertake multiple effortful activities simultaneously.

Even seemingly obvious stimuli can be missed when attention is taken up elsewhere. When attention is being used up, the individual not only becomes blind to obvious stimuli, but that the individual cannot recognize that they are attentionally blind.

Memory is part of System 1, but System 2 deliberately checks memory when needed. Time after a disaster reduces the memory of that disaster. With less memory of disasters, there is also less diligence in preventing disasters. Creating a cycle of disaster then complacency.

An idea triggers thoughts about other ideas in a process called associative activation. Associations are made only by activated ideas. Information not activated, information not retrieved from memory, is like the information does not exist. System 1 incorporates activated ideas, without allowing for consideration of lack of information. System 1 jumps to conclusions with little information, but does not recognize the size of the jumps. WYSIATI, which stands for What You See Is All There Is, is a term that can help remember that there is more information that is not seen, and to prevent adding information that is not there.

The real world is represented as a model in the mind. Representing what is normal in the world. System 1’s main function is to maintain and update the model of the world. Model is constructed by experiences with the real world, and informed by the regularity of outcomes given events and actions, which themselves depend on circumstances. What System 1 does is interpret the present and expected potentials. Internal world in not a replica of the real world. The internal world is influenced by expectations of frequencies being distorted by prevalence and emotional intensity.

Changing the internal model of the world, changes what the individual has thought about the world before the change. Losing the memory of ideas held before changing the mind. Leading to lack of recognition that beliefs have changed, and prior states of knowledge. Individuals holding opposing contradictory views because they have access to different information or select which information to believe.

Continuous vigilance to prevent biases is not necessarily beneficial or practical. Constantly questioning everything would be extremely tedious, and System 2 is too slow and inefficient to handle System 1’s routine decisions. A compromise is to recognize situations that are familiar, and for System 2 to handle the situations where mistakes would carry great risk and consequence.

Heuristics, And Energy Use:
Heuristics intuitively substitute difficult question with easier questions. Then answering the easier question, without noticing that the question substitution. System 2 endorses a heuristic answer without much scrutiny. Without noticing that the actual question has not been answered. Under these heuristics, predictable biases occur, creating systemic errors. Systemic biases and errors in decisions does not denigrate human intelligence.

Law of least effort applies to cognitive and physical tasks. When alternative options are available that have the same outcome, individuals tend to gravitate towards the least demanding of the options.

As effort is a cost, the acquisition of skills depends on the forthcoming benefits of the skills and the effort the skills take to develop. Switching tasks takes more effort.

Thinking becomes homogenized because of heuristics. Using the heuristic for prediction can often times be accurate. But with heterogenous judgments, the representativeness heuristic is misleading because individuals neglect the base-rate information which leads to alternative responses. Representative heuristic has validity, but cannot exclusively rely on the heuristic because it goes against statistical logic. While neglecting valid homogenized leads to suboptimal judgements, it takes a lot of energy to have heterogenous thoughts. Costs worth paying for when wanting better outcomes in society, but those costs cannot be scientifically denied.

Self-control and cognitive effort require mental work which uses up a lot of energy. When already challenged by demanding cognitive tasks, temptation is easy to fall into. Ego depletion occurs after being challenged and using up a lot of energy, individuals become less willing and have less self-control after. Ego depletion leads to loss of motivation. More default options are taken when the mental bandwidth is low.

The mind needs coherence. Useless information, such as such as two contradictory information explaining an event, still satisfies that need for coherence. Events have consequences, and consequences need causes to explain them. System 1 is adept at finding a coherent causal story linking fragments of knowledge at its disposal. More than causes, the mind eagerly identifies agents, and attributes them with personality traits. Eager to assign specific intentions, with the outcomes coming from individual propensities rather than randomness.

Even without much information, the mind can jump to conclusions. Jumping to conclusion is efficient when time and energy are saved while the outcomes are likely to be correct and costs of a mistake acceptable. The problem is when the mind jumps to conclusion in unfamiliar situations. Especially when the consequences of a mistake a high, and no time for gathering information.

Consistency makes for good stories, not completeness. Having less information makes it easier to fit everything into a coherent pattern. In experiments, those given 1 side of the story were far more confident of their judgments, than those given both sides. They used their available information, and did not consider the unprovided information. Persuasive stories are simple, and focus on few occurred events than the myriad of events that did not happen. The nonevents, the counterfactuals. Sometimes adding details to scenarios makes them more persuasive, but the details change their base-rate making them less likely.

Types of Heuristics, and Biases:
Repetition of information leads to familiar. Familiarity is not easily distinguished between truth, making repeated falsehoods believable. Caution is needed for survival, but exposure and repetition signal which decisions are not so dangerous and can be accepted.

A limitation of informational judgement is overconfidence in what is known, while an apparent inability to acknowledge ignorance and uncertainty of the world. An overestimation of understanding, while an underestimation of chance. An overconfidence fed by illusory certainty of hindsight. Illusion of understanding when assuming knowledge of past leads to confidence in the ability to predict the future.

Availability heuristic is when ideas are seen as more prominent due to ease of retrieving them from memory. What makes public policies more salient or neglected within the mind of the public, is their coverage in the media.

Seeking disconfirming evidence is a tool within science. Confirmation bias is an anti-science, as people generally seek confirming evidence of already held beliefs.

Affect heuristic is when likes and dislikes determine which arguments are compelling. Easier to dismiss all benefits or all costs from an argument. Simplifying a complex real world, which makes it more coherent. Real world is complicated with painful tradeoffs between benefits and costs. Halo effect is a tendency to like or dislike everything about a person. To prevent the halo effect, need to decorrelate errors. Theory-induced blindness is when accepted theories are used as tools for thinking, without noticing their flaws.

Anchoring effect is when an individual anchors to a particular value to estimate the value of an unknown quantity. The anchoring value can be arbitrary or random, with the individual not straying far from that value to make the forthcoming estimation.

Law of small numbers is a bias in assuming that the law of large numbers applies to small numbers as well. The problem with too few samples is that it makes the results are subject to sampling luck.

Planning fallacy occurs when only best-cast scenarios are turned into plans and forecasts. While ignoring statistics of similar cases. Many projects have a planning fallacy because they know that when a project is started, it is rarely abandoned by cost overruns or completion times. Irrational perseverance displayed in failing to abandon projects. Optimism can make an individual persevere during obstacles, but can be costly.

Additional Random Observations:
As the mind is meant to detect patterns, the mind will find them whether or not they actually exist. Random processes lack patters, but they might not appear random. The assumptions about what is or is not a random result, influences whether an individual considers a result as random.

There are inevitable fluctuations in random processes, which individuals tend to attach causal interpretations for, even if the interpretations are wrong. Individual can have high and low outcomes, but tend to regress to the mean.

Illusion of skill can be created by a cultural that attributes results to skill than random chance. Acquiring more knowledge does not make for better predictions, but develops an enhanced illusion of skills and becomes unrealistically overconfident. Personal experience is more limited than base-rate information, but individuals favor their personal experience. Validity of expertise depends on sufficiently regular environment, and opportunities to learn from practice. Without stable regularities in the environment, intuition cannot be trusted.

Evaluating risks depends on how that risk is measures. As what is measured gets more attention than what is not measured. Defining risks is an act of power.

Acknowledging lack of knowledge can be penalized. Unbiased appreciation of uncertainty is needed for rationality but is not wanted. Extreme uncertainty is paralyzing under dangerous circumstances. Admission of guessing is unacceptable under high stakes. Pretending knowledge is preferred solution.

Human judgement can be supplemented by formulas. The problem is when humans override the formula.

Asking for proposal premortem can cause individuals to consider alternative views. A proposal premortem considers ideas taken to lead to disaster, and then asks to write the history of the disaster.

Prospect theory indicates that the sensitivity to gains and losses are not the same. Individuals are loss averse, which means that losses carry heavier emotional responses than gains. What is relative to expectations, determines what is perceived as a gain or loss. It is not absolute wealth that determines happiness, but recent changes in wealth.

Part of prospect theory is the different between probabilities and the decision weights. Possibility effect is when highly unlikely outcomes, are given disproportionate weights.

Endowment effect is using present conditions as reference point for forthcoming decision. Negotiating relative to that reference point. Giving up something feels like loss. Pleasure in getting something. Endowment effect is counteracted when the owners of the product see the product as value for future exchange. Loss aversion indicates that individuals are motivated more to avoid losses than to achieve gains.

Illusions of validity is when believers create a social construction that is false, but sustained by their belief.

Caveats?
The systemic biases and the reason why they occur is a central theme found throughout the book, but the various applications are all minor. The book takes a few core arguments, puts them in different settings, and gives them a different name. A large range of biases are presented, but not enough on each. To understand each bias would require more research, and some explanations need editing.

Some biases contradict other biases, with similar contexts. Many of the experiments depends on context and would provide different results given a different context. Much like how System 1 does not consider alternatives and nonevents, the author favors results given than providing potential alternatives.

The information is provided in a sporadic and disorganized manner. This is done to make it easier to read the book, but harder to put all the different information together. More difficult to understand the core arguments with the way information is spread out.

Renaming automatic system and effort system to System 1 and System 2 is claimed to use up less energy, but it can do the exact opposite. Reading the book might have made it less energy intensive to rename, but that increased the energy used in sharing the understanding with others. The author also makes a note that System 1 and System 2 are fictional characters, but that does not mean everyone understands that.
( )
  Eugene_Kernes | Jun 4, 2024 |
This book is amazing and I recommend it to everyone. It is about how we think and make decisions. It explains the contributions of the conscious and unconscious parts of our minds in terms that are understandable. And thus, it explains how it is that although we think we are, in fact, we are not and cannot be truly rational. Brilliant! ( )
  dvoratreis | May 22, 2024 |
A good & important book, but not a quick read. 3-1/2 stars ( )
  Abcdarian | May 18, 2024 |
Dad gave me this book and I read it right away in the summer of 2020 while we were living in Escondido with the Zags, waiting to move in to our CPEN house. Need to reread this one soon. Has been referenced dozens of time in various classes and PME since I read it and I am embarrassed by my lack of recollection despite having read this cover to cover.
  SDWets | Feb 19, 2024 |
Very tedious reading. This book should have been half as long - maybe less. ( )
1 vote donwon | Jan 22, 2024 |
Showing 1-5 of 211 (next | show all)
The replication crisis in psychology does not extend to every line of inquiry, and just a portion of the work described in Thinking, Fast and Slow has been cast in shadows. Kahneman and Tversky’s own research, for example, turns out to be resilient. Large-scale efforts to recreate their classic findings have so far been successful. One bias they discovered—people’s tendency to overvalue the first piece of information that they get, in what is known as the “anchoring effect”—not only passed a replication test, but turned out to be much stronger than Kahneman and Tversky thought.

Still, entire chapters of Kahneman’s book may need to be rewritten.
added by elenchus | editSlate.com, Daniel Engber (Dec 1, 2016)
 
"It is an astonishingly rich book: lucid, profound, full of intellectual surprises and self-help value. It is consistently entertaining and frequently touching..."
added by melmore | editNew York Times, Jim Holt (Nov 25, 2011)
 
Thinking, Fast and Slow is nonetheless rife with lessons on how to overcome bias in daily life.
 

» Add other authors (60 possible)

Author nameRoleType of authorWork?Status
Kahneman, DanielAuthorprimary authorall editionsconfirmed
Chamorro Mielke, JoaquínTranslatorsecondary authorsome editionsconfirmed
De Vries, JonasTranslatorsecondary authorsome editionsconfirmed
Egan, PatrickReadersecondary authorsome editionsconfirmed
Eivind LilleskjæretTranslatorsecondary authorsome editionsconfirmed
Gunnar NyquistTranslatorsecondary authorsome editionsconfirmed
Van Huizen, PeterTranslatorsecondary authorsome editionsconfirmed
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Epigraph
Dedication
In memory of Amos Tversky
First words
Every author, I suppose, has in mind a setting in which readers of his or her work could benefit from having read it.
Quotations
extreme outcomes (both high and low) are more likely to be found in small than in large samples. This explanation is not causal. The small population of a county neither causes nor prevents cancer; it merely allows the incidence of cancer to be much higher (or much lower) than it is in the larger population. The deeper truth is that there is nothing to explain. The incidence of cancer is not truly lower or higher than normal in a county with a small population, it just appears to be so in a particular year because of an accident of sampling. If we repeat the analysis next year, we will observe the same general pattern of extreme results in the small samples, but the counties where cancer was common last year will not necessarily have a high incidence this year. If this is the case, the differences between dense and rural counties do not really count as facts: they are what scientists call artifacts, observations that are produced entirely by some aspect of the method of research - in this case, by differences in sample size. p 111
Even now, you must exert some mental effort to see that the following two statements mean exactly the same thing: Large samples are more precise than small samples. Small samples yield extreme results more often than large samples do. p 111
When experts and the public disagree on their priorities, [Paul Slovic] says, 'Each side must respect the insights and intelligence of the other.' p 140
You can also take precautions that will inoculate you against regret. Perhaps the most useful is to b explicit about the anticipation of regret. If you can remember when things go badly that you considered the possibility of regret carefully before deciding, you are likely to experience less of it. You should also know that regret and hindsight bias will come together, so anything you can do to preclude hindsight is likely to be helpful. My personal hindsight-avoiding policy is to be either very thorough or completely casual when making a decision with long-term consequences. Hindsight is worse when you think a little, just enough to tell yourself later, 'I almost made a better choice.'     Daniel Gilbert and his colleagues provocatively claim that people generally anticipate more regret than they will actually experience, because they underestimate the efficacy of the psychological defenses they will deploy - which they label the 'psychological immune system.' Their recommendation is that you should not put too much weight on regret; even if you have some, it will hurt less than you now think.p 352
Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have an opportunity to discover the extent to which our preferences are frame-bound rather than reality-bound. p 367
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Original language
Canonical DDC/MDS
Canonical LCC
In this work the author, a recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology that challenged the rational model of judgment and decision making, has brought together his many years of research and thinking in one book. He explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. He exposes the extraordinary capabilities, and also the faults and biases, of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. He reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives, and how we can use different techniques to guard against the mental glitches that often get us into trouble. This author's work has transformed cognitive psychology and launched the new fields of behavioral economics and happiness studies. In this book, he takes us on a tour of the mind and explains the two systems that drive the way we think and the way we make choices.

No library descriptions found.

Book description
Le système 1 est rapide , intuitif et émotionel ;le système 2 est plus lent , plus réfléchi , plus controléet plus logique .Fruit d toute une vie de recherche ''Système 1/Système 2" dessine une théorie brillante ,qui offer des prolongements pratiques immédiats dans la vie quotidienne et professionnelle.
Haiku summary

Current Discussions

None

Popular covers

Quick Links

Rating

Average: (4.11)
0.5
1 23
1.5 2
2 61
2.5 9
3 238
3.5 52
4 555
4.5 80
5 622

Is this you?

Become a LibraryThing Author.

 

About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 206,755,453 books! | Top bar: Always visible