Self Help

Thinking 101 - Woo-kyoung Ahn

Author Photo

Matheus Puppe

· 35 min read

The introduction describes the author teaching a course on thinking and decision-making to Yale undergraduates. In one lecture, he plans to have students dance to a K-pop song to experience firsthand the bias of overconfidence.

The author starts the lecture by describing the above-average effect - the tendency for most people to rate themselves as better than others on various skills and attributes. He gives examples of this from surveys of high school students, college professors, and drivers.

To make students experience biases rather than just hear about them, the author shows a clip of a simple K-pop dance and says he will give prizes to whoever can dance it the best in front of the class. He expects overconfidence will lead many students to volunteer, only to struggle with the dance.

This sets up the first chapter, which will likely explain the cognitive basis of overconfidence and why we tend to think things will be easier than they turn out to be. The anecdote illustrates how the author aims to actively demonstrate psychological concepts to make them more impactful.

  • The fluency effect causes us to have overconfidence in things that are easy for our minds to process. This manifests in several illusions:

  • Illusion of skill acquisition - Watching someone perform a skill repeatedly can make us erroneously think we could do it ourselves. In a study, people who watched a moonwalking video 20 times were no better at moonwalking than those who watched it once, but were more confident.

  • Illusion of knowledge - When we understand the mechanism behind something, we become more convinced of it. Providing an explanation for how duct tape removes warts makes that claim more believable.

  • Illusion from something irrelevant - Judgments can be swayed by the fluency of irrelevant factors. Stocks with easy to pronounce names were expected to do better than those with hard names, and they did.

  • The fluency effect leads us to underestimate difficulty when things seem smooth and easy. Providing causal explanations increases willingness to accept claims by making them seem more fluent. And completely irrelevant factors that are fluent can distort our judgments.

  • The fluency effect is an adaptive cognitive bias where we tend to favor information that is easy to process. This can lead to overconfidence and poor decisions.

  • The author demonstrates how even experts fall prey to the fluency effect, like trying grooming techniques from a YouTube video or ordering too many seeds after seeing beautiful garden photos.

  • Cognitive biases persist even when we know about them because they evolved as useful mental shortcuts. Fluency helps with metacognition - judging what we know.

  • The benefits of heuristics like fluency outweigh their costs. The Ponzo visual illusion analogy demonstrates illusions persist despite understanding them.

  • We shouldn’t fully discount fluency feelings, just like we shouldn’t discard visual perspective. But overconfidence from fluency can have serious real-life consequences.

  • The key is to be aware of when fluency might lead us astray, and consciously adjust our confidence levels accordingly. We can’t eliminate the bias, but we can mitigate its effects.

  • Fluency effects (when something feels easy or familiar, we think we’ll be good at it) can lead to harmful overconfidence. Just knowing about fluency effects isn’t enough - we need strategies to counteract them.

  • A simple solution is to physically try out skills before performing them. This breaks the illusion of fluency. Examples include rehearsing a presentation, baking a soufflé before cooking for guests, practicing singing in the mirror.

  • Overconfidence about knowledge can also be reduced by spelling out what we think we know step-by-step. Studies show this makes people realize they know less than they thought.

  • Reducing overconfidence about political issues through explaining them makes people become more moderate. Conversations with those who disagree help puncture the illusion of knowledge.

  • The planning fallacy leads us to underestimate time and effort needed for tasks. Trying things out doesn’t work here. Strategies like breaking projects into parts, budgeting for errors can help.

  • Overconfidence from fluency is hard to avoid completely, but strategies like trying things out, explaining our knowledge, and planning for errors can reduce harmful effects.

  • The planning fallacy is when people underestimate how long tasks will take because when planning, everything seems to run smoothly in our minds. One study found people thought they’d finish Christmas shopping earlier than they actually did.

  • Making detailed step-by-step plans can exacerbate the planning fallacy by creating an illusion of fluency where the task seems easier than it really is.

  • To avoid the planning fallacy, consider potential obstacles - both task-relevant ones and unexpected events unrelated to the task itself. Also, build in extra time as a buffer.

  • Optimism fuels the fluency effect and planning fallacy by making everything seem like it will go smoothly. Realistic optimism acknowledges challenges, while blind optimism ignores them.

  • To combat blind optimism, think carefully about similar past cases and apply their lessons, rather than dismissing them as being different. Assume the current situation will follow past patterns.

  • When remodeling a house, anticipate obstacles, build in extra time, learn from others’ experiences, and don’t let optimism lead to underestimating potential challenges. Careful planning and realistic timelines are key.

  • Bisma, a former student of the author’s, called the author upset after visiting a new doctor. She has suffered from mysterious health issues like nausea and vomiting for years.

  • The doctor suspected Bisma had anorexia after she said she didn’t enjoy food. He kept asking leading questions to confirm his diagnosis, despite her denials.

  • The doctor was wrong - Bisma’s symptoms disappeared while abroad, likely due to allergies. But the doctor’s line of questioning illustrates confirmation bias.

  • Confirmation bias is seeking out evidence that confirms your existing beliefs or hypotheses, while ignoring evidence that disconfirms them.

  • Psychologist Peter Wason demonstrated this through his famous 2-4-6 task, where participants try to figure out a rule for number sequences. Most people only test examples that fit their hypothesized rule.

  • To find the right rule, you have to actively test sequences that would disconfirm your hypothesis, not just confirm it. This allows you to consider alternative explanations.

Okay, let’s analyze this step-by-step:

  • A researcher looked at 1,000 people identified as having high leadership qualities.

  • Of these 1,000 people, 990 had high self-esteem and only 10 had low self-esteem.

  • However, the researcher did not look at people with poor leadership qualities.

  • We don’t know if people with poor leadership also tend to have high self-esteem.

  • So we can’t conclude there is an association between self-esteem and leadership based only on data about people with high leadership qualities.

  • The correct answer is D - we can’t draw any conclusions from this limited data.

You’re right - it’s tempting to see the 990 out of 1000 with high self-esteem and conclude there is a connection. But without data on those with poor leadership, we are falling victim to confirmation bias and only looking for evidence that fits our hypothesis. We need full data to draw a solid conclusion. Nice job spotting the potential bias!

Confirmation bias can hurt individuals in two main ways:

  1. It can lead to inaccurate self-perceptions. People seek out information that confirms their beliefs about themselves, while ignoring disconfirming evidence. This was demonstrated in a study where participants were randomly told they had a gene variant associated with depression. Those who received this feedback reported more depressive symptoms compared to the control group, presumably because they selectively recalled confirming instances of feeling depressed.

  2. It can lead people to misinterpret objective information in a self-fulfilling way. An example is the woman who brought a suspicious package (part of a psychology study) to the police station. She and her family assumed their itchiness was caused by the contents, confirming their belief that the package was harmful.

In both cases, confirmation bias led people to develop false or exaggerated views about themselves that matched their internal beliefs or theories. So this cognitive bias can directly hurt those who commit it by producing inaccurate self-concepts.

  • The study on genetic tests and the earlier example of personality tests illustrate the dangers of confirmation bias leading to vicious cycles. This happens when an initial tentative hypothesis becomes more certain and extreme as only confirming evidence is accumulated, which then causes people to seek out more confirming evidence.

  • Tests like genetic and personality tests provide probabilistic, not definitive, results. There are many factors that influence outcomes beyond what the tests measure. However, confirmation bias can lead people to an exaggerated and invalid view of themselves based on these tests.

  • Confirmation bias can also operate at a societal level, such as the assumption that men are better at science than women. This prevents women from being given fair opportunities, leading to fewer eminent female scientists, which then reinforces the notion. This is irrational and unfair.

  • Societal confirmation bias also leads to lost opportunities and costs, such as in the case of discrimination against Black Americans. Equal opportunity could have led to trillions more in GDP.

  • Confirmation bias persists because it allows us to conserve mental energy by not thoroughly questioning our assumptions. It was adaptive for our ancestors to rely on what worked before rather than extensively testing alternatives.

  • Confirmation bias leads us to seek out evidence that confirms our existing beliefs and hypotheses, while ignoring disconfirming evidence. It is a deeply rooted tendency that helps us be efficient, but can also lead us astray.

  • One way to counteract confirmation bias is to actively try to falsify our hypotheses by looking for disconfirming evidence. But this can be risky or unethical in many real-world situations.

  • A more practical method is to consider two mutually exclusive hypotheses and look for evidence that could confirm or disconfirm both. This forces us to consider alternatives.

  • Framing questions in opposite ways (e.g. “Am I happy?” vs “Am I unhappy?”) can also counteract confirmation bias by prompting us to consider different perspectives.

  • However, confirmation bias is deeply ingrained, so overcoming it completely is very difficult. The best we can do is be aware of this tendency and try to actively consider alternatives when possible. But some residual bias will likely remain.

Here are the key points about the cues we use to infer causality:

  • We rely on similar cues or strategies when inferring the causes of events, which allows us to agree on more reasonable causes even if we don’t always agree completely.

  • Some important cues are:

  • Similarity - We treat causes and effects as similar to each other. Wilson’s flu seems very different in scale and severity from the Holocaust, making us reluctant to link them.

  • Sufficiency/Necessity - Causes are seen as sufficient and/or necessary for the effect. To the extent Wilson’s flu is seen as sufficient/necessary for the harsh Versailles Treaty and the treaty as sufficient/necessary for Hitler’s rise, the flu may be blamed for the Holocaust.

  • Recency - More recent events in a causal sequence get more blame/credit. Wilson’s flu was too far removed temporally from the Holocaust compared to Hitler’s rise, so gets less blame.

  • There are infinite possible causes for any event, but we narrow them down to more reasonable options using these cues. Which cues we weigh more heavily affects which causes we pick. Cues like recency and similarity make us less likely to blame Wilson’s flu for the Holocaust.

  • The similarity heuristic is the tendency to assume causes and effects will be similar in magnitude or characteristics. This can lead us astray when small causes produce large effects or vice versa.

  • The sufficiency heuristic leads us to identify one cause as fully responsible for an outcome, discounting other potential contributing causes. This can lead to unfairly discrediting people or ideas.

  • Overreliance on similarity makes us reluctant to believe dissimilar causes and effects are related, like germ theory explaining disease.

  • Relying on sufficiency makes us ignore other possible causes once we’ve identified one sufficient cause. This is shown in examples like attributing someone’s success only to connections, not skill.

  • These heuristics are useful but can also lead to faulty conclusions if we are not aware of their limitations. We need to consider multiple potential causes for an outcome, not just the most obvious or similar one.

Here is a summary of the key points about wrong conclusions:

  • People often draw different causal conclusions about the same event based on their perspectives and what they consider “normal.” Something that seems abnormal from one viewpoint may seem normal from another.

  • Perspective influences whether we blame actions versus inactions. People tend to feel more responsible for outcomes resulting from their actions rather than their inactions.

  • Jumping to conclusions about causes without considering alternatives can lead to wrong attributions. For example, attributing gender gaps solely to biological differences discounts potential societal and environmental factors.

  • Necessity and sufficiency are important criteria for determining causality, but they should be used carefully. Just because something is necessary or sufficient for an outcome does not automatically make it the cause.

  • Overall, making unwarranted assumptions about causality based on limited information or perspective can lead to incorrect conclusions. Being aware of our biases and considering multiple perspectives allows us to make more accurate causal attributions.

  • When an action directly causes harm, we view it as worse than inaction that allows harm to occur, even if the end result is the same. For example, we see murder as worse than negligent homicide, even though someone dies in both cases.

  • It’s easier for us to imagine undoing an action than undoing inaction. We tend to think “if only they hadn’t done that” rather than trying to imagine all the things they could have done instead.

  • Inaction is invisible, so we don’t always recognize when failing to act causes harm. But inaction can have serious consequences, like failing to combat climate change or neglecting to vote.

  • We tend to give causal credit to the most recent event in a sequence, even when earlier events were equally important. This is known as the recency effect.

  • We assign more blame for outcomes we believe were controllable. If something bad happens due to uncontrollable factors, we feel grief but not necessarily guilt.

  • When facing very complex or unexplainable outcomes, we can become stuck ruminating on unanswerable “why me?” questions. This recursive causal thinking provides no solutions.

  • Susan Nolen-Hoeksema’s research showed how rumination (fixating on negative thoughts) can cause depression. In one study, dysphoric participants who ruminated became significantly more depressed.

  • We tend to ruminate more when negative events occur or when we’re in a negative mood, obsessively asking “why” questions.

  • Unfortunately, rumination doesn’t actually help us find solutions or causes. It can lead to more uncertainty, anxiety, and hopelessness.

  • One constructive approach for very difficult causal questions is to distance yourself from the situation. Taking a step back and looking at it from a different perspective can reduce negative emotions.

  • We can never definitively answer “why” questions or be 100% certain of causes. But it may be worth trying to answer “why” questions that could provide insights to guide future actions.

  • If we will not encounter a similar situation again, trying to figure out why something happened is pointless.

  • Stopping obsession over why unfortunate things happened can allow us to take a more distant view, potentially freeing us from negative emotions like remorse and regret. It may also enable more constructive problem-solving for the future.

The chapter discusses how vivid examples and anecdotes can be too powerful, leading us to violate important rational principles. For instance, people may be more persuaded by one or two anecdotes from people they know than by scientific evidence based on much larger samples. The main reason we are swayed by anecdotes is that most people do not fully understand statistics. The author highlights three key statistical concepts we need to understand to avoid irrational judgments:

  • The law of large numbers - more data is better for making inferences. But we often ignore this and trust anecdotes over statistics.

  • Regression to the mean - extreme outcomes tend to gravitate back toward average over time. But we wrongly attribute extreme outcomes to underlying causes.

  • Bayes’ theorem - how to update beliefs based on new evidence. But we tend to ignore prior probabilities.

The author provides examples of how learning about these principles through controlled experiments helps people make more accurate assessments. However, just teaching the concepts is not always enough. Vivid examples and anecdotes can still override statistics in people’s minds, even when they know it’s irrational. The key is to truly understand the statistical principles intuitively.

  • People tend to donate more to identifiable victims rather than statistics, known as the identifiable victim effect. However, teaching people about the law of large numbers can help them give more credence to statistics.

  • The author uses a personal example about her son trying only two sports (skating and soccer) but not liking them, leading her to wrongly conclude he didn’t like sports in general. But there are many sports he hadn’t tried yet. With a larger sample size, she may have reached a different conclusion.

  • Regression toward the mean is a statistical phenomenon where extreme scores tend to move back towards the average over time. It is often seen in examples like the Sports Illustrated cover jinx, where athletes experience declining performance after appearing on the magazine cover.

  • Regression toward the mean happens because performance is affected by random factors. When those factors align, performance can exceed or fall below true talent level. But those runs of good or bad luck rarely persist indefinitely.

  • Failing to account for regression toward the mean can lead to the regression fallacy, where we inaccurately attribute causes like arrogance or a change in motivation for performance swings that are simply statistical regression.

  • Bayes’ theorem is an important statistical principle that allows us to rationally update our beliefs when presented with new evidence.

  • It helps us calculate the probability of A given B, based on the probability of B given A. People often erroneously assume these two conditional probabilities are the same.

  • Bayes’ theorem was discovered by Thomas Bayes, likely as a way to argue against Hume’s skepticism about miracles.

  • The theorem has become pivotal in data science and machine learning as a rational way to update beliefs and models based on new data.

  • Despite its intimidating formula, Bayes’ theorem is conceptually about learning rationally from evidence.

  • A key example is updating beliefs about the probability someone has breast cancer after receiving a positive mammogram result. The probability of cancer given a positive test is not the same as the probability of a positive test given cancer.

  • Bayes’ theorem specifies how to make the correct probabilistic calculation, avoiding the common error.

  • The theorem demonstrates why ethnic profiling practices are not statistically justified, despite common misperceptions.

  • Bayes’ theorem allows us to calculate conditional probabilities, like the probability that someone has breast cancer given they tested positive on a mammogram.

  • People often confuse conditional probabilities, believing P(A|B) is the same as P(B|A). This is incorrect.

  • After 9/11, some people wrongly believed that if there is terrorism, Muslims are likely responsible. But the conditional probability of a Muslim being a terrorist given a terrorist attack is not the same as the probability of a terrorist being Muslim given that they are a terrorist.

  • Using government data, only 16 Muslim terrorists caused fatal attacks in the U.S. from 2001-2016. Out of 2.2 million Muslim adults in the U.S., the probability of any given Muslim being a terrorist is essentially zero.

  • Ethnic profiling and discrimination against Muslims due to terrorism fears is not statistically justified and relies on misunderstanding conditional probabilities.

  • Vivid examples stick in our minds, but learning through examples has limits. Solving new problems requires looking past the surface details of the original example.

  • People tend to weigh negative information more heavily than positive information when making judgments about products, people, and events. Negative reviews of a product have a greater impact on sales than positive reviews.

  • One negative childhood event can have lifelong consequences that are not easily offset by positive events.

  • The negativity bias can lead us to make irrational decisions - we avoid options framed negatively even if they are the same as positively framed options. People prefer “on time 88%” flights to “late 12%” flights.

  • In a study, people rated beef labeled “75% lean” as better than beef labeled “25% fat” even though it was the same beef.

  • In a college admissions study, students with mostly Bs and one A were rated higher than students with all Bs and one C. The single low grade had more impact than the single high grade.

  • The negativity bias causes us to focus on and remember bad events more than good events. We need to consciously counteract this bias to make rational judgments.

  • The author conducted an experiment to test whether college admissions officers would favor students with passionate interests and heterogeneous grades (e.g. some As and some Cs) or students with uniform grades (e.g. all Bs).

  • Though colleges claim to want students who are passionate about certain subjects, the admissions officers overwhelmingly favored the students with uniform grades over those with mixed grades, even when both students had the same high GPAs.

  • This demonstrates the power of the negativity bias - negative information (like a C grade) looms larger than positive information (like an A grade).

  • The negativity bias also affects financial decisions through a phenomenon called loss aversion. Losing money hurts more than gaining the same amount feels good.

  • Loss aversion was demonstrated by Kahneman and Tversky - people won’t take a 50/50 gamble to win or lose $100, but require much higher potential winnings to accept the risk of loss.

  • In investing, the negativity bias means people tend to avoid risks where they could lose money, even if the potential gains outweigh the losses. Overcoming this bias can lead to better financial outcomes.

  • Loss aversion causes people to strongly prefer avoiding losses over acquiring equivalent gains. This can lead to irrational decisions.

  • In a study, teachers performed better when given a bonus upfront that they might have to pay back if their students didn’t improve, compared to the possibility of an end-of-year bonus. The potential loss motivated them more than the potential gain.

  • The endowment effect is a manifestation of loss aversion - people ascribe more value to things they own than to identical items they don’t yet own. In a study, participants valued a mug or chocolate more once they possessed it, even briefly, and were reluctant to trade it.

  • Loss aversion has been shown to cause physical pain in the brain. The desire to avoid losses is an instinctive reaction, even if the loss is small or symbolic.

  • Loss aversion should be considered in negotiations and when structuring incentives. Framing options as potential gains vs potential losses can significantly impact decisions and motivation.

  • While loss aversion is irrational in strictly economic terms, it is a real factor in human psychology that should not be ignored. Accounting for it can lead to better outcomes.

  • The negativity bias is the tendency to pay more attention to negative information and experiences than positive ones. It is an automatic, unconscious cognitive bias.

  • The negativity bias likely evolved because focusing on potential threats and losses was critical to our ancestors’ survival. It draws our attention to problems that need fixing.

  • However, the negativity bias can be harmful when taken to an extreme. It can cause us to overweight negative reviews, risks, or experiences, leading to worse decisions.

  • To counteract it, we can reframe questions positively (e.g. who to award custody to rather than deny custody). We can also be aware of how the endowment effect exacerbates the negativity bias, like with free trials of products/services.

  • The negativity bias makes it hard to part with possessions we haven’t used in years. An effective strategy is to focus on the benefits and joy that decluttering will bring rather than the pain of losing items.

  • Overall, awareness of our negativity bias can help us make more balanced, rational choices. Though hardwired, we have some capacity to counteract its harmful effects.

  • The author discusses the phenomenon of “causal imprinting” - when someone forms an initial belief that A causes B, this belief persists even in the face of new evidence that A does not actually cause B. She gives an example from her own experience related to using nightlights for babies.

  • Causal imprinting is a type of confirmation bias, where we interpret new data to fit with our preexisting beliefs.

  • Biased interpretations that contradict reality are very common. The author gives examples like blaming others for problems or blaming oneself excessively.

  • Biased negative interpretations are especially common in depression. The author gives an example of interpreting a lack of prompt response to a text as meaning someone doesn’t care about you.

  • These biased interpretations persist despite contradictory evidence because we tend to interpret new data as confirming our initial beliefs. Letting go of long-held beliefs is psychologically difficult.

  • Such biased interpretations can be harmful to ourselves and relationships. Being able to revise beliefs in light of new evidence is important for growth and learning.

The passage discusses how people can hurt others by making inaccurate assumptions and perpetuating stereotypes. It provides examples of studies showing gender and racial bias, including one where science professors rated an applicant named John as more competent than an identical applicant named Jennifer. Another study found participants were more likely to mistakenly “shoot” an unarmed Black target compared to an unarmed white target in a video game. The passage explains that even intelligent people can be susceptible to biased interpretations, as they are adept at finding ways to dismiss evidence that contradicts their existing beliefs. It describes a study where both supporters and opponents of the death penalty adjusted their views after reading short study summaries, but became even more entrenched in their original positions after reading the studies’ full methodological details, which allowed them to critique the contradictory evidence. Overall, the passage demonstrates how people may cling to and even strengthen their biases by finding reasons to reject countervailing facts.

  • Biased interpretation of facts and data is common. Even people with strong reasoning and analytical skills use those skills to confirm their existing biases rather than draw objective conclusions.

  • Motivated reasoning plays a role, as people are motivated to protect their existing beliefs and worldviews. But biases also occur even without clear motivations.

  • Biases stem from normal cognitive processes. We unconsciously and automatically interpret new information through the lens of our existing knowledge and beliefs, a process called top-down processing.

  • Examples are provided of biases in interpreting visual stimuli, audio stimuli, and data/statistics. Even neutral contexts like traffic lights and voicemail transcriptions get interpreted in a top-down biased fashion based on our existing knowledge and expectations.

  • Quantitative skills don’t eliminate biases and can even make people better at rationalizing data to fit their beliefs. Smart people are still vulnerable to irrational biases.

  • Understanding the automatic cognitive roots of biases helps explain why they are so pervasive and difficult to recognize in ourselves.

  • Top-down processing relies on our knowledge, beliefs, and expectations to interpret sensory information. It allows us to make sense of the world, but can also lead to biased interpretations.

  • Biased interpretations happen inevitably as part of normal top-down processing. Realizing this can help us be more open-minded.

  • Deeply entrenched negative thinking is hard to overcome on our own. Cognitive behavioral therapy can help change habitual thought patterns, but takes consistent practice over weeks or months.

  • There are no quick fixes for biased thinking. We need ongoing effort and help from others to recognize our biases and adjust our interpretations.

  • Being aware of our natural tendency for bias is an important first step. Then we can make a deliberate effort to consider alternate perspectives and question our assumptions.

  • We can’t eliminate top-down processing, since we need it to function. But we can work to counteract its downsides like confirmation bias and prejudice.

In summary, the key is developing insight into our own thinking patterns, being intentional about questioning our interpretations, and getting help to change engrained habits of thought. It’s an ongoing process requiring self-awareness, practice, and an openness to other viewpoints.

  • We frequently miscommunicate with others, even close friends and family members, both in writing and in speech. Studies show we are poor at detecting sarcasm in written messages from friends and at interpreting the intent behind ambiguous spoken statements.

  • We tend to assume others understand us and that we understand others, but research shows our abilities are only at chance levels. We aren’t as good at perspective-taking as we think.

  • There are multiple reasons for these communication failures: we don’t share enough contextual knowledge, we use very different vocabularies, we make unwarranted assumptions, and we lack awareness of our own ambiguous phrasing.

  • Perspective gaps also stem from differences in values, personalities, experiences, and priorities, which shape how we interpret situations. What’s obvious and important to us may not be to someone else.

  • We need to put more effort into considering others’ perspectives, asking clarifying questions, explicitly stating our intended meanings, and not making assumptions. Communication requires work from both sides.

  • Understanding the cognitive roots of miscommunication can help us be more patient. We aren’t necessarily ignoring or dismissing others’ views intentionally. But communication still requires persistence, empathy, and an open mind.

  • In an experiment, speakers tried to convey certain meanings to listeners by saying ambiguous sentences out loud. Listeners had to guess the intended meaning.

  • When listeners were spouses, the couples had been married 14.4 years on average. Speakers were more confident spouses understood them than strangers.

  • However, there was no actual difference between acquaintances or strangers in figuring out the intended message. On average, listeners guessed the meaning correctly less than half the time.

  • This happens due to the “curse of knowledge” - when you know something, you have trouble fully taking the perspective of someone who doesn’t know it.

  • Studies show this egocentric bias even in adults. The “tapping game” demonstrated that tappers overestimated how often listeners would guess the song they were tapping.

  • The curse of knowledge makes us overconfident that our messages are clear. We assume others hear the “music playing in our heads.”

  • Smart people who know a lot can fall victim to the curse of knowledge and be poor communicators or teachers.

  • We also fail to consider others’ perspectives in cases where we already know what they know or like. The “status-signal paradox” showed people chose designer items to make friends, wrongly assuming others would be attracted to status signals.

  • Humans want to signal high status and impress others by displaying luxury items like Prada bags or Rolex watches. However, when choosing what to wear to attract friends, we can get trapped in an egocentric perspective. Potential new friends may actually be turned off by flashy luxury items.

  • A study showed that people have difficulty taking others’ perspectives, even when they should be able to. English-speaking participants struggled to follow instructions to move a block that was hidden from the director’s view. In contrast, Chinese participants had no trouble, likely due to their collectivist cultural background which emphasizes understanding others’ perspectives.

  • Collectivist cultures like China foster strong group belonging and awareness of others’ thoughts/feelings from an early age. This constant training makes taking others’ perspectives reflexive. Individualistic cultures like the U.S. focus more on individual rights and privacy.

  • We can teach perspective-taking skills without fully adopting collectivist norms. Young kids can learn to understand false beliefs through games where they have to deceive others. As adults, consciously considering others’ mindsets, asking clarifying questions, and seeking feedback help bolster perspective-taking abilities. These skills are essential for normal social interactions.

  • A study trained children to understand false beliefs and mental states. After training, the children were much better at tricking an experimenter in a “hide the candy” game, showing they had learned theory of mind.

  • Theory of mind has two components: cognitive (understanding others’ beliefs) and emotional (understanding others’ feelings). Psychopaths have cognitive but lack emotional theory of mind.

  • Perspective-taking and imagining others’ situations can increase emotional theory of mind and compassion, as shown in a study about Syrian refugees.

  • However, perspective-taking has limits. A series of 24 experiments showed trying to imagine others’ perspectives does not actually improve understanding of their thoughts and feelings.

  • Even the author and her husband, as psychology professors, wrongly believed they could accurately predict each other’s preferences by taking the other’s perspective.

  • Accurately understanding others requires going beyond simplistic perspective-taking. It involves picking up subtle cues and having a nuanced model of how others think.

  • The author received her PhD early at age 25 due to an externally imposed deadline, even though she is capable of delaying gratification for long-term goals.

  • However, in daily life, the author is very impatient, needing immediate answers and results. She contrasts her ability to wait years to complete her PhD with her inability to wait even days or weeks for small rewards.

  • This demonstrates the phenomenon of delay discounting - people often choose smaller immediate rewards over larger delayed rewards, even when it is not logically optimal.

  • The author explains research showing people will take $340 now over $390 in 6 months, despite the fact that this is not a smart financial choice given interest rates.

  • Reasons for delay discounting include misunderstanding future utility, hyperbolic time discounting, and differential activation of brain systems for immediate versus delayed rewards.

  • We mistakenly project our current preferences onto our future selves, failing to realize our future selves will value things differently.

  • To combat delay discounting, we can use commitment devices that restrict our ability to choose immediate rewards, as well as actively imagine our future selves.

Here are some key points on why we struggle to delay gratification and how we can learn to wait:

  • Lack of self-control - We often choose immediate rewards due to poor impulse control. Distractions and thinking happy thoughts can help increase willpower.

  • Hyperbolic discounting - Rewards lose value rapidly as delay increases, but the rate of discounting drops over time. Making concrete plans for the future reward can make it feel more real.

  • Present bias - We are biased to prefer payoffs closer to the present. Visualizing your future self enjoying the rewards can reduce this bias.

  • Uncertainty about the future - Doubts about whether you will get the future reward makes you favor the sure thing now. Reducing uncertainty by securing commitments can help.

  • Willpower is limited - Self-control draws from a finite reserve. Take breaks and replenish willpower through rest, meditation, or positive social interactions.

  • Habit formation - Making behaviors habitual reduces the need for willpower. Create routines and triggers to make desired behaviors more automatic.

  • Precommitment - Voluntarily restricting your choices, like deleting games from your phone, removes temptations requiring willpower.

  • Reward substitution - Replace immediate rewards with small, sustainable ones aligned with your goals. A healthy snack can substitute for overeating.

  • Future visualization - Vividly imagine yourself achieving the future goal. Make it detailed and experiential. This makes it seem more attainable.

  • Resisting immediate temptation is difficult, even for children and animals. Adults may be able to resist by distracting themselves or substituting a pleasant alternative.

  • Uncertainty about future outcomes can paralyze decision-making. In one study, students were willing to pay to delay a vacation purchase until after learning their exam results, even though the outcome wouldn’t affect their decision.

  • We dislike uncertainty and are willing to pay to reduce it. This “certainty effect” contributes to the preference for smaller, immediate rewards over larger, delayed ones.

  • The Allais paradox demonstrates how people make inconsistent choices between gambles due to the certainty effect. A 1% difference in probability feels more significant when comparing a certain versus uncertain outcome than two uncertain outcomes.

  • In real life, people don’t quibble over small differences in uncertainty, like between 94% and 95% vaccine efficacy. But comparing complete certainty to even a small uncertainty provokes a stronger reaction.

  • The certainty effect makes delayed gratification challenging, as immediate rewards are perceived as certain while delayed rewards feel uncertain. Reframing decisions by canceling out shared components can help make more rational choices.

  • People tend to prefer immediate rewards over larger rewards in the future due to temporal discounting. This bias towards the present can be irrational.

  • There are several factors that contribute to temporal discounting:

  1. Uncertainty about the future makes us anxious, so we prefer certainty now. Boosting confidence about the future can help overcome this.

  2. The future feels psychologically distant, so we discount future rewards. Visualizing future events in detail can make them feel more real and immediate.

  • While persistence and self-control are often praised, an extreme focus on always sacrificing short-term pleasures for long-term goals can backfire and cause anxiety.

  • Balance is important - some give and take between present and future is healthy. Completely fixating on future goals removes joy from daily life.

  • The key is to reduce excessive temporal discounting when it is clearly irrational, while still appreciating the present moment. Moderation is wise.

  • During a trip to Paris, the author realized that future generations may view our overemphasis on work as absurd, just as we now view outdated customs from the past as strange.

  • Many European countries, like those in Nordic regions, understand the importance of work-life balance and rank high in happiness. Excessive self-control can actually impair mental and physical health.

  • Studies found disadvantaged teenagers with high self-control had poorer health outcomes as young adults, likely due to the constant stress of trying to overcome their circumstances. Other studies showed students with a strong desire for self-control performed worse on very difficult tasks, feeling discouraged when perfection was unattainable.

  • The author suggests this may explain elevated anxiety levels in youth today - the gap between aspirations and reality causes stress. We should enjoy the process, not just the end goal. Goals worth pursuing should feel good during the journey.

  • The author hopes the book will promote unbiased thinking to make the world fairer, not manipulation of others. We should be fair to ourselves - neither under- nor over-confident. Decision-making should be impartial, based on statistical principles. Understanding our thinking errors helps prevent exploitation by those who would manipulate them.

Here is a summary of the key points from the acknowledgments section:

  • The author thanks the cognitive psychologists whose work provided the foundation for the book, especially Daniel Kahneman and Amos Tversky.

  • She is grateful to the students who took her “Thinking” course, whose enthusiasm inspired her to spend extensive time preparing lectures and finding engaging examples.

  • Editor Will Schwalbe patiently guided the author through multiple drafts. Agent Jim Levine helped conceptualize the book and insisted on its positive focus. Arthur Goldwag improved the prose while maintaining the author’s voice.

  • Research presented in the book was supported by grants from the National Institute of Mental Health, the National Human Genome Research Institute, and the Reboot Foundation.

  • The author thanks her husband, Marvin Chun, for supporting her career, offering suggestions on drafts, and living with her ups and downs during the writing process. She feels fortunate to have found the right life partner.

Overall, the author expresses gratitude to the researchers, students, editors, funders, and family members who made this book possible through their knowledge, enthusiasm, guidance, support, and patience. She sees the book as a collaborative effort representing the hard work and contributions of many.

Here is a summary of the key points from the cited sources:

  • Self-focused rumination (dwelling on one’s negative thoughts and feelings) can prolong and intensify depression. It prevents people from effectively solving problems. (Nolen-Hoeksema et al., 2008)

  • A study showed that taking a self-distanced perspective when analyzing negative experiences (thinking about the experience as an observer would) was more effective than ruminating or distracting oneself. Self-distancing reduced negative emotions and helped people gain insight into their experiences. (Kross et al., 2005)

  • The self-distancing technique also had long-term benefits, leading to less recurrence of depressive thoughts over time compared to rumination. (Kross & Ayduk, 2008)

  • Positive reviews have a greater impact on sales than negative reviews due to negativity bias. Negative information is given more weight in impressions and decisions. (Cui et al., 2012; Fiske, 1980)

  • Negative life events affect people more than positive ones. Losses loom larger than gains due to loss aversion, a key principle of prospect theory. (Baumeister et al., 2001; Kahneman & Tversky, 1979)

  • Framing information positively (as gains) rather than negatively (as losses) can reduce the impact of negativity bias and loss aversion. (Shafir, 1993)

  • People tend to interpret new information as confirming their existing beliefs, even if the information is contradictory, due to biased assimilation. (Lord et al., 1979)

  • Perspective-taking has limitations. People often project their own knowledge onto others and fail to account for different perspectives, leading to communication issues. (Keysar et al., 1998; Birch & Bloom, 2007)

Here is a summary of some key points about heuristics and biases:

  • Heuristics are mental shortcuts or rules of thumb that allow us to make decisions quickly. They are useful in many situations, but can also lead to systematic biases and errors.

  • Some common heuristics discussed in the book include:

The availability heuristic - judging the likelihood of an event based on how easy it is to recall examples. This can lead to biases when certain vivid examples are more “available” in memory.

The fluency heuristic - when information feels easy to process, we often intuit it is more valid or true. But fluency can be influenced by superficial factors.

The familiarity heuristic - more familiar things feel more true and valid. But familiarity does not necessarily mean something is more likely or more valid.

The representativeness heuristic - evaluating how similar something is to a category prototype. But this can ignore base rates and lead to errors.

  • These heuristics rely on mental shortcuts that are often useful but can also lead to predictable errors and biases. Being aware of them can help us recognize places our judgments may be distorted.

  • The book also discusses how biases like confirmation bias and optimism bias are related to our tendency to satisfice with easy, intuitive heuristics rather than think deeply. Recognizing these tendencies can help us identify situations that call for more careful, analytical thinking.

You provided a summary of key concepts from a book on cognitive biases and heuristics. Here are the main points:

  • Availability heuristic - judging frequency or likelihood based on how easily examples come to mind

  • Representativeness heuristic - judging probabilities based on similarity to stereotypes

  • Anchoring and adjustment - starting with an initial anchor value and adjusting insufficiently from it

  • Confirmation bias - seeking and interpreting information to confirm existing beliefs

  • Negativity bias - giving more weight to negative than positive information

  • Framing effects - how presenting options in different ways alters decisions

  • Overconfidence - overestimating one’s abilities and knowledge

  • Planning fallacy - underestimating task completion times

  • False consensus - overestimating how much others share your views

  • Sunk cost fallacy - continuing an endeavor based on previously invested resources

  • Hindsight bias - believing past events were more predictable than they were

  • Dunning-Kruger effect - those least competent overestimate their abilities

It covers common heuristics, biases, and effects that influence how we think, judge, and make decisions. The book provides evidence and examples of these systematic thinking errors.

#book-summary
Author Photo

About Matheus Puppe