Self Help

The Undoing Project - Michael Lewis

Author Photo

Matheus Puppe

· 56 min read

BOOK LINK:

CLICK HERE

Here is a summary of the introduction and first chapter of The Undoing Project by Michael Lewis:

The introduction discusses how the concept of using analytics and data-driven decision making (as seen in Lewis’ previous book Moneyball) has been applied to many different fields, but with mixed results. While it has led to success in some cases like the Oakland A’s and Boston Red Sox, it has also received criticism when outcomes don’t meet expectations.

The reviewers that had the most enduring impact on Lewis were Daniel Kahneman and Amos Tversky, two Israeli psychologists whose work anticipated many of the insights around cognitive biases and decision making later seen in Moneyball. This prompted Lewis to explore their story and the origins of their work.

The first chapter provides some colorful examples of memorable yet problematic responses from NBA draft prospects during interviews that would impact how evaluators judged them. It sets up how human judgment and decision making is fallible, leaving open the potential for inefficiencies like those exploited by Billy Beane in Moneyball. The work of Kahneman and Tversky aimed to understand these types of cognitive biases at the root of inconsistent human evaluation.

In summary, the introduction and chapter one lay out Lewis’ motivation for the book - to tell the story of Kahneman and Tversky’s pioneering work on judgment and decision making, which helped explain many phenomena later seen in fields like analytics-driven baseball evaluation.

The passage discusses Daryl Morey’s data-driven approach to evaluating NBA prospects and making personnel decisions for the Houston Rockets. It describes how Morey resisted being swayed by the charm and stories of tall prospects during job interviews. While millions were at stake and the team’s future success depended on these choices, Morey believed in taking a statistically-based approach rather than being influenced by in-person interactions.

The passage also provides background on Morey, how he became interested in using statistics and data analysis to predict outcomes, inspired by Bill James’ work in baseball analytics. As a teenager, Morey was skeptical of so-called experts after his favorite baseball team, the Cleveland Indians, had a far worse season than predicted. This led him to think numbers and data could provide better insights than conventional wisdom. He went on to help revolutionize basketball management by applying this analytical approach.

  • Daryl Morey has always wanted to build winning sports teams. In college, his letters to pro sports franchises seeking a job went unanswered.

  • He decided to get rich so he could afford to buy a sports team. He went to business school thinking that’s how you get rich.

  • He worked as a consultant but the industry felt dishonest, pretending to know uncertain things.

  • He got a job with the Boston Celtics using early versions of a statistical model he built for fun to evaluate basketball players.

  • In 2003, he used the model to pick an obscure player with the Celtics’ late draft pick.

  • In 2007, the Houston Rockets hired him as GM hoping his analytical approach could give them an edge in acquiring talent.

  • Morey installed his statistical player evaluation model with the Rockets. The model helped identify the attributes and metrics that best predict pro success, going against conventional views.

  • While successful with the Rockets, Morey’s approach faced criticism and skepticism from those who felt it intellectualized the game and ignored the role of talent.

  • Daryl Morey was an early pioneer in using analytics and a statistical model to evaluate NBA draft prospects, gathering data that had not been collected before.

  • His model’s first test in 2007 was successful, picking two players who became starters with late draft picks. But 2008 highlighted limitations.

  • Joey Dorsey, highly rated by the model, was a bust. DeAndre Jordan, dismissed by the model, turned into an All-Star.

  • Morey realized age was too unimportant in the model - Dorsey’s stats came against younger competition. Weighting age more helped predictions.

  • The model couldn’t account for “basketball bullies” who dominated weaker opponents but not strong ones.

  • It also missed Jordan since he intentionally underperformed in college but had potential seen by a Rockets scout.

  • Morey concluded the model needed to incorporate more subjective scout evaluation to complement statistical analysis, as neither approach alone was sufficient. This helped address types of players the model struggled with.

  • Morey wanted to analyze physical traits of players more rigorously than ever before, going beyond stats like jump height to measures of explosiveness.

  • Models are limited and human judgment is still needed to assess how physical tools might translate to the NBA. But humans are biased - they misjudge based on appearances, get wedded to early impressions, and seek confirmation of preexisting views.

  • Morey learned this firsthand in drafting Marc Gasol, whom scouts mocked based on his physique but is now an All-Star. He tried reducing biases like overvaluing private workouts.

  • Confirmation bias causes scouts to reshape evidence to fit initial opinions. They also prefer players resembling themselves and overgeneralize from small samples, like wrongly dismissing Jeremy Lin as unathletic due to his race.

  • The mind convinces itself of uncertainties and resists seeing what doesn’t fit expectations. This hampers evaluation of unique talents like Lin until they succeed against the odds. Morey aims to balance models with an awareness of human cognitive limitations.

  • Daryl Morey, GM of the Houston Rockets, attended a Harvard Business School class on behavioral economics that changed his perspective. He realized how cognitive biases can distort judgment, like overvaluing one’s own players in trade negotiations due to the “endowment effect.”

  • Morey started accounting for biases like this in the Rockets’ statistical player evaluation model. They established draft pick values for their own players going into trades.

  • The story then shifts to the Rockets interviewing Satnam Singh, a 7-foot-1 center from India trying to enter the NBA. He had a difficult adjustment living in the U.S. and speaking English.

  • During the interview, Morey and others were most interested in Singh’s physical measurements and growth pattern, which was extraordinary. They struggled to communicate with him and determine his skills and personality due to the language barrier.

  • Morey remained unsure if they could truly evaluate a prospect’s character and future behavior, as psychologists had not provided useful insights. The interview with Singh highlighted the challenges of scouting international players.

  • Daryl Morey is the general manager of the Houston Rockets who revolutionized NBA analytics and player evaluation through a data-driven approach.

  • He interviews Satnam Singh, a 7’2” Indian basketball prospect, but has doubts because they lack meaningful data on his skills and experience playing organized basketball.

  • Morey’s statistical model has outperformed other NBA teams in drafting based on where players were selected. However, outcomes are still uncertain with only a few draft picks each year.

  • Traditional NBA scouting relied more on physical traits and biases rather than measurable on-court stats. Morey’s analytical approach was initially seen as disruptive but is now widely adopted.

  • The growing availability of data and computing power, as well as new owners open to information advantages, helped enable Morey’s non-traditional path to an NBA front office role.

  • New cognitive insights into human decision-making errors and biases also contributed to a shift toward fact-based evaluation in the NBA and other industries previously dominated by “conventional wisdom.”

  • Danny’s father was arrested and detained in the Drancy internment camp outside Paris in 1941 due to his Jewish identity. Through connections, he was released after 6 weeks but had become very thin.

  • The family fled Paris in 1942 as the Nazis invaded more of France. They hid in barns and farms, using fake identity papers. Danny’s father continued working as a chemist.

  • In late 1942, the family hid in a chicken coop outside Limoges as the threat increased. They received some food packages from L’Oreal where Danny’s father had worked.

  • Danny attended a new rural school but avoided social contact due to the risk. In early 1944, his father’s health declined from untreated diabetes. On April 27th, his father took Danny for a walk and told him to prepare to be responsible for the family, as he sensed he was dying. Danny’s father passed away that night.

  • Danny Kahneman and his mother fled to Palestine/Israel in 1946 after surviving the Holocaust in France. They moved in with Danny’s uncle in Jerusalem.

  • In late 1947, the UN formally divided Palestine into Jewish and Arab states, with Jerusalem belonging to neither. Fighting broke out between Jewish and Arab militias. Danny witnessed violence near his home.

  • In early 1948, 35 young Jewish fighters were killed by Arabs after sparing a shepherd’s life. This event impacted Danny. That same year, a Red Cross convoy was bombed, killing 78 people including a psychologist who had plans to start a psychology department.

  • As fighting intensified, Danny’s mother took him from Jerusalem to safer Tel Aviv. In May 1948, Israel declared independence and neighboring Arab states attacked. Tel Aviv faced sniper fire and danger.

  • During this time, Danny befriended Shimon Shamir, who would later become an Israeli diplomat. This was Danny’s first real friendship after having to keep isolated as a Jew hiding during the Holocaust.

  • Danny arrived in Tel Aviv speaking fluent English, better than anyone else in his class. He was considered brilliant and intellectually gifted.

  • Danny stood out from the other boys in how he tried to develop a proper English accent and was interested in intellectual pursuits rather than typical activities. He impressed his friend Shamir with an essay he wrote on his own about Greek and English culture.

  • After the war, Danny’s mother moved them to Jerusalem where Danny befriended Ariel Ginsburg. Danny kept to himself and didn’t fully assimilate into Israeli culture despite opportunities, maintaining French at home and intellectual interests.

  • Danny decided early on that he would not take on responsibilities or roots in one place. He was identified as suited for psychology on a vocational test and taught himself much of the field from books due to gaps in professors’ knowledge in the new university.

  • Danny was inspired by his charismatic professor Yeshayahu Leibowitz, known for his critiques of Ben-Gurion and theoretical insights. Danny showed a drift towards behaviorism and studying observable behaviors rather than introspection.

  • B.F. Skinner was an influential American psychologist who pioneered the theory of operant conditioning. He trained pigeons using rewards and found they could guide bombs and play games.

  • Skinner believed all animal and human behavior was driven by external rewards and punishments, not inner thoughts or feelings. His work inspired the behaviorist school of psychology.

  • Behaviorism viewed psychology as an objective science that studied observable behavior. It was dominated by white Anglo-Saxon Protestants who experimented on rats but were cautious about directly studying humans.

  • Gestalt psychology, led by German Jews, took a different approach focused on understanding the human mind and experience. They explored how perception and meaning are constructed.

  • Danny Kahneman was interested in these questions and attracted to the Gestalt approach, but still sought objectivity. He struggled to find his niche within psychology’s theoretical divisions and lack of cohesion.

  • Doing military service in Israel, Kahneman avoided direct combat but had vivid memories of almost facing orders that could have required harming civilians, which troubled him deeply.

  • Danny was a platoon commander in the Israeli army. On one mission, his unit was withdrawn before engaging in combat, which likely saved them from being ambushed and “butchered.”

  • On another mission, Danny led a patrol into Jordanian territory by accident after missing a sign. They narrowly avoided Jordanian snipers and retrieved a lost backpack, even though it was very dangerous. His superiors scolded him not for the near international incident but for not opening fire.

  • Danny found that serving in the army removed his feelings of vulnerability but that he was not well-suited for it. He was assigned to the psychology unit despite having no formal psychology training.

  • The new Israeli army struggled to integrate diverse immigrants and coordinate poorly trained units. Danny was tasked with designing personality tests to sort recruits into specializations, but found his predictions did not match outcomes, similar to a visual illusion.

  • Danny sought an alternative to short individual interviews that were prone to halo effects, where positive impressions of one trait influenced judgments of others. He wanted a more objective way to evaluate recruits.

  • Danny Kahneman overhauled the Israeli army’s assessment of recruits in the 1950s after being influenced by a book that showed clinical psychologists’ judgments were often worse than simple algorithms.

  • He taught interviewers to ask very specific behavior-based questions and rate recruits on scales from 1-5 for characteristics like sociability. This was designed to minimize bias and rely more on objective facts.

  • Danny then had officers rate their own soldiers to identify traits of successful personnel in different roles. However, he found there were no meaningful differences - the same personality traits tended to lead to success across roles.

  • His assessment process proved highly successful at predicting who would succeed or fail in the army. It reduced reliance on intelligence and education alone. The Israeli military still uses a similar process today with minor changes.

  • Later as a professor, Danny realized his inclination was not to tear down others’ statements but to make sense of them and find applicable broader truths, as he had done with the army assessment problem.

  • He briefly pursued personality research inspired by Walter Mischel’s marshmallow experiments but struggled to replicate his own findings, doubting his ability as a scientist in that field. He abandoned personality studies as a result.

  • Amnon Rapoport was an 18-year-old Israeli who was selected for tank commander duties in the Israeli army in 1956. He had experiences in skirmishes with Jordan and Egypt that gave him perspective on the dichotomy between being an efficient killing machine in battle versus showing compassion.

  • After his military service, he took a job as a bookkeeper in a remote copper mine to get away from it all. One day while in the desert, he saw an ad about a new psychology department opening at Hebrew University and was intrigued, though he knew little about psychology.

  • Amos Tversky was a small, pale, baby-faced soldier who stood in line next to Amnon when applying to the highly competitive psychology program. Amnon was impressed by Tversky’s intelligence.

  • Tversky came from a pioneering Zionist family in Israel. He had a gift for math and science but also storytelling. He was physically fearless and agile. He went on to excel in the psychology program and become Amnon’s collaborator.

  • Amos Oz pursued humanities in high school against expectations, forming an intense relationship with another gifted student Dahlia Ravikovitch.

  • He joined the leftist youth movement Nahal but volunteered to become a paratrooper when Dayan recruited elite soldiers.

  • As a paratrooper, Amos proved fearless, making over 50 jumps including behind enemy lines. He was decorated for bravery after saving a soldier’s life during an explosion.

  • Amos rose to platoon commander but his letters home were censored. He endured difficult combat experiences and testified against a sadistic officer.

  • By his late military service, Amos had changed drastically from the boy his sister knew. He rarely discussed his experiences but told funny stories instead.

  • Amos compelled himself to be brave despite initial weakness, and bravery became habit. After the army, he sensed being a very different person than when he entered.

  • Amos Tversky was known for his clever and sometimes cutting remarks directed at people he saw as full of themselves. He would playfully challenge eminent figures in economics, physics, and other fields.

  • Despite his unremarkable physical appearance, everyone who met Amos thought he was the smartest person they had ever encountered. He had an extraordinary first reaction to any intellectual problem.

  • Amos kept unconventional hours and prioritized only what interested him. He minimized social obligations and did exactly what he wanted to do. This allowed him to focus purely on his intellectual pursuits.

  • Amos had a close friendship with Amnon Rapoport from their university days. Amnon admired Amos’ intelligence but wondered why Amos liked him in return.

  • Amos strategically chose his academic focuses based on where he thought he could have the most impact, rather than out of traditional interest areas. He dropped philosophy after determining the major problems had already been solved.

So in summary, the passage describes Amos Tversky’s brilliant yet unconventional approach to intellectual pursuits and social interaction, as well as his powerful impact on those who knew him.

  • The passage discusses Amos Tversky’s interest in psychology and how it developed. As a child growing up in Israel, he was fascinated by understanding human behavior and why people think and act the way they do.

  • He found philosophy too subjective and preferred psychology’s attempt to be a more empirical science through testing theories on representative samples. However, he found many areas of psychology lacking in rigor.

  • One area that intrigued him was decision theory, which aimed to understand and predict how people make choices. An influential paper by Ward Edwards highlighted gaps in testing economic assumptions about rational decision making against actual human behavior.

  • Amos grew excited about the potential to experimentally test predictions from fields like economics about things like whether people’s preferences are transitive (if they prefer A to B and B to C, do they prefer A to C). He saw an opportunity to help build psychology into a more validated science.

  • This led Amos to pursue his graduate studies under Ward Edwards at the University of Michigan, where they would go on to pioneer the field of behavioral decision theory through rigorous empirical studies.

  • Amnon and Amos are Israeli students who come to the US on Fulbright scholarships to study decision making, as there was no one teaching that field in Israel. They view the move as temporary.

  • Amos Tversky, upon arriving at the University of Michigan, is quite quiet and speaks little English. Fellow students initially view him with pity.

  • By his second year, Amos’ English has improved and stories about him start circulating, like ordering food items a restaurant doesn’t have.

  • Amos passes the PhD foreign language requirement by translating French math equations, despite not speaking much French.

  • Amos experiments on inmates at Jackson State Prison to understand decision making, offering incentives like candy and cigarettes. He finds people can be induced into irrational preferences similar to student experiments.

  • Amos is drawn to professor Clyde Coombs’ work on measuring preferences and comparing choices to “ideals” to predict decisions, like choosing how much sugar to put in tea.

  • Amos is wary of his advisor Ward Edwards due to his eccentric behavior like charging for beer at parties and claiming credit for students’ work.

  • Psychologist Clyde Coombs argued that people make choices by comparing an “ideal” in their head to real-world options and choosing the option most similar to the ideal. But how do people judge similarity?

  • Amos Tversky studied how people judge similarity between objects/ideas. He found asymmetries - people saw A as similar to B but not B as similar to A. This contradicted existing theories that assumed similarity was symmetric.

  • Tversky proposed people judge similarity by comparing features, not distances on a mental map. The more shared features, the more similar. Fewer shared/more unique features means less similar.

  • His “features of similarity” theory explained asymmetries and could account for apparently irrational preferences violating transitivity. Context influences which features are noticeable.

  • By changing contexts, you can manipulate which features are salient and thus influence perceived similarity between options/ideas. Similarity has both causal and derivative aspects - it depends on context and classification.

So in summary, Tversky revolutionized understanding of similarity judgments through his observation of asymmetries and “features of similarity” theory of comparing observable traits, not distances on a mental map. Context is also important in determining perceived similarity.

  • Amos Tversky returned to Israel in 1966 after 5 years studying in the US. His friends noticed he seemed more serious about his work and had adopted some professionalism, like wearing suits.

  • He had also married an American woman, Barbara Gans, who he had met while studying in Michigan. She found Israeli culture very informal compared to America. Resources were scarce but basic needs were met equally.

  • In 1967, the Egyptian president closed the Straits of Tiran to Israeli ships, seen as an act of war. Amos was called up to the army to command an infantry unit, despite being 5 years removed from service.

  • The entire country prepared for potential war, with people fearing another siege. The unknowns around the scope and risks of conflict created panic. Amos seemed unconcerned, believing Israel’s air force would prevail. Barbara helped dig trenches on the border as war loomed.

  • In 1967, tensions between Israel and its Arab neighbors Egypt, Jordan, and Syria led to the Six-Day War. On June 5, Israel launched a preemptive air strike that destroyed Egypt’s air force. Israeli forces then invaded the Sinai Peninsula.

  • Israel was quickly at war on three fronts. Barbara, an American living in Jerusalem, took shelter and sewed sandbags. Her boyfriend Amos fought in the war.

  • Within a week, Israel achieved a decisive victory, doubling its territory to include the West Bank, Gaza Strip, Golan Heights, and Old City of Jerusalem. The short, conclusive nature of Israel’s wars reinforced the feeling of miraculous protection.

  • The war had personal impacts. Amos lost friends in battle. Barbara came to know four people killed. Survivors like Avi grappled with the experience of sudden, intense combat after minimal training and preparation.

  • Avi fought in capturing the Old City of Jerusalem, then was sent to the Golan Heights. Facing near-certain death, he survived the war and decided to study psychology to understand the human soul.

So in summary, it describes events leading up to and during Israel’s 1967 Six-Day War, including military operations, territorial changes, and personal stories of Israeli civilians and soldiers impacted by the war.

  • Danny Kahneman was an exceptional professor who taught classes part-time in psychology at Hebrew University. His lectures were brilliant, spontaneous and packed with knowledge.

  • He believed criticism was more effective than praise for teaching pilots to fly, as criticism tended to improve performance while praise made subsequent performance regress to the mean.

  • Danny was incredibly knowledgeable but also volatile and insecure. He lived through his work and constantly doubted his abilities as a teacher, despite his students admiring him greatly.

  • As a researcher, Danny studied various topics like vision, error and forgetting. He built a lab to examine visual illusions and how the eyes process information. His work was imprecise but filled gaps in his knowledge.

  • Though brilliant, Danny was also moody, emotional and starved for admiration. He shifted between many areas of psychology, broadening his expertise. His insecurity both strengthened and weakened him as a teacher and researcher.

  • Danny was interested in studying subliminal perception and whether people could be unconsciously influenced or learn things without realizing it. He conducted experiments showing subjects images or numbers quickly to see if they could detect patterns unconsciously.

  • He moved on from interests quickly if experiments failed or flaws were found. Colleagues noticed his willingness to try new ideas and change his mind often.

  • He observed flaws in psychoanalysis after analysts failed to predict a patient’s suicide despite closely studying her for a month.

  • In hypnosis experiments at University of Michigan, he questioned the validity of claims by suggesting pain could be chosen over reliving trauma, indicating experiences may not have been as extreme as described.

  • He was intrigued by research on pupil dilation in response to stimuli and how it could reveal unconscious preferences and mental effort’s impact on perception. He conducted related experiments on effects of tasks on pupil size.

  • He moved to Harvard after angry at delay of tenure at Hebrew University, being influenced by Anne Treisman’s research on selective attention and filtering of sounds at cocktail parties.

Anne Treisman conducted research on selective listening in the 1960s, finding that people could not entirely filter out unattended audio streams playing in each ear. This challenged assumptions about the ability to selectively attend. Danny Kahneman was interested in her work and how it related to attention in pilots. He conducted further research showing differences in attentional abilities between successful and unsuccessful fighter pilots. Kahneman applied psychological insights to real-world problems in Israel, like assessing panic response and facilitating organizational change. He taught practical application of psychology and challenged students to address open-ended problems, even without complete information. His work aimed to make psychology useful and informed topics like aviation safety, terrorism response, education and workforce training.

  • Danny brought games to class where the objective is to guide a metal ball through a maze. He assigned students to teach someone else how to teach the game, breaking it down into component skills.

  • Danny always found problems for his students to solve, even where none seemed to exist. Students would arrive wondering what problem he would bring that day.

  • One day, Danny invited Amos Tversky, a mathematical psychologist, to speak to the class. Danny and Amos had an apparent rivalry as stars of the department.

  • Amos described an experiment from Ward Edwards’ lab involving guessing probabilities from drawing poker chips from bags of mostly red or white chips. The experiment tested if people intuitively follow Bayesian reasoning when updating beliefs based on new information.

  • The research found people do shift probabilities in the right direction when drawing chips, but not as much as the true Bayesian odds. Edwards coined the term “conservative Bayesians” to describe how people reason with new information.

  • Psychologists like Ward Edwards believed that people intuitively behaved like Bayesian statisticians, updating probabilities based on new information in a rational way. This view aligned with prevailing theories in social science that people were rational actors.

  • Amos attended a seminar by Danny where Danny critiqued an experiment showing people adjusted probabilities after drawing chips from bags. Danny argued this proved little and people were not actually good intuitive statisticians.

  • Danny was skeptical of theories that did not match real-world behavior. He felt experiments like the one Amos described had no psychology and were just math exercises.

  • After the seminar, Amos seemed to have doubts about prevailing theories for the first time. Theories he had accepted were now seen more suspiciously. Amos was generally open-minded but not used to changing his views.

  • Danny argued prevailing theories blinded researchers and they tried to fit evidence to theories rather than adapting theories to evidence. Overall it captures Danny confronting Amos and theories of human rationality, causing Amos to reexamine his assumptions.

  • Amos Tversky and Daniel Kahneman were two prominent psychologists at Hebrew University in Israel in the late 1960s and early 1970s. Though they had very different personalities, they became close collaborators.

  • Tversky was an intuitive, confident Israeli “Sabra” while Kahneman was cautious, analytical, and still affected by his experiences surviving the Holocaust as a child. However, they shared interests in studying human judgment and decision-making scientifically.

  • After Tversky gave a talk criticizing other researchers’ views, Kahneman confronted him and said he didn’t believe the claims. This triggered Tversky to rethink his own views and assumptions. They began collaborating intensely, though keeping their discussions private.

  • Despite their many differences, they found common ground as Eastern European Jews without religious belief who wanted to use science to discover simple truths about human cognition. Their collaboration produced major advances in the field of behavioral economics and decision theory.

So in summary, Tversky and Kahneman had very different personalities but came to share an intellectual pursuit that led to a highly productive collaboration, despite initially keeping their discussions very private from others.

  • Danny Gordon was a messy and disorganized professor, while his colleague Amos was very tidy and minimalist. They struck up an unlikely friendship.

  • Danny had developed some questions testing people’s intuitions about statistics and probability. Amos administered these questions to psychologists at conferences.

  • Danny and Amos then collaborated to analyze the results. They had to work in a small room due to their messy/tidy offices and took turns writing by hand.

  • They found that even trained psychologists showed poor intuitions about concepts like sample size and variability. People underestimated how representative small samples are of overall populations.

  • They termed this the “belief in the law of small numbers.” Their paper argued this mental error was common even among scientists and could undermine social science research relying on small samples.

  • The paper, published in Psychological Bulletin, used humor and self-deprecation. It drew on Danny’s own statistical mistakes to argue these errors were human tendencies, not just individual flaws.

  • Their unlikely collaboration was highly productive and led to an influential challenge of common assumptions in psychology.

  • Psychologists Danny Kahneman and Amos Tversky published a famous paper in 1974 challenging the common assumption that people are intuitive statisticians who rationally update beliefs based on probability. Their work showed how people rely too heavily on small samples and seek explanations that confirm initial hypotheses rather than consider alternative possibilities.

  • To study how much tall buildings may sway with wind before making people uncomfortable, an engineering company consulted the Oregon Research Institute. They built a secret “sway room” on hydraulic wheels that could mimic building movement without subjects knowing. Eye exams were used to get subjects in without suspicion.

  • Subjects quickly noticed something was off even with slight movement, contrary to assumptions. Engineers visited and were shocked at how sensitive people were. This challenged assumptions about how skyscrapers could be designed and tenants’ comfort with movement.

  • Kahneman and Tversky’s work, and studies like this sway room experiment, highlighted that people’s intuitive judgments do not follow statistics or probability as closely as commonly believed. This challenged dominant theories in psychology and economics about human rationality.

  • Paul Hoffman conducted research at the Oregon Research Institute to understand how experts make judgments and decisions. He developed a method of analyzing the cues or inputs that experts use to infer how they weigh different factors.

  • Psychologists like Paul Meehl argued human judgment was inferior to algorithms. But most fields relied on expert human judgment, so it was important to understand where expert judgment goes wrong.

  • Researchers at ORI hoped to model expert judgment to identify errors and improve decision-making. They began studying how doctors diagnosed cancer from X-rays.

  • Doctors said they considered 7 key signs. Researchers created a simple equally-weighted model of these 7 factors to predict diagnoses.

  • When doctors diagnosed the same X-rays twice, the simple model was extremely accurate at predicting their judgments, even though doctors saw their thinking as complex. This showed expert judgment could be modeled with surprisingly simple algorithms.

So in summary, the ORI researchers developed methods to analyze cues experts use and were able to accurately model expert medical judgments with basic algorithms, contrary to experts seeing their own thinking as complex. This supported the idea that human judgment has systematic errors that models can help uncover.

  • Researchers gave doctors duplicate medical images/cases to diagnose and found wide disagreement even within individual doctors’ own diagnoses, suggesting fallibility.

  • They also found clinical psychologists/psychiatrists widely disagreed in case judgments and that experience didn’t correlate with accuracy.

  • They created simple algorithms/models based on doctors’ responses that actually outperformed individual doctors and the best doctor in cancer diagnosis.

  • This challenged assumptions that human expertise and experience ensured superior judgment to basic models. It suggested experts are still human and prone to biases affecting reliability.

  • Amos Tversky visited Paul Slovic and shared early ideas with Danny Kahneman about intuitive judgments and why experts might systematically err not just due to bad days but deeper cognitive biases.

  • They designed strange hypothetical questions on probability/odds and had Israeli and US students answer to collect data on intuitive errors and identify judgment patterns, hoping to understand “what people do” when judging probability concepts abstractly.

  • Amos and Danny became close collaborators. Amos would let his guard down and think differently when working with Danny, which fueled their successful collaboration.

  • They moved to Eugene, Oregon to continue their work. They would analyze data each morning and spent afternoons in lengthy conversations, bouncing ideas back and forth.

  • Their work became playful for both of them. Amos helped Danny view it as less stressful. They thought so similarly that others said they shared a mind.

  • Their first paper on intuitive versus statistical reasoning raised questions about how people actually think. Their next paper explored the “representativeness heuristic” - people’s tendency to judge likelihoods based on similarity to mental models or prototypes, rather than true statistical probabilities. This provided a partial answer to how non-statistician minds approach chance situations.

  • They pioneered using simple, often mysterious titles to force clarifying their paper’s focus, though their titles were initially hard to interpret for academic expectations of understandability.

  • Their collaboration was highly private and they often wanted only each other’s company when working, generating new ideas through intensive discussion.

Here are summaries of the parent populations compared to specific cases for each topic:

Gastric ulcers: A gastric ulcer in a specific person would be compared to the general population of people who experience gastric ulcers. Gastric ulcers affect a minority of the overall population.

Genocidal dictators: A specific genocidal dictator would be compared to the smaller population of dictators throughout history who implemented genocidal policies. Genocidal dictators are thankfully rare compared to the total number of historical rulers.

NBA players: An individual basketball player trying to make an NBA roster would be compared to the elite population of professional basketball players who succeed in reaching the NBA level. NBA players represent an extremely small segment of the overall population and an even smaller portion of those who play basketball. Only a fraction of basketball players have the talent and skills required to play professionally in the NBA.

  • Danny and Amos conducted experiments showing that people’s estimates of probabilities are often distorted by cognitive biases and heuristics (mental shortcuts).

  • One heuristic is availability - events that easily come to mind seem more likely. They found people overestimate probabilities of memorable events.

  • Representativeness is another heuristic where people judge probabilities based on similarity to prototypes rather than base rates.

  • Anchoring bias means people anchor estimates to irrelevant numbers, like those spinning a wheel.

  • Availability and anchoring lead to systematic errors because memories and initial numbers influence judgments even when not relevant.

  • Heuristics allow quick judgments but fall short when evidence needed is hard to retrieve or misleading evidence is salient.

  • Danny and Amos were early exploring these cognitive biases but lacked tools to fully understand mechanisms at play in the mind. Their work laid foundations for further study of heuristics and biases in judgment.

  • Amos Tversky and Daniel Kahneman were interested in how people make predictions and judgments under uncertainty. They noticed people don’t always follow statistical probabilities but instead rely on mental shortcuts or “heuristics.”

  • They conducted an experiment to test how the “representativeness heuristic” affects predictions. They created a hypothetical student profile named “Tom W.” and described his personality traits.

  • They asked one group to rate how similar Tom seemed to students in different graduate programs to see which field was most “representative” of him based on his profile.

  • They asked a second group to use Tom’s profile to predict which graduate program he would actually enroll in, after providing base rate statistics on acceptance rates to different programs.

  • The goal was to see if adding limited personal information affected predictions in a way that diverged from statistical probabilities/base rates. This would demonstrate the influence of representativeness heuristics on judgment under uncertainty.

  • Danny Kahneman and Amos Tversky were psychologists who pioneered the field of behavioral economics by studying cognitive biases and heuristics that influence human judgment and decision-making.

  • In the early 1970s, they started publishing research papers on their findings but initially had a small audience among academics.

  • During a year visiting the University of Oregon, they gained more attention and recognition for their work. It was clear they were “onto something.”

  • Irv Biederman, a psychology professor at Stanford, heard Danny give a talk in 1972 and believed their work could win a Nobel Prize in economics by explaining irrational decision-making.

  • Biederman invited Amos to give talks at the University of Buffalo in the summer of 1972. Amos gave five well-attended talks applying their research to various fields like economics, forecasting, medicine, and law.

  • The talk that stood out was about how their framework could provide a new lens for interpreting history by accounting for cognitive biases in forming explanations and hypotheses.

  • Kahneman and Tversky wanted to establish their research had broad applications and turn real-world domains into laboratories to study decision-making, in order to reach a wider audience beyond academics.

  • Amos gave a talk to historians describing how cognitive biases can affect historical judgments and interpretations. He argued historians tend to impose order on random events and see past events as less uncertain or predictable than they actually were.

  • Amos discussed research by his student Baruch Fischhoff showing people’s memories were distorted after Nixon’s trip to China and Russia. They overestimated how predictable the outcomes were.

  • After hearing Amos, the historians left feeling shaken, realizing how their work could be affected by cognitive biases without realizing it.

  • Amos and Danny were working to spread awareness of cognitive biases and uncertainty to fields beyond psychology. They wrote a summary article for the journal Science.

  • They saw potential in the new field of decision analysis, which tried to make explicit the probabilities behind decisions. However, the way it elicited probabilities from experts assumed accurate judgment, which Danny and Amos knew was flawed.

  • They flew to Stanford to meet with leaders in decision analysis, hoping to integrate their understanding of judgment errors. However, they got news of the Yom Kippur War and rushed home to potentially fight.

The passage describes a case that doctor Don Redelmeier was called to examine at the trauma center of Sunnybrook Hospital in Toronto. A young woman had been in a serious car accident and was found to have multiple fractures in her ankles, feet, hips and face. However, in surgery it was discovered she also had an irregular heartbeat. The trauma center staff immediately diagnosed it as likely being caused by her reported history of an overactive thyroid.

Redelmeier, whose role was to check the thinking of specialists, was skeptical of this initial diagnosis. Even though an overactive thyroid can cause heart issues, it is not a common cause. He urged the staff to consider more statistically likely causes before treating for the thyroid. Upon further examination, they discovered she also had a collapsed lung, which was the actual cause of the irregular heartbeat. Her thyroid tests later came back normal. Redelmeier’s intervention prevented an incorrect diagnosis and treatment, highlighting cognitive biases doctors can fall prey to in emergency situations.

  • The passage discusses Daniel Kahneman and Amos Tversky’s influential work on cognitive biases and heuristics. It focuses on Don Redelmeier, who was deeply influenced by reading one of their early papers as a high school student.

  • Redelmeier grew up in Toronto with speech and coordination issues. He was very talented in math and liked helping other students. This exposed him to his own fallibility.

  • In medical school, Redelmeier was disillusioned by some professors’ arrogance and tendency to overstate certainties. He noticed contradictory diagnoses and treatments being advocated with overconfidence.

  • Redelmeier was bothered by doctors attributing patient recoveries to their own treatments without evidence. He felt many diseases resolve on their own and doctors feel compelled to act. This can lead to overprescription of antibiotics, unnecessary surgeries, etc.

  • Redelmeier also noticed data being taken at face value without close inspection. For example, an elderly pneumonia patient’s normal heart rate should have been a clue that something was wrong, not a reassurance that all was well.

  • Kahneman and Tversky’s work resonated with Redelmeier because it explained predictable patterns in human errors and biases, including those of doctors and medical students like himself.

  • The emergence of evidence-based medicine in the 1980s highlighted how experts like medical doctors can make flawed intuitive judgments and fail to check their assumptions against data.

  • Daniel Kahneman and Amos Tversky showed even statistically sophisticated people are prone to errors in complex problems, demonstrating physicians were not immune to cognitive biases and fallacies.

  • Redelmeier realized as a medical student that doctors failed to systematically check their work like one checks math, even though medicine involves uncertainty where answers are less clear.

  • An example was conventional wisdom recommending suppressing arrhythmias after heart attacks, which studies later showed increased mortality rates. So medical experts had failed to check themselves against evidence.

  • Redelmeier began voicing skepticism early in his career, questioning the view of prominent doctors who opposed helmet laws for motorcyclists despite evidence it prevented deaths.

  • He was developing a worldview that incorporated Kahneman and Tversky’s work on recognizing human fallibility and the need to check expert intuition with data.

  • Redelmeier had lunch with Amos Tversky and Hal Sox, where Amos did most of the talking and asked probing questions about logic in medical behavior.

  • Redelmeier realized he learned more about his superior Sox from this one lunch than the past 3 years. Amos knew how to get people to think critically.

  • Later, Amos invited Redelmeier to his office to discuss ideas. Amos explained the “Samuelson bet” paradox and had Redelmeier find a medical analogy, which he did.

  • They subsequently published a paper showing doctors behaved differently treating individual patients vs groups, endorsing different treatments in each case.

  • Redelmeier enjoyed working with Amos, finding it “pure joy.” Amos had many insightful quotes and principles about human reasoning.

  • They studied the “hot hand” fallacy in basketball and people seeing patterns in random sequences. They also showed arthritis pain has no meaningful correlation to weather despite beliefs of a connection.

  • Working with Amos, Redelmeier gained a better understanding of cognitive biases and inconsistencies in human reasoning. It had a significant impact on his thinking.

  • Danny Kahneman became interested in studying happiness and how people’s predictions of their feelings differed from their actual experiences. He moved from studying happiness to studying unhappiness and pain.

  • Danny met Redelmeier through Amos Tversky and wanted Redelmeier’s help finding a real-world medical example of the “peak-end rule” - that people’s memories of an experience are shaped more by the peak intensity and end of the experience than the total duration.

  • Redelmeier and Danny studied colonoscopies, finding that patients who experienced a less unpleasant end to the procedure remembered it as less painful and were more willing to undergo another one.

  • Working with Danny was different than working with Amos - Danny focused more on potential flaws whereas Amos had many big ideas. Redelmeier learned little about Danny and Amos’ personal lives.

  • After his experience working with Amos, Redelmeier returned to Toronto wanting to combine cognitive psychology with medical decision making, though he was still figuring out how exactly to do so.

  • Dan Redelmeier felt he discovered another side of himself through working with Amos Tversky - a desire to find truth through data and replace false patterns in human behavior. However, he was unsure of himself.

  • Their relationship was intensely intellectual and they connected deeply, though it made others uncomfortable. Amos pushed to keep them working together.

  • When the Yom Kippur war broke out in 1973, they rushed to Israel from the US to help. Amos put on his old army uniform, which still fit, and rejoined as a captain in the military psychology unit.

  • This unit, led by Benny Shalit, had an unusual level of influence and autonomy. Psychologists were embedded with troops to directly advise commanders. Amos and Dan took a jeep to survey troop morale near the battlefield, though others thought this risky. Amos was excited by the opportunity to help in the war effort.

  • The passage describes Amos and Danny’s experiences interviewing Israeli soldiers after the Yom Kippur War of 1973. They administered questionnaires to gather information about soldiers’ experiences and emotions during battle.

  • Danny focused on practical issues like improving training methods based on his observations. Amos seemed more interested in being on the front lines and taking risks.

  • Reading the soldiers’ responses was an eye-opening experience, as they openly discussed fear and emotions for the first time. This challenged perceptions of Israeli soldiers as “supermen.”

  • Danny began giving talks expressing concerns about human judgment limitations and the risks of leaders relying too heavily on intuition rather than data and analysis in high-stakes decisions, especially around war and national security. He and Amos had hoped their research could inform real-world decision making through a more analytical, uncertainty-aware approach.

So in summary, it describes Amos and Danny gathering frontline insights from soldiers after the war, which revealed new perspectives, and Danny’s subsequent interests in applying their research to improve leader decision making.

  • Os and Danny believed that people should evaluate decisions based on the reasoning and uncertainty involved, not just the outcomes. The goal is to understand risk and play odds well, not always be right.

  • They worked with the Israeli intelligence agency to conduct a decision analysis using numerical probabilities for potential threats like war with Syria. But the foreign minister dismissed a 10% increased risk of war as a “small difference,” showing preference for intuition over numbers.

  • This disillusioned Danny and showed that people don’t genuinely understand or trust numerical probabilities to communicate risk. They prefer narratives to numbers.

  • Unable to reform how leaders made decisions, they tried teaching better reasoning to children through school courses. But they never followed through on this broader reform idea.

  • Amos then invited Danny to explore decision making specifically - how people choose between risky options. This shifted their focus from judgment to decisions after forming beliefs about probabilities and outcomes, even with uncertainty. They began collaborating closely on this new area of research.

  • Expected utility theory, first proposed by Bernoulli in the 1730s, posited that people make decisions to maximize their “utility” or subjective value, rather than purely maximizing expected monetary value. It assumed people are risk-averse.

  • This theory explained behaviors like buying insurance, but not behaviors like gambling. Von Neumann and Morgenstern later added mathematical axioms to the theory to formalize rules of “rational” decision making.

  • Amos Tversky questioned the theory’s ability to fully explain or predict human decision making. He found ways people’s decisions violated the axioms, like being intransitive in their preferences.

  • Amos wanted to develop an alternative framework that better incorporated human limitations and inconsistencies. He saw Danny Kahneman as uniquely suited to help with this given his deep insights into psychology and human nature.

  • Danny struggled to understand expected utility theory at first given his unfamiliarity with mathematics. However, he was skeptical it accurately described how people make decisions in reality.

  • An economist named Maurice Allais also doubted the theory and offered arguments against its axioms, helping motivate Amos and Danny’s research on alternatives.

Here is a summary of the key points about the Allais paradox and expected utility theory:

  • The Allais paradox was proposed by French economist Maurice Allais and challenged expected utility theory, a major concept in decision theory.

  • Allais presented two choice situations where most people’s preferences violated the predictions of expected utility theory. They preferred certainty over a gamble with a higher expected value in the first situation, but chose a riskier option over certainty in the second situation.

  • This contradiction sparked debate and attempts to resolve the paradox, like Savage proposing a more complex representation of the gambles. However, many like Amos Tversky and Danny Kahneman remained skeptical it was truly resolved.

  • Danny Kahneman saw the paradox not as a logical problem but a quirk of human psychology. He believed people factored in the anticipation of regret into their decisions, not just financial outcomes. Avoiding extreme regret was an important factor in choices.

  • Kahneman and Tversky went on to develop prospect theory as an alternative to expected utility theory, incorporating psychological factors like loss aversion and regret that influence real-world decision making. The Allais paradox was a key motivation for their work developing behavioral economic models of choice.

  • Danny and Amos were developing a theory of regret to better explain human decision-making under risk and uncertainty. One of their insights was that people experience greater regret over actions they took that led to losses compared to inaction that maintained the status quo.

  • They found regret is closely linked to feelings of responsibility and control over outcomes. The more control one feels, the greater the regret over bad outcomes.

  • Anticipating regret can skew decisions between sure gains and gambles - people are biased toward the sure thing (status quo) due to regret aversion.

  • Testing choices between lotteries and sure amounts uncovered patterns showing people are risk-averse for gains but risk-seeking for losses, flipping preferences depending on whether choices involve possible gains or losses.

  • Their work aimed to integrate psychological insights about emotions like regret into models of decision-making under risk/uncertainty. While still developing the full theory, they found expected utility theory lacked consideration of non-monetary consequences like regret. Flipping choices between gains and losses was an important discovery.

In summary, Danny and Amos were developing a “theory of regret” to better explain anomalies in decision-making by incorporating anticipated regret and other emotions into models of choice under uncertainty. Flipping choices between gains and losses was a breakthrough insight.

  • Danny and Amos discovered that people’s aversion to loss was even greater than their realization showed, exceeding the desire for gain. They needed to offer significantly better odds to get people to accept a gamble over a sure loss.

  • One implication was that their previous theory of regret could not fully explain risk-seeking behavior when losses were involved. They abandoned that theory.

  • Their new theory identified three key aspects: people respond to changes rather than absolute values; people approach risks very differently for losses vs gains; and people’s emotional responses to probabilities are not entirely rational.

  • They hypothesized this could explain behaviors like buying insurance and lottery tickets. Most prior economic theories failed to capture risk-seeking because they focused on monetary decisions that mostly involved gains.

  • In 1975, Amos presented their draft “Risk-Value Theory” to economists in Jerusalem, including future Nobel laureates. Arrow questioned how to clearly define a “loss.” Their theory hinged on people’s responses to losses vs gains but defining loss was complex.

So in summary, they discovered stronger aversion to losses than realized, abandoning regret theory, and developed the initial draft of prospect theory highlighting cognitive and emotional drivers of risk perceptions.

  • Psychologists Amos Tversky and Danny Kahneman were studying how people make decisions and found that reference points and framing significantly influence choices. People evaluate options relative to a reference point, and the same outcomes can be framed as gains or losses, changing risk preferences.

  • In experiments, they showed that people preferred a sure gain of $500 over a 50% chance to win $1000 when framed as a choice between gains, but preferred a risky 50% chance to lose $1000 over a sure loss of $500 when framed as a choice between losses.

  • Their famous “Asian Disease Problem” found people preferred a sure outcome when choices were framed in terms of lives saved, but preferred risk when framed in terms of lives lost, even though the choices were identical.

  • This challenged economic assumptions that people rationally maximize expected utility. Choices depend on how options are described or framed.

  • Early skeptic Richard Thaler was inspired by their work and went on to pioneer the field of behavioral economics exploring psychological influences on economic decisions. Tversky and Kahneman’s research revolutionized understanding of decision making and judgment.

Richard Thaler was interested in applying psychology concepts to economics from an early stage in his career. However, his ideas were rejected by the economics establishment. After losing his job, he secured a temporary position teaching at the Rochester School of Management.

While researching the value of a human life, Thaler began to question traditional economic assumptions of rational decision-making. He found that people’s willingness to pay to avoid risks was inconsistently high compared to their willingness to accept risks.

Thaler started compiling a list of examples where people behaved irrationally according to economic theory. This included the endowment effect and sunk cost fallacy. However, his fellow economists were not interested in these ideas.

Thaler was denied tenure at Rochester. Soon after, he discovered the work of psychologists Daniel Kahneman and Amos Tversky. Their prospect theory provided a framework to explain irrational behaviors using logic and mathematics. This gave Thaler’s ideas legitimacy within economics. He decided to turn his list of examples into an academic article, reaching out to Tversky for feedback. This marked a turning point where psychology began influencing economic thought more substantially.

  • Danny Kahneman and Amos Tversky collaborated to develop prospect theory, which challenged the rational actor model in economics. They sought to uncover how people actually make decisions rather than assuming perfect rationality.

  • At first, economists were resistant to their findings that challenged assumptions of rationality. But Amos realized they needed to express their theory mathematically to gain acceptance.

  • Danny struggled with personal turmoil as his marriage was falling apart. This slowed their progress but Amos was committed to continuing their collaboration.

  • In 1977, they took positions at Stanford to allow Danny to be with his new partner Anne, who couldn’t move to Israel. This was a difficult decision for Amos, who saw their joint work as incomplete.

  • Their theory gained prominence after being published in Econometrica in 1979. It established that decision making involves emotions, subjective weighting of probabilities, and loss aversion rather than purely rational choices. This challenged standard economic assumptions of human behavior.

Here is a summary of the provided text:

The passage discusses a collaboration between Amos Tversky and Daniel Kahneman, who pioneered the field of behavioral economics. Psychiatrist Miles Shore interviewed them in 1983 as part of research on successful collaborative pairs.

Shore found it difficult to assess individual contributions when promoting researcher J. Allan Hobson, who worked closely with a partner. This inspired Shore’s research. He interviewed over 20 pairs, including Tversky and Kahneman.

Tversky and Kahneman said their early work studied “natural stupidity” rather than artificial intelligence. Like other pairs, their partnership created strains but also an “exclusive private club.” They had lost sense of individual contributions by the time of the interview.

What set Tversky and Kahneman apart was their openness about conflicts, which most pairs ignored. Kahneman complained about being perceived as Tversky’s assistant and feeling overshadowed. Tversky blamed outside perceptions for issues.

Alone with Shore, Kahneman hinted problems weren’t entirely external and wondered if Tversky could better control perceptions. Kahneman confessed to envy of Tversky receiving more credit. Shore left thinking they had overcome a rough patch, but Kahneman later said on hearing the tape that their partnership was “finished.”

  • When renowned psychologist Amos Tversky became available to be recruited, universities quickly sprang into action to try and hire him, recognizing his significant contributions to the field.

  • Stanford University moved especially fast, hoping to convince Amos to join them over larger universities that could offer more positions. Lee Ross led the effort, making a persuasive case to the psychology department and university president within a day.

  • Stanford offered Amos lifetime employment that same afternoon, an unprecedentedly fast offer. Amos ultimately chose Stanford over Harvard due to preferences for the location and lifestyle.

  • Amos’ long-time collaborator Danny Kahneman accepted a position at the University of British Columbia, as Stanford only wanted to hire Amos and was not interested in Danny as well.

  • Danny reflected on how his life had changed dramatically but was feeling pride in Amos’ success rather than envy. He began exploring how people mentally “undo” tragic events through counterfactual thinking and fantasy scenarios about alternative outcomes.

  • Danny saw patterns and constraints in these types of thoughts just as in other cognitive biases he and Amos had studied. He wanted to develop this idea of a “simulation heuristic” around imagined futures and alternative realities.

Danny is exploring the idea of counterfactual thinking and “undoing” reality through imagination. He conducted an experiment where subjects were more upset when a flight was delayed by 5 minutes rather than entirely missed, showing emotion can be influenced by proximity to an alternative reality.

Danny set out to understand the “rules” or constraints of the imagination when undoing events. Some rules he proposed included: the more items that need to be undone to create an alternative scenario, the less likely the mind is to conceive of it; events become harder to undo over time as more consequences accumulate; people prefer to keep situations fixed and have actors change rather than completely altering situations.

People also tend to undo whatever feels surprising or unexpected rather than more probable alterations. Danny gives examples of mentally undoing accidents and major historical events like Hitler to illustrate these rules. He likens the mind’s propensity to take the easiest path of undoing to downhill skiing.

Danny shares his ideas with Amos in a letter but Amos does not contribute much feedback, which is unusual. Their differing professional statuses may be coming between them as well. Danny is exploring the human imagination in a new way without challenging an existing theory.

  • Danny and Amos had a close collaborative relationship for over a decade where they would share and develop ideas together. But now that they were physically separated, Danny felt frustrated that he no longer had Amos’ input and feedback on his ideas.

  • Danny delivered a talk at the University of Michigan where he unveiled new ideas he had developed without Amos’ involvement. This confirmed to Danny that Amos may be running low on new ideas. However, Amos did not give Danny proper credit during a conversation after the talk.

  • This moment marked the beginning of the end of their collaboration for Danny. He started sharing his ideas with others like Dale Miller without mentioning Amos. Their dynamic shifted as Danny no longer wanted to be in Amos’ shadow.

  • Meanwhile, Amos was busy traveling and giving talks. He would work through ideas out loud to himself. He joined a delegation to the Soviet Union to discuss psychology, finding the whole thing amusing. He continued adding to notes on “The Undoing Project” but kept this separate from Danny. Their close collaboration was unraveling.

  • Amos received a prestigious MacArthur “genius” grant in 1984 for $250k plus additional funding, which praised his work but did not mention his long-time collaborator Danny.

  • Amos disliked prizes because they exaggerated differences and did more harm than good by only recognizing one winner of collaborative work. He was upset that Danny was not also recognized for their joint work.

  • Over time, more accolades, prizes, and praise came to Amos alone for the work he had done with Danny. Amos tried to correct the record and insist Danny also be credited for their collaborations.

  • Danny was bothered by Amos receiving invitations and attention to conferences that Danny was not also invited to, even though the work was joint. Danny confessed this strained their relationship.

  • In the US, Amos began receiving all sorts of requests for advice from diverse fields, even ones he knew little about. The world accepted interacting with Amos on his terms. This growing fame and singular recognition of Amos further strained his collaboration with Danny.

  • Amos Tversky suggested that airlines like Delta should change their cockpit decision-making environment rather than try to train out pilots’ inherent cognitive biases. Specifically, they should encourage pilots to question each other and not allow an “autocratic jerk” captain to dominate without challenge. This helped stop mistaken landings.

  • Early critics of Kahneman and Tversky’s work included psychologists who felt their work undermined previous theories. Ward Edwards, who had originally called for studying economic assumptions, was a prominent early critic. He dismissed their experiments and claimed human intuitions were more rational than they showed.

  • Other academics were suspicious of Kahneman and Tversky’s apparent joy in their work, feeling it made their motives seem less serious. Philosopher Jonathan Cohen argued people could not truly make mistakes if errors were widespread.

  • Kahneman and Tversky’s ideas eventually spread widely, including to medical decision making, concerning Edwards. His planned critique of their work was a “failed paper” he later conceded had many flaws. Not all were willing to publicly challenge Amos Tversky given his skill at debate.

This summary describes Kahneman and Tversky’s work on the “Linda problem,” which demonstrated how people use heuristics and intuitions in ways that can lead them to violate basic rules of logic and probability. Some key points:

  • Amos Tversky wanted to demonstrate how heuristics could misleadingly influence judgments, to “embarrass” opponents who denied human irrationality.

  • The Linda problem presented a description of “Linda” and asked people to rank probabilities of different options, like “bank teller” vs. “bank teller and feminist.”

  • Despite it being logically impossible, people consistently judged “bank teller and feminist” as more probable than simply “bank teller.”

  • Kahneman was initially skeptical people would make this logical error but experiments showed up to 85% of subjects doing so.

  • It revealed how powerful heuristics like representativeness can be, overriding logic even when the logical problem is directly pointed out.

  • The Linda problem became a famous demonstration of human judgment biases and limitations of intuitive thinking. Tversky saw it as “winning the argument” while Kahneman focused more on the psychological insights.

  • Amos Tversky and Danny Kahneman did groundbreaking work together on cognitive biases and heuristics in the 1970s and 1980s. Their most famous finding was the “conjunction fallacy” or Linda problem.

  • However, over time their collaborative relationship became strained. Danny felt overshadowed by Amos and that Amos did not give him equal credit. He also felt Amos was less receptive to his ideas.

  • Their interactions became increasingly tense. When Danny moved to UC Berkeley in hopes of improving their relationship, it actually made things worse seeing Amos more often.

  • Danny struggled with depression after the move. He wrote to Amos about needing to adapt how they interacted due to changes in their relationship.

  • Amos acknowledged issues with how he responded to criticism but felt Danny had become less open too. They had differing views of their relationship.

  • By the late 1980s, their communication had deteriorated badly. Danny finally left Berkeley for Princeton in 1992 to distance himself from Amos, feeling he “possessed” his mind. Their partnership had effectively ended after over 15 years of groundbreaking collaboration.

  • Danny and Amos had a long-running collaboration and friendship, but Danny increasingly saw their collaboration as over while Amos still hoped to continue working together.

  • They had a disagreement over how to respond to a German psychologist, Gerd Gigerenzer, who was critical of their work. Amos wanted to strongly rebut Gigerenzer, while Danny felt that was giving him too much attention and making them angry. Danny reluctantly agreed to help Amos “as a friend.”

  • Drafting a response led to arguments between Danny and Amos over small details. Danny found the process upsetting while Amos thought Danny was too sensitive. Their disagreements highlighted differences in how critically they saw their critics and each other’s work.

  • The back-and-forth left their collaboration and friendship increasingly strained, though they remained publicly collegial and tried to continue finding ways to work together privately. Their private disputes revealed deeper issues in how they now saw each other.

  • Danny had a dream that his doctor told him he had 6 months to live. He told his friend and collaborator Amos about this dream.

  • Amos was not impressed and said that even if Danny only had 6 months left, Amos would still expect him to finish their work together.

  • Danny was not listed again for membership in the National Academy of Sciences, despite Amos having been a member for a decade. Danny asked Amos why he wasn’t put forward, and knew the reason was their differing views - Amos valued independence over friendship.

  • Upset, Danny ended their friendship and collaboration. Shortly after, Amos called Danny to share that he had been diagnosed with terminal cancer and only had around 6 months left. Hearing this, Danny softened his stance.

  • The story discusses how Danny and Amos’ work in behavioral economics and psychology influenced many others, like economists Richard Thaler and Cass Sunstein. It led to changes in policies and regulations based on psychological insights. So their work had widespread real-world impacts.

  • Even after returning to Canada from Stanford, Don Redelmeier still felt Amos Tversky’s influence on his work. Amos’ voice was so powerful in his mind that it was hard for Redelmeier to hear his own ideas.

  • Redelmeier began to question common assumptions, like the practice of rushing homeless people through emergency rooms. In an experiment, he found that providing better care to homeless patients actually reduced their use of healthcare resources overall.

  • He also studied the risk of driving while using a cell phone, proving it was as dangerous as drinking and driving. This sparked regulations and likely saved thousands of lives.

  • Redelmeier became interested in how human judgment could lead to accidents. Amos had taught that people don’t truly appreciate their own fallibility. This gap leads to many preventable accidents.

  • When Amos learned he was dying of cancer, he told very few people and instructed them not to spend much time discussing it with him. He maintained his routine and avoided changes.

  • Amos spoke stoically about his impending death and focused on finishing his work. He rejected most visits and calls in his final months, maintaining his independence and privacy until the end.

  • Amos took a phone call informing him about something, but the details are unclear. Varda heard Amos assure the caller that winning the Nobel Prize was not something he would miss.

  • Amos spent his last week at home with his family. He had obtained drugs to end his life when he felt it was no longer worth living. He subtly asked his son questions about euthanasia.

  • On the day of the Israeli election, Amos said he wouldn’t see peace in his lifetime upon hearing Netanyahu had defeated Peres. He died a few days later on June 2, 1996.

  • Danny, who Amos said caused him the most pain but was the person he most wanted to talk to, delivered the eulogy at Amos’s funeral. Danny seemed disoriented at the funeral.

  • In their final conversations, Amos and Danny discussed their work and Amos said Danny was the person he was most comfortable talking to about death. Danny spoke to Amos nearly every day as he approached death.

  • Danny gave a talk in Stockholm in 2001 where he was being considered for the Nobel Prize. He didn’t want to solely focus on his joint work with Amos.

  • In 2002, Danny did not allow himself to imagine winning the Nobel Prize. On October 9th, the phone rang - it was Stockholm calling to inform him he had won the Nobel Prize, which he had hoped to share with Amos.

  • The author puts their book on Danny’s bestseller list to help promote sales, since Danny made a mistake that benefited the author.

  • The author recommends reading Danny’s book as well as two encyclopedia/autobiography books for more information on psychology.

  • The author lists the sources they drew from in writing their book, chapter by chapter. This includes academic papers, books, and news articles.

  • The sources cover a wide range of topics in psychology including decision making, cognitive biases, prediction, statistics, risk and uncertainty. Many papers are co-authored by Daniel Kahneman and Amos Tversky, who made influential contributions to the field.

  • The author leaned on this prior work from others to help grapple with their subject matter for their book. They give credit to the individuals and ideas that informed their own work.

So in summary, the passage lists the sources the author relied on for each chapter of their book, thanking those who contributed influential prior research and ideas in the field of psychology. It also briefly mentions how a mistake by Danny helped promote sales of the author’s book.

Here are summaries of the key sources provided:

  • Kahneman, Daniel, and Amos Tversky. “Discussion: On the Interpretation of Intuitive Probability: A Reply to Jonathan Cohen.” Cognition 7, no. 4 (1979): 409–11. This source discusses Kahneman and Tversky’s response to Jonathan Cohen’s interpretation of intuitive probability.

  • Tversky, Amos, and Daniel Kahneman. “Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.” Psychological Review 90, no. 4 (1983): 293–315. This source discusses Kahneman and Tversky’s work on the conjunction fallacy in probability judgment.

  • Kahneman, Daniel, and Amos Tversky. “Advances in Prospect Theory.” Journal of Risk and Uncertainty 5 (1992): 297–323. This source discusses advances in Kahneman and Tversky’s prospect theory.

  • Vranas, Peter B. M. “Gigerenzer’s Normative Critique of Kahneman and Tversky.” Cognition 76 (2000): 179–93. This source discusses Gigerenzer’s normative critique of Kahneman and Tversky’s work.

The other sources provided context about cognitive psychology, psychology as a field, and biographical works but were not summarized individually.

#book-summary
Author Photo

About Matheus Puppe