Self Help

The Bias That Divides Us - Keith E. Stanovich

Author Photo

Matheus Puppe

· 45 min read

BOOK LINK:

CLICK HERE

This book examines the phenomenon of “myside bias”, which is the tendency for people to evaluate evidence, generate evidence, and test hypotheses in a manner biased toward their own prior beliefs, opinions, and attitudes.

Chapter 1 introduces different experimental paradigms that behavioral scientists have used to study myside bias in the lab. It shows that myside bias is one of the most robust and universal cognitive biases.

Chapter 2 discusses whether myside bias should be considered an irrational reasoning error or if it has some rational justification.

Chapter 3 analyzes how myside bias is an “outlier bias” - unlike most other cognitive biases, it is not predictable based on intelligence, executive functioning, or thinking dispositions. It also has little consistency across domains.

Chapter 4 argues that models focusing on acquired beliefs rather than cognitive processes better explain myside bias.

Chapter 5 explains how myside bias creates a blind spot, as cognitive elites incorrectly think they are less susceptible to it than others.

Chapter 6 explores how this blind spot contributes to ideological polarization and declining trust in expertise. It discusses ways to address the social effects of myside biases.

In summary, the book analyzes myside bias as a pervasive but unpredictable cognitive tendency that has significant social and political consequences in exacerbating division and polarizing debates.

  • The introduction discusses how myside bias occurs across many judgment domains and paradigms, as shown in representative studies from different disciplines in Table 1.1.

  • Two classic studies are described to illustrate myside bias - Hastorf and Cantril’s 1954 study showing Princeton and Dartmouth students interpreted the same football game differently based on their side, and Kahan et al’s 2012 replication of this finding using a protest video.

  • The chapter then acknowledges terminology confusion around similar concepts like confirmation bias and belief bias, and notes the term “myside bias” will be used going forward to refer to preferential processing of one’s own opinions regardless of factual accuracy.

  • In summary, the beginning discusses the widespread evidence of myside bias across domains and paradigms, demonstrates this through two well-known studies, and clarifies the terminology that will be used in the book.

The terms “confirmation bias”, “belief bias”, and “myside bias” are used inconsistently in the scientific literature to describe different cognitive biases. “Confirmation bias” is the most commonly used term more broadly, but is also the most ambiguous. Some key points:

  • “Confirmation bias” refers narrowly to favoring evidence that supports one’s existing hypothesis, which is not inherently irrational if counter-evidence is also considered appropriately.

  • “Belief bias” occurs when prior beliefs interfere with logical reasoning, as seen when people judge the validity of conclusions that contradict their beliefs.

  • “Myside bias” involves stronger emotional commitments to opinions valued as “convictions”, leading one to interpret evidence selectively in their favor. It goes beyond simple belief bias.

  • Convictions derive from cherished worldviews and “protected values” resistant to compromise. They are less open to evidence than more flexible “testable beliefs”.

  • The terms highlight different phenomena - confirmation bias a neutral tendency, belief bias an error, and myside bias motivated reasoning - but have been used inconsistently, creating ambiguity that careful definitions aim to resolve.

  • Myside bias refers to the tendency to evaluate evidence and generate conclusions in a way that favors one’s existing beliefs, attitudes and ideologies. These prior beliefs and attitudes are called “distal beliefs”.

  • Distal beliefs are convictions that derive from general worldviews/ideologies and are not easily testable through facts. In contrast, belief bias focuses on testable beliefs about facts.

  • Myside bias has been demonstrated in how people evaluate ambiguous actions based on who it affects (their group vs others), logical reasoning on belief-consistent vs inconsistent arguments, and political/ideological judgments.

  • Factors like religion, political ideology, and partiisanship act like distal beliefs that induce myside bias in logical reasoning and evaluation of conclusions.

  • Myside biased thinking also influences negotiations, legal/workplace decisions where people favor the side they identify with more.

  • Several studies show how people’s assessments of fairness depend on which side of an outcome they are on, exhibiting myside bias in their judgments.

  • In summary, myside bias refers to the tendency to favor one’s existing beliefs, attitudes and worldviews when evaluating evidence, arguments and generating conclusions across many domains of thinking.

  • Studies have found evidence of “myside bias”, where people tend to evaluate arguments and evidence in a biased way that favors their own prior beliefs and opinions. They rate arguments supporting their views as stronger and find more flaws in opposing arguments.

  • One study showed participants pro and con arguments about affirmative action/gun control. Subjects preferred arguments matching their views and spent more time refuting opposing arguments.

  • Another study presented participants with conflicting studies on the death penalty - one supporting deterrence and one not. Both groups rated the study confirming their views more favorably despite similar rigor. Moreover, their views polarized even more after reading conflicting evidence.

  • People also generate arguments in a biased manner, coming up with more reasons to support their side even when instructed to be unbiased.

  • Ratings of risk and rewards for activities tend to be negatively correlated, with people minimizing risks of activities they favor and vice versa.

  • Studies have found similar biased trade-offs when judging morality - people discounted costs of actions supporting their moral views and consequences of opposing views.

  • In general, people use facts and evidence selectively to fit their prior beliefs rather than evaluating evidence independently on its merits. Myside bias undermines objective evaluation of arguments and evidence.

  • Researchers studied the phenomenon of “myside bias”, where people tend to evaluate evidence in a way that confirms their preexisting beliefs or positions on issues.

  • In one experiment, participants were given numerical data (in a 2x2 table format) about the effectiveness of a medical treatment or the impact of gun control laws. The data was designed so some versions supported their prior beliefs while others contradicted them.

  • Participants were more accurate in their analysis of the data when it supported their existing views, and less accurate when it contradicted their views. This myside bias occurred equally among those with differing opinions.

  • A later experiment by Van Boven et al presented identical statistical evidence about immigration restrictions and assault weapon bans to the same participants.

  • Participants favored the statistical formulation that supported their preexisting view on each issue, regardless of whether that view was in support of or opposition to the policy in question.

  • Both liberals and conservatives demonstrated myside bias by selectively focusing on different parts of the identical statistical evidence depending on whether it aligned with or went against their views on the specific issue.

So in summary, the research demonstrated people’s tendency to evaluate ambiguous evidence in a biased way that confirms their prior beliefs, across different topics and for people on both sides of issues.

  • Van and colleagues’ 2019 experiment demonstrated that people pick and choose statistics based on which ones support their prior opinions.

  • The article discusses how pro-immigration and pro-gun control advocates would view different statistics. Supporters of these positions would likely dismiss statistics that contradict their views, even if the statistics appear to reducing the problems they are concerned about.

  • Myside bias is ubiquitous across many studies and groups. It is not limited by intelligence, beliefs, values, or other demographic factors.

  • Some argue myside bias evolved because it increased reproductive fitness even if it reduced rationality. Evolution favors mechanisms that are fast, efficient, and don’t interrupt other cognitive processes, even if they produce some false beliefs.

  • According to Mercier and Sperber’s influential theory, reasoning evolved for social and communication purposes like persuasion, not for truth-seeking. This focus on persuasion rather than accuracy can explain the prevalence of myside bias in human reasoning.

  • Mercier and Sperber argue that human reasoning abilities evolved primarily to persuade others through argumentation, not for unbiased truth-seeking.

  • This makes people prone to “myside bias” where they generate and evaluate arguments in a one-sided manner that supports their own opinions, rather than considering both sides impartially.

  • They claim myside bias persists even when reasoning alone, as people intuitively anticipate future argumentative dialogues with others.

  • While people show myside bias evaluating arguments for distant beliefs, they are less biased for testable beliefs where the evidence is clear.

  • The evolutionary role of argumentation in persuading others may have reinforced myside bias through social and group benefits like promoting consistency, confidence and group cohesion.

  • Things like the cognitive costs of open-mindedness or social costs of inconsistency may have maintained myside bias despite long-term benefits of doubt and impartial evaluation.

  • So Mercier and Sperber provide an evolutionary model how myside bias is inherent in human reasoning abilities developed through argumentation, although the degree it constitutes an “irrational” thinking error requires more analysis.

  • Evidence evaluation tasks in myside thinking paradigms ask subjects to evaluate evidence for or against their prior positions. This seems to violate Bayesian belief updating which says new evidence should be evaluated independently of prior beliefs.

  • However, the normative issues around myside bias are complex. Researchers now recognize the Bayesian formula was too simplistically applied in some paradigms where its use requires more nuance.

  • When subjects must interpret ambiguous information rather than calculate precise likelihood ratios, some degree of myside bias may be justified. Prior beliefs could legitimately influence how subjects assess the reliability or plausibility of ambiguous evidence sources.

  • Jonathan Koehler’s 1993 study of parapsychology experts found they gave lower ratings to studies contradicting their positions. But Koehler analyzed why this bias may still be normative when reliability of information is in question.

  • We now know the stricture against prior beliefs infecting evidence evaluation is weakened for paradigms requiring subjects to assess unclear or unreliable information sources. Some myside bias can be justified in evaluating the plausibility or credibility of such sources.

  • Koehler’s proof B shows that under certain circumstances, it is justified for scientists to rate studies that agree with their favored hypotheses as better than those that disagree. This demonstrates that a degree of “myside bias” in experiment evaluation can be normative.

  • The two key conditions laid out in the proof are: 1) A good study is more likely to produce results congruent with the true state of nature, and 2) The scientist’s favored hypothesis must be deemed more likely to be true than the alternative, based on prior evidence and experience.

  • When these conditions are met, allowing prior beliefs to impact the evaluation of new evidence (known as “knowledge projection”) can lead to the faster accumulation of true beliefs in domains where most prior beliefs are accurate.

  • However, knowledge projection could delay the assimilation of correct information if it is usedstarting from a set of prior beliefs that contain substantial falsehoods. This could trap someone in an “island of false beliefs.”

  • The proof only demonstrates local rationality - it does not guarantee the prior belief itself was determined through an unbiased process. To achieve global rationality, the prior probability would need to be based on valid evidence, not just a “myside preference.”

So in summary, the proof shows a degree of myside bias can be rational under strict conditions, but full rationality requires ensuring prior beliefs were formed through an unbiased evaluation of evidence as well.

  • Koehler’s proof B only establishes that myside bias (projecting one’s prior beliefs) is locally normative, but does not consider whether it is globally normative.

  • For myside bias to be globally rational, the focal hypothesis H that one deems more likely must be properly chosen based on previous evidence, not just one’s worldview.

  • Kelly exhibits globally justified myside thinking by basing her H on previous evidence, not just her personal preferences.

  • Dale exhibits “serial abuse” of proof B by choosing H based on his desired worldview rather than the evidence, which is not globally rational.

  • Most people’s prior beliefs are a combination of evidence-based knowledge and worldview/convictions. Kelly relies more on evidence, Dale relies more on worldview.

  • The degree to which one’s prior belief (and projection of it) is globally rational depends on how much it is based on testable evidence vs. distal worldview/convictions. Kelly is more rational than Dale in this respect.

So in summary, Koehler’s proof only validates local rationality, but globally rational myside bias depends on properly choosing the focal hypothesis H based on prior evidence, not just personal preferences or worldview.

The passage discusses when belief polarization (divergence of opinions after seeing the same mixed evidence) can be considered normative or rational. It references proofs and analyses by Koehler (1993) and Jern, Chang, and Kemp (2014) that show in certain cases, belief polarization is an expected and appropriate Bayesian outcome.

Specifically, Koehler’s proof B showed that projecting one’s prior probability when evaluating new evidence is rational if the prior was formed rationally and the evidence evaluation follows Bayesian principles. Jern et al. argued belief polarization can occur normatively when different individuals have different frames or worldviews that lead them to interpret the same evidence differently in a way consistent with their priors.

The passage gives examples like Bayesian updating by chess spectators with differing player assessments, and a nuclear power debate where supporters and opponents psychologically framed a safety incident differently. In these cases, divergent posteriors after joint evidence do not necessarily reflect irrationality. However, the analyses are local and do not assess the rationality of priors or framing themselves. Overall, belief polarization can be rational depending on the reasonableness of the prior and interpretation, not just the evidence viewing alone.

The passage discusses whether projecting a worldview onto evidence evaluation is necessarily irrational. Previous analyses treated myside bias solely as an epistemic rationality issue, but instrumental rationality is also important.

Epistemic rationality requires beliefs to be true, while instrumental rationality optimizes goal achievement. These can conflict, as true beliefs are not always instrumentally beneficial. There may be costs to changing beliefs even when evidence contradicts them.

Instrumental goals like group cohesion could rationally lead to myside bias by maintaining existing beliefs. Sacrificing some epistemic accuracy for the benefits of group membership is not inherently irrational.

Political science studies show partisan identity strongly predicts views, suggesting myside bias serves social/group goals over issue-based accuracy. Dueling motivations of epistemic accuracy and social/group endorsement likely both influence belief.

So projecting worldviews is not automatically irrational, as instrumental goals like relationships, cognition efficiency, and group membership could rationally motivate myside bias over pure epistemic accuracy in some cases. A comprehensive analysis requires considering both epistemic and instrumental rationality.

The key source of myside bias discussed in the passage is expressive rationality or identity protective cognition, which arises from people’s desire to maintain a positive social identity within affinity groups that hold certain beliefs. Specifically:

  • People belong to groups that define their identity and hold certain beliefs as central. Accommodating disconfirming evidence may subject them to sanctions from the group.

  • Dan Kahan argues this leads to myside bias, as people will readily accept confirming information but scrutinize disconfirming information to a higher degree.

  • This myside bias is not necessarily irrational, as one must consider the epistemic costs versus the benefits of maintaining social identity and group ties.

  • Expressive rationality explains behaviors aimed more at signaling identity/values than conveying truth, like voting or ethical consumerism. Sacred values in particular are resistant to evidence-based updating.

  • This creates a “tragedy of the communications commons” - individuals rationally engage in myside bias, but it harms society overall if policy is not based on objective truth. Expressive rationality and identity protection are key drivers of rational myside bias according to the literature discussed.

Processes like myside bias that project prior beliefs onto new evidence can be individually rational, but at the societal level they result in a tragedy of the commons scenario.

Just like in the prisoner’s dilemma game theory paradigm, each individual acting in their own self-interest through myside bias leads to a collectively poor outcome where society cannot agree on objective truths.

Kahan and colleagues coined the term “tragedy of the science communications commons” to describe how individually rational epistemic behaviors like knowledge projection undermine societal truth-seeking. While prior belief projection may help individuals evaluate evidence credibility, it promotes polarized interpretation of facts and issues at a wider scale.

This challenges the notion that a free marketplace of ideas will efficiently sort ideas and information. If communities interpret facts differently according to their priors, consensus cannot form. The biases that divide us individually seem rational but collectively hinder truth-seeking goals for society.

  • Many studies have found that cognitive biases and errors tend to correlate with each other and be predicted by individual differences like intelligence and thinking dispositions. High intelligence and active open-minded thinking are associated with less biased reasoning.

  • However, research finds the myside bias (tending to favor arguments and evidence supporting one’s own views) has a different pattern - it is generally not correlated with intelligence or scientific reasoning skills. Several experiments replicated this finding across different reasoning tasks.

  • Whether generating, evaluating, or just agreeing/disagreeing with arguments and evidence, more intelligent people did not show less myside bias than less intelligent people. Their reasoning was of higher quality overall but not less biased in favoring their own views.

  • This contrasts with other cognitive biases and errors that tend to be reduced by intelligence. The independence of myside bias from intelligence is a strange and noteworthy finding that challenges assumptions of how individual differences impact biased and rational thinking.

  • Studies have found that greater intelligence, numeracy, education, etc. do not necessarily lead people to have less polarized or biased views on politically charged issues. In some cases, more cognitive sophistication is associated with greater polarization and biased reasoning.

  • For example, Kahan found that on issues like climate change and gun control, people with higher numeracy showed larger differences in beliefs according to their political ideology, rather than being less polarized.

  • Political science research also finds greater disagreement and polarization among more educated partisan respondents on policy facts related to their political views.

  • While intelligence and education do not seem to reduce myside bias, most other cognitive biases are correlated with thinking dispositions like actively open-minded thinking and need for cognition. However, myside bias does not consistently show these types of correlations.

  • This suggests myside bias may have unique features compared to other biases, in that it is not necessarily reduced with greater reasoning ability, knowledge, or thinking style. Cognitive sophistication does not seem to provide immunity to biased political or ideological reasoning.

  • Belief bias, which occurs when real-world knowledge interferes with logical reasoning, consistently correlates with cognitive ability and thinking dispositions, with correlations typically in the range of 0.30-0.50.

  • Myside bias, which refers to evaluating evidence in a biased way to support one’s own views/convictions, does not consistently correlate with cognitive ability or thinking dispositions. In studies with large sample sizes, correlations are small (~0.10-0.20) and only significant due to the large N.

  • This difference suggests belief bias involves testable beliefs, while myside bias involves more deeply held convictions/distal beliefs. Myside bias seems “free-floating” and unrelated to individual cognitive differences.

  • The lack of correlation between myside bias and cognitive abilities converges with the analysis that it is difficult to show myside bias is definitively non-normative, unlike other biases like belief bias where alternatives can be ruled out based on individual differences.

  • Specifically, a positive correlation with cognitive abilities helps establish that one response is more normative/optimal, while a lack of correlation or negative correlation suggests the normative model may be incorrect or other models could also be appropriate.

In summary, the article discusses how myside bias uniquely lacks significant correlations with cognitive/thinking variables, unlike other biases, and how this supports the difficulty in establishing myside bias as clearly non-normative.

  • Myside bias appears to be highly content dependent and domain specific. Studies have found little correlation between degrees of myside bias displayed on different, unrelated issues. This suggests myside bias is not a generalizable trait.

  • Factors like cognitive ability and thinking dispositions do not reliably predict myside bias. They do not correlate with degrees of myside bias shown on different issues.

  • However, the strength and direction of one’s prior opinion on a specific issue strongly predicts the degree of myside bias displayed on that issue. Those with stronger opinions tend to show more myside bias.

  • Studies have found opinion content/valence variables and strength of opinion variables account for more variance in myside bias than individual difference factors like cognitive ability or thinking styles. Belief content is a better predictor than personal characteristics.

  • Other work similarly finds issues that evoke stronger conflict between opposing values are more likely to produce myside bias in reasoning about that issue. Degree of belief superiority also increases with strength of opinion on polarized issues.

  • The study examined whether the direction (liberal vs conservative) or strength of one’s prior opinion on issues is a better predictor of “myside bias” or belief superiority.

  • They found the strength of opinion was consistently a stronger predictor than direction of opinion or individual difference variables like dogmatism. Those with stronger prior beliefs showed greater belief superiority regardless of ideological direction.

  • On some issues there were also linear effects, with one end of the ideological spectrum showing greater belief superiority. But the strength of opinion effect was larger.

  • Decades earlier, Abelson’s research converged with these findings - he found conviction scores would correlate with myside bias but found no powerful individual difference predictor of general propensity for convictions.

  • Predicting opinion agreement is different than predicting myside bias levels. Ideology strongly predicts what opinions one holds but not necessarily bias levels in reasoning about those opinions.

  • The degree of myside bias depends more on the strength of one’s specific prior belief than broad individual differences or overall ideological direction. Myside bias is more content-dependent and outlier-like compared to other biases.

  • Chapter 4 explores the theoretical implications of the fact that opinion content accounts for more variance in myside bias than psychological processes. This suggests myside bias may need to be conceptualized as a content-based bias rather than an individual trait.

  • Recent research in political and social psychology has found that many established findings reinterpreted or overturned once content of stimulus materials fully appreciated. Relationships thought to involve broad psych traits now seen as content-contingent.

  • Studies that fail to sufficiently sample opinion content risk prematurely constructing theories about domain-general psych processes rather than content-dependent responses.

  • Examples discussed include research linking conservatism to resistance to change, belief revision, outgroup prejudice, that were challenged by accounting for specific opinion content rather than assuming domain-general effects.

  • This implies myside bias and other phenomena may depend more on specific opinion contents than broad individual traits, challenging default view of biases as process-driven. Chapter will explore theoretical implications of this conceptualization.

  • Studies have found that measures of out-group tolerance, prejudice, and warmth depend more on the degree of value match/conflict between the subject and target group, rather than the subject’s own psychological characteristics. Liberal subjects showed less tolerance for groups that conflicted with their values (businesspeople, Christians, wealthy, military).

  • Tests of “modern racism scales” found that conservative subjects endorsed statements about bootstraps ideology regardless of the ethnic target group, while liberals were less likely to endorse when the target was African Americans. This suggests the scales actually measure racial sympathy for liberals rather than racial resentment for conservatives.

  • More recent research confirms that psychological relationships involving prejudice are contingent on the congruency between a subject’s values and the target group’s values. Measures of traits like openness and intelligence are less predictive than the subjects’ specific beliefs.

  • Myside bias seems to be driven more by the content of one’s beliefs, not inherent psychological tendencies. Beliefs may differ in how strongly they are structured to reject contradiction. Memetics and cultural replication theory suggest we should question the view of “beliefs as possessions” and consider the possibility that beliefs can propagate themselves regardless of, or even against, the interests of the individuals who hold them.

  • Memetic theory proposes that ideas or beliefs (“memes”) replicate themselves in a similar way that genes do, spreading from person to person through communication and imitation.

  • From a “meme’s-eye view,” memes act to serve their own replication and spread, not necessarily the interests of their human hosts. Successful memes replicate more effectively through behaviors like compulsively copying chain messages.

  • This contrasts with traditional psychological models that see beliefs as things people actively choose based on truth or personal interests. Memetics sees beliefs as things that choose people by virtue of their replicative properties.

  • Some memes spread because they are helpful to hosts, fit genetic tendencies, or promote host reproduction. But others spread through self-perpetuating properties alone, like mimicking beneficial ideas deceptively (“parasitic mindware”).

  • Myside bias may persist because it protects existing memes in the “memosphere” from competing ideas that could replace them. Resident memes form cooperative networks resistant to conflicts, explaining variation in bias across domains.

  • The meme concept, if divorced from claims it was meant solely to criticize religions, can help people distance from convictions and avoid projecting priors when evaluating evidence for beliefs. This distance may reduce dysfunctional myside bias.

  • The concept of a meme was originally used by Dawkins to refer specifically to beliefs that replicate themselves regardless of benefit to the host, like viruses. But most theorists use it more broadly as any cultural replicator.

  • Dawkins’ framing of memes as “viruses” implied they are not acquired reflectively and are not functional. But many beliefs are functional yet still acquired unreflectively through social/cultural transmission rather than individual reflection.

  • Beliefs and ideas are often acquired unconsciously through innate predispositions and social learning in childhood, influenced by parents, peers and institutions. This explains how ideological beliefs can be strongly held yet not arrived at through conscious reflection.

  • Personality traits like openness vs. conscientiousness that correlate with political orientation have genetic components and appear early in development, so ideological leanings are partly innate. Differences in brain chemistry have also been linked to political differences.

  • While we value our beliefs, we have little control over the social and innate factors that influence their development. Challenging beliefs with facts often fails to change higher-level attitudes, showing attitudes have deeper unconscious roots beyond individual reflection.

  • Distal beliefs, or deep-seated convictions, are often formed early in life through instinctive predispositions and socialization processes. They become resistant to information that contradicts existing beliefs.

  • Scholars from various fields have articulated that most of our beliefs are shaped by groupthink and community norms, rather than individual rationality. We tend to hold onto these beliefs due to group loyalty.

  • Studies show that intelligence does not correlate strongly with the ability to avoid myside bias, raising questions about how reflectively our convictions are formed.

  • Convictions are often the drivers of problematic myside bias, where people reject evidence contrary to pre-existing beliefs. This can prevent society from rationally evaluating policies.

  • Memetic theory and the concept of ideas replicating independently of their truth value can help explain where our convictions come from. It encourages a more detached view of our beliefs.

  • Analyzing beliefs through a memetic lens emphasizes that they are separate entities that need ongoing evaluation, rather than possessions we consciously chose. This can promote greater skepticism and distance from deeply held convictions.

  • When we have previously established evidence on an issue, it is justified to have some degree of “local myside bias” by projecting our prior beliefs onto new evidence in a way that accommodates the previous evidence.

  • However, when there is no previous evidence, we should use the principle of indifference and set our prior probability at 50-50 rather than projecting distant ideological beliefs.

  • In reality, people often assess how a new proposition relates to their distal ideology and set the prior probability above 50%, then project that onto new evidence. This leads to partisan disagreement and failure to reach consensus.

  • To address this, people should recognize that our distal beliefs come more from social learning and innate psychology than conscious reasoning. We tend to see our culture and beliefs as more of a personal choice than they really are.

  • Focusing on beliefs themselves rather than who holds them can help with evaluating others’ reasoning and myside bias, as bias levels correlate more with belief strength than thinker sophistication.

  • Overcoming myside bias is challenging for cognitively elite people who wrongly assume they are less biased, since bias levels do not correlate with sophistication for political/ideological beliefs.

Here are the key points made in the summary:

  • Cognitive elites like academics are particularly prone to a bias blind spot when it comes to myside bias. They wrongly assume their views are objective and others’ differing views are due to bias.

  • Psychology and related social science departments studying myside bias ironically demonstrate a “perfect storm” for strong myside bias blind spots. They are overwhelmingly liberal ideologically committed academics.

  • Studies show psychology departments have become a near ideological monoculture, with 84-90% of professors identifying as liberal and just 5-8% as conservative. This imbalance has grown significantly in recent decades.

  • While ideology may not impact all areas of psychology research, it likely does influence work on issues intertwined with political/moral attitudes, like sexuality, family structures, poverty, crime, etc.

  • Highly intelligent and educated cognitive elites who are strongly committed to a viewpoint are especially likely to think they rationally derived their beliefs, rather than recognizing influence from social/psychological factors.

So in summary, the literature suggests cognitive elites like liberal social science academics studying myside bias are particularly prone to blind spots about their own bias, due to ideological homogeneity in their fields and overconfidence in their objective reasoning.

  • The passage expresses concern about potential political bias affecting research in sensitive areas like activity, marriage, incentives, discipline techniques, and educational practices. When all researchers share the same biases in these areas, it undermines scientific objectivity and the ability to critically evaluate each other’s work.

  • It notes psychology research tends toward a liberal/progressive ideological homogeneity. Studies have shown bias is equally strong across the ideological spectrum, not unique to any group. High education and cognitive ability don’t prevent biases.

  • Academics are particularly vulnerable to “bias blind spots” - believing their views are logically reasoned while opponents’ views are not. But few people consciously reason out their core beliefs.

  • There have been relentless attempts by academics to show conservatives have cognitive deficiencies. But most claimed relationships haven’t held up under different framings or ideologies doing the critiques. Conservatism seems more linked to culturally traditional values rather than general intolerance. Feelings thermometers have also been misleadingly used.

  • In summary, the passage expresses concern about political biases undermining objectivity in sensitive research areas, and argues claims of conservative cognitive deficiencies have been overstated and not held up to scrutiny.

  • Some research has attempted to label conservatives as more prejudiced by labeling one group (usually conservatives) as “high” in prejudice based on their relative scores on a prejudice scale, even if their absolute scores indicate little antipathy.

  • Studies have included prejudice scale items about conservative policy stances like affirmative action, which essentially equates policy disagreement with prejudice.

  • Other research embeds conservative beliefs in new scales meant to correlate them with constructs like “racism” or “science denial.” This direct explanation renaming psychology (DERP) syndrome is common.

  • Studies have also labeled normal male behaviors as prejudiced or benevolent sexism based on scales whose items liberals disagree actually indicate prejudice.

  • While conservatism has a modest negative correlation with intelligence in some studies, social and economic conservatism often correlate differently - social conservatism correlates negatively but economic conservatism positively. More recent studies find little difference in intelligence between party identifiers.

  • In summary, some research purporting to link conservatism to prejudices fails scientific standards, while the relationship between ideology and intelligence is more complex than often portrayed.

  • Early research found minimal correlations between conservatism and intelligence. Correlations between ideology and personality traits like openness and conscientiousness are also small.

  • Thinking dispositions (like openness) are means to rational thinking, not ends in themselves. High levels do not necessarily indicate a more optimal psychology.

  • Some measures of traits like openness are flawed because they build in liberal political biases. For example, items assume endorsing moral relativism or ignoring religious authority indicates openness.

  • Actively open-minded thinking (AOT) consistently correlates with ideology, but multiple studies find AOT does not actually reduce myside bias or political polarization. High AOT scores do not indicate less biased thinking.

  • In general, attempts to link conservatism to negative cognitive or psychological traits have yielded low results, despite significant effort. Correlations with traits like prejudice are impeached because they fail to control for value conflicts.

  • While conservatives score lower on some scales like AOT, this does not demonstrate increased bias or worse decision-making compared to liberals. Interpretation of trait relationships is complicated.

This summary discusses research on ideological biases in psychology and the assumptions of irrationality among Trump voters held by many cognitive elites after the 2016 U.S. presidential election.

Some key points:

  • Researchers like Conway et al. demonstrated that using scales that built in liberal biases could find correlations between liberalism and traits like authoritarianism instead of the reverse. This showed a lack of control for ideological biases.

  • After Trump’s victory, many psychologists expected research on rational thinking tests to show Trump voters as irrational compared to Clinton voters. However, research finds little correlation between thinking dispositions/intelligence and ideology.

  • Most Trump voters were past Republican voters, so characterizing them as uniquely irrational would apply to past GOP voters as well. Comparisons focus on party affiliations rather than absolute rationality levels.

  • Rational choice theory says people pursue both material and non-material goals that provide utility. Voting preferences express desires beyond just narrow self-interest, like valuing certain beliefs, values or others’ welfare. This contradicts assumptions Trump voters only pursued racist/sexist goals.

In summary, it discusses how psychological research was slow to control for ideological biases, and how rational choice theory challenges assumptions of Trump voters as uniquely irrational held by some cognitive elites after the 2016 election.

The passage argues that claims that Trump voters were irrational are based on oversimplified views of rational choice theory. Critics argue Trump voters voted against their own economic interests by voting Republican, but rational choice theory does not assume preferences must be purely self-interested. People often vote based on values and worldviews over narrow monetary interests.

The “What’s the Matter with Kansas?” critique fails to recognize how insulting it is to claim others are irrational for not voting in their perceived monetary interests. Educated liberals often vote based on noneconomic values too, like environmentalism. Republican voters may prioritize noneconomic values and worldviews just as Democratic voters do.

Whether to prioritize character/temperament of candidates over ideological worldview differences is a complex calculus, not a clear-cut measure of rationality. Given the stark difference between Clinton and Trump’s worldviews in 2016, it was reasonable for voters to weigh these factors differently. Democratic voters’ willingness to hypothetically support a candidate like Al Sharpton over Ted Cruz in another scenario also shows rational priority of ideology over character.

Claims of Republican epistemic irrationality on issues like climate change also oversimplify - one cannot simply declare a party more rational based on cherry-picked issues. A representative sample is needed to fairly evaluate parties’ receptiveness to scientific evidence overall.

Here is a summary of the key points from the passage:

  • There is evidence of science denial on both the liberal/Democratic side and the conservative/Republican side. Liberals deny scientific consensus in areas like intelligence being heritable, gender pay gaps, and negative impacts of rent control.

  • Studies find little difference in overall levels of factual knowledge between Democrats and Republicans. Surveys show they perform similarly on current event questions.

  • Measures of economic knowledge that include questions biased against both liberals and conservatives find similar scores across ideologies.

  • Research on conspiracy belief patterns originally suggested they were more common on the right, but more balanced studies find they are equally prevalent across the political spectrum.

  • The author’s own research using a wide range of conspiracy theories found no correlation between belief and political ideology.

  • A meta-analysis of over 40 studies found similar levels of “myside bias” or selective bias toward one’s own views among both liberals and conservatives.

  • In terms of both acquired knowledge and knowledge acquisition processes, there is no strong evidence Trump voters were more irrational than Clinton voters according to existing literature.

  • There are narrow and broad conceptions of rationality. Narrow focuses only on means-ends reasoning, while broad also evaluates the appropriateness of an agent’s goals and beliefs.

  • Most cognitive science adopts the narrow view, for practical reasons, but straying into broad rationality raises difficult issues like when it’s rational to be narrowly rational.

  • Evaluating the rationality of partisan opponents is prone to “myside bias.” Both sides see the other as irrationally expressing values contrary to evidence, while justifying their own side’s expressive modes of reasoning.

  • The search for cognitive deficiencies in Trump voters backfired, as cognitive science does not support claims they were irrational or less rational than Clinton voters. Judgments of others’ rationality are uniquely susceptible to myside bias in political domains. Expressive modes of reasoning may seem rational to one’s own side but not the other’s on politicized issues like climate change.

This summary discusses myside bias and potential ways to address it at both the individual and societal level.

The key points are:

  • Myside bias is problematic at the societal level as it has undermined objective debate about public policy issues. Political parties have become like modern tribes that prioritize tribal interests over evidence.

  • Institutions like the media and universities that should serve as neutral arbiters of evidence have failed and instead promoted partisan tribalism.

  • At the individual level, cognitive elites tend to assume political disputes are about facts/rationality and ignore differences in values/culture. They overestimate how evidence will change opposing views.

  • Expressive debates that openly discuss cultural values rather than just facts may be better for addressing issues like gun control where values deeply conflict.

  • Both individuals and institutions need to address myside bias to counter its toxic effects and allow for more objective consideration of evidence in policy debates. Recognizing biases like believing opponents just don’t know the facts is important.

This passage makes several key points:

  1. Some societal problems, like poverty and violence, may be solvable through rational policies and increased knowledge/intelligence. However, others like climate change, pollution, income inequality, and a divided society involve conflicting values between groups.

  2. For problems like pollution control and climate change, solutions require trade-offs between environmental protection and economic growth. People disagree not because of differing knowledge, but because they weigh these trade-offs differently based on their values.

  3. Income inequality does not have a single solution, as there are differing views on which parts of inequality to prioritize - between the wealthy and middle class vs. middle and lower class. This reflects differing values rather than knowledge.

  4. A divided society results largely from conflicts in values between groups. Increasing intelligence/knowledge alone will not solve it, as views differ based on political ideology and priorities.

  5. Liberals and conservatives often misunderstand each other - both groups generally understand facts but prioritize conflicting values when it comes to issues like economic growth vs. the environment or security vs. privacy.

  6. Recognizing the conflicting values within oneself, rather than just between groups, can help reduce “myside bias” or one’s tendency to favor their own viewpoint. Trade-offs must be acknowledged for complex issues.

So in summary, the key point is that some societal problems reflect differences in values and priorities rather than just knowledge, making them more difficult to solve through increased rationality or facts alone. Acknowledging conflicting viewpoints, including within one’s own thinking, can help address this.

  • Politicians often frame issues as having only one value at stake, implying that taking the most partisan position on that one issue does not compromise other important values. This is a cognitive fallacy embraced by both sides in ideologically divided societies.

  • Recognizing and exposing this fallacy as a trick played on voters could encourage less one-sided political discussions that acknowledge trade-offs between multiple values.

  • Myside bias, where we favor information aligning with our preexisting beliefs, flourishes in situations of ambiguity and complexity. As the world becomes more complex with greater information availability, it becomes easier to select evidence selectively in a biased way. Reducing ambiguity could help reduce myside bias even in complex topics.

  • Beliefs act like self-interested memes that are easily spread and accumulated through online algorithms and media targeting niche audiences. This can lead to an “obesity epidemic of the mind” where we accumulate many strong, unquestioned convictions through one-sided exposure to information.

  • In 2018, Facebook announced a project called “Social Science One” to provide researchers access to data to study information sharing on the platform. It took researchers two years to get a functional dataset, mostly spent negotiating with Facebook.

  • Very few academic researchers can afford to spend that much time on a project dependent on proprietary data controlled by a private company.

  • Questions about online communication patterns are unprecedented in their complexity. Virtually no researcher can answer them without “big data” resources held by tech companies.

  • Figuring out things like who saw what ads or content online is extremely difficult due to the massive scale of internet data (billions of users, millions of posts daily).

  • The combination of exponential data growth and technological complexity has greatly increased ambiguous environments that people clarify through “myside bias”.

  • Whether alleged events like “cyberattacks” or “hate campaigns” occurred online is almost impossible for average people to ascertain without relying on experts, whose selection will also be subject to bias.

  • Issues don’t have to be partisan to spread misinformation online. The debunked autism-vaccine link continues to circulate due to “paranoia peer groups” on the internet.

So in summary, the scale of online data and its control by private companies makes independently studying important questions about social media extremely challenging for researchers and opens the door to more misinformation and biased interpretations among the general public.

There are a few key points that help explain the seeming paradox of politically polarized societies despite most people lacking ideological consistency:

  1. Polarization has been driven more by “negative partisanship” - increased animosity toward the opposing party rather than increased positivity toward one’s own party. Tribal identity and us-vs-them thinking plays a bigger role than policy issues.

  2. Partisan identification is a stronger predictor of beliefs and attitudes than actual differences on issues. Party affiliation acts more as a social/tribal identity than due to stance on specific issues.

  3. Political elites strategically bundle issues together to define ideological positions and parties. But many issues are only weakly correlated and positions not consistently linked by principle. Elites drive increased consistency over time.

  4. Most voters don’t have coherent or consistent ideological views across domains. They tend to take positions because that’s what their partisan group supports rather than due to personal views on the issues themselves.

  5. Increased polarization is led by cognitive elites who exhibit the highest correlations across issues with their party positions. But given most people disagree on many issues, no single group can claim to be “right about everything.”

So in summary, polarization arises from social/tribal identities and negative partisanship more than ideological differences, and most voters are led in their views by elites rather than having strong personal ideological stances.

  • Political parties bundle issue positions together for electoral and political reasons, not necessarily based on coherent principles. This leads to inconsistencies and seeming incoherence in some position combinations.

  • Examples given include Republicans supporting both traditional values and free market capitalism, which can disrupt communities. Democrats support action on climate change and helping the poor, but policies like increasing driving costs mainly hurt the poor.

  • In the abortion debate, both pro-life and pro-choice sides can point out inconsistencies in the other position. Animal rights activists tend to support abortion rights despite fetuses being more sentient than some animals.

  • Forgiving student loan debt favors the affluent contrary to Democratic ideals of equality. Democrats oppose charter schools even though they have more support among minority voters.

  • Party positions change quickly based on electoral calculations, not principles, like on deficits and immigration.

  • Because of this partisan bundling, people should treat new issues independently rather than assume a position just because their party holds it. Tribal partisan affiliation drives more division than actual policy disagreements in many cases.

  • Americans are increasingly living near others who share their political views, as evidenced by more counties voting overwhelmingly for one party over another in presidential elections.

  • People are also sorting themselves more consistently into political parties based on certain issues. This increases differences between the parties but does not necessarily make individuals more extreme in their own views.

  • A simulation shows that even a small number of people switching parties based on their issue positions can significantly increase the correlation between partisan identity and those issues. However, it does not change individuals’ actual positions.

  • Partisan sorting leads identities to become more aligned along demographic and lifestyle lines as well as issues. This disrupts “crosscutting identities” that previously kept partisan animosity in check.

  • Increased alignment, independently of changes in issue positions themselves, causes partisan anger and polarization to rise as tribal social identities become more salient.

  • While social and tribal influences can promote “groupish” thinking and bias, analyses of specific issues often show the public is closer in views than partisan divide would suggest. Partisan sorting collectivizes different types of people rather than changing views.

  • Identity politics can magnify myside bias by encouraging people to view issues through an identity lens rather than on the merits of the issue. This turns factual beliefs into strong convictions tied to one’s identity.

  • There are two main types of identity politics - “common-humanity” which emphasizes universal rights, and “common-enemy” which views society in terms of opposing groups competing for power.

  • “Common-enemy” identity politics is problematic as it inflates myside bias. It encourages seeing power relations everywhere and treating more opinions as strong convictions.

  • On campuses, “common-enemy” identity politics deems the arguments of oppressed groups as automatically stronger due to their victim status. It values arguments based on the identity of the speaker rather than logical/empirical merit.

  • This shuts down open debate and discussion as opposing views can be dismissed via identity claims rather than engagement on facts/reasoning. It has led to more conflict and less tolerance of alternative perspectives on campuses.

In summary, the article argues that “common-enemy” identity politics magnifies myside bias, inhibits open debate, and undermines the purpose of higher education discussion by framing all issues in terms of competing identity groups.

  • Identity politics emphasizes one’s core identity (race, gender, etc.) above all else and sees the world through that single lens. This contributes to “myside bias” where one only considers views that confirm their own perspective.

  • Sam Harris and Ezra Klein debated intelligence differences, with Harris advocating an identity-neutral perspective and Klein arguing everyone has inherent biases from their identities. Harris resisted acknowledging his own identity to avoid having his argument discounted.

  • Using identities to claim authority or moral high ground in arguments, as in “Speaking as an X,” shuts down real debate and appeals to external validation rather than the merits of ideas.

  • Traditionally, universities taught students to think from a dispassionate, evidence-based “view from nowhere” rather than rely on personal experiences or identities. But identity politics has reversed this by privileging individual perspectives over objective truth-seeking.

  • This undermines rational debate and the scientific method’s ability to adjudicate conflicting knowledge claims through transparent evidence and experimentation, not personal attributes of the claimants.

The passage argues that identity politics hinders the role of the university in teaching students to avoid “myside bias” or the tendency to accept arguments that agree with one’s own views. A key skill universities aim to develop is cognitive “decoupling” - being able to consider arguments independently of one’s identity or group affiliations.

However, identity politics frames all issues in relation to identities and social groups. It encourages students to view the world and evaluate evidence through the lens of their pre-assigned identities. This makes it difficult for universities to teach perspective-taking and considering multiple viewpoints.

Traditionally, universities aimed to take students out of their comfort zones and identities to promote open-minded thinking. But by affirming pre-determined student identities, universities now risk merely “cheerleading for ice cream” rather than pushing students towards more intellectually demanding ways of thinking.

To remedy issues like political polarization and “science skepticism”, society needs institutions like universities that can promote evidence-based, detached reasoning. But identity politics undermines universities’ ability to fulfill this role by entangling factual claims with social or identity-based convictions. This is contributing to a loss of public trust in universities as neutral arbiters of truth and evidence.

  • The passage argues that universities have become too politicized and promote certain ideological positions over open inquiry. It cites examples of university administrators openly opposing election results and fueling partisan dissent.

  • It claims this politicization has created an environment where certain research conclusions are “verboten” and publishing views contrary to the dominant ideology can be professionally difficult or damaging.

  • As a result, the public is growing more skeptical of university research on politically charged topics, as the findings may be skewed by the prevailing ideological atmosphere. Issues related to identity politics, in particular, are seen as having predetermined conclusions.

  • This loss of credibility and perception of bias undermines universities’ ability to act as neutral arbiters and provide reliable evidence to inform public policy debates. Concepts driving university policies like diversity, microaggressions and social justice lack clear, agreed-upon definitions.

  • In summary, the passage argues excessive politicization in universities is compromising their role in open inquiry and making the public distrust research seen as promoting a partisan agenda over objective truth-seeking.

The passage criticizes the prevalence of the “disparity fallacy” on university campuses and in public discourse. The disparity fallacy involves using statistical disparities between groups to claim discrimination, without accounting for other possible explanatory factors. Two common examples given are the claim that women earn 77 cents for every dollar men earn (which fails to account for job choices, hours worked, etc.) and that police kill black Americans at disproportionately high rates (which studies show is not the case once crime rates are controlled for).

The author argues universities should be correcting these misleading claims, but instead often promote them. Departments have the analytical tools like regression analysis to properly assess claims of discrimination, but fail to deploy them aggressively. This allows biased and “mysided” arguments to spread.

Additionally, the passage criticizes diversity, equity and inclusion requirements for faculty, arguing they amount to ideological conformity and force endorsement of critical race theory. Diversity statements are meant to assess commitment to identity politics and common-enemy groups, rather than open inquiry. The author argues this closing off of debate should be reformed through withholding state university funding until requirements are removed.

The passage discusses the issue of “myside bias” in universities, which refers to bias in favor of one’s own preexisting beliefs and against opposing views. It argues that universities have strayed from their true mission of pursuing truth and are instead promoting bias.

It suggests that public universities issuing clear statements reaffirming their commitment to intellectual diversity and debate - and encouraging private universities to do the same - could help steer universities back to their proper role. This would help address the myside bias problem that is undermining public discourse.

It notes several issues contributing to myside bias, like a focus on tribal identities and victimhood over open debate of ideas. It also criticizes the biased and uncredible research coming out of some fields due to their promotion of social and political agendas over truth-seeking. Stronger commitment to intellectual diversity from universities could help counter these problems.

In summary, the key point is that universities have drifted from their core function and are now propagators of bias, rather than arbitrators of truth and evidence. Clear statements promoting open debate could help correct this issue and reduce myside thinking that plagues public discussions.

Here is a summary of the article “Ring individual differences in decision biases: Methodological considerations. Frontiers in Psychology 6. Article 1770. doi:10.3389/fpsyg.2015.01770”:

  • The article discusses methodological issues in measuring individual differences in decision biases and cognitive styles. It argues that current measures often have low reliability and lack consideration of contextual factors.

  • It proposes using multiple, multi-item measures designed to capture the constructs of interest in different ways. Using different contexts, stimuli, and response formats can help reduce measurement error and increase reliability.

  • Within-subject designs that expose participants to different decision tasks and contexts are recommended over between-subject designs to better distinguish individual tendencies from situational influences.

  • Response times and confidence judgments in addition to choices can provide useful information about processing styles. Biases may also manifest differently for low-stake hypothetical choices versus high-stake real decisions.

  • Future research should aim to develop psychometrically sound multi-method assessment tools that can reliably discriminate cognitive/decision styles across situations to advance understanding of individual differences in reasoning and judgment.

In summary, the article discusses methodological limitations of current individual difference measures in decision biases and biases and proposes best practices to develop more reliable and valid assessment methods.

Here is a summary of the references:

  • Several sources discuss partisan bias and motivated reasoning, where people interpret information in a biased way that favours their preexisting political beliefs. Some find evidence of biased factual beliefs and survey responses along partisan lines.

  • A few references examine the relationship between cognitive ability/intelligence and political ideology, generally finding that higher intelligence correlates with more liberal or left-wing views. Verbal intelligence in particular correlates with socially and economically liberal views.

  • Several studies explore differences in personality and cognitive styles between liberals and conservatives. Some find evidence that conservatives tend to be more sensitive to threats or prefer order and structure. However, the evidence is mixed.

  • Sources discuss factors like heredity, culture and psychological mechanisms that influence the development and expression of political beliefs and biases. Evolutionary perspectives are also considered.

  • A handful of references touch on debates around debates around free speech on college campuses, gender inequality, and survey methodology.

  • Overall the references examine psychological and cognitive factors underlying political attitudes and biases from various theoretical perspectives like motivated reasoning, evolutionary psychology, and the interaction between personality and ideology. Both similarities and differences between liberals and conservatives are explored.

Here is a summary of the key points from the provided sources on motivated reasoning and related topics:

  • Motivated reasoning refers to the tendency for people’s preferences and beliefs to influence their evaluations of arguments and evidence. People are more likely to accept arguments that support their existing views.

  • Political ideology and party identification can influence evaluations and processing of political and scientific information in a biased manner through mechanisms like partisan bias and motivated skepticism.

  • Dual process theories propose that reasoning involves both intuitive/automatic “type 1” processes and more analytic/deliberative “type 2” processes. Motivated reasoning occurs when type 1 processes influence judgments more than type 2.

  • Individual difference factors like cognitive reflection, need for cognition, intelligence, and open-mindedness correlate with susceptibility to biases from prior beliefs or political views when evaluating arguments and evidence.

  • Effects of motivated reasoning have been demonstrated experimentally on tasks like Wason selection, belief bias syllogisms, and evaluation of political and scientific arguments and evidence.

  • Partisan biases tend to be symmetrical - both political liberals and conservatives tend to exhibit biases in processing ideologically congenial vs. uncongenial information.

  • Factors like the complexity of issues, topic familiarity, preferences/values, and ideologically sensitive contexts influence the extent and direction of motivated biases in reasoning.

  • Motivated reasoning is an evolutionary adapted mechanism that likely functions to promote group cooperation and identity but can undermine objective assessment of evidence and arguments in political, scientific and other contexts.

#book-summary
Author Photo

About Matheus Puppe