SUMMARY - The Bias That Divides Us - Keith E. Stanovich
Here are the key points from the summary:
Researchers studied the phenomenon of "myside bias", where people tend to evaluate evidence in a way that confirms their preexisting beliefs or positions on issues.
Studies showed participants favored statistical evidence or interpretations that aligned with their prior views on topics like immigration, gun control, and medical treatments, even when the overall evidence was ambiguous or conflicting.
Myside bias occurred equally among liberals and conservatives, showing people selectively focus on parts of the same evidence depending on whether it supports or contradicts their own views.
Some theories argue myside bias evolved because it increased fitness even if it reduced accuracy, or that reasoning evolved for persuasion in arguments rather than truth-seeking, explaining its prevalence in human thinking.
So in summary, myside bias describes the tendency of people to evaluate ambiguous evidence in a biased way that aligns with and supports their existing beliefs and opinions, as demonstrated across multiple experimental studies.
Here is a summary:
Koehler's study and proof showed that when evaluating ambiguous or uncertain evidence, allowing prior beliefs to influence judgments (a degree of "myside bias") can be rational under certain conditions.
Specifically, if the prior belief was formed reasonably based on previous evidence, and the evaluation follows Bayesian principles, projecting that prior onto new evidence assessment is locally rational as it leads to faster accumulation of true beliefs.
However, the proof only demonstrates local rationality - it does not guarantee the prior belief itself was determined through an unbiased process. To achieve global rationality, the prior probability would need to be based solely on valid evidence, not other factors like personal preferences or worldview.
Later work by Mercier and Sperber introduced instrumental rationality - that beliefs may reasonably be maintained not just for epistemic accuracy, but also for social and cognitive benefits like group cohesion. Some myside bias could be instrumentally rational.
This led to the concept of the "tragedy of the communication commons" - individually rational myside bias prevents society from reaching consensus on objective truths due to polarized interpretations of evidence.
So in summary, limited myside bias can be rational based on epistemic or instrumental reasons, but its social effects may undermine truth-seeking at the societal level even if individually justified. Global rationality requires unbiased formation of prior beliefs.
Here is a summary of the key points:
Studies have found that myside bias, or the tendency to evaluate evidence in a biased way to support one's own views, does not consistently correlate with factors like cognitive ability, thinking styles, education level, etc. Unlike other cognitive biases, it is not reliably reduced by individual differences.
Myside bias appears to depend more on the specific content and strength of one's prior opinions on an issue, rather than general psychological traits. Those with stronger prior beliefs tend to show greater bias, regardless of ideology.
This suggests myside bias may be better conceptualized as a content-based phenomenon rather than an individual trait or cognitive failure. It is more contingent on belief specifics than broad psychological processes.
Considering beliefs from a "memetics" perspective, some propose myside bias persists because it protects existing belief "memes" from competing ideas that could displace them. Bias may serve the spread and replication of beliefs, rather than truth or rationality per se.
The independence of myside bias from factors like cognitive ability challenges assumptions about how individual differences impact rational thinking and challenges the view of beliefs as things people actively choose based on reason and interest alone.
Here is a summary of the key points:
Some psychological research attempting to link conservatism to prejudice or closed-minded thinking has methodological flaws that incorporate liberal biases. Scales are designed in ways that treat disagreement with liberal positions as evidence of prejudice or closed-mindedness.
Correlations between ideology and traits like openness or intelligence are weak. Interpreting trait relationships is complex, and traits alone do not determine rationality or decision-making ability.
Studies find measures of traits like actively open-minded thinking do not actually reduce political polarization or my-side bias as assumed. High scores do not indicate less biased thinking.
Assumptions by many cognitive elites that Trump voters must be irrational were not borne out by research on thinking styles and decision-making tests, which found little difference between partisans.
Ideological bias is a blind spot for many in academia, who see their own liberal views as rational but others' conservative views as motivated by irrational factors. More balanced consideration of evidence is needed.
In summary, the passage questions some psychological research methods and assumptions about links between ideology and cognitive traits or rationality, due to potential liberal biases in the research designs and interpretations.
Here is a summary of the key points:
Social media platforms like Facebook collect enormous amounts of data about online interactions but closely control access to that data for external researchers. This makes it extremely challenging for academics to independently study important questions about information sharing and political communication on these platforms.
Answering complex questions about how content and ads spread online requires "big data" resources that only the tech companies themselves possess. Very few individual researchers have the time and resources needed to negotiate data access for large-scale studies.
The massive scale of internet data, with billions of users and millions of daily posts/interactions, makes it difficult to trace how specific information was viewed or spread without comprehensive platform data.
As online environments become more ambiguous due to exponential data growth and technological complexity, people rely more on "myside bias" to understand issues. But determining the facts of even non-partisan events can be impossible without data-driven expert analysis.
Misinformation propagation is not limited to partisan issues - even debunked claims like vaccine-autism links continue spreading online in "paranoia peer groups." Independent fact-checking is hampered without access to private platform interaction data.
So in summary, limited third-party access to critical social media usage data poses a major challenge for objectively studying modern political communication and information spreading online at scale.
Here are the key points:
The disparity fallacy refers to using statistical disparities between groups alone as proof of discrimination, without considering other possible explanatory factors.
On college campuses and in public discourse, statistical disparities are often taken as conclusive evidence of unfair treatment or implicit bias against a group. Alternative explanations are dismissed.
Groups advocating for certain policies use disparity data selectively to argue for more scrutiny or quotas. But this approach risks unfairly accusing innocent people and exacerbating social tensions.
There may be many legitimate reasons for differences between groups beyond just discrimination, such as underlying preferences, choices, or disparities that accumulated gradually over time through many small decisions.
A proper analysis examines diverse, complex contributing factors and rules out alternative explanations before attributing disparities definitively to unfair treatment or bias. Simply pointing to numerical gaps is fallacious reasoning.
Overreliance on the disparity fallacy undermines open inquiry, fairness and productive discussion of complex social issues. It risks damage to individuals and societies through unjust accusations and policies based on incomplete analysis.
In summary, the passage criticizes an overdependence on disparity data alone to "prove" discrimination, arguing this disparity fallacy overlooks alternative explanations and risks unfair accusations if not combined with thorough analysis of all factors.
Here are summaries of some key papers on motivated reasoning and related topics:
Kahan (2013): Proposed the cultural cognition theory that cultural values shape risk and science judgments more than facts alone. Cultural groups unconsciously evaluate evidence in identity-protecting ways.
Kahan et al. (2012): Found scientifically literate subjects engaged in more motivated reasoning, showing greater bias from preexisting views. Scientific literacy alone doesn't inoculate against biases.
Mercier and Sperber (2011): Argued reasoning evolved to win arguments, not find truth. People are biased processors and communicators of information who tend to reason strategically rather than objectively.
Taber and Lodge (2006): Demonstrated motivated skepticism experimentally. Holding strong views on issues impaired ability to process counterattitudinal information objectively.
Westen et al. (2006): Reasoning is often motivated rather than disinterested. People use logic flexibly to reach identity-defending conclusions, validating initial intuitive judgments.
Kahan (2017): Suggested alternative theories like biased assimilation or identity-protective cognition better capture evidence than rational actor models. Bias is not a bug but human nature.
Lord et al. (1979): Showed partisans different portrayals of same policy positions and found greater attitude shift when positions aligned vs. conflicted with preexisting views. Introduced concept of biased assimilation.
Here is a 100-word summary of the key ideas:
This summary discusses research on motivated reasoning and how ideological beliefs can influence cognitive processes. It touches on how pre-existing views can bias how evidence is evaluated and problems are framed. Threats to one's worldview can trigger identity-protective cognition and distancing from facts. Myside bias and value congruence effects are mentioned as contributors to rigid political thinking. The influence of biases, prejudice and closed-mindedness on flexibility of belief is briefly explored. In general, it concisely outlines cognitive and social psychological factors relating to the rigidity versus flexibility of political opinions.
Did you find this article valuable?
Support Literary Insights by becoming a sponsor. Any amount is appreciated!