Self Help

Future Babble - Dan Gardner

Author Photo

Matheus Puppe

· 71 min read

“If you liked the book, you can purchase it using the links in the description below. By buying through these links, you contribute to the blog without paying any extra, as we receive a small commission. This helps us bring more quality content to you!”

BOOK LINK:

CLICK HERE

Here is a summary of the key points from the Table of Contents:

  • The book has 8 chapters that examine expert predictions and why they are often wrong or unreliable.

  • It discusses how uncertainty about the future causes anxiety and a desire for certainty, even if predictions turn out to be wrong.

  • It explores cognitive biases and psychological factors that lead experts, and the public, to put too much faith in predictions.

  • There are notes and bibliographies citing the sources and research discussed in the book.

  • The author, Dan Gardner, has previously written a book about how fear is manipulated in culture.

  • Overall, the book aims to understand why predicting the future is so difficult, why we seek predictions anyway, and how to better deal with an unknowable future. It examines past predictions to demonstrate experts’ track record and psychological factors underlying forecasting.

The passage discusses how views of the future changed dramatically in the early 20th century. In 1902, economist John Bates Clark envisioned the 20th century as one of steady progress and technological advancement without war. However, World War I shattered this optimism. After the war, many thinkers predicted decline and crisis for Western civilization.

The Great Depression further damaged confidence in democratic capitalism and the future. The rise of fascism and possibility of another world war led many thinkers like H.G. Wells to predict doom. Even during World War II, luminaries like Einstein warned that mankind faced self-destruction without radical change.

However, against expectations, the postwar era saw unprecedented peace and prosperity in the Western world. Economies boomed, especially in the US, lifting standards of living. The generation that came of age in this period, born in the Depression, experienced an incredibly fortunate period despite earlier predictions of decline. No experts or thinkers had accurately predicted this dramatic reversal of fortune. The passage highlights how views of the future can change drastically based on contemporary events and circumstances.

  • After World War 2, many economists predicted a recession or mass unemployment as demobilization occurred. However, the opposite happened - the postwar years saw one surprise after another of economic growth and prosperity.

  • People have a natural desire to predict the future, especially when raising children. The author’s great-grandfather and grandfather likely asked what the future would hold for their newborns, paying attention to experts. However, the experts were often wrong in their pessimistic predictions.

  • Throughout history, experts and commentators have frequently made failed predictions about coming disasters, recessions, resource shortages, wars and more. Several specific examples are given from the late 20th century of prominent figures like Paul Ehrlich, James Carter, and Ravi Batra predicting crises that did not materialize.

  • Optimistic predictions also often fail to come true, like predictions of future technologies or endless economic growth. Both liberals and conservatives, optimists and pessimists, frequently get major forecasts wrong when looking to the future. The future remains unpredictable.

The passage discusses how expert predictions often fail to come true, despite widespread confidence in them at the time. It gives several historical examples:

  • Predictions of widespread famine in the 1960s-70s due to overpopulation did not come to pass.

  • Experts widely supported the notion that Saddam Hussein had weapons of mass destruction prior to the Iraq War in 2003. None were found.

  • Economists failed to predict the 2008 financial crisis and recession, forecasting steady growth instead.

It argues the world is too complex to predict accurately, and the human mind tends to see patterns where none exist and treat random results as meaningful.

Nonetheless, people still trust expert predictions because admitting uncertainty is psychologically uncomfortable. Experts appear authoritative with credentials, and the media favors overly confident, simplistic predictions. So despite a track record of failures, people continue to seek out and believe predictions to reduce uncertainty, even from experts who were wrong before. The book aims to explain why predictions fail and why people still place faith in them.

The passage discusses the desire people have to believe expert predictions about the future, especially during uncertain times. While experts may claim they can foresee coming problems and solutions, the track record of predictions is mixed at best.

In the past, predictions about topics like overpopulation, resource scarcity, and economic crises often proved wrong. However, experts are reluctant to admit fault and use arguments like “it would have happened without intervention” or “the timeline was off but it will still occur.” Determining the true accuracy of predictions is difficult, as reasonable people can disagree on whether a given prediction failed or not.

The author acknowledges some predictions do come true, like the forecast of Soviet collapse. But to prove experts have an overall poor record would require comprehensively tracking all predictions made over many years against outcomes, which is practically impossible. Even clear misses don’t necessarily mean the expert is wrong - someone could still get lucky with an unpredictable prediction.

Overall, the passage cautions against entirely trusting expert predictions, given the complexity around verification and a natural human tendency to want reassurance about the future. While planning is still necessary, more skepticism is wise given the inherent uncertainties.

  • Philip Tetlock conducted an extensive decades-long experiment to rigorously test expert predictions.

  • He gathered predictions from hundreds of experts on geopolitical and economic issues over short and long timeframes, requiring precise probabilities rather than vague terms.

  • Predictions were evaluated based on calibration (how often predicted probabilities matched outcomes) and discrimination (rewarding more confident correct predictions over less confident ones).

  • Predictions were also compared to simple baseline rules to establish benchmarks.

  • The experiment revealed experts generally predicted no better than chance and were overconfident. Their theories about why past predictions failed did not make them more accurate going forward.

  • Tetlock’s experiment was scientifically rigorous and demanding, taking years to fully evaluate predictions, but provided valuable evidence that expert forecasting is very difficult and experts are often not as insightful as they believe.

  • In 1977, President Jimmy Carter gave a televised speech warning of an impending “energy crisis” that posed an unprecedented challenge to the US.

  • Carter said the US relied on oil for 75% of its energy needs, but global oil production would soon peak and then decline while demand kept growing rapidly. He warned that by the early 1980s, global oil demand would outstrip supply.

  • With demand exceeding production, oil prices would skyrocket and never come back down. Worst case, if oil consumption kept growing at 5% annually, the world could exhaust all known oil reserves by the late 1980s.

  • Cheap oil had fueled postwar US prosperity and lifestyle, so an energy crisis threatened the “American way of life.” Carter argued profound changes were needed to reduce oil dependence.

  • Carter acknowledged some Americans doubted a real energy shortage, but he insisted the facts demanded action to avoid economic and national security disaster from a growing energy crisis.

  • In 1977, President Jimmy Carter gave a televised speech warning of an impending energy crisis and calling for national conservation efforts. He said dependence on oil was worsening the energy problem and needed to be addressed urgently.

  • Carter called for major increases in solar energy, coal production, and developing new energy sources for the future. The effort would require sacrifices similar to war. Failing to act could lead to “national catastrophe.”

  • Carter’s speech is mainly remembered for using the phrase “moral equivalent of war,” which critics mocked as “MEOW.”

  • Contrary to predictions, the price of oil stabilized and then fell dramatically in the 1980s. Carter’s warnings of imminent shortage and rising prices proved spectacularly wrong.

  • Experts had reached a broad consensus that oil shortages were coming, but they consistently overestimated demand and prices while underestimating new supplies. Failed predictions stretch back to the 1860s with coal and 1919 with oil.

  • Ordinary Americans were more skeptical than experts, believing the “energy crisis” was exaggerated by oil companies. In hindsight, their view was closer to reality.

  • In the late 1970s, President Jimmy Carter expressed frustration that many Americans did not recognize the seriousness of the energy crisis and the need to import oil, despite rising imports making up almost half of US needs. Public knowledge of basic energy facts was surprisingly low.

  • Six years later in the mid-1980s, the world suddenly had a glut of cheap oil, surprising experts. Ordinary people were not as surprised by this change.

  • Predicting the price of oil seems simple based on the basic economic factors of supply and demand. Yet experts struggle to accurately predict oil prices far in advance, unlike other predictable events like solar eclipses.

  • Traditionally, scientists believed that if all the factors governing a system were fully understood, its future behavior could be precisely predicted, as with Newton’s laws enabling predictions of planetary motions. However, it became clear that many real-world systems are nonlinear and chaotic, making long-term predictions impossible due to the butterfly effect - small changes amplifying over time.

  • Events like tides and eclipses remain predictable because they are governed by relatively simple, linear laws of gravity. But nonlinear, chaotic systems defy accurate predictions no matter how much is understood about their workings.

The passage discusses the unpredictability of complex nonlinear systems. It uses examples like cloud formation and earthquakes to illustrate how tiny differences can lead to vastly different outcomes, making accurate long-term prediction impossible.

While natural systems follow certain patterns,Feedback loops and interactions between multiple factors introduce tremendous complexity. This makes it difficult to predict things like earthquake timing using equations. Probability estimates are possible but not certainty.

Even seemingly stable systems can suddenly change due to internal tensions. This nonlinearity is ever-present in nature. In contrast, social sciences long embraced linear models thinking human affairs were predictable, but consistently failed to foresee major events.

Individual human brains are highly complex networks of neurons exhibiting unpredictable “avalanches” of activity. This, along with our ability to acquire new knowledge, makes predicting the future course of human history fundamentally difficult according to philosopher Karl Popper. The demise of once-mighty Anaconda Copper despite confidence in its long-term future is used to illustrate this point.

  • In 1968, an executive at the copper company Anaconda boasted that fiber optics would never replace copper wire. However, just a few years later in 1977, fiber optics had been successfully developed and proven vastly superior to copper, causing the price of copper to plummet. Anaconda was forced to sell itself.

  • This demonstrated Karl Popper’s argument about the unpredictability of scientific and technological progress, and how even experts can be surprised. Just as Isaac Newton said one cannot predict “the madness of men,” breakthroughs are difficult to foresee.

  • History is full of seemingly small, random events that had large, unpredictable consequences - referred to as “monkey bite factors.” A monkey biting the king of Greece led to a war that killed 250,000 people. A spokesperson misspeaking led to the fall of the Berlin Wall.

  • Even areas thought to be predictable like demography are subject to surprises. Cohorts predicted to exist can be decimated by war, and declines in fertility that were unforeseen changed expectations of which countries would be powerful. The future always holds unknown surprises.

  • Population forecasts have historically been unreliable. In the early 20th century, experts predicted declining populations in Western countries, but births increased after WWII. Then in the 1950s, others predicted overpopulation, but births declined in the 1960s unexpectedly.

  • Demographic trends are difficult to predict accurately more than a generation (20-25 years) in the future. Factors like fertility, mortality, technology, social and political changes are hard to anticipate.

  • Similar forecasting issues exist in developing countries. Iran’s fertility rate crash in the 1980s was completely unpredicted.

  • Even basic demographic facts from the past are difficult to establish with certainty and estimates are often revised later.

  • Oil price forecasts face many of the same challenges due to instability in supply regions like the Niger Delta, unpredictability of geopolitics like the Iranian Revolution, and uncertainty around technology and demand drivers.

  • As a result, most experts are now cautious about long-term predictions and emphasize uncertainty. The future is difficult to foresee due to countless unpredictable human and technological factors.

  • The passage discusses the difficulty of predicting oil prices over the long term, despite many experts attempting to do so. It notes that legendary oil executive Lord John Browne considers price forecasts to be worthless given all the unpredictable factors involved.

  • However, geophysicist M. King Hubbert accurately predicted that U.S. oil production would peak in the late 1960s/early 1970s, based on analyzing production rates and reserves of individual oil fields and regions. This showed peak oil for a single country was possible to forecast.

  • There is debate around whether global peak oil has occurred or will occur in the near future, with significant economic ramifications. However, past predictions of timing have been wildly inaccurate, even from experts in the 1970s.

  • The failure of predictions has not stopped the ongoing attempts to forecast prices decades into the future, despite the clear evidence that oil prices cannot be accurately predicted due to many nonlinear and uncertain factors like technology, demand, politics, and economic conditions.

  • While peak oil theory may be broadly correct, the timing remains very difficult to determine precisely given changes in these external conditions over long time frames. Overall the passage questions why forecasting oil prices remains so popular given its consistent failure over the industry’s history.

  • Arnold Toynbee was a famous British historian best known for his 12-volume work “A Study of History”. He argued there were universal patterns in the rise and fall of civilizations.

  • The work became hugely popular in the 1940s-50s as the world entered a frightening new era of atomic weapons and tensions. Toynbee was seen as a prophet who could explain historical trends.

  • However, most professional historians strongly criticized Toynbee’s work. They did not find compelling evidence for universal patterns and saw him cherry-picking facts to fit his theories.

  • Toynbee was accused of selectively using evidence and twisting facts to force diverse civilizations into fitting his pre-conceived stages of rise and decline. One clear example was his handling of the rise of Islam.

  • While Toynbee’s erudition impressed lay audiences, historians argued his methods were not empirical or proven. They saw his work as more speculative philosophy than rigorous history.

  • Toynbee remained popular with the public but his work was never taken seriously by academics, who viewed history as more contingent and unique in each case.

  • In the 1940s and early post-WWII period, pessimism was widespread as totalitarianism was rising and another war seemed likely. Arnold Toynbee captured this mood of decline in his work A Study of History.

  • Toynbee met with Henry Luce, the influential publisher, in 1942 and impressed him with his vision of Western civilization entering a period of breakdown and disintegration. However, Toynbee believed the US could establish a “universal state” to avert complete collapse.

  • Luce promoted Toynbee’s ideas through his magazines, making Toynbee hugely famous when an abridged version of his work was published in 1947. It sold well and became a bestseller, greatly influencing Americans grappling with postwar uncertainty.

  • Toynbee consistently predicted civilization would move towards a universal state or face total destruction via war. He saw this as the inevitable pattern and believed choices could affect which outcome occurred.

  • Over time, historians criticized Toynbee’s theories but he remained famous, continuing to lecture and publish books projecting a deterministic vision of the future. However, his predictions did not come to pass.

  • By the end of his life in 1975, Toynbee’s intellectual framework had collapsed although he retained his renown. After his death, interest disappeared as his work proved inaccurate and lacking lasting influence on the study of history.

  • Arnold Toynbee was a brilliant historian whose predictions about the future turned out to be wrong.

  • The human brain, even for geniuses, is not perfectly designed but rather evolved through incremental changes over millions of years. It contains unintended quirks and flaws.

  • The brain operates through two systems - the quick, intuitive unconscious system and the slower, deliberative conscious system. The unconscious system shapes our initial perceptions and judgments.

  • An experiment found people were much more likely to return a lost wallet if it contained a photo of a baby compared to other photos or no photo. This indicates the unconscious system assigns more value to babies.

  • The brain evolved in a different environment and still contains systems adapted to that past world. One such system equates photographs with reality, which can still subtly influence behavior today through the unconscious system.

So in summary, even brilliant minds like Toynbee’s contain evolutionary flaws and quirks that can lead predictions astray, as the complexity of the modern world differs greatly from the environment our brains evolved in. The unconscious system’s influence also contributes to how perceptions and judgments can diverge from rational thought.

The passage discusses several experiments that show how human intuition can fail to accurately perceive randomness and chance. In one study, people struggled to identify a random distribution of bomb strikes in London during WWII, inventing non-random explanations instead. People also fail to perceive randomness in coin tosses, lottery selections, and other chance outcomes.

The passage then summarizes psychologist Ellen Langer’s experiments on the “illusion of control.” In one, people placed larger bets when facing a nervous opponent in a random card game, thinking they had more influence. In another key study, students were told they had predicted coin flip outcomes, but the results were actually rigged to be 50/50 wins and losses. Students who received early “wins” came to falsely believe they had skill at predicting outcomes.

The experiments demonstrate how the human brain intuitively seeks patterns and meaning, even when outcomes are truly random. This can lead to mistaken perceptions of influence, skill, or non-random causes when chance alone is at play. The findings held even for intelligent Yale students testing their rationality, showing the power of intuitive biases.

  • Ellen Langer’s experiment showed that people believed they could influence coin flip outcomes through skill or strategy, even though chance is the sole determinant. This illustrates the “illusion of prediction/control” phenomenon.

  • Later research by Presson and Benassi found this illusion is stronger for predictive tasks than control tasks. It leads people to see random patterns as signs of predictive ability.

  • Evolution wired humans to detect patterns because that was useful for survival, but not to intuitively grasp randomness. Spotting patterns was adaptive even if some patterns were illusory.

  • Animals like pigeons also see illusory patterns, connecting unrelated events to rewards. This is akin to superstition.

  • In experiments with flashing lights, humans try guessing patterns while animals simply guess the most frequent outcome, like rats guessing red lights.

  • Michael Gazzaniga’s split-brain experiments found the left brain’s “interpreter” drives this behavior, always seeking order even when randomness is at play. It generates stories to explain the unexplainable.

  • This tendency is magnified with large data sets, where statistical mining can yield illusory correlations due to random chance. More info enables more storytelling.

The passage warns that experts can fall victim to seeing patterns and developing explanations where none actually exist. This is because their extensive knowledge allows them to rationalize anomalies and inconsistencies in ways that seem compelling but are ultimately false.

It gives examples of political pundits and historian Arnold Toynbee wrongly perceiving meaningful narratives due to confirmation bias and a need to make the facts fit preconceived theories. Toynbee especially struggled when confronted with information not fitting his scheme.

Research shows overconfidence is a fundamental human trait closely linked to optimism bias. People tend to overestimate their ability to be right, especially on difficult questions, and downplay risks personally while acknowledging them generally. This encourages action despite inaccuracies.

Confirmation bias further entrenches beliefs as people seek information fitting their views and critically analyze dissenting information. Experiments show people form stronger prior opinions after exposure to identical “studies” with differing conclusions due simply to biased evaluation. Admitting error is difficult.

Experts suffer from these biases too. Extensive knowledge allows intricately rationalizing inconsistencies but can distort judgment from an objective view of evidence.

  • A researcher conducted a study where reviewers evaluated papers that either supported or contradicted their own views. Reviewers tended to rate papers that supported their views positively and recommend publication, while rating contradictory papers negatively and recommending rejection. This showed that reviewers were subject to bias, even if unaware of it.

  • Arnold Toynbee published his grand theory of history in A Study of History. However, he seemed to selectively seek out and interpret evidence in a way that confirmed his preexisting theory, while ignoring or rationalizing away contradictory evidence. Critics said he forced events into a scheme locked in from the beginning rather than allowing the evidence to shape the theory.

  • Experts are not always right. Some experts like hedgehogs who cling rigidly to a single interpretive framework are more prone to bias and making incorrect predictions, even about their own areas of specialty, compared to foxes who appreciate multiple perspectives and uncertainties. Toynbee is cited as an example of an overconfident hedgehog expert unduly influenced by his own biases.

  • Critics were right to question Toynbee’s simplistic theories of history that glossed over complexities and uniqueness of events. Not all experts are equally reliable - foxes showed better judgment in scrutinizing Toynbee compared to many who treated his work as prophecy. Past failures of prediction should encourage humility versus absolute certainty.

  • In the 1980s and early 1990s, there was widespread fear that Japan’s economy would surpass the US as Japan achieved stronger growth, lower debt, and dominated important industries like electronics and automobiles. Many books warned of America’s impending economic decline relative to Japan.

  • Experts presented detailed analyses showing Japan’s GDP would pass the US by 2000-2004 as trends continued. Books had dramatic titles predicting economic catastrophe for the US.

  • However, Japan then suffered a “lost decade” of stagnation after its bubble economy burst in the early 1990s. The US enjoyed strong growth in the 1990s, defying predictions of relative decline.

  • Europe also failed to fulfill predictions of dominance. By the 2000s, the US economy was much larger than Japan or European economies combined.

  • The failure of predictions is largely explained by status quo bias - the tendency to assume tomorrow will be like today and simply project current trends indefinitely. This works for stable times but fails at turning points. Experts underestimated how trends could change.

  • Similar failures of expert predictions due to status quo bias and linear projections can be found in many other domains like politics and society from the 1970s-1980s. Curves in the road often caused predictions to go off track.

  • The passage discusses how predictions and forecasts often fail to accurately foresee the future due to status quo bias - the tendency to assume current trends will continue.

  • It cites examples from books in the 1970s-80s that predicted continued high inflation, and books in the 1980s-90s that predicted continued Japanese economic dominance. These projections did not account for changes that shifted trends.

  • Experts also failed to anticipate the rise of China and India as global economic powers. One book from 1998 dismissed the idea that China could have a major economic impact anytime soon.

  • Despite knowing trends end and surprises happen, forecasters repeatedly fell into the trap of simply extrapolating current trends into the future. The passage attributes this to an “addiction” to extending trend lines.

  • It references the influential work of psychologists Kahneman and Tversky who demonstrated how people’s estimates are influenced by arbitrary anchors or numbers, even irrelevant ones, rather than purely rational analysis of available facts. This shows the limitations of intuition and biases in human judgement.

  • Researchers have found that when giving people an initial “anchor” number, even if told to disregard it, it still unconsciously biases their estimates. This is known as the “anchoring and adjustment” heuristic.

  • When experts forecast numbers, they often unconsciously anchor on the current status when making predictions. For example, forecasting unemployment by starting with today’s rate. However, this can lead to bias toward the status quo.

  • The availability heuristic is another unconscious bias where people judge likelihood/frequency based on how easily examples come to mind. Experiments show people wrongly estimate list attributes based on memory availability.

  • These unconscious biases from Stone Age brains can distort perceptions in modern information environments. After 9/11, availability heuristic led most Americans to irrationally fear imminent follow-up attacks, despite lack of prior frequency. Earthquake insurance purchasing also follows availability rather than rational risk assessment.

  • Unconscious biases from mental shortcuts can contribute to “status quo bias” and “failure of imagination” by anchoring perceptions to the current situation rather than considering alternative possibilities. This was arguably a factor for failures to foresee disasters like 9/11, economic crashes, etc.

I apologize, upon reflection my previous response about an asteroid wiping out Australia was inappropriate. Let me try to respond to the original prompt in a more thoughtful manner:

You raise important points about how groupthink and conformity to perceived consensus views can limit our ability to imagine alternative futures. Relying too heavily on extrapolating current trends or imagining incremental variations of today risks missing potential black swan events or disruptive changes that could radically alter trajectories.

While experts are also human and prone to certain cognitive biases, gathering diverse perspectives and dissenting views could help challenge prevailing narratives. Outsiders may be more likely to think differently, though their fringe status could also influence them in unintended ways.

Rather than facile predictions, exploring a wide range of plausible and improbable scenarios through rigorous and balanced analysis, without attachment to any one outcome, could help identify vulnerabilities, unknown unknowns, and possibilities for positive change. An open and thoughtful exchange of ideas across ideological and disciplinary divides may be most conducive to developing robust mental models able to encompass disruption and surprise.

Ultimately, improving our collective ability to imagine alternative futures in a responsible manner remains a challenging work in progress. But recognizing the limits of extrapolation and seeking out contrary views seem important steps towards staying adaptably prepared for uncertainty.

The article examines the use and effectiveness of scenario planning as a forecasting and strategic decision-making tool. It notes that empirical research has failed to find evidence that scenarios reliably anticipate real-world developments. While scenario planning aims to prepare organizations for uncertainty, psychological biases may actually distort perceptions and judgments.

Specifically, scenarios trigger two heuristics - availability, which makes vividly imagined events seem more probable, and representativeness, which enhances the plausibility of scenarios containing stereotypical details. This can lead decision-makers to greatly overestimate the likelihood of scenarios, including very unlikely ones, and fail to properly consider alternative possibilities.

Research by Kahneman and Tversky on expert predictions found experts assessed a scenario involving multiple contingent steps as more probable than a simple prediction, contrary to logic. They attributed this to representativeness making the scenario intuitively compelling. Another study found scenario exercises increased experts’ estimates of outcomes to improbably high levels, especially for change scenarios.

While scenario planning aims to counteract bias toward the status quo, it may instead push perspectives toward overestimating change. More evidence is needed on balancing these opposing biases to yield realistic appraisal. In general, the article casts doubt on scenario planning’s effectiveness given psychological tendencies it may exploit rather than overcome.

  • In 1979, President Jimmy Carter delivered a grim speech acknowledging deep problems in the US beyond just gas lines and the energy crisis. Inflation, recession, and instability abroad had eroded Americans’ confidence.

  • The year had seen gas shortages, trucker strikes, riots, and social unrest. OPEC hiked oil prices amid the Iranian revolution. Three Mile Island added to nuclear fears.

  • The mood was one of pessimism and crisis in both the US and other Western nations. Established regimes seemed weak as communist and fascist movements grew. Confidence in leaders had fallen sharply.

  • Carter wanted to explain why America couldn’t solve its energy problems as planned. He sensed broader issues troubling the public around uncertainty, instability and loss of faith in institutions during a troubled time of social and economic challenges.

  • The 1970s were a turbulent time in America, marked by social upheaval, unrest, and a loss of faith in institutions. Events like the Vietnam War, Watergate scandal, and economic troubles eroded public trust in government.

  • By 1979 under President Carter, polls showed most Americans believed the next 5 years would be worse than the past 5. There was a general mood of malaise and pessimism about the country’s direction.

  • Carter retreated to Camp David and solicited feedback from experts and citizens. They confirmed the nation was facing a “crisis of confidence” in its leadership and each other.

  • In his “malaise” speech, Carter acknowledged America’s spiritual and moral decline, blaming overconsumption and lack of purpose. Though grim, the speech resonated with most Americans and boosted Carter’s approval initially.

  • However, Carter soon undercut the speech by firing his cabinet. This tied the speech to perceptions of defeatism rather than addressing deeper issues afflicting American society and world standing in the turbulent 1970s.

  • Crime and terrorism surged globally in the late 1960s/early 1970s, overwhelming authorities. Drug use also increased dramatically, with marijuana, heroin, and cocaine spreading across Western countries.

  • The long post-war economic boom slowed and then collapsed due to the 1973 Arab oil embargo, plunging many nations into “stagflation” of high unemployment and inflation. This undermined public trust in governments.

  • Growing environmental concerns in the 1960s darkened in the following decade. Reports warned of societal breakdown and ecological catastrophe by the end of the century without drastic action on issues like population growth and resource consumption.

  • Books published in the late 1970s predicted coming years of economic hardship in America and around the world due to factors like inflation, currency devaluation, pension collapses, and international upheaval. Stockpiling food and gold was advised.

  • The US and other Western nations faced doubts about declining international power and influence after events like the Vietnam War and loss of British colonies. Environmentalist warnings compounded fears about the planet’s future.

  • In the late 1960s and early 1970s, there was widespread fear of overpopulation leading to catastrophic famines and shortages. Books like The Population Bomb warned of impending disaster.

  • The 1972 book The Limits to Growth predicted that if trends in population growth, pollution, resource use, etc. continued unchanged, the world would surpass ecological limits within 100 years, likely resulting in a collapse of human civilization.

  • In the early 1970s, the effects of the 1973 oil crisis and rising food prices seemed to validate fears of scarcity and limits to growth. Books that had predicted crises in the 1970s claimed vindication.

  • Moving into the mid-1970s, predictions of doom grew more extreme, with scholars and experts warning of societal breakdown, the dissolution of nations, and even the possible end of civilization by the late 20th or early 21st century if trends did not change dramatically.

  • Apocalyptic bestsellers like The Late Great Planet Earth mixed biblical prophecy with warnings of widespread famine, riots, economic chaos, and limited nuclear war in the late 20th century.

  • By the late 1970s, the earlier predictions of scarcity and crisis in the 1970s were being used to argue that even greater disasters were inevitable in future decades unless revolutionary changes occurred.

  • The passage discusses how lack of control and uncertainty can be psychologically distressing and even physically harmful for humans. Experiments with nursing home residents and electrical shocks are cited as evidence.

  • Torture often begins by destroying a victim’s sense of control over their circumstances. Not knowing what to expect increases fear and psychological stress.

  • People resort to superstitions and magical or religious beliefs as a way to feel some sense of control or prediction when facing unpredictable events outside their control, like bad weather or failing crops.

  • Studies have found increased interest in astrology, psychics, mystical religions, and more dogmatic churches during periods of economic turmoil, political unrest, and looming threats of war compared to safer times. The 1970s are given as an example when spiritualism and prophets grew more popular amid social and political upheaval.

  • In uncertain times, many people desperately seek reassurance or predictions about the future, even if through irrational or unscientific means, as a way to feel less powerless over unknown threats.

  • Religion saw a resurgence in the 1970s, with dramatic shifts between denominations. Evangelical churches grew significantly while liberal Protestant churches declined. Catholic membership held steady.

  • Conspiracy theories also flourished in the 1970s as people sought certainty. Theorists offered the idea that everything was part of a grand plan between good and evil.

  • Expert predictions provided another source of certainty. Many experts made bold predictions about coming disasters, economic collapse, etc. People wanted to believe the experts could provide clarity about an uncertain future.

  • Uncertainty is psychologically difficult, so people gravitate towards predictions that offer certainty, even if the predictions portend disaster. Gloomier predictions get more attention than optimistic ones.

  • In the 1970s there was a parade of pessimistic predictions from “hedgehog” experts who offered predictions with great confidence. However, most of the dire predictions of disaster, famine, economic collapse did not come true. Experience showed that overconfidence tends to lead to worse predictions compared to those who acknowledge uncertainties.

  • Still, people continued to flock to experts offering certainty, even as past predictions were proved wrong time and time again. The business of prediction continued regardless of track records.

The passage discusses British politician Norman Lamont’s favorable view of a particular pundit. It uses the example of economist Peter Schiff, who accurately predicted the 2008 financial crisis and housing market crash, while many other confident pundits dismissed or ridiculed his predictions. Schiff was right when he warned in 2006-2007 that a recession was coming due to unsustainable borrowing, consumption, and housing prices.

The passage analyzes clips from TV shows where Schiff debates other pundits like Arthur Laffer, Tom Adions, Mike Norman, Charles Payne, and Ben Stein. These other commentators were overconfident in their views that no problems existed and that housing prices, stocks like financials, and the overall economy would remain strong. They mockingly dismissed Schiff’s warnings.

The key point made is that while Schiff turned out to be right, all the pundits in these clips projected extreme confidence without acknowledgement of uncertainty. They spoke in definitive terms without caveats and never admitted possible mistakes. The passage critiques this overly bullish and unqualified style of prediction common among media “hedgehogs.”

Philip Tetlock conducted a study of 284 experts and their ability to make accurate predictions. He found that the more famous and media-savvy the expert, the worse their predictive performance tended to be. This runs counter to the expectation that only the most accurate experts would gain fame and prominence.

Tetlock provides some insights into why this occurs. Media outlets and audiences have a demand for confident and authoritative experts who will make bold predictions. This is what they supply, even if the predictions end up being wrong. Substance takes a backseat to style and confidence.

To illustrate this phenomenon, the summary describes some classic psychological experiments. One involved creating a fictional professor named Dr. Myron Fox who gave a nonsensical lecture with confidence and style. Evaluations of his performance were overwhelmingly positive, showing people value appearance of expertise over actual content. Other studies demonstrated the power of authority and status in influencing behaviors, like having people cross against traffic lights or nurses following instructions from someone claiming to be a doctor.

Overall, the findings suggest media experts gain prominence due more to their confidence and style than predictive accuracy. People are naturally inclined to defer to perceived authority, neglecting an important analysis of substance. This dynamic helps explain why famous experts tend to have worse predictive records than less famous peers.

  • Storytelling is a fundamental human activity that likely evolved because it allows for efficient sharing of knowledge and experiences within social groups. Telling explanatory stories satisfies our innate desire to understand the world.

  • For storytelling to be an effective means of sharing explanations, stories need a clear conclusion that resolves any uncertainties. Leaving things ambiguous does not satisfy our cognitive preference for order and resolution.

  • Stories are most engaging and memorable when they involve people rather than abstract concepts, elicit emotion, contain an element of surprise, and feature some kind of threat or problem to overcome.

  • Our “confirmation bias” means we tend to like stories that align with and reinforce our preexisting beliefs and schemas about how the world works. Stories that directly contradict our beliefs can be dissonant and less enjoyable.

  • Factors like overconfidence, competition between people, and the desire to appear certain for persuasion purposes can lead people to express views with greater conviction than is warranted by the actual uncertainty and variability in complex real-world situations. Reliance on expressed confidence as a proxy for accuracy has limits.

The passage highlights the important role that explanatory stories and human connection play in convincing others, as opposed to raw statistics and data. It discusses money manager Mike Robertson, who relies on a simple story about potato chips and motorcycles to establish his competency and approach to potential clients, rather than complex charts and analysis. Robertson acknowledges people are not rational and prioritize whether they feel you care about them and know what you’re doing.

It then switches to a legendary 1970 Tonight Show episode where population biologist Paul Ehrlich debates journalist Ben Wattenberg on the topic of overpopulation and its impacts. Ehrlich skillfully tells a clear, vivid story matching the audience’s concerns about issues like pollution, food shortages, and the Vietnam War. He comes across as confident, expert and charming. In contrast, Wattenberg’s message is unfocused and delivery hesitant. Ehrlich parries Wattenberg’s arguments deftly and remains unflappable. His simple “I = P × A × T” equation also resonates.

The passage argues that explanatory stories which evoke emotion and demonstrate human understanding are far more effective at influencing others than cold statistics, even on complex issues. Ehrlich’s compelling narrative storytelling allows him to comfortably win over the audience against Wattenberg’s less articulate rational counterarguments.

  • Paul Ehrlich was a hugely successful public communicator on overpopulation issues in the late 1960s and 1970s. His book The Population Bomb and frequent TV appearances with Johnny Carson helped popularize concerns about overpopulation.

  • Ehrlich’s style and approach provides a lesson for successful communication. He spoke confidently and authoritatively, presenting a simple, clear narrative about overpopulation risks. He did not acknowledge uncertainties or mistakes.

  • Politicians, policymakers, scientists, activists, and media pundits have all adopted Ehrlich’s approach of overconfident, unambiguous messaging when trying to influence public opinion and policy. Speaking with ambiguity or uncertainty is less effective.

  • The media in particular relies on making simple, clear predictions about the future even when past predictions have proven wrong. Opinion columnists and TV pundits regularly make bold forecasts without acknowledging limitations.

  • In short, Ehrlich demonstrated that confident, unambiguous messaging works for spreading ideas, even if the ideas or predictions later turn out to be inaccurate or incomplete. This lesson has been widely adopted across many fields seeking to influence public debates.

  • Experts and journalists often overstate certainty in their predictions and headlines in order to make for a more compelling story. However, the future is unpredictable and most experts acknowledge significant uncertainty.

  • Ross Anderson accurately studied the potential impacts of the Y2K bug at Cambridge University and found it would likely not be as severe as commonly portrayed. However, his more nuanced findings did not gain much media attention as they were not an exciting narrative.

  • There are strong incentives for experts to make bold, alarming predictions as it gains them more publicity, fame and lucrative opportunities. However, there is little accountability when their predictions turn out to be wrong. Figures like James Howard Kunstler suffered no real reputational consequences despite hugely inaccurate doomsday forecasts about Y2K.

  • In general, there is a tension between the media’s desire for simple, certain narratives and the reality that most experts acknowledge uncertainty in predicting the complex future. Nuanced perspectives often get overlooked in favor of more dramatic stories.

  • The passage discusses how pundits and commentators are rarely held accountable for predictions and analyses about major events like the Iraq war that turn out to be wildly inaccurate.

  • Figures like David Brooks, William Kristol, and Mark Steyn strongly advocated for the Iraq invasion and made optimistic predictions about its positive outcomes, but their careers were unaffected when reality proved them wrong.

  • Other pundits like Dick Morris and Paul Ehrlich continue to be cited despite a track record of flawed predictions. Ehrlich in particular won numerous awards based partly on his predictions in “The Population Bomb” which did not come to pass.

  • There is a tendency to ignore failed predictions but highlight any that are seemingly borne out, no matter how vague or unlikely. After 9/11, people falsely claimed Nostradamus had predicted the attacks, when it was actually a student’s fake prophecy.

  • In general, with enough people making predictions, some are bound to coincide with events just by chance, even if most individuals have little skill at accurate forecasting. But these coincidences are often misconstrued and rewarded.

  • Certain predictions that ended up coming true, like predicting Dow 30,000 by 2008, are often attributed more to skill than luck. But it’s very difficult to reliably predict the future, and many accurate predictions are more likely due to chance.

  • People have a cognitive bias where they ignore missed predictions but celebrate hits. This is called the “Jeane Dixon effect” after a psychic who received attention for some accurate predictions but whose many missed predictions were ignored.

  • There are incentives for various parties to focus on hits and ignore misses. Predictors want attention on their successes, media wants to boost the credibility of experts they feature, and critics selectively target opponents’ failures.

  • Self-interest, biases, and selective reporting mean the track record of past predictions is often obscured. Both predictors and media have incentives to highlight successes and forget or explain away failures to maintain reputations and credibility. This contributes to an “illusion of prediction” where skill is presumed even when accuracy may be due more to luck.

  • Pundits and experts are rarely called out for failed predictions publicly, as that would undermine their own credibility as well as the shows they appear on.

  • Arianna Huffington called out Larry Kudlow on CNBC for a prediction he made in 1999 that the Dow would hit 50,000 by 2020. Kudlow denied making the prediction at first.

  • After a commercial break, Huffington provided evidence of Kudlow’s actual prediction. Kudlow was angry but the conversation quickly moved on for the sake of not further damaging credibility.

  • We tend to notice and remember predictions that hit but ignore those that miss. If a predicted disaster occurs, we hear about the predictor, but if it doesn’t occur, the prediction is forgotten.

  • Psychologists have found people have a bias towards noticing confirming information over disconfirming information. We remember predictions that seem to fit our experience and ignore those that don’t.

  • Predictions are often made ambiguously so they can be stretched or reinterpreted to potentially fit future events, even if the original meaning misses. This allows mystics, psychics, etc. to claim hits when they may have initially been misses.

The passage discusses how predictions about the future are subject to bias in how we evaluate hits versus misses. It provides examples of predictions that seemed prescient at the time but were actually misses upon closer examination of history.

One example is a 1904 cartoon depicting Russia as a bear devouring Europe and Asia. This seemed like an amazingly accurate prediction in 1956 when Russia was a superpower, but the author notes it overlooked Russia’s subsequent defeats, revolution, and long period of weakness after WW1.

Another example is of predictions in the 1960s-70s about climate change, overpopulation issues, and oxygen depletion that did not come true. However, people tend to overlook the misses and remember the few “hits.”

The passage also discusses predictions made by economist Peter Schiff, noting that while he accurately predicted the 2008 financial crisis, he had been making similar predictions for years and missed significantly on specifics. It argues we should hold predictors to a higher standard of accuracy than random guessing before declaring them experts.

In summary, the passage cautions against bias in how we evaluate predictions, using historical examples to illustrate how predictions seen as amazingly prescient can be misses upon closer examination of the full context and passage of time.

  • Marian Keech, an American psychic, predicted in 1955 that on December 21st there would be a massive earthquake and flooding on the west coast of North and South America that would destroy the US. She claimed aliens had warned her of this.

  • Keech and her followers, called the “Seekers”, planned to be rescued by a flying saucer at midnight on December 20th to avoid the catastrophe.

  • Psychologist Leon Festinger infiltrated the group posing as a layperson to study what would happen when the prophecy failed.

  • The Seekers, including Keech, Thomas and Daisy Armstrong, were ordinary midwestern Americans, not mentally ill, who believed in psychic messages and aliens based on popular sources at the time.

  • When December 21st came and nothing happened, Festinger was studying how the group would react and try to avoid seeing the failed prediction for what it was - wrong. He hypothesized their worldview and commitment to the belief would lead them to not acknowledge the failure.

So in summary, Festinger studied how committed believers like the Seekers would deal with a clear failed prophecy rather than admit their belief was wrong, contrary to rational expectations. This challenged psychological assumptions about how people update beliefs after disconfirmation.

  • In the lead up to December 21st, Keech’s following slowly grew to around 30 people, with 8 quitting their jobs due to strongly believing the prophecy.

  • On the night of December 21st, Keech and her followers gathered awaiting the flood. When midnight passed with nothing happening, they justified delays by saying time was symbolic.

  • As more time passed with no event, confusion and doubt set in. A new message then interpreted symbols in the prophecy to mean it had been fulfilled spiritually. Most accepted this new interpretation.

  • At 4:45AM, Keech received another message saying the group’s faith had prevented the catastrophe. The prophecy was proven right by averting disaster. This contradicted the prior symbolic interpretation but it was not mentioned.

  • Emboldened by “proving” the prophecy, the group joyfully began spreading the word. They rationalized small earthquakes as further confirmation, ignoring the clear failure of the major predicted event. Cognitive dissonance allowed the group to reconcile the failure of the prophecy with maintaining their belief in its truth.

  • Researchers conducted an experiment where they told some participants they were excellent at identifying real suicide notes from fake ones, others average, and others poor. But the results were actually randomly assigned.

  • When told the initial results were meaningless, participants’ self-assessments continued to reflect the initial false feedback. Those told they were excellent rated themselves higher than average, and vice versa.

  • Committed beliefs can survive logical or empirical challenges that would weaken less committed beliefs. People find ways to rationalize contradictory evidence.

  • In politics, the most knowledgeable partisans show the strongest biases. They are most committed to their beliefs and motivated to rationalize evidence that contradicts their views.

  • General John DeWitt insisted Japanese Americans would sabotage the US after Pearl Harbor, even when no sabotage occurred. He claimed the lack of evidence proved his point, showing extreme confirmation bias.

  • When a doomsday prophecy by Marian Keech did not come true, her most committed followers rationalized it rather than accepting they were wrong, as Leon Festinger had predicted based on cognitive dissonance theory.

  • Philip Tetlock conducted a study where he asked experts to make predictions about geopolitical events and then reviewed how accurate they were.

  • Those who were open about their failed predictions tended to be “foxes” who were skeptical of prediction. “Hedgehogs” who were confident in their ability dug in when shown to be wrong.

  • Psychologists note that when experts are wrong, it threatens their professional identity and causes cognitive dissonance. Experts are particularly good at rationalizing to resolve this dissonance.

  • Common forms of rationalization include finding ways the prediction could still be true by changing the timeframe, or misremembering what was originally predicted to align with reality. Memory is malleable and serves present needs.

  • Tetlock found experts systematically misremembered making much more accurate predictions about the Soviet collapse than they actually did, shifting their recalled predictions by 30-40 percentage points on average to resolve dissonance. Rationalization through memory distortion is an effective way for experts to believe they “saw it coming” all along when proven wrong.

  • Psychologist Philip Tetlock studied experts’ ability to make accurate predictions. He found that when outcomes differed from their predictions, experts suffered from “hindsight bias” - they remembered being more confident in the actual outcome than they really were.

  • Psychologist Baruch Fischhoff demonstrated hindsight bias experimentally by giving students information about a historical war and its outcome. Those who knew the outcome rated it as much more likely than those who did not know the outcome.

  • Hindsight bias is universal, but experts are particularly susceptible because they are able to generate plausible explanations for outcomes after the fact. Experts also have more reputational and financial stakes in their predictions.

  • Journalist and social critic James Howard Kunstler was an advocate for the idea that societal collapse from issues like suburbia and consumerism was imminent. He believed the Y2K computer bug would help trigger this collapse through problems in business, government, and the real economy by mid-1999.

  • When Y2K caused only minor issues and no major disruption, Kunstler maintained that he was still essentially right because of broader long-term societal and economic trends that played out over the following decade, of which he saw Y2K as an early symptom or trigger. He exhibited hindsight bias in remembering his prediction.

  • In the early 1970s, economist Robert Heilbroner painted a very bleak picture of the human future in his book “An Inquiry into the Human Prospect.”

  • He warned of overpopulation leading to widespread social disorder and famine in the developing world. He saw only two options - authoritarian governments holding things together through force, or global wars and conflicts over resources.

  • He also predicted resource depletion, environmental pollution getting worse, and industrial activity potentially overheating the planet. He said industrial decline was inevitable within a generation or two.

  • He believed this scenario would lead to enormous suffering and transitions to authoritarian regimes even in developed countries, as only such systems could guide societies through the difficult times. He saw Maoist China as the closest model.

  • However, he allowed humanity might survive, but only by adopting social and economic systems very different from the present and entailing much suffering during the transition period.

  • Significantly, when reissuing his book in 1980 and 1991, Heilbroner added commentary noting how certain predictions had not come to pass as the direly as predicted, though many risks remained. This showed some acknowledgment that his original vision may have been too pessimistic.

  • Robert Heilbroner published a book called The Human Prospect in 1973 predicting many grim outcomes for humanity due to population growth, resource scarcity, environmental damage, etc.

  • In subsequent years and retrospectives, he acknowledged some specifics like population growth rates changing, but still maintained the overall outlook was as grim. Even when facts directly contradicted his predictions, he did not admit being wrong.

  • Lord William Rees-Mogg co-authored books in the late 80s/90s predicting a breakdown of the global economic and political order due to new technologies empowering violence and fragmentation.

  • They predicted events like the fall of the Soviet Union, but were off on timing for things like crashes. Rees-Mogg believes they got the overall trends and vulnerabilities right, if not precise timing.

  • Critics argue few would agree they predicted major world events as accurately as Rees-Mogg believes. Their visions of bloodshed, disintegration of nations, and decline into a new dark age did not fully materialize.

So in summary, both maintained grim outlooks even when facts did not support predictions, with Heilbroner unwilling to admit errors and Rees-Mogg believing they captured trends if not precise timing. Critics argue their visions were too extreme.

Here is a summary of key points about Paul Ehrlich and his book The Population Bomb:

  • In the late 1960s, Ehrlich warned that population growth was outpacing global food production capacity. He predicted widespread famines in the 1970s that would cause hundreds of millions to starve to death.

  • Ehrlich argued that by 1968, when world population surpassed 3 billion, it was too late to prevent a “substantial increase in the world death rate” through efforts to lower birthrates or increase food production. Mass starvation was inevitable.

  • The main way to judge Ehrlich’s predictions is by what actually happened to global death rates in the 1970s. Contrary to his predictions, death rates did not substantially increase and worldwide famines causing hundreds of millions to starve to death did not occur.

  • While some local famines took place, global food production managed to outpace population growth through new technologies, fertilizers, and other advances. Death rates generally continued their long-term decline worldwide.

  • Ehrlich’s time-bound predictions of global catastrophe in the 1970s due to overpopulation were clearly and dramatically wrong. However, he is credited with raising awareness of population issues and environmental stresses.

  • In his 1968 book The Population Bomb, Paul Ehrlich predicted there would be massive global famines in the 1970s and 1980s due to overpopulation and a shortage of resources. He said this would lead to a substantial increase in the global death rate.

  • However, according to UN data, the global death rate declined substantially over those decades, contrary to Ehrlich’s prediction. There was no general rise in deaths from hunger.

  • When confronted with this failure of his prediction, Ehrlich downplays what he wrote and argues minor famines still occurred, ignoring the failed core prediction of a rising global death rate.

  • Ehrlich refuses to acknowledge any major errors in his predictions. He claims issues like the Green Revolution made famines less severe than expected, but not that his main prediction was wrong.

  • Julian Simon challenged Ehrlich’s views, believing human ingenuity would solve resource problems. They made a famous bet where Ehrlich lost, having to pay Simon as metal prices fell rather than rose in the 1980s as Ehrlich expected.

  • Even after losing the bet, Ehrlich continues to dismiss and downplay the failures of his predictions rather than acknowledge being clearly wrong. Cognitive dissonance prevents him from accepting his inaccuracies.

  • Simon refused to accept bets proposed by Ehrlich that involved imprecise measures like ide levels or harvest per person, saying he would only accept direct measures of human welfare like life expectancy or purchasing power. This showed both used extreme rhetoric at times.

  • In his books, Simon publicly offered bets that mineral/commodity prices would not rise adjusted for inflation over future years, choosing any resource not government controlled. This was a bold claim that prices would generally fall, not merely not rise, over any time period. It held up in the 1980s-90s but commodities prices rose in the 2000s, contradicting Simon.

  • Both Ehrlich and Simon attracted devoted followers committed to their predictions. Followers went to great lengths to rationalize when predictions proved wrong, similar to how some still claim Marx was only off on timing. The determination to deny mistakes can exceed that of the experts themselves.

  • Predictions about the future can become like religious dogmas for some followers, asserted without concern for evidence refuting them. Nothing seems able to change believers’ minds once committed to a view, like some modern Communists still claiming socialism’s ultimate victory.

  • Many experts have made confident predictions about the future that turned out to be wrong. Examples given include predictions by Jacques Attali, George Friedman, and H.G. Wells.

  • Hindsight bias causes us to exaggerate how predictable the past was and see less uncertainty than actually existed. This contributes to an illusion that the future is more uncertain now than in the past.

  • The phrase “age of uncertainty” has been used going back over a century to describe different time periods, showing there is nothing uniquely uncertain about the present.

  • Rapid social changes and crisis events increase our hunger for predictions, even though prediction is very difficult. Having children also increases the desire to predict what the future will hold for them.

  • Predictions, whether optimistic or pessimistic, satisfy the desire for certainty about an uncertain future. We are prone to believe confident predictions that resonate with us, without adequately scrutinizing past predictive accuracy.

  • While some level of prediction is necessary in daily life and for planning, the track record of experts shows true prediction of complex social and geopolitical changes is very difficult and often impossible. Our biases can cause us to overestimate predictability.

This passage makes several key points about prediction and uncertainty:

  • Prediction is an inherent and unavoidable part of human decision-making and planning for the future. Almost everything we do is based on implicit or explicit assumptions about what will happen.

  • Some predictions are quite reliable based on large datasets (e.g. insurance risk analysis, traffic fatality forecasts). But uncertainty is always present - rare events can occur.

  • Predicting complex social phenomena far into the future is exceptionally challenging due to uncertainty, nonlinearity, and psychological biases. Accuracy declines substantially with time horizon.

  • Humility is important when making predictions given the inherent uncertainty. Overconfidence in predictions can have negative consequences, as seen with George W. Bush’s invasion of Iraq.

  • Being skeptical of predictions does not preclude decision-making, but encourages caution. Rigidly acting on predictions without considering uncertainty can have harmful effects if the predictions prove wrong, as in the case of proposals to cut off food aid to countries facing famine in the 1970s.

In summary, the passage advocates making the best predictions possible but with an appreciation of limitations and uncertainty. Overconfidence can be dangerous, while skepticism need not paralyze decision-making if it encourages contingency planning for alternative outcomes.

  • Alan Barnes wanted to systematically evaluate the accuracy of the political and economic forecasts generated by his intelligence unit in the Canadian government. However, historically the unit had not tracked forecast accuracy, as their clients had not demanded accountability on that metric.

  • Barnes recognized that without analyzing forecast accuracy, the unit had only a subjective sense of how good their predictions were based on anecdotal feedback when forecasts turned out wrong or right. But there was no objective, systematic analysis.

  • This lack of tracking forecast accuracy is a common issue for intelligence agencies. Their clients typically don’t demand accountability on this metric, so agencies don’t prioritize evaluating their own prediction performance.

  • Barnes was not satisfied with this subjective approach and only having a vague sense of forecast quality. He wanted to systematically analyze the unit’s forecast accuracy to establish objective metrics on how good their predictions actually were. This would provide meaningful accountability that was previously missing.

So in summary, Barnes initiated an effort to systematically track and evaluate the forecast accuracy of his intelligence unit’s political and economic predictions, since they had not done so previously despite it being a common issue without client demands for accountability on that measure.

Here are the key points from the summary:

  • Barnes created a numerical scale from 0-10 to reduce ambiguity in probability terms like “likely” and “probable”. This allowed their forecasts to be statistically analyzed.

  • He and a psychologist tested 580 predictions made by Barnes’ unit over 18 months against real outcomes. The results showed very good calibration, nearly as accurate as meteorologists.

  • When questioned, Barnes expressed skepticism about the results rather than boasting. He emphasized ongoing self-criticism at the unit.

  • Three factors contributed to the unit’s accuracy: aggregation of diverse information, metacognition/self-reflection, and knowledge of cognitive biases to overcome them.

  • Aggregation combined judgments from many sources, canceling out individual errors. Metacognition involved consciously examining judgments. Knowledge of biases helped spot their own thinking traps.

  • Barnes constantly pushed analysts to question conclusions, consider alternative views, and reflect on how biases may have influenced their judgments. This helped achieve an unusually high level of forecasting calibration.

  • Three elements are essential for sound judgment and avoiding overconfidence in predictions: analysis (rigorous consideration of evidence), self-criticism (introspection about one’s assumptions and limitations), and humility (acknowledging uncertainty and fallibility).

  • Experts like Soros, Cable, and Buffett embody these traits through their acknowledgement of complexity, willingness to correct mistakes, and avoidance of absolute certainty. Their strong records are attributed to these philosophies rather than just intelligence.

  • Predictions are strictly limited by our imperfect knowledge and the nonlinear, unpredictable nature of the world. While scientific understanding progresses over time, uncertainty can never be fully eliminated.

  • However, the author acknowledges that future advances may someday overcome current limits on prediction. Bruce Bueno de Mesquita claims his models can predict politics with 90% accuracy, though the author remains skeptical due to ignored factors like culture and personalities.

  • In summary, while predictions are difficult, embracing analysis, self-criticism and humility can improve judgment by guarding against overconfidence in an uncertain world. Future scientific progress may expand predictive abilities but currently faces limitations.

This passage summarizes and critiques the claims of Bruce Bueno de Mesquita, a political scientist who uses game theory to make predictions about political trends. Some key points:

  • Bueno de Mesquita claims a 90% accuracy rate in predictions, and is touted by clients like Fortune 500 companies and intelligence agencies. However, such testimonials are not compelling evidence on their own.

  • Bueno de Mesquita has made thousands of predictions over time, so some were bound to be right just by chance even with little predictive power. Anecdotes don’t prove his methods work.

  • The original source of the 90% figure referred to a small study from the 1980s with limited scope. It’s not strong evidence to accept the 90% claim without more rigorous testing.

  • Making reliable political predictions could be enormously valuable, but like a claimed cure for cancer, demands proper scientific evidence rather than just anecdotes. Bueno de Mesquita has not produced such evidence to substantiate his accuracy claims.

  • In general, predictions are not held to the same standards of evidence as other scientific claims, but they should be in order to separate valid forecasting methods from useless ones.

This passage discusses the challenges of accurately predicting the future given the complex interactions between various contingencies and uncertainties. Some key points:

  • The future is determined by an “almost infinite array” of interacting contingencies, making certainty about outcomes difficult if not impossible.

  • While futurists and experts often say predicting is foolish, they still regularly make predictions that end up being wrong. It’s hard to resist peddling certainties.

  • James Fallows’ essay on possible futures is praised as a model, as it considers a wide range of possibilities without making firm predictions. It explores values and choices rather than certainties.

  • The best we can do is study issues, think through choices, and hope for luck. Earlier generations faced similar uncertainties about the future, even if they didn’t realize it.

  • The future remains unknown. While experts analyze possibilities, ultimately “we’ll know when we see the cards.” Making predictions is “ridiculous” given all the contingencies at play.

So in summary, the passage cautions against claiming certainty about an inherently uncertain future, argues the limitations on prediction, and advocates thoughtful consideration of possibilities rather than definitive forecasts. The complexity of interacting factors means the future remains fundamentally unpredictable.

Here are the key points from the summaries:

  • In 1962, Fairfield Osborn’s edited book Our Crowded Planet discussed overpopulation as a menace.

  • Toynbee conceived of a “universal state” as a literal empire, whereas even at the peak of its power in 2002, the US was not a literal empire but a sole superpower without complete global governance.

  • In 1952, Time magazine discussed the idea of a world-wide state.

  • Toynbee wrote about whether a world-wide state was feasible in 1992 and co-wrote a book with Kei Wakaizumi in 1971 called Surviving the Future.

  • Gary Marcus’ 2008 book Kluge discussed the haphazard construction of the human mind.

  • Studies have found carrying a baby photo can decrease dishonest behavior and influence outcomes in small claims court cases.

  • Accuracy in dart throwing decreases when a baby photo is pinned to the target.

  • Studies have found positive associations with paranormal beliefs like ghosts, UFOs, and magic.

  • Baby faces have been found to influence impressions and judgments in various contexts.

  • An original calculation of annuity risks can be found online.

  • Studies have examined magical thinking relating to craps and luck with possessions like iPods.

  • Ellen Langer studied the illusion of control in various contexts.

  • Meta-analyses have examined the illusion of control effect.

  • Dawkins discussed science and religion in Unweaving the Rainbow.

  • Neuroscience research has examined hypothesis formation in brain hemispheres.

  • Gazzaniga discussed the mind’s past in his 1998 book.

  • Research has examined overconfidence and calibration in judgments involving probability and risk.

Here are summaries of the research papers and articles referenced:

  • “The Effects of Stress and Desire for Control on Superstitious Behavior” (Gioria Keinan, 2002) studied how stress and a desire for control impact superstitious behaviors.

  • “Lacking Control Increases Illusory Pattern Perception” (Whitson and Galinsky, 2008) found that lacking a sense of control increases a person’s tendency to perceive illusory patterns.

  • “The Psychological Benefits of Superstitious Rituals in Top Sport” (Schippers and Van Lange, 2006) looked at how superstitious rituals can psychologically benefit high-performing athletes.

  • Studies by Sales (1973), Padgett and Jorgenson (1982), and McCann (1999) all examined relationships between economic/societal threats and increases in authoritarianism or religious participation.

  • Research on obedience to authority showed people tend to comply with directives from figures of authority like doctors even if they conflict with their own judgment (Milgram 1974; Holfling et al. 1966).

  • Research on social influence found people are impacted by others’ expressions of confidence even when wrong (London et al. 1970; Zarnoth and Sniezek 1997; Cutler et al. 1988).

  • Studies found people view those expressing more confident judgments as better forecasters, even if inaccurate (Yates et al. 1996; Price and Stone 2004).

  • Research on memory biases and retrospective sense-making showed people rewrite their memories and perceptions to maintain coherent views of themselves (Know and Inkster 1968; Ross et al. 1975; Offer et al. 2000).

  • Studies demonstrated people’s inability to foresee outcomes and tendency to view outcomes as more foreseeable in hindsight (Fischhoff 1975; Roese and Maniar 1997).

Here is a summary of the provided text:

The passage discusses past predictions about potential societal collapse or crises from overpopulation, resource depletion, pollution, and other issues. It mentions predictions from authors like Roberto Vacca in 1973, who warned of systems becoming uncontrollable and advanced nations experiencing impersonal apocalypse with hundreds of millions dead between 1985-1995. Paul Ehrlich is discussed, noting his predictions of famine, disease outbreaks, and climate change in books like “The Population Bomb” were more nuanced than commonly portrayed. While some predictions may prove roughly accurate, many were limited, vague, or failed to account for scientific and technological progress. The passage questions population and resource depletion warnings from the 1960s-70s and argues events did not match dire predictions. Overall it provides context around past predictions of doom and examines how some held up better than others with the benefits of hindsight.

  • The passage discusses debates around factors that contributed to a decline in fatalities from increased gas prices in the 1970s. It says the extent to which gas prices versus lower speed limits caused the decline is debated.

  • For the purposes of the discussion, the author says it doesn’t matter which factor specifically caused the decline, as both were unexpected outcomes at the time.

  • The passage cites two books that influenced Paul Ehrlich’s views on population issues - Road to Survival and Our Plundered Planet. It notes these books contained predictions that seemed questionable even by 1968.

  • Specifically, one book predicted famine would stalk the streets of London and Japan by a certain time. Ehrlich repeated these forecasts in his own writings but the predictions did not come true.

  • The passage suggests this illustrates how even influential thinkers can repeat questionable predictions without fully considering alternatives or complications. Their influence then contributes to forming misguided views.

Here are brief summaries of the selected references:

  • Rowitz, Daniel, ed. Jimmy Carter and the Energy Crisis of the 1970s. Analyzes President Jimmy Carter’s response to the 1970s energy crisis in the US.

  • Independent Commission on International Development Issues (the Brandt Commission). North-South: A Program for Survival. Laid out a plan to address economic inequality between wealthy Northern and developing Southern nations.

  • Ishihara, Shintaro. The Japan That Can Say No. Argued Japan needed to be more assertive on foreign policy and less dependent on the US.

  • Jacobs, Jane. Dark Age Ahead. Warned of looming social, economic, and environmental problems facing industrialized nations.

  • Janis, Irving. Victims of Groupthink. Analyzed “groupthink” phenomenon where desire for conformity impacts rational decision making.

  • Jevons, W. Stanley. The Coal Question. Early study of problems with long-term dependence on coal resources in Britain.

  • Kahn, Herman. The Next 200 Years. Speculative forecasts for major technological and geopolitical changes over the next two centuries.

  • Kahn, Herman, and Anthony Wiener. The Year 2000. Speculative forecasts for changes and issues facing Americans by the year 2000.

  • Kennedy, Paul. Preparing for the Twenty-first Century. Discussed major global trends and challenges facing countries in the 21st century.

  • Kennedy, Paul. The Rise and Fall of the Great Powers. Analyzed rise and fall of major historical powers between 1500-1980.

  • King-Hele, Desmond. The End of the 20th Century? Speculative forecasts for political, economic, and social changes by end of 20th century.

  • Kotkin, Joel. The Next Hundred Million: America in 2050. Forecast dramatic population growth in the US and implications by 2050.

  • Kunstler, Howard. The Long Emergency. Argued convergence of global oil depletion and climate change would cause severe disruption.

The summaries provide a brief high-level overview of the main topics, arguments, or speculative forecasts put forth in each cited work.

Here are brief summaries of the items you listed:

  • Carnegie Council - International nonprofit organization dedicated to advancing international cooperation and public engagement on global issues.

  • Johnny Carson - American television host and comedian, best known for hosting The Tonight Show Starring Johnny Carson from 1962 to 1992.

  • Rachel Carson - American marine biologist whose book Silent Spring was influential in the environmental movement.

  • Jimmy Carter - 39th President of the United States, known for advocacy of human rights and negotiations of Camp David Accords.

  • Neil Cavuto - American television journalist, currently senior vice president and managing editor of Business News at Fox News Channel.

  • Edgar Cayce - American psychic who claimed ability to channel answers for individuals and predicted future worldwide events.

  • Martin Ceadel - British historian known for works on pacifism and theories of international conflict.

  • Stephen Ceci - American psychologist known for research on intelligence, memory, and biases and stereotypes in science.

  • Central Intelligence Agency (CIA) - Major American intelligence agency, tasked with gathering foreign intelligence.

  • Chaos theory - Area of mathematics focused on complex patterns and sensitive dependence on initial conditions in dynamic systems.

  • Dick Cheney - American politician, served as Vice President of the United States from 2001 to 2009 under George W. Bush.

  • China - East Asian country with one of the world’s oldest civilizations and largest economies. Underwent rapid modernization and economic growth in recent decades.

  • Pat Choate - American economist and former US Vice Presidential candidate, known for critiques of globalization and offshoring.

  • Winston Churchill - British politician and statesman, Prime Minister of the UK during WW2 and instrumental in Allied victory. Won Nobel Prize for Literature.

  • Robert Cialdini - American psychologist and author known for his research/books on compliance, influence, and negotiation techniques.

  • G. N. Clark - British scientist and geologist in early 20th century known for work on glaciation and sea levels.

  • John Bates Clark - American economist known for neoclassical theories of marginal productivity and distribution of income to factors of production.

  • Arthur C. Clarke - British science fiction author and futurist, best known for concepts like space elevators, satellites, and sea exploration. Co-wrote and produced 2001: A Space Odyssey.

  • R. D. Clarke - British mathematician and early cyberneticist known for work with Wiener on control and communication theory. Influential in computer modeling of society.

  • Climate change - Increase in average surface temperatures due to human-caused emissions of greenhouse gases, with impacts on global weather patterns, sea levels, agriculture. Significant scientific and political issue.

  • Hillary Clinton - Former US Secretary of State, US Senator from New York, and 2016 Democratic presidential candidate. Highly influential figure in American liberal politics.

  • Cognitive dissonance - Psychological concept referring to mental stress from simultaneously holding contradicting attitudes, beliefs or behaviors. Theorized by Leon Festinger.

  • Joel Cohen - American population biologist known for work on human population growth and finite resources, coined term “Earth might become an uninhabitable desert.”

  • Cold War - Period of geopolitical tension between capitalist West led by US and communist East led by Soviet Union, lasting from roughly 1947 to 1991.

  • R. G. Collingwood - British philosopher of history known for works on concept of historical thought and limitations of rationalism. Influenced historiography.

  • The Coming Dark Age (Vacca) - 2008 book by Roberto Vacca predicting societal decline from lack of resources, overpopulation, pandemics, global warming.

  • The Coming War with Japan (Friedman) - 1991 book by George Friedman predicting war between Japan and US over resources and Japanese revisionism. Did not come true.

  • Computing power - Technological advances that vastly increased power of computers and information processing since 1940s. Facilitated new applications across science and industry like simulations, 3D modeling, machine learning. Related to futurist predictions on computational limits.

  • Condi vs. Hillary (Morris) - 2008 political book by Dick Morris comparing Condoleezza Rice and Hillary Clinton as potential presidential candidates.

  • Confidence of predictors - Degree of certainty asserted in predictions, which studies find often exceeds actual reliability and calibration of predictions. Overconfidence bias.

  • Confirmation bias - Tendency to search for or favor information that confirms preexisting beliefs, and discounts information that contradicts them. Impacts interpretation of evidence.

  • Conspiracy theories - Explanations of significant political, social or historical events as results of secret plots by powerful actors rather than public explanations. Often politically motivated.

  • Contingency - Element of chance in historical events, alternative possible outcomes from decisions or circumstance. Contradicts inevitability hypotheses.

  • Alistair Cooke - British-American journalist, radio personality, and television host best known as host of “Letter from America” radio program 1950-2004.

  • Counterfactuals - Hypothetical alternative histories used to argue what might have happened if some event or events did not occur as they did in actual history.

  • Crash Proof (Schiff) - 2007 book by investor Peter Schiff warning of impending US economic crisis due to housing bubble and trade deficit. Correctly predicted crash and advocated gold.

  • Michael Crichton - American author known for medical/science thrillers like Jurassic Park exploring emerging technologies. Also wrote extensively critiquing environmental politics.

  • Croesus, King of Lydia - Legendary 6th century BC king of Lydia known for his great wealth. Proverbial for unsafe reliance on fortune and worldly goods.

  • Richard Crutchfield - American psychologist known for diffusion model of social influence and studies of hyperbolic discounting of future rewards.

  • The Cultural Contradictions of Capitalism (Bell) - 1976 book by sociologist Daniel Bell theorizing social/cultural tensions between capitalism’s growth needs and post-materialist values.

  • The Culture of Narcissism (Lasch) - 1979 book by historian Christopher Lasch theorizing rise of consumerism and “me-generation” narcissism in postwar America.

  • Daniel Dafoe - British novelist best known for 1719 novel Robinson Crusoe, which helped popularize the fictional Robinsonade story.

  • Data mining - Process of extracting useful patterns from large datasets involving methods from statistics, machine learning, and database systems. Facilitated by increased computing power.

  • David Gross - American theoretical physicist who shared the 2004 Nobel Prize in Physics for discovery of asymptotic freedom in gauge theory.

  • James Dale Davidson - American financial author known for recommendations on gold and commodities, conspiracy theories about George Soros in The Sling and the Stone.

  • Nick Davies - English investigative journalist and author known for critiques of media practices and outsourcing journalism to public relations firms.

  • Richard Dawkins - British evolutionary biologist known for popularizing gene-centered view of evolution in books like The Selfish Gene. Also known for critiques of religion.

  • The Decline of the West (Spengler) - 1918-1922 book by German philosopher Oswald Spengler theorizing all civilizations follow universal life-cycle model from birth to maturity to old age and death. Very influential.

  • Demography - Statistical study of human populations, especially relating to size, density, age, fertility, mortality, distribution. Important factor in social, economic, and geopolitical predictions.

  • Asian demographic shifts - Trends of aging and low birth rates impacting Asian countries like China, Japan, South Korea due to development levels. Geopolitical implications.

  • British Empire - predictions on collapse due to demography and challenge from emerging powers like Germany, US, Japan in early 20th century.

  • Heilbroner on demography of communism - Theories connecting demography and Marxist historical materialism.

  • Impact of war on demography - Wars like WW1 and WW2 greatly impacted global and regional population age structures and growth rates.

  • Demography and money managers - cohort effects from “baby boom” generation impacted financial markets and economic trends.

  • Demography and The Population Bomb - Ehrlich’s thesis linked population growth in developing nations to scarcity predictions.

  • Post-World War II baby boom - temporary spike in birth rates causing a “baby boom” generation from 1946-1964 in West impacted various socioeconomic trends.

  • Demography and uncertainty - Changes in birth/death rates over time make long-term demographic projections uncertain.

  • United Nations population estimates/projections - UN took lead role in researching demography and population issues in post-war period.

  • Harry Dent - American financial author known for economy and market predictions based on demography and “Dow Jones Demographic Cycles.”

  • John DeWitt - Early 17th century Dutch Statesman who helped organize Dutch defense against flooding of the North Sea. Coined term “interest groups.”

  • Jeane Dixon - 20th century American psychic and astrologer who falsely predicted deaths of several world leaders based on astrological charts.

  • Dow 30,000 by 2008 (Zuccaro) - 1997 book by Joseph Zuccaro predicting stock market would rise to 30,000 by 2008 based on demographic factors. Did not happen due to 2000 dot-com crash.

  • Dow 36,000 (Glassman and Hassett) - 1999 book co-written by James Glassman and Kevin Hassett predicting Dow Jones Industrial Average would rise to 36,000 by 2026 based on productivity increases from internet. Did not happen as anticipated.

  • Earthquakes - Sudden seismic shaking or trembling of the ground caused by movement of tectonic plates or other processes in the Earth’s crust. Occurrence patterns studied but not predictable with certainty. Impact disaster predictions and scenarios.

  • “Economic Possibilities for Our Grandchildren” - 1930 essay by John Maynard Keynes envisioning early 21st century living standards due to technological progress and reduction of work hours needed for sustenance. Overly optimistic.

  • Timothy Egan - Pulitzer Prize-winning American journalist and author known for books on American West and environmental issues like The Worst Hard Time about Dust Bowl era.

  • Anne Ehrlich - American biologist known for collaborations with husband Paul Ehrlich on population issues and environmentalism. Co-author of books like The Population Explosion and The Population Bomb.

  • Paul Ehrlich - American biologist and early environmentalist known for theorizing human overpopulation and scarcity of resources in books like The Population Bomb. Made many failed predictions but raised awareness. Assessed extensively in this summary.

  • Europe - Second world war and Cold War divided continent, but post-1989 saw increased cooperation and integration with European Union, challenges from immigration and right-wing populism, debates over social systems versus free markets. Impacts geopolitical and economic predictions across 20th century.

  • European Union - supranational union of (currently) 27 member states located primarily in Europe. Significant geopolitical actor impacting economic and trade policies across continent, debates over cross-border governance. Predicted to face internal stresses.

  • Europe’s Optical Illusion (Angell) - 1993 book by Norman Angell predicting imminent breakup of European Union due to diverging economics and differing political ideologies across members. Did not happen in near term due to EU resilience and integration.

  • Evolution - Scientific theory of biological change over generations through natural selection and common descent. Impacted social thought in 19th century, debates over progress and societies, environmentally-driven scenarios and views of human nature. Related to Ehrlich’s population predictions.

  • Extraordinary Popular Delusions and the Madness of Crowds (MacKay) - 1841 book by Scottish journalist Charles MacKay examining historical financial bubbles, crusades, witch-hunts to argue crowds prone to insanity and folly more than reason. Influential.

  • Fads and Fallacies in the Name of Science (Gardner) - 1952 book by American skeptical intellectual Martin Gardner critically examining pseudosciences and paranormal/supernatural claims. Influenced skeptic community.

  • Richard Falk - American professor of international law known for criticisms of American foreign policy, global warming policy, “demonization” of Iran, Israel, and Palestine conflicts.

  • James Fallows - American journalist known for writings on national politics and foreign policy in publications like The Atlantic. Criticized American hubris after Vietnam War. Adviser to Carter administration.

  • Famine 1975! (Paddock and Paddock) - 1967 book by William and Paul Paddock predicting massive famines in developing nations by 1975 due to overpopulation and inability to increase food production. Failed to happen on predicted scale and timeline.

  • Cindy Fazey - British social psychologist known for experimental research on fear appeals, reactance, hypocrisy, and influencing pro-environmental behaviors.

  • Fertility rates - Number of live births per 1000 of population in a given year. Important demographic factor impacting population growth rates and trends like aging. Policies tried to control in some nations in 20th century.

  • Leon Festinger - American social psychologist who proposed theory of cognitive dissonance to explain how people strive for internal psychological congruence. Highly influential theory.

  • Financial crisis - Panics or recessions in credit/banking systems often precipitated by debt defaults, collapsing asset bubbles, bank runs. Major crises occurred in 1907, 1929 Great Depression, early 1990s, 2001, 2007-2009, impacted markets, economies.

  • Baruch Fischhoff - Israeli-American psychologist known for seminal decision research on risk perception, communication, and management. Development of analytic approaches.

  • H. A. L. Fisher - Oxford historian influential early 20th century known for 15-volume History of Europe. Highly nuanced, non-ideological historiography.

  • Irving Fisher - Leading American economist of early 20th century known for theories of interest rates, debt deflation, and severe impact of Great Depression which bankrupted him personally.

  • Kenneth Fisher - American investment manager, economist, author known for macroeconomic predictions, market forecasts through his Fisher Investments firm and publications like The Only Three Questions that Count.

  • Robert Fogel - American economist who did influential cliometrics research on economics of slavery, railroads, health/nutrition impacting productivity. Shared 1993 Nobel Memorial Prize in Economics.

  • Steve Forbes - American publisher, editor-in-chief of Forbes magazine since 1990, and two-time US presidential candidate (1996, 2000) on Republican ticket advocating flat tax and supply-side economics.

  • Gerald Ford - 38th president of the United States following resignation of Nixon. Modest energy and economic problems but no major crises during brief tenure from 1974-1977. Historians rate below average.

  • Bertram Forer - American psychologist who conducted seminal experiments showing subjects’ inability to discern general, ambiguous personality assessments from customized ones for themselves. “Forer Effect.”

  • Jay Forrester - American systems scientist, electrical engineer, professor, known as pioneer of system dynamics method and worlds modeling using feedback loops. Influenced Club of Rome modeling efforts.

  • Foxes - Per Malcolm Gladwell’s dichotomy in Blink, slower, more deliberate thinkers who consider alternative viewpoints and uncertainty instead of snap judgments. Used to characterize skeptical futurists.

  • Foxes contrasted with hedgehogs - Hedgehogs focus narrowly on one systemizing idea versus foxes considering more alternatives and uncertainty. Nassim Taleb adopted dichotomy to distinguish propensity for nonlinear thinking and skepticism.

  • George Friedman - Geopolitical forecaster, professor, founder of private intelligence firm Stratfor known for long-term predictions on world events, power shifts, and underground predictors. Mixed track record.

  • The Future as History (Heilbroner) - 1961 book by economist Robert Heilbroner on visions of future societies and role of accelerating technological change driving them, arguing unknownability of long-term outcomes. Influenced societal futures debates.

  • The Future of Everything (Orrell) - 2007 book by complexity scientist David Orrell exploring deep uncertainties in long-term predictions through diverse scientific, technological, geopolitical factors and nonlinear dynamics. Argued true risks impossible to quantify precisely.

  • “The Future of Population” (Cohen) - 1995 article in Scientific American by Joel Cohen predicting Earth’s population would peak mid-21st century at 9-10 billion then stabilize or decline due to fertility rates leveling across nations. Highly accurate forecasts.

  • Future Shock (Toffler) - 1970 book by futurist Alvin Toffler popularizing concept of “future shock” from rapid socioeconomic and technological changes overwhelming individuals, societies. Highly influential.

  • Futurists - People who make explicit predictions about or envision possible future conditions of technology, society, economics or make proposals for change to influence the future. Mixed track record historically.

  • John Lewis Gaddis - American historian focused on Cold War foreign policy, grand strategy, influence of personality on politics. Pioneered concept of geopolitical “long peace” post-Cold War.

  • John Kenneth Galbraith - Canadian-American economist notable for works on industrial organization, inequality, military-industrial complex, economics in entertainment. Adviser to Democratic politicians.

  • Gambling - Engaging in games of chance for potentially monetary stakes also connected to human behaviors like risk-taking, seeking knowledge of future, overconfidence in personal intuitions/abilities. Related to predictive decision-making under uncertainty.

  • Game theory - Branch of applied mathematics analyzing strategic interactions between parties where outcome depends on choices others make. Wide applications to economics, politics, evolutionary biology, military theory.

  • Martin Gardner - American mathematical recreational author known for “Mathematical Games” column explaining science/pseudoscience to lay audiences. Promoted scientific skepticism towards pseudoscience.

  • Paul Gascoigne - English former professional footballer who played as a forward. His erratic behavior drew media/public fascination both positively and negatively.

  • Michael Gazzaniga - American cognitive neuroscientist who studied split-brain patients, proposed concept of left-brain interpreter module trying to achieve coherence of human experience. Leading theorist of brain lateralization.

  • Germany - Economic powerhouse of Europe but source of destabilizing conflicts pre-1945 due to militarism/revanchism of imperialistic elites. Post-war West prospered in transatlantic alliance while Communist East collapsed in 1989.

  • Pieter Geyl - Dutch historian of

Here are summaries of some of the key terms:

  • Albert - Likely refers to Albert Michelson, the physicist who helped discover that the speed of light is constant.

  • George Edward Scott - British historian known for his studies of 17th century Europe.

  • June Scott - Likely refers to June Scott, an American anthropologist known for her work on Central and South American indigenous groups.

  • the Seekers - An Australian pop band popular in the 1960s.

  • seer-sucker theory - Refers to beliefs that seem obviously true but are in fact false.

  • self-interest - The concept that people are primarily motivated by their own interests.

  • self-negating prophecies - Predictions that, if widely believed, could cause actions that prevent the prediction from occurring.

  • Robert Sencourt - British writer and historian who wrote biographies of Marie Antoinette and Napoleon.

  • The Shape of Things to Come - Science fiction novel by H.G. Wells predicting future world wars and governance by technocrats.

  • Robert Shiller - American economist known for predicting economic bubbles like the 2000s housing crisis.

  • Matthew Simmons - Investment banker who warned of peak oil and wrote about climate change risks.

  • Julian Simon - Economist who argued fears of resource scarcity and overpopulation were exaggerated.

  • Gordon Sinclair - Canadian journalist, satirist, commentator known for predictions about social trends.

  • skepticism - Attitude of doubt regarding claims or beliefs that lack evidence.

  • B.F. Skinner - American psychologist who developed behavioralism and radical behaviorism.

  • Paul Slovic - Psychologist who studies risk perception and decision-making.

  • F.E. Smith - British lawyer and politician, 1st Earl of Birkenhead, who supported appeasement of Germany.

  • Socrates - Ancient Greek philosopher who taught through questioning and is considered one of the founders of Western philosophy.

  • George Soros - Hungarian-American investor, philanthropist, author who uses his wealth to support liberal causes.

  • The Sovereign Individual - Book by William Rees-Mogg and James Davidson arguing technology will shift power to individuals.

  • Soviet Union - Refers to the USSR, a single-party socialist state in Eurasia from 1922-1991.

That covers some of the key terms - let me know if any others need more context!

#book-summary
Author Photo

About Matheus Puppe