Self Help

Too Big to Know Rethinking Knowledge Now That the Facts Aren't the Facts, Experts Are Everywhere, and the Smartest P - David Weinberger

Author Photo

Matheus Puppe

· 45 min read

BOOK LINK:

CLICK HERE

Here is a summary of the Table of Contents for the book:

The book begins with an introduction titled “Prologue” which discusses knowledge and expertise in the digital age. It highlights some challenges to traditional notions of knowledge from digital technologies and media.

The book is then divided into 9 chapters:

Chapter 1 - Knowledge Overload Chapter 2 - Bottomless Knowledge Chapter 3 - The Body of Knowledge Chapter 4 - The Expertise of Clouds Chapter 5 - A Marketplace of Echoes? Chapter 6 - Long Form, Web Form Chapter 7 - Too Much Science
Chapter 8 - Where the Rubber Hits the Node Chapter 9 - Building the New Infrastructure of Knowledge

It also includes acknowledgments, notes, and an index at the end.

Prior publications by the author are listed before the Table of Contents. The chapters appear to discuss different aspects of knowledge in the digital era, such as how it is changing forms, who qualifies as an expert, and challenges in domains like science. The conclusion discusses building a new framework or infrastructure to manage digital knowledge.

  • Business leaders are experimenting with more decentralized decision-making processes that leverage expertise across their networks, similar to collaborative projects like Wikipedia.

  • US intelligence agencies and the State Department are struggling to balance secrecy (“need to know”) with transparency (“need to share”) regarding what information they release to the public.

  • Scientists find themselves both aided and challenged by amateur contributions online, while traditional academic journals are becoming seen as bottlenecks.

  • Media organizations are grappling with how to adapt to the internet where anyone can publish without gatekeepers and fact-checking is now done more publicly by sites like Politifact.

  • Some argue the internet amplifies misinformation and splinters attention, while others counter that sites like Politifact and collaborative tools empower fact-checking and advance knowledge more quickly.

  • Knowledge is shifting from being contained in individual/institutional silos to being a “property of the network” where the smartest people are the interconnected group, not any single person. This networked knowledge is less certain but more inclusive and richer.

So in summary, the passage discusses how knowledge production and sharing is being transformed by greater connectivity and participation online, with both risks and opportunities in the new networked model.

  • The passage discusses different views on what constitutes knowledge. The DIKW pyramid model proposed by Russell Ackoff defined knowledge as “actionable information” or “know-how.” But traditionally, knowledge was seen as a whole, orderly understanding of the world that humans pursue to understand God’s creation.

  • Humans have managed information overload through elaborate filtering systems - things are published, shelved in libraries, decided as worthy of knowing based on filters. This reduces what needs to be known to a more manageable level. Knowledge has traditionally been about reducing information.

  • The Information Age took this filtering approach to an extreme with strict categorization of minimal useful data in databases and information systems. But connecting knowledge online is changing our strategy from reducing to fit traditional mediums to including all information and ideas in vast webs.

  • The idea of information overload was introduced in 1970 but what constituted too much has reduced greatly as capacity exploded. But psychological syndromes around too much data were once marketed as real problems. Now we consume zettabytes of information and the definition of overload has shifted radically with new digital mediums.

The key idea is that knowledge and what constitutes information overload have been redefined by the influx of online information and connectivity, moving away from traditional narrow definitions and filtering approaches.

  • The passage discusses the concept of information overload and how our perception of it has changed over time. It provides historical examples of others expressing concerns about too much information back in the 1600s-1700s.

  • It uses War and Peace as an example to try conceptualizing the size of a zettabyte of information, such as how many copies it would be and how far they would stretch if stacked.

  • While we’ve always had too much information to comprehend, the author argues the new information overload is different because filtering tools no longer fully remove unwanted content, they just filter it out of sight by prioritizing other content. So everything remains available with the right searches.

  • This shift from physical filtering that removes content to digital filtering that just reorders it has consequences, particularly for institutions that previously acted as filters and gatekeepers of knowledge and information. It’s now obvious traditional methods can’t handle the scale of information on the internet.

  • In summary, the passage discusses how our perception and reality of information overload has changed over time, particularly with the rise of digital filtering tools that handle prioritization differently than previous physical filtering methods.

  • Traditional knowledge filtering techniques like expert committees are no longer sufficient due to the massive amount of online information. New social filtering techniques using things like likes, shares, and recommendations from social networks are more effective guides.

  • However, no filter can capture all useful knowledge due to its abundance online. There is too much good and bad information. This leads to information overload and disagreement as traditional authorities lose influence.

  • Online filters are now more transparent as links and recommendations are visible content that helps shape what information spreads. Filters increase rather than reduce information by revealing more of what escapes other filters.

  • Two examples show emerging new knowledge institutions that are wide, boundary-free, populist, evaluate experts differently, and accept ongoing disagreement. Primary Insight uses diverse part-time experts from fields rather than full-time analysts. Expert Labs enables professional and amateur experts to engage in a less defined way. Both recognize traditional credentialing is too slow.

  • There are times when rapidly developing and exploring new ideas is preferable to a slow, careful development process aimed at reaching consensus. Allowing for disagreement and loose collaboration can spur new insights.

  • Expert Labs is a response to the fact that knowledge has become too vast to be contained by traditional top-down structures like pyramids. Networks allow knowledge to grow and spread more organically.

  • The internet has created the ideal network structure for knowledge - it can scale infinitely without limits. Networks lack the rigid boundaries of traditional knowledge structures but allow knowledge to take a more organic shape.

  • Historically, knowledge stopped growing at authoritative sources like textbooks and encyclopedias. But the internet enables knowledge to grow freely in a non-hierarchical networked structure without predefined boundaries. This reflects how knowledge itself is now developing.

  • In the past, knowledge was founded on principles, analogies, and general deductions rather than facts. It was believed the universe reflected God’s orderly design, and the mind apprehends this through associations.

  • Today, facts have emerged as the main foundation of knowledge. Disagreements are resolved by presenting factual evidence rather than appeals to general principles.

  • This shift began in the 17th century with Bacon proposing scientific knowledge be built from experimental facts about particulars, rather than deducing universals.

  • Facts came to be contrasted with self-interest - the factual harms of child labor versus claims it built character. This elevated facts to adjudicate social issues.

  • Malthus provides an example of this transition. His early population theory relied on broad logical deductions, but modern analysis would demand factual evidence to support such sweeping generalizations.

So in summary, facts have displaced analogies and principles as the foundation of knowledge and way to resolve disagreements, a process that began with Bacon and accelerated as facts were contrasted with subjective interests rather than objective truths. Malthus exemplifies this changing approach.

  • In the early 19th century, there was a rising emphasis on using facts and empirical evidence over moral assumptions or the interests of those in power to make policy decisions and understand social issues.

  • Jeremy Bentham pioneered the concept of evaluating policies based on utility/happiness, which required collecting factual data about people’s lived experiences.

  • Parliament began commissioning “blue books” with statistical reports, interviews, etc. to inform debates. This helped advance the social reform movement.

  • Charles Dickens was initially skeptical of overreliance on facts in novels like Hard Times, feeling they lacked nuance. However, facts continued gaining prominence.

  • The use of fact-finding missions to resolve international disputes also grew in the late 19th/early 20th century, replacing war as the main method of settling problems between countries.

  • By the 20th century, facts had largely become accepted as the basis for knowledge and policymaking across many domains of society and government. Collecting and analyzing empirical data was viewed as crucial.

The passage traces the rise of facts and evidence-based thinking over the 18th-19th centuries in England and its influence on various spheres. It emphasizes how facts came to displace other types of arguments, like moral assumptions and special interests.

  • Darwin spent 7 years intensely studying barnacles, resulting in two difficult volumes of meticulous descriptions and classifications. This paved the way for his theory of evolution by natural selection.

  • Darwin was led to study barnacles by an unusual discovery on his voyage - barnacle parasites inside a mollusk shell. Examining this fact years later fully absorbed him for 7 years.

  • His work involved painstaking dissections to find continuities between organisms, guided by his evolutionary theory. For example, he found hidden male organs in hermaphroditic barnacles.

  • His volumes established the key fact that barnacles are crustaceans, rather than separate from them as previously thought. This required extensive specimen collection and examination over many years.

  • In contrast, the website Hunch.com uses vast amounts of user-supplied data to provide recommendations. It asks users seemingly irrelevant questions quickly and fun.

  • Hunch looks for statistical correlations in users’ answers rather than proving scientific theories. It generate recommendations without fully understanding the reasons behind correlations.

  • Darwin’s facts were hard-won through focused study to solve specific problems, while Hunch gathers a wide range of superficial facts quickly to power its recommendations.

So in summary, it contrasts Darwin’s intensive, scientifically-guided fact finding process over many years with Hunch’s rapid, unfocused collection of user data to power recommendations, without deeper scientific understanding.

The passage discusses how facts and data have changed in the digital age. It outlines three phases:

  1. The Age of Classic Facts - Facts were sparse and painstakingly discovered, like Darwin’s observations. Used to prove theories.

  2. Age of Databased Facts - More data stored digitally, but still limited to what could fit on punch cards or early databases.

  3. Age of the Networked Fact - Vast amounts of data available online from diverse sources. Facts are now part of interconnected networks and can be endlessly linked, debated, and recontextualized. Open data initiatives like Data.gov have made government facts more accessible.

Some key points about networked facts include: they exist in web links that provide context; sources are often linked; facts take on new meanings based on where they’re shared and discussed online; and every fact now has “an equal and opposite reaction” as it is debated and contradicted on the internet. This challenges the idea that facts can settle disagreements on their own. The digital age has changed the nature, role and understanding of facts.

The chapter introduces the concept of “the body of knowledge” - the idea that knowledge consists of a comprehensible, agreed-upon set of important ideas and truths about the world. It argues that the internet is eroding this idea as it becomes harder to curate and agree on a definitive set of truths.

Some key points:

  • Traditionally, knowledge had certain agreed-upon characteristics like being based on evidence and reason, forming a collection of truths.

  • The internet makes it impossible to have definitive “editors and curators” that decide what is and isn’t knowledge. There is no consensus.

  • This means we are losing the idea of knowledge having a “body” - a carefully selected set of agreed truths. Knowledge will be more distributed and diverse online.

  • However, we will still have facts, experts, and believable ideas. The fundamentals of knowing (beliefs backed by evidence) won’t change.

  • The rest of the book will explore how knowledge works in a networked world without consensus, including the roles of expertise, diversity, decision-making, and making the most of new networked structures of knowledge.

In short, it introduces the core thesis that the internet is dismantling the traditional “body of knowledge” concept by decentralizing curation and consensus, for better and worse, and sets up exploring this issue in more depth.

The passage discusses the emerging concept of expertise on the internet. It argues that crowd-based knowledge and networked connections are giving rise to new forms of expertise.

Specifically, it notes that the internet connects large numbers of people, allowing for “wisdom of crowds” effects where collective judgments can outperform individual experts given the right conditions. It also describes how networking allows for smart mobs, where digitally connected people can coordinate in new ways.

The passage traces a shift from traditional forms of expertise organized in small elite groups like think tanks, to more distributed forms leveraging online crowds and networks. However, it acknowledges we still play an active role in this change and are not purely passive recipients of new networked models of knowledge.

In short, the passage summarizes that online crowds and networks are offering new strategies for developing expertise, but that humans still shape how this occurs rather than simply having new models imposed on us. A new era of networked knowledge is emerging online, enabled by vast digital connections, but one we help direct rather than just experience passively.

  • The internet has undone previous constraints on expertise by allowing large networks of unrelated people to collectively figure things out or act as knowledge resources. This is known as crowdsourcing.

  • Examples include people scouring satellite images to find a missing sailor, or helping solve political expense controversies. Networks can also gather localized information quickly, as in a DARPA challenge to find weather balloons across the US.

  • Some crowdsourcing involves financial compensation via sites like Amazon Mechanical Turk, allowing large distributed tasks to be completed cheaply versus traditional hiring.

  • Networks also develop informal expertise, like food enthusiasts using sites to track down a chef’s whereabouts. The motivation can be non-financial like enjoying the food.

  • Contests are another way expertise is developed, tapping individual experts within large networks. This sources diverse knowledge that might solve problems confounding domain specialists. Sites like InnoCentive connect companies with solvers of all backgrounds.

  • The internet allows vast, diverse networks that can connect problems with unconventional but applicable expertise that wasn’t accessible before its size and openness.

  • The Internet accumulates expertise over time as information, answers, and discussions are posted online and build upon each other. When new software, products, or issues emerge, questions are asked and answers gradually accumulate, making the network more expert on that topic.

  • Searching for error messages or problems online is likely to produce answers very quickly as the collective discussions and troubleshooting have created a repository of solutions. People benefit from waiting before adopting new things as the network expertise grows.

  • The links and connections people draw between content online also helps curate the network and build expertise. Being able to find and connect relevant information is a type of expertise in itself.

  • While early online posts may seem embarrassing later, the persistent nature of the Internet means useful discussions and solutions remain accessible, adding to the cumulative expertise over time for searchers to benefit from. The network retains both “dumb” and “smart” content posted by users.

In summary, the networked nature of the Internet allows expertise to accumulate over time as knowledge is posted and built upon, creating a growing resource for information and problem solving.

The passage discusses how networked expertise is emerging as a new model of expertise, enabled by the properties of the Internet. Some key points:

  • The Internet allows for expertise networks of all sizes, from small groups to massive crowdsourced networks. It enables many types of expertise groups to form.

  • Networked experts can be smarter than the sum of their parts through complex multi-way interactions. They can build on each other’s knowledge faster than isolated experts.

  • Examples given include IBM “jams” that crowdsource ideas, the Pragmatic Chaos team that won the Netflix prize through collaboration, and online forums for discussing philosophers like Heidegger.

  • The old model of closed expert circles had value but was limited. Networked expertise offers benefits like more knowledge, faster answers, greater stimulation of ideas, and exposure to more perspectives.

  • Organizations like MITRE are moving from traditional written reports to more timely, interactive formats of expertise sharing like discussions and briefings, which better reflect the networked nature of expertise.

So in summary, the passage argues that the Internet allows for new models of networked expertise that can outperform isolated experts and traditional closed groups through open collaboration and connectivity.

  • MITRE recognizes that their value is in understanding government problems deeply and leveraging the entire community of experts, rather than any individual being seen as the smartest.

  • They have created an environment where anyone can publish their knowledge and expertise, rather than designating only certain people as official experts. Their internal search engine helps users find relevant non-official experts.

  • The forums at MITRE do not necessarily come to consensus views but rather allow for differing expertise and perspectives from multiple experts. This network approach is seen as smarter than just relying on individual expert opinions.

  • Experts communicating together is not new, but digital networks have changed the nature of expertise in several ways. It has become less tied to topics, less about certainty, more transparent, multi-directional rather than one-way, less dependent on traditional credentials, and allows for multiple perspectives rather than a single voice.

  • Overall, expertise is shifting from being an attribute of individuals to an emergent property of digital networks as a whole, enabled by connecting diverse experts in new ways.

  • Diversity is important for problem solving, but there can be too much diversity as well. Too little diversity leads to groupthink, while too much diversity results in no agreement.

  • The type of diversity matters - it’s not just about surface attributes like race, gender or ethnicity. More important is diversity of experiences, perspectives and ideas. Having people from different backgrounds only helps if it leads to cognitive diversity.

  • According to research, diversity works best when: 1) The problem is complex with no single right answer 2) Individuals are sufficiently knowledgeable 3) People can build on each other’s ideas incrementally 4) The group is large and drawn from a diverse pool.

  • Factors like race, gender alone don’t guarantee diversity of views or expertise. The key is experiential diversity - having people with different life experiences and ways of approaching problems. Skin-deep attributes don’t necessarily translate to cognitive diversity.

  • In summary, diversity is good for problem-solving but only when it leads to diversity in perspectives, approaches and problem-solving styles among individuals who are still knowledgeable enough to contribute. The type of diversity matters more than just the numbers.

  • Research shows that the diversity that makes groups smarter than individuals is diversity of perspectives and heuristics (problem-solving approaches).

  • Perspectives are different ways of framing or mapping an issue. For moderating comments, perspectives could include organizing by topic, political spectrum, or emotional tone.

  • Heuristics are the tools or techniques people bring to bear on problems. Examples include whether moderation calms discussions or if reputation systems can marginalize irrelevant comments.

  • For a diverse group to be effective, its members need to have different perspectives and heuristics. This allows them to approach problems in different ways and come up with better solutions than a single person could.

  • A wise Latina judge would make the Supreme Court more diverse in a useful way if her experiences gave her a different perspective on issues and different heuristics for dealing with them, compared to the other justices. This could lead to better legal reasoning and outcomes.

In summary, the key idea is that diversity of perspectives and problem-solving approaches makes groups smarter, while shared similarities in goals and values allow diversity to be constructively applied. The “right” amount of diversity depends on the situation.

  • The LA Times experimented with a “wikitorial” that allowed readers to collaboratively edit an editorial online. However, it was sabotaged within days by users posting offensive/pornographic content.

  • The wikitorial saw over 150 edits as users debated and modified the editorial. Comparisons to other wars and critical views of the LA Times were added and removed as users argued.

  • Experts debated whether “forking” discussions (breaking them into separate threads) when they become too diverse is a good approach. While it allows different views to coexist, it can also lead to polarized “echo chambers” where only certain views are heard.

  • Cass Sunstein warned that when people only engage with those who agree, views become more extreme and polarized over time. Studies show this “group polarization” occurs both offline and online as like-minded groups reinforce each other.

  • However, later studies found people who visited partisan websites were actually more likely than average to visit opposing sites as well. This suggests the internet may not be increasing polarization and isolated echo chambers as much as initially feared. The debate continues around how open or closed online discussion spaces truly are.

This passage discusses different perspectives on how the internet may be impacting political discourse and society. Here are the key points:

  • Cass Sunstein argues the internet leads to more “echo chambers” where people only encounter views they already agree with, making them more polarized. However, a Berkman Center study found internet users are actually more insular than users of traditional media.

  • Nicholas Carr argues the internet is “making us stupider” by weakening capacities like deep thinking due to its distracting, scattered nature. But others feel it makes them smarter by providing more information at their fingertips.

  • There is no clear or consensus answer on whether the internet makes people smarter or dumber - the research is limited and findings vary in different contexts.

  • While echo chambers may exist, the internet also exposes people to more diverse views. Communities still need some boundaries to have meaningful discussions amongst agreement.

  • Discourse online reflects the reality that people deeply disagree and we can’t expect universal rational consensus. Reasonable discussions often only occur within groups that share core commitments, like political ideologies.

  • The internet is showing the “coffee shop of reason” ideal exists only within limited contexts defined by some prior agreement, not as a universal forum of completely open and rational debate.

  • The passage discusses the debate between whether human reason is sufficient for understanding the world, and whether there is a “privileged position” from which to interpret it.

  • It introduces some key ideas from postmodernism - that all knowledge is an interpretation from a particular viewpoint; interpretations are social and occur within discourses/contexts; and within discourses, some interpretations become privileged, though discourses themselves are social constructions.

  • It argues the internet has demonstrated these postmodernist ideas to be true, by facilitating the proliferation of diverse interpretations without a centralized authority or privileged position. On the net we see every possible interpretation.

  • This underscores the difficulty of resolving differences and understanding across discourses/contexts. However, connecting across boundaries can make us smarter if we wish to be smarter through forming expert networks.

  • The passage examines whether the internet fritters away ideas through distractions and shortening attention spans, or represents a new way of linking ideas. It remains an open question whether the internet is an echo chamber or facilitator of diversity.

So in summary, it introduces some key postmodernist concepts and argues the internet experience has provided evidence that many of these ideas are correct, while also posing challenges around navigating differences online.

  • On the Origin of Species is a foundational work of science and brilliant work of literature. However, as with any long-form argument, it has weaknesses due to the constraints of the medium.

  • Darwin preemptively addresses objections his readers may have across six of the book’s 15 chapters, showing his genius and awareness of their desire to disagree.

  • However, books require constructing linear, one-directional arguments rather than conversational responses. This shapes how ideas are presented.

  • Historian Robert Darnton discusses how books cannot fully capture the complexity of history and notes how networked media could incorporate more layers of information and interactivity.

  • While physical books will remain important, networked knowledge is better suited as the dominant form, allowing for more fluid connections between ideas. Books historically shaped how we think knowledge should be structured.

  • The passage concludes by noting the irony of critiquing books in long-form book format, but argues this can still allow ideas to develop usefully while the network age emerges. It sets up exploring how the book form itself influences argument structure.

The passage discusses some of the limitations and shortcomings of the traditional printed book format. While books have advantages like enabling long-form thought and making the past feel present, they also have physical constraints.

Books have to tell a complete story within the bounds of the covers because there is no easy way for readers to access related information externally. This can unintentionally shape and limit ideas by forcing them into the rigid structure of sequential pages.

The physical nature of books also means they take up real space and can be hard to find later. Their materiality contrasts with the ease of searching for information online.

Most printed books people actually read are cheaply made mass products, not the idealized leather-bound tomes we often romanticize. Old books show their age through wear and tear, undermining the notion that they perfectly preserve the past.

Long-form writing in book format also tends to squeeze ideas onto narrow paths leading to conclusions, ignoring other relevant influences and factors. This is not a fault of any individual book, but rather a byproduct of the limitations of the print format.

In summary, the passage discusses both advantages and disadvantages of the traditional codex book format, arguing it unintentionally shapes thought and overlooks some limitations in contrast to new digital media.

  • Jay Rosen runs the influential blog PressThink.org, which analyzes journalism. Though it doesn’t get as many direct links as big media sites, it has significant influence through links from highly-read sources.

  • Rosen engages in long-form thinking through a series of blog posts on a topic. He wrote a 6-part series arguing that journalistic objectivity doesn’t construct authority as easily as once believed. The posts total over 110,000 words including comments.

  • There are advantages to Rosen’s public and networked approach over private long-form writing. It allows the argument to develop naturally in response to discussion. Ideas spread faster and escape the author’s control to potentially change the world. Readers are more involved.

  • The author’s authority is “right-sized” through engagement with readers rather than speaking with a publisher’s authority. Others responding shows ideas are tentative. It connects readers in discussion rather than isolating the author.

So in summary, Rosen demonstrates a new model of public, networked long-form thinking that has advantages over traditional private long-form writing and publishing models.

  • Jay Rosen’s ideas are spreading online as more sites link to his work and as he has 37,000 Twitter followers. Ideas can now spread and be viewed openly online in a way that was not possible before.

  • However, there are also some disadvantages to open online discourse. Reader comments may act as a distraction. Some arguments are better presented all at once rather than piecemeal online. Commercial ideas may not do as well if given away for free online.

  • Nine advantages of online discourse were listed compared to five disadvantages. But determining which is better is not as simple as counting pros and cons.

  • Online discourse has impacted long-form writing as well. Works are now discussed, misunderstood, incorporated by others, and spread in new ways online rather than remaining untouched on shelves. This exposure can degrade ideas but also enhance and spread them more widely.

  • The shapelessness and lack of boundaries online reflects the reinvigoration of knowledge but also removes traditional authority structures. This loss of centralized authority is an issue but reflects a change in how knowledge relates to the world.

Here is a summary of the provided passage:

The article discusses some issues with objectivity and how networked knowledge affects traditional notions of authority and long-form writing. It notes that links in online articles acknowledge that knowledge is constructed together, rather than representing a complete or sufficient view. Networked knowledge resembles a “web of temptations” rather than fixed stop points.

As an example, the role of social media in the 2011 Middle East revolutions is discussed. Even experts disagree on the degree of influence, as these events result from complex, unpredictable factors. The world is too “intertwingly” or interdependently complex to fully comprehend causal relationships. Networks may better reflect this complexity than traditional objective reports or essays.

While stopping points are still needed, sites like WolframAlpha function as authoritative sources online by gathering vetted facts and performing computations. However, their links and feedback invite further engagement rather than representing an end to inquiry. Long-form writing is also improved when embedded within networks of related ideas through links, allowing for greater understanding, verification and participation over time.

However, most websites lack professional editorial control, so judgments of credibility rely more on subjective factors. This can lead to the spread of misinformation if judgments are made within insular “echo chambers.”

In summary, networked knowledge is changing notions of authority, objectivity and the role of long-form thinking, though stopping points are still necessary. The implications of these changes are complex and interdependent.

The source material for this passage resides at an undisclosed online location, as the author did not provide a hyperlink or other direct reference. A highly motivated researcher would need to find the full original text through further investigation.

  • Science news stories that arise from small, limited studies often get overhyped and extrapolated to much broader conclusions than the data supports. This happens because such findings generate interest and easy headlines.

  • The scientific method is meant to use carefully controlled experiments to test hypotheses, but when science gets embedded in the real world, modest findings can take on a life of their own outside of scientific rigors.

  • There is now vastly more data about the natural world than ever before due to technological advances, but culture still struggles to handle even modest scientific studies responsibly.

  • To make sense of huge amounts of modern scientific data, our concept of scientific knowledge itself may need to transform. It is becoming larger in scale, less hierarchical, more openly public and debated, less centrally filtered, more open to diverse perspectives, and interconnected through hyperlinks like the network itself.

  • Previous periods of science, like Bacon establishing the empirical method, also transformed what counted as knowledge. The digital age may be bringing about another redefinition of scientific knowledge to better suit its new, networked conditions and scale.

The passage discusses the rise of large scientific data repositories and networks like GBIF, ProteomeCommons, and the Human Genome Project. These “data brickyards” contain thousands or even billions of data points about things like species distributions, proteins, and genetic sequences.

Three key factors have driven this growth in data sharing and collection:

  1. The economics of data storage and deletion have changed, making it cheaper to store all data rather than carefully curating a selection.

  2. Data sharing over the internet is easier, incentivizing collection of even obscure datasets that may have value when shared broadly.

  3. Computing power has increased exponentially, enabling the automated collection and analysis of vast amounts of scientific data.

This has led to the rise of “data-driven” sciences like systems biology that study complex emergent properties of biological systems using massive datasets and computer modeling, rather than attempting straightforward theoretical explanations. Understanding systems at this scale is beyond human cognitive abilities alone. The models produced often cannot be fully understood or reduced to simple principles. This represents a shift away from the traditional goal of constructing scientific theories that explain observable facts.

  • The passage discusses how computer models and simulations can reveal emergent patterns and behaviors that are difficult for humans to predict intuitively. For example, a simple model of people positioning themselves between aggressors and prey ends up forming a tight cluster unexpectedly.

  • It notes that as models get more complex, with more diverse and changing motivations, only computers can effectively manage and analyze the huge amounts of data and interactions involved.

  • Algorithms like Eureqa are able to analyze large datasets and come up with equations that fit the data, even if humans don’t understand the underlying mechanisms represented by the equations. Biology in particular is too complex for human comprehension.

  • While having answers without understanding is unsatisfying, models can still have predictive power even if we don’t fully comprehend them. This represents a new form of knowledge that exists more at the network level than in individual human minds.

  • As networks and data sizes continue to grow, our ability to know through modeling will depend more on distributing computational tasks across connected systems, not just on individual understanding. Complexity may eventually outstrip human comprehension even with computational aids.

  • An Italian physicist discovered a new impact crater in Sudan using just Google Maps and open-source image editing software. Since few craters are known to be caused by space objects, these amateur discoveries are important.

  • Einstein@Home crowdsources data analysis by distributing it to volunteered personal computers. In 2010, non-scientists using this project discovered a new pulsar that may spin at a record-breaking speed.

  • FoldIt crowdsources protein folding by having people play an online game. Humans have outperformed computer algorithms at certain protein structures due to human spatial skills.

  • PatientsLikeMe allows patients to anonymously share treatment details and data with researchers. Patients have expertise from living with diseases.

  • Networks of amateurs can make substantial contributions, like discovering “green peas” through discussions in Galaxy Zoo. Individual amateurs may have less influence on science today due to increased specialization.

  • The line between credentialed and amateur science is blurring as more engagement happens online, allowing open critique, like of a claimed P vs. NP solution, without requiring credentials to participate. This new ecology is changing how scientific impact and authority are measured.

  • Traditional scientific journals rarely publish research with negative results, which show experiments that failed or didn’t have the expected outcome. However, these negative results can be very useful for other researchers starting work in an area, to know what approaches didn’t work.

  • Jean-Claude Bradley started practicing “open-notebook science” - openly recording all his lab’s daily experimental results, including failures, on a blog and wiki. This allowed others to access the information sooner and helped connect his work to others exploring similar areas.

  • Open and continuous recording and sharing of science is becoming more common through tools like open notebooks, preprint repositories, and scientist blogs. This shifts from the old model of private work leading up to a single publication date, to a continuous open process.

  • This open model can cause ambiguity around credit and discovery when work is openly shared as it happens, rather than having a clear before/after publication divide. But it benefits science overall.

  • PLoS One takes a more open approach to peer review than traditional journals, accepting papers based on scientific validity rather than perceived importance or impact. This helps disseminate more types of scientific work.

  • Peer review is crucial for researchers seeking tenure, as tenure committees rely on publications in peer-reviewed journals. However, traditional peer review does not scale well as more papers are submitted.

  • PLoS addresses this by having academic editors who can accept papers without external review if they meet criteria. They rarely skip review, though some fields like infectious disease reporting are expedited.

  • The motive for peer review at PLoS is less about quality and more about reassuring academics and researchers that their work will be viewed as legitimate.

  • Metrics like impact factor are flawed but still influence researchers. PLoS One publishes all sound science regardless of perceived importance to avoid bias.

  • This more open model means readers must take more responsibility in finding relevant work, rather than having the “best” papers spoonfed to them.

  • Open access publishing is growing and helps break the paywalls traditionally imposed by commercial publishers. It opens up scientific knowledge more widely over the internet. Authority will come more from online presence and networks rather than just selective peer-reviewed journals.

Science continues its traditional processes like experimental method, public debate, and authoritative institutions. However, networking science raises new challenges as databases from different disciplines may categorize data differently even if recording the same entities.

The Science at Creative Commons group is working to link different databases so they can be queried as one by assigning unique identifiers to entities across sources. This allows different data sources to be aggregated without requiring universal agreement on classification and naming. A “namespace” approach acknowledges unique identifiers within separate domains, enabling computer programs to map between namespaces and integrate data from different sources.

This pragmatic approach represents a shift from the older view that knowledge is a “mirror of nature” reflecting a single true organization. Instead, different approaches are useful tools depending on aims. Namespaces let science benefit from diverse research without first achieving full consensus. While scientific institutions still matter internally, science can no longer remain behind institutional walls in today’s mediascape. Scientists must acknowledge this new reality and engage effectively with the public to counter misinformation.

Here are the key points:

  • Science had traditionally worked through a process of private work and hypothesis generation, followed by publication in credentialed journals which conferred authority on published findings.

  • The story of Darwin and Wallace showed how publication shaped science, as Darwin waited years to formally publish his theory of evolution even after developing it privately.

  • The publishing model incentivized definitive attribution of ideas to individuals and packaging findings into self-contained “papers” with a clear narrative.

  • However, the publishing process could filter out ambiguous or incomplete findings that didn’t fit the story narrative. It also introduced risks of errors spreading through citation chains.

  • The new networked model of science addresses these issues by making the work more transparent and open to ongoing fact-checking and correction. It reduces reliance on authority and institutional trust by allowing findings to be connected back to their original sources. This makes science more accurate over time.

So in summary, the networked model is rebooting science by revealing limitations of the traditional publishing-driven approach and allowing the work to be more openly connected and validated.

  • Making decisions in the modern, networked world is challenging due to the vast amounts of information and conflicting claims available. However, leaders must still face reality and make decisions.

  • The author visits West Point’s Center for the Advancement of Leader Development to discuss leadership with Lt. Col. Anthony Burgess.

  • Burgess emphasizes developing leadership as a property of effective units/teams rather than individuals. He focuses on training teams to accomplish objectives even in changing circumstances, through motivation, flexibility, resilience, and collaboration - not on developing individual leaders.

  • For Burgess, strong leaders are a means to an end - that end being a cohesive team that can achieve its goals. If distributed, collaborative leadership better enables team effectiveness, that is the model he promotes rather than focusing on any single leader.

  • This signifies a shifting view of leadership as more of a group/network property rather than solely the domain of top individuals in an organization. Decision-making in complex, uncertain environments may rely more on effective teamwork than individual authority.

  • The passage discusses the move toward distributed leadership at West Point, where leadership is seen as a property distributed across teams rather than centered on individual leaders.

  • It attributes this shift partly to experiences in recent wars where soldiers needed a broad range of skills and the ability to problem solve and respond creatively as conditions change rapidly. Successful teams had characteristics like collaboration where every soldier contributes.

  • While the military remains hierarchical, the concept of separating leadership from a single leader and infusing it across groups is becoming more widely adopted as an effective way for networks to achieve goals. Distributed leadership is happening not just at West Point but in other networks like Linux development communities.

  • Linux has a clear leader in Linus Torvalds but authority is widely distributed across thousands of contributors worldwide. Leadership is as distributed as possible across the network with some nodes like Torvalds having more influence than others. Projects like Debian also have leaders but aim to distribute work voluntarily across participants.

  • Debian encourages developer autonomy and decentralized decision-making. Developers have control over their own modules as long as they follow integration guidelines.

  • Noel Dickover creates “micro-ecosystems” connecting local organizers with software developers to solve social problems. This distributes decision-making to those with local knowledge, rather than top-down approaches.

  • Dickover helps organize hackathons where developers create software to address problems identified by local stakeholders. This distributed model gets solutions to urgent issues more efficiently.

  • While Dickover facilitates networks and partnerships, decision-making power comes from enabling connections between stakeholders on the ground. This distributes leadership as locally as possible.

  • Network-based decision-making scales better than hierarchies when situations require local knowledge and when contributors value autonomy over directives. It can also motivate people more by respecting differences. Distributed models tap more local knowledge than top-down approaches.

So in summary, the text advocates for more distributed, network-based models of decision-making and leadership that empower local stakeholders and respect participant autonomy over top-down hierarchical control.

Here are the key points from the summary:

  • Local leaders making purely local decisions can work against the larger good of the whole collaborative network or organization. Distributed autonomous modules, as seen in Linux and Debian, allow local expertise to have more effect while minimizing risks.

  • When decisions are made locally throughout a network by volunteers, they are likely to express the interests of the local members, promoting more genuine corporate social responsibility.

  • Hierarchical organizations relying on a single leader are less resilient than distributed networks. Distributed leadership across local units improves resilience.

  • Hierarchical decision-making fits with traditional reductive strategies that can’t account for all the complexity in a network-connected world. Network decision-making keeps decisions as local as possible.

  • For these reasons, decisions within hierarchies will increasingly take on characteristics of network decisions, even if leaders don’t explicitly recognize it, as organizations get more connected through networks. The context and feedback from networks will influence decisions.

The new infrastructure for knowledge made possible by digital networks like the internet is abundantly linked, widely accessible, and freely published. This represents a major shift from the traditional curated model of knowledge.

Some key aspects of this shift include:

  • Abundance: Nearly limitless amounts of information are now easily available through search and links. This contrasts with the scarcity of the pre-digital era.

  • Links: Information is structured hypertextually through links, allowing ideas to be more interconnected. This erodes authorial control and positions knowledge as a navigable web rather than isolated publications.

  • Public accessibility: Knowledge is now openly published and debated, rather than access being restricted to credentialed experts. Anyone can contribute, making authority and expertise more distributed.

  • Lack of overall permission: There is no centralized gatekeeping of knowledge online. While expert curation still exists, metadata is more important for establishing reliability.

  • Contextualization: Works are situated through their connection to other discussions and debates online. Knowledge emerges from networks of commentary as much as standalone content.

  • Unfinished nature: Drafts, debates and processes are often publicly visible. This reveals the human construction of knowledge rather than presenting it as objective fact.

So in summary, the digitization of knowledge has made it vastly more abundant, interconnected, participatory, contextualized and works-in-progress oriented compared to traditional closed systems of expertise and curation.

  • Early Western philosophy emphasized raising standards of certainty in knowledge, culminating in Descartes’ doubts about even basic knowledge like one’s own existence.

  • 19th century philosophers like Kierkegaard challenged the notion that all knowledge requires certainty and rationality, suggesting some truths are held with “fear and trembling.” Nietzsche argued certainty hides greater truths.

  • Later thinkers like Heidegger, Kuhn, and Foucault further critiqued ideas of objective, certain knowledge by highlighting the role of lived experience, paradigms, and power in shaping what counts as knowledge.

  • The internet makes inevitable the “pragmatic truth” that knowledge consists not of agreed-upon facts but a shared world about which people will always disagree. Disagreement is now conspicuous.

  • However, scientists and historians will continue their work much as before, using new online sources. Over time, as with past media, the internet may spread knowledge more widely and bring ongoing cultural advancement, if open access is supported and the network remains open to research, speech and creativity.

  • Openly sharing work can help address the problems of information overload and low-quality content that arise from an abundant, open system.

  • Creating metadata, like tags, titles, descriptions, etc. makes content easier to find and understand. It provides context and allows for filtering and evaluation.

  • The Semantic Web initiative aims to add standardized metadata to make data on the web more interoperable and machine-readable. However, success also came from the Linked Data approach of releasing imperfect but linked datasets without waiting for comprehensive ontologies.

  • Even basic metadata like linking relationships to established standards allows data to be pulled together from different sources. Releasing imperfect but standardized and linked data unleashes its value.

  • Creating more metadata enables information to be more widely reused. Individual pieces of data only link together when they share metadata.

  • Showing your work, like linking to sources and related information, increases transparency, authority and helps situate work in its proper context. It encourages further exploration and learning.

The key idea is that creating and sharing metadata in various forms addresses problems of abundance, makes information more findable and useful, and cultivates a richer network of interconnected knowledge.

The passage advocates for ways to make the internet smarter as a platform for knowledge. It suggests leaving no institutional knowledge behind by making academic resources like course materials and library archives freely available online. Institutions need to contribute what they have developed to enlarge the knowledge infrastructure on the internet.

Everyone also needs to be taught how to effectively use and evaluate information on the internet. Critical thinking skills are important for distinguishing reliable from unreliable information. Citizens also need skills to participate in online discussions in a multi-cultural environment. It’s important but difficult to learn to appreciate diverse perspectives rather than just echo chambers of similar views.

The internet lowers barriers to encountering different ideas, but we must overcome our own tendencies towards homophily and insularity. Both echo chambers and diverse connections are needed for knowledge. The hyperlinked nature of the internet could provide a model of being locally rooted but globally connected. Overall, collaborative efforts show our capacity to build knowledge resources much larger than any one institution could achieve alone, if we address these challenges of teaching skills, inclusion and openness.

The passage discusses how knowledge has undergone a crisis due to information overload in the digital age. It argues that the rich variety of online information and connected humans provide many reasons for people to contribute and learn collectively at a massive global scale. There are very few barriers preventing people from accessing knowledge and contributing online. As a result, connected humans have not yet discovered what cannot be achieved through collaboration on the internet. The scale and open participation of the internet means we don’t fully understand its potential for collective knowledge creation.

Here is a summary of the blog post by Skip Walter, “Knowledge vs. Information,” from January 2, 2008:

  • Walter defines the difference between knowledge and information, stating that information is data without context or meaning, whereas knowledge is information that has been personaly understood and applied.

  • He argues that the rise of the internet has dramatically increased access to information, but true knowledge is not just absorbed but requires effort to understand information and apply it meaningfully.

  • Walter notes information can even become a distraction or overload if not properly processed into knowledge. Effective learning involves analyzing, comparing, and questioning information from various sources.

  • In conclusion, the blog post emphasizes that merely having access to vast amounts of online information does not equate to knowledge and wisdom - genuine knowledge acquisition involves intentional mental processing of information on a deeper level. The value is in what we do with information, not just having it at our fingertips.

This section discusses how diverse perspectives can be brought together while avoiding echo chambers. It notes that diversity of viewpoints is important for groups to make good decisions. While online interactions allow people to self-select their news sources and clustering with like-minded individuals, this can create “echo chambers” where opposing views are not heard. The importance of exposing people to a diversity of perspectives is discussed, as is the need to cultivate respect and civil discourse between those with differing views. Bridging differences and bringing people together respectfully remains an ongoing challenge.

Here is a summary of the key points from the article:

  • The article examines an online deliberation forum to discuss a public policy issue and how link posting impacted interactions.

  • It found that URLs and link posting generated more interaction and responses compared to opinions posted without links. This helped address problems of scale and participation that can occur with online deliberations.

  • However, the availability of online information did not equalize deliberations and may have given additional advantages to already advantaged groups. Facilitation did not overcome these advantages in some instances.

  • Discussions in some cases appeared more opinionated than informed even when information was readily accessible online.

  • The conclusion is that while information is newly accessible on the internet, it is also politicized in unfamiliar ways that can exacerbate, rather than reduce, existing democratic problems and imbalances in deliberation. Link posting and available information helped participation but did not necessarily make discussions more informed or equal.

In summary, the article examines how link posting impacted an online deliberation forum, finding it increased interactions but did not equalize discussions or overcome issues like some groups having greater advantages from accessible information online. The political dynamics of information availability online are more complicated than simply making discussions more informed.

Here is a summary of the key points from the article:

  • The article discusses Eureqa, a software program created by Nutonian that can analyze data and generate mathematical equations that describe relationships within that data without being explicitly programmed to do so.

  • Eureqa uses an evolutionary computing technique called genetic programming to evolve mathematical expressions that fit the provided data well. It starts with random equations and mutates them over many generations to arrive at increasingly accurate descriptions.

  • The creator of Eureqa believes it demonstrates mechanized science and marks an early step toward artificial general intelligence and technological singularity. However, others are more skeptical and see it as a useful tool but not true artificial intelligence.

  • Eureqa allows users to upload their own data and have the program generate equations to model relationships, allowing discoveries to be made more easily. It could democratize science by empowering amateur scientists. However, others warn it may generate many false discoveries if not used carefully.

  • The article examines both the promise and criticisms of programs like Eureqa and debates whether they represent an important milestone toward strong AI or merely sophisticated tools that augment human intelligence. Overall it presents an overview of this software and discussions around its implications.

Here is a summary of the key points from the article “future-of-books.html”:

  • Books will continue to exist in the future but will likely be supplemented by digital formats. E-books are growing in popularity but printed books are still preferred for long-form reading.

  • The internet allows for a more networked, collaborative form of knowledge sharing compared to the isolated nature of printed books. Knowledge can be more fluid, dynamically updated and built upon by many contributors online.

  • Projects like the Digital Public Library of America aim to make more published works freely available online. However, copyright issues still limit what can be digitized and shared.

  • Online tools like tagging, commenting and annotation allow new forms of discussion and debate to emerge around published works. This helps knowledge become more social and dispersed beyond authors.

  • Open access to information online has democratizing effects but can also spread misinformation more easily. Fact-checking and content moderation are needed to ensure reliability.

  • Books will still have value as trusted, edited works but the future of knowledge is trending toward a more networked model that is collaborative, multimedia and able to incorporate new commentary and connections over time. Both books and digital formats will likely continue evolving together.

Here are summaries of a few of the key topics:

  • British chimney sweep reform: This refers to reforms in the late 18th/early 19th century to improve working conditions for chimney sweeps in Britain, who faced dangerous and difficult work.

  • Darwin’s work on barnacles: Charles Darwin spent over 8 years intensely studying barnacles, which helped inform his later work on evolution and natural selection. His work established barnacles as important model organisms in science.

  • Malthusian theory of population growth: Proposed by Thomas Malthus, this theory held that population would increase geometrically while food production would only increase arithmetically, inevitably leading to famine and conflict over limited resources. It influenced Darwin and others.

  • Knowledge management: Refers to systematically gathering, organizing, sharing, and analyzing an organization’s information, knowledge, and expertise. The rise of digital networks enabled new approaches to knowledge management.

  • Groupthink: A psychological phenomenon where the desire for group cohesion or conformity influences individuals to avoid critical evaluation of alternative ideas or views. It can lead to poor or irrational decisions being made by groups.

  • Federal Advisory Committees: Known as FACs, these are committees, boards, councils etc. established by U.S. government agencies and departments to provide expert advice on various issues. Their use and role has implications for expertise, transparency and decision-making.

  • Networked knowledge: The idea that knowledge now resides in distributed networks and communities online, rather than only in specialist individuals or institutions. The Internet and digital networks have impacted how knowledge is created, shared and scaled. Expertise of diverse amateurs can now be crowdsourced.

Here are brief summaries of the key concepts:

  • Dickens’s antipathy to fact-based knowledge - Dickens had an aversion to knowledge solely based on facts without understanding context or humanity.

  • global - Refers to knowledge being spread widely across the world through digital networks.

  • statistical support for Bentham’s ideas - Early statistical analysis provided empirical backing for philosopher Jeremy Bentham’s utilitarian philosophy.

  • Social tools: information overload - Online social tools like tagging and networks can help manage information overload.

  • Society of Professional Journalists - A professional organization for journalists that sets ethical standards.

  • Socrates - The ancient Greek philosopher who emphasized questioning beliefs rather than asserting facts.

  • Software defaults - The preset options in software programs that shape how people use technology.

  • Statistics - The emergence of statistics as a field provided a new way to analyze large data sets.

  • Sub-networks -Specialized interest networks that form within larger online communities.

  • Transparency - The openness and visibility of information, processes, and institutions. Seen as important for accountability.

  • Truth -The discussion focuses on truth as dependent on context and interpretation rather than absolute facts. Networked knowledge allows new perspectives.

  • Wikipedia - Used as an example of a knowledge system that harnesses collective intelligence but also faces challenges of reliability and bias.

  • The Wisdom of Crowds - A business book advocating the idea that large groups of people can collectively make better predictions than experts. Relates to concepts of wise crowds and collective intelligence.

#book-summary
Author Photo

About Matheus Puppe