Self Help

Life in Code - Ellen Ullman

Author Photo

Matheus Puppe

· 72 min read

Here is a summary of the reflections:

  • Programming is an all-consuming activity that requires intense focus and concentration. Programmers work in a world removed from normal human time, often for years at a stretch.
  • Programmers tend to avoid interruptions and social interaction in order to maintain their focus. The author went for two years in one job barely speaking to colleagues.
  • Programming is challenging because you have to translate chaotic human thinking into the precise, logical steps that computers require. It is inevitable that there will be mistakes and “bugs” in any program.
  • Programmers often prefer to work on “low-level code” that is close to the machine and used primarily by other programmers. This allows them to avoid obligations to normal human time and schedules.
  • The author’s friend Frank envied the author’s position closer to the machine and further from normal human responsibilities like family. Frank saw it as a way to recapture the intense focus and freedom from time that comes with low-level programming work.

The key themes are the single-minded intensity and focus required for programming, the desire to avoid normal human schedules and responsibilities, and the appeal of low-level work close to the machine. The reflections give insight into why programming can be such an alluring and even addictive activity for some.

  • According to the author, programmers and software engineers are valued more highly if their work is obscure and incomprehensible to most people. Code that is highly complex and low-level, like assembly language, is seen as more prestigious than code that is easier to understand.

  • The author describes a coworker, Frank, who was unhappy writing software used by normal people. When he had to work with customers, it was seen as a demotion by his fellow engineers. Though he earned more money, he lost status in their eyes.

  • The author describes attending a lunch with software engineers where they casually discussed forcibly sterilizing and killing people to eliminate a genetic disease. When she pointed out these were Nazi tactics, the engineers reacted with disgust and dismissed her as not a “real techie.”

  • The author profiles an engineering manager of Italian descent who has adopted an eccentric persona to fit in with his Silicon Valley research team. He and his team act in bizarre and childish ways, but this only increases their status as brilliant engineers close to the machine. Even their Japanese sponsors are delighted by their odd behavior.

  • The author describes attending a series of computing conferences, noting how the crowds become more dominated by white and Asian men as the focus moves closer to hardware and the “machine.” The presence of even a few women is a sign of how close a conference is to the heart of computing.

  • In summary, the culture values and rewards programmers and engineers who are as incomprehensible and detached from normal human concerns as possible. The more “low-level” and obscure the work, the higher the status in this culture. Eccentricity and boorishness are seen as marks of a “real techie.” Diversity declines the closer one gets to the machine.

The narrator grew up next door to Eugene, an overweight ham radio operator who spent his evenings alone in the basement communicating with strangers over the radio. Eugene’s elaborate rooftop antenna cast an “electronic shadow” over the narrator’s house, allowing the narrator to overhear Eugene’s radio conversations on Saturday nights. The narrator describes hearing Eugene’s voice through the television set when Eugene’s radio signal would override the TV signal, replacing the picture with loud static and oscillating wave patterns. The narrator came to associate these electronic sounds and images with Eugene’s presence.

The summary conveys the key details about Eugene, the ham radio operator next door, including his physical appearance, his evenings spent alone operating the radio, his rooftop antenna that loomed over the narrator’s house, and the way the narrator could hear Eugene’s radio conversations by the interference they caused with the TV signal. The summary also notes the narrator’s impression that these electronic sounds and visuals represented Eugene’s “true physical presence.” So the key elements regarding Eugene, his radio hobby, his equipment, and his relationship to the narrator as a neighbor are captured in this high-level summary.

The passage is about a woman who recalls her childhood neighbor, Eugene, an amateur radio operator who would broadcast messages late into the night seeking contact with other operators. His lonely calls came through on her family’s television.

Years later, she finds herself up late at night unable to sleep. She turns on her computers and connects to an online bulletin board, where people are chatting about technology. This reminds her of Eugene and his late-night radio broadcasts. She realizes she has become like Eugene, seeking connection with unseen others through technology in the middle of the night.

She connects to her email and finds a message from Karl, a colleague she works with on a project. They realize they are both up late, connect briefly, then continue their work. She begins to become curious about Karl, though they rarely meet in person. She comes to know him primarily through email, where she adopts different personas to communicate with different people, including her colleagues.

In summary, the passage uses the memory of Eugene, the lonely amateur radio operator from the narrator’s childhood, as a metaphor for the kind of technological connection and relationships she now has as an adult. Her brief late-night email exchange with Karl makes her realize they share this kind of lonely connection through technology, even as she remains unsure of who he really is behind his distant electronic persona. The passage suggests technology both enables human connection across distances but also obscures intimacy through the different personas we construct online.

  • The author was a contractor hired to work with a team on a computing project. The team communicated extensively through an email system with multiple aliases that made it hard to know exactly who recipients were.

  • The email culture on the team was very combative and critical. Team members frequently attacked and insulted each other over email. The author was often the target of these attacks, which were harsh and personal.

  • To survive in this culture, the only option was to develop a thick skin and respond with humor. Apologizing or asking for compassion was unacceptable.

  • The offices where the team worked were like a suburban campus, intended to keep employees comfortable. But the engineers were isolated and spent all day “talking” to machines that were like cranky children. Many turned to the internet and email for relief and human interaction.

  • The author, as a woman, found it easier to “code switch” and communicate in person. But she was still capable of the isolation and focus that engineering demanded. She loved working with powerful technology and complicated systems.

  • Though the work was highly technical, it was largely sedentary and focused on writing code and communicating via email. The author argues this is an “oxymoron” —software “engineering” without actually building anything physical.

  • After two months, the author still knew little about her colleague Karl besides his looks. But one night, after receiving another insulting email, she received a surprising message from Karl sharing a story of when he was the target of ridicule on the project. He described how it felt in an empathetic way.

The key points are the combative culture that dominated email communication, the isolation and “oxymoron” of software engineering work, the author’s ability to navigate between technical and interpersonal realms, and the surprising show of empathy from Karl after two months of little substantive interaction.

The narrator meets a man named Karl through work and they begin emailing each other frequently. At first, the speed and intensity of the email exchange is thrilling. However, the narrator soon grows frustrated with the “interpolation” feature of the email software that copies portions of the original message into the reply. She feels like she is living with an “echo” rather than receiving authentic responses.

Eventually, Karl and the narrator arrange to meet in person for dinner. However, the dinner conversation has the same stilted, stop-and-start quality as their email exchange. They have trouble finding a natural rhythm and flow to their in-person conversation. After dinner, they go for a walk on the beach. The narrator stops walking in an attempt to get Karl’s attention and have a more genuine interaction. It’s unclear whether her attempt is successful.

Overall, the passage explores how relationships can form and develop via online communication but face challenges transitioning to in-person. The tools we use to communicate can shape the way we relate in both positive and limiting ways.

  • The author recently committed an act of “technical rebellion” by choosing to purchase the Linux operating system instead of the ubiquitous Windows NT. This choice was an impulse buy but reflected a desire to support the open-source movement.

  • Open-source systems like Linux make their source code freely available, allowing programmers to understand the internals of the system. In contrast, commercial systems like Windows NT keep their source code secret. The author sees a computer system as a statement about how we understand and interact with the world, so the choice of Linux was meaningful.

  • The author had intended to buy Windows NT for professional reasons but bought Linux instead. Linux represents a revolution in programming that allows programmers to see how operating systems actually work. Before open-source software, programmers could only interact with systems at a surface level.

  • The author reflects on how the nature of programming and technical expertise has changed. In the past, programmers were “kept at a remove from the system internals.” Now, with Linux, programmers can read the actual source code, suggest changes, find and fix bugs, and learn from more experienced programmers.

  • The author learned to program using UNIX, a precursor to Linux developed at Bell Labs in the 1970s. UNIX was relatively open-source, with its source code licensed to universities. The author was able to see “what a system was made of.”

  • There is debate over whether Richard Stallman or Linus Torvalds originated the open-source movement. Stallman proposed an open-source UNIX-like system called GNU in 1983. Torvalds released Linux in 1991 but was accused of copying UNIX code. Stallman’s GNU was said to have been created independently.

  • The author reflects that modern systems like Windows NT have “absorbed into themselves as many complications as possible” in order to seem easy to users, but this façade comes at the cost of distancing programmers from a real understanding of the systems. The open-source model avoids this problem.

  • The author wanted to get rid of Windows NT on their computer and install Linux. This proved difficult and time-consuming. After many hours of searching through disks and trying various methods, they were finally able to uninstall Windows NT.

  • With no operating system left, the computer displayed an error message saying “NO ROM BASIC, system halted.” The author eventually figured out that this was a vestige from early IBM PCs that had BASIC built into the ROM. Modern PCs no longer have this, so with no OS the computer defaulted to trying to run BASIC.

  • While troubleshooting, the author learned some details about the BIOS and the origins of the PC, realizing that the system was built up over time and contains remnants of its history, like archaeological layers. The PC is not as “new” as often portrayed.

  • In contrast with the friendly look and feel of Windows, the bare Linux system just presented a plain log-in prompt. But the author felt a sense of power and control in being able to fully uninstall Windows NT and have such low-level access to the system.

  • Overall, the experience revealed the PC as a palimpsest, built on layers of older technologies, and gave the author an appreciation for the control and simplicity of Linux in contrast with the authoritative and opaque nature of Windows.

  • The passage discusses how modern programming tools like visual development environments, wizards, and code generators make programming easier by hiding complexity and automating code generation. However, this convenience comes at the cost of programmers not fully understanding the systems they are building.

  • The author argues that while these tools increase productivity and allow programmers to build more complicated systems, they risk programmers becoming unable to debug or modify these systems when issues arise because they do not fully understand how they work under the hood. Programmers become reliant on the tools and unable to program without them.

  • The author gives an example of working on a project where Windows programmers using Visual C++ quickly built an application but then struggled to debug it, requiring UNIX programmers to step in and fix issues. The UNIX programmers were more accustomed to working with raw code and understanding systems at a deeper level.

  • The author suggests that as knowledge is encoded into computer code, it can become opaque and incomprehensible. Programmers come and go, and over time no one fully understands the original code and why it was designed that way. The year 2000 problem and issues with air traffic control systems are examples of knowledge being lost in code.

  • The author sees Linux as a way for programmers to reconnect with code and understand systems at a low level again. Struggling to get a CD drive working in Linux drove the author to examine the actual hardware and code, reminiscent of early UNIX systems before programming tools and operating systems became overly simplified and obfuscated.

  • The author sees the rise of Linux as a desire by programmers to return to examining raw code and truly understanding the systems they work with, rather than relying on tools that hide complexity.

In summary, the key ideas are that visual programming tools and code generation sacrifice a deep understanding of systems for convenience and productivity; knowledge embedded in computer code can become opaque over time; and Linux represents a return to programming where engineers work with and understand code and systems at a low level.

  • The speaker attended an event at Cisco Systems campus in San Jose. The room could seat 200 but was crowded with over twice that number to hear Linus Torvalds, the creator of Linux, speak.

  • Torvalds discussed the technical challenges of writing an operating system to run on multiple processors. He described the trade-offs between isolating programs to avoid interference and slowing them down with too many locks. He emphasized that there were many valid solutions, none solving every problem. His talk conveyed that software engineering was about balancing imperfect solutions.

  • The next month, Marc Andreessen, co-founder of Netscape, spoke at the Silicon Valley Linux Users Group. The day before, Netscape released the source code for its browser. Andreessen and others celebrated this as “a return to our roots.” However, the speaker felt the mood seemed forced and was not convinced that open-source software and many programmers could overcome the challenges from Microsoft and Oracle.

  • The speaker lives near the event venue. After the talks, the speaker reviewed some Linux code and joined an online forum where programmers from around the world were suggesting code changes. The speaker appreciated not just the technology but also “the society of programmers, talking.”

  • The speaker first discusses Y2K fears in February 1999, about 10 months before the bug might cause computers to fail due to storing years as two digits. Initially, the speaker and other technical experts thought programmers had enough time to find and fix critical code. However, many who studied Y2K developed a deep, “animal insecurity.”

  • Two months before, an anonymous programmer called the speaker late at night with dire warnings about Y2K leading to societal collapse. The speaker considered this person an extremist “survivalist.” Still, at the time, the speaker found Y2K fears amusing, revealing the fallibility of technology that some had promoted as salvific.

  • The term “millennium bug” is a misnomer. Y2K was not happening at the turn of the millennium, and storing years as two digits was an intentional design choice, not a bug. It saved limited resources. Still, the turn of a century stirs apocalyptic fears, even regarding our “secular higher power,” digital technology.

  • Early computer programs were designed to represent years using two digits to save space, since memory and storage were limited. This convention continued even as technology improved and space constraints were alleviated.

  • The year 2000 problem (Y2K) arises from this two-digit year convention. When the year changes from 1999 to 2000, the year will be represented as 00, which will be interpreted as 1900 by systems that use two-digit years. This can cause date-related errors and system crashes.

  • The author visits a brokerage firm and speaks with a quality assurance manager, Lawrence Bell, who is overseeing the company’s Y2K remediation. The company’s systems span many decades of technology, from mainframes to modern servers and software. Rewriting and updating the old systems is difficult and complex.

  • The author then speaks with Jim Fuller, a programmer who has worked for the Federal Reserve for over 30 years. Fuller describes the systematic process the Federal Reserve is using to identify and fix Y2K issues. However, Fuller worries that many other organizations are not sufficiently prepared, which could have major consequences. Despite the Federal Reserve’s best efforts, Fuller cannot say definitively whether their systems will stay up.

  • The author reflects on the scale of the Y2K problem, considering the huge numbers of computers, programs, and lines of code that will be affected. Even a small proportion of failures could be hugely disruptive. The example code shows how a basic date comparison could cause a system to crash at the turn of the millennium.

  • In summary, the year 2000 problem stems from the outdated convention of representing years with two digits. Fixing this issue requires a massive, complex effort to review and update systems and software spanning many decades. Despite some progress, there are widespread concerns about potential technology failures at the start of the year 2000.

  • The passage describes attending a conference on the Y2K computer problem in the year 1999. The keynote speaker blames programmers for not anticipating the Y2K issue, but the author argues that technology progresses in steps based on human desires and limitations, and programmers have to work within those constraints.

  • The passage discusses attending workshops at the conference, including one on creating a “time machine” to test fixed code. The author asks what upper limit they should use for the year, realizing they should worry about dates far in the future, not just 2000. The presenter avoids directly answering.

  • The author describes her favorite proposed solution from two attendees who work for a railroad company. They plan to lie to their system about the date, setting clocks back to 1972, since the days of the week are the same as in 2000. The author finds this solution ingenious.

  • The author visits Texaco to observe their Y2K testing. She meets with managers who have worked there for decades and helped write much of the old code. They show her a test of a remote terminal unit that monitors liquid flow in real time. They have to test each part of the system individually. The author sees how difficult and time-consuming the process is.

  • The overall perspective is that Y2K is a serious technical challenge, but the blame does not lie solely with programmers. Technology develops based on human demand and ingenuity, within the constraints of the current infrastructure and knowledge. Fixing the Y2K issue requires understanding these complex, embedded systems built up over decades. Simple fixes may work as temporary solutions, but comprehensive solutions take enormous effort.

The passage describes a test of a remote terminal unit (RTU) that monitors the flow rate of oil through pipelines. The test reveals that the RTU has a Y2K bug that causes it to malfunction at midnight on January 1, 2000, displaying incorrect dates and becoming unable to send data to the supervisory control and data acquisition (SCADA) system.

The failure of this one RTU could have major consequences. RTUs and other intelligent devices constantly send data to SCADA systems, which oil companies rely on to monitor their pipelines, analyze production, bill customers, and ensure compliance with regulations. If these systems fail, it could shut down oil production, disrupt the supply chain, and have cascading effects on the economy.

Despite discovering this Y2K bug, the engineers testing the RTU remain optimistic that hard work and cooperation can fix critical problems and limit damage. However, the passage also describes a visit to an offshore oil platform control room where an executive acknowledges worrying about interdependencies and cascading failures that could result from Y2K issues with suppliers, customers, and infrastructure like utilities.

The passage ends on December 31, 1999, with the author hosting a New Year’s Eve party in San Francisco. Although concerned about Y2K disasters and aware of the city’s vulnerability to earthquakes, the arrival of the new year passes without major incidents. The mood at the party deflates as infrastructure continues functioning. The fears about catastrophic Y2K bugs prove exaggerated, even though issues like the RTU problem show that some important computer systems were unprepared.

In summary, the passage recounts both optimistic and worrisome perspectives about Y2K at the turn of the millennium. Testing reveals potentially serious technological issues, but hard work appears to have limited major failures even as some expectations of disaster prove overblown. The transition to 2000 ultimately passes without the dreaded societal meltdowns that some had feared.

The author recounts an experience of watching two young boys try to describe playing a video game together, and finding that their experiences were so private and confusing that they couldn’t communicate effectively about it. She compares this to her experience helping two clients use the early World Wide Web for the first time in 1995, and finding that they had a similar inability to convey their experiences to each other in a coherent way.

She reflects that web surfing at the time seemed harmless, like a private form of entertainment that only affected the individual. However, around 1998 she began to see things differently. She describes seeing an irritating billboard displaying the slogan “now the world really does revolve around you.” The billboard was advertising a semiconductor company, reflecting the growing commercialization of the web at the time.

The author argues that the commercialization of the web was proceeding by trying to isolate individuals within an online marketplace. A concept called “disintermediation” was removing intermediaries like agents and brokers from transactions. While this was accepted as inevitable, she believes it reflected an attempt to convince people that this massive change was empowering for them as individuals.

In her view, the ideas in technology spread into the wider culture. She worries that the web was no longer just a place for personal freedom, but was becoming an actual marketplace that was changing the nature of everyday life. The removal of intermediaries threatened to have a huge impact on the structure of reality itself, given how dominated the world was by markets.

So in summary, the author traces her early optimism about the web, followed by growing concern over its commercialization and the concept of disintermediation. She worries about the web isolating individuals and altering reality through the elimination of intermediaries in an increasingly market-dominated world.

  • The web did not cause the trend of disintermediation, or the elimination of middlemen in commerce. However, it enabled and accelerated this trend by making it easy for companies and consumers to cut out intermediaries.

  • We are seeing an attempt to create a capitalist system without reliance on salespeople and physical retail. Companies are trying to convince consumers that intermediaries are useless or dishonest, and that self-service is best. Only the wealthy will receive personal service. The rest will rely on websites.

  • There are illusions about the web, including that it makes us more powerful and gives us endless choice. In reality, the web reduces our control and often provides too many choices, leading to unhappiness. An example is searching for a simple item like a faucet, and being overwhelmed with thousands of options.

  • The author fears for the world the web is creating. In the past, extreme or fringe beliefs required physical effort to sustain. Now, people can isolate themselves in “thought bubbles” online, reading only what reinforces their beliefs. This makes it hard to form consensus or compromise.

  • An example is a TV commercial depicting a post-apocalyptic world, with ruins, guards in a library, and a long line at a bank. The commercial glorifies being alone at home, and is a “shameless and naked expression of the web world.”

  • In summary, while the web has benefits, it also enables the spread of misinformation and extreme beliefs. And it contributes to a fractured society where people are increasingly isolated and unwilling to find common ground.

Here is a summary of the passage in about 60 seconds:

The passage describes San Francisco’s South of Market neighborhood in early 1999, during the peak of the dot-com boom. The author goes to a crowded bar late at night where young tech entrepreneurs and internet enthusiasts gather to network and dream of getting rich from online startups. After the bar closes, the streets become quiet and empty, except for road crews doing loud construction work through the night.

The author captures the wild optimism and dreams of wealth permeating the city at the time, fueled by the rapid growth of internet companies. At the bar Infusion, young ambitious men plot how to launch startups, take them public, and become millionaires. The author gets caught up in the infectious optimism and excitement about how the internet will transform life.

When the bar closes, the area becomes deserted, showing how San Francisco really closes down early for a city. The author walks home through the abandoned streets to her loft. The only signs of life are a few lit windows in buildings, the carcasses of pigeons caught in nets under an overpass, and the road crews tearing up the streets to lay fiber optic cable for the internet boom.

The passage highlights the contrast between the frenetic energy, ambition and optimism of the era during the day and at night in bars and the empty, quiet darkness of the streets when all has shut down. It captures a moment in time when San Francisco was the center of the dot-com revolution and the dreams of its young inhabitants to make fortunes and change the world through technology.

  • The narrator lives in San Francisco’s South of Market (SOMA) neighborhood, where there is a lot of construction and digging up of the streets going on. The neighborhood is transforming rapidly, with many internet startups moving into old factories and warehouses.
  • There are remnants of SOMA’s industrial past, like a lithography factory, a sewing sweatshop, and a union hall. But they are disappearing as the startups take over. The skyline is filled with cranes, indicating rapid building. The narrator feels walled in by the new construction.
  • The constant noise from jackhammers, trucks, and nighttime construction crews disrupts the narrator’s sleep. Mysterious crews are frequently opening up a manhole in the street for unknown reasons.
  • The narrator attends a party at eGroups, a startup founded by her friend Carl Page. She observes the culture of the young, mostly male programmers who work long hours in a dim, hot warehouse space.
  • At the party, the narrator runs into Carl’s brother Larry Page and Sergey Brin, the founders of Google. When Larry asks what she is doing, she mentions she has been “fooling around with symmetrical multiprocessing.” Larry immediately offers her a job at Google.
  • The narrator feels like a fraud, aware of how much she doesn’t know. But she realizes that to work at Google, anyone would have to push beyond what they currently know and work at the cutting edge. She observes the young programmers around her and feels she has outgrown that culture.

The key themes are the rapid transformation and tech dominance of SOMA, the disruptive impact of that change, and the narrator’s complex reaction to being offered an opportunity that she feels unprepared for, at the center of that change. Overall, it seems a commentary on the culture of ambition and constant progress in Silicon Valley.

  • The author attends the Computers, Freedom, and Privacy Conference in Toronto. The conference attendees are usually an eclectic mix of people, but this year the mood is subdued. The author attributes this to the lack of data privacy laws in the U.S. compared to Canada and crossing into the U.S. means losing some constitutional rights.

  • There is an ideological tension between supporting social policy/activism and believing in cryptography. In the past, the conference debates reflected this tension. But this year, several key figures “ defected” to supporting social policy, including Phil Zimmermann, Neal Stephenson, Tim Berners-Lee, and Whitfield Diffie.

  • At a dinner, Neal Stephenson gives an unstructured talk using hand-drawn transparencies. He says that social structures, not cryptography, provide the best defense for privacy and integrity. He shows a series of circles that get bigger and overlap to represent social structures. Though Stephenson is a former programmer who writes about cryptography, he seems to be arguing that social policy is more important for privacy than technology alone.

  • The author is surprised by Stephenson’s argument based on his previous work. But the overlapping circles Stephenson shows seem to represent how social structures can provide privacy through inclusion and connection.

That covers the key points around the surprise, the dinner talk, Stephenson’s circles representing social structures, and the author’s interpretation. Let me know if you would like me to clarify or expand on any part of the summary.

  • The speaker argues that cryptography alone is not enough to protect privacy and ensure security. It needs to be combined with social structures and agreements to be effective.

  • Several prominent technologists expressed views questioning the libertarian belief that unfettered use of technology alone can solve issues like privacy. They argued for a role of laws, regulations and social organizations as well.

  • Neal Stephenson criticized the reliance on cryptography alone as like having only one picket in a fence. He said crypto needs a “sociopolitical context” to work.

  • Phil Zimmermann, creator of the PGP encryption tool, said he “never meant PGP to be the defense of the lone libertarian.” Implying it needs to be combined with other approaches.

  • Tim Berners-Lee, creator of the World Wide Web, worried that the egalitarian principles of the early web were being threatened by large corporations and money. He started to argue that “libertarians are used to fighting the government, and not corporations…” but didn’t finish the thought. He said “we don’t like regulation where we can avoid it, but…” suggesting some regulation may be needed.

  • Whit Diffie, a famous cryptographer, gave an impassioned speech arguing that software and technology alone cannot maintain privacy and security. He said “it turns out you have to trust other people.” He warned against becoming “slaves to the mainframe” again in an age of thin clients and centralized network control. He warned this could enable “corporate imperialism over its workers.” He called for workers to organize and “bargain as a whole” to counter this.

  • In summary, several key technologists expressed views that questioned the belief in technological solutions alone, and argued that legal, regulatory and social structures are also needed to enable privacy, security and autonomy. They suggested crypto needs a broader “sociopolitical context” to be effective.

  • Twenty-four years ago, the author attended a conference and noticed the absence of major tech companies and investors. Despite discussions about the future of the web, they did not have much influence compared to large tech companies and venture capitalists.

  • A month before the conference, the Nasdaq stock index hit a peak but then fell 29% in 3 days. However, it quickly recovered as people believed tech stocks were now bargains and the decline was a chance to buy in. The author felt drawn in by this belief but her friend and financial advisor Clara Basile advised against it, comparing Webvan to Safeway.

  • The stock market is driven by stories and narratives, not just numbers. Venture capitalists evaluate startups based on their story. Daily market movements are explained by narratives in the media, even if the explanations don’t fully make sense or are inconsistent. The narratives are meant to keep people engaged in the market.

  • During the dot-com boom, the dominant narrative was that tech companies were different and didn’t need profits right away. Success was measured by “eyeballs” and traffic. Startups would get funding from venture capitalists and then hopefully go public to raise more money, even without a plan to become profitable.

  • The Nasdaq index rose steadily from 1995 to 1998, reflecting enthusiasm for tech stocks, but the rise was not yet irrational. There were real reasons for tech companies to be valued, but this would change.

  • The widespread adoption of graphical web browsers in the 1990s, especially Internet Explorer, drove the rise of the web by making it accessible to millions of ordinary people. This ignited an explosion in e-commerce and new internet startups.

  • The demand for online porn fueled further innovations to improve web infrastructure like bandwidth. Porn sites were early pioneers in e-commerce and made lots of money.

  • In 1998, the collapse of the hedge fund Long-Term Capital Management caused the Federal Reserve to lower interest rates. This made borrowing cheap and spurred a rush of investments into risky internet startups by venture capitalists and the general public.

  • Between 1999 and 2000, the technology-heavy Nasdaq stock index rose rapidly due to hype around internet stocks, fueled by the belief that profits didn’t matter and investors should “buy on dips.” The index’s rise became parabolic and unsustainable.

  • The Nasdaq reached a peak in March 2000. Its subsequent crash wiped out 78% of its value over 2 and a half years. The crash shattered the belief systems that had fueled the bubble. It represented a massive transfer of wealth from ordinary investors to venture capitalists and startup insiders.

  • The boom and bust was driven more by market dynamics and hype than the actual value of technology. It exemplified how the early web had become more about finance than its original vision. The biggest losses were borne by middle-class investors who were seduced by dreams of easy riches.

The key factors in the rise and fall of the dot-com bubble were: the widespread adoption of web browsers, cheap money from the Fed, hype and irrational exuberance around internet stocks, the belief that profits didn’t matter, and the dynamics of a parabolic market rise and crash. The biggest losers were middle-class investors, while venture capitalists and startup insiders often made out well.

  • In 2002, computer scientists met to discuss how to distinguish humans from software robots or “bots” in electronic communications. Though seemingly trivial, this problem reflects deeper questions about what separates humans from machines.

  • These questions have arisen from humanity’s tendency to build increasingly intelligent tools and machines, as reflected in science fiction. We seem compelled to create robots and artificial life that approach or surpass human capabilities.

  • Some scientists believe human life is just one of many “possible biologies” and that there is nothing inherently special about humanity. They envision the rise of a “post-human”: an artificial, nonbiological intelligence with capabilities beyond those of humans.

  • In 2000, scientists discussed whether “spiritual robots” would replace humanity by 2100. Douglas Hofstadter suggested it is not surprising that life could emerge from inert matter or change its material basis from carbon to silicon. He wondered if there would still be humans in the future.

  • The prospect of creating artificial intelligence and life has long fascinated computer scientists. In the 1970s and 1980s, many believed AI could lead to intelligent machines. Even though early expectations were not met, AI raised deep questions about human nature, mind, and consciousness that were previously addressed through religion or the humanities.

  • With the decline of religious explanations, science and technology have become a means to understand the essence of human life and intelligence. The study of artificial and cybernetic intelligence is a way to address fundamental questions about what animates humans.

So in summary, as humanity’s technological capabilities have grown, we have gained the means to build increasingly intelligent machines that approach human-level intelligence and beyond. This prospect has raised philosophical and existential questions about human nature and humanity’s place in the world. Though the goals of early AI were not achieved, the field articulated questions that get at the heart of what it means to be human. And for some, the creation of artificial life and intelligence is a way to better understand the animating forces of human life.

  • Philosophers and scientists have tried to define what makes humans human. Three popular concepts are: 1) humans as computers; 2) humans as social insects like ants; and 3) humans as accidents of evolution.

  • The view of humans as computers originated in the 1950s, soon after digital computers were invented. Pioneers like Alan Turing, Herbert Simon, and Allen Newell proposed that human intelligence could be replicated in machines. They saw the human mind as an information processor, like a computer.

  • In his 1969 book The Sciences of the Artificial, Herbert Simon argued that humans should be studied as artificial systems that adapt to their environments, just like computers. He proposed studying computers to understand the human mind. This circular logic has influenced decades of thinking in cognitive science and artificial intelligence.

  • Early AI succeeded in simulating rule-based, rational thought. But it failed to replicate human qualities like flexibility, intuition, emotion, and consciousness. Critics argued that AI relied on too narrow a view of human intelligence.

  • Nevertheless, the metaphor of humans as computers persists in the sciences. Many scientists now view living things, including humans, as machines made of biochemicals. Psychologists and biologists often study simulations and robots to draw conclusions about human and animal behavior.

  • However, the view of human biology as purely cybernetic may eventually be challenged. Completing the human genome map could lead to new insights into the complexity of human life that transcend mechanical metaphors.

In summary, the debate about what makes us human continues in philosophy and science. Comparing humans to computers and robots has been an influential metaphor, but it may provide too limited a view of human nature. A more nuanced understanding of human consciousness and biology could emerge from new frontiers like genetics.

• In 2001, scientists announced they had sequenced the entire human genome. They were surprised to find only about 30,000 genes, far fewer than expected. This challenged the idea that there is a simple relationship between genes and human complexity.

• Researchers proposed other explanations for human complexity, such as how proteins are folded, genes having multiple functions, some genes not being expressed in some circumstances, and unknown interactions between genes. DNA is not equivalent to programming code.

• Cybernetics studied ants as a model for intelligence emerging from simple interactions. Ants produce complex behavior without a designer. Herbert Simon proposed that human intelligence could emerge similarly from simple interactions, like ants navigating a beach.

• The idea of “emergence” - complex outcomes arising from simple interactions - became key to complexity theory, chaos theory, cellular automata, robotics, and artificial life. Intelligence could emerge from many simple interactions between automata (mechanical components).

• However, software is much harder to create than hardware. As systems become more complex, software becomes much more difficult to write. Object-oriented programming uses many small chunks of code that interact, but systems can fail or run out of control. Human programmers have to understand, fix and change these systems.

• Darwinian evolution cannot explain the emergence of cybernetic life. Evolution depends on the drive to survive and reproduce, which software lacks. Software that simulates human qualities is still just a simulation.

• The mistake in robotics and AI is mistaking the tool (current software methods) for the builder (human intelligence). How human intelligence is understood changes with the popular paradigm in computing. But computing paradigms do not necessarily illuminate human intelligence.

• Even proponents of emergence recognize its limits in explaining human intelligence. Computational metaphors for the mind and evolution may not adequately capture the essence of human sentience. There are aspects of human experience that emerge in a profound way, not just from interactions, but from consciousness itself.

  1. Artificial life (ALife) research aims to create synthetic biology and life “in principle” by simulating the lowest-level interactions that produce life. However, there is a problem of determining what level of interactions must be simulated - cells, molecules, atoms, elementary particles? This raises the question of the “bottom of physics” - how deep must we go to understand and create life.

  2. Meanwhile, robotics grapples with the question of how to achieve “higher-level” functions like self-awareness and theory of mind in artificial agents. This requires addressing philosophical questions about consciousness and cognition that AI has struggled with.

  3. ALife sees humans and life on earth as accidental and irrelevant to understanding life “in principle”. It aims to create software programs that display properties of life like self-replication and adaptation. This takes the view of life as a disembodied, abstract phenomenon that can exist solely within computers.

  4. These approaches share a disregard for the biological body. Early AI saw the body as irrelevant “meat” and aimed for disembodied intelligence. Kurzweil sees the body as an impediment to intelligence. ALife ignores the body altogether. Even robotics, which requires a physical form, sees intelligence as separable from its biological origins.

  5. However, the body - and experiences like eating, hunger, and waste - are crucial to intelligence and sentience. Huge parts of existence and cognition depend on the body and its needs. By separating “life” and “mind” from the body, AI and ALife lose access to major parts of what makes us alive and able to think.

  6. In this way, AI and ALife take an implicitly dualist, body-denying stance that sees intelligence as a “spirit” separable from its physical form. But intelligence evolved to serve the embodied drive to survive and continue living. It cannot be understood apart from its bodily origins and purposes.

That covers the key points on ALife research, its problems in determining what level of interaction to simulate, its disregard for the human body, and why that stance is limiting. The summary argues that intelligence and life cannot be disembodied or separated from their biological purposes and origins. Please let me know if you would like me to clarify or expand on any part of the summary.

  • Sentience arises from our biological embodiment as mammals, not from something separable like machines or software. Our intelligence is fundamentally shaped by the details of our mammalian bodies and brains.

  • A key part of mammalian intelligence is the ability to engage in social and emotional interchange, which is enabled by the limbic system in our brains. Sentience begins with the capacity for rich social relationships and understanding the emotional states of others.

  • Emotions are critical to sentience and rational thinking. They give meaning and context to even simple things like a chair. We can describe a chair abstractly but we can’t capture the richness of human experience with it without emotions.

  • Some roboticists are starting to explore adding emotions and social abilities to robots to make them appear more sentient or intelligent. But replicating the complex emotional and social existence of a living being is an enormous challenge. The emotions of living creatures involve internal realities and needs, not just outward behaviors or games to elicit reactions from others.

  • In summary, sentience arises from our embodied existence as social mammals, not from a separable mind or logical reasoning system. Emotions are essential to intelligence, not opposed to it. And rich emotional interchange begins at a very basic biological level.

The researchers working on social and emotional intelligence in robots realize they have a long way to go to achieve human-level abilities. They have hit upon many unsolved questions about learning, motivation, personality, and consciousness that they don’t fully understand. The biggest challenge is understanding consciousness and the sense of self.

Rodney Brooks argues that humans are just a set of tricks and mechanisms. But the author disagrees and believes human consciousness evolved to allow individuals to recognize one another, form alliances, and thrive in social groups. This ability to recognize others leads to our own sense of identity and self. Robots currently lack this ability to recognize others of their own kind.

The author argues that a sense of self is critical for social creatures. While humans are made up of many parts that are not unified, we develop a sense of being a whole, unique self. This uniqueness is built into our biology, as each human has a distinct genetic makeup. Our brains then develop based on experience in a way that makes each individual increasingly different. This uniqueness and ability to tell one another apart is key to human society and relationships.

Simulating a creature with a sense of self and identity would be extremely difficult, like simulating a complex system such as the weather. While simulations can do a reasonably good job in the short term, the complexity involved in simulating a self over a long period of time may prove too great. The researchers have a long way to go to achieve human-level social and emotional intelligence in robots. A sense of identity and mutual recognition seem to be key missing pieces that would be hard to program.

  • The author adopts a cat named Sadie from an animal shelter. Sadie is initially two years old.

  • Sadie loves climbing up and down beams in the author’s loft, showing no fear of heights. The author envies Sadie’s fearlessness.

  • In the author’s twenties, she and her friends were very focused on their cats. Their cats provided companionship and an escape from constant human interaction.

  • Toward the end of Sadie’s life, at age twenty-two, she is very sick and frail. She has trouble walking, eating, and using the litter box. The author has to hold Sadie up so she can eat and urinate. Sadie eventually falls down the stairs.

  • Six months later, Sadie has a seizure and dies in the author’s arms. The author and her husband, Elliot, feel sad but also relieved that Sadie’s suffering and their caregiving duties are over. They can now live in New York, something they couldn’t do while Sadie was alive.

  • The author creates a shrine to Sadie, including her ashes, notes, and a poem about the death of a cat. The poem suggests the cat was unaware of death, providing some consolation. But the author knows Sadie was aware of her declining health and difficulty walking, eating, and using the litter box.

  • The author questions whether her relationship with Sadie was real or whether Sadie was just enacting her biological programming. She wonders if cross-species relationships between humans and other animals are possible.

  • A rude woman at a restaurant tells the author that Sadie didn’t really love her, that was just the author’s idea. But the author knows this woman never met Sadie and has no way of knowing how Sadie felt about the author.

The key themes are love and loss, the possibility of cross-species relationships, suffering and mortality, and gaining a sense of freedom mixed with guilt. The story explores the complex relationship between the author and her cat Sadie, especially toward the end of Sadie’s life.

  • The author recently finished writing a book that took five years to complete. To reward herself, she decided to buy a new laptop computer, even though her old one was still functioning.

  • Before making the purchase, the author saw old, discarded electronics on the sidewalk and felt a tinge of guilt for so quickly replacing technology that was once new. However, the allure of a faster, higher-capacity new computer proved too strong to resist.

  • The author’s new laptop was far superior to her old one in every way - much faster processor, more memory, bigger screen, and far more storage space. Her old laptop, in contrast, looked decrepit with loose screws, missing keys, dead pixels on the screen, and other signs of heavy use and age.

  • Though outdated, the author’s old laptop contained years of files, documents, and memories. Migrating all of this to her new high-tech laptop made the author reflect on technology’s role in shaping memory and relationships. The author realized how deeply intertwined her old laptop had become with her sense of self and personal history.

  • In the end, the author kept her old laptop as a backup and companion to her new one. Though lacking in capability, the old laptop held a depth of memory and meaning that made it still valuable. Both the cutting-edge new technology and the outdated old tool had important roles to play in the author’s life.

The key idea is that technology shapes our lives, relationships, and memories in complex ways. Even as new tools arrive, the old ones that have journeyed with us maintain a hold on our sense of identity and the narratives we construct about ourselves.

The author buys a new laptop with vastly more storage space than her old computer. When faced with transferring files from the old laptop to the new one, she is reluctant to bring everything over, as some files contain painful memories of her first marriage and divorce. She grapples with deciding what is truly necessary to transfer vs. what can be left behind.

She discusses how memory itself is unstable and constantly changing. Scientific research shows that memories are not fixed in the brain but are reformed and reconnected each time they are recalled. The researchers found that preventing the formation of new memories also prevented the recall of old, supposedly “consolidated” memories. This suggests that memory is an evolving process, not a filing cabinet where memories are stored and retrieved unchanged. The author notes that sometimes vivid memories can come rushing back, but as soon as we remember them, they become enmeshed in our present experiences and lose their immediacy. Overall, human memory is unreliable, as evidenced by studies on eyewitness testimony.

The central theme is the desire to escape the past by leaving some memories behind, but also the impossibility of truly doing so, as memory continually reshapes itself. The author grapples with managing the perpetually “present” nature of digital information and memory.

The author took a programming course early in her career where the instructor compared creating a computer program to baking a cake. The analogy appealed to her at the time, as she knew programming but little about cooking. Many years later, she encountered a butcher selling homemade beef filets at a gourmet market, and she bought one despite rarely cooking meat. Unsure how to prepare the filet, she turned to Julia Child’s The Art of French Cooking for guidance.

Though the author had owned the cookbook for years, she had initially found its detailed descriptions of butchering beef and complicated cooking techniques intimidating. However, memories of watching Julia Child enthusiastically cook meat on TV as a child inspired the author to persevere. She sees cooking, especially meat cookery, as a visceral act that contrasts with the abstraction of programming. While programmers meticulously specify the logical steps to accomplish a task, cooking allows for sensory experience and improvisation within the constraints of a recipe.

The key distinction the author draws is between the corporeality of cooking and eating versus the disembodied nature of programming. Programming depends on abstract thinking and mathematical exactitude, divorced from physicality. In contrast, cooking, especially with meat, confronts us with the animal flesh we are preparing and consuming. The sensuality of cooking thus grounds us in our physical being in a way programming does not. Overall, the author implies that while programming and cooking can both be rigorous and formulaic acts, only the latter nourishes us in a fundamentally human way.

The author describes coming across a recipe for beef sautéed in a cream and mushroom sauce while reading a cookbook by Julia Child. Trying to imagine writing a computer program to produce this recipe leads the author to realize how impossibly complex such an endeavor would be. The recipe refers to many concepts—like entertaining important guests, a nice brown outside and rosy center, Stroganoff sauce, risotto, potato balls in butter—that would require a broad, culturally informed understanding to explain to a computer.

The author discusses how early hopes for artificial intelligence ran into difficulties as researchers tried to create symbolic representations of human knowledge. The world proved too complex, varied and interconnected to capture in the simplified “frames” and “problem spaces” they devised. Their approach was too Platonic, not grounded enough in human experience. The example of all the types and contexts of chairs, and the myriad associations each evokes, helps illustrate why symbolic AI failed.

While setting the table for a dinner party, the author thinks of both her mother, who gave her much of the dinnerware, and the philosopher Hubert Dreyfus, who discussed the problem of representing chairs. She wonders how she might explain to a hypothetical “social robot” guest all the implements, foods and their interrelationships at her table. The robot might grasp the names and purposes of all the cutlery, but without the human experience of eating, how could it really understand?

The key point is that human knowledge rests on a vast, culturally rich set of associations and understandings that arise from embodied experience. Early artificial intelligence failed because it tried to represent this knowledge in an abstract, disembodied way. Contemporary social robotics, with its emphasis on learning through human interaction and teaching, offers more promise. But true human-level intelligence may remain elusive without human experience.

Here is a summary of the story:

  • The narrator says that sometimes a story or idea can install itself in your imagination and become stuck in your brain, constantly playing out scenarios in your mind.
  • This could be anything that causes worry, regret, embarrassment or anger. For example, realizing you did a bad job on a work project and worrying it will be discovered, or feeling angry at a friend who has treated you badly over the years.
  • In the narrator’s case, in February 2006 a particular story began trailing her and installed itself in her imagination.
  • The story is not specified at this point, but the narrator says “this sort of thing can happen to anyone.” The implication is that the reader will eventually find out what scenario or idea has become stuck in the narrator’s mind.

The key ideas are:

  1. Our imaginations can sometimes become fixated on a worrying or upsetting scenario, idea or memory, causing it to play out repetitively in our minds.
  2. This can happen to anyone, and the cause can be anything that provokes feelings of worry, regret, embarrassment or anger.
  3. In February 2006, an unnamed story or scenario installed itself in the narrator’s imagination, though the details are not yet provided.

The summary outlines the issue of distressing thoughts, scenarios or memories becoming stuck in one’s mind, though the specifics of the narrator’s particular situation are still unclear at this point in the story.

The narrator comes across a newspaper story about a group of people born in a displaced persons camp in Bergen-Belsen after World War II. Though the story haunts the narrator, who does not understand why these people would want to commemorate such a terrible past, the narrator becomes obsessed with their story. The narrator begins imagining characters and a plot set in 1970s San Francisco, with characters who were born in Bergen-Belsen or are connected to that history in some way.

One character is an adopted young woman struggling to find her identity, who is in therapy. The narrator hears this woman’s therapy sessions through the wall of the adjacent office. The narrator is compelled to tell the story through the perspective of a strange male professor character who can overhear the young woman’s therapy. Though the narrator protests, this is the only way the story can unfold. In one sitting, the narrator writes the first 20 pages of the story in this professor’s voice.

The world outside fades away as the narrator immerses themselves in this story and the world of 1970s San Francisco. The city at this time was facing threats of violence from groups like the Zodiac killer, the Death Angels, the Weathermen, and the Symbionese Liberation Army. However, the city was also frenzied with sexual desire, as gay culture blossomed in parts of the city.

The narrator invokes an imaginary Greek chorus, suggesting the narrator’s internal struggle and protests about the dark tale that is pursuing them. Though the narrator wants to escape thoughts of this grim story, they are compelled to continue unfolding it, even through the sinister-seeming professor character. The story seems inevitable, made up of elements from the narrator’s own life, though in a darker permutation. The story both pursues and hides from the narrator, who is in a kind of trap, experiencing a fluency in writing the tale that makes it hard to stop, despite the story’s grim and haunting nature.

  • In 1981, the author decided she wanted experience programming mainframe computers, despite having no knowledge of how to do so.

  • A headhunter got her an interview at a national retailing chain for a mainframe programming job, even though she lacked the required experience and technical skills. She got the interview possibly because the headhunter would get paid a lot if she was hired.

  • For the interview, the author dressed in professional attire to appear straight and hid her usual style and interests. At the time, she had a programming job at a smaller company.

  • The company’s data processing center was on the ninth floor of a department store in San Francisco, but the floor seemed hidden. The author had to go through restricted doors and up a dimly lit staircase to reach it.

  • On the ninth floor, there was a glass-enclosed room where many Filipina women were working on keypunch machines, visible to anyone walking by. The scene seemed outdated, as the author was used to typing directly into computer terminals. The floor felt like it was from an earlier era.

  • Beyond the keypunch room was a vast open space with many partitions sectioning off parts of it. The low ceiling, small window, and hidden location of the floor made it feel like an attic or half-story.

  • The summarizes the strange experience of visiting the outdated data processing center for the mainframe programming interview in 1981. The details convey the disconnect between the author’s familiarity with modern computing at the time and the obsolete equipment and environment at the company.

The narrator took a job as a programmer at a large company. Upon starting work, she discovered that the manager who hired her, Mr. M, had been demoted. Her group was left maintaining obsolete inventory management systems and producing useless reports.

The narrator had to secretly learn the company’s systems on her own to hide her inexperience. An older programmer, known as “the old man,” helped teach her what she needed to know. She was eventually able to fix some bugs in the code and report these fixes at a meeting.

At the next group meeting, Mr. M seemed surprised that the narrator had actually fixed the bugs. The narrator realized she needed to quit her job to get away from Mr. M, but wanted to stay at least a year for her résumé.

The narrator then discovered that the weekly sales change figure in their reports was always showing as 0% because Mr. M claimed the bug was in a vendor’s software, though there was no vendor software used. She also learned that the yearly department total in the reports would drop any digits beyond $99,999, making the total useless.

The narrator concluded that her job involved maintaining useless programs to produce worthless reports, keeping track of outdated items like flowered housedresses, and attending pointless weekly meetings with her deceitful manager, Mr. M.

So in summary, the passage describes the narrator’s frustrating experience starting a new job as a programmer and discovering the work and management were incompetent and pointless.

The narrator faces failure and struggles in their job working on a Sales Analysis System that seems useless. They feel invisible and unfulfilled. To combat the boredom and meaninglessness, the narrator decides to work on fixing a bug in the system that has caused some sales numbers to display as zeros.

Although the solution seems simple, the narrator has trouble finding and fixing it. Many programmers before have failed to solve it. The narrator spends over a year searching through the code to find the issue. Finally, in a moment of anger and determination, they discover the bug. Two variables with similar names, distinguished only by an underscore and dash, were the culprit. The underscored variable had been initialized to zero and never updated, so it kept moving zeroes into the sales report.

The narrator shows perseverance in the face of failure and frustration. Even when the task seemed pointless and impossible, they continued working at it. Their determination and patience eventually paid off, and they solved the problem that had stumped so many others. The moral is that persistence can lead to overcoming even the most difficult failures or setbacks. With enough grit and determination, any problem can be solved.

The narrator, a student at Cornell University, moves into an old farmhouse outside of Ithaca, New York, in 1970. The house has no heat, occasional hot water, and the narrator shares it with a motley group of people, including her Romantic Poetry classmate, his sister, the sister’s boyfriend, and another man.

The area is rural, and the farmhouse shares a party telephone line with other households. The narrator often overhears a polite British woman, Mrs. Richard, on the line. One day, Mrs. Richard knocks on the farmhouse door and introduces herself as their neighbor. She seems desperate and impoverished.

Later, the narrator and her housemates get involved in the Richards’ lives, though the Richards did not ask for their help. Mrs. Richard brings them fresh, unpasteurized milk from their farm. She tells them a story, possibly fanciful, about how she and her husband came to live on their farm: when he retired from being a merchant seaman, they attached an anchor to their truck and settled down wherever it fell off, which happened to be near their current farm.

However, the narrator learns the Richards’ situation is dire. Their original farmhouse burned down, and they now live in a converted outbuilding. Mr. and Mrs. Richard live there with their 10-year-old son and Mr. Richard’s mother. The family struggles to get by and depends on charity. The narrator and her friends try to help the family in small ways.

The narrator reflects that she and her housemates were like children, acting without thought of consequences. But their interactions with the Richards gave them their first understanding of true hardship and made their own struggles and discomforts in the farmhouse seem petty. The narrator moves on from the farmhouse but never forgets the kindness of Mrs. Richard’s first visit.

  • The narrator is a 20-year-old woman who helps out at the Richards’ farm. Mr. Richards is cranky and looks older than his age. Their son has an intellectual disability and cannot run the farm.
  • The family and neighbors help the Richards with farm work as Mr. Richards’ health declines. The narrator even drives a tractor to help with haying at one point.
  • The narrator leaves the farm for a few months. When she returns, the Richards’ situation has worsened. A milk cooperative has required farmers to buy expensive bulk tanks to collect milk. The Richards can’t afford one and will go out of business without it.
  • The narrator and a friend visit the Richards to make a video about their situation. Mrs. Richards shows them around the farm on a bright day. While the farm looks lovely, Mrs. Richards talks about how the bulk tank requirement will end their farm.

The key events are:

  1. The narrator helps the Richards as Mr. Richards’ health declines and their son cannot run the farm.
  2. The narrator leaves for a few months.
  3. When she returns, a bulk tank requirement threatens to put the Richards out of business.
  4. The narrator and a friend make a video about this situation, capturing both the charm of the farm and the Richards’ dire circumstances.

The themes are the difficulty of small family farming, the effects of technology and industry requirements on individuals, and community support for those in need.

  • The author’s father, an accountant, asked her to write a program for a variable-rate amortization schedule in 1980. Although she had just started her first programming job, she didn’t know much about amortization or the client’s computer system.

  • She decided to take on the challenging project to get out of her parents’ house during the Jewish High Holy Days. She planned to write the program for her own system, then translate it for the client’s system. However, the client’s office was closed during that time.

  • After struggling for four days, her father told her to give up. She felt sucker-punched and humiliated but resolved not to give up just because others doubted her. Her relationship with her father was strained after this incident.

  • In 1979, the author bought a TRS-80 computer on impulse, though she knew nothing about programming. She struggled to learn BASIC for two months, forgetting to eat or work at times. She fell into the traps of early spaghetti code.

  • Though the learning process was maddening, the author found exploring the unknown alluring. She moved from failure to failure, gaining knowledge along the way. She came to enjoy the strange experience of struggling with code.

  • The key events are: 1) The amortization program request from her father. 2) Her resolution not to quit despite difficulties and self-doubt. 3) Buying an early computer and learning to program through struggle and failure. 4) Gaining confidence in exploring the unfamiliar.

The themes are perseverance in the face of obstacles, learning through struggle and failure, confidence in tackling the unknown, and strained relationships. The overall tone is one of hard-won triumph over self-doubt and difficult beginnings.

  • The author found programming to be challenging but rewarding. Debugging code and fixing bugs gave her a thrill of accomplishment, like fixing a carburetor.
  • She had a successful career as an ordinary programmer, writing code for business applications. She consulted on many projects, some successful and some not. She often felt like an imposter, aware of how much she didn’t know. But she found that many others, even highly trained engineers, felt the same way.
  • She reflects on the proliferation of tech companies in San Francisco’s South of Market neighborhood. The tech workforce is overwhelmingly young, white, and male. Meanwhile, the neighborhood remains diverse, filled with people of color working service jobs.
  • She worries that code and algorithms developed by the tech elite now deeply influence life for everyone, even the two-thirds of humanity without internet access. She fears technology has penetrated into the smallest details of human existence.

In summary, the author has had a long career writing code and working in technology. While personally rewarding, she worries about the influence wielded by tech companies and their largely privileged workforce. She fears the spread of algorithms and code into all aspects of life, even for those without access to technology. The words “fun” are used once in a paragraph describing errors and setbacks in programming. Overall, her view of technology seems more wary and critical than fun.

  • The author advocates opening up knowledge about how algorithms and code work to the general public. She believes that increased technical literacy and an understanding of the biases in algorithms can help address issues created by technology.

  • However, the author notes that achieving broad technical education is challenging. There are political and social barriers to improving public education and expanding access. Private and charter schools are promoting their own agendas rather than teaching truth or civic values.

  • The author explores whether massive open online courses (MOOCs) could help address the lack of technical literacy. However, many people still lack access to the internet and computers, and MOOCs may still reflect the biases of the existing tech culture.

  • The author signs up for an intro Python course on Coursera to see how these courses work. She finds that the course seems aimed at appealing to a certain geeky, young male audience with many references to shows like The Big Bang Theory and Monty Python. The instructor says he wants the class to be “funny” and not “boring” but the humor seems to rely on assuming familiarity with certain cultural references.

  • The author concludes that while the course could teach Python, it reflects the biases and assumptions of the existing male-dominated tech culture. The course seems more focused on being entertaining than accessible or civic-minded. Overall, the author remains skeptical that MOOCs alone can solve the lack of broad technical literacy and understanding in society.

The key arguments are:

  1. Broader access to technical knowledge and education is necessary to address issues with technology and algorithms.
  2. There are political, social and economic barriers to achieving universal technical literacy through traditional education.
  3. MOOCs are limited by lack of access and may reflect the biases of the existing tech culture rather than promoting broader understanding.
  4. An example intro Python course shows how cultural references and assumptions can limit the accessibility and aims of these courses.
  5. Alternative approaches beyond just MOOCs are needed to truly promote broad technical literacy.

The author’s perspective seems to be that achieving universal technical education will require addressing deeper issues around politics, economics, culture and access—not just providing more online courses. Simply making more technology and courses available will not necessarily promote the kind of critical, literate and civic-minded understanding that the author advocates. Overall, more work is needed to make technical knowledge accessible and relevant for people from all backgrounds.

The writer enrolls in an online Python programming course to observe the culture of programmers and recall her own experiences learning to code. The instructors, especially Warren and Scott Rixner, are passionate and encouraging, though their humor and references reveal a male-centric geek culture that may alienate some students.

The peer-graded assignments and discussion forums are a crucial part of the learning experience. The writer submits an assignment and grades others’, finding the process helpful when done constructively. However, the focus on gaming, from “I want a shrubbery” to the final Space Rocks game project, reflects the disproportionate role of men in that field and risks discouraging women and others outside the dominant culture.

While the culture portrayed may warn off those who won’t fit in, the writer argues it also inoculates students by showing them what to expect in the real world of programming. Over time, she has come to appreciate some of the “weirdness” alongside the hostility she faced. The few good experiences and moments of pleasure in her work are why she recommends others learn to code.

The forums show students from around the world collaborating to solve problems and persevere despite fears of failure and feelings of being unfit. The writer finds herself drawn back to the “furious conversation” in the forums, missing the community when away. The course gives outsiders a chance to express their determination to enter the programming world.

In summary, the writer portrays a programming culture that is often off-putting but also rewarding, especially when women and outsiders have a chance to join the conversation and community. The online course offers an opportunity and warnings for aspiring programmers, as well as a view into why the writer continues to encourage learning to code despite facing exclusion and adversity. Programming’s pleasures, and helping others access them, make challenges worthwhile.

The online course “The Design and Analysis of Algorithms, Part 1” corresponds to half of a ten-week Stanford course required for a computer science degree. The instructor, Tim Roughgarden, introduces famous algorithms created over the last 50 years. Understanding them provides knowledge of core computer science and impresses employers.

Roughgarden warns that the course requires math and programming prerequisites. He says students will feel “baffled and intrigued.” The fast pace and math may discourage newcomers, but Roughgarden invites them to stay and get a “fascination” for the material.

However, Roughgarden says he expects “nothing” from students and that the automated grading is “primitive,” unable to properly assess understanding. The faulty grading algorithm leaves students unsure if they’ve learned. Roughgarden introduced this early, limited version in both 2013 and 2015, not improving it.

The online course is “in no way watered down” but has less demanding assignments. Students may be unprepared but can glimpse the “beauty” in algorithms. The author’s brain “shakes off the crusts of time,” recalling math from 20+ years earlier.

The summary portrays the course as challenging yet intriguing, with a demanding instructor and high expectations but an automated grading system that likely discourages and confuses many students. The “beauty” of algorithms emerges for those who persevere, recalling and re-learning math from long ago.

• The author first studied algorithms and fell in love with their beauty around 5 years ago while reading Donald Knuth’s The Art of Computer Programming. The author is thrilled that Tim Roughgarden’s online Coursera class allows more people to experience the beauty and creativity in algorithms.

• Roughgarden’s class focuses on analyzing algorithms abstractly, separate from any computing environment. The goal is to evaluate how algorithms scale as the number of steps grows very large. The author appreciates this abstract view, having worked for years with the messy realities of computing environments.

• However, the author notes that the abstract view ignores how algorithms are used in the real world. The author brings up the NSA’s mass surveillance program as an example, but gets little response from other students. Roughgarden himself briefly acknowledges the real-world need for fast algorithms when discussing startups.

• The author watches videos of Roughgarden’s in-person Stanford class and notices differences from the MOOC. The in-person class moves at a manic pace, and Roughgarden tells students that for most of them, $1 million “would be nothing.” The author reflects on the vast differences in experience and opportunity between the Stanford students and MOOC students.

• The author finds hope in a Coursera class called “Programming for Everyone,” hoping it may open opportunities to more people.

The key ideas are:

  1. Algorithms and computing topics can be incredibly beautiful and creative.

  2. MOOCs like Roughgarden’s give more people exposure to these topics, but lack the resources and support of an elite school like Stanford.

  3. There are vast differences in experience and opportunity between students at a school like Stanford and students in a MOOC.

  4. The author hopes a class like “Programming for Everyone” may help open up opportunities in tech to more people.

  • The author has lived in SOMA, San Francisco since 1996. At that time, it was an abandoned industrial area with some old Victorian houses. South Park, a nearby park, provided shade and escape.
  • South Park remained largely unchanged for years after the dot-com bust of 2000-2001. It retained a wild, unkempt feel that provided respite from the developing neighborhood.
  • Recently, the area has been undergoing a building frenzy with many high-rises. But South Park remained a “desolate, neglected” patch of nature. The author often went there to escape the heat and think.
  • However, the park has now been renovated. The trees and crabgrass have been removed and replaced with a manicured lawn, picnic tables, and achildren’s play area. The author sees this as a sign it is time to leave SOMA and possibly San Francisco.
  • The renovation of the park signals the end of the neighborhood the author once knew. The “rough, affordable” Victorian flats and abandoned lots of 1996 SOMA have been replaced by “men with clipboards” and luxury high-rises.
  • The author feels a nostalgia for the way SOMA used to be. The new park and development represent a “neutering” of the area’s wild spirit. The author implies this is representative of changes in San Francisco and society as a whole.
  • Nonetheless, the author acknowledges that the new park may be enjoyed by newcomers to the neighborhood, even if it is no longer a place of escape. Change is inevitable, even if nostalgia remains.

The summary condenses the key details about the changes in South Park and SOMA and the author’s reaction to them, including a sense of nostalgia for the past and discomfort with rapid development and gentrification. However, the author also acknowledges that change is unavoidable and the new park may be valued by others. The summary touches on the themes of natural versus developed spaces, memory, and the pace of change in communities.

  • The author returned to San Francisco after being away for a few months and found that the small playground in South Park, near where she lived, had been completely torn up and was enclosed by a tall fence. Most of the trees were gone and the park was mostly dirt.
  • She learned that a “community group” had undertaken the renovation of the park without consulting or notifying most people who lived around the park. They had slipped notices under doors after starting the project.
  • The author feels hopeless about what has happened to her city and neighborhood. An architectural rendering of the planned park shows mostly concrete, little green space, and a massive high-rise that does not currently exist. She believes the planning commission and developers collaborated on radically changing the park and neighborhood.
  • She recalls attending a planning commission meeting in 2012 where she warned them that the footprints for ground-floor retail spaces in new buildings were too large. The all-male commission dismissed and mocked her concerns. She argues that small, affordable retail spaces are important for immigrant entrepreneurs and building neighborhood character. The commission did not listen.
  • She sees the changes, including the loss of small, family-owned stores, as detrimental to the neighborhood feel and opportunity. The forces of development were too strong to stop, though there was a chance in 2012 to preserve what was best about the neighborhood.
  • She highlights a small convenience store run by a Palestinian Christian family as an example of the kind of small, community-building business that is disappearing. Despite its small size, she has a long-standing, affectionate relationship with the owner, who calls her “dah-link.”

The key themes are loss of community, untrammeled development and gentrification destroying neighborhood character, the dismissal of citizens’ and small business owners’ concerns by an unresponsive planning process, and the power of personal relationships in building community. The author clearly feels a deep sense of loss and frustration over these changes in her city and neighborhood.

The author uses the example of a small local convenience store to illustrate how such modest businesses support families and the economy. Although the store itself is small, its profits from selling alcohol, lottery tickets, and cigarettes allow it to support the owner’s entire family.

The author then describes the massive changes in the South of Market (SOMA) neighborhood of San Francisco. There is extensive construction, with many old buildings being demolished and new high-rises going up. This construction has disrupted traffic, closed roads, and changed the local environment, driving away birds and small wildlife. However, the author acknowledges that cities must grow and change to stay vital.

The technology industry has moved into SOMA, taking over many historic buildings. Major tech companies like Twitter, Yelp, and Salesforce have offices there. The demand for housing has led to many high-priced condos and apartments being built. Rents in SOMA are now the highest in San Francisco, affordable only to those making hundreds of thousands of dollars per year. Many tech workers rely on private buses to commute to offices down the peninsula.

The author observes that many of the new tech residents of SOMA don’t seem to care much about building a traditional neighborhood community. They are more concerned with convenient and affordable housing. They get most of what they need delivered and use the neighborhood mainly as a “bedroom community.” Their intense social lives seem to be centered around work rather than home.

The author had predicted such a future back in 1998, foreseeing a society divided into those who stayed home and got everything delivered vs. those who did the delivering. At that time, the delivery workers were mostly unionized. Now, delivery is often done by poorly paid gig workers with no benefits or job security. The author sees piles of packages waiting outside doors for residents who are working long hours at their startups.

In summary, the author laments how the tech boom and influx of wealthy young professionals have transformed SOMA, making it unaffordable for most and eroding the traditional neighborhood community feel. The old neighborhood has been disrupted, but the new residents seem unconcerned with building a new one. Their lifestyles are centered around work over home. Meanwhile, the less affluent workers who support them are increasingly precarious.

  • The author observes that many people in San Francisco now get their meals, groceries, and goods delivered rather than shopping themselves. Delivery people are often invisible, working for companies in the “gig economy.” The author talks to some delivery workers and finds that some are students happy for the flexibility but others struggle with difficult working conditions and low pay.

  • San Francisco has been overtaken by startup culture. Young people move to the city hoping to found successful tech companies and become wealthy. However, the vast majority end up as employees or struggling entrepreneurs. The startup culture values becoming a CEO or early employee of a successful tech company above all else. Those who don’t achieve this level of success may see themselves as failures.

  • The author attends an event where entrepreneurs pitch their startup ideas to investors. Most of the presentations are unmemorable, but one stands out—a virtual reality device for the blind. Though the presentations are short, the event seems to give the entrepreneurs a chance to network and find community. The audience appears genuinely enthusiastic and supportive. For these young entrepreneurs, events like this one are a source of joy and bonding, not just anxiety and rejection.

  • In summary, the author portrays San Francisco as a city where the glamour of startup success masks economic inequality and struggle. Young tech entrepreneurs network and bond at events, sustaining hopes of glory that are unlikely to be realized. Meanwhile, many workers enable the comforts of tech elites through low-paying service jobs like delivery work. The author seems concerned about the “peonization of the working class” and the desire to replace human workers with automation. However, startup culture’s conformity and hubris may ultimately prove transient.

  • Roger King, the founder of Bay Angels, thought the startup pitch event was unprofessional. He said the winners get nothing but maybe an interview. However, the event organizer claimed nearly 100% of the winners get funding.

  • The Bay Angels event was more professional than the Sharks’ event. Presenters got 8 minutes to pitch and admission was waived for them. The presentations included a travel app, 3D printing, virtual reality for national parks, a real estate search app, and an MRI reading algorithm.

  • The author met a man pitching an app to help employers determine if job applicants were a “cultural fit.” The author criticized this as a way to maintain a segregated technical culture. The man replied he was working for companies, not society.

  • There are many co-working spaces in the area to rent desks, offices, and workspaces. They provide practical resources but also sell an aura and networking. The spaces and slogans promote startup ideals like “Do what you love,” “If you don’t like your job, create one,” and “Change the world!”

  • The author criticizes the “Change the world!” motto as egoistic. Startups claim to change the world but often disrupt existing structures in harmful ways, concentrating wealth and eliminating jobs and opportunities. “Change the world!” is really just an advertisement to attract youth and obscure the goal of making money.

  • WeWork is the largest co-working company, renting workspace around the world. Co-working spaces spread the ideals and culture of the startup world. The author sees them as “hatcheries” that “indoctrinate” people into the technical society.

In summary, the author provides a critical perspective on startup culture, pitch events, co-working spaces, and slogans. While startups claim noble goals of changing the world, the author argues the real motives are making money, disrupting existing systems, and concentrating wealth, often in harmful ways. The co-working spaces and pitch events spread and reinforce these ideals.

  • The author visited one of WeWork’s co-working spaces in San Francisco’s Tenderloin neighborhood.

  • The Tenderloin is a high-crime, impoverished area filled with desperate people living on the streets.

  • The author spoke with the CEO of a startup renting office space there. The CEO wanted to provide mobile Wi-Fi in the Tenderloin and other areas, hoping to get tech workers to engage with local businesses.

  • The author initially liked the CEO’s ambitions but later wondered how realistic they were, given the state of the Tenderloin and its inhabitants. The CEO’s plans seemed disconnected from the realities on the ground.

  • The author attended a pitch event the night Trump was elected and felt it was bizarre and unreal given the circumstances. One pitch in particular, for a resort to increase employee loyalty, struck the author as a “Trumpian fantasy.”

  • The author sees a connection between the rhetoric around disintermediation in the 1990s tech world and Trump’s dismissal of mainstream institutions and facts. Tools like Twitter have amplified his ability to spread misinformation and attack critics.

  • The author marched in protest of Trump’s presidency, finding it exhilarating to join so many like-minded people standing up against Trump’s agenda. The diverse, witty crowd gave the author hope.

The speaker attended the Women’s March in New York City and was heartened to see so many young women participating. She hoped they would learn from the history of the women’s movement in the 1970s and avoid dividing over differences. The march was joyful, with clever chants and shouts moving through the crowd.

While the march was uplifting, the real work lies ahead in organizing and fighting for women’s rights, especially as they are under threat from the Trump administration. The young women who attended the march need to create coalitions and continue the fight.

In unrelated notes, the speaker discussed the renovation of South Park in San Francisco. The new design seems too spare, clean and white. The materials and elements like metal tables, benches and light poles will likely not withstand the elements and use over time. The new park seems to erase the messy vitality of the previous space.

The speaker also discussed the tech industry. The privileged members of the millennial generation who work in tech seem to be living charmed lives and dreaming big dreams as they try to build startups. However, the tech boom seems to be slowing, with fewer startups succeeding and major tech companies facing difficulties. The speaker worries that hype around companies like Uber and Snap that have yet to go public will lead ordinary investors to lose money in a crash like that in 2000.

More broadly, the speaker laments how the internet has enabled constant surveillance by governments and companies, contrary to the early ideals of its creators. The speaker feels a need to warn the young tech workers of the damage the internet has caused and the failure of its promise. However, the speaker also recognizes that this new generation must find their own way forward and hopes they are able to balance tech’s promises and perils. The future remains unwritten.

  • The article describes a chess match between Kasparov and Deep Blue in which Kasparov initially thought the computer was showing signs of intelligence or “sentience.” However, Kasparov later realized Deep Blue was not actually sentient. Rather, Kasparov had projected the qualities of a mind onto the machine. Once Kasparov adjusted his strategy to match Deep Blue’s actual capabilities rather than the perceived sentient mind, Deep Blue’s play seemed less impressive.

  • The key point is that Deep Blue did not demonstrate sentience or intelligence. It was operating based on its programming. Kasparov’s perception of sentience caused him to make strategic errors. But ultimately, Deep Blue was just a computer following its code. It did not have a mind or sentience in the way humans do.

#book-summary
Author Photo

About Matheus Puppe