Ryan’s notes

The Shallows by Nicholas Carr

★★★★★

Carr’s argument

The Shallows argues that the Internet has changed the way we think; specifically, that hyperlinks, multi-media, and multi-tasking have overloaded our working memory such that we remember less and make fewer connections in our brains, despite reading more words and getting more done than ever before.

The brain is plastic

Surprising to me, up until about 1970, scientists believed that our brain formed during childhood and then remained largely static through adulthood. While it’s true that our brains change most rapidly as kids, they remain extremely plastic as adults; genes specify the starting condition of our neurons, but our experiences constantly reshape our brains to adapt to the world we live in. Every time we experience something, we strengthen and weaken existing connections in our brain, remove old ones, or create entirely new ones. Some extreme examples of brain plasticity include the blind adapting to devote their visual cortex to handle audio processing and the deaf dedicating more neurons to process peripheral vision. Closer to home: when we form habits, the structure of our brain actually changes—we strengthen particular pathways.

Thinking is making connections

The crux of Carr’s argument resolves around a single concern: what is thinking? His answer: making connections. When we make connections between different concepts, we think originally, express creativity, form our identity, and construct culture. We are the connections we make, so we should be vigilant about what helps us create novel connections and what doesn’t.

We can’t think about what we don’t remember

In order to form connections between concepts in our brain, we need to first store those concepts—we need to remember. We cannot connect what we don’t actually know; therefore, memory is the first step of thinking. It’s critical then that we remember our experiences, otherwise we won’t be able to think about them.

How memory works

Short-term memory works by strengthening or weakening existing connections in the cortex, but long-term memory requires forming entirely new synaptic terminals. Sensory experiences (sights, sounds, smell, touch, taste) are stored in separate parts of the cortex and “copied” to the hippocampus. In time, the hippocampus (largely at night) combines these sensory inputs into cohesive memories, which it connects it to older stored concepts and encodes these connections in the cortex. The simple act of recalling a memory restarts the entire process, causing the memory to re-enter the hippocampus where new connections can be formed. However, the more pressure on our working, short-term memory (the more distracted we are), the more we disrupt the process of consolidation and connection (the less likely we are to remember). In fact, we can only hold 2-4 ideas in working memory; these are the candidates for what gets remembered long-term. Anytime we overflow our working memory, we lose track of our experiences and will fail to encode long-term memory.

Mediums change how we think

We know our brains are plastic: what we experience changes the way we’re wired. But content is not the only experience; medium is too. Our technologies affect how we process information, and therefore how we think. Marshall McLuhan states “The medium is the message.” Similarly, John Culkin says “We shape our tools, and thereafter they shape us.”

Carr calls out a number of technologies that fundamentally changed the way humans think. First, maps forced us to think abstractly about out environment and propagated abstract thinking more generally. The clock divided and measured time, leading to more scientific thinking. Both Nietzsche and T.S. Eliot believed that the typewriter changed the way they thought as they wrote.

But perhaps the most relevant medium to Carr’s argument is the book. Before humans started writing, the spoken word was the only way to hold and transmit information; oral traditions held a community together. In the early era of tablets and scrolls, writing was largely used to record communal facts due to its expense—it was expensive in both the material sense as well as the cognitive sense: early symbolic alphabets demanded serious brainpower to process and because books included no breaks between words, almost everyone read aloud. Reading was something done by a small, intellectual elite, and it wasn’t easy. But with the introduction of the simpler Greek and Roman alphabets, spaces and punctuation, and Gutenberg’s famous printing press, books became cheap enough to widely distribute and easy enough for anyone to consume. Everyday folks began choosing what they read. They could read in private, and this contemplation drove individuality of thought. What’s more, as authors gained a large attentive audience, they experimented with new diction and vocabulary. People worried that tawdry novels, gossip, and pornography would dumb us down, but Carr argues that what changed the way we thought is not the content of books but the medium. Reading was an unnatural process, demanding sustained attention to decipher meaning, and this is what we got good at in the past few hundred years: deep, linear reading.

Deep reading fosters deep thought

When we read, we focus our entire attention on a linear task. The number of things we’re thinking about stays low, and the pace at which we ingest information is only as quick as we read. This steady, linear, intentional reading maximizes the amount of information we remember and the number of connections we make. In Carr’s own words: “[books] set off intellectual vibrations within their own minds. In the quiet spaces opened up by the prolonged, undistracted reading of a book, people made their own associations, drew their own inferences and analogies, fostered their own ideas. They thought deeply as they read deeply.”

The Internet is inherently distracting

Carr speaks about the “intellectual ethic” of a technology: its inherent characteristics that change the way we think and act. Creators of a new technology nor early users usually understand the technology’s intellectual ethic. We didn’t understand how maps, the clock, or books would change the way we thought at the outset; we were just trying to solve a problem.

Carr argues that the Internet’s intellectual ethic is one of distraction. The computer is a universal machine: it can display text and imagery, produce audio, and do most anything else, so we’re always a glance or click away from leaving our current context. Links add cognitive load to reading. Searching makes accessing information effortless, and combined with multitasking, encourages context switching. Most notably, the Internet is bidirectional, calling for us to respond to stimuli. These characteristics combine to create a medium that is inherently distracting. As we browse the Internet, parse links, and move from window to window or app to app, we’re forced to use more of our brains to make decisions about what to look at, what to click, and what to do next; this overloads our working memory making it more difficult for us to concentrate on a single train of thought.

Studies show that the number of links present in a block of text inversely affects our ability to decipher meaning. They also show that inline images, video, and audio most often do the same.

Our distractedness affects us even when we’re not using our devices. In a recent study, students were given an aptitude test. Their phones were turned off, but placed at various distances away: on the edge of their desk, in their purse or pocket, and in another room. Test scores correlated with the distance of the phone, despite nearly all students stating that their phones were not a distraction. The study also revealed that those who relied on their phones the most suffered the steepest performance declines.

We read more but remember less

Given the amount of time we spend on the Web, we certainly read more text than ever before, but the way we read has changed. Most people don’t read the Internet linearly; instead, they skim text quickly, skipping down the page in an F-shaped pattern. We’re browsing and scanning for keywords and links, and while skimming has always been a form of reading humans partake in, it’s now becoming the dominant mode.

Two of the most important sources of cognitive load are “extraneous problem solving” and “divided attention.” Book readers activate regions of the brain associated with language, memory, and vision, but they don’t use regions associated with decision making and problem solving; Internet browsers do. Reading on the Internet engages more of our brain and floods our working memory with additional concerns, causing us to remember less. This isn’t too dissimilar from when humans read early texts without spaces or punctuation—it’s not the “deep reading” that fosters memory and connection. According to Carr: “As non-linear reading becomes the default mode of reading, we spend less time deciphering meaning in text.”

Digital connections are not our connections

Doesn’t the wealth of knowledge on the Internet make up for any loss of memory or connections that we form in our own brain as we read? David Brooks argues: “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”

We’re certainly embracing this point of view. In one study, two groups of people were asked to record a set of facts by typing them. One group was told that the facts would be recorded, the other was told they’d be erased. The latter group recalled more facts. When we believe information can be easily recalled, we seem to remember less.

Ease of access seems to other ill-effects. One might think that with the explosion of information available on the Internet, academic papers would cite more diverse sources of knowledge. In fact, the opposite is true: academic papers are increasingly citing fewer, more recent sources.

In another study, two groups of people were introduced to a digital game. One version had been programmed to explain the rules and give helpful hints, the other had no explanation. While users of the more approachable version scored higher faster, the users of the bare-bones game ultimately achieved the highest scores.

Carr states, “It’s common to think of knowledge as something we swim in, and not something we not only accumulate, but form by making connections between our own experiences, books, and the ideas we form. We think we’re smarter than we actually are.”

Reading linearly might not be the most “efficient” way to mine information, but it seems to be a highly efficient way to remember concepts and draw lasting connections.

Herein lies Carr’s major argument: how we consume information affects what we remember and if we make new connections; reading fosters more connected memory and the internet fosters less: “The Web’s connections are not our connections—and no matter how many hours we spend searching and surfing, they will never become our connections. When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity. William James, in concluding his 1892 lecture on memory, said, “The connecting is the thinking.””

Personal cost

Carr argues that we’re trading “deep reading”, a state of concentration, contemplation and connection-building, for productivity. The consequence is that we remember less, form fewer novel connections, and outsource our intellectual identity.

I agree. I notice myself making fewer interesting connections on the Web but more when I concentrate on reading linearly. My aversion to reading books is largely that it feels inefficient, but I now believe that the slowness of the act is actually what leads to better retention and more connections to emerge.

Cultural cost

Carr believes we’ll see a return to the historical norm where deep reading is done only by a “reading class”. He wonders whether this class will have the power and prestige of old times, or whether it will be seen as antiquated.

I wonder if our abandonment of books has fed anti-intellectualism. Has turning away from the deep reading of books (where people concentrate and draw their own meaning) to shallower consumption of popular media driven tribalism, independent of the network effects of social media?

What to do

Carr doesn’t provide solutions, and he doesn’t advocate for rejecting the Internet; he sees it as an inevitability that the Internet will be our dominant medium for quite some time. What he does suggest is for us to work to understand how the medium affects the way we think and behave. With The Shallows, he wanted to kick-start a conversation to identify the real intellectual ethic of the Web.


Notes

  • Which mediums do I most enjoy? Or rather, feel are most rewarding in long-run?

    • Podcasts

    • Conversation

    • Explanatory article with photos

  • We seem to have once viewed the conversation as an art form... any good resources on this?

    • I looked, and didn't find anything great after a quick glance

  • If we believe the default effect of the Internet is bad because it steals our attention and cripples focus, and we also believe that it's hubris to think "it's just how we use it... we're in control," then what is the answer? Reject it? Seems like a non-starter at the individual level, no?

  • People who grew up as children with books as the medium may find that they're losing something as they embrace the web. But what about the latest generation? Does forming connections in the brain with this new medium at the center make them more capable of learning better with it?

  • "Linear, literary mind"... is it really linear? I find myself jumping out of the page and making connections as I read. Certainly linear in first processing though.

Highlights

  • In a 2010 Pew Research survey of some 400 prominent thinkers, more than eighty percent agreed that “by 2020, people’s use of the Internet will have enhanced human intelligence; as people are allowed unprecedented access to more information, they become smarter and make better choices.”

  • The Shallows explains why we were mistaken about the Net. When it comes to the quality of our thoughts and judgments, the amount of information a communication medium supplies is less important than the way the medium presents the information and the way, in turn, our minds take it in.

  • “The medium is the message.”

  • What both enthusiast and skeptic miss is what McLuhan saw: that in the long run a medium’s content matters less than the medium itself in influencing how we think and act.

  • “The effects of technology do not occur at the level of opinions or concepts,” wrote McLuhan. Rather, they alter “patterns of perception steadily and without any resistance.”

  • Media work their magic, or their mischief, on the nervous system itself.

  • In the end, we come to pretend that the technology itself doesn’t matter. It’s how we use it that matters, we tell ourselves. The implication, comforting in its hubris, is that we’re in control.

  • My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I feel it most strongly when I’m reading. I used to find it easy to immerse myself in a book or a lengthy article.

  • The Web’s been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes.

  • what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

  • “The Internet may have made me a less patient reader, but I think that in many ways, it has made me smarter. More connections to documents, artifacts, and people means more external influences on my thinking and thus on my writing.”

  • “Generation Net”—kids who have grown up using the Web. “Digital immersion,” wrote the lead researcher, “has even affected the way they absorb information. They don’t necessarily read a page from left to right and from top to bottom.

  • For the last five centuries, ever since Gutenberg’s printing press made book reading a popular pursuit, the linear, literary mind has been at the center of art, science, and society. As supple as it is subtle, it’s been the imaginative mind of the Renaissance, the rational mind of the Enlightenment, the inventive mind of the Industrial Revolution, even the subversive mind of Modernism. It may soon be yesterday’s mind.

  • The writing ball rescued Nietzsche, at least for a time. Once he had learned touch typing, he was able to write with his eyes closed, using only the tips of his fingers.

  • But the device had a subtler effect on his work. One of Nietzsche’s closest friends, the writer and composer Heinrich Köselitz, noticed a change in the style of his writing. Nietzsche’s prose had become tighter, more telegraphic. There was a new forcefulness to it, too, as though the machine’s power—its “iron”—was, through some mysterious metaphysical mechanism, being transferred into the words it pressed into the page. “Perhaps you will through this instrument even take to a new idiom,” Köselitz wrote in a letter, noting that, in his own work, “my ‘thoughts’ in music and language often depend on the quality of pen and paper.” “You are right,” Nietzsche replied. “Our writing equipment takes part in the forming of our thoughts.”

  • Even as our knowledge of the physical workings of the brain advanced during the last century, one old assumption remained firmly in place: most biologists and neurologists continued to believe, as they had for hundreds of years, that the structure of the adult brain never changed. Our neurons would connect into circuits during childhood, when our brains were malleable, and as we reached maturity the circuitry would become fixed.

  • At a time of rapid scientific advance and social upheaval, Descartes’ dualism came as a comfort.

  • Beliefs informed by the existing environment. What we believe in one domain (rational thought) informs another (how the brain is structured)

  • Marshall McLuhan and Norman Mailer

    • Can i describe who these people are?

  • Our neurons are always breaking old connections and forming new ones, and brand-new nerve cells are always being created. “The brain,” observes Olds, “has the ability to reprogram itself on the fly, altering the way it functions.”

    • Prior to 1970, it was accepted that brains were fixed in place. We've only had 50 years to use this knowledge!

  • Today, scientists sum up the essential dynamic of neuroplasticity with a saying known as Hebb’s rule: “Cells that fire together wire together.”

  • In a slug’s ordinary state, about ninety percent of the sensory neurons in its gill have connections to motor neurons. But after its gill is touched just forty times, only ten percent of the sensory cells maintain links to the motor cells. The research “showed dramatically,” Kandel wrote, that “synapses can undergo large and enduring changes in strength after only a relatively small amount of training.”19

  • Our genes “specify” many of “the connections among neurons—that is, which neurons form synaptic connections with which other neurons and when.” Those genetically determined connections form Kant’s innate templates, the basic architecture of the brain. But our experiences regulate the strength, or “long-term effectiveness,” of the connections, allowing, as Locke had argued, the ongoing reshaping of the mind and “the expression of new patterns of behavior.”20 The opposing philosophies of the empiricist and the rationalist find their common ground in the synapse. The New York University neuroscientist Joseph LeDoux explains in his book Synaptic Self that nature and nurture “actually speak the same language. They both ultimately achieve their mental and behavioral effects by shaping the synaptic organization of the brain.”

  • Nature determines the initial condition. Why are some things more plastic like habits and some less like personality?

  • Some of the most extensive and remarkable changes take place in response to damage to the nervous system. Experiments show, for instance, that if a person is struck blind, the part of the brain that had been dedicated to processing visual stimuli—the visual cortex—doesn’t just go dark. It is quickly taken over by circuits used for audio processing. And if the person learns to read Braille, the visual cortex will be redeployed for processing information delivered through the sense of touch.22 “Neurons seem to ‘want’ to receive input,” explains Nancy Kanwisher of MIT’s McGovern Institute for Brain Research: “When their usual input disappears, they start responding to the next best thing.”23 Thanks to the ready adaptability of neurons, the senses of hearing and touch can grow sharper to mitigate the effects of the loss of sight. Similar alterations happen in the brains of people who go deaf: their other senses strengthen to help make up for the loss of hearing. The area in the brain that processes peripheral vision, for example, grows larger, enabling them to see what they once would have heard.

  • “Plasticity,” says Alvaro Pascual-Leone, a top neurology researcher at Harvard Medical School, is “the normal ongoing state of the nervous system throughout the life span.” Our brains are constantly changing in response to our experiences and our behavior, reworking their circuitry with “each sensory input, motor act, association, reward signal, action plan, or shift of awareness.” Neuroplasticity, argues Pascual-Leone, is one of the most important products of evolution, a trait that enables the nervous system “to escape the restrictions of its own genome and thus adapt to environmental pressures, physiologic changes, and experiences.”

  • Our ways of thinking, perceiving, and acting, we now know, are not entirely determined by our genes. Nor are they entirely determined by our childhood experiences.

    • But it seems that our increased plasticity early in life lends greater tranformational power to childhood experiences

  • As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit. The paradox of neuroplasticity, observes Doidge, is that, for all the mental flexibility it grants us, it can end up locking us into “rigid behaviors.”

    • Self reinforcing

  • Plastic does not mean elastic, in other words. Our neural loops don’t snap back to their former state the way a rubber band does; they hold onto their changed state.

    • Seems true to a degree, but what about riding a bike?

  • When it comes to the quality of our thought, our neurons and synapses are entirely indifferent. The possibility of intellectual decay is inherent in the malleability of our brains.

  • The source of consciousness lies beyond the grasp of consciousness.

    • Apparently debatable, according to some including (man mentioned earlier re brain as receiver). Referenced in how to change your mind. Descartes?

  • “The use of a reduced, substitute space for that of reality,” explains the cartographic historian Arthur Robinson, “is an impressive act in itself.” But what’s even more impressive is how the map “advanced the evolution of abstract thinking” throughout society.

  • If the proliferation of public clocks changed the way people worked, shopped, played, and otherwise behaved as members of an ever more regulated society, the spread of more personal tools for tracking time—chamber clocks, pocket watches, and, a little later, wristwatches—had more intimate consequences. The personal clock became, as Landes writes, “an ever-visible, ever-audible companion and monitor.” By continually reminding its owner of “time used, time spent, time wasted, time lost,” it became both “prod and key to personal achievement and productivity.” The “personalization” of precisely measured time “was a major stimulus to the individualism that was an ever more salient aspect of Western civilization.”7 The mechanical clock changed the way we saw ourselves. And like the map, it changed the way we thought. Once the clock had redefined time as a series of units of equal duration, our minds began to stress the methodical mental work of division and measurement. We began to see, in all things and phenomena, the pieces that composed the whole, and then we began to see the pieces of which the pieces were made. Our thinking became Aristotelian in its emphasis on discerning abstract patterns behind the visible surfaces of the material world.

    • Examples of medium being the message

  • “intellectual technologies.” These include all the tools we use to extend or support our mental powers—to find and classify information, to formulate and articulate ideas, to share know-how and knowledge, to take measurements and perform calculations, to expand the capacity of our memory. The typewriter is an intellectual technology. So are the abacus and the slide rule, the sextant and the globe, the book and the newspaper, the school and the library, the computer and the Internet. Although the use of any kind of tool can influence our thoughts and perspectives—the plow changed the outlook of the farmer, the microscope opened new worlds of mental exploration for the scientist—it is our intellectual technologies that have the greatest and most lasting power over what and how we think. They are our most intimate tools, the ones we use for self-expression, for shaping personal and public identity, and for cultivating relations with others.

  • The intellectual ethic of a technology is rarely recognized by its inventors. They are usually so intent on solving a particular problem or untangling some thorny scientific or engineering dilemma that they don’t see the broader implications of their work. The users of the technology are also usually oblivious to its ethic. They, too, are concerned with the practical benefits they gain from employing the tool. Our ancestors didn’t develop or use maps in order to enhance their capacity for conceptual thinking or to bring the world’s hidden structures to light. Nor did they manufacture mechanical clocks to spur the adoption of a more scientific mode of thinking. Those were by-products of the technologies. But what by-products! Ultimately, it’s an invention’s intellectual ethic that has the most profound effect on us. The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.

  • Laws, records, transactions, decisions, traditions—everything that today would be “documented”—in oral cultures had to be, as Havelock says, “composed in formulaic verse” and distributed “by being sung or chanted aloud.”30

  • “When he read, his eyes scanned the page and his heart explored the meaning, but his voice was silent and his tongue was still,” wrote Augustine. “Often, when we came to see him, we found him reading like this in silence, for he never read aloud.” Baffled by such peculiar behavior, Augustine wondered whether Ambrose “needed to spare his voice, which quite easily became hoarse.”

  • It’s hard for us to imagine today, but no spaces separated the words in early writing.

  • “altering the neurophysiological process of reading,” word separation “freed the intellectual faculties of the reader,” Saenger writes; “even readers of modest intellectual capacity could read more swiftly, and they could understand an increasing number of inherently more difficult texts.”7

    • Ability to think of connections while reading

  • But as soon as “something in the environment changes, we need to take notice because it might mean danger—or opportunity.” 9 Our fast-paced, reflexive shifts in focus were once crucial to our survival. They reduced the odds that a predator would take us by surprise or that we’d overlook a nearby source of food.

  • To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object.

  • Many people had, of course, cultivated a capacity for sustained attention long before the book or even the alphabet came along. The hunter, the craftsman, the ascetic—all had to train their brains to control and concentrate their attention. What was so remarkable about book reading was that the deep concentration was combined with the highly active and efficient deciphering of text and interpretation of meaning.

  • those words set off intellectual vibrations within their own minds. In the quiet spaces opened up by the prolonged, undistracted reading of a book, people made their own associations, drew their own inferences and analogies, fostered their own ideas. They thought deeply as they read deeply.

  • The writing and reading of tablets, scrolls, and early codices had stressed the communal development and propagation of knowledge. Individual creativity had remained subordinate to the needs of the group. Writing had remained more a means of recording than a method of composition. Now, writing began to take on, and to disseminate, a new intellectual ethic: the ethic of the book. The development of knowledge became an increasingly private act, with each reader creating, in his own mind, a personal synthesis of the ideas and information passed down through the writings of other thinkers. The sense of individualism strengthened.

  • According to one estimate, the number of books produced in the fifty years following Gutenberg’s invention equaled the number produced by European scribes during the preceding thousand years.

  • By accelerating the spread of books into popular culture and making them a mainstay of leisure time, the cruder, crasser, and more trifling works also helped spread the book’s ethic of deep, attentive reading. “The same silence, solitude, and contemplative attitudes associated formerly with pure spiritual devotion,” writes Eisenstein, “also accompanies the perusal of scandal sheets, ‘lewd Ballads,’ ‘merry bookes of Italie,’ and other ‘corrupted tales in Inke and Paper.’”29 Whether a person is immersed in a bodice ripper or a Psalter, the synaptic effects are largely the same.

  • After Gutenberg’s invention, the bounds of language expanded rapidly as writers, competing for the eyes of ever more sophisticated and demanding readers, strived to express ideas and emotions with superior clarity, elegance, and originality. The vocabulary of the English language, once limited to just a few thousand words, expanded to upwards of a million words as books proliferated.37 Many of the new words encapsulated abstract concepts that simply hadn’t existed before.

  • As language expanded, consciousness deepened.

    • Not totally buying it

  • Turing seems to have been the first to understand the digital computer’s limitless adaptability. What he could not have anticipated was the way his universal machine would, just a few decades after his death, become our universal medium. Because the different sorts of information distributed by traditional media—words, numbers, sounds, images, moving pictures—can all be translated into digital code, they can all be “computed.”

  • It’s becoming our typewriter and our printing press, our map and our clock, our calculator and our telephone, our post office and our library, our radio and our TV.

  • THE NET DIFFERS from most of the mass media it replaces in an obvious and very important way: it’s bidirectional.

  • text messages, which also continues to increase rapidly.

    • "Because Internet" could be a good next read

  • Young adults between the ages of twenty-five and thirty-four, who are among the most avid Net users, were reading printed works for a total of just forty-nine minutes a week in 2008, down a precipitous twenty-nine percent from 2004.

    • What is the latest stat?

  • Until the Net arrived, the history of media had been a tale of fragmentation. Different technologies progressed down different paths, leading to a proliferation of special-purpose tools. Books and newspapers could present text and images, but they couldn’t handle sounds or moving pictures. Visual media like cinema and TV were unsuited to the display of text, except in the smallest of quantities. Radios, telephones, phonographs, and tape players were limited to transmitting sounds. If you wanted to add up numbers, you used a calculator. If you wanted to look up facts, you consulted a set of encyclopedias or a World Almanac. The production end of the business was every bit as fragmented as the consumption end. If a company wanted to sell words, it printed them on paper. If it wanted to sell movies, it wound them onto spools of film. If it wanted to sell songs, it pressed them onto vinyl records or recorded them onto magnetic tape. If it wanted to distribute TV shows and commercials, it shot them through the air from a big antenna or sent them down thick black coaxial cables.

  • We replace our special-purpose tools with an all-purpose tool.

    • Best argument for lack of focus IMO

  • Traditional media, even electronic ones, are being refashioned and repositioned as they go through the shift to online distribution. When the Net absorbs a medium, it re-creates that medium in its own image. It not only dissolves the medium’s physical form; it injects the medium’s content with hyperlinks, breaks up the content into searchable chunks, and surrounds the content with the content of all the other media it has absorbed. All these changes in the form of the content also change the way we use, experience, and even understand the content.

  • Links don’t just point us to related or supplemental works; they propel us toward them. They encourage us to dip in and out of a series of texts rather than devote sustained attention to any one of them. Hyperlinks are designed to grab our attention. Their value as navigational tools is inextricable from the distraction they cause.

  • the ease and ready availability of searching make it much simpler to jump between digital documents than it ever was to jump between printed ones. Our attachment to any one text becomes more tenuous, more provisional.

    • Experienced this today when using Notion. A thought came up so i moved to record it in its proper place and soon enough i was away from the original text

  • In March of 2008, the New York Times announced it would begin devoting three pages of every edition to paragraph-long article abstracts and other brief items. Its design director, Tom Bodkin, explained that the “shortcuts” would allow harried readers to get a quick “taste” of the day’s news, sparing them the “less efficient” method of actually turning the pages and reading the articles.

    • Efficiency to what end? Extracting headlines to be in the know / to feel informed? To be entertained?

  • Some newer shows, such as NBC’s Late Night with Jimmy Fallon, have been explicitly designed to cater as much to Net surfers as TV viewers, with an emphasis on brief segments that lend themselves to distribution as YouTube clips.

    • As the content changes to adapt to the new medium, the observer gets a new and different medium, even if she chooses to focus and give undivided attention for the duration.

  • The Net has begun to alter the way we experience actual performances as well as the recordings of those performances.

    • Curious to do a deeper dive on how streaming has changed music. Probably catchier early hooks?

  • Investing a few hundred dollars in a specialized “digital reader” has seemed silly, given the ease and pleasure of buying and reading old-fashioned books.

    • As I'm reading on a kindle

  • early 2009 that for the 275,000 books it sells in both traditional and digital form, the e-book versions account for thirty-five percent of total sales,

    • What about now?

  • The Wall Street Journal’s L. Gordon Crovitz has suggested that easy-to-use, networked readers like the Kindle “can help return to us our attention spans and extend what makes books great: words and their meaning.”5 That’s a sentiment most literary-minded folks would be eager to share. But it’s wishful thinking. Crovitz has fallen victim to the blindness that McLuhan warned against: the inability to see how a change in a medium’s form is also a change in its content. “E-books should not just be print books delivered electronically,” says a senior vice president of HarperStudio, an imprint of the publishing giant HarperCollins. “We need to take advantage of the medium and create something dynamic to enhance the experience. I want links and behind the scenes extras and narration and videos and conversation.”6 As soon as you inject a book with links and connect it to the Web—as soon as you “extend” and “enhance” it and make it “dynamic”—you change what it is and you change, as well, the experience of reading it. An e-book is no more a book than an online newspaper is a newspaper.

  • By the end of the decade, cell phone novels had come to dominate the country’s best-seller lists. The three top-selling Japanese novels in 2007 were all originally written on mobile phones.

    • Whoa

  • As more readers come to discover books through online text searches, for example, authors will face growing pressures to tailor their words to search engines, the way bloggers and other Web writers routinely do today. Steven Johnson sketches out some of the likely consequences: “Writers and publishers will begin to think about how individual pages or chapters might rank in Google’s results, crafting sections explicitly in the hopes that they will draw in that steady stream of search visitors. Individual paragraphs will be accompanied by descriptive tags to orient potential searchers; chapter titles will be tested to determine how well they rank.”

    • Has this happened?

  • We’ll subscribe to services that automatically update our e-books with comments and revisions added by fellow readers. “Soon,” says Ben Vershbow of the Institute for the Future of the Book, an arm of USC’s Annenberg Center for Communication, “books will literally have discussions inside of them, both live chats and asynchronous exchanges through comments and social annotation. You will be able to see who else out there is reading that book and be able to open up a dialog with them.”

    • Top highlights seems like the furthest we've come?

  • Authors, able to assume that an attentive reader, deeply engaged both intellectually and emotionally, “would come at last, and would thank them,” quickly jumped beyond the limits of social speech and began to explore a wealth of distinctively literary forms, many of which could exist only on the page. The new freedom of the private writer led, as we’ve seen, to a burst of experimentation that expanded vocabulary, extended the boundaries of syntax, and in general increased the flexibility and expressiveness of language. Now that the context of reading is again shifting, from the private page to the communal screen, authors will adapt once more.

  • They will increasingly tailor their work to a milieu that the essayist Caleb Crain describes as “groupiness,” where people read mainly “for the sake of a feeling of belonging” rather than for personal enlightenment or amusement.17 As social concerns override literary ones, writers seem fated to eschew virtuosity and experimentation in favor of a bland but immediately accessible style. Writing will become a means for recording chatter.

    • This doesnt seem particularly true?

  • Even after an e-book is downloaded into a networked device, it can be easily and automatically updated—just as software programs routinely are today.18 It seems likely that removing the sense of closure from book writing will, in time, alter writers’ attitudes toward their work. The pressure to achieve perfection will diminish, along with the artistic rigor that the pressure imposed.

    • These predictions aren't holding up particularly well

  • The practice of deep reading that became popular in the wake of Gutenberg’s invention, in which “the quiet was part of the meaning, part of the mind,” will continue to fade, in all likelihood becoming the province of a small and dwindling elite. We will, in other words, revert to the historical norm. As a group of Northwestern University professors wrote in a 2005 article in the Annual Review of Sociology, the recent changes in our reading habits suggest that the “era of mass book reading” was a brief “anomaly” in our intellectual history: “We are now seeing such reading return to its former social base: a self-perpetuating minority that we shall call the reading class.” The question that remains to be answered, they went on, is whether that reading class will have the “power and prestige associated with an increasingly rare form of cultural capital” or will be viewed as the eccentric practitioners of “an increasingly arcane hobby.”

    • And will this feed anti-intellectualism?

  • “Before this century shall end, journalism will be the whole press—the whole human thought,” declared the French poet and politician Alphonse de Lamartine in 1831. “Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other—sudden, instantaneous, burning with the fervor of the soul from which it burst forth. This will be the reign of the human word in all its plenitude. Thought will not have time to ripen, to accumulate into the form of a book—the book will arrive too late. The only book possible from today is a newspaper.”

  • Printing, a “somewhat antiquated process” that for centuries “has reigned despotically over the mind of man,” would be replaced by “phonography,” and libraries would be turned into “phonographotecks.” We would see a return of “the art of utterance,” as narrators took the place of writers. “The ladies,” Uzanne concluded, “will no longer say in speaking of a successful author, ‘What a charming writer!’ All shuddering with emotion, they will sigh, ‘Ah, how this “Teller’s” voice thrills you, charms you, moves you.’”

  • pedagogy

  • The time has come, he said, for teachers and students alike to abandon the “linear, hierarchical” world of the book

    • It does seem that books often arent the best way to learn most practical skills, but instead by doing

  • Clay Shirky, a digital-media scholar at New York University, suggested in a 2008 blog post that we shouldn’t waste our time mourning the death of deep reading—it was overrated all along. “No one reads War and Peace,” he wrote, singling out Tolstoy’s epic as the quintessence of high literary achievement. “It’s too long, and not so interesting.” People have “increasingly decided that Tolstoy’s sacred work isn’t actually worth the time it takes to read it.” The same goes for Proust’s In Search of Lost Time and other novels that until recently were considered, in Shirky’s cutting phrase, “Very Important in some vague way.” Indeed, we’ve “been emptily praising” writers like Tolstoy and Proust “all these years.” Our old literary habits “were just a side-effect of living in an environment of impoverished access.”28 Now that the Net has granted us abundant “access,” Shirky concluded, we can at last lay those tired habits aside.

  • As Alberto Manguel has written, “There is an unbridgeable chasm between the book that tradition has declared a classic and the book (the same book) that we have made ours through instinct, emotion and understanding: suffered through it, rejoiced in it, translated it into our experience and (notwithstanding the layers of readings with which a book comes into our hands) essentially become its first readers.”29 If you lack the time, the interest, or the facility to inhabit a literary work—to make it your own in the way Manguel describes—then of course you’d consider Tolstoy’s masterpiece to be “too long, and not so interesting.”

  • Their arguments are another important sign of the fundamental shift taking place in society’s attitude toward intellectual achievement. Their words also make it a lot easier for people to justify that shift—to convince themselves that surfing the Web is a suitable, even superior, substitute for deep reading and other forms of calm and attentive thought.

    • Hmm, i sense resent here. But also a good broadening from reading to general calm and attentive thought

  • A group of prominent computer scientists had been invited to PARC to see a demonstration of a new operating system that made “multitasking” easy. Unlike traditional operating systems, which could display only one job at a time, the new system divided a screen into many “windows,” each of which could run a different program or display a different document. To illustrate the flexibility of the system, the Xerox presenter clicked from a window in which he had been composing software code to another window that displayed a newly arrived e-mail message. He quickly read and replied to the message, then hopped back to the programming window and continued coding. Some in the audience applauded the new system. They saw that it would enable people to use their computers much more efficiently. Others recoiled from it. “Why in the world would you want to be interrupted—and distracted—by e-mail while programming?” one of the attending scientists angrily demanded. The question seems quaint today.

  • The Net’s interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.

  • The Net commands our attention with far greater insistency than our television or radio or morning newspaper ever did.

  • Our use of the Internet involves many paradoxes, but the one that promises to have the greatest long-term influence over how we think is this one: the Net seizes our attention only to scatter it. We focus intensively on the medium itself, on the flickering screen, but we’re distracted by the medium’s rapid-fire delivery of competing messages and stimuli.

  • We usually make better decisions, his experiments reveal, if we shift our attention away from a difficult mental challenge for a time. But Dijksterhuis’s work also shows that our unconscious thought processes don’t engage with a problem until we’ve clearly and consciously defined the problem.3 If we don’t have a particular intellectual goal in mind, Dijksterhuis writes, “unconscious thought does not occur.”

  • Noting that “our brain is modified on a substantial scale, physically and functionally, each time we learn a new skill or develop a new ability,” he described the Net as the latest in a series of “modern cultural specializations” that “contemporary humans can spend millions of ‘practice’ events at and that the average human a thousand years ago had absolutely no exposure to.” He concluded that “our brains are massively remodeled by this exposure.”

  • Book readers have a lot of activity in regions associated with language, memory, and visual processing, but they don’t display much activity in the prefrontal regions associated with decision making and problem solving. Experienced Net users, by contrast, display extensive activity across all those brain regions when they scan and search Web pages. The good news here is that Web surfing, because it engages so many brain functions, may help keep older people’s minds sharp. Searching and browsing seem to “exercise” the brain in a way similar to solving crossword puzzles, says Small.

  • The information flowing into our working memory at any given moment is called our “cognitive load.” When the load exceeds our mind’s ability to store and process the information—when the water overflows the thimble—we’re unable to retain the information or to draw connections with the information already stored in our long-term memory.

  • There are many possible sources of cognitive overload, but two of the most important, according to Sweller, are “extraneous problem-solving” and “divided attention.” Those also happen to be two of the central features of the Net as an informational medium.

  • The vast majority skimmed the text quickly, their eyes skipping down the page in a pattern that resembled, roughly, the letter F.

  • It’s quite clear, Liu concluded, that with the flood of digital text pouring through our computers and phones, “people are spending more time on reading” than they used to. But it’s equally clear that it’s a very different kind of reading. A “screen-based reading behavior is emerging,” he wrote, which is characterized by “browsing and scanning, keyword spotting, one-time reading, and non-linear reading.” The time “spent on in-depth reading and concentrated reading” is, on the other hand, falling steadily.

  • What is different, and troubling, is that skimming is becoming our dominant mode of reading.

  • we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest.

  • Through the repetitive evaluation of links, headlines, text snippets, and images, we should become more adept at quickly distinguishing among competing informational cues, analyzing their salient characteristics, and judging whether they’ll have practical benefit for whatever task we’re engaged in or goal we’re pursuing.

  • “Does optimizing for multitasking result in better functioning—that is, creativity, inventiveness, productiveness? The answer is, in more cases than not, no,” says Grafman. “The more you multitask, the less deliberative you become; the less able to think and reason out a problem.” You become, he argues, more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought.

  • After mulling over the paradoxes for many years, Flynn came to the conclusion that the gains in IQ scores have less to do with an increase in general intelligence than with a transformation in the way people think about intelligence. Up until the end of the nineteenth century, the scientific view of intelligence, with its stress on classification, correlation, and abstract reasoning, remained fairly rare, limited to those who attended or taught at universities. Most people continued to see intelligence as a matter of deciphering the workings of nature and solving practical problems—on the farm, in the factory, at home. Living in a world of substance rather than symbol, they had little cause or opportunity to think about abstract shapes and theoretical classification schemes. But, Flynn realized, that all changed over the course of the last century when, for economic, technological, and educational reasons, abstract reasoning moved into the mainstream. Everyone began to wear, as Flynn colorfully puts it, the same “scientific spectacles” that were worn by the original developers of IQ tests.8 Once he had that insight, Flynn recalled in a 2007 interview, “I began to feel that I was bridging the gulf between our minds and the minds of our ancestors. We weren’t more intelligent than they, but we had learnt to apply our intelligence to a new set of problems. We had detached logic from the concrete, we were willing to deal with the hypothetical, and we thought the world was a place to be classified and understood scientifically rather than to be manipulated.”

  • she attributed the Flynn effect to an array of factors, from urbanization to the growth in “societal complexity,” all of which “are part and parcel of the worldwide movement from smaller-scale, low-tech communities with subsistence economies toward large-scale, high-tech societies with commercial economies.”

  • His friends remember him as being ambitious, smart, and “nearly obsessed with efficiency.”

    • Does PC really read books and papers fully? With rigor?

  • World Wide Web. Launched on the Internet just four years earlier,

    • I dont understand the difference

  • “We expect,” they had written in a scholarly paper early in 1998, “that advertising-funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”

  • An ad’s placement would be determined not only by the amount of the bid but by the frequency with which people actually clicked on the ad. That innovation ensured that Google’s ads would remain, as the company put it, “relevant”

  • When, in early 2009, Facebook responded to Twitter’s rapid growth by announcing that it was revamping its site to, as it put it, “increase the pace of the stream,” its founder and chief executive, Mark Zuckerberg, assured its quarter of a billion members that the company would “continue making the flow of information even faster.”

  • Because the sales of complementary products rise in tandem, a company has a strong strategic interest in reducing the cost and expanding the availability of the complements to its main product. It’s not too much of an exaggeration to say that a company would like all complements to be given away. If hot dogs were free, mustard sales would skyrocket.

  • By the end of 2009, the original agreement had been abandoned, and Google and the other parties were trying to win support for a slightly less sweeping alternative.

    • What's the current state of affairs?

  • With writing on the screen, we’re still able to decode text quickly—we read, if anything, faster than ever—but we’re no longer guided toward a deep, personally constructed understanding of the text’s connotations. Instead, we’re hurried off toward another bit of related information, and then another, and another. The strip-mining of “relevant content” replaces the slow excavation of meaning.

  • There needs to be time for efficient data collection and time for inefficient contemplation, time to operate the machine and time to sit idly in the garden. We need to work in Google’s “world of numbers,” but we also need to be able to retreat to Sleepy Hollow. The problem today is that we’re losing our ability to strike a balance between those two very different states of mind. Mentally, we’re in perpetual locomotion.

  • Vannevar Bush sounded the keynote for our modern approach to managing information in his much-discussed article “As We May Think,” which appeared in the Atlantic Monthly in 1945. Bush, an electrical engineer who had served as Franklin Roosevelt’s science adviser during World War II, worried that progress was being held back by scientists’ inability to keep abreast of information relevant to their work. The publication of new material, he wrote, “has been extended far beyond our present ability to make use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.” But a technological solution to the problem of information overload was, Bush argued, on the horizon: “The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it.” He proposed a new kind of personal cataloguing machine, called a memex, that would be useful not only to scientists but to anyone employing “logical processes of thought.” Incorporated into a desk, the memex, Bush wrote, “is a device in which an individual stores in compressed form all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility.” On top of the desk are “translucent screens” onto which are projected images of the stored materials as well as “a keyboard” and “sets of buttons and levers” to navigate the database. The “essential feature” of the machine is its use of “associative indexing” to link different pieces of information: “Any item may be caused at will to select immediately and automatically another.” This process “of tying two things together is,” Bush emphasized, “the important thing.”

  • His article inspired many of the original developers of PC hardware and software, including such early devotees of hypertext as the famed computer engineer Douglas Engelbart and HyperCard’s inventor, Bill Atkinson.

  • The Dutch humanist Desiderius Erasmus, in his 1512 textbook De Copia, stressed the connection between memory and reading. He urged students to annotate their books, using “an appropriate little sign” to mark “occurrences of striking words, archaic or novel diction, brilliant flashes of style, adages, examples, and pithy remarks worth memorizing.” He also suggested that every student and teacher keep a notebook, organized by subject, “so that whenever he lights on anything worth noting down, he may write it in the appropriate section.” Transcribing the excerpts in longhand, and rehearsing them regularly, would help ensure that they remained fixed in the mind. The passages were to be viewed as “kinds of flowers,” which, plucked from the pages of books, could be preserved in the pages of memory.3 Erasmus, who as a schoolboy had memorized great swathes of classical literature, including the complete works of the poet Horace and the playwright Terence, was not recommending memorization for memorization’s sake or as a rote exercise for retaining facts. To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.”

  • Erasmus’s recommendation that every reader keep a notebook of memorable quotations was widely and enthusiastically followed. Such notebooks, which came to be called “commonplace books,” or just “commonplaces,” became fixtures of Renaissance schooling. Every student kept one.6 By the seventeenth century, their use had spread beyond the schoolhouse. Commonplaces were viewed as necessary tools for the cultivation of an educated mind. In 1623, Francis Bacon observed that “there can hardly be anything more useful” as “a sound help for the memory” than “a good and learned Digest of Common Places.” By aiding the recording of written works in memory, he wrote, a well-maintained commonplace “supplies matter to invention.”7 Through the eighteenth century, according to American University linguistics professor Naomi Baron, “a gentleman’s commonplace book” served “both as a vehicle for and a chronicle of his intellectual development.”

  • The popularity of commonplace books ebbed as the pace of life quickened in the nineteenth century, and by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century—audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives—greatly expanded the scope and availability of “artificial memory.” Committing information to one’s own mind seemed ever less essential.

  • David Brooks, the popular New York Times columnist, makes a similar point. “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”

  • When, in an 1892 lecture before a group of teachers, William James declared that “the art of remembering is the art of thinking,” he was stating the obvious.14 Now, his words seem old-fashioned.

  • But there’s a problem with our new, post-Internet conception of human memory. It’s wrong.

  • Müller and Pilzecker concluded that it takes an hour or so for memories to become fixed, or “consolidated,” in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can sweep the nascent memories from the mind.

  • “Short-term memory produces a change in the function of the synapse, strengthening or weakening preexisting connections; long-term memory requires anatomical changes.”

  • “The fact that a gene must be switched on to form long-term memory shows clearly that genes are not simply determinants of behavior but are also responsive to environmental stimulation, such as learning.”

    • What does turning a gene on actually mean?

  • A slug calls on implicit memories when retracting its gill. A person draws on them when dribbling a basketball or riding a bike. As Kandel explains, an implicit memory “is recalled directly through performance, without any conscious effort or even awareness that we are drawing on memory.”23 When we talk about our memories, what we’re usually referring to are the “explicit” ones—the recollections of people, events, facts, ideas, feelings, and impressions that we’re able to summon into the working memory of our conscious mind. Explicit memory encompasses everything that we say we “remember” about the past. Kandel refers to explicit memory as “complex memory”—and for good reason. The long-term storage of explicit memories involves all the biochemical and molecular processes of “synaptic consolidation” that play out in storing implicit memories. But it also requires a second form of consolidation, called “system consolidation,” which involves concerted interactions among far-flung areas of the brain. Scientists have only recently begun to document the workings of system consolidation, and many of their findings remain tentative. What’s clear, though, is that the consolidation of explicit memories involves a long and involved “conversation” between the cerebral cortex and the hippocampus.

  • The memory of an experience seems to be stored initially not only in the cortical regions that record the experience—the auditory cortex for a memory of a sound, the visual cortex for a memory of a sight, and so forth—but also in the hippocampus. The hippocampus provides an ideal holding place for new memories because its synapses are able to change very quickly. Over the course of a few days, through a still mysterious signaling process, the hippocampus helps stabilize the memory in the cortex, beginning its transformation from a short-term memory into a long-term one. Eventually, once the memory is fully consolidated, it appears to be erased from the hippocampus. The cortex becomes its sole holding place. Fully transferring an explicit memory from the hippocampus to the cortex is a gradual process that can take many years.

    • This stuff is nuts

  • it is thought to play an important role in weaving together the various contemporaneous memories—visual, spatial, auditory, tactile, emotional—that are stored separately in the brain but that coalesce to form a single, seamless recollection of an event. Neuroscientists also theorize that the hippocampus helps link new memories with older ones, forming the rich mesh of neuronal connections that give memory its flexibility and depth. Many of the connections between memories are likely forged when we’re asleep and the hippocampus is relieved of some of its other cognitive chores.

  • Governed by highly variable biological signals, chemical, electrical, and genetic, every aspect of human memory—the way it’s formed, maintained, connected, recalled—has almost infinite gradations. Computer memory exists as simple binary bits—ones and zeros—that are processed through fixed circuits, which can be either open or closed but nothing in between.

    • I think this view of computer memory is wrong. Data is not simply stored, its transformed and categorized and linked and indexed through processing which isnt too dissimilar to what our brain seems to be doing

  • While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed.”28 Biological memory is alive. Computer memory is not.

    • Computer memory does notg have to be static... It can be continually updated based on new signals

  • the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals.

    • Makes sense

  • Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections—a new context. As Joseph LeDoux explains, “The brain that does the remembering is not the brain that formed the initial memory. In order for the old memory to make sense in the current brain, the memory has to be updated.”30 Biological memory is in a perpetual state of renewal.

  • “The amount of information that can be stored in long-term memory is virtually boundless.”

    • What is the usefulness of forgetting then, besides things like trauma

  • The Web has a very different effect. It places more pressure on our working memory, not only diverting resources from our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas. The calculator, a powerful but highly specialized tool, turned out to be an aid to memory. The Web is a technology of forgetfulness.

  • Learning how to think’ really means learning how to exercise some control over how and what you think,” said the novelist David Foster Wallace in a commencement address at Kenyon College in 2005. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.” To give up that control is to be left with “the constant gnawing sense of having had and lost some infinite thing.”

  • The Web’s connections are not our connections—and no matter how many hours we spend searching and surfing, they will never become our connections. When we outsource our memory to a machine, we also outsource a very important part of our intellect and even our identity. William James, in concluding his 1892 lecture on memory, said, “The connecting is the thinking.” To which could be added, “The connecting is the self.”

    • Read the lecture?

  • What’s stored in the individual mind—events, facts, concepts, skills—is more than the “representation of distinctive personhood” that constitutes the self, writes the anthropologist Pascal Boyer. It’s also “the crux of cultural transmission.”41 Each of us carries and projects the history of the future. Culture is sustained in our synapses.

  • Although even the initial users of the technology can often sense the changes in their patterns of attention, cognition, and memory as their brains adapt to the new medium, the most profound shifts play out more slowly, over several generations, as the technology becomes ever more embedded in work, leisure, and education—in all the norms and practices that define a society and its culture. How is the way we read changing? How is the way we write changing? How is the way we think changing? Those are the questions we should be asking, both of ourselves and of our children. As for me, I’m already backsliding.

    • Seems to advocate for an awareness of how we are changing and what the consequences are but doesn't argue to disconnect or fight the changing tide

  • Weizenbaum observed how easy it is for computer programmers to make machines “behave in wondrous ways, often sufficient to dazzle even the most experienced observer.” But as soon as a program’s “inner workings are explained in language sufficiently plain to induce understanding,” he continued, “its magic crumbles away; it stands revealed as a mere collection of procedures, each quite comprehensible. The observer says to himself ‘I could have written that.’” The program goes “from the shelf marked ‘intelligent’ to that reserved for curios.”

    • So true

  • “We shape our tools,” observed the Jesuit priest and media scholar John Culkin in 1967, “and thereafter they shape us.”

  • in Understanding Media, McLuhan wrote that our tools end up “numbing” whatever part of our body they “amplify.”

  • maps numb sense of bearing. Clocks numb sense of continuous time. Farm machines numb sense of land. Industrial ag numbs relationship with food. Cars numb sense of land and distance

  • Alienation, he understood, is an inevitable by-product of the use of technology. Whenever we use a tool to exert greater control over the outside world, we change our relationship with that world. Control can be wielded only from a psychological distance. In some cases, alienation is precisely what gives a tool its value. We build houses and sew Gore-Tex jackets because we want to be alienated from the wind and the rain and the cold. We build public sewers because we want to maintain a healthy distance from our own filth. Nature isn’t our enemy, but neither is it our friend. McLuhan’s point was that an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self.

    • Find the wendell berry human scale reference

  • The subjects using the bare-bones software consistently demonstrated “more focus, more direct and economical solutions, better strategies, and better imprinting of knowledge.” The more that people depended on explicit guidance from software programs, the less engaged they were in the task and the less they ended up learning.

  • As more journals moved online, scholars actually cited fewer articles than they had before. And as old issues of printed journals were digitized and uploaded to the Web, scholars cited more recent articles with increasing frequency. A broadening of available information led, as Evans described it, to a “narrowing of science and scholarship.”31

  • The quicker that scholars are able to “find prevailing opinion,” wrote Evans, the more likely they are “to follow it, leading to more citations referencing fewer articles.” Though much less efficient than searching the Web, old-fashioned library research probably served to widen scholars’ horizons: “By drawing researchers through unrelated articles, print browsing and perusal may have facilitated broader comparisons and led researchers into the past.”

  • Before Frederick Taylor introduced his system of scientific management, the individual laborer, drawing on his training, knowledge, and experience, would make his own decisions about how he did his work. He would write his own script. After Taylor, the laborer began following a script written by someone else. The machine operator was not expected to understand how the script was constructed or the reasoning behind it; he was simply expected to obey it. The messiness that comes with individual autonomy was cleaned up, and the factory as a whole became more efficient, its output more predictable. Industry prospered. What was lost along with the messiness was personal initiative, creativity, and whim. Conscious craft turned into unconscious routine.

    • Conscious craft... crafting, the verb, but perhaps not always craftmanship as in quality... what was lost for the worker may not have been lost in the product, but improved? At least in some cases?

  • A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper. The reason, according to attention restoration theory, or ART, is that when people aren’t being bombarded by external stimuli, their brains can, in effect, relax. They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind.

  • Rather than taking walks between the rounds of testing, these subjects simply looked at photographs of either calm rural scenes or busy urban ones. The results were the same. The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness.

  • The results were striking. In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased. It was as if the smartphones had force fields that sapped their owners’ intelligence. In subsequent interviews, nearly all the students said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones muddled their thinking.

  • It also revealed that the more heavily the students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered when their phones were nearby.

  • “The mere presence of mobile phones,” the researchers reported, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.” The effects were strongest when “a personally meaningful topic” was being discussed.

  • the historian and social critic Jacques Barzun bemoaned the debasement of the word “culture.” Through years of loose and lazy usage, it had been turned into “a piece of all-purpose jargon that covers a hodge-podge of overlapping things.” Lost along the way was the term’s essential meaning, which Barzun defined, simply, as “the well-furnished mind.”

  • It’s common today, even more so than ten years ago, to think of knowledge as something that surrounds us, something we swim through and consume, like sea creatures in plankton-filled waters. The ideal of knowledge as something self-created, something woven of the facts, ideas, and experiences gathered in the individual mind, continues to recede.

  • Those who believed the facts had been recorded in the computer demonstrated much weaker recall than did those who assumed the facts would not be stored.

  • “Creating a hard copy of an experience through media leaves only a diminished copy in our own heads.”

  • people gather information online, they come to believe they’re smarter and more knowledgeable than they actually are.


Mentioned in: