Is Google Making Us Stupid?

[7][8][9] Birkerts was spurred to write the book after his experience with a class he taught in the fall of 1992, where the students had little appreciation for the literature he had assigned them, stemming from, in his opinion, their inaptitude for the variety of skills involved in deep reading.

[3] This is sometimes called deep reading, a term coined by academic Sven Birkerts in The Gutenberg Elegies and later defined by developmental psychologist Maryanne Wolf with an added cognitive connotation.

The book received mainstream recognition for interrogating the assumptions people make about technological change and advocating for a component of personal accountability in our relationships to devices.

He particularly refers to the work of Maryanne Wolf, a reading behavior scholar, which includes theories about the role of technology and media in learning how to write new languages.

He acknowledges that this theory has a paucity of evidence so far, but refers to such works as Wolf's Proust and the Squid, which discusses how the brain's neurons adapt to a creature's environmental demands to become literate in new problem areas.

These concentration-altering events are only worsened by online media as they adapt their strategies and visual forms to those of Internet platforms to seem more legitimate and trick the viewer into processing them.

Carr also posits that people's ability to concentrate might decrease as new algorithms free them from knowledge work; that is, the process of manipulating and synthesizing abstract information into new concepts and conclusions.

He compares this example with the modern example of Google, which places its computer engineers and designers into a systematized knowledge environment, creating robust insights and results at the expense of creativity.

Additionally, Carr argues that the Internet makes its money mainly by exploiting users' privacy or bombarding them with overstimulation, a vicious cycle where companies facilitate mindless browsing instead of rewarding sustained thinking.

", a question that the article proper does not actually pose and that he believed was "perfect fodder for a 'don't-be-ridiculous' blog post"; Johnson challenged his readers to carefully consider their online responses in the interest of raising the quality of debate.

[27][28][29][30] Calling it "the great digital literacy debate", British-American entrepreneur and author Andrew Keen judged the victor to be the American reader, who has a wide range of compelling writing from "all of America's most articulate Internet luminaries".

[33] Carr acknowledged that there was a debate over the terminology of 'ideogram', but in a response to Esposito he explained that he had "decided to use the common term" and quoted The Oxford American Dictionary to demonstrate that they likewise define Chinese characters as instances of ideograms.

[39] Another neuroscientist, Gary Small, director of UCLA's Memory & Aging Research Center, wrote a letter to the editor of The Atlantic in which he stated that he believed that "brains are developing circuitry for online social networking and are adapting to a new multitasking technology culture".

[41][43] Columnist Leonard Pitts of The Miami Herald described his difficulty sitting down to read a book, in which he felt like he "was getting away with something, like when you slip out of the office to catch a matinee".

[45] He found portable long-form audio to be "transformative", however, because he can easily achieve "sustained attention", which makes him optimistic about the potential to "reactivate ancient traditions, like oral storytelling, and rediscover their powerful neural effects".

[9][45] Also writing in The Atlantic, a year after Carr, the futurist Jamais Cascio argued that human cognition has always evolved to meet environmental challenges, and that those posed by the internet are no different.

[52] In book critic Scott Esposito's view, "responsible adults" have always had to deal with distractions, and, in his own case, he claimed to remain "fully able to turn down the noise" and read deeply.

At the online magazine Edge, Wikipedia co-founder Larry Sanger argued that individual will was all that was necessary to maintain the cognitive capacity to read a book all the way through.

Computer scientist and writer Jaron Lanier balked at the idea that technological progress is an "autonomous process that will proceed in its chosen direction independently of us".

[24][55] At the Britannica Blog, writer Clay Shirky pugnaciously observed that War and Peace was "too long, and not so interesting", further stating that "it would be hard to argue that the last ten years have seen a decrease in either the availability or comprehension of material on scientific or technical subjects".

Drawing parallels with transactive memory—a process whereby people remember things in relationships and groups—Ratliff mused that perhaps the web was "like a spouse who is around all the time, with a particular knack for factual memory of all varieties".

[27] In the essay, Carr introduces the discussion of the scientific support for the idea that the brain's neural circuitry can be rewired with an example in which philosopher Friedrich Nietzsche is said to have been influenced by technology.

[29][31][65] Esposito believed that "the brain is so huge and amazing and enormously complex that it's far, far off base to think that a few years of Internet media or the acquisition of a typewriter can fundamentally rewire it.

In The New York Times it was reported that several scientists believed that it was certainly plausible that the brain's neural circuitry may be shaped differently by regular Internet usage compared with the reading of printed works.

[38] Olds mentioned neuroscientist Michael Merzenich, who had formed several companies with his peers in which neuroplasticity-based computer programs had been developed to improve the cognitive functioning of kids, adults and the elderly.

[69] In Stanley Kubrick's 1968 science fiction film 2001: A Space Odyssey, astronaut David Bowman slowly disassembles the mind of an artificial intelligence named HAL by sequentially unplugging its memory banks.

He observed that HAL showed genuine emotion as his mind was disassembled while, throughout the film, the humans onboard the space station appeared to be automatons, thinking and acting as if they were following the steps of an algorithm.

After the publication of Carr's essay, a developing view unfolded in the media as sociological and neurological studies surfaced that were relevant to determining the cognitive impact of regular Internet usage.

[84] Among the reflections concerning the possible interpretations of the UCLA study were whether greater breadth of brain activity while using the Internet in comparison with reading a book improved or impaired the quality of a reading session; and whether the decision-making and complex reasoning skills that are apparently involved in Internet search, according to the study, suggest a high quality of thought or simply the use of puzzle solving skills.

[85][86] Thomas Claburn, in InformationWeek, observed that the study's findings regarding the cognitive impact of regular Internet usage were inconclusive and stated that "it will take time before it's clear whether we should mourn the old ways, celebrate the new, or learn to stop worrying and love the Net".

A model 1878 of the Malling-Hansen Writing Ball , which Nietzsche began using in 1882 when his poor eyesight made it difficult for him to write by hand [ 62 ] [ 63 ]