Nicolas Carr :: 2008 :: Is Google making us stupid ?
Is Google Making Us Stupid? is a 2008 article written by technologist Nicholas Carr for The Atlantic, and later expanded on in a published edition by W. W. Norton. The book investigates the cognitive effects of technological advancements that relegate certain cognitive activities — namely, knowledge-searching — to external computational devices. The book received mainstream recognition for interrogating the assumptions people make about technological change and advocating for a component of personal accountability in our relationships to devices.
Carr begins the essay by saying that his recent problems with concentrating on reading lengthy texts, including the books and articles that he used to read effortlessly, stem from spending too much time on the Internet. He suggests that constantly using the Internet might reduce one’s ability to concentrate and reflect on content. He introduces a few anecdotes taken from bloggers who write about the transformation in their reading and writing habits over time. In addition, he analyzes a 2008 study by University College London about new “types” of reading that will emerge and become predominant in the information age. He particularly refers to the work of Maryanne Wolf, a reading behavior scholar, which includes theories about the role of technology and media in learning how to write new languages. Carr argues that while speech is an innate ability that stems directly from brain structure, reading is conscious and taught. He acknowledges that this theory has a paucity of evidence so far, but refers to such works as Wolf’s Proust and the Squid, which discusses how the brain’s neurons adapt to a creature’s environmental demands to become literate in new problem areas. The Internet, in his opinion, is just another kind of environment that we will uniquely adapt to.
Carr discusses how concentration might be impaired by Internet usage. He references the historical example of Nietzsche, who used a typewriter, which was new during his time in the 1880s. Allegedly, Nietzsche’s writing style changed after the advent of the typewriter. Carr categorizes this example as demonstrative of neuroplasticity, a scientific theory that states neural circuits are contingent and in flux. He invokes the idea of sociologist Daniel Bell that technologies extend human cognition, arguing that humans unconsciously conform to the very qualities, or kinds of patterns, involved in these devices’ functions. He uses the clock as an example of a device that has both improved and regulated human perception and behavior.
Carr argues that the Internet is changing behavior at unprecedented levels because it is one of the most pervasive and life-altering technologies in human history. He suggests that the Internet engenders cognitive distractions in the form of ads and popups. These concentration-altering events are only worsened by online media as they adapt their strategies and visual forms to those of Internet platforms to seem more legitimate and trick the viewer into processing them.
Carr also posits that people’s ability to concentrate might decrease as new algorithms free us from knowledge work; that is, the process of manipulating and synthesizing abstract information into new concepts and conclusions. He compares the Internet with industrial management systems, tracing how they caused workers to complain that they felt like automata after the implementation of Taylorist management workflows. He compares this example with the modern example of Google, which places its computer engineers and designers into a systematized knowledge environment, creating robust insights and results at the expense of creativity. Additionally, Carr argues that the Internet makes its money mainly by exploiting users’ privacy or bombarding them with overstimulation, a vicious cycle where companies facilitate mindless browsing instead of rewarding sustained thinking.
Carr ends his essay by tracing the roots of the skeptic trend. He discusses events where people were wary about new technologies, including Socrates’s skepticism about the use of written language and a fifteenth-century Italian editor’s concern about the shift from manually written to printed works. All of these technologies indelibly changed human cognition, but also led to mind-opening innovations that endure today. Still, Carr concludes his argument on an ambivalent note, citing a quote by Richard Foreman that laments the erosion of educated and articulate people. Though Google and other knowledge-finding and knowledge-building technologies might speed up existing human computational processes, they might also foreclose the human potential to easily create new knowledge.
[Source: https://en.wikipedia.org/wiki/Is_Google_Making_Us_Stupid%3F#Synopsis ]
Small et al. (2009) :: Your Brain on Google: Patterns of Cerebral Activation during Internet Searching
Objective: Previous research suggests that engaging in mentally stimulating tasks may im-
prove brain health and cognitive abilities. Using computer search engines to find information
on the Internet has become a frequent daily activity of people at any age, including middle-
aged and older adults. As a preliminary means of exploring the possible influence of Internet
experience on brain activation patterns, the authors performed functional magnetic resonance
imaging (MRI) of the brain in older persons during search engine use and explored whether
prior search engine experience was associated with the pattern of brain activation during
Internet use. Design: Cross-sectional, exploratory observational study Participants: The au-
thors studied 24 subjects (age, 55–76 years) who were neurologically normal, of whom 12 had
minimal Internet search engine experience (Net Naive group) and 12 had more extensive
experience (Net Savvy group). The mean age and level of education were similar in the two
groups. Measurements: Patterns of brain activation during functional MRI scanning were
determined while subjects performed a novel Internet search task, or a control task of reading
text on a computer screen formatted to simulate the prototypic layout of a printed book, where
the content was matched in all respects, in comparison with a nontext control task. Results:
The text reading task activated brain regions controlling language, reading, memory, and
visual abilities, including left inferior frontal, temporal, posterior cingulate, parietal, and
occipital regions, and both the magnitude and the extent of brain activation were similar in
the Net Naive and Net Savvy groups. During the Internet search task, the Net Naive group
showed an activation pattern similar to that of their text reading task, whereas the Net Savvy
group demonstrated significant increases in signal intensity in additional regions controlling
decision making, complex reasoning, and vision, including the frontal pole, anterior temporal
region, anterior and posterior cingulate, and hippocampus. Internet searching was associated
with a more than twofold increase in the extent of activation in the major regional clusters in
the Net Savvy group compared with the Net Naive group (21,782 versus 8,646 total activated
voxels). Conclusion: Although the present findings must be interpreted cautiously in light of
the exploratory design of this study, they suggest that Internet searching may engage a greater
extent of neural circuitry not activated while reading text pages but only in people with prior
computer and Internet search experience. These observations suggest that in middle-aged and
older adults, prior experience with Internet searching may alter the brain’s responsiveness in
neural circuits controlling decision making and complex reasoning.(Am J Geriatr Psychiatry
2009; 17:116 –126)
Key Words: Brain activation, functional MRI, Internet search, middle-age and older
adults, computer experience
Small et al. :: Meet Your iBrain
Easier-to-read introduction Small et al. (2009)
Small treatise concerning the concepts of «invasivity» and «reversibility» and their relation to past, present and future techniques of neural imagery
Introduction The aim of this text is threefold: Firstly, to prove to the Teacher that the author of this article (i.e. Student) have sufficiently internalized all the facts presented during UE Neuroimagery. Secondly, Student aims to introduce the notion of «invasivity» as something which should be considered wery seriously by someone who seeks an «ideal method» of conducting his future (neuro)scientific experiments towards success. But the ultimate aim si to show that certain «philosophical schools» who point out to «invasivity-related aspects» of current neuro-scientific research are not doing so from the position of moralizing savants locked in their ivory towers, but they do so because of concrete and highly-pragmatic reasons related to purest expressions of highest scientific practice. Principal thesis of this text states that « invasivity » and « reversibility » aspects of a chosen experimental method should determine experimentator's choice at least as significantly as other aspects like spatial/temporal resolution characteristics, signal/noise ratio or economical feasibility. First part of the text is dedicated to highly invasive techniques tissue extraction and analysis by means of electron, multiphoton or confocal microscopes. Post mortem autopsy and chirurgical interventions like vivisesction or lobotomy will be mentioned when discussing this group. Common demoninator of these approaches is that their condition sine qua non of their realisation is nonreversible and fatal degradation of one vital functions of the organism under study or...death. Second part of the text is dedicated to somewhat more reversible, nonetheless still very brutal «in vivo» techniques like that of calcic imaging, optic imaging or electrode implantation. Because it is evident that such approaches can inflict severe injuries and suffering of the organisms under study, they will be labeled as «partially reversible quasi in vivo techniques». Contrary to common categorisation of these days, even techniques like PET (positron emission tomography) or X-ray imaging will be included into this middle group of partially invasive techniques. This is due to their high-energy kinship with radioactivity which can without any doubt induce mutations resulting in the disequilibrium of a living system which is commonly known as «loss of health». The loss of this precious equilibrium is the reason why we'll include all the luminescence/fluorescence marker techniques into this category as well. The third part of the text aims to bring hope. It will be fully devoted to techniques which can be considered as fully reversible: focus will be definitely on Magnetic Resonance Imaging (MRI) and Electroencephalography (EEG) while other non-invasive techniques (NIRS, echography or TCD) will be excluded from the list due to lack of Student's personal experience with these techniques. The small part of this final part will be dedicated to «what if?» speculation proposing to use these pure and elegant techniques not only for imaging, but as well as a tool of healing practice. These three parts can be considered as a core of Student's homework demanding him to «highlight the advantages and limits of these techniques depending from the scientific question You'll pose». The question posed by student is this: «According to what criteriae could we possibly quantify invasivity of an experimental tool or method ? » This text will try to answer this question by introducing the term which we label hereby as «Information/Invasivity Quotient» (IIQ).We'll analyse this notion from more ethical perspective in Discussion section,while Appendix will summariz IIQ-based ranking of 4 presented methods.
Session 2 :: How to read and understand scientific papers.
Presentation by Dr. M. Mihalilkova.
Digitizing Literacy :: Reflections on haptics of writing
Writing is a complex cognitive process relying on intricate perceptual-sensorimotor combinations. The process and skill of writing is studied on several levels and in many disciplines, from neurophysiological research on the shaping of each letter to studies on stylistic and compositional features of authors and poets. In studies of writing and literacy overall, the role of the physically tangible writing device (pen on paper; computer mouse and keyboard; digital stylus pen and writing tablet; etc.) is rarely addressed. By and large, the (relatively young) field of writing research is dominated by cognitive approaches predominantly focusing on the visual component of the writing process, hence maintaining a separation between (visual) perception and motor action (e.g., haptics1). However, recent theoretical currents in psychology, phenomenology & philosophy of mind, and neuroscience – commonly referred to as “embodied cognition” – indicate that perception and motor action are closely connected and, indeed, reciprocally dependent.
Did You see a unicycling clown ?
We investigated the effects of divided attention during walking. Individuals were classified based on whether they were walking while talking on a cell phone, listening to an MP3 player, walking without any electronics or walking in a pair. In the first study, we found that cell phone users walked more slowly, changed directions more frequently, and were less likely to acknowledge other people than individuals in the other conditions. In the second study, we found that cell phone users were less likely to notice an unusual activity along their walking route (a unicycling clown). Cell phone usage may cause inattentional blindness even during a simple activity that should require few cognitive resources.
Mathematics of a Lady Tasting Tea
In the design of experiments in statistics, the lady tasting tea is a randomized experiment devised by Ronald Fisher and reported in his book The Design of Experiments (1935). The experiment is the original exposition of Fisher's notion of a null hypothesis, which is "never proved or established, but is possibly disproved, in the course of experimentation".[3]
The lady in question (Muriel Bristol) claimed to be able to tell whether the tea or the milk was added first to a cup. Fisher proposed to give her eight cups, four of each variety, in random order. One could then ask what the probability was for her getting the specific number of cups she identified correct, but just by chance.
Fisher's description is less than 10 pages in length and is notable for its simplicity and completeness regarding terminology, calculations and design of the experiment