Technologies which are vectors for the transmission of information -narratives or datasets or social communication- inform human thought as surely as does language itself. The structures of one’s grammar and the content of one’s vocabulary -both dictional and conceptual- delimit one’s cognition, although perhaps not one’s range of emotion. Similarly, the structures of a given technology -which, as practical techniques for the application of theoretical scientific ideas, may be quite haphazardly determined by laboratory exigencies- can come to shape the minds of those who submit to it. This is because communicative technologies are, for us, extensions of language. And as is true of language, they can as easily enable stupidity as catalyze intelligence.
The sort of stupidity engendered by the Internet is thoughtlessness, not ignorance. While the latter is hardly vanquished, and indeed seems to be incomprehensibly, triumphantly persistent despite the avalanche of information beneath which we’re all subsumed, it is only the case that the Internet has brought to the fore what has always been true: most of us know very little; even the best know far less than they don’t know; and in a democracy, it is the right of anyone to participate passionately in debates about which s/he is totally uninformed.
But it is a fact, in its unnecessary defense, that the Internet combats ignorance and illiteracy; that it seems to be a sea of precisely those phenomena is proof only of how much combat was needed. For all its flaws, it is amusing to recall that the Internet is almost entirely textual, and never before have so many read and written so much.
There has been, however, an exchange, well-described by hundreds of cultural critics, most of whom, like the rest of us, are unable to avoid it: the Internet requires that we trade depth for breadth, knowledgeability of one rather facile sort for flickering, inattentive, unfocused, anxious, compulsively social mental tics. It doesn’t matter whether we want to make this exchange: unless we are prepared for ludditism we must accept what Nicholas Carr famously described in “Is Google Making Us Stupid?”
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
Carr very ably describes some of the processes at play, the science behind them, their consequences, and he even contextualizes his concerns by noting that every generation seems disturbed by developments in media technologies:
In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).
Indeed, as the succession of technologies has accelerated, it has become a generational banality that parents and teachers bemoan the baleful effects of each new phenomenon on their children and students. It is repetitive, and through repetition seems to have become comic, unpersuasive. Even though it is evident to all that, for example, the Internet decimates our capacity to pay meaningful attention to anything but ourselves -selves which are thereby reduced to clotted, neurotic urges, preening narcissistic anxiety, and fantasy-fueled envy- it would be the absolute pit of cliche to complain about it as many once did about television.
Television, Before It
Television, the last platform before the Internet to commandeer for profit the distribution of culture, fantasy, distraction, and connection, had as its essential quality a unilateral, centralized mode of information dispersal. It gathered the entirety of the Western world before its little screen and entranced it, speaking without interruption to a civilization of benumbed, mesmerized, enfeebled children, eager to submit to a satiating authority, eager to subsume themselves in fantasies of any sort, eager to escape time, eager to embrace passivity.
Passivity, then, was the concern of cultural critics. It was integral to televisual media, and it seemed to be shaping the minds of the masses into organs capable only of reception and absorption. It was lamented that through television, we’d reached a nadir of disengagement and apathy.
It would have seemed impossibly utopian to an observer that soon, average citizens would live in a world of text, a world in which it was perfectly typical to spend hours every day scanning articles, arguing politics in writing, corresponding in email and SMS and IM and on Facebook, reading editorials, news, analysis, entertainment, invective, encyclopedias, chasing trails of information through labyrinths of sources, and cultivating an identity based on these activities. It would sound scholastic!
Yet it is not so, and it is not so because superficial knowledge and superficial connections are not so different from the televisual ignorance and audience isolation the Internet has replaced. They are the transmissible tics of occupied minds, minds not self-directed but occupied by external forces who perhaps promise satiation or order but who are there for their own reasons. Why we forever seek occupation is a matter for another time.
Inherited Properties of Technology
Television and the Internet have much in common: even if one must secure access to them, they themselves are often offered without upfront payment. Just as NBC once asked only that you attend to its commercials, so does, say, The Huffington Post. And just like NBC, The Huffington Post is obliged to determine how it can best gather the most of your attention, exploiting you psychologically and conditioning you to be dissatisfied with what does not. Even sites that are not run for profit must compete thusly: after all, most hope for attention as well, and cannot hope to secure it from stimulation-junkies accustomed to shorter-and-shorter articles, ever-more inflammatory rhetoric, easier to parse and “share” information, and so on.
Their central divergence is in their organizational structure: television was one-to-many, a broadcast which permitted no interaction and therefore was incentivized to keep you passively watching. The Internet replaces the passivity of television with an enfeebling quality all its own: it is variously called tangentiality or laterality, and like the broadcast quality of television it has its origin in the fundamental structure of the World Wide Web: hypertextual linking. This model for association, connection, and navigation has its virtues, but its evolution, in conjunction with inevitable market pressures and human tendencies, has made linking antithetical rather than supplemental to depth, reflection, interiorization, focus, attention.
Attention and Imagination
Reading correspondence recently -of the sort sent through the ordinary mail in the 20th century- I was struck by the imaginative necessity it entailed: if you were to write a letter to a friend far away, and would not be able to amend or clarify or retract what you wrote until a reply came, perhaps weeks or months in the future, you would take care to imagine your friend, his reactions, his feelings. You would not distractedly dash off performative IMs while watching something else; the glacial pace of information exchange, like the skeletal sensory information of books, demanded imagination.
A friend or lover whom one can imagine, whom one interiorizes, who lives in one’s mind, whom one cannot interrupt, IM, scan, or take for granted, seems quaint now. We are no longer obliged to imagine anyone, and we are less compassionate, more splenetic, and lazier because of it. I have taken so many friendships for granted, and have too many acquaintances to give any the time they deserve. We struggle to write cleverly and compassionately to one another, but who has the time? Or -since we still have the same twenty-four hours in each day that we’ve always had- who has the attention? I feel real guilt over my failure to embrace so many whom I’ve met, but what can I do?
There are ethical consequences to the destruction of our interpersonal imaginations, to say nothing at all of tangentiality or laterality. We are intolerant in novel ways, rush to judge and publicly excoriate with astonishing speed, and we abandon causes as quickly as we find them. We pride ourselves more and more on caustic indignation, which now competes with televisual “cool” as the most-sought aesthetic and emotional ideal.
Wisdom and happiness
Perhaps, particularly from a capitalist’s perspective, the exchange of attention for distraction remains paradigmatic of a happy market transaction which only concerns the pretentious. What is wrong, after all, with being distracted, with having innumerable acquaintanceships, with exteriorizing one’s nervous system and memory and context-switching one’s mind into a blissful flickering laser beam?
I believe that David Foster Wallace was a better essayist and thinker than a writer of fiction, and I like his oft-quoted Kenyon commencement address very much; in it, he suggests that controlling one’s attention is integral to happiness, to real wisdom, and to meaningful freedom:
I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience […] The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.
That entire multi-billion dollar industries exist solely to wrest your attention from you, using whatever pretext they can -“news is important,” “this is a community of your friends,” “you are creating things right now,” “this is vital information,” “this is sex,” “this is where culture is happening”- is proof of attention’s value and its profound connection to will. Attention permits us to control our minds, direct our thoughts, and orient our wills in accordance with what we think is best.
The Internet and television alike are primarily concerned with exploiting weaknesses in our processes for directing attention so that our will can be influenced. And whereas we might hope for, say, health, happiness, and the capacity to love and be loved, businesses want other things from us: “the constant gnawing sense of having had, and lost, some infinite thing,” and the corresponding will to buy it back from them.
It is not a conspiracy; it is the result of a haphazardly constructed system of entities, some virtuous and some not. And it is not catastrophic: it is merely noteworthy that the defining stupidity of my generation seems to be a fraught thoughtlessness, a disabling inattentiveness which steals every successive moment, and an inability to imagine others. Imagination is a moral act, as Alfred Polgar noted:
To reform an evildoer, you must before anything else help him to an awareness that what he did was evil. With the Nazis this won’t be easy. They know exactly what they’re doing: they just can’t imagine it.
The relationship between attention, awareness, moral decency, and happiness seems fairly straightforward to me, yet I sometimes feel that this knowledge has little or no impact on my behavior; sitting for hours reading what I don’t care to read, detesting commenters whose lives I might imagine and whose transgressions I might therefore forgive, scanning stream after addictive stream of semi-social expressions, ignoring Abby, my dogs, my life, the city around me, the sky, the clouds, my own heart, the need to sleep, the stacks of books I haven’t touched, and the fact that I remember little of what I read last week, I am aware of some malfunction in my mind: I have become thoughtless, superficial, other-directed.
And as I have grown older, it is less the stupidity that upsets me than my lost attention, my collapsing awareness, my shallow morality, predicated on judgement rather than forgiveness, and the sense that what stands between me and greater happiness is will alone, the very will which is being disabled, distracted moment by distracted moment, as I check Facebook on my iPhone while someone I ostensibly care for speaks to me.