“Don’t be bored, don’t be lazy, don’t be trivial, and don’t be proud. The slightest loss of attention leads to death.”
“Most of us, by the time we leave childhood, have repressed our vision of the primary miraculousness of creation. We have closed it off, changed it, and no longer perceive the world as it is to raw experience. The great boon of repression is that it makes it possible to live decisively in an overwhelmingly miraculous and incomprehensible world, a world so full of beauty, majesty, and terror that if animals perceived it all they would be paralyzed to act. But nature has protected the lower animals by endowing them with instincts. It is very simple: Animals are not moved by what they cannot react to. They live in a tiny world, a sliver of reality, one neuro-chemical program that keeps them walking behind their noses and shuts everything else out. But look at man, the impossible creature. Here nature seems to have thrown caution to the winds along with the programmed instincts. She created an animal who has no defense against full perception of the external world, an animal completely open to experience. Not only in front of his nose, in his ‘umwelt,’ but in many other ‘umweltsen.’ He can relate not only to animals in his own species, but in some ways to all other species. He can contemplate not only what is edible for him, but everything that grows. He not only lives in this moment, but expands his inner self to yesterday, his curiosity to centuries ago, his fears to five billion years from now when the sun will cool, his hopes to an eternity from now. He lives not only on a tiny territory, nor even on an entire planet, but in a galaxy, in a universe, and in dimensions beyond visible universes. It is appalling, the burden than man bears. He doesn’t know who he is, why he was born, what he is doing on the planet, what he is supposed to do, what he can expect. His own existence is incomprehensible to him, a miracle just like the rest of creation, closer to him but all the more strange. Each thing is a problem. Man had to invent and create out of himself the limitations of perception and the equanimity to live on this planet. And so the core of psychodynamics, the formation of human character, is a study in human self-limitation and in the terrifying costs of that limitation.”
“He lay in bed open-eyed in the dark. There were intestinal moans from his left side, where gas makes a hairpin turn at the splenic flexure. He felt a mass of phlegm wobbling in his throat but he didn’t want to get out of bed to expel it, so he swallowed the whole nasty business, a slick syrupy glop. This was the texture of his life. If someone ever writes his true biography, it will be a chronicle of gas pains and skipped heartbeats, grinding teeth and dizzy spells and smothered breath, with detailed descriptions of Bill leaving his desk to walk to the bathroom and spit up mucus, and we see photographs of ellipsoid clots of cells, water, organic slimes, mineral salts and spotty nicotine. Or descriptions just as long and detailed of Bill staying where he is and swallowing. These were his choices, his days and nights. In the solitary life there was a tendency to collect moments that might otherwise blur into the rough jostle, the swing of a body through busy streets and rooms. He lived deeply in these cosmic-odd pauses. They clung to him. He was a sitting industry of farts and belches. This is what he did for a living, sit and hawk, mucus and flatus. He saw himself staring at the hair buried in his typewriter. He leaned above his oval tablets, hearing the grainy cut of the blade. In his sleeplessness he went down the batting order of the 1938 Cleveland Indians. This was the true man, awake with phantoms. He saw them take the field in all the roomy optimism of those old uniforms, the sun-bleached dinky mitts. The names of those ballplayers were his night prayer, his reverent petition to God, with wording that remained eternally the same. He walked down the hall to piss or spit. He stood by the window dreaming. This was the man he saw as himself. The biographer who didn’t examine these things (not that there would ever be a biographer) couldn’t begin to know the catchments, the odd-corner deeps of Bill’s true life.”
“The key to the creative type is that he is separated out of the common pool of shared meanings. There is something in his life experience that makes him take the world as a problem; as a result he has to make personal sense out of it. This holds true for all creative people to a greater or lesser extent, but it is especially obvious with the artist. Existence becomes a problem that needs an ideal answer; but when you no longer accept the collective solution to the problem of existence, then you must fashion your own. The work of art is, then, the ideal answer of the creative type to the problem of existence as he takes it in —not only the existence of the external world, but especially his own: who he is as a painfully separate person with nothing shared to lean on. He has to answer to the burden of his extreme individuation, his so painful isolation… His creative work is at the same time the expression of his heroism and the justification of it. It is his “private religion,” as [Otto] Rank put it.”
“There is much pride and suffering in every renunciation. Instead of retreating discreetly, without a big show of revolt and hatred, you denounce, emphatically and haughtily, others’ ignorance and illusions; you condemn their pleasures. … Suffering and the consciousness of its inescapability lead to renunciation; yet nothing would induce me, not even if I were to become a leper, to condemn another’s joy. There is much envy in every act of condemnation.”
“With the tremendous acceleration of life, we grow accustomed to using our mind and eye for seeing and judging incompletely or incorrectly, and all men are like travelers who get to know a land and its people from a train.”
“There could come a time when some information is so difficult to obtain on interactive systems that “truth” will be defined as that which is easily available, since selections are costly and must be made quickly. We may tend to assume that information which is not easily available does not need to be known.”
“You yourself have always objected to the opinion I give of myself. But even if it were not just it would still be necessary, as you would understand if you were subjected to as much scaling down and leveling by dozens of [critical] means, from historical comparison to personal attack. [My novel] has its share of faults but so do many other universally and deservedly admired books. This egalitarianism of men who do not care for themselves and therefore cannot allow others to give great value to human personality is extremely dangerous to writers who are after all devoted to a belief in the importance of human actions. The Gods, the saints, the heroes, these are human pictures of human qualities; the citizen, the man in the street, the man of the mass have become their antithesis. I am against the triumph of this antithesis….”
Technologies which are vectors for the transmission of information -narratives or datasets or social communication- inform human thought as surely as does language itself. The structures of one’s grammar and the content of one’s vocabulary -both dictional and conceptual- delimit one’s cognition, although perhaps not one’s range of emotion. Similarly, the structures of a given technology -which, as practical techniques for the application of theoretical scientific ideas, may be quite haphazardly determined by laboratory exigencies- can come to shape the minds of those who submit to it. This is because communicative technologies are, for us, extensions of language. And as is true of language, they can as easily enable stupidity as catalyze intelligence.
The sort of stupidity engendered by the Internet is thoughtlessness, not ignorance. While the latter is hardly vanquished, and indeed seems to be incomprehensibly, triumphantly persistent despite the avalanche of information beneath which we’re all subsumed, it is only the case that the Internet has brought to the fore what has always been true: most of us know very little; even the best know far less than they don’t know; and in a democracy, it is the right of anyone to participate passionately in debates about which s/he is totally uninformed.
But it is a fact, in its unnecessary defense, that the Internet combats ignorance and illiteracy; that it seems to be a sea of precisely those phenomena is proof only of how much combat was needed. For all its flaws, it is amusing to recall that the Internet is almost entirely textual, and never before have so many read and written so much.
There has been, however, an exchange, well-described by hundreds of cultural critics, most of whom, like the rest of us, are unable to avoid it: the Internet requires that we trade depth for breadth, knowledgeability of one rather facile sort for flickering, inattentive, unfocused, anxious, compulsively social mental tics. It doesn’t matter whether we want to make this exchange: unless we are prepared for ludditism we must accept what Nicholas Carr famously described in "Is Google Making Us Stupid?"
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
Carr very ably describes some of the processes at play, the science behind them, their consequences, and he even contextualizes his concerns by noting that every generation seems disturbed by developments in media technologies:
In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).
Indeed, as the succession of technologies has accelerated, it has become a generational banality that parents and teachers bemoan the baleful effects of each new phenomenon on their children and students. It is repetitive, and through repetition seems to have become comic, unpersuasive. Even though it is evident to all that, for example, the Internet decimates our capacity to pay meaningful attention to anything but ourselves -selves which are thereby reduced to clotted, neurotic urges, preening narcissistic anxiety, and fantasy-fueled envy- it would be the absolute pit of cliche to complain about it as many once did about television.
Television, Before It
Television, the last platform before the Internet to commandeer for profit the distribution of culture, fantasy, distraction, and connection, had as its essential quality a unilateral, centralized mode of information dispersal. It gathered the entirety of the Western world before its little screen and entranced it, speaking without interruption to a civilization of benumbed, mesmerized, enfeebled children, eager to submit to a satiating authority, eager to subsume themselves in fantasies of any sort, eager to escape time, eager to embrace passivity.
Passivity, then, was the concern of cultural critics. It was integral to televisual media, and it seemed to be shaping the minds of the masses into organs capable only of reception and absorption. It was lamented that through television, we’d reached a nadir of disengagement and apathy.
It would have seemed impossibly utopian to an observer that soon, average citizens would live in a world of text, a world in which it was perfectly typical to spend hours every day scanning articles, arguing politics in writing, corresponding in email and SMS and IM and on Facebook, reading editorials, news, analysis, entertainment, invective, encyclopedias, chasing trails of information through labyrinths of sources, and cultivating an identity based on these activities. It would sound scholastic!
Yet it is not so, and it is not so because superficial knowledge and superficial connections are not so different from the televisual ignorance and audience isolation the Internet has replaced. They are the transmissible tics of occupied minds, minds not self-directed but occupied by external forces who perhaps promise satiation or order but who are there for their own reasons. Why we forever seek occupation is a matter for another time.
Inherited Properties of Technology
Television and the Internet have much in common: even if one must secure access to them, they themselves are often offered without upfront payment. Just as NBC once asked only that you attend to its commercials, so does, say, The Huffington Post. And just like NBC, The Huffington Post is obliged to determine how it can best gather the most of your attention, exploiting you psychologically and conditioning you to be dissatisfied with what does not. Even sites that are not run for profit must compete thusly: after all, most hope for attention as well, and cannot hope to secure it from stimulation-junkies accustomed to shorter-and-shorter articles, ever-more inflammatory rhetoric, easier to parse and “share” information, and so on.
Their central divergence is in their organizational structure: television was one-to-many, a broadcast which permitted no interaction and therefore was incentivized to keep you passively watching. The Internet replaces the passivity of television with an enfeebling quality all its own: it is variously called tangentiality or laterality, and like the broadcast quality of television it has its origin in the fundamental structure of the World Wide Web: hypertextual linking. This model for association, connection, and navigation has its virtues, but its evolution, in conjunction with inevitable market pressures and human tendencies, has made linking antithetical rather than supplemental to depth, reflection, interiorization, focus, attention.
Attention and Imagination
Reading correspondence recently -of the sort sent through the ordinary mail in the 20th century- I was struck by the imaginative necessity it entailed: if you were to write a letter to a friend far away, and would not be able to amend or clarify or retract what you wrote until a reply came, perhaps weeks or months in the future, you would take care to imagine your friend, his reactions, his feelings. You would not distractedly dash off performative IMs while watching something else; the glacial pace of information exchange, like the skeletal sensory information of books, demanded imagination.
A friend or lover whom one can imagine, whom one interiorizes, who lives in one’s mind, whom one cannot interrupt, IM, scan, or take for granted, seems quaint now. We are no longer obliged to imagine anyone, and we are less compassionate, more splenetic, and lazier because of it. I have taken so many friendships for granted, and have too many acquaintances to give any the time they deserve. We struggle to write cleverly and compassionately to one another, but who has the time? Or -since we still have the same twenty-four hours in each day that we’ve always had- who has the attention? I feel real guilt over my failure to embrace so many whom I’ve met, but what can I do?
There are ethical consequences to the destruction of our interpersonal imaginations, to say nothing at all of tangentiality or laterality. We are intolerant in novel ways, rush to judge and publicly excoriate with astonishing speed, and we abandon causes as quickly as we find them. We pride ourselves more and more on caustic indignation, which now competes with televisual “cool” as the most-sought aesthetic and emotional ideal.
Wisdom and happiness
Perhaps, particularly from a capitalist’s perspective, the exchange of attention for distraction remains paradigmatic of a happy market transaction which only concerns the pretentious. What is wrong, after all, with being distracted, with having innumerable acquaintanceships, with exteriorizing one’s nervous system and memory and context-switching one’s mind into a blissful flickering laser beam?
I believe that David Foster Wallace was a better essayist and thinker than a writer of fiction, and I like his oft-quoted Kenyon commencement address very much; in it, he suggests that controlling one’s attention is integral to happiness, to real wisdom, and to meaningful freedom:
I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience […] The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.
That entire multi-billion dollar industries exist solely to wrest your attention from you, using whatever pretext they can -“news is important,” “this is a community of your friends,” “you are creating things right now,” “this is vital information,” “this is sex,” “this is where culture is happening”- is proof of attention’s value and its profound connection to will. Attention permits us to control our minds, direct our thoughts, and orient our wills in accordance with what we think is best.
The Internet and television alike are primarily concerned with exploiting weaknesses in our processes for directing attention so that our will can be influenced. And whereas we might hope for, say, health, happiness, and the capacity to love and be loved, businesses want other things from us: “the constant gnawing sense of having had, and lost, some infinite thing,” and the corresponding will to buy it back from them.
It is not a conspiracy; it is the result of a haphazardly constructed system of entities, some virtuous and some not. And it is not catastrophic: it is merely noteworthy that the defining stupidity of my generation seems to be a fraught thoughtlessness, a disabling inattentiveness which steals every successive moment, and an inability to imagine others. Imagination is a moral act, as Alfred Polgar noted:
To reform an evildoer, you must before anything else help him to an awareness that what he did was evil. With the Nazis this won’t be easy. They know exactly what they’re doing: they just can’t imagine it.
The relationship between attention, awareness, moral decency, and happiness seems fairly straightforward to me, yet I sometimes feel that this knowledge has little or no impact on my behavior; sitting for hours reading what I don’t care to read, detesting commenters whose lives I might imagine and whose transgressions I might therefore forgive, scanning stream after addictive stream of semi-social expressions, ignoring Abby, my dogs, my life, the city around me, the sky, the clouds, my own heart, the need to sleep, the stacks of books I haven’t touched, and the fact that I remember little of what I read last week, I am aware of some malfunction in my mind: I have become thoughtless, superficial, other-directed.
And as I have grown older, it is less the stupidity that upsets me than my lost attention, my collapsing awareness, my shallow morality, predicated on judgement rather than forgiveness, and the sense that what stands between me and greater happiness is will alone, the very will which is being disabled, distracted moment by distracted moment, as I check Facebook on my iPhone while someone I ostensibly care for speaks to me.
One of the fundamentally false qualities of most narrative fiction -literary or cinematic- is its typically episodic structure. Because neither in prose nor in film can time be presented in its full, ceaseless fluidity, it is corralled into “scenes”; it is because we narrate our own lives according to such forms that we believe scenes exist, when in our lives there is no such thing, no demarcated arcs of action set to neatly start and conclude; rather, we find, there is our unending occupancy of an escaping present moment.
Scenes! Our memory is so easily influenced! A lifetime of interiorizing episodic narratives makes it almost impossible to recall time as we experience it: the caucauphony of sensations, fragmentary thoughts, partialized and distorted recollections, fantasies; the reactions which antedate our cognition and will; the noise; the happenstance and accidental phenomena in our minds. We do not experience the world as we think we do; memory is, from the outset, a falsification of experience, its reduction into a searchable scheme whose structure comes to us from narrative fiction and the properties of the language that informs it.
Aa such, we cannot understand the accurate portrayal of lived time present in books like Ulysses but can easily make sense of absurdly false television dramas; our interior lives are inscrutable to us, while impossible lies about how humans behave seem quite natural. After his wife’s death, CS Lewis noted the nightmarish weight of time and its absence of structure, its lack of chapters:
"One never meets just Cancer, or War, or Unhappiness (or Happiness). One only meets each hour or moment that comes. All manner of ups and downs. Many bad spots in our best times, many good ones in our worst. One never gets the total impact of what we call ‘the thing itself.’ But we call it wrongly. The thing itself is simply all those ups and downs: the rest is a name or an idea."
I think particularly of fights with those we love. After some reflection, I cannot recall a single work of fiction or film which captures -or even attempts to do so- the qualities of a domestic fight of the highest order: its relentlessness, its looping inescapability, the bleakness which comes to characterize all of one’s thoughts as one descends deeper and deeper into it. Fights are not an exchange of barbs which terminate when, music swelling, one character says something particularly pithy, leaving their counterpart stone silent and blank, before quitting the room and permitting all to move on to the next scene.
Fights are hallucinogenic and interminable. From within them, one cannot even tell who is truly at fault -though it generally seems that one’s lover is more at fault, even as one painfully recalls all the insults and exaggerations and distortions one is now responsible for, though “they provoked it!”- or whether fault is even relevant. One loses all of one’s painfully acquired wisdom about the stupidity of fights, about the meaninglessness of them, about the importance of love and respect. One loses one’s sense of time, one’s values, one’s ability to imagine the relationship as it was before the fight and will be after.
Indeed, one cannot imagine an “after”; the truly catastrophic arguments seem like black holes: no amount of love will permit one to escape from within; one has passed an event horizon, and will not now recover. One is drained, exhausted, miserable, angry but unsure of oneself, but sure that one ought not be as unsure as one’s partner, who seems not unsure in the least. Perhaps one will venture, tentatively, to reconcile: a remark about how stupid the fight is, how they ought not waste their time quarreling when they love one another; and one will be met with “that’s why I don’t get why you’re making a big deal out of this; I didn’t want to fight, but you had to bring up…”
Synchronicity is lost, the momentary reprieve vanishes, and it all begins again: a concrete instance of cyclical time that mocks linearity with every new phase, level, exchange. Late into the night, or well past one’s class time, the fight undulates and repeats and takes on new attributes or discards them, and references to previous statements become comically tricky: “I said that.” “No, you said that you bought two of them.” “No, not when I said that, when earlier I said I didn’t buy them.” “When did you say that? You didn’t!” “I did, when I was sitting in the chair over there.” “I don’t remember that.” “That’s because you weren’t listening”
And on and on.
So! My questions for the industrious -and I should add that Abby and I are not fighting!- who wish to answer:
“He noted with distaste his own trick of appealing for sympathy. A personality had its own ways. A mind might observe them without approval. Herzog did not care for his own personality, and at the moment there was apparently nothing he could do about its impulses.”
“I’ve come to the conclusion that this has been the Great Dream of my generation: to position ourselves in such a way that we’re beyond mockery. To not look stupid. That’s the biggest crime of all —looking stupid.”