Meta is Murder
“Look at the masterpiece, and not at the frame — and not at the faces of other people looking at the frame.”

Vladimir Nabokov in his lectures on Russian literature, opposing the primary type of academic and popular criticism: what we might call the demographic-reactive type. The overwhelming majority of opinion derives less from any internal response to a work of art (or political idea or cultural trend) than from what sorts of reactions we imagine on other faces looking at the frame, as it were.

If we’re observant, we see that when we encounter something we have often hardly finished perceiving it when we begin to imagine how others might react, and how still others would react to that reaction, and only at last do we begin to react according to our own demographic allegiances or resentments. We carry our friends, but still more our enemies, with us in every judgment.

The Internet has amplified this effect: you now have with you an audience judging your reactions; streams of posts and hashtagged messages from schools of thought, schools of attitude, schools of discourse. The Internet has pressed your face against the faces of others; they loom in your vision; they blot out the masterpieces; they stare at you from amidst the noise of their automatic opinions, scrolling endlessly away, appearing endlessly anew. The Internet comes with you to the theater. You cannot be alone with art or with facts or with nature: you will anticipate publicly, experience publicly, react publicly, reflect publicly, and you would not be human if such exposure did not subtly contort your stances, as, after all, you will be judged publicly.

Of course, the Internet is only an extension of what has always happened: we influence and are influenced. That mob-technopoly applies democratic pressures to the most trivial opinions, little silos of demography exerting their distributed force on how we think and feel, various web sites accruing weltanschauungs meme by meme, is only “new” in that the Internet seems more insistent, more determined to rule on all questions and arbitrate all conflicts. No opinion is too small, and no one has the right to abstain.

Looking at frames and faces is an error; both belong to the category of “news” —"the froth & scum of the eternal sea"— whereas art aspires to be sub specie aeternitatis, aspires to meet us beyond the ephemeral in that part of ourselves that is beyond the ephemeral, that is not a merely political creature, is something other than an amalgamation of trending topics, fashionable poses, soon-to-be-invalidated certitudes from soon-to-be-forgotten luminaries, and the like.

The frame is everything to those who want to empower themselves at the masterpiece’s expense, subordinate the eternal to the present’s temporary concerns, make art a tool for their own elevation. The faces looking at the frame are the audience for this sort of critic, who produces formulaic reams about what their reactions mean and what the frame says about things like society. The sordid scene is a distraction from the art and from the viewer, a nullification of their import, the substitution of a banal system for what was a relation between two inimitable intelligences: artist and viewer, reader, listener. Systems bring power and election, and that is their utility: not that they illuminate art or help us understand it, but that they empanel fresh judges, a new relay of runners in history’s race.

We should not give our attention to this sideshow. People have set up stalls between the frames and the faces! There are industries operating there, seeking margins and protected by police! But perhaps we can press through to the painting on the wall or the words on the page. As Gombrowicz advised:

Stop pampering art, stop –for God’s sake!– this whole system of puffing it up and magnifying it; and, instead of intoxicating yourselves with legends, let facts create you. 

And this goes not only for artistic masterpieces but for any object of our contemplation: even a natural phenomenon, uninterrupted by posturing reactivity —”not yet descended into words”—, can occasion the “receptive understanding, …contemplative beholding, and immersion -in the real” that is the justification for asking that we be left alone. This immersion in the real, by art or by nature or however else we should come to it, is private, intimate, easily trampled by a crowd. But it is also our only means of combating artifice, touching the real, suspending the performance, experiencing ourselves and our world as we are, even if only for quickening moments of honest, solitary selfhood.

The Charisma of Leaders

“The leaders always had good consciences, for conscience in them coalesced with will, and those who looked on their face were as much smitten with wonder at their freedom from inner restraint as with awe at the energy of their outward performances.”

In The Varieties of Religious Experience, William James identifies the union of conscience and will in leaders as one of their defining attributes. By conscience he means their values, their morality, their meaning-systems; and by will he means their volition, their drive, their constant, daily intentionality. Thus: their actions are in accord with their ideals. Their desires constantly reflect their beliefs.

For most of us, this is not so: there is a frustrating gap between them, such that we’re not in accord with our own values, no matter how badly we wish to be. Our moral commitments are overwhelmed routinely, and our behavior subverts, distracts, and disappoints us. Perhaps we accept a remunerative job rather than dedicating our lives to what we feel is most important; or we pursue the important, but we get sleepy and head home from the office earlier than we suspect we should; we call in sick when we’re perfectly well; or we come to feel that our calling isn’t so important as we thought. We have doubts and waste time; we crave freedom and idle time, but regret our lack of purpose. We are not as dedicated in friendship as we aspire to be; we grow irritated by what we know is superficial, meaningless; and so on ad nauseum. Because this is one of the defining qualities of human life, examples abound and more are likely unneeded.

James says that for “leaders,” this is not so; and more importantly, because it is not so, we are “as much smitten with wonder at their freedom from inner restraint as with awe at the energy of their outward performances.”

The Steve Jobs Myths

No one who has read about Steve Jobs can escape a certain sense of perplexity concerning him. A figure praised as brilliant, profound, and revolutionary, someone who purportedly saw deeply into the mysteries of creativity and human life, and who was unquestionably responsible for a great deal of innovation, was also prone to facile irrationality, appallingly abusive and callow behavior, the dullest sorts of homilies, and seeming shallowness about his own attributes and habits.

Show a video of or read a passage about the man who absurdly concluded his commencement speech at Stanford with ”stay hungry, stay foolish” —a hackneyed Hallmark phrase that might as well be printed on a motivational poster outside of Steve Ballmer’s office— to someone not already indoctrinated, and their reaction will surprise you. His pinched voice droning on with quite-typical businessman phrases; his endless references to the most ordinary pop-art, from The Beatles to U2 to John Mayer; his casually superficial understanding of the spirituality he ostensibly sought during various phases of his life; his fruitarian diets and basic scientific ignorance, suggestive of a narcissistic mysticism: these will all fail to impress an ordinary person. As with Apple’s often-cited but never-achieved marketing perfection, the myth obscures the truth. The "Reality Distortion Field" does not seem to work except on people for whom its existence is already a given, or for people who knew him in real life.

People who knew him, notably, often report a total awe at the power of his personality and mind, a power that overwhelmed them, catalyzed some of their greatest creativity and effort, inspired them them with its focus and its capacity to find the point, the consequence, the animating vision in any effort. There is no question that Jobs was a rare sort of individual, one whom I credit with dramatically improving human access to creativity-supporting computation (among other feats that matter to me a great deal). But there is reason to wonder: in what did his greatness consist?

(Walter Isaacson’s wasteful biography is hardly helpful here, incidentally. It is a mere recounting of interviews, none well-contextualized or examined satisfactorily. It reads like an endless Time article).

A Unity of Conscience and Will

What Jobs was was indefatigable, convinced of the rightness of his pursuits —whatever they happened to be at any given time— and always in possession of a unified conscience and will. Whether flattering or cajoling a partner, denying his responsibility for his daughter, steering a company or a project, humiliating a subordinate, driving designers and engineers to democratize the “bicycle for the mind” so that computation and software could transform lives around the world, or renovating his house, he was, as they say, “single-minded,” and he never seems to have suffered from distance between his values and his actions. He believed in what he did, and was perfectly content to do whatever it took to achieve his ends. It is hard to imagine Jobs haunted by regrets, ruing this or that interpersonal cruelty; moreover, one can imagine how he might justify not regretting his misdeed, deploying a worn California aphorism like “I believe in not looking back.”

Many are willing to behave this way, of course; any number of startup CEOs take adolescent pride in aping Jobs, driving their employees to long hours, performing a sham mercuriality, pushing themselves far past psychological health in order to show just how dedicated they are. Rare is the CEO for whom this produces better results, however, than he or she would have attained with ordinary management methods.

Perhaps this is because for them, it is an act: it is an adopted methodology selected in order to assure whatever the CEO’s goals are, whether they entail wealth or the esteem of peers or conformity to the heroic paradigm he or she most admires. That is to say: there is for him or her the typical chasm between conscience and will, and as social animals, we register their confusion as we register our own. And what we seek in leaders is confidence, not confusion.

For Jobs, while there were surely elements of performance —as there have been with history’s greatest leaders, tyrants and heroes alike— there was at core an iron unity of purpose and practice. This may have been the source of the charisma for which he is famous —which is emphatically not due to the reasons most typically cited— and it is also, as James notes, related to his “energy of…outward performance…” If you really believe in what you do —and Jobs seemed to believe in whatever he did, as a function of personality— you do not tire until your body is overcome. And Jobs, as is well known, pushed himself and others to exhaustion, to mental fragility, to breakdown.

Morality and Praxis

James does not explain why this kind of unity is so magnetic, so charismatic, but his broader discussion of various types of persons imply that it may have something to do with the perennial problem of human meaning: the confrontation between morality —which tends to be ideal— and praxis, in which innumerable considerations problematize and overwhelm us.

There are two exemplary solutions to this problem in human history, opposed to the third path most of us take: muddling through and bargaining in internal monologues about what we ought to be while compromising constantly:

  1. "Saints," who decide to live in accordance with religious values no matter the cost; for example, believing that money is both meaningless and corrupting, they vow poverty, and fall from society.
  2. "Leaders," who live in accordance with their own values, or values of some community that is worldly in its intentions, such that they do not drop from society but seek to instantiate their values in it.

In an age in which religious values are, even by the religious, not considered sufficient for a turn from society —an age of “the cross in the ballpark,” as Paul Simon says, of churches that promise “the rich life,” of believers who look in disgust at the instantiation of their religions’ values— the leader emerges as our most prominent solution to the problem of meaning. She is the embodiment of values and an agent of their transformative influence on the world. She has the energy of purpose, the dedication of the saint but remains within the world, and sometimes improves it.

The value or articulation of the ideas, it is appropriate to mention, is less important than we might think; in the case of Jobs, it is not crucial that he had a system of philosophy that charted the place of design in problem-solving, problem-solving in human advancement, human-advancement in a moral context. Indeed, we might leave that to others entirely, others who write about such things rather than living each moment driving themselves and others to achieve them.

The toll leaders take is fearsome, but we admire them for using us up: better to be used, after all, than useless. This is why those who worked for Jobs so often cannot even begin to justify how he reduced so-and-so to tears, how he stole this or that bit of credit, how he crushed a former friend whom in his paranoia he suspected of disloyalty, and they scarcely care. What we admire about saints and leaders is not solely the values they exemplify but the totality with which they exemplify them, a totality alien to all of us whose lives are balanced between poles of conformity and dedication, commitment and restlessness.

Jobs himself understood the necessity of unifying conscience and will, but his words are no more capable of transforming us than an athlete’s post-game interview is of giving us physical talent:

Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking. Don’t settle.

For most of us, settling is an inevitability, not because of insufficient exposure to these bland admonishments but because, unlike Jobs, we do not know what great work really is —we lack confidence in any system of values or ideals; we cannot give ourselves wholly over to anything without doubt; we cannot have faith, and utter dedication seems faintly ludicrous— or we cannot decide how much of ourselves or others we are willing to sacrifice. We want love and labor, freedom and meaning, flexibility and commitment. One has the strong sense that Jobs had no issue whatever with the idea of total, monomaniacal devotion to his cause, whatever that cause happened to be at any moment, whatever it demanded at any point of decision, however it was later judged. This is a kind of selfishness, too; it can hurt many people, and one cannot be assured that one is doing the right thing, since one might receive no signal from one’s family or peers that one’s dedication is sound, fruitful, worthwhile; for years of Jobs’ life, he did not. And of course: one might be wrong, and others might be misled, and one might immolate one’s life in error. There is no shortage of historical figures of whom we can say that such was the case.

When I read about Jobs, I am reminded far more of someone like Vince Lombardi than I am of any glamorous startup icon. Whether their monomania was “worth it” is of course a matter of whom you ask, and when. But imitating it is not useful; it is not a question of style or aesthetics of even ethics; monomania isn’t a process but a near-pathology, something that infests the mind, even as it brings it into accord with itself. Jobs seems to suggest that one should search for what infects one with it, and perhaps he was right, for while it is it is a dubious blessing, it is nevertheless one for which the world must often admit gratitude. As George Bernard Shaw famously said, “The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.” What is more reminiscent of Jobs than the unreasonable demand which despite the protests of all is satisfied, and which thereby improves the world?

That there is a tension between reasonableness and progress seems hard to accept, but it is also precisely the sort of befuddling dilemma that one encounters again and again in reading about Jobs: was it necessary for him to be cruel to be successful? Did he have to savage so-and-so in order to ship such-and-such? If he was such a man of taste, why were his artistic interests so undeveloped? Not only do I have no idea, it is obvious that among those who worked with him there is no sense of certainty either. This seems to me to reflect, in part, a simple fact: Jobs’ values are not common values, and even among those of us who admire him, his indifference to the feelings of others, even those who loved and needed him, is hard to accept. Jobs himself —like many leaders— seems impossible to resolve; there is no chatty, confessional “inner self” to be located in his words or books about him; one has the sense that no one ever got “inside him,” perhaps because “inside” is where the failed self resides, the self that falls short of its conscience, and Jobs simply didn’t have that sort of mind.

James’ formulation at least seems to bring us closer to understanding one component of his formula, however. His charisma —an enormous part of his ability to motivate and drive progress— was not due to any special intelligence, education, talent, or charm as we typically conceive of them, but due to something else: a conscience and a will unified with one another. To see a person for whom life is an instantiation of meaning, whose will reflects only their values, inspires us; it is meaning in action, the former province of religion, and it has a mysterious force over us that, despite our rational objections, turns us into “the faithful.”

“Whatever follies may be committed in art, once they are accepted among the upper classes of our society, a theory is at once elaborated to explain and legitimize these follies, as if there had never been epochs in history when certain exceptional circles of people had not accepted and approved of false, ugly, meaningless art, which left no traces and was completely forgotten afterwards. And we can see by what is going on now in the art of our circle what degree of meaninglessness and ugliness art can attain, especially when, as in our time, it knows it is regarded as infallible.”

Leo Tolstoy in What is Art?, quoted by Abby. The point is that we forget the limitless fallibility of contemporary human judgements even as we deride the past for its errors: “as if there had never been epochs” of worthless, celebrated art, decades and schools and theories and rebellions and geniuses all laboring towards the “false, ugly, [and] meaningless.” But when we walk through museums, we cannot believe that anything on the walls might be not merely “not to our liking,” but in fact bad, imbecilic, embarrassing!

And as if it were impossible that our museums should be so misled, when in fact it is a feature of the time that there seems to be no agreement between the common person and the expert —such as she is— as to art’s very definition, as to what art is, as to what qualifies as art. This definitional confusion results from an epistemologically-debased philosophical culture in which even the ambitious give up and say, “Well, art is whatever anyone says is art!” or some similar nonsense. That is: we do not know what art is, we cannot distinguish it from non-art, and we do not think it is even possible in principle to do so.

It would be no surprise to me if a far smaller percentage of the canonical work of the past century or so endures —or even makes sense— for long; I sometimes suspect that we’re living through an extraordinarily ridiculous time, culturally, and I only hope that it will at least be comic for those who study it in the future.

Abby and I celebrate our three-year anniversary today. We met through Tumblr, as it happens, and so far as I can tell this was my first post for or about her. Many others followed as I fell in love, moved across the country, and settled in for a wonderful life with her. This past weekend, we hosted some great friends in town and helped one prepare in secret to propose to the other. The trip and the proposal alike went off perfectly, and we even took a photograph of the moment. It feels nice to be part of something like that.
I am not an easy person to love, but I doubt many of us are. Abby’s pretty easy to love, though, honestly. I hope I get to keep at it for a long time.

Abby and I celebrate our three-year anniversary today. We met through Tumblr, as it happens, and so far as I can tell this was my first post for or about her. Many others followed as I fell in love, moved across the country, and settled in for a wonderful life with her. This past weekend, we hosted some great friends in town and helped one prepare in secret to propose to the other. The trip and the proposal alike went off perfectly, and we even took a photograph of the moment. It feels nice to be part of something like that.

I am not an easy person to love, but I doubt many of us are. Abby’s pretty easy to love, though, honestly. I hope I get to keep at it for a long time.

Suspending moral judgment is not the immorality of the novel; it is its morality. The morality that stands against the ineradicable human habit of judging instantly, ceaselessly, and everyone; of judging before, and in the absence of, understanding. From the viewpoint of the novel’s wisdom, that fervid readiness to judge is the most detestable stupidity, the most pernicious evil. Not that the novelist utterly denies that moral judgment is legitimate, but that he refuses it a place in the novel. If you like, you can accuse Panurge of cowardice; accuse Emma Bovary, accuse Rastignac—that’s your business; the novelist has nothing to do with it.

Creating the imaginary terrain where moral judgment is suspended was a move of enormous significance: only there could novelistic characters develop—that is, individuals conceived not as a function of some preexistent truth, as examples of good or evil, or as representations of objective laws in conflict, but as autonomous beings grounded in their own morality, in their own laws. Western society habitually presents itself as the society of the rights of man, but before a man could have rights, he had to constitute himself as an individual, to consider himself such and to be considered such; that could not happen without the long experience of the European arts and particularly of the art of the novel, which teaches the reader to be curious about others and to try to comprehend truths that differ from his own. In this sense E. M. Cioran is right to call European society “the society of the novel” and to speak of Europeans as “the children of the novel.”

Milan Kundera, Testaments Betrayed
Dani Lierow was so severely neglected for the first seven years of her life that some consider her a feral child; the original story of her rescue, recovery, and adoption was both shattering and arresting; it is impossible not to wonder at the silent interior world of an unsocialized mind, a mind both human and not:

She wouldn’t make eye contact. She didn’t react to heat or cold — or pain. The insertion of an IV needle elicited no reaction. She never cried. With a nurse holding her hands, she could stand and walk sideways on her toes, like a crab. She couldn’t talk, didn’t know how to nod yes or no. Once in a while she grunted …
[T]he scene at the house, along with Danielle’s almost comatose condition, led [doctors] to believe she had never been cared for beyond basic sustenance. Hard as it was to imagine, they doubted she had ever been taken out in the sun, sung to sleep, even hugged or held. She was fragile and beautiful, but whatever makes a person human seemed somehow missing…
The most extraordinary thing about Danielle… was her lack of engagement with people, with anything. “There was no light in her eye, no response or recognition… We saw a little girl who didn’t even respond to hugs or affection. Even a child with the most severe autism responds to those.”

A follow-up report by Melissa Lyttle, who took the photo above, details her progress with her extraordinary adoptive parents:

Three years later, Dani, now 12, has grown physically and emotionally. She’s a foot taller and clearly responsive to her dad’s affection. She hugs him back, kisses him and playfully bites his nose.

The photographs are quite moving; the Lierows seem like heroes.

Dani Lierow was so severely neglected for the first seven years of her life that some consider her a feral child; the original story of her rescue, recovery, and adoption was both shattering and arresting; it is impossible not to wonder at the silent interior world of an unsocialized mind, a mind both human and not:

She wouldn’t make eye contact. She didn’t react to heat or cold — or pain. The insertion of an IV needle elicited no reaction. She never cried. With a nurse holding her hands, she could stand and walk sideways on her toes, like a crab. She couldn’t talk, didn’t know how to nod yes or no. Once in a while she grunted …

[T]he scene at the house, along with Danielle’s almost comatose condition, led [doctors] to believe she had never been cared for beyond basic sustenance. Hard as it was to imagine, they doubted she had ever been taken out in the sun, sung to sleep, even hugged or held. She was fragile and beautiful, but whatever makes a person human seemed somehow missing…

The most extraordinary thing about Danielle… was her lack of engagement with people, with anything. “There was no light in her eye, no response or recognition… We saw a little girl who didn’t even respond to hugs or affection. Even a child with the most severe autism responds to those.”

A follow-up report by Melissa Lyttle, who took the photo above, details her progress with her extraordinary adoptive parents:

Three years later, Dani, now 12, has grown physically and emotionally. She’s a foot taller and clearly responsive to her dad’s affection. She hugs him back, kisses him and playfully bites his nose.

The photographs are quite moving; the Lierows seem like heroes.

“His own tweets were extremely boring; bland promotional links or seasonal announcements (‘Summer is coming. Quark Cola never tastes so good as at a backyard BBQ!’). Nevertheless, once posted, he watched the retweets and favourites amass. Often he visited the accounts of the retweeters, trying to establish what kind of person reposted the announcements of an impersonal soda drink corporation. But Twitter pages of individuals held a strange opacity of their own. A tiny mugshot, a list of tweets, and a personal network that you could sense but couldn’t see from the outside. Replies and retweets from other unknowable accounts. No context, no usable chronology. It was like having access to a stranger’s phonebook.”

Pierce Gleeson in "Four Million Followers," a perfect short story which calls to mind a question that interests me: how will fiction be written, how will the interior world of the human mind be conveyed, now that so many of its elements depend for their description on branded language, non-words with ephemeral meanings, temporarily-universal but easily-forgotten software conventions? How intelligible will any of what we experience be to those just a decade older or younger?

Love affairs and suicide-threats within the confines of streams on screens, annotated with @s and requiring familiarity with all these little accidents of technology: how can this be turned into literature? Twitter is unlikely to endure even as long as ham radio, but what can one say about a typical contemporary teenager without mentioning the performative passive-aggression of the so-called sub-tweet? Language is being de-genericized; many necessary phrases are proprietary (and ludicrous), but worse is that whole constellations of words and ideas fade from our sky nightly, and are replaced by newer, brighter arrays by morning.

Soon no computer will have a manual: all devices will be listening, waiting to be touched, eager to understand you as you are, responsive to your intuitively-expressed desires; and all novels will come with manuals explaining “key concepts” and describing the relative synchronousness of this or that protocol, the trademarked terms of discarded products. While it’s only an acceleration of what has always been the case —after all, one must read about patronymics, footnotes about old customs, and so on in novels from the past— changes in degree can become changes in kind. Perhaps the novel won’t die from an absence of readers, but simply because who can write quickly enough? Readers will snicker at characters’ social networks as you might if you opened a book detailing a courtship over Friendster.

“It took me almost another decade after graduate school to figure out what writing really is, or at least what it could be for me; and what prompted this second lesson in language was my discovery of certain remaindered books … in which virtually every sentence had the force and feel of a climax, in which almost every sentence was a vivid extremity of language, an abruption, a definitive inquietude. These were books written by writers who recognized the sentence as the one true theater of endeavor, as the place where writing comes to a point and attains its ultimacy. As a reader, I finally knew what I wanted to read, and as someone now yearning to become a writer, I knew exactly what I wanted to try to write: narratives of steep verbal topography, narratives in which the sentence is a complete, portable solitude, a minute immediacy of consummated language—the sort of sentence that, even when liberated from its receiving context, impresses itself upon the eye and the ear as a totality, an omnitude, unto itself. I once later tried to define this kind of sentence as “an outcry combining the acoustical elegance of the aphorism with the force and utility of the load-bearing, tractional sentence of more or less conventional narrative.” The writers of such sentences became the writers I read and reread. I favored books that you could open to any page and find in every paragraph sentences that had been worked and reworked until their forms and contours and their organizations of sound had about them an air of having been foreordained—as if this combination of words could not be improved upon and had finished readying itself for infinity.”

Gary Lutz, "The Sentence is a Lonely Place," reprinted in The Believer in 2009 and brought to my attention by Ben Lansky. This is as good an essay on writing as I’ve read, and it satisfies a requirement for a good discussion of craft which David Cole called “determinacy.” Writing about craft is determinate when it provides concrete, actionable knowledge.

Lutz is specific; he gets into details, documents the relations between letters, thinks about the components from which, after all, writing is made. An astonishing amount of writing-on-writing (and the overwhelming majority of writers themselves) fail to do so, preferring the heights of feelings and ideas and politics and so on.

In her story “The Blood Jet,” Schutt ends a sentence about “life after a certain age” by describing it capsularly as “acutely felt, clearly flat”—two pairs of words in which an adverb precedes an adjective. The adjectives (felt and flat) are both monosyllabic, they are both four letters in length, and they both share the same consonantal casing: they begin with a tentative-sounding, deflating f and end with the abrupt t. In between the two ends of each adjective, Schutt retains the l, though it slides one space backward in the second adjective; and for the interior vowel, she moves downward from a short e to a short a. The predecessive adverbs acutely and clearly share the k-sounding c, and both words are constituted of virtually the same letters, except that clearly doesn’t retain the t of acutely. The four-word phrase has a resigned and final sound to it; there is more than a little agony in how, with just two little adjustments,felt has been diminished and transmogrified into flat, in how the richness of receptivity summed up in felt has been leveled into the thudding spiritlessness of flat. All of this emotion has been delivered by the most ordinary of words—nothing dredged up from a thesaurus. But what is perhaps most striking about the four-word phrase is the family resemblances between the two pairs of words. There is nothing in the letter-by-letter makeup of the phrase “clearly flat” that wasn’t already physically present in “acutely felt”; the second of the two phrases contains the alphabetic DNA of the first phrase. There isn’t, of course, an exact, anagrammatic correspondence between the two pairs of words; the u of the first pair, after all, hasn’t been carried over into the second pair. (Schutt isn’t stooping to recreational word games here.) But the page-hugging, rather than page-turning, reader—the very reader whom a writer such as Schutt enthralls—cannot help noticing that the second phrase is a selective rearrangement, a selective redisposition, of the first one—a declension, really, as if, within the verbal environment of the story, there were no other direction for the letters in the first pair of words to go. There is nothing random about what has happened here. Schutt’s phrase has achieved the condition that Susan Sontag, in her essay about the prose of poets, called “lexical inevitability.”

Writers who aren’t thinking of this level of detail, who aren’t working on their sentences in this manner, aren’t writing; they’re talking in text. Poetry is typically the densest, most-perfected composition, where nothing is incidental, extraneous, automatic, empty, or indifferent to form; prose can approach poetics quite closely if written with attention and concern, with equivalent obsessions for form and content; but most writing is just content: ideas that might as well be written another way, stories that could be told (and are appreciated) as memes or mirrors for selves.

I was delighted to see DeLillo cited repeatedly in the essay (which would be worth reading if only as a sample of extraordinary sentences). I adore his sentences, which combine concreteness and poetics, plainness and depth, with such easy facility that I can stare at them for minutes, come back to them again and again trying to work them out; they seem either like magic or like the result of more-than-sufficiently advanced technology, operated with cold and perfect precision.

All of Lutz’s examples and analyses are a delight, worth reading whatever one’s relationship with prose.

Thanks to Rachel, Andi, Isaac, Joshua, Melissa, and Sheila Heti and everyone who came to the Tumblr/Believer party last night. I think I was the only reader who is not, in fact, a writer, and I was grateful to everyone for their forbearance. I also want to apologize for not opening with my planned joke:

Wit: Did you hear the story about the three holes in the ground?
Interlocutor: No, tell me!
Wit: Well, well, well…

Oh, and thanks to David for that joke. And to Vic for shooting this video. And to everyone, for everything.

“Good news. I am working on a 1400 page novel called “The Man Who Saw Life Clearly”. It will be post-modernist in style and is intended to be the first ironic treatment of irony. But that’s not all. It will be structured around Gödel’s incompleteness theorem, much as “Ulysses” is structured on “The Odyssey.” The problem of self-reference and the Turing machine will be developed novelistically for the first time.”
My father in an email to me from December of 1998! So far as I know, this novel remains unfinished. Happy belated father’s day, dad!
“Cotter feels a mood coming on, a complicated self-pity, the strength going out of his arms and a voice commencing in his head that reproaches him for caring. And the awful part is that he wallows in it. He knows how to find the twisty compensation in this business of losing, being a loser, drawing it out, expanding it, making it sickly sweet, being someone carefully chosen for the role.”

Don DeLillo, Underworld. My entire life is a complicated self-pity.

(Surprised at “sickly sweet,” though; it had long been a cliché by the time Underworld was written. Could it be deliberate? As a boy’s self aspires to archetypes, takes the shape of a tradition, the language takes the shape of a cliché. No, that’s not sensible, is it?).

One of my heroes died on May 7th. His name was Denny Fitch, and I couldn’t have admired him more; I feel shamefully incapable of memorializing him, but fortunately one of my other heroes, Errol Morris, devoted an episode of his outstanding First Person series to Fitch and his role in the crash-landing of United Airlines Flight 232, in Sioux City, Iowa.

Fitch was a training-check-airman flying as a passenger, headed home to his wife and children, when the DC-10 suffered a catastrophe from which no airliner had ever recovered: the total loss of all flight-surface controls. The story of how Fitch and the flight crew responded to the task of landing an almost entirely uncontrollable jet airplane with nearly 300 people on board, how they considered landing on interstates, how their ground controllers told them they had no guidance because their situation wasn’t considered survivable, how they felt smashing into the ground, exploding, being thrown about as the plane burst into flames: it is a story only Errol Morris could coax, support, convey with the sort of power it merits.

Largely because of Fitch, 185 aboard survived, a fact one can hardly comprehend when one sees the video of the crash (at the start of the documentary above) or sees photos:

It is a sad story, of course, but it is also —why do I flush to say this?— an inspiring story, and I think of Denny Fitch and Al Haynes and the passengers often, often, often; I do not want to use them, recycle them into metaphor, but I cannot help it; theirs was a kind of crucible of crisis, problem-solving, fear and its overcoming. When I learned today that Fitch had died of brain cancer, I cried and cried. I hate that we vainly personalize others’ deaths this way, but all I mean is that he was really important to me and many thousands of others, and that the basic, attainable, direct, courageous, disciplined spirit he had seems to me more important than nearly all other forms of heroics.

I suppose I simply feel grateful to him, and I recommend Errol Morris’ short documentary highly.