January 31st, 2014

David Foster Wallace & Trudy

For many years since reading A Supposedly Fun Thing I’ll Never Do Again, I’ve wondered irritably: was David Foster Wallace mocking real people in his essay on the cruise-ship experience? Specifically, this passage stayed with me:

"My favorite tablemate is Trudy, whose husband…has given his ticket to Alice, their heavy and extremely well-dressed daughter… every time Alice mentions [her boyfriend Patrick, Trudy] suffers some sort of weird facial tic or grimace where the canine tooth on one side of her face shows but the other side’s doesn’t. Trudy is fifty-six and looks –and I mean this in the nicest possible way– rather like Jackie Gleason in drag, and has a particularly loud pre-laugh scream that is a real arrhythmia-producer…"

Because Wallace returns to and discusses this group repeatedly and seems fond of them, it was hard to understand how he’d simultaneously savage them with sardonic insults like these; to be clear: he is mocking them for their appearance, the sound of their laughter, their personalities of their children, etc., in a national publication.

I was often told that Wallace was surely using an amalgam of characters, or even entirely conjured ones, despite the verite nature of the essay; these barbs, after all, are hard to square with the ethics expressed elsewhere in his work, and seem difficult to justify from the reader’s or writer’s perspective.

Nevertheless, it turns out that, in fact, he was mocking real people. He was asked about it long ago in “There’s Going To Be the Occasional Bit of Embellishment”: David Foster Wallace on Nonfiction, 1998, Part 3, an interview with Tom Scocca at Slate. The relevant portion is below:

Q: Also when you’re writing about real events, there are other people who are at the same events. Have you heard back from the peoplethat you’re writing about? Trudy especially comes to mind—

DFW: [Groans]

Q: —who you described as looking like—

DFW: That, that was a very bad scene, because they were really nice to me on the cruise. And actually sent me a couple cards, and were looking forward to the thing coming out. And then it came out, and, you know, I never heard from them again. I feel—I’m worried that it hurt their feelings.

The. Thing. Is. Is, you know, saying that somebody looks like Jackie Gleason in drag, it might not be very nice, but if you just, if you could have seen her, it was true. It was just absolutely true. And so it’s one reason why I don’t do a lot of these, is there’s a real delicate balance between fucking somebody over and telling the truth to the reader.

Scocca does not press him on what sort of truth an insulting analogy is; in my opinion, it is a low order of truth, at the absolute best a physical description that could have been achieved in a less derisive way; that is: it is not a meaningful enough truth to matter much. But more importantly: there is a way to describe Trudy that isn’t a punchline. (The notion that there isn’t would reflect a total poverty of literary imagination).

Note that Wallace himself equivocates about the utility of the analogy:

DFW: I wasn’t going to hurt anybody or, you know, talk about anybody having sex with a White House intern or something. But I was going to tell the truth. And I couldn’t just so worry about Trudy’s feelings that I couldn’t say the truth. Which is, you know, a terrific, really nice, and not unattractive lady who did happen to look just like Jackie Gleason in drag.

Q: Maybe if you’d emphasized that it was not in an unattractive way. Which is sort of a hard thing to picture.

DFW: Actually the first draft of that did have that, and the editor pointed out that not only did this waste words, but it looked like I was trying to have my cake and eat it too. That I was trying to tell an unkind truth but somehow give her a neck rub at the same time. So it got cut.

Q: But you actually did want to have your cake and eat it too. Not in a bad way.

DFW: I’m unabashed, I think, in wanting to have my cake and eat it too.

I think he ought to have been a little abashed by the proximity of phrases like “I wasn’t going to hurt anybody” and “I couldn’t just so worry about Trudy’s feelings” and “not unattractive” and “Jackie Gleason in drag.” So close to one another, they aren’t coherent.

Even Scocca has to note that it’s hard to picture someone looking like Jackie Gleason in drag yet not being unattractive. This means it is a poor analogy, a bad description. Wallace wants to convey that she looks a certain way and is not unattractive; instead, he conveys that she is maximally unattractive and makes a punchline of it, then says it’s for the “truth” before ambivalently wishing it didn’t have to be this way in writing (which it doesn’t).

It is a parting amusement (and a reminder of the 1990s) that Wallace asserts that he would never "talk about anybody having sex with a White House intern…but I was going to tell the truth"; eager to establish his bona fides as a reputable thinker who supports the right politics, Wallace seems not to consider very clearly the relative value of these two disclosures:

  1. That a sitting US president cheated on his wife with an intern employed by the government, then lied about it to a country that —however much this pains me and Wallace alike— wants to moralistically examine and judge the private lives of their elected figures and has every right to do so, as they are the people and this is a democracy (to avoid confusion: I wish America were more like France, indifferent to the private affairs of public citizens; but that is my wish, not the wish of most of my fellow citizens, to whom journalists are theoretically beholden)
  2. That a friendly, ordinary private citizen was overweight, ugly, had an awful laugh, and made faces at her heavy-set daughter whenever the latter mentioned her boyfriend.

It’s hard for me to understand the reasoning he must have employed in deciding that the first is either unimportant or merits the protections of discrete privacy, supported by strangers, while the latter —that a woman and her daughter aren’t attractive— is important in light of the imperatives of journalistic truth! 

(Originally answered on Quora).

December 31st, 2013

So long, Bayou; so long, 2013.

August 29th, 2013

Landscapes by Frank Walter (1926-2009), shared here for use in escaping oneself and touching the ground of being, or being consoled amidst all the frothing, confused reactivity of culture by whatever abides, or some such.

July 21st, 2013

Description of a Struggle

If this piece gives you concerns about my viability as an employee, renter, applicant, neighbor, etc., please read this disclaimer / claimer.

I have an almost technical interest in attempting to describe the subjective experience of certain aberrant mental phenomena. Apart from any broader concerns and without concluding anything from it, then, here is an attempted accounting of what one might call a “breakdown” or an episode, an instance of bipolar collapse. For those interested: there was no interruption in my medication or treatment, but I’d had insufficient sleep the night before and some personal difficulties had catalyzed a terribly unhealthy mood.

I retreated into the bathroom, shut the door, and turned out the lights; I was very upset about many things, about all things; whatever thoughts formed were either dark and horrible at the outset or were pulled towards darkness very quickly. From one subject to the next: fears about the future, regrets about the past, guilt about my own moral failures, self-loathing because of my intransigent faults, fury at innumerable persons, shame at the force of my hatred and bitterness, and exhaustion with the perpetual systematic failure of my entire mind: personality, cognition, memory, emotion, will. In a few seconds I would cycle through these thoughts, which lead into and depart from one another, in an accelerating spiral that unified, separated, and then recomposed these threads again and again; it was rapid and repetitive.

This state is steady enough. It hurts very much, and I often sob into the floor, but it is not acute; it is simply painful to be filled with so much hatred for oneself, and to have this hatred permeate through one’s entirety: into one’s childhood memories, into one’s aesthetic sensibilities, into one’s sense of ethics. Because people were sleeping, I attempted mainly to cry noiselessly.

Suddenly, things grew vastly worse. I felt as though I had fallen backwards into a void at the absolute core of my mind, as though I had dropped through the variously false and detestable strata of my being into the reality of myself: nothingness, blackness, an abhorred vacuum around which swirled thoughts now far too fast to track, record, or resist. And I did indeed fall backwards, into the bathtub, because I felt exposed outside of it. I curled into myself and opened my mouth and screamed silently, my wet face draining from so many places that I worried as I gasped that I would aspirate tears, spit, snot.

Here is what I saw:

  • In the blackness of myself, I could see that my thoughts were not myself at all: my self is only a nothingness that exists in a state of pure terror and hatred, and my thoughts rotate around it as debris in a tornado. My thoughts were imbecilic, disgusting, vicious, superficial, detestable, but by this point I could no longer stay with them long enough to hate them. They distracted me, but I couldn’t attend to them. I said in my mind: “Oh god, oh god, oh god, nothing, nothing, nothing; oh god, nothing, nothing, oh god, I’m nothing, it’s nothing, there’s nothing, god, god.”
  • Periodically I would see what I assume was a phosphene, and it would transform into something real; I saw a glowing purple shape become the sun, and the sun became the blond hair I had in childhood. And I realized that I had murdered that boy, had murdered my own boyhood self, had destroyed this innocent child, and I ground my teeth to silence myself, as I wanted to scream so loud that I would tear myself apart, would explode in a bloody spray. I was sick with guilt and fear; I had nothing inside myself any longer; I felt I had betrayed myself, had orphaned myself when I needed someone most. I heard in my mind: “Why did I kill him? Oh god, he needed someone, he needed someone, why did I kill him, I’ve killed him, oh god, I’ve killed him.”
  • I was seized with a desire to gain physical access to and destroy my brain, an urge I felt in childhood when I had severe headaches. I grasped my hair and attempted to pull it out; I wanted to rip my scalp over and reach into my skull and destroy my mind, scramble and tear apart this malevolent and pathetic apparatus with my fingers, rip out the guts of my who nightmare self. I couldn’t get my hair out, hated myself for it, lost the thread of this thought, and resumed my silent shrieking and sobbing.

I thought of my mother and my father, and I thought of Abby, but only for flashes: nothing would remain, everything was immediately carried off in this great storm of shame, fear, rage, and sorrow. I wept and wept, incapable of extending myself through time: it was the brutality of the present that crushed me, the incessant re-setting of the scene: any effort to elaborate a saving thought, a consolatory or even therapeutic idea, was in vain; all things were carried away at once, disappeared from me, receded into distance. I thought only of my own destruction.

I hurt myself crying: an extraordinarily pathetic feeling overtook me as I cramped from pushing against the walls of the tub and I turned onto my back, looking upwards. Everything was slowing down, and I realized it: I felt as though I was being pulled upward through the same strata, back up to the “higher order” consciousness from which I had moments ago felt permanently alienated. It wasn’t a happy feeling; it felt false, pointless. But it wasn’t volitional, and within minutes I was out of the bathroom, pulling on my clothes to prepare for the day. It was Abby’s company’s summer picnic; they rented out a water park.

June 16th, 2013

My father, his father, and his brother in the 1950s. It took me an inexcusably long time to realize how much of what I like about myself, how much of what enables the happiness or goodness I attain, I owe to him; that strange interference that can distort a daughter’s perception of her mother has its counterpart between fathers and sons, everyone knows that; but it took me by surprise nevertheless how much I’d identified and appreciated those things that came from my mother while assuming that virtues, interests, and ideas which he gave me were my own inventions. I’m no longer quite so deluded.

Happy father’s day, dad!

May 20th, 2013

This photo was taken in 2003, when she was just a few years old. We had so many adventures over the years; I don’t know how much less I might have lived, how much more closed I’d have been, had I not taken her home from the veterinary hospital where I worked. It was 2001, and she’d been found, hairless and bruised and infected with mange and scabies and worms, in Bayou St. John; they dropped her with us, but she was nearly feral. In taking care of her, I bonded with her and took her home over the reasonable objections of many there, who’d noted how damaged and neurotic she was.

Tonight, Abby and I pressed our wet faces to her head as a doctor euthanized Bayou. She was 13 years old, dying from a bleeding belly tumor, too weak to move anything but her eyes. She was always so tough and sweet, always my close companion. These past years in San Francisco were a dream for her, and I guess I’ll try to hang on to that now that she’s gone.

Here are some photos of her being wonderful. Aren’t some of those fun? We were so much younger, and Louisiana was so green. And here are all the times I posted about her. I don’t care about these words in the slightest; for some reason, I just want to share her with you, show you photos of how she played and ran. She was here, with me, for the happiest years of my life.

May 18th, 2013

Miles Barger posted this wonderful image from The Neighbors, a photographic series by Arne Svenson of scenes in the windows of his Manhattan neighbors. They seem to assert the primacy of unknowable interior spaces, those buried within decor and personality, deeper within ourselves than our names go, deeper than our uniquenesses, into those places where we are archetypes, reacting without will to dreams and fears.

Reblogged from Miles Barger
May 18th, 2013
The Church has become close to me in its distrust of man, and my distrust of form, my urgent desire to withdraw from it, to claim ‘that that is not yet I,’ which accompanies my every thought and feeling, coincides with the intentions of its doctrine. The Church is afraid of man and I am afraid of man. The Church does not trust man and I do not trust man. The Church, in opposing temporality to eternity, heaven to earth, tries to provide man with the distance [from] his own nature that I find indispensable. And nowhere does this affiliation mark itself more strongly than in our approach to Beauty. Both the Church and I fear beauty in this vale of tears, we both strive to defuse it, we want to defend ourselves against its excessive allure. The important thing for me is that it and I both insist on the division of man: the Church into the divine and the human component, I into life and consciousness. After the period in which art, philosophy, and politics looked for the integral, uniform, concrete, and literal man, the need for an elusive man who is a play of contradictions, a fountain of gushing antinomies and a system of infinite compensation, is growing. He who calls this “escapism” is unwise…
The irreligious Witold Gombrowicz articulating some of the reasons why even the incredulous might find credulity closer to their principles than many popular forms of unexamined, incoherently reductive materialism.
May 12th, 2013

My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.

April 12th, 2013

Free Will & the Fallibility of Science

One of the most significant intellectual errors educated persons make is in underestimating the fallibility of science. The very best scientific theories containing our soundest, most reliable knowledge are certain to be superseded, recategorized from “right” to “wrong”; they are, as physicist David Deutsch says, misconceptions:

I have often thought that the nature of science would be better understood if we called theories “misconceptions” from the outset, instead of only after we have discovered their successors. Thus we could say that Einstein’s Misconception of Gravity was an improvement on Newton’s Misconception, which was an improvement on Kepler’s. The neo-Darwinian Misconception of Evolution is an improvement on Darwin’s Misconception, and his on Lamarck’s… Science claims neither infallibility nor finality.

This fact comes as a surprise to many; we tend to think of science —at the point of conclusion, when it becomes knowledge— as being more or less infallible and certainly final. Science, indeed, is the sole area of human investigation whose reports we take seriously to the point of crypto-objectivism. Even people who very much deny the possibility of objective knowledge step onto airplanes and ingest medicines. And most importantly: where science contradicts what we believe or know through cultural or even personal means, we accept science and discard those truths, often wisely.

An obvious example: the philosophical problem of free will. When Newton’s misconceptions were still considered the exemplar of truth par excellence, the very model of knowledge, many philosophers felt obliged to accept a kind of determinism with radical implications. Give the initial-state of the universe, it appeared, we should be able to follow all particle trajectories through the present, account for all phenomena through purely physical means. In other words: the chain of causation from the Big Bang on left no room for your volition:

Determinism in the West is often associated with Newtonian physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The “billiard ball” hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (Laplace’s demon). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.

Thus: the movement of the atoms of your body, and the emergent phenomena that such movement entails, can all be physically accounted for as part of a chain of merely physical, causal steps. You do not “decide” things; your “feelings” aren’t governing anything; there is no meaning to your sense of agency or rationality. From this essentially unavoidable philosophical position, we are logically-compelled to derive many political, moral, and cultural conclusions. For example: if free will is a phenomenological illusion, we must deprecate phenomenology in our philosophies; it is the closely-clutched delusion of a faulty animal; people, as predictable and materially reducible as commodities, can be reckoned by governments and institutions as though they are numbers. Freedom is a myth; you are the result of a process you didn’t control, and your choices aren’t choices at all but the results of laws we can discover, understand, and base our morality upon.

I should note now that (1) many people, even people far from epistemology, accept this idea, conveyed via the diffusion of science and philosophy through politics, art, and culture, that most of who you are is determined apart from your will; and (2) the development of quantum physics has not in itself upended the theory that free will is an illusion, as the sort of indeterminacy we see among particles does not provide sufficient room, as it were, for free will.

Of course, few of us can behave for even a moment as though free will is a myth; there should be no reason for personal engagement with ourselves, no justification for “trying” or “striving”; one would be, at best, a robot-like automaton incapable of self-control but capable of self-observation. One would account for one’s behaviors not with reasons but with causes; one would be profoundly divested from outcomes which one cannot affect anyway. And one would come to hold that, in its basic conception of time and will, the human consciousness was totally deluded.

As it happens, determinism is a false conception of reality. Physicists like David Deutsch and Ilya Prigogine have, in my opinion, defended free will amply on scientific grounds; and the philosopher Karl Popper described how free will is compatible in principle with a physicalist conception of the universe; he is quoted by both scientists, and Prigogine begins his book The End of Certainty, which proposes that determinism is no longer compatible with science, by alluding to Popper:

Earlier this century in The Open Universe: An Argument for Indeterminism, Karl Popper wrote,” Common sense inclines, on the one hand, to assert that every event is caused by some preceding events, so that every event can be explained or predicted… On the other hand, … common sense attributes to mature and sane human persons… the ability to choose freely between alternative possibilities of acting.” This “dilemma of determinism,” as William James called it, is closely related to the meaning of time. Is the future given, or is it under perpetual construction?

Prigogine goes on to demonstrate that there is, in fact, an “arrow of time,” that time is not symmetrical, and that the future is very much open, very much compatible with the idea of free will. Thus: in our lifetimes we have seen science —or parts of the scientific community, with the rest to follow in tow— reclassify free will from “illusion” to “likely reality”; the question of your own role in your future, of humanity’s role in the future of civilization, has been answered differently just within the past few decades.

No more profound question can be imagined for human endeavor, yet we have an inescapable conclusion: our phenomenologically obvious sense that we choose, decide, change, perpetually construct the future was for centuries contradicted falsely by “true” science. Prigogine’s work and that of his peers —which he calls a “probabilizing revolution” because of its emphasis on understanding unstable systems and the potentialities they entail— introduces concepts that restore the commonsensical conceptions of possibility, futurity, and free will to defensibility.

If one has read the tortured thinking of twentieth-century intellectuals attempting to unify determinism and the plain facts of human experience, one knows how submissive we now are to the claims of science. As Prigogine notes, we were prepared to believe that we, “as imperfect human observers, [were] responsible for the difference between past and future through the approximations we introduce into our description of nature.” Indeed, one has the sense that the more counterintuitive the scientific claim, the eagerer we are to deny our own experience in order to demonstrate our rationality.

This is only degrees removed from ordinary orthodoxies. The point is merely that the very best scientific theories remain misconceptions, and that where science contradicts human truths of whatever form, it is rational to at least contemplate the possibility that science has not advanced enough yet to account for them; we must be pragmatic in managing our knowledge, aware of the possibility that some truths we intuit we cannot yet explain, while other intuitions we can now abandon. My personal opinion, as you can imagine, is that we take too little note of the “truths,” so to speak, found in the liberal arts, in culture.

It is vital to consider how something can be both true and not in order to understand science and its limitations, and even more the limitations of second-order sciences (like social sciences). Newton’s laws were incredible achievements of rationality, verified by all technologies and analyses for hundreds of years, before their unpredicted exposure as deeply flawed ideas applied to a limited domain which in total provide incorrect predictions and erroneous metaphorical structures for understanding the universe.

I never tire of quoting Karl Popper’s dictum:

Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve.

It is hard but necessary to have this relationship with science, whose theories seem like the only possible answers and whose obsolescence we cannot envision. A rational person in the nineteenth century would have laughed at the suggestion that Newton was in error; he could not have known about the sub-atomic world or the forces and entities at play in the world of general relativity; and he especially could not have imagined how a theory that seemed utterly, universally true and whose predictive and explanatory powers were immense could still be an incomplete understanding, revealed by later progress to be completely mistaken about nearly all of its claims.

Can you imagine such a thing? It will happen to nearly everything you know. Consider what “ignorance” and “knowledge” really are for a human, what you can truly be certain of, how you should judge others given this overwhelming epistemological instability!

March 26th, 2013
Look at the masterpiece, and not at the frame — and not at the faces of other people looking at the frame.

Vladimir Nabokov in his lectures on Russian literature, opposing the primary type of academic and popular criticism: what we might call the demographic-reactive type. The overwhelming majority of opinion derives less from any internal response to a work of art (or political idea or cultural trend) than from what sorts of reactions we imagine on other faces looking at the frame, as it were.

If we’re observant, we see that when we encounter something we have often hardly finished perceiving it when we begin to imagine how others might react, and how still others would react to that reaction, and only at last do we begin to react according to our own demographic allegiances or resentments. We carry our friends, but still more our enemies, with us in every judgment.

The Internet has amplified this effect: you now have with you an audience judging your reactions; streams of posts and hashtagged messages from schools of thought, schools of attitude, schools of discourse. The Internet has pressed your face against the faces of others; they loom in your vision; they blot out the masterpieces; they stare at you from amidst the noise of their automatic opinions, scrolling endlessly away, appearing endlessly anew. The Internet comes with you to the theater. You cannot be alone with art or with facts or with nature: you will anticipate publicly, experience publicly, react publicly, reflect publicly, and you would not be human if such exposure did not subtly contort your stances, as, after all, you will be judged publicly.

Of course, the Internet is only an extension of what has always happened: we influence and are influenced. That mob-technopoly applies democratic pressures to the most trivial opinions, little silos of demography exerting their distributed force on how we think and feel, various web sites accruing weltanschauungs meme by meme, is only “new” in that the Internet seems more insistent, more determined to rule on all questions and arbitrate all conflicts. No opinion is too small, and no one has the right to abstain.

Looking at frames and faces is an error; both belong to the category of “news” —"the froth & scum of the eternal sea"— whereas art aspires to be sub specie aeternitatis, aspires to meet us beyond the ephemeral in that part of ourselves that is beyond the ephemeral, that is not a merely political creature, is something other than an amalgamation of trending topics, fashionable poses, soon-to-be-invalidated certitudes from soon-to-be-forgotten luminaries, and the like.

The frame is everything to those who want to empower themselves at the masterpiece’s expense, subordinate the eternal to the present’s temporary concerns, make art a tool for their own elevation. The faces looking at the frame are the audience for this sort of critic, who produces formulaic reams about what their reactions mean and what the frame says about things like society. The sordid scene is a distraction from the art and from the viewer, a nullification of their import, the substitution of a banal system for what was a relation between two inimitable intelligences: artist and viewer, reader, listener. Systems bring power and election, and that is their utility: not that they illuminate art or help us understand it, but that they empanel fresh judges, a new relay of runners in history’s race.

We should not give our attention to this sideshow. People have set up stalls between the frames and the faces! There are industries operating there, seeking margins and protected by police! But perhaps we can press through to the painting on the wall or the words on the page. As Gombrowicz advised:

Stop pampering art, stop –for God’s sake!– this whole system of puffing it up and magnifying it; and, instead of intoxicating yourselves with legends, let facts create you. 

And this goes not only for artistic masterpieces but for any object of our contemplation: even a natural phenomenon, uninterrupted by posturing reactivity —”not yet descended into words”—, can occasion the “receptive understanding, …contemplative beholding, and immersion -in the real” that is the justification for asking that we be left alone. This immersion in the real, by art or by nature or however else we should come to it, is private, intimate, easily trampled by a crowd. But it is also our only means of combating artifice, touching the real, suspending the performance, experiencing ourselves and our world as we are, even if only for quickening moments of honest, solitary selfhood.

February 27th, 2013

The Charisma of Leaders

“The leaders always had good consciences, for conscience in them coalesced with will, and those who looked on their face were as much smitten with wonder at their freedom from inner restraint as with awe at the energy of their outward performances.”

In The Varieties of Religious Experience, William James identifies the union of conscience and will in leaders as one of their defining attributes. By conscience he means their values, their morality, their meaning-systems; and by will he means their volition, their drive, their constant, daily intentionality. Thus: their actions are in accord with their ideals. Their desires constantly reflect their beliefs.

For most of us, this is not so: there is a frustrating gap between them, such that we’re not in accord with our own values, no matter how badly we wish to be. Our moral commitments are overwhelmed routinely, and our behavior subverts, distracts, and disappoints us. Perhaps we accept a remunerative job rather than dedicating our lives to what we feel is most important; or we pursue the important, but we get sleepy and head home from the office earlier than we suspect we should; we call in sick when we’re perfectly well; or we come to feel that our calling isn’t so important as we thought. We have doubts and waste time; we crave freedom and idle time, but regret our lack of purpose. We are not as dedicated in friendship as we aspire to be; we grow irritated by what we know is superficial, meaningless; and so on ad nauseum. Because this is one of the defining qualities of human life, examples abound and more are likely unneeded.

James says that for “leaders,” this is not so; and more importantly, because it is not so, we are “as much smitten with wonder at their freedom from inner restraint as with awe at the energy of their outward performances.”

The Steve Jobs Myths

No one who has read about Steve Jobs can escape a certain sense of perplexity concerning him. A figure praised as brilliant, profound, and revolutionary, someone who purportedly saw deeply into the mysteries of creativity and human life, and who was unquestionably responsible for a great deal of innovation, was also prone to facile irrationality, appallingly abusive and callow behavior, the dullest sorts of homilies, and seeming shallowness about his own attributes and habits.

Show a video of or read a passage about the man who absurdly concluded his commencement speech at Stanford with ”stay hungry, stay foolish” —a hackneyed Hallmark phrase that might as well be printed on a motivational poster outside of Steve Ballmer’s office— to someone not already indoctrinated, and their reaction will surprise you. His pinched voice droning on with quite-typical businessman phrases; his endless references to the most ordinary pop-art, from The Beatles to U2 to John Mayer; his casually superficial understanding of the spirituality he ostensibly sought during various phases of his life; his fruitarian diets and basic scientific ignorance, suggestive of a narcissistic mysticism: these will all fail to impress an ordinary person. As with Apple’s often-cited but never-achieved marketing perfection, the myth obscures the truth. The "Reality Distortion Field" does not seem to work except on people for whom its existence is already a given, or for people who knew him in real life.

People who knew him, notably, often report a total awe at the power of his personality and mind, a power that overwhelmed them, catalyzed some of their greatest creativity and effort, inspired them them with its focus and its capacity to find the point, the consequence, the animating vision in any effort. There is no question that Jobs was a rare sort of individual, one whom I credit with dramatically improving human access to creativity-supporting computation (among other feats that matter to me a great deal). But there is reason to wonder: in what did his greatness consist?

(Walter Isaacson’s wasteful biography is hardly helpful here, incidentally. It is a mere recounting of interviews, none well-contextualized or examined satisfactorily. It reads like an endless Time article).

A Unity of Conscience and Will

What Jobs was was indefatigable, convinced of the rightness of his pursuits —whatever they happened to be at any given time— and always in possession of a unified conscience and will. Whether flattering or cajoling a partner, denying his responsibility for his daughter, steering a company or a project, humiliating a subordinate, driving designers and engineers to democratize the “bicycle for the mind” so that computation and software could transform lives around the world, or renovating his house, he was, as they say, “single-minded,” and he never seems to have suffered from distance between his values and his actions. He believed in what he did, and was perfectly content to do whatever it took to achieve his ends. It is hard to imagine Jobs haunted by regrets, ruing this or that interpersonal cruelty; moreover, one can imagine how he might justify not regretting his misdeed, deploying a worn California aphorism like “I believe in not looking back.”

Many are willing to behave this way, of course; any number of startup CEOs take adolescent pride in aping Jobs, driving their employees to long hours, performing a sham mercuriality, pushing themselves far past psychological health in order to show just how dedicated they are. Rare is the CEO for whom this produces better results, however, than he or she would have attained with ordinary management methods.

Perhaps this is because for them, it is an act: it is an adopted methodology selected in order to assure whatever the CEO’s goals are, whether they entail wealth or the esteem of peers or conformity to the heroic paradigm he or she most admires. That is to say: there is for him or her the typical chasm between conscience and will, and as social animals, we register their confusion as we register our own. And what we seek in leaders is confidence, not confusion.

For Jobs, while there were surely elements of performance —as there have been with history’s greatest leaders, tyrants and heroes alike— there was at core an iron unity of purpose and practice. This may have been the source of the charisma for which he is famous —which is emphatically not due to the reasons most typically cited— and it is also, as James notes, related to his “energy of…outward performance…” If you really believe in what you do —and Jobs seemed to believe in whatever he did, as a function of personality— you do not tire until your body is overcome. And Jobs, as is well known, pushed himself and others to exhaustion, to mental fragility, to breakdown.

Morality and Praxis

James does not explain why this kind of unity is so magnetic, so charismatic, but his broader discussion of various types of persons imply that it may have something to do with the perennial problem of human meaning: the confrontation between morality —which tends to be ideal— and praxis, in which innumerable considerations problematize and overwhelm us.

There are two exemplary solutions to this problem in human history, opposed to the third path most of us take: muddling through and bargaining in internal monologues about what we ought to be while compromising constantly:

  1. "Saints," who decide to live in accordance with religious values no matter the cost; for example, believing that money is both meaningless and corrupting, they vow poverty, and fall from society.
  2. "Leaders," who live in accordance with their own values, or values of some community that is worldly in its intentions, such that they do not drop from society but seek to instantiate their values in it.

In an age in which religious values are, even by the religious, not considered sufficient for a turn from society —an age of “the cross in the ballpark,” as Paul Simon says, of churches that promise “the rich life,” of believers who look in disgust at the instantiation of their religions’ values— the leader emerges as our most prominent solution to the problem of meaning. She is the embodiment of values and an agent of their transformative influence on the world. She has the energy of purpose, the dedication of the saint but remains within the world, and sometimes improves it.

The value or articulation of the ideas, it is appropriate to mention, is less important than we might think; in the case of Jobs, it is not crucial that he had a system of philosophy that charted the place of design in problem-solving, problem-solving in human advancement, human-advancement in a moral context. Indeed, we might leave that to others entirely, others who write about such things rather than living each moment driving themselves and others to achieve them.

The toll leaders take is fearsome, but we admire them for using us up: better to be used, after all, than useless. This is why those who worked for Jobs so often cannot even begin to justify how he reduced so-and-so to tears, how he stole this or that bit of credit, how he crushed a former friend whom in his paranoia he suspected of disloyalty, and they scarcely care. What we admire about saints and leaders is not solely the values they exemplify but the totality with which they exemplify them, a totality alien to all of us whose lives are balanced between poles of conformity and dedication, commitment and restlessness.

Jobs himself understood the necessity of unifying conscience and will, but his words are no more capable of transforming us than an athlete’s post-game interview is of giving us physical talent:

Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking. Don’t settle.

For most of us, settling is an inevitability, not because of insufficient exposure to these bland admonishments but because, unlike Jobs, we do not know what great work really is —we lack confidence in any system of values or ideals; we cannot give ourselves wholly over to anything without doubt; we cannot have faith, and utter dedication seems faintly ludicrous— or we cannot decide how much of ourselves or others we are willing to sacrifice. We want love and labor, freedom and meaning, flexibility and commitment. One has the strong sense that Jobs had no issue whatever with the idea of total, monomaniacal devotion to his cause, whatever that cause happened to be at any moment, whatever it demanded at any point of decision, however it was later judged. This is a kind of selfishness, too; it can hurt many people, and one cannot be assured that one is doing the right thing, since one might receive no signal from one’s family or peers that one’s dedication is sound, fruitful, worthwhile; for years of Jobs’ life, he did not. And of course: one might be wrong, and others might be misled, and one might immolate one’s life in error. There is no shortage of historical figures of whom we can say that such was the case.

When I read about Jobs, I am reminded far more of someone like Vince Lombardi than I am of any glamorous startup icon. Whether their monomania was “worth it” is of course a matter of whom you ask, and when. But imitating it is not useful; it is not a question of style or aesthetics of even ethics; monomania isn’t a process but a near-pathology, something that infests the mind, even as it brings it into accord with itself. Jobs seems to suggest that one should search for what infects one with it, and perhaps he was right, for while it is it is a dubious blessing, it is nevertheless one for which the world must often admit gratitude. As George Bernard Shaw famously said, “The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.” What is more reminiscent of Jobs than the unreasonable demand which despite the protests of all is satisfied, and which thereby improves the world?

That there is a tension between reasonableness and progress seems hard to accept, but it is also precisely the sort of befuddling dilemma that one encounters again and again in reading about Jobs: was it necessary for him to be cruel to be successful? Did he have to savage so-and-so in order to ship such-and-such? If he was such a man of taste, why were his artistic interests so undeveloped? Not only do I have no idea, it is obvious that among those who worked with him there is no sense of certainty either. This seems to me to reflect, in part, a simple fact: Jobs’ values are not common values, and even among those of us who admire him, his indifference to the feelings of others, even those who loved and needed him, is hard to accept. Jobs himself —like many leaders— seems impossible to resolve; there is no chatty, confessional “inner self” to be located in his words or books about him; one has the sense that no one ever got “inside him,” perhaps because “inside” is where the failed self resides, the self that falls short of its conscience, and Jobs simply didn’t have that sort of mind.

James’ formulation at least seems to bring us closer to understanding one component of his formula, however. His charisma —an enormous part of his ability to motivate and drive progress— was not due to any special intelligence, education, talent, or charm as we typically conceive of them, but due to something else: a conscience and a will unified with one another. To see a person for whom life is an instantiation of meaning, whose will reflects only their values, inspires us; it is meaning in action, the former province of religion, and it has a mysterious force over us that, despite our rational objections, turns us into “the faithful.”

October 11th, 2012

Dont Ask Me I Just Work Here:

The family that Tumblr built. Left to right: Abby Myles, Jess Kelso, Mills Baker, David Cole, Tag Savage.

I aspire to the level of conviviality we seem to exude in this rad Polaroid Ash took last night. I guess that means I’ve attained it already. What should my next goal be?

October 10th, 2012
Whatever follies may be committed in art, once they are accepted among the upper classes of our society, a theory is at once elaborated to explain and legitimize these follies, as if there had never been epochs in history when certain exceptional circles of people had not accepted and approved of false, ugly, meaningless art, which left no traces and was completely forgotten afterwards. And we can see by what is going on now in the art of our circle what degree of meaninglessness and ugliness art can attain, especially when, as in our time, it knows it is regarded as infallible.

Leo Tolstoy in What is Art?, quoted by Abby. The point is that we forget the limitless fallibility of contemporary human judgements even as we deride the past for its errors: “as if there had never been epochs” of worthless, celebrated art, decades and schools and theories and rebellions and geniuses all laboring towards the “false, ugly, [and] meaningless.” But when we walk through museums, we cannot believe that anything on the walls might be not merely “not to our liking,” but in fact bad, imbecilic, embarrassing!

And as if it were impossible that our museums should be so misled, when in fact it is a feature of the time that there seems to be no agreement between the common person and the expert —such as she is— as to art’s very definition, as to what art is, as to what qualifies as art. This definitional confusion results from an epistemologically-debased philosophical culture in which even the ambitious give up and say, “Well, art is whatever anyone says is art!” or some similar nonsense. That is: we do not know what art is, we cannot distinguish it from non-art, and we do not think it is even possible in principle to do so.

It would be no surprise to me if a far smaller percentage of the canonical work of the past century or so endures —or even makes sense— for long; I sometimes suspect that we’re living through an extraordinarily ridiculous time, culturally, and I only hope that it will at least be comic for those who study it in the future.

Reblogged from pocket
Loading tweets...



Hello! My name is Mills Baker. I write about art, culture, love, philosophy, memory, history, and more. Here are some relatively better posts. This site has been featured on Tumblr Tuesday and is listed in the Spotlight, but it pines for its youth as a coloring book. (Header lettering by the amazing Chirp).