Meta is Murder

Genera

I am an allergic and reactive person, most outraged by the sorts of intellectual atrocities I myself commit. To say this is merely to assert the personal applicability of the now-hoary Hermann Hesse adage:

"If you hate a person, you hate something in him that is part of yourself. What isn’t part of ourselves doesn’t disturb us."

Hesse is a figure whom I regard with suspicion, and again: it seems to me likely that this is due to our mutual habits of appropriation, though whereas he recapitulates Eastern religious ideas in semi-novelistic form for his audience of early 20th-century European exoticists, I recapitulate in semi-essayistic form 20th-century European ideas from Kundera, Gombrowicz, Popper, and others. In this as in all cases, it is the form and not the content that matters.

To describe someone formally, we might say: “She is certain of her rightness, intolerant of those who disagree with her.” But to describe the content is necessarily to stray from the realm of the psychological —which is enduring, for the most part— into the realm of ephemera masquerading as philosophy: “She is for X, fighting against those who believe Y.” You and I have opinions about X and Y; we will judge her according to those opinions, even though in the fullness of time an opinion about X or Y will matter as much as the position of a farmer on the Huguenot question. History does not respect our axes and categories, although we believe as ever that they are of life-and-death import. History looks even less kindly on the sense of certainty which nearly all of us attain about our beliefs.

Art and understanding are concerned with forms; politics and judgement are concerned with content. I think of them algebraically: what can be described in variables has greater range, explanatory power, and reach than the specific arithmetic of some sad concluded homework problem.

Some of my smartest friends love Hesse. When I read him I am often struck by the familiarity of his ideas; I cannot tell whether I learned them through other authors who read him, through ambient culture, or through myself, my own reflections, but I know that they often seem to me to be apt instantiations of ideas nearly folklorish in nature, as is the case with the axiom quoted above. Perhaps it is simply that other moral principles lead to the same conclusion, so that Hesse seems as though he arrives at the end, rather than the middle, of the inquiry.

One such principle is well phrased by Marilynne Robinson in her essay “When I was a Child,” in her collection When I Was a Child I Read Books:

"It may be mere historical conditioning, but when I see a man or a woman alone, he or she looks mysterious to me, which is only to say that for a moment I see another human being clearly."

The idea that a human seen clearly is a mystery is anathema to a culture of judgment —such as ours— which rests on a simple premise: humans can be understood by means of simple schema that map their beliefs or actions to moral categories. Moreover, because there are usually relatively few of these categories, and few important issues of discernment —our range of political concerns being startlingly narrow, after all— humans can be understood and judged at high speed in large, generalized groups: Democrats, Republicans, women, men, people of color, whites, Muslims, Christians, the rich, the poor, Generation X, millennials, Baby Boomers, and so on.

It should but does not go without saying that none of those terms describes anything with sufficient precision to support the kinds of observations people flatter themselves making. Generalization is rarely sound. No serious analysis, no serious effort to understand, describe, or change anything can contain much generalization, as every aggregation of persons introduces error. One can hardly describe a person in full, let alone a family, a city, a class, a state, a race. Yet we persist in doing so, myself included.

Robinson continues:

"Tightly knit communities in which members look to one another for identity, and to establish meaning and value, are disabled and often dangerous, however polished their veneer. The opposition frequently made between individualism on the one hand and responsibility to society on the other is a false opposition as we all know. Those who look at things from a little distance can never be valued sufficiently. But arguments from utility will never produce true individualism. The cult of the individual is properly aesthetic and religious. The significance of every human destiny is absolute and equal. The transactions of conscience, doubt, acceptance, rebellion are privileged and unknowable…"

There is a kind of specious semi-rationalism involved in what she calls “utility”: the rationalism that is not simply concerned with logical operations and sound evidentiary processes but also with excluding anything it does not circumscribe. That is to say: the totalizing rationalism that denies a human is anything more than her utility, be it political or economic or whatever. Such rationalism seems intellectually sound until one, say, falls in love, or first encounters something that resists knowing, or reads about the early days of the Soviet Union: when putatively “scientifically known historical laws of development” led directly to massacres we can just barely admit were a kind of error, mostly because murder seems unsavory (even if murderously hostile judgment remains as appealing to us as ever).

One of the very best things Nietzsche ever wrote:

"The will to a system is a lack of integrity."

But to systematize is our first reaction to life in a society of scale, and our first experiment as literate or educated or even just “grown-up” persons with powers of apprehension, cogitation, and rhetoric. What would a person be online if he lacked a system in which phenomena could be traced to the constellation of ideas which constituted his firmament? What is life but the daily diagnosis of this or that bit of news as “yet another example of” an overarching system of absolutely correct beliefs? To have a system is proof of one’s seriousness, it seems —our profiles so often little lists of what we “believe,” or what we “are”— and we coalesce around our systems of thought just as our parents did around their political parties, though we of course consider ourselves mere rationalists following the evidence. Not surprisingly, the evidence always leads to the conclusion that many people in the world are horrible, stupid, even evil; and we are smart, wise, and good. It should be amusing, but it is not.

I hate this because I am doing this right now. I detest generalization because when I scan Twitter I generalize about what I see: “people today,” or “our generation,” I think, even though the people of today are as all people always have been, even though they are all just like me. I resent their judgments because I feel reduced by them and feel reality is reduced, so I reduce them with my own judgments: shallow thinkers who lack, I mutter, the integrity not to systematize. And I put fingers to keys to note this system of analysis, lacking all integrity, mocking my very position.

I want to maintain my capacity to view each as a mystery, as a human in full, whose interiority I cannot know. I want not to be full of hatred, so I seek to confess that my hatred is self-hatred: shame at the state of my intellectual reactivity and decay. I worry deeply that our systematizing is inevitable because when we are online we are in public: that these fora mandate performance, and worse, the kind of performance that asserts its naturalness, like the grotesquely beautiful actor who says, "Oh, me? I just roll out of bed in the morning and wear whatever I find lying about" as he smiles a smile so practiced it could calibrate the atomic clock. Every online utterance is an angling for approval; we write in the style of speeches: exhorting an audience, haranguing enemies, lauding the choir. People “remind” no one in particular of the correct ways to think, the correct opinions to hold. When I see us speaking like op-ed columnists, I feel embarrassed: it is like watching a lunatic relative address passers-by using the “royal we,” and, I feel, it is pitifully imitative. Whom are we imitating? Those who live in public: politicians, celebrities, “personalities.”

There is no honesty without privacy, and privacy is not being forbidden so much as rendered irrelevant; privacy is an invented concept, after all, and like all inventions must contend with waves of successive technologies or be made obsolete. The basis of privacy is the idea that judgment should pertain only to public acts —acts involving other persons and society— and not the interior spaces of the self. Society has no right to judge one’s mind; society hasn’t even the right to inquire about one’s mind. The ballot is secret; one cannot be compelled to testify or even talk in our criminal justice system; there can be no penalty for being oneself, however odious we may find given selves or whole (imagined) classes of selves.

This very radical idea has an epistemological basis, not a purely moral one: the self is a mystery. Every self is a mystery. You cannot know what someone really is, what they are capable of, what transformations of belief or character they might undergo, in what their identity consists, what they’ve inherited or appropriated, what they’ll abandon or reconsider; you cannot say when a person is who she is, at what point the “real” person exists or when a person’s journey through selves has stopped. A person is not, we all know, his appearance; but do we all know that she is not her job? Or even her politics? 

But totalizing rationalism is emphatic: either something is known or it is irrelevant. Thus: the mystery of the self is a myth; there is no mystery at all. A self is valid or invalid, useful or not, correct or incorrect, and if someone is sufficiently different from you, if their beliefs are sufficiently opposed to yours, their way of life alien enough, they are to be judged and detested. Everyone is a known quantity; simply look at their Twitter bio and despise.

But this is nonsense. In truth, the only intellectually defensible posture is one of humility: all beliefs are misconceptions; all knowledge is contingent, temporary, erroneous; and no self is knowable, not truly, not to another. We can perhaps sense this in ourselves —although I worry that many of us are too happy to brag about our conformity to this or that scheme or judgment, to use labels that honor us as though we’ve earned ourselves rather than chancing into them— but we forget that this is true of every single other, too. This forgetting is the first step of the so-called othering process: forget that we are bound together in irreducibility, forget that we ought to be humble in all things, and especially in our judgments of one another.

Robinson once more:

"Only lonesomeness allows one to experience this sort of radical singularity, one’s greatest dignity and privilege."

Lonesomeness is what we’re all fleeing at the greatest possible speed, what our media now concern themselves chiefly with eliminating alongside leisure. We thus forget our radical singularity, a personal tragedy, an erasure, a hollowing-out, and likewise the singularity of others, which is a tragedy more social and political in nature, and one which seems to me truly and literally horrifying. Because more than any shared “belief system” or political pose, it is the shared experience of radical singularity that unites us: the shared experience of inimitability and mortality. Anything which countermands our duty to recognize and honor the human in the other is a kind of evil, however just its original intention.

“"I think." Nietzsche cast doubt on this assertion dictated by a grammatical convention that every verb must have a subject. Actually, said he, "a thought comes when ‘it’ wants to, and not when ‘I’ want it to; so that it is falsifying the fact to say that the subject ‘I’ is necessary to the verb ‘think.’" A thought, comes to the philosopher "from outside, from above or below, like events or thunderbolts heading for him." It comes in a rush. For Nietzsche loves "a bold and exuberant intellectuality that runs presto," and he makes fun of the savants for whom thought seems "a slow, halting activity, something like drudgery, often enough worth the sweat of the hero-savants, but nothing like that light, divine thing that is such close kin to dance and to high-spirited gaiety." Elsewhere Nietzsche writes that the; philosopher "must not, through some false arrangement of deduction and dialectic, falsify the things and the ideas he arrived at by another route…. We should neither conceal nor corrupt the actual way our thoughts come to us. The most profound and inexhaustible books will surely always have something of the aphoristic, abrupt quality of Pascal’s ‘Pensées.’"

We should not “corrupt the actual way our thoughts come to us”: I find this injunction remarkable; and I notice that, beginning with ‘The Dawn,’ all the chapters in all his books are written in a single paragraph: this is so that a thought should be uttered in one single breath; so that it should be caught the way it appeared as it sped toward the philosopher, swift and dancing.”

Milan Kundera in Testaments Betrayed, discussing the meaning of the various prose styles developed by Franz KafkaErnest Hemingway, and Friedrich Nietzsche, and how technical details like paragraph structure and the use of semicolons express deeper elements of an author’s thought and purpose.

Good writing is deliberated style as much as resonant content; there should be nothing automatic, nothing inherited, nothing thoughtless. Punctuation and typeface are not incidental; indentation- and sentence-length and paragraph rhythms all matter, and all ought to be the purposive stylistic expression of authorial intent.

For whatever reason, many seem to consider such things beyond the boundaries of artistic creativity in prose, as though we are obliged to adopt the happenstance syntax of our languages. We are not, but style is not merely a matter of some radical pose, refusing to use commas or arbitrarily violating grammatical rules in a demonstrative way. Rebellion is a crutch in art.

Good prose style is simpler and harder. We must be ruthless in interrogating everything about our writing: the plain honesty of its intentions, the truth of its substance, the value of the ideas it expresses, the novelty (or at least utility) of its existence, and all its tiniest details, all its small conformities to and violations of the rules of the language, all its periods and ellipses and dashes, all the choices we make about quotation marks and italicization, all the elements few readers consciously notice but all readers register.

Friedrich Nietzsche, drawn for this post by Tully Mills. (Click it for full size).
Because time is unlikely to be infinite -it is a physical element of the universe, and is as physically bounded as its host- the idea of eternal return, perhaps most popularized by Nietzsche, cannot have literal applicability. It may be useful as an extricating mental exercise to escape certain sorts of delusion, particularly the central delusion of the conscious animal: that time exists as some sort of magically-transient set of discrete and continuous states, as a moment moving along an implied axis, as a unified, extent-yet-not-extant set of the past, present, and future. But it has no meaning in any ordinary ontological sense: there is no eternal return; each instant does not occur, along some different axis or after aeons of the happenstance reconstitution of elements, eternally. (Although it may be wise to live as though it does, and there are elements of history which seem more cyclical than linear).
But there is a return of higher significance: the return of phenomena according to their causes. That is to say: causality entails a deeply delightful kind of return: that always -eternally, as it were-, B will follow A if A is B’s cause. Causality, as discerned by induction, cannot be demonstrated by observation and it cannot be rationally proven, but no function of consciousness -indeed, no function of life itself- could exist without the materially-instantiated reliability of causation. Thus the delight: that what occurs will always have a cause, itself caused, and that these causes are comprehensible, meant that in the past life could evolve and mind could exist, and in the present it can continue to do so. A universe in which things occur without causes, at random in a sense, without consistent conditions, would be too unstable for emergent phenomena like life and mind to develop without disruption.
It is a sort of horror occasionally depicted in film: the horror of a phenomenon not causally-explicable, the horror of a nonsensical, non-repeating, law-defying event. Every day, the sun rises; observing this cannot be said, no matter the number of years we’ve observed, to logically that it will rise tomorrow; and, Karl Popper’s qualified, hypothetical epistemology notwithstanding, there are no logically compulsory arguments that tell us it will either. It would be a horror of deep significance, not just practically but psychologically, if one day, it simply did not. Yet it will, and we must know that it will, and that similarly repetitive and reliably-caused phenomena will persist as well, forever, so to speak. Every organism’s development is dependent on the possibility of such knowledge (and the encoding of technology to exploit such knowledge in its DNA).
Thus: the eternal return of predictability at every turn in the physical universe, of causality as a principle of all systems, the eternal return of every instant and every state to the same laws which governed the instants and states which preceded them is not about the specific contents of time but about the structure according to which those contents can be arranged.

Friedrich Nietzsche, drawn for this post by Tully Mills. (Click it for full size).

Because time is unlikely to be infinite -it is a physical element of the universe, and is as physically bounded as its host- the idea of eternal return, perhaps most popularized by Nietzsche, cannot have literal applicability. It may be useful as an extricating mental exercise to escape certain sorts of delusion, particularly the central delusion of the conscious animal: that time exists as some sort of magically-transient set of discrete and continuous states, as a moment moving along an implied axis, as a unified, extent-yet-not-extant set of the past, present, and future. But it has no meaning in any ordinary ontological sense: there is no eternal return; each instant does not occur, along some different axis or after aeons of the happenstance reconstitution of elements, eternally. (Although it may be wise to live as though it does, and there are elements of history which seem more cyclical than linear).

But there is a return of higher significance: the return of phenomena according to their causes. That is to say: causality entails a deeply delightful kind of return: that always -eternally, as it were-, B will follow A if A is B’s cause. Causality, as discerned by induction, cannot be demonstrated by observation and it cannot be rationally proven, but no function of consciousness -indeed, no function of life itself- could exist without the materially-instantiated reliability of causation. Thus the delight: that what occurs will always have a cause, itself caused, and that these causes are comprehensible, meant that in the past life could evolve and mind could exist, and in the present it can continue to do so. A universe in which things occur without causes, at random in a sense, without consistent conditions, would be too unstable for emergent phenomena like life and mind to develop without disruption.

It is a sort of horror occasionally depicted in film: the horror of a phenomenon not causally-explicable, the horror of a nonsensical, non-repeating, law-defying event. Every day, the sun rises; observing this cannot be said, no matter the number of years we’ve observed, to logically that it will rise tomorrow; and, Karl Popper’s qualified, hypothetical epistemology notwithstanding, there are no logically compulsory arguments that tell us it will either. It would be a horror of deep significance, not just practically but psychologically, if one day, it simply did not. Yet it will, and we must know that it will, and that similarly repetitive and reliably-caused phenomena will persist as well, forever, so to speak. Every organism’s development is dependent on the possibility of such knowledge (and the encoding of technology to exploit such knowledge in its DNA).

Thus: the eternal return of predictability at every turn in the physical universe, of causality as a principle of all systems, the eternal return of every instant and every state to the same laws which governed the instants and states which preceded them is not about the specific contents of time but about the structure according to which those contents can be arranged.

It all began with music, “music heard so deeply / That it is not heard at all, but you are the music / While the music lasts”:

“There was a place for everyone in this brave new world, where the player [piano] offered an answer to some of America’s most persistent wants: the opportunity to participate in something which asked little understanding; the pleasure of creating without work, practice, or the taking of time; and the manifestation of talent where there was none.”

By 1945, as the player piano itself was fading from national memory, William Gaddis had come to see it as paradigmatic of the effect technological democracy had on the arts. In Agapē Agape, which he wrote fifty years later and just before his death, his partially insane, terminally-ill narrator traces this analysis from that ludicrous inversion of the “piano player”

"…Plato’s chance persons pouring out Für Elise without a flaw till the last perforation in the roll passes over the corresponding hole in the tracker bar and democracy comes lumbering back into the room…"

back through Walter Benjamin’s concerns about “The Work of Art in the Age of Mechanical Reproduction,” and then much further: through the imitation of Nietzsche heartlessly enacted by his sister, through Tolstoy’s “Kreutzer Sonata,” through Louis Moreau Gottschalk’s stylistic anticipation of reductive technologies to come, through Plato, even through Homer. At every turn, he sees the consequences of the democratic urge to reduce art to pleasure, to reduce creation to performance, to smash the Apollonian and hand out a thousand awards a year to all the Americans whose “art” is pantomimic, entertaining. He rages with Flaubert:

The entire dream of democracy is to raise the proletariat to the level of bourgeois stupidity.

And he implies, with impressive lexical elision, that technology has eviscerated the creative arts not solely because the middle-classes want art-as-pleasure, but also because artists themselves misunderstood the import of their human presence:

You want the essence of elitism there [Flaubert] was, his idea of art that “the artist must no more appear in his work than God does in nature, that the artist must manage to make posterity believe that he never existed,” good God, the rate things change a generation lasts about four days what posterity? Everywhere present and nowhere visible leads him right into the embrace of the death of the author whose intentions have no connection with the meaning of the text which is indeterminate anyway, a multidimensional space where the modern scriptor is born with this, this detachable self this second voice inside predicting the future in its hoarse belly-voice, Strabo?

God is absent from nature; he is the clockmaker; now man, ever the imitator, seeks to absent himself from culture, allow clockmakers to drive creation: the movie industry; the music industry; the publishing industry. The raving of the dying man is often strikingly-clear despite his splenetic outbursts; he bleeds on the mountains of notes he’s accrued obsessing about technology and art, his arms are bruised from the needles in his arms, he cannot focus, he cannot breathe, he continues on his rant: a taut, frantic, desperate dictation. The text itself is a mechanical constraint: it is obliged to record his words as he speaks them, at length and digressively, abandoning tangents as he searches for pills, selecting, disgorging, then departing from ideas that revolve around his themes. There is little time for punctuation, none for chapters; the technology of text is imperfect, we all know, for speech; in your life, no one demarcates your sentences with “he said,” but we must have it in books; and here, we haven’t even the time for that. This is a piano roll: the man is gone, dead, but the transcription of his voice, lifeless, without the proper pauses or dynamics, without the heart of the speaker, runs on and on.

What I shit is better than anything you could ever think up!

Rash Beethoven must have flushed so angrily at criticism of his execrable Wellington’s Victory, or, the Battle of Vitoria, Op. 91 in part because he knew it was awful. It was composed with technological imperatives, written for and performed by Johann Maelzel’s panharmonicon:

….a mechanical keyboard instrument that automated the playing of flutes, clarinets, trumpets, violins, cellos, drums, cymbals, triangle, and other instruments [including the sounds of guns, used in the piece].

The device, like the composition, was a failure. Inhuman art tends to be, and our art is more driven by market forces and algorithmic analyses of consumption than by that solitary, authentic artist we all half-hate, half-deride. What a fraud! To think, as he must, that he’s any better than we are! For him to labor so absurdly on his work, when what we make, what we like, is just as good! And why should I have to struggle to understand anything?
(Whatever its merits, the panharmonicon -shown above- was destroyed in a WWII allied bombing raid on Stuttgart. Man’s urges -to mechanize, to conquer- do not change, and therefore history is repetitive).
But mechanization and monetization, the narrator fumes, march on. When Jonathan Franzen repudiated Gaddis, it was in part because he felt that the latter’s anxiety about, fury with, protests of contemporary technological democratic capitalism were “seriously misconceived.” It would be nice to think so; being a popular and wealthy American artist, living in power, comfort, and freedom surely helps. Franzen is sanely acclimated to this world, but Gaddis’ narrator is firmly aligned with the insane, citing Pascal’s claim that everyone is “so necessarily mad that not to be mad would amount to another form of madness.” Moreover, he subscribes to Melville’s views on popularity on art; they sound embittered to us in this, the triumphal era of pop-culture:

…only revenge the mob has…is to go to the movies, thirty fifty a hundred million dollars against a hundred and forty-five dollars and eighty-three cents [how much Melville owed to his publisher after he wrote Moby Dick], the final great stupefying collective. No more illusion of taking part, of discovering your unsuspected talent when the biggest thrill in music was playing it yourself, your own participation that roused your emotions most no, no. The ultimate collective, the herd numbed and silenced agape at blood sex and guns blowing each other to pieces only participation you get’s maybe kids who see it come to school next morning and mow down their classmates no more elitism no more elite no wherever you turn just the spread of the crowd with its, what he did call it, what Huizinga called its insatiable thirst for trivial recreation and crude sensationalism, the mass of the mediocre widening the gap the popularity of a work is the measure of its mediocrity says Melville no news there is there? The masses invading the province of the writer says Walter Benjamin…

The idea of “unsuspected talent” remains a crucial illusion; one will wait forever for happiness if he can dream that success lies hidden within him, ready to spring out at any moment. Gaddis saw in the history of music and literature much of what would develop in the world of technology -the urge to imitation, to enactment, to the Platonic lie- but he didn’t foresee reality television and the Internet, how they’d enable that same old illusion to a greater degree than ever before. One cannot even be sure if it is an illusion, whatever its statistical rarity.
If the idea of replacing the piano player with the player piano seems less a metaphor than a delightful efficiency, and if all seems well with the methods by which we assign value to art, artists, people, cultures, then it might seem pointless to wrestle with a decaying and curmudgeonly old man who still cares about authenticity in the age of ceaseless, ubiquitous self-promotion, teenagers redacting themselves for Twitter, real people talking about their brands.
But if you ponder why we favor simulacra over substance, from what we’re fleeing, how we can find some reality in a culture that tried to force air through preserved, removed human larynxes in order to combat the rising cost of opera stars two hundred years before it cloned a sheep, Agapē Agape is very much worthwhile. Can one disentangle the desire to learn from the habit of imitation? Is there culture without the mimetic? What is the relationship of the arts to democracy, of democracy to technology, of technology to individual freedom?
Gaddis’ narrator, with a sorrowful fire just beyond his words, answers no questions, but as he casts his eye through oddities of Western cultural history -even touching on the Aristotelian ideas in A Confederacy of Dunces- he reminds us of why one might be willing to work a bit at reading fiction, in case we’ve grown accustomed to having our thought automated, our morals mass-produced.

It all began with music, “music heard so deeply / That it is not heard at all, but you are the music / While the music lasts”:

“There was a place for everyone in this brave new world, where the player [piano] offered an answer to some of America’s most persistent wants: the opportunity to participate in something which asked little understanding; the pleasure of creating without work, practice, or the taking of time; and the manifestation of talent where there was none.”

By 1945, as the player piano itself was fading from national memory, William Gaddis had come to see it as paradigmatic of the effect technological democracy had on the arts. In Agapē Agape, which he wrote fifty years later and just before his death, his partially insane, terminally-ill narrator traces this analysis from that ludicrous inversion of the “piano player”

"…Plato’s chance persons pouring out Für Elise without a flaw till the last perforation in the roll passes over the corresponding hole in the tracker bar and democracy comes lumbering back into the room…"

back through Walter Benjamin’s concerns about “The Work of Art in the Age of Mechanical Reproduction,” and then much further: through the imitation of Nietzsche heartlessly enacted by his sister, through Tolstoy’s “Kreutzer Sonata,” through Louis Moreau Gottschalk’s stylistic anticipation of reductive technologies to come, through Plato, even through Homer. At every turn, he sees the consequences of the democratic urge to reduce art to pleasure, to reduce creation to performance, to smash the Apollonian and hand out a thousand awards a year to all the Americans whose “art” is pantomimic, entertaining. He rages with Flaubert:

The entire dream of democracy is to raise the proletariat to the level of bourgeois stupidity.

And he implies, with impressive lexical elision, that technology has eviscerated the creative arts not solely because the middle-classes want art-as-pleasure, but also because artists themselves misunderstood the import of their human presence:

You want the essence of elitism there [Flaubert] was, his idea of art that “the artist must no more appear in his work than God does in nature, that the artist must manage to make posterity believe that he never existed,” good God, the rate things change a generation lasts about four days what posterity? Everywhere present and nowhere visible leads him right into the embrace of the death of the author whose intentions have no connection with the meaning of the text which is indeterminate anyway, a multidimensional space where the modern scriptor is born with this, this detachable self this second voice inside predicting the future in its hoarse belly-voice, Strabo?

God is absent from nature; he is the clockmaker; now man, ever the imitator, seeks to absent himself from culture, allow clockmakers to drive creation: the movie industry; the music industry; the publishing industry. The raving of the dying man is often strikingly-clear despite his splenetic outbursts; he bleeds on the mountains of notes he’s accrued obsessing about technology and art, his arms are bruised from the needles in his arms, he cannot focus, he cannot breathe, he continues on his rant: a taut, frantic, desperate dictation. The text itself is a mechanical constraint: it is obliged to record his words as he speaks them, at length and digressively, abandoning tangents as he searches for pills, selecting, disgorging, then departing from ideas that revolve around his themes. There is little time for punctuation, none for chapters; the technology of text is imperfect, we all know, for speech; in your life, no one demarcates your sentences with “he said,” but we must have it in books; and here, we haven’t even the time for that. This is a piano roll: the man is gone, dead, but the transcription of his voice, lifeless, without the proper pauses or dynamics, without the heart of the speaker, runs on and on.

What I shit is better than anything you could ever think up!

Rash Beethoven must have flushed so angrily at criticism of his execrable Wellington’s Victory, or, the Battle of Vitoria, Op. 91 in part because he knew it was awful. It was composed with technological imperatives, written for and performed by Johann Maelzel’s panharmonicon:

….a mechanical keyboard instrument that automated the playing of flutes, clarinets, trumpets, violins, cellos, drums, cymbals, triangle, and other instruments [including the sounds of guns, used in the piece].

The device, like the composition, was a failure. Inhuman art tends to be, and our art is more driven by market forces and algorithmic analyses of consumption than by that solitary, authentic artist we all half-hate, half-deride. What a fraud! To think, as he must, that he’s any better than we are! For him to labor so absurdly on his work, when what we make, what we like, is just as good! And why should I have to struggle to understand anything?

(Whatever its merits, the panharmonicon -shown above- was destroyed in a WWII allied bombing raid on Stuttgart. Man’s urges -to mechanize, to conquer- do not change, and therefore history is repetitive).

But mechanization and monetization, the narrator fumes, march on. When Jonathan Franzen repudiated Gaddis, it was in part because he felt that the latter’s anxiety about, fury with, protests of contemporary technological democratic capitalism were “seriously misconceived.” It would be nice to think so; being a popular and wealthy American artist, living in power, comfort, and freedom surely helps. Franzen is sanely acclimated to this world, but Gaddis’ narrator is firmly aligned with the insane, citing Pascal’s claim that everyone is “so necessarily mad that not to be mad would amount to another form of madness.” Moreover, he subscribes to Melville’s views on popularity on art; they sound embittered to us in this, the triumphal era of pop-culture:

…only revenge the mob has…is to go to the movies, thirty fifty a hundred million dollars against a hundred and forty-five dollars and eighty-three cents [how much Melville owed to his publisher after he wrote Moby Dick], the final great stupefying collective. No more illusion of taking part, of discovering your unsuspected talent when the biggest thrill in music was playing it yourself, your own participation that roused your emotions most no, no. The ultimate collective, the herd numbed and silenced agape at blood sex and guns blowing each other to pieces only participation you get’s maybe kids who see it come to school next morning and mow down their classmates no more elitism no more elite no wherever you turn just the spread of the crowd with its, what he did call it, what Huizinga called its insatiable thirst for trivial recreation and crude sensationalism, the mass of the mediocre widening the gap the popularity of a work is the measure of its mediocrity says Melville no news there is there? The masses invading the province of the writer says Walter Benjamin…

The idea of “unsuspected talent” remains a crucial illusion; one will wait forever for happiness if he can dream that success lies hidden within him, ready to spring out at any moment. Gaddis saw in the history of music and literature much of what would develop in the world of technology -the urge to imitation, to enactment, to the Platonic lie- but he didn’t foresee reality television and the Internet, how they’d enable that same old illusion to a greater degree than ever before. One cannot even be sure if it is an illusion, whatever its statistical rarity.

If the idea of replacing the piano player with the player piano seems less a metaphor than a delightful efficiency, and if all seems well with the methods by which we assign value to art, artists, people, cultures, then it might seem pointless to wrestle with a decaying and curmudgeonly old man who still cares about authenticity in the age of ceaseless, ubiquitous self-promotion, teenagers redacting themselves for Twitter, real people talking about their brands.

But if you ponder why we favor simulacra over substance, from what we’re fleeing, how we can find some reality in a culture that tried to force air through preserved, removed human larynxes in order to combat the rising cost of opera stars two hundred years before it cloned a sheep, Agapē Agape is very much worthwhile. Can one disentangle the desire to learn from the habit of imitation? Is there culture without the mimetic? What is the relationship of the arts to democracy, of democracy to technology, of technology to individual freedom?

Gaddis’ narrator, with a sorrowful fire just beyond his words, answers no questions, but as he casts his eye through oddities of Western cultural history -even touching on the Aristotelian ideas in A Confederacy of Dunces- he reminds us of why one might be willing to work a bit at reading fiction, in case we’ve grown accustomed to having our thought automated, our morals mass-produced.

“[Atheists like Nietzsche recognized] the unbridgeable abyss between our search for meaning and the world as it is and is bound to remain. And yet most of those who were ready to stare at the icy desert of a godless world had not given up the belief that something could be saved from the impersonal game of atoms. The ‘something’ was to be human dignity, the very ability to face one’s own freedom and to decree a meaning by a sheer act of will, in the full awareness that one was decreeing rather than discovering it in nature or history… The dignity which enables us to accept the truth and to defy, by creative acts, the emptiness of Being was [for Nietzsche] the only way of carrying the burden of life without illusions. He failed to explain where the value of dignity came from, why it should not be another self-deception [as he felt religion was] or why we may rely on it rather than commit suicide or go mad as he himself would subsequently do.”

Leszek Kołakowski, quoted to me by my father. What is the value of this dignity to which men like Kubrick allude when they say things like, “However vast the darkness, we must supply our own light”? It isn’t, after all, a matter of darkness and light; light we can synthesize; we cannot synthesize moral purpose whose terminus is not our own fallible imagination. Will our manufactured meaning not always be, at most, a consoling story we tell ourselves against the backdrop of immutable oblivion?

And if so, what difference does it make whether one story is empirically valid or not? As Kołakowski notes and every reader of fiction or lover of art or anything else knows, "There are no transcendentally valid criteria of meaningfulness and no compelling reasons why the meaningful should be equated with the empirical, in the sense in which modern science understands this term."

Is there any defensible justification for the putatively heroic stance of the existentialist hero, bearing “the impersonal game of atoms” with what he calls dignity? Is there any reason to consider empirical validity a criterion by which to judge our stories?

Propellers for Umbrellas, whose photography I adore endlessly, wrote something very beautiful which reminded me of Walker Percy’s interest in the suburbs:
 
 
Parque de Fina
I am not sure it is possible to unlove the Suburbs. They are so widely and rightly despised that if you do love them, your attraction becomes a mad, guarded thing, perhaps not secret but probably a little shifty-eyed. They are a sprawling silence, smelling like excess poolwater and featureless concrete, and it is easy to get lost there, where the map is a kind of circuit made of straight lines; you come upon the same place again and again but always on a different street. They hoard gas stations and space and safety, and you’re right to run from them, to stumble into some city knowing nothing about buying groceries every evening, or busing with the wetly sallow oyster ladies, the drunken eyeliner ladies, the ladies down the hall. I have done this. I will do it again, always a little forgetfully.
But if you have learned to love them in your own horrible way, as I have, you may send tiny prayers to netless basketball games in driveways, or to the browning lines of 1980’s rvs, or to all the horticultural absurdity. If there is one thing I love most about suburbs it is the cruelty with which people treat their juniper and spruce, a uniformity which somehow still surprises and amuses me. I dig these stupid poodle trees like I dig dingbats and striped awnings and Eichler homes, but I’ve rarely seen something so exciting as this. 
One thing I will always refute is that suburbs lack culture and creativity. This man - with his big boat of a car, his kids and grandkids who live nearby, and his loafers - has his own art. Part of it is to explain what he does in a round whisper, with white at the corners of his mouth, about his banana trees and his tarot root and keeping a hollow space in the hedge, but mostly about his wife, whose name is clipped into it. The duck and the micky mouse and whatever other shapes emerge, the wind-chimes, the small plaque on which he’s painted Parque de Fina - these are a kind of sculpture.
If you are passionate about something - if you love sound, or smoking hookah in the park,  or your wife - and if you can take that passion and build from it something which improves the lives of others, even if you do it quietly between Highway 101 and Middlefield road, I want to thank you.  Who is to say this man would not have a heavily sourced wikipedia page if he weren’t happily married, or had settled in some more desperately vibrant part of the world? What mattered to me was that he was proud and fond, and had translated his love into something that could inspire a twenty-year-old with the twenty-year-old’s customary habit of disdain. He did it in suburbs.
Really, I suspect these things of happening everywhere.

Then the extraordinary Superfluidity, in response to this post on cliche, noted something of interest in a remarkable essay which exemplifies his brilliant, synthetic ideas on the literature of antiquity:
It has always seemed to me that our objection to cliché is not something inherent in the cliché itself, but rests instead in our perception of it. That is to say, if we refuse to allow ourselves to experience the cliché as a whole, and step back to see it for its parts, and absorb the cumulative wisdom which lies behind its creation, there is something valuable still to be had.
There is beauty in the suburb and poetry in the banal, they note: it is our task to find it, as they do in photography or rather static, long dead literature, literature which Nietzsche says changes on because we do, because our relation to it does. This is an excellent example of why they are among my favorite thinkers and creators, and I cannot recommend them more highly.

Propellers for Umbrellas, whose photography I adore endlessly, wrote something very beautiful which reminded me of Walker Percy’s interest in the suburbs:

Parque de Fina

I am not sure it is possible to unlove the Suburbs. They are so widely and rightly despised that if you do love them, your attraction becomes a mad, guarded thing, perhaps not secret but probably a little shifty-eyed. They are a sprawling silence, smelling like excess poolwater and featureless concrete, and it is easy to get lost there, where the map is a kind of circuit made of straight lines; you come upon the same place again and again but always on a different street. They hoard gas stations and space and safety, and you’re right to run from them, to stumble into some city knowing nothing about buying groceries every evening, or busing with the wetly sallow oyster ladies, the drunken eyeliner ladies, the ladies down the hall. I have done this. I will do it again, always a little forgetfully.

But if you have learned to love them in your own horrible way, as I have, you may send tiny prayers to netless basketball games in driveways, or to the browning lines of 1980’s rvs, or to all the horticultural absurdity. If there is one thing I love most about suburbs it is the cruelty with which people treat their juniper and spruce, a uniformity which somehow still surprises and amuses me. I dig these stupid poodle trees like I dig dingbats and striped awnings and Eichler homes, but I’ve rarely seen something so exciting as this. 

One thing I will always refute is that suburbs lack culture and creativity. This man - with his big boat of a car, his kids and grandkids who live nearby, and his loafers - has his own art. Part of it is to explain what he does in a round whisper, with white at the corners of his mouth, about his banana trees and his tarot root and keeping a hollow space in the hedge, but mostly about his wife, whose name is clipped into it. The duck and the micky mouse and whatever other shapes emerge, the wind-chimes, the small plaque on which he’s painted Parque de Fina - these are a kind of sculpture.

If you are passionate about something - if you love sound, or smoking hookah in the park,  or your wife - and if you can take that passion and build from it something which improves the lives of others, even if you do it quietly between Highway 101 and Middlefield road, I want to thank you.  Who is to say this man would not have a heavily sourced wikipedia page if he weren’t happily married, or had settled in some more desperately vibrant part of the world? What mattered to me was that he was proud and fond, and had translated his love into something that could inspire a twenty-year-old with the twenty-year-old’s customary habit of disdain. He did it in suburbs.

Really, I suspect these things of happening everywhere.

Then the extraordinary Superfluidity, in response to this post on cliche, noted something of interest in a remarkable essay which exemplifies his brilliant, synthetic ideas on the literature of antiquity:

It has always seemed to me that our objection to cliché is not something inherent in the cliché itself, but rests instead in our perception of it. That is to say, if we refuse to allow ourselves to experience the cliché as a whole, and step back to see it for its parts, and absorb the cumulative wisdom which lies behind its creation, there is something valuable still to be had.

There is beauty in the suburb and poetry in the banal, they note: it is our task to find it, as they do in photography or rather static, long dead literature, literature which Nietzsche says changes on because we do, because our relation to it does. This is an excellent example of why they are among my favorite thinkers and creators, and I cannot recommend them more highly.

“…distraction is nothing new. Over a century ago, philosopher Friedrich Nietzsche described his harassed peers. “One thinks with a watch in one’s hand,” he wrote in 1887, “even as one eats one’s midday meal while reading the latest news of the stock market”. Yet Nietzsche didn’t blame clocks or markets. “We labour at our daily work more ardently and thoughtlessly than is necessary to sustain our life,” he wrote in his Untimely Meditations, “because it is even more necessary not to have leisure to stop and think. Haste is universal because everyone is in flight from himself.””
Friedrich Nietzsche, quoted in “The Distraction Society,” posted by Fascinated. Apropos of this.
“…a thought comes when “it” wants to and not when “I” want it, so that it’s a falsification of the fact to say that the subject “I” is the condition of the predicate “think.” It thinks: but that this “it” is precisely that old, celebrated “I” is, to put it mildly, only an assumption, an assertion, in no way an “immediate certainty.” After all, we’ve already done too much with this “it thinks”: this “it” already contains an interpretation of the event and is not part of the process itself.”

Friedrich Nietzsche, Beyond Good and Evil. As happens with orphaned memories, the concept of thoughts without a thinker has been occasioned lately by the appearance in my mind of various images and recollections the catalysts for which I can’t imagine: scenes from old bike rides through kudzu-covered landscapes, the hot road snaking over hills ahead; country children playing behind gas stations which, in the vacuum of rural life, become commercial and social points of supreme and happy importance; vignettes of violence and catastrophe, insistently enacted by my roving mind.

I am not thinking these thoughts, or perhaps many of my thoughts; I trace each like a topos which communicates, indirectly, implicitly, sometimes inscrutably, something about my mind, with varying success. In this way all thoughts remind one of dreams.

“If we had a keen vision and feeling of all ordinary human life, it would be like hearing the grass grow and the squirrel’s heart beat, and we should die of that roar which lies on the other side of silence. As it is, the quickest of us walk about well wadded with stupidity.”

George Eliot, Middlemarch. In a conversation with Unburying the Lead, Hungry Ghoast, and Abby Jean yesterday about homelessness, I was reminded again of the problem an individual faces in confronting –or even in reacting to- vast, systematic problems; one cannot feel for the whole world or one will collapse, as in the probably-apocryphal tale of Nietzsche and the beaten horse (the sentimental interpretation of which is a classic example of how people misunderstand and romanticize mental illness, as though it involves some sort of profound lucidity, some “seeing through” that is too terribly true to bear!).

But is Eliot right? She writes of a “keen vision and feeling” overwhelming us, when it seems to me that a keen vision is to be sought provided one’s feelings can be successfully contextualized (presumably by a moral philosophy, ideology, or religion). If one weeps at every instance of poverty one can help no one, not even oneself; but if one can see and grasp the problem without anguish one can act. The Dalai Lama suggests as much when asked how it is that he is so happy despite the suffering he witnesses and combats, and further how it is that he maintains that happiness is important to seek even as one works for justice and peace.

Furthermore: if there is an anthropological limit to empathy –as there must be, a Dunbar’s Number for concrete compassion- it isn’t precisely right to call someone who obeys it “well wadded with stupidity,” any more than to call someone who doesn’t pay attention to 50 Khz sounds “deaf.”

Questions

  1. Can one have a keen vision of human life without being overwhelmed by feeling? Is this just a matter of whether one is “sensitive” or not?
  2. What explanations –religious or otherwise- can contextualize suffering such that we aren’t brought to despair by empathy? Are any satisfactory? Should one be able to consider war, famine, genocide -or the death of a child, a parent, a friend, even a pet- without devastation? Since suffering surrounds us, how does one choose on what to focus, what to feel?
  3. Should we work to expand our circles of empathy –through imaginatively experiencing suffering, through literature, art, religion? Is it more important to increase the clinical accuracy of our vision, without worrying about feeling? If there is a deficit of charity, does it stem from insufficient understanding or a lack of compassion in humanity?
  4. Whom should one emulate: the cheerful doctor whose belief in some religious order permits him to volunteer in developing countries, rarely despairing despite the anguish he sees and his own statistical insignificance in combating it? The miserable filmmaker, who -like Woody Allen- can’t be happy if one person in the world is starving, whose unhappiness perhaps catalyzes action in others but stays him from engagement with life, with charity, with anything beyond his suffering?
  5. How keen is your vision, you feeling, of all ordinary human life? Are you well wadded with stupidity?

Superior answers will be incorporated into my personal philosophy; here is your chance to change another human being.

Betraying Your Favorite Author

The Bronze Medal quotes William Deresiewicz, who in the New Republic criticizes the exploitative packaging and publishing of Nabokov’s authorial detritus, which

"…breaks new ground in editorial chutzpah, inviting us to play a kind of Nabokov: Rock Band—the novel as theme park. One can only imagine what dear old dad—the ultimate artistic control freak, not to mention one of the all-time snobs—would have thought of the idea of letting his readers re-arrange his scraps and chapters at will… Nabokov… was never anything other than a classicist in the perfection of finish that he gave to his work. Pale Fire may yoke together a foreward, a poem, a commentary, and an index, all warring like the principalities of a madman’s soul, but the terms of their struggle are worked out to the last comma. The man built racing machines. To think that he would hand us a bucket of parts—and even more, leave us to fumble around with their order, the implication of Dmitri’s invitation to re-arrangement (deconstruct this book!)—is to commit an outrage against the spirit of his art.”

Deresiewicz’s use of the term “outrage” is interesting, and will be familiar to readers of Milan Kundera’s many polemics against the posthumous abuse of authors. In Testaments Betrayed, Kundera tells several stories of artists whose deaths occasioned the most egregious betrayals, like Kafka’s best friend Max Brod disobeying Kafka’s clear instructions to destroy certain manuscripts, publishing them instead. Without this betrayal, the world wouldn’t know Kafka, a fact which in no way reduces the personal disloyalty of Brod. Or does it?

I thought, too, of Elisabeth Förster-Nietzsche’s misrepresentation and exploitation of her brother after he lapsed into dementia and died, thanks to which he is to this day associated with Nazism and other detestable causes which his work and writings clearly repudiate.

None are more vulnerable, of course, than the dead: their wishes are ours to respect or reject, and because they cannot control even their most intimate possessions we are free to -and in fact academically demand to- rifle through their personal lives, their discarded scraps, their diaries, which we publish and dissect: an autopsy of their minds and hearts and choices. Kundera is right, in some respects, to deplore this cavalier contempt for the dead and their wishes.

Of course, it persists: even now we await the publishing of whatever David Foster Wallace, JD Salinger, Nabokov, and others left behind. We are eager to violate their privacy, anxious to read the biographies that peek into those spaces they kept from us. What excuses this? If they didn’t share something in their lives -if they chose not to, didn’t want to- for what reason do we consider it tolerable to ferret it out when they can no longer protect it?

  1. We believe authors and artists -and their lives- have something to tell us of such value, of such import, that their wishes to protect certain things from scrutiny are outweighed by our right to know.
  2. We do not trust their capacities for self-evaluation: Kafka was wrong to want his stories burned!
  3. We feel that the dead, in not existing, have no claim to our deference, loyalty, respect.

These seem questionable to me. If (1) is the case, we must accept that the right to privacy is highly contingent and that the sagacity of our favorite artists was limited: they don’t know the value of their scraps, works, lives, but we do. To me, (2) seems most plausible, but often their private lives and editorial decisions reflect decades of judgement on their part: if we question those decisions, we question their capacity for understanding in general and assert that it is our matter to deliberate, not theirs. And (3) is obviously not broadly believed: what of memorials, monuments, the tributes to the dead, or the fealty towards and affection for one’s own relatives?

Insatiable curiosity, a virtue and vice, a condition of human achievement and the cause of much of our cultural squalor, seems most responsible. This is why Kundera tells the story of his Icelandic friend, for whom " friendship… is standing guard at the door behind which your friend keeps his private life hidden; it is being the person who never opens that door; who allows no one else to open it."

Most of us were not friends with Salinger or Wallace, yet in our treacly pronouncements of fondness for them we approach a level of attachment that seems at odds with the fact that most of us will happily buy what they didn’t want to publish and pry into what they wanted kept secret.

Aggregates, Accuracy, Politics, Prejudice

Nick Kallen posted a piece on “heaping,” illustrated by a classic example:

Would you describe a single grain of wheat as a heap? No. Would you describe two grains of wheat as a heap? No…. You must admit the presence of a heap sooner or later, so where do you draw the line?

Kallen asserts that “the defining characteristic of the modern era is that every aspect of society is heaping.” Without interrogating that too much further -particularly the word “modern”- it seems the case. Indeed, it’s a minor obsession of mine: I’ve written about scale often, particularly how scale in my view so completely problematizes human behavior, culture, and governance as to make political action generally nonsensical.

I feel this relates to Nietzsche’s claim that “the will to a system is a lack of integrity,” and the efforts of those like Kierkegaard to escape Hegelian thought and restore to philosophy an interest in the individual. In other words, I believe that truth is only possible in particular instances; aggregates, systems, classes, generalities, and stereotypes move us further and further from accuracy, though towards efficiency. It hard to describe every New Orleanian; it is easier to describe all New Orleanians (“reprobates,” for example), but far less accurate.

(I’ll note digressively that this issue of scaling and resolution seems to be a property of physical reality; related: quantum indeterminacy, emergence and its relationship to complexity).

I tend to think that confusion about scale is behind a great deal -though not all- of prejudice. Humans are evolutionary designed to take their accumulated experiences and extract from them rules and predictive patterns for future use; we understand and envisage the world on the basis of our memories; our minds are constructed to do so.

Most of us experience the world rather negatively; positive experiences have an impact, but not nearly so powerfully as negative ones. This is also evolutionarily sensible, but it can misfire as communities grow larger and more diverse, and as one encounters more and more people in one’s life. For example: two coworkers in five years with the same ethnicity who are both ill-behaved might become the anecdotal basis for one’s bias.

Humans are quite bad at accepting statistical facts or rational arguments when they fly in the face of acquired experiences; the latter are, of course, less integral, less operative in our minds; as mental phenomena, they are historically newer and more abstract. So when one has a bad experience in the American South, one will not be dissuaded from one’s certain conviction that Southerners are terrible, bigoted, ignorant, and useless. This applies as well to the person who goes to New York and is ignored by a taxi or cut off by a hurrying pedestrian.

I was reminded of this by Gauntlet’s post about Lord Allen’s indefensible views on women, views he no doubt acquired in the course of his life and now cannot imagine why he should discard. His handlers would do well to explain to him that making assertions about groups of billions of people is statistically risky; others should remember the same when they speak of Christians, Muslims, Democrats, Republicans, women, men, and so on.

We all know this, but in our own lives how capable are we of accepting statistical fact? I remember in the last presidential elections I was often admonished that I ought to vote, that my habit of never voting was disgraceful, that I was the only sort reprehensible to both parties. My pleas that my vote was absolutely irrelevant, particularly in Louisiana, were ignored. Was this because such a claim suggested that any individual vote was irrelevant -unwelcome negativity in an emotionally empowering election? Was it because math is less resonant than stories and experiences, many mediated, about the historical context of the election?

I believe strongly that at an early age, children should be taught: the world is too large for your experience to form a meaningful sample set. This inadequate experience includes the perceptions acquired by what McLuhan describes as externalized nervous system organs: media. Do not bother thinking about heaps, no matter how our society stacks them up; think only of grains, and you’ll be less likely to err.

Update: A New Nadir has a rebuttal post that I consider not only thoroughly persuasive but also exemplary of how to disagree civilly; I highly recommend it, and agree particularly that I failed to distinguish different sorts of generalizations.

“My formula for human greatness is amor fati: that one wants to have nothing different, not forward, not backward, not in all eternity. Not merely to bear the necessary, still less to conceal it—all idealism is mendaciousness before the necessary—but to love it.”

A syphilitic Friedrich Nietzsche in the chapter of Ecce Homo titled “Why I am so Clever,” though I should add that this is an example of an idea -amor fati- not without its value despite the increasing dementia of its author. I came across it again while reading Wikipedia’s brief treatment of Nietzsche’s comments concerning eternal return, which related to the previous post.

That idea is probably familiar to most from Kundera’s The Unbearable Lightness of Being, which questions at its outset whether the lightness of an existence that vanishes irretrievably into the past is terrible or fortunate; would it better for everything that happens to happen eternally, so to speak?

It’s worth noting that physicists would dispute the assumptions these questions make about time; the great Unburying the Lead quoted Albert Einstein recently:"For those of us who believe in physics, this separation between past, present and future is only an illusion."

Update: Nick Barr noted that “the whole syphilis thing is probably untrue,” an assertion which surprised me as the last time I read Nietzsche it seemed fairly widely accepted; much of his lifelong medical trouble is explained by such a diagnosis. But Barr has scholarship on his side, and I thank him for the correction; it appears now to at least be again in dispute, and strong arguments against syphilis have been made.