Meta is Murder

Genera

I am an allergic and reactive person, most outraged by the sorts of intellectual atrocities I myself commit. To say this is merely to assert the personal applicability of the now-hoary Hermann Hesse adage:

"If you hate a person, you hate something in him that is part of yourself. What isn’t part of ourselves doesn’t disturb us."

Hesse is a figure whom I regard with suspicion, and again: it seems to me likely that this is due to our mutual habits of appropriation, though whereas he recapitulates Eastern religious ideas in semi-novelistic form for his audience of early 20th-century European exoticists, I recapitulate in semi-essayistic form 20th-century European ideas from Kundera, Gombrowicz, Popper, and others. In this as in all cases, it is the form and not the content that matters.

To describe someone formally, we might say: “She is certain of her rightness, intolerant of those who disagree with her.” But to describe the content is necessarily to stray from the realm of the psychological —which is enduring, for the most part— into the realm of ephemera masquerading as philosophy: “She is for X, fighting against those who believe Y.” You and I have opinions about X and Y; we will judge her according to those opinions, even though in the fullness of time an opinion about X or Y will matter as much as the position of a farmer on the Huguenot question. History does not respect our axes and categories, although we believe as ever that they are of life-and-death import. History looks even less kindly on the sense of certainty which nearly all of us attain about our beliefs.

Art and understanding are concerned with forms; politics and judgement are concerned with content. I think of them algebraically: what can be described in variables has greater range, explanatory power, and reach than the specific arithmetic of some sad concluded homework problem.

Some of my smartest friends love Hesse. When I read him I am often struck by the familiarity of his ideas; I cannot tell whether I learned them through other authors who read him, through ambient culture, or through myself, my own reflections, but I know that they often seem to me to be apt instantiations of ideas nearly folklorish in nature, as is the case with the axiom quoted above. Perhaps it is simply that other moral principles lead to the same conclusion, so that Hesse seems as though he arrives at the end, rather than the middle, of the inquiry.

One such principle is well phrased by Marilynne Robinson in her essay “When I was a Child,” in her collection When I Was a Child I Read Books:

"It may be mere historical conditioning, but when I see a man or a woman alone, he or she looks mysterious to me, which is only to say that for a moment I see another human being clearly."

The idea that a human seen clearly is a mystery is anathema to a culture of judgment —such as ours— which rests on a simple premise: humans can be understood by means of simple schema that map their beliefs or actions to moral categories. Moreover, because there are usually relatively few of these categories, and few important issues of discernment —our range of political concerns being startlingly narrow, after all— humans can be understood and judged at high speed in large, generalized groups: Democrats, Republicans, women, men, people of color, whites, Muslims, Christians, the rich, the poor, Generation X, millennials, Baby Boomers, and so on.

It should but does not go without saying that none of those terms describes anything with sufficient precision to support the kinds of observations people flatter themselves making. Generalization is rarely sound. No serious analysis, no serious effort to understand, describe, or change anything can contain much generalization, as every aggregation of persons introduces error. One can hardly describe a person in full, let alone a family, a city, a class, a state, a race. Yet we persist in doing so, myself included.

Robinson continues:

"Tightly knit communities in which members look to one another for identity, and to establish meaning and value, are disabled and often dangerous, however polished their veneer. The opposition frequently made between individualism on the one hand and responsibility to society on the other is a false opposition as we all know. Those who look at things from a little distance can never be valued sufficiently. But arguments from utility will never produce true individualism. The cult of the individual is properly aesthetic and religious. The significance of every human destiny is absolute and equal. The transactions of conscience, doubt, acceptance, rebellion are privileged and unknowable…"

There is a kind of specious semi-rationalism involved in what she calls “utility”: the rationalism that is not simply concerned with logical operations and sound evidentiary processes but also with excluding anything it does not circumscribe. That is to say: the totalizing rationalism that denies a human is anything more than her utility, be it political or economic or whatever. Such rationalism seems intellectually sound until one, say, falls in love, or first encounters something that resists knowing, or reads about the early days of the Soviet Union: when putatively “scientifically known historical laws of development” led directly to massacres we can just barely admit were a kind of error, mostly because murder seems unsavory (even if murderously hostile judgment remains as appealing to us as ever).

One of the very best things Nietzsche ever wrote:

"The will to a system is a lack of integrity."

But to systematize is our first reaction to life in a society of scale, and our first experiment as literate or educated or even just “grown-up” persons with powers of apprehension, cogitation, and rhetoric. What would a person be online if he lacked a system in which phenomena could be traced to the constellation of ideas which constituted his firmament? What is life but the daily diagnosis of this or that bit of news as “yet another example of” an overarching system of absolutely correct beliefs? To have a system is proof of one’s seriousness, it seems —our profiles so often little lists of what we “believe,” or what we “are”— and we coalesce around our systems of thought just as our parents did around their political parties, though we of course consider ourselves mere rationalists following the evidence. Not surprisingly, the evidence always leads to the conclusion that many people in the world are horrible, stupid, even evil; and we are smart, wise, and good. It should be amusing, but it is not.

I hate this because I am doing this right now. I detest generalization because when I scan Twitter I generalize about what I see: “people today,” or “our generation,” I think, even though the people of today are as all people always have been, even though they are all just like me. I resent their judgments because I feel reduced by them and feel reality is reduced, so I reduce them with my own judgments: shallow thinkers who lack, I mutter, the integrity not to systematize. And I put fingers to keys to note this system of analysis, lacking all integrity, mocking my very position.

I want to maintain my capacity to view each as a mystery, as a human in full, whose interiority I cannot know. I want not to be full of hatred, so I seek to confess that my hatred is self-hatred: shame at the state of my intellectual reactivity and decay. I worry deeply that our systematizing is inevitable because when we are online we are in public: that these fora mandate performance, and worse, the kind of performance that asserts its naturalness, like the grotesquely beautiful actor who says, "Oh, me? I just roll out of bed in the morning and wear whatever I find lying about" as he smiles a smile so practiced it could calibrate the atomic clock. Every online utterance is an angling for approval; we write in the style of speeches: exhorting an audience, haranguing enemies, lauding the choir. People “remind” no one in particular of the correct ways to think, the correct opinions to hold. When I see us speaking like op-ed columnists, I feel embarrassed: it is like watching a lunatic relative address passers-by using the “royal we,” and, I feel, it is pitifully imitative. Whom are we imitating? Those who live in public: politicians, celebrities, “personalities.”

There is no honesty without privacy, and privacy is not being forbidden so much as rendered irrelevant; privacy is an invented concept, after all, and like all inventions must contend with waves of successive technologies or be made obsolete. The basis of privacy is the idea that judgment should pertain only to public acts —acts involving other persons and society— and not the interior spaces of the self. Society has no right to judge one’s mind; society hasn’t even the right to inquire about one’s mind. The ballot is secret; one cannot be compelled to testify or even talk in our criminal justice system; there can be no penalty for being oneself, however odious we may find given selves or whole (imagined) classes of selves.

This very radical idea has an epistemological basis, not a purely moral one: the self is a mystery. Every self is a mystery. You cannot know what someone really is, what they are capable of, what transformations of belief or character they might undergo, in what their identity consists, what they’ve inherited or appropriated, what they’ll abandon or reconsider; you cannot say when a person is who she is, at what point the “real” person exists or when a person’s journey through selves has stopped. A person is not, we all know, his appearance; but do we all know that she is not her job? Or even her politics? 

But totalizing rationalism is emphatic: either something is known or it is irrelevant. Thus: the mystery of the self is a myth; there is no mystery at all. A self is valid or invalid, useful or not, correct or incorrect, and if someone is sufficiently different from you, if their beliefs are sufficiently opposed to yours, their way of life alien enough, they are to be judged and detested. Everyone is a known quantity; simply look at their Twitter bio and despise.

But this is nonsense. In truth, the only intellectually defensible posture is one of humility: all beliefs are misconceptions; all knowledge is contingent, temporary, erroneous; and no self is knowable, not truly, not to another. We can perhaps sense this in ourselves —although I worry that many of us are too happy to brag about our conformity to this or that scheme or judgment, to use labels that honor us as though we’ve earned ourselves rather than chancing into them— but we forget that this is true of every single other, too. This forgetting is the first step of the so-called othering process: forget that we are bound together in irreducibility, forget that we ought to be humble in all things, and especially in our judgments of one another.

Robinson once more:

"Only lonesomeness allows one to experience this sort of radical singularity, one’s greatest dignity and privilege."

Lonesomeness is what we’re all fleeing at the greatest possible speed, what our media now concern themselves chiefly with eliminating alongside leisure. We thus forget our radical singularity, a personal tragedy, an erasure, a hollowing-out, and likewise the singularity of others, which is a tragedy more social and political in nature, and one which seems to me truly and literally horrifying. Because more than any shared “belief system” or political pose, it is the shared experience of radical singularity that unites us: the shared experience of inimitability and mortality. Anything which countermands our duty to recognize and honor the human in the other is a kind of evil, however just its original intention.

“We disparage ourselves endlessly, sometimes with reason… but more often, and more damningly, with a kind of black clarity of judgment that reaches right past all that we have or have not done, reaches past any insight or diagnosis that psychology can offer, and fingers us at the heart of what we are. Wrongness, call it. A stark and utter saturation of self:
God’s most deep decree
Bitter would have me taste: my taste was me.”

The poet Christian Wiman in My Bright Abyssthe final lines are from Gerard Manley Hopkins"Self" here has a particular definition established in earlier chapters; it is a conception of individual existence which contrasts indifferently with the word "soul,"

a word that has become almost embarrassing for many contemporary people unless it is completely stripped of its religious meaning. Perhaps that’s just what it needs sometimes: to be stripped of its ‘religious’ meaning, in the sense that faith itself sometimes needs to be stripped of its social and historical encrustations and returned to its first, churchless incarnation in the human heart. That’s what the twentieth century was, a kind of windstorm-scouring of all we thought was knowledge, and truth, and ours —until it became too strong for us, or we too weak for it, and ‘the self replaced the soul as the fist of survival’ (Fanny Howe). Anxiety comes from the self as ultimate concern, from the fact that the self cannot bear this ultimate concern: it buckles and wavers under the strain, and eventually, inevitably, it breaks.

My Bright Abyss is dense with such astute and precise humanity —in its poems, both Wiman’s and those he quotes, and its prose descriptions of lived experience— that one’s own lack of religiosity seems hardly important, no more important than faithlessness in a cathedral of tremendous beauty or incredulity amidst Buddhist monks quietly and carefully transcribing their texts. It is certainly the best introduction to poetry I’ve read, but also the most universalizing account of belief:

To have faith is to acknowledge the absolute materiality of existence while acknowledging at the same time the compulsion toward transfiguring order that seems not outside of things but within them, and within you, not an idea imposed upon the world but a vital, answering instinct. Heading home from work, irritated by my busyness and the sense of wasted days, shouldering through the strangers who merge and flow together on Michigan Avenue, merge and flow in the mirrored facades, I flash past the rapt eyes and undecayed face of my grandmother, lit and lost at once. In a board meeting, bored to oblivion, I hear a pen scrape like a fingernail on a cell wall, watch the glasses sweat as if even water wanted out, when suddenly, at the center of the long table, light makes of a bell-shaped pitcher a bell that rings in no place on this earth. Moments, only, and I am aware even within them, and thus am outside of them, yet something in the very act of such attention has troubled the tyranny of the ordinary, as if the world at which I gazed gazed at me, as if the lost face and the living crowd, the soundless bell and the mind in which it rings, all hankered toward—expressed some undeniable hope for—one end.

Quoting any part of the book is acutely frustrating; as Andrew Sullivan wrote after confessing that he read it “in a great rush of exhilaration” that kept him awake into the night, “It is no exaggeration to say that I’ve waited my entire adult life to read a book like this. It is impossible to summarize or even categorize.” And so it is. Perhaps the clearest thing I can say about it is that it seems to come from a time before the degradation and quiet collapse of art and literature, before noncommercial and nonsocial meaning itself was rendered absurd. Sullivan compares it to Simone Weil’s Gravity and Grace, and Weil —a hero of mine in every sense— also seemed rather like an emissary from a vastly more serious and honest time. The introspection on which Wiman and Weil alike base much of their work has nothing of the performativity that ensnares our introspection, to note one difference among many. Sullivan again:

If I were to suggest why, whether believer or not, you should read My Bright Abyss, it would be because Wiman asks the most difficult questions I can imagine about life and death with unflinching honesty.

For me, the caliber, depth, and intensity of his honesty is a bracing artistic achievement rare if not absent among contemporary writers, into whose most intimate prose creeps a pathetic public deference, a political sort of compromise, as though while making love they are wondering how their form will be judged, pretending to enjoy that which they do not. They are oppressed by the imperative to conform to a zeitgeist which insists it is not fashion but moral truth, as though any era is anything but transiently mistaken, soon to be misunderstood by generations who judge it ethically wanting, intellectually primitive, socially disgraceful. Do you think you are not a slaveholder, in your way? Do you think you will carry the approval of your peers with you into the dark earth?

I peer at Wiman’s sentences, trying to determine how he managed to get off stage in order to think and write just so, how he managed to create without hearing the carping of the crowds we all now carry. I will never not hear them, never not seek to anticipate them and defend myself. Wiman quotes Rilke’s Seventh Duino Elegy, which I read and ignored in school:

Truly being here is glorious. Even you knew it,
you girls who seemed to be lost, to go under –, in the filthiest
streets of the city, festering there, or wide open
for garbage. For each of you had an hour, or perhaps
not even an hour, a barely measurable time
between two moments –, when you were granted a sense
of being. Everything. Your veins flowed with being.
But we can so easily forget what our laughing neighbor
neither confirms nor envies.

It is hard to keep a sense of oneself, but even in the filthiest streets of the city our veins flow with being. My Bright Abyss helps me remember what matters and what does not.

Designer Duds: Losing Our Seat at the Table

If design hadn’t triumphed by 2012, it had by 2013.

Three years after launching the iPad, Apple was the world’s most valuable company, and even second-order pundits knew why: design. Steve Jobs’ remark that design was “how it works” had achieved what seemed like widespread comprehension, and recruiting wars for top designers rivaled those for top engineers. Salaries escalated, but cachet escalated faster; entire funds emerged whose only purpose was to invest in designer founders, and with money and esteem came the fetishization of design, the reduction of designers into archetypes, the establishment of trade cliques and the ever-increasing popularity of trend-ecosystems.

There were valedictory encomia about the power of design to deliver better products and therefore better commercial outcomes for companies and better utilitarian outcomes for users. In his rather more sober but nevertheless remarkable talk at Build 2013, David Cole noted that thanks to Apple,

Taking a design-centric approach to product development is becoming the default, I’m sure it will be taught in business schools soon enough… This is a trend I’ve observed happening across our whole industry: design creeping into the tops of organizations, into the beginnings of processes. We’re almost to the point, or maybe we’re already there, that these are boring, obvious observations to be making. Designers, at last, have their seat at the table.

For those of us who believe in the power of design thinking to solve human problems, and to a lesser extent in the power of markets to reward solutions when the interests of consumers and businesses are correctly aligned, this was invigorating news. Parts of the technology industry spent much of the 1990s and even the 2000s misunderstanding what design was and how it could help improve products. There was a time, after all, when Apple was a laughingstock. Now, in part thanks to Jobs and Ive and the entire culture of the company, as well as its undeniable financial success, designers would be heard and could make bigger contributions to human progress.

It’s now 2014, and I doubt seriously whether I’m alone in feeling a sense of anxiety about how “design” is using its seat at the table. From the failure of “design-oriented” Path [1] to the recent news that Square is seeking a buyer [2] to the fact that Medium is paying people to use it [3], there’s evidence that the luminaries of our community have been unable to use design to achieve market success. More troubling, much of the work for which we express the most enthusiasm seems superficial, narrow in its conception of design, shallow in its ambitions, or just ineffective.

To take stock, let’s consider three apps which ought to concern anyone who hoped that the rising profile of design would produce better products, better businesses, better outcomes.

Dropbox’s Carousel

Dropbox isn’t an obvious candidate for a design-obsessed company, but under the direction of former Facebook designer Soleio it has nevertheless become one, stockpiling designers at an impressive rate with a relatively simple pitch: we have the world’s data, its photos, its documents, its digital lives, as a massive foundation. Build great products on top of that. One can see how this might seem plausible enough, and indeed Soleio has assembled a strong team. In the design community, anticipation for the fruits of their labor has been widespread, and Carousel seems to be the first indication of what they’ll be up to.

Carousel is an app for storing and sharing photos. Dryly described, it almost seems like it was released several years late by accident; after all, many solutions already address both needs. Carousel has nice touches —it attempts, with middling success, to enlarge the “most interesting” photo on a given view when it lays out your pictures, and it uses some possibly handy but easily forgotten gestures— but its main standout at launch was some gratuitously sentimental and derivative marketing. It’s honestly hard to determine what should be interesting about it, even in theory; it takes mostly standard approaches to organization, display, and sharing, and seems to do little to distinguish itself from the default iOS Photos app + iCloud Photo Sharing, let alone apps and services like Instagram, VSCO Cam, Snapchat, iMessage, Facebook Messenger, and so on.

Its value seems to be unclear to iPhone owners, in any event:

image

If Carousel is intended to solve a user problem, neither I nor other potential users seem to be able to figure out what it is. It seems likelier to solve a Dropbox problem: how to get consumers to pay for higher tiers of Dropbox services by getting them to store all of their photos there. But Flickr, too, will store all of my photos, with additional functionality and without a fee. And Apple will also store many of my photos, and with iCloud Photo Sharing will let me share them in the manner Carousel does [4].

Despite an immense amount of press at its launch, Carousel is faring poorly in the App Store. Perhaps there are plans for the future development of more useful, differentiating features, but until then, it’s duplicative of existing and adopted solutions and seems to offer no incentive for switching from whatever one uses currently.

If you gathered some of the world’s best designers and gave them significant organizational support and all the resources they need, is an app which at best matches the functionality of bundled OS features from a few years ago what you’d expect?

It should go without saying that Carousel could be on its way to becoming a great product; it should also be acknowledged that absent intimate familiarity with its development, we can’t be confident who’s to fault for its paltry functionality and underwhelming differentiation. Perhaps it was rushed, poorly executed, or the fault of some errant executive. But that hardly accords with what one knows about Soleio, and in any event: the team seems happy with it.

But who is helped by this app? Whose life is improved? Whose problems are solved? What is now possible that wasn’t before? What is even merely easier?

Facebook’s Paper

Facebook has landed some of the best designers in the industry over the past several years, often acquiring their companies outright in order to do so; folks like Wilson Miner, Nicholas Felton, Mike Matas, and many more have gone over to the new Big Blue [5]. While Felton was reputed to be responsible for a Timeline redesign, Paper is known to be the work of Matas, along with several other folks in Facebook’s “Creative Labs” group.

Apart from its performance in the marketplace or success with users, Paper is interesting for two reasons:

  1. The continuing physicalization of the UI, which Matas helped along by designing iPhone OS 1.0 while at Apple, is important for making computers usable. A significant percentage of our progress in computational accessibility comes from the utilization of increased power or device awareness to deliver more persuasively “realistic” physical models for UI elements. First, the GUI; then additional colors, layering, and transition animations; now, velocity physics, the making of manipulable “objects” out of app elements, and so on. The better we get at this, the easier computers are to use and the more power we devolve to users, enabling their aspirations [6].
  2. Facebook spent lots of time and effort on creating a design / development environment to make UI elements like those in Paper easier to implement. They also open-sourced some of their work, helping others to physicalize their UIs, too. Given that Apple seems uninterested in this at the moment —focusing more on data, services, interconnection, and the like while iOS remains mostly a series of scrolling views with headers and footers — Facebook’s leadership is useful.

That said, Paper is not actually a good product in itself, and users don’t seem to be keen on replacing the main Facebook app —which is quite awkward in its animations, webby in its janky scrolling— with it:

image

While this is better than Carousel, it’s still beneath apps like Keek, We Heart It, Kik Kak, and even Google +. If rumors of poor in-app metrics like engagement are true, the download numbers are just the start of the problem. What Paper seems to be is a lovelier UI for Facebook. Path spent tens of millions of dollars attempting to achieve the same goal; both teams seem determined to ignore that for most users, the problems with Facebook do not actually have to do with how pretty it is.

To the extent that Paper is an initial step of UI renovation with substantial functional goals —for example, perhaps the “cards” of user stories have the ultimate aim of making it easier for neophytes to express what they don’t want to read or see by, say, flicking the card down— some of these criticisms are baseless. However, some of the people who made Paper suggest that in fact users are the ones who need to improve:

image

Hoping that users “worry more” about the quality of their photos seems like the wrong attitude to have. It’s worth noting that everyone I know who’s happiest with Facebook uploads whatever they want, including ugly, low-resolution photos, garbage meme-images, random, hyper-compressed videos, and the rest of the junk that they find interesting. It all looks insane in Paper. Wanting users to spring for DSLRs and learn how to shoot their kale salad with a shallow depth of field so that the very lovely new app you’ve built isn’t ruined by their tastelessness is exactly backwards.

Using Paper, I have a sense of anxiety: what if this is what designers make when not yoked to “product thinking”? What if Matas et alia sans Jobs or Forstall are capable of impossibly perfect physics in UIs, of great elements of design, but not of holistic product thinking, of real product integrity? What if design uses its seat at the table to draw pretty things, but otherwise not pay much attention to the outcomes, the user behaviors, the things enabled?

Because Paper, after all, not only adds little to Facebook per se, but is in fact feature-limited relative to the main app. And this is to say nothing of the strange information architecture in it, the issues with information density, and so on. What were they really solving for? Whose lives will be bettered by it? What has been enabled?

Jelly

Jelly is Biz Stone’s empathy-boosting app. Or it’s an app you use to get answers to questions. The marketing makes it hard to understand:

The idea for Jelly is a complete reimagining of how we get answers to queries based on a more human approach. Jelly is a mobile application that uses photos, interactive maps, location, and most importantly, people to deliver answers to queries. On a fundamental level, Jelly helps people. (source)

The idea is a reimagining? It is a reimagining of how we get answers based on a more human approach? More human than what? Than Google Search, which is made by humans to respond to your human inputs by connecting you to resources made by other humans? More human than Quora, which functions similarly?

Using Jelly to help people is much more important than using Jelly to search for help.If we’re successful, then we’re going to introduce into the daily muscle memory of smartphone users, everyone, that there’s this idea that there’s other peoplethat need their help right now. Let’s make the world a more empathetic placeby teaching that there’s other people around them that need help. (source)

Stone seems everywhere to hedge his bets about Jelly’s real purpose, and it’s not hard to understand why: a sufficiently vague target is harder to miss. On the other hand, using the app oneself is a depressing experience; my experience with it, despite being in precisely the demographic that is likeliest to use it heavily, bears out this writer’s opinion: it is a desert. That’s probably because no one is downloading it:

image

To be clear: I do not believe that Jelly is purely a marketing failure. Its design is not good, despite being the startup of a fully pedigreed “thought-leader” in the industry, who says in interviews that “we designed a better way to ask a question.”

But when Google Search designed a better way to ask a question, the proof was in the answers. With Jelly, the answers are rare, slow-in-coming, often jokes, gags, or irrelevant comments, and have nothing like the crowd-vetted quality of sites like Quora. Jelly, by design, is a step backwards, re-emphasizing the quality of your own existing networks as though the very problem to be solved for isn’t the contingent availability of knowledge itself, distributed inefficiently and unequally through social connections!

What Jelly does is it uses photos, locations, maps, and most importantly, people from all your social networks meshed together into one big network. It goes out not just one degree but two degrees of separation. Your query is going to real people. And they either know the answer or they can forward it to someone in their social network. This is where the strength of weak ties comes in… You and your friends generally know the same sort of stuff. But then you’ve got that one acquaintance, that lawyer, say, who brings a whole new circle of expertise. So the queries jump into these new arenas, and within a minute you get back answers from people. You see how you’re connected to that person. A real answer from a real person. (source)

Hopefully you’re connected to lawyers, or to folks who know some! Otherwise, this “better way to ask a question” yields the same divisions society already has: someone people know the right folks to get the right answers, and others don’t. It’s not hard to see why one particularly acerbic pundit called it “Yahoo Answers for the bourgeoisie,” just as Medium is a CMS for the bourgeoisie.

While Stone’s questions about M&A law and where to get the absolute best handmade bike probably get responses, for most of us, Google, Quora, Wikipedia, and dozens of other sources besides are better places to get questions answered.

Again one wonders: what were they designing for? What outcomes did they hope to catalyze through the software and service? Whose life will be improved, or even affected? How seriously are they even taking this?

Designer, Heal Thyself

It’s not fashionable to rain criticism on creative and entrepreneurial efforts in Silicon Valley, and I apologize to anyone rankled, vexed, or hurt by these remarks [7]. I also acknowledge again that, from outside of an enterprise, one’s analyses can be quite mistaken, and if I’ve maligned any apps or companies due to errant assumptions, I regret it. And as it happens: I use and enjoy Paper.

But for the design community, the issue is larger than anyone’s feelings, or even the success or failure of these apps. I worry about the reckoning to come when Square sells to Apple for less than its investors had hoped, or when Medium shuts down or gets acquired (or pivots to provide something other than an attractive, New Yorker-themed CMS for writers, the poorest people in the first world). While Biz Stone will walk away from Jelly smiling about yet another “valuable failure” and Soleio and Matas will always have their bodies of work, ordinary designers will be asked to please gather their things and leave the conference room in which CTOs and VPs of Sales and CEOs who remember how useless all of Square’s attention to detail turned out to be will resume making decisions. Design has, after all, passed out of vogue before.

In order to avoid losing its place atop organizations, design must deliver results. Designers must also accept that if they don’t, they’re not actually designing well; in technology, at least, the subjective artistry of design is mirrored by the objective finality of use data. A “great” design which produces bad outcomes —low engagement, little utility, few downloads, indifference on the part of the target market— should be regarded as a failure.

And if our best designers, ensconced in their labs with world-class teams, cannot reliably produce successful products, we should admit to ourselves that perhaps so-called “design science” remains much less developed than computer science, and that we’d do well to stay humble despite our rising stature. Design’s new prominence means that design’s failures have ever-greater visibility. Having the integrity and introspective accuracy to distinguish what one likes from what is good, useful, meaningful is vital; we do not work for ourselves but for our users. What do they want? What do they need? From what will they benefit? While answering these questions, we should hew to data, be intuitive about our users and their needs, and subject our designs to significant criticism and use before validating them.

Combining epistemological humility, psychological perceptivity, and technological-systematic thinking remains the best defense against launching duds, but necessary too is some depth of character, some intelligence about purposes, some humane empathy for those we serve. Because if what we design is shown by the markets not to have been useful, it’s no one’s fault but ours. And we shouldn’t think that others in organizations won’t take notice.

Notes

1 Path is a themed Facebook, little more; it’s a shallow variation on an existing product, and its lack of use reflects this. In a recent essay, Path co-founder Dave Morin argued for an approach to design which he called Super Normal, borrowing from a distantly-related set of ideas advanced by Jasper Morrison. Morin writes:

Imagine a basic metal bucket in your mind… To apply Super Normal thinking we start by looking at what is normal and then ask the question: What are the key problems? In the case of our basic metal bucket we can find a few. First, the metal handle cuts into your hand when carrying a bucket full of cold water. Second, when picking up a bucket of cold water the metal is freezing to the touch. Third, when pouring the water out, it’s hard to control the stream of water, causing you to lose water.
In thinking through these problems we can come up with some simple innovations that would make the bucket better. First, we can add a wood or plastic wrap to the metal handle, creating more surface area and thus a more comfortable carry. Second, we can wrap the entire bucket in a thin layer of plastic creating insulation when carrying hot or cold water. Third, we can add a spout to the side, making it easy to control the pour, causing you to lose less water.

It’s easy to see how Morin could mistake Path for design innovation when one reads this. To be clear: what’s needed isn’t plastic-covered buckets (or red-covered Facebooks). What’s needed is plumbing.Design is about solving problems that humans have, not problems that products have. We start with problems people have —how do I get clean water to drink, how do I fill my bathtub, how do I water my plants— and find the best practicable solution. It’s not a more comfortable bucket. Morin seems to believe design is varying the ideas of others in obvious ways. I disagree, and so does the market.

2 Square had a complex, ambitious, multi-phase product and services strategy that seems to have required levels of adoption they simply couldn’t achieve. Without substantial marketshare for the card reader, Wallet didn’t work in enough places to be worth downloading (and it had issues with reliability, too); without Wallet adoption rising, card reader adoption depended solely on its value proposition relative to other POS solutions, which are no longer unprepared for Square. Without merchants switching over to Square solutions en masse, Market doesn’t make as much sense, and other companies are already attacking the problem of e-commerce for smaller businesses, with more focus. Square Cash works well, but competitors like Venmo have a multi-year head start and, perhaps, a better model. Building a mutually-reinforcing ecosystem of payments-technology products and using leverage from one success to propel another, in sum, seems not to have worked.

While Medium fiddles with its organizational structure for approving journalists, adoption is sufficiently slow that they’ve resorted to paying people to use their product, since doing so brings no real functional or distributional advantages (beyond the pleasures afforded by their beautiful UI and visual design). Contrast this, say, with Quora’s value proposition: perhaps it’s less pretty, but for writers distribution, longevity, and much more besides full-bleed images matter. As it happens, I feel like Medium is approaching a painful moment Quora itself faced: early adoption by a certain sort of user can affect the brand in the eyes of a larger market of users. Medium is becoming synonymous with bloviating design and tech writing of the “thought leadership” variety and the occasional one-off confession, apology, or hatchet job. Its collections system doesn’t seem to drive much browsing behavior, and it certainly can’t afford to pay micro-celebrities and freelancers forever. And yes: they paid me for this essay.

4 I’m reminded of the short-lived Apple ad campaign which said that the first question in design is something like: “Does it have a right to exist?” Does Carousel?

Miner and Felton have both already left. People often ask how Facebook, which isn’t particularly beloved among designers or their “set,” can entice these sorts of talents. Beyond material compensation, I think it also has to do with something I wrote about here:

There’s a millennial element to insisting on living in public, but it’s also just an effect of the social media age. As it happens, I think this is the one unreservedly positive cultural effect of social media, and I assume this is how Zuckerberg et alia recruit idealists to work on social media products. Thanks to such networks, two things happen: (1) it becomes harder to conceal secrets, to hide ourselves and our behaviors and choices; and (2) it’s harder to ignore the true, unconcealed nature of others, their humanity, the validity of their behaviors and choices.
Together, these bring about necessary revisions in our moral standards and cultural judgments; while it is too slow for persons affected by discrimination and abuse, this process is unbelievably rapid by historical standards.
In particular, the transformation of American attitudes about homosexuality —the decreasing acceptability of using words like “gay” pejoratively, the commonplace presence of gay characters on TV, etc.— has occurred at breakneck speed, due both to activist political efforts and phenomena like George Takei’s presence in everyone’s Facebook news feeds for the past few years. Takei has 6M “Likes” on Facebook and over 1M followers on Twitter, lots of them heartland folks whose exposure to a “safe” and funny gay person changed how they thought; it’s harder to dehumanize those who appear alongside your family in your feed, making amusing observations and getting 100K likes from “regular people.”
Being brought into frequent contact with cultural output of George Takei and others probably did more to shift American attitudes than many would believe. That’s a foundational idea behind Buzzfeed’s LGBTQ coverage, and that they’ve been so successful suggests a lot about the centrality and importance of social media in culture.

6 See Paper Prototyping for more.

7 If it helps: I was not a success as a designer or as an entrepreneur in the marketplace, either. I mean: I’m not a success in any sense!

David Foster Wallace & Trudy

For many years since reading A Supposedly Fun Thing I’ll Never Do Again, I’ve wondered irritably: was David Foster Wallace mocking real people in his essay on the cruise-ship experience? Specifically, this passage stayed with me:

"My favorite tablemate is Trudy, whose husband…has given his ticket to Alice, their heavy and extremely well-dressed daughter… every time Alice mentions [her boyfriend Patrick, Trudy] suffers some sort of weird facial tic or grimace where the canine tooth on one side of her face shows but the other side’s doesn’t. Trudy is fifty-six and looks –and I mean this in the nicest possible way– rather like Jackie Gleason in drag, and has a particularly loud pre-laugh scream that is a real arrhythmia-producer…"

Because Wallace returns to and discusses this group repeatedly and seems fond of them, it was hard to understand how he’d simultaneously savage them with sardonic insults like these; to be clear: he is mocking them for their appearance, the sound of their laughter, their personalities of their children, etc., in a national publication.

I was often told that Wallace was surely using an amalgam of characters, or even entirely conjured ones, despite the verite nature of the essay; these barbs, after all, are hard to square with the ethics expressed elsewhere in his work, and seem difficult to justify from the reader’s or writer’s perspective.

Nevertheless, it turns out that, in fact, he was mocking real people. He was asked about it long ago in “There’s Going To Be the Occasional Bit of Embellishment”: David Foster Wallace on Nonfiction, 1998, Part 3, an interview with Tom Scocca at Slate. The relevant portion is below:

Q: Also when you’re writing about real events, there are other people who are at the same events. Have you heard back from the peoplethat you’re writing about? Trudy especially comes to mind—

DFW: [Groans]

Q: —who you described as looking like—

DFW: That, that was a very bad scene, because they were really nice to me on the cruise. And actually sent me a couple cards, and were looking forward to the thing coming out. And then it came out, and, you know, I never heard from them again. I feel—I’m worried that it hurt their feelings.

The. Thing. Is. Is, you know, saying that somebody looks like Jackie Gleason in drag, it might not be very nice, but if you just, if you could have seen her, it was true. It was just absolutely true. And so it’s one reason why I don’t do a lot of these, is there’s a real delicate balance between fucking somebody over and telling the truth to the reader.

Scocca does not press him on what sort of truth an insulting analogy is; in my opinion, it is a low order of truth, at the absolute best a physical description that could have been achieved in a less derisive way; that is: it is not a meaningful enough truth to matter much. But more importantly: there is a way to describe Trudy that isn’t a punchline. (The notion that there isn’t would reflect a total poverty of literary imagination).

Note that Wallace himself equivocates about the utility of the analogy:

DFW: I wasn’t going to hurt anybody or, you know, talk about anybody having sex with a White House intern or something. But I was going to tell the truth. And I couldn’t just so worry about Trudy’s feelings that I couldn’t say the truth. Which is, you know, a terrific, really nice, and not unattractive lady who did happen to look just like Jackie Gleason in drag.

Q: Maybe if you’d emphasized that it was not in an unattractive way. Which is sort of a hard thing to picture.

DFW: Actually the first draft of that did have that, and the editor pointed out that not only did this waste words, but it looked like I was trying to have my cake and eat it too. That I was trying to tell an unkind truth but somehow give her a neck rub at the same time. So it got cut.

Q: But you actually did want to have your cake and eat it too. Not in a bad way.

DFW: I’m unabashed, I think, in wanting to have my cake and eat it too.

I think he ought to have been a little abashed by the proximity of phrases like “I wasn’t going to hurt anybody” and “I couldn’t just so worry about Trudy’s feelings” and “not unattractive” and “Jackie Gleason in drag.” So close to one another, they aren’t coherent.

Even Scocca has to note that it’s hard to picture someone looking like Jackie Gleason in drag yet not being unattractive. This means it is a poor analogy, a bad description. Wallace wants to convey that she looks a certain way and is not unattractive; instead, he conveys that she is maximally unattractive and makes a punchline of it, then says it’s for the “truth” before ambivalently wishing it didn’t have to be this way in writing (which it doesn’t).

It is a parting amusement (and a reminder of the 1990s) that Wallace asserts that he would never "talk about anybody having sex with a White House intern…but I was going to tell the truth"; eager to establish his bona fides as a reputable thinker who supports the right politics, Wallace seems not to consider very clearly the relative value of these two disclosures:

  1. That a sitting US president cheated on his wife with an intern employed by the government, then lied about it to a country that —however much this pains me and Wallace alike— wants to moralistically examine and judge the private lives of their elected figures and has every right to do so, as they are the people and this is a democracy (to avoid confusion: I wish America were more like France, indifferent to the private affairs of public citizens; but that is my wish, not the wish of most of my fellow citizens, to whom journalists are theoretically beholden)
  2. That a friendly, ordinary private citizen was overweight, ugly, had an awful laugh, and made faces at her heavy-set daughter whenever the latter mentioned her boyfriend.

It’s hard for me to understand the reasoning he must have employed in deciding that the first is either unimportant or merits the protections of discrete privacy, supported by strangers, while the latter —that a woman and her daughter aren’t attractive— is important in light of the imperatives of journalistic truth! 

(Originally answered on Quora).

Description of a Struggle

If this piece gives you concerns about my viability as an employee, renter, applicant, neighbor, etc., please read this disclaimer / claimer.

I have an almost technical interest in attempting to describe the subjective experience of certain aberrant mental phenomena. Apart from any broader concerns and without concluding anything from it, then, here is an attempted accounting of what one might call a “breakdown” or an episode, an instance of bipolar collapse. For those interested: there was no interruption in my medication or treatment, but I’d had insufficient sleep the night before and some personal difficulties had catalyzed a terribly unhealthy mood.

I retreated into the bathroom, shut the door, and turned out the lights; I was very upset about many things, about all things; whatever thoughts formed were either dark and horrible at the outset or were pulled towards darkness very quickly. From one subject to the next: fears about the future, regrets about the past, guilt about my own moral failures, self-loathing because of my intransigent faults, fury at innumerable persons, shame at the force of my hatred and bitterness, and exhaustion with the perpetual systematic failure of my entire mind: personality, cognition, memory, emotion, will. In a few seconds I would cycle through these thoughts, which lead into and depart from one another, in an accelerating spiral that unified, separated, and then recomposed these threads again and again; it was rapid and repetitive.

This state is steady enough. It hurts very much, and I often sob into the floor, but it is not acute; it is simply painful to be filled with so much hatred for oneself, and to have this hatred permeate through one’s entirety: into one’s childhood memories, into one’s aesthetic sensibilities, into one’s sense of ethics. Because people were sleeping, I attempted mainly to cry noiselessly.

Suddenly, things grew vastly worse. I felt as though I had fallen backwards into a void at the absolute core of my mind, as though I had dropped through the variously false and detestable strata of my being into the reality of myself: nothingness, blackness, an abhorred vacuum around which swirled thoughts now far too fast to track, record, or resist. And I did indeed fall backwards, into the bathtub, because I felt exposed outside of it. I curled into myself and opened my mouth and screamed silently, my wet face draining from so many places that I worried as I gasped that I would aspirate tears, spit, snot.

Here is what I saw:

  • In the blackness of myself, I could see that my thoughts were not myself at all: my self is only a nothingness that exists in a state of pure terror and hatred, and my thoughts rotate around it as debris in a tornado. My thoughts were imbecilic, disgusting, vicious, superficial, detestable, but by this point I could no longer stay with them long enough to hate them. They distracted me, but I couldn’t attend to them. I said in my mind: “Oh god, oh god, oh god, nothing, nothing, nothing; oh god, nothing, nothing, oh god, I’m nothing, it’s nothing, there’s nothing, god, god.”
  • Periodically I would see what I assume was a phosphene, and it would transform into something real; I saw a glowing purple shape become the sun, and the sun became the blond hair I had in childhood. And I realized that I had murdered that boy, had murdered my own boyhood self, had destroyed this innocent child, and I ground my teeth to silence myself, as I wanted to scream so loud that I would tear myself apart, would explode in a bloody spray. I was sick with guilt and fear; I had nothing inside myself any longer; I felt I had betrayed myself, had orphaned myself when I needed someone most. I heard in my mind: “Why did I kill him? Oh god, he needed someone, he needed someone, why did I kill him, I’ve killed him, oh god, I’ve killed him.”
  • I was seized with a desire to gain physical access to and destroy my brain, an urge I felt in childhood when I had severe headaches. I grasped my hair and attempted to pull it out; I wanted to rip my scalp over and reach into my skull and destroy my mind, scramble and tear apart this malevolent and pathetic apparatus with my fingers, rip out the guts of my who nightmare self. I couldn’t get my hair out, hated myself for it, lost the thread of this thought, and resumed my silent shrieking and sobbing.

I thought of my mother and my father, and I thought of Abby, but only for flashes: nothing would remain, everything was immediately carried off in this great storm of shame, fear, rage, and sorrow. I wept and wept, incapable of extending myself through time: it was the brutality of the present that crushed me, the incessant re-setting of the scene: any effort to elaborate a saving thought, a consolatory or even therapeutic idea, was in vain; all things were carried away at once, disappeared from me, receded into distance. I thought only of my own destruction.

I hurt myself crying: an extraordinarily pathetic feeling overtook me as I cramped from pushing against the walls of the tub and I turned onto my back, looking upwards. Everything was slowing down, and I realized it: I felt as though I was being pulled upward through the same strata, back up to the “higher order” consciousness from which I had moments ago felt permanently alienated. It wasn’t a happy feeling; it felt false, pointless. But it wasn’t volitional, and within minutes I was out of the bathroom, pulling on my clothes to prepare for the day. It was Abby’s company’s summer picnic; they rented out a water park.

My father, his father, and his brother in the 1950s. It took me an inexcusably long time to realize how much of what I like about myself, how much of what enables the happiness or goodness I attain, I owe to him; that strange interference that can distort a daughter’s perception of her mother has its counterpart between fathers and sons, everyone knows that; but it took me by surprise nevertheless how much I’d identified and appreciated those things that came from my mother while assuming that virtues, interests, and ideas which he gave me were my own inventions. I’m no longer quite so deluded.
Happy father’s day, dad!

My father, his father, and his brother in the 1950s. It took me an inexcusably long time to realize how much of what I like about myself, how much of what enables the happiness or goodness I attain, I owe to him; that strange interference that can distort a daughter’s perception of her mother has its counterpart between fathers and sons, everyone knows that; but it took me by surprise nevertheless how much I’d identified and appreciated those things that came from my mother while assuming that virtues, interests, and ideas which he gave me were my own inventions. I’m no longer quite so deluded.

Happy father’s day, dad!

This photo was taken in 2003, when she was just a few years old. We had so many adventures over the years; I don’t know how much less I might have lived, how much more closed I’d have been, had I not taken her home from the veterinary hospital where I worked. It was 2001, and she’d been found, hairless and bruised and infected with mange and scabies and worms, in Bayou St. John; they dropped her with us, but she was nearly feral. In taking care of her, I bonded with her and took her home over the reasonable objections of many there, who’d noted how damaged and neurotic she was.
Tonight, Abby and I pressed our wet faces to her head as a doctor euthanized Bayou. She was 13 years old, dying from a bleeding belly tumor, too weak to move anything but her eyes. She was always so tough and sweet, always my close companion. These past years in San Francisco were a dream for her, and I guess I’ll try to hang on to that now that she’s gone.
Here are some photos of her being wonderful. Aren’t some of those fun? We were so much younger, and Louisiana was so green. And here are all the times I posted about her. I don’t care about these words in the slightest; for some reason, I just want to share her with you, show you photos of how she played and ran. She was here, with me, for the happiest years of my life.

This photo was taken in 2003, when she was just a few years old. We had so many adventures over the years; I don’t know how much less I might have lived, how much more closed I’d have been, had I not taken her home from the veterinary hospital where I worked. It was 2001, and she’d been found, hairless and bruised and infected with mange and scabies and worms, in Bayou St. John; they dropped her with us, but she was nearly feral. In taking care of her, I bonded with her and took her home over the reasonable objections of many there, who’d noted how damaged and neurotic she was.

Tonight, Abby and I pressed our wet faces to her head as a doctor euthanized Bayou. She was 13 years old, dying from a bleeding belly tumor, too weak to move anything but her eyes. She was always so tough and sweet, always my close companion. These past years in San Francisco were a dream for her, and I guess I’ll try to hang on to that now that she’s gone.

Here are some photos of her being wonderful. Aren’t some of those fun? We were so much younger, and Louisiana was so green. And here are all the times I posted about her. I don’t care about these words in the slightest; for some reason, I just want to share her with you, show you photos of how she played and ran. She was here, with me, for the happiest years of my life.

Miles Barger posted this wonderful image from The Neighbors, a photographic series by Arne Svenson of scenes in the windows of his Manhattan neighbors. They seem to assert the primacy of unknowable interior spaces, those buried within decor and personality, deeper within ourselves than our names go, deeper than our uniquenesses, into those places where we are archetypes, reacting without will to dreams and fears.

Miles Barger posted this wonderful image from The Neighbors, a photographic series by Arne Svenson of scenes in the windows of his Manhattan neighbors. They seem to assert the primacy of unknowable interior spaces, those buried within decor and personality, deeper within ourselves than our names go, deeper than our uniquenesses, into those places where we are archetypes, reacting without will to dreams and fears.

“The Church has become close to me in its distrust of man, and my distrust of form, my urgent desire to withdraw from it, to claim ‘that that is not yet I,’ which accompanies my every thought and feeling, coincides with the intentions of its doctrine. The Church is afraid of man and I am afraid of man. The Church does not trust man and I do not trust man. The Church, in opposing temporality to eternity, heaven to earth, tries to provide man with the distance [from] his own nature that I find indispensable. And nowhere does this affiliation mark itself more strongly than in our approach to Beauty. Both the Church and I fear beauty in this vale of tears, we both strive to defuse it, we want to defend ourselves against its excessive allure. The important thing for me is that it and I both insist on the division of man: the Church into the divine and the human component, I into life and consciousness. After the period in which art, philosophy, and politics looked for the integral, uniform, concrete, and literal man, the need for an elusive man who is a play of contradictions, a fountain of gushing antinomies and a system of infinite compensation, is growing. He who calls this “escapism” is unwise…”
The irreligious Witold Gombrowicz articulating some of the reasons why even the incredulous might find credulity closer to their principles than many popular forms of unexamined, incoherently reductive materialism.
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo
My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.
ZoomInfo

My mother doesn’t care for mother’s day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I’ll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.

Free Will & the Fallibility of Science

One of the most significant intellectual errors educated persons make is in underestimating the fallibility of science. The very best scientific theories containing our soundest, most reliable knowledge are certain to be superseded, recategorized from “right” to “wrong”; they are, as physicist David Deutsch says, misconceptions:

I have often thought that the nature of science would be better understood if we called theories “misconceptions” from the outset, instead of only after we have discovered their successors. Thus we could say that Einstein’s Misconception of Gravity was an improvement on Newton’s Misconception, which was an improvement on Kepler’s. The neo-Darwinian Misconception of Evolution is an improvement on Darwin’s Misconception, and his on Lamarck’s… Science claims neither infallibility nor finality.

This fact comes as a surprise to many; we tend to think of science —at the point of conclusion, when it becomes knowledge— as being more or less infallible and certainly final. Science, indeed, is the sole area of human investigation whose reports we take seriously to the point of crypto-objectivism. Even people who very much deny the possibility of objective knowledge step onto airplanes and ingest medicines. And most importantly: where science contradicts what we believe or know through cultural or even personal means, we accept science and discard those truths, often wisely.

An obvious example: the philosophical problem of free will. When Newton’s misconceptions were still considered the exemplar of truth par excellence, the very model of knowledge, many philosophers felt obliged to accept a kind of determinism with radical implications. Give the initial-state of the universe, it appeared, we should be able to follow all particle trajectories through the present, account for all phenomena through purely physical means. In other words: the chain of causation from the Big Bang on left no room for your volition:

Determinism in the West is often associated with Newtonian physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The “billiard ball” hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (Laplace’s demon). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.

Thus: the movement of the atoms of your body, and the emergent phenomena that such movement entails, can all be physically accounted for as part of a chain of merely physical, causal steps. You do not “decide” things; your “feelings” aren’t governing anything; there is no meaning to your sense of agency or rationality. From this essentially unavoidable philosophical position, we are logically-compelled to derive many political, moral, and cultural conclusions. For example: if free will is a phenomenological illusion, we must deprecate phenomenology in our philosophies; it is the closely-clutched delusion of a faulty animal; people, as predictable and materially reducible as commodities, can be reckoned by governments and institutions as though they are numbers. Freedom is a myth; you are the result of a process you didn’t control, and your choices aren’t choices at all but the results of laws we can discover, understand, and base our morality upon.

I should note now that (1) many people, even people far from epistemology, accept this idea, conveyed via the diffusion of science and philosophy through politics, art, and culture, that most of who you are is determined apart from your will; and (2) the development of quantum physics has not in itself upended the theory that free will is an illusion, as the sort of indeterminacy we see among particles does not provide sufficient room, as it were, for free will.

Of course, few of us can behave for even a moment as though free will is a myth; there should be no reason for personal engagement with ourselves, no justification for “trying” or “striving”; one would be, at best, a robot-like automaton incapable of self-control but capable of self-observation. One would account for one’s behaviors not with reasons but with causes; one would be profoundly divested from outcomes which one cannot affect anyway. And one would come to hold that, in its basic conception of time and will, the human consciousness was totally deluded.

As it happens, determinism is a false conception of reality. Physicists like David Deutsch and Ilya Prigogine have, in my opinion, defended free will amply on scientific grounds; and the philosopher Karl Popper described how free will is compatible in principle with a physicalist conception of the universe; he is quoted by both scientists, and Prigogine begins his book The End of Certainty, which proposes that determinism is no longer compatible with science, by alluding to Popper:

Earlier this century in The Open Universe: An Argument for Indeterminism, Karl Popper wrote,” Common sense inclines, on the one hand, to assert that every event is caused by some preceding events, so that every event can be explained or predicted… On the other hand, … common sense attributes to mature and sane human persons… the ability to choose freely between alternative possibilities of acting.” This “dilemma of determinism,” as William James called it, is closely related to the meaning of time. Is the future given, or is it under perpetual construction?

Prigogine goes on to demonstrate that there is, in fact, an “arrow of time,” that time is not symmetrical, and that the future is very much open, very much compatible with the idea of free will. Thus: in our lifetimes we have seen science —or parts of the scientific community, with the rest to follow in tow— reclassify free will from “illusion” to “likely reality”; the question of your own role in your future, of humanity’s role in the future of civilization, has been answered differently just within the past few decades.

No more profound question can be imagined for human endeavor, yet we have an inescapable conclusion: our phenomenologically obvious sense that we choose, decide, change, perpetually construct the future was for centuries contradicted falsely by “true” science. Prigogine’s work and that of his peers —which he calls a “probabilizing revolution” because of its emphasis on understanding unstable systems and the potentialities they entail— introduces concepts that restore the commonsensical conceptions of possibility, futurity, and free will to defensibility.

If one has read the tortured thinking of twentieth-century intellectuals attempting to unify determinism and the plain facts of human experience, one knows how submissive we now are to the claims of science. As Prigogine notes, we were prepared to believe that we, “as imperfect human observers, [were] responsible for the difference between past and future through the approximations we introduce into our description of nature.” Indeed, one has the sense that the more counterintuitive the scientific claim, the eagerer we are to deny our own experience in order to demonstrate our rationality.

This is only degrees removed from ordinary orthodoxies. The point is merely that the very best scientific theories remain misconceptions, and that where science contradicts human truths of whatever form, it is rational to at least contemplate the possibility that science has not advanced enough yet to account for them; we must be pragmatic in managing our knowledge, aware of the possibility that some truths we intuit we cannot yet explain, while other intuitions we can now abandon. My personal opinion, as you can imagine, is that we take too little note of the “truths,” so to speak, found in the liberal arts, in culture.

It is vital to consider how something can be both true and not in order to understand science and its limitations, and even more the limitations of second-order sciences (like social sciences). Newton’s laws were incredible achievements of rationality, verified by all technologies and analyses for hundreds of years, before their unpredicted exposure as deeply flawed ideas applied to a limited domain which in total provide incorrect predictions and erroneous metaphorical structures for understanding the universe.

I never tire of quoting Karl Popper’s dictum:

Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve.

It is hard but necessary to have this relationship with science, whose theories seem like the only possible answers and whose obsolescence we cannot envision. A rational person in the nineteenth century would have laughed at the suggestion that Newton was in error; he could not have known about the sub-atomic world or the forces and entities at play in the world of general relativity; and he especially could not have imagined how a theory that seemed utterly, universally true and whose predictive and explanatory powers were immense could still be an incomplete understanding, revealed by later progress to be completely mistaken about nearly all of its claims.

Can you imagine such a thing? It will happen to nearly everything you know. Consider what “ignorance” and “knowledge” really are for a human, what you can truly be certain of, how you should judge others given this overwhelming epistemological instability!