Avatar

@mills / metaismurder.com

Avatar
If I didn’t want to compete, I wanted even less to make new rules about what constitutes victory. I would want what everyone else wanted, even if I couldn’t attain it.

Faye, in Rachel Cusk’s Transit.

These sentences, like much of the texts of Outline, Transit, and Kudos, come from the internal monologue of the narrator, Faye. She does not speak them aloud or to other characters; they are private moral commitments within novels about private terrains. Faye’s promised refusal to “make new rules” for herself will, presumably, be measured by no one but Faye; moreover, the detectably deep system of considerations that leads her to this principle might never be exteriorized —discussed out loud or turned into a “post” from which others might string fences— at all. It is never elucidated in the novels, although it also seems to be everywhere in it. So too with “the world,” if one can put it that way; these novels are plainly contemporary, reflect life today. Yet they never strain to integrate the trivia of our kaleidoscopic present or fall for the temporary or transient; they seem both contemporary and easily enduring.

There is much to admire, in Faye and in Cusk. Cusk reminds me a bit of Kundera at his very best; in some ways, I think she may be a better pure novelist than he is, more able to infuse, suspend and sustain, imply and elide, to use subtlety and impression and description for both aesthetic and thematic purposes. In any event, Kundera once wrote: “A novel that does not uncover a hitherto unknown segment of existence is immoral. Knowledge is the novel’s only morality.”

For some time, I worried that our civilizational rate of change was approaching something like the sound barrier: new technologies, cultures, values, arts come so quickly that the fundamental processes which animate art encounter turbulence, cannot keep up, lose lift. Among many other things: each generation’s artists observe how new configurations of human existence reverberate through individual psychologies, integrate and extend these observations over time and with reflection, and then recreate worlds in their work to explore the “hitherto unknown.”

At some pace, there simply wouldn’t be enough time! Vast changes now come within, not between, generations; a novelist might scarcely have time to familiarize herself with the relevant dynamics, stories, cultures, brands, even user interfaces before they’ve been turned into the dead past. (And the past dies faster and more finally than ever before; we feel less and less kinship with even the people of the recent past. A novel set before cellphones is a period piece!).

I should say: I am a poor reader and have no idea whether any of these concerns had merit; for all I know, there is no acceleration whatsoever —it may be a delusion associated with aging— or novelists in the thousands have no difficulty overcoming the torrent of novelties in the trillions. But for me, Cusk’s novels are not only a literary pleasure but a relief. In that effortlessly sublimated yet present complexity she demonstrates a capacity to personalize, individualize, novelize whatever the present brings. Someone will capture what life is like, at least, whatever else happens; someone will help us understand what we are now.

For the phrase “making new rules for victory” is ludicrously dense with observations about our time. I felt, as they say, seen. My entire bratty, cowardly life has been a constant, resentful, creative elaboration of “new rules for victory.” At every turn, I “discovered” that what those around me valued, believed, cared about was “stupid” and often “evil,” while my alternative system of values was correct and, coincidentally, affirmed that I was “good” in whatever arbitrary sense mattered to me at the time. Not only have I spent my life in this narcissistic enterprise, but so too do many, so far as I can tell. It seems difficult to avoid.

Engaged in the most quotidian matter —she is looking for an apartment— Faye refuses to do it. Neither will she prevail in any way thanks to this choice: she cannot afford the places everyone wants, and no one will reward her for her unseen stance. But she’d rather lose than lie to herself. It seems to me a radical but appropriately unhyped heroism; and what heroism is more needed today than for individuals to prefer losing to lying to themselves? (And to refrain from self-aggrandizement if they do?).

Thus: short, simple sentences about a wholly ordinary moment touched both some of my deepest psychological anxieties —I am now worried about nearly all of my beliefs again— and traced a line to some of the broad dynamics of our time. Each of us, if we’re honest, knows how often we make new rules to console ourselves, to lie to ourselves; each of us, if we pay attention, sees that atop Maslow’s hierarchy, our semantic and rhetorical culture-sphere is battleground of meme-ideologies whose primary function is to provide new rules for victory, all ready-made. You may already be a winner.

One can argue that her position —if it were a position, and I do not think it is or should be— is communitarian: the elevation of your preferences over the general scrum is immoral. But I don’t detect the superiority one would expect if that were her attitude. Faye doesn’t want to self-justify, but she’s not trying to persuade others not to. Faye is reacting to the roaring noise of all that justification —never have Nietzsche’s reductive remarks about ressentiment seemed more descriptive than when reviewing online cultures—, crafting her stringent principles to align with a quiet, personal fixation on something like inner honesty. We hurt others when we’re deluded, after all; we cannot regard self-delusion as innocent. It is our responsibility to understand ourselves, even if it means —and I think it often must— that we lose.

Avatar

Pop Sanctimony

I've been surprised —stupidly— at the seeming resurgence of moralizing in pop culture (on all parts of all spectra) . I once felt that by aesthetic chance, pop culture had stumbled onto the position asserted by my favorite novelist, Milan Kundera:

Suspending moral judgment is not the immorality of the novel; it is its morality. The morality that stands against the ineradicable human habit of judging instantly, ceaselessly, and everyone; of judging before, and in the absence of, understanding. From the view­point of the novel’s wisdom, that fervid readiness to judge is the most detestable stupidity, the most pernicious evil.

For Kundera, this position reflects an analytical claim that moral judgment is always a partialization, a reduction, a kind of error. We do not see others clearly, nor ourselves, nor our time or context —there is just too much we cannot know, too much "causal density"— but we have an instinct to judge "before, and in the absence of, understanding." In art, we readily refrain from this judgment, as the interiority of a character on screen or page renders her human whether she does good or evil; this is why political art is trash, whereas art which humanizes those we considered inhuman is a special achievement. All of this may seem plainly obvious to us when we consider others and other times in history but we rarely adopt this stance when developing our opinions about, say, others' religiosity or politics.

It should go without saying that I didn't believe that pop culture reflected these particular ideas. When I called it "aesthetic chance," I meant: it became "uncool" at some point to moralize; sanctimony was a "bad look"; musicians and directors and cultural figures of all kinds resisted sounding like preachers and op-ed columnists, resisted talking about moral categories, resisted shaming others and bragging about their positions; it seemed shallow to do so. There were exceptions, but by and large art was felt to be more important than politics, and truer as well.

At best, we can say: it was uncool because it seemed grotesque for individuals to posture morally, grotesque when someone mysteriously felt the confident superiority required to lecture and hector their cohort, their generation, other generations, all preceding humans in history. That is: self-satisfied superiority seemed gross. More plausibly, we can say: it became unpopular to moralize because that's what conservatives were thought to do, while liberals were seen as likely to take vaguely cultural-relativist positions, and to like artists and politicians who were themselves morally complex. 

But it no longer appears to strike us as grotesque at all; we rather adore it when someone offers an obvious moral position as though it is a revolutionary insight about which they should feel pride. This is especially strange because moral truths strike us as truths, requiring no special work or intelligence or depth. If we want to feel pride, we ought to look to our persuasive effectuality: are we changing minds, helping others to an understanding of what we think is objectively morally true? If we're not effective —if we have no impact— being right is not an occasion for pride.

The competitive moralizing dynamic of online communities in particular favors those who can find straw men or the morally dim and contrast themselves with them through heated rhetoric, which has little to do with persuasion in Alfred Polgar's sense:

To reform an evildoer, you must before anything else help him to an awareness that what he did was evil. With the Nazis this won’t be easy. They know exactly what they’re doing: they just can’t imagine it.

But we do not look for those who seek to illuminate the imaginations of those with whom they disagree, because "helping…to an awareness" is hard work requiring patience and compassion for all. Instead, we want angry pundits; we love when people moralize aggressively, like campus preachers screaming about how evil everyone and everything else is. And we love narratives that allow us to moralize from outside the system, viewing "culture" or "Americans" from an omniscient position: the intellectual's great fetish is that fun Hegelian status as "observer of all things," something Kierkegaard understood and rightly mocked. Such systems are popular today: for example, pop-psychoanalysis of "why others believe what they do," whereas we do not suspect ourselves of believing what we do because of reducible feelings (rather, we think, we believe what's right; otherwise, why would we care whether anyone agrees?).

Online, moralizing is part of the scrum of communities attempting to define and protect their values; it's worst on Twitter because that's where communities are allowed no distinct spaces but are instead mixed like armies on a battlefield; they have no choice but to fight. Nevertheless, it's been amazing to me to see how ubiquitous sanctimony has become, how social our moralizing has become, how relentless our politicization and judgment. Kundera again:

In our time people have learned to subordinate friendship to what’s called “convictions.” And even with a prideful tone of moral correctness. It does take great maturity to understand that the opinion we are arguing for is merely the hypothesis we favor, necessarily imperfect, probably transitory, which only very limited minds can declare to be a certainty or a truth. Unlike the puerile loyalty to a conviction, loyalty to a friend is a virtue—perhaps the only virtue, the last remaining one.

I don't think it takes any great maturity; I think the slightest familiarity with the human mind and human history makes obvious that judges tend to be asses. There is no self-flattery like privileging your own moral conceptions over those of others, because moral elevation is something we all seek. Humans in general like to know that they are "good," especially in the sense of being "in good standing" with their community; there is no cheaper and easier way to accomplish this than by othering, which is why humans do it. Narratives that make othering easier will always be popular online. And the more "intellectual" they seem, the better.

But it does shock me to see what a religious world I inhabit online: a world of dogmas and excommunications, of Huguenots and Catholics, of certainties and casus belli, of inquisitors and the church-goers eager to revel in their (usually) flattering verities and then spill out into the streets parroting them as revelation. Everyone wants to tear down temples. The moralization of artifacts from apps to art reminds me of nothing so much as those religious texts which assert God's position on minor matters like tattoos and silverware; and searching the world and all who live in it and their utterances for evidence of whether they conform to one's beliefs has seemed to me wildly anachronistic, which, as I said, only shows how stupid I am. Form never changes, only content.

Avatar

Reach and Accuracy

As one moves between the general and the specific, two things that vary are (1) accuracy and (2) reach. Accuracy: how well your policy, design, idea, opinion describes the real facts of reality, how few exceptions and edge cases there are. Reach: how many facts of reality or units or instances your policy, design, or idea describes or accounts for. Accuracy is related to depth and focus; reach is related to breadth and speed.

Both are important, and they're in necessary conflict.

This conflict is a function of information density: to communicate or write anything with meaning concisely —human language, artwork, or computer code— requires various forms of abstraction, all of which exchange information-denser realities for more processable, less informationally dense metaphors or representations we can work with. It may also be fair to say that this kind of "density" is really more a matter of physical time and processing limitations (again whether we mean speech or any other information-bearing thing) than real physical density. Because we mostly communicate in language, we often stack abstractions, too; being linguistically concrete doesn't mean we're being conceptually concrete, or describing reality accurately at all.

Reach is also about leverage: one (or few) solution(s) or idea(s) for many units. Accuracy is about eliminating cases of error. With ideas, policies, and designs, accuracy and reach are likely to be in conflict, unless they're based on true explanatory models. This is because a truly explanatory model predicts every case, accounts for every case, but not with an abundance of information. An explanatory model can be light, in information density terms, but have incredible reach. It doesn't describe reality; it mirrors in its internal relations —among its components— the relations of the entities in reality it describes. Moreover, its accuracy is not so much a matter of technology in observation or control but of its hewing to reality. We do not need a larger sample size or better microscopes to understand why lightbulbs work, nor do we have any real margin of error in our conclusions about them.

There's enormous and increasing pressure on humans to achieve reach in their ideas, designs, morals, and policies. Despite having evolved in small groups with small-group habits of cognition and emotion, we now live in a global group and must coordinate hugely complex societies. The problems we face are problems at scale. Thus: reach is mandatory. A taxation, software design, or criminal justice solution that cannot be deployed at scale isn't useful to us anymore; indeed, even opinions must scale up. For personal, political, governmental, commercial, literary, expediency-oriented, and many other reasons, we must have solutions that work for more human (H) units / instances, and H is always increasing (even as every sub-member of H is determined to be respected according to her or his unpredictable inimitability, range of action, moral agency, autonomy, freedom, etc.).

This pressure often inclines people to accept induction- or correlation-based models or ideas, which are inaccurate to varyingly significant degrees, in lieu of explanatory models. That is: in many situations, we'll accept aggregates, groups, central plans, reductions, otherings, dehumanizations, short-hand-symbols, and so on because (1) they serve our ends, sometimes without any costs or (2) we have nothing else. In order to have explanations with reach in areas where we have no models, we commit philosophical fraud: we transact with elements and dynamics we cannot predict or understand and we hope for the best (better, it seems, than admitting that "I don't know"). How we talk about speculative models, reductive schema, and plural entities —peoples, companies, generations, professions, events even— reveals a lot about how much we care for epistemological accuracy. And not caring about it is a kind of brutality; it means we don't care what happens to the lives inaccurately described, not captured by our model, not helped by our policies, unaided by our designs, not included in our normative plan.

In politics, design, art, philosophy, and even ordinary daily thinking, being consciously aware of this tension, and of the pressure to exchange accuracy for reach, is as important as recognizing the difference between "guessing" and "knowing." Otherwise, one is likely to adopt ideas with reach without recognizing the increased risk of inaccuracy that comes with it. One will be tempted to ignore the risk even if one knows it, tempted by how nice it is to have tidy conceptions of good and evil, friend and foe, progress and failure.

Reach is innately personally pleasing in part because it privileges the knower, whose single thought describes thousands or millions of people, whose simple position circumscribes civilization's evolution, the history of religion, the nature of economics, the meaning of life. Exceptions be damned! But in general, if an idea has significant reach, it must be backed by an explanatory model or it will either be too vague or too inaccurate to be useful. And if it's a political or moral idea, the innocent exceptions will be damned along with the guilty. Hence the immorality of reduction, othering, and inaccurate ideas whose reach makes them popular.

Note: the term "reach" and many other ideas in this note come from the physicist David Deutsch. For more on this theme, see this astute remark from Second Balcony, or this.

Avatar

Genera

I am an allergic and reactive person, most outraged by the sorts of intellectual atrocities I myself commit. To say this is merely to assert the personal applicability of the now-hoary Hermann Hesse adage:

"If you hate a person, you hate something in him that is part of yourself. What isn't part of ourselves doesn't disturb us."

Hesse is a figure whom I regard with suspicion, and again: it seems to me likely that this is due to our mutual habits of appropriation, though whereas he recapitulates Eastern religious ideas in semi-novelistic form for his audience of early 20th-century European exoticists, I recapitulate in semi-essayistic form 20th-century European ideas from Kundera, Gombrowicz, Popper, and others. In this as in all cases, it is the form and not the content that matters.

To describe someone formally, we might say: "She is certain of her rightness, intolerant of those who disagree with her." But to describe the content is necessarily to stray from the realm of the psychological —which is enduring, for the most part— into the realm of ephemera masquerading as philosophy: "She is for X, fighting against those who believe Y." You and I have opinions about X and Y; we will judge her according to those opinions, even though in the fullness of time an opinion about X or Y will matter as much as the position of a farmer on the Huguenot question. History does not respect our axes and categories, although we believe as ever that they are of life-and-death import. History looks even less kindly on the sense of certainty which nearly all of us attain about our beliefs.

Art and understanding are concerned with forms; politics and judgement are concerned with content. I think of them algebraically: what can be described in variables has greater range, explanatory power, and reach than the specific arithmetic of some sad concluded homework problem.

Some of my smartest friends love Hesse. When I read him I am often struck by the familiarity of his ideas; I cannot tell whether I learned them through other authors who read him, through ambient culture, or through myself, my own reflections, but I know that they often seem to me to be apt instantiations of ideas nearly folklorish in nature, as is the case with the axiom quoted above. Perhaps it is simply that other moral principles lead to the same conclusion, so that Hesse seems as though he arrives at the end, rather than the middle, of the inquiry.

One such principle is well phrased by Marilynne Robinson in her essay "When I was a Child," in her collection When I Was a Child I Read Books:

"It may be mere historical conditioning, but when I see a man or a woman alone, he or she looks mysterious to me, which is only to say that for a moment I see another human being clearly."

The idea that a human seen clearly is a mystery is anathema to a culture of judgment —such as ours— which rests on a simple premise: humans can be understood by means of simple schema that map their beliefs or actions to moral categories. Moreover, because there are usually relatively few of these categories, and few important issues of discernment —our range of political concerns being startlingly narrow, after all— humans can be understood and judged at high speed in large, generalized groups: Democrats, Republicans, women, men, people of color, whites, Muslims, Christians, the rich, the poor, Generation X, millennials, Baby Boomers, and so on.

It should but does not go without saying that none of those terms describes anything with sufficient precision to support the kinds of observations people flatter themselves making. Generalization is rarely sound. No serious analysis, no serious effort to understand, describe, or change anything can contain much generalization, as every aggregation of persons introduces error. One can hardly describe a person in full, let alone a family, a city, a class, a state, a race. Yet we persist in doing so, myself included.

Robinson continues:

"Tightly knit communities in which members look to one another for identity, and to establish meaning and value, are disabled and often dangerous, however polished their veneer. The opposition frequently made between individualism on the one hand and responsibility to society on the other is a false opposition as we all know. Those who look at things from a little distance can never be valued sufficiently. But arguments from utility will never produce true individualism. The cult of the individual is properly aesthetic and religious. The significance of every human destiny is absolute and equal. The transactions of conscience, doubt, acceptance, rebellion are privileged and unknowable…"

There is a kind of specious semi-rationalism involved in what she calls "utility": the rationalism that is not simply concerned with logical operations and sound evidentiary processes but also with excluding anything it does not circumscribe. That is to say: the totalizing rationalism that denies a human is anything more than her utility, be it political or economic or whatever. Such rationalism seems intellectually sound until one, say, falls in love, or first encounters something that resists knowing, or reads about the early days of the Soviet Union: when putatively "scientifically known historical laws of development" led directly to massacres we can just barely admit were a kind of error, mostly because murder seems unsavory (even if murderously hostile judgment remains as appealing to us as ever).

One of the very best things Nietzsche ever wrote:

"The will to a system is a lack of integrity."

But to systematize is our first reaction to life in a society of scale, and our first experiment as literate or educated or even just "grown-up" persons with powers of apprehension, cogitation, and rhetoric. What would a person be online if he lacked a system in which phenomena could be traced to the constellation of ideas which constituted his firmament? What is life but the daily diagnosis of this or that bit of news as "yet another example of" an overarching system of absolutely correct beliefs? To have a system is proof of one's seriousness, it seems —our profiles so often little lists of what we "believe," or what we "are"— and we coalesce around our systems of thought just as our parents did around their political parties, though we of course consider ourselves mere rationalists following the evidence. Not surprisingly, the evidence always leads to the conclusion that many people in the world are horrible, stupid, even evil; and we are smart, wise, and good. It should be amusing, but it is not.

I hate this because I am doing this right now. I detest generalization because when I scan Twitter I generalize about what I see: "people today," or "our generation," I think, even though the people of today are as all people always have been, even though they are all just like me. I resent their judgments because I feel reduced by them and feel reality is reduced, so I reduce them with my own judgments: shallow thinkers who lack, I mutter, the integrity not to systematize. And I put fingers to keys to note this system of analysis, lacking all integrity, mocking my very position.

I want to maintain my capacity to view each as a mystery, as a human in full, whose interiority I cannot know. I want not to be full of hatred, so I seek to confess that my hatred is self-hatred: shame at the state of my intellectual reactivity and decay. I worry deeply that our systematizing is inevitable because when we are online we are in public: that these fora mandate performance, and worse, the kind of performance that asserts its naturalness, like the grotesquely beautiful actor who says, "Oh, me? I just roll out of bed in the morning and wear whatever I find lying about" as he smiles a smile so practiced it could calibrate the atomic clock. Every online utterance is an angling for approval; we write in the style of speeches: exhorting an audience, haranguing enemies, lauding the choir. People "remind" no one in particular of the correct ways to think, the correct opinions to hold. When I see us speaking like op-ed columnists, I feel embarrassed: it is like watching a lunatic relative address passers-by using the "royal we," and, I feel, it is pitifully imitative. Whom are we imitating? Those who live in public: politicians, celebrities, "personalities."

There is no honesty without privacy, and privacy is not being forbidden so much as rendered irrelevant; privacy is an invented concept, after all, and like all inventions must contend with waves of successive technologies or be made obsolete. The basis of privacy is the idea that judgment should pertain only to public acts —acts involving other persons and society— and not the interior spaces of the self. Society has no right to judge one's mind; society hasn't even the right to inquire about one's mind. The ballot is secret; one cannot be compelled to testify or even talk in our criminal justice system; there can be no penalty for being oneself, however odious we may find given selves or whole (imagined) classes of selves.

This very radical idea has an epistemological basis, not a purely moral one: the self is a mystery. Every self is a mystery. You cannot know what someone really is, what they are capable of, what transformations of belief or character they might undergo, in what their identity consists, what they've inherited or appropriated, what they'll abandon or reconsider; you cannot say when a person is who she is, at what point the "real" person exists or when a person's journey through selves has stopped. A person is not, we all know, his appearance; but do we all know that she is not her job? Or even her politics? 

But totalizing rationalism is emphatic: either something is known or it is irrelevant. Thus: the mystery of the self is a myth; there is no mystery at all. A self is valid or invalid, useful or not, correct or incorrect, and if someone is sufficiently different from you, if their beliefs are sufficiently opposed to yours, their way of life alien enough, they are to be judged and detested. Everyone is a known quantity; simply look at their Twitter bio and despise.

But this is nonsense. In truth, the only intellectually defensible posture is one of humility: all beliefs are misconceptions; all knowledge is contingent, temporary, erroneous; and no self is knowable, not truly, not to another. We can perhaps sense this in ourselves —although I worry that many of us are too happy to brag about our conformity to this or that scheme or judgment, to use labels that honor us as though we've earned ourselves rather than chancing into them— but we forget that this is true of every single other, too. This forgetting is the first step of the so-called othering process: forget that we are bound together in irreducibility, forget that we ought to be humble in all things, and especially in our judgments of one another.

Robinson once more:

"Only lonesomeness allows one to experience this sort of radical singularity, one's greatest dignity and privilege."

Lonesomeness is what we're all fleeing at the greatest possible speed, what our media now concern themselves chiefly with eliminating alongside leisure. We thus forget our radical singularity, a personal tragedy, an erasure, a hollowing-out, and likewise the singularity of others, which is a tragedy more social and political in nature, and one which seems to me truly and literally horrifying. Because more than any shared "belief system" or political pose, it is the shared experience of radical singularity that unites us: the shared experience of inimitability and mortality. Anything which countermands our duty to recognize and honor the human in the other is a kind of evil, however just its original intention.

Avatar
We disparage ourselves endlessly, sometimes with reason… but more often, and more damningly, with a kind of black clarity of judgment that reaches right past all that we have or have not done, reaches past any insight or diagnosis that psychology can offer, and fingers us at the heart of what we are. Wrongness, call it. A stark and utter saturation of self: God’s most deep decree Bitter would have me taste: my taste was me.

The poet Christian Wiman in My Bright Abyssthe final lines are from Gerard Manley Hopkins. "Self" here has a particular definition established in earlier chapters; it is a conception of individual existence which contrasts indifferently with the word "soul,"

a word that has become almost embarrassing for many contemporary people unless it is completely stripped of its religious meaning. Perhaps that’s just what it needs sometimes: to be stripped of its ‘religious’ meaning, in the sense that faith itself sometimes needs to be stripped of its social and historical encrustations and returned to its first, churchless incarnation in the human heart. That’s what the twentieth century was, a kind of windstorm-scouring of all we thought was knowledge, and truth, and ours —until it became too strong for us, or we too weak for it, and ‘the self replaced the soul as the fist of survival’ (Fanny Howe). Anxiety comes from the self as ultimate concern, from the fact that the self cannot bear this ultimate concern: it buckles and wavers under the strain, and eventually, inevitably, it breaks.

My Bright Abyss is dense with such astute and precise humanity —in its poems, both Wiman’s and those he quotes, and its prose descriptions of lived experience— that one’s own lack of religiosity seems hardly important, no more important than faithlessness in a cathedral of tremendous beauty or incredulity amidst Buddhist monks quietly and carefully transcribing their texts. It is certainly the best introduction to poetry I’ve read, but also the most universalizing account of belief:

To have faith is to acknowledge the absolute materiality of existence while acknowledging at the same time the compulsion toward transfiguring order that seems not outside of things but within them, and within you, not an idea imposed upon the world but a vital, answering instinct. Heading home from work, irritated by my busyness and the sense of wasted days, shouldering through the strangers who merge and flow together on Michigan Avenue, merge and flow in the mirrored facades, I flash past the rapt eyes and undecayed face of my grandmother, lit and lost at once. In a board meeting, bored to oblivion, I hear a pen scrape like a fingernail on a cell wall, watch the glasses sweat as if even water wanted out, when suddenly, at the center of the long table, light makes of a bell-shaped pitcher a bell that rings in no place on this earth. Moments, only, and I am aware even within them, and thus am outside of them, yet something in the very act of such attention has troubled the tyranny of the ordinary, as if the world at which I gazed gazed at me, as if the lost face and the living crowd, the soundless bell and the mind in which it rings, all hankered toward—expressed some undeniable hope for—one end.

Quoting any part of the book is acutely frustrating; as Andrew Sullivan wrote after confessing that he read it “in a great rush of exhilaration” that kept him awake into the night, “It is no exaggeration to say that I’ve waited my entire adult life to read a book like this. It is impossible to summarize or even categorize.” And so it is. Perhaps the clearest thing I can say about it is that it seems to come from a time before the degradation and quiet collapse of art and literature, before noncommercial and nonsocial meaning itself was rendered absurd. Sullivan compares it to Simone Weil’s Gravity and Grace, and Weil —a hero of mine in every sense— also seemed rather like an emissary from a vastly more serious and honest time. The introspection on which Wiman and Weil alike base much of their work has nothing of the performativity that ensnares our introspection, to note one difference among many. Sullivan again:

If I were to suggest why, whether believer or not, you should read My Bright Abyss, it would be because Wiman asks the most difficult questions I can imagine about life and death with unflinching honesty.

For me, the caliber, depth, and intensity of his honesty is a bracing artistic achievement rare if not absent among contemporary writers, into whose most intimate prose creeps a pathetic public deference, a political sort of compromise, as though while making love they are wondering how their form will be judged, pretending to enjoy that which they do not. They are oppressed by the imperative to conform to a zeitgeist which insists it is not fashion but moral truth, as though any era is anything but transiently mistaken, soon to be misunderstood by generations who judge it ethically wanting, intellectually primitive, socially disgraceful. Do you think you are not a slaveholder, in your way? Do you think you will carry the approval of your peers with you into the dark earth?

I peer at Wiman’s sentences, trying to determine how he managed to get off stage in order to think and write just so, how he managed to create without hearing the carping of the crowds we all now carry. I will never not hear them, never not seek to anticipate them and defend myself. Wiman quotes Rilke’s Seventh Duino Elegy, which I read and ignored in school:

Truly being here is glorious. Even you knew it, you girls who seemed to be lost, to go under –, in the filthiest streets of the city, festering there, or wide open for garbage. For each of you had an hour, or perhaps not even an hour, a barely measurable time between two moments –, when you were granted a sense of being. Everything. Your veins flowed with being. But we can so easily forget what our laughing neighbor neither confirms nor envies.

It is hard to keep a sense of oneself, but even in the filthiest streets of the city our veins flow with being. My Bright Abyss helps me remember what matters and what does not.

Avatar

Saints Augustine and Monica, Ary Scheffer, 1854.

Avatar

David Foster Wallace & Trudy

For many years since reading A Supposedly Fun Thing I'll Never Do Again, I've wondered irritably: was David Foster Wallace mocking real people in his essay on the cruise-ship experience? Specifically, this passage stayed with me:

"My favorite tablemate is Trudy, whose husband…has given his ticket to Alice, their heavy and extremely well-dressed daughter… every time Alice mentions [her boyfriend Patrick, Trudy] suffers some sort of weird facial tic or grimace where the canine tooth on one side of her face shows but the other side's doesn't. Trudy is fifty-six and looks –and I mean this in the nicest possible way– rather like Jackie Gleason in drag, and has a particularly loud pre-laugh scream that is a real arrhythmia-producer…"

Because Wallace returns to and discusses this group repeatedly and seems fond of them, it was hard to understand how he'd simultaneously savage them with sardonic insults like these; to be clear: he is mocking them for their appearance, the sound of their laughter, their personalities of their children, etc., in a national publication.

I was often told that Wallace was surely using an amalgam of characters, or even entirely conjured ones, despite the verite nature of the essay; these barbs, after all, are hard to square with the ethics expressed elsewhere in his work, and seem difficult to justify from the reader's or writer's perspective.

Nevertheless, it turns out that, in fact, he was mocking real people. He was asked about it long ago in “There’s Going To Be the Occasional Bit of Embellishment”: David Foster Wallace on Nonfiction, 1998, Part 3, an interview with Tom Scocca at Slate. The relevant portion is below:

Q: Also when you’re writing about real events, there are other people who are at the same events. Have you heard back from the peoplethat you’re writing about? Trudy especially comes to mind—
DFW: [Groans]
Q: —who you described as looking like—
DFW: That, that was a very bad scene, because they were really nice to me on the cruise. And actually sent me a couple cards, and were looking forward to the thing coming out. And then it came out, and, you know, I never heard from them again. I feel—I’m worried that it hurt their feelings.
The. Thing. Is. Is, you know, saying that somebody looks like Jackie Gleason in drag, it might not be very nice, but if you just, if you could have seen her, it was true. It was just absolutely true. And so it’s one reason why I don’t do a lot of these, is there’s a real delicate balance between fucking somebody over and telling the truth to the reader.

Scocca does not press him on what sort of truth an insulting analogy is; in my opinion, it is a low order of truth, at the absolute best a physical description that could have been achieved in a less derisive way; that is: it is not a meaningful enough truth to matter much. But more importantly: there is a way to describe Trudy that isn’t a punchline. (The notion that there isn't would reflect a total poverty of literary imagination).

Note that Wallace himself equivocates about the utility of the analogy:

DFW: I wasn’t going to hurt anybody or, you know, talk about anybody having sex with a White House intern or something. But I was going to tell the truth. And I couldn’t just so worry about Trudy’s feelings that I couldn’t say the truth. Which is, you know, a terrific, really nice, and not unattractive lady who did happen to look just like Jackie Gleason in drag.
Q: Maybe if you’d emphasized that it was not in an unattractive way. Which is sort of a hard thing to picture.
DFW: Actually the first draft of that did have that, and the editor pointed out that not only did this waste words, but it looked like I was trying to have my cake and eat it too. That I was trying to tell an unkind truth but somehow give her a neck rub at the same time. So it got cut.
Q: But you actually did want to have your cake and eat it too. Not in a bad way.
DFW: I’m unabashed, I think, in wanting to have my cake and eat it too.

I think he ought to have been a little abashed by the proximity of phrases like “I wasn’t going to hurt anybody” and “I couldn’t just so worry about Trudy’s feelings” and “not unattractive” and “Jackie Gleason in drag.” So close to one another, they aren't coherent.

Even Scocca has to note that it’s hard to picture someone looking like Jackie Gleason in drag yet not being unattractive. This means it is a poor analogy, a bad description. Wallace wants to convey that she looks a certain way and is not unattractive; instead, he conveys that she is maximally unattractive and makes a punchline of it, then says it’s for the “truth” before ambivalently wishing it didn’t have to be this way in writing (which it doesn’t).

It is a parting amusement (and a reminder of the 1990s) that Wallace asserts that he would never "talk about anybody having sex with a White House intern…but I was going to tell the truth"; eager to establish his bona fides as a reputable thinker who supports the right politics, Wallace seems not to consider very clearly the relative value of these two disclosures:

  1. That a sitting US president cheated on his wife with an intern employed by the government, then lied about it to a country that —however much this pains me and Wallace alike— wants to moralistically examine and judge the private lives of their elected figures and has every right to do so, as they are the people and this is a democracy (to avoid confusion: I wish America were more like France, indifferent to the private affairs of public citizens; but that is my wish, not the wish of most of my fellow citizens, to whom journalists are theoretically beholden)
  2. That a friendly, ordinary private citizen was overweight, ugly, had an awful laugh, and made faces at her heavy-set daughter whenever the latter mentioned her boyfriend.

It's hard for me to understand the reasoning he must have employed in deciding that the first is either unimportant or merits the protections of discrete privacy, supported by strangers, while the latter —that a woman and her daughter aren't attractive— is important in light of the imperatives of journalistic truth! 

(Originally answered on Quora).

Avatar

Landscapes by Frank Walter (1926-2009), shared here for use in escaping oneself and touching the ground of being, or being consoled amidst all the frothing, confused reactivity of culture by whatever abides, or some such.

Avatar

Description of a Struggle

If this piece gives you concerns about my viability as an employee, renter, applicant, neighbor, etc., please read this disclaimer / claimer.

I have an almost technical interest in attempting to describe the subjective experience of certain aberrant mental phenomena. Apart from any broader concerns and without concluding anything from it, then, here is an attempted accounting of what one might call a "breakdown" or an episode, an instance of bipolar collapse. For those interested: there was no interruption in my medication or treatment, but I'd had insufficient sleep the night before and some personal difficulties had catalyzed a terribly unhealthy mood.

I retreated into the bathroom, shut the door, and turned out the lights; I was very upset about many things, about all things; whatever thoughts formed were either dark and horrible at the outset or were pulled towards darkness very quickly. From one subject to the next: fears about the future, regrets about the past, guilt about my own moral failures, self-loathing because of my intransigent faults, fury at innumerable persons, shame at the force of my hatred and bitterness, and exhaustion with the perpetual systematic failure of my entire mind: personality, cognition, memory, emotion, will. In a few seconds I would cycle through these thoughts, which lead into and depart from one another, in an accelerating spiral that unified, separated, and then recomposed these threads again and again; it was rapid and repetitive.

This state is steady enough. It hurts very much, and I often sob into the floor, but it is not acute; it is simply painful to be filled with so much hatred for oneself, and to have this hatred permeate through one's entirety: into one's childhood memories, into one's aesthetic sensibilities, into one's sense of ethics. Because people were sleeping, I attempted mainly to cry noiselessly.

Suddenly, things grew vastly worse. I felt as though I had fallen backwards into a void at the absolute core of my mind, as though I had dropped through the variously false and detestable strata of my being into the reality of myself: nothingness, blackness, an abhorred vacuum around which swirled thoughts now far too fast to track, record, or resist. And I did indeed fall backwards, into the bathtub, because I felt exposed outside of it. I curled into myself and opened my mouth and screamed silently, my wet face draining from so many places that I worried as I gasped that I would aspirate tears, spit, snot.

Here is what I saw:

  • In the blackness of myself, I could see that my thoughts were not myself at all: my self is only a nothingness that exists in a state of pure terror and hatred, and my thoughts rotate around it as debris in a tornado. My thoughts were imbecilic, disgusting, vicious, superficial, detestable, but by this point I could no longer stay with them long enough to hate them. They distracted me, but I couldn't attend to them. I said in my mind: "Oh god, oh god, oh god, nothing, nothing, nothing; oh god, nothing, nothing, oh god, I'm nothing, it's nothing, there's nothing, god, god."
  • Periodically I would see what I assume was a phosphene, and it would transform into something real; I saw a glowing purple shape become the sun, and the sun became the blond hair I had in childhood. And I realized that I had murdered that boy, had murdered my own boyhood self, had destroyed this innocent child, and I ground my teeth to silence myself, as I wanted to scream so loud that I would tear myself apart, would explode in a bloody spray. I was sick with guilt and fear; I had nothing inside myself any longer; I felt I had betrayed myself, had orphaned myself when I needed someone most. I heard in my mind: "Why did I kill him? Oh god, he needed someone, he needed someone, why did I kill him, I've killed him, oh god, I've killed him."
  • I was seized with a desire to gain physical access to and destroy my brain, an urge I felt in childhood when I had severe headaches. I grasped my hair and attempted to pull it out; I wanted to rip my scalp over and reach into my skull and destroy my mind, scramble and tear apart this malevolent and pathetic apparatus with my fingers, rip out the guts of my who nightmare self. I couldn't get my hair out, hated myself for it, lost the thread of this thought, and resumed my silent shrieking and sobbing.

I thought of my mother and my father, and I thought of Abby, but only for flashes: nothing would remain, everything was immediately carried off in this great storm of shame, fear, rage, and sorrow. I wept and wept, incapable of extending myself through time: it was the brutality of the present that crushed me, the incessant re-setting of the scene: any effort to elaborate a saving thought, a consolatory or even therapeutic idea, was in vain; all things were carried away at once, disappeared from me, receded into distance. I thought only of my own destruction.

I hurt myself crying: an extraordinarily pathetic feeling overtook me as I cramped from pushing against the walls of the tub and I turned onto my back, looking upwards. Everything was slowing down, and I realized it: I felt as though I was being pulled upward through the same strata, back up to the "higher order" consciousness from which I had moments ago felt permanently alienated. It wasn't a happy feeling; it felt false, pointless. But it wasn't volitional, and within minutes I was out of the bathroom, pulling on my clothes to prepare for the day. It was Abby's company's summer picnic; they rented out a water park.

Avatar

My father, his father, and his brother in the 1950s. It took me an inexcusably long time to realize how much of what I like about myself, how much of what enables the happiness or goodness I attain, I owe to him; that strange interference that can distort a daughter's perception of her mother has its counterpart between fathers and sons, everyone knows that; but it took me by surprise nevertheless how much I'd identified and appreciated those things that came from my mother while assuming that virtues, interests, and ideas which he gave me were my own inventions. I'm no longer quite so deluded.

Happy father's day, dad!

Avatar

This photo was taken in 2003, when she was just a few years old. We had so many adventures over the years; I don't know how much less I might have lived, how much more closed I'd have been, had I not taken her home from the veterinary hospital where I worked. It was 2001, and she'd been found, hairless and bruised and infected with mange and scabies and worms, in Bayou St. John; they dropped her with us, but she was nearly feral. In taking care of her, I bonded with her and took her home over the reasonable objections of many there, who'd noted how damaged and neurotic she was.

Tonight, Abby and I pressed our wet faces to her head as a doctor euthanized Bayou. She was 13 years old, dying from a bleeding belly tumor, too weak to move anything but her eyes. She was always so tough and sweet, always my close companion. These past years in San Francisco were a dream for her, and I guess I'll try to hang on to that now that she's gone.

Here are some photos of her being wonderful. Aren't some of those fun? We were so much younger, and Louisiana was so green. And here are all the times I posted about her. I don't care about these words in the slightest; for some reason, I just want to share her with you, show you photos of how she played and ran. She was here, with me, for the happiest years of my life.

Avatar
reblogged
Avatar
mills

Miles Barger posted this wonderful image from The Neighbors, a photographic series by Arne Svenson of scenes in the windows of his Manhattan neighbors. They seem to assert the primacy of unknowable interior spaces, those buried within decor and personality, deeper within ourselves than our names go, deeper than our uniquenesses, into those places where we are archetypes, reacting without will to dreams and fears.

Avatar
The Church has become close to me in its distrust of man, and my distrust of form, my urgent desire to withdraw from it, to claim 'that that is not yet I,' which accompanies my every thought and feeling, coincides with the intentions of its doctrine. The Church is afraid of man and I am afraid of man. The Church does not trust man and I do not trust man. The Church, in opposing temporality to eternity, heaven to earth, tries to provide man with the distance [from] his own nature that I find indispensable. And nowhere does this affiliation mark itself more strongly than in our approach to Beauty. Both the Church and I fear beauty in this vale of tears, we both strive to defuse it, we want to defend ourselves against its excessive allure. The important thing for me is that it and I both insist on the division of man: the Church into the divine and the human component, I into life and consciousness. After the period in which art, philosophy, and politics looked for the integral, uniform, concrete, and literal man, the need for an elusive man who is a play of contradictions, a fountain of gushing antinomies and a system of infinite compensation, is growing. He who calls this "escapism" is unwise…

The irreligious Witold Gombrowicz articulating some of the reasons why even the incredulous might find credulity closer to their principles than many popular forms of unexamined, incoherently reductive materialism.

Avatar

My mother doesn't care for mother's day, dislikes its manufactured manipulation of sentiment; she asks us not to do anything commercial for the occasion, so this is all I'll do: post these photos of her and note, only half-knowing what I mean, that the older I get the harder it is to think of her without falling all the way into the deepest parts of my own heart. I am not becoming her equal as I age; she constitutes the sky under which I grow, the sky whose scope exceeds the terrestrial horizons which limit my own vision, the sky beyond the mountains and above the clouds, the sky which surrounds the earth. I am thirty-two years old and I love her.

Avatar

Free Will & the Fallibility of Science

One of the most significant intellectual errors educated persons make is in underestimating the fallibility of science. The very best scientific theories containing our soundest, most reliable knowledge are certain to be superseded, recategorized from “right” to “wrong”; they are, as physicist David Deutsch says, misconceptions:

I have often thought that the nature of science would be better understood if we called theories “misconceptions” from the outset, instead of only after we have discovered their successors. Thus we could say that Einstein’s Misconception of Gravity was an improvement on Newton’s Misconception, which was an improvement on Kepler’s. The neo-Darwinian Misconception of Evolution is an improvement on Darwin’s Misconception, and his on Lamarck’s… Science claims neither infallibility nor finality.

This fact comes as a surprise to many; we tend to think of science —at the point of conclusion, when it becomes knowledge— as being more or less infallible and certainly final. Science, indeed, is the sole area of human investigation whose reports we take seriously to the point of crypto-objectivism. Even people who very much deny the possibility of objective knowledge step onto airplanes and ingest medicines. And most importantly: where science contradicts what we believe or know through cultural or even personal means, we accept science and discard those truths, often wisely.

An obvious example: the philosophical problem of free will. When Newton’s misconceptions were still considered the exemplar of truth par excellence, the very model of knowledge, many philosophers felt obliged to accept a kind of determinism with radical implications. Give the initial-state of the universe, it appeared, we should be able to follow all particle trajectories through the present, account for all phenomena through purely physical means. In other words: the chain of causation from the Big Bang on left no room for your volition:

Determinism in the West is often associated with Newtonian physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The “billiard ball” hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (Laplace’s demon). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.

Thus: the movement of the atoms of your body, and the emergent phenomena that such movement entails, can all be physically accounted for as part of a chain of merely physical, causal steps. You do not “decide” things; your “feelings” aren’t governing anything; there is no meaning to your sense of agency or rationality. From this essentially unavoidable philosophical position, we are logically-compelled to derive many political, moral, and cultural conclusions. For example: if free will is a phenomenological illusion, we must deprecate phenomenology in our philosophies; it is the closely-clutched delusion of a faulty animal; people, as predictable and materially reducible as commodities, can be reckoned by governments and institutions as though they are numbers. Freedom is a myth; you are the result of a process you didn’t control, and your choices aren’t choices at all but the results of laws we can discover, understand, and base our morality upon.

I should note now that (1) many people, even people far from epistemology, accept this idea, conveyed via the diffusion of science and philosophy through politics, art, and culture, that most of who you are is determined apart from your will; and (2) the development of quantum physics has not in itself upended the theory that free will is an illusion, as the sort of indeterminacy we see among particles does not provide sufficient room, as it were, for free will.

Of course, few of us can behave for even a moment as though free will is a myth; there should be no reason for personal engagement with ourselves, no justification for “trying” or “striving”; one would be, at best, a robot-like automaton incapable of self-control but capable of self-observation. One would account for one’s behaviors not with reasons but with causes; one would be profoundly divested from outcomes which one cannot affect anyway. And one would come to hold that, in its basic conception of time and will, the human consciousness was totally deluded.

As it happens, determinism is a false conception of reality. Physicists like David Deutsch and Ilya Prigogine have, in my opinion, defended free will amply on scientific grounds; and the philosopher Karl Popper described how free will is compatible in principle with a physicalist conception of the universe; he is quoted by both scientists, and Prigogine begins his book The End of Certainty, which proposes that determinism is no longer compatible with science, by alluding to Popper:

Earlier this century in The Open Universe: An Argument for Indeterminism, Karl Popper wrote,” Common sense inclines, on the one hand, to assert that every event is caused by some preceding events, so that every event can be explained or predicted… On the other hand, … common sense attributes to mature and sane human persons… the ability to choose freely between alternative possibilities of acting.” This “dilemma of determinism,” as William James called it, is closely related to the meaning of time. Is the future given, or is it under perpetual construction?

Prigogine goes on to demonstrate that there is, in fact, an “arrow of time,” that time is not symmetrical, and that the future is very much open, very much compatible with the idea of free will. Thus: in our lifetimes we have seen science —or parts of the scientific community, with the rest to follow in tow— reclassify free will from “illusion” to “likely reality”; the question of your own role in your future, of humanity’s role in the future of civilization, has been answered differently just within the past few decades.

No more profound question can be imagined for human endeavor, yet we have an inescapable conclusion: our phenomenologically obvious sense that we choose, decide, change, perpetually construct the future was for centuries contradicted falsely by “true” science. Prigogine’s work and that of his peers —which he calls a “probabilizing revolution” because of its emphasis on understanding unstable systems and the potentialities they entail— introduces concepts that restore the commonsensical conceptions of possibility, futurity, and free will to defensibility.

If one has read the tortured thinking of twentieth-century intellectuals attempting to unify determinism and the plain facts of human experience, one knows how submissive we now are to the claims of science. As Prigogine notes, we were prepared to believe that we, “as imperfect human observers, [were] responsible for the difference between past and future through the approximations we introduce into our description of nature.” Indeed, one has the sense that the more counterintuitive the scientific claim, the eagerer we are to deny our own experience in order to demonstrate our rationality.

This is only degrees removed from ordinary orthodoxies. The point is merely that the very best scientific theories remain misconceptions, and that where science contradicts human truths of whatever form, it is rational to at least contemplate the possibility that science has not advanced enough yet to account for them; we must be pragmatic in managing our knowledge, aware of the possibility that some truths we intuit we cannot yet explain, while other intuitions we can now abandon. My personal opinion, as you can imagine, is that we take too little note of the “truths,” so to speak, found in the liberal arts, in culture.

It is vital to consider how something can be both true and not in order to understand science and its limitations, and even more the limitations of second-order sciences (like social sciences). Newton’s laws were incredible achievements of rationality, verified by all technologies and analyses for hundreds of years, before their unpredicted exposure as deeply flawed ideas applied to a limited domain which in total provide incorrect predictions and erroneous metaphorical structures for understanding the universe.

I never tire of quoting Karl Popper’s dictum:

Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve.

It is hard but necessary to have this relationship with science, whose theories seem like the only possible answers and whose obsolescence we cannot envision. A rational person in the nineteenth century would have laughed at the suggestion that Newton was in error; he could not have known about the sub-atomic world or the forces and entities at play in the world of general relativity; and he especially could not have imagined how a theory that seemed utterly, universally true and whose predictive and explanatory powers were immense could still be an incomplete understanding, revealed by later progress to be completely mistaken about nearly all of its claims.

Can you imagine such a thing? It will happen to nearly everything you know. Consider what “ignorance” and “knowledge” really are for a human, what you can truly be certain of, how you should judge others given this overwhelming epistemological instability!

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.