- March 2021
- February 2021
- January 2021
- November 2020
- October 2020
- September 2020
- Preface & Introduction
Please Note: this page contains draft excerpts from a manuscript in progress. It is intended for reading by friends of the Cornerstone Forum and not for copying or citation. Thank you.
To be discussed during our March 2021 Florilegia Skype call:
PIETAS: “It is not wisdom to forget”
A society that rejects the past, cuts itself off from its future. It is a dead society, a society with no memory, a society carried off by Alzheimer’s disease. – Robert Cardinal Sarah
It is one and the same movement which makes people no longer believe in the Republic and no longer believe in God, no longer want to lead a republican life, and no longer want to lead a Christian life, they have had enough of it … One and the same sterility withers the city and Christendom. The political city and the Christian City. The city of man and the City of God. That is the specific sterility of modern times. … [T]he modern world is not only opposed to the old régime and the new régime in France, it is opposed and contrary to all old cultures, to all old régimes, to all old cities, to everything which is culture, to everything which is the city. In fact, it is the first time in the history of the world that a whole world lives and prospers, appears to prosper, in opposition to all culture. – Charles Péguy
Life can be ordered in a myriad of ways: tyrannically, materially, economically, politically, and so on. An ordered world is preferable to chaos, but Hobbes’ Leviathan, Stalin’s Gulag, and Xi Jinping’s slave labor camps are demonic orders, antithetical to human flourishing, as are the soulless oligarchic technocracies which have lately grown impatient with the unwieldy nature of democratic governance. The business of ordering the world in a way that conduces to the fulfillment of the human vocation of its citizens is a solemn and serious one. This is the ultimate criteria by which the value of a “constitutional order” or an “experiment in ordered liberty” should be assessed.
Americans have good reason to regard the constitutional order created by their nation’s founders to be uniquely proximate to the ideal. Some may have other assessments, but, aside from untried theoretical arrangements that appeal to idealists, romantics, and utopians, it is difficult to find a concrete case that seriously contravenes the pride many citizens of the United States have in the founding documents of their democratic republic. But the virtue of pietas is not predicated on the superiority of one’s culture or the prestige of its achievements, any more than the fourth commandment applies only to exemplary parents. Christopher Dawson defined the classical virtue of pietas as “the cult of parents and kinsfolk and native place as the principles of our being … a moral principle which lies at the root of every culture and every religion.” Indeed, this sense of belonging to a tradition is one of the “anthropological structures prepared beforehand” that conduces to the spirit of solidarity and transgenerational responsibility on which all healthy cultures depend. Dawson insisted that a society that loses this fundamental sense of belonging “has lost its primary moral basis and its hope of survival.” This reference to the preconditions without which a culture cannot endure was made more specific by John Adams, one of the signers of the American Declaration of Independence and the new country’s second president: “Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.”
The history of the last few centuries and most especially the horrors of the twentieth century and the romance of global governance in the early twenty-first have led to widespread misgivings, not only about the corruption the virtue of pietas, but of its very virtuousness. In its place there has emerged a romantic idea of a post-national global order which requires the sublimation of the allegiances and the networks of shared affection and common purpose on which legitimate social order depends. The unhealthy and destructive perversions of the virtue of pietas for which the appeal to universality is thought to be a corrective will always spring up precisely in the arid wastelands where the healthy versions of it have been censured as morally dubious or insufficiently universal in scope. The proposed cure is a symptom of the advanced stage of the disease, and it undermines those social and spiritual affinities without which a person in the fullest sense cannot flourish.
That grace is not contrary to nature but rather its fulfillment is a longstanding doctrine of Catholic Christianity. One of the many applications of that principle is the way in which affection for one’s culture and its history is conducive to the healthy life of the human person. All things considered, a person who honestly feels gratitude and respect for his cultural predecessors and the bounty that they have bequeathed him will more readily shoulder the burdens involved in preserving and purifying his cultural heritage. His sense of relatedness will be a source of courage and resolve in difficult times. Though a putatively disinterested and cosmopolitan observer might regard his own cultural patrimony, and that of others, as nothing more than the accident of birth, a person who finds in his cultural and historical situation clues to the role he is called to play in the life of his people will, by virtue of that attitude, inhabit a world more favorable to the discovery of personal meaning and fulfillment.
There are two ways of remembering the past that are corrupting of the memory: to idolize the past, to erase or overlook its sins and failures, and, even worse, to hold the past in such contempt that the only response seems to be to reject it entirely, to extinguish memory itself. All the great tyrannies of the modern era, beginning with the French Revolution, have sought to extinguish the past, even those, like the Fascists and the Nazis, which claimed to be reinstating it. All these tyrannies were at war with the living reality of their patrimony. The proper honoring of the past always begins with humility. Those errors that can be seen with the clarity of hindsight should serve as a reminder of how powerful the spirit of the age always is, and how unlikely it is that one’s own moral and political assessments are immune to its mimetic influence. The task of rectifying injustice and disabusing oneself of the hypnotic power of fashionable ideas is a perennial one, but history has shown that the passion for rectification can extinguish the spirit of gratitude which underwrites our sense of communion.
Even when we look back with twenty-twenty hindsight on grave shortcomings or even serious crimes of our culture, we would do well to realize that we would have been incapable of recognizing these failures had not that same culture awakened in us moral acuities without which the failures of our predecessors would have gone unremarked. On the other hand, thanks to what many call the hermeneutics of suspicion, whose principal agents were Nietzsche, Marx, and Freud – we have taught the last few generations to regard their own heritage as irredeemable, and the consequences of this cultural sabotage is now everywhere to be seen.
Roger Kimball offers one measure of the loss of common purpose when he compares the poets at the presidential inaugurations of John F. Kennedy in 1961 and Bill Clinton in 1993.
To get a sense of what has happened to the institution of American identity, compare Robert Frost’s performance at John F. Kennedy’s inauguration in 1961 with Maya Angelou’s performance thirty-two years later. As [Samuel] Huntington reminds us, Frost spoke of the “heroic deeds” of America’s founding, an event, he said, that with God’s “approval” ushered in “a new order of the ages.” By contrast, Maya Angelou never mentioned the words “America” or “American.” Instead, she identified twenty-seven ethnic or religious groups that had suffered repression because of America’s “armed struggles for profit,” “cynicism,” and “brutishness.”
Commenting on the program for the 2021 presidential inauguration published prior to the inauguration ceremony, the European correspondent for Crisis Magazine brought Kimball’s lament up to date:
Where JFK chose the sublime Marian Anderson to sing The Star Spangled Banner, Biden has given that honour to Lady Gaga; instead of the venerable Robert Frost intoning his The Gift Outright, a paean to American exceptionalism, we’ll be treated to 22-year-old Amanda Gorman, whose work “touches on issues of race, feminism, oppression, and marginalization.”
Others were no less bewildered. The Harvard historian James Hankins offers further evidence:
Toppled statues, an educational culture bent on repudiation of the past, media awash with anti-American propaganda, and more, are bringing one truth hideously into view: The old authors were right when they said that a country without pietas will soon disintegrate. Only by recovering that forgotten virtue can we hope to rebuild the edifice of love and loyalty that shelters our common life.
The attempt to reorder a culture by disparaging its past has a dark pedigree. As the British historian Paul Johnson noted, the ghost of Rousseau haunts the efforts to disparage the “experiment in ordered liberty” which Americans inherited from their predecessors. For Rousseau, the State was to replace the father, and the citizens were regarded as wards of the State, or as Johnson assessed, “the children of the paternal orphanage.” In contrast to one’s native country, the State hardly awakens affection, which is why the State, posing as a sovereign Father, or more recently as a coddling Mother, always substitutes indoctrination for education. Thus, cultural critics from Howard Zinn to Nikole Hannah-Jones try to alter the historical record in order to persuade their readers to adopt a deeply critical attitude toward their cultural inheritance. Again speaking of Rousseau’s political project, Paul Johnson writes:
The educational process was thus the key to the success of the cultural engineering needed to make the State acceptable and successful; the axis of Rousseau’s ideas was the citizen as child and the State as parent, and he insisted the government should have complete charge of the upbringing of all children. Hence – and this is the true revolution Rousseau’s ideas brought about – he moved the political process to the very centre of human existence by making the legislator, who is also a pedagogue, into the new Messiah, capable of solving all human problems by creating New Men.
Johnson wrote those words in a book published more than thirty years ago. As prescient as he was, he could hardly have anticipated the troubling trend of recent events. So congenial to the spiritual health of the human person is his gratitude for the bounty that is his because of the labors and sacrifices of his cultural predecessors that those forces that would disabuse him of that gratitude leave him rudderless and vulnerable to the fashions of the moment.
The new post-national cosmopolitan spirit is so enamored of the clichés of which it is constituted, that not only is the special affection for one’s native country shunned in favor of the “citizen of the world” posturing, but the very idea of national citizenship is mocked as hopelessly retrograde. And yet, as the American historian Wilford McClay has recently written:
“Citizenship” means a vivid and enduring sense of one’s full membership in one of the greatest enterprises in human history: the astonishing, perilous, and immensely consequential story of one’s own country. Today, we must redouble our efforts to make that past our own, and then be about the business of passing it on.
Nostalgia literally means homesickness, and it can be a healthy response to circumstances that have led to loneliness and alienation. Indeed, the experience of spiritual homelessness is perhaps the beginning of a retrieval of one’s cultural patrimony. Homesickness will often focus on a familiar, and probably familial, experience of being in a member of a loving and caring community, one ranging from one’s family, to one’s local community, to one’s native country. One grows nostalgic for a community with which one feels a real affinity and affection. Finding no such community one can at least visit one. But the effect of tourism is precisely to pillage another tradition and incline its legitimate heirs to barter away their cultural patrimony and then to mass produce the trinkets that trivialize the heritage they purport to be celebrating. Writing of tensions in the western Austrian state of Tyrol, where traditional Catholicism remained robust until recently, Tim Parks notes:
What does the globalized world of free travelers feel nostalgic for, if not the closed, traditional community? The more the tourists come, particularly Italian tourists, the more the locals cling to their traditions. And the more they cling to their traditions the more the tourists come.
As the historian Glenn W. Olsen notes, “if one does not have a preferred language, religion, or set of customs, one probably gravitates to cosmopolitan centers, where – the crowning irony – one can find people like oneself.”
A million things, starting with the raising of children, are easier if one lives with those with whom one agrees. In many historical situations multiculturalism is inevitable, but in certain obvious ways multicultural societies fail humans. Multiculturalism may have a certain short-term attractiveness, but it is doubtful that a civilization or nation can last long without a fair degree of shared vision of reality. By definition anything more than façade-multiculturalism – I like cappuccino, you like tacos – involves disagreement about reality.
A living tradition can be distinguished from traditionalism. Indeed, though the rise of traditionalism might be an expected response to the loss of tradition, it constitutes yet another contemporary reaction rather than an affectively genuine revival of memory and gratitude. As D. C. Schindler has noted:
It is ironic, but there is perhaps something fitting in the absence of an explicit philosophical theory or account of tradition. A tradition is something we inherit uncritically, without a demand for justification. We feel no need to certify the precise origin of tradition, and, indeed, details about the time and place a tradition was instituted tend to diminish its status as tradition, especially if the origin turns out to be recent and accessible in some way other than its transmission through others. The initiation of a tradition is most properly hidden in the mists of time. Rather than critically assessing it, we are meant to take a tradition for granted; a kind of spontaneous and unreflective acceptance seems to belong to its essence.
The word culture is virtually a synonym for the word tradition. An anti-traditional culture is an oxymoron. Today the word tradition, Schindler remarks, refers “only to what we might call the external ‘trappings’ of a culture – literally, the cut and color of the clothes one wears or the particular seasonings one adds to one’s food.”
Today, of course, the challenge to pietas comes from those who are anthropologically foolish enough to think that it can be replaced by a global order for which the deracinated and ideologically intoxicated might exhibit a passing affinity, but which is conspicuously incapable of awakening an affectively significant allegiance. This shortcoming is analogous to Henry James’ mocking remark about the New England Puritan, Nathaniel Hawthorn, whose literary efforts, James declared, were crippled by the fact that he had:
No sovereign, no court, no personal loyalty, no aristocracy, no church, no clergy, no army, no diplomatic service, no country gentlemen, no palaces, no castles, nor manors, nor old country houses, nor parsonages, nor thatched cottages nor ivied ruins … no cathedrals, nor abbeys, nor little Norman churches; no great Universities nor public schools – no Oxford, nor Eton, nor Harrow; no literature, no novels, no museums, no pictures, no political society, no sporting class – no Epsom nor Ascot!
A century earlier, Rousseau celebrated his emancipation from comparable cultural resources, but he attempted to replace pietas, properly understood, with the State. Today an even more preposterous option has garnered support: post-national global order, tasked with overseeing and compelling behavior deemed by global elites to be required by whatever crisis-of-the-moment. These are the now-infamous crises – whether manufactured or real – which, we’re told, are opportune to globalist aspirations otherwise repugnant to the traditional allegiances of the electorate. Such globalism, writes Douglas Farrow, “serves to break down identity and, indeed, requires the breaking down of identity.” It necessarily cultivates “a soulless bureaucracy inimical to the maintenance of homes and of homelands.”
The future, writes Robert Harrison, “is born of the past and the past reborn from out of the future, thanks to a mysterious process of transmission.” And those who fail to understand this, those who allow the virtue of hope to attach itself to historically untested political projects – or worse, to projects have been, without exception, catastrophic – will leave their cultural heirs with nothing but devastation. As the French philosopher Rémi Brague observes:
A deliberate break with the past brings about a loss of civilization and is the harbinger of some form of barbarism, the latter word being understood in the usual meaning as well, that is, as stupidity and cruelty. The historical examples of such a fact are many. Among them, the French Revolution may have pride (or shame) of place.
Harrison expresses it somewhat less dramatically, when he writes that “when the new does not renew – when it does not rejuvenate latent legacies – it gets old in a hurry.” So modernity took its time getting old, but postmodernity has gotten old in a hurry. The speed at which spiritual, cultural, and social circumstances are dissolving into incoherence can be overwhelming. But these are the circumstances in which Christians can and must discover the buried sources of their faith. These are the circumstances in which we can discover the deeper meaning of personhood that has been revealed by Christ and approximated by those of his faithful followers whose faith can inspire our own.
To be discussed during our February 2021 Florilegia Skype call:
THE OBSERVED OF ALL OBSERVERS: THE DANDY
In The Waves, what made Virginia Woolf’s Percival the object of fascination by his classmates was his single-minded interest in athletic competition, to the exclusion of the far fiercer mimetic competition with which most of the other characters in her novel were preoccupied. In the eyes of his classmates, Percival’s astonishing disinterest in the social drama gave him an aura of Olympic majesty. For Percival, it was dull indifference to everything except competitive sports. If Percival came by his disinterest in mimetic intrigue honestly, others hungry for the social centrality he acquired honestly, learned to parade their putative social indifference in the same way that the Underground Man did when he tried his “utmost to show them that I could do without them.”
The dandy is a figure who has intuitively understood the attraction those entangled in mimetic intrigues have for anyone who seems immune to the mimetic maelstrom. Virginia Woolf’s Percival, the quintessential materialist was, to his credit, a real materialist. He lived and moved and had his being on the field of athletic competition. His more mimetically agitated and enmeshed classmates could only gaze enviously at his immunity to the convoluted mimetic intrigues from which they could find no respite. Neville’s fascination with Percival had a homoerotic feature, but all his classmates saw in Percival’s invulnerability to mimetic intrigue something almost religious. In the chapel, Neville stared with incredulity: “His blue, and oddly inexpressive eyes, are fixed with pagan indifference upon the pillar opposite. … He sees nothing; he hears nothing. He is remote from us all in a pagan universe.” In a sense, the dandy is a Rousseauian figure who feigns exemption what others might think of him while remaining preoccupied with nothing else. He is Percival without the athletic prowess but with a keen sense of the mimetic effect he can have on others by appearing to be immune to such things.
. . .
If anyone can be said to have taken Rousseau’s ritual of self-presentation to its logical conclusions it was the French poet Baudelaire. His short, reckless and self-absorbed life was epitomized by the determination to live the life of the “dandy,” remain at the center of social attention precisely by exhibiting absolutely no interest in others. If, in the first lines of his Confessions, Rousseau wrote that his life was based on “no precedent,” and that above all his life was “different,” Baudelaire would write to his mother in an irascible mood: “Understand one thing which you seem always to ignore, I am not made like other men.” Baudelaire discovered that the air of indifference the dandy labors to exude works like a social aphrodisiac on those around him, those suffering from a milder form of the “ontological sickness” whose terminal phase the dandy represents. The self he advertises as self-sufficient fascinates others precisely because of its eccentricity, but it is the attention of others and not the eccentricity required to attract it that keeps the dandy’s helium-filled existence airborne. Whereas the Underground Man, snorting and stomping around the tavern, managed – for two whole minutes – to attract the attention by which he sought to repair his ontological emaciation, the Baudelairean dandy mastered the art of paying no attention to those on whose attention his social singularity depended. In other words, for as long as the dandy can maintain the pretenses, he can relieve the symptoms of his ontological sickness by becoming the carrier of the disease and infecting others with it. In his book, Madness and Modernism, Louis A. Sass writes:
Baudelaire was the first the crystallize a form of solitude that was virtually unthinkable before the nineteenth century: a profound, self-generated isolation so unaffected by the presence of others that it is likely to be felt most strongly amid a crowd. And he was the first major poet to revel in the self-imposed, self-glorifying alienation that was to become virtually de rigueur in the artistic avant-garde.
“If ever a man was ill, in ways that had nothing to do with medicine,” wrote Baudelaire, “that man is me.” At the end of his life, according to one commentator, Baudelaire was “unable to remember his name or recognize his face in a mirror.” The same observer concluded, however, that Baudelaire had “recorded something that was happening to human nature as a whole.” No, not human nature, but the late modern and postmodern self. Sass cites the observation of the psychiatrist Ernst Kretschmer regarding the kind of indifference Baudelaire exhibited, namely, disinterest “ostentatiously manifested.” Baudelaire died in 1867. His effort to float on the fascination he was able to evoke in others failed in the end. He came into the Catholic Church on his deathbed.
“Talk to me of originality and I will turn on you with rage,” wrote William Butler Yeats. Had Yeats lived before 1776, however, his rage might well have lain dormant, for it was only at that late date that the word originality came into popular usage. That “originality” enters English usage at the end of the eighteenth century, coincident with both the American and French Revolutions, is surely worthy of note. The need to be original is one of the most telling signs of an existing mimetic crisis. The hysteric is the person in whom that crisis becomes personified and manifest. Whether acute or chronic, whether psychologically symptomatic or “socially savvy,” the hysteric is hounded by the problem of originality.
In the world of hysteria, priority in time is annulled by superiority in dramatic representation. Refusing to concede his mimetic fascination with the “other” with whom he is enthralled, the hysteric auditions for the part the other plays. The mild histrionics and social agitation that are such familiar features of our present cultural life are attempts on the part of the self to assert an autonomy, a uniqueness, and an independence from others’ influence. These adepts at the mimetic game understand its operating principles well enough to exhibit a flare for blasé self-presentation, fully aware of it fascination to an audience of onlookers, whose their minor role in the sociodrama is rewarded by the occasional recognition by the maestro. For as long as the pretenses can be maintained, he functions as the carrier of the disease, precisely because he alone seems not to have contracted it.
The quintessential dandy of the late twentieth century was Andy Warhol. Warhol is the Percival of his time, not because he was as oblivious as Percival was to the mimetic intrigue, but rather because he was a master at controlling it. He as the maestro, the ringmaster. René Girard not only saw that the key to the dandy’s hypnotic allure was his apparent lack of desire, but he also saw shimmering at the heart of the ruse an imitation of the Christian saint.
Nondesire once more becomes a privilege as it was for the wise man of old or the Christian saint. But the desiring subject recoils in terror before the idea of absolute renunciation. He looks for loopholes. He wants to create a personality in which the absence of desire has not been won with difficulty out of the anarchy of instincts and metaphysical passion. The somnambulist hero of American writers is the “solution” of this problem. Nondesire in this hero has nothing to do with the triumph of the mind over evil forces, nor with the self-discipline extolled by the great religions and higher humanisms. It makes one think rather of a numbing of the senses, of a total or partial loss of vital curiosity.
One might liken such a faux-saint who carefully struts his imperviousness to temptation to René Descartes “numbing of the senses” and his well-publicized delight in “finding no company to distract me, and having, fortunately, no cares or passions to disturb me.” Such nondesire, Girard insists, “has nothing in common with abstinence and sobriety.” While Descartes was straining not to be seduced by the vicissitudes of mimesis, the dandies of a later age were the studied avatars of seduction, compensating for an ontological deficit by attracting the fascinating attention of their social peers. Baudelaire provides the overview of the dandy which cannot be improved upon:
The dandy has to be a lymphatic and cold man, who is sick and tired of everything and wants to make himself original. Indifference is the supreme virtue of the dandy. … The dandy is not, as is generally thought, a man who is only concerned with the embellishment of himself. This is nothing more than a manifestation of the superiority of the dandy’s spirit. … In order to be a dandy, one must have general knowledge … and the considerable fortune needed to worship all of the passions. The dandy should feel the pleasure of astonishing and the proud satisfaction of never being astonished.
“In the French poet’s case,” writes Louis A. Saas, “it is less a matter of fear or envy than of disdain, less of Angst than of spleen – the latter being Baudelaire’s term for a new kind of emotion, a peculiar mix of disillusionment, irony, bitterness, and ennui that he invented as much as described….” No assessment of the present state of our formerly Christian civilization will succeed without addressing the strange new psychological condition to which Baudelaire first drew attention: spleen. This is finally what lies at the end of the road for the sovereign self. This is where it turns nihilistic and metamorphoses into its postmodern manifestation. The psychiatrist Ernst Kretschmer described the kind of indifference Baudelaire exhibited, as disinterest “ostentatiously manifested.”
As Robert Hughes analyzes it in 1982, Andy Warhol’s prominence coincided with a social convolution in which “the press and television, in their pervasiveness, constructed a kind of parallel universe in which the hierarchical orders of American society . . . were replaced by the new tyranny of the ‘interesting.’”
To enter this turbulence, one might only need to be born — a fact noted by Warhol in his one lasting quip, “In the future, everyone will be famous for fifteen minutes.” But to remain in it, to stay drenched in the glittering spray of promotional culture, one needed other qualities. One was an air of detachment; the dandy must not look into the lens. Another was an acute sense of nuance, an eye for the eddies and trends of fashion, which would regulate the other senses and appetites and so give detachment its point.
Diligent and frigid, Warhol had both to a striking degree.
Hughes describes those circling in the Andy Warhol orbit:
They were all cultural space-debris, drifting fragments from a variety of Sixties subcultures (transvestite, drug, S&M, rock, Poor Little Rich, criminal, street, and all the permutations) orbiting in smeary ellipses around their unmoved mover. … If Warhol’s “Superstars,” as he called them, had possessed talent, discipline, or stamina, they would not have needed him. But then, he would not have needed them. They gave him his ghostly aura of power. If he withdrew his gaze, his carefully allotted permissions and recognitions, they would cease to exist . . . In this way the Factory resembled a sect, a parody of Catholicism enacted (not accidentally) by people who were or had been Catholic . . . In it, the rituals of dandyism could speed up to gibberish and show what they had become — a hunger for approval and forgiveness. These came in a familiar form, perhaps the only form American capitalism knows how to offer: publicity.
. . .
If Warhol is the phlegmatic hysteric, wan, waif-like, with the studied demeanor of a semi-autistic, Camille Paglia is his opposite: the full-blown theatrical hysteric. Her intellectual showmanship notwithstanding, Paglia, has regularly performed spiritual high wire feats that leave savvy onlookers holding their breath. The correlation with hysteria is conspicuous. In her book, Vamps and Tramps, she writes of her own “meteoric rise” and says this of herself:
I was a parallel phenomenon to businessman-turned-politician Ross Perot and radio personalities Rush Limbaugh and Howard Stern, with their gigantic nationwide following. We have widely different political views, but all four of us, with our raging egomania and volatile comic personae tending toward the loopy, helped restore free speech to America. If this is not enough to suggest the model-rival matrix of hysteria, here’s what Paglia has to say about her female model-rival, the writer, Susan Sontag: “I am the Sontag of the ‘90s. . . . I’m her worst nightmare. … I’ve been chasing that bitch for 25 years, and I’ve finally passed her!” That this is intentional self-parody does not entirely remove the perils involved in such things. Dismissing Sontag’s writings as “bleak, boring, pedestrian, clumsy, wobbly and corny,” Paglia anguishes over the apparent fact that Sontag has exhibited absolutely “no awareness that I had written any books or that she had even seen them, even through a telescope.” “I feel I should use my name recognition for service, for art,” Paglia said in 2015. Smith writes that Paglia rejected “a watered-down Marxism that sees the world in terms of society, politics, and economics – a materialistic philosophy that has no sense of the spiritual or sublime.”
“That’s why they’re in a terrible fever and so emotional,” Paglia said. “There is a total vacuum in their view of life. They don’t have religion any longer. Religion teaches you metaphysics. It shows you how to examine yourself and ask questions about your relationship with the universe.” The Bible, she said, is “one of the greatest books ever written.” If Baudelaire came into the Catholic Church on his deathbed, we are permitted to hope that other raging egomaniacs and volatile comic personae might likewise exhaust the faux substantiality of social centrality and, in the end, discover what Henry Adams called “the old Roman road of tradition.” Meanwhile, the ontological diminution is filtering down to the very young. Almost three decades ago, a touching example of a child struggling with precisely this deficit appeared in pages of The New York Times. It was one of a series of articles focusing on people coping with the evisceration of the self in postmodern America, conditions that were being brought about by mimetic confusions. This particular article focused on a 12-year old girl living in Brooklyn. We quote the article by Catherine S. Manegold, omitting only the girl’s name, which appeared in the original article:
[The 12-year-old girl] wears two streaks of bright magenta in her hair. They hang, strands of Kool-Aid, down her loose, long strands of blonde like a seventh grader’s twist of punk: Don’t come too close. Don’t mess with me. Don’t tell me what to do. I’m not like you.
At her Brooklyn public school, a kaleidoscope of teen-age rage, [the girl’s] teachers see a young girl with an attitude. They focus on her slouch, her Kool-Aid streaks, her grunge clothes and sullen anger and see all the signs of trouble. But those vivid slashes say the most, communicating a basic paradox of adolescence, the double-edged message: “bug off” and “LOOK AT ME.”
Here we have the quintessential expression of the Underground Man’s performance in the tavern. Like Freud, the Times reporter overlooked the cultural and historical dimensions of the suffering this girl was undergoing. As prevalent as it is in our present society, this child’s “bug off” and “LOOK AT ME” attitude is not “a basic paradox of adolescence,” and to dismiss it as such is to unwittingly join in the conspiracy of misrecognition. Nor should we dismiss her sad plight as something due to her personal and family circumstances, however much they may have contributed to her predicament. There are far too many people today, both older and even younger than this child, who suffer a similar psycho-social distress.
What is worthy of attention, however, is the characterization of this child’s Brooklyn public school as “a kaleidoscope of teen-age rage.” As we have noted more than once, ultimately the vacuity or ontological insubstantiality which such dramatizations are meant to alienate only exasperate. When there are countless social competitors seeking alleviation from their insubstantiality in more or less the same sulking way, the whole tenor of the social order becomes one of sullen, and potentially explosive, antipathy.
Bearing in mind the attraction his classmates felt toward Percival, in no small part due to his “blue, and oddly inexpressive eyes” that were “fixed with pagan indifference upon the pillar” in front of him, we conclude these reflections with John Horvat’s comment about the 2019 Time Magazine Person of the Year is suggestive.
There is also an element of mystery and mysticism in her presentation. Unfortunately, she suffers from Asperger’s Syndrome, which impairs her emotional expression, making it cold, mysterious, and disconnected. Thus, everything about her tragically defies the standard definition of what might be expected of this child seemingly without a childhood. … The media have been quick to capitalize on ascribing to her mystical persona special powers, not unlike that of a prophetess or oracle.
To be discussed during our January 2021 Florilegia Skype call:
The FREUDIAN INTERLUDE
Gazing at the crowds awaiting his arrival at the New York dock in 1909, Sigmund Freud is reported to have turned to Carl Jung and said: “They don’t know it, but I’m bringing them the plague.” If Virginia Woolf acknowledged something “sulphurous and sinister” in the delight her chief protagonist took in a world reeling into chaos, one could be forgiven for detecting something of the same fiendish delight in Freud’s mordant remark: an envenomed amusement in pondering how much cultural disruption he was poised to unleash.
Freud’s star has fallen, but there is a lingering legacy of the earlier Freudian triumph. It lasted long enough, and enjoyed enough respectability, to have provided intellectual incubation for the sexual revolution. In historical retrospect, the sexual revolution can be seen – and will increasingly be seen – as a plague. When Freud died in 1939, that revolution was still gestating, but signs of its later effect were at hand. “Whether he was a true scientist or not, Freud’s place is secure if for no other reason than that he broke down ancient taboos and cleared the way for a new approach to the mind,” the New York Times unsigned obituary rhapsodized, adding that he was “the most effective disturber of complacency in our time.”[1]
In the opening gambit of his masterful survey of Sigmund Freud and his legacy, Samuel Bendeck Sotillos makes the following observation:
Freud recognized the full scope of the corrosive impact of the Freudian doctrine on society and civilization: “psychoanalysis is regarded as ‘inimical to culture’ and has been put under a ban as a ‘social danger.” Freud in no uncertain terms was conscious of the nefarious and destructive implications of his theory that was cloaked in the dress of modern science, which would come to challenge the very foundations of Western civilization.[2]
. . .
Carroll Smith-Rosenberg declared that psychoanalysis is “the child of the hysterical woman.”[3] It was with the publication of Studies in Hysteria, co-authored by Josef Breuer, that Freudian theory began to take shape. Writes Freud:
What is the meaning of hysterical identification? … I shall be told that this is not more than the familiar hysterical imitation, the capacity of hysterics to imitate any symptoms in other people that may have struck their attention – sympathy, as it were, intensified to the point of reproduction.[4]
The man who thought his sexual theories were bringing the curtain down on religion was in fact cluttering his cutting room floor with penetrating observations that would one day contribute to the revitalization of the kind of Christocentric anthropology we are here trying to limn. Understood in its original sense as an affective communion, Christian discipleship and, therefore, Christian subjectivity, is quite simply “sympathy intensified to the point of reproduction,” and Christian – especially Catholic – spirituality makes sense only in light of the inextinguishable human predilection toward such sympathy.
Modern psychology was called into existence by a symptom of distress that amounts to the most explicit declaration of our imitative propensity that one can imagine – sympathy intensified to the point of reproduction – what René Girard calls mimetic desire. Girard was the first to give due weight to mimesis, but Freud was not the first to notice its role in hysteria. “At least since the late seventeenth century,” writes Mark Micale, “medical observers have noted in hysteria an extraordinary, chameleon-like capacity to reflect the environment in which it develops.”[5] Micale was quick to note that: “No disorder was more important for this historical development of Freud’s thinking than hysteria.” “Psychoanalysis, in essence began as a theory of hysteria.”[6]
Like the Freudian theory of Oedipal conflict within the family, hysteria is a late modern condition, the implications of which were eluded by ascribing great antiquity to both these putative disorders. In one form or another, ancient cases can be found of each of these psycho-spiritual disorders. But the search for evidence of ancient instances of Oedipal conflict and hysteria was prompted by a sudden emergence in the nineteenth century of symptoms for which such a venerable pedigree proved professionally advantageous for the practitioners of a fledgling psychotherapeutic regime.
Modern psychology was born at the moment when symptoms of the mimetic crisis in the midst of which we are now living became acute enough and prevalent enough to justify the invention of a new discipline for managing the distresses, the presenting symptom of which was labeled hysteria. The very antiquity of this term, while lending scientific weight to the new discipline, fostered one of its most crippling misunderstandings. If the hysteria that Freud and his colleagues were treating was, as early researchers theorized, the most ancient of all psychopathologies, then the assumption central to Freud’s later Oedipal theories was plausible, namely, that the contemporary distresses were manifestations of perennial pathologies, and that there was nothing fundamentally historical about them, nothing conspicuously “modern” or “Western.” As Hans Urs von Balthasar noted, the phrase “modern neurosis” is “almost a tautology, inasmuch as there were, strictly speaking, no neuroses in the earlier, humane world (and hence no need for their poisonous antidote, psychotherapy).”[7]
Once one recognizes the historicity of the contemporary cultural and spiritual crisis, however, and its cultural and chronological coordinates, elements of Freudian theory – such as the Oedipus complex – regain a degree of pertinence, inasmuch as they can be seen, not as the continuation of perennial and universal sexual dramas, but as the social and psychological consequences of a widespread application of unsound anthropological assumptions, as the sovereign self’s first, faint de facto cry for help.
Jean-Michel Oughourlian sees the cultural and historical dimension of what Freud dubbed the Oedipal conflict:
We are thus witnessing the emergence of new phenomena that I have baptized diseases of desire, that is to say illnesses of relationship, in other words illnesses that are purely cultural and not natural, and that are linked to our modern, Western society.[8]
Turning interest away from the obvious mimetic features of these distresses and toward their supposed sexual origin was for Freud no easy task. Re-reading today passages in which he accomplished it, one senses how precarious this crucial Freudian maneuver really was, and how dependent its success was on the eager credulity of those intrigued by the sexual theories and sympathetic to what seemed the scientifically formidable challenge to religion. “It is on the subject of religion,” writes Philip Rieff, “that the judicious clinician grows vehement and disputatious…. Freud’s customary detachment fails him here. Confronting religion, psychoanalysis shows itself for what it is: the last great formulation of nineteenth century secularism.”[9] Mark Micale makes the same point when he writes: “nineteenth-century commentators most often conceptualized [the diagnosis of hysteria] as a battle between science and religion.”[10]
We are here concerned with the anthropological mistake that underlies the modern and postmodern notions of selfhood, and, more immediately, with what Freud, his predecessors, collaborators and later explicators refused to fully confront, specifically as they were faced with so conspicuous an instance of it in the psychopathologies they labeled hysteria. Freud’s now discredited sexual theories are of interest in this regard, for they served as decoys, diverting attention from the real issue that called out for further investigation. The sexual theories are of interest as well because they played an important role in lending the sexual revolution of the mid- to late-twentieth century a degree of intellectual respectability, and this revolution is one of the most devastating symptoms of the ontological crisis now ravaging popular culture in the West and beyond. This, surely, is a plague for which Freud and his enthusiasts bear at least indirect responsibility.
Freudian psychoanalysis was one of modernity’s most formidable attempts to toy with the revelation of the mimetic nature of human subjectivity without confronting its religious implications. Freud’s attention to sexual symptoms can be defended to a degree, but not for reasons he proposed. As societies slide into a mimetic crisis of the sort that is now engulfing the Western world, sexuality is predictably the first area of social life to become problematic. The power of the sexual drive, the deep longing for true intimacy, and the drama involved in competition for sexual partners combine to make it all but inevitable that in a crisis-ridden society – rife with envy, rivalry, and resentment – these aggravations will be exasperated and exemplified in relationships where sexuality is in play. And so it is that – with the ascendance of the individualistic, secularized and “social contract” sense of self in the West – sexual intimacy has tended to be instrumentalized and robbed of its sacramentality, and thereby deprived of the very source of its deepest emotional and spiritual satisfactions. The chief requirement of “social contract” sexuality is the prevention of the natural consequences of the sexual act, the success of which exchanges a sacramental self-donating intimacy for a sexuality of plumbing, pleasure, performance, and prevention.
Far from being a reaction to sexual prudery, therefore, the sexual revolution was an attempt to compensate for emotional disappointments that accompanied earlier stages of the revolution, as each subsequent generation entertained the chimerical hope that a loosening of moral restraint might unleash enough passion to make up for the loss of intimacy coincident with the moral laxity of the previous generation. The cure was the next stage of the disease. The result of this spiraling process has been a spiritual calamity rooted, not just in moral laxity, but in anthropological error, a catastrophe which continues unabated and whose most serious consequences are spiritual, not sexual. For these and other reasons, it is worthwhile to rummage through the dustbin of psychoanalytic history for hints of the failure that Freudian thought represents, for the story of the gradual demise of the secular autonomous self contains important clues as to what went wrong and why, and the Freudian chapter of that story should not be closed without recovering as many of those clues as might be found there. There are treasures to be retrieved from the wreckage.
Freud gave a sexual interpretation to what were fundamentally mimetic complications. Where he recognized mimetic conflict – in the father-son rivalry – he interpreted it as an unconscious sexual rivalry for the mother, structuring his analysis around the Sophoclean drama Oedipus the King, the story of a man fated to kill his father and marry his mother. The venerable status of Sophocles’ play gave weight to Freud’s claim to have discovered a perennial feature of the human drama. What he failed to notice was the historical feature of the disorders and complications his patients exhibited. As the effect of the Christian revelation proceeded apace, even as the revelation itself was losing its influence in the modern era, the problems first adumbrated in the Gospels themselves emerged.
By looking to Sophocles, Freud could theorize that this family drama was as perennial as he imagined hysteria to be in another context. Had he looked elsewhere, he might have found an anthropologically lucid key to the very problem he sought. Both Matthew and Luke recount a warning issued by Jesus regarding the breakdown of domestic tranquility.
“Do not think that I have come to bring peace on earth; I have not come to bring peace, but a sword.For I have come to set a man against his father, and a daughter against her mother, and a daughter-in-law against her mother-in-law; and a man’s foes will be those of his own household.” (Mt 10:34-36) (See: Lk 12:51-53)
Like so many others, this passage contains an anthropological insight that Girard’s work helps us bring into focus. Anticipating the gradual effect of the Golgothan and Easter events at the center of the Gospel revelation, Jesus here warns his disciples that precisely those events would slowly cripple the old habitual system for restoring harmony by transferring aggregate animosities onto an expelled or exiled reprobated figure. That arrangement would be fatally exposed on Golgotha by the revelation of the innocence of a Victim unanimously presumed guilty by the crowd. Recall that even families and clans have long and notoriously restored their inner cohesion by standing in opposition to other clans and families. In our time, the unifying ruse that creates a faux solidarity might be a shared political or socio-religious animus against others. Recourse to this ruse would be made again and again after the crucifixion, but with each such effort more of its dark reality would come to light. The Golgothan revelation would begin crippling the system for restoring social harmony by offloading social aggravations onto expendable and anathematized figures, and it would similarly affect any domestic recourse to this ancient recipe. Any domestic harmony that owes its tranquility to the existence of an adversary or enemy whom the family or clan can offload aggravations internal to the community would fail in proportion to any exposure the community might have had to the Golgothan event. Though animosities might continue to trigger the scapegoating impulse, those under Christian influence would, to that extent and in due course, hear the cock crow, as it were. At which moment, the option would be: contrition, repentance, and forgiveness or a return to domestic conflict of an even more malicious and contagious kind.
Thus, what Freud saw – or thought he saw – in the Oedipal conflicts of his patients was not an ancient and perennial sexual drama between fathers and sons. To the extent that his analysis of the conflict was remotely plausible, it was not because what was taking place was a primordial and perennial sexual drama. He was seeing the fulfillment of the warning of Christ, namely: animosities no longer so easily offloaded onto those outside the family circle were now eating away at the most intimate forms of social life.
Elaine Showalter touches on one of the subsequent developments that reinforces both the mimetic and the theatrical dimension of hysteria:
The theatrical subtext of hysteria was codified in 1987 when the Diagnostic and Statistical Manual of Mental Disorders officially renamed what had previously been “hysterical personality disorder” as “histrionic personality disorder”:
The essential feature of this disorder is a pervasive pattern of excessive emotionality and attention-seeking, beginning by early adulthood. … People with this disorder constantly seek or demand reassurance, approval, or praise from others and are uncomfortable in situations in which they are not the center of attention. They characteristically display rapidly shifting and shallow expression of emotions. … These people are typically attractive and seductive, often to the point of looking flamboyant and acting inappropriately. They … are lively and dramatic and always drawing attention to themselves.[11]
Bewildering though it often is, the evidence, we suggest, is that hysteria typically occurs in those suffering from an ontological deficit, the perils of which Henri de Lubac and Gabriel Marcel were soon to warn. The hysterical symptoms significant enough to attract the attention of clinicians can be seen as a kind of survival strategy aimed at resisting the gravitational power of a personality under whose mimetic influence the subject fears being drawn. Just as many organic disorders arise less from an invasive parasite, virus, or bacterium than from the body’s immune response to these things, so with hysteria. The symptoms of hysteria appear when the integrating power of the person suffering the symptoms – and the interdividual boundaries that make human social life possible – are relatively weak with respect to the mimetic power of others with whom the person comes into contact. Thus, hysteria can be likened to a spiritual and psychological autoimmune response. The nineteenth century British surgeon and pathologist Sir James Paget used the terms “nervous mimicries” and “neuromimesis.”
Those clinically diagnosed as hysterics in the late nineteenth and early twentieth centuries inhabited societies more structured and less fluid than those of today. Individuals were expected to retain a higher degree of psychological continuity, and the failure to do so tended to draw more attention than the corresponding failure today, when identities morph far more routinely and arouse considerably less clinical scrutiny or social concern.
Whereas the modern self was adapted to and fostered by the majoritarian voluntarism of a democratic polity, the disaggregated and atomized postmodern self is adapted to and shaped by the consumptive voluntarism most congenial to the now all-encompassing market, and ever at the mercy of the political and moral fashions blithely indifferent to fundamental anthropological realities. It may well be the case that it is the availability of the market – as the repository of desire and the ritual arena where disappointed desires can be easily and quickly recycled into new desires – that has facilitated the morphing of clinical cases of hysteria into a sea of subclinical forms. Whether that shift can be regarded as felicitous remains an open question. Arguably, the market supplied the instrument for servicing the spiritual crisis and endowing its symptoms with an aura of social respectability, just as the identification with one’s nation has historically been a source of social solidarity, notwithstanding the darker role that animosity and violence might have played in endowing the nation’s founding violence with a degree of transcendence.
Literature is filled with stories of lovers being driven to distraction by the arrival of a competitor for the affections of the loved one. Girard wrote one of his finest books on Shakespeare’s genius for both exploiting and elucidating this conspicuous fact of both romantic life and social affairs generally. Mimetic passion, writes Mikkel Borch-Jacobsen, “can of course bear sexuality in its wake (especially in the form called homosexuality), but can by no means be reduced to sexuality.”[12]
The term hysteria comes from the Greek word for uterus and carries with it the long-discredited implication that hysteria is a female sexual disorder. Indeed, as Mark Micale has observed: “Freudian psychology in a real sense represents a second resexualization of the hysteria diagnosis.”[13] Still under the sway of his sexualized theories, and thus still reinforcing the ancient world’s feminization of hysteria, Freud was nonetheless an honest enough clinician to recognize the essential matter:
The physician who has, in the same ward with other patients, a female patient suffering from a particular kind of twitching, is not surprised if one morning he learns that this peculiar hysterical affection has found imitators. He merely tells himself: The others have seen her, and have imitated her; this is psychic infection. – Yes, but psychic infection occurs somewhat in the following manner: As a rule, patients know more about one another than the physician knows about any one of them, and they are concerned about one another when the doctor’s visit is over. One of them has an attack to-day: at once it is known to the rest that a letter from home, a recrudescence of lovesickness, or the like, is the cause. Their sympathy is aroused, and although it does not emerge into consciousness they form the following conclusion: “If it is possible to suffer such an attack from such a cause, I too may suffer this sort of an attack, for I have the same occasion for it.”[14]
Confronted by such glaring indications of the mimetic character of hysteria, it would be difficult to defend Freud’s powers of observation other than by reference to his most famous – and most intentionally mystifying – doctrines: the unconscious. “Freud’s continual relegation of mimesis to a secondary position,” writes Mikkel Borch-Jacobsen, is a “ deliberate gesture – the very gesture that had made a properly psychoanalytic interpretation of hysterical symptoms possible, thus creating the possibility of psychoanalysis itself.”[15]
The British novelist was under no such compunction. Virginia Woolf understood the dynamics of mimesis and it perils far better than did the man determined to found a new discipline. As we noted earlier, Woolf’s character Rhoda assesses her predicament in explicitly mimetic terms: “I have no face. Other people have faces; Susan and Jinny have faces; they are here. Their world is the real world,” Rhoda muses, both “despise me for copying what they do.”[16] Anticipating a topic we will take up when we turn to the Christian understanding of personhood, we can here but note that the “face” that Rhoda lacks, and that she tries to borrow from her classmates, is the social cognate of the “mask” worn by actors on stage. For Rhoda lived and moved and had her being exclusively on the social stage.
Psychologically speaking, the hysteric is someone especially vulnerable by reason of mimetic predisposition and emotional circumstances to the influence of others, and who, having fallen under such influence, finds it suffocating or intolerable, resorting to a kind of private exorcism ritual – involving, typically, either exaggerated histrionics or autistic or cataleptic unresponsiveness – each an effort to ward off or neutralize a mimetic influence which the subject experiences as ontologically threatening. A person risks being diagnosed as hysterical if his attempts to ward off mimetic influence and retain or assert his self-sufficiency are debilitating enough or flamboyant enough to attract the attention of clinicians. For the hysteric, the power of the mimetic model threatens to spiritually eviscerate the reluctant but vulnerable imitator. The latter has available essentially two strategies for resisting the spellbinding influence: autistic imperviousness or histrionic self-proclamation. In commenting on the work of historian Janet Oppenheim (1991), Micale touches on these two strategies:
Two contradictory readings of the hysterical female, she finds, emerge from Victorian medical literature: the willess hysteric, who lacked the ability to control her emotional impulses and fell into childish and self-indulgent invalidism, and the willful hysteric, who insubordinately asserted her demands in the face of societal imperatives.[17]
These two basic strategies can be used alternately, as two researchers, unaware of the mimetic dynamic at work, nevertheless observed:
Hysterics privately enact the battle between Carnival and Lent, a battle in which the anorexic figure of Lent — a figure represented as emaciated, old and female, a figure of humorless fasting and sexual abstinence — is invariably the victor.[18]
What Stallybrass and White see as the hysteric’s carnivalesque and penitential options represent what we have called the histrionic and catatonic strategies for escaping a mimetic invasion – a hostile takeover so to speak – due to the hysteric’s relative “ontological diminution” vis-à-vis the model.
If the anorexic figure of Lent is victorious in the arena of her choosing – namely thinness competition, approximating the holocaust survivors or the starving poor – her opposite appears in the carnivalesque exhibitions of unbridled sexual exuberance and deviance, to which even young children are now routinely exposed. The “Lenten” form of hysteria invites professional intervention. Meanwhile, the least expression of moral discomfort with the Carnivalesque mockery of sexual modesty can damage one’s social reputation and employment opportunities and bring down the wrath of the forces that today control the educational, corporate, entertainment, social media, and mainstream news media institutions. Thus does the abandonment of norms quickly become the adamant enforcer of new ones.
. . .
Further evidence for recognizing hysteria as a mimetic disorder is the fact that the treatment Freud found most helpful was itself a mimetic stratagem: the subtle commandeering of the patient’s consciousness by the physician’s adroit manipulation of the encounter. Nor should we be surprised to find that the relief that hypnosis provided had only limited therapeutic efficacy, depending, as it did, on an extended relationship between the patient and the clinician. As Freud observed:
The situation is the same as if the hypnotist had said to the subject: “Now concern yourself exclusively with my person; the rest of the world is quite uninteresting.” … The hypnotist … makes the person upon whom he is experimenting sink into an activity in which the world is bound to seem uninteresting to him; but at the same time the subject is in reality unconsciously concentrating his whole attention upon the hypnotist, and is getting into an attitude of rapport, of transference on to him.[19]
For the patient, the psychotherapist became the equivalent of what the cricket pitch or soccer field had been for Woolf’s Percival. It served to arrest the otherwise fickle mimetic fascination and anchor it in one place. Freudian transference gave the patient relief from whatever mimetic entanglements had given rise to the hysteric symptoms from which relief was being sought. The relationship with the therapist provides the patient with the ballast needed to retain a semblance of subjective authenticity.
Before we turn to Freud’s description of therapeutic hypnosis at work, let us revisit a passage from Bob Dylan’s Nobel Laureate lecture, which will prepare us to assess what Freud called the transference.
He was powerful and electrifying and had a commanding presence. I was only six feet away. He was mesmerizing. I watched his face, his hands, the way he tapped his foot, his big black glasses, the eyes behind the glasses, the way he held his guitar, the way he stood, his neat suit. Everything about him. He looked older than twenty-two. Something about him seemed permanent, and he filled me with conviction. Then, out of the blue, the most uncanny thing happened. He looked me right straight dead in the eye, and he transmitted something. Something I didn’t know what. And it gave me the chills.
Again, we call attention to the religious tone that Dylan obviously felt was entirely appropriate to his experience. However inchoately, Dylan’s encounter with Buddy Holly awakened him to the fact that his life was not about himself. And that is the beginning of a religious awakening, the crowning glory of which was first declared by St. Paul: “I live, now no longer I, but Christ lives in me.” (Gal 2:20)
With that in mind, here is how Freud describes the effect of therapeutic hypnosis:
Let us recall that hypnosis has something positively uncanny about it; but the characteristic of uncanniness suggests something old and familiar that has undergone repression. Let us consider how hypnosis is induced. The hypnotist asserts that he is in possession of a mysterious power that robs the subject of his own will; or, which is the same thing, the subject believes it of him. … This mysterious power … must be the same power that is looked upon by primitive people as the source of taboo, the same that emanates from kings and chieftains and makes it dangerous to approach them (mana). The hypnotist, then, is supposed to be in possession of this power; and how does he manifest it? By telling the subject to look him in the eyes; his most typical method of hypnotizing is by his look. But it is precisely the sight of the chieftain that is dangerous and unbearable for primitive people, just as later that of the Godhead is for mortals. Even Moses had to act as an intermediary between his people and Jehovah, since the people could not support the sight of God; and when he returned from the presence of God his face shone — some of the mana had been transferred on to him, just as happens with the intermediary among primitive peoples.[20]
“I felt related, like he was an older brother. I even thought I resembled him,” Dylan told his Nobel audience. “He was the archetype. Everything I wasn’t and wanted to be.” How does the hypnotic power of the therapist take hold of the patient, asks Freud. By “telling the subject to look him in the eyes,” and becoming a god in his eyes.
Freud veered away from the huge mimetic implications of his observations and embraced the sexual and Oedipal theories as an alternative. Nevertheless, the therapeutic successes, such as they were, remained dependent on a mimetic manipulation, namely, transference. Writes Mikkel Borch-Jacobsen:
Do we really know what psychoanalysis defined itself against, and why? Hypnotic suggestion had undoubtedly been relegated to the obscurity that bordered and preceded psychoanalysis. But precisely as obscurity: it had been rejected with no one the wiser as to exactly what had been rejected. … Hypnosis was abandoned … by virtue of a denial, a rejection, a suppressed hostility that was obscure, so to speak, to itself. Freud simply wished to hear nothing further on the subject of hypnosis.[21]
Admitting to a “feeling of muffled hostility,” to what he called the “tyranny of suggestion,” Freud nevertheless had to admit, as he put it, that “suggestion (or more correctly suggestibility) is actually an irreducible, primitive phenomenon, a fundamental fact in the mental life of man.”[22] He grudgingly conceded that “the riddle of suggestion” would have to be investigated further. In doing so, however, he began with an assumption that decided the matter from the outset. “There is no doubt,” he wrote, “that something exists in us which, when we become aware of signs of an emotion in someone else, tends to make us fall into the same emotion . . .”[23] Like Descartes before him, Freud turned inward to escape from evidence of the mimetic realities of human existence, which he must have instinctively realized would leave his sexual theories in shambles.
What Freud called the “riddle of suggestion” is the truth about the mimetic nature of desire, which was beginning to break through the conceptual barriers built and repaired over centuries to keep it sequestered. But by assuming that symptoms of mimetic hyperactivity were due to something located inside the hysteric – in Cardinal Ratzinger’s terms, something in the psychic inventory – Freud managed to turn away from the very mimetic reality that was declaring itself in his patients’ symptoms. As Jean-Michel Oughourlian put it, “by hiding the Other inside the subject in the guise of the unconscious, Freud preserved and protected its autonomy.”[24] Hysteria, hysterical identification, and suggestibility were all found to have fundamentally sexual origins. Childhood sexual trauma was their cause. Psychoanalysis was born, and the anthropological revelation with which we would eventually have to reckon was postponed for another hundred years. Freudian theory triumphed because, as Oughourlian puts it: “culture recognized in that new mythology the least harmful of disguises for the reality it wanted to keep hidden.”[25]
Of course, whether or not the fruit of Freud’s “panic-stricken refusal to glance, even furtively, in the only direction where meaning could still be found,”[26] and his aversion to “the riddle of suggestion,” was the least harmful alternative can now be better adjudicated. For Freud’s sexual theories served as the launching pad for the sexual revolution and its demolition of the nuclear family, the catastrophic consequences of which grow more conspicuous by the day.
The real historical significance of the Freudian sexual theory, therefore, was that it turned attention away from discoveries that Freud would otherwise have been obliged to make. So glaring was the clinical evidence pointing in the direction that Freud chose not to follow, and so aware of this evidence was Freud himself, that nothing less than the most glamorous of alternative theories could have deflected attention from the besetting distress. Going forward, however, the theory with which Freud steered clear of the discovery that otherwise awaited him is far less interesting than the neglected discovery itself.
Footnotes:
[1] Nicholas Bakalar, “Freud, 1909,” The New York Times, Oct. 10, 2011. Three weeks before this obituary appeared, German forces had interrupted what was left of European complacency, but this apparently did not keep the Times’ editor from being complacent about Freud’s own contribution to the disruption.
[2] Samuel Bendeck Sotillos, Dismantling Freud: Fake Therapy and the Psychoanalytic Worldview, (Brooklyn, NY: Angelus Press, 2020), 1.
[3] Carrol Smith-Rosenberg, “The Hysterical Woman: Sex Roles and Role Conflict in Nineteenth-Century America,” in Disorderly Conduct: Visions of Gender in Victorian America (New York: Knopf, 1985), 197: quoted: Mark S. Micale, Approaching Hysteria: Disease and Its Interpretations, (Princeton, NJ: Princeton University Press, 1995), 3.
[4] Sigmund Freud, The Interpretation of Dreams, trans. James Strachey, (Basic Books, 2010), 139.
[5] Mark S. Micale, Approaching Hysteria: Disease and Its Interpretations, (Princeton, NJ: Princeton University Press, 1995), 112-13.
[6] Mark S. Micle, Approaching Hysteria: Disease and Its Interpretations, (Princeton, NJ: Princeton University Press, 1995), 27.
[7] Hans Urs von Balthasar. The Christian and Anxiety, trans. Dennis D. Martin and Michael J. Miller, foreword by Yves Tourenne, O.F.M. (San Francisco: Ignatius Press, 2000), 36.
[8] Jean-Michel Oughourlian, The Mimetic Brain, trans. Trevor Cribben Merrill, (East Lansing: Michigan State University Press, 2016), 160-61.
[9] Freud: The Mind of the Moralist, (Garden City, New Jersey: Doubleday, 1961), 281.
[10] Approaching Hysteria, (Princeton, NJ: Princeton University Press, 1995), 34.
[11] Statistical Manual of Mental Disorders, 1987: quoted: Elaine Showalter, Hystories: Hysterical Epidemics and Modern Media, (New York: Columbia University Press, 1997), 102
[12] Mikkel Borch-Jacobsen, The Freudian Subject, trans. Catherine Porter, (Stanford, CA: Stanford University Press,1988), 49.
[13] Ibid., 28.
[14] The Basic Writings of Sigmund Freud, Edited and translated by A. A. Brill, (N.Y., Random House, 1938), 228.
[15] Mikkel Borch-Jacobsen, The Freudian Subject, trans. Catherine Porter, (Stanford, CA: Stanford University Press,1988), 49.
[16] Virginia Woolf, The Waves, (San Diego: Harcourt Brace Jovanovich, 1978), 43.
[17] Mark S. Micle, Approaching Hysteria: Disease and Its Interpretations, (Princeton, NJ: Princeton University Press, 1995), 104. Incidentally, the American Psychiatric Association defines histrionic personality disorder (HPD) as one that features extreme attention-seeking, usually in the form of a need for approval or flirtatious behavior. Doctors diagnose four times as many women with HPD as men, and it affects around three percent of the general population. Beyond certain symptoms, individuals with HPD will be able to function at a high level and remain successful socially, in school, and at work.
[18] Peter Stallybrass and Allon White, The Politics and Poetics of Transgression, (Ithaca, New York: Cornell University Press, 1986), 184.
[19] Sigmund Freud, Group Psychology and the Analysis of the Ego, translated by James Strachey, (New York: W. W. Norton, 1989), 74.
[20] Sigmund Freud, Group Psychology and the Analysis of the Ego, trans. James Strachey, (New York: W. W. Norton, 1989), 73-4.
[21] Mikkel Borch-Jacobsen, The Freudian Subject, trans. Catherine Porter, (Stanford, CA: Stanford University Press,1988), 149, 150.
[22] Sigmund Freud, Group Psychology and the Analysis of the Ego, translated and edited by James Strachey, (New York, W.W. Norton, 1989), 27.
[23] Ibid., 27, italic emphasis added.
[24] Jean-Michel Oughourlian, The Puppet of Desire, trans. Eugene Webb, (Stanford: Stanford University Press, 1991), 151.
[25] !bid., 159.
[26] René Girard, Things Hidden since the Foundation of the World, (Stanford: Stanford University Press, 1987), 261.
No Florilegia Skype call was scheduled for December. We will reconvene our Florilegia call in January.
Finding the right Word…
It should not be surprising to learn that the uniqueness of the “Christ Event” left those whose lives had been altered by exposure to that Event at a loss for a vocabulary capable of doing justice to their experience and the testimony of those whose veracity was unimpeachable. This is borne out by the heated controversies in the first centuries of Christianity over which words and phrases were adequate to the task of proclaiming the truth of Christ, most especially when it came to the Trinitarian relations. The truths that constitute the central teachings of Christian faith are revealed truths, which is to say truths which could not have been acquired by any epistemology other than that of faith itself. Henri de Lubac quotes the Reformed theologian, Gabriel Widmer:
The prophets uttered the Word: Jesus incarnated it. Still, God did not provide them with any heavenly language, syntax, or grammar; to express themselves they had to strain their ingenuity to twist borrowed words and phrases. Thus, as soon as it appeared in Palestine, the theology of biblical revelation was as halting as Jacob after his struggle with the angel. It suffered from a disproportion between what it was supposed to announce on behalf of God and the means at its disposal for saying something to men.1
The challenge for the early Fathers of the Church was to fashion the linguistic terms in which the mystery of the Trinity could accurately be expressed without thereby diminishing its mysteriousness and unfathomability. Analogously, the mystery of the human person remains regardless of how precise or useful might be the linguistic or analytical tools we employ in our probing of that mystery. For instance, we have relied on the anthropological insights of René Girard, but we would be unfaithful to both Girard’s contributions and to his religious faith were we to use his insights reductively and in a clumsy way. For both the divine and the human mystery – entirely interrelated as they are – stand in need of revelation if we are to approach their essence. That revelation now exists in its full and final form. And yet, like the Christians of the early centuries of the Christian era, the task of finding ever more illuminating and evangelical efficacious ways of accounting for faith in Christ – always within the theological framework of our patristic forebearers – falls on us.
De Lubac quotes the American theologian, B. B. Warfield:
This linguistic revolution, which, as regards its essential elements, was achieved in the course of a few generations, is the most eloquent witness to the spiritual revolution effected by Christianity in the ancient world. No other sect, no other oriental religion, ever occasioned such a profound linguistic differentiation.2
In order to give an account of the Catholic Church, “a reality whose like the world had not then or ever before seen,” writes Henri de Lubac, “entirely new words or groupings of words become necessary to express ideas, sentiments, institutions or behavior unknown to ancient religions.”3
The scriptural prehistory of this revelation in Israel, indispensable to its initial intelligibility, did not provide adequate conceptual or terminological resources for codifying and bearing witness to the ramifications of the life, death, and resurrection of Christ. Neither the linguistic resources of Judaism nor that of Greek metaphysics were capable of accounting for the trinitarian theology than came into view with Christ. These resources were gradually made available thanks to the labors of the early Fathers of the Church. Moreover, as Douglas Farrow has so aptly recognized, the Christian revelation altered the very concept of time itself:
For the incarnation of the eternal God as Israel’s Messiah is an unprecedented affirmation of time that gives new significance to “before” and “after,” distinguishing and separating them quite decisively for the benefit of theology, history, hermeneutics, psychology, literature, music, and so forth – even for cosmology and natural science. “Before” and “after” begin to matter after the incarnation in a way they did not matter before.4
Perhaps the most startling claim in Farrow’s remarkably bold assessment is his assertion that the event of the Incarnation represented an alteration in human psychology, a break between the “before” and “after.” This assessment was published long after the writing of this book had begun, but it comes from a respected theologian whose daring assessment we find completely congenial to the argument we are herein trying to make.
. . .
It was from the theological workshops where Christian thought was struggling to find a vocabulary with which to express both the Christian revelation and the lived experience of those whose lives had been altered by that revelation that the word person acquired its status. Benedict XVI drew special attention to the contribution of one of the early Church’s most brilliant protagonists.
Tertullian shaped Latin into a theological language and with an ingenious, almost incredible assurance was able to set down in writing a theological terminology that later centuries could not surpass, because at the first attempt it coined permanently valid formulas for Christian thought. So it was Tertullian who also established for the West its formula for presenting the Christian idea of the Divinity: God is “una substantia – tres personae”, one Being in three Persons. At this point the word “person” enters intellectual history for the first time with its full import. Several centuries passed before this statement could be intellectually assimilated and mastered, so that it was no longer merely a statement but really became an insight into the mystery, which it taught Christians to apprehend somehow, if not fully to comprehend.5
The future pope marveled at the boldness of the Tertullian formulation, amazed at how the second century Carthaginian seemed to find lasting formulations with an “almost somnambulistic assurance.”6 Somnambulance is not always a reliable guide, and it may have contributed to Tertullian’s later drift into heterodoxy, but this was after he had left an indelible mark on Trinitarian thought and challenged Christian posterity to recognize the Christian ordination of the human person, which it is the burden of this book to defend. Be that as it may, to be guided by the hand of providence is to experience an “almost somnambulistic assurance,” and it was this providential guidance that brought the word person into prominence, not only in Christian theological thought, but in the discourse of those cultures that had the great good fortune to fall under Christian influence.
1. Quoted: Henri de Lubac, The Christian Faith: An Essay on the Structure of the Apostles’ Creed, trans. Br. Richard Arnandez, F.S.C., (San Francisco: Ignatius Press, 1986), 267.
2. Quoted: Henri de Lubac, The Christian Faith: An Essay on the Structure of the Apostles’ Creed, trans. Br. Richard Arnandez, F.S.C., (San Francisco: Ignatius Press, 1986), 267.
3. Henri de Lubac, The Christian Faith: An Essay on the Structure of the Apostles’ Creed, trans. Br. Richard Arnandez, F.S.C., (San Francisco: Ignatius Press, 1986), 271.
4. Douglas Farrow, 1 & 2 Thessalonians (Brazos Theological Commentary on the Bible), (Grand Rapids, MI, 2020), 27.
5. Benedict XVI, Dogma and Preaching, (San Francisco: Ignatius Press, 2011), 182.
6. Benedict XVI, Dogma and Preaching, (San Francisco: Ignatius Press, 2011), 182.
To be discussed on Saturday November 21st 2020 during our video/teleconference call:
A preliminary word
Dear Friends,
In the manuscript on which I have been working, I undertake a very cursory “genealogy” of the “sovereign self” – whose disintegration is now underway. As I mentioned to those who joined us on our Skype last month, the plan for the first half of the book is to discuss the contributions – pro and con – of Augustine, Dun Scotus, William of Ockham, Descartes, Rousseau, Nietzsche, Dostoevsky, and Freud, among others. I sometimes have doubts about this, but overall, I think this material is germane enough to the larger argument to merit inclusion. The chapter below on Descartes is representative. If you have time to join us for the next “Florilegia” discussion, I look forward to any thoughts you may have on this sample chapter.
I am enormously grateful to all who have commented and offered advice, and, of course, I am especially grateful to my old, dear friend Randy Coleman-Riese for keeping the Cornerstone Forum plates spinning on sticks while I worry over this writing project.
Affectionately,
Gil
P.S. In the footnotes of the draft version, rather than using the typical Ibid for sources previously quoted, the footnotes are duplicated in full so that these quotations can be moved around later if necessary.
DESCARTES: SENSELESS QUEST FOR CERTAINTY
In two men and in two philosophies, the modern era with its mechanistic undertones and artistic overtones dominates us all. Let us speak of them, so that we don’t remain addicted to them. Let us speak of Descartes and Nietzsche.
René Descartes’ mother died shortly after childbirth; when Friedrich Nietzsche was five years old, his father had a fatal accident. These two events – here, the disappearance of a mother; there, of a father – embody the beginning and the end of the modern era. These events created a one-generational time, a time without ancestors and without heirs.1 – Eugen Rosenstock-Huessy
Countless attempts have been made to trace the genealogy of the modern and postmodern predicament. The swirling currents and eddies that have helped shape the world in which we live are indeed complex. We have neither the competence nor the desire to rehearse or retrace work done by those more learned in these matters. Nonetheless, what follows would be inadequate were we to fail to touch briefly on a few of those earlier intellectual and cultural influences which have led to our present predicament, beginning with the influence of René Descartes and the historical and intellectual circumstances to which he was responding.
If Rosenstock-Huessy is right, then modernity was, in a manner of speaking, fathered by a motherless Frenchman and euthanized by a fatherless German. If Descartes was literally motherless from an early age, he was neither spiritually fatherless nor wanting for intellectual legatees. As for his intellectual forefathers, his legacy can perhaps best be understood with reference to the two chief predecessors under whose influence he fell: Saint Augustine (354-430) and the English Franciscan William of Ockham (1288-1348).
. . .
The Franciscan John Dun Scotus (1266-1308) died as Europe was coming to grips with the failure of the Crusades and descending into the Little Ice Age. Within a generation, the Hundred Years War would begin followed thereafter by the Black Death which indiscriminatingly took perhaps fifty million lives, more than half of Europe’s population. These were the historical circumstances that awakened doubts regarding God’s providential care and justice.
Duns Scotus, known as the Subtle Doctor, place emphasis on the Incarnation as the central predestined event of creation itself. Where Thomas Aquinas privileged man’s rationality as his distinguishing characteristic, for Scotus it was love, and love was first and foremost an act, not of reason, but of will. For him, an act could be virtuous or sinful only because it was willfully performed. Moral life depended less on the authority of legal proscriptions and more on the will of the acting agent. The knowledge of God, so to speak, was reserved for those who conform their will to the will of God. For Scotus man’s eschatological destiny was not the beatific vision, as it was for Thomas, but rather the entry into the loving embrace of Trinitarian life, which awaited an unconditional Yes to God’s will. While for Thomas man has been given a rational faculty capable of recognizing God existence and his laws, for Scotus man’s highest calling is to do God’s will: to say Yes to God’s ordinances, the word understood as the means whereby the ends toward which man is ordered and ordained.
Rather than think of the will as popularly understood today as an outright willful act, we come closer to Scotus by speaking of willingness, first and foremost a simple Yes to the invitation of Christ. According to Scotus, had there been no Fall, no original sin, man would still have faced a decision to align his will with the will of God, because what was at issue was not just moral rectitude, but loving devotion to the Author of the moral life. Perhaps we could say that Thomas and Scotus tracked two perfectly compatible paths of faith, the former emphasizing the faculty of reason and intellectual inquiry and the latter giving more weight to the reciprocation of God’s love in the form of a willing surrender to God’s will. The moral status of an act might be objectively licit, but for Scotus it was primarily the willingness to perform the act that carried the greatest moral weight.
These fourteenth century developments provide the backdrop for the revolutions that followed, not least for both the Protestant and Cartesian Reformations, each in its own way an attempt to restore confidence in God’s providence. Here we come upon the figure who has rightly been blamed for a great deal of the confusions of our age: William of Ockham (1285-1347). Not only was Ockham himself influenced by his fellow Franciscan, John Duns Scotus, but by most accounts it was the transposition that took place from Scotus to Ockham where the decisive mistake was made, a mistake worthy of Chesterton’s famous quip that, “if some small mistake were made in doctrine, huge blunders might be made in human happiness.”2
For Scotus, it was God’s will, and creatures were in no position to pronounce on that. Ockham believed that God’s will was reliable, but that it could not be taken for granted. At least theoretically, God could change and will something not apparently in concert with human moral judgments. Divine freedom could not be limited by human assessments of God’s rationality. Divine contingency preempted divine rational necessity. For Ockham, the world operated according to God’s will, not according to rational principles or eternal ideas that God had built into the order of creation, principles or ideas that operated thereafter on their own, so to speak. For Ockham, the rationalism of Thomas and the scholastics infringed on the freedom of God. However stable and enduring the moral principles written by God into the order of the world might be, for Ockham divine freedom required that these principles not be unalterable. So, quite suddenly, the fixed laws of both the physical and the moral world were giving way to contingency and the freedom of divine will. A chasm of uncertainty opened.
Having emphasized the role of the will, questions naturally arose concerning the legitimate latitude that might be thought consonant with the God revealed by scripture, accessible, as scholastics insisted, to right reason. To the extent that aligning oneself with God’s will took precedence over any rational measure by which man might come to know God, an abyss began to open. Benedict XVI summed up with his characteristic economy and lucidity the issue at stake:
Duns Scotus has developed a point to which modernity is very sensitive. It is the topic of freedom and its relationship with the will and with the intellect. [Scotus] underlines freedom as a fundamental quality of the will, introducing an approach that lays greater emphasis on the will. Unfortunately, in later authors this line of thinking turned into a voluntarism, in contrast to the so-called “Augustinian and Thomist intellectualism.” For St Thomas Aquinas, who follows St Augustine, freedom cannot be considered an innate quality of the will, but, the fruit of the collaboration of the will and the mind. Indeed, an idea of innate and absolute freedom – as it evolved, precisely, after Duns Scotus … risks leading to the idea of a God who would not even be bound to truth and good.3
It is one of the ironies of history that just as the great flowering of theological and philosophical thought inspired by Latin Christianity’s rediscovery of Aristotle was being made possible by Islam’s preservation of portions of the Aristotelian corpus, Islam itself was rejecting philosophical and theological inquiry in favor of a doctrine of Allah’s inscrutable will as laid down in the Qur’an. More ironic still is the fact that Ockham’s challenge to the use of reason by the scholastics eventually gave rise to a doctrinaire individualism that is the mirror image of Islam’s theological voluntarism. The two cultural and historical forces that grew up around these two forms of voluntarism now face each other in a struggle for the soul of Europe.
Pope Benedict was prompt in pointing out that Dun Scotus “does not fall into these extremes,” and that “if he speaks of a ‘primacy’ of the will, he argues this precisely because the will always follows the intellect.”4 Not so with Scotus’ most prominent interpreter. For Ockham freedom is so central to moral action that enforced morality becomes a contradiction in terms. He understood divine omnipotence in an absolute way: God can will whatever he chooses – without deference to human moral assessments or rational expectations. What God wills is morally self-validating. Man’s rational or moral calculation had no bearing. Suddenly, the world was no longer under the providential, loving, and chastising care of a God on whose ordinances mortal men could confidently rely. For Ockham, divine omnipotence remains inscrutable, rendering nugatory any attempt to reckon with the appropriate human response to it.
In time, the overall drift on this Ockhamite anthropology of divine unpredictability opened a chasm under Christians anxious to please God and/or avoid perdition. This uncertainty eventually gave rise, not only to the Protestant Reformation and the Enlightenment, but the rights-obsessed antinomian individualism that it is now the most formidable challenge to the moral and cultural patrimony on which our civilization has been founded.
Ockham insisted, much as Al-Ghazali had in the Islamic world two centuries earlier, that God was free to do anything he chose. There were no universals – no unalterable truths, no goodness, or beauty – to which God’s will would by inner necessity be in harmony. “If there are no real universals,” writes Michael Allen Gillespie of the Ockhamist revolution, “every being must be radically individual.”5 Thus did theological nominalism lead to something like the opposite of what it led to in the Islamic world, namely ontological individualism. This the Western world’s anthropological voluntarism would in due course come face to face with its mirror image: the implacable will of Allah that Islam regarded as set unalterably in Qur’anic stone for all time.
. . .
If Descartes was among those thrown into metaphysical uncertainty by the idea of divine unpredictability, his misgivings about the received wisdom of his culture were not so sweeping as to eliminate the influence of one of his most prominent forefathers, St. Augustine. From the great Bishop of Hippo Descartes learned to turn inward and to place more trust in what he found there than in whatever might have come to him by way of his spiritual, cultural, or intellectual inheritance. This attitude hardly does justice to the vast legacy of great bishop of Hippo, but it seems to be what stood out for Descartes. Materially motherless by unhappy circumstance, Descartes, the mathematician, rendered himself culturally and spiritually fatherless on principle. In doing so, he left his intellectual heirs a poisoned patrimony, one that was lethal to tradition and the transgenerational concord which safeguards.
If Descartes attempted the impossible, to be free of influence, his own influence on subsequent generations was such that we cannot afford to overlook what were the resources on which he drew, for they are part of the story of the sovereign self that is today in such a spiritual and existential crisis. “Augustine,” writes Robert Solomon, “described his ‘inner’ self quite thoroughly, even capping his analysis with the precocious Cartesian insight, ‘I think ergo I am.’6
For several years, Augustine was a follower of the Manicheans, a Gnostic and strictly dualistic sect, but a mind and heart like Augustine’s could not long be housed in such narrow confines. Augustine’s eventual conversion to Christianity turned on the inspirational example of two men: Ambrose, bishop of Milan, and St. Paul, the former by virtue of his personal sanctity and the eloquence of his preaching and the latter by the power of one passage from his Letter to the Romans, to which Augustine was providentially drawn in the midst of a spiritual crisis. In the year 386, he heard a child singing “Pick it up and read, pick it up and read,” and he took it to be a message from God. He opened a volume of Pauline epistles at random and read:
. . . not in reveling and drunkenness, not in debauchery and licentiousness, not in quarreling and jealousy. Instead, put on the Lord Jesus Christ, and make no provision for the flesh, to gratify its desires. (Rm 13:13-14)
The future bishop of Hippo and one of the greatest minds and most impassioned hearts of the early Church was baptized by Ambrose during the Easter Vigil April 24, 387. It might be said, however, that Paul’s “make no provision for the flesh” – valid as it is – did not exactly serve to inoculate Augustine against the Neoplatonic excesses of his earlier period. And it was this aspect of the Augustinian legacy that made its way into the thinking of the French mathematician-turned-philosopher. As Jeffery Stout has written:
Descartes can now be located squarely within the Platonic-Augustinian tradition. Indeed, it was this tradition which supplied most of the concepts and images he used in his attempt to transcend all tradition. Once the unacknowledged debt is recognized, it should no longer be surprising that the attempt did not succeed.7
Irenaeus’ foundational repudiation of Gnosticism in the second century dealt a blow to Neoplatonic interpretations of Christianity, but the struggle to extricate Christianity from Gnostic and Neoplatonic influences would trouble Christianity for centuries. According to Hans Urs von Balthasar, “the spiritualistic temptation in the purer form of the platonic and neoplatonic myths take control of Christian theology and it will require long and confusing struggles before the position that had crept into Christian thought could be eliminated.”8 Indeed, one of the places where Neoplatonism burrowed into Christian thought in the post-Irenaean period was in the thought of Augustine. Notwithstanding Augustine’s immense and indispensable contribution to Christianity’s historical self-understanding, the Neoplatonism on which the bishop of Hippo still largely drew led his intellectual and spiritual heirs into theological imprecisions whose implications were at the time unforeseen. “At the time of his conversion in Milan,” writes von Balthasar “Augustine was assiduously practicing Neoplatonic self-absorption.”9 “The story of Augustine’s intellectual development does not begin with Platonism and end with Christianity,” writes Phillip Cary. On the contrary, Augustine’s is “a distinctive brand of Christian Platonism in the making.”10 Whatever the linger effects of Augustinian Christian Platonism, Cardinal Joseph Ratzinger was surely right when he insisted that Augustine’s doctrine of the Trinity was “one of the most momentous developments of the Western Church. In fundamental ways it influenced both the concept of the Church and the understanding of the person.”11
As always, there are nuances and paradoxes. Henri de Lubac addressed the issue of Augustine’s Neoplatonism more sympathetically: “We can only marvel at the assimilative power of Christian life as manifested in [Augustine’s] attitude, and conclude with Mgr. Régis Jolivet that by means of Augustine’s comments on Plato ‘it was not Augustine who became a neo-Platonist, but Plato who became a Christian.’”12 De Lubac cites another equally sympathetic philosopher, Etienne Borne:
Christian Platonism is a historical fact; but this demanded of St. Augustine a confrontation and a combat like that between Jacob and the angel, from which one of the protagonists, philosophy, emerged limping and bearing the traces of its lucky defeat.13
By transposing the divine persons into intra-psychic categories, Augustine seriously compromised what is surely the most essential, radical, and counter-intuitive element in Trinitarian theology, namely, the interdividual communion of Persons in the Trinity. As noted above and as we will explore further below, the very word person found its way into popular use thanks to the efforts of theologians trying to account for the greatest of all mysteries, the Divine Trinity. The fact, revealed in scripture – Genesis 1:27 – that we are made in the likeness of our Creator, suggested to Augustine’s creative mind that traces of our trinitarian birthmark could be found, and he undertook the task of locating these traces. He brought to that task Neoplatonic habits of thought which, in the words of Karl Rahner, “have held us in bondage for two thousand years.”14 This formation predisposed him to search for evidence of man’s trinitarian godlikeness by turning inward. He famously proposed that the watermark of man’s trinitarian provenance was to be found in his memory, will, and intellect. This inward turn, according to the then-Cardinal Ratzinger, obscured the deeper anthropological implications of the revelation of the divine Trinity.
Augustine explicitly transposed this theological affirmation into anthropology by attempting to understand the human person as an image of the Trinity in terms of this idea of God. Unfortunately, however, he committed a decisive mistake here … In his interpretation, he projected the divine persons into the interior life of the human person and affirmed that intra-psychic processes correspond to these persons. … As a result, the trinitarian concept of the person was no longer transferred to the human person in all its immediate impact.15
As Cardinal Ratzinger lamented, so elusive was the mystery that Augustine was trying to explicate that the confusions gave rise to many of the early Christological heresies, many of which were “attempts at locating the concept of the person at some place in the psychic inventory.”16 As a result:
The contribution of Christian faith to the whole of human thought is not realized; it remains at first detached from it as a theological exception, although it is precisely the meaning of this new element to call into question the whole of human thought and to set it on a new course.17
A line stretches from Augustine’s inward turn to Descartes’ effort to arrive at certainty by trying to isolate himself from history, tradition, and human companionship. For Augustine, the inward turn was a turn toward another, toward God. Augustine was the one who made the very idea of Christianity as a civitas, the City of God, a thriving, embattled community. For Descartes, however, the inward turn was a turn away from community, a liberation that required an act of insulation from the influence of others, whether they be contemporaries, ancestors, or heirs. His was a radical emancipation proclamation, the assertion of an epistemological principle that constituted an anthropological claim: that only by renouncing even the most venerable of one’s fellows or fore fellows would one’s own thought have access to truth. This was one of the seeds from which has grown the poisonous fruit of sovereign selfhood, the triumph of the will and the twilight of resolve. This obviously constitutes a radical break with the larger Augustinian vision, but neither can the Bishop of Hippo be completely exonerated. As Jaroslav Pelikan has written:
The fundamental reorientation of Western philosophy associated with the name of Descartes was likewise a species of Augustinianism, and the Cartesian “Cogito ergo sum” stands in a direct succession, through the scholastics, with Augustine’s use of thought and doubt for the reality of the self and ultimately for the reality of God.18
“On the way from Plato to Descartes,” writes Charles Taylor, “stands Augustine.”19 And this Augustinian legacy is nowhere more clearly manifested than it is in this passage from Augustine’s De Trinitate.
Yet who ever doubts that he himself lives, and remembers, and understands, and wills, and thinks, and knows, and judges? Seeing that even if he doubts, he lives; if he doubts, he remembers why he doubts; if he doubts, he understands that he doubts; if he doubts, he wishes to be certain; if he doubts, he thinks; if he doubts, he knows that he does not know; if he doubts, he judges that he ought not to assent rashly. Whosoever therefore doubts about anything else, ought not to doubt of all these things; which if they were not, he would not be able to doubt of anything.20
Nor was Descartes anything but an eager recipient of this Augustinian trope. For all his determination to avoid the mimetic influence of others, living and dead, Descartes’ absorbed this feature of the tradition with open arms. “In his desire to portray his thought as originating ab ovo,” writes Michael Allen Gillespie of Descartes, “he goes to considerable lengths to conceal his sources.”21 To conceal one’s “sources” is to disguise the mimetic influence of one’s model, and the self-conscious originality of Descartes’ epistemological revolution required an obfuscation of the unmistakable influence of the great African bishop. (Meanwhile, the concealing of one’s sources has become a conspicuous feature of the dramatized individualism that characterizes late-modern psychological stagecraft.)
Some have disavowed a causal link between Augustine’s exploration of the inner self and the desperate Cartesian quest for certainty, which radically renounces any and all received wisdom or cultural inheritance. But most acknowledge that Augustine’s inward turn was there for the taking, and it cannot be doubted that Descartes helped himself to it. Another defender of Augustine, who similarly acknowledges both a link and a hiatus between Augustinian inwardness and Cartesian self-sufficiency, is Charles Taylor. He writes:
Plainly the whole Cartesian project owes a great deal to its Augustinian roots and the enhanced place this tradition gave to inwardness. But plainly also there has been a transposition of this tradition as well. Cartesian internalization has transmuted into something very different from its Augustinian source. For Augustine, the path inward was only a step on the way upward. Something similar remains in Descartes, who also proves the existence of God starting from the self-understanding of the thinking agent. But the spirit has been altered in a subtle but important way. Following Augustine’s path, the thinker comes to sense more and more his lack of self-sufficiency, comes to see more and more that God acts within him. In contrast, for Descartes the whole point of the reflexive turn is to achieve a quite self-sufficient certainty.22
. . .
In redeploying Augustine’s Neoplatonic adaptation, the Cartesian inward turn managed to forsake the one thing that kept the Augustinian one from becoming fatal to Christian thought. Whereas Augustine turned inward in order to find God, Descartes turned inward in search of a self-sufficient source of certainty, and – by extension – identity, “cogito ergosum.” A glance at the evidence that Descartes himself left us of his great conversion will make it clear that it was motivated by his need to insulate himself from the mimetic influence of others. Alas, the task of fending off the potentially distorting effect of mimetic influences proved more difficult than at first supposed. At the beginning of the Third Meditation, Descartes describes the precautions he had to take in order to avoid the taint of mimetic influence. Living at the time (1628) in the Dutch Republic, he describes the physical surroundings which most suited his purpose:
The onset of winter held me up in quarters in which, finding no company to distract me, and having, fortunately, no cares or passions to disturb me, I spent the whole day shut up in a room heated by an enclosed stove, where I had complete leisure to meditate on my own thoughts.23
Not only did the Cartesian turn involve “Neoplatonic self-absorption” of which von Balthasar complained, but the ghost of Ockham’s inscrutable and rationally unpredictable God was in evidence as well; conspicuously so in his First Meditation, where Ockham’s God has “gone native,” taking the form, hypothetically at least, of the Gnostic demiurge haunting a material world incurably infected with its malignity. Descartes writes:
I will suppose not a supremely good God, the source of all truth, but rather an evil genius, supremely powerful and clear, who has directed his entire effort at deceiving me. I will regard the heavens, the air, the earth, colors shapes, sounds, and all external things as nothing but the bedeviling hoaxes of my dreams, with which he lays snares for my credulity. I will regard myself as not having hands, or eyes, or flesh, or blood, or any senses, but as nevertheless falsely believing that I possess all these things. I will remain resolute and steadfast in this meditation, and even if it is not within my power to know anything true, it certainly is within my power to take care resolutely to withhold my assent to what is false, lest this deceiver, however powerful, however clever he may be, have any effect on me.24
Here we have a malignant, seductive deity, conjuring an illusory world of “external things” whose wiles could only be defeated by the (metaphorical) elimination of the body – “hands, or eyes, or flesh, or blood, or any senses.” Here we have a strange blend of that hermeneutics of suspicion later associated with Nietzsche, Marks, and Freud, as well as the Gnostic suspicion of the body and material reality. In place of the senses with which our species has been endowed for the purpose of assessing concrete reality, Descartes proposed an all-purpose instrument: methodological doubt distrustful of the human body and gullible with regard to that portion of the body which cogitates. Not only does this passage presage the reappearance in due course of Christianity’s oldest adversary, the ancient Gnostic contempt of the body, but it prefigures the appearance at the dawn of modern psychiatry of that most pluriform of psychological maladies: hysteria, about which we will concern ourselves in subsequent chapters.
When Rosenstock-Huessy declared Descartes to be the inaugurating thinker of our one-generational age, we have reason to believe that he chose precisely the right metaphor. For, of all the mimetic influences that a committed originalist must foreswear, one of the greatest is that of one’s cultural and familial patrimony. The irony, of course, is that those who think themselves thus emancipated, readily expect generations of their own descendants to be the beneficiaries of their own putatively novel contributions.
Descartes’ strategy of renouncing the influence of God and men may tell us more about the Cartesian revolution than the philosopher’s elaborate justifications for it. It indicates where the problem lies for Descartes. It lies with other people. Not only is it others that Descartes fears will distract him, but it is the “cares” and “passions” others arouse in him that he must extinguish in order to think clearly. Sartre’s “hell is other people” is still more than three centuries away, but Descartes has taken the first steps in that direction, not because he is a misanthropist, but because he has intuitively sensed how susceptible to mimetic forces we humans are and how imperceptibly influential these mimetic stimuli can be. At the heart of this search for unmediated knowledge is a wariness about the mimetic influence of others and a fear of the epistemological corruption such an influence might have. It is, of course, a legitimate fear, as the God who created humans that way certainly knew, but for which redemptive provisions had been made, whose literary treasure trove can be found in the Jewish and Christian scriptures.
One could as well live without oxygen as eliminate the mediating influence of others. Descartes’ efforts to do so anticipate the desperate self-referentiality of the modern self and its ridiculous determination to experience its own ever-elusive authenticity. Descartes represents himself as ensconced for six days in an isolated Dutch garret, where he communes only with his own thoughts, but at length there seems more to wall out than just the physical presence of others, and he must take extreme measures to insure that it is, in fact, his own thoughts on which he is meditating.
In the Third Meditation, Descartes’ vigilance requires that he take leave of his body for fear that physicality might compromise his quest for certain knowledge as much as would received wisdom.
I will now close my eyes, I will plug my ears, I will turn aside all my senses … in this way, concerned only with myself, looking only at what is inside me, I will try, little by little, to know myself, and to become more familiar to myself.25
William Temple called Descartes’ withdrawal into himself “the most disastrous moment in the history of Europe.”26 And yet in the Cartesian move, Paul Zweig sees the distinct traces of a more ancient impulse, one that will emerge with increasing prominence as the Cartesian revolution progresses toward its manifestation in hysteria.
If the prodigal son “came to his senses” at the nadir of his effort to live only for himself, Descartes has chosen quite literally to “lose his senses” in the desperate attempt to insulate himself against whatever might be the machinations of the “evil genius” who threatened his intellectual self-sovereignty.27 It is precisely the straightforward truth of lived reality that defies the ideological geniuses that mimesis conjures up in order to prolong the myth of autonomous individuality. According to Bishop Robert Barron, Descartes “brought all claims to knowledge before the bar of the self-validating ego for adjudication.”28 The result was “the lonely but unassailably secure Cartesian ego, standing amidst the ruins of culture, intelligence, and sense experience, emerged as the sure foundation of knowledge,” leading inexorably to the world of moral anarchy we have lately been celebrating. It is easy to recognize the Cartesian contribution to our present confusions in Rémi Brague’s observation: “The ultimate goal of Enlightenment, was not the joy that one finds in what is, but the pleasure of the subject in not being duped.”29
The notion of a “self” has its etymological roots in the Greek term autos, the source of our idea of “autonomy.” Autos and its cognates strongly imply, in the words of Kenneth Schmitz, “an identity that preserves itself against others.”30 As Schmitz explains, “in order to preserve itself and retain that identity, such a self must at some point exclude others, or even repel them.” The prevailing Western notion of “individuality” is rooted in the act of “dis-identification” by which the individual distinguishes himself from others. However beneficial such a notion of human subjectivity might have been in helping to extricate “individuals” from the always dangerous mechanisms of social contagion, the West’s canonization of this notion of selfhood was anthropologically untenable. As Joseph Ratzinger pointed out:
[Man does not find salvation in a reflective finding of himself but in the being-taken-out-of-himself that goes beyond reflection – not in continuing to be himself, but in going out from himself. It means that the liberation of man consists in his being freed from himself and, in relinquishing himself, truly finding himself.31
. . .
In our deeply secularized age, it is difficult to imagine the impact of this idea on Christian culture of the time. Not least because as the idea of a willful God was throwing the moral confidence of Christian civilization into confusion, there were historical events that seemed to call into question the confidence in divine providence. Still suffering from the Little Ice Age, fourteenth century Europe was haunted by the failure of the Crusades a century earlier. The Hundred Years War was grinding on. In the middle of the century, the Black Death struck, taking perhaps fifty million lives, more than half of Europe’s population. It was under these circumstances that Ockham’s theological voluntarism – which mirrored the inscrutable will Allah in Islam – seemed confirmed by events. “Although Ockham was excommunicated from the Church and his teaching was repeatedly condemned in Paris from 1339 to 1347,” writes Michael Allen Gillespie, “his thought soon became ascendant in much of Europe.”32
In the words of Michael Gillespie, Ockham’s voluntarism was “the first step on the road to the insight that faith alone is the basis of salvation.”33 “Whereas Christianity had previously depicted itself as community and Church,” wrote Joseph Ratzinger, with Luther, “it was now the pro me, the ultimate discontinuity of a personalist orientation.” The pressing imperative was to root out the pernicious elements that had crept into the pristine Christian proclamation. That a rectification of corruptions within the Church were necessary – as they always are – is indisputable. The new spirit, however, was deeply anti-traditional inasmuch as, for Luther and his followers “history, once understood as the union of promise and fulfillment, is interpreted now as the contradiction between law and gospel.”34 The task at hand was to shake off the lamentable influence of a Hellenistic distortion of Christianity represented by the the Catholic Scholasticism of the Middle Ages.
Thinking “historically” becomes thinking anti-historically … the enthusiastic option for history represents, at the same time, an equally decisive rejection of the past, a suspension of all reference to tradition in favor of a program of what is to be done.35
Each in his own way, Martin Luther and René Descartes declared to have found a source of certainty that would replace the confidence that the older Christian order had found in the scriptural, liturgical, and sacramental life of Christian experience. We touch but for a moment on the path Luther took before returning to the legacy of Descartes, Both of these responses to the crisis of uncertainty that Ockham had unleashed and that bitter circumstances had seemed to confirm represented a tectonic shift in how those they influenced experienced the world and sought to ameliorate the accompanying anxieties. It is the Cartesian influence, however, that is more germane to our investigation of the contemporary plight of the autonomous self. As the philosopher Jeffery Stout supplies a succinct overview:
For Descartes, as for Luther before him, what most matters in life is no longer played out in the dimensions of community and tradition. One discovers truth in the privacy of subjective illumination, and this truth is underlined by a kind of self-certifying certainty. Community, tradition, authority: these have all started to give way to the individual, his inwardness, his autonomy.36
While the emphasis on religious individualism of the Protestant movement bears on the genealogy of the crisis of subjectivity we are here sketching in outline, the line leading from Oakham to Descartes will suffice to exemplify the quest for an epistemological certainty by turning inward and disavowing the influence of others, a fool’s errand.
As Balthasar summarizes the process which led from Dun Scotus to William of Ockham to the Reformation.
… the Franciscan image of God – love beyond the limits of knowledge – must therefore degenerate into an image of fear (which is no longer even that of the Old Testament), since this God of pure freedom might always posit and demand what is contrary; for instance that man should hate him, that the innocent should be damned and the guilty saved, and why should he not be able to destroy the world in such a way that it would never had existed? And, of course, the late Augustinianism of double predestination makes its appearance here with renewed virulence; from here, it will be bequeathed to the Reformers.37
In due course, Ockham’s theological voluntarism gave birth to an anthropological voluntarism, which devolved into what Cardinal Ratzinger called the dictatorship of relativism. Which begs the question: Christianity has again and again survived crushing oppression. Will it survive an autonomous and anarchistic understanding of freedom? Of course it will, but at what costs? That remains to be seen.
Ockham unwittingly bequeath to his intellectual descendants a form of theological fatherlessness. For the fatherhood, with which we will be concerned in later chapters, depends on reliability. The God whom Jesus called Father was always the God of the Promise, the Promise keeper, the God on whose consistency his creatures could rely. If the God whom Christ taught us to call Father was capable of forsaking His promises, His covenant, or His commandments, He would, in due course, become a terrifying reality.
It did not take long for the uncertainty regarding God’s will to throw those under the sway of that possibility into a profound moral and eschatological uncertainty. How was one to ensure one’s eternal happiness if at any moment God could alter the moral dictates in obedience to which salvation was alone possible? Of those shaken by this possibility, Martin Luther and René Descartes, each influenced by one of the few missteps found it the Augustinian corpus, were arguably the most historically consequential. Luther proposed faith alone as the foundation, with the solo scriptura as the reigning proviso. Descartes sought refuge from the Ockhamite God’s radical unpredictability by turning to an equally radical inwardness, not only cut off from all intellectual mediation, but cut off as well from the entire patrimony of the culture of which he himself was so conspicuously a product.
William of Ockham radicalized the emphasis on the will, denying the existence of universals to which the will was obliged to conform in the perennial opinion of orthodox Christians. Ockham’s world was radically unpredictable. Each person was, for all intents and purposes, a species al to himself. The term man, for instance, as a reference to our species, was for Oakham a mere convenience. There was no such thing. There was only this man and that man. There was no nature shared by members of a species. Moreover, no effort to conform one’s will to the will of God was availing, for God’s will was as untethered as man’s. In theory at least, Oakham’s God was radically unpredictable, a god of pure will whose omnipotence empties the world of ordering principles grounded in love. The cultural consequences were glimpsed by the English poet John Donne (1572-1631):
Tis all in pieces, all coherence gone,
All just Supply, and all Relation;
Prince, subject, father, son, are things forgot,
For every man alone thinks he had got
To be a Phoenix, and that then can be
None of that kind, of which he is, but he.38
Under this regime, what we have called the human vocation, what in an Aristotelian register could be called an entelechistic impulse, ceases to be an ordination shared by all members of our spectacularly gifted and sin-shackled species. Rather the human is thought to be entirely self-defined and self-determined, without regard to our natural endowment and the moral standards that might conduce to it perfection. Under such an anthropological premise, social life completely breaks down, there being no shared assumptions about the meaning and purpose of life itself. The radical autonomy thus enshrined will require protection by the state, and where might the state locate its authority to intervene on behalf of those who essentially assert their immunity from interference? Rousseau’s answer was that people must be forced to be free. “The autonomy principle,” writes Douglas Farrow, “acts like an acid to dissolve what remains of the moral and cultural fabric of Christian civilization.”39 About the consequences of abandoning the Christian foundations of our civilization, Michael Allen Gillespie ominously observed: “The notion of absolute freedom that had its origin in the nominalist notion of God came to a monstrous crescendo in Stalin’s terror.”[40]
Meanwhile, Jeffery Stout draws out the implications for our time of the Cartesian inward turn, paralleling as it does the Reformation rejection of the role of tradition:
Descartes’s quest for certainty was born … in a flight from authority. The crisis of authority made an absolutely radical break with the past seem necessary. Methodical doubt therefore sought complete transcendence of situation. It tried to make the inheritance of tradition irrelevant, to start over again from scratch, to escape history. But is this possible?41
What Jeffery Stout has to say about the legacy of the Protestant Reformation is true as well of Descartes’ legacy:
The Protestant appeal to the individual conscience and inner persuasion in effect produces yet another version of the problem of many authorities. But now we have far more authorities than before, for every man recognizes his own inner light. Every conscience constitutes a separate authority. We are left with no means to settle disagreements about matters of public importance. What started out as an appeal to the single authority of scriptural revelation now seems to recognize, implicitly at least, ten authorities in every pew. The potential for anarchy did not go unnoticed by the Catholic critics.42
Among those Catholic critics was, René Girard, who observed: “Once we are deprived of transcendental guideposts, we must trust our subjective experience. Whether we like it or not, we are little Cartesian gods with no fixed reference and no certainty outside of ourselves.”43 We are, in the dire warning of John the Baptist, chaff that the winnowing fork will separate from the wheat and which will thereafter become fuel for a conflagration – whether here or hereafter.
[1] Eugen Rosenstock-Huessy, In the Cross of Reality: Vol. I: The Hegemony of Spaces, trans. Jürgen Lawrenz, (New York: Transaction Publishers, 2017), 281.
[2] G. K. Chesterton, Orthodoxy, (Sam Torode Book Arts, Nashville, TN, 2008), 96.
[3] Benedict XVI, General Audience, July 7, 2010.
[4] Ibid.
[5] Michael Allen Gillespie, The Theological Origins of Modernity, (Chicago: University of Chicago Press, 2008), 22.
[6] Robert C. Solomon, Continental Philosophy since 1750: The Rise and Fall of the Self, (Oxford: Oxford University Press, 1988), 5.
[7] Jeffery Stout, The Flight from Authority: Religion, Morality, and the Quest for Autonomy, (University of Notre Dame Press, 1981),
[8] Hans Urs von Balthasar, The von Balthasar Reader, Ed. Medard Kehland, Werner Loser, (New York: Crossroad Herder, 1997), 383.
[9] Hans Urs von Balthasar, Prayer, (San Francisco: Ignatius Press, 1986), 261.
[10] Phillip Cary: Augustine’s Invention of the Inner Self: The Legacy of a Christian Platonist, (Oxford: Oxford University Press, 2000), 35.
[11] Cardinal Joseph Ratzinger, “Concerning the Notion of Person in Theology,” Joseph Ratzinger in Communio, Vol. 2: Anthropology and Culture, ed. David L. Schindler and Nicholas J. Healy, (Grand Rapids: Eerdmans, 2013), 117-118.
[12] Henri de Lubac, S.J. The Mystery of the Supernatural, trans. Rosemary Sheed, (New York: Crossroad Publishing, 1988), 225.
[13] Henri de Lubac, S.J. The Mystery of the Supernatural, trans. Rosemary Sheed, (New York: Crossroad Publishing, 1988), 226.
[14] Karl Rahner, On the Theology of Death, (New York: Herder and Herder, 1962), 28.
[15] Cardinal Joseph Ratzinger, “Concerning the Notion of Person in Theology,” Joseph Ratzinger in Communio, Vol. 2: Anthropology and Culture, ed. David L. Schindler and Nicholas J. Healy, (Grand Rapids: Eerdmans, 2013), 111, italic emphasis added.
[16] Cardinal Joseph Ratzinger, “Concerning the Notion of Person in Theology,” Joseph Ratzinger in Communio, Vol. 2: Anthropology and Culture, ed. David L. Schindler and Nicholas J. Healy, (Grand Rapids: Eerdmans, 2013), 112.
[17] Cardinal Joseph Ratzinger, “Concerning the Notion of Person in Theology,” Joseph Ratzinger in Communio, Vol. 2: Anthropology and Culture, ed. David L. Schindler and Nicholas J. Healy, (Grand Rapids: Eerdmans, 2013), 113.
[18] Jaroslav Pelikan, The Mystery of Continuity: Time and History, Memory and Eternity in the Thought of Saint Augustine, (Charlottesville: University Press of Virginia, 1986), 151.
[19] Charles Taylor, Sources of the Self: The Making of the Modern Identity, (Cambridge: Harvard University Press, 1989), 127.
[20] De Trinitate, Book X, Ch 14.
[21] Michael Allen Gillespie, Nihilism Before Nietzsche, (Chicago: University of Chicago Press, 1996), 29.
[22] Charles Taylor, Sources of the Self: The Making of the Modern Identity, (Cambridge: Harvard University Press, 1989), 156-57.
[23] Rene Descartes, Discourse on Method and The Meditations, trans. F. E. Sutcliffe (Harmondsworth: Penguin Books, 1968), 35.
[24] Rene Descartes, Meditations on First Philosophy: Meditation I, (Hackett Publishing, 1993), 16-17.
[25] Rene Descartes, Discourse on Method and The Meditations, trans. F. E. Sutcliffe (Harmondsworth: Penguin Books, 1968), 36.
[26] William Temple, Nature, Man and God, (London: Macmillan, 1940), 57.
[27] In the Greek text of the prodigal son story, the younger son comes to his autos. But at that time the word had none of the connotations the word self has acquired in the modern era.
[28] Robert Barron, The Priority of Christ: Toward a Postliberal Catholicism, (Grand Rapids: Baker Academic: 2007), 137.
[29] Rémi Brague, The Kingdom of Man: Genesis and Failure of the Modern Project, trans. Paul Seaton, (Notre Dame, IN: University of Notre Dame Press, 2018), 192.
[30] Kenneth L. Schmitz, “Selves and persons: A Difference in loves?,” Communio 18 (Summer, 1991), 185, italic emphasis added.
[31] Joseph Cardinal Ratzinger, Principles of Catholic Theology, trans. Sister Mary Frances McCarthy, S.N.D., (San Francisco: Ignatius Press, 1987), 171.
[32] Michael Allen Gillespie, Nihilism Before Nietzsche, (Chicago: University of Chicago Press, 1996), 23-4.
[33] Michael Allen Gillespie, Nihilism Before Nietzsche, (Chicago: University of Chicago Press, 1996), 21.
[34] Joseph Cardinal Ratzinger, Principles of Catholic Theology, trans. Sister Mary Frances McCarthy, S.N.D., (San Francisco: Ignatius Press, 1987), 157.
[35] Joseph Cardinal Ratzinger, Principles of Catholic Theology, trans. Sister Mary Frances McCarthy, S.N.D., (San Francisco: Ignatius Press, 1987), 159.
[36] Jeffery Stout, The Flight from Authority: Religion, Morality, and the Quest for Autonomy, (University of Notre Dame Press, 1981), 49-50.
[37] Hans Urs von Balthasar, The Glory of the Lord: A Theological Aesthetics: Vol V: The Realm of Metaphysics in the Modern Age, trans. Davies, Louth, McNeil, Saward and Williams, (San Francisco: Ignatius Press, 1991), 20.
[38] From “An Anatomy of the World,” 1611; quoted: Douglas Farrow, Theological Negotiations: Proposal in Soteriology and Anthropology, (Grand Rapids, MI: Baker Academic, 2018), 255, n.7.
[39] Douglas Farrow, Theological Negotiations: Proposal in Soteriology and Anthropology, (Grand Rapids, MI: Baker Academic, 2018), 188.
[40] Michael Allen Gillespie, Nihilism Before Nietzsche, (Chicago: University of Chicago Press, 1995), xx.
[41] Jeffery Stout, The Flight from Authority: Religion, Morality, and the Quest for Autonomy, (University of Notre Dame Press, 1981), 68-69.
[42] Jeffery Stout, The Flight from Authority: Religion, Morality, and the Quest for Autonomy, (University of Notre Dame Press, 1981), 44.
[43] René Girard, Resurrection from the Underground, edited and translated by James G. Williams, (East Lansing: Michigan State University Press), 83.
To be discussed on Wednesday October 14th 2020 during our video/teleconference call:
Dear Friends,
The mountain of research that I have done over the last decade for the book I am writing makes the task of integrating and focusing this material into an intelligible whole quite daunting. I have repeatedly asked myself what thematic thread might legitimately represent all the many features of the argument the book will make. This dilemma has caused me to change the working title and subtitle of the project several times, each change an experiment. The most persistent of these sample titles and subtitles has been: Changing the Subject: From Self to Person. There is much to recommend this choice. But it has at least two defects. First, it lacks the urgency that our present crisis requires, and that recent events have so dramatically underscored. And secondly, it fails to attest – explicitly – to the singular role that Christianity can play in responding to this crisis.
Since those who might join Randy and me for our upcoming Florilegia teleconferences are friends more familiar with our work, let me share (below) my most recent attempt to find a title, subtitle, and initial citations which might guide me in arranging the material coherently. I look forward to what our friends have to say on this matter. I have also chosen to share – for your eyes only – the working draft of a chapter which will appear much later in the manuscript, but which may give some background to the working title I am now considering. Thus, the current working title:
THE APOCALYPSE OF THE SOVEREIGN SELF:
Recovering the Christian Meaning of Personhood
“The knowledge of what it means to be a person is inextricably bound up with the Faith of Christianity.” – Romano Guardini
“The person both as a concept and as a living reality is purely the product of patristic thought. Without this, the deepest meaning of personhood can neither be grasped nor justified.” – John Zizioulas
“The revelation of the person is the event of Christianity.” – Paul Evdokimov
(What follows is the draft of a chapter that will appear much later in the manuscript, and which throws light on the latest working title and subtitle. Earlier chapters in the book highlight the role played in this gradual spiritual evisceration by figures such as William of Ockham, René Descartes, Jean-Jacques Rousseau, Friedrich Nietzsche, Sigmund Freud, and so forth. This chapter is a work in progress, to be shared only with the friends and supporters of the Cornerstone Forum.)
MULTIPHRENIA
For Jesus had said to him, “Come out of this man, you impure spirit!” Then Jesus asked him, “What is your name?” “My name is Legion,” he replied, “for we are many.” (Mk 5:8-9)
. . .
As we saw in an earlier chapter, Freud recognized early in his observations of those exhibiting hysterical symptoms “the capacity of hysterics to imitate any symptoms in other people that may have struck their attention.” He recognized, not only the psychological instability that this trait involved, but the extreme fluidity of symptoms likely to be exhibited by those bombarded by multiple mimetic influences. Even earlier, in our reflections on Virginia Woolf’s novel, The Waves, we noted the fluid form of subjectivity experienced by the principal protagonist, Bernard, who at one point declared: “I am not one and simple, but complex and many. … They do not understand that I have to effect different transitions; have to cover the entrances and exits of several different men who alternatively act their parts as Bernard.” [1]
Near the end of Woolf’s novel, there appear the first faint signs of what are the most alarming consequences of the mimetic crisis: the indistinguishability of the sexes. The world-weary and wistful Bernard muses, “this is not one life; nor do I always know if I am man or woman, Bernard or Neville, Louis, Susan, Jinny or Rhoda – so strange is the contact of one with another.”[2]
Woolf gives her reader access to Bernard’s interior musings:
. . . we are slipping away. Little bits of ourselves are crumbling. . . . I cannot keep myself together. . . . But what is odd is that I still clasp the return half of my ticket to Waterloo firmly between the fingers of my right hand, even now, even sleeping.[3]
Without an overarching, paradigmatic, “external” model, the mimetic phenomenon can careen wildly and erratically. One recalls the insight of the French philosopher, Rémi Brague:
In my country, and in other ones, too, like Spain, when a cab is for hire and looking for a customer, it has a flag of sorts on which is written “free.” For many of our contemporaries, the model of what “being free” means is the way in which this cab is “free.” This means that it is empty, that it doesn’t go to any particular place, and can be taken over and hired by anybody who can pay.[4]
The very attempt to survey and catalogue the magical, shape-shifting world of a full-blown mimetic crisis runs the risk of appearing to be yet another instance of it. And yet some of the literary adumbrations of this crisis supply pieces of the overall puzzle that we would be remiss for ignoring. For instance: whether consciously or not, in the above quotation Rémi Brague has adopted the same metaphor on which T. S. Eliot relied in his own poetic allusion to the selfsame postmodern predicament. At the end of a day structured by responsibilities to one’s employer, one faces a few hours relatively less structured by obligations.
At the violet hour, when the eyes and back Turn upward from the desk, when the human engine waits Like a taxi throbbing waiting . . . [T. S. Eliot, Waste Land]
So fickle is the deracinated self today, and so evanescent are the models on which it more or less unconsciously relies, that once the routine tasks it has contracted to perform come to an end for the day the search begins for another scent to follow, another example to imitate, another opportunity to perform the pantomime of autonomy. Writes René Girard:
We can also predict that when the fascinated being reaches the paroxysmal stage of his sickness he will be completely incapable of maintaining his original pose and will constantly change roles.[5]
Jean-Michel Oughourlian:
Just as God created man in his image, animating the clay with otherness, so do human beings engender one another mutually, not only on the genetic level but also on the psychological level, the self being filled with and saturated by otherness throughout its history and constituted as a patchwork of all integrated others.[6]
There is in Oughourlian’s basically sound insight an inner tension, captured by the words patchwork and integrated. He is doubtless correct that each of us owe a debt to the countless people who have touched our lives. Therein is the patchwork. Ordering that patchwork into a coherent unity requires an touchstone, a polestar, an ordering principle of some kind. In the first instance, parents provide the child with such a touchstone, which gives credence to the lament of Arthur Miller’s Willy Loman that his father left when he was just a child, he acknowledged that he still felt kind of temporary about himself. Be that as it may, it is worth recalling that the transcendence of parental models is enshrined, not in the first, but in the fourth commandment. To change metaphors, the role of parental figures is that of the Big Dipper, which serves star-gazers in locating the north star, an apt metaphor of the transcendental grounding which alone satisfies the longing of the human heart.
Willy Loman was a literary manifestation of something that is now quite widespread. There is an epidemic of ontological diminution in our time, and Girard has provided the key for unlocking its underlying source. More and more people today are feeling kind of temporary about themselves. We see all of this foreshadowed of course in Luke’s Gospel:
“When an unclean spirit goes out of a man it wanders through waterless country looking for a place to rest, and not finding one it says, ‘I will go back to the home I came from.’ But on arrival, finding it swept and tidied, it then goes off and brings seven other spirits more wicked than itself, and they go in and set up house there, so that the man ends up by being worse than he was before.” [Luke 11: 24-26]
Two factors are operative: our ineradicable mimetic nature has grown famished because of the simultaneous emergence of the myth of autonomous individuality, which considers imitation beneath the dignity of the sovereign individual, and the intense urbanization of the human population over the last century and a half, not to mention that plethora of social media platforms that host an ever swirling aggregation of enticing models. The opportunity for mimetic fascination has grown exponentially while the myth of autonomy grows more dubious.
In an inner monologue in Virginia Woolf’s novel, The Waves, Neville analyzes his classmate, Bernard:
“Once you were Tolstoi’s young man; now you are Byron’s young man; perhaps you will be Meredith’s young man; then you will visit Paris in the Easter vacation and come back wearing a black tie, some detestable Frenchman whom nobody has ever heard of.”[7]
Bernard acknowledges the validity of Neville’s assessment in his own inner monologue: “I am Bernard; I am Byron; I am this, that and the other.”[8] And near the end of the novel Bernard casts a glance back over his life:
I changed and changed; was Hamlet, was Shelley, was the hero, whose name I now forget, of a novel by Dostoevsky; was for a whole term, incredibly, Napoleon; but was Byron chiefly. For many weeks at a time it was my part to stride into rooms and fling gloves and coat on the back of chairs, scowling slightly. I was always going to the bookcase for another sip of the divine specific. Therefore, I let fly my tremendous battery of phrases upon somebody quite inappropriate — a girl now married, now buried; every book, every window-seat was littered with the sheets of my unfinished letters to the woman who made me Byron. For it is difficult to finish a letter in somebody else’s style.[9]
The hero whose name Bernard thinks he has forgotten is surely the protagonist in Dostoevsky’s Notes from Underground, whose name doesn’t appear in the novel.
. . .
The spiritual, psychological, and increasingly ontological predicament in which many – especially the young – are today living has been disturbingly captured by Kenneth Gergen and made all the more distressing by his effort to remain sanguine in the face of it. Like Freud, however, Gergen and a few of his postmodern contemporaries, provide an inestimable service by insightfully surveying social and psychological phenomena they have nevertheless analyzed inadequately. The lived experience of the postmodern self, Gergen seems happy to announce, is multiphrenia.
Like so many postmodern apologists, Mr. Gergen – having diagnosed a self-dissolution that coincides with the loss of Christian sources of hope – must try as best he can to remain cheerful. Now perfectly unencumbered by the modern quest for what de Lubac termed “static sincerity,” the postmodern accommodates to his life as a de-centered “social chameleon,” taking bits and pieces at random from the incessant parade of mimetic models to which he is exposed. “If one’s identity is properly managed, the rewards can be substantial,” Gergen strains to assure his readers: “the devotion of one’s intimates, happy children, professional success, the achievement of community goals, personal popularity, and so on.” All this is possible, he imagines, “if one avoids looking back to locate a true and enduring self, and simply acts to full potential in the moment at hand.” Avoiding this glance backward – the glance that might awaken that blissfully dormant “guilt of self-violation” and its accompanying “sense of superficiality” – is what the indefatigable Norman O. Brown, called “improvising a raft after shipwreck,” the shoring up of fragments against one’s ruin. At this point in our explorations, it hardly needs pointing out that what Freud diagnosed as hysteria and what Gergen characterizes as multiphrenia are species of the same predicament.
Among the voices of reason is that of Nicholas Berdyaev:
Inner division wears away personality, and this division can be overcome only by making a choice, by selecting a definite object for one’s love . . . Debauchery means the absolute inability to choose from among many attractions. . . [10]
The Russian philosopher is echoed by the Polish poet, Czeslaw Milosz, who offers a more sobering assessment of what Gergen calls multiphrenia, one more in sync with Jesus’ reference to the seven demons more wicked than the one expelled:
What reasonable man would like to be a city of demons, who behave as if they were at home, speak in many tongues . . . The purpose of poetry is to remind us how difficult it is to remain just one person, for our house is open, there are no keys to the doors, and invisible guests come in and out at will.[11]
Professor Gergen has recognized better than most the features of human anthropology that many find it convenient to ignore, and he has seen the contemporary spiritual, psychological, and cultural crisis with great insight. Not least when he writes:
There is an important sense in which each of us is a metaphor for those with whom we come in contact. They provide the images of what it is to be an authentic person, and as we incorporate others’ modes of being – their mannerisms, their styles – we become their surrogates, metaphors of their reality.[12]
This is doubtlessly so. What is more problematic and elusive is how this can be done without compromising the subject’s ontological and psychological integrity.
The fully saturated self, insists Kenneth Gergen, is an accomplishment. It begins, he argues, with the populating of the self by multiple influences. The subsequent achievement of social saturation leads eventually to a multiphrenic condition and the “vertigo of unlimited multiplicity.” “Both the populating of the self and the multiphrenic condition are significant preludes to postmodern consciousness.”[13] That every achievement seems but a prelude to yet another one need not be taken as evidence of the value of what is eventually achieved. The notion that a series of steps, each just disappointing enough to be imagined to be “milestones” but not so obviously wrong-headed to provoke an about-face, will lead in the end to full satisfaction – that is the animating principle of contemporary liberalism, an application, as we shall see in a later chapter, of Norman O. Brown’s “emergency after emergency of swift transformations.”
As for points of stability in this endless flux, Gergen turns to the columnist the Washington Post eulogized as the woman who “brought humor to hanky-panky.”[14]
The columnist Cynthia Heimel argues that because celebrity figures are known by so many people, they serve as forms of social glue, allowing people from different points of society to converse with each other, to share feelings, and essentially to carry on informal relations. “Celebrities,” she proposes, “are our common frames of reference, celebrity loathing and revilement crosses all cultural boundaries.”[15]
It may not be entirely coincidental that no sooner has Ms. Heimel invoked the socially unifying influence of celebrities than she specifies the emotional tone associated with that social solidarity, namely: loathing and revilement. There is a darker truth beneath this curious comment. Societies that count on popular celebrities for their social solidarity are already in the early stages of a crisis, and no celebrity is capable of arresting for long the deepening progress of the crisis. At a later stage in the crisis, the adulation once accorded those at the center of public attention may well reverse it’s valence and turn – slowly or not so slowly – to antipathy.[16]
Heimel goes on to say, “Celebrities are not our community elders, they are our community.” And this touches on the collapse of social hierarchies, in the aftermath of which no paragons of moral virtue, civic pride, honorable conduct, and nobility – no “elders” – are allowed to grace the public square, much less exert a serious claim on the public imagination of subsequent generations. A celebrity culture is an anti-culture, inasmuch as those figures who attract social attention come and go with such frequency that in due course what is left of the genuine social cohesion inevitably dissipates, leaving the celebrities at the unenviable center of a spiritually and morally exhausted society’s attention. In his defense, it must be said that Kenneth Gergen cannot be accused of deception. His honesty is bracing:
We are not dealing here with doubts regarding claims about the truth of human character, but with the full-scale abandonment of the concept of objective truth. The argument is not that our descriptions of the self are objectively shaky, but that the very attempt to render accurate understanding is itself bankrupt.[17]
There is a great deal of gibberish in Gergen’s book, much of it one suspects de rigueur at Swarthmore College in the last decades of the last century and beyond. For instance:
As one casts out to sea in the contemporary world, modernist moorings are slowly left behind. It becomes increasingly difficult to recall precisely to what core essence one must remain true. The ideal of authenticity frays about the edges; the meaning of sincerity slowly lapses into indeterminacy. And with this sea change, the guilt of self-violation also recedes. As the guilt and sense of superficiality recede from view, one is simultaneously readied for the emergence of a pastiche personality. The pastiche personality is a social chameleon, constantly borrowing bits and pieces of identity from whatever sources are available and constructing them as useful or desirable in a given situation.[18]
Gergen is intelligent enough to realize and honest enough to acknowledge the cultural consequences of the multiphrenia he so masterfully surveys. Having dismissed truth as a sufficiently stable category, he helps explain a very troubling feature of both our political and journalistic cultures in recent years. “With postmodernism the distinction between truth and falsity lapses into indeterminacy,” he writes. “The existence of lying in society is thus not an outcome of individual depravity, but of pluralistic social worlds.”[19] These social worlds are held precariously intact by whatever narrative their advocates manage to make plausible. Writes Gergen:
The initial stages of this consciousness result in a sense of the self as a social con artist, manipulating images to achieve ends. As the category of “real self” continues to recede from view, however, one acquires a pastiche-like personality. Coherence and contradiction cease to matter as one takes pleasure in the expanded possibilities of being in a socially saturated world. Finally, as the distinction between the real and the contrived, style and substance, is eroded, the concept of the individual self ceases to be intelligible.[20]
Writing at the dawn of the internet age, long before the technologies of saturation achieved anything like their present scope, Gergen managed to issue this prediction in a spirit of cheerfulness:
Not only do the technologies of social saturation fashion “the individual without character,” but at the same time, they furnish invitations to incoherence. In a humdrum moment, the Vancouver tax accountant can pick up the phone and rekindle a relationship in St. Louis, within less than an hour the restless engineer can drive to a singles bar thirty miles away; on a tedious Friday a New Jersey executive can decide to fly to Tortola for the weekend. … In the final analysis, we find technology and life-style operating in a state of symbiotic interdependence. … The technologies engender a multiplicitous and polymorphic being who thrives on incoherence, and this being grows increasingly enraptured by the means by which this protean capacity is expressed. We enter the age of techno-personal systems.[21]
Thirty years on, the veterans are returning. The romance is over. But from a professorship at Swarthmore in 1991, it seemed promising. We close Professor Gergen’s insightful book with a sense of gratitude and sadness. Indeed, there are hints here and there that the author himself was haunted by such doubts, however heroically he struggled to keep the tone of his prose basically upbeat.
Many are dismayed by the current state of events. It is painful to find the old rituals of relationship – deep and enduring friendships, committed intimacy, and the nuclear family – coming apart at the “seems.” Continuity is replaced by contingency, unity by fragmentation, and authenticity by artfulness. Yet there is no obvious means of return at hand.[22]
That’s a nice touch: coming apart at the “seems,” as if what appeared to be coherence and integrity in an earlier age only seemed to be so, flattering those who ostensibly see beyond what “seems” to have been so for his epistemically encumbered predecessors. “Truth as a correspondence between word and world lapses into nonsense,” writes Gergen triumphantly. “Terms such as sham and pretense in their traditional sense simply don’t apply.”[23] Whatever the shortcomings of this deeply cynical view, Gergen focuses only on the benefits as he sees them:
With the spread of postmodern consciousness, we see the demise of personal definition, reason, authority, commitment, trust, the sense of authenticity, sincerity, belief in leadership, depth of feeling, and faith in progress. In their stead, an open slate emerges on which persons may inscribe, erase, and rewrite their identities as the ever-shifting, ever-expanding, and incoherent network of relationships invites and permits.[24]
A man who doubts the benefits of a shared culture and the lively exchange in a marketplace of ideas comes at last to celebrate a “totalizing discourse” carefully disguised as the ardent enemy of such things.
Totalizing discourses have a final deficit. Not only do such systems truncate, oppress, and obliterate alternative forms of social life; they also set the stage for schism. To be convinced of the “truth” of a discourse is to find the alternatives foolish or fatuous – to slander or silence the outside. Warring camps are developed that speak only to themselves, and that seek means of destroying others’ credibility and influence (and life), all with an abiding sense of righteousness.[25]
It was still possible in 1991 to presuppose that “totalizing discourses” were inherently traditional ones. Three decades on, something like the opposite appears to be the case. Those holding traditional views are routinely silenced by those who rose to power – in educational, entertainment, journalistic, and corporate institutions – in the years since Gergen’s book was published.
Writing three years later, the psychiatrist Robert Jay Lifton began his exploration of the contemporary predicament of the self with these jauntily cheerful words:
We are becoming fluid and many-sided. Without quite realizing it, we have been evolving a sense of self appropriate to the restlessness and flux of our time. This mode of being differs radically from that of the past, and enables us to engage in continuous exploration and personal experiment. I have named it the “protean self” after Proteus, the Greek sea god of many forms.[26]
Which raises a question: If we can assume that today’s university students are demographic cadre most likely to have undergone a transition into the protean self, how do we explain that they are on average far less open to points of view other than, or contrary to, those to which they so tenaciously cling, and for which they often have such utter contempt?
As of this writing, the official website of the University of California, once one of the most respected educational institutions in the world, lists sixty-six approved terms for expressing an individual’s chosen identity, the great majority of which are terms defining one’s chosen sexual self-identification. In testimony to the extreme fluidity of such definitions, the website prominently acknowledges the date of the survey and cautiously declares:
These terms were last updated in May 2019. For the most complete definitions, we encourage you to compare what you find here with information from other sources as language in our communities is often an evolving process, and there may be regional differences. Please be aware that these terms may be defined with outdated language or concepts.[27]
Appended to this apology for any sexual identity that might not have been listed was an email address for anyone who might want to add a different identity.
[1] Virginia Woolf, The Waves, (San Diego: Harcourt Brace Jovanovich, 1978), 76.
[2] Ibid, 281.
[3] Ibid., 235.
[4] Rémi Brague, Curing Mad Truths (Catholic Ideas for a Secular World), (Notre Dame, IN: University of Notre Dame Press, 2019), 59.
[5] René Girard, Deceit, Desire, and the Novel, trans. Yvonne Freccero, (Baltimore: Johns Hopkins University Press, 1965), 266.
[6] Jean-Michel Oughourlian, The Mimetic Brain, trans. Trevor Cribben Merrill, (East Lansing: Michigan State University Press, 2016), 187.
[7] Virginia Woolf, The Waves, (San Diego: Harcourt Brace Jovanovich, 1978), 87.
[8] Ibid, 89.
[9] Ibid., 249-50.
[10] Nicholas Berdyaev, Dostoevsky, (New York: New American Library, 1974), 125.
[11] Czeslaw Milosz, “Ars Poetica?” from The Collected Poems: 1931-1987. Copyright © 1988 by Czeslaw Milosz Royalties, Inc. Used by permission of HarperCollins Publishers.
[12] Kenneth J. Gergen, The Saturated Self, (New York: Basic Books, 1991), 223.
[13] Ibid, 49.
[14] “Cynthia Heimel, columnist who brought humor to hanky-panky,” The Washington Post, Feb. 26, 2018.
[15] Kenneth J. Gergen, The Saturated Self, (New York: Basic Books, 1991), 56.
[16] René Girard’s opus is a massive explication of this process.
[17] Kenneth J. Gergen, The Saturated Self, (New York: Basic Books, 1991), 82.
[18] Ibid., 150.
[19] Ibid., 169.
[20] Ibid., 170
[21] Ibid., 173.
[22] Ibid., 181.
[23] Ibid., 187.
[24] Ibid., 228.
[25] Ibid., 252.
[26] Robert Jay Lifton, The Protean Self: Human Resilience in an Age of Fragmentation, (New York: BasicBooks, 1993), 1.
[27] https://campusclimate.berkeley.edu/students/ejce/geneq/resources/lgbtq-resources/definition-terms
To be discussed on Wednesday September 16th 2020 during our video/teleconference call:
PART I: CONVERSION
Dear Friends,
Books are not written, alas, by committee, but the kind of book on which I am working will be one that draws on the wisdom of the “Church” broadly conceived. That is why I regularly cite those theologians and thinkers who have preceded me and guided me, prominent among whom are Hans Urs von Balthasar, Pope Benedict XVI, and, of course, René Girard. It is voices such as theirs that herald a bold reawakening of Christian self-confidence. In a far more modest way, I would join my voice with theirs on themes that seem to me particularly germane to the crisis of our time.
The greatest joy I experienced during the years when I travelled around this country and abroad giving talks and taking part in conferences was the joy of discovery that arose from the exchanges I was able to have with others. Today, of course, there are few opportunities to have such discussions. I will be grateful, therefore, to any of the friends of the Cornerstone Forum who might be able to join Randy and me for an occasional teleconference discussion of selected parts of the manuscript on which I am working. We will keep the discussions short, say 45 minutes or so.
The current working title of this book project is:
The Soul is Naturally Christian: Changing the Subject from Self to Person
Perhaps the shortest synopsis of the project are these words from the great German theologian Romano Guardini: “The knowledge of what it means to be a person is inextricably bound up with the Faith of Christianity” Establishing the truth of that statement in today’s world is no simple task. I look forward to being helped by any comments and suggestions and critiques you may offer.
Below are a few excerpts from the first section of the manuscript that might prompt a discussion.
. . .
Bob Dylan is perhaps the most original popular musician of the last half-century. And yet his originality has never been premised on a lack of inspiring role models.
In accepting the 2017 Nobel Prize for Literature, Bob Dylan began his acceptance lecture with these words:
If I was to go back to the dawning of it all, I guess I’d have to start with Buddy Holly. Buddy died when I was about eighteen and he was twenty-two. From the moment I first heard him, I felt akin. I felt related, like he was an older brother. I even thought I resembled him. … He was the archetype. Everything I wasn’t and wanted to be. I saw him only but once, and that was a few days before he was gone. I had to travel a hundred miles to get to see him play, and I wasn’t disappointed. … He was powerful and electrifying and had a commanding presence. I was only six feet away. He was mesmerizing. I watched his face, his hands, the way he tapped his foot, his big black glasses, the eyes behind the glasses, the way he held his guitar, the way he stood, his neat suit. Everything about him. He looked older than twenty-two. Something about him seemed permanent, and he filled me with conviction. Then, out of the blue, the most uncanny thing happened. He looked me right straight dead in the eye, and he transmitted something. Something I didn’t know what. And it gave me the chills.
I think it was a day or two after that that his plane went down. And somebody – somebody I’d never seen before – handed me a Leadbelly record with the song “Cottonfields” on it. And that record changed my life right then and there. Transported me into a world I’d never known. It was like an explosion went off. Like I’d been walking in darkness and all of the sudden the darkness was illuminated. It was like somebody laid hands on me. I must have played that record a hundred times.
There are two striking things about Dylan’s description of the conversion he underwent in Buddy Holly’s presence. “He filled me with conviction,” Dylan relates. “He looked me right straight dead in the eye, and he transmitted something. Something I didn’t know what. And it gave me the chills.”
The palpably religious tone of Dylan’s words was echoed in his account of listening to the Leadbelly album: “It was like an explosion went off. Like I’d been walking in darkness and all of the sudden the darkness was illuminated. It was like somebody laid hands on me.” The biblical tone of this account is no accident. Dylan’s subsequent career has made it clear that he is has been a grateful inheritor of the Jewish and Christian tradition.
Thus does Dylan’s Nobel Prize Lecture touch upon the debate over the character of natural law, and whether the longing or desire that is at the heart of human life aims only at mundane and attainable goals. Those who argue that human desire is intrinsically ordered toward a transcendent or religious aspiration might want to cite the opening paragraph of Dylan’s lecture as evidence for their view, a view with which we concur.
Human nature, writes Hans Urs von Balthasar, “crawling on the ground, needs to be held up by a trellis if it is to bear fruit.” Grace presupposes nature, elevates and perfects it. That is a preeminently Catholic principle. For, Balthasar has noted:
… the natural man, if he has not already been artificially corrupted, does have a sense of awe in the face of the hidden mystery of being, in face of the ultimate origin and destiny of the world, of matter, of life, of evolution, of the fate of the individual and of humanity. Every religion, from the most primitive to the most sophisticated, lives essentially on this awe.
When we speak of longing, what do we mean? And why do we use that word for it? The verb came into English apparently from the Old High German langen, from which the contemporary German word, verlangen, meaning “to desire.” If we stop to savor the word, however, we soon find ourselves wondering about one of the truly most distinctly human of experiences. Most of us will immediately recognize the difference between what we mean when speaking of desire and what we mean when speaking of longing, even though we may be at a loss to explain the difference. Girard has taught us that desire is mimetic, but the poets, prophets, artists, musicians and mystics have taught us that longing is at the heart of who we are. And Flannery O’Connor and Bob Dylan have left evidence for the fact that longing, too, is mimetically aroused.
In retrospect, Dylan’s encounter with Buddy Holly seems to have been an initial moment in the unfolding of his unique personal vocation. That his vocation has taken many twists and turns, not all of them exemplary or necessarily edifying, follows a well-known pattern found in many of our lives, not least this author’s. However meandering the path Dylan took, that his poetic and musical vocation had at its core a religious experience will hardly be disputed by those who have followed his career.
The same religious insight is found in the self-described Hillbilly Thomist, the novelist and short-story writer Flannery O’Connor. The protagonist in her short story Parker’s Back was O. E. Parker. Only reluctantly, and only in a whispering voice, did Parker finally speak his full name to a woman he was wooing: Obadiah Elihue. The Book of Obadiah, the shortest book of the Old Testament, bemoans the treachery of Edomites and by extension all the hostile pagan nations arrayed against Israel. Obadiah means the servant of God, but O’Connor’s O. E. Parker only evinces that possibility at the very end of her story. For our purposes, and surely for O’Connor, the salient verse from the Book of Obadiah is:
They have driven you right to the frontiers,
they have misled you, all your allies.
They have deceived you, all your fine friends. (Obadiah 1:7)
O’Connor makes no mention of Parker’s father, and this is clearly one of the keys for understanding the story.
His mother wept over what was becoming of him. One night she dragged him off to a revival with her, not telling him where they were going. When he saw the big lighted church, he jerked out of her grasp and ran. The next day he lied about his age and joined the navy.
As so predictably happens, the flight from what is perceived as the narrow constraints of Christianity prefaces a supine submission to another, less reliable, and far more circumscribed and monitored allegiance. Under the circumstances, the navy was hardly the worst choice Parker could have made, but it was somewhat vitiated by the fact that he made it in order to avoid a brush with Christianity, to which he would finally return at the end of O’Conner’s story.
O’Connor’s fictional O. E. Parker is of interest to us inasmuch as he had an experience not unlike the one that Bob Dylan recounts when Buddy Holly looked him “right straight dead in the eye.” It happened quite at random when, at the age of 14, Parker saw the tattooed man at the fair.
Except for his loins which were girded with a panther hide, the man’s skin was patterned in what seemed from Parker’s distance — he was near the back of the tent, standing on a bench — a single intricate design of brilliant color. The man, who was small and sturdy, moved about on the platform, flexing his muscles so that the arabesque of men and beasts and flowers on his skin appeared to have a subtle motion of its own. Parker was filled with emotion, lifted up as some people are when the flag passes.
We humans are mimetic creatures, but, in one way or another, so are most living things. The difference is that we are not only naturally mimetic. We are spiritually so. An animal may mimic the behavior of other animals, but it will never be “filled with conviction” while standing in wonder at another animal. Less still will it be “filled with emotion, lifted up as some people are when the flag passes.” The mimesis operative in human affairs has what might be called a sacramental dimension, and, unless we recognize this, our analysis of that unique dimension of our existence will fall woefully short.
We must not overlook the crypto-religious dimension of O. E. Parker’s moment of conversion, similar in so many ways to what Bob Dylan experienced in the presence of Buddy Holly. Parker, for the moment at least, found himself a “lord” for his life, a guide to meaning and purpose and fulfillment.
Parker had never before felt the least motion of wonder in himself. Until he saw the man at the fair, it did not enter his head that there was anything out of the ordinary about the fact that he existed. … He had his first tattoo some time after – the eagle perched on the cannon. It was done by a local artist. It hurt very little, just enough to make it appear to Parker to be worth doing.
A person is a subject in a way that the autonomous self is not. The latter designation has come to imply a self-authenticating and entirely self-directed individual. The former, the subject, exists as one subject to something or someone else. As understood today, the self need not be a subject; for the self today is no longer determined by a loyalty to a greater purpose. But a subject properly so called is so subordinated. Parker went to the tattooist to light up his body with insignias chosen, not out of any serious commitment on Parker’s part to what the image might have signified, but simply for the colorfulness of the image. Each such image in time “wore off” and he sought another.
Parker would be satisfied with each tattoo about a month, then something about it that had attracted him would wear off. Whenever a decent-sized mirror was available, he would get in front of it and study his overall look. The effect was not of one intricate arabesque of colors but of something haphazard and botched. A huge dissatisfaction would come over him and he would go off and find another tattooist and have another space filled up.
His dissatisfaction, from being chronic and latent, had suddenly become acute and raged in him. It was as if the panther and the lion and the serpents and the eagles and the hawks had penetrated his skin and lived inside him in a raging warfare.
He seeks solace in another tattoo, but this time it must be God:
Parker: “Let me see the book you got with all the pictures of God in it,” Parker said breathlessly. “The religious one.”
“Who are you interested in?” he said, “saints, angels, Christs or what?”
“God,” Parker said.
“Father, Son or Spirit?”
“Just God,” Parker said impatiently. “Christ. I don’t care. Just so it’s God.”
Finally, however, O. E. Parker’s effort to lash all the psychological fragments of his existence together and make a raft out of them failed. His furious attempt to avoid self-disintegration by covering his body with one logo after another was having the effect of accelerating his psychological dissolution. In the half century since this story was published, there has been an overwhelming explosion of compulsive tattooing. The prevalence of the O. E. Parker phenomenon would probably have surprised the author who used tattooing metaphorically. Nor was this the only instance of O’Connor’s prescience.
K. V. Turley writes of the change that began with Dylan’s 1979 album, Slow Train:
Whereas, before 1979, there was a slow-burn rage at injustice and lies, after his conversion that rage is filtered through the lens of the Gospel. Previously, Dylan had come across as something of an Old Testament prophet, pointing out what was wrong in society; post-conversion, he had encountered the Messiah: then Dylan was not so much pointing out what was wrong as pointing to the only Person who could make things right.
In an autobiography published in 2004, Dylan wrote of his first experience on stage.
My first performances were seen in the Black Hills Passion Play of South Dakota, a religious drama depicting the last days of Christ. This play always came to town during the Christmas season with professional actors in the leading roles, cages of pigeons, a donkey, a camel and a truck full of props. There were always parts that called for extras. One year I played a Roman soldier with a spear and helmet-breastplate, the works – a nonspeaking role, but it didn’t matter. I felt like a star. I liked the costume. It felt like a nerve tonic … as a Roman soldier I felt like a part of everything, in the center of the planet, invincible. That seemed a million years ago now, a million private struggles and difficulties ago.
Hans Urs von Balthasar:
What is a person without a life-form, that is to say, without a form which he has chosen for his life, a form into which and through which to pour out his life, so that his life becomes the soul of the form and the form becomes the expression of his soul? For this is no extraneous form, but rather so intimate a one that it is greatly rewarding to identify oneself with it. Nor is it a forcibly imposed form, rather one which has been bestowed from within and has been freely chosen. Nor, finally, is it an arbitrary form, rather that uniquely personal one which constitutes the very law of the individual. Whoever shatters this form by ignoring it is unworthy of the beauty of Being, and he will be banished from the splendor of solid reality as one who has not passed the test. Thus, while physically he remains alive, such a person decays to expressionlessness and sterility, is like the dry wood which is gathered in the Gospel for burning. But if man is to live in an original form, that form has first to be sighted. One must possess a spiritual eye capable of perceiving the forms of existence with awe.
THE APOCALYPSE OF THE SOVEREIGN SELF And the Christian Mystery of the Person
“The revelation of the person is the event of Christianity.” – Paul Evdokimov . . .
“This is the age of anxiety because this is the age of the loss of the self.” – Walker Percy . . .
“Under the influence of Christian thought, we come to a better idea of what a person is.” – Henri de Lubac . . .
The Gospel has not completed its mission. It has something new and old to teach each generation as it arises – something especially for our own ear, which our fathers never heard; an explanation, perhaps, a way of looking at things, a task, fresh orders. – Paul Claudel
PREFACE
. . .
The person both as a concept and as a living reality is purely the product of patristic thought. Without this, the deepest meaning of personhood can neither be grasped nor justified. – John Zizioulas . . .
The knowledge of what it means to be a person is inextricably bound up with the Faith of Christianity. – Romano Guardini . . .
Books give birth to books. This book emerged from its immediate predecessor, so much so that a comment that appeared on the dust-jacket of God’s Gamble: The Gravitational Power of Crucified Love can serve as an epigraph for the book it spawned. The Cross of Christ has left a crater at the center of history, an inflection of sacrificial love toward which everything before and after this event is ordered and properly understood. That Christ is the Alpha and Omega – the logic, the meaning of creation itself, from whom the drama of salvation emanates and toward whom it moves – is a central but often neglected doctrine of Catholic Christianity. Though it is a mystery that will ever elude rational explication, sufficient traces of it can be found.
No amount of argumentation or documentation can convince someone that this is, in fact, true. Faith is ultimately a gift, and neither argumentation nor documentation can produce it in someone who is defined by his determination not to be duped. The journey of faith begins in faith, albeit an incipient form of faith that we call wonder. As that wonder leads to more mature forms of faith, the wonder with which the journey began grows all the more. In fact, the growth of wonder is perhaps the best measure of the health and vitality of one’s faith. This book is not a memoir, and the reader will not be expecting an account of its author’s journey into the never-ending wonders of Catholic faith, but if some of that wonder is occasionally detectable in the pages to follow, the reader will not be wrong in thinking he has stumbled upon the motive force that gave birth to this book. “Why,” asked the future Pope Benedict XVI, “does faith still have a chance? I would say because it is in harmony with what man is.” In this case as in others, the cure is easier to explain than the disorder it can rectify. That is because the misrecognition of the disorder is of its essence. The disorder is itself chiefly constituted by its misrecognition. Nonetheless, its recognition does not, alas, conduce to its cure. . . .
Every child of human parents is a person from the moment of conception and worthy of the reverence and respect rightfully due to that noble status. Nonetheless, our universal human condition as persons is accompanied by the responsibility to fulfill the promise of personhood according to our endowments, the circumstances of our lives, and whatever divine prompting we may be able to discern. Neither our natural endowments nor the social, historical, or cultural circumstances of our lives are dispositive regarding such a vocation. Not infrequently, a person is called to a task for which he has no noticeable qualifications. If it is in fact a genuine call, then the grace made available upon its acceptance will compensate, or more than compensate, for whatever natural talents for the task might be lacking. It is the call, the mission, that is decisive. “Only when we identify ourselves with our mission,” wrote Hans Urs von Balthasar, “do we become persons in the deepest, theological sense of the word.” One could quite confidently say that Christianity exists to help fallen creatures like ourselves discover the mission, in fulfillment of which we fulfill the promise of personhood that is our birthright by becoming persons in the deepest theological sense of the word. The self, we might say, is what happens to human subjectivity when the deeper theological meaning of personhood is abandoned or renounced.
Those who have never been exposed to Christ or to the Church, or whose connection to either has atrophied, are obliged to fashion forms of selfhood from whatever cultural and familial experience is available to them. To the extent that these forms of subjectivity are guided by natural law, they conduce to human dignity and moral responsibility, and deserve respect. Neither Christian forms of subjectivity nor those indebted to other influences can avoid the perils of the journey through the valley of tears which is life in the world. The Christian may well be more regularly reminded of those perils, and of the ways in which he has succumbed to them, but this advantage comes at the cost of greater culpability for yielding to them.
The moral armamentarium of Christian faith is an unqualified blessing, but it confronts Christians with added responsibility. By arguing for the uniqueness of the Christian form of personhood, therefore, we are by no means asserting moral superiority. Among those who have lived Christ-haunted and Christ-centered lives are some whose sins are both conspicuous and grave. When these sins were committed prior to Christian conversion, they often provided the backdrop for it. And when these sins are committed after Christian conversion, the sinner’s Christ-hauntedness can play a key role in his recovery of moral and spiritual dignity.
It was Tertullian (155-220), in his On the Testimony of the Soul, who first declared that the soul is naturally Christian – anima naturaliter Christiana. For Tertullian, the soul is naturally Christian because it is Christ who reveals to us the meaning and purpose of existence, the existence for which the soul naturally longs. The soul does not know its true ordination until its encounter with Christ. One of the echoes of Tertullian’s insight comes from the American historian, Bradley J. Birzer: “Any understanding of human dignity in the twenty-first century,” Rirzer writes, “demands an understanding of the Judeo-Christian Logos … Without it, there is only chaos and darkness, dispiritedness and confusion, blackness and the abyss.”
Robert Sokolowski suggests that, in today’s philosophically and theologically impoverished environment, perhaps it is better to speak of soul by recourse to a synonym derived from its Latin origin: animation. When we say that the soul is naturally Christian what we are saying is that the animating principle at the heart of the human vocation partakes of the Logos – the divine intention operative in creation itself and fully and finally incarnate in the life, death, resurrection and ascension of Jesus, who is, therefore, the incomparable paradigm of the human vocation as such. It is in the nature of the divine Logos incarnate in Christ to awaken – to animate – those creatures divinely fashioned to respond to that animating provocation and to do so in ways utterly unique to each person’s circumstances, gifts, and spiritual attentiveness.
Finally, a book critical of the sovereign self cannot pretend to be written by one. As we did in God’s Gamble: The Gravitational Power of Crucified Love, we employ the third person pronoun precisely because we make no pretense of originality, save for the sometimes quirky way we have arranged and redeployed the traditional resources, namely the manifold number of authorities on which our argument is based. The author takes full responsibility for the ways he has chosen to orchestrate these resources, as well as for efforts he makes to rescue the staggering implications of the Christian revelation.
INTRODUCTION
… my people have committed two evils: they have forsaken me, the fountain of living waters, and hewed out cisterns for themselves, broken cisterns, that can hold no water. (Jer 2:13)
This is not a storybook; its author hasn’t the gift. But it is a collection of stories and a running commentary on them. Some of these stories are historically factual and some are literary constructions. Both the historical stories and the literary ones will be read in an allegorical key. That is to say, we will recount them in a larger effort to limn the outlines of two more elusive stories: that of the self and that of the person. These two topics suggest themselves for what we hope are more or less obvious reasons: the experiment in autonomous selfhood is in crisis while the nature and meaning of the person is poorly understood. The following reflections on these matters is undertaken, as were many of the Old Testament psalms, in a time of confusion and cultural dislocation. It is fitting, therefore, to adopt as our own the words of the psalmists:
By the rivers of Babylon we sat down and wept when we remembered Zion. On the willow trees there we hung up our harps. (Ps. 137:1-2)
I will turn my mind to a parable, with the harp I will solve my problem. (Ps. 49:4)
In God’s Gamble we argued that faith is a form of intelligence which – though reducible to neither rational inquiry nor empirical corroboration – is nonetheless a rich and incomparable asset for discovering realities that lie on the margins and beyond of what can be accessed by more conventional cognitive approaches. With the psalmist who would turn his mind to a parable and solve his problem with a harp, in what follows we will turn our minds to a series of extended parables about the spiritual and psychological predicament in the midst of which we are living. In due course, we will take our harps down from the willow trees to which we consigned them in our despondency when the culture we are called to convert undertook its doomed post-Christian experiment. In conclusion, we will sing as best we can the Lord’s song in an alien land.
Contrary to a widespread assumption, ours is not an irreligious age, not even among those who declare their disinterest in religion. In fact, one religion is now completely dominant: secular-progressivism. Both its faithful adherents and its dissenters and reluctant bystanders are daily evangelized in its doctrines – by the educational institutions from kindergarten to graduate school, by the entertainment industry, most of the so-called “mainstream media,” and more recently by transnational corporations, post-national institutions and the globalist elites. No religion in history has been so thoroughly and insistently catechized, and few have so rigorously enforced their doctrines. In all of this, Robert Cardinal Sarah sees “an erroneous concept of human destiny.”
The tragically mistaken idea that the superannuation of Christianity can come about with impunity is the predicate for much that happened in the Christian West since the Renaissance. Christianity can for a time be crushed by another religion or ideology, but it’s smoldering embers will one day revive it. But abandoning it for what is at best a Christ-flavored agnosticism will prove the validity of Christ’s warning: “Without me you can do nothing.” For these words were spoken to those on whom Christ had left his indelible mark, and they apply to one degree or another to those whose exposure to Christ has been merely cultural. The cultural manifestation of that nothing is the nihilism that is the distinguishing characteristic of the post-Christian world.
We want to argue that what Christianity has done both to cultures seriously exposed to it and by extension to culture itself, it has also done to human subjectivity. It has revealed, beneath and beyond the discourse of self and autonomy, the mystery of the person as such, a mystery that can be fully fathomed only from within a well-formed Christian sensibility, but of which even those oblivious of, or hostile to Christian faith might be made aware.
In his last essay, published in 1938 on the subject of the self or the person, the distinguished French sociologist, Marcel Mauss wrote: It is Christians who have made a metaphysical entity of the “moral person” (personne morale), after they became aware of its religious power. Our own notion of the human person is still basically the Christian one.
As with the Christian revelation itself, the person as understood in light of Christian revelation is prefigured in the Old Testament, and there most strikingly in the prophets. As Benedict XVI observed, as the word is used in the Old Testament, the category of prophet is “something totally specific and unique, in contrast to the surrounding religious world, something that Israel alone has in this particular form.” The renowned Old Testament scholar Gerhard von Rad captured this when he wrote of the prophets: These men became individuals, persons. They could say ‘I’ in a way never before heard in Israel. At the same time, it has become apparent that the ‘I’ of which these men were allowed to become conscious was very different from our present-day concept of personality.
As for the “present-day concept of personality,” it now largely consists of anything anyone might decide about himself. As recently as 1967, when von Rad’s book on the Old Testament prophets appeared in German, even the “present-day” concept of personality retained at least the aura of objective reality and anthropological coherence. As for the disappearance of a shared understanding of the person, the observation of von Rad’s German contemporary Romano Guardini is most apposite.
Personality is essential to man. This truth becomes clear, however, and can be affirmed only under the guidance of Revelation, which related man to a living, personal God, which makes him a son of God, which teaches the ordering of His Providence. When man fails to ground his personal perfection in Divine Revelation, he still retains an awareness of the individual as a rounded, dignified and creative human being. He can have no consciousness, however, of the real person who is the absolute ground of each man, an absolute ground superior to every psychological or cultural advantage or achievement. The knowledge of what it means to be a person is inextricably bound up with the Faith of Christianity. An affirmation and a cultivation of the personal can endure for a time perhaps after Faith has been extinguished, but gradually they too will be lost.
The word “person” entered the vocabulary of Western culture only after Christian theologians, in speaking of the three Persons of the Trinity, gave the word persona a philosophical profundity never before associated with it. In bringing about this theological revolution, the theologians of the fourth and fifth centuries laid the groundwork for a radical reassessment of human subjectivity which has yet to be fully appreciated, and which it may be the special privilege of 21st century Christian thought to reconnoiter.
The British social anthropologist, Alfred Radcliffe-Brown (1881-1955) saw the rough outlines of the issue that we want to explore: If you tell me that an individual and a person are after all really the same thing, I would remind you of the Christian creed. God is three persons, but to say that He is three individuals is to be guilty of a heresy for which men have been put to death. Yet the failure to distinguish individual and person is not merely a heresy in religion: it is worse than that; it is a source of confusion in science.
We may disagree with the eminent anthropologist about the relative dangers of religious heresy and scientific confusion. As we are awash in both in this time of trial, however, we welcome an alliance of all those who have recognized these perils from their own fields of interest. The point is that Christ has altered humanity’s psychological circumstances as much as he has altered man’s cultural and historical situation. In both cases man’s freedom is called into play in a far greater way than was the case prior to the Christian revelation. This increased freedom has come with its obvious corollary: a heightening of the perils associated with the misuse of that freedom, whether the misuse was due to conscious rebellion or naïveté.
Kenneth Schmitz writes: “whereas the notion of the self and the dynamic of self-reference follow the thread of autos, the notion of person constitutes another thread and follows a different path.” But, as Schmitz argues, the terms person, as contrasted, for instance, with the terms self or individual, is rooted in religious thinking, and cannot be used in anything resembling its etymological roots by those who fail to take the religious issues seriously. “Militant atheistic humanism,” Schmitz writes, “can do quite well without the notion of person.” Christians, informed by faith, are “called to transmute the metal of self by a kind of spiritual alchemy into the gold of personhood.”
“The revelation of the person,” writes Paul Evdokimov, “is the event of Christianity,” and human desire is simply “the inborn nostalgia to become a ‘person’.” If the revelation of the person is the event of Christianity, and if desire is an inborn longing to become a person, then there is no more urgent question than the one Schmitz asks. “Can such a call to spiritual personhood be made today in such a way that it might be heard?” This book is an effort to answer that question in the affirmative. The key will be to show how anthropologically sound is the uniquely Christian understanding of personhood, and the anthropological insights of René Girard can play a significant and salutary role in bringing this to light.
In place of a formal introduction to Girard’s work, we will simply allude to its most familiar features and leave to subsequent pages of this book the task of drawing out the subtler implications of Girardian thought. Suffice it at this point to say that in the eyes of this writer and of many of those familiar with it, Girard’s work represents an indispensable anthropological resource for meeting the cultural and spiritual challenges facing us today, not least the challenge posed by the shockingly rapid de-Christianization of western culture. Whoever is not with me is against me, and whoever does not gather with me scatters. (Mt 12:30; Lk 11:23)
The anthropological upshot of this passage, to which the work of René Girard has drawn our attention, is that old gathering principle was fatally exposed on Golgotha. What is the old gathering principle? We have suggested an interrelationship between the two faces, so to speak, of evil, the diabolic and the satanic, which collaboratively bring about the collective expulsion or execution of a scapegoat victim, thus restoring order of a kind to societies rife with animosity. It is in light of this worldly ruse for restoring social order that salient facets of the scene on Golgotha merit attention. At the moment of gave up his spirit and breathed his last, breathing his Spirit, the spirit of truth, on both those assembled there and on the whole world, Luke provides the following commentary: Now when the centurion saw what had taken place, he praised God, and said, “Certainly this man was innocent!” And all the multitudes who assembled to see the sight, when they saw what had taken place, returned home beating their breasts. (Lk 23:47-48)
As we discussed at more length in God’s Gamble, the simultaneous conversion of the centurion and his recognition of Jesus’ innocence deserves close attention. For the recognition of the innocence of a community’s scapegoat is incompatible with the success of the scapegoating. In retrospect, the victim might come to be honored in some fashion, but never in such a way as to contest the righteousness of his immolation. It is even more startling to find a declaration of Jesus’ innocence by a hardened Roman executioner. More importantly for our purposes is the connection between this recognition of Jesus’ innocence and its social consequences. The desired outcome of all scapegoating is the reunification of a formerly fractious community. When the scapegoating works, that is, when the gathered community finds it just and necessary, the community enjoys are renewed esprit de corps. This is precisely what did not happen at the death of Jesus. When Luke says that those assembled scattered – that they returned home beating their breasts – the gesture does not mean contrition, as it does in later Christian life. It means confusion. This same scenario occurred with the woman caught in adultery in John’s Gospel. Once Jesus had dispelled the aura of innocence which was synonymous with the unanimity of the accusers, “they slipped away one by one, beginning with the oldest” (Jn 8:9). On Golgotha, both the Roman executioner and the mob who had clamored for Jesus’ execution register a confused moral discomfort with what has taken place.
The Christian revelation has been jeopardizing those communal arrangements that depend on this scapegoating ritual for social rejuvenation ever since. But Christianity does not abandon those it deprives of the old, morally dubious social solidarity. Christ, whose death shattered the social arrangements underwritten by periodic episodes of violent unanimity, spent his life on earth fashioning an alternative community: the Church.
I am the vine, you are the branches. He who abides in me, and I in him, he it is that bears much fruit, for apart from me you can do nothing. If a man does not abide in me, he is cast forth as a branch and withers; and the branches are gathered, thrown into the fire and burned. (Jn 15:5-6)
The “nothing” of which Christ spoke is today the nihilism that is the final phase of the sterile form of the putatively autonomous self and its exaltation of the will. That this willfulness should prove in the end to be withered branches find an echo in the warning of John the Baptist: “His winnowing fork is in his hand, to clear his threshing floor, and to gather the wheat into his granary, but the chaff he will burn with unquenchable fire” (Lk 3:17; Mt 3:12). The withered branches of Jesus’ discourse and the chaff of which the Baptist spoke have their contemporary analogue in what Henri de Lubac called “the waning of ontological density” and what Gabriel Marcel termed the loss of our “ontological moorings.” The Scriptural treatment of chaff is instructive: it will either be scattered to the wind or gathered and burned in a fire. These outcomes could be both conflated and transposed into an anthropological key by saying that the alternative to the kind of genuine solidarity that Christ offers is social alienation fitfully and temporarily relieved by the faux solidarity of collective violence.
These are matters we addressed in two earlier books – Violence Unveiled and God’s Gamble. We touch on them here as analogous to the crisis the Gospel precipitates at the spiritual and psychological level. For, just as the Gospel throws down a radical challenge to conventional forms of cultural life, so it challenges with equal audacity the conventional forms of human subjectivity, calling familiar psychological adaptations into question and revealing a startling new form, namely, the person properly understood, with all its specific Christian meaning and theological overtones. To that we will return in due course. In the meantime, we will turn to a series of parables by which we hope to bring to light today’s spiritual crisis.
Ironically, almost perversely, the same historical developments that were making “individual” experience possible were depriving what was left of the West’s sacrificial system of its power to periodically convert the jealousy, rivalry, resentment, and rancor born of mimetic desire, first, into unanimous forms of social resentment and then, into reinvigorated social hierarchies whose anthropological utility was in restraining the mimetic passions that produced it in the first place. In addition, as moral misgivings erode the sacrificial system’s putative religious aura, those religious symbols and icons that once served as socially transcendent benchmarks for the culture lose their charismatic power. The “individual” was increasingly inhabiting a world in which serious references to something that was both religiously significant and socially transcendent were rapidly fading. It was a world where restraints on the mimetic passions were losing their moral force at the same time that the sacrificial system for draining away these passions was losing its efficacy. The “individual” was trying to get his historical and psychological footing in a world rapidly surrendering the traditional resources for maintaining social equilibrium and psychological poise. The same forces that made the “individual” an intelligible concept were unleashing a tide of mimetic passions that would eventually reveal the psychological inadequacy of the concept.
It is not surprising, therefore, that when Western cultures — vaguely sensing the destabilizing potential of the biblical tradition — tried to dilute its influence with Greek classical traditions, they placed the “individual” at the center of their concern, in the very position occupied by the victim in the biblical tradition. When the Enlightenment made the “rights of the individual” the centerpiece of Western culture’s social and political life, it was simply bringing to near completion the task begun in the Renaissance of transposing into a secular and politically manageable idiom the Bible’s empathy for victims. The insistence that the individual’s rights not be violated, a vestige of the biblical solicitude for victims, became the central ethical principle of Western culture. Unable or unwilling to carry the biblical momentum forward under its own banner, the West fashioned for itself a weakened version of it, but one that nevertheless retained enough of the power of its biblical original to sweep virtually the whole world into its orbit.
The very possibility of autonomous individuality, therefore, arose only as the gravitational power of the ancient system of sacralized violence – the effect of which was to cement a community in its united condemnation of a putative culprit – weakened under the impact of the Christian revelation of the Cross, which aroused an empathy for, and eventually the valorization of, the victim of such violence. With the quiet working of this moral revolution, those who managed to resist the intoxicating power of the occasional social hysterias that united the community against a universally reprobated enemy experienced a social alienation to one degree or another. Thus did an incipient form of individuality appear as the byproduct of the Christian revaluation of values. Of course, the process by which this came about neither altered humanity’s fallen condition nor rendered the Christian doctrine of original sin inoperative. It took centuries of exposure to the Gospel for the power of communal violence to lose its moral immunity, during which even those who resolutely asserted their individuality took an occasional plunge into the renewing waters of righteous indignation.
The Christocentric anthropology to which René Girard’s work lends scholarly credence and analytical rigor gives fresh meaning and specificity to the biblical notion that we are made in the image and likeness of our Creator, inasmuch as it accounts for that fact that our one irreducible impulse is to replicate the desire – the “will” – of an Other, an impulse, however, which is fickle in the extreme. The spiritual and historical turmoil born of mimetic desire is such that it finally takes nothing less than the Incarnation to save us from it. But Christ doesn’t propose the renunciation of mimetic desire, quite the contrary, he coaxes it into greater intensity and turns it, via his own historically singular and unforgettable example, toward its true Object. Christ is “the icon of the Living God,” through whose mediation we are able imitate the One in whose image and likeness we are fashioned, which is ultimately what we long most to do. “Our heart is restless unti\l it rests in Thee.”
What is unique about Christian personhood has been supremely summed up by Saint Paul: “I live, now no longer I, but Christ lives in me.” (Gal 2:20). We have regarded Paul’s existential circumstance thusly summed up as somehow entirely unique to his singular vocation, allowing us to overlook perhaps the single most challenging passage in the Pauline corpus. The practical spiritual ramifications of this scriptural passage, and many others like them, have not been adequately mined for the light they shed on the human dilemma generally and our contemporary psychological distress specifically, and it is now time to do so. The very essence of this new form of subjectivity is its Christological character. As a cognate of Christ’s own relationship to his heavenly Father, this Pauline subjectivity gives ontological and psychological specificity to the revered scriptural idea that Christians are sons and daughters of God by “adoption” in Christ.
A Christian is someone whose understanding of the mystery of life and the nature of his own existence has been decisively reconfigured by the truth which he has found in Christ. “By revealing the Father and by being revealed by him,” writes Henri de Lubac, anticipating what was to become the anthropological touchstone of Vatican II, “Christ completes the revelation of man to himself.” If it were possible to acquire the truth which Christians have by faith in any other way, then the Incarnation, Crucifixion, and Resurrection would be superfluous, or at most reduced to an optional source of a truth which might, in time, have been discovered otherwise. If, as the Second Vatican Council affirmed, however, “only in the mystery of the incarnate Word does the mystery of man take on light,” then it would be odd indeed if those living in the light of that mystery were to understand themselves in terms indistinguishable from how a non-Christian might. And yet, today most Christians use terms such as “self,” “individual,” “psyche,” and “person” interchangeably, as if they refer equally well to a Christian’s subjective experience as to a non-Christian’s. The result of the triumph of the Cartesian cogito over the Christian imitatio means that many Christians regard their faith propositionally rather than ontologically. The two are obviously related, but the latter of the two cries out today for further explication and exploration.
Like the truth of the victim, the truth of the person is one on which the Gospel throws an utterly unique light. This latter Gospel truth overtakes recalcitrant humanity with the same historical persistence as does the former one. As evidence of the intersubjective nature and self-donating meaning of personal existence mounts, efforts to ignore it grow progressively more adamant and disingenuous. So obvious had this unwanted truth become by the late nineteenth century that perhaps nothing less than Freud’s sexually titillating psychological theories could have deflected attention from it, and even this maneuver served only to postpone the day of reckoning, which is now upon us.
Evidence of an unwarranted optimism on the part of the Second Vatican Council fathers simply underscores the Council’s implicit call for what John Paul II has called “an adequate anthropology.” The formulation of such an anthropology, in fact, would be the best way for the Church of the twenty-first century to revive the hope for which the optimism of the 1960s was a naïve substitute. However chastened that hope might necessarily be in today’s world, it is by no means a wan hope. On the contrary, there is reason for even greater hopes than those the Council expressed, for the anthropological resources for better assessing both the present predicament and its evangelical, apologetic, catechetical and sacramental promise are now available. With these resources at hand, it can be the privilege of the twenty-first century Church, as it is in any case her responsibility, to awaken hopes more resilient – because more Christologically anchored – than the late twentieth century paeans to historical progress and individual autonomy.
The worldly optimism by which the Council’s theological hope was both augmented and diluted the fact that beneath the surface of historical and cultural changes that were attracting the council’s attention, the moral touchstone for conciliar solicitude – the person – was reeling from the accumulated consequences of a long-standing anthropological miscalculation about the nature, meaning and ordination of the human person. For the fact is that many of the most disturbing moral, political, cultural problems now looming are traceable to, and exacerbated by, the loss of our “ontological moorings.” Even though Vatican II emphasized the centrality of the person in Catholic social teaching and its supreme importance for the Church’s engagement with contemporary culture, the Council presupposed that the Church’s use of the word person was in workable accord with the connotations given it by those outside the Church. Where reference was made to the anthropology of the person, it was to a philosophical anthropology of extraordinary depth, but ill-suited to the diagnostic task at hand, and unlikely to command the attention of those most in need of ecclesial guidance.
Glenn Olsen has observed that “the liberal self, replicating as it does a social order from which the idea of a common good and any hierarchy of public goods has been largely evacuated, is intrinsically disordered and dysfunctional.” It is this same symbiosis of self and social order to which the authors of Gaudium et spes alluded in insisting that “the progress of the human person and the advance of society itself hinge on one another” (§ 25). This reciprocal relationship between the human person and human society means that distresses in one of these spheres will be accompanied by distresses in the other, and that, whatever their short-term practical advantages, attempts to remedy distresses in one sphere which unwittingly collude with the distresses in the other will necessarily exacerbate and prolong the crisis common to them both.
For many years the Church has been championing the dignity of the human person, most notably in insisting on the moment of conception as the outset of the person’s life. What follows presupposes this understanding of the person as worthy of dignity from conception to natural death. The burden of our exploration will be to argue for a Christocentric recovery of the mystery of the person. If, in so arguing, attention is focused on the mature expression of this mystery, that is in no way to suggest any affinity with, or sympathy for, those who would attribute human status only to humans who have acquired this or that level of functionality – an unconscionable and ethically monstrous position.
There is, however, another debate about the person that deserves our attention, and that is the debate as to what it is about personhood that Christ reveals and that without Christ the world is incapable of recognizing. What is the nature of personhood as uniquely revealed by the person of Christ? That is the question we are here exploring. No one would argue that a four-year-old has taken full possession of the talents which, with time, he or she will develop and express. Similarly, a child one day after conception, though a biological human person of inestimable worth, will only much later have an opportunity to fulfill the promise implicit in his personhood.
The concept of the person, writes Joseph Cardinal Ratzinger, “grew in the first place out of the interplay between human thought and the data of Christian faith,” entering thereby into the intellectual history, beginning of course with those cultures that fell under Christian influence. In bringing about this theological revolution, the theologians of the patristic age laid the foundation for a revolution in human self-understanding which has languished for lack of adequate anthropological elaboration. It is surely the special responsibility and unique privilege of the twenty-first century Church to undertake this elaboration.
If our great-grandchildren are to inherit the blessings we have both enjoyed and squandered, a path back to the Judeo-Christian sources of these blessings will have to be charted. This writer is under no illusions about what a book he might write could accomplish under the present undeniably dire social, psychological, and spiritual circumstances, but neither does he wish to content himself with lamentation or shrink from the task of doing what little he can to give some words of encouragement to those who will take up this task in the years to come.
As we argued in both Violence Unveiled and God’s Gamble, the life, death, descent into hell, and resurrection of Christ profoundly altered the human predicament at a level all-but-imperceptible to human cognition. The course of history in general, and the course of those cultures that fell under Christian influence in particular, are incomprehensible without reference to the epicenter of the earthquake recorded by St. Matthew at the moment of Jesus’ death, when in John’s Gospel Jesus said, “It is finished.” For the most part, the cultural consequences of the Incarnation, death, and resurrection of Christ have occurred very slowly and in an understandably complex way. The burden of this book is to argue that the Incarnation, death, and resurrection of Christ have had a comparable impact on the nature and experience of human subjectivity, and that, moreover, we stand today at the crossroads. “To those who have, more will be given; to those who have not, what little they have will be taken away.” Writes Hans Urs von Balthasar: For it does not seem self-evident that an “accidental” historical event like the crucifixion of a “man” in a corner of the Roman Empire antecedently conditions and determines not only the entire course of world history, but, more profoundly, the inner reality of every human being, in fact, the entire ontological structure of every person. If that is correct, then all being and all possible thought proves to be Christian.
The attempt to fashion a post-Christian culture will devolve into an anti-Christian ideological movement, which will devolve further into nihilism and catastrophe, as long as the unsurpassable Christian reckoning of the human predicament is set aside in favor of reckless anthropological experiments whose dire consequences will follow as surely as night follows day. It is a lot easier to turn a deaf ear and a blind eye to Christ and His Church than it is to fashion a post-Christian alternative to them. The latter task, in fact, is impossible. As Malcolm Muggeridge famously pointed out, the choice before us today is Christ or nothing. Both formerly Christian cultures and those individuals shaped by the overarching Christian tenor of the cultures in which they live will remain – in subtle but indelible ways – Christ-haunted. The entry into history of Christ and his Church represents a watershed which demands a decision for or against, and those who reject it often intuitively understand the inevitability of that choice better than do Christians. Of the revelation on Golgotha, Hans Urs von Balthasar has insisted: “From this point on, true, deliberate atheism becomes possible for the first time, since, prior to this, without a genuine concept of God, there could be no true atheism.”
What then if man, no longer accustomed to taking his standard from the cosmos (now emptied of the divine), refuses to take it from Christ? This is post-Christian man, who cannot return to the pre-Christian fluidity that once existed between man and the cosmos but who, in passing through Christianity, has grown used to the heightening of his creaturely rhythms and wants to hold on to them as if they are his personal hallmark, a gift that now belongs to him entirely. This will be the general characteristic of the post-Christian era, however manifold and contradictory its concrete expressions may be.
It has become increasingly obvious that the myth of autonomous individuality flies in the face of the social and psychological facts, but once the West had made protecting the rights of the individual the engine of its historical reforms and reconfigured all its institutions accordingly, it was understandably in no mood to quibble over the psychological plausibility of what had become its moral compass and organizing principle. After all, Western culture parleyed its solicitude for the rights of the individual into some of the most impressive moral and political reforms in history. It takes a lot more than mere misinterpretation to destroy the historical power of the revelation which the myth of individuality misinterprets. Overlooking the psychological naïveté of the premise that produced these historical marvels would have seemed a tolerable price to pay for their political utility. But, like the national debt, the price to be paid for this anthropological miscalculation compounds rapidly and falls most heavily on subsequent generations, and today’s youth are now visibly staggering under the burden of it.
Sooner or later, the social and psychological invalidity of the individualist myth was bound to lead to problems. As soon as everyone became an “individual,” and every social grievance was challenged as an infringement on the individual’s rights, the moral force once marshaled to protect the rights of the individual lost its clarity. As Simone Weil put it during the dark years of war, Nazi camps, and the aerial bombardment of cities: “The notion of rights, which was launched into the world in 1789, has proved unable, because of its intrinsic inadequacy, to fulfil the role assigned to it.” Not only did the notion of rights present no formidable barrier to the campaigns of mass slaughter that were rippling through the modern world, but the notion was problematic even at the level of ordinary human relations. We can only expect a pretty paltry form of justice, Weil argued, from a justice system as dependent as our is on the need to “agitate for our rights.”
The “rights rhetoric,” according to Tracy Rowland, “is ideological. Its objective is to secure a social consensus in circumstances where there is no commonly accepted moral tradition by construing all human relations in contractual terms.” The Catholic philosopher, Alasdair MacIntyre concurs: The dominant contemporary idiom and rhetoric of rights cannot serve genuinely rational purposes, and we ought not to conduct our moral and political arguments in terms derived from that idiom and rhetoric.
The Christians who wrote U. N. Declaration on Human Rights and the Christians who enthusiastically adopted could hardly have foreseen that they were embracing by default a jurisprudence of “rights” that would one day be used to enforce “rights” like “reproductive rights” – a paper thin metaphor for the “right” to kill children in the womb – or a myriad of other purported “rights” deeply antithetical to the moral tradition for which the rights discourse seemed at first to be the closest secular approximation.
As we shall see, where the rhetoric of rights is the touchstone of social affairs and the cornerstone of jurisprudence, each thing meets in “mere oppugnancy.” Such a culture is spiritually toxic, especially for those trying to live a Christian life, for a social order based on constantly competing rights tends to foster an adversarial and defensive form of personal autonomy, corrosive of a shared sense of meaning and purpose, and in sharp contrast to the Christian vocation to participate in the Trinitarian Life of Self-Donation and the sacramental life of the ecclesial communion that is its earthly analogue.