Some time ago I spent a year at the Institute for Advanced Study in Princeton, New Jersey, where one of the pleasures is the opportunity to exchange ideas with scholars from other countries. One evening, a particularly animated member of an informal discussion group I had joined began to lament the sorry state of public intellectualism in the United States — this by contrast to her native France, and particularly Paris, with its dizzying clash of opinions. I remember being somewhat stung by her comments, and joined the others in shaking my head at the lackluster state of our public intellectual life. Why couldn’t Americans be more like Parisians?
The moment passed rather quickly, at least in my case. I recalled just how thoroughly the French intellectual class—except for the rare dissenters, such as the estimable, brave, and lonely Albert Camus (1913-60)—had capitulated to the seductions of totalitarian logic, opposing fascism only to become apologists for what Camus called “the socialism of the gallows.”
French political life would have been much healthier had France embraced Camus and his few compatriots rather than Jean-Paul Sartre and the many others of his kind who wore the mantle of the public intellectual. When Camus spoke in a political voice, he spoke as a citizen who understood politics to be a process that involves debate and compromise, not as an ideologue seeking to make politics conform to an overarching vision. In the end, Camus insisted, the ideologue’s vision effectively destroys politics.
Perhaps, I reflected, America’s peculiar blend of rough-and-ready pragmatism and a tendency to fret about the moral dimensions of public life—unsystematic and, from the viewpoint of lofty ideology, unsophisticated as this combination might be—was a better guarantor of constitutionalism and a healthy civil society than were intellectuals of the sort my French interlocutor favored. Historically, public intellectuals in America were, in fact, members of a wider public. They shared with other Americans access to religious and civic idioms that pressed the moral questions embedded in political debate; they were prepared to live, at least most of the time, with the give-and-take of political life, and they favored practical results over systems.
The American temperament invites wariness toward intellectuals. Because they are generally better at living in their heads than at keeping their feet on the ground, intellectuals are more vulnerable than others to the seductions of power that come with possessing a worldview whose logic promises to explain everything, and perhaps, in some glorious future, control and manage everything. The 20th century is littered with the disastrous consequences of such seductions, many of them spearheaded and defined by intellectuals who found themselves superseded, or even destroyed, by ruthless men of action once they were no longer needed as apologists, provocateurs, and publicists. The definitive crackup since 1989 of the political utopianism that enthralled so many 20th-century public intellectuals in the West prompts several important questions: Who, exactly, are the public intellectuals in contemporary America? Do we need them? And if we do, what should be their job description?
Let us not understand these questions too narrowly. Every country’s history is different. Many critics who bemoan the paucity of public intellectuals in America today have a constricted view of them—as a group of independent thinkers who, nonetheless, seem to think remarkably alike. In most accounts, they are left-wing, seek the overthrow of bourgeois convention, and spend endless hours (or at least did so once-upon-a-time) talking late into the night in smoke-filled cafés and Greenwich Village lofts. We owe this vision not only to the self-promotion of members of the group but to films such as Warren Beatty’s Reds. But such accounts distort our understanding of American intellectual life. There was a life of the mind west of the Hudson River, too, as Louis Menand shows in his recent book, The Metaphysical Club. American intellectuals have come in a number of modes and have embraced a variety of approaches.
But even Menand pays too little attention to an important part of the American ferment. American public intellectual life is unintelligible if one ignores the extraordinary role once played by the Protestant clergy and similar thinkers, from Jonathan Edwards in the 18th century through Reinhold Niebuhr in the 20th. The entire Social Gospel movement, from its late-19th-century origins through its heyday about the time of World War I, was an attempt by the intellectuals in America’s clergy and seminaries to define an American civil religion and to bring a vision of something akin to the Peaceable Kingdom to fruition on earth, or at least in North America.
As universities became prominent homes for intellectual life, university-based intellectuals entered this already-established public discourse. They did so as generalists rather than as spokesmen for a discipline. In the minds of thinkers such as William James, George Herbert Mead, and John Dewey, there was no way to separate intellectual and political issues from larger moral concerns. Outside the university proper during the last decades of the 19th century and early decades of the 20th, there arose extraordinary figures such as Jane Addams and Randolph Bourne. These thinkers and social activists combined moral urgency and political engagement in their work. None trafficked in a totalizing ideology on the Marxist model of so many European intellectuals.
Addams, for example, insisted that the settlement house movement she pioneered in Chicago remain open, flexible, and experimental—a communal home for what might be called organic intellectual life. Responding to the clash of the social classes that dominated the public life of her day, she spoke of the need for the classes to engage in “mutual interpretation,” and for this to be done person to person. Addams stoutly resisted the lure of ideology—she told droll stories about the utopianism that was sometimes voiced in the Working Man’s Social Science Club at Hull-House.
Addams saw in Nathaniel Hawthorne’s short story “Ethan Brand” an object lesson for intellectuals. Ethan Brand is a lime burner who leaves his village to search for the “Unpardonable Sin.” And he finds it: an “intellect that triumphed over the sense of brotherhood with man and reverence for God, and sacrificed everything to its mighty claims!” This pride of intellect, operating in public life, tries to force life to conform to an abstract model. Addams used the lesson of Ethan Brand in replying to the socialists who claimed that she refused to convert to their point of view because she was “caught in the coils of capitalism.” In responding to her critics, Addams once described an exchange in one of the weekly Hull-House drawing room discussions. An ardent socialist proclaimed that “socialism will cure the toothache.” A second fellow upped the ante by insisting that when every child’s teeth were systematically cared for from birth, toothaches would disappear from the face of the earth. Addams, of course, knew that we would always have toothaches.
Addams, James, Dewey, and, later, Niebuhr shared a strong sense of living in a distinctly Protestant civic culture. That culture was assumed, whether one was a religious believer or not, and from the days of abolitionism through the struggle for women’s suffrage and down to the civil rights movement of the 1960s, public intellectuals could appeal to its values. But Protestant civic culture thinned out with the rise of groups that had been excluded from the consensus (Catholics, Jews, Evangelical Christians), with the triumph of a generally secular, consumerist worldview, and with mainline Protestantism’s abandonment of much of its own intellectual tradition in favor of a therapeutic ethos.
The consequence, for better and for worse, is that there is no longer a unified intellectual culture to address—or to rebel against. Pundits of one sort or another often attempt to recreate such a culture rhetorically and to stoke old fears, as if we were fighting theocrats in the Massachusetts Bay Colony all over again. Raising the stakes in this way promotes a sense of self-importance by exaggerating what one is ostensibly up against. During the Clinton-Lewinsky scandal, for example, those who were critical of the president’s dubious use of the Oval Office were often accused of trying to resurrect the morality of Old Salem. A simple click of your television remote gives the lie to all such talk of a Puritan restoration: The screen is crowded with popular soft-core pornography packaged as confessional talk shows or self-help programs.
The specter of Old Salem is invoked in part because it provides, at least temporarily, a clear target for counterargument and gives television’s talking heads an issue that seems to justify their existence. But the truth is that there are no grand, clear-cut issues around which public intellectuals, whether self-described media hounds or scholars yearning to break out of university-defined disciplinary boundaries, now rally. The overriding issues of three or four decades ago on which an unambiguous position was possible—above all, segregation and war—have given way to matters that are complex and murky. We now see in shades of gray rather than black and white. It is difficult to build a grand intellectual argument around how best to reform welfare, structure a tax cut, or protect the environment. Even many of our broader civic problems do not lend themselves to the sorts of thematic and cultural generalizations that have historically been the stuff of most public intellectual discourse.
My point is not that the issues Americans now face raise no major ethical or conceptual concerns; rather, these concerns are so complex, and the arguments from all sides often so compelling, that each side seems to have some part of the truth. That is why those who treat every issue as if it fit within the narrative of moral goodness on one side and venality and inequity on the other become so wearying. Most of us, whether or not we are part of what one wag rather uncharitably dubbed “the chattering classes,” realize that matters are not so simple. That is one reason we often turn to expert researchers, who do not fit the historical profile of the public intellectual as omnicompetent generalist.
For example, well before today’s mountains of empirical evidence came in, a number of intellectuals were writing about what appeared to be Americans’ powerful disaffection from public life and from the work of civil society. Political theorists like me could speak to widespread discontents, but it was finally the empirical evidence presented by, among others, political scientist Robert Putman in his famous 1995 “Bowling Alone” essay that won these concerns a broad public hearing. In this instance, one finds disciplinary expertise put to the service of a public intellectual enterprise. That cuts against the grain of the culturally enshrined view of the public intellectual as a bold, lone intellect. Empirical researchers work in teams. They often have hordes of assistants. Their data are complex and must be translated for public consumption. Their work is very much the task of universities and think tanks, not of the public intellectual as heroic dissenter.
Yet it would be a mistake simply to let the experts take over. A case in point is the current debate over stem cell research and embryonic cloning for the purpose of “harvesting” stem cells. Anyone aware of the history of technological advance and the power of an insatiable desire for profit understands that such harvesting is a first step toward cloning, and that irresponsible individuals and companies are already moving in that direction. But because the debate is conducted in highly technical terms, it is very difficult for the generalist, or any nonspecialist, to find a point of entry. If you are not prepared to state an authoritative view on whether adult stem cells have the “pluripotent” potential of embryonic stem cells, you may as well keep your mouth shut. The technical debate excludes most citizens and limits the involvement of nonscientists who think about the long-range political implications of projects that bear a distinct eugenics cast.
Genetic “enhancement,” as it is euphemistically called, will eventually become a eugenics project, meant to perfect the genetic composition of the human race. But our public life is so dominated by short-term considerations that someone who brings to the current genetic debate such a historical understanding sounds merely alarmist. This kind of understanding does not sit well with the can-do, upbeat American temperament. Americans are generally relieved to have moral and political urgency swamped by technicalities. This is hardly new. During the Cold War, debaters who had at their fingertips the latest data on missile throw-weights could trump the person who was not that sort of expert—but who wasn’t a naif either, who had read her Thucydides, and who thought there were alternatives to mutually assured destruction.
Americans prefer cheerleaders to naysayers. We tend to concentrate on the positive side of the ledger and refuse to conjure with the negative features—whether actual or potential—of social reform or technological innovation. Americans notoriously lack a sense of tragedy, or even, as Reinhold Niebuhr insisted, a recognition of the ironies of our own history. By naysayers I do not refer to those who, at the drop of a hat, issue a pre-fabricated condemnation of more-or-less anything going on in American politics and popular culture. I mean those who recognize that there are always losers when there are winners, and that it has never been the case in the history of any society that the benefits of a change or innovation fall evenly on all groups.
Whenever I heard the wonders of the “information superhighway” extolled during America’s years of high-tech infatuation, my mind turned to the people who would inevitably be found sitting in antiquated jalopies in the breakdown lane. It isn’t easy to get Americans to think about such things. One evening, on a nightly news show, I debated a dot.com millionaire who proclaimed that the enormous wealth and expertise being amassed by rich techno-whiz kids would soon allow us to realize a cure for cancer, the end of urban gridlock, and world peace. World peace would follow naturally from market globalization. Having the right designer label on your jeans would be the glue that held people together, from here to Beijing. When I suggested that this was pretty thin civic glue, the gentleman in question looked at me as if I were a member of some extinct species. It was clear that he found such opinions not only retrograde but nearly unintelligible.
The dot.com millionaire’s attitude exemplified a larger American problem: the dangers of an excess of pride, not just for individuals but for the culture as a whole. It isn’t easy in our public intellectual life, or in our church life, for that matter, to get Americans to think about anything to do with sin, the focus of much public intellectual discourse in America from Edwards to Niebuhr. We are comfortable with “syndromes.” The word has a soothing, therapeutic sound. But the sin of pride, in the form of a triumphalist stance that recognizes no limits to human striving, is another matter.
The moral voices—the Jane Addamses and Reinhold Niebuhrs—that once had real public clout and that warned us against our tendency toward cultural pride and triumphalism seem no longer to exist, or at least to claim an audience anywhere near the size they once did. There are a few such voices in our era, but they tend not to be American. I think of President Václav Havel of the Czech Republic, who has written unabashedly against what happens when human beings, in his words, forget that they are not God or godlike. Here is Havel, in a lecture reprinted in the journal First Things (March 1995):
The relativization of all moral norms, the crisis of authority, the reduction of life to the pursuit of immediate material gain without regard for its general consequences—the very things Western democracy is most criticized for—do not originate in democracy but in that which modern man has lost: his transcendental anchor, and along with it the only genuine source of his responsibility and self-respect. Given its fatal incorrigibility, humanity probably will have to go through many more Rwandas and Chernobyls before it understands how unbelievably short-sighted a human being can be who has forgotten that he is not God.
Our era is one of forgetting. If there is a role for the public intellectual, it is to insist that we remember, and that remembering is a moral act requiring the greatest intellectual and moral clarity. In learning to remember the Holocaust, we have achieved a significant (and lonely) success. Yet to the extent that we now see genocide as a historical anomaly unique to a particular regime or people, or, alternatively, as a historical commonplace that allows us to brand every instance of political killing a holocaust, we have failed to achieve clarity. The truth lies somewhere between.
Where techno-enthusiasm and utopia are concerned, we are far gone on the path of forgetting. One already sees newspaper ads offering huge financial rewards to young egg donors if they have SAT scores of at least 1400 or above, stand at least 5'10" tall, and are athletic. The “designer genes” of the future are talked about in matter-of-fact tones. Runaway technological utopianism, because it presents itself to us with the imprimatur of science, has an automatic authority in American culture that ethical thinkers, intellectual generalists, the clergy, and those with a sense of historic irony and tragedy no longer enjoy. The lay Catholic magazine Commonwealth may editorialize against our newfangled modes of trading in human flesh—against what amounts to a “world where persons carry a price tag, and where the cash value of some persons is far greater than that of others.” But the arguments seem to reach only those who are already persuaded. Critics on the environmental left and the social-conservative right who question techno-triumphalism fare no better. Instead of being seen as an early warning system—speaking unwelcome truths and reminding us what happens when people are equated with their genetic potential—the doubters are dismissed as a rear guard standing in the way of progress.
So this is our situation. Many of our pressing contemporary issues—issues that are not often construed as intrinsically political but on which politics has great bearing—raise daunting moral concerns. The concerns cannot be dealt with adequately without a strong ethical framework, a historical sensibility, and an awareness of human limits and tragedies. But such qualities are in short supply in an era of specialization and technological triumphalism. Those who seize the microphone and can bring the almost automatic authority of science to their side are mostly apologists for the coming new order. Those who warn about this new order’s possible baneful effects and consequences can be marginalized as people who refuse, stubbornly, to march in time, or who illegitimately seek to import to the public arena concerns that derive from religion.
We are so easily dazzled. We are so proud. If we can do it, we must do it. We must be first in all things—and if we become serious about bringing ethical restraint to bear on certain technologies, we may fall behind country X or country Y. And that seems un-American. The role for public intellectuals under such circumstances is to step back and issue thoughtful warnings. But where is the venue for this kind of discourse? Where is the training ground for what political theorist Michael Walzer calls “connected critics,” thinkers who identify strongly with their culture, who do not traffic in facile denunciations of the sort we hear every night on television (along with equally facile cheerleading), but who speak to politics in a moral voice that is not narrowly moralizing? That question underlies much of the debate about the state of civil society that occurred during the past decade. The writers and thinkers who warned about the decline of American civil society were concerned about finding not just more effective ways to reach desirable ends in public policy but about finding ways to stem the rushing tide of consumerism, of privatization and civic withdrawal, of public apathy and disengagement. We will not stem that tide without social structures and institutions that promote a fuller public conversation about the questions that confront us.
Whenever I speak about the quality of our public life before civic groups, I find a real hunger for public places like Hull-House. Americans yearn for forums where they can engage and interpret the public questions of our time, and where a life of the mind can emerge and grow communally, free of the fetters of overspecialization. Without an engaged public, there can be no true public conversations, and no true public intellectuals. At Hull-House, Jane Addams spoke in a civic and ethical idiom shaped and shared by her fellow citizens. The voices of the Hull-House public served as a check on narrow, specialized, and monolithic points of view. It was from this rich venue that Addams launched herself into the public debates of her time. Where are the institutions for such discussion today? How might we create them? It is one of the many ironies of their vocation that contemporary public intellectuals can no longer presume a public.
Intellectuals and others who speak in a public moral voice do not carry a card that says “Have Ideology, Will Talk.” Instead, they embrace Hannah Arendt’s description of the task of the political theorist as one who helps us to think about what we are doing. In a culture that is always doing, the responsibility to think is too often evaded. Things move much too fast. The role for public intellectuals today is to bestir the quiet voice of ethically engaged reason.
Jean Bethke Elshtain was the Laura Spelman Rockefeller Professor of Social and Political Ethics at the University of Chicago. She was the author of many books, including Jane Addams and the Dream of American Democracy (2001).