The New Ivory Tower
When I finished my Ph.D. and began working at a large southern state university in the early 1970s, the grunts who taught freshman English had in their possession a handsome athletic-style trophy of which they were extremely proud. Instead of a football or tennis player or track star, atop its base stood a foot-high high school teachers and graduated, so to speak, to the university when the post–World War II expansion of higher education created a temporary shortage of qualified faculty.
These tough old schoolmarms were on the verge of retiring as I came in the door, and their standards and attitudes now seem as antique as the mandatory ROTC for male students that many state universities had only recently abolished. New assistant professors like me, who were paid more to teach less, generally frowned on the amateurism and provinciality we attached to our predecessors. The university was changing fast, and we were part of the change. American higher education was on its way to becoming a mature industry, with all the benefits of greater professionalism, but also, in retrospect, with much of the waste and confusion that come with unplanned growth in a sector largely shielded from competition. It rarely occurred to anybody to specify realistically what purposes this vast and costly increase of scale was intended to accomplish, let alone to question whether it might turn out to be a mixed blessing.
Until World War II, colleges and universities were a modest presence in American life, enrolling a tiny minority of high school graduates. The GI Bill of 1944 started to change all that. Subsidized by a grateful nation, suddenly a much larger proportion of young men and women acquired the college habit, and in the 1960s their baby boomer children began swarming onto campuses. The result was an unprecedented burgeoning of higher education, particularly at state institutions. New campuses sprouted everywhere, while old ones frequently tripled or quadrupled their enrollments. Practically every teacher’s college in the country proclaimed itself a university. Once mostly an amenity of the elite, a college education quickly became a necessity for anyone with ambition, and soon after an entitlement.
Few would have expected universities to prosper as much as they did in the third of a century that followed the expansive 1960s. By the fall of 2005, there were some 2,000 four-year colleges and universities in the United States, enrolling in excess of 17 million students (more than three-quarters of them in public institutions) of hugely varying qualifications. These institutions confer close to a million bachelor’s degrees a year. The 20 richest schools have endowments that collectively approach $200 billion; the University of California receives $3 billion a year from the state government, and that is just a fraction of the system’s total budget. The number of students, like the size of budgets, continues to mushroom, as the belief that a bachelor’s degree—at the least—is essential to success has hardened into holy writ. The cost of American higher education has for decades been rising faster than the price of gold or gasoline. Between 1995 and 2005, tuition and fees at public four-year institutions rose by 51 percent after inflation. Meanwhile the number of colleges and universities continues to grow.
Higher education in America has become a sprawling enterprise, an octopus with many apparently uncoordinated tentacles. Seemingly endless capital campaigns and partnership agreements with corporations blur the line between higher education and other major economic entities (there is now a Yahoo! Founders Professor of Engineering at Stanford, and the University of California, Irvine, boasts a Taco Bell Chair of Information Technology Management). Yet the bottom line of all this activity has become even harder to identify than it was in 1946 or 1960. One striking illustration of the confusion of purposes is that, having established “research parks” in the 1980s to attract high-tech industries as tenants, many universities are now building retirement villages to entice the affluent elderly.
Higher education on this scale is something new in the world. Unlike American primary and secondary education, it is also the envy of the world. Yet we may be starting to notice that some of its achievements shimmer like a mirage. This past August, the National Commission on the Future of Higher Education reported that “the quality of student learning—as measured by assessments of college graduates—is declining.” It cited a stunning finding of the National Assessment of Adult Literacy: Only 31 percent of college-educated Americans qualify as “prose literate,” meaning that they can fully comprehend something as simple as a newspaper story. That number has shrunk from 40 percent a decade ago, apparently because the flood of badly educated new graduates is dragging down the average.
One reason we often overlook the shortcomings of higher education is that although in many ways modern universities resemble diversified corporations, they are also strikingly peculiar. American higher education today looks somewhat like the Catholic Church of the late Middle Ages—another anomalous enterprise that was once a ubiquitous presence, immensely rich in money and talent, staffed by multiple hierarchies whose principles of organization were opaque to outsiders, following its own arcane laws and mores, seemingly invulnerable to criticism because, with all its contradictions, it still represented what the society as a whole regarded as its highest aspirations.
Nobody disputes that higher education is a good thing, that fine teaching and research enrich society over and above their immediate economic benefits—the main goal most students, parents, and taxpayers have in mind—or that the professionals who spend their lives in these pursuits are as admirable as any group of people. If we take all that as read, what else might we notice when we peer through the fog of idealization that always seems to obscure the particularities of the university?
While most people still think of undergraduate education as the core function of colleges and universities, the total undergraduate capacity of American higher education today probably exceeds by a wide margin the economic advantages it confers, either on students as individuals or on society as a whole. Most of the jobs now held by college graduates in sales, transportation, services, and even the computer industry could be performed successfully by people with little or no higher education. For high school graduates, as is often pointed out, going to college has become a defensive necessity—you have to do it because everyone else is doing it—regardless of how unattractive another four years in school looks to the average 18-year-old.
As for graduate education and research, the amount of duplication among 50 state systems, in addition to the Ivies and Stanfords and Dukes, serves no rational purpose. At last count, more than 160 universities offered a Ph.D. in English, a field where people with doctorates have far outnumbered jobs since the early 1970s. Even the number of Division I football teams, each a multimillion-dollar business in itself, is too great for anyone but a sportscaster to keep track of the standings.
Considering the multiple vulnerabilities of higher education—to inflation, donors, state legislatures, and parents who complain about skyrocketing tuition—maintaining such an expensive status quo has been quite an achievement, but an achievement that has required some hidden sacrifices.
“The university shamelessly promised everything to everyone,” Jane Smiley wrote in her 1995 novel Moo, “and charged so much that prospective students tended to believe the promises. . . . Students would find good jobs, the state would see a return on its educational investment, businesses could harvest enthusiastic and well-trained workers by the hundreds, theory and technology would break through limits as old as the human race (and some lucky person would get to patent the breakthroughs). . . . Everyone around the university had given free rein to his or her desires, and the institution had, with a fine, trembling responsiveness, answered, ‘Why not?’ It had become, more than anything, a vast network of interlocking wishes, some of them modest, some of them impossible, many of them conflicting, many of them complementary.”
The instruction that Smiley’s Moo U, based loosely on Iowa State University, offers is a drop in the bucket of adolescent ignorance. Its research is harmlessly bizarre at best and destructive at worst. The faculty spend as little time on campus as possible. Cruelest of all, Smiley uses a hog to symbolize the university—a hog whose sole purpose as the subject of an eccentric professor’s research project is to grow as fat as it is genetically capable of becoming. When the grotesque beast escapes from its cage at the climax of the novel, it immediately drops dead of a heart attack. Smiley’s vivid satire displays some unsurprising manifestations of human nature in large bureaucratic systems. Yet each of these realities takes forms in the university that render it unfamiliar, sometimes even unrecognizable, to outsiders.
Normally, when an industry becomes overstocked with providers of a service, the least capable are eventually taken over by the more successful, or simply go bankrupt and cease to exist. Think of Gimbel’s, TWA, American Motors. The victorious competitors typically become more efficient and distinctive. In higher education, however, there are several reasons why the normal effects of competition fail to operate. First, the extreme difficulty of objectively measuring the relative success or failure of a university makes it possible for the administrators of even the most disreputable institution to claim that it has not failed. Both the criteria that define success and the best means for gauging it are endlessly debatable. Because the funding of higher education comes from such a diversity of sources, there is no immediate connection between failure in the market (such as a persistent inability to attract sufficient numbers of students) and utter collapse.
Second, because the majority of colleges and universities are public rather than private, any threatened campus usually has one or more legislators it can count on to save it from oblivion. Despite periodic alarms about the threat to such institutions, very few have gone out of existence or suffered hostile takeovers since the end of the Great Depression. Even small private colleges, the most fragile members of the breed, have low fatality rates: About three shut down every year, many of them religiously based institutions, and new ones are always being born.
Instead of becoming more varied in the face of competition, institutions of higher education paradoxically become more alike, thereby increasing redundancy. In every subsector—Ivy League institutions, gigantic public research universities such as the University of Wisconsin, elite liberal arts colleges such as Oberlin, or the middle-sized regional campuses where most Americans pursue their studies—distinctiveness of mission, curriculum, and expectations of students and faculty have diminished since the 1950s. Thanks partly to accrediting organizations that apply national standards uniformly, the cultural, denominational, and regional differences that not long ago distinguished, say, all-male, Quaker Haverford College in Pennsylvania from coeducational, Baptist Furman University in South Carolina have narrowed to little more than local color. (One straw in the prevailing wind has been the virtual disappearance of single-sex higher education since the 1960s.) The University of California, Berkeley, and the University of Virginia were once far more oriented toward the differing needs of their states than they are today. The major exceptions to the trend are a few radically contrarian religious schools such as Bob Jones University and Liberty University that are pariahs in the academic world. Given the emphasis placed on diversity in the mission statement of virtually every academic institution in the country, this homogenization seems ironic. But to administrators today, “diversity” simply means recruiting more black and Hispanic students and employees. It has no intellectual significance.
Homogeneity has been reinforced by the rise of a national managerial class in higher education. Unlike in the past, most college and university presidents today are not alumni or longtime employees of the institutions over which they preside. According to a 2002 study, their average tour of duty is less than seven years. Beneath them sits an ever-expanding cadre of career administrators whose lives follow a similar pattern as they move from one institution to another of the same type, rising successively from department head to dean to provost and (occasionally) to president. Rather than embrace the historical peculiarities of a particular school, academic managers tend to pass their careers keeping up with the fashions and taboos of the moment through professional conferences and the administrative trade paper, _The Chronicle of Higher Education._
The notorious misadventures of former treasury secretary Lawrence Summers as president of Harvard put several of these anomalies on the front page, first when he spoke indiscreetly (as it seemed to his critics) about research on the mathematical abilities of men and women, and again a year later, when the Harvard faculty of arts and sciences succeeded in getting him fired for offenses against academic orthodoxy. To many observers, the whole episode looked like a recurrent fantasy or nightmare of the late 1960s, depending on your point of view, in which the tenured equivalent of a people’s court meted out revolutionary justice.
Yet it’s important to notice just how atypical the events at Harvard were of the way higher education normally operates today. First, Summers expressed himself with a freedom and openness that few contemporary academic administrators would allow themselves. The time when presidents of major universities were public figures who spoke with some candor on genuinely controversial issues is long past. Second, the Harvard faculty have, or at least had (they may have used some of it up), an authority within the university that exists in few other institutions, no matter how eminent their members. Almost invariably, trustees hire and fire the people who run large universities without much consultation, while fully socialized administrators express in public only the most widely shared opinions.
Faculties are a little less conformist, but, as the Summers episode illustrates, not much, despite the brilliance and idealism of many of their members. Like administrators, the most successful professors migrate from institution to institution. Their allegiance is far more to a discipline and a group of widely dispersed colleagues—fellow specialists around the world who also study population genetics, medieval Islam, or tropical agriculture—than to a particular place.
Regular faculty members, a small minority of university employees, increasingly occupy a world of their own. They compete like gladiators, first for tenure, then for the considerable amenities higher education has to offer. Light-to-nonexistent teaching loads, substantial research and travel budgets, frequent time off, and a cascade of honorific titles help make up for salaries that, even at the higher levels, rarely approach what a comparably successful lawyer or doctor would make. The average full professor in a Ph.D.-granting public university earned $101,620 last year, but the figure is almost meaningless because of dramatic variations among individuals as well as between high-paying (engineering, accounting, law) and low-paying (arts, humanities, foreign languages) departments. Once upon a time, many institutions used published salary scales. It is only one sign of the faculty’s diminished authority on campus that academic salaries now are individually set by what administrators airily refer to as “the market,” and frequently kept secret.
Today, a profession exceeded by few others in its intellectual commitment to egalitarianism inhabits a world of ever more elaborate hierarchies. Not long ago, most universities had three professorial ranks—assistant, associate, and full professor—with a rare endowed chair thrown in. Now even public universities may have three or four categories of endowed, distinguished, and other super-ranked faculty, each with its own clearly defined status and perquisites.
Undergraduate education has been one of the chief casualties of this new order. Overcapacity and changing demographics mean that all but the most prestigious institutions have to admit marginally qualified students simply to keep their classrooms full. Virtually any high school graduate today who wants to go to college can find a berth somewhere. To reduce the chances that these under-prepared students and their tuition dollars will flunk out, most universities now tie faculty raises and promotions to student evaluations of teaching, thereby encouraging easier courses and less stringent grading. The days when a flagship state university would routinely fail a quarter or more of its freshman class in order to maintain academic standards ended before the last baby boomers graduated in the mid-1980s. Even so, about a third of the students who enroll in universities leave without getting a degree.
The thirst for dollars has also brought the system of “publish or perish,” which used to operate only in top research universities, to virtually every institution that aspires to national standing. While good teaching may attract good students, well-publicized research can bring in the harder currency of grant money and status. And if professors are to focus on research, somebody else must take up the teaching slack. The bulk of introductory teaching now falls to graduate students or poorly paid adjuncts—not the former high school teachers of old, but frequently holders of Ph.D.’s who have failed to find permanent positions amid the market glut. Contact with beginning undergraduates has become both less attractive and, for senior faculty, less frequent.
Nobody wants to go back to the era of finishing schools staffed by amateurs, but the new world we have created is dysfunctional in ways that we have only begun to recognize. The National Commission on the Future of Higher Education tentatively warned of “disturbing signs that many students who do earn degrees have not actually mastered the reading, writing, and thinking skills we expect of college graduates.” Are we really turning out armies of semiliterate college graduates? The answer is that nobody knows. Standardized national tests, along with federal monitoring of institutional quality and changes in financial aid to students, were among the commission’s recommendations. Like so many previous commissions on higher education over the past four decades, however, the group had trouble reaching agreement on the key questions and, apparently, persuading itself that its proposals had much chance of surviving the hostility of the higher education lobby.
The fact that evaluating universities is so frustratingly difficult suggests that we have only the vaguest idea of what we want from them. Is it primarily undergraduate education? (In that case, we could get rid of many expensive Ph.D. programs and research facilities.) If so, exactly what kind of education, and for what percentage of the population? By what criteria will the fortunate few or many be selected? (Any decision on these questions would definitively settle the issue of whether we currently have overcapacity, undercapacity, or just the right amount.) Or do we, as taxpayers, donors, or parents, really want to maintain the lavish graduate programs and laboratories that attract so many foreign applicants and lead to so many patents and Nobel Prizes? If so, maybe we should stop complaining quite so loudly about the price tag. In any case, merely agreeing that education and research are valuable doesn’t get us very far. Some informed choices are long overdue.
The late Reuven Frank, who was president of NBC News and subsequently a critic of television, once asked his readers whether they could correctly identify either the main product of a commercial television network or its customers. Most people, he noted, would say that the product was an array of programs, while the customers were the audience that viewed them. But most people would be wrong. The real product of the television industry, Frank concluded—what it sets out to create, what it compulsively measures, what it labors single-mindedly to increase—is an audience. The industry’s customers are advertisers who buy access to that audience.
A similar misapprehension surrounds higher education. The real product of major-league universities—what they measure obsessively with yardsticks that range from rankings in U.S. News & World Report to the total size of their research budgets, what they seek and reward most in faculty members, what they tirelessly emphasize in their marketing—is reputation. Their customers are parents, government at all levels, businesses, donors, and foundations, all of whom want different pieces of what has become a hugely complicated pie. In a world where the bottom line is so elusive, the distinction between appearance and reality has no meaning. At bottom, the mark of a great university, more than anything else, is its success in gaining and profiting from a reputation for being a great university.
The power of the new ivory tower endures, like that of a church, simply because most Americans believe whichever of its promises pertain to them. If they ever started to question their faith, a Reformation might be at hand, but not before.
This article originally appeared in print