America's Romance with the Future
When he announced his bid for the presidency back in 1991, the then-governor of Arkansas, Bill Clinton, spoke movingly of the teacher who had most influenced his thinking, a Georgetown University professor named Carroll Quigley. Quigley was known for ripping apart a copy of The Republic while he denounced Plato as the intellectual father of totalitarianism. But it was not the classroom pyrotechnics that most impressed the future president. Rather, it was Quigley’s emphasis on the future in his foundation course on Western civilization. Clinton never forgot the professor’s preoccupation—and not just because he was one of only two students in the class to receive an A.
“The thing that got you into this classroom today is belief in the future, a belief that the future can be better than the present and that people will and should sacrifice in the present to get to that better future,” said Quigley. “That belief has taken man out of the chaos and deprivation that most human beings toiled in for most of history to the point where we are today. One thing will kill our civilization and way of life—when people no longer have the will to undergo the pain required to prefer the future to the present. That is what got your parents to pay this expensive tuition. That is what got us through two wars and the Depression. Future preference. Don’t ever forget that.”
It is tempting to dismiss this preference for the future as a truism, an instinct for clan survival hard-wired into the genes of all living creatures. Adults of every species exert themselves to feed and protect their helpless young. Hunter-gatherers learn to salt and dry today’s meat against tomorrow’s hunger, and the most primitive peasants learn to save precious seed corn for next year’s harvest. But advanced societies have embellished and refined the instinct into something much grander: an array of deliberate policy choices. These include investment in police and standing armed forces, education and economic infrastructure, and social health and welfare. Advanced societies extend welfare provisions even to the elderly, though they know that there is little genetic advantage to be gained from such expenditure on those beyond breeding age. They make these substantial income transfers from the working population to the retired for reasons of social cohesion and human decency—and possibly also from an acute sense of the propensity of the elderly to vote. Whatever the cause, this is an act of general political will that has little to do with the individual demands of our genes and everything to do with what we might call Quigley’s Law: Successful societies are defined by their readiness to allow consideration of the future to determine today’s choices.
The United States is a successful society today because over the past two or three generations it has applied Quigley’s Law more thoroughly and more widely than any other society in history, and, in doing so, has shaped much of the world. Until 1940, the United States was not much more Quigley-minded than most other great powers. But the challenges of global war from 1939 to 1945, and the Cold War thereafter, persuaded successive administrations of both parties to apply Quigley’s principles on a global scale. There had been a hesitant precedent in the way that the British Empire crushed piracy, abolished the slave trade, established the principle of freedom of the seas, and built lighthouses and ports available to all. But the strategy by which the United States waged the Cold War was altogether more grandiose in conception and more transforming in its application.
That extraordinary generation of policymakers gathered around Presidents Franklin Roosevelt and Harry Truman—George Marshall, Dean Acheson, George Kennan, Paul Nitze, Paul Hoffman, and others—established, with bipartisan support, a series of global institutions that, in effect, created The West, the global economic machine that brought together the wealth, markets, and ingenuity of North America, Western Europe, and Japan. The policymakers set up the North Atlantic Treaty Organization (NATO) for common security, the International Monetary Fund for global economic stability, the World Bank for global development, the United Nations for global order, and the General Agreement on Tariffs and Trade for the expansion of global trade. And they crafted inventive new instruments to help the war-flattened industries of Europe and Japan rebuild at American expense. The Marshall Plan, for example, which furnished Europeans with the dollars that enabled them to rebuild their factories and feed their workers (the offer was made to the Soviet Union and the Eastern bloc countries as well), represented an annual disbursement of just over one percent of America’s gross domestic product (GDP) for five years.
There was method to this altruism. The Western European economies were thereby enabled to contribute not only more effectively but also more willingly to common defense; NATO, in contrast to the Warsaw Pact and its dragooned members, was an alliance of consent. The United States subsequently extended the pattern of altruism through the Pentagon’s Special Procurements Fund, which pumped more money into rebuilding Japan than West Germany had received under the Marshall Plan. Because Japan became the industrial and logistic base for the Korean War, American taxpayers financed the ports, railroads, power stations, hospitals, and shipyards of modern Japan. They even paid for the first assembly lines of the Toyota Manufacturing Corporation, which was about to go bankrupt when it was saved by a Pentagon order for trucks.
The spur to this Quigleyan activity on a global scale was, of course, the national security of the United States: The nation needed forward bases in Europe and Asia and allies to share the burden of the Cold War. Yet America’s grand strategists understood that, in rebuilding these allies, they were fostering formidable commercial competitors for the future, whose success might one day challenge the economic dominance that had allowed the United States to generate about half of all global economic output in 1945. American politicians certainly understood what was at stake, and, accordingly, they exacted various prices. Southern Congressmen, for example, insisted that American tobacco products be counted as Marshall Plan aid, which caused one British member of Parliament to complain, “The British Empire is being sold for a packet of cigarettes.”
A far more important demand was “the open door,” a requirement that the British, French, and Dutch colonial empires dismantle the imperial tariff system that gave their goods privileged access to colonial markets. This dovetailed precisely with the American strategy to promote world trade and thus boost American exports. As a grand design, it proved stunningly successful, although some Americans may have thought the price rather high. By 2005, the United States and the 25-nation European Union each accounted for less than a quarter of global GDP, and Japan for another 11 percent. The once-stricken competitors had long since become serious commercial rivals, in a large, prosperous, and competitive global economy that witnessed the decimation of American jobs in traditionally strategic industries such as coal, steel, and automobiles.
The Soviet Union, the West’s great adversary in the Cold War, had its own plans for the future. At the Twenty-second Party Congress in 1961, Soviet premier Nikita Khrushchev pledged that within 20 years his country would be outproducing the United States in all the traditional sectors of industrial might: coal, steel, cement, fertilizer, tractors, and metal-cutting lathes. The pledge was fulfilled: In 1981, the Soviet Union outdid America in every one of those industries; it had successfully reproduced a mid-20th-century industrial economy. But the West by then was inventing a different kind of economy altogether, one based on plastic and silicon, on the new service sector, and on world trade. Even with the best of Quigleyan motives, an advanced society, such as the Soviet Union (which put the first man into space even as Khrushchev was issuing his promises), can make disastrous choices.
That mistaken Soviet vision of the future ensured that the entire planet would eventually come to live instead in an American-designed future, whose contours were drafted in the furious burst of technological, cultural, and economic energy that powered the United States after it assumed its global role in World War II and the postwar world. Its films and popular music, its visual arts and literature, its assumption that a college education should be the norm, and its insistence on domestic comforts (appliances, central heating and air conditioning, family cars) have now all spread beyond the mass middle class that America invented and become the defining possessions of a mass middle class that is global. And with them have spread those essential underpinnings of the American creed: free press, free trade, free markets, and free elections.
We live now in that American future and call it globalization. And whatever the costs—personal, regional, environmental—that have been paid by Pittsburgh steelworkers, Amazonian tribes, Nigerian villagers, or deracinated Muslims in Paris slums, the overall achievement has been stupendous. More people than ever before are clambering out of the absolute poverty of their ancestors and aspiring to join that mass middle class. James McGregor, chairman of the American Chamber of Commerce in China and author of the new book One Billion Customers, estimates that the market for private cars in China is already bigger than the markets of France and Germany combined, and within five years it will be twice as large again. By then, the Indian market, too, will be bigger than the combined markets of France and Germany. And so on. The biosphere groans under the strain, but the future of mass consumption that gripped the young Henry Ford 100 years ago, and that was implicit in the Cold War’s original grand strategy, now pervades the world.
Have we any clues as to how these new pressures are likely to affect human relationships and social change? We do—and these clues come from Americans. It is a remarkable feature of science fiction that, although Europeans invented the genre, Americans have produced its most thoughtful explorations of future societies. Jules Verne and Arthur Conan Doyle and H. G. Wells were fascinated by the future of things, of stupendous technology. American authors of science-fiction classics tend to have been intrigued rather by the future of people. Robert Heinlein wrote what is still the most accomplished description of a wholly free-market society in The Moon Is a Harsh Mistress. Isaac Asimov drafted laws of robotics (“A robot may not injure a human being or, through inaction, allow a human being to come to harm”) that are sure to come in handy fairly soon. And Philip K. Dick explored, among other themes, the personal relationships that are bound to develop between humans and androids (the novel Do Androids Dream of Electric Sheep? was the basis of the movie Blade Runner), the nature of justice in a society where human behavior and even crime may be predicted (the short story “The Minority Report” became the movie Minority Report), and the likely outcome when virtual reality becomes all too plausible (“We Can Remember It for You Wholesale” reached the screen as Total Recall).
Americans, then, invent the future as statesmen and imagine it as writers, and they have traditionally been confident that the future will be splendid—that today’s debts will be tomorrow’s fortune, that their citizenship holds a vast and generous promise that will inevitably be redeemed. The vision on the other side of the Atlantic has been altogether grimmer. “If you want a picture of the future, imagine a boot stamping on a human face—forever,” wrote George Orwell in 1984. That’s far removed from the sentiment of the modern American sage Daniel Boorstin: “America has been a land of dreams. A land where the aspirations of people from countries cluttered with rich, cumbersome, aristocratic, ideological pasts can reach for what once seemed unattainable. Here they have tried to make dreams come true.” When Henry Ford said “History is bunk,” he was speaking a great truth for those millions of immigrants who had abandoned the old continent with its constipated social order and confining tradition. America was Hegel’s “land of desire for all those who are weary of the historical lumber-room of Old Europe.”
The question now, however, is whether that vision still endures, whether the innate national confidence remains secure that made Ronald Reagan’s “It’s morning in America” resonate so powerfully. There are some troubling signs that Quigley’s Law is no longer operating with the old American rigor. America as an economic community is no longer saving the seed corn. Indeed, it is no longer saving. Since 2002, America’s annual net savings have failed to rise even to the miserable level of two percent of GDP. Europeans save about 15 percent of GDP, and the Chinese more than 35 percent. The federal budget deficit was $412 billion in 2004, and the current account deficit (which used to be called the trade deficit) was $666 billion. Combine those figures into a double deficit, and the United States in 2004 lived beyond its means to the tune of more than a trillion dollars. We learn from the bookkeeping of the Bank for International Settlements that these deficits were largely financed by the central banks of China and Japan, which bought dollars, Treasury bonds, and other U.S. securities. Thanks to the Chinese and Japanese savers who wanted Americans to have the money to continue consuming their exports, Americans were able to continue living in the style to which they had become accustomed, but which they could no longer afford.
In October 2005, the Council on Foreign Relations released a report, “Getting Serious About the Twin Deficits,” by Professor Menzie Chinn of the University of Wisconsin–Madison. (Chinn served on the Council of Economic Advisers for Presidents Bill Clinton and George H. W. Bush.) “Failure to take the initiative to reduce the twin deficits will cede to foreign governments increasing influence over the nation’s fate. Perhaps equally alarming, it will lead to slower growth, escalating trade friction, and reduced American influence in political and economic spheres,” Chinn wrote in the report. “Foreign governments and private investors, confronted with an endless vista of U.S. budget deficits, will tire of accumulating Treasury securities. Borrowing costs for the Treasury would then rise significantly and the dollar would fall sharply. The economy would slow dramatically, driven indirectly by a slump in the housing market or directly through falling private consumption.”
These are alarming warnings from a respected source. Perhaps the best antidote to the gloom is to recall that the United States has always been rather good at reinventing itself in the face of new challenges and changed times. It is barely 14 years since former Massachusetts senator Paul Tsongas won the New Hampshire presidential primary in 1992 with the slogan “The Cold War is over, and Japan won.” Since then, the Japanese economy has been virtually stagnant. The U.S. economy, which along the way developed the Internet and broadband technology, has grown by more than 40 percent. That is to say, the GDP of the U.S. economy has grown since 1992 by an amount greater than the entire GDP of Japan. It requires a breathtaking disregard for the lessons of history to bet against the resilience and vigor of the American economic machine.
One way to look at American history over the past century or so is to suggest that in the late 19th century the United States became the world’s farm, the source of cheap food that fed its own swelling population and much of the rest of the world. In the first two-thirds of the 20th century, it became the world’s workshop, the source of industrial innovations and goods, and, when needed, of munitions. Over the past generation, as European, Japanese, and Chinese manufacturers began challenging its dominance, the United States became the world’s graduate school.
The most recent ranking of the world’s universities (the criteria included the Nobel and other international prizes, articles cited in leading academic journals, research results, and academic performance) was published in 2005 by the Institute of Higher Education at Shanghai’s Jiao Tong University. Of the world’s top 10 universities, only two, Oxford and Cambridge, were not American. The third non-American university to make the list was Japan’s Tokyo University, at number 20. The highest-ranking non-British European university, at number 27, was Switzerland’s Federal Institute of Technology in Zurich. America dominates the world’s brainpower and scores well on this classically Quigleyan measure of care for the future. If the global mass middle class is indeed straining the biosphere beyond endurance, it will be universities in America—if anywhere—that produce the research and innovation needed to repair the damage.
Alexis de Tocqueville’s Democracy in America, the first volume of which was published in 1835, remains perhaps the most perceptive book ever written on the young republic. Tocqueville’s ideas and judgments have continued to ring true, including the cautionary notes he sounds along with his expressions of admiration. His celebrated warning about a singular American weakness provides the counterpoint to Quigley’s essential optimism: “The prospect really does frighten me that they may finally become so engrossed in a cowardly love of immediate pleasures that their interest in their own future and in that of their descendants may vanish, and that they will prefer tamely to follow the course of their destiny rather than make a sudden energetic effort necessary to set things right.” These many years later, Tocqueville’s concern seems more prescient and urgent than ever.
This article originally appeared in print