Revisiting the Great Depression
– Robert J. Samuelson
The role of the welfare state in today’s economic crisis recalls the part played by the gold standard in the calamitous 1930s.
The Great Depression cast a dark shadow over the 20th century. It arguably led to World War II, because without the Depression, Adolf Hitler might never have come to power. It discredited unfettered capitalism—which was blamed for the collapse—and inspired the expansion of government as the essential overseer of markets. This economic catastrophe has long fascinated historians and economists, but for decades serious reflection on the Depression didn’t extend much beyond the scholarly world. It couldn’t happen again. We knew too much. There were too many economic and regulatory controls. But the Great Recession has made us wonder. Can we learn from the Depression? Are there parallels between then and now? Most ominously, could we suffer another depression? The conventional wisdom still says no. Unfortunately, the conventional wisdom might be wrong.
There is no precise definition of a depression; it’s a term of art. Generally speaking, it’s a broad economic collapse that produces high unemployment from which there is no easy and obvious escape. The crucial difference between recession and depression is that recoveries from run-of-the-mill recessions occur fairly rapidly in response to automatic market correctives and standard government policies. Businesses work off surplus inventories or repay excessive debt. Governments reduce interest rates and allow budgets to swing into deficit. A depression occurs when these mechanisms don’t work, or don’t work quickly. The pivotal question becomes: Why?
One answer is that powerful historical, social, and political changes overwhelm the normal market and policy responses. Modern depressions are not ordinary business cycles susceptible to routine remedies, because their origins lie in institutions and ideas that have been overtaken by events. But letting go of or modifying these powerful attachments is a painfully slow process, precisely because the belief in them is so strong and the alternatives are often unclear. Hence, adjustment occurs slowly, if at all. Change is resisted or delayed, or wanders down dead ends. Economies languish or decline. The Great Depression was one of those moments. We may now be in another.
There are parallels between then and now, largely unrecognized. Then, the forces suffocating economies stemmed from a jarring historical rupture: the end of the gold standard. In the late 1920s and early ’30s, countries clung to the gold standard—backing paper currencies with gold reserves—as a defense against hyperinflation. Gold was thought to be the foundation of sound money, which was deemed necessary for prosperity. Most simply, gold regulated economic activity. When gold drained out of a country, supplies of money and credit tended to shrink; when a country accumulated gold, they tended to expand. But defending the gold standard caused country after country to suffer banking runs and currency crises. These fed each other and deepened the economic collapse. By 1936, more than two dozen countries had reluctantly jettisoned gold. Once this happened, expansion generally resumed.
Something similar is happening today, with the welfare state—the social safety net of wealthy democracies—playing gold’s destructive role. In Europe, government spending is routinely 40 percent or more of national income. In the United States, it exceeds a third. Like the gold standard 80 years ago, these protections command broad support. They mediate between impersonal market forces and widely shared norms of fairness. The trouble is that many countries can no longer afford their costly welfare states. Some nations have already overborrowed; others wish to avoid that fate. Their common antidote is austerity: spending cuts, tax increases, or both. The more austerity spreads, the greater the danger it will feed on itself. What may make sense for one country is disastrous for many—just as in the 1930s.
The exhaustion of economics is another parallel between our time and the Depression. Then, as now, economists didn’t predict the crisis and weren’t able to engineer recovery. “Liquidate labor, liquidate stocks, liquidate the farmers, liquidate real estate,” said President Herbert Hoover’s Treasury secretary, Andrew Mellon. In the 1930s, this “liquidationist” view dominated. Let wages, stocks, and land values fall until prices are attractive, it said; recovery will occur spontaneously as businesses hire and investors invest. It didn’t work. Today’s orthodoxy is Keynesianism (after John Maynard Keynes), and governments responded to the 2007–09 financial crisis with its textbook remedies. The Federal Reserve and other central banks cut interest rates; governments ran huge budget deficits. Arguably, these measures did prevent a depression. But, contrary to expectations, they did not promote a vigorous recovery. As in the 1930s, economics has disappointed.
Of course, analogies shouldn’t be overdrawn. We’re still a long way from a second Great Depression, even if such an economic disaster is conceivable. Compared to what happened in the 1930s, the present distress—here and abroad—is tame. From 1929 to 1933, the output of the U.S. economy (gross domestic product) dropped almost 27 percent. The recent peak-to-trough GDP decline, from the fourth quarter of 2007 to the second quarter of 2009, was 5.1 percent. From 1930 to 1939, the U.S. unemployment rate averaged 14 percent; the peak rate, in 1932, was 23 percent. Rates elsewhere in the world were as bad or worse. Unemployment among industrial workers had reached 21 percent in the United Kingdom a year earlier; it hit 44 percent in Germany in 1932. The social protections we take for granted barely existed. Congress didn’t enact federal unemployment insurance until 1935.
Still, the economy’s present turmoil resembles the Great Depression more than anything since. As this is written, Europe is sinking into recession. In the United States, unemployment stayed above nine percent for 21 consecutive months, and then another seven after a short period slightly below that level. The longest previous stretch was 19 months, in the early 1980s. Against this backdrop, it’s natural to reexamine the Depression and search for parallels.
The Depression is usually dated from late 1929 to the eve of World War II. But people didn’t immediately recognize that they had entered uncharted economic waters. “Down to the last weeks of 1930, Americans could still plausibly assume that they were caught up in yet another of the routine business-cycle downswings that periodically afflicted their boom-and-bust economy,” David Kennedy writes in his 2001 Pulitzer Prize– winning history Freedom From Fear: The American People in Depression and War, 1929–1945. Unemployment, for example, reached nearly 12 percent in the recession year of 1921 and was 8.9 percent in 1930. The riddle is: What caused the Depression to defy history? Over the years, many theories have been floated and discredited.
Chief among the fallen is the stock market crash of 1929. True, it was terrifying. From October 23 to November 13, the Dow Jones Industrial Average dropped almost 40 percent, from 327 to 199. Fortunes were lost; Americans were fearful. But steep market declines, before and since, have occurred without causing a depression. The most obvious connection would be the “wealth effect.” Shareholders, being poorer, would spend less. However, very few Americans (about 2.5 percent in 1928) owned stocks. Moreover, stocks rebounded, as historian Maury Klein has noted. By March 1930, the Dow had recovered 74 percent from their December level. Stocks later fell, but that was a consequence of the Depression, not the cause.
Another familiar villain is the Smoot-Hawley tariff. It has “become synonymous with an avalanche of protectionism that led to the collapse of world trade and the Great Depression,” writes Dartmouth economist Douglas Irwin. But Irwin’s recent book Peddling Protectionism demolishes the conventional wisdom. The tariff’s direct effects were modest, and its timing also argues against its significance. President Hoover signed the Smoot-Hawley Tariff Act in June 1930, well after the Depression had begun. Average U.S. tariffs on imports did rise from 40 percent in 1929 to 59 percent in 1932, but two-thirds of U.S. imports had no duties at all. Europe did retaliate with higher tariffs, but only six percent of Europe’s exports came to the United States. Trade did collapse in the Depression, but (again) that was consequence, not cause.
Finally, there’s Herbert Hoover. The anti-Hoover indictment is that he passively let the Depression deepen and, by trying to balance the budget, made it worse. This argument is unfair and inaccurate. After the crash, Hoover urged businesses to maintain wages and continue investment projects. In three years, he nearly doubled federal public works spending and pushed the states to do likewise. In 1932, he did successfully propose a tax increase—Roosevelt also advocated balanced budgets, a widely shared goal—but the federal budget still ran a large deficit: four percent of GDP. “It would be hard to find an economic historian to argue that fiscal [budgetary] tightness was a significant factor in worsening the Great Depression,” writes Timothy Taylor, managing editor of The Journal of Economic Perspectives.
None of these familiar scapegoats solve the puzzle: Why did the economy continue getting worse? Some other force or forces must have been responsible. Scholarship on this question has proceeded in spasms.
Just as the Gold Standard amplified the effects of the Depression, so the modern welfare state is magnifying the effects of the recession.
In 1933, Irving Fisher of Yale, then one of the nation’s most prominent economists, published an article titled “The Debt-Deflation Theory of Great Depressions.” The chief causes of the Depression, he argued, were “over-indebtedness to start with and deflation following soon after.” Debts were written in fixed dollar amounts, and so deflation—falling prices, wages, and profits—made it harder for farmers, businesses, and households to repay loans. Defaults dumped more land and jobless workers onto the market, causing prices and wages to fall further and worsening the slump. It was a vicious circle. Still widely accepted, Fisher’s analysis explains why modern economists dread deflation. From 1929 to 1933, prices for wheat, corn, and other farm products dropped 54 percent; those for building materials fell 25 percent. But Fisher didn’t explain precisely what caused the 1930s’ deep deflation.
In 1936, Keynes provided his answer in The General Theory of Employment, Interest, and Money. The culprit was insufficient “effective demand”— what economists now call “aggregate demand.” People and firms weren’t spending enough. Keynes rejected the “classical” economists’ view that spontaneous shifts in wages and interest rates would generate recovery. Wages might be rigid. Low interest rates might not stimulate new investment in plants or products, because businessmen’s “animal spirits” had deadened. The economy “seems capable of remaining in a chronic condition of subnormal activity for a considerable period without any marked tendency either towards recovery or towards complete collapse,” he wrote. Keynes’s remedy was to boost “effective demand” through more government spending.
But his argument, like Fisher’s, was abstract. It lacked a detailed explanation of the Depression itself. Since then, scholars have scoured the historical record to obtain a fuller answer. A breakthrough occurred in 1963 with the publication of A Monetary History of the United States, 1867–1960 by Milton Friedman (a subsequent Nobel Prize winner) and Anna Jacobson Schwartz. Friedman and Schwartz argued that the Federal Reserve caused the Depression by failing to rescue the banking system. From 1929 to 1933, more than two-fifths of the nation’s 24,970 banks disappeared through failure or merger. The nation’s money supply—basically, bank deposits plus currency in circulation—shrank by a third. This steep decline, said Friedman and Schwartz, drove prices and production down. The irony was that Congress created the Fed in 1913 to backstop the banking system.
What would have been a normal, if severe, recession became a depression. Friedman and Schwartz blamed the Fed’s passivity on the death in 1928 of Benjamin Strong, head of the New York Federal Reserve Bank, who had been the Fed’s most forceful figure and would have, they contended, acted aggressively to limit bank failures. By contrast, economist Allan Meltzer cites the “real bills” doctrine as the cause of the Fed’s passivity. Under “real bills” (bills are a type of loan), the Fed lent to banks only against collateral they presented. During the Depression, they didn’t present much; the supply of money and credit shrank. Whatever the truth, these accounts had the Depression starting in the United States and spreading abroad. It was an American story with global side effects.
Not so, argued the economic historian Charles Kindleberger in his 1973 book The World in Depression, 1929–1939. The collapse was international and reflected the inability of a Britain weakened by World War I to continue to stabilize the world economy. Among other things, Kindleberger wrote, Britain’s leadership role had required it (a) to act as “lender of last resort” to stem banking crises, (b) to keep its markets open to sustain trade, and (c) to maintain stable exchange rates. After the war, Britain couldn’t perform these tasks. It lacked sufficient gold reserves to make loans to stop foreign banking crises. High joblessness weakened its commitment to free trade. Consequently, it couldn’t stabilize exchange rates.
The gold standard transmitted the breakdown around the globe, argue economic historians Barry Eichengreen and Peter Temin in, respectively, Golden Fetters: The Gold Standard and the Great Depression, 1919–1939 (1992) and Lessons From the Great Depression (1989). Countries that backed their paper currency with gold sacrificed much economic independence. For example, gold outflows through trade deficits might trigger recessions, because the loss of gold could automatically contract the supply of money and credit. But countries could not respond by devaluing their currencies to boost exports; gold fixed currency rates. Gold’s straitjacket was its supposed virtue. By eliminating inflation and currency fluctuations, it reduced uncertainty and encouraged commerce. This was the theory and belief.
After World War I, countries sought to restore the gold standard, which had been widely suspended during the fighting. Because the reliance on gold had delivered prosperity, this was understandable. But there were daunting problems: Prices had exploded during the war; gold was relatively scarce; exchange rates had shifted; countries were saddled with large debts. As a result, the restored gold standard was unstable. Skewed exchange rates meant that two countries, the United States and France, ran large trade surpluses and accumulated disproportionately large gold stocks. By 1930, they owned nearly 60 percent of the world’s gold.
The resulting gold scarcity—for most countries—created a fatal interdependence. If one country raised interest rates, it might drain gold from others. Depositors and investors, foreign and domestic, would withdraw their money or sell their bonds, convert the receipts into gold, and transfer the gold to the country with higher interest rates. There, the process would be reversed: Gold would be converted into local currency and invested at the higher rates. The gold standard created a potential domino effect of tighter credit that would make the Depression feed on itself. While credit was plentiful, the danger was theoretical. Once economies turned downward, the scramble for gold intensified the slump.
Germany’s Reichsbank, the Bank of England, the Fed, and other central state financial institutions were handcuffed in their efforts to aid their countries’ banks. The Depression weakened banks by increasing their customers’ loan defaults; loan losses then made the banks more vulnerable to depositor runs. But a central bank couldn’t inject too much money and credit into the system without raising doubts about its country’s commitment to gold. Politics compounded the effect by closing another avenue of escape: international rescues to stop bank runs. In May 1931, Austria’s largest bank, Credit-Anstalt, faced a panic. The Bank of England’s reserves were too meager for it to provide an adequate loan on its own, and France—still scarred by World War I—insisted that Austria renounce a customs union with Germany before providing funds. The rescue was delayed. Panic spread and confidence fell.
Gold’s oppressive consequences ultimately caused countries to abandon it. Austria, Germany, and Britain did so in 1931. (The United States left two years later, while France hung on until 1936.) The process was long and punishing because faith in gold was so pervasive. It was hard to let go. But once countries did let go, they could spur their economies. Eichengreen writes, “They could expand the money supply. They could provide liquidity [cash] to the banking system at the first sign of distress. They could increase the level of government expenditure. They could take these actions unilaterally.” By 1937, world manufacturing output was 71 percent above its 1932 level and had exceeded its 1929 level.
Why was the Depression so deep and long? All this scholarship provides a crude answer. Whether the cause was the gold standard, the “real bills” doctrine, Benjamin Strong’s death, Britain’s postwar weakness, or rancor from World War I—or all of these factors—government economic policies perversely reinforced the original slump. Banks were not rescued. Defaults and bankruptcies fed deflation. Unemployment spiraled up, production down. Prevailing economic doctrine was suicidal. The good news, it’s said, is that we understand what happened and can prevent a repeat. Heeding Fisher, we can avoid deflation. Following Keynes, we can prop up aggregate demand. Per Friedman and Schwartz, we can defuse financial panics. Learning from Kindleberger, Eichengreen, and Temin, we can practice international cooperation.
UNFORTUNATELY, THESE REASSURANCES OMIT AN OBVIOUS AND MORE DISCOURAGING LESSON: The Depression couldn’t end until people changed their beliefs and behavior—a lengthy and tortuous process, because people cling to what’s familiar. Here is where the parallel with the present becomes relevant and sobering. Just as the gold standard amplified and transmitted the effects of the Depression, so the modern welfare state is magnifying the effects of the recession. The United States, Europe, and Japan, together representing about half of the world economy, face similar pressures: aging societies, high government spending, and soaring debt levels. These pressures impose austerity on country after country—just as the gold standard did. The cumulative effect is to make it harder for the world to recover from what started as an ordinary, though severe, recession—just as happened under the gold standard.
Casting the welfare state in this role will strike many as outrageous. After all, the welfare state—what Americans blandly call “social spending”—didn’t cause the 2007–09 financial crisis. This dubious distinction belongs to the huge credit bubble that formed in the United States and elsewhere, symbolized by inflated real estate prices and large losses on mortgage-related securities. But neither did the gold standard directly cause the 1929 stock market crash. Wall Street’s collapse stemmed, most simply, from speculative excesses. Stock prices were too high for an economy that was already (we now know) entering recession. But once the slump started, the gold standard spread and perpetuated it. Today, the weakened welfare state is perpetuating and spreading the slump.
What has brought the welfare state to grief is not an excess of compassion, but an excess of debt. After World War II, governments in most advanced countries grew enormously, a reaction to the suffering of the Depression coupled with early postwar optimism about the power of social engineering. By 2007, government spending totaled 53 percent of GDP in France, 44 percent in Germany, 45 percent in Britain, and 37 percent in the United States, reports economist Vito Tanzi in Government Versus Markets (2011). Most spending represented income transfers. Even in the United States, with its sizable military budget, “payments for individuals” (which means entitlements such as Social Security and Medicare) amounted to two-thirds of federal spending in 2010, up from a quarter in 1960.
But this system required favorable economics and demographics—and both have moved adversely. A younger population was needed to lighten the burden of supporting the old, the largest claimants of benefits. Rapid economic growth was needed to generate the tax revenues to pay for benefits. Indeed, the great expansion of benefits started in the 1950s and ’60s, when annual economic growth in Europe and the United States averaged about four percent or more, and the expectation was that this would continue indefinitely. Long-term economic growth is now reckoned closer to two percent a year, a little more for the United States, a little less for Europe. Meanwhile, older populations are exploding. In 2010, the 65-and-over population in Italy was 21 percent, and heading toward 34 percent by 2050; for the United States, the figures were 13 percent and 20 percent.
The means of escape from these unhappy trends was to borrow. Some countries with extensive welfare systems that didn’t borrow heavily (examples: Sweden and Finland) have fared well. But most governments became dependent on bond markets. Until the financial crisis, they coexisted in a shaky equilibrium. Most European governments could borrow cheaply. Their bonds were considered safe investments. Perhaps inevitably, the financial crisis shattered this equilibrium. Economic growth fell from already low levels; government debt rose. Suddenly, financial markets—banks, pension funds, insurance companies, wealthy investors—turned skittish. Perhaps debts wouldn’t be repaid. Greater risk translated into higher interest rates on government bonds.
Once this happened, welfare states became an engine of international austerity. Countries’ choices were constricted. To maintain existing levels of spending, they needed to borrow. But lenders demanded higher interest rates, and to keep these down, governments had to resort to austerity, which meant cutting social programs and raising taxes. Some countries were completely shut out of private markets and had to rely on international financial bailouts; but these bailouts (i.e., loans) came with a string attached: austerity. First Greece, then Ireland and Portugal submitted to this logic. But almost all advanced countries, including the United States, are potentially subject to it. Countries embrace austerity to keep their credit worthiness. Or they embrace it because they lose their credit worthiness.
What this means is that governments, against their will, are being forced to reconsider some basic post–World War II premises around which their economies and societies are organized, much as countries in the 1930s were forced to reconsider economic premises based on the gold standard. Now as then, the process is unwelcome, painful, and agonizingly slow. It involves a balancing of political and economic imperatives: not dismantling the welfare state, but shrinking it to a size that is politically acceptable and economically viable. Social protections and benefits must be reduced so that the resulting obligations don’t impose crippling levels of debt or taxes. It is not clear where this point is and whether wealthy democracies are capable of identifying and reaching it. It will differ for different countries, depending on their underlying economic vitality and political culture.
The ultimate danger is that the welfare state will go into a death spiral. The political impetus to provide promised benefits keeps taxes and debt high, to the point that economic growth suffers; but slower growth or longer recessions make it harder to pay promised benefits, an outcome requiring still further cutbacks. As political leaders grapple with these problems, they are constantly reacting to events—doing too little too late. The fact that many governments are caught in this trap simultaneously means that their collective actions exert a drag on the world economy that makes it harder for all of them to reconcile political and financial- market pressures. The further fact that Europe’s banks are large holders of government debt means that a debt crisis could become a banking crisis—with failures and runs—or a credit squeeze, as banks suffer large losses on their bond portfolios.
Governments are losing control over their economic fates, because high debt also undermines standard Keynesian anti-recessionary tools, a.k.a. “stimulus,” spending more and taxing less in times of economic weakness. The prospect of more debt simply sends interest rates up, nullifying some or all of any “stimulus” and, for some countries, closing access to private credit markets. It’s true that some major debtor countries, notably the United States and Germany, have so far escaped this squeeze. Their interest rates (at this writing) remain low, about two percent on 10-year bonds. But there’s no ironclad reason why these countries should remain immune forever. If investors come to believe that the United States can’t control its debt, they might dump Treasury bonds and other dollar securities. Interest rates would rise; on foreign exchange markets, the dollar would fall.
So it’s not preposterous to compare the gold standard then with the welfare state now. In both cases, a framework is imposed that impedes recovery from what might otherwise be a recognizable recession. The obstacles lie in institutions and beliefs that are deeply woven into the social, political, and intellectual fabric of societies. It takes time to adjust—and sometimes adjustment doesn’t happen at all—because the status quo has established stubborn habits of thought and strong vested interests that can be dislodged only by powerful, incontestable evidence and experience to the contrary. Even then, the destruction of the old does not ensure replacement by the new. There may simply be a void.
This does not mean we are condemned to a second Great Depression. The messy process of grappling with overcommitted government may lead to slow growth, long recessions, or stagnation—but not the dramatic collapse of the 1930s. China, India, Brazil, and other developing countries, representing about half of the world economy, don’t face the dilemmas of mature welfare states. Their economic growth may provide a safety net for the “old world” of Europe, North America, and Japan. But here, too, there are cautionary comparisons. China’s rise and America’s problems have fragmented economic power. Cooperation is strained. The analogies with Britain’s post–World War I weakness and the paralyzing rancor between Germany and France are obvious. Another parallel with the 1930s is the euro, which, as the gold standard once did, has created a straitjacket that makes recovery harder.
All of these challenges suggest that a second depression or some prolonged period of economic disappointment and hardship is no longer implausible, as it seemed for most of the past half-century. The mastery of economic activity we thought we had achieved—not in the sense that we could eliminate all business cycles or financial panics, but in the more limited way that we could avoid pervasive instability—can no longer be taken for granted. The mistake, popularized largely by economists, was to believe that regulation of the economy could be derived from theory and converted into practical precepts for policy. The reality is that economic life is not solely described or dictated by rhythms suggested by economic models. It moves in response to institutions, technologies, beliefs, and cultures that follow their own logic, sometimes with completely unexpected, mystifying, and terrifying consequences.
* * *
Robert J. Samuelson writes a regular column for The Washington Post and is the author most recently of The Great Inflation and Its Aftermath: The past and Future of American Affluence (2009).
Photo courtesy of Wikimedia Commons