The High Price of Victory
The United States has a history of military unpreparedness. After World War I, Congress, never imagining the world conflict to come, spurned the War Department’s outrageous plan for a regular army of 500,000 men. It authorized instead a force of 280,000, and then let budgetary pressures during the 1920s keep troop levels at halfstrength. After World War II, America swiftly shrunk its army of 10 million down to about 552,000—a force that, General Omar Bradley judged when he inherited it in 1948, "could not fight its way out of a paper bag." The perceived weakness encouraged North Korea’s Soviet-sanctioned invasion of South Korea in 1950, bringing on the Korean War. Now, six years after the end of the Cold War, with the military reduced from 2.1 million to 1.4 million men and women, the question of preparedness has arisen once again.
Defense spending, which has fallen from more than $300 billion in 1989 to about $250 billion today, could be cut by billions of dollars more "without jeopardizing our national security," in the opinion of Lawrence J. Korb, a former assistant secretary of defense, writing in the Washington Monthly (Mar. 1997). "An objective assessment of the threat would show that we have more than enough forces to protect our interests in the Persian Gulf and on the Korean Peninsula, and that our forces are already increasing their technological advantage at the current levels of defense spending." Instead of preparing for a two-front war that almost certainly will not occur, he writes in the New York Times (May 22, 1997), the United States should aim "to be able to fight one large war while handling smaller peacekeeping operations elsewhere, with a weapons budget sufficient to maintain our technological edge." This could be done, he maintains, with a much leaner Pentagon budget.
The "Rogue Doctrine," formulated in 1989 under General Colin Powell, then chairman of the Joint Chiefs of Staff, stated that the military threats to the United States in the post-Cold War era would come from "rogue" states such as Iraq, Iran, Syria, Libya, Cuba, and North Korea. The United States thus should be able to fight and win two large regional wars simultaneously. According to the Quadrennial Defense Review, a congressionally mandated strategic blueprint issued recently by the Pentagon, the current main military force of 10 active army divisions, a dozen aircraft-carrier battle groups, and 20 air force fighter wings is enough to do that.
Some military specialists, however, question whether the United States is ready to fight even one large regional war. "The army today," assert Frederick W. Kagan and David T. Fautua, military historians at the U.S. Military Academy, "could not field the force which won the [1991 Persian] Gulf war.... Whereas the American land component of the forces that defeated Saddam Hussein comprised seven divisions, five heavy and two light (out of our then-total of 18 divisions), today, out of 10 divisions, only six are heavy, and five of these are already committed to defending American interests elsewhere around the world." Withdrawing the heavy divisions "from either Bosnia or Korea, let alone from both," they write in Commentary (May 1997), "would itself entail large costs, undermining the credibility of America’s commitments around the world and inviting instability and possibly war."
The "real Achilles’ heel" of the tworegional-wars strategy, Harry G. Summers, Jr., a retired army colonel and syndicated columnist, writes in Orbis (Spring 1997), is that it has been "seriously underfunded, with estimates of the shortfall ranging from $150 million to $200 billion. But instead of facing that fact, America [has] tried to wish it away." One way of doing that, "a favorite of the defense contractors, [is] to argue that high technology could substitute for manpower." But soldiers are not going to be rendered obsolete, Summers says. America, in his view, must make up its mind whether its national interest and international obligations require "a Cold War-type military with a relatively large standing army" or not.
In reality, Kagan and Fautua argue, the U.S. role in the world "is as extensive as ever, and there is no reason to think it will soon diminish." During the Bush and Clinton administrations, "we have dispatched troops abroad more often than we did during the previous 20 years under Presidents Reagan, Carter, Ford, and Nixon." President Bush sent soldiers to Panama and Somalia, as well as to the Persian Gulf, while President Clinton sent armed forces to Haiti, Bosnia, the Persian Gulf again, and the seas around Taiwan.
These far-flung missions are not only stretching the army thin but robbing it of its war-fighting edge, Kagan and Fautua argue. Peacekeeping and war-fighting demand very different skills and qualities, and the army today is heavily involved in the former.
Because manpower is very expensive, especially without a draft, the army makes an attractive target for budget-cutters, observes historian Donald Kagan, of Yale University, also writing in Orbis. (The recent Quadrennial Defense Review report calls for a four percent reduction in active-duty troops.) But the temptation must be resisted, he says. More money than is now budgeted or anticipated will be needed. With the Cold War over, America today is in an immensely favorable situation in the world, Kagan notes, and its "most vital interest...is maintaining the general peace." But, he adds, it is a common mistake to assume "that peace is natural and can be preserved merely by having peace-seeking nations avoid provocative actions. The last three-quarters of the 20th century strongly suggests the opposite conclusion: major war is more likely to come when satisfied states neglect their defenses and fail to take an active part in the preservation of peace."
Yet modern democracies find it hard to maintain their commitment to deterrence. Kagan writes: "If there is no war and no immediate threat in sight, opponents of the policy will denounce it as an unnecessary expense diverting resources from more desirable causes. They will regard the peaceful international situation as natural and unconnected to what has helped produce it: the effort and money expended on military power."
Such a commitment will be possible, Kagan concludes, only after "a full national debate, followed by the adoption of a grand strategy of continued engagement in the new constellation of international relations." Critics such as Eric Alterman, a Senior Fellow at the World Policy Institute, who favor, as he writes in World Policy Journal (Summer 1996), "a less interventionist United States," would be heard. So would proponents of humanitarian intervention, such as Robert I. Rotberg, president of the World Peace Foundation, and Thomas G. Weiss, of Brown University. As they write in From Massacres to Genocide (1996), humanitarian interventionists believe that the United States should regard its national interest as "genuinely threatened by instability and strife wherever in the world they occur." Eliot A. Cohen and A. J. Bacevich, both of Johns Hopkins University’s Nitze School of Advanced International Studies, agree on the necessity of a national debate. "The uncomfortable fact," they write in the Weekly Standard (Mar. 3, 1997), "is that the United States has become a global hegemon, its soldiers members of a constabulary enforcing a Pax Americana. It may be awkward or disconcerting to admit as much to ourselves, let alone to others, but to pretend otherwise will serve in the long run only to confuse citizens and soldiers alike. As a result, the nation is sorely in need of a new public discourse appropriate to the grand strategic enterprise to which the United States has tacitly committed itself."
This article originally appeared in print