The Mission: Waging War and Keeping Peace with America’s Military.
W.W. Norton & Company. 429 pages. $26.95
In this engaging book, Dana Priest, a reporter for the Washington Post, explores how the American military unexpectedly came to play a lead role in making foreign policy during the Clinton administration. The Defense Department’s commanders in chief — “cincs” in dod jargon — essentially became prefects, representing American interests — and power — over vast regions of the globe.
To understand how this happened, one first must see the world through dod eyes. The Defense Department divides the globe into areas of responsibility, or commands. Each is assigned to a four-star admiral or general who directs units assigned to his command from the Army, Navy, Air Force, and Marine Corps. Until last year, the Defense Department referred to these generals and admirals as “commanders in chief,” or “cincs,” pronounced like the kitchen fixture. In 2002 the Defense Department changed the title to “combatant commander” to avoid confusion with the president, who is commander in chief of the armed forces under the Constitution. Even so, writers and military personnel alike still often use the term “cincs” from force of habit, without meaning to refer to the old title. In effect, the acronym has taken on a life of its own.
In covering the activities of several cincs, Priest provides a first-hand account of how U.S. armed services are dealing with the new kinds of threats the nation now faces. In the Cold War, U.S. forces could focus on a single, predictable adversary: the Soviet Union. The most likely conflict was predictable, too — a Red Army invasion of Western Europe, possibly leading to a nuclear exchange.
Not so today. Current threats may be less apocalyptic than the potential explosion of tens of thousands of megatons’ worth of thermonuclear warheads, but they often seem more complex and harder to manage. Priest rides along with American troops supporting Colombia in its war against drug traffickers and in peacekeeping operations in the Balkans. She describes the 1999 U.S. intervention aimed at stopping Serbian troops from ethnic cleansing in Kosovo. And she follows the U.S. operation to eliminate al Qaeda bases and the Taliban regime in Afghanistan after the September 11 terrorist attack.
The result is a highly readable account of how U.S. forces prepare for these new threats. Priest began writing her book in the middle of the Clinton administration and completed it a few months after the terrorist attacks of September 11, 2001. One gets the feeling that Priest had to make major adjustments after George W. Bush won the November 2000 election. The story line probably would have followed a continuous stream if Al Gore had been the victor. Gore presumably would have kept two key features of the Clinton defense policy: a gradual approach to military transformation, and the use of military forces for “nation-building” — that is, humanitarian aid, peacekeeping operations, and generally showing the American flag in a region to promote free, democratic government.
Alas, when Bush won the presidency, Priest had a new element to incorporate. Bush had campaigned against the Clinton administration’s predilection for nation-building; he and his advisors argued that U.S. forces had been overextended and were being wasted in irrelevant operations that made them less prepared for more important conflicts. Bush also believed that Clinton was indecisive when he did deploy the troops. Bush’s future national security advisor, Condoleezza Rice, gave an assessment in a speech at the gop convention in August 2000. She described how Bush would serve as commander in chief:
He recognizes that the magnificent men and women of America’s armed forces are not a global police force. They are not the world’s 911. They are the strongest shield and surest sword in the maintenance of peace. If the time ever comes to use military force, President George W. Bush will do so to win — because for him, victory is not a dirty word.
Rice’s comment would take on a different resonance 407 days later, because it was precisely 9-11 that made everyone rethink how U.S. forces would be used. The September 11, 2001 terrorist strikes showed how terrorist organizations could use failed states like Afghanistan and unstable countries like Sudan, Somalia, and Pakistan as bases for their operations. After 9-11, Bush became the most aggressive nation-builder of all — even as the task became more challenging. The Bush administration was determined to fix these trouble spots — even if it meant using military forces in non-combat roles and even if it meant deploying them in prolonged campaigns.
The post of cinc had originated in World War ii, when American and British military planners organized their war against the Axis powers into regional campaigns. Each region — Europe, the Pacific, Southwest Asia, and so on — was assigned to a top-ranking flag officer responsible for fusing into a unified force both ground and naval units, and often units from several countries. The United States continued this approach into the Cold War.
The main role of a cinc was always to command, but war planning at the strategic level usually involves high-level diplomacy. Also, there are massive logistics requirements for operations on this scale — bases, transportation, contracting with locals for supplies, and so on. So inevitably cincs (and their staffs, which can number in the thousands) find themselves running what are, in effect, regional programs for economic development and civil affairs.
These facts, combined with the end of the Cold War and U.S. domestic politics, gave cincs a new role and stature in the 1990s. Most Americans turned away from foreign affairs after the fall of the Berlin Wall. Congress squeezed the State Department budget. Indeed, American foreign relations often seemed on autopilot during the decade — even as the world became more complex.
The cincs filled the vacuum by default, since they were often the highest-ranking U.S. officials active regionally, and they had to keep preparing for military contingencies in any case. As Priest explains, “In a decade when Congress significantly slashed money for diplomacy, the cincs’ headquarters had grown to more than twice their Cold War sizes. With a combined budget of $380 million a year, their resources were lavish compared to the civilian agencies that by law and tradition were supposed to manage U.S. foreign relations.” The cincs had become, in effect, the most visible representatives of the United States in many parts of the world and the dispensers of American largess.
Priest devotes just over half of the book to her adventures accompanying U.S. forces in peacetime operations in Nigeria, Indonesia, and Colombia, peacekeeping operations in Bosnia, and pressuring the Milosevic regime out of Kosovo. Her accounts offer some interesting vignettes of how American foreign policy evolved in this era. She describes, for example, how Marine Corps General Anthony Zinni, head of Central Command in 1999, would often use his past relations with Pakistani officers to troubleshoot U.S. relations in the Asian subcontinent. The task was all the more dicey because Zinni had to avoid getting crosswise with his boss, Defense Secretary William Cohen, or overstepping his own legal authority. Priest reports Zinni saying, “You have to be radical, give the cincs more authority. You have a lot of ‘suits’ [civilian officials] running around who wouldn’t want to give cincs political authority. But we already have it.”
Priest’s reporting is a great read, but one cannot help but wonder if all the access the general officers granted her occasionally left her blinded by the glint of four-star insignia. For example, $380 million may seem like an enormous amount of money, but in the Defense Department, it is almost chump change — it amounts to about one tenth of one percent of the entire Defense Department budget. The big bucks are in operations and acquisition, and moving these funds into a different direction was the harder task.
Which brings us to the other major factor Priest had to accommodate after the 2000 election: Donald Rumsfeld and his policy of defense transformation. The Clinton administration and Cohen had deliberately taken an incremental, go-slow approach to modernizing the American military. Cohen was concerned about the cost and risky technologies that would accompany radical change. Besides, the political mantra in the late 1990s was “preserve the budget surplus,” and that naturally limited the money available for any modernization plan.
Rumsfeld, by contrast, believed that U.S. forces had to make greater use of modern technology — in particular, information technology. Any other course would fritter away America’s unique advantage over its rivals and would leave U.S. forces unprepared for new threats. Alas, many officers resisted change, and so most of the news coming out of the Pentagon during the first eight months of 2001 was about conflicts between the new secretary and the uniformed military. The Pentagon bureaucracy nickled and dimed the new secretary’s plans. It is easy to forget today, but by the summer of 2001 many pundits were writing that Rumsfeld would not even survive until the next election.
In fact, squabbling between civilian officials and top officers is an old story, and generals and admirals had resisted the Clinton administration’s policy of nation-building at least as vigorously as they resisted the Bush administration’s new policy of defense transformation. It’s tempting to say simply that military organizations are instinctively conservative, but there is more to it than that. There are incentives and culture at work, and it is important to understand these if one hopes to understand the challenge of preparing the U.S. military for the new era.
The origin of military organizations dates back 3,000 years, when the first recorded armies were established in ancient Persia. The Persians discovered that by fighting as a hierarchical, orderly group rather than as a mere rabble, soldiers could both concentrate their firepower and protect each other. Firearms and mechanization increased the speed and distance of warfare, but the basic features of an army have remained the same over time and from country to country: chain of command, formation deployment, and group maneuver.
This was why armies drilled. The military drill trained individual soldiers and smaller units to move and act as part of a larger unit and plan. Drilling and mustering an army into formation before battle was also good for motivation. Once the fighting began, any single soldier who strayed was apt to be picked off and killed (or shot as a deserter). Even after 3,000 years, this modus operandi — massing forces into a unified strike — is almost instinctive for army officers. As proof, consider Lieutenant General Fred Franks, Commander of vii Corps in Operation Desert Storm. Franks delayed attacking the Iraqi Republican Guard at a crucial stage of Desert Storm, when it was within easy striking distance of some of his forces. This, according to his critics, allowed the Guard to escape intact — and survive to fight 12 years later.
But, in reality, Franks was simply reflecting the conventional wisdom of an Army officer, which says a commander should wait until he can cock his forces and fire an all-out, synchronized strike into the enemy. As Franks told a Frontline interviewer five years after the war:
I wanted to hit the Republican Guards with a three-division fist. I didn’t want to poke at them with fingers — in other words, piecemeal commitment of my forces. I wanted to hit them with a fist at full speed.
When asked, “What happens to people who attack in fingers rather than fists?” Franks replied:
What you get is, you get piecemeal commitment, you get lack of coherence in the attack or lack of synchronization of fires with maneuver forces, with ground and air coordination, and what you get is you get a chance, a probability of increased casualties, you get the probability of an attack that starts and stops and starts and stops, loses momentum, and you get all those things that you really don’t want.
Franks’s outlook is typical among Army officers. Indeed, in 1991, he was promoted to the rank of full general and went on to command the Army’s Training and Doctrine Command (tradoc), holding the post through November 1994. In effect, this assignment validated his judgment in Desert Storm.
Throughout the 1980s, the so-called military reform movement challenged this view. These analysts and military thinkers argued that the traditional synchronization was too constraining. They argued for “maneuver warfare,” emphasizing smaller, more agile units that could quickly seize opportunities on the battlefield as they appeared — even if that meant they would break out of a synchronized formation. The traditionalists and the reformers debated their ideas throughout the decade. Then, in the 1990s, cheap microprocessors and the internet settled the question.
With the new technology, almost anyone on the battlefield could trade data with anyone else. Today it is common for military commanders at all levels to communicate via tactical intranet chat rooms or to download imagery directly from servers. This communications technology, combined with the deadly combination of new sensors and precision-guided weapons, led to the idea of “network-centric” warfare, in which even small units — like special operations forces — could pack a lethal punch because they could pool data, locate their targets, and then direct fire from hundreds or even thousands of miles away.
Network-centric warfare was part of the transformation Rumsfeld and his advisors pushed. But many military officers — especially in the Army — refused to believe that the new technology was as miraculous as some claimed. This was understandable; they were the ones who were on the front line. For a platoon commander in Korea or planning to deploy to the Middle East, a Washington analyst promising that a microchip will make a soldier 10 times more capable sounded much like the Iranian mullahs who claimed that their prayer scarves would ward off Iraqi nerve gas attacks. Little wonder the traditionalists pushed back.
As Priest describes, however, Afghanistan vindicated this new form of combat (as did Operation Iraqi Freedom a year later.) Relatively small numbers of U.S. forces crushed their adversaries on the battlefield, suffering few casualties in the process. Alas, these tactics and even much of the technology is available to our adversaries as well — as al Qaeda and the Iraqi fedayeen have demonstrated in their own use of networked military operations.
Yet the complications in Iraq after the Baathist regime fell illustrated how victory on the battlefield is often only the start of a larger challenge, and this is where Priest connects the controversies over nation-building in the 1990s with the wars the United States fought in the following decade. It turned out that many of the skills the Clinton Pentagon hoped would avert conflict were the same ones the Bush Pentagon needed to secure victory.
It was easy to view the Clinton administration’s efforts at nation-building as charity and, given some of the ideological leanings of the officials, they probably were. The early Clinton administration, remember, saw the Defense Department as a fount of technology development funding for “growing the economy.” So critics could be forgiven if they thought nation-building was an effort to convert the Defense Department into a foreign-oriented Department of Health and Human Services.
Priest argues, though, that today such nation-building activities must be an integral part of military planning. Writing soon after the American liberation of Afghanistan from the Taliban, she observes:
Much of the ambivalence in foreign relations affecting Clinton’s presidency reflected a larger angst among civilian and military leaders over the proper role of the military in peacetime. On the surface, it might appear as if this question has been settled. September 11 provided a sudden clarity of purpose for the U.S. military and its leaders. President Bush defined it as “good versus evil.” I beg to differ.
Although the war against Al Qaeda in Afghanistan was clear in purpose, we are now seeing that the hardest, longest, and most important work comes after the bombing stops, when rebuilding replaces destroying, and consensus-building replaces precision strikes.
As we are discovering in Iraq today, nation-building will likely be “the operation that comes after the operation” — the process of ensuring law and order and a functioning government after we win a war. This is especially true given the nature of modern U.S. military operations. As John Peters has noted, unlike traditional warfare, which often devastates the opponent’s society and destroys its army, modern warfare “in the American style” usually leaves the society and even much of the economy and even political structure intact.
If we expect to use our new forces effectively, we will need these new capabilities — rapidly deployable police forces with non-lethal weapons and effective plans for civil affairs and psychological operations, for example. In effect, defense transformation is turning the need for nation-building on its head. Nation-building is essential not just to avoid the fighting, but to secure the payoff when the war is over. Pick almost any future potential conflict — Iran, North Korea, West Africa — and you see the need. If the U.S. military cannot meet the requirement, one has to wonder: Who will?