Going Our Way: A New Foreign Policy

Author: Robert Schmuhl ’70

War is always a bloody interlude. Before the fury comes the triggering rationale — and afterward the consequences of scarring change.

The age-old pattern repeated itself last spring in Iraq. Beyond this theater of war, however, a related drama — with words as weapons — continues to play out nationally and throughout the world.

Is the policy that led the United States to take military action in the Persian Gulf based on defensive security or offensive superiority? Will America be measurably safer in years to come — or be more vulnerable to adversaries provoked by this new strategy?

Doubt and uncertainty followed the hostilities in Iraq for one prime reason. U.S. forces struck first in the campaign to remove Saddam Hussein and his government from power. Though assisted by British and Australian contingents, American might — dazzling in its high-tech sophistication, frightening in its earth-bound execution — brought an end to savage despotism but did so without authorization by the United Nations or approval from many traditional U.S. allies. Moreover, the principal justification for war was itself something of a moving target. Eliminating weapons of mass destruction, preventing the development of nuclear armament capability, producing “regime change” and ending Iraqi assistance to terrorist organizations were among the reasons cited for combat before hostilities began.

Success, especially swift elimination of the tyrannous authority ruling Iraq, can obscure concern over the justification of military action. It shouldn’t, though, because this use of force reflects a way of thinking and acting much greater in its international implications than the fate of one regime on its receiving end at a given time. How this policy took strategic shape before its first implementation tells a story about presidential decision-making and our times that would have been difficult to foresee just two years ago.

For most of the 1990s and through the summer of 2001, domestic concerns preoccupied people in the United States. What was happening abroad merited only occasional media attention. International affairs played a minor role in the presidential elections of 1992, 1996 and 2000. When George W. Bush campaigned for the White House in 2000, he emphasized the virtue of humility in dealing with other countries and pointed out the dangers of nation-building. In spring 2001, former Secretary of State Henry A. Kissinger captured the prevailing national mood by asking a question as the title of a new book: Does America Need A Foreign Policy?

Everything changed on September 11, 2001. Zealots from distant lands turned commercial airplanes into makeshift missiles that killed more than 3,000 people on U.S. soil. That evening Bush (in the presidency only seven months) called the response he envisioned a “war against terrorism” and vowed: “We will make no distinction between the terrorists who committed these acts and those who harbor them.”

Nine days later, in an address to a joint session of Congress, the president amplified his thinking on the relationship between perpetrators of terror and countries abetting their malevolence. “We will starve terrorists of funding, turn them one against another, drive them from place to place, until there is no refuge or no rest,” he said. “And we will pursue nations that provide aid or safe haven to terrorism. Every nation, in every region, now has a decision to make. Either you are with us, or you are with the terrorists. From this day forward, any nation that continues to harbor or support terrorism will be regarded by the United States as a hostile regime.”

Although the explicit target for most of Bush’s speech was Afghanistan and its terrorist-accommodating Taliban movement, the more encompassing words of warning suggested the possibility of a larger strategy. In retrospect, toppling the Taliban and destroying the terrorist training camps in Afghanistan — actions widely supported by the world community — served as a first step down a road that quickly took a sharp turn away from a definite goal.

The turn came on January 29, 2002, during the president’s first State of the Union address. With the new leader of “liberated” Afghanistan — Hamid Karzai — in attendance, Bush praised “the might of the United States military,” deployed 7,000 miles away, but he moved shortly thereafter to identify three countries never directly associated with September 11 that could at some point prove dangerous to “America or our friends and allies.”

Branding North Korea, Iran and Iraq “an axis of evil,” he trained his rhetorical fire on the weaponry at their disposal and the risks they posed. In language both personal and provocative, the president went further than before in articulating a policy of direct action: “We’ll be deliberate, yet time is not on our side. I will not wait on events, while dangers gather. I will not stand by, as peril draws closer and closer. The United States of America will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons.”

Talk of an “axis of evil” now competed with “a war against terrorism,” perplexing public thinking at home and abroad. The focus on tracking down those responsible for September 11 became blurred, with the president expanding on potential enemies as well as those currently in the crosshairs.

Why, people wondered, link these three countries — Iran and Iraq had fought a protracted war in the 1980s, while North Korea has long isolated itself from the rest of the world — in an unholy trinity of evil without immediate relevance to the military action under way in Afghanistan? Whether “hit list” or formal warning, the explicit connections among the trio lacked immediately identifiable coherence, furrowing brows at home and abroad.

Speaking at West Point graduation ceremonies June 1, 2002, Bush built on his earlier statements to provide a more encompassing strategic framework for America’s engagement in international affairs. Specific as to purpose without naming particular countries, he set forth what he called “new thinking” to confront “new threats.”

Looking back to the Cold War with its reliance on deterrence and containment to control the aggression and ambition of “imperial communism,” the president pronounced those strategies anachronistic in a time of terrorists. “Deterrence — the promise of massive retaliation against nations — means nothing against shadowy terrorist networks with no nation or citizens to defend,” he said. “Containment is not possible when unbalanced dictators with weapons of mass destruction can deliver those weapons on missiles or secretly provide them to terrorist allies.”

With deterrence and containment no longer central tenets, Bush proposed a more direct approach: “We must take the battle to the enemy, disrupt his plans and confront the worst threats before they emerge. In the world we have entered, the only path to safety is the path of action. And this nation will act.”

Shortly after asserting this general principle for dealing with perceived threats, the president became more precise about what he had in mind. Three consecutive sentences began with the phrase “Our security” and went on to advocate modernization in intelligence, domestic agencies (like the FBI) and the military. He concluded the passage with a more detailed and eye-opening statement: “And our security will require all Americans to be forward-looking and resolute, to be ready for preemptive action when necessary to defend our liberty and to defend our lives.”

Surrounded by appeals for ensuring “our security” and defending “our liberty” and “our lives,” the concept of preemptive action entered the arena of policy debate. In the eight months since September 11, Bush had used four major speeches to articulate an approach to international affairs that became less abstract and more assertive as his thinking and that of his administration evolved.

That approach served as the foundation-setting prelude to a formal, 31-page document, “The National Security Strategy of the United States of America,” which was issued September 20, 2002.

Much of the document’s prose is predictable, championing human dignity, economic growth and the development of democracy. What’s striking, however, is the explicit rejection of the Cold War tactic of deterrence and the repeated emphasis on preemption as an option in dealing with potential dangers. Midway through the report required by Congress of every administration, September 11 is mentioned right before one reads these two paragraphs:

“The United States has long maintained the option of preemptive actions to counter a sufficient threat to our national security. The greater the threat, the greater is the risk of inaction — and the more compelling the case for taking anticipatory action to defend ourselves, even if uncertainty remains as to the time and place of the enemy’s attack. To forestall or prevent such hostile acts by our adversaries, the United States will, if necessary, act preemptively.

“The United States will not use force in all cases to preempt emerging threats, nor should nations use preemption as a pretext for aggression. Yet in an age where the enemies of civilization openly and actively seek the world’s most destructive technologies, the United States cannot remain idle while dangers gather.”

Release of the official report spelled out in detail what’s become known as the “Bush Doctrine.” Despite diplomatic declamations about working with allies, friends and international institutions to achieve common objectives, America served notice on the world that (in words at the end of the document) “we will be prepared to act apart when our interests and unique responsibilities require.” By stressing preemption and suggesting unilateralism, the administration was positioning the nation to be robustly preeminent — and people here and in other countries started paying closer attention to what the United States might do.

To be sure, the American military had acted preemptively in the past — most recently in the Dominican Republic in 1965, in Grenada in 1983 and in Panama in 1989. But putting preemption at the center of a new strategic doctrine struck many observers as undue saber-rattling by the world’s only superpower — which in some circles abroad was increasingly identified as a “hyperpower,” given its economic, military, technological and cultural clout. By subordinating deterrence and stressing preemption, the United States was saying, in effect, we’ll assess any outside threats and act accordingly, including the possible use of force.

Intriguingly, a decade earlier, a draft proposal with thinking quite similar to that in “The National Security Strategy” document circulated during the final year of the first George Bush presidency — and was severely criticized within and outside the administration. Called the “Defense Planning Guidance” report, the 46-page plan discussed the option of preemptive action to maintain American primacy, regardless of the challenge from friend or foe.

Leaked to The New York Times and the lead story in its Sunday edition of March 8, 1992, the policy proposal received extensive treatment. The thrust of the argument is clear from a few sentences of the plan: “Our first objective is to prevent the re-emergence of a new rival, either on the territory of the former Soviet Union or elsewhere, that poses a threat on the order of that posed formerly by the Soviet Union. This is a dominant consideration underlying the new regional defense strategy and requires that we endeavor to prevent any hostile power from dominating a region whose resources would, under consolidated control, be sufficient to generate global power.”

After specific areas — Western Europe, Eastern and Southwestern Asia and the region of the former Soviet Union — are identified, the report elaborates on its principal objective and moves beyond defensive concerns: “First, the U.S. must show the leadership necessary to establish and protect a new order that holds the promise of convincing potential competitors that they need not aspire to a greater role or pursue a more aggressive posture to protect their legitimate interests. Second, in the non-defense areas, we must account sufficiently for the interests of the advanced industrial nations to discourage them from challenging our leadership or seeking to overturn the established political and economic order. Finally, we must maintain the mechanisms for determining potential competitors from even aspiring to a larger regional or global role.”

Appearing as the 1992 presidential campaign was gathering momentum, the proposal for a regimen of unrivaled superiority was assailed by Patrick Buchanan, who was challenging Bush for the Republican nomination, and several Democratic contenders. More significantly, the plan received strong opposition inside the Bush administration. A follow-up article in The New York Times three days after the story first received attention reported: “One Administration official, familiar with the reaction of senior officials at the White House and State Department, characterized the document as a ‘dumb report’ that ‘in no way or shape represents U.S. policy.’”

What’s fascinating (and relevant) in looking back at this controversial report is its resemblance to the current security strategy and its conceptual parentage. At the time Dick Cheney was serving as secretary of defense, and Paul Wolfowitz was the Pentagon’s undersecretary for policy with responsibility for developing the plan. Today, of course, Cheney is vice president with a strong say in international affairs, and Wolfowitz is deputy secretary of defense and a principal architect of the nation’s strategic thinking.

Through the Clinton administration years of the 1990s, Cheney, Wolfowitz and other officials currently serving in government considered Iraq a nagging nemesis. Indeed, in 1998, the Project for the New American Century, a Washington policy center with ties to (among others) Cheney and Wolfowitz, issued an open letter to then-President Bill Clinton, summarizing the dangers from Saddam Hussein’s staying in power and calling for action.

“The only acceptable strategy is one that eliminates the possibility that Iraq will be able to use or threaten to use weapons of mass destruction,” the letter stated. “In the near term, this means a willingness to undertake military action as diplomacy is clearly failing. In the long term, it means removing Saddam Hussein and his regime from power. That now needs to become the aim of American foreign policy.”

Out of the 18 signatories to the letter, 10 later joined the Bush administration, including Secretary of Defense Donald Rumsfeld, Deputy Secretary of State Richard Armitage and Wolfowitz. While toppling Saddam Hussein was a long-standing objective, there needed to be a triggering opportunity.

According to Bob Woodward’s insider account, Bush at War, Rumsfeld and Wolfowitz raised the possibility of military action against Iraq immediately after the al Qaeda terrorist assaults on September 11, 2001. Worried about international reaction, Secretary of State Colin Powell argued against such thinking without direct evidence of Iraqi involvement.

As Bush subsequently told Woodward, “My theory is you’ve got to do something and do it well and that . . . if we could prove that we could be successful in [the Afghanistan] theater, then the rest of the task would be easier.”

Iraq remained on the administration’s radar screen, as it considered larger security issues and threats in the post-September 11 world. Proposals (such as the thinking behind the decade-old Defense Department draft report) that previously seemed too radical, even unthinkable, now commanded attention. The post 9/11 environment, with its anxiety and fear, brought preemption and preeminence to the fore as core tenets of the Bush Doctrine.

Firing the first shot in Iraq seemed reasonable and appropriate to observers of differing viewpoints. Max Boot, a senior fellow at the Council on Foreign Relations, argued in The New York Times, “It is certainly true that preemptive wars are not the norm in history. But they are not as rare as President Bush’s critics suggest. The president’s preemption doctrine — and its first application, in Iraq — is firmly rooted in centuries of tradition.” After listing several historical precedents, Boot asks: “[W]ho today thinks it was wise of Britain and France to stay their hands in the 1930s when they could have thwarted Hitler’s ambitions early on?”

Journalist and commentator Christopher Hitchens, a self-described “contrarian” with intellectual roots in leftist ideology, stood foursquare for preemption in Iraq. Writing in his quickly published collection of “polemics” favoring force, A Long Short War, he asserted: “If the Bush administration actually went around deposing all bad guys, as the peaceniks taunt it for not doing, then that really would constitute preemption. But how preemptive is an intervention in Iraq, when undertaken to enforce a multiply reaffirmed resolution of international law? Saddam has been warned and put on notice an the entire debate on armed enforcement has been exhaustively conducted in full public view.”

Despite the sinister threat of global terrorism, the new policy impressed many commentators as overly bellicose and individualistic. Harvard professor Stanley Hoffmann criticized the assumption that “presumes that the United States is the sole judge of the legitimacy of its own or anyone else’s preemptive strikes.” He went on to argue that “the Bush Doctrine proclaims the emancipation of a colossus from international constraints (including from the restraints that the United States itself enshrined in networks of international and regional organizations after World War II). In context, it amounts to a doctrine of global domination.”

The essayist Wendell Berry closely analyzed the text of the National Security Strategy and observed: “This document affirms peace; it also affirms peace as the justification of war and war as the means of peace — and thus perpetuates a hallowed absurdity. But implicit in its assertion of this (and, by implication, any other) nation’s right to act alone in its own interest is an acceptance of war as a permanent condition. Either way, it is cynical to invoke the ideas of cooperation, community, peace, freedom, justice, dignity, and the rule of law (as this document repeatedly does), and then proceed to assert one’s intention to act alone in making war.”

Other analysts considered the new strategic approach as out-of-character with traditional American ideals and behavior, especially our historic reluctance to use the military except out of necessity. “There never was a good war or a bad peace,” Benjamin Franklin wrote back in 1783, a statement frequently repeated in 2003. One commentator quoted former president and World War II allied commander Dwight D. Eisenhower’s opinion on preventative war: “I don’t believe in such a thing, and frankly I wouldn’t even listen seriously to anyone that came in and talked about such a thing.” Another recalled the words of Eisenhower’s White House successor, John Kennedy, who boldly asserted in the swelter of the Cold War: “The United States, as the world knows, will never start a war.”

Times — and thinking — change. Behind any strategy of a first-strike option is the conviction that military force is required for self-defense in the face of an imminent threat. A targeted attack on a potential enemy, it is hoped, will remove likely peril to Americans here at home or abroad. To avoid another September 11, doesn’t it make sense to do whatever’s possible beforehand?

Yet, probed more deeply, the policy raises other questions that can’t be ignored. Is the United States absolutely certain of its vulnerability in a given situation? Does the government, through its intelligence agencies, have proof-positive evidence of imminent danger? Has every other option to tame a threat been exhausted? Will going on the offensive prove to be the strongest defense to avoid greater and more protracted conflict? What harm might innocents face?

Such questions gained greater pertinence in the months following the major hostilities in Iraq, when evidence of weapons of mass destruction could not be found. A primary justification for waging war seemed suspect, and the administration’s credibility came under fire domestically and internationally.

Just 48 hours before combat began, the president told the nation (and world): “Intelligence gathered by this and other governments leaves no doubt that the Iraq regime continues to possess and conceal some of the most lethal weapons ever devised. This regime has already used weapons of mass destruction against Iraqís neighbors and against Iraq’s people.”

Mentioning the regime’s “deep hatred of America” and its assistance to terrorists (including al Qaeda), he elevates the fear factor to a domestic concern: “The danger is clear: using chemical, biological or, one day, nuclear weapons, obtained with the help of Iraq, the terrorists could fulfill their stated ambitions and kill thousands or hundreds of thousands of innocent people in our country, or any other.”

The inability to locate these metaphorical smoking guns struck many observers as damaging to the more encompassing Bush Doctrine. Writing in late June, conservative columnist George F. Will claimed that preemption as “the core of the president’s foreign policy” was “in jeopardy” without discovery of the often-cited weapons. He went on to make his case:

“To govern is to choose, almost always on the basis of very imperfect information. But preemption presupposes the ability to know things — to know about threats with a degree of certainty not requisite for decisions less momentous than those for waging war.

“Some say the war was justified even if WMD are not found nor their destruction explained, because the world is ‘better off’ without Saddam Hussein. Of course it is better off. But unless one is prepared to postulate a U.S. right, perhaps even a duty, to militarily dismantle any tyranny — on to Burma? — it is unacceptable to argue that Hussein’s mass graves and torture chambers suffice as retrospective justification for preemptive war.”

Those are pointed words, and time might ultimately prove them premature. Yet, without indisputable assurance that danger looms, any first strike could be interpreted as unwarranted aggression or intervention from an outsider. Moreover, once combat ceases, securing the peace and rebuilding a country or area follow. This chancy work needs to take place with coherence and care so that the forces feeding the original threats don’t rekindle or take an ominously new shape.

Besides such pragmatic considerations, moral concerns weave themselves through the Bush Doctrine, as they always do in matters involving life and death. In this particular situation, philosophers and theologians who specialize in just-war theory make nuanced distinctions between preemption and preventive war.

According to ethicists, if an identified enemy poses an imminent and grave threat to a country, then a proportionate, preemptive attack possesses moral legitimacy on grounds of self-defense. By contrast, a preventive war involves less immediate danger — and provokes more ethical debate. In this case, a nation worries that another country at some point in the future could use force against it, with action now preventing possible conflict later.

Whether last spring’s action in Iraq was preemption or prevention — or something in between — remains in dispute, depending on one’s viewpoint and basis for judgment. Within America, fear that Iraqi-made weapons of mass destruction would fall into the hands of terrorists opposed to the United States made many people see the use of force as preemptive and, hence, justified. Beyond our shores, public opinion was more inclined to regard what happened more critically — and even skeptically.

Although the terrorist attacks of 9/11 produced sympathy and solidarity for the United States throughout the world — a headline September 12, 2001, in the French newspaper Le Monde vowed “Nous Sommes Tous Americains” (We Are All Americans) — those sentiments proved short-lived. As the administration’s security policy received scrutiny and as talk about war in Iraq grew louder, reactions from abroad took on a sharply anti-American tone.

One statistical study, conducted by Pew Global Attitudes and released last June after the major phase of Iraqi combat, found that favorable opinion of the United States had fallen in 14 of 15 countries surveyed since 1999-2000. Positive response to America declined in Great Britain, Canada and Italy, with dramatic downturns (and less than 50 percent favorability) in South Korea, Germany, France, Spain, Brazil, Morocco, Indonesia, Turkey and Pakistan. In Indonesia, 15 percent had a positive opinion of the United States — down from 75 percent earlier. Morocco and Turkey also showed precipitous drops. The lone country registering an increase in regard was Nigeria, up to 6l percent from 46 — but 72 percent of Nigerians surveyed worried “very” or “somewhat” about a military threat from the United States.

Polls offer snapshots in numbers. Monitoring the news media in other countries through the months preceding and during the war in Iraq allowed Americans a different vantage point, in words and pictures, for viewing the nation and the policy it was pursuing.

The coverage proved soberingly revealing. For instance, William Pfaff ’49, a syndicated columnist for the International Herald Tribune, quoted “a very senior retired officer in the German Army” a few weeks before the war: “You Americans have been telling us for 60 years that we must never go to war. You have made the Germans pacifists. We have accepted that war is never a solution. We believe that even more because of our own history. Now you attack us because Germans are against this war.”

Once the fighting began, the coverage turned even more fiercely negative in portraying what the United States was doing. Especially in Arab-based media but elsewhere as well, gruesome images, focusing on civilian casualties, dominated the pages of newspapers and television screens. With our news organizations focusing on the military’s heroism and technological prowess, it often seemed as though American outlets and foreign ones were reporting on unrelated realities happening simultaneously in the same place.

Applause at home for battlefield advances was countered abroad with anxiety and anger. “The political, cultural and social point of reference that America has been, is now eclipsed in the eyes of billions of people,” editorialized one European paper. “The overwhelming impression: an imperial power is doing what it wants, regardless of its friends and its foes.”

How long such thinking might persist is anyone’s guess. Yet, beyond public opinion, there are broader, more tangible consequences the recent hostilities could have on, say, the sales of American products, the attention paid to our popular culture, the safe travel of U.S. citizens abroad and other international interaction.

What’s clear, however, is that people outside the United States now view this country with greater suspicion as a result of the Bush Doctrine. To call the reaction of many foreigners “anti-Americanism” is misleading. The animus is closer to anti-American security policy or anti-Bush administration — but over time one fears a more generalized antipathy might metastasize. Should that happen, the U.S. leadership role in the world and our influence in several realms could suffer — with a policy intended for national security becoming one that isolates or even ostracizes.

The debate before, during and after the major military activity in Iraq brought into the open America’s new approach to potential threats. In the United States, the focus remained on the specific case — removal of a murderous dictator who ruled not only with an iron fist but also with singular regard for his own fate and fortune. For people in other countries, the more abstract principle of preemptive attack as a continuing policy option overshadowed the particular, with questions and qualms dominant.

Many foreigners wondered whether the ends (toppling a tyrant) justified the means (outside intervention without direct provocation). Asked more pointedly by some critics: Did the justification for war (the repeated allegations of Iraqi-made weapons of mass destruction and that country’s friendly relations with terrorists) warrant the means that resulted in their ends?

Doubts dog the new doctrine for several reasons. America, long viewed as a beacon of hope and opportunity, is now seen as less predictable and more predisposed to use its unparalleled power in a unilateral, we-know-best way. That image exacerbates the animosity reflected in public opinion, testing (if not weakening) our position in international organizations and our alliances with traditional allies.

To carry this thinking to its more ominous conclusion: Countries that consider themselves possible targets of the United States could be inclined to seek advanced weaponry, triggering a new arms race of frightening dimension.

In an ironic twist of unintended consequence, weapons of mass destruction might proliferate in response to the American strategy that seeks their control. Then, of course (and most depressingly), another country could engage in a preemptive attack against a potential enemy, citing the precedent in Iraq and using the United States as an example of conflict resolution.

These darker implications and possibilities of the first-strike doctrine deserve consideration because as a policy for governmental action its effects radiate out on a global scale. Fear about what might happen domestically in one country ends up casting an ominous shadow across the world at large.

In his commencement address at West Point last year, President Bush said, “America has no empire to extend or utopia to establish. We wish for others only what we wish for ourselves — safety from violence, the rewards of liberty and the hope for a better life.” These noble words reflect the nation’s soul and heritage, but they run the risk of sounding hollow if a doctrine of preemption and preeminence leads the United States to flex its military muscle without the certain provocation from a clear and present danger.


Robert Schmuhl is professor of American studies and director of the John W. Gallivan Program in Journalism, Ethics & Democracy at Notre Dame. His most recent book is Indecent Liberties, published by Notre Dame Press.


(October 2003)