Share

For well over 30 years now, the United States military has been intensively engaged in various quarters of the Islamic world. An end to that involvement is nowhere in sight.

Tick off the countries in that region that U.S. forces in recent decades have invaded, occupied, garrisoned, bombed or raided and where American soldiers have killed or been killed. Since 1980, they include Iraq and Afghanistan, of course. But also Iran, Lebanon, Libya, Turkey, Kuwait, Saudi Arabia, Qatar, Bahrain, the United Arab Emirates, Jordan, Bosnia, Kosovo, Yemen, Sudan, Somalia and Pakistan. The list goes on.

To judge by various official explanations coming out of Washington, the mission of the troops dispatched to these various quarters has been to defend or deter or liberate, punishing the wicked and protecting the innocent while spreading liberal values and generally keeping Americans safe.

What are we to make of the larger enterprise in which the U.S. forces have been engaged since well before today’s Notre Dame undergraduates were even born? What is the nature of the military struggle we are waging? What should we call it?

For several years after 9/11, Americans referred to it as the Global War on Terrorism, a misleading term that has since fallen out of favor.

For a brief period during the early years of the George W. Bush administration, certain neoconservatives promoted the term World War IV. This never caught on, however, in part because, unlike other major 20th century conflicts, it found the American people sitting on the sidelines.

With interventions in Iraq and Afghanistan dragging on inconclusively, some military officers began referring to what they called the Long War. While nicely capturing the temporal dimension of the conflict, this label had nothing to say about purpose, adversary or location. As with World War IV, the Long War never gained much traction.

Here’s another possibility. Since 1980, back when President Jimmy Carter promulgated the Carter Doctrine, the United States has been engaged in what we should rightfully call America’s War for the Greater Middle East. The premise underlying that war can be simply stated: with disorder, dysfunction and disarray in the Islamic world posing a growing threat to vital U.S. national security interests, the adroit application of hard power would enable the United States to check those tendencies and thereby preserve the American way of life.

Choose whatever term you like: police, pacify, shape, control, dominate, transform. In 1980, President Carter launched the United States on a project aimed at nothing less than determining the fate and future of the peoples inhabiting the arc of nations from the Maghreb and the Arabian Peninsula to the Persian Gulf and Central Asia.

Since the end of World War II, American soldiers had fought and died in Asia. Even when the wars in Korea and Vietnam ended, U.S. troop contingents continued to garrison the region. In Europe, a major U.S. military presence dating from the start of the Cold War signaled Washington’s willingness to fight there as well. Prior to Carter’s watershed 1980 statement, no comparable U.S. commitment toward the Islamic world existed. Now that was going to change.

Only in retrospect does this become clear, of course. At the time President Carter declared the Persian Gulf a vital national security interest — that was the literal meaning of the Carter Doctrine — he did not intend to embark upon a war. Nor did he anticipate what course that war was going to follow — its duration, costs and consequences. Like the European statesmen who a hundred years ago touched off the cataclysm we know today as World War I, Carter merely lit a fuse without knowing where it led.

Partly for domestic political reasons (1980 was a presidential election year) and partly to counter perceptions that America itself was flagging (the Iran hostage crisis was a continuing embarrassment), Carter wanted to make a show of drawing a line in the sand. Along with plenty of oil, the Persian Gulf region had plenty of sand. Here it seemed, prompted by the successive surprises of the Iranian Revolution and the Soviet invasion of Afghanistan, was the place to draw that line.

Neither Carter nor his advisers foresaw what awaited 10 or 20 years down the line. They were largely clueless as to what lay inside the Pandora’s box they insisted on opening. But what they and their successors in government found there prompted them to initiate a sequence of military actions, some large, some small, that deserve collective recognition as a war. That war continues down to the present day.

Look closely enough and the dots connect. Much as, say, the Berlin Airlift, the Korean War, the Cuban Missile Crisis and the invasion of Grenada (among many other events) all constitute episodes in what we call the Cold War, so, too, do seemingly disparate events such as the Beirut bombing of 1983, the “Black Hawk Down” debacle of 1993 and the Iraq invasion of 2003 (among many others) all form part of a single narrative. Acknowledging the existence of that narrative — seeing America’s War for the Greater Middle East whole — is a prerequisite to learning.

Let me state plainly my own overall assessment of that war. We have not won it. We are not winning it. And simply pressing on is unlikely to produce more positive results next year or the year after — hence, the imperative of absorbing the lessons this ongoing war has to teach. Learning offers a first-step toward devising wiser, more effective and less costly policies.

The “10 theses” that follow constitute a preliminary effort to identify the most important of those lessons.

First, the center of gravity.

Devised by the German philosopher Carl von Clausewitz, the phrase center of gravity refers to that factor upon which a war’s outcome ultimately turns. The center of gravity could be the enemy army or capital. It could be some critical resource or piece of valuable terrain.

Correctly identifying the center of gravity doesn’t guarantee victory. But at least you’ve an inkling of how a war might be — or even whether it can be — won.

The War for the Greater Middle East began in the desert, with President Carter’s failed 1980 attempt to free the Americans held hostage in Iran. Since then the war has featured campaigns in other remote and desolate places.

That said, Americans ought to have learned by now that in their War for the Greater Middle East, the key terrain is urban. In this contest, outcomes turn on what people think and believe. What matters most is not killing adversaries — U.S. forces know how to do that — but influencing populations. People constitute this war’s center of gravity.

Here, the United States labors under massive disadvantages. When American soldiers venture into this key terrain they do so as alien intruders. They arrive in cities like Baghdad or Kabul as heirs of a Western civilization that has seldom acted to further the well-being of Muslims. U.S. efforts are unavoidably tainted by the prior actions of Europeans who in attempting to incorporate the Middle East into their own empires made such a hash of things.

Like it or not, we are the successors of these imperialists. Washington’s insistence that U.S. intentions today differ from and are more benign than those of, say, Great Britain a century ago invites only incredulity from the Islamic world. This is especially the case when we stride into Iraq or Afghanistan arm-in-arm with our British cousins.

To our ears, the phrase “Anglo-American” conjures up glorious memories of a partnership forged to free a continent gripped by slavery. Back in June 1944, General Dwight D. Eisenhower summoned the Anglo-American forces under his command to embark upon what he unabashedly referred to as a “Great Crusade.” When President George W. Bush, in an unscripted moment, referred to the Iraq War as a “crusade,” he was alluding to those memories.

Islamic residents of the Middle East will inevitably assess “Anglo-American” purposes somewhat differently. The crusades their forebears experienced and that remain part of their shared consciousness drew their inspiration not from a desire to free but from a determination to conquer.

Can the United States nudge the people of the Islamic world to think as we think? To share our views of God, freedom, family, identity and the purpose of life? If not, our War for the Greater Middle East is doomed to fail.

Second, the role of technology.

As the Cold War was winding down and the War for the Greater Middle East was heating up, leading members of the U.S. national security elite persuaded themselves that technology was transforming the very nature of war. Fast computers and precision-guided weapons promised to enhance by orders of magnitude the efficacy of force. Since technological superiority is ostensibly an American strong suit, this high-tech approach to warfare promised to endow U.S. forces with a decisive edge against any and all adversaries.

The United States has now fully tested that proposition, trying various approaches in various places. Except in the narrowest tactical terms, it has proven to be utterly false. In the Greater Middle East, the preferred American style of warfare possesses limited political utility — a verdict that the Obama administration’s preference for missile-firing drones and commandos has not overturned.

The Pentagon’s response to this problem — for example, imbuing troops with modicum of “cultural sensitivity” prior to their deployment — offers an example of too little too late. While he was secretary of defense, Donald Rumsfeld touched off a minor furor by remarking that “you go to war with the army you have — not the army you might want.” Thirty-some years into the War for the Greater Middle East, the army we have still has not adapted itself to what fighting that war entails. Nor have the Navy and the Air Force, likewise enthralled with technology.

Third, strategy.

In present-day Washington, strategy — the principled application of power to achieve objectives of first-order importance — has become a lost art.

Any strategy worthy of the name sees around the curve. It anticipates or at least can accommodate the unexpected. Strategy expands the range of plausible and affordable options. In short, it creates choices.

What passes for U.S. strategy in the War for the Greater Middle East has been almost entirely reactive. Rather than principled, it has been opportunistic. It has repeatedly failed to anticipate second-order consequences.

Instead of creating choices, it has fostered the sense that the United States has no alternative but to press on, vaguely hoping persistence will produce a different result. This hope has led us to a dead end, although few in policymaking circles seem aware of that fact.

During the Cold War, the United States made many costly mistakes. That said, the concept of containment did provide at least a basic framework for a sound strategy. Thirty-some years after it began, we cannot make a similar statement regarding America’s War for the Greater Middle East. Opportunism and ad hoc reactions have prevailed.

And yet feasible strategic alternatives do exist. For example, what supposedly endows the Persian Gulf with such huge importance is the concentration of world oil reserves in that region. But what if — as appears to be the case — the United States stands on the brink of something approximating energy self-sufficiency? Should that not affect the reigning perception that the American way of life is somehow tied to the fate of Saudi Arabia and its oil-rich neighbors? That should create options, but thus far policymakers seem oblivious to the possibility.

Fourth, the national security apparatus.

I refer here to the sprawling network of institutions that emerged in the aftermath of World War II and has since grown like Topsy. Today that network centers on the Department of Defense and the so-called “intelligence community.” Yet it also includes select congressional committees, think tanks, advocacy groups, lobbies, defense contractors, certain academic programs and even specialized publications, all of them devoted to the proposition that “national security” must remain priority number one.

When it comes to fresh ideas, this vast realm has become a dead zone. It exists chiefly to perpetuate itself. In his recent memoir, former Secretary of Defense Robert Gates complains about the unresponsiveness of the national security bureaucracy, even when called upon to meet the pressing needs of troops during wartime.

In his time, Secretary of Defense Rumsfeld made the same point. Here he is speaking to Pentagon employees on — note the date — September 10, 2001:

_“The topic today is an adversary that poses a threat, a serious threat, to the security of the United States of America. This adversary is one of the world’s last bastions of central planning. It governs by dictating five-year plans…. With brutal consistency, it stifles free thought and crushes new ideas. It disrupts the defense of the United States and places the lives of men and women in uniform at risk.

“Perhaps this adversary sounds like the former Soviet Union, but that enemy is gone: our foes are more subtle and implacable today. You may think I’m describing one of the last decrepit dictators of the world. But [this] adversary’s closer to home. It’s the Pentagon bureaucracy.”_

Nothing that has occurred in the years since then reduces the salience of his critique. The national security apparatus suffers from seemingly irreversible sclerosis.

Of course, things look different to those who labor within the bowels of the national security apparatus. There, efforts to streamline are always underway.

Not long ago, Secretary of Defense Chuck Hagel announced plans to reduce the size of his own staff. At present the Office of the Secretary of Defense has 2,400 employees. Hagel’s senior staff currently includes a deputy secretary, an executive secretary, five undersecretaries, six deputy undersecretaries, 15 assistant secretaries and five principal deputy assistant secretaries. Hagel intends to reduce staffing by whopping 200 spaces. He’s given his people until 2019 to figure out how to do just that. Don’t hold your breath.

Fifth, generalship.

In any war, large or small, superior generalship alone does not guarantee a successful outcome. If it did, Richmond, Virginia, would today be the capital of an independent nation.

Yes, competent generals increase the odds of coming out on top, while inept ones can all but singlehandedly turn a potentially winning hand into a losing one. The American Civil War lasted as long as it did for several reasons. Prominent among those reasons was the quality of leadership under which the Army of the Potomac suffered when commanded by the likes of Irvin McDowell, George B. McClellan, Ambrose Burnside and Joseph Hooker.

We should not assume that present-day U.S. military leadership is all that much better. Taken as a whole, the performance of senior U.S. officers during the decades of America’s War for the Greater Middle East has been decidedly mixed. This is true even with regard to that elite group of top commanders sometimes referred to as “savior generals.”

Operation Desert Storm’s H. Norman Schwarzkopf along with more recent worthies such as Tommy Franks, David Petraeus and Stanley McChrystal were once mentioned in the same breath with George S. Patton. With the passage of time, however, the achievements that earned them wide renown have lost their luster.

Today, 1991’s Desert Storm seems less like a historic victory than an opening act to a chapter in U.S. military history most Americans would prefer to forget. No one thinks of Tommy Franks as the “liberator of Baghdad,” as they did ever so briefly a decade ago. As for Petraeus’s famous “surge” in Iraq and the Afghan equivalent engineered by McChrystal, whatever they once appeared to achieve has since become unstuck. The United States did not “win” the Iraq War; it merely bequeathed to the people of Iraq a war it fecklessly began but failed to finish. A similar outcome is likely in Afghanistan. We will leave. The war there will continue. George Patton would be decidedly unimpressed at such demonstrations of superior generalship.

Far more troubling than the limited achievements of generals once said to be saviors is the parade of high-ranking officers occupying positions of great responsibility who never came close to delivering salvation. In recent decades, the officers assigned to serve as chairman of the Joint Chiefs of Staff — the military establishment’s top post — have more often than not been eminently forgettable. To judge by the course of recent U.S. military policy — comparing intentions and expectations with outcomes — in advising the president and secretary of defense, the JCS chairman has either given lousy advice or, if having sound advice to offer, failed to make himself heard. No doubt the officers holding this office meant well. They just didn’t do well.

Much the same verdict can be rendered of the various field commanders assigned to conduct the campaigns comprising America’s War for the Greater Middle East. In expeditions undertaken in at least a dozen countries, senior U.S. commanders conclusively achieved U.S. political objectives — a workable definition of victory — on no more than two occasions. After the first of those, the Kosovo War of 1999, as soon as the shooting stopped President Bill Clinton sacked the general in charge for serious errors of judgment. The second occasion, the May 2011 operation that killed Osama bin Laden, was hardly larger in scope than a police raid.

The armed services know how to grow first-rate sergeants and captains. Their apparent inability to do the same when it comes to identifying, developing and selecting officers for the top jobs is troubling.

Nor is the problem one that the officer corps itself is likely to fix. That would require first admitting that a problem exists, something the current crop of four-star generals and admirals is unlikely to do. After all, existing arrangements got them to the top. They see little cause to question those arrangements, which as far as they can tell — especially when peering into a mirror — are working just swell.

Sixth, the U.S. military system.

The intensification of the War for the Greater Middle East after 9/11 revealed unsuspected defects in America’s basic approach to raising its military forces. Notwithstanding the considerable virtues of our professional military, notably durability and tactical prowess, the existing system rates as a failure.

The All-Volunteer Force is like a burger from a fast-food joint: it’s cheap, filling and tastes good going down. What’s not to like? Take a closer look, however, and problems with the existing U.S. military system become apparent. It encourages political irresponsibility. It underwrites an insipid conception of citizenship. It’s undemocratic. It turns out to be exorbitantly expensive. And it doesn’t win.

Dishonesty pervades the relationship between the U.S. military and society. Rhetorically, we “support the troops.” But the support is seldom more than skin-deep.

In practice, we subject the troops we profess to care about to serial abuse. As authorities in Washington commit U.S. forces to wars that are unnecessary or ill-managed or unwinnable — or, in the martial equivalent of a trifecta, all of the above — Americans manifest something close to indifference. The bungled rollout of a health care reform program might generate public attention and even outrage. By comparison, a bungled military campaign elicits shrugs.

Certainly our reliance on professional soldiers has relieved citizens of any responsibility to contribute to the nation’s defense. But is that actually such a good thing?

Back in the 1970s, when Vietnam induced Americans to abandon the tradition of the citizen-soldier, Washington responded by creating a standing army. That’s at least what the Founding Fathers would have called the All-Volunteer Force. In their day, standing army was a term of opprobrium. An army consisting of professionals rather than citizens, they believed, was at odds with the principles animating the American Revolution and infused in documents such as the Declaration of Independence and the Constitution.

The shadow of Vietnam lingers even now, with ironic implications. Americans today seem intent on making amends for sins committed (or said to have been committed) back when supporting the troops had become optional and overtly pro-military attitudes defined the very inverse of hip — like men who got crew-cuts, women who wore bras or anyone who voted for Nixon.

Today, blaming the troops for the wars they are sent to fight has become all but unthinkable. “Thank you for your service” trips off American lips as easily as “Have a nice day” — and with as little real meaning. Dietrich Bonhoeffer had a phrase for such posturing: He called it cheap grace. The actually existing relationship between American soldiers and the American people is shot full of cheap grace.

Seventh, the political economy of war.

The second-order consequences of relying on professional soldiers are likewise unfortunate. Washington’s appetite for waging war in the Greater Middle East has exceeded the willingness of young Americans to volunteer for military service. This has created a gap: Too much war, too few warriors.

This gap has created an opening for profit-minded “private security firms” to flood the war zone. In both Iraq and Afghanistan, for example, contractors ultimately outnumbered uniformed military personnel, taking on tasks once performed by soldiers. The results have fallen well short of being satisfactory.

To charge all contractors with being incompetent or corrupt would be unfair. Yet waste and corruption have occurred on a colossal scale — so much so that the Pentagon is literally unable to say where all the money went.

War has always created opportunities for some people to make money. America’s War for the Greater Middle East today has become primarily a means to make money.

Eighth, history.

Americans tend to remember what they find convenient, too often overlooking what actually matters.

When it comes to “seeing” the world, our perspective is still largely shaped by selective memories of World War II — its origins, conduct and aftermath. Isolationism remains the great boogeyman. Winston Churchill remains the ideal of the heroic leader. The postwar occupations of Germany and Japan remain the most instructive illustrations of what American leadership can be counted on to achieve.

Yet the triumphal story of World War II as a victory masterminded by the United States is almost entirely irrelevant to the Greater Middle East.

The story that matters — the account of how the modern Middle East came into existence — is largely unknown to the general public. And what the public knows is often misleading — the result of entertaining but largely fanciful movies like Exodus.

For the history that matters we might pay less attention to the Munich Conference of 1938, warning of the dangers of appeasing evil. Instead, we should pay more attention to the Sykes-Picot Agreement of 1916, which carved up the Ottoman Empire to suit British and French imperial ambitions and thereby yielded evil results that linger today. Similarly, while paying homage to the Churchill who got Hitler right we might ask what possessed the same Churchill who got the Middle East so grotesquely wrong.

In waging the War for the Greater Middle East, our mental frame of reference remains stuck in the 20th century. That frame is obsolete — like thinking about communications in terms of tubes, wires and postage stamps. The 21st century demands something quite different.

Ninth, regional allies.

The longer America’s War for the Greater Middle East drags on, the more apparent it becomes that Washington has done a lousy job of picking allies.

Consider Pakistan and Saudi Arabia, for example. The United States seeks to reduce the prevalence of violent Islamic radicalism. The governments of Pakistan and Saudi Arabia actively promote it. It’s time to stop pretending otherwise.

Then there is Israel. U.S. interests and those of the Jewish state have long since diverged. To ensure the security and well-being of its citizens, the government of Israel vigorously employs its military muscle to preempt perceived threats and ensure Israeli control of vital terrain and resources, now and in perpetuity.

In practical terms, that implies double standards when it comes to, say, possessing weapons of mass destruction. It also means skepticism regarding any “peace” agreement except on terms manifestly favorable to Israel. From an Israeli perspective, this makes considerable sense.

Yet to satisfy Israel’s prerequisites for peace, nearby Arab states will have to become little Canadas — not only “friendly” but also demilitarized, economically accessible, and with open and undefended borders. That seems highly unlikely.

Through action and inaction, Washington serves as Israel’s willing enabler. By providing arms and technology, the United States guarantees Israel’s “qualitative edge,” a euphemism for unchallengeable regional supremacy. The United States also provides Israel with diplomatic cover, for example, tacitly accepting manifestly illegal Israeli actions such as settlement expansion in the West Bank.

With what consequences? Becoming party to the Arab-Israeli conflict on Israel’s side creates unwanted complications for the United States. It also exacerbates that previously mentioned tendency to overstate the importance of the Greater Middle East in the hierarchy of U.S. strategic interests.

The chief U.S. interest in the region lies in promoting stability. Anything else falls into the category of “nice to have.” In that regard, the United States has a profound interest in responding to the grievances of the Palestinian people promptly and comprehensively. Yet the government of Israel will respond to those grievances in due time and on Israeli terms. In the meantime, the persistence of those grievances provides either a genuine cause of or a pretext for anti-American and anti-Western attitudes across much of the Islamic world.

When it comes to waging the War for the Greater Middle East, Israel belongs in the same category as Saudi Arabia and Pakistan: As allies, all three are unhelpful.

Tenth, religion.

No single explanation exists for why the War for the Greater Middle East began and why it persists. But religion figures as a central element.

Secularized American elites either cannot grasp or are unwilling to accept this. So they contrive alternative explanations such as “terrorism,” a justification that impedes understanding.

Our leaders can proclaim their high regard for Islam until they are blue in the face. They can insist over and over that we are not at war with Islam. Their claims will fall on deaf ears through much of the Greater Middle East.

Whatever Washington’s intentions, we are engaged in a religious war. That is, the ongoing war has an ineradicable religious dimension. That’s the way a few hundred million Muslims see it and their seeing it in those terms makes it so.

The beginning of wisdom is found not in denying that the war is about religion but in acknowledging that war cannot provide an antidote to the fix we have foolishly gotten ourselves into.

Does the Islamic world pose something of a problem for the United States? You bet, in all sorts of ways. But after more than three decades of trying, it’s pretty clear that the application of military power is unlikely to provide a solution. The solution, if there is one, will be found by looking beyond the military realm — which just might be the biggest lesson our experience with the War for the Greater Middle East ought to teach.


Andrew J. Bacevich is a professor of history and international relations at Boston University He was formerly a visiting fellow at Notre Dame’s Kroc Institute for International Peace Studies.


The magazine welcomes comments, but we do ask that they be on topic and civil. Read our full comment policy.