Sometimes history happens at the moment when no one is looking.  On weekends in late August, the president of the United States ought to be playing golf or loafing at Camp David, not making headlines.  Yet Barack Obama chose Labor Day weekend to unveil arguably the most consequential foreign policy shift of his presidency.

In an announcement that surprised virtually everyone, the president told his countrymen and the world that he was putting on hold the much anticipated U.S. attack against Syria.  Obama hadn’t, he assured us, changed his mind about the need and justification for punishing the Syrian government for its probable use of chemical weapons against its own citizens.  In fact, only days before administration officials had been claiming that, if necessary, the U.S. would “go it alone” in punishing Bashar al-Assad’s regime for its bad behavior.  Now, however, Obama announced that, as the chief executive of “the world’s oldest constitutional democracy,” he had decided to seek Congressional authorization before proceeding.

Obama thereby brought to a screeching halt a process extending back over six decades in which successive inhabitants of the Oval Office had arrogated to themselves (or had thrust upon them) ever wider prerogatives in deciding when and against whom the United States should wage war.  Here was one point on which every president from Harry Truman to George W. Bush had agreed: on matters related to national security, the authority of the commander-in-chief has no fixed limits.  When it comes to keeping the country safe and securing its vital interests, presidents can do pretty much whatever they see fit.

Here, by no means incidentally, lies the ultimate the source of the stature and prestige that defines the imperial presidency and thereby shapes (or distorts) the American political system.  Sure, the quarters at 1600 Pennsylvania Avenue are classy, but what really endowed the postwar war presidency with its singular aura were the missiles, bombers, and carrier battle groups that responded to the commands of one man alone.  What’s the bully pulpit in comparison to having the 82nd Airborne and SEAL Team Six at your beck and call?

Now, in effect, Obama was saying to Congress: I’m keen to launch a war of choice.  But first I want you guys to okay it.  In politics, where voluntarily forfeiting power is an unnatural act, Obama’s invitation qualifies as beyond unusual.  Whatever the calculations behind his move, its effect rates somewhere between unprecedented and positively bizarre — the heir to imperial prerogatives acting, well, decidedly unimperial.

Obama is a constitutional lawyer, of course, and it’s pleasant to imagine that he acted out of due regard for what Article 1, Section 8, of that document plainly states, namely that “the Congress shall have power…  to declare war.”  Take his explanation at face value and the president’s decision ought to earn plaudits from strict constructionists across the land.  The Federalist Society should offer Obama an honorary lifetime membership.

Of course, seasoned political observers, understandably steeped in cynicism, dismissed the president’s professed rationale out of hand and immediately began speculating about his actual motivation.  The most popular explanation was this: having painted himself into a corner, Obama was trying to lure members of the legislative branch into joining him there.  Rather than a belated conversion experience, the president’s literal reading of the Constitution actually amounted to a sneaky political ruse.

After all, the president had gotten himself into a pickle by declaring back in August 2012 that any use of chemical weapons by the government of Bashar al-Assad would cross a supposedly game-changing “red line.”  When the Syrians (apparently) called his bluff, Obama found himself facing uniformly unattractive military options that ranged from the patently risky — joining forces with the militants intent on toppling Assad — to the patently pointless — firing a “shot across the bow” of the Syrian ship of state.

Meanwhile, the broader American public, awakening from its summertime snooze, was demonstrating remarkably little enthusiasm for yet another armed intervention in the Middle East.  Making matters worse still, U.S. military leaders and many members of Congress, Republican and Democratic alike, were expressing serious reservations or actual opposition. Press reports even cited leaks by unnamed officials who characterized the intelligence linking Assad to the chemical attacks as no “slam dunk,” a painful reminder of how bogus information had paved the way for the disastrous and unnecessary Iraq War.  For the White House, even a hint that Obama in 2013 might be replaying the Bush scenario of 2003 was anathema.

The president also discovered that recruiting allies to join him in this venture was proving a hard sell.  It wasn’t just the Arab League’s refusal to give an administration strike against Syria its seal of approval, although that was bad enough.  Jordan’s King Abdullah, America’s “closest ally in the Arab world,” publicly announced that he favored talking to Syria rather than bombing it.  As for Iraq, that previous beneficiary of American liberation, its government was refusing even to allow U.S. forces access to its airspace.  Ingrates!

For Obama, the last straw may have come when America’s most reliable (not to say subservient) European partner refused to enlist in yet another crusade to advance the cause of peace, freedom, and human rights in the Middle East.  With memories of Tony and George W. apparently eclipsing those of Winston and Franklin, the British Parliament rejected Prime Minister David Cameron’s attempt to position the United Kingdom alongside the United States.  Parliament’s vote dashed Obama’s hopes of forging a coalition of two and so investing a war of choice against Syria with at least a modicum of legitimacy.

When it comes to actual military action, only France still entertains the possibility of making common cause with the United States.  Yet the number of Americans taking assurance from this prospect approximates the number who know that Bernard-Henri Lévy isn’t a celebrity chef.

John F. Kennedy once remarked that defeat is an orphan.  Here was a war bereft of parents even before it had begun.

Whether or Not to Approve the War for the Greater Middle East

Still, whether high-minded constitutional considerations or diabolically clever political machinations motivated the president may matter less than what happens next.  Obama lobbed the ball into Congress’s end of the court.  What remains to be seen is how the House and the Senate, just now coming back into session, will respond.

At least two possibilities exist, one with implications that could prove profound and the second holding the promise of being vastly entertaining.

On the one hand, Obama has implicitly opened the door for a Great Debate regarding the trajectory of U.S. policy in the Middle East.  Although a week or ten days from now the Senate and House of Representatives will likely be voting to approve or reject some version of an Authorization for the Use of Military Force (AUMF), at stake is much more than the question of what to do about Syria.  The real issue — Americans should hope that the forthcoming congressional debate makes this explicit — concerns the advisability of continuing to rely on military might as the preferred means of advancing U.S. interests in this part of the world.

Appreciating the actual stakes requires putting the present crisis in a broader context.  Herewith an abbreviated history lesson.

Back in 1980, President Jimmy Carter announced that the United States would employ any means necessary to prevent a hostile power from gaining control of the Persian Gulf.  In retrospect, it’s clear enough that the promulgation of the so-called Carter Doctrine amounted to a de facto presidential “declaration” of war (even if Carter himself did not consciously intend to commit the United States to perpetual armed conflict in the region).  Certainly, what followed was a never-ending sequence of wars and war-like episodes.  Although the Congress never formally endorsed Carter’s declaration, it tacitly acceded to all that his commitment subsequently entailed.

Relatively modest in its initial formulation, the Carter Doctrine quickly metastasized.  Geographically, it grew far beyond the bounds of the Persian Gulf, eventually encompassing virtually all of the Islamic world.  Washington’s own ambitions in the region also soared.  Rather than merely preventing a hostile power from achieving dominance in the Gulf, the United States was soon seeking to achieve dominance itself.  Dominance — that is, shaping the course of events to Washington’s liking — was said to hold the key to maintaining stability, ensuring access to the world’s most important energy reserves, checking the spread of Islamic radicalism, combating terrorism, fostering Israel’s security, and promoting American values.  Through the adroit use of military might, dominance actually seemed plausible.  (So at least Washington persuaded itself.)

What this meant in practice was the wholesale militarization of U.S. policy toward the Greater Middle East in a period in which Washington’s infatuation with military power was reaching its zenith.  As the Cold War wound down, the national security apparatus shifted its focus from defending Germany’s Fulda Gap to projecting military power throughout the Islamic world.  In practical terms, this shift found expression in the creation of Central Command (CENTCOM), reconfigured forces, and an eternal round of contingency planning, war plans, and military exercises in the region.  To lay the basis for the actual commitment of troops, the Pentagon established military bases, stockpiled material in forward locations, and negotiated transit rights.  It also courted and armed proxies.  In essence, the Carter Doctrine provided the Pentagon (along with various U.S. intelligence agencies) with a rationale for honing and then exercising new capabilities.

Capabilities expanded the range of policy options.  Options offered opportunities to “do something” in response to crisis.  From the Reagan era on, policymakers seized upon those opportunities with alacrity.  A seemingly endless series of episodes and incidents ensued, as U.S. forces, covert operatives, or proxies engaged in hostile actions (often on multiple occasions) in Lebanon, Libya, Iran, Somalia, Bosnia, Kosovo, Saudi Arabia, the Sudan, Yemen, Pakistan, the southern Philippines, and in the Persian Gulf itself, not to mention Iraq and Afghanistan.  Consider them altogether and what you have is a War for the Greater Middle East, pursued by the United States for over three decades now.  If Congress gives President Obama the green light, Syria will become the latest front in this ongoing enterprise.

Profiles in Courage? If Only

A debate over the Syrian AUMF should encourage members of Congress — if they’ve got the guts — to survey this entire record of U.S. military activities in the Greater Middle East going back to 1980.  To do so means almost unavoidably confronting this simple question: How are we doing?  To state the matter directly, all these years later, given all the ordnance expended, all the toing-and-froing of U.S. forces, and all the lives lost or shattered along the way, is mission accomplishment anywhere insight?  Or have U.S. troops — the objects of such putative love and admiration on the part of the American people — been engaged over the past 30-plus years in a fool’s errand?  How members cast their votes on the Syrian AUMF will signal their answer — and by extension the nation’s answer — to that question.

To okay an attack on Syria will, in effect, reaffirm the Carter Doctrine and put a stamp of congressional approval on the policies that got us where we are today.  A majority vote in favor of the Syrian AUMF will sustain and probably deepen Washington’s insistence that the resort to violence represents the best way to advance U.S. interests in the Islamic world.  From this perspective, all we need to do is try harder and eventually we’ll achieve a favorable outcome.  With Syria presumably the elusive but never quite attained turning point, the Greater Middle East will stabilize.  Democracy will flourish.  And the United States will bask in the appreciation of those we have freed from tyranny.

To vote against the AUMF, on the other hand, will draw a red line of much greater significance than the one that President Obama himself so casually laid down.  Should the majority in either House reject the Syrian AUMF, the vote will call into question the continued viability of the Carter Doctrine and all that followed in its wake.

It will create space to ask whether having another go is likely to produce an outcome any different from what the United States has achieved in the myriad places throughout the Greater Middle East where U.S. forces (or covert operatives) have, whatever their intentions, spent the past several decades wreaking havoc and sowing chaos under the guise of doing good.  Instead of offering more of the same – does anyone seriously think that ousting Assad will transform Syria into an Arab Switzerland? — rejecting the AUMF might even invite the possibility of charting an altogether different course, entailing perhaps a lower military profile and greater self-restraint.

What a stirring prospect!  Imagine members of Congress setting aside partisan concerns to debate first-order questions of policy.  Imagine them putting the interests of the country in front of their own worries about winning reelection or pursuing their political ambitions.  It would be like Lincoln vs. Douglas or Woodrow Wilson vs. Henry Cabot Lodge.  Call Doris Kearns Goodwin.  Call Spielberg or Sorkin.  Get me Capra, for God’s sake.  We’re talking high drama of blockbuster proportions.

On the other hand, given the record of the recent past, we should hardly discount the possibility that our legislative representatives will not rise to the occasion.  Invited by President Obama to share in the responsibility for deciding whether and where to commit acts of war, one or both Houses — not known these days for displaying either courage or responsibility — may choose instead to punt.

As we have learned by now, the possible ways for Congress to shirk its duty are legion.  In this instance, all are likely to begin with the common supposition that nothing’s at stake here except responding to Assad’s alleged misdeeds.  To refuse to place the Syrian crisis in any larger context is, of course, a dodge.  Yet that dodge creates multiple opportunities for our elected representatives to let themselves off the hook.

Congress could, for example, pass a narrowly drawn resolution authorizing Obama to fire his “shot across the bow” and no more.  In other words, it could basically endorse the president’s inclination to substitute gesture for policy.

Or it could approve a broadly drawn, but vacuous resolution, handing the president a blank check.  Ample precedent exists for that approach, since it more or less describes what Congress did in 1964 with the Tonkin Gulf Resolution, opening the way to presidential escalation in Vietnam, or with the AUMF it passed in the immediate aftermath of 9/11, giving George W. Bush’s administration permission to do more or less anything it wanted to just about anyone.

Even more irresponsibly, Congress could simply reject any Syrian AUMF, however worded, without identifying a plausible alternative to war, in effect washing its hands of the matter and creating a policy vacuum.

Will members of the Senate and the House grasp the opportunity to undertake an urgently needed reassessment of America’s War for the Greater Middle East?  Or wriggling and squirming, will they inelegantly sidestep the issue, opting for short-term expediency in place of serious governance?  In an age where the numbing blather of McCain, McConnell, and Reid have replaced the oratory of Clay, Calhoun, and Webster, merely to pose the question is to answer it.

But let us not overlook the entertainment value of such an outcome, which could well be formidable.  In all likelihood, high comedy Washington-style lurks just around the corner.  So renew that subscription to The Onion.  Keep an eye on Doonesbury.  Set the TiVo to record Jon Stewart.  This is going to be really funny — and utterly pathetic.  Where’s H.L. Mencken when we need him?

Andrew J. Bacevich is a professor of history and international relations at Boston University.  He is the author of the new book, Breach of Trust: How Americans Failed Their Soldiers and Their Country (Metropolitan Books).

Copyright 2013 Andrew Bacevich

The Hill to the Rescue on Syria?

For well over a decade now the United States has been “a nation at war.” Does that war have a name?

It did at the outset.  After 9/11, George W. Bush’s administration wasted no time in announcing that the U.S. was engaged in a Global War on Terrorism, or GWOT.  With few dissenters, the media quickly embraced the term. The GWOT promised to be a gargantuan, transformative enterprise. The conflict begun on 9/11 would define the age. In neoconservative circles, it was known as World War IV.

Upon succeeding to the presidency in 2009, however, Barack Obama without fanfare junked Bush’s formulation (as he did again in a speech at the National Defense University last week).  Yet if the appellation went away, the conflict itself, shorn of identifying marks, continued.

Does it matter that ours has become and remains a nameless war? Very much so.

Names bestow meaning.  When it comes to war, a name attached to a date can shape our understanding of what the conflict was all about.  To specify when a war began and when it ended is to privilege certain explanations of its significance while discrediting others. Let me provide a few illustrations.

With rare exceptions, Americans today characterize the horrendous fraternal bloodletting of 1861-1865 as the Civil War.  Yet not many decades ago, diehard supporters of the Lost Cause insisted on referring to that conflict as the War Between the States or the War for Southern Independence (or even the War of Northern Aggression).  The South may have gone down in defeat, but the purposes for which Southerners had fought — preserving a distinctive way of life and the principle of states’ rights — had been worthy, even noble.  So at least they professed to believe, with their preferred names for the war reflecting that belief.

Schoolbooks tell us that the Spanish-American War began in April 1898 and ended in August of that same year.  The name and dates fit nicely with a widespread inclination from President William McKinley’s day to our own to frame U.S. intervention in Cuba as an altruistic effort to liberate that island from Spanish oppression.

Yet the Cubans were not exactly bystanders in that drama.  By 1898, they had been fighting for years to oust their colonial overlords.  And although hostilities in Cuba itself ended on August 12th, they dragged on in the Philippines, another Spanish colony that the United States had seized for reasons only remotely related to liberating Cubans.  Notably, U.S. troops occupying the Philippines waged a brutal war not against Spaniards but against Filipino nationalists no more inclined to accept colonial rule by Washington than by Madrid.  So widen the aperture to include this Cuban prelude and the Filipino postlude and you end up with something like this:  The Spanish-American-Cuban-Philippines War of 1895-1902.  Too clunky?  How about the War for the American Empire?  This much is for sure: rather than illuminating, the commonplace textbook descriptor serves chiefly to conceal.

Strange as it may seem, Europeans once referred to the calamitous events of 1914-1918 as the Great War.  When Woodrow Wilson decided in 1917 to send an army of doughboys to fight alongside the Allies, he went beyond Great.  According to the president, the Great War was going to be the War To End All Wars.  Alas, things did not pan out as he expected.  Perhaps anticipating the demise of his vision of permanent peace, War Department General Order 115, issued on October 7, 1919, formally declared that, at least as far as the United States was concerned, the recently concluded hostilities would be known simply as the World War.

In September 1939 — presto chango! — the World War suddenly became the First World War, the Nazi invasion of Poland having inaugurated a Second World War, also known as World War II or more cryptically WWII.  To be sure, Soviet dictator Josef Stalin preferred the Great Patriotic War.  Although this found instant — almost unanimous — favor among Soviet citizens, it did not catch on elsewhere.

Does World War II accurately capture the events it purports to encompass?  With the crusade against the Axis now ranking alongside the crusade against slavery as a myth-enshrouded chapter in U.S. history to which all must pay homage, Americans are no more inclined to consider that question than to consider why a playoff to determine the professional baseball championship of North America constitutes a “World Series.”

In fact, however convenient and familiar, World War II is misleading and not especially useful.  The period in question saw at least two wars, each only tenuously connected to the other, each having distinctive origins, each yielding a different outcome.  To separate them is to transform the historical landscape.

On the one hand, there was the Pacific War, pitting the United States against Japan.  Formally initiated by the December 7, 1941, attack on Pearl Harbor, it had in fact begun a decade earlier when Japan embarked upon a policy of armed conquest in Manchuria.  At stake was the question of who would dominate East Asia.  Japan’s crushing defeat at the hands of the United States, sealed by two atomic bombs in 1945, answered that question (at least for a time).

Then there was the European War, pitting Nazi Germany first against Great Britain and France, but ultimately against a grand alliance led by the United States, the Soviet Union, and a fast fading British Empire.  At stake was the question of who would dominate Europe.  Germany’s defeat resolved that issue (at least for a time): no one would.  To prevent any single power from controlling Europe, two outside powers divided it.

This division served as the basis for the ensuing Cold War, which wasn’t actually cold, but also (thankfully) wasn’t World War III, the retrospective insistence of bellicose neoconservatives notwithstanding.  But when did the Cold War begin?  Was it in early 1947, when President Harry Truman decided that Stalin’s Russia posed a looming threat and committed the United States to a strategy of containment?  Or was it in 1919, when Vladimir Lenin decided that Winston Churchill’s vow to “strangle Bolshevism in its cradle” posed a looming threat to the Russian Revolution, with an ongoing Anglo-American military intervention evincing a determination to make good on that vow?

Separating the war against Nazi Germany from the war against Imperial Japan opens up another interpretive possibility.  If you incorporate the European conflict of 1914-1918 and the European conflict of 1939-1945 into a single narrative, you get a Second Thirty Years War (the first having occurred from 1618-1648) — not so much a contest of good against evil, as a mindless exercise in self-destruction that represented the ultimate expression of European folly.

So, yes, it matters what we choose to call the military enterprise we’ve been waging not only in Iraq and Afghanistan, but also in any number of other countries scattered hither and yon across the Islamic world.  Although the Obama administration appears no more interested than the Bush administration in saying when that enterprise will actually end, the date we choose as its starting point also matters.

Although Washington seems in no hurry to name its nameless war — and will no doubt settle on something self-serving or anodyne if it ever finally addresses the issue — perhaps we should jump-start the process.  Let’s consider some possible options, names that might actually explain what’s going on.

The Long War: Coined not long after 9/11 by senior officers in the Pentagon, this formulation never gained traction with either civilian officials or the general public.  Yet the Long War deserves consideration, even though — or perhaps because — it has lost its luster with the passage of time.

At the outset, it connoted grand ambitions buoyed by extreme confidence in the efficacy of American military might.  This was going to be one for the ages, a multi-generational conflict yielding sweeping results.

The Long War did begin on a hopeful note.  The initial entry into Afghanistan and then into Iraq seemed to herald “home by Christmas” triumphal parades.  Yet this soon proved an illusion as victory slipped from Washington’s grasp.  By 2005 at the latest, events in the field had dashed the neo-Wilsonian expectations nurtured back home.

With the conflicts in Iraq and Afghanistan dragging on, “long” lost its original connotation.  Instead of “really important,” it became a synonym for “interminable.”  Today, the Long War does succinctly capture the experience of American soldiers who have endured multiple combat deployments to Iraq and Afghanistan.

For Long War combatants, the object of the exercise has become to persist.  As for winning, it’s not in the cards. The Long War just might conclude by the end of 2014 if President Obama keeps his pledge to end the U.S. combat role in Afghanistan and if he avoids getting sucked into Syria’s civil war.  So the troops may hope.

The War Against Al-Qaeda: It began in August 1996 when Osama bin Laden issued a “Declaration of War against the Americans Occupying the Land of the Two Holy Places,” i.e., Saudi Arabia.  In February 1998, a second bin Laden manifesto announced that killing Americans, military and civilian alike, had become “an individual duty for every Muslim who can do it in any country in which it is possible to do it.”

Although President Bill Clinton took notice, the U.S. response to bin Laden’s provocations was limited and ineffectual.  Only after 9/11 did Washington take this threat seriously.  Since then, apart from a pointless excursion into Iraq (where, in Saddam Hussein’s day, al-Qaeda did not exist), U.S. attention has been focused on Afghanistan, where U.S. troops have waged the longest war in American history, and on Pakistan’s tribal borderlands, where a CIA drone campaign is ongoing.  By the end of President Obama’s first term, U.S. intelligence agencies were reporting that a combined CIA/military campaign had largely destroyed bin Laden’s organization.  Bin Laden himself, of course, was dead.

Could the United States have declared victory in its unnamed war at this point?  Perhaps, but it gave little thought to doing so.  Instead, the national security apparatus had already trained its sights on various al-Qaeda “franchises” and wannabes, militant groups claiming the bin Laden brand and waging their own version of jihad.  These offshoots emerged in the Maghreb, Yemen, Somalia, Nigeria, and — wouldn’t you know it — post-Saddam Iraq, among other places.  The question as to whether they actually posed a danger to the United States got, at best, passing attention — the label “al-Qaeda” eliciting the same sort of Pavlovian response that the word “communist” once did.

Americans should not expect this war to end anytime soon.  Indeed, the Pentagon’s impresario of special operations recently speculated — by no means unhappily — that it would continue globally for “at least 10 to 20 years.”   Freely translated, his statement undoubtedly means: “No one really knows, but we’re planning to keep at it for one helluva long time.”

The War For/Against/About Israel: It began in 1948.  For many Jews, the founding of the state of Israel signified an ancient hope fulfilled.  For many Christians, conscious of the sin of anti-Semitism that had culminated in the Holocaust, it offered a way to ease guilty consciences, albeit mostly at others’ expense.  For many Muslims, especially Arabs, and most acutely Arabs who had been living in Palestine, the founding of the Jewish state represented a grave injustice.  It was yet another unwelcome intrusion engineered by the West — colonialism by another name.

Recounting the ensuing struggle without appearing to take sides is almost impossible.  Yet one thing seems clear: in terms of military involvement, the United States attempted in the late 1940s and 1950s to keep its distance.  Over the course of the 1960s, this changed.  The U.S. became Israel’s principal patron, committed to maintaining (and indeed increasing) its military superiority over its neighbors.

In the decades that followed, the two countries forged a multifaceted “strategic relationship.”  A compliant Congress provided Israel with weapons and other assistance worth many billions of dollars, testifying to what has become an unambiguous and irrevocable U.S. commitment to the safety and well-being of the Jewish state.  The two countries share technology and intelligence.  Meanwhile, just as Israel had disregarded U.S. concerns when it came to developing nuclear weapons, it ignored persistent U.S. requests that it refrain from colonizing territory that it has conquered.

When it comes to identifying the minimal essential requirements of Israeli security and the terms that will define any Palestinian-Israeli peace deal, the United States defers to Israel.  That may qualify as an overstatement, but only slightly.  Given the Israeli perspective on those requirements and those terms — permanent military supremacy and a permanently demilitarized Palestine allowed limited sovereignty — the War For/Against/About Israel is unlikely to end anytime soon either.  Whether the United States benefits from the perpetuation of this war is difficult to say, but we are in it for the long haul.

The War for the Greater Middle East: I confess that this is the name I would choose for Washington’s unnamed war and is, in fact, the title of a course I teach.  (A tempting alternative is the Second Hundred Years War, the “first” having begun in 1337 and ended in 1453.)

This war is about to hit the century mark, its opening chapter coinciding with the onset of World War I.  Not long after the fighting on the Western Front in Europe had settled into a stalemate, the British government, looking for ways to gain the upper hand, set out to dismantle the Ottoman Empire whose rulers had foolishly thrown in their lot with the German Reich against the Allies.

By the time the war ended with Germany and the Turks on the losing side, Great Britain had already begun to draw up new boundaries, invent states, and install rulers to suit its predilections, while also issuing mutually contradictory promises to groups inhabiting these new precincts of its empire.  Toward what end?  Simply put, the British were intent on calling the shots from Egypt to India, whether by governing through intermediaries or ruling directly.  The result was a new Middle East and a total mess.

London presided over this mess, albeit with considerable difficulty, until the end of World War II.  At this point, by abandoning efforts to keep Arabs and Zionists from one another’s throats in Palestine and by accepting the partition of India, they signaled their intention to throw in the towel. Alas, Washington proved more than willing to assume Britain’s role.  The lure of oil was strong.  So too were the fears, however overwrought, of the Soviets extending their influence into the region.

Unfortunately, the Americans enjoyed no more success in promoting long-term, pro-Western stability than had the British.  In some respects, they only made things worse, with the joint CIA-MI6 overthrow of a democratically elected government in Iran in 1953 offering a prime example of a “success” that, to this day, has never stopped breeding disaster.

Only after 1980 did things get really interesting, however.  The Carter Doctrine promulgated that year designated the Persian Gulf a vital national security interest and opened the door to greatly increased U.S. military activity not just in the Gulf, but also throughout the Greater Middle East (GME).  Between 1945 and 1980, considerable numbers of American soldiers lost their lives fighting in Asia and elsewhere.  During that period, virtually none were killed fighting in the GME.  Since 1990, in contrast, virtually none have been killed fighting anywhere except in the GME.

What does the United States hope to achieve in its inherited and unending War for the Greater Middle East?  To pacify the region?  To remake it in our image?  To drain its stocks of petroleum?  Or just keeping the lid on?  However you define the war’s aims, things have not gone well, which once again suggests that, in some form, it will continue for some time to come.  If there’s any good news here, it’s the prospect of having ever more material for my seminar, which may soon expand into a two-semester course.

The War Against Islam: This war began nearly 1,000 years ago and continued for centuries, a storied collision between Christendom and the Muslim ummah.  For a couple of hundred years, periodic eruptions of large-scale violence occurred until the conflict finally petered out with the last crusade sometime in the fourteenth century.

In those days, many people had deemed religion something worth fighting for, a proposition to which the more sophisticated present-day inhabitants of Christendom no longer subscribe.  Yet could that religious war have resumed in our own day?  Professor Samuel Huntington thought so, although he styled the conflict a “clash of civilizations.”  Some militant radical Islamists agree with Professor Huntington, citing as evidence the unwelcome meddling of “infidels,” mostly wearing American uniforms, in various parts of the Muslim world.  Some militant evangelical Christians endorse this proposition, even if they take a more favorable view of U.S. troops occupying and drones targeting Muslim countries.

In explaining the position of the United States government, religious scholars like George W. Bush and Barack (Hussein!) Obama have consistently expressed a contrary view.  Islam is a religion of peace, they declare, part of the great Abrahamic triad.  That the other elements of that triad are likewise committed to peace is a proposition that Bush, Obama, and most Americans take for granted, evidence not required.  There should be no reason why Christians, Jews, and Muslims can’t live together in harmony.

Still, remember back in 2001 when, in an unscripted moment, President Bush described the war barely begun as a “crusade”?  That was just a slip of the tongue, right?  If not, we just might end up calling this one the Eternal War.

Andrew J. Bacevich is a professor of history and international relations at Boston University and a TomDispatch regular. His next book, Breach of Trust: How Americans Failed Their Soldiers and Their Country, will appear in September.

Copyright 2013 Andrew J. Bacevich

Naming Our Nameless War

First came the hullaballoo over the “Mosque at Ground Zero.” Then there was Pastor Terry Jones of Gainesville, Florida, grabbing headlines as he promoted “International Burn-a-Koran Day.” Most recently, we have an American posting a slanderous anti-Muslim video on the Internet with all the ensuing turmoil.

Throughout, the official U.S. position has remained fixed: the United States government condemns Islamophobia. Americans respect Islam as a religion of peace. Incidents suggesting otherwise are the work of a tiny minority — whackos, hatemongers, and publicity-seekers. Among Muslims from Benghazi to Islamabad, the argument has proven to be a tough sell.

And not without reason: although it might be comforting to dismiss anti-Islamic outbursts in the U.S. as the work of a few fanatics, the picture is actually far more complicated. Those complications in turn help explain why religion, once considered a foreign policy asset, has in recent years become a net liability.

Let’s begin with a brief history lesson. From the late 1940s to the late 1980s, when Communism provided the overarching ideological rationale for American globalism, religion figured prominently as a theme of U.S. foreign policy. Communist antipathy toward religion helped invest the Cold War foreign policy consensus with its remarkable durability. That Communists were godless sufficed to place them beyond the pale. For many Americans, the Cold War derived its moral clarity from the conviction that here was a contest pitting the God-fearing against the God-denying. Since we were on God’s side, it appeared axiomatic that God should repay the compliment.

From time to time during the decades when anti-Communism provided so much of the animating spirit of U.S. policy, Judeo-Christian strategists in Washington (not necessarily believers themselves), drawing on the theologically correct proposition that Christians, Jews, and Muslims all worship the same God, sought to enlist Muslims, sometimes of fundamentalist persuasions, in the cause of opposing the godless. One especially notable example was the Soviet-Afghan War of 1979-1989. To inflict pain on the Soviet occupiers, the United States threw its weight behind the Afghan resistance, styled in Washington as “freedom fighters,” and funneled aid (via the Saudis and the Pakistanis) to the most religiously extreme among them. When this effort resulted in a massive Soviet defeat, the United States celebrated its support for the Afghan Mujahedeen as evidence of strategic genius. It was almost as if God had rendered a verdict.

Yet not so many years after the Soviets withdrew in defeat, the freedom fighters morphed into the fiercely anti-Western Taliban, providing sanctuary to al-Qaeda as it plotted — successfully — to attack the United States. Clearly, this was a monkey wrench thrown into God’s plan.

With the launching of the Global War on Terrorism, Islamism succeeded Communism as the body of beliefs that, if left unchecked, threatened to sweep across the globe with dire consequences for freedom. Those who Washington had armed as “freedom fighters” now became America’s most dangerous enemies. So at least members of the national security establishment believed or purported to believe, thereby curtailing any further discussion of whether militarized globalism actually represented the best approach to promoting liberal values globally or even served U.S. interests.

Yet as a rallying cry, a war against Islamism presented difficulties right from the outset. As much as policymakers struggled to prevent Islamism from merging in the popular mind with Islam itself, significant numbers of Americans — whether genuinely fearful or mischief-minded — saw this as a distinction without a difference. Efforts by the Bush administration to work around this problem by framing the post-9/11 threat under the rubric of “terrorism” ultimately failed because that generic term offered no explanation for motive. However the administration twisted and turned, motive in this instance seemed bound up with matters of religion.

Where exactly to situate God in post-9/11 U.S. policy posed a genuine challenge for policymakers, not least of all for George W. Bush, who believed, no doubt sincerely, that God had chosen him to defend America in its time of maximum danger. Unlike the communists, far from denying God’s existence, Islamists embrace God with startling ferocity. Indeed, in their vitriolic denunciations of the United States and in perpetrating acts of anti-American violence, they audaciously present themselves as nothing less than God’s avenging agents. In confronting the Great Satan, they claim to be doing God’s will.

Waging War in Jesus’s Name

This debate over who actually represents God’s will is one that the successive administrations of George W. Bush and Barack Obama have studiously sought to avoid. The United States is not at war with Islam per se, U.S. officials insist. Still, among Muslims abroad, Washington’s repeated denials notwithstanding, suspicion persists and not without reason.

Consider the case of Lieutenant General William G. (“Jerry”) Boykin. While still on active duty in 2002, this highly decorated Army officer spoke in uniform at a series of some 30 church gatherings during which he offered his own response to President Bush’s famous question: “Why do they hate us?” The general’s perspective differed markedly from his commander-in-chief’s: “The answer to that is because we’re a Christian nation. We are hated because we are a nation of believers.”

On another such occasion, the general recalled his encounter with a Somali warlord who claimed to enjoy Allah’s protection. The warlord was deluding himself, Boykin declared, and was sure to get his comeuppance: “I knew that my God was bigger than his. I knew that my God was a real God and his was an idol.” As a Christian nation, Boykin insisted, the United States would succeed in overcoming its adversaries only if “we come against them in the name of Jesus.”

When Boykin’s remarks caught the attention of the mainstream press, denunciations rained down from on high, as the White House, the State Department, and the Pentagon hastened to disassociate the government from the general’s views. Yet subsequent indicators suggest that, however crudely, Boykin was indeed expressing perspectives shared by more than a few of his fellow citizens.

One such indicator came immediately: despite the furor, the general kept his important Pentagon job as deputy undersecretary of defense for intelligence, suggesting that the Bush administration considered his transgression minor. Perhaps Boykin had spoken out of turn, but his was not a fireable offense. (One can only speculate regarding the fate likely to befall a U.S. high-ranking officer daring to say of Israeli Prime Benjamin Netanyahu, “My God is a real God and his is an idol.”)

A second indicator came in the wake of Boykin’s retirement from active duty. In 2012, the influential Family Research Council (FRC) in Washington hired the general to serve as the organization’s executive vice-president. Devoted to “advancing faith, family, and freedom,” the council presents itself as emphatically Christian in its outlook. FRC events routinely attract Republican Party heavyweights. The organization forms part of the conservative mainstream, much as, say, the American Civil Liberties Union forms part of the left-liberal mainstream.

So for the FRC to hire as its chief operating officer someone espousing Boykin’s pronounced views regarding Islam qualifies as noteworthy. At a minimum, those who recruited the former general apparently found nothing especially objectionable in his worldview. They saw nothing politically risky about associating with Jerry Boykin. He’s their kind of guy. More likely, by hiring Boykin, the FRC intended to send a signal: on matters where their new COO claimed expertise — above all, war — thumb-in-your eye political incorrectness was becoming a virtue. Imagine the NAACP electing Nation of Islam leader Louis Farrakhan as its national president, thereby endorsing his views on race, and you get the idea.

What the FRC’s embrace of General Boykin makes clear is this: to dismiss manifestations of Islamophobia simply as the work of an insignificant American fringe is mistaken. As with the supporters of Senator Joseph McCarthy, who during the early days of the Cold War saw communists under every State Department desk, those engaging in these actions are daring to express openly attitudes that others in far greater numbers also quietly nurture. To put it another way, what Americans in the 1950s knew as McCarthyism has reappeared in the form of Boykinism.

Historians differ passionately over whether McCarthyism represented a perversion of anti-Communism or its truest expression. So, too, present-day observers will disagree as to whether Boykinism represents a merely fervent or utterly demented response to the Islamist threat. Yet this much is inarguable: just as the junior senator from Wisconsin in his heyday embodied a non-trivial strain of American politics, so, too, does the former special-ops-warrior-turned-“ordained minister with a passion for spreading the Gospel of Jesus Christ.”

Notably, as Boykinism’s leading exponent, the former general’s views bear a striking resemblance to those favored by the late senator. Like McCarthy, Boykin believes that, while enemies beyond America’s gates pose great dangers, the enemy within poses a still greater threat. “I’ve studied Marxist insurgency,” he declared in a 2010 video. “It was part of my training. And the things I know that have been done in every Marxist insurgency are being done in America today.” Explicitly comparing the United States as governed by Barack Obama to Stalin’s Soviet Union, Mao Zedong’s China, and Fidel Castro’s Cuba, Boykin charges that, under the guise of health reform, the Obama administration is secretly organizing a “constabulary force that will control the population in America.” This new force is, he claims, designed to be larger than the United States military, and will function just as Hitler’s Brownshirts once did in Germany. All of this is unfolding before our innocent and unsuspecting eyes.

Boykinism: The New McCarthyism

How many Americans endorsed McCarthy’s conspiratorial view of national and world politics? It’s difficult to know for sure, but enough in Wisconsin to win him reelection in 1952, by a comfortable 54% to 46% majority. Enough to strike fear into the hearts of politicians who quaked at the thought of McCarthy fingering them for being “soft on Communism.”

How many Americans endorse Boykin’s comparably incendiary views? Again, it’s difficult to tell. Enough to persuade FRC’s funders and supporters to hire him, confident that doing so would burnish, not tarnish, the organization’s brand. Certainly, Boykin has in no way damaged its ability to attract powerhouses of the domestic right. FRC’s recent “Values Voter Summit” featured luminaries such as Republican vice-presidential nominee Paul Ryan, former Republican Senator and presidential candidate Rick Santorum, House Majority Leader Eric Cantor, and Representative Michele Bachmann — along with Jerry Boykin himself, who lectured attendees on “Israel, Iran, and the Future of Western Civilization.” (In early August, Mitt Romney met privately with a group of “prominent social conservatives,” including Boykin.)

Does their appearance at the FRC podium signify that Ryan, Santorum, Cantor, and Bachmann all subscribe to Boykinism’s essential tenets? Not any more than those who exploited the McCarthyite moment to their own political advantage — Richard Nixon, for example — necessarily agreed with all of McCarthy’s reckless accusations. Yet the presence of leading Republicans on an FRC program featuring Boykin certainly suggests that they find nothing especially objectionable or politically damaging to them in his worldview.

Still, comparisons between McCarthyism and Boykinism only go so far. Senator McCarthy wreaked havoc mostly on the home front, instigating witch-hunts, destroying careers, and trampling on civil rights, while imparting to American politics even more of a circus atmosphere than usual. In terms of foreign policy, the effect of McCarthyism, if anything, was to reinforce an already existing anti-communist consensus. McCarthy’s antics didn’t create enemies abroad. McCarthyism merely reaffirmed that communists were indeed the enemy, while making the political price of thinking otherwise too high to contemplate.

Boykinism, in contrast, makes its impact felt abroad. Unlike McCarthyism, it doesn’t strike fear into the hearts of incumbents on the campaign trail here. Attracting General Boykin’s endorsement or provoking his ire probably won’t determine the outcome of any election. Yet in its various manifestations Boykinism provides the kindling that helps sustain anti-American sentiment in the Islamic world. It reinforces the belief among Muslims that the Global War on Terror really is a war against them.

Boykinism confirms what many Muslims are already primed to believe: that American values and Islamic values are irreconcilable. American presidents and secretaries of state stick to their talking points, praising Islam as a great religious tradition and touting past U.S. military actions (ostensibly) undertaken on behalf of Muslims. Yet with their credibility among Iraqis, Afghans, Pakistanis, and others in the Greater Middle East about nil, they are pissing in the wind.

As long as substantial numbers of vocal Americans do not buy the ideological argument constructed to justify U.S. intervention in the Islamic world — that their conception of freedom (including religious freedom) is ultimately compatible with ours — then neither will Muslims. In that sense, the supporters of Boykinism who reject that proposition encourage Muslims to follow suit. This ensures, by extension, that further reliance on armed force as the preferred instrument of U. S. policy in the Islamic world will compound the errors that produced and have defined the post-9/11 era.

Andrew J. Bacevich is currently a visiting fellow at Notre Dame’s Kroc Institute for International Peace Studies. A TomDispatch regular, he is author of Washington Rules: America’s Path to Permanent War, among other works, and most recently editor of The Short American Century.

Copyright 2012 Andrew J. Bacevich

Boykinism

With the United States now well into the second decade of what the Pentagon has styled an “era of persistent conflict,” the war formerly known as the global war on terrorism (unofficial acronym WFKATGWOT) appears increasingly fragmented and diffuse.  Without achieving victory, yet unwilling to acknowledge failure, the United States military has withdrawn from Iraq.  It is trying to leave Afghanistan, where events seem equally unlikely to yield a happy outcome. 

Elsewhere — in Pakistan, Libya, Yemen, and Somalia, for example — U.S. forces are busily opening up new fronts.  Published reports that the United States is establishing “a constellation of secret drone bases” in or near the Horn of Africa and the Arabian Peninsula suggest that the scope of operations will only widen further.  In a front-page story, the New York Times described plans for “thickening” the global presence of U.S. special operations forces.  Rushed Navy plans to convert an aging amphibious landing ship into an “afloat forward staging base” — a mobile launch platform for either commando raids or minesweeping operations in the Persian Gulf — only reinforces the point. Yet as some fronts close down and others open up, the war’s narrative has become increasingly difficult to discern.  How much farther until we reach the WFKATGWOT’s equivalent of Berlin?  What exactly is the WFKATGWOT’s equivalent of Berlin?  In fact, is there a storyline here at all?

Viewed close-up, the “war” appears to have lost form and shape.  Yet by taking a couple of steps back, important patterns begin to appear.  What follows is a preliminary attempt to score the WFKATGWOT, dividing the conflict into a bout of three rounds.  Although there may be several additional rounds still to come, here’s what we’ve suffered through thus far.

The Rumsfeld Era

Round 1: Liberation.  More than any other figure — more than any general, even more than the president himself — Secretary of Defense Donald Rumsfeld dominated the war’s early stages.  Appearing for a time to be a larger-than-life figure — the “Secretary at War” in the eyes of an adoring (if fickle) neocon fan club — Rumsfeld dedicated himself to the proposition that, in battle, speed holds the key to victory.  He threw his considerable weight behind a high-tech American version of blitzkrieg.  U.S. forces, he regularly insisted, were smarter and more agile than any adversary.  To employ them in ways that took advantage of those qualities was to guarantee victory.  The journalistic term adopted to describe this concept was “shock and awe.”

No one believed more passionately in “shock and awe” than Rumsfeld himself.  The design of Operation Enduring Freedom, launched in October 2001, and of Operation Iraqi Freedom, begun in March 2003, reflected this belief.  In each instance, the campaign got off to a promising start, with U.S. troops landing some swift and impressive blows.  In neither case, however, were they able to finish off their opponent or even, in reality, sort out just who their opponent might be.  Unfortunately for Rumsfeld, the “terrorists” refused to play by his rulebook and U.S. forces proved to be less smart and agile than their technological edge — and their public relations machine — suggested would be the case.  Indeed, when harassed by minor insurgencies and scattered bands of jihadis, they proved surprisingly slow to figure out what hit them.

In Afghanistan, Rumsfeld let victory slip through his grasp.  In Iraq, his mismanagement of the campaign brought the United States face-to-face with outright defeat.  Rumsfeld’s boss had hoped to liberate (and, of course, dominate) the Islamic world through a series of short, quick thrusts.  What Bush got instead were two different versions of a long, hard slog.  By the end of 2006, “shock and awe” was kaput.  Trailing well behind the rest of the country and its armed forces, the president eventually lost confidence in his defense secretary’s approach.  As a result, Rumsfeld lost his job.  Round one came to an end, the Americans, rather embarrassingly, having lost it on points.

The Petraeus Era

Round 2: Pacification.  Enter General David Petraeus.  More than any other figure, in or out of uniform, Petraeus dominated the WFKATGWOT’s second phase.  Round two opened with lowered expectations.  Gone was the heady talk of liberation.  Gone, too, were predictions of lightning victories.  The United States was now willing to settle for much less while still claiming success. 

Petraeus offered a formula for restoring a semblance of order to countries reduced to chaos as a result of round one.  Order might permit the United States to extricate itself while maintaining some semblance of having met its policy objectives.  This became the operative definition of victory.

The formal name for the formula that Petraeus devised was counterinsurgency, or COIN.  Rather than trying to defeat the enemy, COIN sought to facilitate the emergence of a viable and stable nation-state.  This was the stated aim of the “surge” in Iraq ordered by President George W. Bush at the end of 2006. 

With Petraeus presiding, violence in that country did decline precipitously. Whether the relationship was causal or coincidental remains the subject of controversy.  Still, Petraeus’s apparent success persuaded some observers that counterinsurgency on a global scale — GCOIN, they called it — should now form the basis for U.S. national security strategy.  Here, they argued, was an approach that could definitively extract the United States from the WFKATGWOT, while offering victory of a sort.  Rather than employing “shock and awe” to liberate the Islamic world, U.S. forces would apply counterinsurgency doctrine to pacify it.

The task of demonstrating the validity of COIN beyond Iraq fell to General Stanley McChrystal, appointed with much fanfare in 2009 to command U.S. and NATO forces in Afghanistan.  Press reports celebrated McChrystal as another Petraeus, the ideal candidate to replicate the achievements already credited to “King David.” 

McChrystal’s ascendency came at a moment when a cult of generalship gripped Washington.  Rather than technology being the determinant of success as Rumsfeld had believed, the key was to put the right guy in charge and then let him run with things.  Political figures on both sides of the aisle fell all over themselves declaring McChrystal the right guy for Afghanistan.  Pundits of all stripes joined the chorus.

Once installed in Kabul, the general surveyed the situation and, to no one’s surprise, announced that “success demands a comprehensive counterinsurgency campaign.”  Implementing that campaign would necessitate an Afghan “surge” mirroring the one that had seemingly turned Iraq around.  In December 2009, albeit with little evident enthusiasm, President Barack Obama acceded to his commander’s request (or ultimatum).  The U.S. troop commitment to Afghanistan rapidly increased.

Here things began to come undone.  Progress toward reducing the insurgency or improving the capacity of Afghan security forces was — by even the most generous evaluation — negligible.  McChrystal made promises — like meeting basic Afghan needs with “government in a box, ready to roll in” — that he proved utterly incapable of keeping.  Relations with the government of President Hamid Karzai remained strained.  Those with neighboring Pakistan, not good to begin with, only worsened.  Both governments expressed deep resentment at what they viewed as high-handed American behavior that killed or maimed noncombatants with disturbing frequency.

To make matters worse, despite all the hype, McChrystal turned out to be miscast — manifestly the wrong guy for the job.  Notably, he proved unable to grasp the need for projecting even some pretence of respect for the principle of civilian control back in Washington.  By the summer of 2010, he was out — and Petraeus was back in.

In Washington (if not in Kabul), Petraeus’s oversized reputation quelled the sense that with McChrystal’s flame-out Afghanistan might be a lost cause.  Surely, the most celebrated soldier of his generation would repeat his Iraq magic, affirming his own greatness and the continued viability of COIN. 

Alas, this was not to be.  Conditions in Afghanistan during Petraeus’s tenure in command improved — if that’s even the word — only modestly.  The ongoing war met just about anyone’s definition of a quagmire.  With considerable understatement, a 2011 National Intelligence Estimate called it a “stalemate.” Soon, talk of a “comprehensive counterinsurgency” faded.  With the bar defining success slipping ever lower, passing off the fight to Afghan security forces and hightailing it for home became the publicly announced war aim.

That job remained unfinished when Petraeus himself headed for home, leaving the army to become CIA director.  Although Petraeus was still held in high esteem, his departure from active duty left the cult of generalship looking more than a little the worse for wear.  By the time General John Allen succeeded Petraeus — thereby became the eighth U.S. officer appointed to preside over the ongoing Afghan War — no one believed that simply putting the right guy in charge was going to produce magic.  On that inclusive note, round two of the WFKATGWOT ended.

The Vickers Era

Round 3: Assassination.  Unlike Donald Rumsfeld or David Petraeus, Michael Vickers has not achieved celebrity status.  Yet more than anyone else in or out of uniform, Vickers, who carries the title Under Secretary of Defense for Intelligence, deserves recognition as the emblematic figure of the WFKATGWOT’s round three.  His low-key, low-profile persona meshes perfectly with this latest evolution in the war’s character.  Few people outside of Washington know who he is, which is fitting indeed since he presides over a war that few people outside of Washington are paying much attention to any longer.

With the retirement of Secretary of Defense Robert Gates, Vickers is the senior remaining holdover from George W. Bush’s Pentagon.  His background is nothing if not eclectic.  He previously served in U.S. Army Special Forces and as a CIA operative.  In that guise, he played a leading role in supporting the Afghan mujahedeen in their war against Soviet occupiers in the 1980s.  Subsequently, he worked in a Washington think tank and earned a PhD in strategic studies at Johns Hopkins University (dissertation title: “The Structure of Military Revolutions”). 

Even during the Bush era, Vickers never subscribed to expectations that the United States could liberate or pacify the Islamic world.  His preferred approach to the WFKATGWOT has been simplicity itself. “I just want to kill those guys,” he says — “those guys” referring to members of al-Qaeda. Kill the people who want to kill Americans and don’t stop until they are all dead: this defines the Vickers strategy, which over the course of the Obama presidency has supplanted COIN as the latest variant of U.S. strategy. 

The Vickers approach means acting aggressively to eliminate would-be killers wherever they might be found, employing whatever means are necessary.  Vickers “tends to think like a gangster,” one admirer comments. “He can understand trends then change the rules of the game so they are advantageous for your side.”

Round three of the WFKATGWOT is all about bending, breaking, and reinventing rules in ways thought to be advantageous to the United States.  Much as COIN supplanted “shock and awe,” a broad-gauged program of targeted assassination has now displaced COIN as the prevailing expression of the American way of war. 

The United States is finished with the business of sending large land armies to invade and occupy countries on the Eurasian mainland.  Robert Gates, when still Secretary of Defense, made the definitive statement on that subject.  The United States is now in the business of using missile-armed drones and special operations forces to eliminate anyone (not excluding U.S. citizens) the president of the United States decides has become an intolerable annoyance.  Under President Obama, such attacks have proliferated. 

This is America’s new MO.  Paraphrasing a warning issued by Secretary of State Hillary Clinton, a Washington Post dispatch succinctly summarized what it implied: “The United States reserved the right to attack anyone who it determined posed a direct threat to U.S. national security, anywhere in the world.” 

Furthermore, acting on behalf of the United States, the president exercises this supposed right without warning, without regard to claims of national sovereignty, without Congressional authorization, and without consulting anyone other than Michael Vickers and a few other members of the national security apparatus.  The role allotted to the American people is to applaud, if and when notified that a successful assassination has occurred.  And applaud we do, for example, when a daring raid by members in SEAL Team Six secretly enter Pakistan to dispatch Osama bin Laden with two neatly placed kill shots.  Vengeance long deferred making it unnecessary to consider what second-order political complications might ensue. 

How round three will end is difficult to forecast.  The best we can say is that it’s unlikely to end anytime soon or particularly well.  As Israel has discovered, once targeted assassination becomes your policy, the list of targets has a way of growing ever longer. 

So what tentative judgments can we offer regarding the ongoing WFKATGWOT?  Operationally, a war launched by the conventionally minded has progressively fallen under the purview of those who inhabit what Dick Cheney once called “the dark side,” with implications that few seem willing to explore.  Strategically, a war informed at the outset by utopian expectations continues today with no concretely stated expectations whatsoever, the forward momentum of events displacing serious consideration of purpose.  Politically, a war that once occupied center stage in national politics has now slipped to the periphery, the American people moving on to other concerns and entertainments, with legal and moral questions raised by the war left dangling in midair.

Is this progress?

Andrew J. Bacevich is professor of history and international relations at Boston University.  A TomDispatch regular, he is the author most recently of Washington Rules: The American Path to Permanent War and the editor of the new book The Short American Century: A Postmortem, just out from Harvard University Press. To catch Timothy MacBain’s latest Tomcast audio interview in which Bacevich discusses the changing face of the Gobal War on Terror, click here, or download it to your iPod here.

Copyright 2012 Andrew Bacevich

Scoring the Global War on Terror

Fenway Park, Boston, July 4, 2011.  On this warm summer day, the Red Sox will play the Toronto Blue Jays.  First come pre-game festivities, especially tailored for the occasion.  The ensuing spectacle — a carefully scripted encounter between the armed forces and society — expresses the distilled essence of present-day American patriotism.  A masterpiece of contrived spontaneity, the event leaves spectators feeling good about their baseball team, about their military, and not least of all about themselves — precisely as it was meant to do.

In this theatrical production, the Red Sox provide the stage, and the Pentagon the props.  In military parlance, it is a joint operation.  In front of a gigantic American flag draped over the left-field wall, an Air Force contingent, clad in blue, stands at attention.  To carry a smaller version of the Stars and Stripes onto the playing field, the Navy provides a color guard in crisp summer whites.  The United States Marine Corps kicks in with a choral ensemble that leads the singing of the national anthem.  As the anthem’s final notes sound, four U. S. Air Force F-15C Eagles scream overhead.  The sellout crowd roars its approval.

But there is more to come. “On this Independence Day,” the voice of the Red Sox booms over the public address system, “we pay a debt of gratitude to the families whose sons and daughters are serving our country.”  On this particular occasion the designated recipients of that gratitude are members of the Lydon family, hailing from Squantum, Massachusetts.  Young Bridget Lydon is a sailor — Aviation Ordnanceman Airman is her official title — serving aboard the carrier USS Ronald Reagan, currently deployed in support of the Afghanistan War, now in its 10th year.

From Out of Nowhere

The Lydons are Every Family, decked out for the Fourth.  Garbed in random bits of Red Sox paraphernalia and Mardi Gras necklaces, they wear their shirts untucked and ball caps backwards.  Neither sleek nor fancy, they are without pretension.  Yet they exude good cheer.  As they are ushered onto the field, their eagerness is palpable.  Like TV game show contestants, they know that this is their lucky day and they are keen to make the most of it.

As the Lydons gather near the pitcher’s mound, the voice directs their attention to the 38-by-100-foot Jumbotron mounted above the centerfield bleachers.  On the screen, Bridget appears.  She is aboard ship, in duty uniform, posed below decks in front of an F/A-18 fighter jet.  Waiflike, but pert and confident, she looks directly into the camera, sending a “shout-out” to family and friends.  She wishes she could join them at Fenway. 

As if by magic, wish becomes fulfillment.  While the video clip is still running, Bridget herself, now in dress whites, emerges from behind the flag covering the leftfield wall.  On the Jumbotron, in place of Bridget below decks, an image of Bridget marching smartly toward the infield appears.  In the stands pandemonium erupts.  After a moment of confusion, members of her family — surrounded by camera crews — rush to embrace their sailor, a reunion shared vicariously by the 38,000 fans in attendance along with many thousands more watching at home on the Red Sox television network. 

Once the Lydons finish with hugs and kisses and the crowd settles down, Navy veteran Bridget (annual salary approximately $22,000) throws the ceremonial first pitch to aging Red Sox veteran Tim Wakefield (annual salary $2,000,000).  More cheers.  As a souvenir, Wakefield gives her the baseball along with his own hug.  All smiles, Bridget and her family shout “Play Ball!” into a microphone.  As they are escorted off the field and out of sight, the game begins. 

Cheap Grace

What does this event signify?

For the Lydons, the day will no doubt long remain a happy memory.  If they were to some degree manipulated — their utter and genuine astonishment at Bridget’s seemingly miraculous appearance lending the occasion its emotional punch — they played their allotted roles without complaint and with considerable élan.  However briefly, they stood in the spotlight, quasi-celebrities, all eyes trained on them, a contemporary version of the American dream fulfilled.  And if offstage puppet-masters used Bridget herself, at least she got a visit home and a few days off — no doubt a welcome break. 

Yet this feel-good story was political as well as personal.  As a collaboration between two well-heeled but image-conscious institutions, the Lydon reunion represented a small but not inconsequential public relations triumph.  The Red Sox and the Navy had worked together to perform an act of kindness for a sailor and her loved ones.  Both organizations came away looking good, not only because the event itself was so deftly executed, but because it showed that the large for-profit professional sports team and the even larger military bureaucracy both care about ordinary people.  The message conveyed to fans/taxpayers could not be clearer: the corporate executives who run the Red Sox have a heart. So, too, do the admirals who run the Navy.

Better still, these benefits accrued at essentially no cost to the sponsors.  The military personnel arrayed around Fenway showed up because they were told to do so.  They are already “paid for,” as are the F-15s, the pilots who fly them, and the ground crews that service them.  As for whatever outlays the Red Sox may have made, they are trivial and easily absorbed.  For the 2011 season, the average price of a ticket at Fenway Park had climbed to $52.  A soft drink in a commemorative plastic cup runs you $5.50 and a beer $8.  Then there is the television ad revenue, all contributing the previous year to corporate profits exceeding $58 million.  A decade of war culminating in the worst economic crisis since the Great Depression hasn’t done much good for the country but it has been strangely good for the Red Sox — and a no-less well funded Pentagon.  Any money expended in bringing Bridget to Fenway and entertaining the Lydons had to be the baseball/military equivalent of pocket change.

And the holiday festivities at Fenway had another significance as well, one that extended beyond burnishing institutional reputations and boosting bottom lines.  Here was America’s civic religion made manifest. 

In recent decades, an injunction to “support the troops” has emerged as a central tenet of that religion.  Since 9/11 this imperative has become, if anything, even more binding.  Indeed, as citizens, Americans today acknowledge no higher obligation.

Fulfilling that obligation has posed a challenge, however.  Rather than doing so concretely, Americans — with a few honorable exceptions — have settled for symbolism.  With their pronounced aversion to collective service and sacrifice (an inclination indulged by leaders of both political parties), Americans resist any definition of civic duty that threatens to crimp lifestyles. 

To stand in solidarity with those on whom the burden of service and sacrifice falls is about as far as they will go.  Expressions of solidarity affirm that the existing relationship between soldiers and society is consistent with democratic practice.  By extension, so, too, is the distribution of prerogatives and responsibilities entailed by that relationship: a few fight, the rest applaud.  Put simply, the message that citizens wish to convey to their soldiers is this: although choosing not to be with you, we are still for you (so long as being for you entails nothing on our part).  Cheering for the troops, in effect, provides a convenient mechanism for voiding obligation and easing guilty consciences.   

In ways far more satisfying than displaying banners or bumper stickers, the Fenway Park Independence Day event provided a made-to-order opportunity for conscience easing.  It did so in three ways.  First, it brought members of Red Sox Nation into close proximity (even if not direct contact) with living, breathing members of the armed forces, figuratively closing any gap between the two.  (In New England, where few active duty military installations remain, such encounters are increasingly infrequent.)  Second, it manufactured one excuse after another to whistle and shout, whoop and holler, thereby allowing the assembled multitudes to express — and to be seen expressing — their affection and respect for the troops.  Finally, it rewarded participants and witnesses alike with a sense of validation, the reunion of Bridget and her family, even if temporary, serving as a proxy for a much larger, if imaginary, reconciliation of the American military and the American peopleThat debt?  Mark it paid in full. 

The late German theologian Dietrich Bonhoeffer had a name for this unearned self-forgiveness and undeserved self-regard.  He called it cheap grace.  Were he alive today, Bonhoeffer might suggest that a taste for cheap grace, compounded by an appetite for false freedom, is leading Americans down the road to perdition. 

Andrew J. Bacevich, the author of Washington Rules: America’s Path to Permanent War, is professor of history and international relations at Boston University. His next book, of which this post is a small part, will assess the impact of a decade of war on American society and the United States military. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses cheap grace and military spectacle, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Ballpark Liturgy: America’s New Civic Religion

At periodic intervals, the American body politic has shown a marked susceptibility to messianic fevers.  Whenever an especially acute attack occurs, a sort of delirium ensues, manifesting itself in delusions of grandeur and demented behavior. 

By the time the condition passes and a semblance of health is restored, recollection of what occurred during the illness tends to be hazy.  What happened?  How’d we get here?  Most Americans prefer not to know.  No sense dwelling on what’s behind us.  Feeling much better now!  Thanks!

Gripped by such a fever in 1898, Americans evinced an irrepressible impulse to liberate oppressed Cubans.  By the time they’d returned to their senses, having acquired various parcels of real estate between Puerto Rico and the Philippines, no one could quite explain what had happened or why.  (The Cubans meanwhile had merely exchanged one set of overseers for another.)

In 1917, the fever suddenly returned.  Amid wild ravings about waging a war to end war, Americans lurched off to France.  This time the affliction passed quickly, although the course of treatment proved painful: confinement to the charnel house of the Western Front, followed by bitter medicine administered at Versailles.

The 1960s brought another bout (and so yet more disappointment).  An overwhelming urge to pay any price, bear any burden landed Americans in Vietnam.  The fall of Saigon in 1975 seemed, for a brief interval, to inoculate the body politic against any further recurrence.  Yet the salutary effects of this “Vietnam syndrome” proved fleeting.  By the time the Cold War ended, Americans were running another temperature, their self-regard reaching impressive new heights.  Out of Washington came all sorts of embarrassing gibberish about permanent global supremacy and history’s purpose finding fulfillment in the American way of life.

Give Me Fever

Then came 9/11 and the fever simply soared off the charts.  The messiah-nation was really pissed and was going to fix things once and for all.

Nearly 10 years have passed since Washington set out to redeem the Greater Middle East.  The crusades have not gone especially well.  In fact, in the pursuit of its saving mission, the American messiah has pretty much worn itself out.

Today, the post-9/11 fever finally shows signs of abating.  The evidence is partial and preliminary.  The sickness has by no means passed.  Oddly, it lingers most strongly in the Obama White House, of all places, where a keenness to express American ideals by dropping bombs seems strangely undiminished.

Yet despite the urges of some in the Obama administration, after nearly a decade of self-destructive flailing about, American recovery has become a distinct possibility.  Here’s some of the evidence:

In Washington, it’s no longer considered a sin to question American omnipotence.  Take the case of Robert Gates.  The outgoing secretary of defense may well be the one senior U.S. official of the past decade to leave office with his reputation not only intact, but actually enhanced.  (Note to President Obama: think about naming an aircraft carrier after the guy).  Yet along with restoring a modicum of competence and accountability to the Pentagon, the Gates legacy is likely to be found in his willingness — however belated — to acknowledge the limits of American power.

That the United States should avoid wars except when absolutely necessary no longer connotes incipient isolationism.  It is once again a sign of common sense, with Gates a leading promoter.  Modesty is becoming respectable.

The Gates Doctrine

No one can charge Gates with being an isolationist or a national security wimp.  Neither is he a “declinist.”  So when he says anyone proposing another major land war in the Greater Middle East should “have his head examined” — citing the authority of Douglas MacArthur, no less — people take notice.  Or more recently there was this:  "I've got a military that's exhausted," Gates remarked, in one of those statements of the obvious too seldom heard from on high.  "Let's just finish the wars we're in and keep focused on that instead of signing up for other wars of choice."  Someone should etch that into the outer walls of the Pentagon’s E-ring.

A half-dozen years ago, “wars of choice” were all the rage in Washington.  No more.  Thank you, Mr. Secretary.

Or consider the officer corps.  There is no “military mind,” but there are plenty of minds in the military and some numbers of them are changing.

Evidence suggests that the officer corps itself is rethinking the role of military power.  Consider, for example, “Mr. Y,” author of A National Strategic Narrative, published this spring to considerable acclaim by the Woodrow Wilson Center for Scholars.  The actual authors of this report are two military professionals, one a navy captain, the other a Marine colonel.

What you won’t find in this document are jingoism, braggadocio, chest-thumping, and calls for a bigger military budget.  If there’s an overarching theme, it’s pragmatism.  Rather than the United States imposing its will on the world, the authors want more attention paid to the investment needed to rebuild at home.

The world is too big and complicated for any one nation to call the shots, they insist.  The effort to do so is self-defeating. “As Americans,” Mr. Y writes, “we needn’t seek the world’s friendship or proselytize the virtues of our society.  Neither do we seek to bully, intimidate, cajole, or persuade others to accept our unique values or to share our national objectives.  Rather, we will let others draw their own conclusions based upon our actions… We will pursue our national interests and let others pursue theirs…”

You might dismiss this as the idiosyncratic musing of two officers who have spent too much time having their brains baked in the Iraqi or Afghan sun.  I don’t.  What convinces me otherwise is the positive email traffic that my own musings about the misuse and abuse of American power elicit weekly from serving officers.  It’s no scientific sample, but the captains, majors, and lieutenant colonels I hear from broadly agree with Mr. Y.  They’ve had a bellyful of twenty-first-century American war and are open to a real debate over how to overhaul the nation’s basic approach to national security.

Intelligence Where You Least Expect It

And finally, by gum, there is the United States Congress.  Just when that body appeared to have entered a permanent vegetative state, a flickering of intelligent life has made its reappearance.  Perhaps more remarkably still, the signs are evident on both sides of the aisle as Democrats and Republicans alike — albeit for different reasons — are raising serious questions about the nation’s propensity for multiple, open-ended wars.

Some members cite concerns for the Constitution and the abuse of executive power.  Others worry about the price tag.  With Osama bin Laden out of the picture, still others insist that it’s time to rethink strategic priorities.  No doubt partisan calculation or personal ambition figures alongside matters of principle.  They are, after all, politicians.

Given what polls indicate is a growing public unhappiness over the Afghan War, speaking out against that war these days doesn’t exactly require political courage.  Still, the possibility of our legislators reasserting a role in deciding whether or not a war actually serves the national interest — rather than simply rubberstamping appropriations and slinking away — now presents itself.  God bless the United States Congress.

Granted, the case presented here falls well short of being conclusive.  To judge by his announcement of a barely-more-than-symbolic troop withdrawal from Afghanistan, President Obama himself seems uncertain of where he stands.  And clogging the corridors of power or the think tanks and lobbying arenas that surround them are plenty of folks still hankering to have a go at Syria or Iran.

At the first signs of self-restraint, you can always count on the likes of Senator John McCain or the editorial board of the Wall Street Journal to decry (in McCain’s words) an “isolationist-withdrawal-lack-of-knowledge-of-history attitude” hell-bent on pulling up the drawbridge and having Americans turn their backs on the world.  In such quarters, fever is a permanent condition and it’s always 104 and rising.  Yet it is a measure of just how quickly things are changing that McCain himself, once deemed a source of straight talk, now comes across as a mere crank.

In this way, nearly a decade after our most recent descent into madness, does the possibility of recovery finally beckon.

Andrew J. Bacevich is professor of history and international relations at Boston University. His most recent book is Washington Rules: America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses voices of dissent within the military, click here, or download it to your iPod here.

Copyright 2011 Andrew J. Bacevich

On the Mend?

It is a commonplace of American politics: when the moving van pulls up to the White House on Inauguration Day, it delivers not only a closetful of gray suits and power ties, but a boatload of expectations. 

A president, being the most powerful man in the world, begins history anew — so at least Americans believe, or pretend to believe.  Out with the old, sordid, and disappointing; in with the fresh, unsullied, and hopeful.  Why, with the stroke of a pen, a new president can order the closing of an embarrassing and controversial off-shore prison for accused terrorists held for years on end without trial!  Just like that: done.

For all sorts of reasons, the expectations raised by Barack Obama’s arrival in the Oval Office were especially high.  Americans weren’t the only ones affected.  How else to explain the Nobel Committee’s decision to honor the new president by transforming its Peace Prize into a Prize Anticipating Peace — more or less the equivalent of designating the winner of the Heisman Trophy during week one of the college football season.

Of course, if the political mood immediately prior to and following a presidential inauguration emphasizes promise and discovery (the First Lady has biceps!), it doesn’t take long for the novelty to start wearing off.  Then the narrative arc takes a nosedive: he’s breaking his promises,  he’s letting us down, he’s not so different after all.

The words of H.L. Mencken apply.  “When I hear a man applauded by the mob,” the Sage of Baltimore wrote, “I always feel a pang of pity for him.  All he has to do to be hissed is to live long enough.”  Barack Obama has now lived long enough to attract his fair share of hisses, boos, and catcalls.

Along with prolonging and expanding one war in Afghanistan, the Nobel Peace laureate has played a leading role in starting another war in Libya.  Laboring to distinguish between this administration and its predecessor, Obama’s defenders emphasize the purity of his motives.  Contemptuous of George W. Bush’s claim that U.S. forces invaded oil-rich Iraq to keep weapons of mass destruction out of the hands of terrorists, they readily accept this president’s insistence that the United States intervened in oil-rich Libya to prevent genocidal slaughter.  Besides, testifying to our virtuous intent, this time we’ve got the French with us rather than against us.

Explaining Why Is a Mug’s Game

In truth, to ascribe a single governing purpose or rationale to any large-scale foreign policy initiative is to engage in willful distortion.  In any administration, action grows out of consensus.  The existence of consensus among any president’s advisers — LBJ’s inner circle supporting escalation in South Vietnam back in 1965, George W.’s pressing for regime change in Baghdad — does not imply across-the-board agreement as to intent.

Motive is slippery.  As Paul Wolfowitz famously noted regarding Iraq, weapons of mass destruction merely provided the agreed upon public rationale for war.  In reality, a mix of motives probably shaped the decision to invade.  For some administration officials, there was the prospect of eliminating a perceived source of mischief while providing an object lesson to other would-be troublemakers.  For others, there was the promise of reasserting U.S. hegemony over the world’s energy heartland.  For others still (including Wolfowitz himself), there were alluring visions of a region transformed, democratized, and pacified, the very sources of Islamist terror thereby eliminated once and for all. 

At least on the margins, expanding the powers of the presidency at the expense of Congress, bolstering the security of Israel, and finishing what daddy had left undone also likely figured in the equation.  Within this mix, policymakers could pick and choose.

In the face of changing circumstances, they even claimed the prerogative of revising their choices.  Who can doubt that President Bush, faced with the Big Oops — the weapons of mass destruction that turned out not to exist — genuinely persuaded himself that America’s true and abiding purpose for invading Iraq had been to liberate the Iraqi people from brutal oppression?  After all, right from day one wasn’t the campaign called Operation Iraqi Freedom?

So even as journalists and historians preoccupy themselves with trying to explain why something happened, they are playing a mug’s game.  However creative or well-sourced, their answers are necessarily speculative, partial, and ambiguous.  It can’t be otherwise.

Rather than why, what deserves far more attention than it generally receives is the question of how.  Here is where we find Barack Obama and George W. Bush (not to mention Bill Clinton, George H. W. Bush, Ronald Reagan, and Jimmy Carter) joined at the hip.  When it comes to the Islamic world, for more than three decades now Washington’s answer to how has been remarkably consistent: through the determined application of hard power wielded by the United States.  Simply put, Washington’s how implies a concerted emphasis on girding for and engaging in war. 

Presidents may not agree on exactly what we are trying to achieve in the Greater Middle East (Obama wouldn’t be caught dead reciting lines from Bush’s Freedom Agenda, for example), but for the past several decades, they have agreed on means: whatever it is we want done, military might holds the key to doing it.  So today, we have the extraordinary spectacle of Obama embracing and expanding Bush’s Global War on Terror even after having permanently banished that phrase to the Guantanamo of politically incorrect speech.

The Big How — By Force

Efforts to divine this administration’s intent in Libya have centered on the purported influence of the Three Harpies: Secretary of State Hillary Clinton, U.N. Ambassador Susan Rice, and National Security Council Human Rights Director Samantha Power, women in positions of influence ostensibly burdened with regret that the United States failed back in 1994 to respond effectively to the Rwandan genocide and determined this time to get it right.  Yet this is insider stuff, which necessarily remains subject to considerable speculation.  What we can say for sure is this: by seeing the Greater Middle East as a region of loose nails badly in need of being hammered, the current commander-in-chief has claimed his place in the ranks of a long list of his warrior-predecessors.

The key point is this: like those who preceded them, neither Obama nor his Harpies (nor anyone else in a position of influence) could evidently be bothered to assess whether the hammer actually works as advertised — notwithstanding abundant evidence showing that it doesn’t.

The sequence of military adventures set in motion when Jimmy Carter promulgated his Carter Doctrine back in 1980 makes for an interesting story but not a very pretty one.  Ronald Reagan’s effort to bring peace to Lebanon ended in 1983 in a bloody catastrophe.  The nominal victory of Operation Desert Storm in 1991, which pushed Saddam Hussein’s forces out of Kuwait, produced little except woeful complications, which Bill Clinton’s penchant for flinging bombs and missiles about during the 1990s did little to resolve or conceal.  The blowback stemming from our first Afghanistan intervention against the Soviets helped create the conditions leading to 9/11 and another Afghanistan War, now approaching its tenth anniversary with no clear end in sight.  As for George W. Bush’s second go at Iraq, the less said the better.  Now, there is Libya.

The question demands to be asked: Are we winning yet?  And if not, why persist in an effort for which great pain is repaid with such little gain?

Perhaps Barack Obama found his political soul mate in Samantha Power, making her determination to alleviate evil around the world his own.  Or perhaps he is just another calculating politician who speaks the language of ideals while pursuing less exalted purposes.  In either case, the immediate relevance of the question is limited.  The how rather than the why is determinant.

Whatever his motives, by conforming to a pre-existing American penchant for using force in the Greater Middle East, this president has chosen the wrong tool.  In doing so, he condemns himself and the country to persisting in the folly of his predecessors.  The failure is one of imagination, but also of courage.  He promised, and we deserve something better. 

Andrew J. Bacevich is professor of history and international relations.  His most recent book Washington Rules: America’s Path to Permanent War (Metropolitan Books) is just out in paperback. To catch Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses what to make of the Obama administration’s Libyan intervention, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Not Why, But How

In defense circles, “cutting” the Pentagon budget has once again become a topic of conversation.  Americans should not confuse that talk with reality.  Any cuts exacted will at most reduce the rate of growth.  The essential facts remain: U.S. military outlays today equal that of every other nation on the planet combined, a situation without precedent in modern history. 

The Pentagon presently spends more in constant dollars than it did at any time during the Cold War — this despite the absence of anything remotely approximating what national security experts like to call a “peer competitor.”  Evil Empire?  It exists only in the fevered imaginations of those who quiver at the prospect of China adding a rust-bucket Russian aircraft carrier to its fleet or who take seriously the ravings of radical Islamists promising from deep inside their caves to unite the Umma in a new caliphate.

What are Americans getting for their money?  Sadly, not much.  Despite extraordinary expenditures (not to mention exertions and sacrifices by U.S. forces), the return on investment is, to be generous, unimpressive.  The chief lesson to emerge from the battlefields of the post-9/11 era is this: the Pentagon possesses next to no ability to translate “military supremacy” into meaningful victory.

Washington knows how to start wars and how to prolong them, but is clueless when it comes to ending them.  Iraq, the latest addition to the roster of America’s forgotten wars, stands as exhibit A.  Each bomb that blows up in Baghdad or some other Iraqi city, splattering blood all over the streets, testifies to the manifest absurdity of judging “the surge” as the epic feat of arms celebrated by the Petraeus lobby.

The problems are strategic as well as operational.  Old Cold War-era expectations that projecting U.S. power will enhance American clout and standing no longer apply, especially in the Islamic world.  There, American military activities are instead fostering instability and inciting anti-Americanism.  For Exhibit B, see the deepening morass that Washington refers to as AfPak or the Afghanistan-Pakistan theater of operations.

Add to that the mountain of evidence showing that Pentagon, Inc. is a miserably managed enterprise: hide-bound, bloated, slow-moving, and prone to wasting resources on a prodigious scale — nowhere more so than in weapons procurement and the outsourcing of previously military functions to “contractors.”  When it comes to national security, effectiveness (what works) should rightly take precedence over efficiency (at what cost?) as the overriding measure of merit.  Yet beyond a certain level, inefficiency undermines effectiveness, with the Pentagon stubbornly and habitually exceeding that level.  By comparison, Detroit’s much-maligned Big Three offer models of well-run enterprises.

Impregnable Defenses

All of this takes place against the backdrop of mounting problems at home: stubbornly high unemployment, trillion-dollar federal deficits, massive and mounting debt, and domestic needs like education, infrastructure, and employment crying out for attention.

Yet the defense budget — a misnomer since for Pentagon, Inc. defense per se figures as an afterthought — remains a sacred cow.  Why is that? 

The answer lies first in understanding the defenses arrayed around that cow to ensure that it remains untouched and untouchable.  Exemplifying what the military likes to call a “defense in depth,” that protective shield consists of four distinct but mutually supporting layers. 

Institutional Self-Interest: Victory in World War II produced not peace, but an atmosphere of permanent national security crisis.  As never before in U.S. history, threats to the nation’s existence seemed omnipresent, an attitude first born in the late 1940s that still persists today.  In Washington, fear — partly genuine, partly contrived — triggered a powerful response. 

One result was the emergence of the national security state, an array of institutions that depended on (and therefore strove to perpetuate) this atmosphere of crisis to justify their existence, status, prerogatives, and budgetary claims.  In addition, a permanent arms industry arose, which soon became a major source of jobs and corporate profits.  Politicians of both parties were quick to identify the advantages of aligning with this “military-industrial complex,” as President Eisenhower described it. 

Allied with (and feeding off of) this vast apparatus that transformed tax dollars into appropriations, corporate profits, campaign contributions, and votes was an intellectual axis of sorts  — government-supported laboratories, university research institutes, publications, think tanks, and lobbying firms (many staffed by former or would-be senior officials) — devoted to identifying (or conjuring up) ostensible national security challenges and alarms, always assumed to be serious and getting worse, and then devising responses to them. 

The upshot: within Washington, the voices carrying weight in any national security “debate” all share a predisposition for sustaining very high levels of military spending for reasons having increasingly little to do with the well-being of the country.

Strategic Inertia: In a 1948 State Department document, diplomat George F. Kennan offered this observation: “We have about 50 percent of the world’s wealth, but only 6.3 percent of its population.”  The challenge facing American policymakers, he continued, was “to devise a pattern of relationships that will permit us to maintain this disparity.”  Here we have a description of American purposes that is far more candid than all of the rhetoric about promoting freedom and democracy, seeking world peace, or exercising global leadership. 

The end of World War II found the United States in a spectacularly privileged position.  Not for nothing do Americans remember the immediate postwar era as a Golden Age of middle-class prosperity.  Policymakers since Kennan’s time have sought to preserve that globally privileged position.  The effort has been a largely futile one. 

By 1950 at the latest, those policymakers (with Kennan by then a notable dissenter) had concluded that the possession and deployment of military power held the key to preserving America’s exalted status.  The presence of U.S. forces abroad and a demonstrated willingness to intervene, whether overtly or covertly, just about anywhere on the planet would promote stability, ensure U.S. access to markets and resources, and generally serve to enhance the country’s influence in the eyes of friend and foe alike — this was the idea, at least. 

In postwar Europe and postwar Japan, this formula achieved considerable success.  Elsewhere — notably in Korea, Vietnam, Latin America, and (especially after 1980) in the so-called Greater Middle East — it either produced mixed results or failed catastrophically.  Certainly, the events of the post-9/11 era provide little reason to believe that this presence/power-projection paradigm will provide an antidote to the threat posed by violent anti-Western jihadism.  If anything, adherence to it is exacerbating the problem by creating ever greater anti-American animus.

One might think that the manifest shortcomings of the presence/power-projection approach — trillions expended in Iraq for what? — might stimulate present-day Washington to pose some first-order questions about basic U.S. national security strategy.  A certain amount of introspection would seem to be called for.  Could, for example, the effort to sustain what remains of America’s privileged status benefit from another approach? 

Yet there are few indications that our political leaders, the senior-most echelons of the officer corps, or those who shape opinion outside of government are capable of seriously entertaining any such debate.  Whether through ignorance, arrogance, or a lack of imagination, the pre-existing strategic paradigm stubbornly persists; so, too, as if by default do the high levels of military spending that the strategy entails.

Cultural Dissonance: The rise of the Tea Party movement should disabuse any American of the thought that the cleavages produced by the “culture wars” have healed.  The cultural upheaval touched off by the 1960s and centered on Vietnam remains unfinished business in this country. 

Among other things, the sixties destroyed an American consensus, forged during World War II, about the meaning of patriotism.  During the so-called Good War, love of country implied, even required, deference to the state, shown most clearly in the willingness of individuals to accept the government’s authority to mandate military service.  GI’s, the vast majority of them draftees, were the embodiment of American patriotism, risking life and limb to defend the country. 

The GI of World War II had been an American Everyman.  Those soldiers both represented and reflected the values of the nation from which they came (a perception affirmed by the ironic fact that the military adhered to prevailing standards of racial segregation).  It was “our army” because that army was “us.” 

With Vietnam, things became more complicated.  The war’s supporters argued that the World War II tradition still applied: patriotism required deference to the commands of the state.  Opponents of the war, especially those facing the prospect of conscription, insisted otherwise.  They revived the distinction, formulated a generation earlier by the radical journalist Randolph Bourne, that distinguished between the country and the state.  Real patriots, the ones who most truly loved their country, were those who opposed state policies they regarded as misguided, illegal, or immoral. 

In many respects, the soldiers who fought the Vietnam War found themselves caught uncomfortably in the center of this dispute.  Was the soldier who died in Vietnam a martyr, a tragic figure, or a sap?  Who deserved greater admiration:  the soldier who fought bravely and uncomplainingly or the one who served and then turned against the war?  Or was the war resister — the one who never served at all — the real hero? 

War’s end left these matters disconcertingly unresolved.  President Richard Nixon’s 1971 decision to kill the draft in favor of an All-Volunteer Force, predicated on the notion that the country might be better served with a military that was no longer “us,” only complicated things further.  So, too, did the trends in American politics where bona fide war heroes (George H.W. Bush, Bob Dole, John Kerry, and John McCain) routinely lost to opponents whose military credentials were non-existent or exceedingly slight (Bill Clinton, George W. Bush, and Barack Obama), yet who demonstrated once in office a remarkable propensity for expending American blood (none belonging to members of their own families) in places like Somalia, Iraq, and Afghanistan.  It was all more than a little unseemly.

Patriotism, once a simple concept, had become both confusing and contentious.  What obligations, if any, did patriotism impose?  And if the answer was none — the option Americans seemed increasingly to prefer — then was patriotism itself still a viable proposition? 

Wanting to answer that question in the affirmative — to distract attention from the fact that patriotism had become little more than an excuse for fireworks displays and taking the occasional day off from work — people and politicians alike found a way to do so by exalting those Americans actually choosing to serve in uniform.  The thinking went this way: soldiers offer living proof that America is a place still worth dying for, that patriotism (at least in some quarters) remains alive and well; by common consent, therefore, soldiers are the nation’s “best,” committed to “something bigger than self” in a land otherwise increasingly absorbed in pursuing a material and narcissistic definition of self-fulfillment. 

In effect, soldiers offer much-needed assurance that old-fashioned values still survive, even if confined to a small and unrepresentative segment of American society.  Rather than Everyman, today’s warrior has ascended to the status of icon, deemed morally superior to the nation for which he or she fights, the repository of virtues that prop up, however precariously, the nation’s increasingly sketchy claim to singularity.

Politically, therefore, “supporting the troops” has become a categorical imperative across the political spectrum.  In theory, such support might find expression in a determination to protect those troops from abuse, and so translate into wariness about committing soldiers to unnecessary or unnecessarily costly wars.  In practice, however, “supporting the troops” has found expression in an insistence upon providing the Pentagon with open-ended drawing rights on the nation’s treasury, thereby creating massive barriers to any proposal to affect more than symbolic reductions in military spending. 

Misremembered History: The duopoly of American politics no longer allows for a principled anti-interventionist position.  Both parties are war parties.  They differ mainly in the rationale they devise to argue for interventionism.  The Republicans tout liberty; the Democrats emphasize human rights.  The results tend to be the same: a penchant for activism that sustains a never-ending demand for high levels of military outlays.

American politics once nourished a lively anti-interventionist tradition.  Leading proponents included luminaries such as George Washington and John Quincy Adams.  That tradition found its basis not in principled pacifism, a position that has never attracted widespread support in this country, but in pragmatic realism.  What happened to that realist tradition?  Simply put, World War II killed it — or at least discredited it.  In the intense and divisive debate that occurred in 1939-1941, the anti-interventionists lost, their cause thereafter tarred with the label “isolationism.” 

The passage of time has transformed World War II from a massive tragedy into a morality tale, one that casts opponents of intervention as blackguards.  Whether explicitly or implicitly, the debate over how the United States should respond to some ostensible threat — Iraq in 2003, Iran today — replays the debate finally ended by the events of December 7, 1941.  To express skepticism about the necessity and prudence of using military power is to invite the charge of being an appeaser or an isolationist.  Few politicians or individuals aspiring to power will risk the consequences of being tagged with that label. 

In this sense, American politics remains stuck in the 1930s — always discovering a new Hitler, always privileging Churchillian rhetoric — even though the circumstances in which we live today bear scant resemblance to that earlier time.  There was only one Hitler and he’s long dead.  As for Churchill, his achievements and legacy are far more mixed than his battalions of defenders are willing to acknowledge.  And if any one figure deserves particular credit for demolishing Hitler’s Reich and winning World War II, it’s Josef Stalin, a dictator as vile and murderous as Hitler himself. 

Until Americans accept these facts, until they come to a more nuanced view of World War II that takes fully into account the political and moral implications of the U.S. alliance with the Soviet Union and the U.S. campaign of obliteration bombing directed against Germany and Japan, the mythic version of “the Good War” will continue to provide glib justifications for continuing to dodge that perennial question: How much is enough?

Like concentric security barriers arrayed around the Pentagon, these four factors — institutional self-interest, strategic inertia, cultural dissonance, and misremembered history — insulate the military budget from serious scrutiny.  For advocates of a militarized approach to policy, they provide invaluable assets, to be defended at all costs. 

Andrew J. Bacevich is professor of history and international relations at Boston University.  His most recent book is Washington Rules:  America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses the money that pours into the national security budget, click here or, to download it to your iPod, here.

Copyright 2011 Andrew Bacevich

Cow Most Sacred

In January 1863, President Abraham Lincoln’s charge to a newly-appointed commanding general was simplicity itself: “give us victories.”  President Barack Obama’s tacit charge to his generals amounts to this: give us conditions permitting a dignified withdrawal.  A pithy quote in Bob Woodward’s new book captures the essence of an emerging Obama Doctrine: “hand it off and get out.”

Getting into a war is generally a piece of cake.  Getting out tends to be another matter altogether — especially when the commander-in-chief and his commanders in the field disagree on the advisability of doing so.

Happy Anniversary, America.  Nine years ago today — on October 7, 2001 — a series of U.S. air strikes against targets across Afghanistan launched the opening campaign of what has since become the nation’s longest war.  Three thousand two hundred and eighty five days later the fight to determine Afghanistan’s future continues.  At least in part, “Operation Enduring Freedom” has lived up to its name:  it has certainly proven to be enduring.

As the conflict formerly known as the Global War on Terror enters its tenth year, Americans are entitled to pose this question: When, where, and how will the war end?  Bluntly, are we almost there yet?

Of course, with the passage of time, where “there” is has become increasingly difficult to discern.  Baghdad turned out not to be Berlin and Kandahar is surely not Tokyo.  Don’t look for CNN to be televising a surrender ceremony anytime soon.

This much we know: an enterprise that began in Afghanistan but soon after focused on Iraq has now shifted back — again — to Afghanistan.  Whether the swings of this pendulum signify progress toward some final objective is anyone’s guess.

To measure progress during wartime, Americans once employed pins and maps.  Plotting the conflict triggered by 9/11 will no doubt improve your knowledge of world geography, but it won’t tell you anything about where this war is headed.

Where, then, have nine years of fighting left us?  Chastened, but not necessarily enlightened.

Just over a decade ago, the now-forgotten Kosovo campaign seemingly offered a template for a new American way of war.  It was a decision gained without suffering a single American fatality.  Kosovo turned out, however, to be a one-off event.  No doubt the United States military was then (and remains today) unbeatable in traditional terms.  Yet, after 9/11, Washington committed that military to an endeavor that it manifestly cannot win.

Rather than probing the implications of this fact — relying on the force of arms to eliminate terrorism is a fool’s errand — two administrations have doggedly prolonged the war even as they quietly ratcheted down expectations of what it might accomplish.

In officially ending the U.S. combat role in Iraq earlier this year — a happy day if there ever was one — President Obama refrained from proclaiming “mission accomplished.”  As well he might: as U.S. troops depart Iraq, insurgents remain active and in the field.  Instead of declaring victory, the president simply urged Americans to turn the page.  With remarkable alacrity, most of us seem to have complied.

Perhaps more surprisingly, today’s military leaders have themselves abandoned the notion that winning battles wins wars, once the very foundation of their profession.  Warriors of an earlier day insisted: “There is no substitute for victory.”  Warriors in the Age of David Petraeus embrace an altogether different motto: “There is no military solution.”

Here is Brigadier General H. R. McMaster, one of the Army’s rising stars, summarizing the latest in advanced military thinking:  “Simply fighting and winning a series of interconnected battles in a well developed campaign does not automatically deliver the achievement of war aims.”  Winning as such is out.  Persevering is in.

So an officer corps once intent above all on avoiding protracted wars now specializes in quagmires.  Campaigns don’t really end.  At best, they peter out.

Formerly trained to kill people and break things, American soldiers now attend to winning hearts and minds, while moonlighting in assassination.  The politically correct term for this is “counterinsurgency.”

Now, assigning combat soldiers the task of nation-building in, say, Mesopotamia is akin to hiring a crew of lumberjacks to build a house in suburbia.  What astonishes is not that the result falls short of perfection, but that any part of the job gets done at all.

Yet by simultaneously adopting the practice of “targeted killing,” the home builders do double-duty as home wreckers.  For American assassins, the weapon of choice is not the sniper rifle or the shiv, but missile-carrying pilotless aircraft controlled from bases in Nevada and elsewhere thousands of miles from the battlefield — the ultimate expression of an American desire to wage war without getting our hands dirty.

In practice, however, killing the guilty from afar not infrequently entails killing innocents as well.  So actions undertaken to deplete the ranks of jihadists as far afield as Pakistan, Yemen, and Somalia unwittingly ensure the recruitment of replacements, guaranteeing a never-ending supply of hardened hearts to soften.

No wonder the campaigns launched since 9/11 drag on and on.  General Petraeus himself has spelled out the implications: “This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”  Obama may want to “get out.”  His generals are inclined to stay the course.

Taking longer to achieve less than we initially intended is also costing far more than anyone ever imagined.  Back in 2003, White House economic adviser Lawrence Lindsey suggested that invading Iraq might run up a tab of as much as $200 billion — a seemingly astronomical sum.  Although Lindsey soon found himself out of a job as a result, he turned out to be a piker.  The bill for our post-9/11 wars already exceeds a trillion dollars, all of it piled atop our mushrooming national debt.  Helped in no small measure by Obama’s war policies, the meter is still running.

So are we almost there yet?  Not even.  The truth is we’re lost in the desert, careening down an unmarked road, odometer busted, GPS on the fritz, and fuel gauge hovering just above E.  Washington can only hope that the American people, napping in the backseat, won’t notice.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His bestselling new book is Washington Rules: America’s Path to Permanent War.  To catch Bacevich discussing how the U.S. military became specialists in quagmires in a Timothy MacBain TomCast audio interview click here or, to download it to your iPod, here.

Copyright 2010 Andrew J. Bacevich

The Long War: Year Ten

Once a serious journalist, the Washington Post’s Bob Woodward now makes a very fine living as chief gossip-monger of the governing class.  Early on in his career, along with Carl Bernstein, his partner at the time, Woodward confronted power.  Today, by relentlessly exalting Washington trivia, he flatters power.  His reporting does not inform. It titillates.

A new Woodward book, Obama’s Wars, is a guaranteed blockbuster.  It’s out this week, already causing a stir, and guaranteed to be forgotten the week after dropping off the bestseller lists.  For good reason: when it comes to substance, any book written by Woodward has about as much heft as the latest potboiler penned by the likes of James Patterson or Tom Clancy.

Back in 2002, for example, during the run-up to the invasion of Iraq, Woodward treated us to Bush at War.  Based on interviews with unidentified officials close to President George W. Bush, the book offered a portrait of the president-as-resolute-war-leader that put him in a league with Abraham Lincoln and Franklin Roosevelt.  But the book’s real juice came from what it revealed about events behind the scenes.  “Bush’s war cabinet is riven with feuding,” reported the Times of London, which credited Woodward with revealing “the furious arguments and personal animosity” that divided Bush’s lieutenants.

Of course, the problem with the Bush administration wasn’t that folks on the inside didn’t play nice with one another.  No, the problem was that the president and his inner circle committed a long series of catastrophic errors that produced an unnecessary and grotesquely mismanaged war.  That war has cost the country dearly — although the people who engineered that catastrophe, many of them having pocketed handsome advances on their forthcoming memoirs, continue to manage quite well, thank you.

To judge by the publicity blitzkrieg announcing the arrival of Obama’s Wars in your local bookstore, the big news out of Washington is that, even today, politics there remains an intensely competitive sport, with the participants, whether in anger or frustration, sometimes speaking ill of one another.

Essentially, news reports indicate, Woodward has updated his script from 2002.  The characters have different names, but the plot remains the same.  Talk about jumping the shark.

So we learn that Obama political adviser David Axelrod doesn’t fully trust Secretary of State Hillary Clinton.  National security adviser James Jones, a retired Marine general, doesn’t much care for the likes of Axelrod, and will say so behind his back.  Almost everyone thinks Richard Holbrooke, chief State Department impresario of the AfPak portfolio, is a jerk.  And — stop the presses — when under the influence of alcohol, General David Petraeus, commander of U.S. and allied forces in Afghanistan, is alleged to use the word “f**ked.”  These are the sort of shocking revelations that make you a headliner on the Sunday morning talk shows.

Based on what we have learned so far from those select few provided with advance copies of the book — mostly reporters for the Post and The New York Times who, for whatever reason, seem happy to serve as its shills — Obama’s Wars contains hints of another story, the significance of which seems to have eluded Woodward.

The theme of that story is not whether Dick likes Jane, but whether the Constitution remains an operative document.  The Constitution explicitly assigns to the president the role of commander-in-chief. Responsibility for the direction of American wars rests with him. According to the principle of civilian control, senior military officers advise and execute, but it’s the president who decides.  That’s the theory, at least.  Reality turns out to be considerably different and, to be kind about it, more complicated.

Obama’s Wars reportedly contains this comment by President Obama to Secretary Clinton and Secretary of Defense Robert Gates regarding Afghanistan:  “I’m not doing 10 years… I’m not doing long-term nation-building. I am not spending a trillion dollars.”

Aren’t you, Mr. President?  Don’t be so sure.

Obama’s Wars also affirms what we already suspected about the decision-making process that led up to the president’s announcement at West Point in December 2009 to prolong and escalate the war. Bluntly put, the Pentagon gamed the process to exclude any possibility of Obama rendering a decision not to its liking.

Pick your surge: 20,000 troops? Or 30,000 troops?  Or 40,000 troops?  Only the most powerful man in the world — or Goldilocks contemplating three bowls of porridge — could handle a decision like that.  Even as Obama opted for the middle course, the real decision had already been made elsewhere by others: the war in Afghanistan would expand and continue.

And then there’s this from the estimable General David Petraeus: “I don’t think you win this war,” Woodward quotes the field commander as saying. “I think you keep fighting… This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”

Here we confront a series of questions to which Woodward (not to mention the rest of Washington) remains steadfastly oblivious.  Why fight a war that even the general in charge says can’t be won?  What will the perpetuation of this conflict cost?  Who will it benefit?  Does the ostensibly most powerful nation in the world have no choice but to wage permanent war?  Are there no alternatives?  Can Obama shut down an unwinnable war now about to enter its tenth year?  Or is he — along with the rest of us — a prisoner of war?

President Obama has repeatedly stated that in July 2011 a withdrawal of U. S. troops from Afghanistan will commence.  No one quite knows
exactly what that means.  Will the withdrawal be symbolic?  General Petraeus has already made it abundantly clear that he will entertain nothing more.  Or will July signal that the Afghan War — and by extension the Global War on Terror launched nine years ago — is finally coming to an end?

Between now and next summer attentive Americans will learn much about how national security policy is actually formulated and who is really
in charge.  Just don’t expect Bob Woodward to offer any enlightenment on the subject.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His new book is Washington Rules: America’s Path to Permanent War.

Copyright 2010 Andrew J. Bacevich

Prisoners of War

Worldly ambition inhibits true learning. Ask me. I know. A young man in a hurry is nearly uneducable: He knows what he wants and where he’s headed; when it comes to looking back or entertaining heretical thoughts, he has neither the time nor the inclination. All that counts is that he is going somewhere. Only as ambition wanes does education become a possibility.

My own education did not commence until I had reached middle age. I can fix its start date with precision: for me, education began in Berlin, on a winter’s evening, at the Brandenburg Gate, not long after the Berlin Wall had fallen.

As an officer in the U.S. Army I had spent considerable time in Germany. Until that moment, however, my family and I had never had occasion to visit this most famous of German cities, still littered with artifacts of a deeply repellent history. At the end of a long day of exploration, we found ourselves in what had, until just months before, been the communist East. It was late and we were hungry, but I insisted on walking the length of the Unter den Linden, from the River Spree to the gate itself. A cold rain was falling and the pavement glistened. The buildings lining the avenue, dating from the era of Prussian kings, were dark, dirty, and pitted. Few people were about. It was hardly a night for sightseeing.

For as long as I could remember, the Brandenburg Gate had been the preeminent symbol of the age and Berlin the epicenter of contemporary history. Yet by the time I made it to the once and future German capital, history was already moving on. The Cold War had abruptly ended. A divided city and a divided nation had reunited.

For Americans who had known Berlin only from a distance, the city existed primarily as a metaphor. Pick a date — 1933, 1942, 1945, 1948, 1961, 1989 — and Berlin becomes an instructive symbol of power, depravity, tragedy, defiance, endurance, or vindication. For those inclined to view the past as a chronicle of parables, the modern history of Berlin offered an abundance of material. The greatest of those parables emerged from the events of 1933 to 1945, an epic tale of evil ascendant, belatedly confronted, then heroically overthrown. A second narrative, woven from events during the intense period immediately following World War II, saw hopes for peace dashed, yielding bitter antagonism but also great resolve. The ensuing stand-off — the “long twilight struggle,” in John Kennedy’s memorable phrase — formed the centerpiece of the third parable, its central theme stubborn courage in the face of looming peril. Finally came the exhilarating events of 1989, with freedom ultimately prevailing, not only in Berlin, but throughout Eastern Europe.

What exactly was I looking for at the Brandenburg Gate? Perhaps confirmation that those parables, which I had absorbed and accepted as true, were just that. Whatever I expected, what I actually found was a cluster of shabby-looking young men, not German, hawking badges, medallions, hats, bits of uniforms, and other artifacts of the mighty Red Army. It was all junk, cheaply made and shoddy. For a handful of deutsche marks, I bought a wristwatch emblazoned with the symbol of the Soviet armored corps. Within days, it ceased to work.

Huddling among the scarred columns, those peddlers — almost certainly off-duty Russian soldiers awaiting redeployment home — constituted a subversive presence. They were loose ends of a story that was supposed to have ended neatly when the Berlin Wall came down. As we hurried off to find warmth and a meal, this disconcerting encounter stuck with me, and I began to entertain this possibility: that the truths I had accumulated over the previous twenty years as a professional soldier — especially truths about the Cold War and U.S. foreign policy — might not be entirely true.

By temperament and upbringing, I had always taken comfort in orthodoxy. In a life spent subject to authority, deference had become a deeply ingrained habit. I found assurance in conventional wisdom. Now, I started, however hesitantly, to suspect that orthodoxy might be a sham. I began to appreciate that authentic truth is never simple and that any version of truth handed down from on high — whether by presidents, prime ministers, or archbishops — is inherently suspect. The powerful, I came to see, reveal truth only to the extent that it suits them. Even then, the truths to which they testify come wrapped in a nearly invisible filament of dissembling, deception, and duplicity. The exercise of power necessarily involves manipulation and is antithetical to candor.

I came to these obvious points embarrassingly late in life. “Nothing is so astonishing in education,” the historian Henry Adams once wrote, “as the amount of ignorance it accumulates in the form of inert facts.” Until that moment I had too often confused education with accumulating and cataloging facts. In Berlin, at the foot of the Brandenburg Gate, I began to realize that I had been a naïf. And so, at age 41, I set out, in a halting and haphazard fashion, to acquire a genuine education.

Twenty years later I’ve made only modest progress. What follows is an accounting of what I have learned thus far.

Visiting a Third-World Version of Germany

In October 1990, I’d gotten a preliminary hint that something might be amiss in my prior education. On October 3rd, communist East Germany — formally the German Democratic Republic (GDR) — ceased to exist and German reunification was officially secured. That very week I accompanied a group of American military officers to the city of Jena in what had been the GDR. Our purpose was self-consciously educational — to study the famous battle of Jena-Auerstädt in which Napoleon Bonaparte and his marshals had inflicted an epic defeat on Prussian forces commanded by the Duke of Brunswick. (The outcome of that 1806 battle inspired the philosopher Hegel, then residing in Jena, to declare that the “end of history” was at hand. The conclusion of the Cold War had only recently elicited a similarly exuberant judgment from the American scholar Francis Fukuyama.)

On this trip we did learn a lot about the conduct of that battle, although mainly inert facts possessing little real educational value. Inadvertently, we also gained insight into the reality of life on the far side of what Americans had habitually called the Iron Curtain, known in U.S. military vernacular as “the trace.” In this regard, the trip proved nothing less than revelatory. The educational content of this excursion would — for me — be difficult to exaggerate.

As soon as our bus crossed the old Inner German Border, we entered a time warp. For U.S. troops garrisoned throughout Bavaria and Hesse, West Germany had for decades served as a sort of theme park — a giant Epcot filled with quaint villages, stunning scenery, and superb highways, along with ample supplies of quite decent food, excellent beer, and accommodating women. Now, we found ourselves face-to-face with an altogether different Germany. Although commonly depicted as the most advanced and successful component of the Soviet Empire, East Germany more closely resembled part of the undeveloped world.

The roads — even the main highways — were narrow and visibly crumbling. Traffic posed little problem. Apart from a few sluggish Trabants and Wartburgs — East German automobiles that tended to a retro primitivism — and an occasional exhaust-spewing truck, the way was clear. The villages through which we passed were forlorn and the small farms down at the heels. For lunch we stopped at a roadside stand. The proprietor happily accepted our D-marks, offering us inedible sausages in exchange. Although the signs assured us that we remained in a land of German speakers, it was a country that had not yet recovered from World War II.

Upon arrival in Jena, we checked into the Hotel Schwarzer Bär, identified by our advance party as the best hostelry in town. It turned out to be a rundown fleabag. As the senior officer present, I was privileged to have a room in which the plumbing functioned. Others were not so lucky.

Jena itself was a midsized university city, with its main academic complex immediately opposite our hotel. A very large bust of Karl Marx, mounted on a granite pedestal and badly in need of cleaning, stood on the edge of the campus. Briquettes of soft coal used for home heating made the air all but unbreathable and coated everything with soot. In the German cities we knew, pastels predominated — houses and apartment blocks painted pale green, muted salmon, and soft yellow. Here everything was brown and gray.

That evening we set out in search of dinner. The restaurants within walking distance were few and unattractive. We chose badly, a drab establishment in which fresh vegetables were unavailable and the wurst inferior. The adequacy of the local beer provided the sole consolation.

The following morning, on the way to the battlefield, we noted a significant Soviet military presence, mostly in the form of trucks passing by — to judge by their appearance, designs that dated from the 1950s. To our surprise, we discovered that the Soviets had established a small training area adjacent to where Napoleon had vanquished the Prussians. Although we had orders to avoid contact with any Russians, the presence of their armored troops going through their paces riveted us. Here was something of far greater immediacy than Bonaparte and the Duke of Brunswick: “the other,” about which we had for so long heard so much but knew so little. Through binoculars, we watched a column of Russian armored vehicles — BMPs, in NATO parlance — traversing what appeared to be a drivers’ training course. Suddenly, one of them began spewing smoke. Soon thereafter, it burst into flames.

Here was education, although at the time I had only the vaguest sense of its significance.

An Ambitious Team Player Assailed by Doubts

These visits to Jena and Berlin offered glimpses of a reality radically at odds with my most fundamental assumptions. Uninvited and unexpected, subversive forces had begun to infiltrate my consciousness. Bit by bit, my worldview started to crumble.

That worldview had derived from this conviction: that American power manifested a commitment to global leadership, and that both together expressed and affirmed the nation’s enduring devotion to its founding ideals. That American power, policies, and purpose were bound together in a neat, internally consistent package, each element drawing strength from and reinforcing the others, was something I took as a given. That, during my adult life, a penchant for interventionism had become a signature of U.S. policy did not — to me, at least — in any way contradict America’s aspirations for peace. Instead, a willingness to expend lives and treasure in distant places testified to the seriousness of those aspirations. That, during this same period, the United States had amassed an arsenal of over 31,000 nuclear weapons, some small number of them assigned to units in which I had served, was not at odds with our belief in the inalienable right to life and liberty; rather, threats to life and liberty had compelled the United States to acquire such an arsenal and maintain it in readiness for instant use.

I was not so naïve as to believe that the American record had been without flaws. Yet I assured myself that any errors or misjudgments had been committed in good faith. Furthermore, circumstances permitted little real choice. In Southeast Asia as in Western Europe, in the Persian Gulf as in the Western Hemisphere, the United States had simply done what needed doing. Viable alternatives did not exist. To consent to any dilution of American power would be to forfeit global leadership, thereby putting at risk safety, prosperity, and freedom, not only our own but also that of our friends and allies.

The choices seemed clear enough. On one side was the status quo: the commitments, customs, and habits that defined American globalism, implemented by the national security apparatus within which I functioned as a small cog. On the other side was the prospect of appeasement, isolationism, and catastrophe. The only responsible course was the one to which every president since Harry Truman had adhered.

For me, the Cold War had played a crucial role in sustaining that worldview. Given my age, upbringing, and professional background, it could hardly have been otherwise. Although the great rivalry between the United States and the Soviet Union had contained moments of considerable anxiety — I remember my father, during the Cuban Missile Crisis, stocking our basement with water and canned goods — it served primarily to clarify, not to frighten. The Cold War provided a framework that organized and made sense of contemporary history. It offered a lineup and a scorecard. That there existed bad Germans and good Germans, their Germans and our Germans, totalitarian Germans and Germans who, like Americans, passionately loved freedom was, for example, a proposition I accepted as dogma. Seeing the Cold War as a struggle between good and evil answered many questions, consigned others to the periphery, and rendered still others irrelevant.

Back in the 1960s, during the Vietnam War, more than a few members of my generation had rejected the conception of the Cold War as a Manichean struggle. Here too, I was admittedly a slow learner. Yet having kept the faith long after others had lost theirs, the doubts that eventually assailed me were all the more disorienting.

Granted, occasional suspicions had appeared long before Jena and Berlin. My own Vietnam experience had generated its share, which I had done my best to suppress. I was, after all, a serving soldier. Except in the narrowest of terms, the military profession, in those days at least, did not look kindly on nonconformity. Climbing the ladder of career success required curbing maverick tendencies. To get ahead, you needed to be a team player. Later, when studying the history of U.S. foreign relations in graduate school, I was pelted with challenges to orthodoxy, which I vigorously deflected. When it came to education, graduate school proved a complete waste of time — a period of intense study devoted to the further accumulation of facts, while I exerted myself to ensuring that they remained inert.

Now, however, my personal circumstances were changing. Shortly after the passing of the Cold War, my military career ended. Education thereby became not only a possibility, but also a necessity.

In measured doses, mortification cleanses the soul. It’s the perfect antidote for excessive self-regard. After 23 years spent inside the U.S. Army seemingly going somewhere, I now found myself on the outside going nowhere in particular. In the self-contained and cloistered universe of regimental life, I had briefly risen to the status of minor spear carrier. The instant I took off my uniform, that status vanished. I soon came to a proper appreciation of my own insignificance, a salutary lesson that I ought to have absorbed many years earlier.

As I set out on what eventually became a crablike journey toward a new calling as a teacher and writer — a pilgrimage of sorts — ambition in the commonly accepted meaning of the term ebbed. This did not happen all at once. Yet gradually, trying to grab one of life’s shiny brass rings ceased being a major preoccupation. Wealth, power, and celebrity became not aspirations but subjects for critical analysis. History — especially the familiar narrative of the Cold War — no longer offered answers; instead, it posed perplexing riddles. Easily the most nagging was this one: How could I have so profoundly misjudged the reality of what lay on the far side of the Iron Curtain?

Had I been insufficiently attentive? Or was it possible that I had been snookered all along? Contemplating such questions, while simultaneously witnessing the unfolding of the “long 1990s” — the period bookended by two wars with Iraq when American vainglory reached impressive new heights — prompted the realization that I had grossly misinterpreted the threat posed by America’s adversaries. Yet that was the lesser half of the problem. Far worse than misperceiving “them” was the fact that I had misperceived “us.” What I thought I knew best I actually understood least. Here, the need for education appeared especially acute.

George W. Bush’s decision to launch Operation Iraqi Freedom in 2003 pushed me fully into opposition. Claims that once seemed elementary — above all, claims relating to the essentially benign purposes of American power — now appeared preposterous. The contradictions that found an ostensibly peace-loving nation committing itself to a doctrine of preventive war became too great to ignore. The folly and hubris of the policy makers who heedlessly thrust the nation into an ill-defined and open-ended “global war on terror” without the foggiest notion of what victory would look like, how it would be won, and what it might cost approached standards hitherto achieved only by slightly mad German warlords. During the era of containment, the United States had at least maintained the pretense of a principled strategy; now, the last vestiges of principle gave way to fantasy and opportunism. With that, the worldview to which I had adhered as a young adult and carried into middle age dissolved completely.

Credo and Trinity

What should stand in the place of such discarded convictions? Simply inverting the conventional wisdom, substituting a new Manichean paradigm for the old discredited version — the United States taking the place of the Soviet Union as the source of the world’s evil — would not suffice. Yet arriving at even an approximation of truth would entail subjecting conventional wisdom, both present and past, to sustained and searching scrutiny. Cautiously at first but with growing confidence, this I vowed to do.

Doing so meant shedding habits of conformity acquired over decades. All of my adult life I had been a company man, only dimly aware of the extent to which institutional loyalties induce myopia. Asserting independence required first recognizing the extent to which I had been socialized to accept certain things as unimpeachable. Here then were the preliminary steps essential to making education accessible. Over a period of years, a considerable store of debris had piled up. Now, it all had to go. Belatedly, I learned that more often than not what passes for conventional wisdom is simply wrong. Adopting fashionable attitudes to demonstrate one’s trustworthiness — the world of politics is flush with such people hoping thereby to qualify for inclusion in some inner circle — is akin to engaging in prostitution in exchange for promissory notes. It’s not only demeaning but downright foolhardy.

Washington Rules aims to take stock of conventional wisdom in its most influential and enduring form, namely the package of assumptions, habits, and precepts that have defined the tradition of statecraft to which the United States has adhered since the end of World War II — the era of global dominance now drawing to a close. This postwar tradition combines two components, each one so deeply embedded in the American collective consciousness as to have all but disappeared from view.

The first component specifies norms according to which the international order ought to work and charges the United States with responsibility for enforcing those norms. Call this the American credo. In the simplest terms, the credo summons the United States — and the United States alone — to lead, save, liberate, and ultimately transform the world. In a celebrated manifesto issued at the dawn of what he termed “The American Century,” Henry R. Luce made the case for this spacious conception of global leadership. Writing in Life magazine in early 1941, the influential publisher exhorted his fellow citizens to “accept wholeheartedly our duty to exert upon the world the full impact of our influence for such purposes as we see fit and by such means as we see fit.” Luce thereby captured what remains even today the credo’s essence.

Luce’s concept of an American Century, an age of unquestioned American global primacy, resonated, especially in Washington. His evocative phrase found a permanent place in the lexicon of national politics. (Recall that the neoconservatives who, in the 1990s, lobbied for more militant U.S. policies named their enterprise the Project for a New American Century.) So, too, did Luce’s expansive claim of prerogatives to be exercised by the United States. Even today, whenever public figures allude to America’s responsibility to lead, they signal their fidelity to this creed. Along with respectful allusions to God and “the troops,” adherence to Luce’s credo has become a de facto prerequisite for high office. Question its claims and your prospects of being heard in the hubbub of national politics become nil.

Note, however, that the duty Luce ascribed to Americans has two components. It is not only up to Americans, he wrote, to choose the purposes for which they would bring their influence to bear, but to choose the means as well. Here we confront the second component of the postwar tradition of American statecraft.

With regard to means, that tradition has emphasized activism over example, hard power over soft, and coercion (often styled “negotiating from a position of strength”) over suasion. Above all, the exercise of global leadership as prescribed by the credo obliges the United States to maintain military capabilities staggeringly in excess of those required for self-defense. Prior to World War II, Americans by and large viewed military power and institutions with skepticism, if not outright hostility. In the wake of World War II, that changed. An affinity for military might emerged as central to the American identity.

By the midpoint of the twentieth century, “the Pentagon” had ceased to be merely a gigantic five-sided building. Like “Wall Street” at the end of the nineteenth century, it had become Leviathan, its actions veiled in secrecy, its reach extending around the world. Yet while the concentration of power in Wall Street had once evoked deep fear and suspicion, Americans by and large saw the concentration of power in the Pentagon as benign. Most found it reassuring.

A people who had long seen standing armies as a threat to liberty now came to believe that the preservation of liberty required them to lavish resources on the armed forces. During the Cold War, Americans worried ceaselessly about falling behind the Russians, even though the Pentagon consistently maintained a position of overall primacy. Once the Soviet threat disappeared, mere primacy no longer sufficed. With barely a whisper of national debate, unambiguous and perpetual global military supremacy emerged as an essential predicate to global leadership.

Every great military power has its distinctive signature. For Napoleonic France, it was the levée en masse — the people in arms animated by the ideals of the Revolution. For Great Britain in the heyday of empire, it was command of the seas, sustained by a dominant fleet and a network of far-flung outposts from Gibraltar and the Cape of Good Hope to Singapore and Hong Kong. Germany from the 1860s to the 1940s (and Israel from 1948 to 1973) took another approach, relying on a potent blend of tactical flexibility and operational audacity to achieve battlefield superiority.

The abiding signature of American military power since World War II has been of a different order altogether. The United States has not specialized in any particular type of war. It has not adhered to a fixed tactical style. No single service or weapon has enjoyed consistent favor. At times, the armed forces have relied on citizen-soldiers to fill their ranks; at other times, long-service professionals. Yet an examination of the past 60 years of U.S. military policy and practice does reveal important elements of continuity. Call them the sacred trinity: an abiding conviction that the minimum essentials of international peace and order require the United States to maintain a global military presence, to configure its forces for global power projection, and to counter existing or anticipated threats by relying on a policy of global interventionism.

Together, credo and trinity — the one defining purpose, the other practice — constitute the essence of the way that Washington has attempted to govern and police the American Century. The relationship between the two is symbiotic. The trinity lends plausibility to the credo’s vast claims. For its part, the credo justifies the trinity’s vast requirements and exertions. Together they provide the basis for an enduring consensus that imparts a consistency to U.S. policy regardless of which political party may hold the upper hand or who may be occupying the White House. From the era of Harry Truman to the age of Barack Obama, that consensus has remained intact. It defines the rules to which Washington adheres; it determines the precepts by which Washington rules.

As used here, Washington is less a geographic expression than a set of interlocking institutions headed by people who, whether acting officially or unofficially, are able to put a thumb on the helm of state. Washington, in this sense, includes the upper echelons of the executive, legislative, and judicial branches of the federal government. It encompasses the principal components of the national security state — the departments of Defense, State, and, more recently, Homeland Security, along with various agencies comprising the intelligence and federal law enforcement communities. Its ranks extend to select think tanks and interest groups. Lawyers, lobbyists, fixers, former officials, and retired military officers who still enjoy access are members in good standing. Yet Washington also reaches beyond the Beltway to include big banks and other financial institutions, defense contractors and major corporations, television networks and elite publications like the New York Times, even quasi-academic entities like the Council on Foreign Relations and Harvard’s Kennedy School of Government. With rare exceptions, acceptance of the Washington rules forms a prerequisite for entry into this world.

My purpose in writing Washiington Rules is fivefold: first, to trace the origins and evolution of the Washington rules — both the credo that inspires consensus and the trinity in which it finds expression; second, to subject the resulting consensus to critical inspection, showing who wins and who loses and also who foots the bill; third, to explain how the Washington rules are perpetuated, with certain views privileged while others are declared disreputable; fourth, to demonstrate that the rules themselves have lost what ever utility they may once have possessed, with their implications increasingly pernicious and their costs increasingly unaffordable; and finally, to argue for readmitting disreputable (or “radical”) views to our national security debate, in effect legitimating alternatives to the status quo. In effect, my aim is to invite readers to share in the process of education on which I embarked two decades ago in Berlin.

The Washington rules were forged at a moment when American influence and power were approaching their acme. That moment has now passed. The United States has drawn down the stores of authority and goodwill it had acquired by 1945. Words uttered in Washington command less respect than once was the case. Americans can ill afford to indulge any longer in dreams of saving the world, much less remaking it in our own image. The curtain is now falling on the American Century.

Similarly, the United States no longer possesses sufficient wherewithal to sustain a national security strategy that relies on global military presence and global power projection to underwrite a policy of global interventionism. Touted as essential to peace, adherence to that strategy has propelled the United States into a condition approximating perpetual war, as the military misadventures of the past decade have demonstrated.

To anyone with eyes to see, the shortcomings inherent in the Washington rules have become plainly evident. Although those most deeply invested in perpetuating its conventions will insist otherwise, the tradition to which Washington remains devoted has begun to unravel. Attempting to prolong its existence might serve Washington’s interests, but it will not serve the interests of the American people.

Devising an alternative to the reigning national security paradigm will pose a daunting challenge — especially if Americans look to “Washington” for fresh thinking. Yet doing so has become essential.

In one sense, the national security policies to which Washington so insistently adheres express what has long been the preferred American approach to engaging the world beyond our borders. That approach plays to America’s presumed strong suit — since World War II, and especially since the end of the Cold War, thought to be military power. In another sense, this reliance on military might creates excuses for the United States to avoid serious engagement: confidence in American arms has made it unnecessary to attend to what others might think or to consider how their aspirations might differ from our own. In this way, the Washington rules reinforce American provincialism — a national trait for which the United States continues to pay dearly.

The persistence of these rules has also provided an excuse to avoid serious self-engagement. From this perspective, confidence that the credo and the trinity will oblige others to accommodate themselves to America’s needs or desires — whether for cheap oil, cheap credit, or cheap consumer goods — has allowed Washington to postpone or ignore problems demanding attention here at home. Fixing Iraq or Afghanistan ends up taking precedence over fixing Cleveland and Detroit. Purporting to support the troops in their crusade to free the world obviates any obligation to assess the implications of how Americans themselves choose to exercise freedom.

When Americans demonstrate a willingness to engage seriously with others, combined with the courage to engage seriously with themselves, then real education just might begin.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent War (Metropolitan Books)has just been published. This essay is its introduction.  Listen to a TomCast audio interview in which he discusses the book by clicking here, or to download to an iPod, here.

Excerpted from Washington Rules: America’s Path to Permanent War, published this month by Metropolitan Books, an imprint of Henry Holt and Company, LLC. Copyright (c) 2010 by Andrew Bacevich. All rights reserved.

Andrew Bacevich, How Washington Rules

“In watching the flow of events over the past decade or so, it is hard to avoid the feeling that something very fundamental has happened in world history.”  This sentiment, introducing the essay that made Francis Fukuyama a household name, commands renewed attention today, albeit from a different perspective.

Developments during the 1980s, above all the winding down of the Cold War, had convinced Fukuyama that the “end of history” was at hand.  “The triumph of the West, of the Western idea,” he wrote in 1989, “is evident… in the total exhaustion of viable systematic alternatives to Western liberalism.”

Today the West no longer looks quite so triumphant.  Yet events during the first decade of the present century have delivered history to another endpoint of sorts.  Although Western liberalism may retain considerable appeal, the Western way of war has run its course.

For Fukuyama, history implied ideological competition, a contest pitting democratic capitalism against fascism and communism.  When he wrote his famous essay, that contest was reaching an apparently definitive conclusion.

Yet from start to finish, military might had determined that competition’s course as much as ideology.  Throughout much of the twentieth century, great powers had vied with one another to create new, or more effective, instruments of coercion.  Military innovation assumed many forms.  Most obviously, there were the weapons: dreadnoughts and aircraft carriers, rockets and missiles, poison gas, and atomic bombs — the list is a long one.  In their effort to gain an edge, however, nations devoted equal attention to other factors: doctrine and organization, training systems and mobilization schemes, intelligence collection and war plans.

All of this furious activity, whether undertaken by France or Great Britain, Russia or Germany, Japan or the United States, derived from a common belief in the plausibility of victory.  Expressed in simplest terms, the Western military tradition could be reduced to this proposition: war remains a viable instrument of statecraft, the accoutrements of modernity serving, if anything, to enhance its utility.

Grand Illusions

That was theory.  Reality, above all the two world wars of the last century, told a decidedly different story.  Armed conflict in the industrial age reached new heights of lethality and destructiveness.  Once begun, wars devoured everything, inflicting staggering material, psychological, and moral damage.  Pain vastly exceeded gain.  In that regard, the war of 1914-1918 became emblematic: even the winners ended up losers.  When fighting eventually stopped, the victors were left not to celebrate but to mourn.  As a consequence, well before Fukuyama penned his essay, faith in war’s problem-solving capacity had begun to erode.  As early as 1945, among several great powers — thanks to war, now great in name only — that faith disappeared altogether.

Among nations classified as liberal democracies, only two resisted this trend.  One was the United States, the sole major belligerent to emerge from the Second World War stronger, richer, and more confident.  The second was Israel, created as a direct consequence of the horrors unleashed by that cataclysm.  By the 1950s, both countries subscribed to this common conviction: national security (and, arguably, national survival) demanded unambiguous military superiority.  In the lexicon of American and Israeli politics, “peace” was a codeword.  The essential prerequisite for peace was for any and all adversaries, real or potential, to accept a condition of permanent inferiority.  In this regard, the two nations — not yet intimate allies — stood apart from the rest of the Western world.

So even as they professed their devotion to peace, civilian and military elites in the United States and Israel prepared obsessively for war.  They saw no contradiction between rhetoric and reality.

Yet belief in the efficacy of military power almost inevitably breeds the temptation to put that power to work.  “Peace through strength” easily enough becomes “peace through war.”  Israel succumbed to this temptation in 1967.  For Israelis, the Six Day War proved a turning point.  Plucky David defeated, and then became, Goliath.  Even as the United States was flailing about in Vietnam, Israel had evidently succeeded in definitively mastering war.

A quarter-century later, U.S. forces seemingly caught up.  In 1991, Operation Desert Storm, George H.W. Bush’s war against Iraqi dictator Saddam Hussein, showed that American troops like Israeli soldiers knew how to win quickly, cheaply, and humanely.  Generals like H. Norman Schwarzkopf persuaded themselves that their brief desert campaign against Iraq had replicated — even eclipsed — the battlefield exploits of such famous Israeli warriors as Moshe Dayan and Yitzhak Rabin.  Vietnam faded into irrelevance.

For both Israel and the United States, however, appearances proved deceptive.  Apart from fostering grand illusions, the splendid wars of 1967 and 1991 decided little.  In both cases, victory turned out to be more apparent than real.  Worse, triumphalism fostered massive future miscalculation.

On the Golan Heights, in Gaza, and throughout the West Bank, proponents of a Greater Israel — disregarding Washington’s objections — set out to assert permanent control over territory that Israel had seized.  Yet “facts on the ground” created by successive waves of Jewish settlers did little to enhance Israeli security.  They succeeded chiefly in shackling Israel to a rapidly growing and resentful Palestinian population that it could neither pacify nor assimilate.

In the Persian Gulf, the benefits reaped by the United States after 1991 likewise turned out to be ephemeral.  Saddam Hussein survived and became in the eyes of successive American administrations an imminent threat to regional stability.  This perception prompted (or provided a pretext for) a radical reorientation of strategy in Washington.  No longer content to prevent an unfriendly outside power from controlling the oil-rich Persian Gulf, Washington now sought to dominate the entire Greater Middle East.  Hegemony became the aim.  Yet the United States proved no more successful than Israel in imposing its writ.

During the 1990s, the Pentagon embarked willy-nilly upon what became its own variant of a settlement policy.  Yet U.S. bases dotting the Islamic world and U.S. forces operating in the region proved hardly more welcome than the Israeli settlements dotting the occupied territories and the soldiers of the Israeli Defense Forces (IDF) assigned to protect them.  In both cases, presence provoked (or provided a pretext for) resistance.  Just as Palestinians vented their anger at the Zionists in their midst, radical Islamists targeted Americans whom they regarded as neo-colonial infidels.

Stuck

No one doubted that Israelis (regionally) and Americans (globally) enjoyed unquestioned military dominance.  Throughout Israel’s near abroad, its tanks, fighter-bombers, and warships operated at will.  So, too, did American tanks, fighter-bombers, and warships wherever they were sent.

So what?  Events made it increasingly evident that military dominance did not translate into concrete political advantage.  Rather than enhancing the prospects for peace, coercion produced ever more complications.  No matter how badly battered and beaten, the “terrorists” (a catch-all term applied to anyone resisting Israeli or American authority) weren’t intimidated, remained unrepentant, and kept coming back for more.

Israel ran smack into this problem during Operation Peace for Galilee, its 1982 intervention in Lebanon.  U.S. forces encountered it a decade later during Operation Restore Hope, the West’s gloriously titled foray into Somalia.  Lebanon possessed a puny army; Somalia had none at all.  Rather than producing peace or restoring hope, however, both operations ended in frustration, embarrassment, and failure.

And those operations proved but harbingers of worse to come.  By the 1980s, the IDF’s glory days were past.  Rather than lightning strikes deep into the enemy rear, the narrative of Israeli military history became a cheerless recital of dirty wars — unconventional conflicts against irregular forces yielding problematic results.  The First Intifada (1987-1993), the Second Intifada (2000-2005), a second Lebanon War (2006), and Operation Cast Lead, the notorious 2008-2009 incursion into Gaza, all conformed to this pattern.

Meanwhile, the differential between Palestinian and Jewish Israeli birth rates emerged as a looming threat — a “demographic bomb,” Benjamin Netanyahu called it.  Here were new facts on the ground that military forces, unless employed pursuant to a policy of ethnic cleansing, could do little to redress.  Even as the IDF tried repeatedly and futilely to bludgeon Hamas and Hezbollah into submission, demographic trends continued to suggest that within a generation a majority of the population within Israel and the occupied territories would be Arab.

Trailing a decade or so behind Israel, the United States military nonetheless succeeded in duplicating the IDF’s experience.  Moments of glory remained, but they would prove fleeting indeed.  After 9/11, Washington’s efforts to transform (or “liberate”) the Greater Middle East kicked into high gear.  In Afghanistan and Iraq, George W. Bush’s Global War on Terror began impressively enough, as U.S. forces operated with a speed and élan that had once been an Israeli trademark.  Thanks to “shock and awe,” Kabul fell, followed less than a year and a half later by Baghdad.  As one senior Army general explained to Congress in 2004, the Pentagon had war all figured out:

“We are now able to create decision superiority that is enabled by networked systems, new sensors and command and control capabilities that are producing unprecedented near real time situational awareness, increased information availability, and an ability to deliver precision munitions throughout the breadth and depth of the battlespace… Combined, these capabilities of the future networked force will leverage information dominance, speed and precision, and result in decision superiority.”

The key phrase in this mass of techno-blather was the one that occurred twice: “decision superiority.”  At that moment, the officer corps, like the Bush administration, was still convinced that it knew how to win.

Such claims of success, however, proved obscenely premature.  Campaigns advertised as being wrapped up in weeks dragged on for years, while American troops struggled with their own intifadas.  When it came to achieving decisions that actually stuck, the Pentagon (like the IDF) remained clueless.

Winless

If any overarching conclusion emerges from the Afghan and Iraq Wars (and from their Israeli equivalents), it’s this: victory is a chimera.  Counting on today’s enemy to yield in the face of superior force makes about as much sense as buying lottery tickets to pay the mortgage: you better be really lucky.

Meanwhile, as the U.S. economy went into a tailspin, Americans contemplated their equivalent of Israel’s “demographic bomb” — a “fiscal bomb.”  Ingrained habits of profligacy, both individual and collective, held out the prospect of long-term stagnation: no growth, no jobs, no fun.  Out-of-control spending on endless wars exacerbated that threat.

By 2007, the American officer corps itself gave up on victory, although without giving up on war.  First in Iraq, then in Afghanistan, priorities shifted.  High-ranking generals shelved their expectations of winning — at least as a Rabin or Schwarzkopf would have understood that term.  They sought instead to not lose.  In Washington as in U.S. military command posts, the avoidance of outright defeat emerged as the new gold standard of success.

As a consequence, U.S. troops today sally forth from their base camps not to defeat the enemy, but to “protect the people,” consistent with the latest doctrinal fashion.  Meanwhile, tea-sipping U.S. commanders cut deals with warlords and tribal chieftains in hopes of persuading guerrillas to lay down their arms.

A new conventional wisdom has taken hold, endorsed by everyone from new Afghan War commander General David Petraeus, the most celebrated soldier of this American age, to Barack Obama, commander-in-chief and Nobel Peace Prize laureate.  For the conflicts in which the United States finds itself enmeshed, “military solutions” do not exist.  As Petraeus himself has emphasized, “we can’t kill our way out of” the fix we’re in.  In this way, he also pronounced a eulogy on the Western conception of warfare of the last two centuries.

The Unasked Question

What then are the implications of arriving at the end of Western military history?

In his famous essay, Fukuyama cautioned against thinking that the end of ideological history heralded the arrival of global peace and harmony.  Peoples and nations, he predicted, would still find plenty to squabble about.

With the end of military history, a similar expectation applies.  Politically motivated violence will persist and may in specific instances even retain marginal utility.  Yet the prospect of Big Wars solving Big Problems is probably gone for good.  Certainly, no one in their right mind, Israeli or American, can believe that a continued resort to force will remedy whatever it is that fuels anti-Israeli or anti-American antagonism throughout much of the Islamic world.  To expect persistence to produce something different or better is moonshine.

It remains to be seen whether Israel and the United States can come to terms with the end of military history.  Other nations have long since done so, accommodating themselves to the changing rhythms of international politics.  That they do so is evidence not of virtue, but of shrewdness.  China, for example, shows little eagerness to disarm.  Yet as Beijing expands its reach and influence, it emphasizes trade, investment, and development assistance.  Meanwhile, the People’s Liberation Army stays home.  China has stolen a page from an old American playbook, having become today the preeminent practitioner of “dollar diplomacy.”

The collapse of the Western military tradition confronts Israel with limited choices, none of them attractive.  Given the history of Judaism and the history of Israel itself, a reluctance of Israeli Jews to entrust their safety and security to the good will of their neighbors or the warm regards of the international community is understandable.  In a mere six decades, the Zionist project has produced a vibrant, flourishing state.  Why put all that at risk?  Although the demographic bomb may be ticking, no one really knows how much time remains on the clock.  If Israelis are inclined to continue putting their trust in (American-supplied) Israeli arms while hoping for the best, who can blame them?

In theory, the United States, sharing none of Israel’s demographic or geographic constraints and, far more richly endowed, should enjoy far greater freedom of action.  Unfortunately, Washington has a vested interest in preserving the status quo, no matter how much it costs or where it leads.  For the military-industrial complex, there are contracts to win and buckets of money to be made.  For those who dwell in the bowels of the national security state, there are prerogatives to protect.  For elected officials, there are campaign contributors to satisfy.  For appointed officials, civilian and military, there are ambitions to be pursued.

And always there is a chattering claque of militarists, calling for jihad and insisting on ever greater exertions, while remaining alert to any hint of backsliding.  In Washington, members of this militarist camp, by no means coincidentally including many of the voices that most insistently defend Israeli bellicosity, tacitly collaborate in excluding or marginalizing views that they deem heretical.  As a consequence, what passes for debate on matters relating to national security is a sham.  Thus are we invited to believe, for example, that General Petraeus’s appointment as the umpteenth U.S. commander in Afghanistan constitutes a milestone on the way to ultimate success.

Nearly 20 years ago, a querulous Madeleine Albright demanded to know: “What’s the point of having this superb military you’re always talking about if we can’t use it?”  Today, an altogether different question deserves our attention: What’s the point of constantly using our superb military if doing so doesn’t actually work?

Washington’s refusal to pose that question provides a measure of the corruption and dishonesty permeating our politics.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent Warhas just been published. Listen to the latest TomCast audio interview to hear him discuss the book by clicking here or, to download to an iPod, here.

Copyright 2010 Andrew Bacevich

This article was originally posted at TomDispatch.com.

The End of (Military) History?

In a recent column, the Washington Post’s Richard Cohen wrote, “What Henry Luce called ‘the American Century’ is over.” Cohen is right. All that remains is to drive a stake through the heart of Luce’s pernicious creation, lest it come back to life. This promises to take some doing.

When the Time-Life publisher coined his famous phrase, his intent was to prod his fellow citizens into action. Appearing in the February 7, 1941 issue of Life, his essay, “The American Century,” hit the newsstands at a moment when the world was in the throes of a vast crisis. A war in Europe had gone disastrously awry. A second almost equally dangerous conflict was unfolding in the Far East. Aggressors were on the march.

With the fate of democracy hanging in the balance, Americans diddled. Luce urged them to get off the dime. More than that, he summoned them to “accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world… to exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”

Read today, Luce’s essay, with its strange mix of chauvinism, religiosity, and bombast (“We must now undertake to be the Good Samaritan to the entire world…”), does not stand up well. Yet the phrase “American Century” stuck and has enjoyed a remarkable run. It stands in relation to the contemporary era much as “Victorian Age” does to the nineteenth century. In one pithy phrase, it captures (or at least seems to capture) the essence of some defining truth: America as alpha and omega, source of salvation and sustenance, vanguard of history, guiding spirit and inspiration for all humankind.

In its classic formulation, the central theme of the American Century has been one of righteousness overcoming evil. The United States (above all the U.S. military) made that triumph possible. When, having been given a final nudge on December 7, 1941, Americans finally accepted their duty to lead, they saved the world from successive diabolical totalitarianisms. In doing so, the U.S. not only preserved the possibility of human freedom but modeled what freedom ought to look like.

Thank You, Comrades

So goes the preferred narrative of the American Century, as recounted by its celebrants.

The problems with this account are two-fold. First, it claims for the United States excessive credit. Second, it excludes, ignores, or trivializes matters at odds with the triumphal story-line.

The net effect is to perpetuate an array of illusions that, whatever their value in prior decades, have long since outlived their usefulness. In short, the persistence of this self-congratulatory account deprives Americans of self-awareness, hindering our efforts to navigate the treacherous waters in which the country finds itself at present. Bluntly, we are perpetuating a mythic version of the past that never even approximated reality and today has become downright malignant. Although Richard Cohen may be right in declaring the American Century over, the American people — and especially the American political class — still remain in its thrall.

Constructing a past usable to the present requires a willingness to include much that the American Century leaves out.

For example, to the extent that the demolition of totalitarianism deserves to be seen as a prominent theme of contemporary history (and it does), the primary credit for that achievement surely belongs to the Soviet Union. When it came to defeating the Third Reich, the Soviets bore by far the preponderant burden, sustaining 65% of all Allied deaths in World War II.

By comparison, the United States suffered 2% of those losses, for which any American whose father or grandfather served in and survived that war should be saying: Thank you, Comrade Stalin.

For the United States to claim credit for destroying the Wehrmacht is the equivalent of Toyota claiming credit for inventing the automobile. We entered the game late and then shrewdly scooped up more than our fair share of the winnings. The true “Greatest Generation” is the one that willingly expended millions of their fellow Russians while killing millions of German soldiers.

Hard on the heels of World War II came the Cold War, during which erstwhile allies became rivals. Once again, after a decades-long struggle, the United States came out on top.

Yet in determining that outcome, the brilliance of American statesmen was far less important than the ineptitude of those who presided over the Kremlin. Ham-handed Soviet leaders so mismanaged their empire that it eventually imploded, permanently discrediting Marxism-Leninism as a plausible alternative to liberal democratic capitalism. The Soviet dragon managed to slay itself. So thank you, Comrades Malenkov, Khrushchev, Brezhnev, Andropov, Chernenko, and Gorbachev.

Screwing the Pooch

What flag-wavers tend to leave out of their account of the American Century is not only the contributions of others, but the various missteps perpetrated by the United States — missteps, it should be noted, that spawned many of the problems bedeviling us today.

The instances of folly and criminality bearing the label “made-in-Washington” may not rank up there with the Armenian genocide, the Bolshevik Revolution, the appeasement of Adolf Hitler, or the Holocaust, but they sure don’t qualify as small change. To give them their due is necessarily to render the standard account of the American Century untenable.

Here are several examples, each one familiar, even if its implications for the problems we face today are studiously ignored:

Cuba. In 1898, the United States went to war with Spain for the proclaimed purpose of liberating the so-called Pearl of the Antilles. When that brief war ended, Washington reneged on its promise. If there actually has been an American Century, it begins here, with the U.S. government breaking a solemn commitment, while baldly insisting otherwise. By converting Cuba into a protectorate, the United States set in motion a long train of events leading eventually to the rise of Fidel Castro, the Bay of Pigs, Operation Mongoose, the Cuban Missile Crisis, and even today’s Guantanamo Bay prison camp. The line connecting these various developments may not be a straight one, given the many twists and turns along the way, but the dots do connect.

The Bomb. Nuclear weapons imperil our existence. Used on a large scale, they could destroy civilization itself. Even now, the prospect of a lesser power like North Korea or Iran acquiring nukes sends jitters around the world. American presidents — Barack Obama is only the latest in a long line — declare the abolition of these weapons to be an imperative. What they are less inclined to acknowledge is the role the United States played in afflicting humankind with this scourge.

The United States invented the bomb. The United States — alone among members of the nuclear club — actually employed it as a weapon of war. The U.S. led the way in defining nuclear-strike capacity as the benchmark of power in the postwar world, leaving other powers like the Soviet Union, Great Britain, France, and China scrambling to catch up. Today, the U.S. still maintains an enormous nuclear arsenal at the ready and adamantly refuses to commit itself to a no-first-use policy, even as it professes its horror at the prospect of some other nation doing as the United States itself has done.

Iran. Extending his hand to Tehran, President Obama has invited those who govern the Islamic republic to “unclench their fists.” Yet to a considerable degree, those clenched fists are of our own making. For most Americans, the discovery of Iran dates from the time of the notorious hostage crisis of 1979-1981 when Iranian students occupied the U.S. embassy in Tehran, detained several dozen U.S. diplomats and military officers, and subjected the administration of Jimmy Carter to a 444-day-long lesson in abject humiliation.

For most Iranians, the story of U.S.-Iranian relations begins somewhat earlier. It starts in 1953, when CIA agents collaborated with their British counterparts to overthrow the democratically-elected government of Mohammed Mossadegh and return the Shah of Iran to his throne. The plot succeeded. The Shah regained power. The Americans got oil, along with a lucrative market for exporting arms. The people of Iran pretty much got screwed. Freedom and democracy did not prosper. The antagonism that expressed itself in November 1979 with the takeover of the U.S. embassy in Tehran was not entirely without cause.

Afghanistan. President Obama has wasted little time in making the Afghanistan War his own. Like his predecessor he vows to defeat the Taliban. Also like his predecessor he has yet to confront the role played by the United States in creating the Taliban in the first place. Washington once took pride in the success it enjoyed funneling arms and assistance to fundamentalist Afghans waging jihad against foreign occupiers. During the administrations of Jimmy Carter and Ronald Reagan, this was considered to represent the very acme of clever statecraft. U.S. support for the Afghan mujahideen caused the Soviets fits. Yet it also fed a cancer that, in time, exacted a most grievous toll on Americans themselves — and has U.S. forces today bogged down in a seemingly endless war.



Watch the video



Act of Contrition

Had the United States acted otherwise, would Cuba have evolved into a stable and prosperous democracy, a beacon of hope for the rest of Latin America? Would the world have avoided the blight of nuclear weapons? Would Iran today be an ally of the United States, a beacon of liberalism in the Islamic world, rather than a charter member of the “axis of evil?” Would Afghanistan be a quiet, pastoral land at peace with its neighbors? No one, of course, can say what might have been. All we know for sure is that policies concocted in Washington by reputedly savvy statesmen now look exceedingly ill-advised.

What are we to make of these blunders? The temptation may be to avert our gaze, thereby preserving the reassuring tale of the American Century. We should avoid that temptation and take the opposite course, acknowledging openly, freely, and unabashedly where we have gone wrong. We should carve such acknowledgments into the face of a new monument smack in the middle of the Mall in Washington: We blew it. We screwed the pooch. We caught a case of the stupids. We got it ass-backwards.

Only through the exercise of candor might we avoid replicating such mistakes.

Indeed, we ought to apologize. When it comes to avoiding the repetition of sin, nothing works like abject contrition. We should, therefore, tell the people of Cuba that we are sorry for having made such a hash of U.S.-Cuban relations for so long. President Obama should speak on our behalf in asking the people of Hiroshima and Nagasaki for forgiveness. He should express our deep collective regret to Iranians and Afghans for what past U.S. interventionism has wrought.

The United States should do these things without any expectations of reciprocity. Regardless of what U.S. officials may say or do, Castro won’t fess up to having made his own share of mistakes. The Japanese won’t liken Hiroshima to Pearl Harbor and call it a wash. Iran’s mullahs and Afghanistan’s jihadists won’t be offering to a chastened Washington to let bygones be bygones.

No, we apologize to them, but for our own good — to free ourselves from the accumulated conceits of the American Century and to acknowledge that the United States participated fully in the barbarism, folly, and tragedy that defines our time. For those sins, we must hold ourselves accountable.

To solve our problems requires that we see ourselves as we really are. And that requires shedding, once and for all, the illusions embodied in the American Century.

Andrew J. Bacevich is a professor of history and international relations at Boston University. His most recent book, The Limits of Power: The End of American Exceptionalism, is just out in paperback.

Copyright 2009 Andrew J. Bacevich

Farewell, the American Century

A week ago, I had a long conversation with a four-star U.S. military officer who, until his recent retirement, had played a central role in directing the global war on terror. I asked him: what exactly is the strategy that guides the Bush administration’s conduct of this war? His dismaying, if not exactly surprising, answer: there is none.

President Bush will bequeath to his successor the ultimate self-licking ice cream cone. To defense contractors, lobbyists, think-tankers, ambitious military officers, the hosts of Sunday morning talk shows, and the Douglas Feith-like creatures who maneuver to become players in the ultimate power game, the Global War on Terror is a boon, an enterprise redolent with opportunity and promising to extend decades into the future.

Yet, to a considerable extent, that very enterprise has become a fiction, a gimmicky phrase employed to lend an appearance of cohesion to a panoply of activities that, in reality, are contradictory, counterproductive, or at the very least beside the point. In this sense, the global war on terror relates to terrorism precisely as the war on drugs relates to drug abuse and dependence: declaring a state of permanent "war" sustains the pretense of actually dealing with a serious problem, even as policymakers pay lip-service to the problem’s actual sources. The war on drugs is a very expensive fraud. So, too, is the Global War on Terror.

Anyone intent on identifying some unifying idea that explains U.S. actions, military and otherwise, across the Greater Middle East is in for a disappointment. During World War II, President Franklin D. Roosevelt laid down "Germany first" and then "unconditional surrender" as core principles. Early in the Cold War, the Truman administration devised the concept of containment, which for decades thereafter provided a conceptual framework to which policymakers adhered. Yet seven years into its Global War on Terror, the Bush administration is without a compass, wandering in the arid wilderness. To the extent that any inkling of a strategy once existed — the preposterous neoconservative vision of employing American power to "transform" the Islamic world — events have long since demolished the assumptions on which it was based.

Rather than one single war, the United States is presently engaged in several.

Ranking first in importance is the war for Bush’s legacy, better known as Iraq. The President himself will never back away from his insistence that here lies the "central front" of the conflict he initiated after 9/11. Hunkered down in their bunker, Bush and his few remaining supporters would have us believe that the "surge" has, at long last, brought victory in sight and with it some prospect of redeeming this otherwise misbegotten and mismanaged endeavor. If the President can leave office spouting assurances that light is finally visible somewhere at the far end of a very long, very dark Mesopotamian tunnel, he will claim at least partial vindication. And if actual developments subsequent to January 20 don’t turn out well, he can always blame the outcome on his successor.

Next comes the orphan war. This is Afghanistan, a conflict now in its eighth year with no signs of ending anytime soon. Given the attention lavished on Iraq, developments in Afghanistan have until recently attracted only intermittent notice. Lately, however, U.S. officials have awakened to the fact that things are going poorly, both politically and militarily. Al Qaeda persists. The Taliban is reasserting itself. Expectations that NATO might ride to the rescue have proven illusory. Apart from enabling Afghanistan to reclaim its status as the world’s number one producer of opium, U.S. efforts to pacify that nation and nudge it toward modernity have produced little.

The Pentagon calls its intervention in Afghanistan Operation Enduring Freedom. The emphasis was supposed to be on the noun. Unfortunately, the adjective conveys the campaign’s defining characteristic: enduring as in endless. Barring a radical re-definition of purpose, this is an enterprise which promises to continue, consuming lives and treasure, for a long, long time.

In neighboring Pakistan, meanwhile, there is the war-hidden-in-plain-sight. Reports of U.S. military action in Pakistan have now become everyday fare. Air strikes, typically launched from missile-carrying drones, are commonplace, and U.S. ground forces have also conducted at least one cross-border raid from inside Afghanistan. Although the White House doesn’t call this a war, it is — a gradually escalating war of attrition in which we are killing both terrorists and noncombatants. Unfortunately, we are killing too few of the former to make a difference and more than enough of the latter to facilitate the recruitment of new terrorists to replace those we eliminate.

Finally — skipping past the wars-in-waiting, which are Syria and Iran — there is Condi’s war. This clash, which does not directly involve U.S. forces, may actually be the most important of all. The war that Secretary of State Condoleezza Rice has made her own is the ongoing conflict between Israel and the Palestinians. Having for years dismissed the insistence of Muslims, Arabs and non-Arabs alike, that the plight of the Palestinians constitutes a problem of paramount importance, Rice now embraces that view. With the fervor of a convert, she has vowed to broker an end to that conflict prior to leaving office in January 2009.

Given that Rice brings little — perhaps nothing — to the effort in the way of fresh ideas, her prospects of making good as a peacemaker appear slight. Yet, as with Bush and Iraq, so too with Rice and the Palestinian problem: she has a lot riding on the effort. If she flops, history will remember her as America’s least effective secretary of state since Cordell Hull spent World War II being ignored, bypassed, and humiliated by Franklin Roosevelt. She will depart Foggy Bottom having accomplished nothing.

There’s nothing inherently wrong in fighting simultaneously on several fronts, as long as actions on front A are compatible with those on front B, and together contribute to overall success. Unfortunately, that is not the case with the Global War on Terror. We have instead an illustration of what Winston Churchill once referred to as a pudding without a theme: a war devoid of strategic purpose.

This absence of cohesion — by now a hallmark of the Bush administration — is both a disaster and an opportunity. It is a disaster in the sense that we have, over the past seven years, expended enormous resources, while gaining precious little in return.

Bush’s supporters beg to differ, of course. They credit the president with having averted a recurrence of 9/11, doubtless a commendable achievement but one primarily attributable to the fact that the United States no longer neglects airport security. To argue that, say, the invasion and occupation of Iraq have prevented terrorist attacks against the United States is the equivalent of contending that Israel’s occupation of the West Bank since in 1967 has prevented terrorist attacks against the state of Israel.

Yet the existing strategic vacuum is also an opportunity. When it comes to national security at least, the agenda of the next administration all but sets itself. There is no need to waste time arguing about which issues demand priority action.

First-order questions are begging for attention. How should we gauge the threat? What are the principles that should inform our response? What forms of power are most relevant to implementing that response? Are the means at hand adequate to the task? If not, how should national priorities be adjusted to provide the means required? Given the challenges ahead, how should the government organize itself? Who — both agencies and individuals — will lead?

To each and every one of these questions, the Bush administration devised answers that turned out to be dead wrong. The next administration needs to do better. The place to begin is with the candid recognition that the Global War on Terror has effectively ceased to exist. When it comes to national security strategy, we need to start over from scratch.

Andrew J. Bacevich is professor of history and international relations at Boston University. His bestselling new book is The Limits of Power: The End of American Exceptionalism (The American Empire Project, Metropolitan Books). To listen to a podcast in which he discusses issues relevant to this article, click here.

Copyright 2008 Andrew Bacevich

Expanding War, Contracting Meaning

The events of the past seven years have yielded a definitive judgment on the strategy that the Bush administration conceived in the wake of 9/11 to wage its so-called Global War on Terror. That strategy has failed, massively and irrevocably. To acknowledge that failure is to confront an urgent national priority: to scrap the Bush approach in favor of a new national security strategy that is realistic and sustainable — a task that, alas, neither of the presidential candidates seems able to recognize or willing to take up.

On September 30, 2001, President Bush received from Secretary of Defense Donald Rumsfeld a memorandum outlining U.S. objectives in the War on Terror. Drafted by Rumsfeld’s chief strategist Douglas Feith, the memo declared expansively: "If the war does not significantly change the world’s political map, the U.S. will not achieve its aim." That aim, as Feith explained in a subsequent missive to his boss, was to "transform the Middle East and the broader world of Islam generally."

Rumsfeld and Feith were co-religionists: Along with other senior Bush administration officials, they worshipped in the Church of the Indispensable Nation, a small but intensely devout Washington-based sect formed in the immediate wake of the Cold War. Members of this church shared an exalted appreciation for the efficacy of American power, especially hard power. The strategy of transformation emerged as a direct expression of their faith.

The members of this church were also united by an equally exalted estimation of their own abilities. Lucky the nation to be blessed with such savvy and sophisticated public servants in its hour of need!

The goal of transforming the Islamic world was nothing if not bold. It implied far-reaching political, economic, social, and even cultural adjustments. At a press conference on September 18, 2001, Rumsfeld spoke bluntly of the need to "change the way that they live." Rumsfeld didn’t specify who "they" were. He didn’t have to. His listeners understood without being told: "They" were Muslims inhabiting a vast arc of territory that stretched from Morocco in the west all the way to the Moro territories of the Southern Philippines in the east.

Yet boldly conceived action, if successfully executed, offered the prospect of solving a host of problems. Once pacified (or "liberated"), the Middle East would cease to breed or harbor anti-American terrorists. Post-9/11 fears about weapons of mass destruction falling into the hands of evil-doers could abate. Local regimes, notorious for being venal, oppressive, and inept, might finally get serious about cleaning up their acts. Liberal values, including rights for women, would flourish. A part of the world perpetually dogged by violence would enjoy a measure of stability, with stability promising not so incidentally to facilitate exploitation of the region’s oil reserves. There was even the possibility of enhancing the security of Israel. Like a powerful antibiotic, the Bush administration’s strategy of transformation promised to clean out not simply a single infection but several; or to switch metaphors, a strategy of transformation meant running the table.

When it came to implementation, the imperative of the moment was to think big. Just days after 9/11, Rumsfeld was charging his subordinates to devise a plan of action that had "three, four, five moves behind it." By December 2001, the Pentagon had persuaded itself that the first move — into Afghanistan — had met success. The Bush administration wasted little time in pocketing its ostensible victory. Attention quickly shifted to the second move, seen by insiders as holding the key to ultimate success: Iraq.

Fix Iraq and moves three, four, and five promised to come easily. Writing in the Weekly Standard, William Kristol and Robert Kagan got it exactly right: "The president’s vision will, in the coming months, either be launched successfully in Iraq, or it will die in Iraq."

The point cannot be emphasized too strongly: Saddam Hussein’s (nonexistent) weapons of mass destruction and his (imaginary) ties to Al Qaeda never constituted the real reason for invading Iraq — any more than the imperative of defending Russian "peacekeepers" in South Ossetia explains the Kremlin’s decision to invade Georgia.

Iraq merely offered a convenient place from which to launch a much larger and infinitely more ambitious project. "After Hussein is removed," enthused Hudson Institute analyst Max Singer, "there will be an earthquake through the region." Success in Iraq promised to endow the United States with hitherto unprecedented leverage. Once the United States had made an example of Saddam Hussein, as the influential neoconservative Richard Perle put it, dealing with other ne’er-do-wells would become simple: "We could deliver a short message, a two-word message: ‘You’re next.’" Faced with the prospect of sharing Saddam’s fate, Syrians, Iranians, Sudanese, and other recalcitrant regimes would see submission as the wiser course — so Perle and others believed.

Members of the administration tried to imbue this strategic vision with a softer ideological gloss. "For 60 years," Condoleezza Rice explained to a group of students in Cairo, "my country, the United States, pursued stability at the expense of democracy in this region here in the Middle East — and we achieved neither." No more. "Now, we are taking a different course. We are supporting the democratic aspirations of all people." The world’s Muslims needed to know that the motives behind the U.S. incursion into Iraq and its actions elsewhere in the region were (or had, at least, suddenly become) entirely benign. Who knows? Rice may even have believed the words she spoke.

In either case — whether the strategy of transformation aimed at dominion or democratization — today, seven years after it was conceived, we can assess exactly what it has produced. The answer is clear: next to nothing, apart from squandering vast resources and exacerbating the slide toward debt and dependency that poses a greater strategic threat to the United States than Osama bin Laden ever did.

In point of fact, hardly had the Pentagon commenced its second move, its invasion of Iraq, when the entire strategy began to unravel. In Iraq, President Bush’s vision of regional transformation did die, much as Kagan and Kristol had feared. No amount of CPR credited to the so-called surge will revive it. Even if tomorrow Iraq were to achieve stability and become a responsible member of the international community, no sensible person could suggest that Operation Iraqi Freedom provides a model to apply elsewhere. Senator John McCain says that he’ll keep U.S. combat troops in Iraq for as long as it takes. Yet even he does not propose "solving" any problems posed by Syria or Iran (much less Pakistan) by employing the methods that the Bush administration used to "solve" the problem posed by Iraq. The Bush Doctrine of preventive war may remain nominally on the books. But, as a practical matter, it is defunct.

The United States will not change the world’s political map in the ways top administration officials once dreamed of. There will be no earthquake that shakes up the Middle East — unless the growing clout of Iran, Hezbollah, and Hamas in recent years qualifies as that earthquake. Given the Pentagon’s existing commitments, there will be no threats of "you’re next" either — at least none that will worry our adversaries, as the Russians have neatly demonstrated. Nor will there be a wave of democratic reform — even Rice has ceased her prattling on that score. Islam will remain stubbornly resistant to change, except on terms of its own choosing. We will not change the way "they" live.

In a book that he co-authored during the run-up to the invasion, Kristol confidently declared, "The mission begins in Baghdad, but it does not end there." In fact, the Bush administration’s strategy of transformation has ended. It has failed miserably. The sooner we face up to that failure, the sooner we can get about repairing the damage.

Andrew J. Bacevich is professor of history and international relations at Boston University. His bestselling new book is The Limits of Power: The End of American Exceptionalism. You can read excerpts from it by clicking here, and here, or watch a video of him discussing the lessons of 9/11, seven years later, by clicking here.

Copyright 2008 Andrew J. Bacevich

9/11 Plus Seven