First came the hullaballoo over the “Mosque at Ground Zero.” Then there was Pastor Terry Jones of Gainesville, Florida, grabbing headlines as he promoted “International Burn-a-Koran Day.” Most recently, we have an American posting a slanderous anti-Muslim video on the Internet with all the ensuing turmoil.

Throughout, the official U.S. position has remained fixed: the United States government condemns Islamophobia. Americans respect Islam as a religion of peace. Incidents suggesting otherwise are the work of a tiny minority — whackos, hatemongers, and publicity-seekers. Among Muslims from Benghazi to Islamabad, the argument has proven to be a tough sell.

And not without reason: although it might be comforting to dismiss anti-Islamic outbursts in the U.S. as the work of a few fanatics, the picture is actually far more complicated. Those complications in turn help explain why religion, once considered a foreign policy asset, has in recent years become a net liability.

Let’s begin with a brief history lesson. From the late 1940s to the late 1980s, when Communism provided the overarching ideological rationale for American globalism, religion figured prominently as a theme of U.S. foreign policy. Communist antipathy toward religion helped invest the Cold War foreign policy consensus with its remarkable durability. That Communists were godless sufficed to place them beyond the pale. For many Americans, the Cold War derived its moral clarity from the conviction that here was a contest pitting the God-fearing against the God-denying. Since we were on God’s side, it appeared axiomatic that God should repay the compliment.

From time to time during the decades when anti-Communism provided so much of the animating spirit of U.S. policy, Judeo-Christian strategists in Washington (not necessarily believers themselves), drawing on the theologically correct proposition that Christians, Jews, and Muslims all worship the same God, sought to enlist Muslims, sometimes of fundamentalist persuasions, in the cause of opposing the godless. One especially notable example was the Soviet-Afghan War of 1979-1989. To inflict pain on the Soviet occupiers, the United States threw its weight behind the Afghan resistance, styled in Washington as “freedom fighters,” and funneled aid (via the Saudis and the Pakistanis) to the most religiously extreme among them. When this effort resulted in a massive Soviet defeat, the United States celebrated its support for the Afghan Mujahedeen as evidence of strategic genius. It was almost as if God had rendered a verdict.

Yet not so many years after the Soviets withdrew in defeat, the freedom fighters morphed into the fiercely anti-Western Taliban, providing sanctuary to al-Qaeda as it plotted — successfully — to attack the United States. Clearly, this was a monkey wrench thrown into God’s plan.

With the launching of the Global War on Terrorism, Islamism succeeded Communism as the body of beliefs that, if left unchecked, threatened to sweep across the globe with dire consequences for freedom. Those who Washington had armed as “freedom fighters” now became America’s most dangerous enemies. So at least members of the national security establishment believed or purported to believe, thereby curtailing any further discussion of whether militarized globalism actually represented the best approach to promoting liberal values globally or even served U.S. interests.

Yet as a rallying cry, a war against Islamism presented difficulties right from the outset. As much as policymakers struggled to prevent Islamism from merging in the popular mind with Islam itself, significant numbers of Americans — whether genuinely fearful or mischief-minded — saw this as a distinction without a difference. Efforts by the Bush administration to work around this problem by framing the post-9/11 threat under the rubric of “terrorism” ultimately failed because that generic term offered no explanation for motive. However the administration twisted and turned, motive in this instance seemed bound up with matters of religion.

Where exactly to situate God in post-9/11 U.S. policy posed a genuine challenge for policymakers, not least of all for George W. Bush, who believed, no doubt sincerely, that God had chosen him to defend America in its time of maximum danger. Unlike the communists, far from denying God’s existence, Islamists embrace God with startling ferocity. Indeed, in their vitriolic denunciations of the United States and in perpetrating acts of anti-American violence, they audaciously present themselves as nothing less than God’s avenging agents. In confronting the Great Satan, they claim to be doing God’s will.

Waging War in Jesus’s Name

This debate over who actually represents God’s will is one that the successive administrations of George W. Bush and Barack Obama have studiously sought to avoid. The United States is not at war with Islam per se, U.S. officials insist. Still, among Muslims abroad, Washington’s repeated denials notwithstanding, suspicion persists and not without reason.

Consider the case of Lieutenant General William G. (“Jerry”) Boykin. While still on active duty in 2002, this highly decorated Army officer spoke in uniform at a series of some 30 church gatherings during which he offered his own response to President Bush’s famous question: “Why do they hate us?” The general’s perspective differed markedly from his commander-in-chief’s: “The answer to that is because we’re a Christian nation. We are hated because we are a nation of believers.”

On another such occasion, the general recalled his encounter with a Somali warlord who claimed to enjoy Allah’s protection. The warlord was deluding himself, Boykin declared, and was sure to get his comeuppance: “I knew that my God was bigger than his. I knew that my God was a real God and his was an idol.” As a Christian nation, Boykin insisted, the United States would succeed in overcoming its adversaries only if “we come against them in the name of Jesus.”

When Boykin’s remarks caught the attention of the mainstream press, denunciations rained down from on high, as the White House, the State Department, and the Pentagon hastened to disassociate the government from the general’s views. Yet subsequent indicators suggest that, however crudely, Boykin was indeed expressing perspectives shared by more than a few of his fellow citizens.

One such indicator came immediately: despite the furor, the general kept his important Pentagon job as deputy undersecretary of defense for intelligence, suggesting that the Bush administration considered his transgression minor. Perhaps Boykin had spoken out of turn, but his was not a fireable offense. (One can only speculate regarding the fate likely to befall a U.S. high-ranking officer daring to say of Israeli Prime Benjamin Netanyahu, “My God is a real God and his is an idol.”)

A second indicator came in the wake of Boykin’s retirement from active duty. In 2012, the influential Family Research Council (FRC) in Washington hired the general to serve as the organization’s executive vice-president. Devoted to “advancing faith, family, and freedom,” the council presents itself as emphatically Christian in its outlook. FRC events routinely attract Republican Party heavyweights. The organization forms part of the conservative mainstream, much as, say, the American Civil Liberties Union forms part of the left-liberal mainstream.

So for the FRC to hire as its chief operating officer someone espousing Boykin’s pronounced views regarding Islam qualifies as noteworthy. At a minimum, those who recruited the former general apparently found nothing especially objectionable in his worldview. They saw nothing politically risky about associating with Jerry Boykin. He’s their kind of guy. More likely, by hiring Boykin, the FRC intended to send a signal: on matters where their new COO claimed expertise — above all, war — thumb-in-your eye political incorrectness was becoming a virtue. Imagine the NAACP electing Nation of Islam leader Louis Farrakhan as its national president, thereby endorsing his views on race, and you get the idea.

What the FRC’s embrace of General Boykin makes clear is this: to dismiss manifestations of Islamophobia simply as the work of an insignificant American fringe is mistaken. As with the supporters of Senator Joseph McCarthy, who during the early days of the Cold War saw communists under every State Department desk, those engaging in these actions are daring to express openly attitudes that others in far greater numbers also quietly nurture. To put it another way, what Americans in the 1950s knew as McCarthyism has reappeared in the form of Boykinism.

Historians differ passionately over whether McCarthyism represented a perversion of anti-Communism or its truest expression. So, too, present-day observers will disagree as to whether Boykinism represents a merely fervent or utterly demented response to the Islamist threat. Yet this much is inarguable: just as the junior senator from Wisconsin in his heyday embodied a non-trivial strain of American politics, so, too, does the former special-ops-warrior-turned-“ordained minister with a passion for spreading the Gospel of Jesus Christ.”

Notably, as Boykinism’s leading exponent, the former general’s views bear a striking resemblance to those favored by the late senator. Like McCarthy, Boykin believes that, while enemies beyond America’s gates pose great dangers, the enemy within poses a still greater threat. “I’ve studied Marxist insurgency,” he declared in a 2010 video. “It was part of my training. And the things I know that have been done in every Marxist insurgency are being done in America today.” Explicitly comparing the United States as governed by Barack Obama to Stalin’s Soviet Union, Mao Zedong’s China, and Fidel Castro’s Cuba, Boykin charges that, under the guise of health reform, the Obama administration is secretly organizing a “constabulary force that will control the population in America.” This new force is, he claims, designed to be larger than the United States military, and will function just as Hitler’s Brownshirts once did in Germany. All of this is unfolding before our innocent and unsuspecting eyes.

Boykinism: The New McCarthyism

How many Americans endorsed McCarthy’s conspiratorial view of national and world politics? It’s difficult to know for sure, but enough in Wisconsin to win him reelection in 1952, by a comfortable 54% to 46% majority. Enough to strike fear into the hearts of politicians who quaked at the thought of McCarthy fingering them for being “soft on Communism.”

How many Americans endorse Boykin’s comparably incendiary views? Again, it’s difficult to tell. Enough to persuade FRC’s funders and supporters to hire him, confident that doing so would burnish, not tarnish, the organization’s brand. Certainly, Boykin has in no way damaged its ability to attract powerhouses of the domestic right. FRC’s recent “Values Voter Summit” featured luminaries such as Republican vice-presidential nominee Paul Ryan, former Republican Senator and presidential candidate Rick Santorum, House Majority Leader Eric Cantor, and Representative Michele Bachmann — along with Jerry Boykin himself, who lectured attendees on “Israel, Iran, and the Future of Western Civilization.” (In early August, Mitt Romney met privately with a group of “prominent social conservatives,” including Boykin.)

Does their appearance at the FRC podium signify that Ryan, Santorum, Cantor, and Bachmann all subscribe to Boykinism’s essential tenets? Not any more than those who exploited the McCarthyite moment to their own political advantage — Richard Nixon, for example — necessarily agreed with all of McCarthy’s reckless accusations. Yet the presence of leading Republicans on an FRC program featuring Boykin certainly suggests that they find nothing especially objectionable or politically damaging to them in his worldview.

Still, comparisons between McCarthyism and Boykinism only go so far. Senator McCarthy wreaked havoc mostly on the home front, instigating witch-hunts, destroying careers, and trampling on civil rights, while imparting to American politics even more of a circus atmosphere than usual. In terms of foreign policy, the effect of McCarthyism, if anything, was to reinforce an already existing anti-communist consensus. McCarthy’s antics didn’t create enemies abroad. McCarthyism merely reaffirmed that communists were indeed the enemy, while making the political price of thinking otherwise too high to contemplate.

Boykinism, in contrast, makes its impact felt abroad. Unlike McCarthyism, it doesn’t strike fear into the hearts of incumbents on the campaign trail here. Attracting General Boykin’s endorsement or provoking his ire probably won’t determine the outcome of any election. Yet in its various manifestations Boykinism provides the kindling that helps sustain anti-American sentiment in the Islamic world. It reinforces the belief among Muslims that the Global War on Terror really is a war against them.

Boykinism confirms what many Muslims are already primed to believe: that American values and Islamic values are irreconcilable. American presidents and secretaries of state stick to their talking points, praising Islam as a great religious tradition and touting past U.S. military actions (ostensibly) undertaken on behalf of Muslims. Yet with their credibility among Iraqis, Afghans, Pakistanis, and others in the Greater Middle East about nil, they are pissing in the wind.

As long as substantial numbers of vocal Americans do not buy the ideological argument constructed to justify U.S. intervention in the Islamic world — that their conception of freedom (including religious freedom) is ultimately compatible with ours — then neither will Muslims. In that sense, the supporters of Boykinism who reject that proposition encourage Muslims to follow suit. This ensures, by extension, that further reliance on armed force as the preferred instrument of U. S. policy in the Islamic world will compound the errors that produced and have defined the post-9/11 era.

Andrew J. Bacevich is currently a visiting fellow at Notre Dame’s Kroc Institute for International Peace Studies. A TomDispatch regular, he is author of Washington Rules: America’s Path to Permanent War, among other works, and most recently editor of The Short American Century.

Copyright 2012 Andrew J. Bacevich

Boykinism

With the United States now well into the second decade of what the Pentagon has styled an “era of persistent conflict,” the war formerly known as the global war on terrorism (unofficial acronym WFKATGWOT) appears increasingly fragmented and diffuse.  Without achieving victory, yet unwilling to acknowledge failure, the United States military has withdrawn from Iraq.  It is trying to leave Afghanistan, where events seem equally unlikely to yield a happy outcome. 

Elsewhere — in Pakistan, Libya, Yemen, and Somalia, for example — U.S. forces are busily opening up new fronts.  Published reports that the United States is establishing “a constellation of secret drone bases” in or near the Horn of Africa and the Arabian Peninsula suggest that the scope of operations will only widen further.  In a front-page story, the New York Times described plans for “thickening” the global presence of U.S. special operations forces.  Rushed Navy plans to convert an aging amphibious landing ship into an “afloat forward staging base” — a mobile launch platform for either commando raids or minesweeping operations in the Persian Gulf — only reinforces the point. Yet as some fronts close down and others open up, the war’s narrative has become increasingly difficult to discern.  How much farther until we reach the WFKATGWOT’s equivalent of Berlin?  What exactly is the WFKATGWOT’s equivalent of Berlin?  In fact, is there a storyline here at all?

Viewed close-up, the “war” appears to have lost form and shape.  Yet by taking a couple of steps back, important patterns begin to appear.  What follows is a preliminary attempt to score the WFKATGWOT, dividing the conflict into a bout of three rounds.  Although there may be several additional rounds still to come, here’s what we’ve suffered through thus far.

The Rumsfeld Era

Round 1: Liberation.  More than any other figure — more than any general, even more than the president himself — Secretary of Defense Donald Rumsfeld dominated the war’s early stages.  Appearing for a time to be a larger-than-life figure — the “Secretary at War” in the eyes of an adoring (if fickle) neocon fan club — Rumsfeld dedicated himself to the proposition that, in battle, speed holds the key to victory.  He threw his considerable weight behind a high-tech American version of blitzkrieg.  U.S. forces, he regularly insisted, were smarter and more agile than any adversary.  To employ them in ways that took advantage of those qualities was to guarantee victory.  The journalistic term adopted to describe this concept was “shock and awe.”

No one believed more passionately in “shock and awe” than Rumsfeld himself.  The design of Operation Enduring Freedom, launched in October 2001, and of Operation Iraqi Freedom, begun in March 2003, reflected this belief.  In each instance, the campaign got off to a promising start, with U.S. troops landing some swift and impressive blows.  In neither case, however, were they able to finish off their opponent or even, in reality, sort out just who their opponent might be.  Unfortunately for Rumsfeld, the “terrorists” refused to play by his rulebook and U.S. forces proved to be less smart and agile than their technological edge — and their public relations machine — suggested would be the case.  Indeed, when harassed by minor insurgencies and scattered bands of jihadis, they proved surprisingly slow to figure out what hit them.

In Afghanistan, Rumsfeld let victory slip through his grasp.  In Iraq, his mismanagement of the campaign brought the United States face-to-face with outright defeat.  Rumsfeld’s boss had hoped to liberate (and, of course, dominate) the Islamic world through a series of short, quick thrusts.  What Bush got instead were two different versions of a long, hard slog.  By the end of 2006, “shock and awe” was kaput.  Trailing well behind the rest of the country and its armed forces, the president eventually lost confidence in his defense secretary’s approach.  As a result, Rumsfeld lost his job.  Round one came to an end, the Americans, rather embarrassingly, having lost it on points.

The Petraeus Era

Round 2: Pacification.  Enter General David Petraeus.  More than any other figure, in or out of uniform, Petraeus dominated the WFKATGWOT’s second phase.  Round two opened with lowered expectations.  Gone was the heady talk of liberation.  Gone, too, were predictions of lightning victories.  The United States was now willing to settle for much less while still claiming success. 

Petraeus offered a formula for restoring a semblance of order to countries reduced to chaos as a result of round one.  Order might permit the United States to extricate itself while maintaining some semblance of having met its policy objectives.  This became the operative definition of victory.

The formal name for the formula that Petraeus devised was counterinsurgency, or COIN.  Rather than trying to defeat the enemy, COIN sought to facilitate the emergence of a viable and stable nation-state.  This was the stated aim of the “surge” in Iraq ordered by President George W. Bush at the end of 2006. 

With Petraeus presiding, violence in that country did decline precipitously. Whether the relationship was causal or coincidental remains the subject of controversy.  Still, Petraeus’s apparent success persuaded some observers that counterinsurgency on a global scale — GCOIN, they called it — should now form the basis for U.S. national security strategy.  Here, they argued, was an approach that could definitively extract the United States from the WFKATGWOT, while offering victory of a sort.  Rather than employing “shock and awe” to liberate the Islamic world, U.S. forces would apply counterinsurgency doctrine to pacify it.

The task of demonstrating the validity of COIN beyond Iraq fell to General Stanley McChrystal, appointed with much fanfare in 2009 to command U.S. and NATO forces in Afghanistan.  Press reports celebrated McChrystal as another Petraeus, the ideal candidate to replicate the achievements already credited to “King David.” 

McChrystal’s ascendency came at a moment when a cult of generalship gripped Washington.  Rather than technology being the determinant of success as Rumsfeld had believed, the key was to put the right guy in charge and then let him run with things.  Political figures on both sides of the aisle fell all over themselves declaring McChrystal the right guy for Afghanistan.  Pundits of all stripes joined the chorus.

Once installed in Kabul, the general surveyed the situation and, to no one’s surprise, announced that “success demands a comprehensive counterinsurgency campaign.”  Implementing that campaign would necessitate an Afghan “surge” mirroring the one that had seemingly turned Iraq around.  In December 2009, albeit with little evident enthusiasm, President Barack Obama acceded to his commander’s request (or ultimatum).  The U.S. troop commitment to Afghanistan rapidly increased.

Here things began to come undone.  Progress toward reducing the insurgency or improving the capacity of Afghan security forces was — by even the most generous evaluation — negligible.  McChrystal made promises — like meeting basic Afghan needs with “government in a box, ready to roll in” — that he proved utterly incapable of keeping.  Relations with the government of President Hamid Karzai remained strained.  Those with neighboring Pakistan, not good to begin with, only worsened.  Both governments expressed deep resentment at what they viewed as high-handed American behavior that killed or maimed noncombatants with disturbing frequency.

To make matters worse, despite all the hype, McChrystal turned out to be miscast — manifestly the wrong guy for the job.  Notably, he proved unable to grasp the need for projecting even some pretence of respect for the principle of civilian control back in Washington.  By the summer of 2010, he was out — and Petraeus was back in.

In Washington (if not in Kabul), Petraeus’s oversized reputation quelled the sense that with McChrystal’s flame-out Afghanistan might be a lost cause.  Surely, the most celebrated soldier of his generation would repeat his Iraq magic, affirming his own greatness and the continued viability of COIN. 

Alas, this was not to be.  Conditions in Afghanistan during Petraeus’s tenure in command improved — if that’s even the word — only modestly.  The ongoing war met just about anyone’s definition of a quagmire.  With considerable understatement, a 2011 National Intelligence Estimate called it a “stalemate.” Soon, talk of a “comprehensive counterinsurgency” faded.  With the bar defining success slipping ever lower, passing off the fight to Afghan security forces and hightailing it for home became the publicly announced war aim.

That job remained unfinished when Petraeus himself headed for home, leaving the army to become CIA director.  Although Petraeus was still held in high esteem, his departure from active duty left the cult of generalship looking more than a little the worse for wear.  By the time General John Allen succeeded Petraeus — thereby became the eighth U.S. officer appointed to preside over the ongoing Afghan War — no one believed that simply putting the right guy in charge was going to produce magic.  On that inclusive note, round two of the WFKATGWOT ended.

The Vickers Era

Round 3: Assassination.  Unlike Donald Rumsfeld or David Petraeus, Michael Vickers has not achieved celebrity status.  Yet more than anyone else in or out of uniform, Vickers, who carries the title Under Secretary of Defense for Intelligence, deserves recognition as the emblematic figure of the WFKATGWOT’s round three.  His low-key, low-profile persona meshes perfectly with this latest evolution in the war’s character.  Few people outside of Washington know who he is, which is fitting indeed since he presides over a war that few people outside of Washington are paying much attention to any longer.

With the retirement of Secretary of Defense Robert Gates, Vickers is the senior remaining holdover from George W. Bush’s Pentagon.  His background is nothing if not eclectic.  He previously served in U.S. Army Special Forces and as a CIA operative.  In that guise, he played a leading role in supporting the Afghan mujahedeen in their war against Soviet occupiers in the 1980s.  Subsequently, he worked in a Washington think tank and earned a PhD in strategic studies at Johns Hopkins University (dissertation title: “The Structure of Military Revolutions”). 

Even during the Bush era, Vickers never subscribed to expectations that the United States could liberate or pacify the Islamic world.  His preferred approach to the WFKATGWOT has been simplicity itself. “I just want to kill those guys,” he says — “those guys” referring to members of al-Qaeda. Kill the people who want to kill Americans and don’t stop until they are all dead: this defines the Vickers strategy, which over the course of the Obama presidency has supplanted COIN as the latest variant of U.S. strategy. 

The Vickers approach means acting aggressively to eliminate would-be killers wherever they might be found, employing whatever means are necessary.  Vickers “tends to think like a gangster,” one admirer comments. “He can understand trends then change the rules of the game so they are advantageous for your side.”

Round three of the WFKATGWOT is all about bending, breaking, and reinventing rules in ways thought to be advantageous to the United States.  Much as COIN supplanted “shock and awe,” a broad-gauged program of targeted assassination has now displaced COIN as the prevailing expression of the American way of war. 

The United States is finished with the business of sending large land armies to invade and occupy countries on the Eurasian mainland.  Robert Gates, when still Secretary of Defense, made the definitive statement on that subject.  The United States is now in the business of using missile-armed drones and special operations forces to eliminate anyone (not excluding U.S. citizens) the president of the United States decides has become an intolerable annoyance.  Under President Obama, such attacks have proliferated. 

This is America’s new MO.  Paraphrasing a warning issued by Secretary of State Hillary Clinton, a Washington Post dispatch succinctly summarized what it implied: “The United States reserved the right to attack anyone who it determined posed a direct threat to U.S. national security, anywhere in the world.” 

Furthermore, acting on behalf of the United States, the president exercises this supposed right without warning, without regard to claims of national sovereignty, without Congressional authorization, and without consulting anyone other than Michael Vickers and a few other members of the national security apparatus.  The role allotted to the American people is to applaud, if and when notified that a successful assassination has occurred.  And applaud we do, for example, when a daring raid by members in SEAL Team Six secretly enter Pakistan to dispatch Osama bin Laden with two neatly placed kill shots.  Vengeance long deferred making it unnecessary to consider what second-order political complications might ensue. 

How round three will end is difficult to forecast.  The best we can say is that it’s unlikely to end anytime soon or particularly well.  As Israel has discovered, once targeted assassination becomes your policy, the list of targets has a way of growing ever longer. 

So what tentative judgments can we offer regarding the ongoing WFKATGWOT?  Operationally, a war launched by the conventionally minded has progressively fallen under the purview of those who inhabit what Dick Cheney once called “the dark side,” with implications that few seem willing to explore.  Strategically, a war informed at the outset by utopian expectations continues today with no concretely stated expectations whatsoever, the forward momentum of events displacing serious consideration of purpose.  Politically, a war that once occupied center stage in national politics has now slipped to the periphery, the American people moving on to other concerns and entertainments, with legal and moral questions raised by the war left dangling in midair.

Is this progress?

Andrew J. Bacevich is professor of history and international relations at Boston University.  A TomDispatch regular, he is the author most recently of Washington Rules: The American Path to Permanent War and the editor of the new book The Short American Century: A Postmortem, just out from Harvard University Press. To catch Timothy MacBain’s latest Tomcast audio interview in which Bacevich discusses the changing face of the Gobal War on Terror, click here, or download it to your iPod here.

Copyright 2012 Andrew Bacevich

Scoring the Global War on Terror

Fenway Park, Boston, July 4, 2011.  On this warm summer day, the Red Sox will play the Toronto Blue Jays.  First come pre-game festivities, especially tailored for the occasion.  The ensuing spectacle — a carefully scripted encounter between the armed forces and society — expresses the distilled essence of present-day American patriotism.  A masterpiece of contrived spontaneity, the event leaves spectators feeling good about their baseball team, about their military, and not least of all about themselves — precisely as it was meant to do.

In this theatrical production, the Red Sox provide the stage, and the Pentagon the props.  In military parlance, it is a joint operation.  In front of a gigantic American flag draped over the left-field wall, an Air Force contingent, clad in blue, stands at attention.  To carry a smaller version of the Stars and Stripes onto the playing field, the Navy provides a color guard in crisp summer whites.  The United States Marine Corps kicks in with a choral ensemble that leads the singing of the national anthem.  As the anthem’s final notes sound, four U. S. Air Force F-15C Eagles scream overhead.  The sellout crowd roars its approval.

But there is more to come. “On this Independence Day,” the voice of the Red Sox booms over the public address system, “we pay a debt of gratitude to the families whose sons and daughters are serving our country.”  On this particular occasion the designated recipients of that gratitude are members of the Lydon family, hailing from Squantum, Massachusetts.  Young Bridget Lydon is a sailor — Aviation Ordnanceman Airman is her official title — serving aboard the carrier USS Ronald Reagan, currently deployed in support of the Afghanistan War, now in its 10th year.

From Out of Nowhere

The Lydons are Every Family, decked out for the Fourth.  Garbed in random bits of Red Sox paraphernalia and Mardi Gras necklaces, they wear their shirts untucked and ball caps backwards.  Neither sleek nor fancy, they are without pretension.  Yet they exude good cheer.  As they are ushered onto the field, their eagerness is palpable.  Like TV game show contestants, they know that this is their lucky day and they are keen to make the most of it.

As the Lydons gather near the pitcher’s mound, the voice directs their attention to the 38-by-100-foot Jumbotron mounted above the centerfield bleachers.  On the screen, Bridget appears.  She is aboard ship, in duty uniform, posed below decks in front of an F/A-18 fighter jet.  Waiflike, but pert and confident, she looks directly into the camera, sending a “shout-out” to family and friends.  She wishes she could join them at Fenway. 

As if by magic, wish becomes fulfillment.  While the video clip is still running, Bridget herself, now in dress whites, emerges from behind the flag covering the leftfield wall.  On the Jumbotron, in place of Bridget below decks, an image of Bridget marching smartly toward the infield appears.  In the stands pandemonium erupts.  After a moment of confusion, members of her family — surrounded by camera crews — rush to embrace their sailor, a reunion shared vicariously by the 38,000 fans in attendance along with many thousands more watching at home on the Red Sox television network. 

Once the Lydons finish with hugs and kisses and the crowd settles down, Navy veteran Bridget (annual salary approximately $22,000) throws the ceremonial first pitch to aging Red Sox veteran Tim Wakefield (annual salary $2,000,000).  More cheers.  As a souvenir, Wakefield gives her the baseball along with his own hug.  All smiles, Bridget and her family shout “Play Ball!” into a microphone.  As they are escorted off the field and out of sight, the game begins. 

Cheap Grace

What does this event signify?

For the Lydons, the day will no doubt long remain a happy memory.  If they were to some degree manipulated — their utter and genuine astonishment at Bridget’s seemingly miraculous appearance lending the occasion its emotional punch — they played their allotted roles without complaint and with considerable élan.  However briefly, they stood in the spotlight, quasi-celebrities, all eyes trained on them, a contemporary version of the American dream fulfilled.  And if offstage puppet-masters used Bridget herself, at least she got a visit home and a few days off — no doubt a welcome break. 

Yet this feel-good story was political as well as personal.  As a collaboration between two well-heeled but image-conscious institutions, the Lydon reunion represented a small but not inconsequential public relations triumph.  The Red Sox and the Navy had worked together to perform an act of kindness for a sailor and her loved ones.  Both organizations came away looking good, not only because the event itself was so deftly executed, but because it showed that the large for-profit professional sports team and the even larger military bureaucracy both care about ordinary people.  The message conveyed to fans/taxpayers could not be clearer: the corporate executives who run the Red Sox have a heart. So, too, do the admirals who run the Navy.

Better still, these benefits accrued at essentially no cost to the sponsors.  The military personnel arrayed around Fenway showed up because they were told to do so.  They are already “paid for,” as are the F-15s, the pilots who fly them, and the ground crews that service them.  As for whatever outlays the Red Sox may have made, they are trivial and easily absorbed.  For the 2011 season, the average price of a ticket at Fenway Park had climbed to $52.  A soft drink in a commemorative plastic cup runs you $5.50 and a beer $8.  Then there is the television ad revenue, all contributing the previous year to corporate profits exceeding $58 million.  A decade of war culminating in the worst economic crisis since the Great Depression hasn’t done much good for the country but it has been strangely good for the Red Sox — and a no-less well funded Pentagon.  Any money expended in bringing Bridget to Fenway and entertaining the Lydons had to be the baseball/military equivalent of pocket change.

And the holiday festivities at Fenway had another significance as well, one that extended beyond burnishing institutional reputations and boosting bottom lines.  Here was America’s civic religion made manifest. 

In recent decades, an injunction to “support the troops” has emerged as a central tenet of that religion.  Since 9/11 this imperative has become, if anything, even more binding.  Indeed, as citizens, Americans today acknowledge no higher obligation.

Fulfilling that obligation has posed a challenge, however.  Rather than doing so concretely, Americans — with a few honorable exceptions — have settled for symbolism.  With their pronounced aversion to collective service and sacrifice (an inclination indulged by leaders of both political parties), Americans resist any definition of civic duty that threatens to crimp lifestyles. 

To stand in solidarity with those on whom the burden of service and sacrifice falls is about as far as they will go.  Expressions of solidarity affirm that the existing relationship between soldiers and society is consistent with democratic practice.  By extension, so, too, is the distribution of prerogatives and responsibilities entailed by that relationship: a few fight, the rest applaud.  Put simply, the message that citizens wish to convey to their soldiers is this: although choosing not to be with you, we are still for you (so long as being for you entails nothing on our part).  Cheering for the troops, in effect, provides a convenient mechanism for voiding obligation and easing guilty consciences.   

In ways far more satisfying than displaying banners or bumper stickers, the Fenway Park Independence Day event provided a made-to-order opportunity for conscience easing.  It did so in three ways.  First, it brought members of Red Sox Nation into close proximity (even if not direct contact) with living, breathing members of the armed forces, figuratively closing any gap between the two.  (In New England, where few active duty military installations remain, such encounters are increasingly infrequent.)  Second, it manufactured one excuse after another to whistle and shout, whoop and holler, thereby allowing the assembled multitudes to express — and to be seen expressing — their affection and respect for the troops.  Finally, it rewarded participants and witnesses alike with a sense of validation, the reunion of Bridget and her family, even if temporary, serving as a proxy for a much larger, if imaginary, reconciliation of the American military and the American peopleThat debt?  Mark it paid in full. 

The late German theologian Dietrich Bonhoeffer had a name for this unearned self-forgiveness and undeserved self-regard.  He called it cheap grace.  Were he alive today, Bonhoeffer might suggest that a taste for cheap grace, compounded by an appetite for false freedom, is leading Americans down the road to perdition. 

Andrew J. Bacevich, the author of Washington Rules: America’s Path to Permanent War, is professor of history and international relations at Boston University. His next book, of which this post is a small part, will assess the impact of a decade of war on American society and the United States military. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses cheap grace and military spectacle, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Ballpark Liturgy: America’s New Civic Religion

At periodic intervals, the American body politic has shown a marked susceptibility to messianic fevers.  Whenever an especially acute attack occurs, a sort of delirium ensues, manifesting itself in delusions of grandeur and demented behavior. 

By the time the condition passes and a semblance of health is restored, recollection of what occurred during the illness tends to be hazy.  What happened?  How’d we get here?  Most Americans prefer not to know.  No sense dwelling on what’s behind us.  Feeling much better now!  Thanks!

Gripped by such a fever in 1898, Americans evinced an irrepressible impulse to liberate oppressed Cubans.  By the time they’d returned to their senses, having acquired various parcels of real estate between Puerto Rico and the Philippines, no one could quite explain what had happened or why.  (The Cubans meanwhile had merely exchanged one set of overseers for another.)

In 1917, the fever suddenly returned.  Amid wild ravings about waging a war to end war, Americans lurched off to France.  This time the affliction passed quickly, although the course of treatment proved painful: confinement to the charnel house of the Western Front, followed by bitter medicine administered at Versailles.

The 1960s brought another bout (and so yet more disappointment).  An overwhelming urge to pay any price, bear any burden landed Americans in Vietnam.  The fall of Saigon in 1975 seemed, for a brief interval, to inoculate the body politic against any further recurrence.  Yet the salutary effects of this “Vietnam syndrome” proved fleeting.  By the time the Cold War ended, Americans were running another temperature, their self-regard reaching impressive new heights.  Out of Washington came all sorts of embarrassing gibberish about permanent global supremacy and history’s purpose finding fulfillment in the American way of life.

Give Me Fever

Then came 9/11 and the fever simply soared off the charts.  The messiah-nation was really pissed and was going to fix things once and for all.

Nearly 10 years have passed since Washington set out to redeem the Greater Middle East.  The crusades have not gone especially well.  In fact, in the pursuit of its saving mission, the American messiah has pretty much worn itself out.

Today, the post-9/11 fever finally shows signs of abating.  The evidence is partial and preliminary.  The sickness has by no means passed.  Oddly, it lingers most strongly in the Obama White House, of all places, where a keenness to express American ideals by dropping bombs seems strangely undiminished.

Yet despite the urges of some in the Obama administration, after nearly a decade of self-destructive flailing about, American recovery has become a distinct possibility.  Here’s some of the evidence:

In Washington, it’s no longer considered a sin to question American omnipotence.  Take the case of Robert Gates.  The outgoing secretary of defense may well be the one senior U.S. official of the past decade to leave office with his reputation not only intact, but actually enhanced.  (Note to President Obama: think about naming an aircraft carrier after the guy).  Yet along with restoring a modicum of competence and accountability to the Pentagon, the Gates legacy is likely to be found in his willingness — however belated — to acknowledge the limits of American power.

That the United States should avoid wars except when absolutely necessary no longer connotes incipient isolationism.  It is once again a sign of common sense, with Gates a leading promoter.  Modesty is becoming respectable.

The Gates Doctrine

No one can charge Gates with being an isolationist or a national security wimp.  Neither is he a “declinist.”  So when he says anyone proposing another major land war in the Greater Middle East should “have his head examined” — citing the authority of Douglas MacArthur, no less — people take notice.  Or more recently there was this:  "I've got a military that's exhausted," Gates remarked, in one of those statements of the obvious too seldom heard from on high.  "Let's just finish the wars we're in and keep focused on that instead of signing up for other wars of choice."  Someone should etch that into the outer walls of the Pentagon’s E-ring.

A half-dozen years ago, “wars of choice” were all the rage in Washington.  No more.  Thank you, Mr. Secretary.

Or consider the officer corps.  There is no “military mind,” but there are plenty of minds in the military and some numbers of them are changing.

Evidence suggests that the officer corps itself is rethinking the role of military power.  Consider, for example, “Mr. Y,” author of A National Strategic Narrative, published this spring to considerable acclaim by the Woodrow Wilson Center for Scholars.  The actual authors of this report are two military professionals, one a navy captain, the other a Marine colonel.

What you won’t find in this document are jingoism, braggadocio, chest-thumping, and calls for a bigger military budget.  If there’s an overarching theme, it’s pragmatism.  Rather than the United States imposing its will on the world, the authors want more attention paid to the investment needed to rebuild at home.

The world is too big and complicated for any one nation to call the shots, they insist.  The effort to do so is self-defeating. “As Americans,” Mr. Y writes, “we needn’t seek the world’s friendship or proselytize the virtues of our society.  Neither do we seek to bully, intimidate, cajole, or persuade others to accept our unique values or to share our national objectives.  Rather, we will let others draw their own conclusions based upon our actions… We will pursue our national interests and let others pursue theirs…”

You might dismiss this as the idiosyncratic musing of two officers who have spent too much time having their brains baked in the Iraqi or Afghan sun.  I don’t.  What convinces me otherwise is the positive email traffic that my own musings about the misuse and abuse of American power elicit weekly from serving officers.  It’s no scientific sample, but the captains, majors, and lieutenant colonels I hear from broadly agree with Mr. Y.  They’ve had a bellyful of twenty-first-century American war and are open to a real debate over how to overhaul the nation’s basic approach to national security.

Intelligence Where You Least Expect It

And finally, by gum, there is the United States Congress.  Just when that body appeared to have entered a permanent vegetative state, a flickering of intelligent life has made its reappearance.  Perhaps more remarkably still, the signs are evident on both sides of the aisle as Democrats and Republicans alike — albeit for different reasons — are raising serious questions about the nation’s propensity for multiple, open-ended wars.

Some members cite concerns for the Constitution and the abuse of executive power.  Others worry about the price tag.  With Osama bin Laden out of the picture, still others insist that it’s time to rethink strategic priorities.  No doubt partisan calculation or personal ambition figures alongside matters of principle.  They are, after all, politicians.

Given what polls indicate is a growing public unhappiness over the Afghan War, speaking out against that war these days doesn’t exactly require political courage.  Still, the possibility of our legislators reasserting a role in deciding whether or not a war actually serves the national interest — rather than simply rubberstamping appropriations and slinking away — now presents itself.  God bless the United States Congress.

Granted, the case presented here falls well short of being conclusive.  To judge by his announcement of a barely-more-than-symbolic troop withdrawal from Afghanistan, President Obama himself seems uncertain of where he stands.  And clogging the corridors of power or the think tanks and lobbying arenas that surround them are plenty of folks still hankering to have a go at Syria or Iran.

At the first signs of self-restraint, you can always count on the likes of Senator John McCain or the editorial board of the Wall Street Journal to decry (in McCain’s words) an “isolationist-withdrawal-lack-of-knowledge-of-history attitude” hell-bent on pulling up the drawbridge and having Americans turn their backs on the world.  In such quarters, fever is a permanent condition and it’s always 104 and rising.  Yet it is a measure of just how quickly things are changing that McCain himself, once deemed a source of straight talk, now comes across as a mere crank.

In this way, nearly a decade after our most recent descent into madness, does the possibility of recovery finally beckon.

Andrew J. Bacevich is professor of history and international relations at Boston University. His most recent book is Washington Rules: America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses voices of dissent within the military, click here, or download it to your iPod here.

Copyright 2011 Andrew J. Bacevich

On the Mend?

It is a commonplace of American politics: when the moving van pulls up to the White House on Inauguration Day, it delivers not only a closetful of gray suits and power ties, but a boatload of expectations. 

A president, being the most powerful man in the world, begins history anew — so at least Americans believe, or pretend to believe.  Out with the old, sordid, and disappointing; in with the fresh, unsullied, and hopeful.  Why, with the stroke of a pen, a new president can order the closing of an embarrassing and controversial off-shore prison for accused terrorists held for years on end without trial!  Just like that: done.

For all sorts of reasons, the expectations raised by Barack Obama’s arrival in the Oval Office were especially high.  Americans weren’t the only ones affected.  How else to explain the Nobel Committee’s decision to honor the new president by transforming its Peace Prize into a Prize Anticipating Peace — more or less the equivalent of designating the winner of the Heisman Trophy during week one of the college football season.

Of course, if the political mood immediately prior to and following a presidential inauguration emphasizes promise and discovery (the First Lady has biceps!), it doesn’t take long for the novelty to start wearing off.  Then the narrative arc takes a nosedive: he’s breaking his promises,  he’s letting us down, he’s not so different after all.

The words of H.L. Mencken apply.  “When I hear a man applauded by the mob,” the Sage of Baltimore wrote, “I always feel a pang of pity for him.  All he has to do to be hissed is to live long enough.”  Barack Obama has now lived long enough to attract his fair share of hisses, boos, and catcalls.

Along with prolonging and expanding one war in Afghanistan, the Nobel Peace laureate has played a leading role in starting another war in Libya.  Laboring to distinguish between this administration and its predecessor, Obama’s defenders emphasize the purity of his motives.  Contemptuous of George W. Bush’s claim that U.S. forces invaded oil-rich Iraq to keep weapons of mass destruction out of the hands of terrorists, they readily accept this president’s insistence that the United States intervened in oil-rich Libya to prevent genocidal slaughter.  Besides, testifying to our virtuous intent, this time we’ve got the French with us rather than against us.

Explaining Why Is a Mug’s Game

In truth, to ascribe a single governing purpose or rationale to any large-scale foreign policy initiative is to engage in willful distortion.  In any administration, action grows out of consensus.  The existence of consensus among any president’s advisers — LBJ’s inner circle supporting escalation in South Vietnam back in 1965, George W.’s pressing for regime change in Baghdad — does not imply across-the-board agreement as to intent.

Motive is slippery.  As Paul Wolfowitz famously noted regarding Iraq, weapons of mass destruction merely provided the agreed upon public rationale for war.  In reality, a mix of motives probably shaped the decision to invade.  For some administration officials, there was the prospect of eliminating a perceived source of mischief while providing an object lesson to other would-be troublemakers.  For others, there was the promise of reasserting U.S. hegemony over the world’s energy heartland.  For others still (including Wolfowitz himself), there were alluring visions of a region transformed, democratized, and pacified, the very sources of Islamist terror thereby eliminated once and for all. 

At least on the margins, expanding the powers of the presidency at the expense of Congress, bolstering the security of Israel, and finishing what daddy had left undone also likely figured in the equation.  Within this mix, policymakers could pick and choose.

In the face of changing circumstances, they even claimed the prerogative of revising their choices.  Who can doubt that President Bush, faced with the Big Oops — the weapons of mass destruction that turned out not to exist — genuinely persuaded himself that America’s true and abiding purpose for invading Iraq had been to liberate the Iraqi people from brutal oppression?  After all, right from day one wasn’t the campaign called Operation Iraqi Freedom?

So even as journalists and historians preoccupy themselves with trying to explain why something happened, they are playing a mug’s game.  However creative or well-sourced, their answers are necessarily speculative, partial, and ambiguous.  It can’t be otherwise.

Rather than why, what deserves far more attention than it generally receives is the question of how.  Here is where we find Barack Obama and George W. Bush (not to mention Bill Clinton, George H. W. Bush, Ronald Reagan, and Jimmy Carter) joined at the hip.  When it comes to the Islamic world, for more than three decades now Washington’s answer to how has been remarkably consistent: through the determined application of hard power wielded by the United States.  Simply put, Washington’s how implies a concerted emphasis on girding for and engaging in war. 

Presidents may not agree on exactly what we are trying to achieve in the Greater Middle East (Obama wouldn’t be caught dead reciting lines from Bush’s Freedom Agenda, for example), but for the past several decades, they have agreed on means: whatever it is we want done, military might holds the key to doing it.  So today, we have the extraordinary spectacle of Obama embracing and expanding Bush’s Global War on Terror even after having permanently banished that phrase to the Guantanamo of politically incorrect speech.

The Big How — By Force

Efforts to divine this administration’s intent in Libya have centered on the purported influence of the Three Harpies: Secretary of State Hillary Clinton, U.N. Ambassador Susan Rice, and National Security Council Human Rights Director Samantha Power, women in positions of influence ostensibly burdened with regret that the United States failed back in 1994 to respond effectively to the Rwandan genocide and determined this time to get it right.  Yet this is insider stuff, which necessarily remains subject to considerable speculation.  What we can say for sure is this: by seeing the Greater Middle East as a region of loose nails badly in need of being hammered, the current commander-in-chief has claimed his place in the ranks of a long list of his warrior-predecessors.

The key point is this: like those who preceded them, neither Obama nor his Harpies (nor anyone else in a position of influence) could evidently be bothered to assess whether the hammer actually works as advertised — notwithstanding abundant evidence showing that it doesn’t.

The sequence of military adventures set in motion when Jimmy Carter promulgated his Carter Doctrine back in 1980 makes for an interesting story but not a very pretty one.  Ronald Reagan’s effort to bring peace to Lebanon ended in 1983 in a bloody catastrophe.  The nominal victory of Operation Desert Storm in 1991, which pushed Saddam Hussein’s forces out of Kuwait, produced little except woeful complications, which Bill Clinton’s penchant for flinging bombs and missiles about during the 1990s did little to resolve or conceal.  The blowback stemming from our first Afghanistan intervention against the Soviets helped create the conditions leading to 9/11 and another Afghanistan War, now approaching its tenth anniversary with no clear end in sight.  As for George W. Bush’s second go at Iraq, the less said the better.  Now, there is Libya.

The question demands to be asked: Are we winning yet?  And if not, why persist in an effort for which great pain is repaid with such little gain?

Perhaps Barack Obama found his political soul mate in Samantha Power, making her determination to alleviate evil around the world his own.  Or perhaps he is just another calculating politician who speaks the language of ideals while pursuing less exalted purposes.  In either case, the immediate relevance of the question is limited.  The how rather than the why is determinant.

Whatever his motives, by conforming to a pre-existing American penchant for using force in the Greater Middle East, this president has chosen the wrong tool.  In doing so, he condemns himself and the country to persisting in the folly of his predecessors.  The failure is one of imagination, but also of courage.  He promised, and we deserve something better. 

Andrew J. Bacevich is professor of history and international relations.  His most recent book Washington Rules: America’s Path to Permanent War (Metropolitan Books) is just out in paperback. To catch Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses what to make of the Obama administration’s Libyan intervention, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Not Why, But How

In defense circles, “cutting” the Pentagon budget has once again become a topic of conversation.  Americans should not confuse that talk with reality.  Any cuts exacted will at most reduce the rate of growth.  The essential facts remain: U.S. military outlays today equal that of every other nation on the planet combined, a situation without precedent in modern history. 

The Pentagon presently spends more in constant dollars than it did at any time during the Cold War — this despite the absence of anything remotely approximating what national security experts like to call a “peer competitor.”  Evil Empire?  It exists only in the fevered imaginations of those who quiver at the prospect of China adding a rust-bucket Russian aircraft carrier to its fleet or who take seriously the ravings of radical Islamists promising from deep inside their caves to unite the Umma in a new caliphate.

What are Americans getting for their money?  Sadly, not much.  Despite extraordinary expenditures (not to mention exertions and sacrifices by U.S. forces), the return on investment is, to be generous, unimpressive.  The chief lesson to emerge from the battlefields of the post-9/11 era is this: the Pentagon possesses next to no ability to translate “military supremacy” into meaningful victory.

Washington knows how to start wars and how to prolong them, but is clueless when it comes to ending them.  Iraq, the latest addition to the roster of America’s forgotten wars, stands as exhibit A.  Each bomb that blows up in Baghdad or some other Iraqi city, splattering blood all over the streets, testifies to the manifest absurdity of judging “the surge” as the epic feat of arms celebrated by the Petraeus lobby.

The problems are strategic as well as operational.  Old Cold War-era expectations that projecting U.S. power will enhance American clout and standing no longer apply, especially in the Islamic world.  There, American military activities are instead fostering instability and inciting anti-Americanism.  For Exhibit B, see the deepening morass that Washington refers to as AfPak or the Afghanistan-Pakistan theater of operations.

Add to that the mountain of evidence showing that Pentagon, Inc. is a miserably managed enterprise: hide-bound, bloated, slow-moving, and prone to wasting resources on a prodigious scale — nowhere more so than in weapons procurement and the outsourcing of previously military functions to “contractors.”  When it comes to national security, effectiveness (what works) should rightly take precedence over efficiency (at what cost?) as the overriding measure of merit.  Yet beyond a certain level, inefficiency undermines effectiveness, with the Pentagon stubbornly and habitually exceeding that level.  By comparison, Detroit’s much-maligned Big Three offer models of well-run enterprises.

Impregnable Defenses

All of this takes place against the backdrop of mounting problems at home: stubbornly high unemployment, trillion-dollar federal deficits, massive and mounting debt, and domestic needs like education, infrastructure, and employment crying out for attention.

Yet the defense budget — a misnomer since for Pentagon, Inc. defense per se figures as an afterthought — remains a sacred cow.  Why is that? 

The answer lies first in understanding the defenses arrayed around that cow to ensure that it remains untouched and untouchable.  Exemplifying what the military likes to call a “defense in depth,” that protective shield consists of four distinct but mutually supporting layers. 

Institutional Self-Interest: Victory in World War II produced not peace, but an atmosphere of permanent national security crisis.  As never before in U.S. history, threats to the nation’s existence seemed omnipresent, an attitude first born in the late 1940s that still persists today.  In Washington, fear — partly genuine, partly contrived — triggered a powerful response. 

One result was the emergence of the national security state, an array of institutions that depended on (and therefore strove to perpetuate) this atmosphere of crisis to justify their existence, status, prerogatives, and budgetary claims.  In addition, a permanent arms industry arose, which soon became a major source of jobs and corporate profits.  Politicians of both parties were quick to identify the advantages of aligning with this “military-industrial complex,” as President Eisenhower described it. 

Allied with (and feeding off of) this vast apparatus that transformed tax dollars into appropriations, corporate profits, campaign contributions, and votes was an intellectual axis of sorts  — government-supported laboratories, university research institutes, publications, think tanks, and lobbying firms (many staffed by former or would-be senior officials) — devoted to identifying (or conjuring up) ostensible national security challenges and alarms, always assumed to be serious and getting worse, and then devising responses to them. 

The upshot: within Washington, the voices carrying weight in any national security “debate” all share a predisposition for sustaining very high levels of military spending for reasons having increasingly little to do with the well-being of the country.

Strategic Inertia: In a 1948 State Department document, diplomat George F. Kennan offered this observation: “We have about 50 percent of the world’s wealth, but only 6.3 percent of its population.”  The challenge facing American policymakers, he continued, was “to devise a pattern of relationships that will permit us to maintain this disparity.”  Here we have a description of American purposes that is far more candid than all of the rhetoric about promoting freedom and democracy, seeking world peace, or exercising global leadership. 

The end of World War II found the United States in a spectacularly privileged position.  Not for nothing do Americans remember the immediate postwar era as a Golden Age of middle-class prosperity.  Policymakers since Kennan’s time have sought to preserve that globally privileged position.  The effort has been a largely futile one. 

By 1950 at the latest, those policymakers (with Kennan by then a notable dissenter) had concluded that the possession and deployment of military power held the key to preserving America’s exalted status.  The presence of U.S. forces abroad and a demonstrated willingness to intervene, whether overtly or covertly, just about anywhere on the planet would promote stability, ensure U.S. access to markets and resources, and generally serve to enhance the country’s influence in the eyes of friend and foe alike — this was the idea, at least. 

In postwar Europe and postwar Japan, this formula achieved considerable success.  Elsewhere — notably in Korea, Vietnam, Latin America, and (especially after 1980) in the so-called Greater Middle East — it either produced mixed results or failed catastrophically.  Certainly, the events of the post-9/11 era provide little reason to believe that this presence/power-projection paradigm will provide an antidote to the threat posed by violent anti-Western jihadism.  If anything, adherence to it is exacerbating the problem by creating ever greater anti-American animus.

One might think that the manifest shortcomings of the presence/power-projection approach — trillions expended in Iraq for what? — might stimulate present-day Washington to pose some first-order questions about basic U.S. national security strategy.  A certain amount of introspection would seem to be called for.  Could, for example, the effort to sustain what remains of America’s privileged status benefit from another approach? 

Yet there are few indications that our political leaders, the senior-most echelons of the officer corps, or those who shape opinion outside of government are capable of seriously entertaining any such debate.  Whether through ignorance, arrogance, or a lack of imagination, the pre-existing strategic paradigm stubbornly persists; so, too, as if by default do the high levels of military spending that the strategy entails.

Cultural Dissonance: The rise of the Tea Party movement should disabuse any American of the thought that the cleavages produced by the “culture wars” have healed.  The cultural upheaval touched off by the 1960s and centered on Vietnam remains unfinished business in this country. 

Among other things, the sixties destroyed an American consensus, forged during World War II, about the meaning of patriotism.  During the so-called Good War, love of country implied, even required, deference to the state, shown most clearly in the willingness of individuals to accept the government’s authority to mandate military service.  GI’s, the vast majority of them draftees, were the embodiment of American patriotism, risking life and limb to defend the country. 

The GI of World War II had been an American Everyman.  Those soldiers both represented and reflected the values of the nation from which they came (a perception affirmed by the ironic fact that the military adhered to prevailing standards of racial segregation).  It was “our army” because that army was “us.” 

With Vietnam, things became more complicated.  The war’s supporters argued that the World War II tradition still applied: patriotism required deference to the commands of the state.  Opponents of the war, especially those facing the prospect of conscription, insisted otherwise.  They revived the distinction, formulated a generation earlier by the radical journalist Randolph Bourne, that distinguished between the country and the state.  Real patriots, the ones who most truly loved their country, were those who opposed state policies they regarded as misguided, illegal, or immoral. 

In many respects, the soldiers who fought the Vietnam War found themselves caught uncomfortably in the center of this dispute.  Was the soldier who died in Vietnam a martyr, a tragic figure, or a sap?  Who deserved greater admiration:  the soldier who fought bravely and uncomplainingly or the one who served and then turned against the war?  Or was the war resister — the one who never served at all — the real hero? 

War’s end left these matters disconcertingly unresolved.  President Richard Nixon’s 1971 decision to kill the draft in favor of an All-Volunteer Force, predicated on the notion that the country might be better served with a military that was no longer “us,” only complicated things further.  So, too, did the trends in American politics where bona fide war heroes (George H.W. Bush, Bob Dole, John Kerry, and John McCain) routinely lost to opponents whose military credentials were non-existent or exceedingly slight (Bill Clinton, George W. Bush, and Barack Obama), yet who demonstrated once in office a remarkable propensity for expending American blood (none belonging to members of their own families) in places like Somalia, Iraq, and Afghanistan.  It was all more than a little unseemly.

Patriotism, once a simple concept, had become both confusing and contentious.  What obligations, if any, did patriotism impose?  And if the answer was none — the option Americans seemed increasingly to prefer — then was patriotism itself still a viable proposition? 

Wanting to answer that question in the affirmative — to distract attention from the fact that patriotism had become little more than an excuse for fireworks displays and taking the occasional day off from work — people and politicians alike found a way to do so by exalting those Americans actually choosing to serve in uniform.  The thinking went this way: soldiers offer living proof that America is a place still worth dying for, that patriotism (at least in some quarters) remains alive and well; by common consent, therefore, soldiers are the nation’s “best,” committed to “something bigger than self” in a land otherwise increasingly absorbed in pursuing a material and narcissistic definition of self-fulfillment. 

In effect, soldiers offer much-needed assurance that old-fashioned values still survive, even if confined to a small and unrepresentative segment of American society.  Rather than Everyman, today’s warrior has ascended to the status of icon, deemed morally superior to the nation for which he or she fights, the repository of virtues that prop up, however precariously, the nation’s increasingly sketchy claim to singularity.

Politically, therefore, “supporting the troops” has become a categorical imperative across the political spectrum.  In theory, such support might find expression in a determination to protect those troops from abuse, and so translate into wariness about committing soldiers to unnecessary or unnecessarily costly wars.  In practice, however, “supporting the troops” has found expression in an insistence upon providing the Pentagon with open-ended drawing rights on the nation’s treasury, thereby creating massive barriers to any proposal to affect more than symbolic reductions in military spending. 

Misremembered History: The duopoly of American politics no longer allows for a principled anti-interventionist position.  Both parties are war parties.  They differ mainly in the rationale they devise to argue for interventionism.  The Republicans tout liberty; the Democrats emphasize human rights.  The results tend to be the same: a penchant for activism that sustains a never-ending demand for high levels of military outlays.

American politics once nourished a lively anti-interventionist tradition.  Leading proponents included luminaries such as George Washington and John Quincy Adams.  That tradition found its basis not in principled pacifism, a position that has never attracted widespread support in this country, but in pragmatic realism.  What happened to that realist tradition?  Simply put, World War II killed it — or at least discredited it.  In the intense and divisive debate that occurred in 1939-1941, the anti-interventionists lost, their cause thereafter tarred with the label “isolationism.” 

The passage of time has transformed World War II from a massive tragedy into a morality tale, one that casts opponents of intervention as blackguards.  Whether explicitly or implicitly, the debate over how the United States should respond to some ostensible threat — Iraq in 2003, Iran today — replays the debate finally ended by the events of December 7, 1941.  To express skepticism about the necessity and prudence of using military power is to invite the charge of being an appeaser or an isolationist.  Few politicians or individuals aspiring to power will risk the consequences of being tagged with that label. 

In this sense, American politics remains stuck in the 1930s — always discovering a new Hitler, always privileging Churchillian rhetoric — even though the circumstances in which we live today bear scant resemblance to that earlier time.  There was only one Hitler and he’s long dead.  As for Churchill, his achievements and legacy are far more mixed than his battalions of defenders are willing to acknowledge.  And if any one figure deserves particular credit for demolishing Hitler’s Reich and winning World War II, it’s Josef Stalin, a dictator as vile and murderous as Hitler himself. 

Until Americans accept these facts, until they come to a more nuanced view of World War II that takes fully into account the political and moral implications of the U.S. alliance with the Soviet Union and the U.S. campaign of obliteration bombing directed against Germany and Japan, the mythic version of “the Good War” will continue to provide glib justifications for continuing to dodge that perennial question: How much is enough?

Like concentric security barriers arrayed around the Pentagon, these four factors — institutional self-interest, strategic inertia, cultural dissonance, and misremembered history — insulate the military budget from serious scrutiny.  For advocates of a militarized approach to policy, they provide invaluable assets, to be defended at all costs. 

Andrew J. Bacevich is professor of history and international relations at Boston University.  His most recent book is Washington Rules:  America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses the money that pours into the national security budget, click here or, to download it to your iPod, here.

Copyright 2011 Andrew Bacevich

Cow Most Sacred

In January 1863, President Abraham Lincoln’s charge to a newly-appointed commanding general was simplicity itself: “give us victories.”  President Barack Obama’s tacit charge to his generals amounts to this: give us conditions permitting a dignified withdrawal.  A pithy quote in Bob Woodward’s new book captures the essence of an emerging Obama Doctrine: “hand it off and get out.”

Getting into a war is generally a piece of cake.  Getting out tends to be another matter altogether — especially when the commander-in-chief and his commanders in the field disagree on the advisability of doing so.

Happy Anniversary, America.  Nine years ago today — on October 7, 2001 — a series of U.S. air strikes against targets across Afghanistan launched the opening campaign of what has since become the nation’s longest war.  Three thousand two hundred and eighty five days later the fight to determine Afghanistan’s future continues.  At least in part, “Operation Enduring Freedom” has lived up to its name:  it has certainly proven to be enduring.

As the conflict formerly known as the Global War on Terror enters its tenth year, Americans are entitled to pose this question: When, where, and how will the war end?  Bluntly, are we almost there yet?

Of course, with the passage of time, where “there” is has become increasingly difficult to discern.  Baghdad turned out not to be Berlin and Kandahar is surely not Tokyo.  Don’t look for CNN to be televising a surrender ceremony anytime soon.

This much we know: an enterprise that began in Afghanistan but soon after focused on Iraq has now shifted back — again — to Afghanistan.  Whether the swings of this pendulum signify progress toward some final objective is anyone’s guess.

To measure progress during wartime, Americans once employed pins and maps.  Plotting the conflict triggered by 9/11 will no doubt improve your knowledge of world geography, but it won’t tell you anything about where this war is headed.

Where, then, have nine years of fighting left us?  Chastened, but not necessarily enlightened.

Just over a decade ago, the now-forgotten Kosovo campaign seemingly offered a template for a new American way of war.  It was a decision gained without suffering a single American fatality.  Kosovo turned out, however, to be a one-off event.  No doubt the United States military was then (and remains today) unbeatable in traditional terms.  Yet, after 9/11, Washington committed that military to an endeavor that it manifestly cannot win.

Rather than probing the implications of this fact — relying on the force of arms to eliminate terrorism is a fool’s errand — two administrations have doggedly prolonged the war even as they quietly ratcheted down expectations of what it might accomplish.

In officially ending the U.S. combat role in Iraq earlier this year — a happy day if there ever was one — President Obama refrained from proclaiming “mission accomplished.”  As well he might: as U.S. troops depart Iraq, insurgents remain active and in the field.  Instead of declaring victory, the president simply urged Americans to turn the page.  With remarkable alacrity, most of us seem to have complied.

Perhaps more surprisingly, today’s military leaders have themselves abandoned the notion that winning battles wins wars, once the very foundation of their profession.  Warriors of an earlier day insisted: “There is no substitute for victory.”  Warriors in the Age of David Petraeus embrace an altogether different motto: “There is no military solution.”

Here is Brigadier General H. R. McMaster, one of the Army’s rising stars, summarizing the latest in advanced military thinking:  “Simply fighting and winning a series of interconnected battles in a well developed campaign does not automatically deliver the achievement of war aims.”  Winning as such is out.  Persevering is in.

So an officer corps once intent above all on avoiding protracted wars now specializes in quagmires.  Campaigns don’t really end.  At best, they peter out.

Formerly trained to kill people and break things, American soldiers now attend to winning hearts and minds, while moonlighting in assassination.  The politically correct term for this is “counterinsurgency.”

Now, assigning combat soldiers the task of nation-building in, say, Mesopotamia is akin to hiring a crew of lumberjacks to build a house in suburbia.  What astonishes is not that the result falls short of perfection, but that any part of the job gets done at all.

Yet by simultaneously adopting the practice of “targeted killing,” the home builders do double-duty as home wreckers.  For American assassins, the weapon of choice is not the sniper rifle or the shiv, but missile-carrying pilotless aircraft controlled from bases in Nevada and elsewhere thousands of miles from the battlefield — the ultimate expression of an American desire to wage war without getting our hands dirty.

In practice, however, killing the guilty from afar not infrequently entails killing innocents as well.  So actions undertaken to deplete the ranks of jihadists as far afield as Pakistan, Yemen, and Somalia unwittingly ensure the recruitment of replacements, guaranteeing a never-ending supply of hardened hearts to soften.

No wonder the campaigns launched since 9/11 drag on and on.  General Petraeus himself has spelled out the implications: “This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”  Obama may want to “get out.”  His generals are inclined to stay the course.

Taking longer to achieve less than we initially intended is also costing far more than anyone ever imagined.  Back in 2003, White House economic adviser Lawrence Lindsey suggested that invading Iraq might run up a tab of as much as $200 billion — a seemingly astronomical sum.  Although Lindsey soon found himself out of a job as a result, he turned out to be a piker.  The bill for our post-9/11 wars already exceeds a trillion dollars, all of it piled atop our mushrooming national debt.  Helped in no small measure by Obama’s war policies, the meter is still running.

So are we almost there yet?  Not even.  The truth is we’re lost in the desert, careening down an unmarked road, odometer busted, GPS on the fritz, and fuel gauge hovering just above E.  Washington can only hope that the American people, napping in the backseat, won’t notice.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His bestselling new book is Washington Rules: America’s Path to Permanent War.  To catch Bacevich discussing how the U.S. military became specialists in quagmires in a Timothy MacBain TomCast audio interview click here or, to download it to your iPod, here.

Copyright 2010 Andrew J. Bacevich

The Long War: Year Ten

Once a serious journalist, the Washington Post’s Bob Woodward now makes a very fine living as chief gossip-monger of the governing class.  Early on in his career, along with Carl Bernstein, his partner at the time, Woodward confronted power.  Today, by relentlessly exalting Washington trivia, he flatters power.  His reporting does not inform. It titillates.

A new Woodward book, Obama’s Wars, is a guaranteed blockbuster.  It’s out this week, already causing a stir, and guaranteed to be forgotten the week after dropping off the bestseller lists.  For good reason: when it comes to substance, any book written by Woodward has about as much heft as the latest potboiler penned by the likes of James Patterson or Tom Clancy.

Back in 2002, for example, during the run-up to the invasion of Iraq, Woodward treated us to Bush at War.  Based on interviews with unidentified officials close to President George W. Bush, the book offered a portrait of the president-as-resolute-war-leader that put him in a league with Abraham Lincoln and Franklin Roosevelt.  But the book’s real juice came from what it revealed about events behind the scenes.  “Bush’s war cabinet is riven with feuding,” reported the Times of London, which credited Woodward with revealing “the furious arguments and personal animosity” that divided Bush’s lieutenants.

Of course, the problem with the Bush administration wasn’t that folks on the inside didn’t play nice with one another.  No, the problem was that the president and his inner circle committed a long series of catastrophic errors that produced an unnecessary and grotesquely mismanaged war.  That war has cost the country dearly — although the people who engineered that catastrophe, many of them having pocketed handsome advances on their forthcoming memoirs, continue to manage quite well, thank you.

To judge by the publicity blitzkrieg announcing the arrival of Obama’s Wars in your local bookstore, the big news out of Washington is that, even today, politics there remains an intensely competitive sport, with the participants, whether in anger or frustration, sometimes speaking ill of one another.

Essentially, news reports indicate, Woodward has updated his script from 2002.  The characters have different names, but the plot remains the same.  Talk about jumping the shark.

So we learn that Obama political adviser David Axelrod doesn’t fully trust Secretary of State Hillary Clinton.  National security adviser James Jones, a retired Marine general, doesn’t much care for the likes of Axelrod, and will say so behind his back.  Almost everyone thinks Richard Holbrooke, chief State Department impresario of the AfPak portfolio, is a jerk.  And — stop the presses — when under the influence of alcohol, General David Petraeus, commander of U.S. and allied forces in Afghanistan, is alleged to use the word “f**ked.”  These are the sort of shocking revelations that make you a headliner on the Sunday morning talk shows.

Based on what we have learned so far from those select few provided with advance copies of the book — mostly reporters for the Post and The New York Times who, for whatever reason, seem happy to serve as its shills — Obama’s Wars contains hints of another story, the significance of which seems to have eluded Woodward.

The theme of that story is not whether Dick likes Jane, but whether the Constitution remains an operative document.  The Constitution explicitly assigns to the president the role of commander-in-chief. Responsibility for the direction of American wars rests with him. According to the principle of civilian control, senior military officers advise and execute, but it’s the president who decides.  That’s the theory, at least.  Reality turns out to be considerably different and, to be kind about it, more complicated.

Obama’s Wars reportedly contains this comment by President Obama to Secretary Clinton and Secretary of Defense Robert Gates regarding Afghanistan:  “I’m not doing 10 years… I’m not doing long-term nation-building. I am not spending a trillion dollars.”

Aren’t you, Mr. President?  Don’t be so sure.

Obama’s Wars also affirms what we already suspected about the decision-making process that led up to the president’s announcement at West Point in December 2009 to prolong and escalate the war. Bluntly put, the Pentagon gamed the process to exclude any possibility of Obama rendering a decision not to its liking.

Pick your surge: 20,000 troops? Or 30,000 troops?  Or 40,000 troops?  Only the most powerful man in the world — or Goldilocks contemplating three bowls of porridge — could handle a decision like that.  Even as Obama opted for the middle course, the real decision had already been made elsewhere by others: the war in Afghanistan would expand and continue.

And then there’s this from the estimable General David Petraeus: “I don’t think you win this war,” Woodward quotes the field commander as saying. “I think you keep fighting… This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”

Here we confront a series of questions to which Woodward (not to mention the rest of Washington) remains steadfastly oblivious.  Why fight a war that even the general in charge says can’t be won?  What will the perpetuation of this conflict cost?  Who will it benefit?  Does the ostensibly most powerful nation in the world have no choice but to wage permanent war?  Are there no alternatives?  Can Obama shut down an unwinnable war now about to enter its tenth year?  Or is he — along with the rest of us — a prisoner of war?

President Obama has repeatedly stated that in July 2011 a withdrawal of U. S. troops from Afghanistan will commence.  No one quite knows
exactly what that means.  Will the withdrawal be symbolic?  General Petraeus has already made it abundantly clear that he will entertain nothing more.  Or will July signal that the Afghan War — and by extension the Global War on Terror launched nine years ago — is finally coming to an end?

Between now and next summer attentive Americans will learn much about how national security policy is actually formulated and who is really
in charge.  Just don’t expect Bob Woodward to offer any enlightenment on the subject.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His new book is Washington Rules: America’s Path to Permanent War.

Copyright 2010 Andrew J. Bacevich

Prisoners of War

Worldly ambition inhibits true learning. Ask me. I know. A young man in a hurry is nearly uneducable: He knows what he wants and where he’s headed; when it comes to looking back or entertaining heretical thoughts, he has neither the time nor the inclination. All that counts is that he is going somewhere. Only as ambition wanes does education become a possibility.

My own education did not commence until I had reached middle age. I can fix its start date with precision: for me, education began in Berlin, on a winter’s evening, at the Brandenburg Gate, not long after the Berlin Wall had fallen.

As an officer in the U.S. Army I had spent considerable time in Germany. Until that moment, however, my family and I had never had occasion to visit this most famous of German cities, still littered with artifacts of a deeply repellent history. At the end of a long day of exploration, we found ourselves in what had, until just months before, been the communist East. It was late and we were hungry, but I insisted on walking the length of the Unter den Linden, from the River Spree to the gate itself. A cold rain was falling and the pavement glistened. The buildings lining the avenue, dating from the era of Prussian kings, were dark, dirty, and pitted. Few people were about. It was hardly a night for sightseeing.

For as long as I could remember, the Brandenburg Gate had been the preeminent symbol of the age and Berlin the epicenter of contemporary history. Yet by the time I made it to the once and future German capital, history was already moving on. The Cold War had abruptly ended. A divided city and a divided nation had reunited.

For Americans who had known Berlin only from a distance, the city existed primarily as a metaphor. Pick a date — 1933, 1942, 1945, 1948, 1961, 1989 — and Berlin becomes an instructive symbol of power, depravity, tragedy, defiance, endurance, or vindication. For those inclined to view the past as a chronicle of parables, the modern history of Berlin offered an abundance of material. The greatest of those parables emerged from the events of 1933 to 1945, an epic tale of evil ascendant, belatedly confronted, then heroically overthrown. A second narrative, woven from events during the intense period immediately following World War II, saw hopes for peace dashed, yielding bitter antagonism but also great resolve. The ensuing stand-off — the “long twilight struggle,” in John Kennedy’s memorable phrase — formed the centerpiece of the third parable, its central theme stubborn courage in the face of looming peril. Finally came the exhilarating events of 1989, with freedom ultimately prevailing, not only in Berlin, but throughout Eastern Europe.

What exactly was I looking for at the Brandenburg Gate? Perhaps confirmation that those parables, which I had absorbed and accepted as true, were just that. Whatever I expected, what I actually found was a cluster of shabby-looking young men, not German, hawking badges, medallions, hats, bits of uniforms, and other artifacts of the mighty Red Army. It was all junk, cheaply made and shoddy. For a handful of deutsche marks, I bought a wristwatch emblazoned with the symbol of the Soviet armored corps. Within days, it ceased to work.

Huddling among the scarred columns, those peddlers — almost certainly off-duty Russian soldiers awaiting redeployment home — constituted a subversive presence. They were loose ends of a story that was supposed to have ended neatly when the Berlin Wall came down. As we hurried off to find warmth and a meal, this disconcerting encounter stuck with me, and I began to entertain this possibility: that the truths I had accumulated over the previous twenty years as a professional soldier — especially truths about the Cold War and U.S. foreign policy — might not be entirely true.

By temperament and upbringing, I had always taken comfort in orthodoxy. In a life spent subject to authority, deference had become a deeply ingrained habit. I found assurance in conventional wisdom. Now, I started, however hesitantly, to suspect that orthodoxy might be a sham. I began to appreciate that authentic truth is never simple and that any version of truth handed down from on high — whether by presidents, prime ministers, or archbishops — is inherently suspect. The powerful, I came to see, reveal truth only to the extent that it suits them. Even then, the truths to which they testify come wrapped in a nearly invisible filament of dissembling, deception, and duplicity. The exercise of power necessarily involves manipulation and is antithetical to candor.

I came to these obvious points embarrassingly late in life. “Nothing is so astonishing in education,” the historian Henry Adams once wrote, “as the amount of ignorance it accumulates in the form of inert facts.” Until that moment I had too often confused education with accumulating and cataloging facts. In Berlin, at the foot of the Brandenburg Gate, I began to realize that I had been a naïf. And so, at age 41, I set out, in a halting and haphazard fashion, to acquire a genuine education.

Twenty years later I’ve made only modest progress. What follows is an accounting of what I have learned thus far.

Visiting a Third-World Version of Germany

In October 1990, I’d gotten a preliminary hint that something might be amiss in my prior education. On October 3rd, communist East Germany — formally the German Democratic Republic (GDR) — ceased to exist and German reunification was officially secured. That very week I accompanied a group of American military officers to the city of Jena in what had been the GDR. Our purpose was self-consciously educational — to study the famous battle of Jena-Auerstädt in which Napoleon Bonaparte and his marshals had inflicted an epic defeat on Prussian forces commanded by the Duke of Brunswick. (The outcome of that 1806 battle inspired the philosopher Hegel, then residing in Jena, to declare that the “end of history” was at hand. The conclusion of the Cold War had only recently elicited a similarly exuberant judgment from the American scholar Francis Fukuyama.)

On this trip we did learn a lot about the conduct of that battle, although mainly inert facts possessing little real educational value. Inadvertently, we also gained insight into the reality of life on the far side of what Americans had habitually called the Iron Curtain, known in U.S. military vernacular as “the trace.” In this regard, the trip proved nothing less than revelatory. The educational content of this excursion would — for me — be difficult to exaggerate.

As soon as our bus crossed the old Inner German Border, we entered a time warp. For U.S. troops garrisoned throughout Bavaria and Hesse, West Germany had for decades served as a sort of theme park — a giant Epcot filled with quaint villages, stunning scenery, and superb highways, along with ample supplies of quite decent food, excellent beer, and accommodating women. Now, we found ourselves face-to-face with an altogether different Germany. Although commonly depicted as the most advanced and successful component of the Soviet Empire, East Germany more closely resembled part of the undeveloped world.

The roads — even the main highways — were narrow and visibly crumbling. Traffic posed little problem. Apart from a few sluggish Trabants and Wartburgs — East German automobiles that tended to a retro primitivism — and an occasional exhaust-spewing truck, the way was clear. The villages through which we passed were forlorn and the small farms down at the heels. For lunch we stopped at a roadside stand. The proprietor happily accepted our D-marks, offering us inedible sausages in exchange. Although the signs assured us that we remained in a land of German speakers, it was a country that had not yet recovered from World War II.

Upon arrival in Jena, we checked into the Hotel Schwarzer Bär, identified by our advance party as the best hostelry in town. It turned out to be a rundown fleabag. As the senior officer present, I was privileged to have a room in which the plumbing functioned. Others were not so lucky.

Jena itself was a midsized university city, with its main academic complex immediately opposite our hotel. A very large bust of Karl Marx, mounted on a granite pedestal and badly in need of cleaning, stood on the edge of the campus. Briquettes of soft coal used for home heating made the air all but unbreathable and coated everything with soot. In the German cities we knew, pastels predominated — houses and apartment blocks painted pale green, muted salmon, and soft yellow. Here everything was brown and gray.

That evening we set out in search of dinner. The restaurants within walking distance were few and unattractive. We chose badly, a drab establishment in which fresh vegetables were unavailable and the wurst inferior. The adequacy of the local beer provided the sole consolation.

The following morning, on the way to the battlefield, we noted a significant Soviet military presence, mostly in the form of trucks passing by — to judge by their appearance, designs that dated from the 1950s. To our surprise, we discovered that the Soviets had established a small training area adjacent to where Napoleon had vanquished the Prussians. Although we had orders to avoid contact with any Russians, the presence of their armored troops going through their paces riveted us. Here was something of far greater immediacy than Bonaparte and the Duke of Brunswick: “the other,” about which we had for so long heard so much but knew so little. Through binoculars, we watched a column of Russian armored vehicles — BMPs, in NATO parlance — traversing what appeared to be a drivers’ training course. Suddenly, one of them began spewing smoke. Soon thereafter, it burst into flames.

Here was education, although at the time I had only the vaguest sense of its significance.

An Ambitious Team Player Assailed by Doubts

These visits to Jena and Berlin offered glimpses of a reality radically at odds with my most fundamental assumptions. Uninvited and unexpected, subversive forces had begun to infiltrate my consciousness. Bit by bit, my worldview started to crumble.

That worldview had derived from this conviction: that American power manifested a commitment to global leadership, and that both together expressed and affirmed the nation’s enduring devotion to its founding ideals. That American power, policies, and purpose were bound together in a neat, internally consistent package, each element drawing strength from and reinforcing the others, was something I took as a given. That, during my adult life, a penchant for interventionism had become a signature of U.S. policy did not — to me, at least — in any way contradict America’s aspirations for peace. Instead, a willingness to expend lives and treasure in distant places testified to the seriousness of those aspirations. That, during this same period, the United States had amassed an arsenal of over 31,000 nuclear weapons, some small number of them assigned to units in which I had served, was not at odds with our belief in the inalienable right to life and liberty; rather, threats to life and liberty had compelled the United States to acquire such an arsenal and maintain it in readiness for instant use.

I was not so naïve as to believe that the American record had been without flaws. Yet I assured myself that any errors or misjudgments had been committed in good faith. Furthermore, circumstances permitted little real choice. In Southeast Asia as in Western Europe, in the Persian Gulf as in the Western Hemisphere, the United States had simply done what needed doing. Viable alternatives did not exist. To consent to any dilution of American power would be to forfeit global leadership, thereby putting at risk safety, prosperity, and freedom, not only our own but also that of our friends and allies.

The choices seemed clear enough. On one side was the status quo: the commitments, customs, and habits that defined American globalism, implemented by the national security apparatus within which I functioned as a small cog. On the other side was the prospect of appeasement, isolationism, and catastrophe. The only responsible course was the one to which every president since Harry Truman had adhered.

For me, the Cold War had played a crucial role in sustaining that worldview. Given my age, upbringing, and professional background, it could hardly have been otherwise. Although the great rivalry between the United States and the Soviet Union had contained moments of considerable anxiety — I remember my father, during the Cuban Missile Crisis, stocking our basement with water and canned goods — it served primarily to clarify, not to frighten. The Cold War provided a framework that organized and made sense of contemporary history. It offered a lineup and a scorecard. That there existed bad Germans and good Germans, their Germans and our Germans, totalitarian Germans and Germans who, like Americans, passionately loved freedom was, for example, a proposition I accepted as dogma. Seeing the Cold War as a struggle between good and evil answered many questions, consigned others to the periphery, and rendered still others irrelevant.

Back in the 1960s, during the Vietnam War, more than a few members of my generation had rejected the conception of the Cold War as a Manichean struggle. Here too, I was admittedly a slow learner. Yet having kept the faith long after others had lost theirs, the doubts that eventually assailed me were all the more disorienting.

Granted, occasional suspicions had appeared long before Jena and Berlin. My own Vietnam experience had generated its share, which I had done my best to suppress. I was, after all, a serving soldier. Except in the narrowest of terms, the military profession, in those days at least, did not look kindly on nonconformity. Climbing the ladder of career success required curbing maverick tendencies. To get ahead, you needed to be a team player. Later, when studying the history of U.S. foreign relations in graduate school, I was pelted with challenges to orthodoxy, which I vigorously deflected. When it came to education, graduate school proved a complete waste of time — a period of intense study devoted to the further accumulation of facts, while I exerted myself to ensuring that they remained inert.

Now, however, my personal circumstances were changing. Shortly after the passing of the Cold War, my military career ended. Education thereby became not only a possibility, but also a necessity.

In measured doses, mortification cleanses the soul. It’s the perfect antidote for excessive self-regard. After 23 years spent inside the U.S. Army seemingly going somewhere, I now found myself on the outside going nowhere in particular. In the self-contained and cloistered universe of regimental life, I had briefly risen to the status of minor spear carrier. The instant I took off my uniform, that status vanished. I soon came to a proper appreciation of my own insignificance, a salutary lesson that I ought to have absorbed many years earlier.

As I set out on what eventually became a crablike journey toward a new calling as a teacher and writer — a pilgrimage of sorts — ambition in the commonly accepted meaning of the term ebbed. This did not happen all at once. Yet gradually, trying to grab one of life’s shiny brass rings ceased being a major preoccupation. Wealth, power, and celebrity became not aspirations but subjects for critical analysis. History — especially the familiar narrative of the Cold War — no longer offered answers; instead, it posed perplexing riddles. Easily the most nagging was this one: How could I have so profoundly misjudged the reality of what lay on the far side of the Iron Curtain?

Had I been insufficiently attentive? Or was it possible that I had been snookered all along? Contemplating such questions, while simultaneously witnessing the unfolding of the “long 1990s” — the period bookended by two wars with Iraq when American vainglory reached impressive new heights — prompted the realization that I had grossly misinterpreted the threat posed by America’s adversaries. Yet that was the lesser half of the problem. Far worse than misperceiving “them” was the fact that I had misperceived “us.” What I thought I knew best I actually understood least. Here, the need for education appeared especially acute.

George W. Bush’s decision to launch Operation Iraqi Freedom in 2003 pushed me fully into opposition. Claims that once seemed elementary — above all, claims relating to the essentially benign purposes of American power — now appeared preposterous. The contradictions that found an ostensibly peace-loving nation committing itself to a doctrine of preventive war became too great to ignore. The folly and hubris of the policy makers who heedlessly thrust the nation into an ill-defined and open-ended “global war on terror” without the foggiest notion of what victory would look like, how it would be won, and what it might cost approached standards hitherto achieved only by slightly mad German warlords. During the era of containment, the United States had at least maintained the pretense of a principled strategy; now, the last vestiges of principle gave way to fantasy and opportunism. With that, the worldview to which I had adhered as a young adult and carried into middle age dissolved completely.

Credo and Trinity

What should stand in the place of such discarded convictions? Simply inverting the conventional wisdom, substituting a new Manichean paradigm for the old discredited version — the United States taking the place of the Soviet Union as the source of the world’s evil — would not suffice. Yet arriving at even an approximation of truth would entail subjecting conventional wisdom, both present and past, to sustained and searching scrutiny. Cautiously at first but with growing confidence, this I vowed to do.

Doing so meant shedding habits of conformity acquired over decades. All of my adult life I had been a company man, only dimly aware of the extent to which institutional loyalties induce myopia. Asserting independence required first recognizing the extent to which I had been socialized to accept certain things as unimpeachable. Here then were the preliminary steps essential to making education accessible. Over a period of years, a considerable store of debris had piled up. Now, it all had to go. Belatedly, I learned that more often than not what passes for conventional wisdom is simply wrong. Adopting fashionable attitudes to demonstrate one’s trustworthiness — the world of politics is flush with such people hoping thereby to qualify for inclusion in some inner circle — is akin to engaging in prostitution in exchange for promissory notes. It’s not only demeaning but downright foolhardy.

Washington Rules aims to take stock of conventional wisdom in its most influential and enduring form, namely the package of assumptions, habits, and precepts that have defined the tradition of statecraft to which the United States has adhered since the end of World War II — the era of global dominance now drawing to a close. This postwar tradition combines two components, each one so deeply embedded in the American collective consciousness as to have all but disappeared from view.

The first component specifies norms according to which the international order ought to work and charges the United States with responsibility for enforcing those norms. Call this the American credo. In the simplest terms, the credo summons the United States — and the United States alone — to lead, save, liberate, and ultimately transform the world. In a celebrated manifesto issued at the dawn of what he termed “The American Century,” Henry R. Luce made the case for this spacious conception of global leadership. Writing in Life magazine in early 1941, the influential publisher exhorted his fellow citizens to “accept wholeheartedly our duty to exert upon the world the full impact of our influence for such purposes as we see fit and by such means as we see fit.” Luce thereby captured what remains even today the credo’s essence.

Luce’s concept of an American Century, an age of unquestioned American global primacy, resonated, especially in Washington. His evocative phrase found a permanent place in the lexicon of national politics. (Recall that the neoconservatives who, in the 1990s, lobbied for more militant U.S. policies named their enterprise the Project for a New American Century.) So, too, did Luce’s expansive claim of prerogatives to be exercised by the United States. Even today, whenever public figures allude to America’s responsibility to lead, they signal their fidelity to this creed. Along with respectful allusions to God and “the troops,” adherence to Luce’s credo has become a de facto prerequisite for high office. Question its claims and your prospects of being heard in the hubbub of national politics become nil.

Note, however, that the duty Luce ascribed to Americans has two components. It is not only up to Americans, he wrote, to choose the purposes for which they would bring their influence to bear, but to choose the means as well. Here we confront the second component of the postwar tradition of American statecraft.

With regard to means, that tradition has emphasized activism over example, hard power over soft, and coercion (often styled “negotiating from a position of strength”) over suasion. Above all, the exercise of global leadership as prescribed by the credo obliges the United States to maintain military capabilities staggeringly in excess of those required for self-defense. Prior to World War II, Americans by and large viewed military power and institutions with skepticism, if not outright hostility. In the wake of World War II, that changed. An affinity for military might emerged as central to the American identity.

By the midpoint of the twentieth century, “the Pentagon” had ceased to be merely a gigantic five-sided building. Like “Wall Street” at the end of the nineteenth century, it had become Leviathan, its actions veiled in secrecy, its reach extending around the world. Yet while the concentration of power in Wall Street had once evoked deep fear and suspicion, Americans by and large saw the concentration of power in the Pentagon as benign. Most found it reassuring.

A people who had long seen standing armies as a threat to liberty now came to believe that the preservation of liberty required them to lavish resources on the armed forces. During the Cold War, Americans worried ceaselessly about falling behind the Russians, even though the Pentagon consistently maintained a position of overall primacy. Once the Soviet threat disappeared, mere primacy no longer sufficed. With barely a whisper of national debate, unambiguous and perpetual global military supremacy emerged as an essential predicate to global leadership.

Every great military power has its distinctive signature. For Napoleonic France, it was the levée en masse — the people in arms animated by the ideals of the Revolution. For Great Britain in the heyday of empire, it was command of the seas, sustained by a dominant fleet and a network of far-flung outposts from Gibraltar and the Cape of Good Hope to Singapore and Hong Kong. Germany from the 1860s to the 1940s (and Israel from 1948 to 1973) took another approach, relying on a potent blend of tactical flexibility and operational audacity to achieve battlefield superiority.

The abiding signature of American military power since World War II has been of a different order altogether. The United States has not specialized in any particular type of war. It has not adhered to a fixed tactical style. No single service or weapon has enjoyed consistent favor. At times, the armed forces have relied on citizen-soldiers to fill their ranks; at other times, long-service professionals. Yet an examination of the past 60 years of U.S. military policy and practice does reveal important elements of continuity. Call them the sacred trinity: an abiding conviction that the minimum essentials of international peace and order require the United States to maintain a global military presence, to configure its forces for global power projection, and to counter existing or anticipated threats by relying on a policy of global interventionism.

Together, credo and trinity — the one defining purpose, the other practice — constitute the essence of the way that Washington has attempted to govern and police the American Century. The relationship between the two is symbiotic. The trinity lends plausibility to the credo’s vast claims. For its part, the credo justifies the trinity’s vast requirements and exertions. Together they provide the basis for an enduring consensus that imparts a consistency to U.S. policy regardless of which political party may hold the upper hand or who may be occupying the White House. From the era of Harry Truman to the age of Barack Obama, that consensus has remained intact. It defines the rules to which Washington adheres; it determines the precepts by which Washington rules.

As used here, Washington is less a geographic expression than a set of interlocking institutions headed by people who, whether acting officially or unofficially, are able to put a thumb on the helm of state. Washington, in this sense, includes the upper echelons of the executive, legislative, and judicial branches of the federal government. It encompasses the principal components of the national security state — the departments of Defense, State, and, more recently, Homeland Security, along with various agencies comprising the intelligence and federal law enforcement communities. Its ranks extend to select think tanks and interest groups. Lawyers, lobbyists, fixers, former officials, and retired military officers who still enjoy access are members in good standing. Yet Washington also reaches beyond the Beltway to include big banks and other financial institutions, defense contractors and major corporations, television networks and elite publications like the New York Times, even quasi-academic entities like the Council on Foreign Relations and Harvard’s Kennedy School of Government. With rare exceptions, acceptance of the Washington rules forms a prerequisite for entry into this world.

My purpose in writing Washiington Rules is fivefold: first, to trace the origins and evolution of the Washington rules — both the credo that inspires consensus and the trinity in which it finds expression; second, to subject the resulting consensus to critical inspection, showing who wins and who loses and also who foots the bill; third, to explain how the Washington rules are perpetuated, with certain views privileged while others are declared disreputable; fourth, to demonstrate that the rules themselves have lost what ever utility they may once have possessed, with their implications increasingly pernicious and their costs increasingly unaffordable; and finally, to argue for readmitting disreputable (or “radical”) views to our national security debate, in effect legitimating alternatives to the status quo. In effect, my aim is to invite readers to share in the process of education on which I embarked two decades ago in Berlin.

The Washington rules were forged at a moment when American influence and power were approaching their acme. That moment has now passed. The United States has drawn down the stores of authority and goodwill it had acquired by 1945. Words uttered in Washington command less respect than once was the case. Americans can ill afford to indulge any longer in dreams of saving the world, much less remaking it in our own image. The curtain is now falling on the American Century.

Similarly, the United States no longer possesses sufficient wherewithal to sustain a national security strategy that relies on global military presence and global power projection to underwrite a policy of global interventionism. Touted as essential to peace, adherence to that strategy has propelled the United States into a condition approximating perpetual war, as the military misadventures of the past decade have demonstrated.

To anyone with eyes to see, the shortcomings inherent in the Washington rules have become plainly evident. Although those most deeply invested in perpetuating its conventions will insist otherwise, the tradition to which Washington remains devoted has begun to unravel. Attempting to prolong its existence might serve Washington’s interests, but it will not serve the interests of the American people.

Devising an alternative to the reigning national security paradigm will pose a daunting challenge — especially if Americans look to “Washington” for fresh thinking. Yet doing so has become essential.

In one sense, the national security policies to which Washington so insistently adheres express what has long been the preferred American approach to engaging the world beyond our borders. That approach plays to America’s presumed strong suit — since World War II, and especially since the end of the Cold War, thought to be military power. In another sense, this reliance on military might creates excuses for the United States to avoid serious engagement: confidence in American arms has made it unnecessary to attend to what others might think or to consider how their aspirations might differ from our own. In this way, the Washington rules reinforce American provincialism — a national trait for which the United States continues to pay dearly.

The persistence of these rules has also provided an excuse to avoid serious self-engagement. From this perspective, confidence that the credo and the trinity will oblige others to accommodate themselves to America’s needs or desires — whether for cheap oil, cheap credit, or cheap consumer goods — has allowed Washington to postpone or ignore problems demanding attention here at home. Fixing Iraq or Afghanistan ends up taking precedence over fixing Cleveland and Detroit. Purporting to support the troops in their crusade to free the world obviates any obligation to assess the implications of how Americans themselves choose to exercise freedom.

When Americans demonstrate a willingness to engage seriously with others, combined with the courage to engage seriously with themselves, then real education just might begin.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent War (Metropolitan Books)has just been published. This essay is its introduction.  Listen to a TomCast audio interview in which he discusses the book by clicking here, or to download to an iPod, here.

Excerpted from Washington Rules: America’s Path to Permanent War, published this month by Metropolitan Books, an imprint of Henry Holt and Company, LLC. Copyright (c) 2010 by Andrew Bacevich. All rights reserved.

Andrew Bacevich, How Washington Rules

“In watching the flow of events over the past decade or so, it is hard to avoid the feeling that something very fundamental has happened in world history.”  This sentiment, introducing the essay that made Francis Fukuyama a household name, commands renewed attention today, albeit from a different perspective.

Developments during the 1980s, above all the winding down of the Cold War, had convinced Fukuyama that the “end of history” was at hand.  “The triumph of the West, of the Western idea,” he wrote in 1989, “is evident… in the total exhaustion of viable systematic alternatives to Western liberalism.”

Today the West no longer looks quite so triumphant.  Yet events during the first decade of the present century have delivered history to another endpoint of sorts.  Although Western liberalism may retain considerable appeal, the Western way of war has run its course.

For Fukuyama, history implied ideological competition, a contest pitting democratic capitalism against fascism and communism.  When he wrote his famous essay, that contest was reaching an apparently definitive conclusion.

Yet from start to finish, military might had determined that competition’s course as much as ideology.  Throughout much of the twentieth century, great powers had vied with one another to create new, or more effective, instruments of coercion.  Military innovation assumed many forms.  Most obviously, there were the weapons: dreadnoughts and aircraft carriers, rockets and missiles, poison gas, and atomic bombs — the list is a long one.  In their effort to gain an edge, however, nations devoted equal attention to other factors: doctrine and organization, training systems and mobilization schemes, intelligence collection and war plans.

All of this furious activity, whether undertaken by France or Great Britain, Russia or Germany, Japan or the United States, derived from a common belief in the plausibility of victory.  Expressed in simplest terms, the Western military tradition could be reduced to this proposition: war remains a viable instrument of statecraft, the accoutrements of modernity serving, if anything, to enhance its utility.

Grand Illusions

That was theory.  Reality, above all the two world wars of the last century, told a decidedly different story.  Armed conflict in the industrial age reached new heights of lethality and destructiveness.  Once begun, wars devoured everything, inflicting staggering material, psychological, and moral damage.  Pain vastly exceeded gain.  In that regard, the war of 1914-1918 became emblematic: even the winners ended up losers.  When fighting eventually stopped, the victors were left not to celebrate but to mourn.  As a consequence, well before Fukuyama penned his essay, faith in war’s problem-solving capacity had begun to erode.  As early as 1945, among several great powers — thanks to war, now great in name only — that faith disappeared altogether.

Among nations classified as liberal democracies, only two resisted this trend.  One was the United States, the sole major belligerent to emerge from the Second World War stronger, richer, and more confident.  The second was Israel, created as a direct consequence of the horrors unleashed by that cataclysm.  By the 1950s, both countries subscribed to this common conviction: national security (and, arguably, national survival) demanded unambiguous military superiority.  In the lexicon of American and Israeli politics, “peace” was a codeword.  The essential prerequisite for peace was for any and all adversaries, real or potential, to accept a condition of permanent inferiority.  In this regard, the two nations — not yet intimate allies — stood apart from the rest of the Western world.

So even as they professed their devotion to peace, civilian and military elites in the United States and Israel prepared obsessively for war.  They saw no contradiction between rhetoric and reality.

Yet belief in the efficacy of military power almost inevitably breeds the temptation to put that power to work.  “Peace through strength” easily enough becomes “peace through war.”  Israel succumbed to this temptation in 1967.  For Israelis, the Six Day War proved a turning point.  Plucky David defeated, and then became, Goliath.  Even as the United States was flailing about in Vietnam, Israel had evidently succeeded in definitively mastering war.

A quarter-century later, U.S. forces seemingly caught up.  In 1991, Operation Desert Storm, George H.W. Bush’s war against Iraqi dictator Saddam Hussein, showed that American troops like Israeli soldiers knew how to win quickly, cheaply, and humanely.  Generals like H. Norman Schwarzkopf persuaded themselves that their brief desert campaign against Iraq had replicated — even eclipsed — the battlefield exploits of such famous Israeli warriors as Moshe Dayan and Yitzhak Rabin.  Vietnam faded into irrelevance.

For both Israel and the United States, however, appearances proved deceptive.  Apart from fostering grand illusions, the splendid wars of 1967 and 1991 decided little.  In both cases, victory turned out to be more apparent than real.  Worse, triumphalism fostered massive future miscalculation.

On the Golan Heights, in Gaza, and throughout the West Bank, proponents of a Greater Israel — disregarding Washington’s objections — set out to assert permanent control over territory that Israel had seized.  Yet “facts on the ground” created by successive waves of Jewish settlers did little to enhance Israeli security.  They succeeded chiefly in shackling Israel to a rapidly growing and resentful Palestinian population that it could neither pacify nor assimilate.

In the Persian Gulf, the benefits reaped by the United States after 1991 likewise turned out to be ephemeral.  Saddam Hussein survived and became in the eyes of successive American administrations an imminent threat to regional stability.  This perception prompted (or provided a pretext for) a radical reorientation of strategy in Washington.  No longer content to prevent an unfriendly outside power from controlling the oil-rich Persian Gulf, Washington now sought to dominate the entire Greater Middle East.  Hegemony became the aim.  Yet the United States proved no more successful than Israel in imposing its writ.

During the 1990s, the Pentagon embarked willy-nilly upon what became its own variant of a settlement policy.  Yet U.S. bases dotting the Islamic world and U.S. forces operating in the region proved hardly more welcome than the Israeli settlements dotting the occupied territories and the soldiers of the Israeli Defense Forces (IDF) assigned to protect them.  In both cases, presence provoked (or provided a pretext for) resistance.  Just as Palestinians vented their anger at the Zionists in their midst, radical Islamists targeted Americans whom they regarded as neo-colonial infidels.

Stuck

No one doubted that Israelis (regionally) and Americans (globally) enjoyed unquestioned military dominance.  Throughout Israel’s near abroad, its tanks, fighter-bombers, and warships operated at will.  So, too, did American tanks, fighter-bombers, and warships wherever they were sent.

So what?  Events made it increasingly evident that military dominance did not translate into concrete political advantage.  Rather than enhancing the prospects for peace, coercion produced ever more complications.  No matter how badly battered and beaten, the “terrorists” (a catch-all term applied to anyone resisting Israeli or American authority) weren’t intimidated, remained unrepentant, and kept coming back for more.

Israel ran smack into this problem during Operation Peace for Galilee, its 1982 intervention in Lebanon.  U.S. forces encountered it a decade later during Operation Restore Hope, the West’s gloriously titled foray into Somalia.  Lebanon possessed a puny army; Somalia had none at all.  Rather than producing peace or restoring hope, however, both operations ended in frustration, embarrassment, and failure.

And those operations proved but harbingers of worse to come.  By the 1980s, the IDF’s glory days were past.  Rather than lightning strikes deep into the enemy rear, the narrative of Israeli military history became a cheerless recital of dirty wars — unconventional conflicts against irregular forces yielding problematic results.  The First Intifada (1987-1993), the Second Intifada (2000-2005), a second Lebanon War (2006), and Operation Cast Lead, the notorious 2008-2009 incursion into Gaza, all conformed to this pattern.

Meanwhile, the differential between Palestinian and Jewish Israeli birth rates emerged as a looming threat — a “demographic bomb,” Benjamin Netanyahu called it.  Here were new facts on the ground that military forces, unless employed pursuant to a policy of ethnic cleansing, could do little to redress.  Even as the IDF tried repeatedly and futilely to bludgeon Hamas and Hezbollah into submission, demographic trends continued to suggest that within a generation a majority of the population within Israel and the occupied territories would be Arab.

Trailing a decade or so behind Israel, the United States military nonetheless succeeded in duplicating the IDF’s experience.  Moments of glory remained, but they would prove fleeting indeed.  After 9/11, Washington’s efforts to transform (or “liberate”) the Greater Middle East kicked into high gear.  In Afghanistan and Iraq, George W. Bush’s Global War on Terror began impressively enough, as U.S. forces operated with a speed and élan that had once been an Israeli trademark.  Thanks to “shock and awe,” Kabul fell, followed less than a year and a half later by Baghdad.  As one senior Army general explained to Congress in 2004, the Pentagon had war all figured out:

“We are now able to create decision superiority that is enabled by networked systems, new sensors and command and control capabilities that are producing unprecedented near real time situational awareness, increased information availability, and an ability to deliver precision munitions throughout the breadth and depth of the battlespace… Combined, these capabilities of the future networked force will leverage information dominance, speed and precision, and result in decision superiority.”

The key phrase in this mass of techno-blather was the one that occurred twice: “decision superiority.”  At that moment, the officer corps, like the Bush administration, was still convinced that it knew how to win.

Such claims of success, however, proved obscenely premature.  Campaigns advertised as being wrapped up in weeks dragged on for years, while American troops struggled with their own intifadas.  When it came to achieving decisions that actually stuck, the Pentagon (like the IDF) remained clueless.

Winless

If any overarching conclusion emerges from the Afghan and Iraq Wars (and from their Israeli equivalents), it’s this: victory is a chimera.  Counting on today’s enemy to yield in the face of superior force makes about as much sense as buying lottery tickets to pay the mortgage: you better be really lucky.

Meanwhile, as the U.S. economy went into a tailspin, Americans contemplated their equivalent of Israel’s “demographic bomb” — a “fiscal bomb.”  Ingrained habits of profligacy, both individual and collective, held out the prospect of long-term stagnation: no growth, no jobs, no fun.  Out-of-control spending on endless wars exacerbated that threat.

By 2007, the American officer corps itself gave up on victory, although without giving up on war.  First in Iraq, then in Afghanistan, priorities shifted.  High-ranking generals shelved their expectations of winning — at least as a Rabin or Schwarzkopf would have understood that term.  They sought instead to not lose.  In Washington as in U.S. military command posts, the avoidance of outright defeat emerged as the new gold standard of success.

As a consequence, U.S. troops today sally forth from their base camps not to defeat the enemy, but to “protect the people,” consistent with the latest doctrinal fashion.  Meanwhile, tea-sipping U.S. commanders cut deals with warlords and tribal chieftains in hopes of persuading guerrillas to lay down their arms.

A new conventional wisdom has taken hold, endorsed by everyone from new Afghan War commander General David Petraeus, the most celebrated soldier of this American age, to Barack Obama, commander-in-chief and Nobel Peace Prize laureate.  For the conflicts in which the United States finds itself enmeshed, “military solutions” do not exist.  As Petraeus himself has emphasized, “we can’t kill our way out of” the fix we’re in.  In this way, he also pronounced a eulogy on the Western conception of warfare of the last two centuries.

The Unasked Question

What then are the implications of arriving at the end of Western military history?

In his famous essay, Fukuyama cautioned against thinking that the end of ideological history heralded the arrival of global peace and harmony.  Peoples and nations, he predicted, would still find plenty to squabble about.

With the end of military history, a similar expectation applies.  Politically motivated violence will persist and may in specific instances even retain marginal utility.  Yet the prospect of Big Wars solving Big Problems is probably gone for good.  Certainly, no one in their right mind, Israeli or American, can believe that a continued resort to force will remedy whatever it is that fuels anti-Israeli or anti-American antagonism throughout much of the Islamic world.  To expect persistence to produce something different or better is moonshine.

It remains to be seen whether Israel and the United States can come to terms with the end of military history.  Other nations have long since done so, accommodating themselves to the changing rhythms of international politics.  That they do so is evidence not of virtue, but of shrewdness.  China, for example, shows little eagerness to disarm.  Yet as Beijing expands its reach and influence, it emphasizes trade, investment, and development assistance.  Meanwhile, the People’s Liberation Army stays home.  China has stolen a page from an old American playbook, having become today the preeminent practitioner of “dollar diplomacy.”

The collapse of the Western military tradition confronts Israel with limited choices, none of them attractive.  Given the history of Judaism and the history of Israel itself, a reluctance of Israeli Jews to entrust their safety and security to the good will of their neighbors or the warm regards of the international community is understandable.  In a mere six decades, the Zionist project has produced a vibrant, flourishing state.  Why put all that at risk?  Although the demographic bomb may be ticking, no one really knows how much time remains on the clock.  If Israelis are inclined to continue putting their trust in (American-supplied) Israeli arms while hoping for the best, who can blame them?

In theory, the United States, sharing none of Israel’s demographic or geographic constraints and, far more richly endowed, should enjoy far greater freedom of action.  Unfortunately, Washington has a vested interest in preserving the status quo, no matter how much it costs or where it leads.  For the military-industrial complex, there are contracts to win and buckets of money to be made.  For those who dwell in the bowels of the national security state, there are prerogatives to protect.  For elected officials, there are campaign contributors to satisfy.  For appointed officials, civilian and military, there are ambitions to be pursued.

And always there is a chattering claque of militarists, calling for jihad and insisting on ever greater exertions, while remaining alert to any hint of backsliding.  In Washington, members of this militarist camp, by no means coincidentally including many of the voices that most insistently defend Israeli bellicosity, tacitly collaborate in excluding or marginalizing views that they deem heretical.  As a consequence, what passes for debate on matters relating to national security is a sham.  Thus are we invited to believe, for example, that General Petraeus’s appointment as the umpteenth U.S. commander in Afghanistan constitutes a milestone on the way to ultimate success.

Nearly 20 years ago, a querulous Madeleine Albright demanded to know: “What’s the point of having this superb military you’re always talking about if we can’t use it?”  Today, an altogether different question deserves our attention: What’s the point of constantly using our superb military if doing so doesn’t actually work?

Washington’s refusal to pose that question provides a measure of the corruption and dishonesty permeating our politics.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent Warhas just been published. Listen to the latest TomCast audio interview to hear him discuss the book by clicking here or, to download to an iPod, here.

Copyright 2010 Andrew Bacevich

This article was originally posted at TomDispatch.com.

The End of (Military) History?

In a recent column, the Washington Post’s Richard Cohen wrote, “What Henry Luce called ‘the American Century’ is over.” Cohen is right. All that remains is to drive a stake through the heart of Luce’s pernicious creation, lest it come back to life. This promises to take some doing.

When the Time-Life publisher coined his famous phrase, his intent was to prod his fellow citizens into action. Appearing in the February 7, 1941 issue of Life, his essay, “The American Century,” hit the newsstands at a moment when the world was in the throes of a vast crisis. A war in Europe had gone disastrously awry. A second almost equally dangerous conflict was unfolding in the Far East. Aggressors were on the march.

With the fate of democracy hanging in the balance, Americans diddled. Luce urged them to get off the dime. More than that, he summoned them to “accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world… to exert upon the world the full impact of our influence, for such purposes as we see fit and by such means as we see fit.”

Read today, Luce’s essay, with its strange mix of chauvinism, religiosity, and bombast (“We must now undertake to be the Good Samaritan to the entire world…”), does not stand up well. Yet the phrase “American Century” stuck and has enjoyed a remarkable run. It stands in relation to the contemporary era much as “Victorian Age” does to the nineteenth century. In one pithy phrase, it captures (or at least seems to capture) the essence of some defining truth: America as alpha and omega, source of salvation and sustenance, vanguard of history, guiding spirit and inspiration for all humankind.

In its classic formulation, the central theme of the American Century has been one of righteousness overcoming evil. The United States (above all the U.S. military) made that triumph possible. When, having been given a final nudge on December 7, 1941, Americans finally accepted their duty to lead, they saved the world from successive diabolical totalitarianisms. In doing so, the U.S. not only preserved the possibility of human freedom but modeled what freedom ought to look like.

Thank You, Comrades

So goes the preferred narrative of the American Century, as recounted by its celebrants.

The problems with this account are two-fold. First, it claims for the United States excessive credit. Second, it excludes, ignores, or trivializes matters at odds with the triumphal story-line.

The net effect is to perpetuate an array of illusions that, whatever their value in prior decades, have long since outlived their usefulness. In short, the persistence of this self-congratulatory account deprives Americans of self-awareness, hindering our efforts to navigate the treacherous waters in which the country finds itself at present. Bluntly, we are perpetuating a mythic version of the past that never even approximated reality and today has become downright malignant. Although Richard Cohen may be right in declaring the American Century over, the American people — and especially the American political class — still remain in its thrall.

Constructing a past usable to the present requires a willingness to include much that the American Century leaves out.

For example, to the extent that the demolition of totalitarianism deserves to be seen as a prominent theme of contemporary history (and it does), the primary credit for that achievement surely belongs to the Soviet Union. When it came to defeating the Third Reich, the Soviets bore by far the preponderant burden, sustaining 65% of all Allied deaths in World War II.

By comparison, the United States suffered 2% of those losses, for which any American whose father or grandfather served in and survived that war should be saying: Thank you, Comrade Stalin.

For the United States to claim credit for destroying the Wehrmacht is the equivalent of Toyota claiming credit for inventing the automobile. We entered the game late and then shrewdly scooped up more than our fair share of the winnings. The true “Greatest Generation” is the one that willingly expended millions of their fellow Russians while killing millions of German soldiers.

Hard on the heels of World War II came the Cold War, during which erstwhile allies became rivals. Once again, after a decades-long struggle, the United States came out on top.

Yet in determining that outcome, the brilliance of American statesmen was far less important than the ineptitude of those who presided over the Kremlin. Ham-handed Soviet leaders so mismanaged their empire that it eventually imploded, permanently discrediting Marxism-Leninism as a plausible alternative to liberal democratic capitalism. The Soviet dragon managed to slay itself. So thank you, Comrades Malenkov, Khrushchev, Brezhnev, Andropov, Chernenko, and Gorbachev.

Screwing the Pooch

What flag-wavers tend to leave out of their account of the American Century is not only the contributions of others, but the various missteps perpetrated by the United States — missteps, it should be noted, that spawned many of the problems bedeviling us today.

The instances of folly and criminality bearing the label “made-in-Washington” may not rank up there with the Armenian genocide, the Bolshevik Revolution, the appeasement of Adolf Hitler, or the Holocaust, but they sure don’t qualify as small change. To give them their due is necessarily to render the standard account of the American Century untenable.

Here are several examples, each one familiar, even if its implications for the problems we face today are studiously ignored:

Cuba. In 1898, the United States went to war with Spain for the proclaimed purpose of liberating the so-called Pearl of the Antilles. When that brief war ended, Washington reneged on its promise. If there actually has been an American Century, it begins here, with the U.S. government breaking a solemn commitment, while baldly insisting otherwise. By converting Cuba into a protectorate, the United States set in motion a long train of events leading eventually to the rise of Fidel Castro, the Bay of Pigs, Operation Mongoose, the Cuban Missile Crisis, and even today’s Guantanamo Bay prison camp. The line connecting these various developments may not be a straight one, given the many twists and turns along the way, but the dots do connect.

The Bomb. Nuclear weapons imperil our existence. Used on a large scale, they could destroy civilization itself. Even now, the prospect of a lesser power like North Korea or Iran acquiring nukes sends jitters around the world. American presidents — Barack Obama is only the latest in a long line — declare the abolition of these weapons to be an imperative. What they are less inclined to acknowledge is the role the United States played in afflicting humankind with this scourge.

The United States invented the bomb. The United States — alone among members of the nuclear club — actually employed it as a weapon of war. The U.S. led the way in defining nuclear-strike capacity as the benchmark of power in the postwar world, leaving other powers like the Soviet Union, Great Britain, France, and China scrambling to catch up. Today, the U.S. still maintains an enormous nuclear arsenal at the ready and adamantly refuses to commit itself to a no-first-use policy, even as it professes its horror at the prospect of some other nation doing as the United States itself has done.

Iran. Extending his hand to Tehran, President Obama has invited those who govern the Islamic republic to “unclench their fists.” Yet to a considerable degree, those clenched fists are of our own making. For most Americans, the discovery of Iran dates from the time of the notorious hostage crisis of 1979-1981 when Iranian students occupied the U.S. embassy in Tehran, detained several dozen U.S. diplomats and military officers, and subjected the administration of Jimmy Carter to a 444-day-long lesson in abject humiliation.

For most Iranians, the story of U.S.-Iranian relations begins somewhat earlier. It starts in 1953, when CIA agents collaborated with their British counterparts to overthrow the democratically-elected government of Mohammed Mossadegh and return the Shah of Iran to his throne. The plot succeeded. The Shah regained power. The Americans got oil, along with a lucrative market for exporting arms. The people of Iran pretty much got screwed. Freedom and democracy did not prosper. The antagonism that expressed itself in November 1979 with the takeover of the U.S. embassy in Tehran was not entirely without cause.

Afghanistan. President Obama has wasted little time in making the Afghanistan War his own. Like his predecessor he vows to defeat the Taliban. Also like his predecessor he has yet to confront the role played by the United States in creating the Taliban in the first place. Washington once took pride in the success it enjoyed funneling arms and assistance to fundamentalist Afghans waging jihad against foreign occupiers. During the administrations of Jimmy Carter and Ronald Reagan, this was considered to represent the very acme of clever statecraft. U.S. support for the Afghan mujahideen caused the Soviets fits. Yet it also fed a cancer that, in time, exacted a most grievous toll on Americans themselves — and has U.S. forces today bogged down in a seemingly endless war.



Watch the video



Act of Contrition

Had the United States acted otherwise, would Cuba have evolved into a stable and prosperous democracy, a beacon of hope for the rest of Latin America? Would the world have avoided the blight of nuclear weapons? Would Iran today be an ally of the United States, a beacon of liberalism in the Islamic world, rather than a charter member of the “axis of evil?” Would Afghanistan be a quiet, pastoral land at peace with its neighbors? No one, of course, can say what might have been. All we know for sure is that policies concocted in Washington by reputedly savvy statesmen now look exceedingly ill-advised.

What are we to make of these blunders? The temptation may be to avert our gaze, thereby preserving the reassuring tale of the American Century. We should avoid that temptation and take the opposite course, acknowledging openly, freely, and unabashedly where we have gone wrong. We should carve such acknowledgments into the face of a new monument smack in the middle of the Mall in Washington: We blew it. We screwed the pooch. We caught a case of the stupids. We got it ass-backwards.

Only through the exercise of candor might we avoid replicating such mistakes.

Indeed, we ought to apologize. When it comes to avoiding the repetition of sin, nothing works like abject contrition. We should, therefore, tell the people of Cuba that we are sorry for having made such a hash of U.S.-Cuban relations for so long. President Obama should speak on our behalf in asking the people of Hiroshima and Nagasaki for forgiveness. He should express our deep collective regret to Iranians and Afghans for what past U.S. interventionism has wrought.

The United States should do these things without any expectations of reciprocity. Regardless of what U.S. officials may say or do, Castro won’t fess up to having made his own share of mistakes. The Japanese won’t liken Hiroshima to Pearl Harbor and call it a wash. Iran’s mullahs and Afghanistan’s jihadists won’t be offering to a chastened Washington to let bygones be bygones.

No, we apologize to them, but for our own good — to free ourselves from the accumulated conceits of the American Century and to acknowledge that the United States participated fully in the barbarism, folly, and tragedy that defines our time. For those sins, we must hold ourselves accountable.

To solve our problems requires that we see ourselves as we really are. And that requires shedding, once and for all, the illusions embodied in the American Century.

Andrew J. Bacevich is a professor of history and international relations at Boston University. His most recent book, The Limits of Power: The End of American Exceptionalism, is just out in paperback.

Copyright 2009 Andrew J. Bacevich

Farewell, the American Century

A week ago, I had a long conversation with a four-star U.S. military officer who, until his recent retirement, had played a central role in directing the global war on terror. I asked him: what exactly is the strategy that guides the Bush administration’s conduct of this war? His dismaying, if not exactly surprising, answer: there is none.

President Bush will bequeath to his successor the ultimate self-licking ice cream cone. To defense contractors, lobbyists, think-tankers, ambitious military officers, the hosts of Sunday morning talk shows, and the Douglas Feith-like creatures who maneuver to become players in the ultimate power game, the Global War on Terror is a boon, an enterprise redolent with opportunity and promising to extend decades into the future.

Yet, to a considerable extent, that very enterprise has become a fiction, a gimmicky phrase employed to lend an appearance of cohesion to a panoply of activities that, in reality, are contradictory, counterproductive, or at the very least beside the point. In this sense, the global war on terror relates to terrorism precisely as the war on drugs relates to drug abuse and dependence: declaring a state of permanent "war" sustains the pretense of actually dealing with a serious problem, even as policymakers pay lip-service to the problem’s actual sources. The war on drugs is a very expensive fraud. So, too, is the Global War on Terror.

Anyone intent on identifying some unifying idea that explains U.S. actions, military and otherwise, across the Greater Middle East is in for a disappointment. During World War II, President Franklin D. Roosevelt laid down "Germany first" and then "unconditional surrender" as core principles. Early in the Cold War, the Truman administration devised the concept of containment, which for decades thereafter provided a conceptual framework to which policymakers adhered. Yet seven years into its Global War on Terror, the Bush administration is without a compass, wandering in the arid wilderness. To the extent that any inkling of a strategy once existed — the preposterous neoconservative vision of employing American power to "transform" the Islamic world — events have long since demolished the assumptions on which it was based.

Rather than one single war, the United States is presently engaged in several.

Ranking first in importance is the war for Bush’s legacy, better known as Iraq. The President himself will never back away from his insistence that here lies the "central front" of the conflict he initiated after 9/11. Hunkered down in their bunker, Bush and his few remaining supporters would have us believe that the "surge" has, at long last, brought victory in sight and with it some prospect of redeeming this otherwise misbegotten and mismanaged endeavor. If the President can leave office spouting assurances that light is finally visible somewhere at the far end of a very long, very dark Mesopotamian tunnel, he will claim at least partial vindication. And if actual developments subsequent to January 20 don’t turn out well, he can always blame the outcome on his successor.

Next comes the orphan war. This is Afghanistan, a conflict now in its eighth year with no signs of ending anytime soon. Given the attention lavished on Iraq, developments in Afghanistan have until recently attracted only intermittent notice. Lately, however, U.S. officials have awakened to the fact that things are going poorly, both politically and militarily. Al Qaeda persists. The Taliban is reasserting itself. Expectations that NATO might ride to the rescue have proven illusory. Apart from enabling Afghanistan to reclaim its status as the world’s number one producer of opium, U.S. efforts to pacify that nation and nudge it toward modernity have produced little.

The Pentagon calls its intervention in Afghanistan Operation Enduring Freedom. The emphasis was supposed to be on the noun. Unfortunately, the adjective conveys the campaign’s defining characteristic: enduring as in endless. Barring a radical re-definition of purpose, this is an enterprise which promises to continue, consuming lives and treasure, for a long, long time.

In neighboring Pakistan, meanwhile, there is the war-hidden-in-plain-sight. Reports of U.S. military action in Pakistan have now become everyday fare. Air strikes, typically launched from missile-carrying drones, are commonplace, and U.S. ground forces have also conducted at least one cross-border raid from inside Afghanistan. Although the White House doesn’t call this a war, it is — a gradually escalating war of attrition in which we are killing both terrorists and noncombatants. Unfortunately, we are killing too few of the former to make a difference and more than enough of the latter to facilitate the recruitment of new terrorists to replace those we eliminate.

Finally — skipping past the wars-in-waiting, which are Syria and Iran — there is Condi’s war. This clash, which does not directly involve U.S. forces, may actually be the most important of all. The war that Secretary of State Condoleezza Rice has made her own is the ongoing conflict between Israel and the Palestinians. Having for years dismissed the insistence of Muslims, Arabs and non-Arabs alike, that the plight of the Palestinians constitutes a problem of paramount importance, Rice now embraces that view. With the fervor of a convert, she has vowed to broker an end to that conflict prior to leaving office in January 2009.

Given that Rice brings little — perhaps nothing — to the effort in the way of fresh ideas, her prospects of making good as a peacemaker appear slight. Yet, as with Bush and Iraq, so too with Rice and the Palestinian problem: she has a lot riding on the effort. If she flops, history will remember her as America’s least effective secretary of state since Cordell Hull spent World War II being ignored, bypassed, and humiliated by Franklin Roosevelt. She will depart Foggy Bottom having accomplished nothing.

There’s nothing inherently wrong in fighting simultaneously on several fronts, as long as actions on front A are compatible with those on front B, and together contribute to overall success. Unfortunately, that is not the case with the Global War on Terror. We have instead an illustration of what Winston Churchill once referred to as a pudding without a theme: a war devoid of strategic purpose.

This absence of cohesion — by now a hallmark of the Bush administration — is both a disaster and an opportunity. It is a disaster in the sense that we have, over the past seven years, expended enormous resources, while gaining precious little in return.

Bush’s supporters beg to differ, of course. They credit the president with having averted a recurrence of 9/11, doubtless a commendable achievement but one primarily attributable to the fact that the United States no longer neglects airport security. To argue that, say, the invasion and occupation of Iraq have prevented terrorist attacks against the United States is the equivalent of contending that Israel’s occupation of the West Bank since in 1967 has prevented terrorist attacks against the state of Israel.

Yet the existing strategic vacuum is also an opportunity. When it comes to national security at least, the agenda of the next administration all but sets itself. There is no need to waste time arguing about which issues demand priority action.

First-order questions are begging for attention. How should we gauge the threat? What are the principles that should inform our response? What forms of power are most relevant to implementing that response? Are the means at hand adequate to the task? If not, how should national priorities be adjusted to provide the means required? Given the challenges ahead, how should the government organize itself? Who — both agencies and individuals — will lead?

To each and every one of these questions, the Bush administration devised answers that turned out to be dead wrong. The next administration needs to do better. The place to begin is with the candid recognition that the Global War on Terror has effectively ceased to exist. When it comes to national security strategy, we need to start over from scratch.

Andrew J. Bacevich is professor of history and international relations at Boston University. His bestselling new book is The Limits of Power: The End of American Exceptionalism (The American Empire Project, Metropolitan Books). To listen to a podcast in which he discusses issues relevant to this article, click here.

Copyright 2008 Andrew Bacevich

Expanding War, Contracting Meaning

The events of the past seven years have yielded a definitive judgment on the strategy that the Bush administration conceived in the wake of 9/11 to wage its so-called Global War on Terror. That strategy has failed, massively and irrevocably. To acknowledge that failure is to confront an urgent national priority: to scrap the Bush approach in favor of a new national security strategy that is realistic and sustainable — a task that, alas, neither of the presidential candidates seems able to recognize or willing to take up.

On September 30, 2001, President Bush received from Secretary of Defense Donald Rumsfeld a memorandum outlining U.S. objectives in the War on Terror. Drafted by Rumsfeld’s chief strategist Douglas Feith, the memo declared expansively: "If the war does not significantly change the world’s political map, the U.S. will not achieve its aim." That aim, as Feith explained in a subsequent missive to his boss, was to "transform the Middle East and the broader world of Islam generally."

Rumsfeld and Feith were co-religionists: Along with other senior Bush administration officials, they worshipped in the Church of the Indispensable Nation, a small but intensely devout Washington-based sect formed in the immediate wake of the Cold War. Members of this church shared an exalted appreciation for the efficacy of American power, especially hard power. The strategy of transformation emerged as a direct expression of their faith.

The members of this church were also united by an equally exalted estimation of their own abilities. Lucky the nation to be blessed with such savvy and sophisticated public servants in its hour of need!

The goal of transforming the Islamic world was nothing if not bold. It implied far-reaching political, economic, social, and even cultural adjustments. At a press conference on September 18, 2001, Rumsfeld spoke bluntly of the need to "change the way that they live." Rumsfeld didn’t specify who "they" were. He didn’t have to. His listeners understood without being told: "They" were Muslims inhabiting a vast arc of territory that stretched from Morocco in the west all the way to the Moro territories of the Southern Philippines in the east.

Yet boldly conceived action, if successfully executed, offered the prospect of solving a host of problems. Once pacified (or "liberated"), the Middle East would cease to breed or harbor anti-American terrorists. Post-9/11 fears about weapons of mass destruction falling into the hands of evil-doers could abate. Local regimes, notorious for being venal, oppressive, and inept, might finally get serious about cleaning up their acts. Liberal values, including rights for women, would flourish. A part of the world perpetually dogged by violence would enjoy a measure of stability, with stability promising not so incidentally to facilitate exploitation of the region’s oil reserves. There was even the possibility of enhancing the security of Israel. Like a powerful antibiotic, the Bush administration’s strategy of transformation promised to clean out not simply a single infection but several; or to switch metaphors, a strategy of transformation meant running the table.

When it came to implementation, the imperative of the moment was to think big. Just days after 9/11, Rumsfeld was charging his subordinates to devise a plan of action that had "three, four, five moves behind it." By December 2001, the Pentagon had persuaded itself that the first move — into Afghanistan — had met success. The Bush administration wasted little time in pocketing its ostensible victory. Attention quickly shifted to the second move, seen by insiders as holding the key to ultimate success: Iraq.

Fix Iraq and moves three, four, and five promised to come easily. Writing in the Weekly Standard, William Kristol and Robert Kagan got it exactly right: "The president’s vision will, in the coming months, either be launched successfully in Iraq, or it will die in Iraq."

The point cannot be emphasized too strongly: Saddam Hussein’s (nonexistent) weapons of mass destruction and his (imaginary) ties to Al Qaeda never constituted the real reason for invading Iraq — any more than the imperative of defending Russian "peacekeepers" in South Ossetia explains the Kremlin’s decision to invade Georgia.

Iraq merely offered a convenient place from which to launch a much larger and infinitely more ambitious project. "After Hussein is removed," enthused Hudson Institute analyst Max Singer, "there will be an earthquake through the region." Success in Iraq promised to endow the United States with hitherto unprecedented leverage. Once the United States had made an example of Saddam Hussein, as the influential neoconservative Richard Perle put it, dealing with other ne’er-do-wells would become simple: "We could deliver a short message, a two-word message: ‘You’re next.’" Faced with the prospect of sharing Saddam’s fate, Syrians, Iranians, Sudanese, and other recalcitrant regimes would see submission as the wiser course — so Perle and others believed.

Members of the administration tried to imbue this strategic vision with a softer ideological gloss. "For 60 years," Condoleezza Rice explained to a group of students in Cairo, "my country, the United States, pursued stability at the expense of democracy in this region here in the Middle East — and we achieved neither." No more. "Now, we are taking a different course. We are supporting the democratic aspirations of all people." The world’s Muslims needed to know that the motives behind the U.S. incursion into Iraq and its actions elsewhere in the region were (or had, at least, suddenly become) entirely benign. Who knows? Rice may even have believed the words she spoke.

In either case — whether the strategy of transformation aimed at dominion or democratization — today, seven years after it was conceived, we can assess exactly what it has produced. The answer is clear: next to nothing, apart from squandering vast resources and exacerbating the slide toward debt and dependency that poses a greater strategic threat to the United States than Osama bin Laden ever did.

In point of fact, hardly had the Pentagon commenced its second move, its invasion of Iraq, when the entire strategy began to unravel. In Iraq, President Bush’s vision of regional transformation did die, much as Kagan and Kristol had feared. No amount of CPR credited to the so-called surge will revive it. Even if tomorrow Iraq were to achieve stability and become a responsible member of the international community, no sensible person could suggest that Operation Iraqi Freedom provides a model to apply elsewhere. Senator John McCain says that he’ll keep U.S. combat troops in Iraq for as long as it takes. Yet even he does not propose "solving" any problems posed by Syria or Iran (much less Pakistan) by employing the methods that the Bush administration used to "solve" the problem posed by Iraq. The Bush Doctrine of preventive war may remain nominally on the books. But, as a practical matter, it is defunct.

The United States will not change the world’s political map in the ways top administration officials once dreamed of. There will be no earthquake that shakes up the Middle East — unless the growing clout of Iran, Hezbollah, and Hamas in recent years qualifies as that earthquake. Given the Pentagon’s existing commitments, there will be no threats of "you’re next" either — at least none that will worry our adversaries, as the Russians have neatly demonstrated. Nor will there be a wave of democratic reform — even Rice has ceased her prattling on that score. Islam will remain stubbornly resistant to change, except on terms of its own choosing. We will not change the way "they" live.

In a book that he co-authored during the run-up to the invasion, Kristol confidently declared, "The mission begins in Baghdad, but it does not end there." In fact, the Bush administration’s strategy of transformation has ended. It has failed miserably. The sooner we face up to that failure, the sooner we can get about repairing the damage.

Andrew J. Bacevich is professor of history and international relations at Boston University. His bestselling new book is The Limits of Power: The End of American Exceptionalism. You can read excerpts from it by clicking here, and here, or watch a video of him discussing the lessons of 9/11, seven years later, by clicking here.

Copyright 2008 Andrew J. Bacevich

9/11 Plus Seven

To appreciate the full extent of the military crisis into which the United States has been plunged requires understanding what the Iraq War and, to a lesser extent, the Afghan War have to teach. These two conflicts, along with the attacks of September 11, 2001, will form the centerpiece of George W. Bush’s legacy. Their lessons ought to constitute the basis of a new, more realistic military policy.

In some respects, the effort to divine those lessons is well under way, spurred by critics of President Bush’s policies on the left and the right as well as by reform-minded members of the officer corps. Broadly speaking, this effort has thus far yielded three distinct conclusions. Whether taken singly or together, they invert the post-Cold War military illusions that provided the foundation for the president’s Global War on Terror. In exchange for these received illusions, they propound new ones, which are equally misguided. Thus far, that is, the lessons drawn from America’s post-9/11 military experience are the wrong ones.

According to the first lesson, the armed services — and above all the Army — need to recognize that the challenges posed by Iraq and Afghanistan define not only the military’s present but also its future, the "next war," as enthusiasts like to say. Rooting out insurgents, nation-building, training and advising "host nation" forces, population security and control, winning hearts and minds — these promise to be ongoing priorities, preoccupying U.S. troops for decades to come, all across the Islamic world.

Rather than brief interventions ending in decisive victory, sustained presence will be the norm. Large-scale conventional conflict like 1991’s Operation Desert Storm becomes the least likely contingency. The future will be one of small wars, expected to be frequent, protracted, perhaps perpetual.

Although advanced technology will retain an important place in such conflicts, it will not be decisive. Wherever possible, the warrior will rely on "nonkinetic" methods, functioning as diplomat, mediator, and relief worker. No doubt American soldiers will engage in combat, but, drawing on the latest findings of social science, they will also demonstrate cultural sensitivity, not to speak of mastering local languages and customs. As Secretary of Defense Robert Gates put it in October 2007, "Reviving public services, rebuilding infrastructure and promoting good governance" had now become soldiers’ business. "All these so-called nontraditional capabilities have moved into the mainstream of military thinking, planning, and strategy — where they must stay."

This prospect implies a rigorous integration of military action with political purpose. Hard power and soft power will merge. The soldier on the ground will serve as both cop and social worker. This prospect also implies shedding the sort of utopian expectations that produced so much confident talk of "transformation," "shock-and-awe," and "networkcentric warfare" — all of which had tended to segregate war and politics into separate compartments.

Local conditions will dictate technique, dooming the Pentagon’s effort to devise a single preconceived, technologically determined template applicable across the entire spectrum of conflict. When it comes to low-intensity wars, the armed services will embrace a style owing less to the traditions of the Civil War, World War II, or even Gulf War I than to the nearly forgotten American experiences in the Philippines after 1898 and in Central America during the 1920s. Instead of looking for inspiration at the campaigns of U. S. Grant, George Patton, or H. Norman Schwarzkopf, officers will study postwar British and French involvement in places like Palestine and Malaya, Indochina and Algeria.

In sum, an officer corps bloodied in Iraq and Afghanistan has seen the future and it points to many more Iraqs and Afghanistans. Whereas the architects of full spectrum dominance had expected the unprecedented lethality, range, accuracy, and responsiveness of high-tech striking power to perpetuate military dominion, the veterans of Iraq and Afghanistan know better. They remain committed to global dominance while believing that its pursuit will require not only advanced weaponry but also the ability to put boots on the ground and keep them there. This, in turn, implies a plentiful supply of soldiers and loads of patience on the home front.

Were the Civilians of the Defense Department Responsible?

Viewed from another perspective, however, the post-9/11 wars teach an altogether different lesson. According to this alternative view, echoing a similar complaint during the Vietnam era, the shortcomings of U.S. policy in Iraq and Afghanistan have little to do with the actual performance of American forces in the field and everything to do with the meddling of bumbling civilians back in Washington. In its simplest form, fault lies not with the troops themselves, nor with their commanders, but with the likes of Secretary of Defense Donald Rumsfeld, Deputy Secretary of Defense Paul Wolfowitz, and Undersecretary of Defense Douglas Feith, who prevented the troops from doing their jobs.

The charges leveled by Major General John Batiste, who served in Rumsfeld’s Pentagon but subsequently retired in disgust and became one of the defense secretary’s loudest military critics, are representative of this view. "Rumsfeld’s dismal strategic decisions resulted in the unnecessary deaths of American servicemen and women," Batiste declared in September 2006. The former general held Rumsfeld personally "responsible for America and her allies going to war with the wrong plan." But that was just for starters. Rumsfeld also "violated fundamental principles of war, dismissed deliberate military planning, ignored the hard work to build the peace after the fall of Saddam Hussein, set the conditions for Abu Ghraib and other atrocities that further ignited the insurgency, disbanded Iraqi security force institutions when we needed them most, [and] constrained our commanders with an overly restrictive de-Ba’athification policy."

Nor was the problem limited to Rumsfeld himself. It included his chief lieutenants. According to Batiste, Rumsfeld surrounded himself "with like-minded and compliant subordinates who [did] not grasp the importance of the principles of war, the complexities of Iraq, or the human dimension of warfare." The overall effect was tantamount to murder: Rumsfeld "tied the hands of commanders while our troops were in contact with the enemy."

Here lies the second preliminary lesson drawn from Iraq and Afghanistan, one that appeals to disgruntled military officers like Batiste, but also to Democrats eager to blame the Bush administration for any and all sins and to neoconservatives looking to absolve themselves of responsibility for botched wars that they had once cavalierly promoted. The corrective to civilian arrogance and misjudgment is obvious: It requires tilting the civil-military balance back in favor of the generals, untying the hands of senior commanders.

From this perspective, the most important lesson to take away from Iraq and Afghanistan is the imperative to empower military professionals. The Petraeus moment of 2007, when all of official Washington from President Bush to the lowest-ranking congressional staffer waited with bated breath for General David Petraeus to formulate basic policy for Iraq, offers a preview of how this lesson might play itself out.

Is a Draft the Answer?

There is also a third perspective, which blames the failures of Iraq and Afghanistan on a problematic relationship between soldiers and society. According to this view, the All-Volunteer Force itself is the problem. As the military historian Adrian Lewis observed, "The most significant transformation in the American conduct of war since World War II and the invention of the atomic bomb was not technological, but cultural, social, and political — the removal of the American people from the conduct of war." Only after 9/11, with the Bush administration waging war on multiple fronts, have the implications of this transformation become fully evident.

A reliance on volunteer-professionals places a de facto cap on the army’s overall size. The pool of willing recruits is necessarily limited. Given a choice, most young Americans will opt for opportunities other than military service, with protracted war diminishing rather than enhancing any collective propensity to volunteer. It is virtually inconceivable that any presidential call to the colors, however impassioned, any PR campaign, however cleverly designed, or any package of pay and bonuses, however generous, could reverse this disinclination.

Furthermore, to the extent that an army composed of regulars is no longer a people’s army, the people have little say in its use. In effect, the professional military has become an extension of the imperial presidency. The troops fight when and where the commander in chief determines.

Finally, a reliance on professional soldiers eviscerates the concept of civic duty, relieving citizens at large of any obligation to contribute to the nation’s defense. Ending the draft during the waning days of the Vietnam War did nothing to heal the divisions created by that conflict; instead, it ratified the separation of army from society. Like mowing lawns and bussing tables, fighting and perhaps dying to sustain the American way of life became something that Americans pay others to do.

So the third lesson of the Iraq War focuses on the need to repair the relationship between army and society. One way to do this is to junk the All-Volunteer Force altogether. Rather than rely on professionals, perhaps it makes sense to revive the tradition of the citizen-soldier.

Proposals to restore this hallowed tradition invariably conjure up images of reinstituting some form of conscription. In place of a system based on the principle of individual choice, those unhappy with the AVF advocate a system based on the principle of state compulsion.

The advantages offered by such a system are hardly trivial. To the extent that Iraq and Afghanistan have exposed the operational, political, and moral problems produced by relying on a small professional force, a draft seems to offer one obvious way to alleviate those problems.

For those who worry that the existing army is overextended, conscription provides a mechanism for expansion. Triple the size of the army — in essence restoring the structure that existed during much of the Cold War — and the personnel shortages that constrain the prosecution of ground campaigns will disappear. Sustaining the military commitment to Iraq for ten or twenty years, or even a century as Senator John McCain and many neoconservatives are willing to contemplate, then becomes a viable proposition.

War planners will no longer find themselves obliged to give short shrift to Contingency A (Afghanistan) in order to support Contingency B (Iraq). The concept of "surge" will take on a whole new meaning with the Pentagon able to dispatch not a measly 30,000 reinforcements to Iraq or another few thousand to Afghanistan, but 100,000 or more additional troops wherever they might be needed. Was the problem with Operation Iraqi Freedom too few "boots on the ground" for occupation and reconstruction? Reconstitute the draft, and that problem goes away.

Creating a mass army might even permit the United States to resuscitate the Weinberger-Powell Doctrine with its emphasis on "overwhelming force."

For those distressed by the absence of a politically meaningful antiwar movement despite the Iraq War’s manifest unpopularity, the appeal of conscription differs somewhat. Some political activists look to an Iraq-era draft to do what the Vietnam-era draft did: animate large-scale protest, alter the political dynamic, and eventually shut down any conflict that lacks widespread popular support. The prospect of involuntary service will pry the kids out of the shopping malls and send them into the streets. It will prod the parents of draft-eligible offspring to see politics as something other than a mechanism for doling out entitlements. As a consequence, members of Congress keen to retain their seats will define their wartime responsibilities as something more than simply rubber-stamping spending bills proposed by the White House. In this way, a draft could reinvigorate American democracy, restore the governmental system of checks and balances, and constrain the warmongers inhabiting the executive branch.

For those moved by moral considerations, a draft promises to ensure a more equitable distribution of sacrifice in war time. No longer will rural Americans, people of color, recent immigrants, and members of the working class fill the ranks of the armed forces in disproportionate numbers. With conscription, the children of the political elite and of the well-to-do will once again bear their fair share of the load. Those reaping the benefits of the American way of life will contribute to its defense, helping to garrison the more distant precincts of empire. Perhaps even the editorial staffs of the Weekly Standard, National Review, and the New Republic might have the opportunity to serve, a salutary prospect given the propensity of those magazines to argue on behalf of military intervention.

Reconfigure the armed services to fight "small wars"; empower the generals; reconnect soldiering to citizenship — on the surface each of these has a certain appeal. But upon closer examination, each also has large defects. They are the wrong lessons to take from Iraq and Afghanistan.

Drawing the Right Lessons from the Bush Era

If gearing up to fight "small wars," deferring to the brass, and scrapping the All-Volunteer Force are the wrong lessons to be drawn from our recent military experience, then what are the right ones?

The events of the recent past offer several lessons that illuminate these questions. The first concerns the nature of war. Iraq and Afghanistan remind us that war is not subject to reinvention, whatever George W. Bush and Pentagon proponents of the so-called Revolution in Military Affairs may contend.

War’s essential nature is fixed, permanent, intractable, and irrepressible. War’s constant companions are uncertainty and risk. "War is the realm of chance," wrote the military theorist Carl von Clausewitz nearly two centuries ago. "No other human activity gives it greater scope: no other has such incessant and varied dealings with this intruder…" — a judgment that the invention of the computer, the Internet, and precision-guided munitions has done nothing to overturn.

So the first lesson to be taken away from the Bush administration’s two military adventures is simply this: War remains today what it has always been — elusive, untamed, costly, difficult to control, fraught with surprise, and sure to give rise to unexpected consequences. Only the truly demented will imagine otherwise.

The second lesson of Iraq and Afghanistan derives from the first. As has been the case throughout history, the utility of armed force remains finite. Even in the information age, to the extent that force "works," it does so with respect to a limited range of contingencies.

Although die-hard supporters of the Global War on Terror will insist otherwise, events in Iraq and Afghanistan have demonstrated definitively that further reliance on coercive methods will not enable the United States to achieve its objectives. Whether the actual aim is to democratize the Islamic world or subdue it, the military "option" is not the answer.

The Bush Doctrine itself provides the basis for a third lesson. For centuries, the Western moral tradition has categorically rejected the concept of preventive war. The events of 9/11 convinced some that this tradition no longer applied: old constraints had to give way. Yet our actual experience with preventive war suggests that, even setting moral considerations aside, to launch a war today to eliminate a danger that might pose a threat at some future date is just plain stupid. It doesn’t work.

History has repeatedly demonstrated the irrationality of preventive war. If the world needed a further demonstration, President Bush provided it. Iraq shows us why the Bush Doctrine was a bad idea in the first place and why its abrogation has become essential. For principled guidance in determining when the use of force is appropriate, the country should conform to the Just War tradition — not only because that tradition is consistent with our professed moral values, but also because its provisions provide an eminently useful guide for sound statecraft.

Finally, there is a fourth lesson, relating to the formulation of strategy. The results of U.S. policy in Iraq and Afghanistan suggest that in the upper echelons of the government and among the senior ranks of the officer corps, this has become a lost art.

Since the end of the Cold War, the tendency among civilians — with President Bush a prime example — has been to confuse strategy with ideology. The president’s freedom agenda, which supposedly provided a blueprint for how to prosecute the Global War on Terror, expressed grandiose aspirations without serious effort to assess the means required to achieve them. Meanwhile, ever since the Vietnam War ended, the tendency among military officers has been to confuse strategy with operations.

Here we come face-to-face with the essential dilemma with which the United States has unsuccessfully wrestled since the Soviets deprived us of a stabilizing adversary. The political elite that ought to bear the chief responsibility for crafting grand strategy instead nurses fantasies of either achieving permanent global hegemony or remaking the world in America’s image. Meanwhile, the military elite that could puncture those fantasies and help restore a modicum of realism to U.S. policy fixates on campaigns and battles, with generalship largely a business of organizing and coordinating matériel.

The four lessons of Iraq and Afghanistan boil down to this: Events have exposed as illusory American pretensions to having mastered war. Even today, war is hardly more subject to human control than the tides or the weather. Simply trying harder — investing ever larger sums in even more advanced technology, devising novel techniques, or even improving the quality of American generalship — will not enable the United States to evade that reality.

As measured by results achieved, the performance of the military since the end of the Cold War and especially since 9/11 has been unimpressive. This indifferent record of success leads some observers to argue that we need a bigger army or a different army.

But the problem lies less with the army that we have — a very fine one, which every citizen should wish to preserve — than with the requirements that we have imposed on our soldiers. Rather than expanding or reconfiguring that army, we need to treat it with the respect that it deserves. That means protecting it from further abuse of the sort that it has endured since 2001.

America doesn’t need a bigger army. It needs a smaller — that is, more modest — foreign policy, one that assigns soldiers missions that are consistent with their capabilities. Modesty implies giving up on the illusions of grandeur to which the end of the Cold War and then 9/11 gave rise. It also means reining in the imperial presidents who expect the army to make good on those illusions. When it comes to supporting the troops, here lies the essence of a citizen’s obligation.

Andrew Bacevich, professor of history and international relations at Boston University, retired from the U.S. Army with the rank of colonel. This piece is adapted from his new book, The Limits of Power: The End of American Exceptionalism (Metropolitan Books, 2008). He is also the author of The New American Militarism, among other books. His writing has appeared in Foreign Affairs, the Atlantic Monthly, the Nation, the New York Times, the Los Angeles Times, and the Wall Street Journal. A TomDispatch interview with him can be read by clicking here, and then here. For part one of Bacevich’s two-part series for TomDispatch, "Illusions of Victory," click here

From the book The Limits of Power: The End of American Exceptionalism by Andrew Bacevich, Copyright © 2008 by Andrew Bacevich. Reprinted by arrangement with Metropolitan Books, an Imprint of Henry Holt and Company, LLC. All Rights Reserved.

Is Perpetual War Our Future?

"War is the great auditor of institutions," the historian Corelli Barnett once observed. Since 9/11, the United States has undergone such an audit and been found wanting. That adverse judgment applies in full to America’s armed forces.

Valor does not offer the measure of an army’s greatness, nor does fortitude, nor durability, nor technological sophistication. A great army is one that accomplishes its assigned mission. Since George W. Bush inaugurated his global war on terror, the armed forces of the United States have failed to meet that standard.

In the aftermath of September 11, 2001, Bush conceived of a bold, offensive strategy, vowing to "take the battle to the enemy, disrupt his plans, and confront the worst threats before they emerge." The military offered the principal means for undertaking this offensive, and U.S. forces soon found themselves engaged on several fronts.

Two of those fronts — Afghanistan and Iraq — commanded priority attention. In each case, the assigned task was to deliver a knockout blow, leading to a quick, decisive, economical, politically meaningful victory. In each case, despite impressive displays of valor, fortitude, durability, and technological sophistication, America’s military came up short. The problem lay not with the level of exertion but with the results achieved.

In Afghanistan, U.S. forces failed to eliminate the leadership of Al Qaeda. Although they toppled the Taliban regime that had ruled most of that country, they failed to eliminate the Taliban movement, which soon began to claw its way back. Intended as a brief campaign, the Afghan War became a protracted one. Nearly seven years after it began, there is no end in sight. If anything, America’s adversaries are gaining strength. The outcome remains much in doubt.

In Iraq, events followed a similar pattern, with the appearance of easy success belied by subsequent developments. The U.S. invasion began on March 19, 2003. Six weeks later, against the backdrop of a White House-produced banner proclaiming "Mission Accomplished," President Bush declared that "major combat operations in Iraq have ended." This claim proved illusory.

Writing shortly after the fall of Baghdad, the influential neoconservatives David Frum and Richard Perle declared Operation Iraqi Freedom "a vivid and compelling demonstration of America’s ability to win swift and total victory." General Tommy Franks, commanding the force that invaded Iraq, modestly characterized the results of his handiwork as "unequalled in its excellence by anything in the annals of war." In retrospect, such judgments — and they were legion — can only be considered risible. A war thought to have ended on April 9, 2003, in Baghdad’s al-Firdos Square was only just beginning. Fighting dragged on for years, exacting a cruel toll. Iraq became a reprise of Vietnam, although in some respects at least on a blessedly smaller scale.

A New American Way of War?

It wasn’t supposed to be this way. Just a few short years ago, observers were proclaiming that the United States possessed military power such as the world had never seen. Here was the nation’s strong suit. "The troops" appeared unbeatable. Writing in 2002, for example, Max Boot, a well-known commentator on military matters, attributed to the United States a level of martial excellence "that far surpasses the capabilities of such previous would-be hegemons as Rome, Britain, and Napoleonic France." With U.S. forces enjoying "unparalleled strength in every facet of warfare," allies, he wrote, had become an encumbrance: "We just don’t need anyone else’s help very much."

Boot dubbed this the Doctrine of the Big Enchilada. Within a year, after U.S. troops had occupied Baghdad, he went further: America’s army even outclassed Germany’s Wehrmacht. The mastery displayed in knocking off Saddam, Boot gushed, made "fabled generals such as Erwin Rommel and Heinz Guderian seem positively incompetent by comparison."

All of this turned out to be hot air. If the global war on terror has produced one undeniable conclusion, it is this: Estimates of U.S. military capabilities have turned out to be wildly overstated. The Bush administration’s misplaced confidence in the efficacy of American arms represents a strategic misjudgment that has cost the country dearly. Even in an age of stealth, precision weapons, and instant communications, armed force is not a panacea. Even in a supposedly unipolar era, American military power turns out to be quite limited.

How did it happen that Americans so utterly overappraised the utility of military power? The answer to that question lies at the intersection of three great illusions.

According to the first illusion, the United States during the 1980s and 1990s had succeeded in reinventing armed conflict. The result was to make force more precise, more discriminating, and potentially more humane. The Pentagon had devised a new American Way of War, investing its forces with capabilities unlike any the world had ever seen. As President Bush exuberantly declared shortly after the fall of Baghdad in April 2003, "We’ve applied the new powers of technology… to strike an enemy force with speed and incredible precision. By a combination of creative strategies and advanced technologies, we are redefining war on our terms. In this new era of warfare, we can target a regime, not a nation."

The distinction between regime and nation was a crucial one. By employing these new military techniques, the United States could eliminate an obstreperous foreign leader and his cronies, while sparing the population over which that leader ruled. Putting a missile through the roof of a presidential palace made it unnecessary to incinerate an entire capital city, endowing force with hitherto undreamed-of political utility and easing ancient moral inhibitions on the use of force. Force had been a club; it now became a scalpel. By the time the president spoke, such sentiments had already become commonplace among many (although by no means all) military officers and national security experts.

Here lay a formula for certain victory. Confidence in military prowess both reflected and reinforced a post-Cold War confidence in the universality of American values. Harnessed together, they made a seemingly unstoppable one-two punch.

With that combination came expanded ambitions. In the 1990s, the very purpose of the Department of Defense changed. Sustaining American global preeminence, rather than mere national security, became its explicit function. In the most comprehensive articulation of this new American Way of War, the Joint Chiefs of Staff committed the armed services to achieving what they called "full spectrum dominance" — unambiguous supremacy in all forms of warfare, to be achieved by tapping the potential of two "enablers" — "technological innovation and information superiority."

Full spectrum dominance stood in relation to military affairs as the political scientist Francis Fukuyama’s well-known proclamation of "the end of history" stood in relation to ideology: Each claimed to have unlocked ultimate truths. According to Fukuyama, democratic capitalism represented the final stage in political economic evolution. According to the proponents of full spectrum dominance, that concept represented the final stage in the evolution of modern warfare. In their first days and weeks, the successive invasions of Afghanistan and Iraq both seemed to affirm such claims.

How Not to "Support the Troops"

According to the second illusion, American civilian and military leaders subscribed to a common set of principles for employing their now-dominant forces. Adherence to these principles promised to prevent any recurrence of the sort of disaster that had befallen the nation in Vietnam. If politicians went off half-cocked, as President Lyndon Johnson and Secretary of Defense Robert McNamara had back in the 1960s, generals who had correctly discerned and assimilated the lessons of modern war could be counted on to rein them in.

These principles found authoritative expression in the Weinberger-Powell Doctrine, which specified criteria for deciding when and how to use force. Caspar Weinberger, secretary of defense during most of the Reagan era, first articulated these principles in 1984. General Colin Powell, chairman of the Joint Chiefs of Staff during the early 1990s, expanded on them. Yet the doctrine’s real authors were the members of the post-Vietnam officer corps. The Weinberger-Powell principles expressed the military’s own lessons taken from that war. Those principles also expressed the determination of senior officers to prevent any recurrence of Vietnam.

Henceforth, according to Weinberger and Powell, the United States would fight only when genuinely vital interests were at stake. It would do so in pursuit of concrete and attainable objectives. It would mobilize the necessary resources — political and moral as well as material — to win promptly and decisively. It would end conflicts expeditiously and then get out, leaving no loose ends. The spirit of the Weinberger-Powell Doctrine was not permissive; its purpose was to curb the reckless or imprudent inclinations of bellicose civilians.

According to the third illusion, the military and American society had successfully patched up the differences that produced something akin to divorce during the divisive Vietnam years. By the 1990s, a reconciliation of sorts was under way. In the wake of Operation Desert Storm, "the American people fell in love again with their armed forces." So, at least, General Colin Powell, one of that war’s great heroes, believed. Out of this love affair a new civil-military compact had evolved, one based on the confidence that, in times of duress, Americans could be counted on to "support the troops." Never again would the nation abandon its soldiers.

The All-Volunteer Force (AVF) — despite its name, a professional military establishment — represented the chief manifestation of this new compact. By the 1990s, Americans were celebrating the AVF as the one component of the federal government that actually worked as advertised. The AVF embodied the nation’s claim to the status of sole superpower; it was "America’s Team." In the wake of the Cold War, the AVF sustained the global Pax Americana without interfering with the average American’s pursuit of life, liberty, and happiness. What was not to like?

Events since 9/11 have exposed these three illusions for what they were. When tested, the new American Way of War yielded more glitter than gold. The generals and admirals who touted the wonders of full spectrum dominance were guilty of flagrant professional malpractice, if not outright fraud. To judge by the record of the past twenty years, U.S. forces win decisively only when the enemy obligingly fights on American terms — and Saddam Hussein’s demise has drastically reduced the likelihood of finding such accommodating adversaries in the future. As for loose ends, from Somalia to the Balkans, from Central Asia to the Persian Gulf, they have been endemic.

When it came to the Weinberger-Powell Doctrine, civilian willingness to conform to its provisions proved to be highly contingent. Confronting Powell in 1993, Madeleine Albright famously demanded to know, "What’s the point of having this superb military that you’re always talking about, if we can’t use it?" Mesmerized by the prospects of putting American soldiers to work to alleviate the world’s ills, Albright soon enough got her way. An odd alliance that combined left-leaning do-gooders with jingoistic politicians and pundits succeeded in chipping away at constraints on the use of force. "Humanitarian intervention" became all the rage. Whatever restraining influence the generals exercised during the 1990s did not survive that decade. Lessons of Vietnam that had once seemed indelible were forgotten.

Meanwhile, the reconciliation of the people and the army turned out to be a chimera. When the chips were down, "supporting the troops" elicited plenty of posturing but little by way of binding commitments. Far from producing a stampede of eager recruits keen to don a uniform, the events of 9/11 reaffirmed a widespread popular preference for hiring someone else’s kid to chase terrorists, spread democracy, and ensure access to the world’s energy reserves.

In the midst of a global war of ostensibly earthshaking importance, Americans demonstrated a greater affinity for their hometown sports heroes than for the soldiers defending the distant precincts of the American imperium. Tom Brady makes millions playing quarterback in the NFL and rakes in millions more from endorsements. Pat Tillman quit professional football to become an army ranger and was killed in Afghanistan. Yet, of the two, Brady more fully embodies the contemporary understanding of the term patriot.

Demolishing the Doctrine of the Big Enchilada

While they persisted, however, these three illusions fostered gaudy expectations about the efficacy of American military might. Every president since Ronald Reagan has endorsed these expectations. Every president since Reagan has exploited his role as commander in chief to expand on the imperial prerogatives of his office. Each has also relied on military power to conceal or manage problems that stemmed from the nation’s habits of profligacy.

In the wake of 9/11, these puerile expectations — that armed force wielded by a strong-willed chief executive could do just about anything — reached an apotheosis of sorts. Having manifestly failed to anticipate or prevent a devastating attack on American soil, President Bush proceeded to use his ensuing global war on terror as a pretext for advancing grandiose new military ambitions married to claims of unbounded executive authority — all under the guise of keeping Americans "safe."

With the president denying any connection between the events of September 11th and past U.S. policies, his declaration of a global war nipped in the bud whatever inclination the public might have entertained to reconsider those policies. In essence, Bush counted on war both to concentrate greater power in his own hands and to divert attention from the political, economic, and cultural bind in which the United States found itself as a result of its own past behavior.

As long as U.S. forces sustained their reputation for invincibility, it remained possible to pretend that the constitutional order and the American way of life were in good health. The concept of waging an open-ended global campaign to eliminate terrorism retained a modicum of plausibility. After all, how could anyone or anything stop the unstoppable American soldier?

Call that reputation into question, however, and everything else unravels. This is what occurred when the Iraq War went sour. The ills afflicting our political system, including a deeply irresponsible Congress, broken national security institutions, and above all an imperial commander in chief not up to the job, became all but impossible to ignore. So, too, did the self-destructive elements inherent in the American way of life — especially an increasingly costly addiction to foreign oil, universally deplored and almost as universally indulged. More noteworthy still, the prospect of waging war on a global scale for decades, if not generations, became preposterous.

To anyone with eyes to see, the events of the past seven years have demolished the Doctrine of the Big Enchilada. A gung-ho journalist like Robert Kaplan might still believe that, with the dawn of the twenty-first century, the Pentagon had "appropriated the entire earth, and was ready to flood the most obscure areas of it with troops at a moment’s notice," that planet Earth in its entirety had become "battle space for the American military." Yet any buck sergeant of even middling intelligence knew better than to buy such claptrap.

With the Afghanistan War well into its seventh year and the Iraq War marking its fifth anniversary, a commentator like Michael Barone might express absolute certainty that "just about no mission is impossible for the United States military." But Barone was not facing the prospect of being ordered back to the war zone for his second or third combat tour.

Between what President Bush called upon America’s soldiers to do and what they were capable of doing loomed a huge gap that defines the military crisis besetting the United States today. For a nation accustomed to seeing military power as its trump card, the implications of that gap are monumental.

Andrew Bacevich, professor of history and international relations at Boston University, retired from the U.S. Army with the rank of colonel. This piece is adapted from his new book, The Limits of Power: The End of American Exceptionalism (Metropolitan Books, 2008). He is also the author of The New American Militarism, among other books. His writing has appeared in Foreign Affairs, the Atlantic Monthly, the Nation, the New York Times, the Los Angeles Times, and the Wall Street Journal. A TomDispatch interview with him can be read by clicking here, and then here.

[Note for TomDispatch readers: This is the first of a two-part series, "The American Military Crisis," adapted from Andrew Bacevich’s new book, The Limits of Power. On Thursday expect, "Is Perpetual War Our Future? Learning the Wrong Lessons from the Bush Era."]

From the book The Limits of Power: The End of American Exceptionalism by Andrew Bacevich, copyright © 2008 by Andrew Bacevich. Reprinted by arrangement with Metropolitan Books, an Imprint of Henry Holt and Company, LLC. All Rights Reserved.

Illusions of Victory