First, they tried to shoot the dogs. Next, they tried to poison them with strychnine. When both failed as efficient killing methods, British government agents and U.S. Navy personnel used raw meat to lure the pets into a sealed shed. Locking them inside, they gassed the howling animals with exhaust piped in from U.S. military vehicles. Then, setting coconut husks ablaze, they burned the dogs’ carcasses as their owners were left to watch and ponder their own fate.

The truth about the U.S. military base on the British-controlled Indian Ocean island of Diego Garcia is often hard to believe. It would be easy enough to confuse the real story with fictional accounts of the island found in the Transformers movies, on the television series 24, and in Internet conspiracy theories about the disappearance of Malaysia Airlines flight MH370.

While the grim saga of Diego Garcia frequently reads like fiction, it has proven all too real for the people involved. It’s the story of a U.S. military base built on a series of real-life fictions told by U.S. and British officials over more than half a century. The central fiction is that the U.S. built its base on an “uninhabited” island. That was “true” only because the indigenous people were secretly exiled from the Chagos Archipelago when the base was built. Although their ancestors had lived there since the time of the American Revolution, Anglo-American officials decided, as one wrote, to “maintain the fiction that the inhabitants of Chagos [were] not a permanent or semi-permanent population,” but just “transient contract workers.” The same official summed up the situation bluntly: “We are able to make up the rules as we go along.”

And so they did: between 1968 and 1973, American officials conspired with their British colleagues to remove the Chagossians, carefully hiding their expulsion from Congress, Parliament, the U.N., and the media. During the deportations, British agents and members of a U.S. Navy construction battalion rounded up and killed all those pet dogs. Their owners were then deported to the western Indian Ocean islands of Mauritius and the Seychelles, 1,200 miles from their homeland, where they received no resettlement assistance. More than 40 years after their expulsion, Chagossians generally remain the poorest of the poor in their adopted lands, struggling to survive in places that outsiders know as exotic tourist destinations.

During the same period, Diego Garcia became a multi-billion-dollar Navy and Air Force base and a central node in U.S. military efforts to control the Greater Middle East and its oil and natural gas supplies. The base, which few Americans are aware of, is more important strategically and more secretive than the U.S. naval base-cum-prison at Guantánamo Bay, Cuba. Unlike Guantánamo, no journalist has gotten more than a glimpse of Diego Garcia in more than 30 years. And yet, it has played a key role in waging the Gulf War, the 2003 invasion of Iraq, the U.S.-led war in Afghanistan, and the current bombing campaign against the Islamic State in Syria and Iraq.

Following years of reports that the base was a secret CIA “black site” for holding terrorist suspects and years of denials by U.S. and British officials, leaders on both sides of the Atlantic finally fessed up in 2008. “Contrary to earlier explicit assurances,” said Secretary of State for Foreign and Commonwealth Affairs David Miliband, Diego Garcia had indeed played at least some role in the CIA’s secret “rendition” program.

Last year, British officials claimed that flight log records, which might have shed light on those rendition operations, were “incomplete due to water damage” thanks to “extremely heavy weather in June 2014.” A week later, they suddenly reversed themselves, saying that the “previously wet paper records have been dried out.” Two months later, they insisted the logs had not dried out at all and were “damaged to the point of no longer being useful.” Except that the British government’s own weather data indicates that June 2014 was an unusually dry month on Diego Garcia. A legal rights advocate said British officials “could hardly be less credible if they simply said ‘the dog ate my homework.’”

And these are just a few of the fictions underlying the base that occupies the Chagossians’ former home and that the U.S. military has nicknamed the “Footprint of Freedom.” After more than four decades of exile, however, with a Chagossian movement to return to their homeland growing, the fictions of Diego Garcia may finally be crumbling.

No “Tarzans”

The story of Diego Garcia begins in the late eighteenth century. At that time, enslaved peoples from Africa, brought to work on Franco-Mauritian coconut plantations, became the first settlers in the Chagos Archipelago. Following emancipation and the arrival of indentured laborers from India, a diverse mixture of peoples created a new society with its own language, Chagos Kreol. They called themselves the Ilois — the Islanders.

While still a plantation society, the archipelago, by then under British colonial control, provided a secure life featuring universal employment and numerous social benefits on islands described by many as idyllic. “That beautiful atoll of Diego Garcia, right in the middle of the ocean,” is how Stuart Barber described it in the late 1950s. A civilian working for the U.S. Navy, Barber would become the architect of one of the most powerful U.S. military bases overseas.

Amid Cold War competition with the Soviet Union, Barber and other officials were concerned that there was almost no U.S. military presence in and around the Indian Ocean. Barber noted that Diego Garcia’s isolation — halfway between Africa and Indonesia and 1,000 miles south of India — ensured that it would be safe from attack, yet was still within striking distance of territory from southern Africa and the Middle East to South and Southeast Asia.

Guided by Barber’s idea, the administrations of John F. Kennedy and Lyndon Johnson convinced the British government to detach the Chagos Archipelago from colonial Mauritius and create a new colony, which they called the British Indian Ocean Territory. Its sole purpose would be to house U.S. military facilities.

During secret negotiations with their British counterparts, Pentagon and State Department officials insisted that Chagos come under their “exclusive control (without local inhabitants),” embedding an expulsion order in a polite-looking parenthetical phrase. U.S. officials wanted the islands “swept” and “sanitized.” British officials appeared happy to oblige, removing a people one official called “Tarzans” and, in a racist reference to Robinson Crusoe, “Man Fridays.”

“Absolutely Must Go”

This plan was confirmed with an “exchange of notes” signed on December 30, 1966, by U.S. and British officials, as one of the State Department negotiators told me, “under the cover of darkness.” The notes effectively constituted a treaty but required no Congressional or Parliamentary approval, meaning that both governments could keep their plans hidden.

According to the agreement, the United States would gain use of the new colony “without charge.” This was another fiction. In confidential minutes, the United States agreed to secretly wipe out a $14 million British military debt, circumventing the need to ask Congress for funding. In exchange, the British agreed to take the “administrative measures” necessary for “resettling the inhabitants.”

Those measures meant that, after 1967, any Chagossians who left home for medical treatment or a routine vacation in Mauritius were barred from returning. Soon, British officials began restricting the flow of food and medical supplies to Chagos. As conditions deteriorated, more islanders began leaving. By 1970, the U.S. Navy had secured funding for what officials told Congress would be an “austere communications station.” They were, however, already planning to ask for additional funds to expand the facility into a much larger base. As the Navy’s Office of Communications and Cryptology explained, “The communications requirements cited as justification are fiction.” By the 1980s, Diego Garcia would become a billion-dollar garrison.

In briefing papers delivered to Congress, the Navy described Chagos’s population as “negligible,” with the islands “for all practical purposes… uninhabited.” In fact, there were around 1,000 people on Diego Garcia in the 1960s and 500 to 1,000 more on other islands in the archipelago. With Congressional funds secured, the Navy’s highest-ranking admiral, Elmo Zumwalt, summed up the Chagossians’ fate in a 1971 memo of exactly three words: “Absolutely must go.”

The authorities soon ordered the remaining Chagossians — generally allowed no more than a single box of belongings and a sleeping mat — onto overcrowded cargo ships destined for Mauritius and the Seychelles. By 1973, the last Chagossians were gone.

“Abject Poverty”

At their destinations, most of the Chagossians were literally left on the docks, homeless, jobless, and with little money. In 1975, two years after the last removals, a Washington Post reporter found them living in “abject poverty.”

Aurélie Lisette Talate was one of the last to go. “I came to Mauritius with six children and my mother,” she told me. “We got our house… but the house didn’t have a door, didn’t have running water, didn’t have electricity. And then my children and I began to suffer. All my children started getting sick.”

Within two months, two of her children were dead. The second was buried in an unmarked grave because she lacked money for a proper burial. Aurélie experienced fainting spells herself and couldn’t eat. “We were living like animals. Land? We had none… Work? We had none. Our children weren’t going to school.”

Today, most Chagossians, who now number more than 5,000, remain impoverished. In their language, their lives are ones of lamizer(impoverished misery) and sagren (profound sorrow and heartbreak over being exiled from their native lands). Many of the islanders attribute sickness and even death to sagren. “I had something that had been affecting me for a long time, since we were uprooted,” was the way Aurélie explained it to me. “This sagren, this shock, it was this same problem that killed my child. We weren’t living free like we did in our natal land.”

Struggling for Justice

From the moment they were deported, the Chagossians demanded to be returned or at least properly resettled. After years of protest, including five hunger strikes led by women like Aurélie Talate, some in Mauritius received the most modest of compensation from the British government: small concrete houses, tiny plots of land, and about $6,000 per adult. Many used the money to pay off large debts they had accrued. For most, conditions improved only marginally. Those living in the Seychelles received nothing.

The Chagossian struggle was reinvigorated in 1997 with the launching of a lawsuit against the British government. In November 2000, the British High Court ruled the removal illegal. In 2001 and 2002, most Chagossians joined new lawsuits in both American and British courts demanding the right to return and proper compensation for their removal and for resettling their islands. The U.S. suit was ultimately dismissed on the grounds that the judiciary can’t, in most circumstances, overrule the executive branch on matters of military and foreign policy. In Britain, the Chagossians were more successful. In 2002, they secured the right to full U.K. citizenship. Over 1,000 Chagossians have since moved to Britain in search of better lives. Twice more, British courts ruled in the people’s favor, with judges calling the government’s behavior “repugnant” and an “abuse of power.”

On the government’s final appeal, however, Britain’s then highest court, the Law Lords in the House of Lords, upheld the exile in a 3-2 decision. The Chagossians appealed to the European Court of Human Rights to overturn the ruling.

A Green Fiction

Before the European Court could rule, the British government announced the creation of the world’s largest Marine Protected Area (MPA) in the Chagos Archipelago. The date of the announcement, April Fool’s Day 2010, should have been a clue that there was more than environmentalism behind the move. The MPA banned commercial fishing and limited other human activity in the archipelago, endangering the viability of any resettlement efforts.

And then came WikiLeaks. In December 2010, it released a State Department cable from the U.S. Embassy in London quoting a senior Foreign and Commonwealth Office official saying that the “former inhabitants would find it difficult, if not impossible, to pursue their claim for resettlement on the islands if the entire Chagos Archipelago were a marine reserve.” U.S. officials agreed. According to the Embassy, Political Counselor Richard Mills wrote, “Establishing a marine reserve might, indeed… be the most effective long-term way to prevent any of the Chagos Islands’ former inhabitants or their descendants from resettling.”

Not surprisingly, the main State Department concern was whether the MPA would affect base operations. “We are concerned,” the London Embassy noted, that some “would come to see the existence of a marine reserve as inherently inconsistent with the military use of Diego Garcia.” British officials assured the Americans there would be “no constraints on military operations.”

Although the European Court of Human Rights ultimately ruled against the Chagossians in 2013, this March, a U.N. tribunal found that the British government had violated international law in creating the Marine Protected Area. Next week, Chagossians will challenge the MPA and their expulsion before the British Supreme Court (now Britain’s highest) armed with the U.N. ruling and revelations that the government won its House of Lords decision with the help of a fiction-filled resettlement study.

Meanwhile, the European Parliament has passed a resolution calling for the Chagossians’ return, the African Union has condemned their deportation as unlawful, three Nobel laureates have spoken out on their behalf, and dozens of members of the British Parliament have joined a group supporting their struggle. In January, a British government “feasibility study” found no significant legal barriers to resettling the islands and outlined several possible resettlement plans, beginning with Diego Garcia. (Notably, Chagossians are not calling for the removal of the U.S. military base. Their opinions about it are diverse and complicated. At least some would prefer jobs on the base to lives of poverty and unemployment in exile.)

Of course, no study was needed to know that resettlement on Diego Garcia and in the rest of the archipelago is feasible. The base, which has hosted thousands of military and civilian personnel for more than 40 years, has demonstrated that well enough. In fact, Stuart Barber, its architect, came to the same conclusion in the years before his death. After he learned of the Chagossians’ fate, he wrote a series of impassioned letters to Human Rights Watch and the British Embassy in Washington, among others, imploring them to help the Chagossians return home. In a letter to Alaska Senator Ted Stevens, he said bluntly that the expulsion “wasn’t necessary militarily.”

In a 1991 letter to the Washington Post, Barber suggested that it was time “to redress the inexcusably inhuman wrongs inflicted by the British at our insistence.” He added, “Substantial additional compensation for 18-25 past years of misery for all evictees is certainly in order. Even if that were to cost $100,000 per family, we would be talking of a maximum of $40-50 million, modest compared with our base investment there.”

Almost a quarter-century later, nothing has yet been done. In 2016, the initial 50-year agreement for Diego Garcia will expire. While it is subject to an automatic 20-year renewal, it provides for a two-year renegotiation period, which commenced in late 2014. With momentum building in support of the Chagossians, they are optimistic that the two governments will finally correct this historic injustice. That U.S. officials allowed the British feasibility study to consider resettlement plans for Diego Garcia is a hopeful sign that Anglo-American policy may finally be shifting to right a great wrong in the Indian Ocean.

Unfortunately, Aurélie Talate will never see the day when her people go home. Like others among the rapidly dwindling number of Chagossians born in the archipelago, Aurélie died in 2012 at age 70, succumbing to the heartbreak that is sagren.

David Vine, a TomDispatch regular, is associate professor of anthropology at American University in Washington, D.C. His new book, Base Nation: How U.S. Military Bases Abroad Harm America and the World will be published in August as part of the American Empire Project (Metropolitan Books). He is also the author of Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia. He has written for the New York Times, the Washington Post, the Guardian, and Mother Jones, among other publications. For more of his writing, visit www.davidvine.net.

Copyright 2015 David Vine

The Truth About Diego Garcia

En route back to Washington at the tail end of his most recent overseas trip, John Kerry, America’s peripatetic secretary of state, stopped off in France “to share a hug with all of Paris.” Whether Paris reciprocated the secretary’s embrace went unrecorded.

Despite the requisite reference to General Pershing (“Lafayette, we are here!”) and flying James Taylor in from the 1960s to assure Parisians that “You’ve Got a Friend,” in the annals of American diplomacy Kerry’s hug will likely rank with President Eisenhower’s award of the Legion of Merit to Nicaraguan dictator Anastasio Somoza for “exceptionally meritorious conduct” and Jimmy Carter’s acknowledgment of the “admiration and love” said to define the relationship between the Iranian people and their Shah.  In short, it was a moment best forgotten.

Alas, this vapid, profoundly silly event is all too emblematic of statecraft in the Obama era.  Seldom have well-credentialed and well-meaning people worked so hard to produce so little of substance.

Not one of the signature foreign policy initiatives conceived in Obama’s first term has borne fruit. When it came to making a fresh start with the Islamic world, responsibly ending the “dumb” war in Iraq (while winning the “necessary” one in Afghanistan), “resetting” U.S.-Russian relations, and “pivoting” toward Asia, mark your scorecard 0 for 4.

There’s no doubt that when Kerry arrived at the State Department he brought with him some much-needed energy.  That he is giving it his all — the department’s website reports that the secretary has already clocked over 682,000 miles of travel — is doubtless true as well.  The problem is the absence of results.  Remember when his signature initiative was going to be an Israeli-Palestinian peace deal?  Sadly, that quixotic plan, too, has come to naught.

Yes, Team Obama “got” bin Laden.  And, yes, it deserves credit for abandoning a self-evidently counterproductive 50-plus-year-old policy toward Cuba and for signing a promising agreement with China on climate change.  That said, the administration’s overall record of accomplishment is beyond thin, starting with that first-day-in-the-Oval-Office symbol that things were truly going to be different: Obama’s order to close Guantanamo.  That, of course, remains a work in progress (despite regular reassurances of light glimmering at the end of what has become a very long tunnel).

In fact, taking the president’s record as a whole, noting that on his watch occasional U.S. drone strikes have become routine, the Nobel Committee might want to consider revoking its Peace Prize.

Nor should we expect much in the time that Obama has remaining. Perhaps there is a deal with Iran waiting in the wings (along with the depth charge of ever-fiercer congressionally mandated sanctions), but signs of intellectual exhaustion are distinctly in evidence.

“Where there is no vision,” the Hebrew Bible tells us, “the people perish.”  There’s no use pretending: if there’s one thing the Obama administration most definitely has not got and has never had, it’s a foreign policy vision.

In Search of Truly Wise (White) Men — Only Those 84 or Older Need Apply

All of this evokes a sense of unease, even consternation bordering on panic, in circles where members of the foreign policy elite congregate.  Absent visionary leadership in Washington, they have persuaded themselves, we’re all going down.  So the world’s sole superpower and self-anointed global leader needs to get game — and fast.

Leslie Gelb, former president of the Council on Foreign Relations, recently weighed in with a proposal for fixing the problem: clean house.  Obama has surrounded himself with fumbling incompetents, Gelb charges.  Get rid of them and bring in the visionaries.

Writing at the Daily Beast, Gelb urges the president to fire his entire national security team and replace them with “strong and strategic people of proven foreign policy experience.”  Translation: the sort of people who sip sherry and nibble on brie in the august precincts of the Council of Foreign Relations.  In addition to offering his own slate of nominees, including several veterans of the storied George W. Bush administration, Gelb suggests that Obama consult regularly with Henry Kissinger, Brent Scowcroft, Zbigniew Brzezinski, and James Baker.  These distinguished war-horses range in age from 84 to 91.  By implication, only white males born prior to World War II are eligible for induction into the ranks of the Truly Wise Men.

Anyway, Gelb emphasizes, Obama needs to get on with it.  With the planet awash in challenges that “imperil our very survival,” there is simply no time to waste.

At best, Gelb’s got it half right.  When it comes to foreign policy, this president has indeed demonstrated a knack for surrounding himself with lackluster lieutenants.  That statement applies equally to national security adviser Susan Rice (and her predecessor), to Secretary of State Kerry (and his predecessor), and to outgoing Pentagon chief Chuck Hagel.  Ashton Carter, the technocrat slated to replace Hagel as defense secretary, comes from the same mold.

They are all “seasoned”  – in Washington, a euphemism for bland, conventional, and utterly unimaginative — charter members of the Rogers-Christopher school of American statecraft.  (That may require some unpacking, so pretend you’re on Jeopardy.  Alex Trebek:  “Two eminently forgettable and completely forgotten twentieth-century secretaries of state.”  You, hitting the buzzer:  “Who were William Rogers and Warren Christopher?”  “Correct!”)

Members of Obama’s national security team worked long and hard to get where they are.  Yet along the way — perhaps from absorbing too many position papers, PowerPoint briefings, and platitudes about “American global leadership” — they lost whatever creative spark once endowed them with the appearance of talent and promise.  Ambition, unquestioned patriotism, and a capacity for putting in endless hours (and enduring endless travel) — all these remain.  But a serious conception of where the world is heading and what that implies for basic U.S. policy?  Individually and collectively, they are without a clue.

I submit that maybe that’s okay, that plodding mediocrity can be a boon if, as at present, the alternatives on offer look even worse.

A Hug for Obama

You want vision?  Obama’s predecessor surrounded himself with visionaries.  Dick Cheney, Condoleezza Rice, Donald Rumsfeld, and Paul Wolfowitz, products of the Cold War one and all, certainly fancied themselves large-bore strategic thinkers.  Busily positioning the United States to run (just another “i” and you have “ruin”) the world, they were blindsided by 9/11.  Unembarrassed and unchastened by this disaster, they initiated a series of morally dubious, strategically boneheaded moves that were either (take your pick) going to spread freedom and democracy or position the United States to exercise permanent dominion.  The ensuing Global War on Terror did neither, of course, while adding trillions to the national debt and helping fracture great expanses of the planet.  Obama is still, however ineffectually, trying to clean up the mess they created.

If that’s what handing the keys to big thinkers gets you, give me Susan Rice any day.  Although Obama’s “don’t do stupid shit” may never rank with Washington’s Farewell Address or the Monroe Doctrine in the history books, George W. Bush might have profited from having some comparable axiom taped to his laptop.

Big ideas have their place — indeed, are essential — when the issues at hand are clearly defined.  The Fall of France in 1940 was one such moment, which President Franklin D. Roosevelt recognized.  So too, arguably, was the period immediately after World War II.  The defeat of Nazi Germany and Imperial Japan had left a dangerous power vacuum in both Europe and the Pacific to which George Marshall, Dean Acheson, and their compatriots forged a necessary response.  Perhaps the period 1968-1969 falls into that same category, the debacle of Vietnam requiring a major adjustment in U.S. Cold War strategy.  This Richard Nixon and Henry Kissinger undertook with their opening to China.

Yet despite the overwrought claims of Gelb (and others) that America’s very survival is today at risk, the present historical moment lacks comparable clarity.  Ours is not a time when we face a single overarching threat.  Instead, on several different fronts, worrisome developments are brewing.  Environmental degradation, the rise of China and other emerging powers, the spread of radical Islam, the precarious state of the global economy, vulnerabilities that are an inevitable byproduct of our pursuit of a cyber-utopia: all of these bear very careful watching.  Each one today should entail a defensive response, the United States protecting itself (and its allies) against worst-case outcomes.  But none of these at the present moment justifies embarking upon a let-out-all-the-stops offensive.  Chasing after one problem would necessarily divert attention from the rest.

The immediate future remains too opaque to say with certainty which threat will turn out to pose the greatest danger, whether in the next year or the next decade — and which might even end up not being a threat at all but an unexpected opportunity.  Conditions are not ripe for boldness.  The abiding imperative of the moment is to discern, which requires careful observation and patience.  In short, forget about strategy.

And there’s a further matter.  Correct discernment assumes a proper vantage point.  What you see depends on where you sit and which way you’re facing.  Those who inhabit the upper ranks of the Obama administration (and those whom Leslie Gelb offers as replacements) sit somewhere back in the twentieth century, their worldview shaped by memories of Munich and Yalta, Korea and Vietnam, the Cuban Missile Crisis and the Berlin Wall, none of which retain more than tangential relevance to the present day.

You want vision?  That will require a new crop of visionaries.  Instead of sitting down with ancients like Kissinger, Scowcroft, Brzezinski, or Baker, this president (or his successor) would be better served to pick the brain of the army captain back from multiple combat tours in Iraq and Afghanistan, the moral theologian specializing in inter-religious dialog, the Peace Corps volunteer who spent the last two years in West Africa, and the Silicon Valley entrepreneur best able to spell out the political implications of the next big thing.

In short, a post-twentieth century vision requires a post-twentieth century generation, able to free itself from old shibboleths to which Leslie Gelb and most of official Washington today remain stubbornly dedicated.  That generation waits in the wings and after another presidential election or two may indeed wield some influence.  We should hope so.  In the meantime, we should bide our time, amending the words of the prophet to something like: “Where there is no vision, the people muddle along and await salvation.”

So as Obama and his team muddle toward their finish line, their achievements negligible, we might even express a modicum of gratitude.  When they depart the scene, we will forget the lot of them.  Yet at least they managed to steer clear of truly epic disasters.  When muddling was the best Washington had on offer, they delivered.  They may even deserve a hug.

Andrew J. Bacevich, a TomDispatch regular, is writing a military history of America’s War for the Greater Middle East. His most recent book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.

Copyright 2015 Andrew Bacevich

Save Us From Washington’s Visionaries

The abiding defect of U.S. foreign policy? It’s isolationism, my friend. Purporting to steer clear of war, isolationism fosters it. Isolationism impedes the spread of democracy. It inhibits trade and therefore prosperity. It allows evildoers to get away with murder. Isolationists prevent the United States from accomplishing its providentially assigned global mission. Wean the American people from their persistent inclination to look inward and who knows what wonders our leaders will accomplish.

The United States has been at war for well over a decade now, with U.S. attacks and excursions in distant lands having become as commonplace as floods and forest fires. Yet during the recent debate over Syria, the absence of popular enthusiasm for opening up another active front evoked expressions of concern in Washington that Americans were once more turning their backs on the world.

As he was proclaiming the imperative of punishing the government of Bashar al-Assad, Secretary of State John Kerry also chided skeptical members of the Senate Foreign Relations Committee that “this is not the time for armchair isolationism.”  Commentators keen to have a go at the Syrian autocrat wasted little time in expanding on Kerry’s theme.

Reflecting on “where isolationism leads,” Jennifer Rubin, the reliably bellicose Washington Post columnist, was quick to chime in, denouncing those hesitant to initiate another war as “infantile.” American isolationists, she insisted, were giving a green light to aggression. Any nation that counted on the United States for protection had now become a “sitting duck,” with “Eastern Europe [and] neighbors of Venezuela and Israel” among those left exposed and vulnerable.  News reports of Venezuelan troop movements threatening Brazil, Colombia, or Guyana were notably absent from the Post or any other media outlet, but no matter — you get the idea.

Military analyst Frederick Kagan was equally troubled.  Also writing in the Post, he worried that “the isolationist narrative is rapidly becoming dominant.”  His preferred narrative emphasized the need for ever greater military exertions, with Syria just the place to launch a new campaign.  For Bret Stephens, a columnist with the Wall Street Journal, the problem was the Republican Party.  Where had the hawks gone?  The Syria debate, he lamented, was “exposing the isolationist worm eating its way through the GOP apple.”

The Journal’s op-ed page also gave the redoubtable Norman Podhoretz, not only still alive but vigorously kicking, a chance to vent.  Unmasking President Obama as “a left-wing radical” intent on “reduc[ing] the country’s power and influence,” the unrepentant neoconservative accused the president of exploiting the “war-weariness of the American people and the rise of isolationist sentiment… on the left and right” to bring about “a greater diminution of American power than he probably envisaged even in his wildest radical dreams.”

Obama escalated the war in Afghanistan, “got” Osama bin Laden, toppled one Arab dictator in Libya, and bashed and bombed targets in Somalia, Yemen, Pakistan, and elsewhere.  Even so, it turns out he is actually part of the isolationist conspiracy to destroy America!

Over at the New York Times, similar concerns, even if less hysterically expressed, prevailed.  According to Times columnist Roger Cohen, President Obama’s reluctance to pull the trigger showed that he had “deferred to a growing isolationism.”  Bill Keller concurred.  “America is again in a deep isolationist mood.”  In a column entitled, “Our New Isolationism,” he decried “the fears and defeatist slogans of knee-jerk isolationism” that were impeding military action.  (For Keller, the proper antidote to isolationism is amnesia.  As he put it, “Getting Syria right starts with getting over Iraq.”)

For his part, Times staff writer Sam Tanenhaus contributed a bizarre two-minute exercise in video agitprop — complete with faked scenes of the Japanese attacking Pearl Harbor — that slapped the isolationist label on anyone opposing entry into any war whatsoever, or tiring of a war gone awry, or proposing that America go it alone.

When the “New Isolationism” Was New 

Most of this, of course, qualifies as overheated malarkey.  As a characterization of U.S. policy at any time in memory, isolationism is a fiction.  Never really a tendency, it qualifies at most as a moment, referring to that period in the 1930s when large numbers of Americans balked at the prospect of entering another European war, the previous one having fallen well short of its “War To End All Wars” advance billing.

In fact, from the day of its founding down to the present, the United States has never turned its back on the world.  Isolationism owes its storied history to its value as a rhetorical device, deployed to discredit anyone opposing an action or commitment (usually involving military forces) that others happen to favor.  If I, a grandson of Lithuanian immigrants, favor deploying U.S. forces to Lithuania to keep that NATO ally out of Vladimir Putin’s clutches and you oppose that proposition, then you, sir or madam, are an “isolationist.”  Presumably, Jennifer Rubin will see things my way and lend her support to shoring up Lithuania’s vulnerable frontiers.

For this very reason, the term isolationism is not likely to disappear from American political discourse anytime soon.  It’s too useful.  Indeed, employ this verbal cudgel to castigate your opponents and your chances of gaining entrée to the nation’s most prestigious publications improve appreciably.  Warn about the revival of isolationism and your prospects of making the grade as a pundit or candidate for high office suddenly brighten.  This is the great thing about using isolationists as punching bags: it makes actual thought unnecessary.  All that’s required to posture as a font of wisdom is the brainless recycling of clichés, half-truths, and bromides.

No publication is more likely to welcome those clichés, half-truths, and bromides than the New York Times.  There, isolationism always looms remarkably large and is just around the corner.

In July 1942, the New York Times Magazine opened its pages to Vice President Henry A. Wallace, who sounded the alarm about the looming threat of what he styled a “new isolationism.”  This was in the midst of World War II, mind you.

After the previous world war, the vice president wrote, the United States had turned inward.  As summer follows spring, “the choice led up to this present war.”  Repeat the error, Wallace warned, and “the price will be more terrible and will be paid much sooner.”  The world was changing and it was long past time for Americans to get with the program.  “The airplane, the radio, and modern technology have bound the planet so closely together that what happens anywhere on the planet has a direct effect everywhere else.”  In a world that had “suddenly become so small,” he continued, “we cannot afford to resume the role of hermit.”

The implications for policy were self-evident:

“This time, then, we have only one real choice.  We must play a responsible part in the world — leading the way in world progress, fostering a healthy world trade, helping to protect the world’s peace.”

One month later, it was Archibald MacLeish’s turn.  On August 16, 1942, the Times magazine published a long essay of his under the title of — wouldn’t you know it — “The New Isolationism.”  For readers in need of coaching, Times editors inserted this seal of approval before the text: “There is great pertinence in the following article.”

A well-known poet, playwright, and literary gadfly, MacLeish was at the time serving as Librarian of Congress.  From this bully pulpit, he offered the reassuring news that “isolationism in America is dead.”  Unfortunately, like zombies, “old isolationists never really die: they merely dig in their toes in a new position.  And the new position, whatever name is given it, is isolation still.”

Fortunately, the American people were having none of it.  They had “recaptured the current of history and they propose to move with it; they don’t mean to be denied.” MacLeish’s fellow citizens knew what he knew: “that there is a stirring in our world…, a forward thrusting and overflowing human hope of the human will which must be given a channel or it will dig a channel itself.”  In effect, MacLeish was daring the isolationists, in whatever guise, to stand in the way of this forward thrusting and overflowing hopefulness.  Presumably, they would either drown or be crushed.

The end of World War II found the United States donning the mantle of global leadership, much as Wallace, MacLeish, and the Times had counseled.  World peace did not ensue.  Instead, a host of problems continued to afflict the planet, with isolationists time and again fingered as the culprits impeding their solution.

The Gift That Never Stops Giving

In June 1948, with a notable absence of creativity in drafting headlines, the Times once again found evidence of “the new isolationism.”  In an unsigned editorial, the paper charged that an American penchant for hermit-like behavior was “asserting itself again in a manner that is both distressing and baffling.”  With the Cold War fully joined and U.S. forces occupying Germany, Japan, and other countries, the Times worried that some Republicans in Congress appeared reluctant to fund the Marshall Plan.

From their offices in Manhattan, members of the Times editorial board detected in some quarters “a homesickness for the old days.”  It was incumbent upon Americans to understand that “the time is past when we could protect ourselves easily behind our barriers behind the seas.”  History was summoning the United States to lead the world: “The very success of our democracy has now imposed duties upon us which we must fulfill if that democracy is to survive.”  Those entertaining contrary views, the Times huffed, “do not speak for the American people.”

That very month, Josef Stalin announced that the Soviet Union was blockading Berlin.  The U.S. responded not by heading for the exits but by initiating a dramatic airlift.  Oh, and Congress fully funded the Marshall Plan.

Barely a year later, in August 1949, with Stalin having just lifted the Berlin Blockade, Times columnist Arthur Krock discerned another urge to disengage.  In a piece called “Chickens Usually Come Home,” he cited congressional reservations about the recently promulgated Truman Doctrine as evidence of, yes, a “new isolationism.”  As it happened, Congress duly appropriated the money President Truman was requesting to support Greece and Turkey against the threat of communism — as it would support similar requests to throw arms and money at other trouble spots like French Indochina.

Even so, in November of that year, the Times magazine published yet another warning about “the challenge of a new isolationism.”  The author was Illinois Governor Adlai Stevenson, then positioning himself for a White House run.  Like many another would-be candidate before and since, Stevenson took the preliminary step of signaling his opposition to the I-word.

World War II, he wrote, had “not only destroyed fascism abroad, but a lot of isolationist notions here at home.”  War and technological advance had “buried the whole ostrich of isolation.”  At least it should have.  Unfortunately, some Republicans hadn’t gotten the word.  They were “internationally minded in principle but not in practice.”  Stevenson feared that when the chips were down such head-in-the-sand inclinations might come roaring back.  This he was determined to resist.  “The eagle, not the ostrich,” he proclaimed, “is our national emblem.”

In August 1957, the Times magazine was at it once again, opening its pages to another Illinois Democrat, Senator Paul Douglas, for an essay familiarly entitled “A New Isolationism — Ripples or Tide?” Douglas claimed that “a new tide of isolationism is rising in the country.”  U.S. forces remained in Germany and Japan, along with Korea, where they had recently fought a major war.  Even so, the senator worried that “the internationalists are tiring rapidly now.”

Americans needed to fortify themselves by heeding the message of the Gospels: “Let the spirit of the Galilean enter our worldly and power-obsessed hearts.”  In other words, the senator’s prescription for American statecraft was an early version of What Would Jesus Do?  Was Jesus Christ an advocate of American global leadership?  Senator Douglas apparently thought so.

Then came Vietnam.  By May 1970, even Times-men were showing a little of that fatigue.  That month, star columnist James Reston pointed (yet again) to the “new isolationism.”  Yet in contrast to the paper’s scribblings on the subject over the previous three decades, Reston didn’t decry it as entirely irrational.  The war had proven to be a bummer and “the longer it goes on,” he wrote, “the harder it will be to get public support for American intervention.”  Washington, in other words, needed to end its misguided war if it had any hopes of repositioning itself to start the next one.

A Concept Growing Long in the Tooth

By 1980, the Times showed signs of recovering from its brief Vietnam funk.  In a review of Norman Podhoretz’s The Present Danger, for example, the noted critic Anatole Broyard extolled the author’s argument as “dispassionate,” “temperate,” and “almost commonsensical.”

The actual text was none of those things.  What the pugnacious Podhoretz called — get ready for it — “the new isolationism” was, in his words, “hard to distinguish from simple anti-Americanism.”  Isolationists — anyone who had opposed the Vietnam War on whatever grounds — believed that the United States was “a force for evil, a menace, a terror.”  Podhoretz detected a “psychological connection” between “anti-Americanism, isolationism, and the tendency to explain away or even apologize for anything the Soviet Union does, no matter how menacing.”  It wasn’t bad enough that isolationists hated their country, they were, it seems, commie symps to boot.

Fast forward a decade, and — less than three months after U.S. troops invaded Panama — Times columnist Flora Lewis sensed a resurgence of you-know-what.  In a February 1990 column, she described “a convergence of right and left” with both sides “arguing with increasing intensity that it’s time for the U.S. to get off the world.”  Right-wingers saw that world as too nasty to save; left-wingers, the United States as too nasty to save it.  “Both,” she concluded (of course), were “moving toward a new isolationism.”

Five months later, Saddam Hussein sent his troops into Kuwait.  Instead of getting off the world, President George H.W. Bush deployed U.S. combat forces to defend Saudi Arabia.  For Joshua Muravchik, however, merely defending that oil-rich kingdom wasn’t nearly good enough.  Indeed, here was a prime example of the “New Isolationism, Same Old Mistake,” as his Times op-ed was entitled.

The mistake was to flinch from instantly ejecting Saddam’s forces.  Although opponents of a war against Iraq did not “see themselves as isolationists, but as realists,” he considered this a distinction without a difference.  Muravchik, who made his living churning out foreign policy analysis for various Washington think tanks, favored “the principle of investing America’s power in the effort to fashion an environment congenial to our long-term safety.”  War, he firmly believed, offered the means to fashion that congenial environment.  Should America fail to act, he warned, “our abdication will encourage such threats to grow.”

Of course, the United States did act and the threats grew anyway.  In and around the Middle East, the environment continued to be thoroughly uncongenial.  Still, in Times-world, the American penchant for doing too little rather than too much remained the eternal problem, eternally “new.”  An op-ed by up-and-coming journalist James Traub appearing in the Times in December 1991, just months after a half-million U.S. troops had liberated Kuwait, was typical.  Assessing the contemporary political scene, Traub detected “a new wave of isolationism gathering force.”  Traub was undoubtedly establishing his bona fides.  (Soon after, he landed a job working for the paper.)

This time, according to Traub, the problem was the Democrats.  No longer “the party of Wilson or of John F. Kennedy,” Democrats, he lamented, “aspire[d] to be the party of middle-class frustrations — and if that entails turning your back on the world, so be it.”  The following year Democrats nominated as their presidential candidate Bill Clinton, who insisted that he would never under any circumstances turn his back on the world.  Even so, no sooner did Clinton win than Times columnist Leslie Gelb was predicting that the new president would “fall into the trap of isolationism and policy passivity.”

Get Me Rewrite!

Arthur Schlesinger defined the problem in broader terms.  The famous historian and Democratic Party insider had weighed in early on the matter with a much-noted essay that appeared in The Atlantic Monthly back in 1952.  He called it – you guessed it — “The New Isolationism.”

In June 1994, more than 40 years later, with the Cold War now finally won, Schlesinger was back for more with a Times op-ed that sounded the usual alarm.  “The Cold War produced the illusion that traditional isolationism was dead and buried,” he wrote, but of course — this is, after all, the Times — it was actually alive and kicking.  The passing of the Cold War had “weakened the incentives to internationalism” and was giving isolationists a new opening, even though in “a world of law requiring enforcement,” it was incumbent upon the United States to be the lead enforcer.

The warning resonated.  Although the Times does not normally give commencement addresses much attention, it made an exception for Madeleine Albright’s remarks to graduating seniors at Barnard College in May 1995.  The U.S. ambassador to the United Nations had detected what she called “a trend toward isolationism that is running stronger in America than at any time since the period between the two world wars,” and the American people were giving in to the temptation “to pull the covers up over our heads and pretend we do not notice, do not care, and are unaffected by events overseas.”  In other circumstances in another place, it might have seemed an odd claim, given that the United States had just wrapped up armed interventions in Somalia and Haiti and was on the verge of initiating a bombing campaign in the Balkans.

Still, Schlesinger had Albright’s back.  The July/August 1995 issue of Foreign Affairs prominently featured an article of his entitled “Back to the Womb?  Isolationism’s Renewed Threat,” with Times editors publishing a CliffsNotes version on the op-ed page a month earlier.  “The isolationist impulse has risen from the grave,” Schlesinger announced, “and it has taken the new form of unilateralism.” 

His complaint was no longer that the United States hesitated to act, but that it did not act in concert with others.  This “neo-isolationism,” he warned, introducing a new note into the tradition of isolationism-bashing for the first time in decades, “promises to prevent the most powerful nation on the planet from playing any role in enforcing the peace system.”  The isolationists were winning — this time through pure international belligerence.  Yet “as we return to the womb,” Schlesinger warned his fellow citizens, “we are surrendering a magnificent dream.”

Other Times contributors shared Schlesinger’s concern.  On January 30, 1996, the columnist Russell Baker chipped in with a piece called “The New Isolationism.”  For those slow on the uptake, Jessica Mathews, then a fellow at the Council on Foreign Relations, affirmed Baker’s concerns by publishing an identically titled column in the Washington Post a mere six days later.  Mathews reported “troubling signs that the turning inward that many feared would follow the Cold War’s end is indeed happening.”  With both the Times and the Post concurring, “the new isolationism” had seemingly reached pandemic proportions (as a title, if nothing else).

Did the “new” isolationism then pave the way for 9/11?  Was al-Qaeda inspired by an unwillingness on Washington’s part to insert itself into the Islamic world?

Unintended and unanticipated consequences stemming from prior U.S. interventions might have seemed to offer a better explanation.  But this much is for sure:  as far as the Times was concerned, even in the midst of George W. Bush’s Global War in Terror, the threat of isolationism persisted.

In January 2004, David M. Malone, president of the International Peace Academy, worried in a Times op-ed “that the United States is retracting into itself” — this despite the fact that U.S. forces were engaged in simultaneous wars in Iraq and Afghanistan.  Among Americans, a concern about terrorism, he insisted, was breeding “a sense of self-obsession and indifference to the plight of others.”  “When Terrorists Win: Beware America’s New Isolationism,” blared the headline of Malone’s not-so-new piece.

Actually, Americans should beware those who conjure up phony warnings of a “new isolationism” to advance a particular agenda.  The essence of that agenda, whatever the particulars and however packaged, is this: If the United States just tries a little bit harder — one more intervention, one more shipment of arms to a beleaguered “ally,” one more line drawn in the sand — we will finally turn the corner and the bright uplands of peace and freedom will come into view.

This is a delusion, of course.  But if you write a piece exposing that delusion, don’t bother submitting it to the Times.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book is Breach of Trust:  How Americans Failed Their Soldiers and Their Country.

Copyright 2013 Andrew Bacevich

Always and Everywhere

Sometimes history happens at the moment when no one is looking.  On weekends in late August, the president of the United States ought to be playing golf or loafing at Camp David, not making headlines.  Yet Barack Obama chose Labor Day weekend to unveil arguably the most consequential foreign policy shift of his presidency.

In an announcement that surprised virtually everyone, the president told his countrymen and the world that he was putting on hold the much anticipated U.S. attack against Syria.  Obama hadn’t, he assured us, changed his mind about the need and justification for punishing the Syrian government for its probable use of chemical weapons against its own citizens.  In fact, only days before administration officials had been claiming that, if necessary, the U.S. would “go it alone” in punishing Bashar al-Assad’s regime for its bad behavior.  Now, however, Obama announced that, as the chief executive of “the world’s oldest constitutional democracy,” he had decided to seek Congressional authorization before proceeding.

Obama thereby brought to a screeching halt a process extending back over six decades in which successive inhabitants of the Oval Office had arrogated to themselves (or had thrust upon them) ever wider prerogatives in deciding when and against whom the United States should wage war.  Here was one point on which every president from Harry Truman to George W. Bush had agreed: on matters related to national security, the authority of the commander-in-chief has no fixed limits.  When it comes to keeping the country safe and securing its vital interests, presidents can do pretty much whatever they see fit.

Here, by no means incidentally, lies the ultimate the source of the stature and prestige that defines the imperial presidency and thereby shapes (or distorts) the American political system.  Sure, the quarters at 1600 Pennsylvania Avenue are classy, but what really endowed the postwar war presidency with its singular aura were the missiles, bombers, and carrier battle groups that responded to the commands of one man alone.  What’s the bully pulpit in comparison to having the 82nd Airborne and SEAL Team Six at your beck and call?

Now, in effect, Obama was saying to Congress: I’m keen to launch a war of choice.  But first I want you guys to okay it.  In politics, where voluntarily forfeiting power is an unnatural act, Obama’s invitation qualifies as beyond unusual.  Whatever the calculations behind his move, its effect rates somewhere between unprecedented and positively bizarre — the heir to imperial prerogatives acting, well, decidedly unimperial.

Obama is a constitutional lawyer, of course, and it’s pleasant to imagine that he acted out of due regard for what Article 1, Section 8, of that document plainly states, namely that “the Congress shall have power…  to declare war.”  Take his explanation at face value and the president’s decision ought to earn plaudits from strict constructionists across the land.  The Federalist Society should offer Obama an honorary lifetime membership.

Of course, seasoned political observers, understandably steeped in cynicism, dismissed the president’s professed rationale out of hand and immediately began speculating about his actual motivation.  The most popular explanation was this: having painted himself into a corner, Obama was trying to lure members of the legislative branch into joining him there.  Rather than a belated conversion experience, the president’s literal reading of the Constitution actually amounted to a sneaky political ruse.

After all, the president had gotten himself into a pickle by declaring back in August 2012 that any use of chemical weapons by the government of Bashar al-Assad would cross a supposedly game-changing “red line.”  When the Syrians (apparently) called his bluff, Obama found himself facing uniformly unattractive military options that ranged from the patently risky — joining forces with the militants intent on toppling Assad — to the patently pointless — firing a “shot across the bow” of the Syrian ship of state.

Meanwhile, the broader American public, awakening from its summertime snooze, was demonstrating remarkably little enthusiasm for yet another armed intervention in the Middle East.  Making matters worse still, U.S. military leaders and many members of Congress, Republican and Democratic alike, were expressing serious reservations or actual opposition. Press reports even cited leaks by unnamed officials who characterized the intelligence linking Assad to the chemical attacks as no “slam dunk,” a painful reminder of how bogus information had paved the way for the disastrous and unnecessary Iraq War.  For the White House, even a hint that Obama in 2013 might be replaying the Bush scenario of 2003 was anathema.

The president also discovered that recruiting allies to join him in this venture was proving a hard sell.  It wasn’t just the Arab League’s refusal to give an administration strike against Syria its seal of approval, although that was bad enough.  Jordan’s King Abdullah, America’s “closest ally in the Arab world,” publicly announced that he favored talking to Syria rather than bombing it.  As for Iraq, that previous beneficiary of American liberation, its government was refusing even to allow U.S. forces access to its airspace.  Ingrates!

For Obama, the last straw may have come when America’s most reliable (not to say subservient) European partner refused to enlist in yet another crusade to advance the cause of peace, freedom, and human rights in the Middle East.  With memories of Tony and George W. apparently eclipsing those of Winston and Franklin, the British Parliament rejected Prime Minister David Cameron’s attempt to position the United Kingdom alongside the United States.  Parliament’s vote dashed Obama’s hopes of forging a coalition of two and so investing a war of choice against Syria with at least a modicum of legitimacy.

When it comes to actual military action, only France still entertains the possibility of making common cause with the United States.  Yet the number of Americans taking assurance from this prospect approximates the number who know that Bernard-Henri Lévy isn’t a celebrity chef.

John F. Kennedy once remarked that defeat is an orphan.  Here was a war bereft of parents even before it had begun.

Whether or Not to Approve the War for the Greater Middle East

Still, whether high-minded constitutional considerations or diabolically clever political machinations motivated the president may matter less than what happens next.  Obama lobbed the ball into Congress’s end of the court.  What remains to be seen is how the House and the Senate, just now coming back into session, will respond.

At least two possibilities exist, one with implications that could prove profound and the second holding the promise of being vastly entertaining.

On the one hand, Obama has implicitly opened the door for a Great Debate regarding the trajectory of U.S. policy in the Middle East.  Although a week or ten days from now the Senate and House of Representatives will likely be voting to approve or reject some version of an Authorization for the Use of Military Force (AUMF), at stake is much more than the question of what to do about Syria.  The real issue — Americans should hope that the forthcoming congressional debate makes this explicit — concerns the advisability of continuing to rely on military might as the preferred means of advancing U.S. interests in this part of the world.

Appreciating the actual stakes requires putting the present crisis in a broader context.  Herewith an abbreviated history lesson.

Back in 1980, President Jimmy Carter announced that the United States would employ any means necessary to prevent a hostile power from gaining control of the Persian Gulf.  In retrospect, it’s clear enough that the promulgation of the so-called Carter Doctrine amounted to a de facto presidential “declaration” of war (even if Carter himself did not consciously intend to commit the United States to perpetual armed conflict in the region).  Certainly, what followed was a never-ending sequence of wars and war-like episodes.  Although the Congress never formally endorsed Carter’s declaration, it tacitly acceded to all that his commitment subsequently entailed.

Relatively modest in its initial formulation, the Carter Doctrine quickly metastasized.  Geographically, it grew far beyond the bounds of the Persian Gulf, eventually encompassing virtually all of the Islamic world.  Washington’s own ambitions in the region also soared.  Rather than merely preventing a hostile power from achieving dominance in the Gulf, the United States was soon seeking to achieve dominance itself.  Dominance — that is, shaping the course of events to Washington’s liking — was said to hold the key to maintaining stability, ensuring access to the world’s most important energy reserves, checking the spread of Islamic radicalism, combating terrorism, fostering Israel’s security, and promoting American values.  Through the adroit use of military might, dominance actually seemed plausible.  (So at least Washington persuaded itself.)

What this meant in practice was the wholesale militarization of U.S. policy toward the Greater Middle East in a period in which Washington’s infatuation with military power was reaching its zenith.  As the Cold War wound down, the national security apparatus shifted its focus from defending Germany’s Fulda Gap to projecting military power throughout the Islamic world.  In practical terms, this shift found expression in the creation of Central Command (CENTCOM), reconfigured forces, and an eternal round of contingency planning, war plans, and military exercises in the region.  To lay the basis for the actual commitment of troops, the Pentagon established military bases, stockpiled material in forward locations, and negotiated transit rights.  It also courted and armed proxies.  In essence, the Carter Doctrine provided the Pentagon (along with various U.S. intelligence agencies) with a rationale for honing and then exercising new capabilities.

Capabilities expanded the range of policy options.  Options offered opportunities to “do something” in response to crisis.  From the Reagan era on, policymakers seized upon those opportunities with alacrity.  A seemingly endless series of episodes and incidents ensued, as U.S. forces, covert operatives, or proxies engaged in hostile actions (often on multiple occasions) in Lebanon, Libya, Iran, Somalia, Bosnia, Kosovo, Saudi Arabia, the Sudan, Yemen, Pakistan, the southern Philippines, and in the Persian Gulf itself, not to mention Iraq and Afghanistan.  Consider them altogether and what you have is a War for the Greater Middle East, pursued by the United States for over three decades now.  If Congress gives President Obama the green light, Syria will become the latest front in this ongoing enterprise.

Profiles in Courage? If Only

A debate over the Syrian AUMF should encourage members of Congress — if they’ve got the guts — to survey this entire record of U.S. military activities in the Greater Middle East going back to 1980.  To do so means almost unavoidably confronting this simple question: How are we doing?  To state the matter directly, all these years later, given all the ordnance expended, all the toing-and-froing of U.S. forces, and all the lives lost or shattered along the way, is mission accomplishment anywhere insight?  Or have U.S. troops — the objects of such putative love and admiration on the part of the American people — been engaged over the past 30-plus years in a fool’s errand?  How members cast their votes on the Syrian AUMF will signal their answer — and by extension the nation’s answer — to that question.

To okay an attack on Syria will, in effect, reaffirm the Carter Doctrine and put a stamp of congressional approval on the policies that got us where we are today.  A majority vote in favor of the Syrian AUMF will sustain and probably deepen Washington’s insistence that the resort to violence represents the best way to advance U.S. interests in the Islamic world.  From this perspective, all we need to do is try harder and eventually we’ll achieve a favorable outcome.  With Syria presumably the elusive but never quite attained turning point, the Greater Middle East will stabilize.  Democracy will flourish.  And the United States will bask in the appreciation of those we have freed from tyranny.

To vote against the AUMF, on the other hand, will draw a red line of much greater significance than the one that President Obama himself so casually laid down.  Should the majority in either House reject the Syrian AUMF, the vote will call into question the continued viability of the Carter Doctrine and all that followed in its wake.

It will create space to ask whether having another go is likely to produce an outcome any different from what the United States has achieved in the myriad places throughout the Greater Middle East where U.S. forces (or covert operatives) have, whatever their intentions, spent the past several decades wreaking havoc and sowing chaos under the guise of doing good.  Instead of offering more of the same – does anyone seriously think that ousting Assad will transform Syria into an Arab Switzerland? — rejecting the AUMF might even invite the possibility of charting an altogether different course, entailing perhaps a lower military profile and greater self-restraint.

What a stirring prospect!  Imagine members of Congress setting aside partisan concerns to debate first-order questions of policy.  Imagine them putting the interests of the country in front of their own worries about winning reelection or pursuing their political ambitions.  It would be like Lincoln vs. Douglas or Woodrow Wilson vs. Henry Cabot Lodge.  Call Doris Kearns Goodwin.  Call Spielberg or Sorkin.  Get me Capra, for God’s sake.  We’re talking high drama of blockbuster proportions.

On the other hand, given the record of the recent past, we should hardly discount the possibility that our legislative representatives will not rise to the occasion.  Invited by President Obama to share in the responsibility for deciding whether and where to commit acts of war, one or both Houses — not known these days for displaying either courage or responsibility — may choose instead to punt.

As we have learned by now, the possible ways for Congress to shirk its duty are legion.  In this instance, all are likely to begin with the common supposition that nothing’s at stake here except responding to Assad’s alleged misdeeds.  To refuse to place the Syrian crisis in any larger context is, of course, a dodge.  Yet that dodge creates multiple opportunities for our elected representatives to let themselves off the hook.

Congress could, for example, pass a narrowly drawn resolution authorizing Obama to fire his “shot across the bow” and no more.  In other words, it could basically endorse the president’s inclination to substitute gesture for policy.

Or it could approve a broadly drawn, but vacuous resolution, handing the president a blank check.  Ample precedent exists for that approach, since it more or less describes what Congress did in 1964 with the Tonkin Gulf Resolution, opening the way to presidential escalation in Vietnam, or with the AUMF it passed in the immediate aftermath of 9/11, giving George W. Bush’s administration permission to do more or less anything it wanted to just about anyone.

Even more irresponsibly, Congress could simply reject any Syrian AUMF, however worded, without identifying a plausible alternative to war, in effect washing its hands of the matter and creating a policy vacuum.

Will members of the Senate and the House grasp the opportunity to undertake an urgently needed reassessment of America’s War for the Greater Middle East?  Or wriggling and squirming, will they inelegantly sidestep the issue, opting for short-term expediency in place of serious governance?  In an age where the numbing blather of McCain, McConnell, and Reid have replaced the oratory of Clay, Calhoun, and Webster, merely to pose the question is to answer it.

But let us not overlook the entertainment value of such an outcome, which could well be formidable.  In all likelihood, high comedy Washington-style lurks just around the corner.  So renew that subscription to The Onion.  Keep an eye on Doonesbury.  Set the TiVo to record Jon Stewart.  This is going to be really funny — and utterly pathetic.  Where’s H.L. Mencken when we need him?

Andrew J. Bacevich is a professor of history and international relations at Boston University.  He is the author of the new book, Breach of Trust: How Americans Failed Their Soldiers and Their Country (Metropolitan Books).

Copyright 2013 Andrew Bacevich

The Hill to the Rescue on Syria?

For well over a decade now the United States has been “a nation at war.” Does that war have a name?

It did at the outset.  After 9/11, George W. Bush’s administration wasted no time in announcing that the U.S. was engaged in a Global War on Terrorism, or GWOT.  With few dissenters, the media quickly embraced the term. The GWOT promised to be a gargantuan, transformative enterprise. The conflict begun on 9/11 would define the age. In neoconservative circles, it was known as World War IV.

Upon succeeding to the presidency in 2009, however, Barack Obama without fanfare junked Bush’s formulation (as he did again in a speech at the National Defense University last week).  Yet if the appellation went away, the conflict itself, shorn of identifying marks, continued.

Does it matter that ours has become and remains a nameless war? Very much so.

Names bestow meaning.  When it comes to war, a name attached to a date can shape our understanding of what the conflict was all about.  To specify when a war began and when it ended is to privilege certain explanations of its significance while discrediting others. Let me provide a few illustrations.

With rare exceptions, Americans today characterize the horrendous fraternal bloodletting of 1861-1865 as the Civil War.  Yet not many decades ago, diehard supporters of the Lost Cause insisted on referring to that conflict as the War Between the States or the War for Southern Independence (or even the War of Northern Aggression).  The South may have gone down in defeat, but the purposes for which Southerners had fought — preserving a distinctive way of life and the principle of states’ rights — had been worthy, even noble.  So at least they professed to believe, with their preferred names for the war reflecting that belief.

Schoolbooks tell us that the Spanish-American War began in April 1898 and ended in August of that same year.  The name and dates fit nicely with a widespread inclination from President William McKinley’s day to our own to frame U.S. intervention in Cuba as an altruistic effort to liberate that island from Spanish oppression.

Yet the Cubans were not exactly bystanders in that drama.  By 1898, they had been fighting for years to oust their colonial overlords.  And although hostilities in Cuba itself ended on August 12th, they dragged on in the Philippines, another Spanish colony that the United States had seized for reasons only remotely related to liberating Cubans.  Notably, U.S. troops occupying the Philippines waged a brutal war not against Spaniards but against Filipino nationalists no more inclined to accept colonial rule by Washington than by Madrid.  So widen the aperture to include this Cuban prelude and the Filipino postlude and you end up with something like this:  The Spanish-American-Cuban-Philippines War of 1895-1902.  Too clunky?  How about the War for the American Empire?  This much is for sure: rather than illuminating, the commonplace textbook descriptor serves chiefly to conceal.

Strange as it may seem, Europeans once referred to the calamitous events of 1914-1918 as the Great War.  When Woodrow Wilson decided in 1917 to send an army of doughboys to fight alongside the Allies, he went beyond Great.  According to the president, the Great War was going to be the War To End All Wars.  Alas, things did not pan out as he expected.  Perhaps anticipating the demise of his vision of permanent peace, War Department General Order 115, issued on October 7, 1919, formally declared that, at least as far as the United States was concerned, the recently concluded hostilities would be known simply as the World War.

In September 1939 — presto chango! — the World War suddenly became the First World War, the Nazi invasion of Poland having inaugurated a Second World War, also known as World War II or more cryptically WWII.  To be sure, Soviet dictator Josef Stalin preferred the Great Patriotic War.  Although this found instant — almost unanimous — favor among Soviet citizens, it did not catch on elsewhere.

Does World War II accurately capture the events it purports to encompass?  With the crusade against the Axis now ranking alongside the crusade against slavery as a myth-enshrouded chapter in U.S. history to which all must pay homage, Americans are no more inclined to consider that question than to consider why a playoff to determine the professional baseball championship of North America constitutes a “World Series.”

In fact, however convenient and familiar, World War II is misleading and not especially useful.  The period in question saw at least two wars, each only tenuously connected to the other, each having distinctive origins, each yielding a different outcome.  To separate them is to transform the historical landscape.

On the one hand, there was the Pacific War, pitting the United States against Japan.  Formally initiated by the December 7, 1941, attack on Pearl Harbor, it had in fact begun a decade earlier when Japan embarked upon a policy of armed conquest in Manchuria.  At stake was the question of who would dominate East Asia.  Japan’s crushing defeat at the hands of the United States, sealed by two atomic bombs in 1945, answered that question (at least for a time).

Then there was the European War, pitting Nazi Germany first against Great Britain and France, but ultimately against a grand alliance led by the United States, the Soviet Union, and a fast fading British Empire.  At stake was the question of who would dominate Europe.  Germany’s defeat resolved that issue (at least for a time): no one would.  To prevent any single power from controlling Europe, two outside powers divided it.

This division served as the basis for the ensuing Cold War, which wasn’t actually cold, but also (thankfully) wasn’t World War III, the retrospective insistence of bellicose neoconservatives notwithstanding.  But when did the Cold War begin?  Was it in early 1947, when President Harry Truman decided that Stalin’s Russia posed a looming threat and committed the United States to a strategy of containment?  Or was it in 1919, when Vladimir Lenin decided that Winston Churchill’s vow to “strangle Bolshevism in its cradle” posed a looming threat to the Russian Revolution, with an ongoing Anglo-American military intervention evincing a determination to make good on that vow?

Separating the war against Nazi Germany from the war against Imperial Japan opens up another interpretive possibility.  If you incorporate the European conflict of 1914-1918 and the European conflict of 1939-1945 into a single narrative, you get a Second Thirty Years War (the first having occurred from 1618-1648) — not so much a contest of good against evil, as a mindless exercise in self-destruction that represented the ultimate expression of European folly.

So, yes, it matters what we choose to call the military enterprise we’ve been waging not only in Iraq and Afghanistan, but also in any number of other countries scattered hither and yon across the Islamic world.  Although the Obama administration appears no more interested than the Bush administration in saying when that enterprise will actually end, the date we choose as its starting point also matters.

Although Washington seems in no hurry to name its nameless war — and will no doubt settle on something self-serving or anodyne if it ever finally addresses the issue — perhaps we should jump-start the process.  Let’s consider some possible options, names that might actually explain what’s going on.

The Long War: Coined not long after 9/11 by senior officers in the Pentagon, this formulation never gained traction with either civilian officials or the general public.  Yet the Long War deserves consideration, even though — or perhaps because — it has lost its luster with the passage of time.

At the outset, it connoted grand ambitions buoyed by extreme confidence in the efficacy of American military might.  This was going to be one for the ages, a multi-generational conflict yielding sweeping results.

The Long War did begin on a hopeful note.  The initial entry into Afghanistan and then into Iraq seemed to herald “home by Christmas” triumphal parades.  Yet this soon proved an illusion as victory slipped from Washington’s grasp.  By 2005 at the latest, events in the field had dashed the neo-Wilsonian expectations nurtured back home.

With the conflicts in Iraq and Afghanistan dragging on, “long” lost its original connotation.  Instead of “really important,” it became a synonym for “interminable.”  Today, the Long War does succinctly capture the experience of American soldiers who have endured multiple combat deployments to Iraq and Afghanistan.

For Long War combatants, the object of the exercise has become to persist.  As for winning, it’s not in the cards. The Long War just might conclude by the end of 2014 if President Obama keeps his pledge to end the U.S. combat role in Afghanistan and if he avoids getting sucked into Syria’s civil war.  So the troops may hope.

The War Against Al-Qaeda: It began in August 1996 when Osama bin Laden issued a “Declaration of War against the Americans Occupying the Land of the Two Holy Places,” i.e., Saudi Arabia.  In February 1998, a second bin Laden manifesto announced that killing Americans, military and civilian alike, had become “an individual duty for every Muslim who can do it in any country in which it is possible to do it.”

Although President Bill Clinton took notice, the U.S. response to bin Laden’s provocations was limited and ineffectual.  Only after 9/11 did Washington take this threat seriously.  Since then, apart from a pointless excursion into Iraq (where, in Saddam Hussein’s day, al-Qaeda did not exist), U.S. attention has been focused on Afghanistan, where U.S. troops have waged the longest war in American history, and on Pakistan’s tribal borderlands, where a CIA drone campaign is ongoing.  By the end of President Obama’s first term, U.S. intelligence agencies were reporting that a combined CIA/military campaign had largely destroyed bin Laden’s organization.  Bin Laden himself, of course, was dead.

Could the United States have declared victory in its unnamed war at this point?  Perhaps, but it gave little thought to doing so.  Instead, the national security apparatus had already trained its sights on various al-Qaeda “franchises” and wannabes, militant groups claiming the bin Laden brand and waging their own version of jihad.  These offshoots emerged in the Maghreb, Yemen, Somalia, Nigeria, and — wouldn’t you know it — post-Saddam Iraq, among other places.  The question as to whether they actually posed a danger to the United States got, at best, passing attention — the label “al-Qaeda” eliciting the same sort of Pavlovian response that the word “communist” once did.

Americans should not expect this war to end anytime soon.  Indeed, the Pentagon’s impresario of special operations recently speculated — by no means unhappily — that it would continue globally for “at least 10 to 20 years.”   Freely translated, his statement undoubtedly means: “No one really knows, but we’re planning to keep at it for one helluva long time.”

The War For/Against/About Israel: It began in 1948.  For many Jews, the founding of the state of Israel signified an ancient hope fulfilled.  For many Christians, conscious of the sin of anti-Semitism that had culminated in the Holocaust, it offered a way to ease guilty consciences, albeit mostly at others’ expense.  For many Muslims, especially Arabs, and most acutely Arabs who had been living in Palestine, the founding of the Jewish state represented a grave injustice.  It was yet another unwelcome intrusion engineered by the West — colonialism by another name.

Recounting the ensuing struggle without appearing to take sides is almost impossible.  Yet one thing seems clear: in terms of military involvement, the United States attempted in the late 1940s and 1950s to keep its distance.  Over the course of the 1960s, this changed.  The U.S. became Israel’s principal patron, committed to maintaining (and indeed increasing) its military superiority over its neighbors.

In the decades that followed, the two countries forged a multifaceted “strategic relationship.”  A compliant Congress provided Israel with weapons and other assistance worth many billions of dollars, testifying to what has become an unambiguous and irrevocable U.S. commitment to the safety and well-being of the Jewish state.  The two countries share technology and intelligence.  Meanwhile, just as Israel had disregarded U.S. concerns when it came to developing nuclear weapons, it ignored persistent U.S. requests that it refrain from colonizing territory that it has conquered.

When it comes to identifying the minimal essential requirements of Israeli security and the terms that will define any Palestinian-Israeli peace deal, the United States defers to Israel.  That may qualify as an overstatement, but only slightly.  Given the Israeli perspective on those requirements and those terms — permanent military supremacy and a permanently demilitarized Palestine allowed limited sovereignty — the War For/Against/About Israel is unlikely to end anytime soon either.  Whether the United States benefits from the perpetuation of this war is difficult to say, but we are in it for the long haul.

The War for the Greater Middle East: I confess that this is the name I would choose for Washington’s unnamed war and is, in fact, the title of a course I teach.  (A tempting alternative is the Second Hundred Years War, the “first” having begun in 1337 and ended in 1453.)

This war is about to hit the century mark, its opening chapter coinciding with the onset of World War I.  Not long after the fighting on the Western Front in Europe had settled into a stalemate, the British government, looking for ways to gain the upper hand, set out to dismantle the Ottoman Empire whose rulers had foolishly thrown in their lot with the German Reich against the Allies.

By the time the war ended with Germany and the Turks on the losing side, Great Britain had already begun to draw up new boundaries, invent states, and install rulers to suit its predilections, while also issuing mutually contradictory promises to groups inhabiting these new precincts of its empire.  Toward what end?  Simply put, the British were intent on calling the shots from Egypt to India, whether by governing through intermediaries or ruling directly.  The result was a new Middle East and a total mess.

London presided over this mess, albeit with considerable difficulty, until the end of World War II.  At this point, by abandoning efforts to keep Arabs and Zionists from one another’s throats in Palestine and by accepting the partition of India, they signaled their intention to throw in the towel. Alas, Washington proved more than willing to assume Britain’s role.  The lure of oil was strong.  So too were the fears, however overwrought, of the Soviets extending their influence into the region.

Unfortunately, the Americans enjoyed no more success in promoting long-term, pro-Western stability than had the British.  In some respects, they only made things worse, with the joint CIA-MI6 overthrow of a democratically elected government in Iran in 1953 offering a prime example of a “success” that, to this day, has never stopped breeding disaster.

Only after 1980 did things get really interesting, however.  The Carter Doctrine promulgated that year designated the Persian Gulf a vital national security interest and opened the door to greatly increased U.S. military activity not just in the Gulf, but also throughout the Greater Middle East (GME).  Between 1945 and 1980, considerable numbers of American soldiers lost their lives fighting in Asia and elsewhere.  During that period, virtually none were killed fighting in the GME.  Since 1990, in contrast, virtually none have been killed fighting anywhere except in the GME.

What does the United States hope to achieve in its inherited and unending War for the Greater Middle East?  To pacify the region?  To remake it in our image?  To drain its stocks of petroleum?  Or just keeping the lid on?  However you define the war’s aims, things have not gone well, which once again suggests that, in some form, it will continue for some time to come.  If there’s any good news here, it’s the prospect of having ever more material for my seminar, which may soon expand into a two-semester course.

The War Against Islam: This war began nearly 1,000 years ago and continued for centuries, a storied collision between Christendom and the Muslim ummah.  For a couple of hundred years, periodic eruptions of large-scale violence occurred until the conflict finally petered out with the last crusade sometime in the fourteenth century.

In those days, many people had deemed religion something worth fighting for, a proposition to which the more sophisticated present-day inhabitants of Christendom no longer subscribe.  Yet could that religious war have resumed in our own day?  Professor Samuel Huntington thought so, although he styled the conflict a “clash of civilizations.”  Some militant radical Islamists agree with Professor Huntington, citing as evidence the unwelcome meddling of “infidels,” mostly wearing American uniforms, in various parts of the Muslim world.  Some militant evangelical Christians endorse this proposition, even if they take a more favorable view of U.S. troops occupying and drones targeting Muslim countries.

In explaining the position of the United States government, religious scholars like George W. Bush and Barack (Hussein!) Obama have consistently expressed a contrary view.  Islam is a religion of peace, they declare, part of the great Abrahamic triad.  That the other elements of that triad are likewise committed to peace is a proposition that Bush, Obama, and most Americans take for granted, evidence not required.  There should be no reason why Christians, Jews, and Muslims can’t live together in harmony.

Still, remember back in 2001 when, in an unscripted moment, President Bush described the war barely begun as a “crusade”?  That was just a slip of the tongue, right?  If not, we just might end up calling this one the Eternal War.

Andrew J. Bacevich is a professor of history and international relations at Boston University and a TomDispatch regular. His next book, Breach of Trust: How Americans Failed Their Soldiers and Their Country, will appear in September.

Copyright 2013 Andrew J. Bacevich

Naming Our Nameless War

First came the hullaballoo over the “Mosque at Ground Zero.” Then there was Pastor Terry Jones of Gainesville, Florida, grabbing headlines as he promoted “International Burn-a-Koran Day.” Most recently, we have an American posting a slanderous anti-Muslim video on the Internet with all the ensuing turmoil.

Throughout, the official U.S. position has remained fixed: the United States government condemns Islamophobia. Americans respect Islam as a religion of peace. Incidents suggesting otherwise are the work of a tiny minority — whackos, hatemongers, and publicity-seekers. Among Muslims from Benghazi to Islamabad, the argument has proven to be a tough sell.

And not without reason: although it might be comforting to dismiss anti-Islamic outbursts in the U.S. as the work of a few fanatics, the picture is actually far more complicated. Those complications in turn help explain why religion, once considered a foreign policy asset, has in recent years become a net liability.

Let’s begin with a brief history lesson. From the late 1940s to the late 1980s, when Communism provided the overarching ideological rationale for American globalism, religion figured prominently as a theme of U.S. foreign policy. Communist antipathy toward religion helped invest the Cold War foreign policy consensus with its remarkable durability. That Communists were godless sufficed to place them beyond the pale. For many Americans, the Cold War derived its moral clarity from the conviction that here was a contest pitting the God-fearing against the God-denying. Since we were on God’s side, it appeared axiomatic that God should repay the compliment.

From time to time during the decades when anti-Communism provided so much of the animating spirit of U.S. policy, Judeo-Christian strategists in Washington (not necessarily believers themselves), drawing on the theologically correct proposition that Christians, Jews, and Muslims all worship the same God, sought to enlist Muslims, sometimes of fundamentalist persuasions, in the cause of opposing the godless. One especially notable example was the Soviet-Afghan War of 1979-1989. To inflict pain on the Soviet occupiers, the United States threw its weight behind the Afghan resistance, styled in Washington as “freedom fighters,” and funneled aid (via the Saudis and the Pakistanis) to the most religiously extreme among them. When this effort resulted in a massive Soviet defeat, the United States celebrated its support for the Afghan Mujahedeen as evidence of strategic genius. It was almost as if God had rendered a verdict.

Yet not so many years after the Soviets withdrew in defeat, the freedom fighters morphed into the fiercely anti-Western Taliban, providing sanctuary to al-Qaeda as it plotted — successfully — to attack the United States. Clearly, this was a monkey wrench thrown into God’s plan.

With the launching of the Global War on Terrorism, Islamism succeeded Communism as the body of beliefs that, if left unchecked, threatened to sweep across the globe with dire consequences for freedom. Those who Washington had armed as “freedom fighters” now became America’s most dangerous enemies. So at least members of the national security establishment believed or purported to believe, thereby curtailing any further discussion of whether militarized globalism actually represented the best approach to promoting liberal values globally or even served U.S. interests.

Yet as a rallying cry, a war against Islamism presented difficulties right from the outset. As much as policymakers struggled to prevent Islamism from merging in the popular mind with Islam itself, significant numbers of Americans — whether genuinely fearful or mischief-minded — saw this as a distinction without a difference. Efforts by the Bush administration to work around this problem by framing the post-9/11 threat under the rubric of “terrorism” ultimately failed because that generic term offered no explanation for motive. However the administration twisted and turned, motive in this instance seemed bound up with matters of religion.

Where exactly to situate God in post-9/11 U.S. policy posed a genuine challenge for policymakers, not least of all for George W. Bush, who believed, no doubt sincerely, that God had chosen him to defend America in its time of maximum danger. Unlike the communists, far from denying God’s existence, Islamists embrace God with startling ferocity. Indeed, in their vitriolic denunciations of the United States and in perpetrating acts of anti-American violence, they audaciously present themselves as nothing less than God’s avenging agents. In confronting the Great Satan, they claim to be doing God’s will.

Waging War in Jesus’s Name

This debate over who actually represents God’s will is one that the successive administrations of George W. Bush and Barack Obama have studiously sought to avoid. The United States is not at war with Islam per se, U.S. officials insist. Still, among Muslims abroad, Washington’s repeated denials notwithstanding, suspicion persists and not without reason.

Consider the case of Lieutenant General William G. (“Jerry”) Boykin. While still on active duty in 2002, this highly decorated Army officer spoke in uniform at a series of some 30 church gatherings during which he offered his own response to President Bush’s famous question: “Why do they hate us?” The general’s perspective differed markedly from his commander-in-chief’s: “The answer to that is because we’re a Christian nation. We are hated because we are a nation of believers.”

On another such occasion, the general recalled his encounter with a Somali warlord who claimed to enjoy Allah’s protection. The warlord was deluding himself, Boykin declared, and was sure to get his comeuppance: “I knew that my God was bigger than his. I knew that my God was a real God and his was an idol.” As a Christian nation, Boykin insisted, the United States would succeed in overcoming its adversaries only if “we come against them in the name of Jesus.”

When Boykin’s remarks caught the attention of the mainstream press, denunciations rained down from on high, as the White House, the State Department, and the Pentagon hastened to disassociate the government from the general’s views. Yet subsequent indicators suggest that, however crudely, Boykin was indeed expressing perspectives shared by more than a few of his fellow citizens.

One such indicator came immediately: despite the furor, the general kept his important Pentagon job as deputy undersecretary of defense for intelligence, suggesting that the Bush administration considered his transgression minor. Perhaps Boykin had spoken out of turn, but his was not a fireable offense. (One can only speculate regarding the fate likely to befall a U.S. high-ranking officer daring to say of Israeli Prime Benjamin Netanyahu, “My God is a real God and his is an idol.”)

A second indicator came in the wake of Boykin’s retirement from active duty. In 2012, the influential Family Research Council (FRC) in Washington hired the general to serve as the organization’s executive vice-president. Devoted to “advancing faith, family, and freedom,” the council presents itself as emphatically Christian in its outlook. FRC events routinely attract Republican Party heavyweights. The organization forms part of the conservative mainstream, much as, say, the American Civil Liberties Union forms part of the left-liberal mainstream.

So for the FRC to hire as its chief operating officer someone espousing Boykin’s pronounced views regarding Islam qualifies as noteworthy. At a minimum, those who recruited the former general apparently found nothing especially objectionable in his worldview. They saw nothing politically risky about associating with Jerry Boykin. He’s their kind of guy. More likely, by hiring Boykin, the FRC intended to send a signal: on matters where their new COO claimed expertise — above all, war — thumb-in-your eye political incorrectness was becoming a virtue. Imagine the NAACP electing Nation of Islam leader Louis Farrakhan as its national president, thereby endorsing his views on race, and you get the idea.

What the FRC’s embrace of General Boykin makes clear is this: to dismiss manifestations of Islamophobia simply as the work of an insignificant American fringe is mistaken. As with the supporters of Senator Joseph McCarthy, who during the early days of the Cold War saw communists under every State Department desk, those engaging in these actions are daring to express openly attitudes that others in far greater numbers also quietly nurture. To put it another way, what Americans in the 1950s knew as McCarthyism has reappeared in the form of Boykinism.

Historians differ passionately over whether McCarthyism represented a perversion of anti-Communism or its truest expression. So, too, present-day observers will disagree as to whether Boykinism represents a merely fervent or utterly demented response to the Islamist threat. Yet this much is inarguable: just as the junior senator from Wisconsin in his heyday embodied a non-trivial strain of American politics, so, too, does the former special-ops-warrior-turned-“ordained minister with a passion for spreading the Gospel of Jesus Christ.”

Notably, as Boykinism’s leading exponent, the former general’s views bear a striking resemblance to those favored by the late senator. Like McCarthy, Boykin believes that, while enemies beyond America’s gates pose great dangers, the enemy within poses a still greater threat. “I’ve studied Marxist insurgency,” he declared in a 2010 video. “It was part of my training. And the things I know that have been done in every Marxist insurgency are being done in America today.” Explicitly comparing the United States as governed by Barack Obama to Stalin’s Soviet Union, Mao Zedong’s China, and Fidel Castro’s Cuba, Boykin charges that, under the guise of health reform, the Obama administration is secretly organizing a “constabulary force that will control the population in America.” This new force is, he claims, designed to be larger than the United States military, and will function just as Hitler’s Brownshirts once did in Germany. All of this is unfolding before our innocent and unsuspecting eyes.

Boykinism: The New McCarthyism

How many Americans endorsed McCarthy’s conspiratorial view of national and world politics? It’s difficult to know for sure, but enough in Wisconsin to win him reelection in 1952, by a comfortable 54% to 46% majority. Enough to strike fear into the hearts of politicians who quaked at the thought of McCarthy fingering them for being “soft on Communism.”

How many Americans endorse Boykin’s comparably incendiary views? Again, it’s difficult to tell. Enough to persuade FRC’s funders and supporters to hire him, confident that doing so would burnish, not tarnish, the organization’s brand. Certainly, Boykin has in no way damaged its ability to attract powerhouses of the domestic right. FRC’s recent “Values Voter Summit” featured luminaries such as Republican vice-presidential nominee Paul Ryan, former Republican Senator and presidential candidate Rick Santorum, House Majority Leader Eric Cantor, and Representative Michele Bachmann — along with Jerry Boykin himself, who lectured attendees on “Israel, Iran, and the Future of Western Civilization.” (In early August, Mitt Romney met privately with a group of “prominent social conservatives,” including Boykin.)

Does their appearance at the FRC podium signify that Ryan, Santorum, Cantor, and Bachmann all subscribe to Boykinism’s essential tenets? Not any more than those who exploited the McCarthyite moment to their own political advantage — Richard Nixon, for example — necessarily agreed with all of McCarthy’s reckless accusations. Yet the presence of leading Republicans on an FRC program featuring Boykin certainly suggests that they find nothing especially objectionable or politically damaging to them in his worldview.

Still, comparisons between McCarthyism and Boykinism only go so far. Senator McCarthy wreaked havoc mostly on the home front, instigating witch-hunts, destroying careers, and trampling on civil rights, while imparting to American politics even more of a circus atmosphere than usual. In terms of foreign policy, the effect of McCarthyism, if anything, was to reinforce an already existing anti-communist consensus. McCarthy’s antics didn’t create enemies abroad. McCarthyism merely reaffirmed that communists were indeed the enemy, while making the political price of thinking otherwise too high to contemplate.

Boykinism, in contrast, makes its impact felt abroad. Unlike McCarthyism, it doesn’t strike fear into the hearts of incumbents on the campaign trail here. Attracting General Boykin’s endorsement or provoking his ire probably won’t determine the outcome of any election. Yet in its various manifestations Boykinism provides the kindling that helps sustain anti-American sentiment in the Islamic world. It reinforces the belief among Muslims that the Global War on Terror really is a war against them.

Boykinism confirms what many Muslims are already primed to believe: that American values and Islamic values are irreconcilable. American presidents and secretaries of state stick to their talking points, praising Islam as a great religious tradition and touting past U.S. military actions (ostensibly) undertaken on behalf of Muslims. Yet with their credibility among Iraqis, Afghans, Pakistanis, and others in the Greater Middle East about nil, they are pissing in the wind.

As long as substantial numbers of vocal Americans do not buy the ideological argument constructed to justify U.S. intervention in the Islamic world — that their conception of freedom (including religious freedom) is ultimately compatible with ours — then neither will Muslims. In that sense, the supporters of Boykinism who reject that proposition encourage Muslims to follow suit. This ensures, by extension, that further reliance on armed force as the preferred instrument of U. S. policy in the Islamic world will compound the errors that produced and have defined the post-9/11 era.

Andrew J. Bacevich is currently a visiting fellow at Notre Dame’s Kroc Institute for International Peace Studies. A TomDispatch regular, he is author of Washington Rules: America’s Path to Permanent War, among other works, and most recently editor of The Short American Century.

Copyright 2012 Andrew J. Bacevich

Boykinism

With the United States now well into the second decade of what the Pentagon has styled an “era of persistent conflict,” the war formerly known as the global war on terrorism (unofficial acronym WFKATGWOT) appears increasingly fragmented and diffuse.  Without achieving victory, yet unwilling to acknowledge failure, the United States military has withdrawn from Iraq.  It is trying to leave Afghanistan, where events seem equally unlikely to yield a happy outcome. 

Elsewhere — in Pakistan, Libya, Yemen, and Somalia, for example — U.S. forces are busily opening up new fronts.  Published reports that the United States is establishing “a constellation of secret drone bases” in or near the Horn of Africa and the Arabian Peninsula suggest that the scope of operations will only widen further.  In a front-page story, the New York Times described plans for “thickening” the global presence of U.S. special operations forces.  Rushed Navy plans to convert an aging amphibious landing ship into an “afloat forward staging base” — a mobile launch platform for either commando raids or minesweeping operations in the Persian Gulf — only reinforces the point. Yet as some fronts close down and others open up, the war’s narrative has become increasingly difficult to discern.  How much farther until we reach the WFKATGWOT’s equivalent of Berlin?  What exactly is the WFKATGWOT’s equivalent of Berlin?  In fact, is there a storyline here at all?

Viewed close-up, the “war” appears to have lost form and shape.  Yet by taking a couple of steps back, important patterns begin to appear.  What follows is a preliminary attempt to score the WFKATGWOT, dividing the conflict into a bout of three rounds.  Although there may be several additional rounds still to come, here’s what we’ve suffered through thus far.

The Rumsfeld Era

Round 1: Liberation.  More than any other figure — more than any general, even more than the president himself — Secretary of Defense Donald Rumsfeld dominated the war’s early stages.  Appearing for a time to be a larger-than-life figure — the “Secretary at War” in the eyes of an adoring (if fickle) neocon fan club — Rumsfeld dedicated himself to the proposition that, in battle, speed holds the key to victory.  He threw his considerable weight behind a high-tech American version of blitzkrieg.  U.S. forces, he regularly insisted, were smarter and more agile than any adversary.  To employ them in ways that took advantage of those qualities was to guarantee victory.  The journalistic term adopted to describe this concept was “shock and awe.”

No one believed more passionately in “shock and awe” than Rumsfeld himself.  The design of Operation Enduring Freedom, launched in October 2001, and of Operation Iraqi Freedom, begun in March 2003, reflected this belief.  In each instance, the campaign got off to a promising start, with U.S. troops landing some swift and impressive blows.  In neither case, however, were they able to finish off their opponent or even, in reality, sort out just who their opponent might be.  Unfortunately for Rumsfeld, the “terrorists” refused to play by his rulebook and U.S. forces proved to be less smart and agile than their technological edge — and their public relations machine — suggested would be the case.  Indeed, when harassed by minor insurgencies and scattered bands of jihadis, they proved surprisingly slow to figure out what hit them.

In Afghanistan, Rumsfeld let victory slip through his grasp.  In Iraq, his mismanagement of the campaign brought the United States face-to-face with outright defeat.  Rumsfeld’s boss had hoped to liberate (and, of course, dominate) the Islamic world through a series of short, quick thrusts.  What Bush got instead were two different versions of a long, hard slog.  By the end of 2006, “shock and awe” was kaput.  Trailing well behind the rest of the country and its armed forces, the president eventually lost confidence in his defense secretary’s approach.  As a result, Rumsfeld lost his job.  Round one came to an end, the Americans, rather embarrassingly, having lost it on points.

The Petraeus Era

Round 2: Pacification.  Enter General David Petraeus.  More than any other figure, in or out of uniform, Petraeus dominated the WFKATGWOT’s second phase.  Round two opened with lowered expectations.  Gone was the heady talk of liberation.  Gone, too, were predictions of lightning victories.  The United States was now willing to settle for much less while still claiming success. 

Petraeus offered a formula for restoring a semblance of order to countries reduced to chaos as a result of round one.  Order might permit the United States to extricate itself while maintaining some semblance of having met its policy objectives.  This became the operative definition of victory.

The formal name for the formula that Petraeus devised was counterinsurgency, or COIN.  Rather than trying to defeat the enemy, COIN sought to facilitate the emergence of a viable and stable nation-state.  This was the stated aim of the “surge” in Iraq ordered by President George W. Bush at the end of 2006. 

With Petraeus presiding, violence in that country did decline precipitously. Whether the relationship was causal or coincidental remains the subject of controversy.  Still, Petraeus’s apparent success persuaded some observers that counterinsurgency on a global scale — GCOIN, they called it — should now form the basis for U.S. national security strategy.  Here, they argued, was an approach that could definitively extract the United States from the WFKATGWOT, while offering victory of a sort.  Rather than employing “shock and awe” to liberate the Islamic world, U.S. forces would apply counterinsurgency doctrine to pacify it.

The task of demonstrating the validity of COIN beyond Iraq fell to General Stanley McChrystal, appointed with much fanfare in 2009 to command U.S. and NATO forces in Afghanistan.  Press reports celebrated McChrystal as another Petraeus, the ideal candidate to replicate the achievements already credited to “King David.” 

McChrystal’s ascendency came at a moment when a cult of generalship gripped Washington.  Rather than technology being the determinant of success as Rumsfeld had believed, the key was to put the right guy in charge and then let him run with things.  Political figures on both sides of the aisle fell all over themselves declaring McChrystal the right guy for Afghanistan.  Pundits of all stripes joined the chorus.

Once installed in Kabul, the general surveyed the situation and, to no one’s surprise, announced that “success demands a comprehensive counterinsurgency campaign.”  Implementing that campaign would necessitate an Afghan “surge” mirroring the one that had seemingly turned Iraq around.  In December 2009, albeit with little evident enthusiasm, President Barack Obama acceded to his commander’s request (or ultimatum).  The U.S. troop commitment to Afghanistan rapidly increased.

Here things began to come undone.  Progress toward reducing the insurgency or improving the capacity of Afghan security forces was — by even the most generous evaluation — negligible.  McChrystal made promises — like meeting basic Afghan needs with “government in a box, ready to roll in” — that he proved utterly incapable of keeping.  Relations with the government of President Hamid Karzai remained strained.  Those with neighboring Pakistan, not good to begin with, only worsened.  Both governments expressed deep resentment at what they viewed as high-handed American behavior that killed or maimed noncombatants with disturbing frequency.

To make matters worse, despite all the hype, McChrystal turned out to be miscast — manifestly the wrong guy for the job.  Notably, he proved unable to grasp the need for projecting even some pretence of respect for the principle of civilian control back in Washington.  By the summer of 2010, he was out — and Petraeus was back in.

In Washington (if not in Kabul), Petraeus’s oversized reputation quelled the sense that with McChrystal’s flame-out Afghanistan might be a lost cause.  Surely, the most celebrated soldier of his generation would repeat his Iraq magic, affirming his own greatness and the continued viability of COIN. 

Alas, this was not to be.  Conditions in Afghanistan during Petraeus’s tenure in command improved — if that’s even the word — only modestly.  The ongoing war met just about anyone’s definition of a quagmire.  With considerable understatement, a 2011 National Intelligence Estimate called it a “stalemate.” Soon, talk of a “comprehensive counterinsurgency” faded.  With the bar defining success slipping ever lower, passing off the fight to Afghan security forces and hightailing it for home became the publicly announced war aim.

That job remained unfinished when Petraeus himself headed for home, leaving the army to become CIA director.  Although Petraeus was still held in high esteem, his departure from active duty left the cult of generalship looking more than a little the worse for wear.  By the time General John Allen succeeded Petraeus — thereby became the eighth U.S. officer appointed to preside over the ongoing Afghan War — no one believed that simply putting the right guy in charge was going to produce magic.  On that inclusive note, round two of the WFKATGWOT ended.

The Vickers Era

Round 3: Assassination.  Unlike Donald Rumsfeld or David Petraeus, Michael Vickers has not achieved celebrity status.  Yet more than anyone else in or out of uniform, Vickers, who carries the title Under Secretary of Defense for Intelligence, deserves recognition as the emblematic figure of the WFKATGWOT’s round three.  His low-key, low-profile persona meshes perfectly with this latest evolution in the war’s character.  Few people outside of Washington know who he is, which is fitting indeed since he presides over a war that few people outside of Washington are paying much attention to any longer.

With the retirement of Secretary of Defense Robert Gates, Vickers is the senior remaining holdover from George W. Bush’s Pentagon.  His background is nothing if not eclectic.  He previously served in U.S. Army Special Forces and as a CIA operative.  In that guise, he played a leading role in supporting the Afghan mujahedeen in their war against Soviet occupiers in the 1980s.  Subsequently, he worked in a Washington think tank and earned a PhD in strategic studies at Johns Hopkins University (dissertation title: “The Structure of Military Revolutions”). 

Even during the Bush era, Vickers never subscribed to expectations that the United States could liberate or pacify the Islamic world.  His preferred approach to the WFKATGWOT has been simplicity itself. “I just want to kill those guys,” he says — “those guys” referring to members of al-Qaeda. Kill the people who want to kill Americans and don’t stop until they are all dead: this defines the Vickers strategy, which over the course of the Obama presidency has supplanted COIN as the latest variant of U.S. strategy. 

The Vickers approach means acting aggressively to eliminate would-be killers wherever they might be found, employing whatever means are necessary.  Vickers “tends to think like a gangster,” one admirer comments. “He can understand trends then change the rules of the game so they are advantageous for your side.”

Round three of the WFKATGWOT is all about bending, breaking, and reinventing rules in ways thought to be advantageous to the United States.  Much as COIN supplanted “shock and awe,” a broad-gauged program of targeted assassination has now displaced COIN as the prevailing expression of the American way of war. 

The United States is finished with the business of sending large land armies to invade and occupy countries on the Eurasian mainland.  Robert Gates, when still Secretary of Defense, made the definitive statement on that subject.  The United States is now in the business of using missile-armed drones and special operations forces to eliminate anyone (not excluding U.S. citizens) the president of the United States decides has become an intolerable annoyance.  Under President Obama, such attacks have proliferated. 

This is America’s new MO.  Paraphrasing a warning issued by Secretary of State Hillary Clinton, a Washington Post dispatch succinctly summarized what it implied: “The United States reserved the right to attack anyone who it determined posed a direct threat to U.S. national security, anywhere in the world.” 

Furthermore, acting on behalf of the United States, the president exercises this supposed right without warning, without regard to claims of national sovereignty, without Congressional authorization, and without consulting anyone other than Michael Vickers and a few other members of the national security apparatus.  The role allotted to the American people is to applaud, if and when notified that a successful assassination has occurred.  And applaud we do, for example, when a daring raid by members in SEAL Team Six secretly enter Pakistan to dispatch Osama bin Laden with two neatly placed kill shots.  Vengeance long deferred making it unnecessary to consider what second-order political complications might ensue. 

How round three will end is difficult to forecast.  The best we can say is that it’s unlikely to end anytime soon or particularly well.  As Israel has discovered, once targeted assassination becomes your policy, the list of targets has a way of growing ever longer. 

So what tentative judgments can we offer regarding the ongoing WFKATGWOT?  Operationally, a war launched by the conventionally minded has progressively fallen under the purview of those who inhabit what Dick Cheney once called “the dark side,” with implications that few seem willing to explore.  Strategically, a war informed at the outset by utopian expectations continues today with no concretely stated expectations whatsoever, the forward momentum of events displacing serious consideration of purpose.  Politically, a war that once occupied center stage in national politics has now slipped to the periphery, the American people moving on to other concerns and entertainments, with legal and moral questions raised by the war left dangling in midair.

Is this progress?

Andrew J. Bacevich is professor of history and international relations at Boston University.  A TomDispatch regular, he is the author most recently of Washington Rules: The American Path to Permanent War and the editor of the new book The Short American Century: A Postmortem, just out from Harvard University Press. To catch Timothy MacBain’s latest Tomcast audio interview in which Bacevich discusses the changing face of the Gobal War on Terror, click here, or download it to your iPod here.

Copyright 2012 Andrew Bacevich

Scoring the Global War on Terror

Fenway Park, Boston, July 4, 2011.  On this warm summer day, the Red Sox will play the Toronto Blue Jays.  First come pre-game festivities, especially tailored for the occasion.  The ensuing spectacle — a carefully scripted encounter between the armed forces and society — expresses the distilled essence of present-day American patriotism.  A masterpiece of contrived spontaneity, the event leaves spectators feeling good about their baseball team, about their military, and not least of all about themselves — precisely as it was meant to do.

In this theatrical production, the Red Sox provide the stage, and the Pentagon the props.  In military parlance, it is a joint operation.  In front of a gigantic American flag draped over the left-field wall, an Air Force contingent, clad in blue, stands at attention.  To carry a smaller version of the Stars and Stripes onto the playing field, the Navy provides a color guard in crisp summer whites.  The United States Marine Corps kicks in with a choral ensemble that leads the singing of the national anthem.  As the anthem’s final notes sound, four U. S. Air Force F-15C Eagles scream overhead.  The sellout crowd roars its approval.

But there is more to come. “On this Independence Day,” the voice of the Red Sox booms over the public address system, “we pay a debt of gratitude to the families whose sons and daughters are serving our country.”  On this particular occasion the designated recipients of that gratitude are members of the Lydon family, hailing from Squantum, Massachusetts.  Young Bridget Lydon is a sailor — Aviation Ordnanceman Airman is her official title — serving aboard the carrier USS Ronald Reagan, currently deployed in support of the Afghanistan War, now in its 10th year.

From Out of Nowhere

The Lydons are Every Family, decked out for the Fourth.  Garbed in random bits of Red Sox paraphernalia and Mardi Gras necklaces, they wear their shirts untucked and ball caps backwards.  Neither sleek nor fancy, they are without pretension.  Yet they exude good cheer.  As they are ushered onto the field, their eagerness is palpable.  Like TV game show contestants, they know that this is their lucky day and they are keen to make the most of it.

As the Lydons gather near the pitcher’s mound, the voice directs their attention to the 38-by-100-foot Jumbotron mounted above the centerfield bleachers.  On the screen, Bridget appears.  She is aboard ship, in duty uniform, posed below decks in front of an F/A-18 fighter jet.  Waiflike, but pert and confident, she looks directly into the camera, sending a “shout-out” to family and friends.  She wishes she could join them at Fenway. 

As if by magic, wish becomes fulfillment.  While the video clip is still running, Bridget herself, now in dress whites, emerges from behind the flag covering the leftfield wall.  On the Jumbotron, in place of Bridget below decks, an image of Bridget marching smartly toward the infield appears.  In the stands pandemonium erupts.  After a moment of confusion, members of her family — surrounded by camera crews — rush to embrace their sailor, a reunion shared vicariously by the 38,000 fans in attendance along with many thousands more watching at home on the Red Sox television network. 

Once the Lydons finish with hugs and kisses and the crowd settles down, Navy veteran Bridget (annual salary approximately $22,000) throws the ceremonial first pitch to aging Red Sox veteran Tim Wakefield (annual salary $2,000,000).  More cheers.  As a souvenir, Wakefield gives her the baseball along with his own hug.  All smiles, Bridget and her family shout “Play Ball!” into a microphone.  As they are escorted off the field and out of sight, the game begins. 

Cheap Grace

What does this event signify?

For the Lydons, the day will no doubt long remain a happy memory.  If they were to some degree manipulated — their utter and genuine astonishment at Bridget’s seemingly miraculous appearance lending the occasion its emotional punch — they played their allotted roles without complaint and with considerable élan.  However briefly, they stood in the spotlight, quasi-celebrities, all eyes trained on them, a contemporary version of the American dream fulfilled.  And if offstage puppet-masters used Bridget herself, at least she got a visit home and a few days off — no doubt a welcome break. 

Yet this feel-good story was political as well as personal.  As a collaboration between two well-heeled but image-conscious institutions, the Lydon reunion represented a small but not inconsequential public relations triumph.  The Red Sox and the Navy had worked together to perform an act of kindness for a sailor and her loved ones.  Both organizations came away looking good, not only because the event itself was so deftly executed, but because it showed that the large for-profit professional sports team and the even larger military bureaucracy both care about ordinary people.  The message conveyed to fans/taxpayers could not be clearer: the corporate executives who run the Red Sox have a heart. So, too, do the admirals who run the Navy.

Better still, these benefits accrued at essentially no cost to the sponsors.  The military personnel arrayed around Fenway showed up because they were told to do so.  They are already “paid for,” as are the F-15s, the pilots who fly them, and the ground crews that service them.  As for whatever outlays the Red Sox may have made, they are trivial and easily absorbed.  For the 2011 season, the average price of a ticket at Fenway Park had climbed to $52.  A soft drink in a commemorative plastic cup runs you $5.50 and a beer $8.  Then there is the television ad revenue, all contributing the previous year to corporate profits exceeding $58 million.  A decade of war culminating in the worst economic crisis since the Great Depression hasn’t done much good for the country but it has been strangely good for the Red Sox — and a no-less well funded Pentagon.  Any money expended in bringing Bridget to Fenway and entertaining the Lydons had to be the baseball/military equivalent of pocket change.

And the holiday festivities at Fenway had another significance as well, one that extended beyond burnishing institutional reputations and boosting bottom lines.  Here was America’s civic religion made manifest. 

In recent decades, an injunction to “support the troops” has emerged as a central tenet of that religion.  Since 9/11 this imperative has become, if anything, even more binding.  Indeed, as citizens, Americans today acknowledge no higher obligation.

Fulfilling that obligation has posed a challenge, however.  Rather than doing so concretely, Americans — with a few honorable exceptions — have settled for symbolism.  With their pronounced aversion to collective service and sacrifice (an inclination indulged by leaders of both political parties), Americans resist any definition of civic duty that threatens to crimp lifestyles. 

To stand in solidarity with those on whom the burden of service and sacrifice falls is about as far as they will go.  Expressions of solidarity affirm that the existing relationship between soldiers and society is consistent with democratic practice.  By extension, so, too, is the distribution of prerogatives and responsibilities entailed by that relationship: a few fight, the rest applaud.  Put simply, the message that citizens wish to convey to their soldiers is this: although choosing not to be with you, we are still for you (so long as being for you entails nothing on our part).  Cheering for the troops, in effect, provides a convenient mechanism for voiding obligation and easing guilty consciences.   

In ways far more satisfying than displaying banners or bumper stickers, the Fenway Park Independence Day event provided a made-to-order opportunity for conscience easing.  It did so in three ways.  First, it brought members of Red Sox Nation into close proximity (even if not direct contact) with living, breathing members of the armed forces, figuratively closing any gap between the two.  (In New England, where few active duty military installations remain, such encounters are increasingly infrequent.)  Second, it manufactured one excuse after another to whistle and shout, whoop and holler, thereby allowing the assembled multitudes to express — and to be seen expressing — their affection and respect for the troops.  Finally, it rewarded participants and witnesses alike with a sense of validation, the reunion of Bridget and her family, even if temporary, serving as a proxy for a much larger, if imaginary, reconciliation of the American military and the American peopleThat debt?  Mark it paid in full. 

The late German theologian Dietrich Bonhoeffer had a name for this unearned self-forgiveness and undeserved self-regard.  He called it cheap grace.  Were he alive today, Bonhoeffer might suggest that a taste for cheap grace, compounded by an appetite for false freedom, is leading Americans down the road to perdition. 

Andrew J. Bacevich, the author of Washington Rules: America’s Path to Permanent War, is professor of history and international relations at Boston University. His next book, of which this post is a small part, will assess the impact of a decade of war on American society and the United States military. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses cheap grace and military spectacle, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Ballpark Liturgy: America’s New Civic Religion

At periodic intervals, the American body politic has shown a marked susceptibility to messianic fevers.  Whenever an especially acute attack occurs, a sort of delirium ensues, manifesting itself in delusions of grandeur and demented behavior. 

By the time the condition passes and a semblance of health is restored, recollection of what occurred during the illness tends to be hazy.  What happened?  How’d we get here?  Most Americans prefer not to know.  No sense dwelling on what’s behind us.  Feeling much better now!  Thanks!

Gripped by such a fever in 1898, Americans evinced an irrepressible impulse to liberate oppressed Cubans.  By the time they’d returned to their senses, having acquired various parcels of real estate between Puerto Rico and the Philippines, no one could quite explain what had happened or why.  (The Cubans meanwhile had merely exchanged one set of overseers for another.)

In 1917, the fever suddenly returned.  Amid wild ravings about waging a war to end war, Americans lurched off to France.  This time the affliction passed quickly, although the course of treatment proved painful: confinement to the charnel house of the Western Front, followed by bitter medicine administered at Versailles.

The 1960s brought another bout (and so yet more disappointment).  An overwhelming urge to pay any price, bear any burden landed Americans in Vietnam.  The fall of Saigon in 1975 seemed, for a brief interval, to inoculate the body politic against any further recurrence.  Yet the salutary effects of this “Vietnam syndrome” proved fleeting.  By the time the Cold War ended, Americans were running another temperature, their self-regard reaching impressive new heights.  Out of Washington came all sorts of embarrassing gibberish about permanent global supremacy and history’s purpose finding fulfillment in the American way of life.

Give Me Fever

Then came 9/11 and the fever simply soared off the charts.  The messiah-nation was really pissed and was going to fix things once and for all.

Nearly 10 years have passed since Washington set out to redeem the Greater Middle East.  The crusades have not gone especially well.  In fact, in the pursuit of its saving mission, the American messiah has pretty much worn itself out.

Today, the post-9/11 fever finally shows signs of abating.  The evidence is partial and preliminary.  The sickness has by no means passed.  Oddly, it lingers most strongly in the Obama White House, of all places, where a keenness to express American ideals by dropping bombs seems strangely undiminished.

Yet despite the urges of some in the Obama administration, after nearly a decade of self-destructive flailing about, American recovery has become a distinct possibility.  Here’s some of the evidence:

In Washington, it’s no longer considered a sin to question American omnipotence.  Take the case of Robert Gates.  The outgoing secretary of defense may well be the one senior U.S. official of the past decade to leave office with his reputation not only intact, but actually enhanced.  (Note to President Obama: think about naming an aircraft carrier after the guy).  Yet along with restoring a modicum of competence and accountability to the Pentagon, the Gates legacy is likely to be found in his willingness — however belated — to acknowledge the limits of American power.

That the United States should avoid wars except when absolutely necessary no longer connotes incipient isolationism.  It is once again a sign of common sense, with Gates a leading promoter.  Modesty is becoming respectable.

The Gates Doctrine

No one can charge Gates with being an isolationist or a national security wimp.  Neither is he a “declinist.”  So when he says anyone proposing another major land war in the Greater Middle East should “have his head examined” — citing the authority of Douglas MacArthur, no less — people take notice.  Or more recently there was this:  "I've got a military that's exhausted," Gates remarked, in one of those statements of the obvious too seldom heard from on high.  "Let's just finish the wars we're in and keep focused on that instead of signing up for other wars of choice."  Someone should etch that into the outer walls of the Pentagon’s E-ring.

A half-dozen years ago, “wars of choice” were all the rage in Washington.  No more.  Thank you, Mr. Secretary.

Or consider the officer corps.  There is no “military mind,” but there are plenty of minds in the military and some numbers of them are changing.

Evidence suggests that the officer corps itself is rethinking the role of military power.  Consider, for example, “Mr. Y,” author of A National Strategic Narrative, published this spring to considerable acclaim by the Woodrow Wilson Center for Scholars.  The actual authors of this report are two military professionals, one a navy captain, the other a Marine colonel.

What you won’t find in this document are jingoism, braggadocio, chest-thumping, and calls for a bigger military budget.  If there’s an overarching theme, it’s pragmatism.  Rather than the United States imposing its will on the world, the authors want more attention paid to the investment needed to rebuild at home.

The world is too big and complicated for any one nation to call the shots, they insist.  The effort to do so is self-defeating. “As Americans,” Mr. Y writes, “we needn’t seek the world’s friendship or proselytize the virtues of our society.  Neither do we seek to bully, intimidate, cajole, or persuade others to accept our unique values or to share our national objectives.  Rather, we will let others draw their own conclusions based upon our actions… We will pursue our national interests and let others pursue theirs…”

You might dismiss this as the idiosyncratic musing of two officers who have spent too much time having their brains baked in the Iraqi or Afghan sun.  I don’t.  What convinces me otherwise is the positive email traffic that my own musings about the misuse and abuse of American power elicit weekly from serving officers.  It’s no scientific sample, but the captains, majors, and lieutenant colonels I hear from broadly agree with Mr. Y.  They’ve had a bellyful of twenty-first-century American war and are open to a real debate over how to overhaul the nation’s basic approach to national security.

Intelligence Where You Least Expect It

And finally, by gum, there is the United States Congress.  Just when that body appeared to have entered a permanent vegetative state, a flickering of intelligent life has made its reappearance.  Perhaps more remarkably still, the signs are evident on both sides of the aisle as Democrats and Republicans alike — albeit for different reasons — are raising serious questions about the nation’s propensity for multiple, open-ended wars.

Some members cite concerns for the Constitution and the abuse of executive power.  Others worry about the price tag.  With Osama bin Laden out of the picture, still others insist that it’s time to rethink strategic priorities.  No doubt partisan calculation or personal ambition figures alongside matters of principle.  They are, after all, politicians.

Given what polls indicate is a growing public unhappiness over the Afghan War, speaking out against that war these days doesn’t exactly require political courage.  Still, the possibility of our legislators reasserting a role in deciding whether or not a war actually serves the national interest — rather than simply rubberstamping appropriations and slinking away — now presents itself.  God bless the United States Congress.

Granted, the case presented here falls well short of being conclusive.  To judge by his announcement of a barely-more-than-symbolic troop withdrawal from Afghanistan, President Obama himself seems uncertain of where he stands.  And clogging the corridors of power or the think tanks and lobbying arenas that surround them are plenty of folks still hankering to have a go at Syria or Iran.

At the first signs of self-restraint, you can always count on the likes of Senator John McCain or the editorial board of the Wall Street Journal to decry (in McCain’s words) an “isolationist-withdrawal-lack-of-knowledge-of-history attitude” hell-bent on pulling up the drawbridge and having Americans turn their backs on the world.  In such quarters, fever is a permanent condition and it’s always 104 and rising.  Yet it is a measure of just how quickly things are changing that McCain himself, once deemed a source of straight talk, now comes across as a mere crank.

In this way, nearly a decade after our most recent descent into madness, does the possibility of recovery finally beckon.

Andrew J. Bacevich is professor of history and international relations at Boston University. His most recent book is Washington Rules: America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses voices of dissent within the military, click here, or download it to your iPod here.

Copyright 2011 Andrew J. Bacevich

On the Mend?

It is a commonplace of American politics: when the moving van pulls up to the White House on Inauguration Day, it delivers not only a closetful of gray suits and power ties, but a boatload of expectations. 

A president, being the most powerful man in the world, begins history anew — so at least Americans believe, or pretend to believe.  Out with the old, sordid, and disappointing; in with the fresh, unsullied, and hopeful.  Why, with the stroke of a pen, a new president can order the closing of an embarrassing and controversial off-shore prison for accused terrorists held for years on end without trial!  Just like that: done.

For all sorts of reasons, the expectations raised by Barack Obama’s arrival in the Oval Office were especially high.  Americans weren’t the only ones affected.  How else to explain the Nobel Committee’s decision to honor the new president by transforming its Peace Prize into a Prize Anticipating Peace — more or less the equivalent of designating the winner of the Heisman Trophy during week one of the college football season.

Of course, if the political mood immediately prior to and following a presidential inauguration emphasizes promise and discovery (the First Lady has biceps!), it doesn’t take long for the novelty to start wearing off.  Then the narrative arc takes a nosedive: he’s breaking his promises,  he’s letting us down, he’s not so different after all.

The words of H.L. Mencken apply.  “When I hear a man applauded by the mob,” the Sage of Baltimore wrote, “I always feel a pang of pity for him.  All he has to do to be hissed is to live long enough.”  Barack Obama has now lived long enough to attract his fair share of hisses, boos, and catcalls.

Along with prolonging and expanding one war in Afghanistan, the Nobel Peace laureate has played a leading role in starting another war in Libya.  Laboring to distinguish between this administration and its predecessor, Obama’s defenders emphasize the purity of his motives.  Contemptuous of George W. Bush’s claim that U.S. forces invaded oil-rich Iraq to keep weapons of mass destruction out of the hands of terrorists, they readily accept this president’s insistence that the United States intervened in oil-rich Libya to prevent genocidal slaughter.  Besides, testifying to our virtuous intent, this time we’ve got the French with us rather than against us.

Explaining Why Is a Mug’s Game

In truth, to ascribe a single governing purpose or rationale to any large-scale foreign policy initiative is to engage in willful distortion.  In any administration, action grows out of consensus.  The existence of consensus among any president’s advisers — LBJ’s inner circle supporting escalation in South Vietnam back in 1965, George W.’s pressing for regime change in Baghdad — does not imply across-the-board agreement as to intent.

Motive is slippery.  As Paul Wolfowitz famously noted regarding Iraq, weapons of mass destruction merely provided the agreed upon public rationale for war.  In reality, a mix of motives probably shaped the decision to invade.  For some administration officials, there was the prospect of eliminating a perceived source of mischief while providing an object lesson to other would-be troublemakers.  For others, there was the promise of reasserting U.S. hegemony over the world’s energy heartland.  For others still (including Wolfowitz himself), there were alluring visions of a region transformed, democratized, and pacified, the very sources of Islamist terror thereby eliminated once and for all. 

At least on the margins, expanding the powers of the presidency at the expense of Congress, bolstering the security of Israel, and finishing what daddy had left undone also likely figured in the equation.  Within this mix, policymakers could pick and choose.

In the face of changing circumstances, they even claimed the prerogative of revising their choices.  Who can doubt that President Bush, faced with the Big Oops — the weapons of mass destruction that turned out not to exist — genuinely persuaded himself that America’s true and abiding purpose for invading Iraq had been to liberate the Iraqi people from brutal oppression?  After all, right from day one wasn’t the campaign called Operation Iraqi Freedom?

So even as journalists and historians preoccupy themselves with trying to explain why something happened, they are playing a mug’s game.  However creative or well-sourced, their answers are necessarily speculative, partial, and ambiguous.  It can’t be otherwise.

Rather than why, what deserves far more attention than it generally receives is the question of how.  Here is where we find Barack Obama and George W. Bush (not to mention Bill Clinton, George H. W. Bush, Ronald Reagan, and Jimmy Carter) joined at the hip.  When it comes to the Islamic world, for more than three decades now Washington’s answer to how has been remarkably consistent: through the determined application of hard power wielded by the United States.  Simply put, Washington’s how implies a concerted emphasis on girding for and engaging in war. 

Presidents may not agree on exactly what we are trying to achieve in the Greater Middle East (Obama wouldn’t be caught dead reciting lines from Bush’s Freedom Agenda, for example), but for the past several decades, they have agreed on means: whatever it is we want done, military might holds the key to doing it.  So today, we have the extraordinary spectacle of Obama embracing and expanding Bush’s Global War on Terror even after having permanently banished that phrase to the Guantanamo of politically incorrect speech.

The Big How — By Force

Efforts to divine this administration’s intent in Libya have centered on the purported influence of the Three Harpies: Secretary of State Hillary Clinton, U.N. Ambassador Susan Rice, and National Security Council Human Rights Director Samantha Power, women in positions of influence ostensibly burdened with regret that the United States failed back in 1994 to respond effectively to the Rwandan genocide and determined this time to get it right.  Yet this is insider stuff, which necessarily remains subject to considerable speculation.  What we can say for sure is this: by seeing the Greater Middle East as a region of loose nails badly in need of being hammered, the current commander-in-chief has claimed his place in the ranks of a long list of his warrior-predecessors.

The key point is this: like those who preceded them, neither Obama nor his Harpies (nor anyone else in a position of influence) could evidently be bothered to assess whether the hammer actually works as advertised — notwithstanding abundant evidence showing that it doesn’t.

The sequence of military adventures set in motion when Jimmy Carter promulgated his Carter Doctrine back in 1980 makes for an interesting story but not a very pretty one.  Ronald Reagan’s effort to bring peace to Lebanon ended in 1983 in a bloody catastrophe.  The nominal victory of Operation Desert Storm in 1991, which pushed Saddam Hussein’s forces out of Kuwait, produced little except woeful complications, which Bill Clinton’s penchant for flinging bombs and missiles about during the 1990s did little to resolve or conceal.  The blowback stemming from our first Afghanistan intervention against the Soviets helped create the conditions leading to 9/11 and another Afghanistan War, now approaching its tenth anniversary with no clear end in sight.  As for George W. Bush’s second go at Iraq, the less said the better.  Now, there is Libya.

The question demands to be asked: Are we winning yet?  And if not, why persist in an effort for which great pain is repaid with such little gain?

Perhaps Barack Obama found his political soul mate in Samantha Power, making her determination to alleviate evil around the world his own.  Or perhaps he is just another calculating politician who speaks the language of ideals while pursuing less exalted purposes.  In either case, the immediate relevance of the question is limited.  The how rather than the why is determinant.

Whatever his motives, by conforming to a pre-existing American penchant for using force in the Greater Middle East, this president has chosen the wrong tool.  In doing so, he condemns himself and the country to persisting in the folly of his predecessors.  The failure is one of imagination, but also of courage.  He promised, and we deserve something better. 

Andrew J. Bacevich is professor of history and international relations.  His most recent book Washington Rules: America’s Path to Permanent War (Metropolitan Books) is just out in paperback. To catch Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses what to make of the Obama administration’s Libyan intervention, click here, or download it to your iPod here.

Copyright 2011 Andrew Bacevich

Not Why, But How

In defense circles, “cutting” the Pentagon budget has once again become a topic of conversation.  Americans should not confuse that talk with reality.  Any cuts exacted will at most reduce the rate of growth.  The essential facts remain: U.S. military outlays today equal that of every other nation on the planet combined, a situation without precedent in modern history. 

The Pentagon presently spends more in constant dollars than it did at any time during the Cold War — this despite the absence of anything remotely approximating what national security experts like to call a “peer competitor.”  Evil Empire?  It exists only in the fevered imaginations of those who quiver at the prospect of China adding a rust-bucket Russian aircraft carrier to its fleet or who take seriously the ravings of radical Islamists promising from deep inside their caves to unite the Umma in a new caliphate.

What are Americans getting for their money?  Sadly, not much.  Despite extraordinary expenditures (not to mention exertions and sacrifices by U.S. forces), the return on investment is, to be generous, unimpressive.  The chief lesson to emerge from the battlefields of the post-9/11 era is this: the Pentagon possesses next to no ability to translate “military supremacy” into meaningful victory.

Washington knows how to start wars and how to prolong them, but is clueless when it comes to ending them.  Iraq, the latest addition to the roster of America’s forgotten wars, stands as exhibit A.  Each bomb that blows up in Baghdad or some other Iraqi city, splattering blood all over the streets, testifies to the manifest absurdity of judging “the surge” as the epic feat of arms celebrated by the Petraeus lobby.

The problems are strategic as well as operational.  Old Cold War-era expectations that projecting U.S. power will enhance American clout and standing no longer apply, especially in the Islamic world.  There, American military activities are instead fostering instability and inciting anti-Americanism.  For Exhibit B, see the deepening morass that Washington refers to as AfPak or the Afghanistan-Pakistan theater of operations.

Add to that the mountain of evidence showing that Pentagon, Inc. is a miserably managed enterprise: hide-bound, bloated, slow-moving, and prone to wasting resources on a prodigious scale — nowhere more so than in weapons procurement and the outsourcing of previously military functions to “contractors.”  When it comes to national security, effectiveness (what works) should rightly take precedence over efficiency (at what cost?) as the overriding measure of merit.  Yet beyond a certain level, inefficiency undermines effectiveness, with the Pentagon stubbornly and habitually exceeding that level.  By comparison, Detroit’s much-maligned Big Three offer models of well-run enterprises.

Impregnable Defenses

All of this takes place against the backdrop of mounting problems at home: stubbornly high unemployment, trillion-dollar federal deficits, massive and mounting debt, and domestic needs like education, infrastructure, and employment crying out for attention.

Yet the defense budget — a misnomer since for Pentagon, Inc. defense per se figures as an afterthought — remains a sacred cow.  Why is that? 

The answer lies first in understanding the defenses arrayed around that cow to ensure that it remains untouched and untouchable.  Exemplifying what the military likes to call a “defense in depth,” that protective shield consists of four distinct but mutually supporting layers. 

Institutional Self-Interest: Victory in World War II produced not peace, but an atmosphere of permanent national security crisis.  As never before in U.S. history, threats to the nation’s existence seemed omnipresent, an attitude first born in the late 1940s that still persists today.  In Washington, fear — partly genuine, partly contrived — triggered a powerful response. 

One result was the emergence of the national security state, an array of institutions that depended on (and therefore strove to perpetuate) this atmosphere of crisis to justify their existence, status, prerogatives, and budgetary claims.  In addition, a permanent arms industry arose, which soon became a major source of jobs and corporate profits.  Politicians of both parties were quick to identify the advantages of aligning with this “military-industrial complex,” as President Eisenhower described it. 

Allied with (and feeding off of) this vast apparatus that transformed tax dollars into appropriations, corporate profits, campaign contributions, and votes was an intellectual axis of sorts  — government-supported laboratories, university research institutes, publications, think tanks, and lobbying firms (many staffed by former or would-be senior officials) — devoted to identifying (or conjuring up) ostensible national security challenges and alarms, always assumed to be serious and getting worse, and then devising responses to them. 

The upshot: within Washington, the voices carrying weight in any national security “debate” all share a predisposition for sustaining very high levels of military spending for reasons having increasingly little to do with the well-being of the country.

Strategic Inertia: In a 1948 State Department document, diplomat George F. Kennan offered this observation: “We have about 50 percent of the world’s wealth, but only 6.3 percent of its population.”  The challenge facing American policymakers, he continued, was “to devise a pattern of relationships that will permit us to maintain this disparity.”  Here we have a description of American purposes that is far more candid than all of the rhetoric about promoting freedom and democracy, seeking world peace, or exercising global leadership. 

The end of World War II found the United States in a spectacularly privileged position.  Not for nothing do Americans remember the immediate postwar era as a Golden Age of middle-class prosperity.  Policymakers since Kennan’s time have sought to preserve that globally privileged position.  The effort has been a largely futile one. 

By 1950 at the latest, those policymakers (with Kennan by then a notable dissenter) had concluded that the possession and deployment of military power held the key to preserving America’s exalted status.  The presence of U.S. forces abroad and a demonstrated willingness to intervene, whether overtly or covertly, just about anywhere on the planet would promote stability, ensure U.S. access to markets and resources, and generally serve to enhance the country’s influence in the eyes of friend and foe alike — this was the idea, at least. 

In postwar Europe and postwar Japan, this formula achieved considerable success.  Elsewhere — notably in Korea, Vietnam, Latin America, and (especially after 1980) in the so-called Greater Middle East — it either produced mixed results or failed catastrophically.  Certainly, the events of the post-9/11 era provide little reason to believe that this presence/power-projection paradigm will provide an antidote to the threat posed by violent anti-Western jihadism.  If anything, adherence to it is exacerbating the problem by creating ever greater anti-American animus.

One might think that the manifest shortcomings of the presence/power-projection approach — trillions expended in Iraq for what? — might stimulate present-day Washington to pose some first-order questions about basic U.S. national security strategy.  A certain amount of introspection would seem to be called for.  Could, for example, the effort to sustain what remains of America’s privileged status benefit from another approach? 

Yet there are few indications that our political leaders, the senior-most echelons of the officer corps, or those who shape opinion outside of government are capable of seriously entertaining any such debate.  Whether through ignorance, arrogance, or a lack of imagination, the pre-existing strategic paradigm stubbornly persists; so, too, as if by default do the high levels of military spending that the strategy entails.

Cultural Dissonance: The rise of the Tea Party movement should disabuse any American of the thought that the cleavages produced by the “culture wars” have healed.  The cultural upheaval touched off by the 1960s and centered on Vietnam remains unfinished business in this country. 

Among other things, the sixties destroyed an American consensus, forged during World War II, about the meaning of patriotism.  During the so-called Good War, love of country implied, even required, deference to the state, shown most clearly in the willingness of individuals to accept the government’s authority to mandate military service.  GI’s, the vast majority of them draftees, were the embodiment of American patriotism, risking life and limb to defend the country. 

The GI of World War II had been an American Everyman.  Those soldiers both represented and reflected the values of the nation from which they came (a perception affirmed by the ironic fact that the military adhered to prevailing standards of racial segregation).  It was “our army” because that army was “us.” 

With Vietnam, things became more complicated.  The war’s supporters argued that the World War II tradition still applied: patriotism required deference to the commands of the state.  Opponents of the war, especially those facing the prospect of conscription, insisted otherwise.  They revived the distinction, formulated a generation earlier by the radical journalist Randolph Bourne, that distinguished between the country and the state.  Real patriots, the ones who most truly loved their country, were those who opposed state policies they regarded as misguided, illegal, or immoral. 

In many respects, the soldiers who fought the Vietnam War found themselves caught uncomfortably in the center of this dispute.  Was the soldier who died in Vietnam a martyr, a tragic figure, or a sap?  Who deserved greater admiration:  the soldier who fought bravely and uncomplainingly or the one who served and then turned against the war?  Or was the war resister — the one who never served at all — the real hero? 

War’s end left these matters disconcertingly unresolved.  President Richard Nixon’s 1971 decision to kill the draft in favor of an All-Volunteer Force, predicated on the notion that the country might be better served with a military that was no longer “us,” only complicated things further.  So, too, did the trends in American politics where bona fide war heroes (George H.W. Bush, Bob Dole, John Kerry, and John McCain) routinely lost to opponents whose military credentials were non-existent or exceedingly slight (Bill Clinton, George W. Bush, and Barack Obama), yet who demonstrated once in office a remarkable propensity for expending American blood (none belonging to members of their own families) in places like Somalia, Iraq, and Afghanistan.  It was all more than a little unseemly.

Patriotism, once a simple concept, had become both confusing and contentious.  What obligations, if any, did patriotism impose?  And if the answer was none — the option Americans seemed increasingly to prefer — then was patriotism itself still a viable proposition? 

Wanting to answer that question in the affirmative — to distract attention from the fact that patriotism had become little more than an excuse for fireworks displays and taking the occasional day off from work — people and politicians alike found a way to do so by exalting those Americans actually choosing to serve in uniform.  The thinking went this way: soldiers offer living proof that America is a place still worth dying for, that patriotism (at least in some quarters) remains alive and well; by common consent, therefore, soldiers are the nation’s “best,” committed to “something bigger than self” in a land otherwise increasingly absorbed in pursuing a material and narcissistic definition of self-fulfillment. 

In effect, soldiers offer much-needed assurance that old-fashioned values still survive, even if confined to a small and unrepresentative segment of American society.  Rather than Everyman, today’s warrior has ascended to the status of icon, deemed morally superior to the nation for which he or she fights, the repository of virtues that prop up, however precariously, the nation’s increasingly sketchy claim to singularity.

Politically, therefore, “supporting the troops” has become a categorical imperative across the political spectrum.  In theory, such support might find expression in a determination to protect those troops from abuse, and so translate into wariness about committing soldiers to unnecessary or unnecessarily costly wars.  In practice, however, “supporting the troops” has found expression in an insistence upon providing the Pentagon with open-ended drawing rights on the nation’s treasury, thereby creating massive barriers to any proposal to affect more than symbolic reductions in military spending. 

Misremembered History: The duopoly of American politics no longer allows for a principled anti-interventionist position.  Both parties are war parties.  They differ mainly in the rationale they devise to argue for interventionism.  The Republicans tout liberty; the Democrats emphasize human rights.  The results tend to be the same: a penchant for activism that sustains a never-ending demand for high levels of military outlays.

American politics once nourished a lively anti-interventionist tradition.  Leading proponents included luminaries such as George Washington and John Quincy Adams.  That tradition found its basis not in principled pacifism, a position that has never attracted widespread support in this country, but in pragmatic realism.  What happened to that realist tradition?  Simply put, World War II killed it — or at least discredited it.  In the intense and divisive debate that occurred in 1939-1941, the anti-interventionists lost, their cause thereafter tarred with the label “isolationism.” 

The passage of time has transformed World War II from a massive tragedy into a morality tale, one that casts opponents of intervention as blackguards.  Whether explicitly or implicitly, the debate over how the United States should respond to some ostensible threat — Iraq in 2003, Iran today — replays the debate finally ended by the events of December 7, 1941.  To express skepticism about the necessity and prudence of using military power is to invite the charge of being an appeaser or an isolationist.  Few politicians or individuals aspiring to power will risk the consequences of being tagged with that label. 

In this sense, American politics remains stuck in the 1930s — always discovering a new Hitler, always privileging Churchillian rhetoric — even though the circumstances in which we live today bear scant resemblance to that earlier time.  There was only one Hitler and he’s long dead.  As for Churchill, his achievements and legacy are far more mixed than his battalions of defenders are willing to acknowledge.  And if any one figure deserves particular credit for demolishing Hitler’s Reich and winning World War II, it’s Josef Stalin, a dictator as vile and murderous as Hitler himself. 

Until Americans accept these facts, until they come to a more nuanced view of World War II that takes fully into account the political and moral implications of the U.S. alliance with the Soviet Union and the U.S. campaign of obliteration bombing directed against Germany and Japan, the mythic version of “the Good War” will continue to provide glib justifications for continuing to dodge that perennial question: How much is enough?

Like concentric security barriers arrayed around the Pentagon, these four factors — institutional self-interest, strategic inertia, cultural dissonance, and misremembered history — insulate the military budget from serious scrutiny.  For advocates of a militarized approach to policy, they provide invaluable assets, to be defended at all costs. 

Andrew J. Bacevich is professor of history and international relations at Boston University.  His most recent book is Washington Rules:  America’s Path to Permanent War.  To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses the money that pours into the national security budget, click here or, to download it to your iPod, here.

Copyright 2011 Andrew Bacevich

Cow Most Sacred

In January 1863, President Abraham Lincoln’s charge to a newly-appointed commanding general was simplicity itself: “give us victories.”  President Barack Obama’s tacit charge to his generals amounts to this: give us conditions permitting a dignified withdrawal.  A pithy quote in Bob Woodward’s new book captures the essence of an emerging Obama Doctrine: “hand it off and get out.”

Getting into a war is generally a piece of cake.  Getting out tends to be another matter altogether — especially when the commander-in-chief and his commanders in the field disagree on the advisability of doing so.

Happy Anniversary, America.  Nine years ago today — on October 7, 2001 — a series of U.S. air strikes against targets across Afghanistan launched the opening campaign of what has since become the nation’s longest war.  Three thousand two hundred and eighty five days later the fight to determine Afghanistan’s future continues.  At least in part, “Operation Enduring Freedom” has lived up to its name:  it has certainly proven to be enduring.

As the conflict formerly known as the Global War on Terror enters its tenth year, Americans are entitled to pose this question: When, where, and how will the war end?  Bluntly, are we almost there yet?

Of course, with the passage of time, where “there” is has become increasingly difficult to discern.  Baghdad turned out not to be Berlin and Kandahar is surely not Tokyo.  Don’t look for CNN to be televising a surrender ceremony anytime soon.

This much we know: an enterprise that began in Afghanistan but soon after focused on Iraq has now shifted back — again — to Afghanistan.  Whether the swings of this pendulum signify progress toward some final objective is anyone’s guess.

To measure progress during wartime, Americans once employed pins and maps.  Plotting the conflict triggered by 9/11 will no doubt improve your knowledge of world geography, but it won’t tell you anything about where this war is headed.

Where, then, have nine years of fighting left us?  Chastened, but not necessarily enlightened.

Just over a decade ago, the now-forgotten Kosovo campaign seemingly offered a template for a new American way of war.  It was a decision gained without suffering a single American fatality.  Kosovo turned out, however, to be a one-off event.  No doubt the United States military was then (and remains today) unbeatable in traditional terms.  Yet, after 9/11, Washington committed that military to an endeavor that it manifestly cannot win.

Rather than probing the implications of this fact — relying on the force of arms to eliminate terrorism is a fool’s errand — two administrations have doggedly prolonged the war even as they quietly ratcheted down expectations of what it might accomplish.

In officially ending the U.S. combat role in Iraq earlier this year — a happy day if there ever was one — President Obama refrained from proclaiming “mission accomplished.”  As well he might: as U.S. troops depart Iraq, insurgents remain active and in the field.  Instead of declaring victory, the president simply urged Americans to turn the page.  With remarkable alacrity, most of us seem to have complied.

Perhaps more surprisingly, today’s military leaders have themselves abandoned the notion that winning battles wins wars, once the very foundation of their profession.  Warriors of an earlier day insisted: “There is no substitute for victory.”  Warriors in the Age of David Petraeus embrace an altogether different motto: “There is no military solution.”

Here is Brigadier General H. R. McMaster, one of the Army’s rising stars, summarizing the latest in advanced military thinking:  “Simply fighting and winning a series of interconnected battles in a well developed campaign does not automatically deliver the achievement of war aims.”  Winning as such is out.  Persevering is in.

So an officer corps once intent above all on avoiding protracted wars now specializes in quagmires.  Campaigns don’t really end.  At best, they peter out.

Formerly trained to kill people and break things, American soldiers now attend to winning hearts and minds, while moonlighting in assassination.  The politically correct term for this is “counterinsurgency.”

Now, assigning combat soldiers the task of nation-building in, say, Mesopotamia is akin to hiring a crew of lumberjacks to build a house in suburbia.  What astonishes is not that the result falls short of perfection, but that any part of the job gets done at all.

Yet by simultaneously adopting the practice of “targeted killing,” the home builders do double-duty as home wreckers.  For American assassins, the weapon of choice is not the sniper rifle or the shiv, but missile-carrying pilotless aircraft controlled from bases in Nevada and elsewhere thousands of miles from the battlefield — the ultimate expression of an American desire to wage war without getting our hands dirty.

In practice, however, killing the guilty from afar not infrequently entails killing innocents as well.  So actions undertaken to deplete the ranks of jihadists as far afield as Pakistan, Yemen, and Somalia unwittingly ensure the recruitment of replacements, guaranteeing a never-ending supply of hardened hearts to soften.

No wonder the campaigns launched since 9/11 drag on and on.  General Petraeus himself has spelled out the implications: “This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”  Obama may want to “get out.”  His generals are inclined to stay the course.

Taking longer to achieve less than we initially intended is also costing far more than anyone ever imagined.  Back in 2003, White House economic adviser Lawrence Lindsey suggested that invading Iraq might run up a tab of as much as $200 billion — a seemingly astronomical sum.  Although Lindsey soon found himself out of a job as a result, he turned out to be a piker.  The bill for our post-9/11 wars already exceeds a trillion dollars, all of it piled atop our mushrooming national debt.  Helped in no small measure by Obama’s war policies, the meter is still running.

So are we almost there yet?  Not even.  The truth is we’re lost in the desert, careening down an unmarked road, odometer busted, GPS on the fritz, and fuel gauge hovering just above E.  Washington can only hope that the American people, napping in the backseat, won’t notice.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His bestselling new book is Washington Rules: America’s Path to Permanent War.  To catch Bacevich discussing how the U.S. military became specialists in quagmires in a Timothy MacBain TomCast audio interview click here or, to download it to your iPod, here.

Copyright 2010 Andrew J. Bacevich

The Long War: Year Ten

Once a serious journalist, the Washington Post’s Bob Woodward now makes a very fine living as chief gossip-monger of the governing class.  Early on in his career, along with Carl Bernstein, his partner at the time, Woodward confronted power.  Today, by relentlessly exalting Washington trivia, he flatters power.  His reporting does not inform. It titillates.

A new Woodward book, Obama’s Wars, is a guaranteed blockbuster.  It’s out this week, already causing a stir, and guaranteed to be forgotten the week after dropping off the bestseller lists.  For good reason: when it comes to substance, any book written by Woodward has about as much heft as the latest potboiler penned by the likes of James Patterson or Tom Clancy.

Back in 2002, for example, during the run-up to the invasion of Iraq, Woodward treated us to Bush at War.  Based on interviews with unidentified officials close to President George W. Bush, the book offered a portrait of the president-as-resolute-war-leader that put him in a league with Abraham Lincoln and Franklin Roosevelt.  But the book’s real juice came from what it revealed about events behind the scenes.  “Bush’s war cabinet is riven with feuding,” reported the Times of London, which credited Woodward with revealing “the furious arguments and personal animosity” that divided Bush’s lieutenants.

Of course, the problem with the Bush administration wasn’t that folks on the inside didn’t play nice with one another.  No, the problem was that the president and his inner circle committed a long series of catastrophic errors that produced an unnecessary and grotesquely mismanaged war.  That war has cost the country dearly — although the people who engineered that catastrophe, many of them having pocketed handsome advances on their forthcoming memoirs, continue to manage quite well, thank you.

To judge by the publicity blitzkrieg announcing the arrival of Obama’s Wars in your local bookstore, the big news out of Washington is that, even today, politics there remains an intensely competitive sport, with the participants, whether in anger or frustration, sometimes speaking ill of one another.

Essentially, news reports indicate, Woodward has updated his script from 2002.  The characters have different names, but the plot remains the same.  Talk about jumping the shark.

So we learn that Obama political adviser David Axelrod doesn’t fully trust Secretary of State Hillary Clinton.  National security adviser James Jones, a retired Marine general, doesn’t much care for the likes of Axelrod, and will say so behind his back.  Almost everyone thinks Richard Holbrooke, chief State Department impresario of the AfPak portfolio, is a jerk.  And — stop the presses — when under the influence of alcohol, General David Petraeus, commander of U.S. and allied forces in Afghanistan, is alleged to use the word “f**ked.”  These are the sort of shocking revelations that make you a headliner on the Sunday morning talk shows.

Based on what we have learned so far from those select few provided with advance copies of the book — mostly reporters for the Post and The New York Times who, for whatever reason, seem happy to serve as its shills — Obama’s Wars contains hints of another story, the significance of which seems to have eluded Woodward.

The theme of that story is not whether Dick likes Jane, but whether the Constitution remains an operative document.  The Constitution explicitly assigns to the president the role of commander-in-chief. Responsibility for the direction of American wars rests with him. According to the principle of civilian control, senior military officers advise and execute, but it’s the president who decides.  That’s the theory, at least.  Reality turns out to be considerably different and, to be kind about it, more complicated.

Obama’s Wars reportedly contains this comment by President Obama to Secretary Clinton and Secretary of Defense Robert Gates regarding Afghanistan:  “I’m not doing 10 years… I’m not doing long-term nation-building. I am not spending a trillion dollars.”

Aren’t you, Mr. President?  Don’t be so sure.

Obama’s Wars also affirms what we already suspected about the decision-making process that led up to the president’s announcement at West Point in December 2009 to prolong and escalate the war. Bluntly put, the Pentagon gamed the process to exclude any possibility of Obama rendering a decision not to its liking.

Pick your surge: 20,000 troops? Or 30,000 troops?  Or 40,000 troops?  Only the most powerful man in the world — or Goldilocks contemplating three bowls of porridge — could handle a decision like that.  Even as Obama opted for the middle course, the real decision had already been made elsewhere by others: the war in Afghanistan would expand and continue.

And then there’s this from the estimable General David Petraeus: “I don’t think you win this war,” Woodward quotes the field commander as saying. “I think you keep fighting… This is the kind of fight we’re in for the rest of our lives and probably our kids’ lives.”

Here we confront a series of questions to which Woodward (not to mention the rest of Washington) remains steadfastly oblivious.  Why fight a war that even the general in charge says can’t be won?  What will the perpetuation of this conflict cost?  Who will it benefit?  Does the ostensibly most powerful nation in the world have no choice but to wage permanent war?  Are there no alternatives?  Can Obama shut down an unwinnable war now about to enter its tenth year?  Or is he — along with the rest of us — a prisoner of war?

President Obama has repeatedly stated that in July 2011 a withdrawal of U. S. troops from Afghanistan will commence.  No one quite knows
exactly what that means.  Will the withdrawal be symbolic?  General Petraeus has already made it abundantly clear that he will entertain nothing more.  Or will July signal that the Afghan War — and by extension the Global War on Terror launched nine years ago — is finally coming to an end?

Between now and next summer attentive Americans will learn much about how national security policy is actually formulated and who is really
in charge.  Just don’t expect Bob Woodward to offer any enlightenment on the subject.

Andrew J. Bacevich is professor of history and international relations at Boston University.  His new book is Washington Rules: America’s Path to Permanent War.

Copyright 2010 Andrew J. Bacevich

Prisoners of War

Worldly ambition inhibits true learning. Ask me. I know. A young man in a hurry is nearly uneducable: He knows what he wants and where he’s headed; when it comes to looking back or entertaining heretical thoughts, he has neither the time nor the inclination. All that counts is that he is going somewhere. Only as ambition wanes does education become a possibility.

My own education did not commence until I had reached middle age. I can fix its start date with precision: for me, education began in Berlin, on a winter’s evening, at the Brandenburg Gate, not long after the Berlin Wall had fallen.

As an officer in the U.S. Army I had spent considerable time in Germany. Until that moment, however, my family and I had never had occasion to visit this most famous of German cities, still littered with artifacts of a deeply repellent history. At the end of a long day of exploration, we found ourselves in what had, until just months before, been the communist East. It was late and we were hungry, but I insisted on walking the length of the Unter den Linden, from the River Spree to the gate itself. A cold rain was falling and the pavement glistened. The buildings lining the avenue, dating from the era of Prussian kings, were dark, dirty, and pitted. Few people were about. It was hardly a night for sightseeing.

For as long as I could remember, the Brandenburg Gate had been the preeminent symbol of the age and Berlin the epicenter of contemporary history. Yet by the time I made it to the once and future German capital, history was already moving on. The Cold War had abruptly ended. A divided city and a divided nation had reunited.

For Americans who had known Berlin only from a distance, the city existed primarily as a metaphor. Pick a date — 1933, 1942, 1945, 1948, 1961, 1989 — and Berlin becomes an instructive symbol of power, depravity, tragedy, defiance, endurance, or vindication. For those inclined to view the past as a chronicle of parables, the modern history of Berlin offered an abundance of material. The greatest of those parables emerged from the events of 1933 to 1945, an epic tale of evil ascendant, belatedly confronted, then heroically overthrown. A second narrative, woven from events during the intense period immediately following World War II, saw hopes for peace dashed, yielding bitter antagonism but also great resolve. The ensuing stand-off — the “long twilight struggle,” in John Kennedy’s memorable phrase — formed the centerpiece of the third parable, its central theme stubborn courage in the face of looming peril. Finally came the exhilarating events of 1989, with freedom ultimately prevailing, not only in Berlin, but throughout Eastern Europe.

What exactly was I looking for at the Brandenburg Gate? Perhaps confirmation that those parables, which I had absorbed and accepted as true, were just that. Whatever I expected, what I actually found was a cluster of shabby-looking young men, not German, hawking badges, medallions, hats, bits of uniforms, and other artifacts of the mighty Red Army. It was all junk, cheaply made and shoddy. For a handful of deutsche marks, I bought a wristwatch emblazoned with the symbol of the Soviet armored corps. Within days, it ceased to work.

Huddling among the scarred columns, those peddlers — almost certainly off-duty Russian soldiers awaiting redeployment home — constituted a subversive presence. They were loose ends of a story that was supposed to have ended neatly when the Berlin Wall came down. As we hurried off to find warmth and a meal, this disconcerting encounter stuck with me, and I began to entertain this possibility: that the truths I had accumulated over the previous twenty years as a professional soldier — especially truths about the Cold War and U.S. foreign policy — might not be entirely true.

By temperament and upbringing, I had always taken comfort in orthodoxy. In a life spent subject to authority, deference had become a deeply ingrained habit. I found assurance in conventional wisdom. Now, I started, however hesitantly, to suspect that orthodoxy might be a sham. I began to appreciate that authentic truth is never simple and that any version of truth handed down from on high — whether by presidents, prime ministers, or archbishops — is inherently suspect. The powerful, I came to see, reveal truth only to the extent that it suits them. Even then, the truths to which they testify come wrapped in a nearly invisible filament of dissembling, deception, and duplicity. The exercise of power necessarily involves manipulation and is antithetical to candor.

I came to these obvious points embarrassingly late in life. “Nothing is so astonishing in education,” the historian Henry Adams once wrote, “as the amount of ignorance it accumulates in the form of inert facts.” Until that moment I had too often confused education with accumulating and cataloging facts. In Berlin, at the foot of the Brandenburg Gate, I began to realize that I had been a naïf. And so, at age 41, I set out, in a halting and haphazard fashion, to acquire a genuine education.

Twenty years later I’ve made only modest progress. What follows is an accounting of what I have learned thus far.

Visiting a Third-World Version of Germany

In October 1990, I’d gotten a preliminary hint that something might be amiss in my prior education. On October 3rd, communist East Germany — formally the German Democratic Republic (GDR) — ceased to exist and German reunification was officially secured. That very week I accompanied a group of American military officers to the city of Jena in what had been the GDR. Our purpose was self-consciously educational — to study the famous battle of Jena-Auerstädt in which Napoleon Bonaparte and his marshals had inflicted an epic defeat on Prussian forces commanded by the Duke of Brunswick. (The outcome of that 1806 battle inspired the philosopher Hegel, then residing in Jena, to declare that the “end of history” was at hand. The conclusion of the Cold War had only recently elicited a similarly exuberant judgment from the American scholar Francis Fukuyama.)

On this trip we did learn a lot about the conduct of that battle, although mainly inert facts possessing little real educational value. Inadvertently, we also gained insight into the reality of life on the far side of what Americans had habitually called the Iron Curtain, known in U.S. military vernacular as “the trace.” In this regard, the trip proved nothing less than revelatory. The educational content of this excursion would — for me — be difficult to exaggerate.

As soon as our bus crossed the old Inner German Border, we entered a time warp. For U.S. troops garrisoned throughout Bavaria and Hesse, West Germany had for decades served as a sort of theme park — a giant Epcot filled with quaint villages, stunning scenery, and superb highways, along with ample supplies of quite decent food, excellent beer, and accommodating women. Now, we found ourselves face-to-face with an altogether different Germany. Although commonly depicted as the most advanced and successful component of the Soviet Empire, East Germany more closely resembled part of the undeveloped world.

The roads — even the main highways — were narrow and visibly crumbling. Traffic posed little problem. Apart from a few sluggish Trabants and Wartburgs — East German automobiles that tended to a retro primitivism — and an occasional exhaust-spewing truck, the way was clear. The villages through which we passed were forlorn and the small farms down at the heels. For lunch we stopped at a roadside stand. The proprietor happily accepted our D-marks, offering us inedible sausages in exchange. Although the signs assured us that we remained in a land of German speakers, it was a country that had not yet recovered from World War II.

Upon arrival in Jena, we checked into the Hotel Schwarzer Bär, identified by our advance party as the best hostelry in town. It turned out to be a rundown fleabag. As the senior officer present, I was privileged to have a room in which the plumbing functioned. Others were not so lucky.

Jena itself was a midsized university city, with its main academic complex immediately opposite our hotel. A very large bust of Karl Marx, mounted on a granite pedestal and badly in need of cleaning, stood on the edge of the campus. Briquettes of soft coal used for home heating made the air all but unbreathable and coated everything with soot. In the German cities we knew, pastels predominated — houses and apartment blocks painted pale green, muted salmon, and soft yellow. Here everything was brown and gray.

That evening we set out in search of dinner. The restaurants within walking distance were few and unattractive. We chose badly, a drab establishment in which fresh vegetables were unavailable and the wurst inferior. The adequacy of the local beer provided the sole consolation.

The following morning, on the way to the battlefield, we noted a significant Soviet military presence, mostly in the form of trucks passing by — to judge by their appearance, designs that dated from the 1950s. To our surprise, we discovered that the Soviets had established a small training area adjacent to where Napoleon had vanquished the Prussians. Although we had orders to avoid contact with any Russians, the presence of their armored troops going through their paces riveted us. Here was something of far greater immediacy than Bonaparte and the Duke of Brunswick: “the other,” about which we had for so long heard so much but knew so little. Through binoculars, we watched a column of Russian armored vehicles — BMPs, in NATO parlance — traversing what appeared to be a drivers’ training course. Suddenly, one of them began spewing smoke. Soon thereafter, it burst into flames.

Here was education, although at the time I had only the vaguest sense of its significance.

An Ambitious Team Player Assailed by Doubts

These visits to Jena and Berlin offered glimpses of a reality radically at odds with my most fundamental assumptions. Uninvited and unexpected, subversive forces had begun to infiltrate my consciousness. Bit by bit, my worldview started to crumble.

That worldview had derived from this conviction: that American power manifested a commitment to global leadership, and that both together expressed and affirmed the nation’s enduring devotion to its founding ideals. That American power, policies, and purpose were bound together in a neat, internally consistent package, each element drawing strength from and reinforcing the others, was something I took as a given. That, during my adult life, a penchant for interventionism had become a signature of U.S. policy did not — to me, at least — in any way contradict America’s aspirations for peace. Instead, a willingness to expend lives and treasure in distant places testified to the seriousness of those aspirations. That, during this same period, the United States had amassed an arsenal of over 31,000 nuclear weapons, some small number of them assigned to units in which I had served, was not at odds with our belief in the inalienable right to life and liberty; rather, threats to life and liberty had compelled the United States to acquire such an arsenal and maintain it in readiness for instant use.

I was not so naïve as to believe that the American record had been without flaws. Yet I assured myself that any errors or misjudgments had been committed in good faith. Furthermore, circumstances permitted little real choice. In Southeast Asia as in Western Europe, in the Persian Gulf as in the Western Hemisphere, the United States had simply done what needed doing. Viable alternatives did not exist. To consent to any dilution of American power would be to forfeit global leadership, thereby putting at risk safety, prosperity, and freedom, not only our own but also that of our friends and allies.

The choices seemed clear enough. On one side was the status quo: the commitments, customs, and habits that defined American globalism, implemented by the national security apparatus within which I functioned as a small cog. On the other side was the prospect of appeasement, isolationism, and catastrophe. The only responsible course was the one to which every president since Harry Truman had adhered.

For me, the Cold War had played a crucial role in sustaining that worldview. Given my age, upbringing, and professional background, it could hardly have been otherwise. Although the great rivalry between the United States and the Soviet Union had contained moments of considerable anxiety — I remember my father, during the Cuban Missile Crisis, stocking our basement with water and canned goods — it served primarily to clarify, not to frighten. The Cold War provided a framework that organized and made sense of contemporary history. It offered a lineup and a scorecard. That there existed bad Germans and good Germans, their Germans and our Germans, totalitarian Germans and Germans who, like Americans, passionately loved freedom was, for example, a proposition I accepted as dogma. Seeing the Cold War as a struggle between good and evil answered many questions, consigned others to the periphery, and rendered still others irrelevant.

Back in the 1960s, during the Vietnam War, more than a few members of my generation had rejected the conception of the Cold War as a Manichean struggle. Here too, I was admittedly a slow learner. Yet having kept the faith long after others had lost theirs, the doubts that eventually assailed me were all the more disorienting.

Granted, occasional suspicions had appeared long before Jena and Berlin. My own Vietnam experience had generated its share, which I had done my best to suppress. I was, after all, a serving soldier. Except in the narrowest of terms, the military profession, in those days at least, did not look kindly on nonconformity. Climbing the ladder of career success required curbing maverick tendencies. To get ahead, you needed to be a team player. Later, when studying the history of U.S. foreign relations in graduate school, I was pelted with challenges to orthodoxy, which I vigorously deflected. When it came to education, graduate school proved a complete waste of time — a period of intense study devoted to the further accumulation of facts, while I exerted myself to ensuring that they remained inert.

Now, however, my personal circumstances were changing. Shortly after the passing of the Cold War, my military career ended. Education thereby became not only a possibility, but also a necessity.

In measured doses, mortification cleanses the soul. It’s the perfect antidote for excessive self-regard. After 23 years spent inside the U.S. Army seemingly going somewhere, I now found myself on the outside going nowhere in particular. In the self-contained and cloistered universe of regimental life, I had briefly risen to the status of minor spear carrier. The instant I took off my uniform, that status vanished. I soon came to a proper appreciation of my own insignificance, a salutary lesson that I ought to have absorbed many years earlier.

As I set out on what eventually became a crablike journey toward a new calling as a teacher and writer — a pilgrimage of sorts — ambition in the commonly accepted meaning of the term ebbed. This did not happen all at once. Yet gradually, trying to grab one of life’s shiny brass rings ceased being a major preoccupation. Wealth, power, and celebrity became not aspirations but subjects for critical analysis. History — especially the familiar narrative of the Cold War — no longer offered answers; instead, it posed perplexing riddles. Easily the most nagging was this one: How could I have so profoundly misjudged the reality of what lay on the far side of the Iron Curtain?

Had I been insufficiently attentive? Or was it possible that I had been snookered all along? Contemplating such questions, while simultaneously witnessing the unfolding of the “long 1990s” — the period bookended by two wars with Iraq when American vainglory reached impressive new heights — prompted the realization that I had grossly misinterpreted the threat posed by America’s adversaries. Yet that was the lesser half of the problem. Far worse than misperceiving “them” was the fact that I had misperceived “us.” What I thought I knew best I actually understood least. Here, the need for education appeared especially acute.

George W. Bush’s decision to launch Operation Iraqi Freedom in 2003 pushed me fully into opposition. Claims that once seemed elementary — above all, claims relating to the essentially benign purposes of American power — now appeared preposterous. The contradictions that found an ostensibly peace-loving nation committing itself to a doctrine of preventive war became too great to ignore. The folly and hubris of the policy makers who heedlessly thrust the nation into an ill-defined and open-ended “global war on terror” without the foggiest notion of what victory would look like, how it would be won, and what it might cost approached standards hitherto achieved only by slightly mad German warlords. During the era of containment, the United States had at least maintained the pretense of a principled strategy; now, the last vestiges of principle gave way to fantasy and opportunism. With that, the worldview to which I had adhered as a young adult and carried into middle age dissolved completely.

Credo and Trinity

What should stand in the place of such discarded convictions? Simply inverting the conventional wisdom, substituting a new Manichean paradigm for the old discredited version — the United States taking the place of the Soviet Union as the source of the world’s evil — would not suffice. Yet arriving at even an approximation of truth would entail subjecting conventional wisdom, both present and past, to sustained and searching scrutiny. Cautiously at first but with growing confidence, this I vowed to do.

Doing so meant shedding habits of conformity acquired over decades. All of my adult life I had been a company man, only dimly aware of the extent to which institutional loyalties induce myopia. Asserting independence required first recognizing the extent to which I had been socialized to accept certain things as unimpeachable. Here then were the preliminary steps essential to making education accessible. Over a period of years, a considerable store of debris had piled up. Now, it all had to go. Belatedly, I learned that more often than not what passes for conventional wisdom is simply wrong. Adopting fashionable attitudes to demonstrate one’s trustworthiness — the world of politics is flush with such people hoping thereby to qualify for inclusion in some inner circle — is akin to engaging in prostitution in exchange for promissory notes. It’s not only demeaning but downright foolhardy.

Washington Rules aims to take stock of conventional wisdom in its most influential and enduring form, namely the package of assumptions, habits, and precepts that have defined the tradition of statecraft to which the United States has adhered since the end of World War II — the era of global dominance now drawing to a close. This postwar tradition combines two components, each one so deeply embedded in the American collective consciousness as to have all but disappeared from view.

The first component specifies norms according to which the international order ought to work and charges the United States with responsibility for enforcing those norms. Call this the American credo. In the simplest terms, the credo summons the United States — and the United States alone — to lead, save, liberate, and ultimately transform the world. In a celebrated manifesto issued at the dawn of what he termed “The American Century,” Henry R. Luce made the case for this spacious conception of global leadership. Writing in Life magazine in early 1941, the influential publisher exhorted his fellow citizens to “accept wholeheartedly our duty to exert upon the world the full impact of our influence for such purposes as we see fit and by such means as we see fit.” Luce thereby captured what remains even today the credo’s essence.

Luce’s concept of an American Century, an age of unquestioned American global primacy, resonated, especially in Washington. His evocative phrase found a permanent place in the lexicon of national politics. (Recall that the neoconservatives who, in the 1990s, lobbied for more militant U.S. policies named their enterprise the Project for a New American Century.) So, too, did Luce’s expansive claim of prerogatives to be exercised by the United States. Even today, whenever public figures allude to America’s responsibility to lead, they signal their fidelity to this creed. Along with respectful allusions to God and “the troops,” adherence to Luce’s credo has become a de facto prerequisite for high office. Question its claims and your prospects of being heard in the hubbub of national politics become nil.

Note, however, that the duty Luce ascribed to Americans has two components. It is not only up to Americans, he wrote, to choose the purposes for which they would bring their influence to bear, but to choose the means as well. Here we confront the second component of the postwar tradition of American statecraft.

With regard to means, that tradition has emphasized activism over example, hard power over soft, and coercion (often styled “negotiating from a position of strength”) over suasion. Above all, the exercise of global leadership as prescribed by the credo obliges the United States to maintain military capabilities staggeringly in excess of those required for self-defense. Prior to World War II, Americans by and large viewed military power and institutions with skepticism, if not outright hostility. In the wake of World War II, that changed. An affinity for military might emerged as central to the American identity.

By the midpoint of the twentieth century, “the Pentagon” had ceased to be merely a gigantic five-sided building. Like “Wall Street” at the end of the nineteenth century, it had become Leviathan, its actions veiled in secrecy, its reach extending around the world. Yet while the concentration of power in Wall Street had once evoked deep fear and suspicion, Americans by and large saw the concentration of power in the Pentagon as benign. Most found it reassuring.

A people who had long seen standing armies as a threat to liberty now came to believe that the preservation of liberty required them to lavish resources on the armed forces. During the Cold War, Americans worried ceaselessly about falling behind the Russians, even though the Pentagon consistently maintained a position of overall primacy. Once the Soviet threat disappeared, mere primacy no longer sufficed. With barely a whisper of national debate, unambiguous and perpetual global military supremacy emerged as an essential predicate to global leadership.

Every great military power has its distinctive signature. For Napoleonic France, it was the levée en masse — the people in arms animated by the ideals of the Revolution. For Great Britain in the heyday of empire, it was command of the seas, sustained by a dominant fleet and a network of far-flung outposts from Gibraltar and the Cape of Good Hope to Singapore and Hong Kong. Germany from the 1860s to the 1940s (and Israel from 1948 to 1973) took another approach, relying on a potent blend of tactical flexibility and operational audacity to achieve battlefield superiority.

The abiding signature of American military power since World War II has been of a different order altogether. The United States has not specialized in any particular type of war. It has not adhered to a fixed tactical style. No single service or weapon has enjoyed consistent favor. At times, the armed forces have relied on citizen-soldiers to fill their ranks; at other times, long-service professionals. Yet an examination of the past 60 years of U.S. military policy and practice does reveal important elements of continuity. Call them the sacred trinity: an abiding conviction that the minimum essentials of international peace and order require the United States to maintain a global military presence, to configure its forces for global power projection, and to counter existing or anticipated threats by relying on a policy of global interventionism.

Together, credo and trinity — the one defining purpose, the other practice — constitute the essence of the way that Washington has attempted to govern and police the American Century. The relationship between the two is symbiotic. The trinity lends plausibility to the credo’s vast claims. For its part, the credo justifies the trinity’s vast requirements and exertions. Together they provide the basis for an enduring consensus that imparts a consistency to U.S. policy regardless of which political party may hold the upper hand or who may be occupying the White House. From the era of Harry Truman to the age of Barack Obama, that consensus has remained intact. It defines the rules to which Washington adheres; it determines the precepts by which Washington rules.

As used here, Washington is less a geographic expression than a set of interlocking institutions headed by people who, whether acting officially or unofficially, are able to put a thumb on the helm of state. Washington, in this sense, includes the upper echelons of the executive, legislative, and judicial branches of the federal government. It encompasses the principal components of the national security state — the departments of Defense, State, and, more recently, Homeland Security, along with various agencies comprising the intelligence and federal law enforcement communities. Its ranks extend to select think tanks and interest groups. Lawyers, lobbyists, fixers, former officials, and retired military officers who still enjoy access are members in good standing. Yet Washington also reaches beyond the Beltway to include big banks and other financial institutions, defense contractors and major corporations, television networks and elite publications like the New York Times, even quasi-academic entities like the Council on Foreign Relations and Harvard’s Kennedy School of Government. With rare exceptions, acceptance of the Washington rules forms a prerequisite for entry into this world.

My purpose in writing Washiington Rules is fivefold: first, to trace the origins and evolution of the Washington rules — both the credo that inspires consensus and the trinity in which it finds expression; second, to subject the resulting consensus to critical inspection, showing who wins and who loses and also who foots the bill; third, to explain how the Washington rules are perpetuated, with certain views privileged while others are declared disreputable; fourth, to demonstrate that the rules themselves have lost what ever utility they may once have possessed, with their implications increasingly pernicious and their costs increasingly unaffordable; and finally, to argue for readmitting disreputable (or “radical”) views to our national security debate, in effect legitimating alternatives to the status quo. In effect, my aim is to invite readers to share in the process of education on which I embarked two decades ago in Berlin.

The Washington rules were forged at a moment when American influence and power were approaching their acme. That moment has now passed. The United States has drawn down the stores of authority and goodwill it had acquired by 1945. Words uttered in Washington command less respect than once was the case. Americans can ill afford to indulge any longer in dreams of saving the world, much less remaking it in our own image. The curtain is now falling on the American Century.

Similarly, the United States no longer possesses sufficient wherewithal to sustain a national security strategy that relies on global military presence and global power projection to underwrite a policy of global interventionism. Touted as essential to peace, adherence to that strategy has propelled the United States into a condition approximating perpetual war, as the military misadventures of the past decade have demonstrated.

To anyone with eyes to see, the shortcomings inherent in the Washington rules have become plainly evident. Although those most deeply invested in perpetuating its conventions will insist otherwise, the tradition to which Washington remains devoted has begun to unravel. Attempting to prolong its existence might serve Washington’s interests, but it will not serve the interests of the American people.

Devising an alternative to the reigning national security paradigm will pose a daunting challenge — especially if Americans look to “Washington” for fresh thinking. Yet doing so has become essential.

In one sense, the national security policies to which Washington so insistently adheres express what has long been the preferred American approach to engaging the world beyond our borders. That approach plays to America’s presumed strong suit — since World War II, and especially since the end of the Cold War, thought to be military power. In another sense, this reliance on military might creates excuses for the United States to avoid serious engagement: confidence in American arms has made it unnecessary to attend to what others might think or to consider how their aspirations might differ from our own. In this way, the Washington rules reinforce American provincialism — a national trait for which the United States continues to pay dearly.

The persistence of these rules has also provided an excuse to avoid serious self-engagement. From this perspective, confidence that the credo and the trinity will oblige others to accommodate themselves to America’s needs or desires — whether for cheap oil, cheap credit, or cheap consumer goods — has allowed Washington to postpone or ignore problems demanding attention here at home. Fixing Iraq or Afghanistan ends up taking precedence over fixing Cleveland and Detroit. Purporting to support the troops in their crusade to free the world obviates any obligation to assess the implications of how Americans themselves choose to exercise freedom.

When Americans demonstrate a willingness to engage seriously with others, combined with the courage to engage seriously with themselves, then real education just might begin.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent War (Metropolitan Books)has just been published. This essay is its introduction.  Listen to a TomCast audio interview in which he discusses the book by clicking here, or to download to an iPod, here.

Excerpted from Washington Rules: America’s Path to Permanent War, published this month by Metropolitan Books, an imprint of Henry Holt and Company, LLC. Copyright (c) 2010 by Andrew Bacevich. All rights reserved.

Andrew Bacevich, How Washington Rules

“In watching the flow of events over the past decade or so, it is hard to avoid the feeling that something very fundamental has happened in world history.”  This sentiment, introducing the essay that made Francis Fukuyama a household name, commands renewed attention today, albeit from a different perspective.

Developments during the 1980s, above all the winding down of the Cold War, had convinced Fukuyama that the “end of history” was at hand.  “The triumph of the West, of the Western idea,” he wrote in 1989, “is evident… in the total exhaustion of viable systematic alternatives to Western liberalism.”

Today the West no longer looks quite so triumphant.  Yet events during the first decade of the present century have delivered history to another endpoint of sorts.  Although Western liberalism may retain considerable appeal, the Western way of war has run its course.

For Fukuyama, history implied ideological competition, a contest pitting democratic capitalism against fascism and communism.  When he wrote his famous essay, that contest was reaching an apparently definitive conclusion.

Yet from start to finish, military might had determined that competition’s course as much as ideology.  Throughout much of the twentieth century, great powers had vied with one another to create new, or more effective, instruments of coercion.  Military innovation assumed many forms.  Most obviously, there were the weapons: dreadnoughts and aircraft carriers, rockets and missiles, poison gas, and atomic bombs — the list is a long one.  In their effort to gain an edge, however, nations devoted equal attention to other factors: doctrine and organization, training systems and mobilization schemes, intelligence collection and war plans.

All of this furious activity, whether undertaken by France or Great Britain, Russia or Germany, Japan or the United States, derived from a common belief in the plausibility of victory.  Expressed in simplest terms, the Western military tradition could be reduced to this proposition: war remains a viable instrument of statecraft, the accoutrements of modernity serving, if anything, to enhance its utility.

Grand Illusions

That was theory.  Reality, above all the two world wars of the last century, told a decidedly different story.  Armed conflict in the industrial age reached new heights of lethality and destructiveness.  Once begun, wars devoured everything, inflicting staggering material, psychological, and moral damage.  Pain vastly exceeded gain.  In that regard, the war of 1914-1918 became emblematic: even the winners ended up losers.  When fighting eventually stopped, the victors were left not to celebrate but to mourn.  As a consequence, well before Fukuyama penned his essay, faith in war’s problem-solving capacity had begun to erode.  As early as 1945, among several great powers — thanks to war, now great in name only — that faith disappeared altogether.

Among nations classified as liberal democracies, only two resisted this trend.  One was the United States, the sole major belligerent to emerge from the Second World War stronger, richer, and more confident.  The second was Israel, created as a direct consequence of the horrors unleashed by that cataclysm.  By the 1950s, both countries subscribed to this common conviction: national security (and, arguably, national survival) demanded unambiguous military superiority.  In the lexicon of American and Israeli politics, “peace” was a codeword.  The essential prerequisite for peace was for any and all adversaries, real or potential, to accept a condition of permanent inferiority.  In this regard, the two nations — not yet intimate allies — stood apart from the rest of the Western world.

So even as they professed their devotion to peace, civilian and military elites in the United States and Israel prepared obsessively for war.  They saw no contradiction between rhetoric and reality.

Yet belief in the efficacy of military power almost inevitably breeds the temptation to put that power to work.  “Peace through strength” easily enough becomes “peace through war.”  Israel succumbed to this temptation in 1967.  For Israelis, the Six Day War proved a turning point.  Plucky David defeated, and then became, Goliath.  Even as the United States was flailing about in Vietnam, Israel had evidently succeeded in definitively mastering war.

A quarter-century later, U.S. forces seemingly caught up.  In 1991, Operation Desert Storm, George H.W. Bush’s war against Iraqi dictator Saddam Hussein, showed that American troops like Israeli soldiers knew how to win quickly, cheaply, and humanely.  Generals like H. Norman Schwarzkopf persuaded themselves that their brief desert campaign against Iraq had replicated — even eclipsed — the battlefield exploits of such famous Israeli warriors as Moshe Dayan and Yitzhak Rabin.  Vietnam faded into irrelevance.

For both Israel and the United States, however, appearances proved deceptive.  Apart from fostering grand illusions, the splendid wars of 1967 and 1991 decided little.  In both cases, victory turned out to be more apparent than real.  Worse, triumphalism fostered massive future miscalculation.

On the Golan Heights, in Gaza, and throughout the West Bank, proponents of a Greater Israel — disregarding Washington’s objections — set out to assert permanent control over territory that Israel had seized.  Yet “facts on the ground” created by successive waves of Jewish settlers did little to enhance Israeli security.  They succeeded chiefly in shackling Israel to a rapidly growing and resentful Palestinian population that it could neither pacify nor assimilate.

In the Persian Gulf, the benefits reaped by the United States after 1991 likewise turned out to be ephemeral.  Saddam Hussein survived and became in the eyes of successive American administrations an imminent threat to regional stability.  This perception prompted (or provided a pretext for) a radical reorientation of strategy in Washington.  No longer content to prevent an unfriendly outside power from controlling the oil-rich Persian Gulf, Washington now sought to dominate the entire Greater Middle East.  Hegemony became the aim.  Yet the United States proved no more successful than Israel in imposing its writ.

During the 1990s, the Pentagon embarked willy-nilly upon what became its own variant of a settlement policy.  Yet U.S. bases dotting the Islamic world and U.S. forces operating in the region proved hardly more welcome than the Israeli settlements dotting the occupied territories and the soldiers of the Israeli Defense Forces (IDF) assigned to protect them.  In both cases, presence provoked (or provided a pretext for) resistance.  Just as Palestinians vented their anger at the Zionists in their midst, radical Islamists targeted Americans whom they regarded as neo-colonial infidels.

Stuck

No one doubted that Israelis (regionally) and Americans (globally) enjoyed unquestioned military dominance.  Throughout Israel’s near abroad, its tanks, fighter-bombers, and warships operated at will.  So, too, did American tanks, fighter-bombers, and warships wherever they were sent.

So what?  Events made it increasingly evident that military dominance did not translate into concrete political advantage.  Rather than enhancing the prospects for peace, coercion produced ever more complications.  No matter how badly battered and beaten, the “terrorists” (a catch-all term applied to anyone resisting Israeli or American authority) weren’t intimidated, remained unrepentant, and kept coming back for more.

Israel ran smack into this problem during Operation Peace for Galilee, its 1982 intervention in Lebanon.  U.S. forces encountered it a decade later during Operation Restore Hope, the West’s gloriously titled foray into Somalia.  Lebanon possessed a puny army; Somalia had none at all.  Rather than producing peace or restoring hope, however, both operations ended in frustration, embarrassment, and failure.

And those operations proved but harbingers of worse to come.  By the 1980s, the IDF’s glory days were past.  Rather than lightning strikes deep into the enemy rear, the narrative of Israeli military history became a cheerless recital of dirty wars — unconventional conflicts against irregular forces yielding problematic results.  The First Intifada (1987-1993), the Second Intifada (2000-2005), a second Lebanon War (2006), and Operation Cast Lead, the notorious 2008-2009 incursion into Gaza, all conformed to this pattern.

Meanwhile, the differential between Palestinian and Jewish Israeli birth rates emerged as a looming threat — a “demographic bomb,” Benjamin Netanyahu called it.  Here were new facts on the ground that military forces, unless employed pursuant to a policy of ethnic cleansing, could do little to redress.  Even as the IDF tried repeatedly and futilely to bludgeon Hamas and Hezbollah into submission, demographic trends continued to suggest that within a generation a majority of the population within Israel and the occupied territories would be Arab.

Trailing a decade or so behind Israel, the United States military nonetheless succeeded in duplicating the IDF’s experience.  Moments of glory remained, but they would prove fleeting indeed.  After 9/11, Washington’s efforts to transform (or “liberate”) the Greater Middle East kicked into high gear.  In Afghanistan and Iraq, George W. Bush’s Global War on Terror began impressively enough, as U.S. forces operated with a speed and élan that had once been an Israeli trademark.  Thanks to “shock and awe,” Kabul fell, followed less than a year and a half later by Baghdad.  As one senior Army general explained to Congress in 2004, the Pentagon had war all figured out:

“We are now able to create decision superiority that is enabled by networked systems, new sensors and command and control capabilities that are producing unprecedented near real time situational awareness, increased information availability, and an ability to deliver precision munitions throughout the breadth and depth of the battlespace… Combined, these capabilities of the future networked force will leverage information dominance, speed and precision, and result in decision superiority.”

The key phrase in this mass of techno-blather was the one that occurred twice: “decision superiority.”  At that moment, the officer corps, like the Bush administration, was still convinced that it knew how to win.

Such claims of success, however, proved obscenely premature.  Campaigns advertised as being wrapped up in weeks dragged on for years, while American troops struggled with their own intifadas.  When it came to achieving decisions that actually stuck, the Pentagon (like the IDF) remained clueless.

Winless

If any overarching conclusion emerges from the Afghan and Iraq Wars (and from their Israeli equivalents), it’s this: victory is a chimera.  Counting on today’s enemy to yield in the face of superior force makes about as much sense as buying lottery tickets to pay the mortgage: you better be really lucky.

Meanwhile, as the U.S. economy went into a tailspin, Americans contemplated their equivalent of Israel’s “demographic bomb” — a “fiscal bomb.”  Ingrained habits of profligacy, both individual and collective, held out the prospect of long-term stagnation: no growth, no jobs, no fun.  Out-of-control spending on endless wars exacerbated that threat.

By 2007, the American officer corps itself gave up on victory, although without giving up on war.  First in Iraq, then in Afghanistan, priorities shifted.  High-ranking generals shelved their expectations of winning — at least as a Rabin or Schwarzkopf would have understood that term.  They sought instead to not lose.  In Washington as in U.S. military command posts, the avoidance of outright defeat emerged as the new gold standard of success.

As a consequence, U.S. troops today sally forth from their base camps not to defeat the enemy, but to “protect the people,” consistent with the latest doctrinal fashion.  Meanwhile, tea-sipping U.S. commanders cut deals with warlords and tribal chieftains in hopes of persuading guerrillas to lay down their arms.

A new conventional wisdom has taken hold, endorsed by everyone from new Afghan War commander General David Petraeus, the most celebrated soldier of this American age, to Barack Obama, commander-in-chief and Nobel Peace Prize laureate.  For the conflicts in which the United States finds itself enmeshed, “military solutions” do not exist.  As Petraeus himself has emphasized, “we can’t kill our way out of” the fix we’re in.  In this way, he also pronounced a eulogy on the Western conception of warfare of the last two centuries.

The Unasked Question

What then are the implications of arriving at the end of Western military history?

In his famous essay, Fukuyama cautioned against thinking that the end of ideological history heralded the arrival of global peace and harmony.  Peoples and nations, he predicted, would still find plenty to squabble about.

With the end of military history, a similar expectation applies.  Politically motivated violence will persist and may in specific instances even retain marginal utility.  Yet the prospect of Big Wars solving Big Problems is probably gone for good.  Certainly, no one in their right mind, Israeli or American, can believe that a continued resort to force will remedy whatever it is that fuels anti-Israeli or anti-American antagonism throughout much of the Islamic world.  To expect persistence to produce something different or better is moonshine.

It remains to be seen whether Israel and the United States can come to terms with the end of military history.  Other nations have long since done so, accommodating themselves to the changing rhythms of international politics.  That they do so is evidence not of virtue, but of shrewdness.  China, for example, shows little eagerness to disarm.  Yet as Beijing expands its reach and influence, it emphasizes trade, investment, and development assistance.  Meanwhile, the People’s Liberation Army stays home.  China has stolen a page from an old American playbook, having become today the preeminent practitioner of “dollar diplomacy.”

The collapse of the Western military tradition confronts Israel with limited choices, none of them attractive.  Given the history of Judaism and the history of Israel itself, a reluctance of Israeli Jews to entrust their safety and security to the good will of their neighbors or the warm regards of the international community is understandable.  In a mere six decades, the Zionist project has produced a vibrant, flourishing state.  Why put all that at risk?  Although the demographic bomb may be ticking, no one really knows how much time remains on the clock.  If Israelis are inclined to continue putting their trust in (American-supplied) Israeli arms while hoping for the best, who can blame them?

In theory, the United States, sharing none of Israel’s demographic or geographic constraints and, far more richly endowed, should enjoy far greater freedom of action.  Unfortunately, Washington has a vested interest in preserving the status quo, no matter how much it costs or where it leads.  For the military-industrial complex, there are contracts to win and buckets of money to be made.  For those who dwell in the bowels of the national security state, there are prerogatives to protect.  For elected officials, there are campaign contributors to satisfy.  For appointed officials, civilian and military, there are ambitions to be pursued.

And always there is a chattering claque of militarists, calling for jihad and insisting on ever greater exertions, while remaining alert to any hint of backsliding.  In Washington, members of this militarist camp, by no means coincidentally including many of the voices that most insistently defend Israeli bellicosity, tacitly collaborate in excluding or marginalizing views that they deem heretical.  As a consequence, what passes for debate on matters relating to national security is a sham.  Thus are we invited to believe, for example, that General Petraeus’s appointment as the umpteenth U.S. commander in Afghanistan constitutes a milestone on the way to ultimate success.

Nearly 20 years ago, a querulous Madeleine Albright demanded to know: “What’s the point of having this superb military you’re always talking about if we can’t use it?”  Today, an altogether different question deserves our attention: What’s the point of constantly using our superb military if doing so doesn’t actually work?

Washington’s refusal to pose that question provides a measure of the corruption and dishonesty permeating our politics.

Andrew J. Bacevich is a professor of history and international relations at Boston University.  His new book, Washington Rules: America’s Path to Permanent Warhas just been published. Listen to the latest TomCast audio interview to hear him discuss the book by clicking here or, to download to an iPod, here.

Copyright 2010 Andrew Bacevich

This article was originally posted at TomDispatch.com.

The End of (Military) History?