To judge by the early returns, the presidential race of 2016 is shaping up as the most disheartening in recent memory. Other than as a form of low entertainment, the speeches, debates, campaign events, and slick TV ads already inundating the public sphere offer little of value. Rather than exhibiting the vitality of American democracy, they testify to its hollowness.
Present-day Iranian politics may actually possess considerably more substance than our own. There, the parties involved, whether favoring change or opposing it, understand that the issues at stake have momentous implications. Here, what passes for national politics is a form of exhibitionism about as genuine as pro wrestling.
A presidential election campaign ought to involve more than competing coalitions of interest groups or bevies of investment banks and billionaires vying to install their preferred candidate in the White House. It should engage and educate citizens, illuminating issues and subjecting alternative solutions to careful scrutiny.
That this one won’t even come close we can ascribe as much to the media as to those running for office, something the recent set of “debates” and the accompanying commentary have made painfully clear. With certain honorable exceptions such as NBC’s estimable Lester Holt, representatives of the press are less interested in fulfilling their civic duty than promoting themselves as active participants in the spectacle. They bait, tease, and strut. Then they subject the candidates’ statements and misstatements to minute deconstruction. The effect is to inflate their own importance while trivializing the proceedings they are purportedly covering.
Above all in the realm of national security, election 2016 promises to be not just a missed opportunity but a complete bust. Recent efforts to exercise what people in Washington like to call “global leadership” have met with many more failures and disappointments than clearcut successes. So you might imagine that reviewing the scorecard would give the current raft of candidates, Republican and Democratic alike, plenty to talk about.
But if you thought that, you’d be mistaken. Instead of considered discussion of first-order security concerns, the candidates have regularly opted for bluff and bluster, their chief aim being to remove all doubts regarding their hawkish bona fides.
In that regard, nothing tops rhetorically beating up on the so-called Islamic State. So, for example, Hillary Clinton promises to “smash the would-be caliphate,” Jeb Bush to “defeat ISIS for good,” Ted Cruz to “carpet bomb them into oblivion,” and Donald Trump to “bomb the shit out of them.” For his part, having recently acquired a gun as the “last line of defense between ISIS and my family,” Marco Rubio insists that when he becomes president, “The most powerful intelligence agency in the world is going to tell us where [ISIS militants] are; the most powerful military in the world is going to destroy them; and if we capture any of them alive, they are getting a one-way ticket to Guantanamo Bay.”
These carefully scripted lines perform their intended twofold function. First, they elicit applause and certify the candidate as plenty tough. Second, they spare the candidate from having to address matters far more deserving of presidential attention than managing the fight against the Islamic State.
In the hierarchy of challenges facing the United States today, ISIS ranks about on a par with Sicily back in 1943. While liberating that island was a necessary prelude to liberating Europe more generally, the German occupation of Sicily did not pose a direct threat to the Allied cause. So with far weightier matters to attend to — handling Soviet dictator Joseph Stalin and British Prime Minister Winston Churchill, for example — President Franklin Roosevelt wisely left the problem of Sicily to subordinates. FDR thereby demonstrated an aptitude for distinguishing between the genuinely essential and the merely important.
By comparison, today’s crop of presidential candidates either are unable to grasp, cannot articulate, or choose to ignore those matters that should rightfully fall under a commander-in-chief’s purview. Instead, they compete with one another in vowing to liberate the twenty-first-century equivalent of Sicily, as if doing so demonstrates their qualifications for the office.
What sort of national security concerns should be front and center in the current election cycle? While conceding that a reasoned discussion of heavily politicized matters like climate change, immigration, or anything to do with Israel is probably impossible, other issues of demonstrable significance deserve attention. What follows are six of them — by no means an exhaustive list — that I’ve framed as questions a debate moderator might ask of anyone seeking the presidency, along with brief commentaries explaining why neither the posing nor the answering of such questions is likely to happen anytime soon.
1. The War on Terror: Nearly 15 years after this “war” was launched by George W. Bush, why hasn’t “the most powerful military in the world,” “the finest fighting force in the history of the world” won it? Why isn’t victory anywhere in sight?
As if by informal agreement, the candidates and the journalists covering the race have chosen to ignore the military enterprise inaugurated in 2001, initially called the Global War on Terrorism and continuing today without an agreed-upon name. Since 9/11, the United States has invaded, occupied, bombed, raided, or otherwise established a military presence in numerous countries across much of the Islamic world. How are we doing?
Given the resources expended and the lives lost or ruined, not particularly well it would seem. Intending to promote stability, reduce the incidence of jihadism, and reverse the tide of anti-Americanism among many Muslims, that “war” has done just the opposite. Advance the cause of democracy and human rights? Make that zero-for-four.
Amazingly, this disappointing record has been almost entirely overlooked in the campaign. The reasons why are not difficult to discern. First and foremost, both parties share in the serial failures of U.S. policy in Afghanistan, Iraq, Syria, Libya, and elsewhere in the region. Pinning the entire mess on George W. Bush is no more persuasive than pinning it all on Barack Obama. An intellectually honest accounting would require explanations that look beyond reflexive partisanship. Among the matters deserving critical scrutiny is Washington’s persistent bipartisan belief in military might as an all-purpose problem solver. Not far behind should come questions about simple military competence that no American political figure of note or mainstream media outlet has the gumption to address.
The politically expedient position indulged by the media is to sidestep such concerns in favor of offering endless testimonials to the bravery and virtue of the troops, while calling for yet more of the same or even further escalation. Making a show of supporting the troops takes precedence over serious consideration of what they are continually being asked to do.
2. Nuclear Weapons: Today, more than 70 years after Hiroshima and Nagasaki, what purpose do nukes serve? How many nuclear weapons and delivery systems does the United States actually need?
In an initiative that has attracted remarkably little public attention, the Obama administration has announced plans to modernize and upgrade the U.S. nuclear arsenal. Estimated costs of this program reach as high as $1 trillion over the next three decades. Once finished — probably just in time for the 100th anniversary of Hiroshima — the United States will possess more flexible, precise, survivable, and therefore usable nuclear capabilities than anything hitherto imagined. In effect, the country will have acquired a first-strike capability — even as U.S. officials continue to affirm their earnest hope of removing the scourge of nuclear weapons from the face of the Earth (other powers being the first to disarm, of course).
Whether, in the process, the United States will become more secure or whether there might be far wiser ways to spend that kind of money — shoring up cyber defenses, for example — would seem like questions those who could soon have their finger on the nuclear button might want to consider.
Yet we all know that isn’t going to happen. Having departed from the sphere of politics or strategy, nuclear policy has long since moved into the realm of theology. Much as the Christian faith derives from a belief in a Trinity consisting of the Father, the Son, and the Holy Ghost, so nuclear theology has its own Triad, comprised of manned bombers, intercontinental ballistic missiles, and submarine-launched missiles. To question the existence of such a holy threesome constitutes rank heresy. It’s just not done — especially when there’s all that money about to be dropped into the collection plate.
3. Energy Security: Given the availability of abundant oil and natural gas reserves in the Western Hemisphere and the potential future abundance of alternative energy systems, why should the Persian Gulf continue to qualify as a vital U.S. national security interest?
Back in 1980, two factors prompted President Jimmy Carter to announce that the United States viewed the Persian Gulf as worth fighting for. The first was a growing U.S. dependence on foreign oil and a belief that American consumers were guzzling gas at a rate that would rapidly deplete domestic reserves. The second was a concern that, having just invaded Afghanistan, the Soviet Union might next have an appetite for going after those giant gas stations in the Gulf, Iran, or even Saudi Arabia.
Today we know that the Western Hemisphere contains more than ample supplies of oil and natural gas to sustain the American way of life (while also heating up the planet). As for the Soviet Union, it no longer exists — a decade spent chewing on Afghanistan having produced a fatal case of indigestion.
No doubt ensuring U.S. energy security should remain a major priority. Yet in that regard, protecting Canada, Mexico, and Venezuela is far more relevant to the nation’s well-being than protecting Saudi Arabia, Kuwait, and Iraq, while being far easier and cheaper to accomplish. So who will be the first presidential candidate to call for abrogating the Carter Doctrine? Show of hands, please?
4. Assassination: Now that the United States has normalized assassination as an instrument of policy, how well is it working? What are its benefits and costs?
George W. Bush’s administration pioneered the practice of using missile-armed drones as a method of extrajudicial killing. Barack Obama’s administration greatly expanded and routinized the practice.
The technique is clearly “effective” in the narrow sense of liquidating leaders and “lieutenants” of terror groups that policymakers want done away with. What’s less clear is whether the benefits of state-sponsored assassination outweigh the costs, which are considerable. The incidental killing of noncombatants provokes ire directed against the United States and provides terror groups with an excellent recruiting tool. The removal of Mr. Bad Actor from the field adversely affects the organization he leads for no longer than it takes for a successor to emerge. As often as not, the successor turns out to be nastier than Mr. Bad Actor himself.
It would be naïve to expect presidential candidates to interest themselves in the moral implications of assassination as now practiced on a regular basis from the White House. Still, shouldn’t they at least wonder whether it actually works as advertised? And as drone technology proliferates, shouldn’t they also contemplate the prospect of others — say, Russians, Chinese, and Iranians — following America’s lead and turning assassination into a global practice?
5. Europe: Seventy years after World War II and a quarter-century after the Cold War ended, why does European security remain an American responsibility? Given that Europeans are rich enough to defend themselves, why shouldn’t they?
Americans love Europe: old castles, excellent cuisine, and cultural attractions galore. Once upon a time, the parts of Europe that Americans love best needed protection. Devastated by World War II, Western Europe faced in the Soviet Union a threat that it could not handle alone. In a singular act of generosity laced with self-interest, Washington came to the rescue. By forming NATO, the United States committed itself to defend its impoverished and vulnerable European allies. Over time this commitment enabled France, Great Britain, West Germany, and other nearby countries to recover from the global war and become strong, prosperous, and democratic countries.
Today Europe is “whole and free,” incorporating not only most of the former Soviet empire, but even parts of the old Soviet Union itself. In place of the former Soviet threat, there is Vladimir Putin, a bully governing a rickety energy state that, media hype notwithstanding, poses no more than a modest danger to Europe itself. Collectively, the European Union’s economy, at $18 trillion, equals that of the United States and exceeds Russia’s, even in sunnier times, by a factor of nine. Its total population, easily outnumbering our own, is more than triple Russia’s. What these numbers tell us is that Europe is entirely capable of funding and organizing its own defense if it chooses to do so.
It chooses otherwise, in effect opting for something approximating disarmament. As a percentage of the gross domestic product, European nations spend a fraction of what the United States does on defense. When it comes to armaments, they prefer to be free riders and Washington indulges that choice. So even today, seven decades after World War II ended, U.S. forces continue to garrison Europe and America’s obligation to defend 26 countries on the far side of the Atlantic remains intact.
The persistence of this anomalous situation deserves election-year attention for one very important reason. It gets to the question of whether the United States can ever declare mission accomplished. Since the end of World War II, Washington has extended its security umbrella to cover not only Europe, but also virtually all of Latin America and large parts of East Asia. More recently, the Middle East, Central Asia, and now Africa have come in for increased attention. Today, U.S. forces alone maintain an active presence in 147 countries.
Do our troops ever really get to “come home”? The question is more than theoretical in nature. To answer it is to expose the real purpose of American globalism, which means, of course, that none of the candidates will touch it with a 10-foot pole.
6. Debt: Does the national debt constitute a threat to national security? If so, what are some politically plausible ways of reining it in?
Together, the administrations of George W. Bush and Barack Obama can take credit for tripling the national debt since 2000. Well before Election Day this coming November, the total debt, now exceeding the entire gross domestic product, will breach the $19 trillion mark.
In 2010, Admiral Mike Mullen, then chairman of the Joint Chiefs of Staff, described that debt as “the most significant threat to our national security.” Although in doing so he wandered a bit out of his lane, he performed a rare and useful service by drawing a link between long-term security and fiscal responsibility. Ever so briefly, a senior military officer allowed consideration of the national interest to take precedence over the care and feeding of the military-industrial complex. It didn’t last long.
Mullen’s comment garnered a bit of attention, but failed to spur any serious congressional action. Again, we can see why, since Congress functions as an unindicted co-conspirator in the workings of that lucrative collaboration. Returning to anything like a balanced budget would require legislators to make precisely the sorts of choices that they are especially loathe to make — cutting military programs that line the pockets of donors and provide jobs for constituents. (Although the F-35 fighter may be one of the most bloated and expensive weapons programs in history, even Democratic Socialist Senator Bernie Sanders has left no stone unturned in lobbying to get those planes stationed in his hometown of Burlington.)
Recently, the role of Congress in authorizing an increase in the debt ceiling has provided Republicans with an excuse for political posturing, laying responsibility for all that red ink entirely at the feet of President Obama — this despite the fact that he has reduced the annual deficit by two-thirds, from $1.3 trillion the year he took office to $439 billion last year.
This much is certain: regardless of who takes the prize in November, the United States will continue to accumulate debt at a non-trivial rate. If a Democrat occupies the White House, Republicans will pretend to care. If our next president is a Republican, they will keep mum. In either case, the approach to national security that does so much to keep the books out of balance will remain intact.
Come to think of it, averting real change might just be the one point on which the candidates generally agree.
Copyright 2016 Andrew J. Bacevich
Out of Bounds, Off-Limits, or Just Plain Ignored
Assume that the hawks get their way — that the United States does whatever it takes militarily to confront and destroy ISIS. Then what?
Answering that question requires taking seriously the outcomes of other recent U.S. interventions in the Greater Middle East. In 1991, when the first President Bush ejected Saddam Hussein’s army from Kuwait, Americans rejoiced, believing that they had won a decisive victory. A decade later, the younger Bush seemingly outdid his father by toppling the Taliban in Afghanistan and then making short work of Saddam himself — a liberation twofer achieved in less time than it takes Americans to choose a president. After the passage of another decade, Barack Obama got into the liberation act, overthrowing the Libyan dictator Muammar Gaddafi in what appeared to be a tidy air intervention with a clean outcome. As Secretary of State Hillary Clinton memorably put it, “We came, we saw, he died.” End of story.
In fact, subsequent events in each case mocked early claims of success or outright victory. Unanticipated consequences and complications abounded. “Liberation” turned out to be a prelude to chronic violence and upheaval.
Indeed, the very existence of the Islamic State (ISIS) today renders a definitive verdict on the Iraq wars over which the Presidents Bush presided, each abetted by a Democratic successor. A de facto collaboration of four successive administrations succeeded in reducing Iraq to what it is today: a dysfunctional quasi-state unable to control its borders or territory while serving as a magnet and inspiration for terrorists.
The United States bears a profound moral responsibility for having made such a hash of things there. Were it not for the reckless American decision to invade and occupy a nation that, whatever its crimes, had nothing to do with 9/11, the Islamic State would not exist. Per the famous Pottery Barn Rule attributed to former Secretary of State Colin Powell, having smashed Iraq to bits a decade ago, we can now hardly deny owning ISIS.
That the United States possesses sufficient military power to make short work of that “caliphate” is also the case. True, in both Syria and Iraq the Islamic State has demonstrated a disturbing ability to capture and hold large stretches of desert, along with several population centers. It has, however, achieved these successes against poorly motivated local forces of, at best, indifferent quality.
In that regard, the glibly bellicose editor of the Weekly Standard, William Kristol, is surely correct in suggesting that a well-armed contingent of 50,000 U.S. troops, supported by ample quantities of air power, would make mincemeat of ISIS in a toe-to-toe contest. Liberation of the various ISIS strongholds like Fallujah and Mosul in Iraq and Palmyra and Raqqa, its “capital,” in Syria would undoubtedly follow in short order.
In the wake of the recent attacks in Paris, the American mood is strongly trending in favor of this sort of escalation. Just about anyone who is anyone — the current occupant of the Oval Office partially excepted — favors intensifying the U.S. military campaign against ISIS. And why not? What could possibly go wrong? As Kristol puts it, "I don’t think there’s much in the way of unanticipated side effects that are going to be bad there."
It’s an alluring prospect. In the face of a sustained assault by the greatest military the world has ever seen, ISIS foolishly (and therefore improbably) chooses to make an Alamo-like stand. Whammo! We win. They lose. Mission accomplished.
Of course, that phrase recalls the euphoric early reactions to Operations Desert Storm in 1991, Enduring Freedom in 2001, Iraqi Freedom in 2003, and Odyssey Dawn, the Libyan intervention of 2011. Time and again the unanticipated side effects of U.S. military action turned out to be very bad indeed. In Kabul, Baghdad, or Tripoli, the Alamo fell, but the enemy dispersed or reinvented itself and the conflict continued. Assurances offered by Kristol that this time things will surely be different deserve to be taken with more than a grain of salt. Pass the whole shaker.
Embracing Generational War
Why this repeated disparity between perceived and actual outcomes? Why have apparent battlefield successes led so regularly to more violence and disorder? Before following Kristol’s counsel, Americans would do well to reflect on these questions.
Cue Professor Eliot A. Cohen. Shortly after 9/11, Cohen, one of this country’s preeminent military thinkers, characterized the conflict on which the United States was then embarking as “World War IV.” (In this formulation, the Cold War becomes World War III.) Other than in certain neoconservative quarters, the depiction did not catch on. Yet nearly a decade-and-a-half later, the Johns Hopkins professor and former State Department official is sticking to his guns. In an essay penned for the American Interestfollowing the recent Paris attacks, he returns to his theme. “It was World War IV in 2001,” Cohen insists. “It is World War IV today.” And to our considerable benefit he spells out at least some of the implications of casting the conflict in such expansive and evocative terms.
Now I happen to think that equating our present predicament in the Islamic world with the immensely destructive conflicts of the prior century is dead wrong. Yet it’s a proposition that Americans at this juncture should contemplate with the utmost seriousness.
In the United States today, confusion about what war itself signifies is widespread. Through misuse, misapplication, and above all misremembering, we have distorted the term almost beyond recognition. As one consequence, talk of war comes too easily off the tongues of the unknowing.
Not so with Cohen. When it comes to war, he has no illusions. Addressing that subject, he illuminates it, enabling us to see what war entails. So in advocating World War IV, he performs a great service, even if perhaps not the one he intends.
What will distinguish the war that Cohen deems essential? “Begin with endurance,” he writes. “This war will probably go on for the rest of my life, and well into my children’s.” Although American political leaders seem reluctant “to explain just how high the stakes are,” Cohen lays them out in direct, unvarnished language. At issue, he insists, is the American way of life itself, not simply “in the sense of rock concerts and alcohol in restaurants, but the more fundamental rights of freedom of speech and religion, the equality of women, and, most essentially, the freedom from fear and freedom to think.”
With so much on the line, Cohen derides the Obama administration’s tendency to rely on “therapeutic bombing, which will temporarily relieve the itch, but leave the wounds suppurating.” The time for such half-measures has long since passed. Defeating the Islamic State and “kindred movements” will require the U.S. to “kill a great many people.” To that end Washington needs “a long-range plan not to ‘contain’ but to crush” the enemy. Even with such a plan, victory will be a long way off and will require “a long, bloody, and costly process.”
Cohen’s candor and specificity, as bracing as they are rare, should command our respect. If World War IV describes what we are in for, then eliminating ISIS might figure as a near-term imperative, but it can hardly define the endgame. Beyond ISIS loom all those continually evolving “kindred movements” to which the United States will have to attend before it can declare the war itself well and truly won.
To send just tens of thousands of U.S. troops to clean up Syria and Iraq, as William Kristol and others propose, offers at best a recipe for winning a single campaign. Winning the larger war would involve far more arduous exertions. This Cohen understands, accepts, and urges others to acknowledge.
And here we come to the heart of the matter. For at least the past 35 years — that is, since well before 9/11 — the United States has been “at war” in various quarters of the Islamic world. At no point has it demonstrated the will or the ability to finish the job. Washington’s approach has been akin to treating cancer with a little bit of chemo one year and a one-shot course of radiation the next. Such gross malpractice aptly describes U.S. military policy throughout the Greater Middle East across several decades.
While there may be many reasons why the Iraq War of 2003 to 2011 and the still longer Afghanistan War yielded such disappointing results, Washington’s timidity in conducting those campaigns deserves pride of place. That most Americans might bridle at the term “timidity” reflects the extent to which they have deluded themselves regarding the reality of war.
In comparison to Vietnam, for example, Washington’s approach to waging its two principal post-9/11 campaigns was positively half-hearted. With the nation as a whole adhering to peacetime routines, Washington neither sent enough troops nor stayed anywhere near long enough to finish the job. Yes, we killed many tens of thousands of Iraqis and Afghans, but if winning World War IV requires, as Cohen writes, that we “break the back” of the enemy, then we obviously didn’t kill nearly enough.
Nor were Americans sufficiently willing to die for the cause. In South Vietnam, 58,000 G.I.s died in a futile effort to enable that country to survive. In Iraq and Afghanistan, where the stakes were presumably much higher, we pulled the plug after fewer than 7,000 deaths.
Americans would be foolish to listen to those like William Kristol who, even today, peddle illusions about war being neat and easy. They would do well instead to heed Cohen, who knows that war is hard and ugly.
What Would World War IV Look Like?
Yet when specifying the practical implications of generational war, Cohen is less forthcoming. From his perspective, this fourth iteration of existential armed conflict in a single century is not going well. But apart from greater resolve and bloody-mindedness, what will it take to get things on the right track?
As a thought experiment, let’s answer that question by treating it with the urgency that Cohen believes it deserves. After 9/11, certain U.S. officials thundered about “taking the gloves off.” In practice, however, with the notable exception of policies permitting torture and imprisonment without due process, the gloves stayed on. Take Cohen’s conception of World War IV at face value and that will have to change.
For starters, the country would have to move to something like a war footing, enabling Washington to raise a lot more troops and spend a lot more money over a very long period of time. Although long since banished from the nation’s political lexicon, the M-word — mobilization — would make a comeback. Prosecuting a generational war, after all, is going to require the commitment of generations.
Furthermore, if winning World War IV means crushing the enemy, as Cohen emphasizes, then ensuring that the enemy, once crushed, cannot recover would be hardly less important. And that requirement would prohibit U.S. forces from simply walking away from a particular fight even — or especially — when it might appear won.
At the present moment, defeating the Islamic State ranks as Washington’s number one priority. With the Pentagon already claiming a body count of 20,000 ISIS fighters without notable effect, this campaign won’t end anytime soon. But even assuming an eventually positive outcome, the task of maintaining order and stability in areas that ISIS now controls will remain. Indeed, that task will persist until the conditions giving rise to entities like ISIS are eliminated. Don’t expect French President François Hollande or British Prime Minister David Cameron to sign up for that thankless job. U.S. forces will own it. Packing up and leaving the scene won’t be an option.
How long would those forces have to stay? Extrapolating from recent U.S. occupations in Iraq and Afghanistan, something on the order of a quarter-century seems like a plausible approximation. So should our 45th president opt for a boots-on-the-ground solution to ISIS, as might well be the case, the privilege of welcoming the troops home could belong to the 48th or 49th occupant of the White House.
In the meantime, U.S. forces would have to deal with the various and sundry “kindred movements” that are already cropping up like crabgrass in country after country. Afghanistan — still? again? — would head the list of places requiring U.S. military attention. But other prospective locales would include such hotbeds of Islamist activity as Lebanon, Libya, Palestine, Somalia, and Yemen, along with several West African countries increasingly beset with insurgencies. Unless Egyptian, Pakistani, and Saudi security forces demonstrate the ability (not to mention the will) to suppress the violent radicals in their midst, one or more of those countries could also become the scene of significant U.S. military action.
Effective prosecution of World War IV, in other words, would require the Pentagon to plan for each of these contingencies, while mustering the assets needed for implementation. Allies might kick in token assistance — tokenism is all they have to offer — but the United States will necessarily carry most of the load.
What Would World War IV Cost?
During World War III (aka the Cold War), the Pentagon maintained a force structure ostensibly adequate to the simultaneous prosecution of two and a half wars. This meant having the wherewithal to defend Europe and the Pacific from communist aggression while still leaving something for the unexpected. World War IV campaigns are unlikely to entail anything on the scale of the Warsaw Pact attacking Western Europe or North Korea invading the South. Still, the range of plausible scenarios will require that U.S. forces be able to take on militant organizations C and D even while guarding against the resurgence of organizations A and B in altogether different geographic locations.
Even though Washington may try whenever possible to avoid large-scale ground combat, relying on air power (including drones) and elite Special Operations forces to do the actual killing, post-conflict pacification promises to be a manpower intensive activity. Certainly, this ranks as one of the most obvious lessons to emerge from World War IV’s preliminary phases: when the initial fight ends, the real work begins.
U.S. forces committed to asserting control over Iraq after the invasion of 2003 topped out at roughly 180,000. In Afghanistan, during the Obama presidency, the presence peaked at 110,000. In a historical context, these are not especially large numbers. At the height of the Vietnam War, for example, U.S. troop strength in Southeast Asia exceeded 500,000.
In hindsight, the Army general who, before the invasion of 2003, publicly suggested that pacifying postwar Iraq would require “several hundred thousand troops” had it right. A similar estimate applies to Afghanistan. In other words, those two occupations together could easily have absorbed 600,000 to 800,000 troops on an ongoing basis. Given the Pentagon’s standard three-to-one rotation policy, which assumes that for every unit in-country, a second is just back, and a third is preparing to deploy, you’re talking about a minimum requirement of between 1.8 and 2.4 million troops to sustain just two medium-sized campaigns — a figure that wouldn’t include some number of additional troops kept in reserve for the unexpected.
In other words, waging World War IV would require at least a five-fold increase in the current size of the U.S. Army — and not as an emergency measure but a permanent one. Such numbers may appear large, but as Cohen would be the first to point out, they are actually modest when compared to previous world wars. In 1968, in the middle of World War III, the Army had more than 1.5 million active duty soldiers on its rolls — this at a time when the total American population was less than two-thirds what it is today and when gender discrimination largely excluded women from military service. If it chose to do so, the United States today could easily field an army of two million or more soldiers.
Whether it could also retain the current model of an all-volunteer force is another matter. Recruiters would certainly face considerable challenges, even if Congress enhanced the material inducements for service, which since 9/11 have already included a succession of generous increases in military pay. A loosening of immigration policy, granting a few hundred thousand foreigners citizenship in return for successfully completing a term of enlistment might help. In all likelihood, however, as with all three previous world wars, waging World War IV would oblige the United States to revive the draft, a prospect as likely to be well-received as a flood of brown and black immigrant enlistees. In short, going all out to create the forces needed to win World War IV would confront Americans with uncomfortable choices.
The budgetary implications of expanding U.S. forces while conducting a perpetual round of what the Pentagon calls “overseas contingency operations” would also loom large. Precisely how much money an essentially global conflict projected to extend well into the latter half of the century would require is difficult to gauge. As a starting point, given the increased number of active duty forces, tripling the present Defense Department budget of more than $600 billion might serve as a reasonable guess.
At first glance, $1.8 trillion annually is a stupefyingly large figure. To make it somewhat more palatable, a proponent of World War IV might put that number in historical perspective. During the first phases of World War III, for example, the United States routinely allocated 10% or more of total gross domestic product (GDP) for national security. With that GDP today exceeding $17 trillion, apportioning 10% to the Pentagon would give those charged with managing World War IV a nice sum to work with and no doubt to build upon.
Of course, that money would have to come from somewhere. For several years during the last decade, sustaining wars in Iraq and Afghanistan pushed the federal deficit above a trillion dollars. As one consequence, the total national debt now exceeds annual GDP, having tripled since 9/11. How much additional debt the United States can accrue without doing permanent damage to the economy is a question of more than academic interest.
To avoid having World War IV produce an endless string of unacceptably large deficits, ratcheting up military spending would undoubtedly require either substantial tax increases or significant cuts in non-military spending, including big-ticket programs like Medicare and social security — precisely those, that is, which members of the middle class hold most dear.
In other words, funding World War IV while maintaining a semblance of fiscal responsibility would entail the kind of trade-offs that political leaders are loathe to make. Today, neither party appears up to taking on such challenges. That the demands of waging protracted war will persuade them to rise above their partisan differences seems unlikely. It sure hasn’t so far.
The Folly of World War IV
In his essay, Cohen writes, “we need to stop the circumlocutions.” Of those who would bear the direct burden of his world war, he says, “we must start telling them the truth.” He’s right, even if he himself is largely silent about what the conduct of World War IV is likely to exact from the average citizen.
As the United States enters a presidential election year, plain talk about the prospects of our ongoing military engagement in the Islamic world should be the order of the day. The pretense that either dropping a few more bombs or invading one or two more countries will yield a conclusive outcome amounts to more than an evasion. It is an outright lie.
As Cohen knows, winning World War IV would require dropping many, many more bombs and invading, and then occupying for years to come, many more countries. After all, it’s not just ISIS that Washington will have to deal with, but also its affiliates, offshoots, wannabes, and the successors almost surely waiting in the wings. And don’t forget al-Qaeda.
Cohen believes that we have no alternative. Either we get serious about fighting World War IV the way it needs to be fought or darkness will envelop the land. He is undeterred by the evidence that the more deeply we insert our soldiers into the Greater Middle East the more concerted the resistance they face; that the more militants we kill the more we seem to create; that the inevitable, if unintended, killing of innocents only serves to strengthen the hand of the extremists. As he sees it, with everything we believe in riding on the outcome, we have no choice but to press on.
While listening carefully to Cohen’s call to arms, Americans should reflect on its implications. Wars change countries and people. Embracing his prescription for World War IV would change the United States in fundamental ways. It would radically expand the scope and reach of the national security state, which, of course, includes agencies beyond the military itself. It would divert vast quantities of wealth to nonproductive purposes. It would make the militarization of the American way of life, a legacy of prior world wars, irreversible. By sowing fear and fostering impossible expectations of perfect security, it would also compromise American freedom in the name of protecting it. The nation that decades from now might celebrate VT Day — victory over terrorism — will have become a different place, materially, politically, culturally, and morally.
In my view, Cohen’s World War IV is an invitation to collective suicide. Arguing that no alternative exists to open-ended war represents not hard-nosed realism, but the abdication of statecraft. Yet here’s the ultimate irony: even without the name, the United States has already embarked upon something akin to a world war, which now extends into the far reaches of the Islamic world and spreads further year by year.
Incrementally, bit by bit, this nameless war has already expanded the scope and reach of the national security apparatus. It is diverting vast quantities of wealth to nonproductive purposes even as it normalizes the continuing militarization of the American way of life. By sowing fear and fostering impossible expectations of perfect security, it is undermining American freedom in the name of protecting it, and doing so right before our eyes.
Cohen rightly decries the rudderless character of the policies that have guided the (mis)conduct of that war thus far. For that critique we owe him a considerable debt. But the real problem is the war itself and the conviction that only through war can America remain America.
For a rich and powerful nation to conclude that it has no choice but to engage in quasi-permanent armed conflict in the far reaches of the planet represents the height of folly. Power confers choice. As citizens, we must resist with all our might arguments that deny the existence of choice. Whether advanced forthrightly by Cohen or fecklessly by the militarily ignorant, such claims will only perpetuate the folly that has already lasted far too long.
Andrew J. Bacevich, a TomDispatch regular, is professor emeritus of history and international relations at Boston University. He is the author of Breach of Trust: How Americans Failed Their Soldiers and Their Country, among other works. His new book, America’s War for the Greater Middle East (Random House), is due out in April 2016.
Copyright 2015 Andrew J. Bacevich
First came Fallujah, then Mosul, and later Ramadi in Iraq. Now, there is Kunduz, a provincial capital in northern Afghanistan. In all four places, the same story has played out: in cities that newspaper reporters like to call “strategically important,” security forces trained and equipped by the U.S. military at great expense simply folded, abandoning their posts (and much of their U.S.-supplied weaponry) without even mounting serious resistance. Called upon to fight, they fled. In each case, the defending forces gave way before substantially outnumbered attackers, making the outcomes all the more ignominious.
Together, these setbacks have rendered a verdict on the now more-or-less nameless Global War on Terrorism (GWOT). Successive blitzkriegs by ISIS and the Taliban respectively did more than simply breach Iraqi and Afghan defenses. They also punched gaping holes in the strategy to which the United States had reverted in hopes of stemming the further erosion of its position in the Greater Middle East.
Recall that, when the United States launched its GWOT soon after 9/11, it did so pursuant to a grandiose agenda. U.S. forces were going to imprint onto others a specific and exalted set of values. During President George W. Bush’s first term, this “freedom agenda” formed the foundation, or at least the rationale, for U.S. policy.
The shooting would stop, Bush vowed, only when countries like Afghanistan had ceased to harbor anti-American terrorists and countries like Iraq had ceased to encourage them. Achieving this goal meant that the inhabitants of those countries would have to change. Afghans and Iraqis, followed in due course by Syrians, Libyans, Iranians, and sundry others would embrace democracy, respect human rights, and abide by the rule of law, or else. Through the concerted application of American power, they would become different — more like us and therefore more inclined to get along with us. A bit less Mecca and Medina, a bit more “we hold these truths” and “of the people, by the people.”
So Bush and others in his inner circle professed to believe. At least some of them, probably including Bush himself, may actually have done so.
History, at least the bits and pieces to which Americans attend, seemed to endow such expectations with a modicum of plausibility. Had not such a transfer of values occurred after World War II when the defeated Axis Powers had hastily thrown in with the winning side? Had it not recurred as the Cold War was winding down, when previously committed communists succumbed to the allure of consumer goods and quarterly profit statements?
If the appropriate mix of coaching and coercion were administered, Afghans and Iraqis, too, would surely take the path once followed by good Germans and nimble Japanese, and subsequently by Czechs tired of repression and Chinese tired of want. Once liberated, grateful Afghans and Iraqis would align themselves with a conception of modernity that the United States had pioneered and now exemplified. For this transformation to occur, however, the accumulated debris of retrograde social conventions and political arrangements that had long retarded progress would have to be cleared away. This was what the invasions of Afghanistan (Operation Enduring Freedom!) and Iraq (Operation Iraqi Freedom!) were meant to accomplish in one fell swoop by a military the likes of which had (to hear Washington tell it) never been seen in history. POW!
Standing Them Up As We Stand Down
Concealed within that oft-cited “freedom” — the all-purpose justification for deploying American power — were several shades of meaning. The term, in fact, requires decoding. Yet within the upper reaches of the American national security apparatus, one definition takes precedence over all others. In Washington, freedom has become a euphemism for dominion. Spreading freedom means positioning the United States to call the shots. Seen in this context, Washington’s expected victories in both Afghanistan and Iraq were meant to affirm and broaden its preeminence by incorporating large parts of the Islamic world into the American imperium. They would benefit, of course, but to an even greater extent, so would we.
Alas, liberating Afghans and Iraqis turned out to be a tad more complicated than the architects of Bush’s freedom (or dominion) agenda anticipated. Well before Barack Obama succeeded Bush in January 2009, few observers — apart from a handful of ideologues and militarists — clung to the fairy tale of U.S. military might whipping the Greater Middle East into shape. Brutally but efficiently, war had educated the educable. As for the uneducable, they persisted in taking their cues from Fox News and the Weekly Standard.
Yet if the strategy of transformation via invasion and “nation building” had failed, there was a fallback position that seemed to be dictated by the logic of events. Together, Bush and Obama would lower expectations as to what the United States was going to achieve, even as they imposed new demands on the U.S. military, America’s go-to outfit in foreign policy, to get on with the job.
Rather than midwifing fundamental political and cultural change, the Pentagon was instead ordered to ramp up its already gargantuan efforts to create local militaries (and police forces) capable of maintaining order and national unity. President Bush provided aconcise formulation of the new strategy: “As the Iraqis stand up, we will stand down.” Under Obama, after his own stab at a “surge,” the dictum applied to Afghanistan as well. Nation-building had flopped. Building armies and police forces able to keep a lid on things now became the prevailing definition of success.
The United States had, of course, attempted this approach once before, with unhappy results. This was in Vietnam. There, efforts to destroy North Vietnamese and Viet Cong forces intent on unifying their divided country had exhausted both the U.S. military and the patience of the American people. Responding to the logic of events, Presidents Lyndon Johnson and Richard Nixon had a tacitly agreed upon fallback position. As the prospects of American forces successfully eliminating threats to South Vietnamese security faded, the training and equipping of the South Vietnamese to defend themselves became priority number one.
Dubbed “Vietnamization,” this enterprise ended in abject failure with the fall of Saigon in 1975. Yet that failure raised important questions to which members of the national security elite might have attended: Given a weak state with dubious legitimacy, how feasible is it to expect outsiders to invest indigenous forces with genuine fighting power? How do differences in culture or history or religion affect the prospects for doing so? Can skill ever make up for a deficit of will? Can hardware replace cohesion? Above all, if tasked with giving some version of Vietnamization another go, what did U.S. forces need to do differently to ensure a different result?
At the time, with general officers and civilian officials more inclined to forget Vietnam than contemplate its implications, these questions attracted little attention. Instead, military professionals devoted themselves to gearing up for the next fight, which they resolved would be different. No more Vietnams — and therefore no more Vietnamization.
After the Gulf War of 1991, basking in the ostensible success of Operation Desert Storm, the officer corps persuaded itself that it had once and for all banished its Vietnam-induced bad memories. As Commander-in-Chief George H.W. Bush so memorably put it, “By God, we’ve kicked the Vietnam syndrome once and for all.”
In short, the Pentagon now had war figured out. Victory had become a foregone conclusion. As it happened, this self-congratulatory evaluation left U.S. troops ill-prepared for the difficulties awaiting them after 9/11 when interventions in Afghanistan and Iraq departed from the expected script, which posited short wars by a force beyond compare ending in decisive victories. What the troops got were two very long wars with no decision whatsoever. It was Vietnam on a smaller scale all over again — times two.
For Bush in Iraq and Obama after a brief, half-hearted flirtation with counterinsurgency in Afghanistan, opting for a variant of Vietnamization proved to be a no-brainer. Doing so offered the prospect of an escape from all complexities. True enough, Plan A — we export freedom and democracy — had fallen short. But Plan B — they (with our help) restore some semblance of stability — could enable Washington to salvage at least partial success in both places. With the bar suitably lowered, a version of “Mission Accomplished” might still be within reach.
If Plan A had looked to U.S. troops to vanquish their adversaries outright, Plan B focused on prepping besieged allies to take over the fight. Winning outright was no longer the aim — given the inability of U.S. forces to do so, this was self-evidently not in the cards — but holding the enemy at bay was.
Although allied with the United States, only in the loosest sense did either Iraq or Afghanistan qualify as a nation-state. Only nominally and intermittently did governments in Baghdad and Kabul exercise a writ of authority commanding respect from the people known as Iraqis and Afghans. Yet in the Washington of George Bush and Barack Obama, a willing suspension of disbelief became the basis for policy. In distant lands where the concept of nationhood barely existed, the Pentagon set out to create a full-fledged national security apparatus capable of defending that aspiration as if it represented reality. From day one, this was a faith-based undertaking.
As with any Pentagon project undertaken on a crash basis, this one consumed resources on a gargantuan scale — $25 billion in Iraq and an even more staggering $65 billion in Afghanistan. “Standing up” the requisite forces involved the transfer of vast quantities of equipment and the creation of elaborate U.S. training missions. Iraqi and Afghan forces acquired all the paraphernalia of modern war — attack aircraft or helicopters, artillery and armored vehicles, night vision devices and drones. Needless to say, stateside defense contractors lined up in droves to cash in.
Based on their performance, the security forces on which the Pentagon has lavished years of attention remain visibly not up to the job. Meanwhile, ISIS warriors, without the benefit of expensive third-party mentoring, appear plenty willing to fight and die for their cause. Ditto Taliban fighters in Afghanistan. The beneficiaries of U.S. assistance? Not so much. Based on partial but considerable returns, Vietnamization 2.0 seems to be following an eerily familiar trajectory that should remind anyone of Vietnamization 1.0. Meanwhile, the questions that ought to have been addressed back when our South Vietnamese ally went down to defeat have returned with a vengeance.
The most important of those questions challenges the assumption that has informed U.S. policy in the Greater Middle East since the freedom agenda went south: that Washington has a particular knack for organizing, training, equipping, and motivating foreign armies. Based on the evidence piling up before our eyes, that assumption appears largely false. On this score, retired Lieutenant General Karl Eikenberry, a former military commander and U.S. ambassador in Afghanistan, has rendered an authoritative judgment. “Our track record at building [foreign] security forces over the past 15 years is miserable,” he recently told the New York Times. Just so.
Fighting the Wrong War
Some might argue that trying harder, investing more billions, sending yet more equipment for perhaps another 15 years will produce more favorable results. But this is akin to believing that, given sufficient time, the fruits of capitalism will ultimately trickle down to benefit the least among us or that the march of technology holds the key to maximizing human happiness. You can believe it if you want, but it’s a mug’s game.
Indeed, the United States would be better served if policymakers abandoned the pretense that the Pentagon possesses any gift whatsoever for “standing up” foreign military forces. Prudence might actually counsel that Washington assume instead, when it comes to organizing, training, equipping, and motivating foreign armies, that the United States is essentially clueless.
Exceptions may exist. For example, U.S. efforts have probably helped boost the fighting power of the Kurdish peshmerga. Yet such exceptions are rare enough to prove the rule. Keep in mind that before American trainers and equipment ever showed up, Iraq’s Kurds already possessed the essential attributes of nationhood. Unlike Afghans and Iraqis, Kurds do not require tutoring in the imperative of collective self-defense.
What are the policy implications of giving up the illusion that the Pentagon knows how to build foreign armies? The largest is this: subletting war no longer figures as a plausible alternative to waging it directly. So where U.S. interests require that fighting be done, like it or not, we’re going to have to do that fighting ourselves. By extension, in circumstances where U.S. forces are demonstrably incapable of winning or where Americans balk at any further expenditure of American blood — today in the Greater Middle East both of these conditions apply — then perhaps we shouldn’t be there. To pretend otherwise is to throw good money after bad or, as a famous American general once put it, to wage (even if indirectly) “the wrong war, at the wrong place, at the wrong time, and with the wrong enemy.” This we have been doing now for several decades across much of the Islamic world.
In American politics, we await the officeholder or candidate willing to state the obvious and confront its implications.
Andrew J. Bacevich, a TomDispatch regular, is professor emeritus of history and international relations at Boston University. He is the author of Breach of Trust: How Americans Failed Their Soldiers and Their Country, among other works.
Copyright 2015 Andrew Bacevich
On Building Armies (and Watching Them Fail)
There is a peculiar form of insanity in which a veneer of rationality distracts attention from the madness lurking just beneath the surface. When Alice dove down her rabbit hole to enter a place where smirking cats offered directions, ill-mannered caterpillars dispensed advice, and Mock Turtles constituted the principal ingredient in Mock Turtle soup, she experienced something of the sort.
Yet, as the old adage goes, truth can be even stranger than fiction. For a real-life illustration of this phenomenon, one need look no further than Washington and its approach to national security policy. Viewed up close, it all seems to hang together. Peer out of the rabbit hole and the sheer lunacy quickly becomes apparent.
Consider this recent headline: “U.S. to Ship 2,000 Anti-Tank Missiles To Iraq To Help Fight ISIS.” The accompanying article describes a Pentagon initiative to reinforce Iraq’s battered army with a rush order of AT-4s. A souped-up version of the old bazooka, the AT-4 is designed to punch holes through armored vehicles.
Taken on its own terms, the decision makes considerable sense. Iraqi forces need something to counter a fearsome new tactic of the Islamic State of Iraq and Syria (ISIS): suicide bombers mounted in heavily armored wheeled vehicles. Improved antitank capabilities certainly could help Iraqi troops take out such bombers before they reach their intended targets. The logic is airtight. The sooner these weapons get into the hands of Iraqi personnel, the better for them — and so the better for us.
As it turns out, however, the vehicle of choice for ISIS suicide bombers these days is the up-armored Humvee. In June 2014, when the Iraqi Army abandoned the country’s second largest city, Mosul, ISIS acquired 2,300 made-in-the-U.S.A. Humvees. Since then, it’s captured even more of them.
As U.S. forces were themselves withdrawing from Iraq in 2011, they bequeathed a huge fleet of Humvees to the “new” Iraqi army it had built to the tune of $25 billion. Again, the logic of doing so was impeccable: Iraqi troops needed equipment; shipping used Humvees back to the U.S. was going to cost more than they were worth. Better to give them to those who could put them to good use. Who could quarrel with that?
Before they handed over the used equipment, U.S. troops had spent years trying to pacify Iraq, where order had pretty much collapsed after the invasion of 2003. American troops in Iraq had plenty of tanks and other heavy equipment, but once the country fell into insurgency and civil war, patrolling Iraqi cities required something akin to a hopped-up cop car. The readily available Humvee filled the bill. When it turned out that troops driving around in what was essentially an oversized jeep were vulnerable to sniper fire and roadside bombs, “hardening” those vehicles to protect the occupants became a no-brainer — as even Secretary of Defense Donald Rumsfeld eventually recognized.
At each step along the way, the decisions made possessed a certain obvious logic. It’s only when you get to the end — giving Iraqis American-made weapons to destroy specially hardened American-made military vehicles previously provided to those same Iraqis — that the strangely circular and seriously cuckoo Alice-in-Wonderland nature of the entire enterprise becomes apparent.
AT-4s blowing up those Humvees — with fingers crossed that the anti-tank weapons don’t also fall into the hands of ISIS militants — illustrates in microcosm the larger madness of Washington’s policies concealed by the superficial logic of each immediate situation.
The Promotion of Policies That Have Manifestly Failed
Let me provide a firsthand illustration. A week ago, I appeared on a network television news program to discuss American policy in Iraq and in particular the challenges posed by ISIS. The other guests were former Secretary of Defense and CIA Director Leon Panetta, former Undersecretary of Defense for Policy and current CEO of a Washington think tank Michelle Flournoy, and retired four-star general Anthony Zinni who had once headed up United States Central Command.
Washington is a city in which whatever happens within the current news cycle trumps all other considerations, whether in the immediate or distant past. So the moderator launched the discussion by asking the panelists to comment on President Obama’s decision, announced earlier that very day, to plus-up the 3,000-strong train-and-equip mission to Iraq with an additional 450 American soldiers, the latest ratcheting up of ongoing U.S. efforts to deal with ISIS.
Panetta spoke first and professed wholehearted approval of the initiative. “Well, there’s no question that I think the president’s taken the right step in adding these trainers and advisers.” More such steps — funneling arms to Iraqi Kurds and Sunnis and deploying U.S. Special Operations Forces to hunt down terrorists — were “going to be necessary in order to be able to achieve the mission that we have embarked on.” That mission was of critical importance. Unless defeated, ISIS would convert Iraq into “a base [for] attacking our country and attacking our homeland.”
Flournoy expressed a similar opinion. She called the decision to send additional trainers “a good move and a smart move,” although she, too, hoped that it was only the “first step in a broader series” of escalatory actions. If anything, her view of ISIS was more dire than that of her former Pentagon boss. She called it “the new jihad — violent jihadist vanguard in the Middle East and globally.” Unless stopped, ISIS was likely to become “a global network” with “transnational objectives,” while its “thousands of foreign fighters” from the West and Gulf states were eventually going to “return and be looking to carry out jihad in their home countries.”
General Zinni begged to differ — not on the nature of the danger confronting Washington, but on what to do about it. He described the present policy as “almost déjà vu,” a throwback “to Vietnam before we committed the ground forces. We dribble in more and more advisers and support.”
“We’re not fully committed to this fight,” the general complained. “We use terms like destroy. I can tell you, you could put ground forces on the ground now and we can destroy ISIS.” Zinni proposed doing just that. No more shilly-shallying. The template for action was readily at hand. “The last victory, clear victory that we had was in the first Gulf War,” he said. And what were the keys to success then? “We used overwhelming force. We ended it quickly. We went to the U.N. and got a resolution. We built a coalition. And that ought to be a model we ought to look at.” In short, go big, go hard, go home.
Panetta disagreed. He had a different template in mind. The Iraq War of 2003-2011 had clearly shown that “we know how to do this, and we know how to win at doing this.” The real key was to allow America’s generals a free hand to do what needed to be done. “[A]ll we really do need to do is to be able to give our military commanders the flexibility to design not only the strategy to degrade ISIS, but the larger strategy we need in order to defeat ISIS.” Unleashing the likes of Delta Force or SEAL Team 6 with some missile-firing drones thrown in for good measure was likely to suffice.
For her part, Flournoy thought the real problem was “making sure that there is Iraqi capacity to hold the territory, secure it long-term, so that ISIS doesn’t come back again. And that involves the larger political compromises” — the ones the Iraqis themselves needed to make. At the end of the day, the solution was an Iraqi army willing and able to fight and an Iraqi government willing and able to govern effectively. On that score, there was much work to be done.
Panetta then pointed out that none of this was in the cards unless the United States stepped up to meet the challenge. “[I]f the United States doesn’t provide leadership in these crises, nobody else will.” That much was patently obvious. Other countries and the Iraqis themselves might pitch in, “but we have to provide that leadership. We can’t just stand on the sidelines wringing our hands. I mean… ask the people of Paris what happened there with ISIS. Ask the people in Brussels what happened there with ISIS. What happened in Toronto? What’s happened in this country as a result of the threat from ISIS?”
Ultimately, everything turned on the willingness of America to bring order and stability out of chaos and confusion. Only the United States possessed the necessary combination of wisdom, competence, and strength. Here was a proposition to which Flournoy and Zinni readily assented.
With Alice in Washington
To participate in an exchange with these pillars of the Washington establishment was immensely instructive. Only nominally did their comments qualify as a debate. Despite superficial differences, the discussion was actually an exercise in affirming the theology of American national security — those essential matters of faith that define continuities of policy in Washington, whatever administration is in power.
In that regard, apparent disagreement on specifics masked a deeper consensus consisting of three elements:
* That ISIS represents something akin to an existential threat to the United States, the latest in a long line going back to the totalitarian ideologies of the last century; fascism and communism may be gone, but danger is ever present.
* That if the United States doesn’t claim ownership of the problem of Iraq, the prospects of “solving” it are nil; action or inaction by Washington alone, that is, determines the fate of the planet.
* That the exercise of leadership implies, and indeed requires, employing armed might; without a willingness to loose military power, global leadership is inconceivable.
In a fundamental respect, the purpose of the national security establishment, including the establishment media, is to shield that tripartite consensus from critical examination. This requires narrowing the aperture of analysis so as to exclude anything apart from the here-and-now. The discussion in which I participated provided a vehicle for doing just that. It was an exercise aimed at fostering collective amnesia.
So what the former secretary of defense, think tank CEO, and retired general chose not to say in fretting about ISIS is as revealing as what they did say. Here are some of the things they chose to overlook:
* ISIS would not exist were it not for the folly of the United States in invading — and breaking — Iraq in the first place; we created the vacuum that ISIS is now attempting to fill.
* U.S. military efforts to pacify occupied Iraq from 2003 to 2011 succeeded only in creating a decent interval for the United States to withdraw without having to admit to outright defeat; in no sense did “our” Iraq War end in anything remotely approximating victory, despite the already forgotten loss of thousands of American lives and the expenditure of trillions of dollars.
* For more than a decade and at very considerable expense, the United States has been attempting to create an Iraqi government that governs and an Iraqi army that fights; the results of those efforts speak for themselves: they have failed abysmally.
Now, these are facts. Acknowledging them might suggest a further conclusion: that anyone proposing ways for Washington to put things right in Iraq ought to display a certain sense of humility. The implications of those facts — behind which lies a policy failure of epic proportions — might even provide the basis for an interesting discussion on national television. But that would assume a willingness to engage in serious self-reflection. This, the culture of Washington does not encourage, especially on matters related to basic national security policy.
My own contribution to the televised debate was modest and ineffectual. Toward the end, the moderator offered me a chance to redeem myself. What, she asked, did I think about Panetta’s tribute to the indispensability of American leadership?
A fat pitch that I should have hit it out of the park. Instead, I fouled it off. What I should have said was this: leadership ought to mean something other than simply repeating and compounding past mistakes. It should require more than clinging to policies that have manifestly failed. To remain willfully blind to those failures is not leadership, it’s madness.
Not that it would have mattered if I had. When it comes to Iraq, we’re already halfway back down Alice’s rabbit hole.
Andrew J. Bacevich, a TomDispatch regular, is writing a military history of America’s War for the Greater Middle East. His most recent book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2015 Andrew J. Bacevich
Washington in Wonderland
First, they tried to shoot the dogs. Next, they tried to poison them with strychnine. When both failed as efficient killing methods, British government agents and U.S. Navy personnel used raw meat to lure the pets into a sealed shed. Locking them inside, they gassed the howling animals with exhaust piped in from U.S. military vehicles. Then, setting coconut husks ablaze, they burned the dogs’ carcasses as their owners were left to watch and ponder their own fate.
The truth about the U.S. military base on the British-controlled Indian Ocean island of Diego Garcia is often hard to believe. It would be easy enough to confuse the real story with fictional accounts of the island found in the Transformers movies, on the television series 24, and in Internet conspiracy theories about the disappearance of Malaysia Airlines flight MH370.
While the grim saga of Diego Garcia frequently reads like fiction, it has proven all too real for the people involved. It’s the story of a U.S. military base built on a series of real-life fictions told by U.S. and British officials over more than half a century. The central fiction is that the U.S. built its base on an “uninhabited” island. That was “true” only because the indigenous people were secretly exiled from the Chagos Archipelago when the base was built. Although their ancestors had lived there since the time of the American Revolution, Anglo-American officials decided, as one wrote, to “maintain the fiction that the inhabitants of Chagos [were] not a permanent or semi-permanent population,” but just “transient contract workers.” The same official summed up the situation bluntly: “We are able to make up the rules as we go along.”
And so they did: between 1968 and 1973, American officials conspired with their British colleagues to remove the Chagossians, carefully hiding their expulsion from Congress, Parliament, the U.N., and the media. During the deportations, British agents and members of a U.S. Navy construction battalion rounded up and killed all those pet dogs. Their owners were then deported to the western Indian Ocean islands of Mauritius and the Seychelles, 1,200 miles from their homeland, where they received no resettlement assistance. More than 40 years after their expulsion, Chagossians generally remain the poorest of the poor in their adopted lands, struggling to survive in places that outsiders know as exotic tourist destinations.
During the same period, Diego Garcia became a multi-billion-dollar Navy and Air Force base and a central node in U.S. military efforts to control the Greater Middle East and its oil and natural gas supplies. The base, which few Americans are aware of, is more important strategically and more secretive than the U.S. naval base-cum-prison at Guantánamo Bay, Cuba. Unlike Guantánamo, no journalist has gotten more than a glimpse of Diego Garcia in more than 30 years. And yet, it has played a key role in waging the Gulf War, the 2003 invasion of Iraq, the U.S.-led war in Afghanistan, and the current bombing campaign against the Islamic State in Syria and Iraq.
Following years of reports that the base was a secret CIA “black site” for holding terrorist suspects and years of denials by U.S. and British officials, leaders on both sides of the Atlantic finally fessed up in 2008. “Contrary to earlier explicit assurances,” said Secretary of State for Foreign and Commonwealth Affairs David Miliband, Diego Garcia had indeed played at least some role in the CIA’s secret “rendition” program.
Last year, British officials claimed that flight log records, which might have shed light on those rendition operations, were “incomplete due to water damage” thanks to “extremely heavy weather in June 2014.” A week later, they suddenly reversed themselves, saying that the “previously wet paper records have been dried out.” Two months later, they insisted the logs had not dried out at all and were “damaged to the point of no longer being useful.” Except that the British government’s own weather data indicates that June 2014 was an unusually dry month on Diego Garcia. A legal rights advocate said British officials “could hardly be less credible if they simply said ‘the dog ate my homework.’”
And these are just a few of the fictions underlying the base that occupies the Chagossians’ former home and that the U.S. military has nicknamed the “Footprint of Freedom.” After more than four decades of exile, however, with a Chagossian movement to return to their homeland growing, the fictions of Diego Garcia may finally be crumbling.
The story of Diego Garcia begins in the late eighteenth century. At that time, enslaved peoples from Africa, brought to work on Franco-Mauritian coconut plantations, became the first settlers in the Chagos Archipelago. Following emancipation and the arrival of indentured laborers from India, a diverse mixture of peoples created a new society with its own language, Chagos Kreol. They called themselves the Ilois — the Islanders.
While still a plantation society, the archipelago, by then under British colonial control, provided a secure life featuring universal employment and numerous social benefits on islands described by many as idyllic. “That beautiful atoll of Diego Garcia, right in the middle of the ocean,” is how Stuart Barber described it in the late 1950s. A civilian working for the U.S. Navy, Barber would become the architect of one of the most powerful U.S. military bases overseas.
Amid Cold War competition with the Soviet Union, Barber and other officials were concerned that there was almost no U.S. military presence in and around the Indian Ocean. Barber noted that Diego Garcia’s isolation — halfway between Africa and Indonesia and 1,000 miles south of India — ensured that it would be safe from attack, yet was still within striking distance of territory from southern Africa and the Middle East to South and Southeast Asia.
Guided by Barber’s idea, the administrations of John F. Kennedy and Lyndon Johnson convinced the British government to detach the Chagos Archipelago from colonial Mauritius and create a new colony, which they called the British Indian Ocean Territory. Its sole purpose would be to house U.S. military facilities.
During secret negotiations with their British counterparts, Pentagon and State Department officials insisted that Chagos come under their “exclusive control (without local inhabitants),” embedding an expulsion order in a polite-looking parenthetical phrase. U.S. officials wanted the islands “swept” and “sanitized.” British officials appeared happy to oblige, removing a people one official called “Tarzans” and, in a racist reference to Robinson Crusoe, “Man Fridays.”
“Absolutely Must Go”
This plan was confirmed with an “exchange of notes” signed on December 30, 1966, by U.S. and British officials, as one of the State Department negotiators told me, “under the cover of darkness.” The notes effectively constituted a treaty but required no Congressional or Parliamentary approval, meaning that both governments could keep their plans hidden.
According to the agreement, the United States would gain use of the new colony “without charge.” This was another fiction. In confidential minutes, the United States agreed to secretly wipe out a $14 million British military debt, circumventing the need to ask Congress for funding. In exchange, the British agreed to take the “administrative measures” necessary for “resettling the inhabitants.”
Those measures meant that, after 1967, any Chagossians who left home for medical treatment or a routine vacation in Mauritius were barred from returning. Soon, British officials began restricting the flow of food and medical supplies to Chagos. As conditions deteriorated, more islanders began leaving. By 1970, the U.S. Navy had secured funding for what officials told Congress would be an “austere communications station.” They were, however, already planning to ask for additional funds to expand the facility into a much larger base. As the Navy’s Office of Communications and Cryptology explained, “The communications requirements cited as justification are fiction.” By the 1980s, Diego Garcia would become a billion-dollar garrison.
In briefing papers delivered to Congress, the Navy described Chagos’s population as “negligible,” with the islands “for all practical purposes… uninhabited.” In fact, there were around 1,000 people on Diego Garcia in the 1960s and 500 to 1,000 more on other islands in the archipelago. With Congressional funds secured, the Navy’s highest-ranking admiral, Elmo Zumwalt, summed up the Chagossians’ fate in a 1971 memo of exactly three words: “Absolutely must go.”
The authorities soon ordered the remaining Chagossians — generally allowed no more than a single box of belongings and a sleeping mat — onto overcrowded cargo ships destined for Mauritius and the Seychelles. By 1973, the last Chagossians were gone.
At their destinations, most of the Chagossians were literally left on the docks, homeless, jobless, and with little money. In 1975, two years after the last removals, a Washington Post reporter found them living in “abject poverty.”
Aurélie Lisette Talate was one of the last to go. “I came to Mauritius with six children and my mother,” she told me. “We got our house… but the house didn’t have a door, didn’t have running water, didn’t have electricity. And then my children and I began to suffer. All my children started getting sick.”
Within two months, two of her children were dead. The second was buried in an unmarked grave because she lacked money for a proper burial. Aurélie experienced fainting spells herself and couldn’t eat. “We were living like animals. Land? We had none… Work? We had none. Our children weren’t going to school.”
Today, most Chagossians, who now number more than 5,000, remain impoverished. In their language, their lives are ones of lamizer(impoverished misery) and sagren (profound sorrow and heartbreak over being exiled from their native lands). Many of the islanders attribute sickness and even death to sagren. “I had something that had been affecting me for a long time, since we were uprooted,” was the way Aurélie explained it to me. “This sagren, this shock, it was this same problem that killed my child. We weren’t living free like we did in our natal land.”
Struggling for Justice
From the moment they were deported, the Chagossians demanded to be returned or at least properly resettled. After years of protest, including five hunger strikes led by women like Aurélie Talate, some in Mauritius received the most modest of compensation from the British government: small concrete houses, tiny plots of land, and about $6,000 per adult. Many used the money to pay off large debts they had accrued. For most, conditions improved only marginally. Those living in the Seychelles received nothing.
The Chagossian struggle was reinvigorated in 1997 with the launching of a lawsuit against the British government. In November 2000, the British High Court ruled the removal illegal. In 2001 and 2002, most Chagossians joined new lawsuits in both American and British courts demanding the right to return and proper compensation for their removal and for resettling their islands. The U.S. suit was ultimately dismissed on the grounds that the judiciary can’t, in most circumstances, overrule the executive branch on matters of military and foreign policy. In Britain, the Chagossians were more successful. In 2002, they secured the right to full U.K. citizenship. Over 1,000 Chagossians have since moved to Britain in search of better lives. Twice more, British courts ruled in the people’s favor, with judges calling the government’s behavior “repugnant” and an “abuse of power.”
On the government’s final appeal, however, Britain’s then highest court, the Law Lords in the House of Lords, upheld the exile in a 3-2 decision. The Chagossians appealed to the European Court of Human Rights to overturn the ruling.
A Green Fiction
Before the European Court could rule, the British government announced the creation of the world’s largest Marine Protected Area (MPA) in the Chagos Archipelago. The date of the announcement, April Fool’s Day 2010, should have been a clue that there was more than environmentalism behind the move. The MPA banned commercial fishing and limited other human activity in the archipelago, endangering the viability of any resettlement efforts.
And then came WikiLeaks. In December 2010, it released a State Department cable from the U.S. Embassy in London quoting a senior Foreign and Commonwealth Office official saying that the “former inhabitants would find it difficult, if not impossible, to pursue their claim for resettlement on the islands if the entire Chagos Archipelago were a marine reserve.” U.S. officials agreed. According to the Embassy, Political Counselor Richard Mills wrote, “Establishing a marine reserve might, indeed… be the most effective long-term way to prevent any of the Chagos Islands’ former inhabitants or their descendants from resettling.”
Not surprisingly, the main State Department concern was whether the MPA would affect base operations. “We are concerned,” the London Embassy noted, that some “would come to see the existence of a marine reserve as inherently inconsistent with the military use of Diego Garcia.” British officials assured the Americans there would be “no constraints on military operations.”
Although the European Court of Human Rights ultimately ruled against the Chagossians in 2013, this March, a U.N. tribunal found that the British government had violated international law in creating the Marine Protected Area. Next week, Chagossians will challenge the MPA and their expulsion before the British Supreme Court (now Britain’s highest) armed with the U.N. ruling and revelations that the government won its House of Lords decision with the help of a fiction-filled resettlement study.
Meanwhile, the European Parliament has passed a resolution calling for the Chagossians’ return, the African Union has condemned their deportation as unlawful, three Nobel laureates have spoken out on their behalf, and dozens of members of the British Parliament have joined a group supporting their struggle. In January, a British government “feasibility study” found no significant legal barriers to resettling the islands and outlined several possible resettlement plans, beginning with Diego Garcia. (Notably, Chagossians are not calling for the removal of the U.S. military base. Their opinions about it are diverse and complicated. At least some would prefer jobs on the base to lives of poverty and unemployment in exile.)
Of course, no study was needed to know that resettlement on Diego Garcia and in the rest of the archipelago is feasible. The base, which has hosted thousands of military and civilian personnel for more than 40 years, has demonstrated that well enough. In fact, Stuart Barber, its architect, came to the same conclusion in the years before his death. After he learned of the Chagossians’ fate, he wrote a series of impassioned letters to Human Rights Watch and the British Embassy in Washington, among others, imploring them to help the Chagossians return home. In a letter to Alaska Senator Ted Stevens, he said bluntly that the expulsion “wasn’t necessary militarily.”
In a 1991 letter to the Washington Post, Barber suggested that it was time “to redress the inexcusably inhuman wrongs inflicted by the British at our insistence.” He added, “Substantial additional compensation for 18-25 past years of misery for all evictees is certainly in order. Even if that were to cost $100,000 per family, we would be talking of a maximum of $40-50 million, modest compared with our base investment there.”
Almost a quarter-century later, nothing has yet been done. In 2016, the initial 50-year agreement for Diego Garcia will expire. While it is subject to an automatic 20-year renewal, it provides for a two-year renegotiation period, which commenced in late 2014. With momentum building in support of the Chagossians, they are optimistic that the two governments will finally correct this historic injustice. That U.S. officials allowed the British feasibility study to consider resettlement plans for Diego Garcia is a hopeful sign that Anglo-American policy may finally be shifting to right a great wrong in the Indian Ocean.
Unfortunately, Aurélie Talate will never see the day when her people go home. Like others among the rapidly dwindling number of Chagossians born in the archipelago, Aurélie died in 2012 at age 70, succumbing to the heartbreak that is sagren.
David Vine, a TomDispatch regular, is associate professor of anthropology at American University in Washington, D.C. His new book, Base Nation: How U.S. Military Bases Abroad Harm America and the World will be published in August as part of the American Empire Project (Metropolitan Books). He is also the author of Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia. He has written for the New York Times, the Washington Post, the Guardian, and Mother Jones, among other publications. For more of his writing, visit www.davidvine.net.
Copyright 2015 David Vine
The Truth About Diego Garcia
En route back to Washington at the tail end of his most recent overseas trip, John Kerry, America’s peripatetic secretary of state, stopped off in France “to share a hug with all of Paris.” Whether Paris reciprocated the secretary’s embrace went unrecorded.
Despite the requisite reference to General Pershing (“Lafayette, we are here!”) and flying James Taylor in from the 1960s to assure Parisians that “You’ve Got a Friend,” in the annals of American diplomacy Kerry’s hug will likely rank with President Eisenhower’s award of the Legion of Merit to Nicaraguan dictator Anastasio Somoza for “exceptionally meritorious conduct” and Jimmy Carter’s acknowledgment of the “admiration and love” said to define the relationship between the Iranian people and their Shah. In short, it was a moment best forgotten.
Not one of the signature foreign policy initiatives conceived in Obama’s first term has borne fruit. When it came to making a fresh start with the Islamic world, responsibly ending the “dumb” war in Iraq (while winning the “necessary” one in Afghanistan), “resetting” U.S.-Russian relations, and “pivoting” toward Asia, mark your scorecard 0 for 4.
There’s no doubt that when Kerry arrived at the State Department he brought with him some much-needed energy. That he is giving it his all — the department’s website reports that the secretary has already clocked over 682,000 miles of travel — is doubtless true as well. The problem is the absence of results. Remember when his signature initiative was going to be an Israeli-Palestinian peace deal? Sadly, that quixotic plan, too, has come to naught.
Yes, Team Obama “got” bin Laden. And, yes, it deserves credit for abandoning a self-evidently counterproductive 50-plus-year-old policy toward Cuba and for signing a promising agreement with China on climate change. That said, the administration’s overall record of accomplishment is beyond thin, starting with that first-day-in-the-Oval-Office symbol that things were truly going to be different: Obama’s order to close Guantanamo. That, of course, remains a work in progress (despite regular reassurances of light glimmering at the end of what has become a very long tunnel).
In fact, taking the president’s record as a whole, noting that on his watch occasional U.S. drone strikes have become routine, the Nobel Committee might want to consider revoking its Peace Prize.
Nor should we expect much in the time that Obama has remaining. Perhaps there is a deal with Iran waiting in the wings (along with the depth charge of ever-fiercer congressionally mandated sanctions), but signs of intellectual exhaustion are distinctly in evidence.
“Where there is no vision,” the Hebrew Bible tells us, “the people perish.” There’s no use pretending: if there’s one thing the Obama administration most definitely has not got and has never had, it’s a foreign policy vision.
In Search of Truly Wise (White) Men — Only Those 84 or Older Need Apply
All of this evokes a sense of unease, even consternation bordering on panic, in circles where members of the foreign policy elite congregate. Absent visionary leadership in Washington, they have persuaded themselves, we’re all going down. So the world’s sole superpower and self-anointed global leader needs to get game — and fast.
Leslie Gelb, former president of the Council on Foreign Relations, recently weighed in with a proposal for fixing the problem: clean house. Obama has surrounded himself with fumbling incompetents, Gelb charges. Get rid of them and bring in the visionaries.
Writing at the Daily Beast, Gelb urges the president to fire his entire national security team and replace them with “strong and strategic people of proven foreign policy experience.” Translation: the sort of people who sip sherry and nibble on brie in the august precincts of the Council of Foreign Relations. In addition to offering his own slate of nominees, including several veterans of the storied George W. Bush administration, Gelb suggests that Obama consult regularly with Henry Kissinger, Brent Scowcroft, Zbigniew Brzezinski, and James Baker. These distinguished war-horses range in age from 84 to 91. By implication, only white males born prior to World War II are eligible for induction into the ranks of the Truly Wise Men.
Anyway, Gelb emphasizes, Obama needs to get on with it. With the planet awash in challenges that “imperil our very survival,” there is simply no time to waste.
At best, Gelb’s got it half right. When it comes to foreign policy, this president has indeed demonstrated a knack for surrounding himself with lackluster lieutenants. That statement applies equally to national security adviser Susan Rice (and her predecessor), to Secretary of State Kerry (and his predecessor), and to outgoing Pentagon chief Chuck Hagel. Ashton Carter, the technocrat slated to replace Hagel as defense secretary, comes from the same mold.
They are all “seasoned” – in Washington, a euphemism for bland, conventional, and utterly unimaginative — charter members of the Rogers-Christopher school of American statecraft. (That may require some unpacking, so pretend you’re on Jeopardy. Alex Trebek: “Two eminently forgettable and completely forgotten twentieth-century secretaries of state.” You, hitting the buzzer: “Who were William Rogers and Warren Christopher?” “Correct!”)
Members of Obama’s national security team worked long and hard to get where they are. Yet along the way — perhaps from absorbing too many position papers, PowerPoint briefings, and platitudes about “American global leadership” — they lost whatever creative spark once endowed them with the appearance of talent and promise. Ambition, unquestioned patriotism, and a capacity for putting in endless hours (and enduring endless travel) — all these remain. But a serious conception of where the world is heading and what that implies for basic U.S. policy? Individually and collectively, they are without a clue.
I submit that maybe that’s okay, that plodding mediocrity can be a boon if, as at present, the alternatives on offer look even worse.
A Hug for Obama
You want vision? Obama’s predecessor surrounded himself with visionaries. Dick Cheney, Condoleezza Rice, Donald Rumsfeld, and Paul Wolfowitz, products of the Cold War one and all, certainly fancied themselves large-bore strategic thinkers. Busily positioning the United States to run (just another “i” and you have “ruin”) the world, they were blindsided by 9/11. Unembarrassed and unchastened by this disaster, they initiated a series of morally dubious, strategically boneheaded moves that were either (take your pick) going to spread freedom and democracy or position the United States to exercise permanent dominion. The ensuing Global War on Terror did neither, of course, while adding trillions to the national debt and helping fracture great expanses of the planet. Obama is still, however ineffectually, trying to clean up the mess they created.
If that’s what handing the keys to big thinkers gets you, give me Susan Rice any day. Although Obama’s “don’t do stupid shit” may never rank with Washington’s Farewell Address or the Monroe Doctrine in the history books, George W. Bush might have profited from having some comparable axiom taped to his laptop.
Big ideas have their place — indeed, are essential — when the issues at hand are clearly defined. The Fall of France in 1940 was one such moment, which President Franklin D. Roosevelt recognized. So too, arguably, was the period immediately after World War II. The defeat of Nazi Germany and Imperial Japan had left a dangerous power vacuum in both Europe and the Pacific to which George Marshall, Dean Acheson, and their compatriots forged a necessary response. Perhaps the period 1968-1969 falls into that same category, the debacle of Vietnam requiring a major adjustment in U.S. Cold War strategy. This Richard Nixon and Henry Kissinger undertook with their opening to China.
Yet despite the overwrought claims of Gelb (and others) that America’s very survival is today at risk, the present historical moment lacks comparable clarity. Ours is not a time when we face a single overarching threat. Instead, on several different fronts, worrisome developments are brewing. Environmental degradation, the rise of China and other emerging powers, the spread of radical Islam, the precarious state of the global economy, vulnerabilities that are an inevitable byproduct of our pursuit of a cyber-utopia: all of these bear very careful watching. Each one today should entail a defensive response, the United States protecting itself (and its allies) against worst-case outcomes. But none of these at the present moment justifies embarking upon a let-out-all-the-stops offensive. Chasing after one problem would necessarily divert attention from the rest.
The immediate future remains too opaque to say with certainty which threat will turn out to pose the greatest danger, whether in the next year or the next decade — and which might even end up not being a threat at all but an unexpected opportunity. Conditions are not ripe for boldness. The abiding imperative of the moment is to discern, which requires careful observation and patience. In short, forget about strategy.
And there’s a further matter. Correct discernment assumes a proper vantage point. What you see depends on where you sit and which way you’re facing. Those who inhabit the upper ranks of the Obama administration (and those whom Leslie Gelb offers as replacements) sit somewhere back in the twentieth century, their worldview shaped by memories of Munich and Yalta, Korea and Vietnam, the Cuban Missile Crisis and the Berlin Wall, none of which retain more than tangential relevance to the present day.
You want vision? That will require a new crop of visionaries. Instead of sitting down with ancients like Kissinger, Scowcroft, Brzezinski, or Baker, this president (or his successor) would be better served to pick the brain of the army captain back from multiple combat tours in Iraq and Afghanistan, the moral theologian specializing in inter-religious dialog, the Peace Corps volunteer who spent the last two years in West Africa, and the Silicon Valley entrepreneur best able to spell out the political implications of the next big thing.
In short, a post-twentieth century vision requires a post-twentieth century generation, able to free itself from old shibboleths to which Leslie Gelb and most of official Washington today remain stubbornly dedicated. That generation waits in the wings and after another presidential election or two may indeed wield some influence. We should hope so. In the meantime, we should bide our time, amending the words of the prophet to something like: “Where there is no vision, the people muddle along and await salvation.”
So as Obama and his team muddle toward their finish line, their achievements negligible, we might even express a modicum of gratitude. When they depart the scene, we will forget the lot of them. Yet at least they managed to steer clear of truly epic disasters. When muddling was the best Washington had on offer, they delivered. They may even deserve a hug.
Andrew J. Bacevich, a TomDispatch regular, is writing a military history of America’s War for the Greater Middle East. His most recent book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2015 Andrew Bacevich
Save Us From Washington’s Visionaries
The abiding defect of U.S. foreign policy? It’s isolationism, my friend. Purporting to steer clear of war, isolationism fosters it. Isolationism impedes the spread of democracy. It inhibits trade and therefore prosperity. It allows evildoers to get away with murder. Isolationists prevent the United States from accomplishing its providentially assigned global mission. Wean the American people from their persistent inclination to look inward and who knows what wonders our leaders will accomplish.
The United States has been at war for well over a decade now, with U.S. attacks and excursions in distant lands having become as commonplace as floods and forest fires. Yet during the recent debate over Syria, the absence of popular enthusiasm for opening up another active front evoked expressions of concern in Washington that Americans were once more turning their backs on the world.
As he was proclaiming the imperative of punishing the government of Bashar al-Assad, Secretary of State John Kerry also chided skeptical members of the Senate Foreign Relations Committee that “this is not the time for armchair isolationism.” Commentators keen to have a go at the Syrian autocrat wasted little time in expanding on Kerry’s theme.
Reflecting on “where isolationism leads,” Jennifer Rubin, the reliably bellicose Washington Post columnist, was quick to chime in, denouncing those hesitant to initiate another war as “infantile.” American isolationists, she insisted, were giving a green light to aggression. Any nation that counted on the United States for protection had now become a “sitting duck,” with “Eastern Europe [and] neighbors of Venezuela and Israel” among those left exposed and vulnerable. News reports of Venezuelan troop movements threatening Brazil, Colombia, or Guyana were notably absent from the Post or any other media outlet, but no matter — you get the idea.
Military analyst Frederick Kagan was equally troubled. Also writing in the Post, he worried that “the isolationist narrative is rapidly becoming dominant.” His preferred narrative emphasized the need for ever greater military exertions, with Syria just the place to launch a new campaign. For Bret Stephens, a columnist with the Wall Street Journal, the problem was the Republican Party. Where had the hawks gone? The Syria debate, he lamented, was “exposing the isolationist worm eating its way through the GOP apple.”
The Journal’s op-ed page also gave the redoubtable Norman Podhoretz, not only still alive but vigorously kicking, a chance to vent. Unmasking President Obama as “a left-wing radical” intent on “reduc[ing] the country’s power and influence,” the unrepentant neoconservative accused the president of exploiting the “war-weariness of the American people and the rise of isolationist sentiment… on the left and right” to bring about “a greater diminution of American power than he probably envisaged even in his wildest radical dreams.”
Obama escalated the war in Afghanistan, “got” Osama bin Laden, toppled one Arab dictator in Libya, and bashed and bombed targets in Somalia, Yemen, Pakistan, and elsewhere. Even so, it turns out he is actually part of the isolationist conspiracy to destroy America!
Over at the New York Times, similar concerns, even if less hysterically expressed, prevailed. According to Times columnist Roger Cohen, President Obama’s reluctance to pull the trigger showed that he had “deferred to a growing isolationism.” Bill Keller concurred. “America is again in a deep isolationist mood.” In a column entitled, “Our New Isolationism,” he decried “the fears and defeatist slogans of knee-jerk isolationism” that were impeding military action. (For Keller, the proper antidote to isolationism is amnesia. As he put it, “Getting Syria right starts with getting over Iraq.”)
For his part, Times staff writer Sam Tanenhaus contributed a bizarre two-minute exercise in video agitprop — complete with faked scenes of the Japanese attacking Pearl Harbor — that slapped the isolationist label on anyone opposing entry into any war whatsoever, or tiring of a war gone awry, or proposing that America go it alone.
When the “New Isolationism” Was New
Most of this, of course, qualifies as overheated malarkey. As a characterization of U.S. policy at any time in memory, isolationism is a fiction. Never really a tendency, it qualifies at most as a moment, referring to that period in the 1930s when large numbers of Americans balked at the prospect of entering another European war, the previous one having fallen well short of its “War To End All Wars” advance billing.
In fact, from the day of its founding down to the present, the United States has never turned its back on the world. Isolationism owes its storied history to its value as a rhetorical device, deployed to discredit anyone opposing an action or commitment (usually involving military forces) that others happen to favor. If I, a grandson of Lithuanian immigrants, favor deploying U.S. forces to Lithuania to keep that NATO ally out of Vladimir Putin’s clutches and you oppose that proposition, then you, sir or madam, are an “isolationist.” Presumably, Jennifer Rubin will see things my way and lend her support to shoring up Lithuania’s vulnerable frontiers.
For this very reason, the term isolationism is not likely to disappear from American political discourse anytime soon. It’s too useful. Indeed, employ this verbal cudgel to castigate your opponents and your chances of gaining entrée to the nation’s most prestigious publications improve appreciably. Warn about the revival of isolationism and your prospects of making the grade as a pundit or candidate for high office suddenly brighten. This is the great thing about using isolationists as punching bags: it makes actual thought unnecessary. All that’s required to posture as a font of wisdom is the brainless recycling of clichés, half-truths, and bromides.
No publication is more likely to welcome those clichés, half-truths, and bromides than the New York Times. There, isolationism always looms remarkably large and is just around the corner.
In July 1942, the New York Times Magazine opened its pages to Vice President Henry A. Wallace, who sounded the alarm about the looming threat of what he styled a “new isolationism.” This was in the midst of World War II, mind you.
After the previous world war, the vice president wrote, the United States had turned inward. As summer follows spring, “the choice led up to this present war.” Repeat the error, Wallace warned, and “the price will be more terrible and will be paid much sooner.” The world was changing and it was long past time for Americans to get with the program. “The airplane, the radio, and modern technology have bound the planet so closely together that what happens anywhere on the planet has a direct effect everywhere else.” In a world that had “suddenly become so small,” he continued, “we cannot afford to resume the role of hermit.”
The implications for policy were self-evident:
“This time, then, we have only one real choice. We must play a responsible part in the world — leading the way in world progress, fostering a healthy world trade, helping to protect the world’s peace.”
One month later, it was Archibald MacLeish’s turn. On August 16, 1942, the Times magazine published a long essay of his under the title of — wouldn’t you know it — “The New Isolationism.” For readers in need of coaching, Times editors inserted this seal of approval before the text: “There is great pertinence in the following article.”
A well-known poet, playwright, and literary gadfly, MacLeish was at the time serving as Librarian of Congress. From this bully pulpit, he offered the reassuring news that “isolationism in America is dead.” Unfortunately, like zombies, “old isolationists never really die: they merely dig in their toes in a new position. And the new position, whatever name is given it, is isolation still.”
Fortunately, the American people were having none of it. They had “recaptured the current of history and they propose to move with it; they don’t mean to be denied.” MacLeish’s fellow citizens knew what he knew: “that there is a stirring in our world…, a forward thrusting and overflowing human hope of the human will which must be given a channel or it will dig a channel itself.” In effect, MacLeish was daring the isolationists, in whatever guise, to stand in the way of this forward thrusting and overflowing hopefulness. Presumably, they would either drown or be crushed.
The end of World War II found the United States donning the mantle of global leadership, much as Wallace, MacLeish, and the Times had counseled. World peace did not ensue. Instead, a host of problems continued to afflict the planet, with isolationists time and again fingered as the culprits impeding their solution.
The Gift That Never Stops Giving
In June 1948, with a notable absence of creativity in drafting headlines, the Times once again found evidence of “the new isolationism.” In an unsigned editorial, the paper charged that an American penchant for hermit-like behavior was “asserting itself again in a manner that is both distressing and baffling.” With the Cold War fully joined and U.S. forces occupying Germany, Japan, and other countries, the Times worried that some Republicans in Congress appeared reluctant to fund the Marshall Plan.
From their offices in Manhattan, members of the Times editorial board detected in some quarters “a homesickness for the old days.” It was incumbent upon Americans to understand that “the time is past when we could protect ourselves easily behind our barriers behind the seas.” History was summoning the United States to lead the world: “The very success of our democracy has now imposed duties upon us which we must fulfill if that democracy is to survive.” Those entertaining contrary views, the Times huffed, “do not speak for the American people.”
That very month, Josef Stalin announced that the Soviet Union was blockading Berlin. The U.S. responded not by heading for the exits but by initiating a dramatic airlift. Oh, and Congress fully funded the Marshall Plan.
Barely a year later, in August 1949, with Stalin having just lifted the Berlin Blockade, Times columnist Arthur Krock discerned another urge to disengage. In a piece called “Chickens Usually Come Home,” he cited congressional reservations about the recently promulgated Truman Doctrine as evidence of, yes, a “new isolationism.” As it happened, Congress duly appropriated the money President Truman was requesting to support Greece and Turkey against the threat of communism — as it would support similar requests to throw arms and money at other trouble spots like French Indochina.
Even so, in November of that year, the Times magazine published yet another warning about “the challenge of a new isolationism.” The author was Illinois Governor Adlai Stevenson, then positioning himself for a White House run. Like many another would-be candidate before and since, Stevenson took the preliminary step of signaling his opposition to the I-word.
World War II, he wrote, had “not only destroyed fascism abroad, but a lot of isolationist notions here at home.” War and technological advance had “buried the whole ostrich of isolation.” At least it should have. Unfortunately, some Republicans hadn’t gotten the word. They were “internationally minded in principle but not in practice.” Stevenson feared that when the chips were down such head-in-the-sand inclinations might come roaring back. This he was determined to resist. “The eagle, not the ostrich,” he proclaimed, “is our national emblem.”
In August 1957, the Times magazine was at it once again, opening its pages to another Illinois Democrat, Senator Paul Douglas, for an essay familiarly entitled “A New Isolationism — Ripples or Tide?” Douglas claimed that “a new tide of isolationism is rising in the country.” U.S. forces remained in Germany and Japan, along with Korea, where they had recently fought a major war. Even so, the senator worried that “the internationalists are tiring rapidly now.”
Americans needed to fortify themselves by heeding the message of the Gospels: “Let the spirit of the Galilean enter our worldly and power-obsessed hearts.” In other words, the senator’s prescription for American statecraft was an early version of What Would Jesus Do? Was Jesus Christ an advocate of American global leadership? Senator Douglas apparently thought so.
Then came Vietnam. By May 1970, even Times-men were showing a little of that fatigue. That month, star columnist James Reston pointed (yet again) to the “new isolationism.” Yet in contrast to the paper’s scribblings on the subject over the previous three decades, Reston didn’t decry it as entirely irrational. The war had proven to be a bummer and “the longer it goes on,” he wrote, “the harder it will be to get public support for American intervention.” Washington, in other words, needed to end its misguided war if it had any hopes of repositioning itself to start the next one.
A Concept Growing Long in the Tooth
By 1980, the Times showed signs of recovering from its brief Vietnam funk. In a review of Norman Podhoretz’s The Present Danger, for example, the noted critic Anatole Broyard extolled the author’s argument as “dispassionate,” “temperate,” and “almost commonsensical.”
The actual text was none of those things. What the pugnacious Podhoretz called — get ready for it — “the new isolationism” was, in his words, “hard to distinguish from simple anti-Americanism.” Isolationists — anyone who had opposed the Vietnam War on whatever grounds — believed that the United States was “a force for evil, a menace, a terror.” Podhoretz detected a “psychological connection” between “anti-Americanism, isolationism, and the tendency to explain away or even apologize for anything the Soviet Union does, no matter how menacing.” It wasn’t bad enough that isolationists hated their country, they were, it seems, commie symps to boot.
Fast forward a decade, and — less than three months after U.S. troops invaded Panama — Times columnist Flora Lewis sensed a resurgence of you-know-what. In a February 1990 column, she described “a convergence of right and left” with both sides “arguing with increasing intensity that it’s time for the U.S. to get off the world.” Right-wingers saw that world as too nasty to save; left-wingers, the United States as too nasty to save it. “Both,” she concluded (of course), were “moving toward a new isolationism.”
Five months later, Saddam Hussein sent his troops into Kuwait. Instead of getting off the world, President George H.W. Bush deployed U.S. combat forces to defend Saudi Arabia. For Joshua Muravchik, however, merely defending that oil-rich kingdom wasn’t nearly good enough. Indeed, here was a prime example of the “New Isolationism, Same Old Mistake,” as his Times op-ed was entitled.
The mistake was to flinch from instantly ejecting Saddam’s forces. Although opponents of a war against Iraq did not “see themselves as isolationists, but as realists,” he considered this a distinction without a difference. Muravchik, who made his living churning out foreign policy analysis for various Washington think tanks, favored “the principle of investing America’s power in the effort to fashion an environment congenial to our long-term safety.” War, he firmly believed, offered the means to fashion that congenial environment. Should America fail to act, he warned, “our abdication will encourage such threats to grow.”
Of course, the United States did act and the threats grew anyway. In and around the Middle East, the environment continued to be thoroughly uncongenial. Still, in Times-world, the American penchant for doing too little rather than too much remained the eternal problem, eternally “new.” An op-ed by up-and-coming journalist James Traub appearing in the Times in December 1991, just months after a half-million U.S. troops had liberated Kuwait, was typical. Assessing the contemporary political scene, Traub detected “a new wave of isolationism gathering force.” Traub was undoubtedly establishing his bona fides. (Soon after, he landed a job working for the paper.)
This time, according to Traub, the problem was the Democrats. No longer “the party of Wilson or of John F. Kennedy,” Democrats, he lamented, “aspire[d] to be the party of middle-class frustrations — and if that entails turning your back on the world, so be it.” The following year Democrats nominated as their presidential candidate Bill Clinton, who insisted that he would never under any circumstances turn his back on the world. Even so, no sooner did Clinton win than Times columnist Leslie Gelb was predicting that the new president would “fall into the trap of isolationism and policy passivity.”
Get Me Rewrite!
Arthur Schlesinger defined the problem in broader terms. The famous historian and Democratic Party insider had weighed in early on the matter with a much-noted essay that appeared in The Atlantic Monthly back in 1952. He called it – you guessed it — “The New Isolationism.”
In June 1994, more than 40 years later, with the Cold War now finally won, Schlesinger was back for more with a Times op-ed that sounded the usual alarm. “The Cold War produced the illusion that traditional isolationism was dead and buried,” he wrote, but of course — this is, after all, the Times — it was actually alive and kicking. The passing of the Cold War had “weakened the incentives to internationalism” and was giving isolationists a new opening, even though in “a world of law requiring enforcement,” it was incumbent upon the United States to be the lead enforcer.
The warning resonated. Although the Times does not normally give commencement addresses much attention, it made an exception for Madeleine Albright’s remarks to graduating seniors at Barnard College in May 1995. The U.S. ambassador to the United Nations had detected what she called “a trend toward isolationism that is running stronger in America than at any time since the period between the two world wars,” and the American people were giving in to the temptation “to pull the covers up over our heads and pretend we do not notice, do not care, and are unaffected by events overseas.” In other circumstances in another place, it might have seemed an odd claim, given that the United States had just wrapped up armed interventions in Somalia and Haiti and was on the verge of initiating a bombing campaign in the Balkans.
Still, Schlesinger had Albright’s back. The July/August 1995 issue of Foreign Affairs prominently featured an article of his entitled “Back to the Womb? Isolationism’s Renewed Threat,” with Times editors publishing a CliffsNotes version on the op-ed page a month earlier. “The isolationist impulse has risen from the grave,” Schlesinger announced, “and it has taken the new form of unilateralism.”
His complaint was no longer that the United States hesitated to act, but that it did not act in concert with others. This “neo-isolationism,” he warned, introducing a new note into the tradition of isolationism-bashing for the first time in decades, “promises to prevent the most powerful nation on the planet from playing any role in enforcing the peace system.” The isolationists were winning — this time through pure international belligerence. Yet “as we return to the womb,” Schlesinger warned his fellow citizens, “we are surrendering a magnificent dream.”
Other Times contributors shared Schlesinger’s concern. On January 30, 1996, the columnist Russell Baker chipped in with a piece called “The New Isolationism.” For those slow on the uptake, Jessica Mathews, then a fellow at the Council on Foreign Relations, affirmed Baker’s concerns by publishing an identically titled column in the Washington Post a mere six days later. Mathews reported “troubling signs that the turning inward that many feared would follow the Cold War’s end is indeed happening.” With both the Times and the Post concurring, “the new isolationism” had seemingly reached pandemic proportions (as a title, if nothing else).
Did the “new” isolationism then pave the way for 9/11? Was al-Qaeda inspired by an unwillingness on Washington’s part to insert itself into the Islamic world?
Unintended and unanticipated consequences stemming from prior U.S. interventions might have seemed to offer a better explanation. But this much is for sure: as far as the Times was concerned, even in the midst of George W. Bush’s Global War in Terror, the threat of isolationism persisted.
In January 2004, David M. Malone, president of the International Peace Academy, worried in a Times op-ed “that the United States is retracting into itself” — this despite the fact that U.S. forces were engaged in simultaneous wars in Iraq and Afghanistan. Among Americans, a concern about terrorism, he insisted, was breeding “a sense of self-obsession and indifference to the plight of others.” “When Terrorists Win: Beware America’s New Isolationism,” blared the headline of Malone’s not-so-new piece.
Actually, Americans should beware those who conjure up phony warnings of a “new isolationism” to advance a particular agenda. The essence of that agenda, whatever the particulars and however packaged, is this: If the United States just tries a little bit harder — one more intervention, one more shipment of arms to a beleaguered “ally,” one more line drawn in the sand — we will finally turn the corner and the bright uplands of peace and freedom will come into view.
This is a delusion, of course. But if you write a piece exposing that delusion, don’t bother submitting it to the Times.
Andrew J. Bacevich is a professor of history and international relations at Boston University. His new book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2013 Andrew Bacevich
Always and Everywhere
Sometimes history happens at the moment when no one is looking. On weekends in late August, the president of the United States ought to be playing golf or loafing at Camp David, not making headlines. Yet Barack Obama chose Labor Day weekend to unveil arguably the most consequential foreign policy shift of his presidency.
In an announcement that surprised virtually everyone, the president told his countrymen and the world that he was putting on hold the much anticipated U.S. attack against Syria. Obama hadn’t, he assured us, changed his mind about the need and justification for punishing the Syrian government for its probable use of chemical weapons against its own citizens. In fact, only days before administration officials had been claiming that, if necessary, the U.S. would “go it alone” in punishing Bashar al-Assad’s regime for its bad behavior. Now, however, Obama announced that, as the chief executive of “the world’s oldest constitutional democracy,” he had decided to seek Congressional authorization before proceeding.
Obama thereby brought to a screeching halt a process extending back over six decades in which successive inhabitants of the Oval Office had arrogated to themselves (or had thrust upon them) ever wider prerogatives in deciding when and against whom the United States should wage war. Here was one point on which every president from Harry Truman to George W. Bush had agreed: on matters related to national security, the authority of the commander-in-chief has no fixed limits. When it comes to keeping the country safe and securing its vital interests, presidents can do pretty much whatever they see fit.
Here, by no means incidentally, lies the ultimate the source of the stature and prestige that defines the imperial presidency and thereby shapes (or distorts) the American political system. Sure, the quarters at 1600 Pennsylvania Avenue are classy, but what really endowed the postwar war presidency with its singular aura were the missiles, bombers, and carrier battle groups that responded to the commands of one man alone. What’s the bully pulpit in comparison to having the 82nd Airborne and SEAL Team Six at your beck and call?
Now, in effect, Obama was saying to Congress: I’m keen to launch a war of choice. But first I want you guys to okay it. In politics, where voluntarily forfeiting power is an unnatural act, Obama’s invitation qualifies as beyond unusual. Whatever the calculations behind his move, its effect rates somewhere between unprecedented and positively bizarre — the heir to imperial prerogatives acting, well, decidedly unimperial.
Obama is a constitutional lawyer, of course, and it’s pleasant to imagine that he acted out of due regard for what Article 1, Section 8, of that document plainly states, namely that “the Congress shall have power… to declare war.” Take his explanation at face value and the president’s decision ought to earn plaudits from strict constructionists across the land. The Federalist Society should offer Obama an honorary lifetime membership.
Of course, seasoned political observers, understandably steeped in cynicism, dismissed the president’s professed rationale out of hand and immediately began speculating about his actual motivation. The most popular explanation was this: having painted himself into a corner, Obama was trying to lure members of the legislative branch into joining him there. Rather than a belated conversion experience, the president’s literal reading of the Constitution actually amounted to a sneaky political ruse.
After all, the president had gotten himself into a pickle by declaring back in August 2012 that any use of chemical weapons by the government of Bashar al-Assad would cross a supposedly game-changing “red line.” When the Syrians (apparently) called his bluff, Obama found himself facing uniformly unattractive military options that ranged from the patently risky — joining forces with the militants intent on toppling Assad — to the patently pointless — firing a “shot across the bow” of the Syrian ship of state.
Meanwhile, the broader American public, awakening from its summertime snooze, was demonstrating remarkably little enthusiasm for yet another armed intervention in the Middle East. Making matters worse still, U.S. military leaders and many members of Congress, Republican and Democratic alike, were expressing serious reservations or actual opposition. Press reports even cited leaks by unnamed officials who characterized the intelligence linking Assad to the chemical attacks as no “slam dunk,” a painful reminder of how bogus information had paved the way for the disastrous and unnecessary Iraq War. For the White House, even a hint that Obama in 2013 might be replaying the Bush scenario of 2003 was anathema.
The president also discovered that recruiting allies to join him in this venture was proving a hard sell. It wasn’t just the Arab League’s refusal to give an administration strike against Syria its seal of approval, although that was bad enough. Jordan’s King Abdullah, America’s “closest ally in the Arab world,” publicly announced that he favored talking to Syria rather than bombing it. As for Iraq, that previous beneficiary of American liberation, its government was refusing even to allow U.S. forces access to its airspace. Ingrates!
For Obama, the last straw may have come when America’s most reliable (not to say subservient) European partner refused to enlist in yet another crusade to advance the cause of peace, freedom, and human rights in the Middle East. With memories of Tony and George W. apparently eclipsing those of Winston and Franklin, the British Parliament rejected Prime Minister David Cameron’s attempt to position the United Kingdom alongside the United States. Parliament’s vote dashed Obama’s hopes of forging a coalition of two and so investing a war of choice against Syria with at least a modicum of legitimacy.
When it comes to actual military action, only France still entertains the possibility of making common cause with the United States. Yet the number of Americans taking assurance from this prospect approximates the number who know that Bernard-Henri Lévy isn’t a celebrity chef.
John F. Kennedy once remarked that defeat is an orphan. Here was a war bereft of parents even before it had begun.
Whether or Not to Approve the War for the Greater Middle East
Still, whether high-minded constitutional considerations or diabolically clever political machinations motivated the president may matter less than what happens next. Obama lobbed the ball into Congress’s end of the court. What remains to be seen is how the House and the Senate, just now coming back into session, will respond.
At least two possibilities exist, one with implications that could prove profound and the second holding the promise of being vastly entertaining.
On the one hand, Obama has implicitly opened the door for a Great Debate regarding the trajectory of U.S. policy in the Middle East. Although a week or ten days from now the Senate and House of Representatives will likely be voting to approve or reject some version of an Authorization for the Use of Military Force (AUMF), at stake is much more than the question of what to do about Syria. The real issue — Americans should hope that the forthcoming congressional debate makes this explicit — concerns the advisability of continuing to rely on military might as the preferred means of advancing U.S. interests in this part of the world.
Appreciating the actual stakes requires putting the present crisis in a broader context. Herewith an abbreviated history lesson.
Back in 1980, President Jimmy Carter announced that the United States would employ any means necessary to prevent a hostile power from gaining control of the Persian Gulf. In retrospect, it’s clear enough that the promulgation of the so-called Carter Doctrine amounted to a de facto presidential “declaration” of war (even if Carter himself did not consciously intend to commit the United States to perpetual armed conflict in the region). Certainly, what followed was a never-ending sequence of wars and war-like episodes. Although the Congress never formally endorsed Carter’s declaration, it tacitly acceded to all that his commitment subsequently entailed.
Relatively modest in its initial formulation, the Carter Doctrine quickly metastasized. Geographically, it grew far beyond the bounds of the Persian Gulf, eventually encompassing virtually all of the Islamic world. Washington’s own ambitions in the region also soared. Rather than merely preventing a hostile power from achieving dominance in the Gulf, the United States was soon seeking to achieve dominance itself. Dominance — that is, shaping the course of events to Washington’s liking — was said to hold the key to maintaining stability, ensuring access to the world’s most important energy reserves, checking the spread of Islamic radicalism, combating terrorism, fostering Israel’s security, and promoting American values. Through the adroit use of military might, dominance actually seemed plausible. (So at least Washington persuaded itself.)
What this meant in practice was the wholesale militarization of U.S. policy toward the Greater Middle East in a period in which Washington’s infatuation with military power was reaching its zenith. As the Cold War wound down, the national security apparatus shifted its focus from defending Germany’s Fulda Gap to projecting military power throughout the Islamic world. In practical terms, this shift found expression in the creation of Central Command (CENTCOM), reconfigured forces, and an eternal round of contingency planning, war plans, and military exercises in the region. To lay the basis for the actual commitment of troops, the Pentagon established military bases, stockpiled material in forward locations, and negotiated transit rights. It also courted and armed proxies. In essence, the Carter Doctrine provided the Pentagon (along with various U.S. intelligence agencies) with a rationale for honing and then exercising new capabilities.
Capabilities expanded the range of policy options. Options offered opportunities to “do something” in response to crisis. From the Reagan era on, policymakers seized upon those opportunities with alacrity. A seemingly endless series of episodes and incidents ensued, as U.S. forces, covert operatives, or proxies engaged in hostile actions (often on multiple occasions) in Lebanon, Libya, Iran, Somalia, Bosnia, Kosovo, Saudi Arabia, the Sudan, Yemen, Pakistan, the southern Philippines, and in the Persian Gulf itself, not to mention Iraq and Afghanistan. Consider them altogether and what you have is a War for the Greater Middle East, pursued by the United States for over three decades now. If Congress gives President Obama the green light, Syria will become the latest front in this ongoing enterprise.
Profiles in Courage? If Only
A debate over the Syrian AUMF should encourage members of Congress — if they’ve got the guts — to survey this entire record of U.S. military activities in the Greater Middle East going back to 1980. To do so means almost unavoidably confronting this simple question: How are we doing? To state the matter directly, all these years later, given all the ordnance expended, all the toing-and-froing of U.S. forces, and all the lives lost or shattered along the way, is mission accomplishment anywhere insight? Or have U.S. troops — the objects of such putative love and admiration on the part of the American people — been engaged over the past 30-plus years in a fool’s errand? How members cast their votes on the Syrian AUMF will signal their answer — and by extension the nation’s answer — to that question.
To okay an attack on Syria will, in effect, reaffirm the Carter Doctrine and put a stamp of congressional approval on the policies that got us where we are today. A majority vote in favor of the Syrian AUMF will sustain and probably deepen Washington’s insistence that the resort to violence represents the best way to advance U.S. interests in the Islamic world. From this perspective, all we need to do is try harder and eventually we’ll achieve a favorable outcome. With Syria presumably the elusive but never quite attained turning point, the Greater Middle East will stabilize. Democracy will flourish. And the United States will bask in the appreciation of those we have freed from tyranny.
To vote against the AUMF, on the other hand, will draw a red line of much greater significance than the one that President Obama himself so casually laid down. Should the majority in either House reject the Syrian AUMF, the vote will call into question the continued viability of the Carter Doctrine and all that followed in its wake.
It will create space to ask whether having another go is likely to produce an outcome any different from what the United States has achieved in the myriad places throughout the Greater Middle East where U.S. forces (or covert operatives) have, whatever their intentions, spent the past several decades wreaking havoc and sowing chaos under the guise of doing good. Instead of offering more of the same – does anyone seriously think that ousting Assad will transform Syria into an Arab Switzerland? — rejecting the AUMF might even invite the possibility of charting an altogether different course, entailing perhaps a lower military profile and greater self-restraint.
What a stirring prospect! Imagine members of Congress setting aside partisan concerns to debate first-order questions of policy. Imagine them putting the interests of the country in front of their own worries about winning reelection or pursuing their political ambitions. It would be like Lincoln vs. Douglas or Woodrow Wilson vs. Henry Cabot Lodge. Call Doris Kearns Goodwin. Call Spielberg or Sorkin. Get me Capra, for God’s sake. We’re talking high drama of blockbuster proportions.
On the other hand, given the record of the recent past, we should hardly discount the possibility that our legislative representatives will not rise to the occasion. Invited by President Obama to share in the responsibility for deciding whether and where to commit acts of war, one or both Houses — not known these days for displaying either courage or responsibility — may choose instead to punt.
As we have learned by now, the possible ways for Congress to shirk its duty are legion. In this instance, all are likely to begin with the common supposition that nothing’s at stake here except responding to Assad’s alleged misdeeds. To refuse to place the Syrian crisis in any larger context is, of course, a dodge. Yet that dodge creates multiple opportunities for our elected representatives to let themselves off the hook.
Congress could, for example, pass a narrowly drawn resolution authorizing Obama to fire his “shot across the bow” and no more. In other words, it could basically endorse the president’s inclination to substitute gesture for policy.
Or it could approve a broadly drawn, but vacuous resolution, handing the president a blank check. Ample precedent exists for that approach, since it more or less describes what Congress did in 1964 with the Tonkin Gulf Resolution, opening the way to presidential escalation in Vietnam, or with the AUMF it passed in the immediate aftermath of 9/11, giving George W. Bush’s administration permission to do more or less anything it wanted to just about anyone.
Even more irresponsibly, Congress could simply reject any Syrian AUMF, however worded, without identifying a plausible alternative to war, in effect washing its hands of the matter and creating a policy vacuum.
Will members of the Senate and the House grasp the opportunity to undertake an urgently needed reassessment of America’s War for the Greater Middle East? Or wriggling and squirming, will they inelegantly sidestep the issue, opting for short-term expediency in place of serious governance? In an age where the numbing blather of McCain, McConnell, and Reid have replaced the oratory of Clay, Calhoun, and Webster, merely to pose the question is to answer it.
But let us not overlook the entertainment value of such an outcome, which could well be formidable. In all likelihood, high comedy Washington-style lurks just around the corner. So renew that subscription to The Onion. Keep an eye on Doonesbury. Set the TiVo to record Jon Stewart. This is going to be really funny — and utterly pathetic. Where’s H.L. Mencken when we need him?
Andrew J. Bacevich is a professor of history and international relations at Boston University. He is the author of the new book, Breach of Trust: How Americans Failed Their Soldiers and Their Country (Metropolitan Books).
Copyright 2013 Andrew Bacevich
The Hill to the Rescue on Syria?
For well over a decade now the United States has been “a nation at war.” Does that war have a name?
It did at the outset. After 9/11, George W. Bush’s administration wasted no time in announcing that the U.S. was engaged in a Global War on Terrorism, or GWOT. With few dissenters, the media quickly embraced the term. The GWOT promised to be a gargantuan, transformative enterprise. The conflict begun on 9/11 would define the age. In neoconservative circles, it was known as World War IV.
Upon succeeding to the presidency in 2009, however, Barack Obama without fanfare junked Bush’s formulation (as he did again in a speech at the National Defense University last week). Yet if the appellation went away, the conflict itself, shorn of identifying marks, continued.
Does it matter that ours has become and remains a nameless war? Very much so.
Names bestow meaning. When it comes to war, a name attached to a date can shape our understanding of what the conflict was all about. To specify when a war began and when it ended is to privilege certain explanations of its significance while discrediting others. Let me provide a few illustrations.
With rare exceptions, Americans today characterize the horrendous fraternal bloodletting of 1861-1865 as the Civil War. Yet not many decades ago, diehard supporters of the Lost Cause insisted on referring to that conflict as the War Between the States or the War for Southern Independence (or even the War of Northern Aggression). The South may have gone down in defeat, but the purposes for which Southerners had fought — preserving a distinctive way of life and the principle of states’ rights — had been worthy, even noble. So at least they professed to believe, with their preferred names for the war reflecting that belief.
Schoolbooks tell us that the Spanish-American War began in April 1898 and ended in August of that same year. The name and dates fit nicely with a widespread inclination from President William McKinley’s day to our own to frame U.S. intervention in Cuba as an altruistic effort to liberate that island from Spanish oppression.
Yet the Cubans were not exactly bystanders in that drama. By 1898, they had been fighting for years to oust their colonial overlords. And although hostilities in Cuba itself ended on August 12th, they dragged on in the Philippines, another Spanish colony that the United States had seized for reasons only remotely related to liberating Cubans. Notably, U.S. troops occupying the Philippines waged a brutal war not against Spaniards but against Filipino nationalists no more inclined to accept colonial rule by Washington than by Madrid. So widen the aperture to include this Cuban prelude and the Filipino postlude and you end up with something like this: The Spanish-American-Cuban-Philippines War of 1895-1902. Too clunky? How about the War for the American Empire? This much is for sure: rather than illuminating, the commonplace textbook descriptor serves chiefly to conceal.
Strange as it may seem, Europeans once referred to the calamitous events of 1914-1918 as the Great War. When Woodrow Wilson decided in 1917 to send an army of doughboys to fight alongside the Allies, he went beyond Great. According to the president, the Great War was going to be the War To End All Wars. Alas, things did not pan out as he expected. Perhaps anticipating the demise of his vision of permanent peace, War Department General Order 115, issued on October 7, 1919, formally declared that, at least as far as the United States was concerned, the recently concluded hostilities would be known simply as the World War.
In September 1939 — presto chango! — the World War suddenly became the First World War, the Nazi invasion of Poland having inaugurated a Second World War, also known as World War II or more cryptically WWII. To be sure, Soviet dictator Josef Stalin preferred the Great Patriotic War. Although this found instant — almost unanimous — favor among Soviet citizens, it did not catch on elsewhere.
Does World War II accurately capture the events it purports to encompass? With the crusade against the Axis now ranking alongside the crusade against slavery as a myth-enshrouded chapter in U.S. history to which all must pay homage, Americans are no more inclined to consider that question than to consider why a playoff to determine the professional baseball championship of North America constitutes a “World Series.”
In fact, however convenient and familiar, World War II is misleading and not especially useful. The period in question saw at least two wars, each only tenuously connected to the other, each having distinctive origins, each yielding a different outcome. To separate them is to transform the historical landscape.
On the one hand, there was the Pacific War, pitting the United States against Japan. Formally initiated by the December 7, 1941, attack on Pearl Harbor, it had in fact begun a decade earlier when Japan embarked upon a policy of armed conquest in Manchuria. At stake was the question of who would dominate East Asia. Japan’s crushing defeat at the hands of the United States, sealed by two atomic bombs in 1945, answered that question (at least for a time).
Then there was the European War, pitting Nazi Germany first against Great Britain and France, but ultimately against a grand alliance led by the United States, the Soviet Union, and a fast fading British Empire. At stake was the question of who would dominate Europe. Germany’s defeat resolved that issue (at least for a time): no one would. To prevent any single power from controlling Europe, two outside powers divided it.
This division served as the basis for the ensuing Cold War, which wasn’t actually cold, but also (thankfully) wasn’t World War III, the retrospective insistence of bellicose neoconservatives notwithstanding. But when did the Cold War begin? Was it in early 1947, when President Harry Truman decided that Stalin’s Russia posed a looming threat and committed the United States to a strategy of containment? Or was it in 1919, when Vladimir Lenin decided that Winston Churchill’s vow to “strangle Bolshevism in its cradle” posed a looming threat to the Russian Revolution, with an ongoing Anglo-American military intervention evincing a determination to make good on that vow?
Separating the war against Nazi Germany from the war against Imperial Japan opens up another interpretive possibility. If you incorporate the European conflict of 1914-1918 and the European conflict of 1939-1945 into a single narrative, you get a Second Thirty Years War (the first having occurred from 1618-1648) — not so much a contest of good against evil, as a mindless exercise in self-destruction that represented the ultimate expression of European folly.
So, yes, it matters what we choose to call the military enterprise we’ve been waging not only in Iraq and Afghanistan, but also in any number of other countries scattered hither and yon across the Islamic world. Although the Obama administration appears no more interested than the Bush administration in saying when that enterprise will actually end, the date we choose as its starting point also matters.
Although Washington seems in no hurry to name its nameless war — and will no doubt settle on something self-serving or anodyne if it ever finally addresses the issue — perhaps we should jump-start the process. Let’s consider some possible options, names that might actually explain what’s going on.
The Long War: Coined not long after 9/11 by senior officers in the Pentagon, this formulation never gained traction with either civilian officials or the general public. Yet the Long War deserves consideration, even though — or perhaps because — it has lost its luster with the passage of time.
At the outset, it connoted grand ambitions buoyed by extreme confidence in the efficacy of American military might. This was going to be one for the ages, a multi-generational conflict yielding sweeping results.
The Long War did begin on a hopeful note. The initial entry into Afghanistan and then into Iraq seemed to herald “home by Christmas” triumphal parades. Yet this soon proved an illusion as victory slipped from Washington’s grasp. By 2005 at the latest, events in the field had dashed the neo-Wilsonian expectations nurtured back home.
With the conflicts in Iraq and Afghanistan dragging on, “long” lost its original connotation. Instead of “really important,” it became a synonym for “interminable.” Today, the Long War does succinctly capture the experience of American soldiers who have endured multiple combat deployments to Iraq and Afghanistan.
For Long War combatants, the object of the exercise has become to persist. As for winning, it’s not in the cards. The Long War just might conclude by the end of 2014 if President Obama keeps his pledge to end the U.S. combat role in Afghanistan and if he avoids getting sucked into Syria’s civil war. So the troops may hope.
The War Against Al-Qaeda: It began in August 1996 when Osama bin Laden issued a “Declaration of War against the Americans Occupying the Land of the Two Holy Places,” i.e., Saudi Arabia. In February 1998, a second bin Laden manifesto announced that killing Americans, military and civilian alike, had become “an individual duty for every Muslim who can do it in any country in which it is possible to do it.”
Although President Bill Clinton took notice, the U.S. response to bin Laden’s provocations was limited and ineffectual. Only after 9/11 did Washington take this threat seriously. Since then, apart from a pointless excursion into Iraq (where, in Saddam Hussein’s day, al-Qaeda did not exist), U.S. attention has been focused on Afghanistan, where U.S. troops have waged the longest war in American history, and on Pakistan’s tribal borderlands, where a CIA drone campaign is ongoing. By the end of President Obama’s first term, U.S. intelligence agencies were reporting that a combined CIA/military campaign had largely destroyed bin Laden’s organization. Bin Laden himself, of course, was dead.
Could the United States have declared victory in its unnamed war at this point? Perhaps, but it gave little thought to doing so. Instead, the national security apparatus had already trained its sights on various al-Qaeda “franchises” and wannabes, militant groups claiming the bin Laden brand and waging their own version of jihad. These offshoots emerged in the Maghreb, Yemen, Somalia, Nigeria, and — wouldn’t you know it — post-Saddam Iraq, among other places. The question as to whether they actually posed a danger to the United States got, at best, passing attention — the label “al-Qaeda” eliciting the same sort of Pavlovian response that the word “communist” once did.
Americans should not expect this war to end anytime soon. Indeed, the Pentagon’s impresario of special operations recently speculated — by no means unhappily — that it would continue globally for “at least 10 to 20 years.” Freely translated, his statement undoubtedly means: “No one really knows, but we’re planning to keep at it for one helluva long time.”
The War For/Against/About Israel: It began in 1948. For many Jews, the founding of the state of Israel signified an ancient hope fulfilled. For many Christians, conscious of the sin of anti-Semitism that had culminated in the Holocaust, it offered a way to ease guilty consciences, albeit mostly at others’ expense. For many Muslims, especially Arabs, and most acutely Arabs who had been living in Palestine, the founding of the Jewish state represented a grave injustice. It was yet another unwelcome intrusion engineered by the West — colonialism by another name.
Recounting the ensuing struggle without appearing to take sides is almost impossible. Yet one thing seems clear: in terms of military involvement, the United States attempted in the late 1940s and 1950s to keep its distance. Over the course of the 1960s, this changed. The U.S. became Israel’s principal patron, committed to maintaining (and indeed increasing) its military superiority over its neighbors.
In the decades that followed, the two countries forged a multifaceted “strategic relationship.” A compliant Congress provided Israel with weapons and other assistance worth many billions of dollars, testifying to what has become an unambiguous and irrevocable U.S. commitment to the safety and well-being of the Jewish state. The two countries share technology and intelligence. Meanwhile, just as Israel had disregarded U.S. concerns when it came to developing nuclear weapons, it ignored persistent U.S. requests that it refrain from colonizing territory that it has conquered.
When it comes to identifying the minimal essential requirements of Israeli security and the terms that will define any Palestinian-Israeli peace deal, the United States defers to Israel. That may qualify as an overstatement, but only slightly. Given the Israeli perspective on those requirements and those terms — permanent military supremacy and a permanently demilitarized Palestine allowed limited sovereignty — the War For/Against/About Israel is unlikely to end anytime soon either. Whether the United States benefits from the perpetuation of this war is difficult to say, but we are in it for the long haul.
The War for the Greater Middle East: I confess that this is the name I would choose for Washington’s unnamed war and is, in fact, the title of a course I teach. (A tempting alternative is the Second Hundred Years War, the “first” having begun in 1337 and ended in 1453.)
This war is about to hit the century mark, its opening chapter coinciding with the onset of World War I. Not long after the fighting on the Western Front in Europe had settled into a stalemate, the British government, looking for ways to gain the upper hand, set out to dismantle the Ottoman Empire whose rulers had foolishly thrown in their lot with the German Reich against the Allies.
By the time the war ended with Germany and the Turks on the losing side, Great Britain had already begun to draw up new boundaries, invent states, and install rulers to suit its predilections, while also issuing mutually contradictory promises to groups inhabiting these new precincts of its empire. Toward what end? Simply put, the British were intent on calling the shots from Egypt to India, whether by governing through intermediaries or ruling directly. The result was a new Middle East and a total mess.
London presided over this mess, albeit with considerable difficulty, until the end of World War II. At this point, by abandoning efforts to keep Arabs and Zionists from one another’s throats in Palestine and by accepting the partition of India, they signaled their intention to throw in the towel. Alas, Washington proved more than willing to assume Britain’s role. The lure of oil was strong. So too were the fears, however overwrought, of the Soviets extending their influence into the region.
Unfortunately, the Americans enjoyed no more success in promoting long-term, pro-Western stability than had the British. In some respects, they only made things worse, with the joint CIA-MI6 overthrow of a democratically elected government in Iran in 1953 offering a prime example of a “success” that, to this day, has never stopped breeding disaster.
Only after 1980 did things get really interesting, however. The Carter Doctrine promulgated that year designated the Persian Gulf a vital national security interest and opened the door to greatly increased U.S. military activity not just in the Gulf, but also throughout the Greater Middle East (GME). Between 1945 and 1980, considerable numbers of American soldiers lost their lives fighting in Asia and elsewhere. During that period, virtually none were killed fighting in the GME. Since 1990, in contrast, virtually none have been killed fighting anywhere except in the GME.
What does the United States hope to achieve in its inherited and unending War for the Greater Middle East? To pacify the region? To remake it in our image? To drain its stocks of petroleum? Or just keeping the lid on? However you define the war’s aims, things have not gone well, which once again suggests that, in some form, it will continue for some time to come. If there’s any good news here, it’s the prospect of having ever more material for my seminar, which may soon expand into a two-semester course.
The War Against Islam: This war began nearly 1,000 years ago and continued for centuries, a storied collision between Christendom and the Muslim ummah. For a couple of hundred years, periodic eruptions of large-scale violence occurred until the conflict finally petered out with the last crusade sometime in the fourteenth century.
In those days, many people had deemed religion something worth fighting for, a proposition to which the more sophisticated present-day inhabitants of Christendom no longer subscribe. Yet could that religious war have resumed in our own day? Professor Samuel Huntington thought so, although he styled the conflict a “clash of civilizations.” Some militant radical Islamists agree with Professor Huntington, citing as evidence the unwelcome meddling of “infidels,” mostly wearing American uniforms, in various parts of the Muslim world. Some militant evangelical Christians endorse this proposition, even if they take a more favorable view of U.S. troops occupying and drones targeting Muslim countries.
In explaining the position of the United States government, religious scholars like George W. Bush and Barack (Hussein!) Obama have consistently expressed a contrary view. Islam is a religion of peace, they declare, part of the great Abrahamic triad. That the other elements of that triad are likewise committed to peace is a proposition that Bush, Obama, and most Americans take for granted, evidence not required. There should be no reason why Christians, Jews, and Muslims can’t live together in harmony.
Still, remember back in 2001 when, in an unscripted moment, President Bush described the war barely begun as a “crusade”? That was just a slip of the tongue, right? If not, we just might end up calling this one the Eternal War.
Andrew J. Bacevich is a professor of history and international relations at Boston University and a TomDispatch regular. His next book, Breach of Trust: How Americans Failed Their Soldiers and Their Country, will appear in September.
Copyright 2013 Andrew J. Bacevich
Naming Our Nameless War
First came the hullaballoo over the “Mosque at Ground Zero.” Then there was Pastor Terry Jones of Gainesville, Florida, grabbing headlines as he promoted “International Burn-a-Koran Day.” Most recently, we have an American posting a slanderous anti-Muslim video on the Internet with all the ensuing turmoil.
Throughout, the official U.S. position has remained fixed: the United States government condemns Islamophobia. Americans respect Islam as a religion of peace. Incidents suggesting otherwise are the work of a tiny minority — whackos, hatemongers, and publicity-seekers. Among Muslims from Benghazi to Islamabad, the argument has proven to be a tough sell.
And not without reason: although it might be comforting to dismiss anti-Islamic outbursts in the U.S. as the work of a few fanatics, the picture is actually far more complicated. Those complications in turn help explain why religion, once considered a foreign policy asset, has in recent years become a net liability.
Let’s begin with a brief history lesson. From the late 1940s to the late 1980s, when Communism provided the overarching ideological rationale for American globalism, religion figured prominently as a theme of U.S. foreign policy. Communist antipathy toward religion helped invest the Cold War foreign policy consensus with its remarkable durability. That Communists were godless sufficed to place them beyond the pale. For many Americans, the Cold War derived its moral clarity from the conviction that here was a contest pitting the God-fearing against the God-denying. Since we were on God’s side, it appeared axiomatic that God should repay the compliment.
From time to time during the decades when anti-Communism provided so much of the animating spirit of U.S. policy, Judeo-Christian strategists in Washington (not necessarily believers themselves), drawing on the theologically correct proposition that Christians, Jews, and Muslims all worship the same God, sought to enlist Muslims, sometimes of fundamentalist persuasions, in the cause of opposing the godless. One especially notable example was the Soviet-Afghan War of 1979-1989. To inflict pain on the Soviet occupiers, the United States threw its weight behind the Afghan resistance, styled in Washington as “freedom fighters,” and funneled aid (via the Saudis and the Pakistanis) to the most religiously extreme among them. When this effort resulted in a massive Soviet defeat, the United States celebrated its support for the Afghan Mujahedeen as evidence of strategic genius. It was almost as if God had rendered a verdict.
Yet not so many years after the Soviets withdrew in defeat, the freedom fighters morphed into the fiercely anti-Western Taliban, providing sanctuary to al-Qaeda as it plotted — successfully — to attack the United States. Clearly, this was a monkey wrench thrown into God’s plan.
With the launching of the Global War on Terrorism, Islamism succeeded Communism as the body of beliefs that, if left unchecked, threatened to sweep across the globe with dire consequences for freedom. Those who Washington had armed as “freedom fighters” now became America’s most dangerous enemies. So at least members of the national security establishment believed or purported to believe, thereby curtailing any further discussion of whether militarized globalism actually represented the best approach to promoting liberal values globally or even served U.S. interests.
Yet as a rallying cry, a war against Islamism presented difficulties right from the outset. As much as policymakers struggled to prevent Islamism from merging in the popular mind with Islam itself, significant numbers of Americans — whether genuinely fearful or mischief-minded — saw this as a distinction without a difference. Efforts by the Bush administration to work around this problem by framing the post-9/11 threat under the rubric of “terrorism” ultimately failed because that generic term offered no explanation for motive. However the administration twisted and turned, motive in this instance seemed bound up with matters of religion.
Where exactly to situate God in post-9/11 U.S. policy posed a genuine challenge for policymakers, not least of all for George W. Bush, who believed, no doubt sincerely, that God had chosen him to defend America in its time of maximum danger. Unlike the communists, far from denying God’s existence, Islamists embrace God with startling ferocity. Indeed, in their vitriolic denunciations of the United States and in perpetrating acts of anti-American violence, they audaciously present themselves as nothing less than God’s avenging agents. In confronting the Great Satan, they claim to be doing God’s will.
Waging War in Jesus’s Name
This debate over who actually represents God’s will is one that the successive administrations of George W. Bush and Barack Obama have studiously sought to avoid. The United States is not at war with Islam per se, U.S. officials insist. Still, among Muslims abroad, Washington’s repeated denials notwithstanding, suspicion persists and not without reason.
Consider the case of Lieutenant General William G. (“Jerry”) Boykin. While still on active duty in 2002, this highly decorated Army officer spoke in uniform at a series of some 30 church gatherings during which he offered his own response to President Bush’s famous question: “Why do they hate us?” The general’s perspective differed markedly from his commander-in-chief’s: “The answer to that is because we’re a Christian nation. We are hated because we are a nation of believers.”
On another such occasion, the general recalled his encounter with a Somali warlord who claimed to enjoy Allah’s protection. The warlord was deluding himself, Boykin declared, and was sure to get his comeuppance: “I knew that my God was bigger than his. I knew that my God was a real God and his was an idol.” As a Christian nation, Boykin insisted, the United States would succeed in overcoming its adversaries only if “we come against them in the name of Jesus.”
When Boykin’s remarks caught the attention of the mainstream press, denunciations rained down from on high, as the White House, the State Department, and the Pentagon hastened to disassociate the government from the general’s views. Yet subsequent indicators suggest that, however crudely, Boykin was indeed expressing perspectives shared by more than a few of his fellow citizens.
One such indicator came immediately: despite the furor, the general kept his important Pentagon job as deputy undersecretary of defense for intelligence, suggesting that the Bush administration considered his transgression minor. Perhaps Boykin had spoken out of turn, but his was not a fireable offense. (One can only speculate regarding the fate likely to befall a U.S. high-ranking officer daring to say of Israeli Prime Benjamin Netanyahu, “My God is a real God and his is an idol.”)
A second indicator came in the wake of Boykin’s retirement from active duty. In 2012, the influential Family Research Council (FRC) in Washington hired the general to serve as the organization’s executive vice-president. Devoted to “advancing faith, family, and freedom,” the council presents itself as emphatically Christian in its outlook. FRC events routinely attract Republican Party heavyweights. The organization forms part of the conservative mainstream, much as, say, the American Civil Liberties Union forms part of the left-liberal mainstream.
So for the FRC to hire as its chief operating officer someone espousing Boykin’s pronounced views regarding Islam qualifies as noteworthy. At a minimum, those who recruited the former general apparently found nothing especially objectionable in his worldview. They saw nothing politically risky about associating with Jerry Boykin. He’s their kind of guy. More likely, by hiring Boykin, the FRC intended to send a signal: on matters where their new COO claimed expertise — above all, war — thumb-in-your eye political incorrectness was becoming a virtue. Imagine the NAACP electing Nation of Islam leader Louis Farrakhan as its national president, thereby endorsing his views on race, and you get the idea.
What the FRC’s embrace of General Boykin makes clear is this: to dismiss manifestations of Islamophobia simply as the work of an insignificant American fringe is mistaken. As with the supporters of Senator Joseph McCarthy, who during the early days of the Cold War saw communists under every State Department desk, those engaging in these actions are daring to express openly attitudes that others in far greater numbers also quietly nurture. To put it another way, what Americans in the 1950s knew as McCarthyism has reappeared in the form of Boykinism.
Historians differ passionately over whether McCarthyism represented a perversion of anti-Communism or its truest expression. So, too, present-day observers will disagree as to whether Boykinism represents a merely fervent or utterly demented response to the Islamist threat. Yet this much is inarguable: just as the junior senator from Wisconsin in his heyday embodied a non-trivial strain of American politics, so, too, does the former special-ops-warrior-turned-“ordained minister with a passion for spreading the Gospel of Jesus Christ.”
Notably, as Boykinism’s leading exponent, the former general’s views bear a striking resemblance to those favored by the late senator. Like McCarthy, Boykin believes that, while enemies beyond America’s gates pose great dangers, the enemy within poses a still greater threat. “I’ve studied Marxist insurgency,” he declared in a 2010 video. “It was part of my training. And the things I know that have been done in every Marxist insurgency are being done in America today.” Explicitly comparing the United States as governed by Barack Obama to Stalin’s Soviet Union, Mao Zedong’s China, and Fidel Castro’s Cuba, Boykin charges that, under the guise of health reform, the Obama administration is secretly organizing a “constabulary force that will control the population in America.” This new force is, he claims, designed to be larger than the United States military, and will function just as Hitler’s Brownshirts once did in Germany. All of this is unfolding before our innocent and unsuspecting eyes.
Boykinism: The New McCarthyism
How many Americans endorsed McCarthy’s conspiratorial view of national and world politics? It’s difficult to know for sure, but enough in Wisconsin to win him reelection in 1952, by a comfortable 54% to 46% majority. Enough to strike fear into the hearts of politicians who quaked at the thought of McCarthy fingering them for being “soft on Communism.”
How many Americans endorse Boykin’s comparably incendiary views? Again, it’s difficult to tell. Enough to persuade FRC’s funders and supporters to hire him, confident that doing so would burnish, not tarnish, the organization’s brand. Certainly, Boykin has in no way damaged its ability to attract powerhouses of the domestic right. FRC’s recent “Values Voter Summit” featured luminaries such as Republican vice-presidential nominee Paul Ryan, former Republican Senator and presidential candidate Rick Santorum, House Majority Leader Eric Cantor, and Representative Michele Bachmann — along with Jerry Boykin himself, who lectured attendees on “Israel, Iran, and the Future of Western Civilization.” (In early August, Mitt Romney met privately with a group of “prominent social conservatives,” including Boykin.)
Does their appearance at the FRC podium signify that Ryan, Santorum, Cantor, and Bachmann all subscribe to Boykinism’s essential tenets? Not any more than those who exploited the McCarthyite moment to their own political advantage — Richard Nixon, for example — necessarily agreed with all of McCarthy’s reckless accusations. Yet the presence of leading Republicans on an FRC program featuring Boykin certainly suggests that they find nothing especially objectionable or politically damaging to them in his worldview.
Still, comparisons between McCarthyism and Boykinism only go so far. Senator McCarthy wreaked havoc mostly on the home front, instigating witch-hunts, destroying careers, and trampling on civil rights, while imparting to American politics even more of a circus atmosphere than usual. In terms of foreign policy, the effect of McCarthyism, if anything, was to reinforce an already existing anti-communist consensus. McCarthy’s antics didn’t create enemies abroad. McCarthyism merely reaffirmed that communists were indeed the enemy, while making the political price of thinking otherwise too high to contemplate.
Boykinism, in contrast, makes its impact felt abroad. Unlike McCarthyism, it doesn’t strike fear into the hearts of incumbents on the campaign trail here. Attracting General Boykin’s endorsement or provoking his ire probably won’t determine the outcome of any election. Yet in its various manifestations Boykinism provides the kindling that helps sustain anti-American sentiment in the Islamic world. It reinforces the belief among Muslims that the Global War on Terror really is a war against them.
Boykinism confirms what many Muslims are already primed to believe: that American values and Islamic values are irreconcilable. American presidents and secretaries of state stick to their talking points, praising Islam as a great religious tradition and touting past U.S. military actions (ostensibly) undertaken on behalf of Muslims. Yet with their credibility among Iraqis, Afghans, Pakistanis, and others in the Greater Middle East about nil, they are pissing in the wind.
As long as substantial numbers of vocal Americans do not buy the ideological argument constructed to justify U.S. intervention in the Islamic world — that their conception of freedom (including religious freedom) is ultimately compatible with ours — then neither will Muslims. In that sense, the supporters of Boykinism who reject that proposition encourage Muslims to follow suit. This ensures, by extension, that further reliance on armed force as the preferred instrument of U. S. policy in the Islamic world will compound the errors that produced and have defined the post-9/11 era.
Andrew J. Bacevich is currently a visiting fellow at Notre Dame’s Kroc Institute for International Peace Studies. A TomDispatch regular, he is author of Washington Rules: America’s Path to Permanent War, among other works, and most recently editor of The Short American Century.
Copyright 2012 Andrew J. Bacevich
With the United States now well into the second decade of what the Pentagon has styled an “era of persistent conflict,” the war formerly known as the global war on terrorism (unofficial acronym WFKATGWOT) appears increasingly fragmented and diffuse. Without achieving victory, yet unwilling to acknowledge failure, the United States military has withdrawn from Iraq. It is trying to leave Afghanistan, where events seem equally unlikely to yield a happy outcome.
Elsewhere — in Pakistan, Libya, Yemen, and Somalia, for example — U.S. forces are busily opening up new fronts. Published reports that the United States is establishing “a constellation of secret drone bases” in or near the Horn of Africa and the Arabian Peninsula suggest that the scope of operations will only widen further. In a front-page story, the New York Times described plans for “thickening” the global presence of U.S. special operations forces. Rushed Navy plans to convert an aging amphibious landing ship into an “afloat forward staging base” — a mobile launch platform for either commando raids or minesweeping operations in the Persian Gulf — only reinforces the point. Yet as some fronts close down and others open up, the war’s narrative has become increasingly difficult to discern. How much farther until we reach the WFKATGWOT’s equivalent of Berlin? What exactly is the WFKATGWOT’s equivalent of Berlin? In fact, is there a storyline here at all?
Viewed close-up, the “war” appears to have lost form and shape. Yet by taking a couple of steps back, important patterns begin to appear. What follows is a preliminary attempt to score the WFKATGWOT, dividing the conflict into a bout of three rounds. Although there may be several additional rounds still to come, here’s what we’ve suffered through thus far.
The Rumsfeld Era
Round 1: Liberation. More than any other figure — more than any general, even more than the president himself — Secretary of Defense Donald Rumsfeld dominated the war’s early stages. Appearing for a time to be a larger-than-life figure — the “Secretary at War” in the eyes of an adoring (if fickle) neocon fan club — Rumsfeld dedicated himself to the proposition that, in battle, speed holds the key to victory. He threw his considerable weight behind a high-tech American version of blitzkrieg. U.S. forces, he regularly insisted, were smarter and more agile than any adversary. To employ them in ways that took advantage of those qualities was to guarantee victory. The journalistic term adopted to describe this concept was “shock and awe.”
No one believed more passionately in “shock and awe” than Rumsfeld himself. The design of Operation Enduring Freedom, launched in October 2001, and of Operation Iraqi Freedom, begun in March 2003, reflected this belief. In each instance, the campaign got off to a promising start, with U.S. troops landing some swift and impressive blows. In neither case, however, were they able to finish off their opponent or even, in reality, sort out just who their opponent might be. Unfortunately for Rumsfeld, the “terrorists” refused to play by his rulebook and U.S. forces proved to be less smart and agile than their technological edge — and their public relations machine — suggested would be the case. Indeed, when harassed by minor insurgencies and scattered bands of jihadis, they proved surprisingly slow to figure out what hit them.
In Afghanistan, Rumsfeld let victory slip through his grasp. In Iraq, his mismanagement of the campaign brought the United States face-to-face with outright defeat. Rumsfeld’s boss had hoped to liberate (and, of course, dominate) the Islamic world through a series of short, quick thrusts. What Bush got instead were two different versions of a long, hard slog. By the end of 2006, “shock and awe” was kaput. Trailing well behind the rest of the country and its armed forces, the president eventually lost confidence in his defense secretary’s approach. As a result, Rumsfeld lost his job. Round one came to an end, the Americans, rather embarrassingly, having lost it on points.
The Petraeus Era
Round 2: Pacification. Enter General David Petraeus. More than any other figure, in or out of uniform, Petraeus dominated the WFKATGWOT’s second phase. Round two opened with lowered expectations. Gone was the heady talk of liberation. Gone, too, were predictions of lightning victories. The United States was now willing to settle for much less while still claiming success.
Petraeus offered a formula for restoring a semblance of order to countries reduced to chaos as a result of round one. Order might permit the United States to extricate itself while maintaining some semblance of having met its policy objectives. This became the operative definition of victory.
The formal name for the formula that Petraeus devised was counterinsurgency, or COIN. Rather than trying to defeat the enemy, COIN sought to facilitate the emergence of a viable and stable nation-state. This was the stated aim of the “surge” in Iraq ordered by President George W. Bush at the end of 2006.
With Petraeus presiding, violence in that country did decline precipitously. Whether the relationship was causal or coincidental remains the subject of controversy. Still, Petraeus’s apparent success persuaded some observers that counterinsurgency on a global scale — GCOIN, they called it — should now form the basis for U.S. national security strategy. Here, they argued, was an approach that could definitively extract the United States from the WFKATGWOT, while offering victory of a sort. Rather than employing “shock and awe” to liberate the Islamic world, U.S. forces would apply counterinsurgency doctrine to pacify it.
The task of demonstrating the validity of COIN beyond Iraq fell to General Stanley McChrystal, appointed with much fanfare in 2009 to command U.S. and NATO forces in Afghanistan. Press reports celebrated McChrystal as another Petraeus, the ideal candidate to replicate the achievements already credited to “King David.”
McChrystal’s ascendency came at a moment when a cult of generalship gripped Washington. Rather than technology being the determinant of success as Rumsfeld had believed, the key was to put the right guy in charge and then let him run with things. Political figures on both sides of the aisle fell all over themselves declaring McChrystal the right guy for Afghanistan. Pundits of all stripes joined the chorus.
Once installed in Kabul, the general surveyed the situation and, to no one’s surprise, announced that “success demands a comprehensive counterinsurgency campaign.” Implementing that campaign would necessitate an Afghan “surge” mirroring the one that had seemingly turned Iraq around. In December 2009, albeit with little evident enthusiasm, President Barack Obama acceded to his commander’s request (or ultimatum). The U.S. troop commitment to Afghanistan rapidly increased.
Here things began to come undone. Progress toward reducing the insurgency or improving the capacity of Afghan security forces was — by even the most generous evaluation — negligible. McChrystal made promises — like meeting basic Afghan needs with “government in a box, ready to roll in” — that he proved utterly incapable of keeping. Relations with the government of President Hamid Karzai remained strained. Those with neighboring Pakistan, not good to begin with, only worsened. Both governments expressed deep resentment at what they viewed as high-handed American behavior that killed or maimed noncombatants with disturbing frequency.
To make matters worse, despite all the hype, McChrystal turned out to be miscast — manifestly the wrong guy for the job. Notably, he proved unable to grasp the need for projecting even some pretence of respect for the principle of civilian control back in Washington. By the summer of 2010, he was out — and Petraeus was back in.
In Washington (if not in Kabul), Petraeus’s oversized reputation quelled the sense that with McChrystal’s flame-out Afghanistan might be a lost cause. Surely, the most celebrated soldier of his generation would repeat his Iraq magic, affirming his own greatness and the continued viability of COIN.
Alas, this was not to be. Conditions in Afghanistan during Petraeus’s tenure in command improved — if that’s even the word — only modestly. The ongoing war met just about anyone’s definition of a quagmire. With considerable understatement, a 2011 National Intelligence Estimate called it a “stalemate.” Soon, talk of a “comprehensive counterinsurgency” faded. With the bar defining success slipping ever lower, passing off the fight to Afghan security forces and hightailing it for home became the publicly announced war aim.
That job remained unfinished when Petraeus himself headed for home, leaving the army to become CIA director. Although Petraeus was still held in high esteem, his departure from active duty left the cult of generalship looking more than a little the worse for wear. By the time General John Allen succeeded Petraeus — thereby became the eighth U.S. officer appointed to preside over the ongoing Afghan War — no one believed that simply putting the right guy in charge was going to produce magic. On that inclusive note, round two of the WFKATGWOT ended.
The Vickers Era
Round 3: Assassination. Unlike Donald Rumsfeld or David Petraeus, Michael Vickers has not achieved celebrity status. Yet more than anyone else in or out of uniform, Vickers, who carries the title Under Secretary of Defense for Intelligence, deserves recognition as the emblematic figure of the WFKATGWOT’s round three. His low-key, low-profile persona meshes perfectly with this latest evolution in the war’s character. Few people outside of Washington know who he is, which is fitting indeed since he presides over a war that few people outside of Washington are paying much attention to any longer.
With the retirement of Secretary of Defense Robert Gates, Vickers is the senior remaining holdover from George W. Bush’s Pentagon. His background is nothing if not eclectic. He previously served in U.S. Army Special Forces and as a CIA operative. In that guise, he played a leading role in supporting the Afghan mujahedeen in their war against Soviet occupiers in the 1980s. Subsequently, he worked in a Washington think tank and earned a PhD in strategic studies at Johns Hopkins University (dissertation title: “The Structure of Military Revolutions”).
Even during the Bush era, Vickers never subscribed to expectations that the United States could liberate or pacify the Islamic world. His preferred approach to the WFKATGWOT has been simplicity itself. “I just want to kill those guys,” he says — “those guys” referring to members of al-Qaeda. Kill the people who want to kill Americans and don’t stop until they are all dead: this defines the Vickers strategy, which over the course of the Obama presidency has supplanted COIN as the latest variant of U.S. strategy.
The Vickers approach means acting aggressively to eliminate would-be killers wherever they might be found, employing whatever means are necessary. Vickers “tends to think like a gangster,” one admirer comments. “He can understand trends then change the rules of the game so they are advantageous for your side.”
Round three of the WFKATGWOT is all about bending, breaking, and reinventing rules in ways thought to be advantageous to the United States. Much as COIN supplanted “shock and awe,” a broad-gauged program of targeted assassination has now displaced COIN as the prevailing expression of the American way of war.
The United States is finished with the business of sending large land armies to invade and occupy countries on the Eurasian mainland. Robert Gates, when still Secretary of Defense, made the definitive statement on that subject. The United States is now in the business of using missile-armed drones and special operations forces to eliminate anyone (not excluding U.S. citizens) the president of the United States decides has become an intolerable annoyance. Under President Obama, such attacks have proliferated.
This is America’s new MO. Paraphrasing a warning issued by Secretary of State Hillary Clinton, a Washington Post dispatch succinctly summarized what it implied: “The United States reserved the right to attack anyone who it determined posed a direct threat to U.S. national security, anywhere in the world.”
Furthermore, acting on behalf of the United States, the president exercises this supposed right without warning, without regard to claims of national sovereignty, without Congressional authorization, and without consulting anyone other than Michael Vickers and a few other members of the national security apparatus. The role allotted to the American people is to applaud, if and when notified that a successful assassination has occurred. And applaud we do, for example, when a daring raid by members in SEAL Team Six secretly enter Pakistan to dispatch Osama bin Laden with two neatly placed kill shots. Vengeance long deferred making it unnecessary to consider what second-order political complications might ensue.
How round three will end is difficult to forecast. The best we can say is that it’s unlikely to end anytime soon or particularly well. As Israel has discovered, once targeted assassination becomes your policy, the list of targets has a way of growing ever longer.
So what tentative judgments can we offer regarding the ongoing WFKATGWOT? Operationally, a war launched by the conventionally minded has progressively fallen under the purview of those who inhabit what Dick Cheney once called “the dark side,” with implications that few seem willing to explore. Strategically, a war informed at the outset by utopian expectations continues today with no concretely stated expectations whatsoever, the forward momentum of events displacing serious consideration of purpose. Politically, a war that once occupied center stage in national politics has now slipped to the periphery, the American people moving on to other concerns and entertainments, with legal and moral questions raised by the war left dangling in midair.
Is this progress?
Andrew J. Bacevich is professor of history and international relations at Boston University. A TomDispatch regular, he is the author most recently of Washington Rules: The American Path to Permanent War and the editor of the new book The Short American Century: A Postmortem, just out from Harvard University Press. To catch Timothy MacBain’s latest Tomcast audio interview in which Bacevich discusses the changing face of the Gobal War on Terror, click here, or download it to your iPod here.
Copyright 2012 Andrew Bacevich
Scoring the Global War on Terror
Fenway Park, Boston, July 4, 2011. On this warm summer day, the Red Sox will play the Toronto Blue Jays. First come pre-game festivities, especially tailored for the occasion. The ensuing spectacle — a carefully scripted encounter between the armed forces and society — expresses the distilled essence of present-day American patriotism. A masterpiece of contrived spontaneity, the event leaves spectators feeling good about their baseball team, about their military, and not least of all about themselves — precisely as it was meant to do.
In this theatrical production, the Red Sox provide the stage, and the Pentagon the props. In military parlance, it is a joint operation. In front of a gigantic American flag draped over the left-field wall, an Air Force contingent, clad in blue, stands at attention. To carry a smaller version of the Stars and Stripes onto the playing field, the Navy provides a color guard in crisp summer whites. The United States Marine Corps kicks in with a choral ensemble that leads the singing of the national anthem. As the anthem’s final notes sound, four U. S. Air Force F-15C Eagles scream overhead. The sellout crowd roars its approval.
But there is more to come. “On this Independence Day,” the voice of the Red Sox booms over the public address system, “we pay a debt of gratitude to the families whose sons and daughters are serving our country.” On this particular occasion the designated recipients of that gratitude are members of the Lydon family, hailing from Squantum, Massachusetts. Young Bridget Lydon is a sailor — Aviation Ordnanceman Airman is her official title — serving aboard the carrier USS Ronald Reagan, currently deployed in support of the Afghanistan War, now in its 10th year.
From Out of Nowhere
The Lydons are Every Family, decked out for the Fourth. Garbed in random bits of Red Sox paraphernalia and Mardi Gras necklaces, they wear their shirts untucked and ball caps backwards. Neither sleek nor fancy, they are without pretension. Yet they exude good cheer. As they are ushered onto the field, their eagerness is palpable. Like TV game show contestants, they know that this is their lucky day and they are keen to make the most of it.
As the Lydons gather near the pitcher’s mound, the voice directs their attention to the 38-by-100-foot Jumbotron mounted above the centerfield bleachers. On the screen, Bridget appears. She is aboard ship, in duty uniform, posed below decks in front of an F/A-18 fighter jet. Waiflike, but pert and confident, she looks directly into the camera, sending a “shout-out” to family and friends. She wishes she could join them at Fenway.
As if by magic, wish becomes fulfillment. While the video clip is still running, Bridget herself, now in dress whites, emerges from behind the flag covering the leftfield wall. On the Jumbotron, in place of Bridget below decks, an image of Bridget marching smartly toward the infield appears. In the stands pandemonium erupts. After a moment of confusion, members of her family — surrounded by camera crews — rush to embrace their sailor, a reunion shared vicariously by the 38,000 fans in attendance along with many thousands more watching at home on the Red Sox television network.
Once the Lydons finish with hugs and kisses and the crowd settles down, Navy veteran Bridget (annual salary approximately $22,000) throws the ceremonial first pitch to aging Red Sox veteran Tim Wakefield (annual salary $2,000,000). More cheers. As a souvenir, Wakefield gives her the baseball along with his own hug. All smiles, Bridget and her family shout “Play Ball!” into a microphone. As they are escorted off the field and out of sight, the game begins.
What does this event signify?
For the Lydons, the day will no doubt long remain a happy memory. If they were to some degree manipulated — their utter and genuine astonishment at Bridget’s seemingly miraculous appearance lending the occasion its emotional punch — they played their allotted roles without complaint and with considerable élan. However briefly, they stood in the spotlight, quasi-celebrities, all eyes trained on them, a contemporary version of the American dream fulfilled. And if offstage puppet-masters used Bridget herself, at least she got a visit home and a few days off — no doubt a welcome break.
Yet this feel-good story was political as well as personal. As a collaboration between two well-heeled but image-conscious institutions, the Lydon reunion represented a small but not inconsequential public relations triumph. The Red Sox and the Navy had worked together to perform an act of kindness for a sailor and her loved ones. Both organizations came away looking good, not only because the event itself was so deftly executed, but because it showed that the large for-profit professional sports team and the even larger military bureaucracy both care about ordinary people. The message conveyed to fans/taxpayers could not be clearer: the corporate executives who run the Red Sox have a heart. So, too, do the admirals who run the Navy.
Better still, these benefits accrued at essentially no cost to the sponsors. The military personnel arrayed around Fenway showed up because they were told to do so. They are already “paid for,” as are the F-15s, the pilots who fly them, and the ground crews that service them. As for whatever outlays the Red Sox may have made, they are trivial and easily absorbed. For the 2011 season, the average price of a ticket at Fenway Park had climbed to $52. A soft drink in a commemorative plastic cup runs you $5.50 and a beer $8. Then there is the television ad revenue, all contributing the previous year to corporate profits exceeding $58 million. A decade of war culminating in the worst economic crisis since the Great Depression hasn’t done much good for the country but it has been strangely good for the Red Sox — and a no-less well funded Pentagon. Any money expended in bringing Bridget to Fenway and entertaining the Lydons had to be the baseball/military equivalent of pocket change.
And the holiday festivities at Fenway had another significance as well, one that extended beyond burnishing institutional reputations and boosting bottom lines. Here was America’s civic religion made manifest.
In recent decades, an injunction to “support the troops” has emerged as a central tenet of that religion. Since 9/11 this imperative has become, if anything, even more binding. Indeed, as citizens, Americans today acknowledge no higher obligation.
Fulfilling that obligation has posed a challenge, however. Rather than doing so concretely, Americans — with a few honorable exceptions — have settled for symbolism. With their pronounced aversion to collective service and sacrifice (an inclination indulged by leaders of both political parties), Americans resist any definition of civic duty that threatens to crimp lifestyles.
To stand in solidarity with those on whom the burden of service and sacrifice falls is about as far as they will go. Expressions of solidarity affirm that the existing relationship between soldiers and society is consistent with democratic practice. By extension, so, too, is the distribution of prerogatives and responsibilities entailed by that relationship: a few fight, the rest applaud. Put simply, the message that citizens wish to convey to their soldiers is this: although choosing not to be with you, we are still for you (so long as being for you entails nothing on our part). Cheering for the troops, in effect, provides a convenient mechanism for voiding obligation and easing guilty consciences.
In ways far more satisfying than displaying banners or bumper stickers, the Fenway Park Independence Day event provided a made-to-order opportunity for conscience easing. It did so in three ways. First, it brought members of Red Sox Nation into close proximity (even if not direct contact) with living, breathing members of the armed forces, figuratively closing any gap between the two. (In New England, where few active duty military installations remain, such encounters are increasingly infrequent.) Second, it manufactured one excuse after another to whistle and shout, whoop and holler, thereby allowing the assembled multitudes to express — and to be seen expressing — their affection and respect for the troops. Finally, it rewarded participants and witnesses alike with a sense of validation, the reunion of Bridget and her family, even if temporary, serving as a proxy for a much larger, if imaginary, reconciliation of the American military and the American people. That debt? Mark it paid in full.
The late German theologian Dietrich Bonhoeffer had a name for this unearned self-forgiveness and undeserved self-regard. He called it cheap grace. Were he alive today, Bonhoeffer might suggest that a taste for cheap grace, compounded by an appetite for false freedom, is leading Americans down the road to perdition.
Andrew J. Bacevich, the author of Washington Rules: America’s Path to Permanent War, is professor of history and international relations at Boston University. His next book, of which this post is a small part, will assess the impact of a decade of war on American society and the United States military. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses cheap grace and military spectacle, click here, or download it to your iPod here.
Copyright 2011 Andrew Bacevich
Ballpark Liturgy: America’s New Civic Religion
At periodic intervals, the American body politic has shown a marked susceptibility to messianic fevers. Whenever an especially acute attack occurs, a sort of delirium ensues, manifesting itself in delusions of grandeur and demented behavior.
By the time the condition passes and a semblance of health is restored, recollection of what occurred during the illness tends to be hazy. What happened? How’d we get here? Most Americans prefer not to know. No sense dwelling on what’s behind us. Feeling much better now! Thanks!
Gripped by such a fever in 1898, Americans evinced an irrepressible impulse to liberate oppressed Cubans. By the time they’d returned to their senses, having acquired various parcels of real estate between Puerto Rico and the Philippines, no one could quite explain what had happened or why. (The Cubans meanwhile had merely exchanged one set of overseers for another.)
In 1917, the fever suddenly returned. Amid wild ravings about waging a war to end war, Americans lurched off to France. This time the affliction passed quickly, although the course of treatment proved painful: confinement to the charnel house of the Western Front, followed by bitter medicine administered at Versailles.
The 1960s brought another bout (and so yet more disappointment). An overwhelming urge to pay any price, bear any burden landed Americans in Vietnam. The fall of Saigon in 1975 seemed, for a brief interval, to inoculate the body politic against any further recurrence. Yet the salutary effects of this “Vietnam syndrome” proved fleeting. By the time the Cold War ended, Americans were running another temperature, their self-regard reaching impressive new heights. Out of Washington came all sorts of embarrassing gibberish about permanent global supremacy and history’s purpose finding fulfillment in the American way of life.
Give Me Fever
Then came 9/11 and the fever simply soared off the charts. The messiah-nation was really pissed and was going to fix things once and for all.
Nearly 10 years have passed since Washington set out to redeem the Greater Middle East. The crusades have not gone especially well. In fact, in the pursuit of its saving mission, the American messiah has pretty much worn itself out.
Today, the post-9/11 fever finally shows signs of abating. The evidence is partial and preliminary. The sickness has by no means passed. Oddly, it lingers most strongly in the Obama White House, of all places, where a keenness to express American ideals by dropping bombs seems strangely undiminished.
Yet despite the urges of some in the Obama administration, after nearly a decade of self-destructive flailing about, American recovery has become a distinct possibility. Here’s some of the evidence:
In Washington, it’s no longer considered a sin to question American omnipotence. Take the case of Robert Gates. The outgoing secretary of defense may well be the one senior U.S. official of the past decade to leave office with his reputation not only intact, but actually enhanced. (Note to President Obama: think about naming an aircraft carrier after the guy). Yet along with restoring a modicum of competence and accountability to the Pentagon, the Gates legacy is likely to be found in his willingness — however belated — to acknowledge the limits of American power.
That the United States should avoid wars except when absolutely necessary no longer connotes incipient isolationism. It is once again a sign of common sense, with Gates a leading promoter. Modesty is becoming respectable.
The Gates Doctrine
No one can charge Gates with being an isolationist or a national security wimp. Neither is he a “declinist.” So when he says anyone proposing another major land war in the Greater Middle East should “have his head examined” — citing the authority of Douglas MacArthur, no less — people take notice. Or more recently there was this: "I've got a military that's exhausted," Gates remarked, in one of those statements of the obvious too seldom heard from on high. "Let's just finish the wars we're in and keep focused on that instead of signing up for other wars of choice." Someone should etch that into the outer walls of the Pentagon’s E-ring.
Or consider the officer corps. There is no “military mind,” but there are plenty of minds in the military and some numbers of them are changing.
Evidence suggests that the officer corps itself is rethinking the role of military power. Consider, for example, “Mr. Y,” author of A National Strategic Narrative, published this spring to considerable acclaim by the Woodrow Wilson Center for Scholars. The actual authors of this report are two military professionals, one a navy captain, the other a Marine colonel.
What you won’t find in this document are jingoism, braggadocio, chest-thumping, and calls for a bigger military budget. If there’s an overarching theme, it’s pragmatism. Rather than the United States imposing its will on the world, the authors want more attention paid to the investment needed to rebuild at home.
The world is too big and complicated for any one nation to call the shots, they insist. The effort to do so is self-defeating. “As Americans,” Mr. Y writes, “we needn’t seek the world’s friendship or proselytize the virtues of our society. Neither do we seek to bully, intimidate, cajole, or persuade others to accept our unique values or to share our national objectives. Rather, we will let others draw their own conclusions based upon our actions… We will pursue our national interests and let others pursue theirs…”
You might dismiss this as the idiosyncratic musing of two officers who have spent too much time having their brains baked in the Iraqi or Afghan sun. I don’t. What convinces me otherwise is the positive email traffic that my own musings about the misuse and abuse of American power elicit weekly from serving officers. It’s no scientific sample, but the captains, majors, and lieutenant colonels I hear from broadly agree with Mr. Y. They’ve had a bellyful of twenty-first-century American war and are open to a real debate over how to overhaul the nation’s basic approach to national security.
Intelligence Where You Least Expect It
And finally, by gum, there is the United States Congress. Just when that body appeared to have entered a permanent vegetative state, a flickering of intelligent life has made its reappearance. Perhaps more remarkably still, the signs are evident on both sides of the aisle as Democrats and Republicans alike — albeit for different reasons — are raising serious questions about the nation’s propensity for multiple, open-ended wars.
Some members cite concerns for the Constitution and the abuse of executive power. Others worry about the price tag. With Osama bin Laden out of the picture, still others insist that it’s time to rethink strategic priorities. No doubt partisan calculation or personal ambition figures alongside matters of principle. They are, after all, politicians.
Given what polls indicate is a growing public unhappiness over the Afghan War, speaking out against that war these days doesn’t exactly require political courage. Still, the possibility of our legislators reasserting a role in deciding whether or not a war actually serves the national interest — rather than simply rubberstamping appropriations and slinking away — now presents itself. God bless the United States Congress.
Granted, the case presented here falls well short of being conclusive. To judge by his announcement of a barely-more-than-symbolic troop withdrawal from Afghanistan, President Obama himself seems uncertain of where he stands. And clogging the corridors of power or the think tanks and lobbying arenas that surround them are plenty of folks still hankering to have a go at Syria or Iran.
At the first signs of self-restraint, you can always count on the likes of Senator John McCain or the editorial board of the Wall Street Journal to decry (in McCain’s words) an “isolationist-withdrawal-lack-of-knowledge-of-history attitude” hell-bent on pulling up the drawbridge and having Americans turn their backs on the world. In such quarters, fever is a permanent condition and it’s always 104 and rising. Yet it is a measure of just how quickly things are changing that McCain himself, once deemed a source of straight talk, now comes across as a mere crank.
In this way, nearly a decade after our most recent descent into madness, does the possibility of recovery finally beckon.
Andrew J. Bacevich is professor of history and international relations at Boston University. His most recent book is Washington Rules: America’s Path to Permanent War. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses voices of dissent within the military, click here, or download it to your iPod here.
Copyright 2011 Andrew J. Bacevich
On the Mend?
It is a commonplace of American politics: when the moving van pulls up to the White House on Inauguration Day, it delivers not only a closetful of gray suits and power ties, but a boatload of expectations.
A president, being the most powerful man in the world, begins history anew — so at least Americans believe, or pretend to believe. Out with the old, sordid, and disappointing; in with the fresh, unsullied, and hopeful. Why, with the stroke of a pen, a new president can order the closing of an embarrassing and controversial off-shore prison for accused terrorists held for years on end without trial! Just like that: done.
For all sorts of reasons, the expectations raised by Barack Obama’s arrival in the Oval Office were especially high. Americans weren’t the only ones affected. How else to explain the Nobel Committee’s decision to honor the new president by transforming its Peace Prize into a Prize Anticipating Peace — more or less the equivalent of designating the winner of the Heisman Trophy during week one of the college football season.
Of course, if the political mood immediately prior to and following a presidential inauguration emphasizes promise and discovery (the First Lady has biceps!), it doesn’t take long for the novelty to start wearing off. Then the narrative arc takes a nosedive: he’s breaking his promises, he’s letting us down, he’s not so different after all.
The words of H.L. Mencken apply. “When I hear a man applauded by the mob,” the Sage of Baltimore wrote, “I always feel a pang of pity for him. All he has to do to be hissed is to live long enough.” Barack Obama has now lived long enough to attract his fair share of hisses, boos, and catcalls.
Along with prolonging and expanding one war in Afghanistan, the Nobel Peace laureate has played a leading role in starting another war in Libya. Laboring to distinguish between this administration and its predecessor, Obama’s defenders emphasize the purity of his motives. Contemptuous of George W. Bush’s claim that U.S. forces invaded oil-rich Iraq to keep weapons of mass destruction out of the hands of terrorists, they readily accept this president’s insistence that the United States intervened in oil-rich Libya to prevent genocidal slaughter. Besides, testifying to our virtuous intent, this time we’ve got the French with us rather than against us.
Explaining Why Is a Mug’s Game
In truth, to ascribe a single governing purpose or rationale to any large-scale foreign policy initiative is to engage in willful distortion. In any administration, action grows out of consensus. The existence of consensus among any president’s advisers — LBJ’s inner circle supporting escalation in South Vietnam back in 1965, George W.’s pressing for regime change in Baghdad — does not imply across-the-board agreement as to intent.
Motive is slippery. As Paul Wolfowitz famously noted regarding Iraq, weapons of mass destruction merely provided the agreed upon public rationale for war. In reality, a mix of motives probably shaped the decision to invade. For some administration officials, there was the prospect of eliminating a perceived source of mischief while providing an object lesson to other would-be troublemakers. For others, there was the promise of reasserting U.S. hegemony over the world’s energy heartland. For others still (including Wolfowitz himself), there were alluring visions of a region transformed, democratized, and pacified, the very sources of Islamist terror thereby eliminated once and for all.
At least on the margins, expanding the powers of the presidency at the expense of Congress, bolstering the security of Israel, and finishing what daddy had left undone also likely figured in the equation. Within this mix, policymakers could pick and choose.
In the face of changing circumstances, they even claimed the prerogative of revising their choices. Who can doubt that President Bush, faced with the Big Oops — the weapons of mass destruction that turned out not to exist — genuinely persuaded himself that America’s true and abiding purpose for invading Iraq had been to liberate the Iraqi people from brutal oppression? After all, right from day one wasn’t the campaign called Operation Iraqi Freedom?
So even as journalists and historians preoccupy themselves with trying to explain why something happened, they are playing a mug’s game. However creative or well-sourced, their answers are necessarily speculative, partial, and ambiguous. It can’t be otherwise.
Rather than why, what deserves far more attention than it generally receives is the question of how. Here is where we find Barack Obama and George W. Bush (not to mention Bill Clinton, George H. W. Bush, Ronald Reagan, and Jimmy Carter) joined at the hip. When it comes to the Islamic world, for more than three decades now Washington’s answer to how has been remarkably consistent: through the determined application of hard power wielded by the United States. Simply put, Washington’s how implies a concerted emphasis on girding for and engaging in war.
Presidents may not agree on exactly what we are trying to achieve in the Greater Middle East (Obama wouldn’t be caught dead reciting lines from Bush’s Freedom Agenda, for example), but for the past several decades, they have agreed on means: whatever it is we want done, military might holds the key to doing it. So today, we have the extraordinary spectacle of Obama embracing and expanding Bush’s Global War on Terror even after having permanently banished that phrase to the Guantanamo of politically incorrect speech.
The Big How — By Force
Efforts to divine this administration’s intent in Libya have centered on the purported influence of the Three Harpies: Secretary of State Hillary Clinton, U.N. Ambassador Susan Rice, and National Security Council Human Rights Director Samantha Power, women in positions of influence ostensibly burdened with regret that the United States failed back in 1994 to respond effectively to the Rwandan genocide and determined this time to get it right. Yet this is insider stuff, which necessarily remains subject to considerable speculation. What we can say for sure is this: by seeing the Greater Middle East as a region of loose nails badly in need of being hammered, the current commander-in-chief has claimed his place in the ranks of a long list of his warrior-predecessors.
The key point is this: like those who preceded them, neither Obama nor his Harpies (nor anyone else in a position of influence) could evidently be bothered to assess whether the hammer actually works as advertised — notwithstanding abundant evidence showing that it doesn’t.
The sequence of military adventures set in motion when Jimmy Carter promulgated his Carter Doctrine back in 1980 makes for an interesting story but not a very pretty one. Ronald Reagan’s effort to bring peace to Lebanon ended in 1983 in a bloody catastrophe. The nominal victory of Operation Desert Storm in 1991, which pushed Saddam Hussein’s forces out of Kuwait, produced little except woeful complications, which Bill Clinton’s penchant for flinging bombs and missiles about during the 1990s did little to resolve or conceal. The blowback stemming from our first Afghanistan intervention against the Soviets helped create the conditions leading to 9/11 and another Afghanistan War, now approaching its tenth anniversary with no clear end in sight. As for George W. Bush’s second go at Iraq, the less said the better. Now, there is Libya.
The question demands to be asked: Are we winning yet? And if not, why persist in an effort for which great pain is repaid with such little gain?
Perhaps Barack Obama found his political soul mate in Samantha Power, making her determination to alleviate evil around the world his own. Or perhaps he is just another calculating politician who speaks the language of ideals while pursuing less exalted purposes. In either case, the immediate relevance of the question is limited. The how rather than the why is determinant.
Whatever his motives, by conforming to a pre-existing American penchant for using force in the Greater Middle East, this president has chosen the wrong tool. In doing so, he condemns himself and the country to persisting in the folly of his predecessors. The failure is one of imagination, but also of courage. He promised, and we deserve something better.
Andrew J. Bacevich is professor of history and international relations. His most recent book Washington Rules: America’s Path to Permanent War (Metropolitan Books) is just out in paperback. To catch Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses what to make of the Obama administration’s Libyan intervention, click here, or download it to your iPod here.
Copyright 2011 Andrew Bacevich
Not Why, But How
In defense circles, “cutting” the Pentagon budget has once again become a topic of conversation. Americans should not confuse that talk with reality. Any cuts exacted will at most reduce the rate of growth. The essential facts remain: U.S. military outlays today equal that of every other nation on the planet combined, a situation without precedent in modern history.
The Pentagon presently spends more in constant dollars than it did at any time during the Cold War — this despite the absence of anything remotely approximating what national security experts like to call a “peer competitor.” Evil Empire? It exists only in the fevered imaginations of those who quiver at the prospect of China adding a rust-bucket Russian aircraft carrier to its fleet or who take seriously the ravings of radical Islamists promising from deep inside their caves to unite the Umma in a new caliphate.
What are Americans getting for their money? Sadly, not much. Despite extraordinary expenditures (not to mention exertions and sacrifices by U.S. forces), the return on investment is, to be generous, unimpressive. The chief lesson to emerge from the battlefields of the post-9/11 era is this: the Pentagon possesses next to no ability to translate “military supremacy” into meaningful victory.
Washington knows how to start wars and how to prolong them, but is clueless when it comes to ending them. Iraq, the latest addition to the roster of America’s forgotten wars, stands as exhibit A. Each bomb that blows up in Baghdad or some other Iraqi city, splattering blood all over the streets, testifies to the manifest absurdity of judging “the surge” as the epic feat of arms celebrated by the Petraeus lobby.
The problems are strategic as well as operational. Old Cold War-era expectations that projecting U.S. power will enhance American clout and standing no longer apply, especially in the Islamic world. There, American military activities are instead fostering instability and inciting anti-Americanism. For Exhibit B, see the deepening morass that Washington refers to as AfPak or the Afghanistan-Pakistan theater of operations.
Add to that the mountain of evidence showing that Pentagon, Inc. is a miserably managed enterprise: hide-bound, bloated, slow-moving, and prone to wasting resources on a prodigious scale — nowhere more so than in weapons procurement and the outsourcing of previously military functions to “contractors.” When it comes to national security, effectiveness (what works) should rightly take precedence over efficiency (at what cost?) as the overriding measure of merit. Yet beyond a certain level, inefficiency undermines effectiveness, with the Pentagon stubbornly and habitually exceeding that level. By comparison, Detroit’s much-maligned Big Three offer models of well-run enterprises.
All of this takes place against the backdrop of mounting problems at home: stubbornly high unemployment, trillion-dollar federal deficits, massive and mounting debt, and domestic needs like education, infrastructure, and employment crying out for attention.
Yet the defense budget — a misnomer since for Pentagon, Inc. defense per se figures as an afterthought — remains a sacred cow. Why is that?
The answer lies first in understanding the defenses arrayed around that cow to ensure that it remains untouched and untouchable. Exemplifying what the military likes to call a “defense in depth,” that protective shield consists of four distinct but mutually supporting layers.
Institutional Self-Interest: Victory in World War II produced not peace, but an atmosphere of permanent national security crisis. As never before in U.S. history, threats to the nation’s existence seemed omnipresent, an attitude first born in the late 1940s that still persists today. In Washington, fear — partly genuine, partly contrived — triggered a powerful response.
One result was the emergence of the national security state, an array of institutions that depended on (and therefore strove to perpetuate) this atmosphere of crisis to justify their existence, status, prerogatives, and budgetary claims. In addition, a permanent arms industry arose, which soon became a major source of jobs and corporate profits. Politicians of both parties were quick to identify the advantages of aligning with this “military-industrial complex,” as President Eisenhower described it.
Allied with (and feeding off of) this vast apparatus that transformed tax dollars into appropriations, corporate profits, campaign contributions, and votes was an intellectual axis of sorts — government-supported laboratories, university research institutes, publications, think tanks, and lobbying firms (many staffed by former or would-be senior officials) — devoted to identifying (or conjuring up) ostensible national security challenges and alarms, always assumed to be serious and getting worse, and then devising responses to them.
The upshot: within Washington, the voices carrying weight in any national security “debate” all share a predisposition for sustaining very high levels of military spending for reasons having increasingly little to do with the well-being of the country.
Strategic Inertia: In a 1948 State Department document, diplomat George F. Kennan offered this observation: “We have about 50 percent of the world’s wealth, but only 6.3 percent of its population.” The challenge facing American policymakers, he continued, was “to devise a pattern of relationships that will permit us to maintain this disparity.” Here we have a description of American purposes that is far more candid than all of the rhetoric about promoting freedom and democracy, seeking world peace, or exercising global leadership.
The end of World War II found the United States in a spectacularly privileged position. Not for nothing do Americans remember the immediate postwar era as a Golden Age of middle-class prosperity. Policymakers since Kennan’s time have sought to preserve that globally privileged position. The effort has been a largely futile one.
By 1950 at the latest, those policymakers (with Kennan by then a notable dissenter) had concluded that the possession and deployment of military power held the key to preserving America’s exalted status. The presence of U.S. forces abroad and a demonstrated willingness to intervene, whether overtly or covertly, just about anywhere on the planet would promote stability, ensure U.S. access to markets and resources, and generally serve to enhance the country’s influence in the eyes of friend and foe alike — this was the idea, at least.
In postwar Europe and postwar Japan, this formula achieved considerable success. Elsewhere — notably in Korea, Vietnam, Latin America, and (especially after 1980) in the so-called Greater Middle East — it either produced mixed results or failed catastrophically. Certainly, the events of the post-9/11 era provide little reason to believe that this presence/power-projection paradigm will provide an antidote to the threat posed by violent anti-Western jihadism. If anything, adherence to it is exacerbating the problem by creating ever greater anti-American animus.
One might think that the manifest shortcomings of the presence/power-projection approach — trillions expended in Iraq for what? — might stimulate present-day Washington to pose some first-order questions about basic U.S. national security strategy. A certain amount of introspection would seem to be called for. Could, for example, the effort to sustain what remains of America’s privileged status benefit from another approach?
Yet there are few indications that our political leaders, the senior-most echelons of the officer corps, or those who shape opinion outside of government are capable of seriously entertaining any such debate. Whether through ignorance, arrogance, or a lack of imagination, the pre-existing strategic paradigm stubbornly persists; so, too, as if by default do the high levels of military spending that the strategy entails.
Cultural Dissonance: The rise of the Tea Party movement should disabuse any American of the thought that the cleavages produced by the “culture wars” have healed. The cultural upheaval touched off by the 1960s and centered on Vietnam remains unfinished business in this country.
Among other things, the sixties destroyed an American consensus, forged during World War II, about the meaning of patriotism. During the so-called Good War, love of country implied, even required, deference to the state, shown most clearly in the willingness of individuals to accept the government’s authority to mandate military service. GI’s, the vast majority of them draftees, were the embodiment of American patriotism, risking life and limb to defend the country.
The GI of World War II had been an American Everyman. Those soldiers both represented and reflected the values of the nation from which they came (a perception affirmed by the ironic fact that the military adhered to prevailing standards of racial segregation). It was “our army” because that army was “us.”
With Vietnam, things became more complicated. The war’s supporters argued that the World War II tradition still applied: patriotism required deference to the commands of the state. Opponents of the war, especially those facing the prospect of conscription, insisted otherwise. They revived the distinction, formulated a generation earlier by the radical journalist Randolph Bourne, that distinguished between the country and the state. Real patriots, the ones who most truly loved their country, were those who opposed state policies they regarded as misguided, illegal, or immoral.
In many respects, the soldiers who fought the Vietnam War found themselves caught uncomfortably in the center of this dispute. Was the soldier who died in Vietnam a martyr, a tragic figure, or a sap? Who deserved greater admiration: the soldier who fought bravely and uncomplainingly or the one who served and then turned against the war? Or was the war resister — the one who never served at all — the real hero?
War’s end left these matters disconcertingly unresolved. President Richard Nixon’s 1971 decision to kill the draft in favor of an All-Volunteer Force, predicated on the notion that the country might be better served with a military that was no longer “us,” only complicated things further. So, too, did the trends in American politics where bona fide war heroes (George H.W. Bush, Bob Dole, John Kerry, and John McCain) routinely lost to opponents whose military credentials were non-existent or exceedingly slight (Bill Clinton, George W. Bush, and Barack Obama), yet who demonstrated once in office a remarkable propensity for expending American blood (none belonging to members of their own families) in places like Somalia, Iraq, and Afghanistan. It was all more than a little unseemly.
Patriotism, once a simple concept, had become both confusing and contentious. What obligations, if any, did patriotism impose? And if the answer was none — the option Americans seemed increasingly to prefer — then was patriotism itself still a viable proposition?
Wanting to answer that question in the affirmative — to distract attention from the fact that patriotism had become little more than an excuse for fireworks displays and taking the occasional day off from work — people and politicians alike found a way to do so by exalting those Americans actually choosing to serve in uniform. The thinking went this way: soldiers offer living proof that America is a place still worth dying for, that patriotism (at least in some quarters) remains alive and well; by common consent, therefore, soldiers are the nation’s “best,” committed to “something bigger than self” in a land otherwise increasingly absorbed in pursuing a material and narcissistic definition of self-fulfillment.
In effect, soldiers offer much-needed assurance that old-fashioned values still survive, even if confined to a small and unrepresentative segment of American society. Rather than Everyman, today’s warrior has ascended to the status of icon, deemed morally superior to the nation for which he or she fights, the repository of virtues that prop up, however precariously, the nation’s increasingly sketchy claim to singularity.
Politically, therefore, “supporting the troops” has become a categorical imperative across the political spectrum. In theory, such support might find expression in a determination to protect those troops from abuse, and so translate into wariness about committing soldiers to unnecessary or unnecessarily costly wars. In practice, however, “supporting the troops” has found expression in an insistence upon providing the Pentagon with open-ended drawing rights on the nation’s treasury, thereby creating massive barriers to any proposal to affect more than symbolic reductions in military spending.
Misremembered History: The duopoly of American politics no longer allows for a principled anti-interventionist position. Both parties are war parties. They differ mainly in the rationale they devise to argue for interventionism. The Republicans tout liberty; the Democrats emphasize human rights. The results tend to be the same: a penchant for activism that sustains a never-ending demand for high levels of military outlays.
American politics once nourished a lively anti-interventionist tradition. Leading proponents included luminaries such as George Washington and John Quincy Adams. That tradition found its basis not in principled pacifism, a position that has never attracted widespread support in this country, but in pragmatic realism. What happened to that realist tradition? Simply put, World War II killed it — or at least discredited it. In the intense and divisive debate that occurred in 1939-1941, the anti-interventionists lost, their cause thereafter tarred with the label “isolationism.”
The passage of time has transformed World War II from a massive tragedy into a morality tale, one that casts opponents of intervention as blackguards. Whether explicitly or implicitly, the debate over how the United States should respond to some ostensible threat — Iraq in 2003, Iran today — replays the debate finally ended by the events of December 7, 1941. To express skepticism about the necessity and prudence of using military power is to invite the charge of being an appeaser or an isolationist. Few politicians or individuals aspiring to power will risk the consequences of being tagged with that label.
In this sense, American politics remains stuck in the 1930s — always discovering a new Hitler, always privileging Churchillian rhetoric — even though the circumstances in which we live today bear scant resemblance to that earlier time. There was only one Hitler and he’s long dead. As for Churchill, his achievements and legacy are far more mixed than his battalions of defenders are willing to acknowledge. And if any one figure deserves particular credit for demolishing Hitler’s Reich and winning World War II, it’s Josef Stalin, a dictator as vile and murderous as Hitler himself.
Until Americans accept these facts, until they come to a more nuanced view of World War II that takes fully into account the political and moral implications of the U.S. alliance with the Soviet Union and the U.S. campaign of obliteration bombing directed against Germany and Japan, the mythic version of “the Good War” will continue to provide glib justifications for continuing to dodge that perennial question: How much is enough?
Like concentric security barriers arrayed around the Pentagon, these four factors — institutional self-interest, strategic inertia, cultural dissonance, and misremembered history — insulate the military budget from serious scrutiny. For advocates of a militarized approach to policy, they provide invaluable assets, to be defended at all costs.
Andrew J. Bacevich is professor of history and international relations at Boston University. His most recent book is Washington Rules: America’s Path to Permanent War. To listen to Timothy MacBain’s latest TomCast audio interview in which Bacevich discusses the money that pours into the national security budget, click here or, to download it to your iPod, here.
Copyright 2011 Andrew Bacevich