Apart from being a police officer, firefighter, or soldier engaged in one of this nation’s endless wars, writing a column for a major American newspaper has got to be one of the toughest and most unforgiving jobs there is. The pay may be decent (at least if your gig is with one of the major papers in New York or Washington), but the pressures to perform on cue are undoubtedly relentless.
Anyone who has ever tried cramming a coherent and ostensibly insightful argument into a mere 750 words knows what I’m talking about. Writing op-eds does not perhaps qualify as high art. Yet, like tying flies or knitting sweaters, it requires no small amount of skill. Performing the trick week in and week out without too obviously recycling the same ideas over and over again — or at least while disguising repetitions and concealing inconsistencies — requires notable gifts.
David Brooks of the New York Times is a gifted columnist. Among contemporary journalists, he is our Walter Lippmann, the closest thing we have to an establishment-approved public intellectual. As was the case with Lippmann, Brooks works hard to suppress the temptation to rant. He shuns raw partisanship. In his frequent radio and television appearances, he speaks in measured tones. Dry humor and ironic references abound. And like Lippmann, when circumstances change, he makes at least a show of adjusting his views accordingly.
For all that, Brooks remains an ideologue. In his columns, and even more so in his weekly appearances on NPR and PBS, he plays the role of the thoughtful, non-screaming conservative, his very presence affirming the ideological balance that, until November 8th of last year, was a prized hallmark of “respectable” journalism. Just as that balance always involved considerable posturing, so, too, with the ostensible conservatism of David Brooks: it’s an act.
Praying at the Altar of American Greatness
In terms of confessional fealty, his true allegiance is not to conservatism as such, but to the Church of America the Redeemer. This is a virtual congregation, albeit one possessing many of the attributes of a more traditional religion. The Church has its own Holy Scripture, authenticated on July 4, 1776, at a gathering of 56 prophets. And it has its own saints, prominent among them the Good Thomas Jefferson, chief author of the sacred text (not the Bad Thomas Jefferson who owned and impregnated slaves); Abraham Lincoln, who freed said slaves and thereby suffered martyrdom (on Good Friday no less); and, of course, the duly canonized figures most credited with saving the world itself from evil: Winston Churchill and Franklin Roosevelt, their status akin to that of saints Peter and Paul in Christianity. The Church of America the Redeemer even has its own Jerusalem, located on the banks of the Potomac, and its own hierarchy, its members situated nearby in High Temples of varying architectural distinction.
This ecumenical enterprise does not prize theological rigor. When it comes to shalts and shalt nots, it tends to be flexible, if not altogether squishy. It demands of the faithful just one thing: a fervent belief in America’s mission to remake the world in its own image. Although in times of crisis Brooks has occasionally gone a bit wobbly, he remains at heart a true believer.
In a March 1997 piece for The Weekly Standard, his then-employer, he summarized his credo. Entitled “A Return to National Greatness,” the essay opened with a glowing tribute to the Library of Congress and, in particular, to the building completed precisely a century earlier to house its many books and artifacts. According to Brooks, the structure itself embodied the aspirations defining America’s enduring purpose. He called particular attention to the dome above the main reading room decorated with a dozen “monumental figures” representing the advance of civilization and culminating in a figure representing America itself. Contemplating the imagery, Brooks rhapsodized:
“The theory of history depicted in this mural gave America impressive historical roots, a spiritual connection to the centuries. And it assigned a specific historic role to America as the latest successor to Jerusalem, Athens, and Rome. In the procession of civilization, certain nations rise up to make extraordinary contributions… At the dawn of the 20th century, America was to take its turn at global supremacy. It was America’s task to take the grandeur of past civilizations, modernize it, and democratize it. This common destiny would unify diverse Americans and give them a great national purpose.”
This February, 20 years later, in a column with an identical title, but this time appearing in the pages of his present employer, the New York Times, Brooks revisited this theme. Again, he began with a paean to the Library of Congress and its spectacular dome with its series of “monumental figures” that placed America “at the vanguard of the great human march of progress.” For Brooks, those 12 allegorical figures convey a profound truth.
“America is the grateful inheritor of other people’s gifts. It has a spiritual connection to all people in all places, but also an exceptional role. America culminates history. It advances a way of life and a democratic model that will provide people everywhere with dignity. The things Americans do are not for themselves only, but for all mankind.”
In 1997, in the midst of the Clinton presidency, Brooks had written that “America’s mission was to advance civilization itself.” In 2017, as Donald Trump gained entry into the Oval Office, he embellished and expanded that mission, describing a nation “assigned by providence to spread democracy and prosperity; to welcome the stranger; to be brother and sister to the whole human race.”
Back in 1997, “a moment of world supremacy unlike any other,” Brooks had worried that his countrymen might not seize the opportunity that was presenting itself. On the cusp of the twenty-first century, he worried that Americans had “discarded their pursuit of national greatness in just about every particular.” The times called for a leader like Theodore Roosevelt, who wielded that classic “big stick” and undertook monster projects like the Panama Canal. Yet Americans were stuck instead with Bill Clinton, a small-bore triangulator. “We no longer look at history as a succession of golden ages,” Brooks lamented. “And, save in the speeches of politicians who usually have no clue what they are talking about,” America was no longer fulfilling its “special role as the vanguard of civilization.”
By early 2017, with Donald Trump in the White House and Steve Bannon whispering in his ear, matters had become worse still. Americans had seemingly abandoned their calling outright. “The Trump and Bannon anschluss has exposed the hollowness of our patriotism,” wrote Brooks, inserting the now-obligatory reference to Nazi Germany. The November 2016 presidential election had “exposed how attenuated our vision of national greatness has become and how easy it was for Trump and Bannon to replace a youthful vision of American greatness with a reactionary, alien one.” That vision now threatens to leave America as “just another nation, hunkered down in a fearful world.”
What exactly happened between 1997 and 2017, you might ask? What occurred during that “moment of world supremacy” to reduce the United States from a nation summoned to redeem humankind to one hunkered down in fear?
Trust Brooks to have at hand a brow-furrowing explanation. The fault, he explains, lies with an “educational system that doesn’t teach civilizational history or real American history but instead a shapeless multiculturalism,” as well as with “an intellectual culture that can’t imagine providence.” Brooks blames “people on the left who are uncomfortable with patriotism and people on the right who are uncomfortable with the federal government that is necessary to lead our project.”
An America that no longer believes in itself — that’s the problem. In effect, Brooks revises Norma Desmond’s famous complaint about the movies, now repurposed to diagnose an ailing nation: it’s the politics that got small.
Nowhere does he consider the possibility that his formula for “national greatness” just might be so much hooey. Between 1997 and 2017, after all, egged on by people like David Brooks, Americans took a stab at “greatness,” with the execrable Donald Trump now numbering among the eventual results.
Say what you will about the shortcomings of the American educational system and the country’s intellectual culture, they had far less to do with creating Trump than did popular revulsion prompted by specific policies that Brooks, among others, enthusiastically promoted. Not that he is inclined to tally up the consequences. Only as a sort of postscript to his litany of contemporary American ailments does he refer even in passing to what he calls the “humiliations of Iraq.”
A great phrase, that. Yet much like, say, the “tragedy of Vietnam” or the “crisis of Watergate,” it conceals more than it reveals. Here, in short, is a succinct historical reference that cries out for further explanation. It bursts at the seams with implications demanding to be unpacked, weighed, and scrutinized. Brooks shrugs off Iraq as a minor embarrassment, the equivalent of having shown up at a dinner party wearing the wrong clothes.
Under the circumstances, it’s easy to forget that, back in 2003, he and other members of the Church of America the Redeemer devoutly supported the invasion of Iraq. They welcomed war. They urged it. They did so not because Saddam Hussein was uniquely evil — although he was evil enough — but because they saw in such a war the means for the United States to accomplish its salvific mission. Toppling Saddam and transforming Iraq would provide the mechanism for affirming and renewing America’s “national greatness.”
Anyone daring to disagree with that proposition they denounced as craven or cowardly. Writing at the time, Brooks disparaged those opposing the war as mere “marchers.” They were effete, pretentious, ineffective, and absurd. “These people are always in the streets with their banners and puppets. They march against the IMF and World Bank one day, and against whatever war happens to be going on the next… They just march against.”
Perhaps space constraints did not permit Brooks in his recent column to spell out the “humiliations” that resulted and that even today continue to accumulate. Here in any event is a brief inventory of what that euphemism conceals: thousands of Americans needlessly killed; tens of thousands grievously wounded in body or spirit; trillions of dollars wasted; millions of Iraqis dead, injured, or displaced; this nation’s moral standing compromised by its resort to torture, kidnapping, assassination, and other perversions; a region thrown into chaos and threatened by radical terrorist entities like the Islamic State that U.S. military actions helped foster. And now, if only as an oblique second-order bonus, we have Donald Trump’s elevation to the presidency to boot.
In refusing to reckon with the results of the war he once so ardently endorsed, Brooks is hardly alone. Members of the Church of America the Redeemer, Democrats and Republicans alike, are demonstrably incapable of rendering an honest accounting of what their missionary efforts have yielded.
Brooks belongs, or once did, to the Church’s neoconservative branch. But liberals such as Bill Clinton, along with his secretary of state Madeleine Albright, were congregants in good standing, as were Barack Obama and his secretary of state Hillary Clinton. So, too, are putative conservatives like Senators John McCain, Ted Cruz, and Marco Rubio, all of them subscribing to the belief in the singularity and indispensability of the United States as the chief engine of history, now and forever.
Back in April 2003, confident that the fall of Baghdad had ended the Iraq War, Brooks predicted that “no day will come when the enemies of this endeavor turn around and say, ‘We were wrong. Bush was right.’” Rather than admitting error, he continued, the war’s opponents “will just extend their forebodings into a more distant future.”
Yet it is the war’s proponents who, in the intervening years, have choked on admitting that they were wrong. Or when making such an admission, as did both John Kerry and Hillary Clinton while running for president, they write it off as an aberration, a momentary lapse in judgment of no particular significance, like having guessed wrong on a TV quiz show.
Rather than requiring acts of contrition, the Church of America the Redeemer has long promulgated a doctrine of self-forgiveness, freely available to all adherents all the time. “You think our country’s so innocent?” the nation’s 45th president recently barked at a TV host who had the temerity to ask how he could have kind words for the likes of Russian President Vladimir Putin. Observers professed shock that a sitting president would openly question American innocence.
In fact, Trump’s response and the kerfuffle that ensued both missed the point. No serious person believes that the United States is “innocent.” Worshipers in the Church of America the Redeemer do firmly believe, however, that America’s transgressions, unlike those of other countries, don’t count against it. Once committed, such sins are simply to be set aside and then expunged, a process that allows American politicians and pundits to condemn a “killer” like Putin with a perfectly clear conscience while demanding that Donald Trump do the same.
What the Russian president has done in Crimea, Ukraine, and Syria qualifies as criminal. What American presidents have done in Iraq, Afghanistan, and Libya qualifies as incidental and, above all, beside the point.
Rather than confronting the havoc and bloodshed to which the United States has contributed, those who worship in the Church of America the Redeemer keep their eyes fixed on the far horizon and the work still to be done in aligning the world with American expectations. At least they would, were it not for the arrival at center stage of a manifestly false prophet who, in promising to “make America great again,” inverts all that “national greatness” is meant to signify.
For Brooks and his fellow believers, the call to “greatness” emanates from faraway precincts — in the Middle East, East Asia, and Eastern Europe. For Trump, the key to “greatness” lies in keeping faraway places and the people who live there as faraway as possible. Brooks et al. see a world that needs saving and believe that it’s America’s calling to do just that. In Trump’s view, saving others is not a peculiarly American responsibility. Events beyond our borders matter only to the extent that they affect America’s well-being. Trump worships in the Church of America First, or at least pretends to do so in order to impress his followers.
That Donald Trump inhabits a universe of his own devising, constructed of carefully arranged alt-facts, is no doubt the case. Yet, in truth, much the same can be said of David Brooks and others sharing his view of a country providentially charged to serve as the “successor to Jerusalem, Athens, and Rome.” In fact, this conception of America’s purpose expresses not the intent of providence, which is inherently ambiguous, but their own arrogance and conceit. Out of that conceit comes much mischief. And in the wake of mischief come charlatans like Donald Trump.
Angst in the Church of America the Redeemer
The fall of the Berlin Wall in October 1989 abruptly ended one historical era and inaugurated another. So, too, did the outcome of last year’s U.S. presidential election. What are we to make of the interval between those two watershed moments? Answering that question is essential to understanding how Donald Trump became president and where his ascendency leaves us.
Hardly had this period commenced before observers fell into the habit of referring to it as the “post-Cold War” era. Now that it’s over, a more descriptive name might be in order. My suggestion: America’s Age of Great Expectations.
Forgive and Forget
The end of the Cold War caught the United States completely by surprise. During the 1980s, even with Mikhail Gorbachev running the Kremlin, few in Washington questioned the prevailing conviction that the Soviet-American rivalry was and would remain a defining feature of international politics more or less in perpetuity. Indeed, endorsing such an assumption was among the prerequisites for gaining entrée to official circles. Virtually no one in the American establishment gave serious thought to the here-today, gone-tomorrow possibility that the Soviet threat, the Soviet empire, and the Soviet Union itself might someday vanish. Washington had plans aplenty for what to do should a Third World War erupt, but none for what to do if the prospect of such a climactic conflict simply disappeared.
Still, without missing a beat, when the Berlin Wall fell and two years later the Soviet Union imploded, leading members of that establishment wasted no time in explaining the implications of developments they had totally failed to anticipate. With something close to unanimity, politicians and policy-oriented intellectuals interpreted the unification of Berlin and the ensuing collapse of communism as an all-American victory of cosmic proportions. “We” had won, “they” had lost — with that outcome vindicating everything the United States represented as the archetype of freedom.
From within the confines of that establishment, one rising young intellectual audaciously suggested that the “end of history” itself might be at hand, with the “sole superpower” left standing now perfectly positioned to determine the future of all humankind. In Washington, various powers-that-be considered this hypothesis and concluded that it sounded just about right. The future took on the appearance of a blank slate upon which Destiny itself was inviting Americans to inscribe their intentions.
American elites might, of course, have assigned a far different, less celebratory meaning to the passing of the Cold War. They might have seen the outcome as a moment that called for regret, repentance, and making amends.
After all, the competition between the United States and the Soviet Union, or more broadly between what was then called the Free World and the Communist bloc, had yielded a host of baleful effects. An arms race between two superpowers had created monstrous nuclear arsenals and, on multiple occasions, brought the planet precariously close to Armageddon. Two singularly inglorious wars had claimed the lives of many tens of thousands of American soldiers and literally millions of Asians. One, on the Korean peninsula, had ended in an unsatisfactory draw; the other, in Southeast Asia, in catastrophic defeat. Proxy fights in Asia, Africa, Latin America, and the Middle East killed so many more and laid waste to whole countries. Cold War obsessions led Washington to overthrow democratic governments, connive in assassination, make common cause with corrupt dictators, and turn a blind eye to genocidal violence. On the home front, hysteria compromised civil liberties and fostered a sprawling, intrusive, and unaccountable national security apparatus. Meanwhile, the military-industrial complex and its beneficiaries conspired to spend vast sums on weapons purchases that somehow never seemed adequate to the putative dangers at hand.
Rather than reflecting on such somber and sordid matters, however, the American political establishment together with ambitious members of the country’s intelligentsia found it so much more expedient simply to move on. As they saw it, the annus mirabilis of 1989 wiped away the sins of former years. Eager to make a fresh start, Washington granted itself a plenary indulgence. After all, why contemplate past unpleasantness when a future so stunningly rich in promise now beckoned?
Three Big Ideas and a Dubious Corollary
Soon enough, that promise found concrete expression. In remarkably short order, three themes emerged to define the new American age. Informing each of them was a sense of exuberant anticipation toward an era of almost unimaginable expectations. The twentieth century was ending on a high note. For the planet as a whole but especially for the United States, great things lay ahead.
Focused on the world economy, the first of those themes emphasized the transformative potential of turbocharged globalization led by U.S.-based financial institutions and transnational corporations. An “open world” would facilitate the movement of goods, capital, ideas, and people and thereby create wealth on an unprecedented scale. In the process, the rules governing American-style corporate capitalism would come to prevail everywhere on the planet. Everyone would benefit, but especially Americans who would continue to enjoy more than their fair share of material abundance.
Focused on statecraft, the second theme spelled out the implications of an international order dominated as never before — not even in the heydays of the Roman and British Empires — by a single nation. With the passing of the Cold War, the United States now stood apart as both supreme power and irreplaceable global leader, its status guaranteed by its unstoppable military might.
In the editorial offices of the Wall Street Journal, the Washington Post, the New Republic, and the Weekly Standard, such “truths” achieved a self-evident status. Although more muted in their public pronouncements than Washington’s reigning pundits, officials enjoying access to the Oval Office, the State Department’s 7th floor, and the E-ring of the Pentagon generally agreed. The assertive exercise of (benign!) global hegemony seemingly held the key to ensuring that Americans would enjoy safety and security, both at home and abroad, now and in perpetuity.
The third theme was all about rethinking the concept of personal freedom as commonly understood and pursued by most Americans. During the protracted emergency of the Cold War, reaching an accommodation between freedom and the putative imperatives of national security had not come easily. Cold War-style patriotism seemingly prioritized the interests of the state at the expense of the individual. Yet even as thrillingly expressed by John F. Kennedy — “Ask not what your country can do for you, ask what you can do for your country” — this was never an easy sell, especially if it meant wading through rice paddies and getting shot at.
Once the Cold War ended, however, the tension between individual freedom and national security momentarily dissipated. Reigning conceptions of what freedom could or should entail underwent a radical transformation. Emphasizing the removal of restraints and inhibitions, the shift made itself felt everywhere, from patterns of consumption and modes of cultural expression to sexuality and the definition of the family. Norms that had prevailed for decades if not generations — marriage as a union between a man and a woman, gender identity as fixed at birth — became passé. The concept of a transcendent common good, which during the Cold War had taken a backseat to national security, now took a backseat to maximizing individual choice and autonomy.
Finally, as a complement to these themes, in the realm of governance, the end of the Cold War cemented the status of the president as quasi-deity. In the Age of Great Expectations, the myth of the president as a deliverer from (or, in the eyes of critics, the ultimate perpetrator of) evil flourished. In the solar system of American politics, the man in the White House increasingly became the sun around which everything seemed to orbit. By comparison, nothing else much mattered.
From one administration to the next, of course, presidential efforts to deliver Americans to the Promised Land regularly came up short. Even so, the political establishment and the establishment media collaborated in sustaining the pretense that out of the next endlessly hyped “race for the White House,” another Roosevelt or Kennedy or Reagan would magically emerge to save the nation. From one election cycle to the next, these campaigns became longer and more expensive, drearier and yet ever more circus-like. No matter. During the Age of Great Expectations, the reflexive tendency to see the president as the ultimate guarantor of American abundance, security, and freedom remained sacrosanct.
Meanwhile, between promise and reality, a yawning gap began to appear. During the concluding decade of the twentieth century and the first decade-and-a-half of the twenty-first, Americans endured a seemingly endless series of crises. Individually, none of these merit comparison with, say, the Civil War or World War II. Yet never in U.S. history has a sequence of events occurring in such close proximity subjected American institutions and the American people to greater stress.
During the decade between 1998 and 2008, they came on with startling regularity: one president impeached and his successor chosen by the direct intervention of the Supreme Court; a massive terrorist attack on American soil that killed thousands, traumatized the nation, and left senior officials bereft of their senses; a mindless, needless, and unsuccessful war of choice launched on the basis of false claims and outright lies; a natural disaster (exacerbated by engineering folly) that all but destroyed a major American city, after which government agencies mounted a belated and half-hearted response; and finally, the worst economic downturn since the Great Depression, bringing ruin to millions of families.
For the sake of completeness, we should append to this roster of seismic occurrences one additional event: Barack Obama’s election as the nation’s first black president. He arrived at the zenith of American political life as a seemingly messianic figure called upon not only to undo the damage wrought by his predecessor, George W. Bush, but somehow to absolve the nation of its original sins of slavery and racism.
Yet during the Obama presidency race relations, in fact, deteriorated. Whether prompted by cynical political calculations or a crass desire to boost ratings, race baiters came out of the woodwork — one of them, of course, infamously birthered in Trump Tower in mid-Manhattan — and poured their poisons into the body politic. Even so, as the end of Obama’s term approached, the cult of the presidency itself remained remarkably intact.
Individually, the impact of these various crises ranged from disconcerting to debilitating to horrifying. Yet to treat them separately is to overlook their collective implications, which the election of Donald Trump only now enables us to appreciate. It was not one president’s dalliance with an intern or “hanging chads” or 9/11 or “Mission Accomplished” or the inundation of the Lower Ninth Ward or the collapse of Lehman Brothers or the absurd birther movement that undermined the Age of Great Expectations. It was the way all these events together exposed those expectations as radically suspect.
In effect, the various crises that punctuated the post-Cold War era called into question key themes to which a fevered American triumphalism had given rise. Globalization, militarized hegemony, and a more expansive definition of freedom, guided by enlightened presidents in tune with the times, should have provided Americans with all the blessings that were rightly theirs as a consequence of having prevailed in the Cold War. Instead, between 1989 and 2016, things kept happening that weren’t supposed to happen. A future marketed as all but foreordained proved elusive, if not illusory. As actually experienced, the Age of Great Expectations became an Age of Unwelcome Surprises.
A Candidate for Decline
True, globalization created wealth on a vast scale, just not for ordinary Americans. The already well-to-do did splendidly, in some cases unbelievably so. But middle-class incomes stagnated and good jobs became increasingly hard to find or keep. By the election of 2016, the United States looked increasingly like a society divided between haves and have-nots, the affluent and the left-behind, the 1% and everyone else. Prospective voters were noticing.
Meanwhile, policies inspired by Washington’s soaring hegemonic ambitions produced remarkably few happy outcomes. With U.S. forces continuously engaged in combat operations, peace all but vanished as a policy objective (or even a word in Washington’s political lexicon). The acknowledged standing of the country’s military as the world’s best-trained, best-equipped, and best-led force coexisted uneasily with the fact that it proved unable to win. Instead, the national security establishment became conditioned to the idea of permanent war, high-ranking officials taking it for granted that ordinary citizens would simply accommodate themselves to this new reality. Yet it soon became apparent that, instead of giving ordinary Americans a sense of security, this new paradigm induced an acute sense of vulnerability, which left many susceptible to demagogic fear mongering.
As for the revised definition of freedom, with autonomy emerging as the national summum bonum, it left some satisfied but others adrift. During the Age of Great Expectations, distinctions between citizen and consumer blurred. Shopping became tantamount to a civic obligation, essential to keeping the economy afloat. Yet if all the hoopla surrounding Black Friday and Cyber Monday represented a celebration of American freedom, its satisfactions were transitory at best, rarely extending beyond the due date printed on a credit card statement. Meanwhile, as digital connections displaced personal ones, relationships, like jobs, became more contingent and temporary. Loneliness emerged as an abiding affliction. Meanwhile, for all the talk of empowering the marginalized — people of color, women, gays — elites reaped the lion’s share of the benefits while ordinary people were left to make do. The atmosphere was rife with hypocrisy and even a whiff of nihilism.
To these various contradictions, the establishment itself remained stubbornly oblivious, with the 2016 presidential candidacy of Hillary Clinton offering a case in point. As her long record in public life made abundantly clear, Clinton embodied the establishment in the Age of Great Expectations. She believed in globalization, in the indispensability of American leadership backed by military power, and in the post-Cold War cultural project. And she certainly believed in the presidency as the mechanism to translate aspirations into outcomes.
Such commonplace convictions of the era, along with her vanguard role in pressing for the empowerment of women, imparted to her run an air of inevitability. That she deserved to win appeared self-evident. It was, after all, her turn. Largely overlooked were signs that the abiding themes of the Age of Great Expectations no longer commanded automatic allegiance.
Gasping for Air
Senator Bernie Sanders offered one of those signs. That a past-his-prime, self-professed socialist from Vermont with a negligible record of legislative achievement and tenuous links to the Democratic Party might mount a serious challenge to Clinton seemed, on the face of it, absurd. Yet by zeroing in on unfairness and inequality as inevitable byproducts of globalization, Sanders struck a chord.
Knocked briefly off balance, Clinton responded by modifying certain of her longstanding positions. By backing away from free trade, the ne plus ultra of globalization, she managed, though not without difficulty, to defeat the Sanders insurgency. Even so, he, in effect, served as the canary in the establishment coal mine, signaling that the Age of Great Expectations might be running out of oxygen.
A parallel and far stranger insurgency was simultaneously wreaking havoc in the Republican Party. That a narcissistic political neophyte stood the slightest chance of capturing the GOP seemed even more improbable than Sanders taking a nomination that appeared Clinton’s by right.
Coarse, vulgar, unprincipled, uninformed, erratic, and with little regard for truth, Trump was sui generis among presidential candidates. Yet he possessed a singular gift: a knack for riling up those who nurse gripes and are keen to pin the blame on someone or something. In post-Cold War America, among the millions that Hillary Clinton was famously dismissing as “deplorables,” gripes had been ripening like cheese in a hothouse.
Through whatever combination of intuition and malice aforethought, Trump demonstrated a genius for motivating those deplorables. He pushed their buttons. They responded by turning out in droves to attend his rallies. There they listened to a message that they found compelling.
In Trump’s pledge to “make America great again” his followers heard a promise to restore everything they believed had been taken from them in the Age of Great Expectations. Globalization was neither beneficial nor inevitable, the candidate insisted, and vowed, once elected, to curb its effects along with the excesses of corporate capitalism, thereby bringing back millions of lost jobs from overseas. He would, he swore, fund a massive infrastructure program, cut taxes, keep a lid on the national debt, and generally champion the cause of working stiffs. The many complications and contradictions inherent in these various prescriptions would, he assured his fans, give way to his business savvy.
In considering America’s role in the post-Cold War world, Trump exhibited a similar impatience with the status quo. Rather than allowing armed conflicts to drag on forever, he promised to win them (putting to work his mastery of military affairs) or, if not, to quit and get out, pausing just long enough to claim as a sort of consolation prize whatever spoils might be lying loose on the battlefield. At the very least, he would prevent so-called allies from treating the United States like some patsy. Henceforth, nations benefitting from American protection were going to foot their share of the bill. What all of this added up to may not have been clear, but it did suggest a sharp departure from the usual post-1989 formula for exercising global leadership.
No less important than Trump’s semi-coherent critique of globalization and American globalism, however, was his success in channeling the discontent of all those who nursed an inchoate sense that post-Cold War freedoms might be working for some, but not for them.
Not that Trump had anything to say about whether freedom confers obligations, or whether conspicuous consumption might not actually hold the key to human happiness, or any of the various controversies related to gender, sexuality, and family. He was indifferent to all such matters. He was, however, distinctly able to offer his followers a grimly persuasive explanation for how America had gone off course and how the blessings of liberties to which they were entitled had been stolen. He did that by fingering as scapegoats Muslims, Mexicans, and others “not-like-me.”
Trump’s political strategy reduced to this: as president, he would overturn the conventions that had governed right thinking since the end of the Cold War. To the amazement of an establishment grown smug and lazy, his approach worked. Even while disregarding all received wisdom when it came to organizing and conducting a presidential campaign in the Age of Great Expectations, Trump won. He did so by enchanting the disenchanted, all those who had lost faith in the promises that had sprung from the bosom of the elites that the end of the Cold War had taken by surprise.
Adrift Without a Compass
Within hours of Trump’s election, among progressives, expressing fear and trepidation at the prospect of what he might actually do on assuming office became de rigueur. Yet those who had actually voted for Trump were also left wondering what to expect. Both camps assign him the status of a transformative historical figure. However, premonitions of incipient fascism and hopes that he will engineer a new American Golden Age are likely to prove similarly misplaced. To focus on the man himself rather than on the circumstances that produced him is to miss the significance of what has occurred.
Note, for example, that his mandate is almost entirely negative. It centers on rejection: of globalization, of counterproductive military meddling, and of the post-Cold War cultural project. Yet neither Trump nor any of his surrogates has offered a coherent alternative to the triad of themes providing the through line for the last quarter-century of American history. Apart a lingering conviction that forceful — in The Donald’s case, blustering — presidential leadership can somehow turn things around, “Trumpism” is a dog’s breakfast.
In all likelihood, his presidency will prove less transformative than transitional. As a result, concerns about what he may do, however worrisome, matter less than the larger question of where we go from here. The principles that enjoyed favor following the Cold War have been found wanting. What should replace them?
Efforts to identify those principles should begin with an honest accounting of the age we are now leaving behind, the history that happened after “the end of history.” That accounting should, in turn, allow room for regret, repentance, and making amends — the very critical appraisal that ought to have occurred at the end of the Cold War but was preempted when American elites succumbed to their bout of victory disease.
Don’t expect Donald Trump to undertake any such appraisal. Nor will the establishment that candidate Trump so roundly denounced, but which President-elect Trump, at least in his senior national security appointments, now shows sign of accommodating. Those expecting Trump’s election to inject courage into members of the political class or imagination into inside-the-Beltway “thought leaders” are in for a disappointment. So the principles we need — an approach to political economy providing sustainable and equitable prosperity; a foreign policy that discards militarism in favor of prudence and pragmatism; and an enriched, inclusive concept of freedom — will have to come from somewhere else.
“Where there is no vision,” the Book of Proverbs tells us, “the people perish.” In the present day, there is no vision to which Americans collectively adhere. For proof, we need look no further than the election of Donald Trump.
The Age of Great Expectations has ended, leaving behind an ominous void. Yet Trump’s own inability to explain what should fill that great void provides neither excuse for inaction nor cause for despair. Instead, Trump himself makes manifest the need to reflect on the nation’s recent past and to think deeply about its future.
A decade before the Cold War ended, writing in democracy, a short-lived journal devoted to “political renewal and radical change,” the historian and social critic Christopher Lasch sketched out a set of principles that might lead us out of our current crisis. Lasch called for a politics based on “the nurture of the soil against the exploitation of resources, the family against the factory, the romantic vision of the individual against the technological vision, [and] localism over democratic centralism.” Nearly a half-century later, as a place to begin, his prescription remains apt.
The Age of Great Expectations and the Great Void
President-elect Donald Trump’s message for the nation’s senior military leadership is ambiguously unambiguous. Here is he on 60 Minutes just days after winning the election.
Trump: “We have some great generals. We have great generals.”
Lesley Stahl: “You said you knew more than the generals about ISIS.”
Trump: “Well, I’ll be honest with you, I probably do because look at the job they’ve done. OK, look at the job they’ve done. They haven’t done the job.”
In reality, Trump, the former reality show host, knows next to nothing about ISIS, one of many gaps in his education that his impending encounter with actual reality is likely to fill. Yet when it comes to America’s generals, our president-to-be is onto something. No doubt our three- and four-star officers qualify as “great” in the sense that they mean well, work hard, and are altogether fine men and women. That they have not “done the job,” however, is indisputable — at least if their job is to bring America’s wars to a timely and successful conclusion.
Trump’s unhappy verdict — that the senior U.S. military leadership doesn’t know how to win — applies in spades to the two principal conflicts of the post-9/11 era: the Afghanistan War, now in its 16th year, and the Iraq War, launched in 2003 and (after a brief hiatus) once more grinding on. Yet the verdict applies equally to lesser theaters of conflict, largely overlooked by the American public, that in recent years have engaged the attention of U.S. forces, a list that would include conflicts in Libya, Somalia, Syria, and Yemen.
Granted, our generals have demonstrated an impressive aptitude for moving pieces around on a dauntingly complex military chessboard. Brigades, battle groups, and squadrons shuttle in and out of various war zones, responding to the needs of the moment. The sheer immensity of the enterprise across the Greater Middle East and northern Africa — the sorties flown, munitions expended, the seamless deployment and redeployment of thousands of troops over thousands of miles, the vast stockpiles of material positioned, expended, and continuously resupplied — represents a staggering achievement. Measured by these or similar quantifiable outputs, America’s military has excelled. No other military establishment in history could have come close to duplicating the logistical feats being performed year in, year out by the armed forces of the United States.
Nor should we overlook the resulting body count. Since the autumn of 2001, something like 370,000 combatants and noncombatants have been killed in the various theaters of operations where U.S. forces have been active. Although modest by twentieth century standards, this post-9/11 harvest of death is hardly trivial.
Yet in evaluating military operations, it’s a mistake to confuse how much with how well. Only rarely do the outcomes of armed conflicts turn on comparative statistics. Ultimately, the one measure of success that really matters involves achieving war’s political purposes. By that standard, victory requires not simply the defeat of the enemy, but accomplishing the nation’s stated war aims, and not just in part or temporarily but definitively. Anything less constitutes failure, not to mention utter waste for taxpayers, and for those called upon to fight, it constitutes cause for mourning.
By that standard, having been “at war” for virtually the entire twenty-first century, the United States military is still looking for its first win. And however strong the disinclination to concede that Donald Trump could be right about anything, his verdict on American generalship qualifies as apt.
A Never-Ending Parade of Commanders for Wars That Never End
That verdict brings to mind three questions. First, with Trump a rare exception, why have the recurring shortcomings of America’s military leadership largely escaped notice? Second, to what degree does faulty generalship suffice to explain why actual victory has proven so elusive? Third, to the extent that deficiencies at the top of the military hierarchy bear directly on the outcome of our wars, how might the generals improve their game?
As to the first question, the explanation is quite simple: During protracted wars, traditional standards for measuring generalship lose their salience. Without pertinent standards, there can be no accountability. Absent accountability, failings and weaknesses escape notice. Eventually, what you’ve become accustomed to seems tolerable. Twenty-first century Americans inured to wars that never end have long since forgotten that bringing such conflicts to a prompt and successful conclusion once defined the very essence of what generals were expected to do.
Senior military officers were presumed to possess unique expertise in designing campaigns and directing engagements. Not found among mere civilians or even among soldiers of lesser rank, this expertise provided the rationale for conferring status and authority on generals.
In earlier eras, the very structure of wars provided a relatively straightforward mechanism for testing such claims to expertise. Events on the battlefield rendered harsh judgments, creating or destroying reputations with brutal efficiency.
Back then, standards employed in evaluating generalship were clear-cut and uncompromising. Those who won battles earned fame, glory, and the gratitude of their countrymen. Those who lost battles got fired or were put out to pasture.
During the Civil War, for example, Abraham Lincoln did not need an advanced degree in strategic studies to conclude that Union generals like John Pope, Ambrose Burnside, and Joseph Hooker didn’t have what it took to defeat the Army of Northern Virginia. Humiliating defeats sustained by the Army of the Potomac at the Second Bull Run, Fredericksburg, and Chancellorsville made that obvious enough. Similarly, the victories Ulysses S. Grant and William T. Sherman gained at Shiloh, at Vicksburg, and in the Chattanooga campaign strongly suggested that here was the team to which the president could entrust the task of bringing the Confederacy to its knees.
Today, public drunkenness, petty corruption, or sexual shenanigans with a subordinate might land generals in hot water. But as long as they avoid egregious misbehavior, senior officers charged with prosecuting America’s wars are largely spared judgments of any sort. Trying hard is enough to get a passing grade.
With the country’s political leaders and public conditioned to conflicts seemingly destined to drag on for years, if not decades, no one expects the current general-in-chief in Iraq or Afghanistan to bring things to a successful conclusion. His job is merely to manage the situation until he passes it along to a successor, while duly adding to his collection of personal decorations and perhaps advancing his career.
Today, for example, Army General John Nicholson commands U.S. and allied forces in Afghanistan. He’s only the latest in a long line of senior officers to preside over that war, beginning with General Tommy Franks in 2001 and continuing with Generals Mikolashek, Barno, Eikenberry, McNeill, McKiernan, McChrystal, Petraeus, Allen, Dunford, and Campbell. The title carried by these officers changed over time. So, too, did the specifics of their “mission” as Operation Enduring Freedom evolved into Operation Freedom’s Sentinel. Yet even as expectations slipped lower and lower, none of the commanders rotating through Kabul delivered. Not a single one has, in our president-elect’s concise formulation, “done the job.” Indeed, it’s increasingly difficult to know what that job is, apart from preventing the Taliban from quite literally toppling the government.
In Iraq, meanwhile, Army Lieutenant General Stephen Townsend currently serves as the — count ‘em — ninth American to command U.S. and coalition forces in that country since the George W. Bush administration ordered the invasion of 2003. The first in that line, (once again) General Tommy Franks, overthrew the Saddam Hussein regime and thereby broke Iraq. The next five, Generals Sanchez, Casey, Petraeus, Odierno, and Austin, labored for eight years to put it back together again.
At the end of 2011, President Obama declared that they had done just that and terminated the U.S. military occupation. The Islamic State soon exposed Obama’s claim as specious when its militants put a U.S.-trained Iraqi army to flight and annexed large swathes of that country’s territory. Following in the footsteps of his immediate predecessors Generals James Terry and Sean MacFarland, General Townsend now shoulders the task of trying to restore Iraq’s status as a more or less genuinely sovereign state. He directs what the Pentagon calls Operation Inherent Resolve, dating from June 2014, the follow-on to Operation New Dawn (September 2010-December 2011), which was itself the successor to Operation Iraqi Freedom (March 2003-August 2010).
When and how Inherent Resolve will conclude is difficult to forecast. This much we can, however, say with some confidence: with the end nowhere in sight, General Townsend won’t be its last commander. Other generals are waiting in the wings with their own careers to polish. As in Kabul, the parade of U.S. military commanders through Baghdad will continue.
For some readers, this listing of mostly forgotten names and dates may have a soporific effect. Yet it should also drive home Trump’s point. The United States may today have the world’s most powerful and capable military — so at least we are constantly told. Yet the record shows that it does not have a corps of senior officers who know how to translate capability into successful outcomes.
Draining Which Swamp?
That brings us to the second question: Even if commander-in-chief Trump were somehow able to identify modern day equivalents of Grant and Sherman to implement his war plans, secret or otherwise, would they deliver victory?
On that score, we would do well to entertain doubts. Although senior officers charged with running recent American wars have not exactly covered themselves in glory, it doesn’t follow that their shortcomings offer the sole or even a principal explanation for why those wars have yielded such disappointing results. The truth is that some wars aren’t winnable and shouldn’t be fought.
So, yes, Trump’s critique of American generalship possesses merit, but whether he knows it or not, the question truly demanding his attention as the incoming commander-in-chief isn’t: Who should I hire (or fire) to fight my wars? Instead, far more urgent is: Does further war promise to solve any of my problems?
One mark of a successful business executive is knowing when to cut your losses. It’s also the mark of a successful statesman. Trump claims to be the former. Whether his putative business savvy will translate into the world of statecraft remains to be seen. Early signs are not promising.
As a candidate, Trump vowed to “defeat radical Islamic terrorism,” destroy ISIS, “decimate al-Qaeda,” and “starve funding for Iran-backed Hamas and Hezbollah.” Those promises imply a significant escalation of what Americans used to call the Global War on Terrorism.
Toward that end, the incoming administration may well revive some aspects of the George W. Bush playbook, including repopulating the military prison at Guantanamo Bay, Cuba, and “if it’s so important to the American people,” reinstituting torture. The Trump administration will at least consider re-imposing sanctions on countries like Iran. It may aggressively exploit the offensive potential of cyber-weapons, betting that America’s cyber-defenses will hold.
Yet President Trump is also likely to double down on the use of conventional military force. In that regard, his promise to “quickly and decisively bomb the hell out of ISIS” offers a hint of what is to come. His appointment of the uber-hawkish Lieutenant General Michael Flynn as his national security adviser and his rumored selection of retired Marine Corps General James (“Mad Dog”) Mattis as defense secretary suggest that he means what he says. In sum, a Trump administration seems unlikely to reexamine the conviction that the problems roiling the Greater Middle East will someday, somehow yield to a U.S.-imposed military solution. Indeed, in the face of massive evidence to the contrary, that conviction will deepen, with genuinely ironic implications for the Trump presidency.
In the immediate wake of 9/11, George W. Bush concocted a fantasy of American soldiers liberating oppressed Afghans and Iraqis and thereby “draining the swamp” that served to incubate anti-Western terrorism. The results achieved proved beyond disappointing, while the costs exacted in terms of lives and dollars squandered were painful indeed. Incrementally, with the passage of time, many Americans concluded that perhaps the swamp most in need of attention was not on the far side of the planet but much closer at hand — right in the imperial city nestled alongside the Potomac River.
To a very considerable extent, Trump defeated Hillary Clinton, preferred candidate of the establishment, because he advertised himself as just the guy disgruntled Americans could count on to drain that swamp.
Yet here’s what too few of those Americans appreciate, even today: war created that swamp in the first place. War empowers Washington. It centralizes. It provides a rationale for federal authorities to accumulate and exercise new powers. It makes government bigger and more intrusive. It lubricates the machinery of waste, fraud, and abuse that causes tens of billions of taxpayer dollars to vanish every year. When it comes to sustaining the swamp, nothing works better than war.
Were Trump really intent on draining that swamp — if he genuinely seeks to “Make America Great Again” — then he would extricate the United States from war. His liquidation of Trump University, which was to higher education what Freedom’s Sentinel and Inherent Resolve are to modern warfare, provides a potentially instructive precedent for how to proceed.
But don’t hold your breath on that one. All signs indicate that, in one fashion or another, our combative next president will perpetuate the wars he’s inheriting. Trump may fancy that, as a veteran of Celebrity Apprentice (but not of military service), he possesses a special knack for spotting the next Grant or Sherman. But acting on that impulse will merely replenish the swamp in the Greater Middle East along with the one in Washington. And soon enough, those who elected him with expectations of seeing the much-despised establishment dismantled will realize that they’ve been had.
Which brings us, finally, to that third question: To the extent that deficiencies at the top of the military hierarchy do affect the outcome of wars, what can be done to fix the problem?
The most expeditious approach: purge all currently serving three- and four-star officers; then, make a precondition for promotion to those ranks confinement in a reeducation camp run by Iraq and Afghanistan war amputees, with a curriculum designed by Veterans for Peace. Graduation should require each student to submit an essay reflecting on these words of wisdom from U.S. Grant himself: “There never was a time when, in my opinion, some way could not be found to prevent the drawing of the sword.”
True, such an approach may seem a bit draconian. But this is no time for half-measures — as even Donald Trump may eventually recognize.
Copyright 2016 Andrew J. Bacevich
You may have missed it. Perhaps you dozed off. Or wandered into the kitchen to grab a snack. Or by that point in the proceedings were checking out Seinfeld reruns. During the latter part of the much hyped but excruciating-to-watch first presidential debate, NBC Nightly News anchor Lester Holt posed a seemingly straightforward but cunningly devised question. His purpose was to test whether the candidates understood the essentials of nuclear strategy.
A moderator given to plain speaking might have said this: “Explain why the United States keeps such a large arsenal of nuclear weapons and when you might consider using those weapons.”
What Holt actually said was: “On nuclear weapons, President Obama reportedly considered changing the nation’s longstanding policy on first use. Do you support the current policy?”
The framing of the question posited no small amount of knowledge on the part of the two candidates. Specifically, it assumed that Donald Trump and Hillary Clinton each possess some familiarity with the longstanding policy to which Holt referred and with the modifications that Obama had contemplated making to it.
If you will permit the equivalent of a commercial break as this piece begins, let me explain why I’m about to parse in detail each candidate’s actual answer to Holt’s question. Amid deep dives into, and expansive punditry regarding, issues like how “fat” a former Miss Universe may have been and how high an imagined future wall on our southern border might prove to be, national security issues likely to test the judgment of a commander-in-chief have received remarkably little attention. So indulge me. This largely ignored moment in last week’s presidential debate is worth examining.
With regard to the issue of “first use,” every president since Harry Truman has subscribed to the same posture: the United States retains the prerogative of employing nuclear weapons to defend itself and its allies against even nonnuclear threats. In other words, as a matter of policy, the United States rejects the concept of “no first use,” which would prohibit any employment of nuclear weapons except in retaliation for a nuclear attack. According to press reports, President Obama had toyed with but then rejected the idea of committing the United States to a “no first use” posture. Holt wanted to know where the two candidates aspiring to succeed Obama stood on the matter.
Cruelly, the moderator invited Trump to respond first. The look in the Republican nominee’s eyes made it instantly clear that Holt could have been speaking Farsi for all he understood. A lesser candidate might then have begun with the nuclear equivalent of “What is Aleppo?”
Yet Trump being Trump, he gamely — or naively — charged headlong into the ambush that Holt had carefully laid, using his allotted two minutes to offer his insights into how as president he would address the nuclear conundrum that previous presidents had done so much to create. The result owed less to early Cold War thinkers-of-the-unthinkable like Herman Kahn or Albert Wohlstetter, who created the field of nuclear strategy, than to Dr. Strangelove. Make that Dr. Strangelove on meth.
Trump turned first to Russia, expressing concern that it might be gaining an edge in doomsday weaponry. “They have a much newer capability than we do,” he said. “We have not been updating from the new standpoint.” The American bomber fleet in particular, he added, needs modernization. Presumably referring to the recent employment of Vietnam-era bombers in the wars in Afghanistan, Iraq, and Syria, he continued somewhat opaquely, “I looked the other night. I was seeing B-52s, they’re old enough that your father, your grandfather, could be flying them. We are not — we are not keeping up with other countries.”
Trump then professed an appreciation for the awfulness of nuclear weaponry. “I would like everybody to end it, just get rid of it. But I would certainly not do first strike. I think that once the nuclear alternative happens, it’s over.”
Give Trump this much: even in a field that tends to favor abstraction and obfuscating euphemisms like “fallout” or “dirty bomb,” classifying Armageddon as the “nuclear alternative” represents something of a contribution.
Still, it’s worth noting that, in the arcane theology of nuclear strategy, “first strike” and “first use” are anything but synonymous. “First strike” implies a one-sided, preventive war of annihilation. The logic of a first strike, such as it is, is based on the calculation that a surprise nuclear attack could inflict the “nuclear alternative” on your adversary, while sparing your own side from suffering a comparable fate. A successful first strike would be a one-punch knockout, delivered while your opponent still sits in his corner of the ring.
Yet whatever reassurance was to be found in Trump’s vow never to order a first strike — not the question Lester Holt was asking — was immediately squandered. The Republican nominee promptly revoked his “no first strike” pledge by insisting, in a cliché much favored in Washington, that “I can’t take anything off the table.”
Piling non sequitur upon non sequitur, he next turned to the threat posed by a nuclear-armed North Korea, where “we’re doing nothing.” Yet, worrisome as this threat might be, keeping Pyongyang in check, he added, ought to be Beijing’s job. “China should solve that problem for us,” he insisted. “China should go into North Korea. China is totally powerful as it relates to North Korea.”
If China wouldn’t help with North Korea, however, what could be more obvious than that Iran, many thousands of miles away, should do so — and might have, if only President Obama had incorporated the necessary proviso into the Iran nuclear deal. “Iran is one of their biggest trading partners. Iran has power over North Korea.” When the Obama administration “made that horrible deal with Iran, they should have included the fact that they do something with respect to North Korea.” But why stop with North Korea? Iran “should have done something with respect to Yemen and all these other places,” he continued, wandering into the nonnuclear world. U.S. negotiators suitably skilled in the Trumpian art of the deal, he implied, could easily have maneuvered Iran into solving such problems on Washington’s behalf.
Veering further off course, Trump then took a passing swipe at Secretary of State John Kerry: “Why didn’t you add other things into the deal?” Why, in “one of the great giveaways of all time,” did the Obama administration fork over $400 million in cash? At which point, he promptly threw in another figure without the slightest explanation — “It was actually $1.7 billion in cash” — in “one of the worst deals ever made by any country in history.”
Trump then wrapped up his meandering tour d’horizon by decrying the one action of the Obama administration that arguably has reduced the prospect of nuclear war, at least in the near future. “The deal with Iran will lead to nuclear problems,” he stated with conviction. “All they have to do is sit back 10 years, and they don’t have to do much. And they’re going to end up getting nuclear.” For proof, he concluded, talk to the Israelis. “I met with Bibi Netanyahu the other day,” he added for no reason in particular. “Believe me, he’s not a happy camper.”
On this indecipherable note, his allotted time exhausted, Trump’s recitation ended. In its way, it had been a Joycean performance.
Bridge Over Troubled Waters?
It was now Clinton’s turn to show her stuff. If Trump had responded to Holt like a voluble golf caddy being asked to discuss the finer points of ice hockey, Hillary Clinton chose a different course: she changed the subject. She would moderate her own debate. Perhaps Trump thought Holt was in charge of the proceedings; Clinton knew better.
What followed was vintage Clinton: vapid sentiments, smoothly delivered in the knowing tone of a seasoned Washington operative. During her two minutes, she never came within a country mile of discussing the question Holt had asked or the thoughts she evidently actually has about nuclear issues.
“[L]et me start by saying, words matter,” she began. “Words matter when you run for president. And they really matter when you are president. And I want to reassure our allies in Japan and South Korea and elsewhere that we have mutual defense treaties and we will honor them.”
It was as if Clinton were already speaking from the Oval Office. Trump had addressed his remarks to Lester Holt. Clinton directed hers to the nation at large, to people the world over, indeed to history itself. Warming to her task, she was soon rolling out the sort of profundities that play well at the Brookings Institution, the Carnegie Endowment, or the Council on Foreign Relations, causing audiences to nod — or nod off.
“It is essential that America’s word be good,” Clinton continued. “And so I know that this campaign has caused some questioning and worries on the part of many leaders across the globe. I’ve talked with a number of them. But I want to — on behalf of myself, and I think on behalf of a majority of the American people, say that, you know, our word is good.”
Then, after inserting a tepid, better-than-nothing endorsement of the Iran nuclear deal, she hammered Trump for not offering an alternative. “Would he have started a war? Would he have bombed Iran?” If you’re going to criticize, she pointed out, you need to offer something better. Trump never does, she charged. “It’s like his plan to defeat ISIS. He says it’s a secret plan, but the only secret is that he has no plan.”
With that, she reverted to platitudes. “So we need to be more precise in how we talk about these issues. People around the word follow our presidential campaigns so closely, trying to get hints about what we will do. Can they rely on us? Are we going to lead the world with strength and in accordance with our values? That’s what I intend to do. I intend to be a leader of our country that people can count on, both here at home and around the world, to make decisions that will further peace and prosperity, but also stand up to bullies, whether they’re abroad or at home.”
Like Trump, she offered no specifics. Which bullies? Where? How? In what order? Would she start with Russia’s Putin? North Korea’s Kim Jong-Un? Perhaps Rodrigo Duterte of the Philippines? How about Turkey’s Recep Tayyip Erdogan? Or Bibi?
In contrast to Trump, however, Clinton did speak in complete sentences, which followed one another in an orderly fashion. She thereby came across as at least nominally qualified to govern the country, much like, say, Warren G. Harding nearly a century ago. And what worked for Harding in 1920 may well work for Clinton in 2016.
Of Harding’s speechifying, H.L. Mencken wrote at the time, “It reminds me of a string of wet sponges.” Mencken characterized Harding’s rhetoric as “so bad that a sort of grandeur creeps into it. It drags itself out of the dark abysm of pish, and crawls insanely up the topmost pinnacle of posh. It is rumble and bumble. It is flap and doodle. It is balder and dash.” So, too, with Hillary Clinton. She is our Warren G. Harding. In her oratory, flapdoodle and balderdash live on.
The National Security Void
If I’ve taxed your patience by recounting this non-debate and non-discussion of nuclear first use, it’s to make a larger point. The absence of relevant information elicited by Lester Holt’s excellent question speaks directly to what has become a central flaw in this entire presidential campaign: the dearth of attention given to matters basic to U.S. national security policy.
In the nuclear arena, the issue of first use is only one of several on which anyone aspiring to become the next commander-in-chief should be able to offer an informed judgment. Others include questions such as these:
- What is the present-day justification for maintaining the U.S. nuclear “triad,” a strike force consisting of manned bombers and land-based ballistic missiles and submarine-launched ballistic missiles?
- Why is the Pentagon embarking upon a decades-long, trillion-dollar program to modernize that triad, fielding a new generation of bombers, missiles, and submarines along with an arsenal of new warheads? Is that program necessary?
- How do advances in non-nuclear weaponry — for example, in the realm of cyberwarfare — affect theories of nuclear deterrence devised by the likes of Kahn and Wohlstetter during the 1950s and 1960s? Does the logic of those theories still pertain?
Beyond the realm of nuclear strategy, there are any number of other security-related questions about which the American people deserve to hear directly from both Trump and Clinton, testing their knowledge of the subject matter and the quality of their judgments. Among such matters, one in particular screams out for attention. Consider it the question that Washington has declared off-limits: What lessons should be drawn from America’s costly and disappointing post-9/11 wars and how should those lessons apply to future policy?
With Election Day now merely a month away, there is no more reason to believe that such questions will receive serious consideration than to expect Trump to come clean on his personal finances or Clinton to release the transcripts of her handsomely compensated Goldman Sachs speeches.
When outcomes don’t accord with his wishes, Trump reflexively blames a “rigged” system. But a system that makes someone like Trump a finalist for the presidency isn’t rigged. It is manifestly absurd, a fact that has left most of the national media grasping wildly for explanations (albeit none that tag them with having facilitated the transformation of politics into theater).
I’ll take a backseat to no one in finding Trump unfit to serve as president. Yet beyond the outsized presence of one particular personality, the real travesty of our predicament lies elsewhere — in the utter shallowness of our political discourse, no more vividly on display than in the realm of national security.
What do our presidential candidates talk about when they don’t want to talk about nuclear war? The one, in a vain effort to conceal his own ignorance, offers rambling nonsense. The other, accustomed to making her own rules, simply changes the subject.
The American people thereby remain in darkness. On that score, Trump, Clinton, and the parties they represent are not adversaries. They are collaborators.
What We Talk About When We Don’t Want to Talk About Nuclear War
My earliest recollection of national politics dates back exactly 60 years to the moment, in the summer of 1956, when I watched the political conventions in the company of that wondrous new addition to our family, television. My parents were supporting President Dwight D. Eisenhower for a second term and that was good enough for me. Even as a youngster, I sensed that Ike, the former supreme commander of allied forces in Europe in World War II, was someone of real stature. In a troubled time, he exuded authority and self-confidence. By comparison, Democratic candidate Adlai Stevenson came across as vaguely suspect. Next to the five-star incumbent, he seemed soft, even foppish, and therefore not up to the job. So at least it appeared to a nine-year-old living in Chicagoland.
Of the seamy underside of politics I knew nothing, of course. On the surface, all seemed reassuring. As if by divine mandate, two parties vied for power. The views they represented defined the allowable range of opinion. The outcome of any election expressed the collective will of the people and was to be accepted as such. That I was growing up in the best democracy the world had ever known — its very existence a daily rebuke to the enemies of freedom — was beyond question.
Naïve? Embarrassingly so. Yet how I wish that Election Day in November 2016 might present Americans with something even loosely approximating the alternatives available to them in November 1956. Oh, to choose once more between an Ike and an Adlai.
Don’t for a second think that this is about nostalgia. Today, Stevenson doesn’t qualify for anyone’s list of Great Americans. If remembered at all, it’s for his sterling performance as President John F. Kennedy’s U.N. ambassador during the Cuban Missile Crisis. Interrogating his Soviet counterpart with cameras rolling, Stevenson barked that he was prepared to wait “until hell freezes over” to get his questions answered about Soviet military activities in Cuba. When the chips were down, Adlai proved anything but soft. Yet in aspiring to the highest office in the land, he had come up well short. In 1952, he came nowhere close to winning and in 1956 he proved no more successful. Stevenson was to the Democratic Party what Thomas Dewey had been to the Republicans: a luckless two-time loser.
As for Eisenhower, although there is much in his presidency to admire, his errors of omission and commission were legion. During his two terms, from Guatemala to Iran, the CIA overthrew governments, plotted assassinations, and embraced unsavory right-wing dictators — in effect, planting a series of IEDs destined eventually to blow up in the face of Ike’s various successors. Meanwhile, binging on nuclear weapons, the Pentagon accumulated an arsenal far beyond what even Eisenhower as commander-in-chief considered prudent or necessary.
In addition, during his tenure in office, the military-industrial complex became a rapacious juggernaut, an entity unto itself as Ike himself belatedly acknowledged. By no means least of all, Eisenhower fecklessly committed the United States to an ill-fated project of nation-building in a country that just about no American had heard of at the time: South Vietnam. Ike did give the nation eight years of relative peace and prosperity, but at a high price — most of the bills coming due long after he left office.
The Pathology of American Politics
And yet, and yet…
To contrast the virtues and shortcomings of Stevenson and Eisenhower with those of Hillary Rodham Clinton and Donald Trump is both instructive and profoundly depressing. Comparing the adversaries of 1956 with their 2016 counterparts reveals with startling clarity what the decades-long decay of American politics has wrought.
In 1956, each of the major political parties nominated a grown-up for the highest office in the land. In 2016, only one has.
In 1956, both parties nominated likeable individuals who conveyed a basic sense of trustworthiness. In 2016, neither party has done so.
In 1956, Americans could count on the election to render a definitive verdict, the vote count affirming the legitimacy of the system itself and allowing the business of governance to resume. In 2016, that is unlikely to be the case. Whether Trump or Clinton ultimately prevails, large numbers of Americans will view the result as further proof of “rigged” and irredeemably corrupt political arrangements. Rather than inducing some semblance of reconciliation, the outcome is likely to deepen divisions.
How in the name of all that is holy did we get into such a mess?
How did the party of Eisenhower, an architect of victory in World War II, choose as its nominee a narcissistic TV celebrity who, with each successive Tweet and verbal outburst, offers further evidence that he is totally unequipped for high office? Yes, the establishment media are ganging up on Trump, blatantly displaying the sort of bias normally kept at least nominally under wraps. Yet never have such expressions of journalistic hostility toward a particular candidate been more justified. Trump is a bozo of such monumental proportions as to tax the abilities of our most talented satirists. Were he alive today, Mark Twain at his most scathing would be hard-pressed to do justice to The Donald’s blowhard pomposity.
Similarly, how did the party of Adlai Stevenson, but also of Stevenson’s hero Franklin Roosevelt, select as its candidate someone so widely disliked and mistrusted even by many of her fellow Democrats? True, antipathy directed toward Hillary Clinton draws some of its energy from incorrigible sexists along with the “vast right wing conspiracy” whose members thoroughly loathe both Clintons. Yet the antipathy is not without basis in fact.
Even by Washington standards, Secretary Clinton exudes a striking sense of entitlement combined with a nearly complete absence of accountability. She shrugs off her misguided vote in support of invading Iraq back in 2003, while serving as senator from New York. She neither explains nor apologizes for pressing to depose Libya’s Muammar Gaddafi in 2011, her most notable “accomplishment” as secretary of state. “We came, we saw, he died,” she bragged back then, somewhat prematurely given that Libya has since fallen into anarchy and become a haven for ISIS.
She clings to the demonstrably false claim that her use of a private server for State Department business compromised no classified information. Now opposed to the Trans Pacific Partnership (TTP) that she once described as the “gold standard in trade agreements,” Clinton rejects charges of political opportunism. That her change of heart occurred when attacking the TPP was helping Bernie Sanders win one Democratic primary after another is merely coincidental. Oh, and the big money accepted from banks and Wall Street as well as the tech sector for minimal work and the bigger money still from leading figures in the Israel lobby? Rest assured that her acceptance of such largesse won’t reduce by one iota her support for “working class families” or her commitment to a just peace settlement in the Middle East.
Let me be clear: none of these offer the slightest reason to vote for Donald Trump. Yet together they make the point that Hillary Clinton is a deeply flawed candidate, notably so in matters related to national security. Clinton is surely correct that allowing Trump to make decisions related to war and peace would be the height of folly. Yet her record in that regard does not exactly inspire confidence.
When it comes to foreign policy, Trump’s preference for off-the-cuff utterances finds him committing astonishing gaffes with metronomic regularity. Spontaneity serves chiefly to expose his staggering ignorance.
By comparison, the carefully scripted Clinton commits few missteps, as she recites with practiced ease the pabulum that passes for right thinking in establishment circles. But fluency does not necessarily connote soundness. Clinton, after all, adheres resolutely to the highly militarized “Washington playbook” that President Obama himself has disparaged — a faith-based belief in American global primacy to be pursued regardless of how the world may be changing and heedless of costs.
On the latter point, note that Clinton’s acceptance speech in Philadelphia included not a single mention of Afghanistan. By Election Day, the war there will have passed its 15th anniversary. One might think that a prospective commander-in-chief would have something to say about the longest conflict in American history, one that continues with no end in sight. Yet, with the Washington playbook offering few answers, Mrs. Clinton chooses to remain silent on the subject.
So while a Trump presidency holds the prospect of the United States driving off a cliff, a Clinton presidency promises to be the equivalent of banging one’s head against a brick wall without evident effect, wondering all the while why it hurts so much.
Pseudo-Politics for an Ersatz Era
But let’s not just blame the candidates. Trump and Clinton are also the product of circumstances that neither created. As candidates, they are merely exploiting a situation — one relying on intuition and vast stores of brashness, the other putting to work skills gained during a life spent studying how to acquire and employ power. The success both have achieved in securing the nominations of their parties is evidence of far more fundamental forces at work.
In the pairing of Trump and Clinton, we confront symptoms of something pathological. Unless Americans identify the sources of this disease, it will inevitably worsen, with dire consequences in the realm of national security. After all, back in Eisenhower’s day, the IEDs planted thanks to reckless presidential decisions tended to blow up only years — or even decades — later. For example, between the 1953 U.S.-engineered coup that restored the Shah to his throne and the 1979 revolution that converted Iran overnight from ally to adversary, more than a quarter of a century elapsed. In our own day, however, detonation occurs so much more quickly — witness the almost instantaneous and explosively unhappy consequences of Washington’s post-9/11 military interventions in the Greater Middle East.
So here’s a matter worth pondering: How is it that all the months of intensive fundraising, the debates and speeches, the caucuses and primaries, the avalanche of TV ads and annoying robocalls have produced two presidential candidates who tend to elicit from a surprisingly large number of rank-and-file citizens disdain, indifference, or at best hold-your-nose-and-pull-the-lever acquiescence?
Here, then, is a preliminary diagnosis of three of the factors contributing to the erosion of American politics, offered from the conviction that, for Americans to have better choices next time around, fundamental change must occur — and soon.
First, and most important, the evil effects of money: Need chapter and verse? For a tutorial, see this essential 2015 book by Professor Lawrence Lessig of Harvard: Republic Lost, Version 2.0. Those with no time for books might spare 18 minutes for Lessig’s brilliant and deeply disturbing TED talk. Professor Lessig argues persuasively that unless the United States radically changes the way it finances political campaigns, we’re pretty much doomed to see our democracy wither and die.
Needless to say, moneyed interests and incumbents who benefit from existing arrangements take a different view and collaborate to maintain the status quo. As a result, political life has increasingly become a pursuit reserved for those like Trump who possess vast personal wealth or for those like Clinton who display an aptitude for persuading the well to do to open their purses, with all that implies by way of compromise, accommodation, and the subsequent repayment of favors.
Second, the perverse impact of identity politics on policy: Observers make much of the fact that, in capturing the presidential nomination of a major party, Hillary Clinton has shattered yet another glass ceiling. They are right to do so. Yet the novelty of her candidacy starts and ends with gender. When it comes to fresh thinking, Donald Trump has far more to offer than Clinton — even if his version of “fresh” tends to be synonymous with wacky, off-the-wall, ridiculous, or altogether hair-raising.
The essential point here is that, in the realm of national security, Hillary Clinton is utterly conventional. She subscribes to a worldview (and view of America’s role in the world) that originated during the Cold War, reached its zenith in the 1990s when the United States proclaimed itself the planet’s “sole superpower,” and persists today remarkably unaffected by actual events. On the campaign trail, Clinton attests to her bona fides by routinely reaffirming her belief in American exceptionalism, paying fervent tribute to “the world’s greatest military,” swearing that she’ll be “listening to our generals and admirals,” and vowing to get tough on America’s adversaries. These are, of course, the mandatory rituals of the contemporary Washington stump speech, amplified if anything by the perceived need for the first female candidate for president to emphasize her pugnacity.
A Clinton presidency, therefore, offers the prospect of more of the same — muscle-flexing and armed intervention to demonstrate American global leadership — albeit marketed with a garnish of diversity. Instead of different policies, Clinton will offer an administration that has a different look, touting this as evidence of positive change.
Yet while diversity may be a good thing, we should not confuse it with effectiveness. A national security team that “looks like America” (to use the phrase originally coined by Bill Clinton) does not necessarily govern more effectively than one that looks like President Eisenhower’s. What matters is getting the job done.
Since the 1990s women have found plentiful opportunities to fill positions in the upper echelons of the national security apparatus. Although we have not yet had a female commander-in-chief, three women have served as secretary of state and two as national security adviser. Several have filled Adlai Stevenson’s old post at the United Nations. Undersecretaries, deputy undersecretaries, and assistant secretaries of like gender abound, along with a passel of female admirals and generals.
So the question needs be asked: Has the quality of national security policy improved compared to the bad old days when men exclusively called the shots? Using as criteria the promotion of stability and the avoidance of armed conflict (along with the successful prosecution of wars deemed unavoidable), the answer would, of course, have to be no. Although Madeleine Albright, Condoleezza Rice, Susan Rice, Samantha Power, and Clinton herself might entertain a different view, actually existing conditions in Afghanistan, Iraq, Libya, Syria, Somalia, Sudan, Yemen, and other countries across the Greater Middle East and significant parts of Africa tell a different story.
The abysmal record of American statecraft in recent years is not remotely the fault of women; yet neither have women made a perceptibly positive difference. It turns out that identity does not necessarily signify wisdom or assure insight. Allocating positions of influence in the State Department or the Pentagon based on gender, race, ethnicity, or sexual orientation — as Clinton will assuredly do — may well gratify previously disenfranchised groups. Little evidence exists to suggest that doing so will produce more enlightened approaches to statecraft, at least not so long as adherence to the Washington playbook figures as a precondition to employment. (Should Clinton win in November, don’t expect the redoubtable ladies of Code Pink to be tapped for jobs at the Pentagon and State Department.)
In the end, it’s not identity that matters but ideas and their implementation. To contemplate the ideas that might guide a President Trump along with those he will recruit to act on them — Ivanka as national security adviser? — is enough to elicit shudders from any sane person. Yet the prospect of Madam President surrounding herself with an impeccably diverse team of advisers who share her own outmoded views is hardly cause for celebration.
Putting a woman in charge of national security policy will not in itself amend the defects exhibited in recent years. For that, the obsolete principles with which Clinton along with the rest of Washington remains enamored will have to be jettisoned. In his own bizarre way (albeit without a clue as to a plausible alternative), Donald Trump seems to get that; Hillary Clinton does not.
Third, the substitution of “reality” for reality: Back in 1962, a young historian by the name of Daniel Boorstin published The Image: A Guide to Pseudo-Events in America. In an age in which Donald Trump and Hillary Clinton vie to determine the nation’s destiny, it should be mandatory reading. The Image remains, as when it first appeared, a fire bell ringing in the night.
According to Boorstin, more than five decades ago the American people were already living in a “thicket of unreality.” By relentlessly indulging in ever more “extravagant expectations,” they were forfeiting their capacity to distinguish between what was real and what was illusory. Indeed, Boorstin wrote, “We have become so accustomed to our illusions that we mistake them for reality.”
While ad agencies and PR firms had indeed vigorously promoted a world of illusions, Americans themselves had become willing accomplices in the process.
“The American citizen lives in a world where fantasy is more real than reality, where the image has more dignity than its original. We hardly dare to face our bewilderment, because our ambiguous experience is so pleasantly iridescent, and the solace of belief in contrived reality is so thoroughly real. We have become eager accessories to the great hoaxes of the age. These are the hoaxes we play on ourselves.”
This, of course, was decades before the nation succumbed to the iridescent allure of Facebook, Google, fantasy football, “Real Housewives of _________,” selfies, smartphone apps, Game of Thrones, Pokémon GO — and, yes, the vehicle that vaulted Donald Trump to stardom, The Apprentice.
“The making of the illusions which flood our experience has become the business of America,” wrote Boorstin. It’s also become the essence of American politics, long since transformed into theater, or rather into some sort of (un)reality show.
Presidential campaigns today are themselves, to use Boorstin’s famous term, “pseudo-events” that stretch from months into years. By now, most Americans know better than to take at face value anything candidates say or promise along the way. We’re in on the joke — or at least we think we are. Reinforcing that perception on a daily basis are media outlets that have abandoned mere reporting in favor of enhancing the spectacle of the moment. This is especially true of the cable news networks, where talking heads serve up a snide and cynical complement to the smarmy fakery that is the office-seeker’s stock in trade. And we lap it up. It matters little that we know it’s all staged and contrived, as long as — a preening Megyn Kelly getting under Trump’s skin, Trump himself denouncing “lyin’ Ted” Cruz, etc., etc. — it’s entertaining.
This emphasis on spectacle has drained national politics of whatever substance it still had back when Ike and Adlai commanded the scene. It hardly need be said that Donald Trump has demonstrated an extraordinary knack — a sort of post-modern genius — for turning this phenomenon to his advantage. Yet in her own way Clinton plays the same game. How else to explain a national convention organized around the idea of “reintroducing to the American people” someone who served eight years as First Lady, was elected to the Senate, failed in a previous high-profile run for the presidency, and completed a term as secretary of state? The just-ended conclave in Philadelphia was, like the Republican one that preceded it, a pseudo-event par excellence, the object of the exercise being to fashion a new “image” for the Democratic candidate.
The thicket of unreality that is American politics has now become all-enveloping. The problem is not Trump and Clinton, per se. It’s an identifiable set of arrangements — laws, habits, cultural predispositions — that have evolved over time and promoted the rot that now pervades American politics. As a direct consequence, the very concept of self-government is increasingly a fantasy, even if surprisingly few Americans seem to mind.
At an earlier juncture back in 1956, out of a population of 168 million, we got Ike and Adlai. Today, with almost double the population, we get — well, we get what we’ve got. This does not represent progress. And don’t kid yourself that things really can’t get much worse. Unless Americans rouse themselves to act, count on it, they will.
Copyright 2016 Andrew J. Bacevich
The Decay of American Politics
We have it on highest authority: the recent killing of Taliban leader Mullah Akhtar Muhammad Mansour by a U.S. drone strike in Pakistan marks “an important milestone.” So the president of the United States has declared, with that claim duly echoed and implicitly endorsed by media commentary — the New York Times reporting, for example, that Mansour’s death leaves the Taliban leadership “shocked” and “shaken.”
But a question remains: A milestone toward what exactly?
Toward victory? Peace? Reconciliation? At the very least, toward the prospect of the violence abating? Merely posing the question is to imply that U.S. military efforts in Afghanistan and elsewhere in the Islamic world serve some larger purpose.
Yet for years now that has not been the case. The assassination of Mansour instead joins a long list of previous milestones, turning points, and landmarks briefly heralded as significant achievements only to prove much less than advertised.
One imagines that Obama himself understands this perfectly well. Just shy of five years ago, he was urging Americans to “take comfort in knowing that the tide of war is receding.” In Iraq and Afghanistan, the president insisted, “the light of a secure peace can be seen in the distance.”
“These long wars,” he promised, were finally coming to a “responsible end.” We were, that is, finding a way out of Washington’s dead-end conflicts in the Greater Middle East.
Who can doubt Obama’s sincerity, or question his oft-expressed wish to turn away from war and focus instead on unattended needs here at home? But wishing is the easy part. Reality has remained defiant. Even today, the wars in Iraq and Afghanistan that George W. Bush bequeathed to Obama show no sign of ending.
Like Bush, Obama will bequeath to his successor wars he failed to finish. Less remarked upon, he will also pass along to President Clinton or President Trump new wars that are his own handiwork. In Libya, Somalia, Yemen, and several other violence-wracked African nations, the Obama legacy is one of ever-deepening U.S. military involvement. The almost certain prospect of a further accumulation of briefly celebrated and quickly forgotten “milestones” beckons.
During the Obama era, the tide of war has not receded. Instead, Washington finds itself drawn ever deeper into conflicts that, once begun, become interminable — wars for which the vaunted U.S. military has yet to devise a plausible solution.
The Oldest (Also Latest) Solution: Bombs Away
Once upon a time, during the brief, if heady, interval between the end of the Cold War and 9/11 when the United States ostensibly reigned supreme as the world’s “sole superpower,” Pentagon field manuals credited U.S. forces with the ability to achieve “quick, decisive victory — on and off the battlefield — anywhere in the world and under virtually any conditions.” Bold indeed (if not utterly delusional) would be the staff officer willing to pen such words today.
To be sure, the United States military routinely demonstrates astonishing technical prowess — putting a pair of Hellfire missiles through the roof of the taxi in which Mansour was riding, for example. Yet if winning — that is, ending wars on conditions favorable to our side — offers the measure of merit by which to judge a nation’s military forces, then when put to the test ours have been found wanting.
Not for lack of trying, of course. In their quest for a formula that might actually accomplish the mission, those charged with directing U.S. military efforts in the Greater Middle East have demonstrated notable flexibility. They have employed overwhelming force and “shock-and awe.” They have tried regime change (bumping off Saddam Hussein and Muammar Gaddafi, for example) and “decapitation” (assassinating Mansour and a host of other militant leaders, including Osama Bin Laden). They have invaded and occupied countries, even giving military-style nation-building a whirl. They have experimented with counterinsurgency and counterterrorism, peacekeeping and humanitarian intervention, retaliatory strikes and preventive war. They have operated overtly, covertly, and through proxies. They have equipped, trained, and advised — and when the beneficiaries of these exertions have folded in the face of the enemy, they have equipped, trained, and advised some more. They have converted American reservists into quasi-regulars, subject to repeated combat tours. In imitation of the corporate world, they have outsourced as well, handing over to profit-oriented “private security” firms functions traditionally performed by soldiers. In short, they have labored doggedly to translate American military power into desired political outcomes.
In this one respect at least, an endless parade of three- and four-star generals exercising command in various theaters over the past several decades have earned high marks. In terms of effort, they deserve an A.
As measured by outcomes, however, they fall well short of a passing grade. However commendable their willingness to cast about for some method that might actually work, they have ended up waging a war of attrition. Strip away the light-at-the-end-of-the-tunnel reassurances regularly heard at Pentagon press briefings or in testimony presented on Capitol Hill and America’s War for the Greater Middle East proceeds on this unspoken assumption: if we kill enough people for a long enough period of time, the other side will eventually give in.
On that score, the prevailing Washington gripe directed at Commander-in-Chief Obama is that he has not been willing to kill enough. Take, for example, a recent Wall Street Journal op-ed penned by that literary odd couple, retired General David Petraeus and Brookings Institution analyst Michael O’Hanlon, that appeared under the pugnacious headline “Take the Gloves Off Against the Taliban.” To turn around the longest war in American history, Petraeus and O’Hanlon argue, the United States just needs to drop more bombs.
The rules of engagement currently governing air operations in Afghanistan are, in their view, needlessly restrictive. Air power “represents an asymmetric Western advantage, relatively safe to apply, and very effective.” (The piece omits any mention of incidents such as the October 2015 destruction of a Doctors Without Borders hospital in the Afghan provincial capital of Kunduz by a U.S. Air Force gunship.) More ordnance will surely produce “some version of victory.” The path ahead is clear. “Simply waging the Afghanistan air-power campaign with the vigor we are employing in Iraq and Syria,” the authors write with easy assurance, should do the trick.
When armchair generals cite the ongoing U.S. campaign in Iraq and Syria as a model of effectiveness, you know that things must be getting desperate.
Granted, Petraeus and O’Hanlon are on solid ground in noting that as the number of U.S. and NATO troops in Afghanistan has decreased, so, too, has the number of air strikes targeting the Taliban. Back when more allied boots were on the ground, more allied planes were, of course, overhead. And yet the 100,000 close-air-support sorties flown between 2011 and 2015 — that’s more than one sortie per Taliban fighter — did not, alas, yield “some version of victory.” In short, we’ve already tried the Petraeus-O’Hanlon take-the-gloves-off approach to defeating the Taliban. It didn’t work. With the Afghanistan War’s 15th anniversary now just around the corner, to suggest that we can bomb our way to victory there is towering nonsense.
In Washington, Big Thinking and Small
Petraeus and O’Hanlon characterize Afghanistan as “the eastern bulwark in our broader Middle East fight.” Eastern sinkhole might be a more apt description. Note, by the way, that they have nothing useful to say about the “broader fight” to which they allude. Yet that broader fight — undertaken out of the conviction, still firmly in place today, that American military assertiveness can somehow repair the Greater Middle East — is far more deserving of attention than how to employ very expensive airplanes against insurgents armed with inexpensive Kalashnikovs.
To be fair, in silently passing over the broader fight, Petraeus and O’Hanlon are hardly alone. On this subject no one has much to say — not other stalwarts of the onward-to-victory school, nor officials presently charged with formulating U.S. national security policy, nor members of the Washington commentariat eager to pontificate about almost anything. Worst of all, the subject is one on which each of the prospective candidates for the presidency is mum.
From Secretary of Defense Ashton Carter and Chairman of the Joint Chiefs of Staff General Joseph Dunford on down to the lowliest blogger, opinions about how best to wage a particular campaign in that broader fight are readily available. Need a plan for rolling back the Islamic State? Glad you asked. Concerned about that new ISIS franchise in Libya? Got you covered. Boko Haram? Here’s what you need to know. Losing sleep over Al-Shabab? Take heart — big thinkers are on the case.
As to the broader fight itself, however, no one has a clue. Indeed, it seems fair to say that merely defining our aims in that broader fight, much less specifying the means to achieve them, heads the list of issues that people in Washington studiously avoid. Instead, they prattle endlessly about the Taliban and ISIS and Boko Haram and al-Shabab.
Here’s the one thing you need to know about the broader fight: there is no strategy. None. Zilch. We’re on a multi-trillion-dollar bridge to nowhere, with members of the national security establishment more or less content to see where it leads.
May I suggest that we find ourselves today in what might be called a Khe Sanh moment? Older readers will recall that back in late 1967 and early 1968 in the midst of the Vietnam War, one particular question gripped the national security establishment and those paid to attend to its doings: Can Khe Sanh hold?
Now almost totally forgotten, Khe Sanh was then a battlefield as well known to Americans as Fallujah was to become in our own day. Located in the northern part of South Vietnam, it was the site of a besieged and outnumbered Marine garrison, surrounded by two full enemy divisions. In the eyes of some observers, the outcome of the Vietnam War appeared to hinge on the ability of the Marines there to hold out — to avoid the fate that had befallen the French garrison at Dien Bien Phu slightly more than a decade earlier. For France, the fall of Dien Bien Phu had indeed spelled final defeat in Indochina.
Was history about to repeat itself at Khe Sanh? As it turned out, no… and yes.
The Marines did hold — a milestone! — and the United States lost the war anyway.
In retrospect, it seems pretty clear that those responsible for formulating U.S. policy back then fundamentally misconstrued the problem at hand. Rather than worrying about the fate of Khe Sanh, they ought to have been asking questions like these: Is the Vietnam War winnable? Does it even make sense? If not, why are we there? And above all, does no alternative exist to simply pressing on with a policy that shows no signs of success?
Today the United States finds itself in a comparable situation. What to do about the Taliban or ISIS is not a trivial question. Much the same can be said regarding the various other militant organizations with which U.S. forces are engaged in a variety of countries — many now failing states — across the Greater Middle East.
But the question of how to take out organization X or put country Y back together pales in comparison with the other questions that should by now have come to the fore but haven’t. Among the most salient are these: Does waging war across a large swath of the Islamic world make sense? When will this broader fight end? What will it cost? Short of reducing large parts of the Middle East to rubble, is that fight winnable in any meaningful sense? Above all, does the world’s most powerful nation have no other choice but to persist in pursuing a manifestly futile endeavor?
Try this thought experiment. Imagine the opposing candidates in a presidential campaign each refusing to accept war as the new normal. Imagine them actually taking stock of the broader fight that’s been ongoing for decades now. Imagine them offering alternatives to armed conflicts that just drag on and on. Now that would be a milestone.
Copyright 2016 Andrew Bacevich
Milestones (Or What Passes for Them in Washington)
Let’s face it: in times of war, the Constitution tends to take a beating. With the safety or survival of the nation said to be at risk, the basic law of the land — otherwise considered sacrosanct — becomes nonbinding, subject to being waived at the whim of government authorities who are impatient, scared, panicky, or just plain pissed off.
The examples are legion. During the Civil War, Abraham Lincoln arbitrarily suspended the writ of habeas corpus and ignored court orders that took issue with his authority to do so. After U.S. entry into World War I, the administration of Woodrow Wilson mounted a comprehensive effort to crush dissent, shutting down anti-war publications in complete disregard of the First Amendment. Amid the hysteria triggered by Pearl Harbor, Franklin Roosevelt issued an executive order consigning to concentration camps more than 100,000 Japanese-Americans, many of them native-born citizens. Asked in 1944 to review this gross violation of due process, the Supreme Court endorsed the government’s action by a 6-3 vote.
More often than not, the passing of the emergency induces second thoughts and even remorse. The further into the past a particular war recedes, the more dubious the wartime arguments for violating the Constitution appear. Americans thereby take comfort in the “lessons learned” that will presumably prohibit any future recurrence of such folly.
Even so, the onset of the next war finds the Constitution once more being ill-treated. We don’t repeat past transgressions, of course. Instead, we devise new ones. So it has been during the ongoing post-9/11 period of protracted war.
During the presidency of George W. Bush, the United States embraced torture as an instrument of policy in clear violation of the Eighth Amendment prohibiting cruel and unusual punishment. Bush’s successor, Barack Obama, ordered the extrajudicial killing of an American citizen, a death by drone that was visibly in disregard of the Fifth and Fourteenth Amendments. Both administrations — Bush’s with gusto, Obama’s with evident regret — imprisoned individuals for years on end without charge and without anything remotely approximating the “speedy and public trial, by an impartial jury” guaranteed by the Sixth Amendment. Should the present state of hostilities ever end, we can no doubt expect Guantánamo to become yet another source of “lessons learned” for future generations of rueful Americans.
Congress on the Sidelines
Yet one particular check-and-balance constitutional proviso now appears exempt from this recurring phenomenon of disregard followed by professions of dismay, embarrassment, and “never again-ism” once the military emergency passes. I mean, of course, Article I, section 8 of the Constitution, which assigns to Congress the authority “to declare war” and still stands as testimony to the genius of those who drafted it. There can be no question that the responsibility for deciding when and whether the United States should fight resides with the legislative branch, not the executive, and that this was manifestly the intent of the Framers.
On parchment at least, the division of labor appears straightforward. The president’s designation as commander-in-chief of the armed forces in no way implies a blanket authorization to employ those forces however he sees fit or anything faintly like it. Quite the contrary: legitimizing presidential command requires explicit congressional sanction.
Actual practice has evolved into something altogether different. The portion of Article I, Section 8, cited above has become a dead letter, about as operative as blue laws still on the books in some American cities and towns that purport to regulate Sabbath day activities. Superseding the written text is an unwritten counterpart that goes something like this: with legislators largely consigned to the status of observers, presidents pretty much wage war whenever, wherever, and however they see fit. Whether the result qualifies as usurpation or forfeiture is one of those chicken-and-egg questions that’s interesting but practically speaking beside the point.
This is by no means a recent development. It has a history. In the summer of 1950, when President Harry Truman decided that a U.N. Security Council resolution provided sufficient warrant for him to order U.S. forces to fight in Korea, congressional war powers took a hit from which they would never recover.
Congress soon thereafter bought into the notion, fashionable during the Cold War, that formal declarations of hostilities had become passé. Waging the “long twilight struggle” ostensibly required deference to the commander-in-chief on all matters related to national security. To sustain the pretense that it still retained some relevance, Congress took to issuing what were essentially permission slips, granting presidents maximum freedom of action to do whatever they might decide needed to be done in response to the latest perceived crisis.
The Tonkin Gulf Resolution of 1964 offers a notable example. With near unanimity, legislators urged President Lyndon Johnson “to take all necessary measures to repel any armed attack against the forces of the United States and to prevent further aggression” across the length and breadth of Southeast Asia. Through the magic of presidential interpretation, a mandate to prevent aggression provided legal cover for an astonishingly brutal and aggressive war in Vietnam, as well as Cambodia and Laos. Under the guise of repelling attacks on U.S. forces, Johnson and his successor, Richard Nixon, thrust millions of American troops into a war they could not win, even if more than 58,000 died trying.
To leap almost four decades ahead, think of the Authorization to Use Military Force (AUMF) that was passed by Congress in the immediate aftermath of 9/11 as the grandchild of the Tonkin Gulf Resolution. This document required (directed, called upon, requested, invited, urged) President George W. Bush “to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations, or persons.” In plain language: here’s a blank check; feel free to fill it in any way you like.
As a practical matter, one specific individual — Osama bin Laden — had hatched the 9/11 plot. A single organization — al-Qaeda — had conspired to pull it off. And just one nation — backward, Taliban-controlled Afghanistan — had provided assistance, offering sanctuary to bin Laden and his henchmen. Yet nearly 15 years later, the AUMF remains operative and has become the basis for military actions against innumerable individuals, organizations, and nations with no involvement whatsoever in the murderous events of September 11, 2001.
Consider the following less than comprehensive list of four developments, all of which occurred just within the last month and a half:
*In Yemen, a U.S. airstrike killed at least 50 individuals, said to be members of an Islamist organization that did not exist on 9/11.
*In Somalia, another U.S. airstrike killed a reported 150 militants, reputedly members of al-Shabab, a very nasty outfit, even if one with no real agenda beyond Somalia itself.
*In Syria, pursuant to the campaign of assassination that is the latest spin-off of the Iraq War, U.S. special operations forces bumped off the reputed “finance minister” of the Islamic State, another terror group that didn’t even exist in September 2001.
*In Libya, according to press reports, the Pentagon is again gearing up for “decisive military action” — that is, a new round of air strikes and special operations attacks to quell the disorder resulting from the U.S.-orchestrated air campaign that in 2011 destabilized that country. An airstrike conducted in late February gave a hint of what is to come: it killed approximately 50 Islamic State militants (and possibly two Serbian diplomatic captives).
Yemen, Somalia, Syria, and Libya share at least this in common: none of them, nor any of the groups targeted, had a hand in the 9/11 attacks.
Imagine if, within a matter of weeks, China were to launch raids into Vietnam, Thailand, and Taiwan, with punitive action against the Philippines in the offing. Or if Russia, having given a swift kick to Ukraine, Georgia, and Azerbaijan, leaked its plans to teach Poland a lesson for mismanaging its internal affairs. Were Chinese President Xi Jinping or Russian President Vladimir Putin to order such actions, the halls of Congress would ring with fierce denunciations. Members of both houses would jostle for places in front of the TV cameras to condemn the perpetrators for recklessly violating international law and undermining the prospects for world peace. Having no jurisdiction over the actions of other sovereign states, senators and representatives would break down the doors to seize the opportunity to get in their two cents worth. No one would be able to stop them. Who does Xi think he is! How dare Putin!
Yet when an American president undertakes analogous actions over which the legislative branch does have jurisdiction, members of Congress either yawn or avert their eyes.
In this regard, Republicans are especially egregious offenders. On matters where President Obama is clearly acting in accordance with the Constitution — for example, in nominating someone to fill a vacancy on the Supreme Court — they spare no effort to thwart him, concocting bizarre arguments nowhere found in the Constitution to justify their obstructionism. Yet when this same president cites the 2001 AUMF as the basis for initiating hostilities hither and yon, something that is on the face of it not legal but ludicrous, they passively assent.
Indeed, when Obama in 2015 went so far as to ask Congress to pass a new AUMF addressing the specific threat posed by the Islamic State — that is, essentially rubberstamping the war he had already launched on his own in Syria and Iraq — the Republican leadership took no action. Looking forward to the day when Obama departs office, Senator Mitch McConnell with his trademark hypocrisy worried aloud that a new AUMF might constrain his successor. The next president will “have to clean up this mess, created by all of this passivity over the last eight years,” the majority leader remarked. In that regard, “an authorization to use military force that ties the president’s hands behind his back is not something I would want to do.” The proper role of Congress was to get out of the way and give this commander-in-chief carte blanche so that the next one would enjoy comparably unlimited prerogatives.
Collaborating with a president they roundly despise — implicitly concurring in Obama’s questionable claim that “existing statutes [already] provide me with the authority I need” to make war on ISIS — the GOP-controlled Congress thereby transformed the post-9/11 AUMF into what has now become, in effect, a writ of permanent and limitless armed conflict. In Iraq and Syria, for instance, what began as a limited but open-ended campaign of air strikes authorized by President Obama in August 2014 has expanded to include an ever-larger contingent of U.S. trainers and advisers for the Iraqi military, special operations forces conducting raids in both Iraq and Syria, the first new all-U.S. forward fire base in Iraq, and at least 5,000 U.S. military personnel now on the ground, a number that continues to grow incrementally.
Remember Barack Obama campaigning back in 2008 and solemnly pledging to end the Iraq War? What he neglected to mention at the time was that he was retaining the prerogative to plunge the country into another Iraq War on his own ticket. So has he now done, with members of Congress passively assenting and the country essentially a prisoner of war.
By now, through its inaction, the legislative branch has, in fact, surrendered the final remnant of authority it retained on matters relating to whether, when, against whom, and for what purpose the United States should go to war. Nothing now remains but to pay the bills, which Congress routinely does, citing a solemn obligation to “support the troops.” In this way does the performance of lesser duties provide an excuse for shirking far greater ones.
In military circles, there is a term to describe this type of behavior. It’s called cowardice.
Copyright 2016 Andrew J. Bacevich
Writing a Blank Check on War for the President
Whether or not Donald Trump ultimately succeeds in winning the White House, historians are likely to rank him as the most consequential presidential candidate of at least the past half-century. He has already transformed the tone and temper of American political life. If he becomes the Republican nominee, he will demolish its structural underpinnings as well. Should he prevail in November, his election will alter its very fabric in ways likely to prove irreversible. Whether Trump ever delivers on his promise to “Make America Great Again,” he is already transforming American democratic practice.
Trump takes obvious delight in thumbing his nose at the political establishment and flouting its norms. Yet to classify him as an anti-establishment figure is to miss his true significance. He is to American politics what Martin Shkreli is to Big Pharma. Each represents in exaggerated form the distilled essence of a much larger and more disturbing reality. Each embodies the smirking cynicism that has become one of the defining characteristics of our age. Each in his own way is a sign of the times.
In contrast to the universally reviled Shkreli, however, Trump has cultivated a mass following that appears impervious to his missteps, miscues, and misstatements. What Trump actually believes — whether he believes in anything apart from big, splashy self-display — is largely unknown and probably beside the point. Trumpism is not a program or an ideology. It is an attitude or pose that feeds off of, and then reinforces, widespread anger and alienation.
The pose works because the anger — always present in certain quarters of the American electorate but especially acute today — is genuine. By acting the part of impish bad boy and consciously trampling on the canons of political correctness, Trump validates that anger. The more outrageous his behavior, the more secure his position at the very center of the political circus. Wondering what he will do next, we can’t take our eyes off him. And to quote Marco Rubio in a different context, Trump “knows exactly what he is doing.”
Targeting Obama’s Presidency
There is a form of genius at work here. To an extent unmatched by any other figure in American public life, Trump understands that previous distinctions between the ostensibly serious and the self-evidently frivolous have collapsed. Back in 1968, then running for president, Richard Nixon, of all people, got things rolling when he appeared on Laugh-In and uttered the immortal words, “Sock it to me?” But no one has come close to Trump in grasping the implications of all this: in contemporary America, celebrity confers authority. Mere credentials or qualifications have become an afterthought. How else to explain the host of a “reality” TV show instantly qualifying as a serious contender for high office?
For further evidence of Trump’s genius, consider the skill with which he plays the media, especially celebrity journalists who themselves specialize in smirking cynicism. Rather than pretending to take them seriously, he unmasks their preening narcissism, which mirrors his own. He refuses to acknowledge their self-assigned role as gatekeepers empowered to police the boundaries of permissible discourse. As the embodiment of “breaking news,” he continues to stretch those boundaries beyond recognition.
In that regard, the spectacle of televised “debates” has offered Trump an ideal platform for promoting his cult of personality. Once a solemn, almost soporific forum for civic education — remember Kennedy and Nixon in 1960? — presidential debates now provide occasions for trading insults, provoking gaffes, engaging in verbal food fights, and marketing magical solutions to problems ranging from war to border security that are immune to magic. For all of that we have Trump chiefly to thank.
Trump’s success as a campaigner schools his opponents, of course. In a shrinking Republican field, survival requires mimicking his antics. In that regard, Ted Cruz rates as Trump’s star pupil. Cruz is to Trump what Lady Gaga was to Amy Winehouse — a less freewheeling, more scripted, and arguably more calculating version of the original.
Yet if not a clone, Cruz taps into the same vein of pissed-off, give-me-my-country-back rage that Trump himself has so adeptly exploited. Like the master himself, Cruz has demonstrated a notable aptitude for expressing disagreement through denigration and for extravagant,crackpot promises. For his part, Marco Rubio, the only other Republican still seriously in the running, lags not far behind. When it comes to swagger and grandiosity, nothing beats a vow to create a “New American Century,” thereby resurrecting a mythic past when all was ostensibly right with the world.
On two points alone do these several Republicans see eye-to-eye. The first relates to domestic policy, the second to America’s role in the world.
On point one: with absolute unanimity, Trump, Cruz, and Rubio ascribe to Barack Obama any and all problems besetting the nation. To take their critique at face value, the country was doing swimmingly well back in 2009 when Obama took office. Today, it’s FUBAR, due entirely to Obama’s malign actions.
Wielding comparable authority, however, a Republican president can, they claim, dismantle Obama’s poisonous legacy and restore all that he has destroyed. From “day one,” on issues ranging from health care to immigration to the environment, the Republican candidates vow to do exactly this. With the stroke of a pen and the wave of a hand, it will be a breeze.
On point two: ditto. Aided and abetted by Hillary Clinton, Obama has made a complete hash of things abroad. Here the list of Republican grievances is especially long. Thanks to Obama, Russia threatens Europe; North Korea is misbehaving; China is flexing its military muscles; ISIS is on the march; Iran has a clear path to acquiring nuclear weapons; and perhaps most distressingly of all, Benjamin Netanyahu, the prime minister of Israel, is unhappy with U.S. policy.
Here, too, the Republican candidates see eye-to-eye and have solutions readily at hand. In one way or another, all of those solutions relate to military power. Trump, Cruz, and Rubio are unabashed militarists. (So, too, is Hillary Clinton, but that’s an issue deserving an essay of its own). Their gripe with Obama is that he never put American military might fully to work, a defect they vow to amend. A Republican commander-in-chief, be it Trump, Cruz, or Rubio, won’t take any guff from Moscow or Pyongyang or Beijing or Tehran. He will eradicate “radical Islamic terrorism,” put the mullahs back in their box, torture a bunch of terrorists in the bargain, and give Bibi whatever he wants.
In addition to offering Obama a sort of backhanded tribute — so much damage wrought by just one man in so little time — the Republican critique reinforces reigning theories of presidential omnipotence. Just as an incompetent or ill-motivated chief executive can screw everything up, so, too, can a bold and skillful one set things right.
Juan and Evita in Washington?
The ratio between promises made and promises fulfilled by every president in recent memory — Obama included — should have demolished such theories long ago. But no such luck. Fantasies of a great president saving the day still persist, something that Trump, Cruz, and Rubio have all made the centerpiece of their campaigns. Elect me, each asserts. I alone can save the Republic.
Here, however, Trump may enjoy an edge over his competitors, including Hillary Clinton and Bernie Sanders. With Americans assigning to their presidents the attributes of demigods — each and every onememorialized before death with a library-shrine — who better to fill the role than an egomaniacal tycoon who already acts the part? The times call for strong leadership. Who better to provide it than a wheeler-dealer unbothered by the rules that constrain mere mortals?
What then lies ahead?
If Trump secures the Republican nomination, now an increasingly imaginable prospect, the party is likely to implode. Whatever rump organization survives will have forfeited any remaining claim to represent principled conservatism.
None of this will matter to Trump, however. He is no conservative and Trumpism requires no party. Even if some new institutional alternative to conventional liberalism eventually emerges, the two-party system that has long defined the landscape of American politics will be gone for good.
Should Trump or a Trump mini-me ultimately succeed in capturing the presidency, a possibility that can no longer be dismissed out of hand, the effects will be even more profound. In all but name, the United States will cease to be a constitutional republic. Once President Trump inevitably declares that he alone expresses the popular will, Americans will find that they have traded the rule of law for a version ofcaudillismo. Trump’s Washington could come to resemble Buenos Aires in the days of Juan Perón, with Melania a suitably glamorous stand-in for Evita, and plebiscites suitably glamorous stand-ins for elections.
That a considerable number of Americans appear to welcome this prospect may seem inexplicable. Yet reason enough exists for their disenchantment. American democracy has been decaying for decades. The people know that they are no longer truly sovereign. They know that the apparatus of power, both public and private, does not promote the common good, itself a concept that has become obsolete. They have had their fill of irresponsibility, lack of accountability, incompetence, and the bad times that increasingly seem to go with them.
So in disturbingly large numbers they have turned to Trump to strip bare the body politic, willing to take a chance that he will come up with something that, if not better, will at least be more entertaining. As Argentines and others who have trusted their fate to demagogues have discovered, such expectations are doomed to disappointment.
In the meantime, just imagine how the Donald J. Trump Presidential Library, no doubt taller than all the others put together, might one day glitter and glisten — perhaps with casino attached.
Andrew J. Bacevich, a TomDispatch regular, is professor emeritus of history and international relations at Boston University. He is the author of the new book America’s War for the Greater Middle East: A Military History (Random House, April 2016).
Copyright 2016 Andrew Bacevich
Don’t Cry for Me, America
To judge by the early returns, the presidential race of 2016 is shaping up as the most disheartening in recent memory. Other than as a form of low entertainment, the speeches, debates, campaign events, and slick TV ads already inundating the public sphere offer little of value. Rather than exhibiting the vitality of American democracy, they testify to its hollowness.
Present-day Iranian politics may actually possess considerably more substance than our own. There, the parties involved, whether favoring change or opposing it, understand that the issues at stake have momentous implications. Here, what passes for national politics is a form of exhibitionism about as genuine as pro wrestling.
A presidential election campaign ought to involve more than competing coalitions of interest groups or bevies of investment banks and billionaires vying to install their preferred candidate in the White House. It should engage and educate citizens, illuminating issues and subjecting alternative solutions to careful scrutiny.
That this one won’t even come close we can ascribe as much to the media as to those running for office, something the recent set of “debates” and the accompanying commentary have made painfully clear. With certain honorable exceptions such as NBC’s estimable Lester Holt, representatives of the press are less interested in fulfilling their civic duty than promoting themselves as active participants in the spectacle. They bait, tease, and strut. Then they subject the candidates’ statements and misstatements to minute deconstruction. The effect is to inflate their own importance while trivializing the proceedings they are purportedly covering.
Above all in the realm of national security, election 2016 promises to be not just a missed opportunity but a complete bust. Recent efforts to exercise what people in Washington like to call “global leadership” have met with many more failures and disappointments than clearcut successes. So you might imagine that reviewing the scorecard would give the current raft of candidates, Republican and Democratic alike, plenty to talk about.
But if you thought that, you’d be mistaken. Instead of considered discussion of first-order security concerns, the candidates have regularly opted for bluff and bluster, their chief aim being to remove all doubts regarding their hawkish bona fides.
In that regard, nothing tops rhetorically beating up on the so-called Islamic State. So, for example, Hillary Clinton promises to “smash the would-be caliphate,” Jeb Bush to “defeat ISIS for good,” Ted Cruz to “carpet bomb them into oblivion,” and Donald Trump to “bomb the shit out of them.” For his part, having recently acquired a gun as the “last line of defense between ISIS and my family,” Marco Rubio insists that when he becomes president, “The most powerful intelligence agency in the world is going to tell us where [ISIS militants] are; the most powerful military in the world is going to destroy them; and if we capture any of them alive, they are getting a one-way ticket to Guantanamo Bay.”
These carefully scripted lines perform their intended twofold function. First, they elicit applause and certify the candidate as plenty tough. Second, they spare the candidate from having to address matters far more deserving of presidential attention than managing the fight against the Islamic State.
In the hierarchy of challenges facing the United States today, ISIS ranks about on a par with Sicily back in 1943. While liberating that island was a necessary prelude to liberating Europe more generally, the German occupation of Sicily did not pose a direct threat to the Allied cause. So with far weightier matters to attend to — handling Soviet dictator Joseph Stalin and British Prime Minister Winston Churchill, for example — President Franklin Roosevelt wisely left the problem of Sicily to subordinates. FDR thereby demonstrated an aptitude for distinguishing between the genuinely essential and the merely important.
By comparison, today’s crop of presidential candidates either are unable to grasp, cannot articulate, or choose to ignore those matters that should rightfully fall under a commander-in-chief’s purview. Instead, they compete with one another in vowing to liberate the twenty-first-century equivalent of Sicily, as if doing so demonstrates their qualifications for the office.
What sort of national security concerns should be front and center in the current election cycle? While conceding that a reasoned discussion of heavily politicized matters like climate change, immigration, or anything to do with Israel is probably impossible, other issues of demonstrable significance deserve attention. What follows are six of them — by no means an exhaustive list — that I’ve framed as questions a debate moderator might ask of anyone seeking the presidency, along with brief commentaries explaining why neither the posing nor the answering of such questions is likely to happen anytime soon.
1. The War on Terror: Nearly 15 years after this “war” was launched by George W. Bush, why hasn’t “the most powerful military in the world,” “the finest fighting force in the history of the world” won it? Why isn’t victory anywhere in sight?
As if by informal agreement, the candidates and the journalists covering the race have chosen to ignore the military enterprise inaugurated in 2001, initially called the Global War on Terrorism and continuing today without an agreed-upon name. Since 9/11, the United States has invaded, occupied, bombed, raided, or otherwise established a military presence in numerous countries across much of the Islamic world. How are we doing?
Given the resources expended and the lives lost or ruined, not particularly well it would seem. Intending to promote stability, reduce the incidence of jihadism, and reverse the tide of anti-Americanism among many Muslims, that “war” has done just the opposite. Advance the cause of democracy and human rights? Make that zero-for-four.
Amazingly, this disappointing record has been almost entirely overlooked in the campaign. The reasons why are not difficult to discern. First and foremost, both parties share in the serial failures of U.S. policy in Afghanistan, Iraq, Syria, Libya, and elsewhere in the region. Pinning the entire mess on George W. Bush is no more persuasive than pinning it all on Barack Obama. An intellectually honest accounting would require explanations that look beyond reflexive partisanship. Among the matters deserving critical scrutiny is Washington’s persistent bipartisan belief in military might as an all-purpose problem solver. Not far behind should come questions about simple military competence that no American political figure of note or mainstream media outlet has the gumption to address.
The politically expedient position indulged by the media is to sidestep such concerns in favor of offering endless testimonials to the bravery and virtue of the troops, while calling for yet more of the same or even further escalation. Making a show of supporting the troops takes precedence over serious consideration of what they are continually being asked to do.
2. Nuclear Weapons: Today, more than 70 years after Hiroshima and Nagasaki, what purpose do nukes serve? How many nuclear weapons and delivery systems does the United States actually need?
In an initiative that has attracted remarkably little public attention, the Obama administration has announced plans to modernize and upgrade the U.S. nuclear arsenal. Estimated costs of this program reach as high as $1 trillion over the next three decades. Once finished — probably just in time for the 100th anniversary of Hiroshima — the United States will possess more flexible, precise, survivable, and therefore usable nuclear capabilities than anything hitherto imagined. In effect, the country will have acquired a first-strike capability — even as U.S. officials continue to affirm their earnest hope of removing the scourge of nuclear weapons from the face of the Earth (other powers being the first to disarm, of course).
Whether, in the process, the United States will become more secure or whether there might be far wiser ways to spend that kind of money — shoring up cyber defenses, for example — would seem like questions those who could soon have their finger on the nuclear button might want to consider.
Yet we all know that isn’t going to happen. Having departed from the sphere of politics or strategy, nuclear policy has long since moved into the realm of theology. Much as the Christian faith derives from a belief in a Trinity consisting of the Father, the Son, and the Holy Ghost, so nuclear theology has its own Triad, comprised of manned bombers, intercontinental ballistic missiles, and submarine-launched missiles. To question the existence of such a holy threesome constitutes rank heresy. It’s just not done — especially when there’s all that money about to be dropped into the collection plate.
3. Energy Security: Given the availability of abundant oil and natural gas reserves in the Western Hemisphere and the potential future abundance of alternative energy systems, why should the Persian Gulf continue to qualify as a vital U.S. national security interest?
Back in 1980, two factors prompted President Jimmy Carter to announce that the United States viewed the Persian Gulf as worth fighting for. The first was a growing U.S. dependence on foreign oil and a belief that American consumers were guzzling gas at a rate that would rapidly deplete domestic reserves. The second was a concern that, having just invaded Afghanistan, the Soviet Union might next have an appetite for going after those giant gas stations in the Gulf, Iran, or even Saudi Arabia.
Today we know that the Western Hemisphere contains more than ample supplies of oil and natural gas to sustain the American way of life (while also heating up the planet). As for the Soviet Union, it no longer exists — a decade spent chewing on Afghanistan having produced a fatal case of indigestion.
No doubt ensuring U.S. energy security should remain a major priority. Yet in that regard, protecting Canada, Mexico, and Venezuela is far more relevant to the nation’s well-being than protecting Saudi Arabia, Kuwait, and Iraq, while being far easier and cheaper to accomplish. So who will be the first presidential candidate to call for abrogating the Carter Doctrine? Show of hands, please?
4. Assassination: Now that the United States has normalized assassination as an instrument of policy, how well is it working? What are its benefits and costs?
George W. Bush’s administration pioneered the practice of using missile-armed drones as a method of extrajudicial killing. Barack Obama’s administration greatly expanded and routinized the practice.
The technique is clearly “effective” in the narrow sense of liquidating leaders and “lieutenants” of terror groups that policymakers want done away with. What’s less clear is whether the benefits of state-sponsored assassination outweigh the costs, which are considerable. The incidental killing of noncombatants provokes ire directed against the United States and provides terror groups with an excellent recruiting tool. The removal of Mr. Bad Actor from the field adversely affects the organization he leads for no longer than it takes for a successor to emerge. As often as not, the successor turns out to be nastier than Mr. Bad Actor himself.
It would be naïve to expect presidential candidates to interest themselves in the moral implications of assassination as now practiced on a regular basis from the White House. Still, shouldn’t they at least wonder whether it actually works as advertised? And as drone technology proliferates, shouldn’t they also contemplate the prospect of others — say, Russians, Chinese, and Iranians — following America’s lead and turning assassination into a global practice?
5. Europe: Seventy years after World War II and a quarter-century after the Cold War ended, why does European security remain an American responsibility? Given that Europeans are rich enough to defend themselves, why shouldn’t they?
Americans love Europe: old castles, excellent cuisine, and cultural attractions galore. Once upon a time, the parts of Europe that Americans love best needed protection. Devastated by World War II, Western Europe faced in the Soviet Union a threat that it could not handle alone. In a singular act of generosity laced with self-interest, Washington came to the rescue. By forming NATO, the United States committed itself to defend its impoverished and vulnerable European allies. Over time this commitment enabled France, Great Britain, West Germany, and other nearby countries to recover from the global war and become strong, prosperous, and democratic countries.
Today Europe is “whole and free,” incorporating not only most of the former Soviet empire, but even parts of the old Soviet Union itself. In place of the former Soviet threat, there is Vladimir Putin, a bully governing a rickety energy state that, media hype notwithstanding, poses no more than a modest danger to Europe itself. Collectively, the European Union’s economy, at $18 trillion, equals that of the United States and exceeds Russia’s, even in sunnier times, by a factor of nine. Its total population, easily outnumbering our own, is more than triple Russia’s. What these numbers tell us is that Europe is entirely capable of funding and organizing its own defense if it chooses to do so.
It chooses otherwise, in effect opting for something approximating disarmament. As a percentage of the gross domestic product, European nations spend a fraction of what the United States does on defense. When it comes to armaments, they prefer to be free riders and Washington indulges that choice. So even today, seven decades after World War II ended, U.S. forces continue to garrison Europe and America’s obligation to defend 26 countries on the far side of the Atlantic remains intact.
The persistence of this anomalous situation deserves election-year attention for one very important reason. It gets to the question of whether the United States can ever declare mission accomplished. Since the end of World War II, Washington has extended its security umbrella to cover not only Europe, but also virtually all of Latin America and large parts of East Asia. More recently, the Middle East, Central Asia, and now Africa have come in for increased attention. Today, U.S. forces alone maintain an active presence in 147 countries.
Do our troops ever really get to “come home”? The question is more than theoretical in nature. To answer it is to expose the real purpose of American globalism, which means, of course, that none of the candidates will touch it with a 10-foot pole.
6. Debt: Does the national debt constitute a threat to national security? If so, what are some politically plausible ways of reining it in?
Together, the administrations of George W. Bush and Barack Obama can take credit for tripling the national debt since 2000. Well before Election Day this coming November, the total debt, now exceeding the entire gross domestic product, will breach the $19 trillion mark.
In 2010, Admiral Mike Mullen, then chairman of the Joint Chiefs of Staff, described that debt as “the most significant threat to our national security.” Although in doing so he wandered a bit out of his lane, he performed a rare and useful service by drawing a link between long-term security and fiscal responsibility. Ever so briefly, a senior military officer allowed consideration of the national interest to take precedence over the care and feeding of the military-industrial complex. It didn’t last long.
Mullen’s comment garnered a bit of attention, but failed to spur any serious congressional action. Again, we can see why, since Congress functions as an unindicted co-conspirator in the workings of that lucrative collaboration. Returning to anything like a balanced budget would require legislators to make precisely the sorts of choices that they are especially loathe to make — cutting military programs that line the pockets of donors and provide jobs for constituents. (Although the F-35 fighter may be one of the most bloated and expensive weapons programs in history, even Democratic Socialist Senator Bernie Sanders has left no stone unturned in lobbying to get those planes stationed in his hometown of Burlington.)
Recently, the role of Congress in authorizing an increase in the debt ceiling has provided Republicans with an excuse for political posturing, laying responsibility for all that red ink entirely at the feet of President Obama — this despite the fact that he has reduced the annual deficit by two-thirds, from $1.3 trillion the year he took office to $439 billion last year.
This much is certain: regardless of who takes the prize in November, the United States will continue to accumulate debt at a non-trivial rate. If a Democrat occupies the White House, Republicans will pretend to care. If our next president is a Republican, they will keep mum. In either case, the approach to national security that does so much to keep the books out of balance will remain intact.
Come to think of it, averting real change might just be the one point on which the candidates generally agree.
Copyright 2016 Andrew J. Bacevich
Out of Bounds, Off-Limits, or Just Plain Ignored
Assume that the hawks get their way — that the United States does whatever it takes militarily to confront and destroy ISIS. Then what?
Answering that question requires taking seriously the outcomes of other recent U.S. interventions in the Greater Middle East. In 1991, when the first President Bush ejected Saddam Hussein’s army from Kuwait, Americans rejoiced, believing that they had won a decisive victory. A decade later, the younger Bush seemingly outdid his father by toppling the Taliban in Afghanistan and then making short work of Saddam himself — a liberation twofer achieved in less time than it takes Americans to choose a president. After the passage of another decade, Barack Obama got into the liberation act, overthrowing the Libyan dictator Muammar Gaddafi in what appeared to be a tidy air intervention with a clean outcome. As Secretary of State Hillary Clinton memorably put it, “We came, we saw, he died.” End of story.
In fact, subsequent events in each case mocked early claims of success or outright victory. Unanticipated consequences and complications abounded. “Liberation” turned out to be a prelude to chronic violence and upheaval.
Indeed, the very existence of the Islamic State (ISIS) today renders a definitive verdict on the Iraq wars over which the Presidents Bush presided, each abetted by a Democratic successor. A de facto collaboration of four successive administrations succeeded in reducing Iraq to what it is today: a dysfunctional quasi-state unable to control its borders or territory while serving as a magnet and inspiration for terrorists.
The United States bears a profound moral responsibility for having made such a hash of things there. Were it not for the reckless American decision to invade and occupy a nation that, whatever its crimes, had nothing to do with 9/11, the Islamic State would not exist. Per the famous Pottery Barn Rule attributed to former Secretary of State Colin Powell, having smashed Iraq to bits a decade ago, we can now hardly deny owning ISIS.
That the United States possesses sufficient military power to make short work of that “caliphate” is also the case. True, in both Syria and Iraq the Islamic State has demonstrated a disturbing ability to capture and hold large stretches of desert, along with several population centers. It has, however, achieved these successes against poorly motivated local forces of, at best, indifferent quality.
In that regard, the glibly bellicose editor of the Weekly Standard, William Kristol, is surely correct in suggesting that a well-armed contingent of 50,000 U.S. troops, supported by ample quantities of air power, would make mincemeat of ISIS in a toe-to-toe contest. Liberation of the various ISIS strongholds like Fallujah and Mosul in Iraq and Palmyra and Raqqa, its “capital,” in Syria would undoubtedly follow in short order.
In the wake of the recent attacks in Paris, the American mood is strongly trending in favor of this sort of escalation. Just about anyone who is anyone — the current occupant of the Oval Office partially excepted — favors intensifying the U.S. military campaign against ISIS. And why not? What could possibly go wrong? As Kristol puts it, "I don’t think there’s much in the way of unanticipated side effects that are going to be bad there."
It’s an alluring prospect. In the face of a sustained assault by the greatest military the world has ever seen, ISIS foolishly (and therefore improbably) chooses to make an Alamo-like stand. Whammo! We win. They lose. Mission accomplished.
Of course, that phrase recalls the euphoric early reactions to Operations Desert Storm in 1991, Enduring Freedom in 2001, Iraqi Freedom in 2003, and Odyssey Dawn, the Libyan intervention of 2011. Time and again the unanticipated side effects of U.S. military action turned out to be very bad indeed. In Kabul, Baghdad, or Tripoli, the Alamo fell, but the enemy dispersed or reinvented itself and the conflict continued. Assurances offered by Kristol that this time things will surely be different deserve to be taken with more than a grain of salt. Pass the whole shaker.
Embracing Generational War
Why this repeated disparity between perceived and actual outcomes? Why have apparent battlefield successes led so regularly to more violence and disorder? Before following Kristol’s counsel, Americans would do well to reflect on these questions.
Cue Professor Eliot A. Cohen. Shortly after 9/11, Cohen, one of this country’s preeminent military thinkers, characterized the conflict on which the United States was then embarking as “World War IV.” (In this formulation, the Cold War becomes World War III.) Other than in certain neoconservative quarters, the depiction did not catch on. Yet nearly a decade-and-a-half later, the Johns Hopkins professor and former State Department official is sticking to his guns. In an essay penned for the American Interestfollowing the recent Paris attacks, he returns to his theme. “It was World War IV in 2001,” Cohen insists. “It is World War IV today.” And to our considerable benefit he spells out at least some of the implications of casting the conflict in such expansive and evocative terms.
Now I happen to think that equating our present predicament in the Islamic world with the immensely destructive conflicts of the prior century is dead wrong. Yet it’s a proposition that Americans at this juncture should contemplate with the utmost seriousness.
In the United States today, confusion about what war itself signifies is widespread. Through misuse, misapplication, and above all misremembering, we have distorted the term almost beyond recognition. As one consequence, talk of war comes too easily off the tongues of the unknowing.
Not so with Cohen. When it comes to war, he has no illusions. Addressing that subject, he illuminates it, enabling us to see what war entails. So in advocating World War IV, he performs a great service, even if perhaps not the one he intends.
What will distinguish the war that Cohen deems essential? “Begin with endurance,” he writes. “This war will probably go on for the rest of my life, and well into my children’s.” Although American political leaders seem reluctant “to explain just how high the stakes are,” Cohen lays them out in direct, unvarnished language. At issue, he insists, is the American way of life itself, not simply “in the sense of rock concerts and alcohol in restaurants, but the more fundamental rights of freedom of speech and religion, the equality of women, and, most essentially, the freedom from fear and freedom to think.”
With so much on the line, Cohen derides the Obama administration’s tendency to rely on “therapeutic bombing, which will temporarily relieve the itch, but leave the wounds suppurating.” The time for such half-measures has long since passed. Defeating the Islamic State and “kindred movements” will require the U.S. to “kill a great many people.” To that end Washington needs “a long-range plan not to ‘contain’ but to crush” the enemy. Even with such a plan, victory will be a long way off and will require “a long, bloody, and costly process.”
Cohen’s candor and specificity, as bracing as they are rare, should command our respect. If World War IV describes what we are in for, then eliminating ISIS might figure as a near-term imperative, but it can hardly define the endgame. Beyond ISIS loom all those continually evolving “kindred movements” to which the United States will have to attend before it can declare the war itself well and truly won.
To send just tens of thousands of U.S. troops to clean up Syria and Iraq, as William Kristol and others propose, offers at best a recipe for winning a single campaign. Winning the larger war would involve far more arduous exertions. This Cohen understands, accepts, and urges others to acknowledge.
And here we come to the heart of the matter. For at least the past 35 years — that is, since well before 9/11 — the United States has been “at war” in various quarters of the Islamic world. At no point has it demonstrated the will or the ability to finish the job. Washington’s approach has been akin to treating cancer with a little bit of chemo one year and a one-shot course of radiation the next. Such gross malpractice aptly describes U.S. military policy throughout the Greater Middle East across several decades.
While there may be many reasons why the Iraq War of 2003 to 2011 and the still longer Afghanistan War yielded such disappointing results, Washington’s timidity in conducting those campaigns deserves pride of place. That most Americans might bridle at the term “timidity” reflects the extent to which they have deluded themselves regarding the reality of war.
In comparison to Vietnam, for example, Washington’s approach to waging its two principal post-9/11 campaigns was positively half-hearted. With the nation as a whole adhering to peacetime routines, Washington neither sent enough troops nor stayed anywhere near long enough to finish the job. Yes, we killed many tens of thousands of Iraqis and Afghans, but if winning World War IV requires, as Cohen writes, that we “break the back” of the enemy, then we obviously didn’t kill nearly enough.
Nor were Americans sufficiently willing to die for the cause. In South Vietnam, 58,000 G.I.s died in a futile effort to enable that country to survive. In Iraq and Afghanistan, where the stakes were presumably much higher, we pulled the plug after fewer than 7,000 deaths.
Americans would be foolish to listen to those like William Kristol who, even today, peddle illusions about war being neat and easy. They would do well instead to heed Cohen, who knows that war is hard and ugly.
What Would World War IV Look Like?
Yet when specifying the practical implications of generational war, Cohen is less forthcoming. From his perspective, this fourth iteration of existential armed conflict in a single century is not going well. But apart from greater resolve and bloody-mindedness, what will it take to get things on the right track?
As a thought experiment, let’s answer that question by treating it with the urgency that Cohen believes it deserves. After 9/11, certain U.S. officials thundered about “taking the gloves off.” In practice, however, with the notable exception of policies permitting torture and imprisonment without due process, the gloves stayed on. Take Cohen’s conception of World War IV at face value and that will have to change.
For starters, the country would have to move to something like a war footing, enabling Washington to raise a lot more troops and spend a lot more money over a very long period of time. Although long since banished from the nation’s political lexicon, the M-word — mobilization — would make a comeback. Prosecuting a generational war, after all, is going to require the commitment of generations.
Furthermore, if winning World War IV means crushing the enemy, as Cohen emphasizes, then ensuring that the enemy, once crushed, cannot recover would be hardly less important. And that requirement would prohibit U.S. forces from simply walking away from a particular fight even — or especially — when it might appear won.
At the present moment, defeating the Islamic State ranks as Washington’s number one priority. With the Pentagon already claiming a body count of 20,000 ISIS fighters without notable effect, this campaign won’t end anytime soon. But even assuming an eventually positive outcome, the task of maintaining order and stability in areas that ISIS now controls will remain. Indeed, that task will persist until the conditions giving rise to entities like ISIS are eliminated. Don’t expect French President François Hollande or British Prime Minister David Cameron to sign up for that thankless job. U.S. forces will own it. Packing up and leaving the scene won’t be an option.
How long would those forces have to stay? Extrapolating from recent U.S. occupations in Iraq and Afghanistan, something on the order of a quarter-century seems like a plausible approximation. So should our 45th president opt for a boots-on-the-ground solution to ISIS, as might well be the case, the privilege of welcoming the troops home could belong to the 48th or 49th occupant of the White House.
In the meantime, U.S. forces would have to deal with the various and sundry “kindred movements” that are already cropping up like crabgrass in country after country. Afghanistan — still? again? — would head the list of places requiring U.S. military attention. But other prospective locales would include such hotbeds of Islamist activity as Lebanon, Libya, Palestine, Somalia, and Yemen, along with several West African countries increasingly beset with insurgencies. Unless Egyptian, Pakistani, and Saudi security forces demonstrate the ability (not to mention the will) to suppress the violent radicals in their midst, one or more of those countries could also become the scene of significant U.S. military action.
Effective prosecution of World War IV, in other words, would require the Pentagon to plan for each of these contingencies, while mustering the assets needed for implementation. Allies might kick in token assistance — tokenism is all they have to offer — but the United States will necessarily carry most of the load.
What Would World War IV Cost?
During World War III (aka the Cold War), the Pentagon maintained a force structure ostensibly adequate to the simultaneous prosecution of two and a half wars. This meant having the wherewithal to defend Europe and the Pacific from communist aggression while still leaving something for the unexpected. World War IV campaigns are unlikely to entail anything on the scale of the Warsaw Pact attacking Western Europe or North Korea invading the South. Still, the range of plausible scenarios will require that U.S. forces be able to take on militant organizations C and D even while guarding against the resurgence of organizations A and B in altogether different geographic locations.
Even though Washington may try whenever possible to avoid large-scale ground combat, relying on air power (including drones) and elite Special Operations forces to do the actual killing, post-conflict pacification promises to be a manpower intensive activity. Certainly, this ranks as one of the most obvious lessons to emerge from World War IV’s preliminary phases: when the initial fight ends, the real work begins.
U.S. forces committed to asserting control over Iraq after the invasion of 2003 topped out at roughly 180,000. In Afghanistan, during the Obama presidency, the presence peaked at 110,000. In a historical context, these are not especially large numbers. At the height of the Vietnam War, for example, U.S. troop strength in Southeast Asia exceeded 500,000.
In hindsight, the Army general who, before the invasion of 2003, publicly suggested that pacifying postwar Iraq would require “several hundred thousand troops” had it right. A similar estimate applies to Afghanistan. In other words, those two occupations together could easily have absorbed 600,000 to 800,000 troops on an ongoing basis. Given the Pentagon’s standard three-to-one rotation policy, which assumes that for every unit in-country, a second is just back, and a third is preparing to deploy, you’re talking about a minimum requirement of between 1.8 and 2.4 million troops to sustain just two medium-sized campaigns — a figure that wouldn’t include some number of additional troops kept in reserve for the unexpected.
In other words, waging World War IV would require at least a five-fold increase in the current size of the U.S. Army — and not as an emergency measure but a permanent one. Such numbers may appear large, but as Cohen would be the first to point out, they are actually modest when compared to previous world wars. In 1968, in the middle of World War III, the Army had more than 1.5 million active duty soldiers on its rolls — this at a time when the total American population was less than two-thirds what it is today and when gender discrimination largely excluded women from military service. If it chose to do so, the United States today could easily field an army of two million or more soldiers.
Whether it could also retain the current model of an all-volunteer force is another matter. Recruiters would certainly face considerable challenges, even if Congress enhanced the material inducements for service, which since 9/11 have already included a succession of generous increases in military pay. A loosening of immigration policy, granting a few hundred thousand foreigners citizenship in return for successfully completing a term of enlistment might help. In all likelihood, however, as with all three previous world wars, waging World War IV would oblige the United States to revive the draft, a prospect as likely to be well-received as a flood of brown and black immigrant enlistees. In short, going all out to create the forces needed to win World War IV would confront Americans with uncomfortable choices.
The budgetary implications of expanding U.S. forces while conducting a perpetual round of what the Pentagon calls “overseas contingency operations” would also loom large. Precisely how much money an essentially global conflict projected to extend well into the latter half of the century would require is difficult to gauge. As a starting point, given the increased number of active duty forces, tripling the present Defense Department budget of more than $600 billion might serve as a reasonable guess.
At first glance, $1.8 trillion annually is a stupefyingly large figure. To make it somewhat more palatable, a proponent of World War IV might put that number in historical perspective. During the first phases of World War III, for example, the United States routinely allocated 10% or more of total gross domestic product (GDP) for national security. With that GDP today exceeding $17 trillion, apportioning 10% to the Pentagon would give those charged with managing World War IV a nice sum to work with and no doubt to build upon.
Of course, that money would have to come from somewhere. For several years during the last decade, sustaining wars in Iraq and Afghanistan pushed the federal deficit above a trillion dollars. As one consequence, the total national debt now exceeds annual GDP, having tripled since 9/11. How much additional debt the United States can accrue without doing permanent damage to the economy is a question of more than academic interest.
To avoid having World War IV produce an endless string of unacceptably large deficits, ratcheting up military spending would undoubtedly require either substantial tax increases or significant cuts in non-military spending, including big-ticket programs like Medicare and social security — precisely those, that is, which members of the middle class hold most dear.
In other words, funding World War IV while maintaining a semblance of fiscal responsibility would entail the kind of trade-offs that political leaders are loathe to make. Today, neither party appears up to taking on such challenges. That the demands of waging protracted war will persuade them to rise above their partisan differences seems unlikely. It sure hasn’t so far.
The Folly of World War IV
In his essay, Cohen writes, “we need to stop the circumlocutions.” Of those who would bear the direct burden of his world war, he says, “we must start telling them the truth.” He’s right, even if he himself is largely silent about what the conduct of World War IV is likely to exact from the average citizen.
As the United States enters a presidential election year, plain talk about the prospects of our ongoing military engagement in the Islamic world should be the order of the day. The pretense that either dropping a few more bombs or invading one or two more countries will yield a conclusive outcome amounts to more than an evasion. It is an outright lie.
As Cohen knows, winning World War IV would require dropping many, many more bombs and invading, and then occupying for years to come, many more countries. After all, it’s not just ISIS that Washington will have to deal with, but also its affiliates, offshoots, wannabes, and the successors almost surely waiting in the wings. And don’t forget al-Qaeda.
Cohen believes that we have no alternative. Either we get serious about fighting World War IV the way it needs to be fought or darkness will envelop the land. He is undeterred by the evidence that the more deeply we insert our soldiers into the Greater Middle East the more concerted the resistance they face; that the more militants we kill the more we seem to create; that the inevitable, if unintended, killing of innocents only serves to strengthen the hand of the extremists. As he sees it, with everything we believe in riding on the outcome, we have no choice but to press on.
While listening carefully to Cohen’s call to arms, Americans should reflect on its implications. Wars change countries and people. Embracing his prescription for World War IV would change the United States in fundamental ways. It would radically expand the scope and reach of the national security state, which, of course, includes agencies beyond the military itself. It would divert vast quantities of wealth to nonproductive purposes. It would make the militarization of the American way of life, a legacy of prior world wars, irreversible. By sowing fear and fostering impossible expectations of perfect security, it would also compromise American freedom in the name of protecting it. The nation that decades from now might celebrate VT Day — victory over terrorism — will have become a different place, materially, politically, culturally, and morally.
In my view, Cohen’s World War IV is an invitation to collective suicide. Arguing that no alternative exists to open-ended war represents not hard-nosed realism, but the abdication of statecraft. Yet here’s the ultimate irony: even without the name, the United States has already embarked upon something akin to a world war, which now extends into the far reaches of the Islamic world and spreads further year by year.
Incrementally, bit by bit, this nameless war has already expanded the scope and reach of the national security apparatus. It is diverting vast quantities of wealth to nonproductive purposes even as it normalizes the continuing militarization of the American way of life. By sowing fear and fostering impossible expectations of perfect security, it is undermining American freedom in the name of protecting it, and doing so right before our eyes.
Cohen rightly decries the rudderless character of the policies that have guided the (mis)conduct of that war thus far. For that critique we owe him a considerable debt. But the real problem is the war itself and the conviction that only through war can America remain America.
For a rich and powerful nation to conclude that it has no choice but to engage in quasi-permanent armed conflict in the far reaches of the planet represents the height of folly. Power confers choice. As citizens, we must resist with all our might arguments that deny the existence of choice. Whether advanced forthrightly by Cohen or fecklessly by the militarily ignorant, such claims will only perpetuate the folly that has already lasted far too long.
Andrew J. Bacevich, a TomDispatch regular, is professor emeritus of history and international relations at Boston University. He is the author of Breach of Trust: How Americans Failed Their Soldiers and Their Country, among other works. His new book, America’s War for the Greater Middle East (Random House), is due out in April 2016.
Copyright 2015 Andrew J. Bacevich
First came Fallujah, then Mosul, and later Ramadi in Iraq. Now, there is Kunduz, a provincial capital in northern Afghanistan. In all four places, the same story has played out: in cities that newspaper reporters like to call “strategically important,” security forces trained and equipped by the U.S. military at great expense simply folded, abandoning their posts (and much of their U.S.-supplied weaponry) without even mounting serious resistance. Called upon to fight, they fled. In each case, the defending forces gave way before substantially outnumbered attackers, making the outcomes all the more ignominious.
Together, these setbacks have rendered a verdict on the now more-or-less nameless Global War on Terrorism (GWOT). Successive blitzkriegs by ISIS and the Taliban respectively did more than simply breach Iraqi and Afghan defenses. They also punched gaping holes in the strategy to which the United States had reverted in hopes of stemming the further erosion of its position in the Greater Middle East.
Recall that, when the United States launched its GWOT soon after 9/11, it did so pursuant to a grandiose agenda. U.S. forces were going to imprint onto others a specific and exalted set of values. During President George W. Bush’s first term, this “freedom agenda” formed the foundation, or at least the rationale, for U.S. policy.
The shooting would stop, Bush vowed, only when countries like Afghanistan had ceased to harbor anti-American terrorists and countries like Iraq had ceased to encourage them. Achieving this goal meant that the inhabitants of those countries would have to change. Afghans and Iraqis, followed in due course by Syrians, Libyans, Iranians, and sundry others would embrace democracy, respect human rights, and abide by the rule of law, or else. Through the concerted application of American power, they would become different — more like us and therefore more inclined to get along with us. A bit less Mecca and Medina, a bit more “we hold these truths” and “of the people, by the people.”
So Bush and others in his inner circle professed to believe. At least some of them, probably including Bush himself, may actually have done so.
History, at least the bits and pieces to which Americans attend, seemed to endow such expectations with a modicum of plausibility. Had not such a transfer of values occurred after World War II when the defeated Axis Powers had hastily thrown in with the winning side? Had it not recurred as the Cold War was winding down, when previously committed communists succumbed to the allure of consumer goods and quarterly profit statements?
If the appropriate mix of coaching and coercion were administered, Afghans and Iraqis, too, would surely take the path once followed by good Germans and nimble Japanese, and subsequently by Czechs tired of repression and Chinese tired of want. Once liberated, grateful Afghans and Iraqis would align themselves with a conception of modernity that the United States had pioneered and now exemplified. For this transformation to occur, however, the accumulated debris of retrograde social conventions and political arrangements that had long retarded progress would have to be cleared away. This was what the invasions of Afghanistan (Operation Enduring Freedom!) and Iraq (Operation Iraqi Freedom!) were meant to accomplish in one fell swoop by a military the likes of which had (to hear Washington tell it) never been seen in history. POW!
Standing Them Up As We Stand Down
Concealed within that oft-cited “freedom” — the all-purpose justification for deploying American power — were several shades of meaning. The term, in fact, requires decoding. Yet within the upper reaches of the American national security apparatus, one definition takes precedence over all others. In Washington, freedom has become a euphemism for dominion. Spreading freedom means positioning the United States to call the shots. Seen in this context, Washington’s expected victories in both Afghanistan and Iraq were meant to affirm and broaden its preeminence by incorporating large parts of the Islamic world into the American imperium. They would benefit, of course, but to an even greater extent, so would we.
Alas, liberating Afghans and Iraqis turned out to be a tad more complicated than the architects of Bush’s freedom (or dominion) agenda anticipated. Well before Barack Obama succeeded Bush in January 2009, few observers — apart from a handful of ideologues and militarists — clung to the fairy tale of U.S. military might whipping the Greater Middle East into shape. Brutally but efficiently, war had educated the educable. As for the uneducable, they persisted in taking their cues from Fox News and the Weekly Standard.
Yet if the strategy of transformation via invasion and “nation building” had failed, there was a fallback position that seemed to be dictated by the logic of events. Together, Bush and Obama would lower expectations as to what the United States was going to achieve, even as they imposed new demands on the U.S. military, America’s go-to outfit in foreign policy, to get on with the job.
Rather than midwifing fundamental political and cultural change, the Pentagon was instead ordered to ramp up its already gargantuan efforts to create local militaries (and police forces) capable of maintaining order and national unity. President Bush provided aconcise formulation of the new strategy: “As the Iraqis stand up, we will stand down.” Under Obama, after his own stab at a “surge,” the dictum applied to Afghanistan as well. Nation-building had flopped. Building armies and police forces able to keep a lid on things now became the prevailing definition of success.
The United States had, of course, attempted this approach once before, with unhappy results. This was in Vietnam. There, efforts to destroy North Vietnamese and Viet Cong forces intent on unifying their divided country had exhausted both the U.S. military and the patience of the American people. Responding to the logic of events, Presidents Lyndon Johnson and Richard Nixon had a tacitly agreed upon fallback position. As the prospects of American forces successfully eliminating threats to South Vietnamese security faded, the training and equipping of the South Vietnamese to defend themselves became priority number one.
Dubbed “Vietnamization,” this enterprise ended in abject failure with the fall of Saigon in 1975. Yet that failure raised important questions to which members of the national security elite might have attended: Given a weak state with dubious legitimacy, how feasible is it to expect outsiders to invest indigenous forces with genuine fighting power? How do differences in culture or history or religion affect the prospects for doing so? Can skill ever make up for a deficit of will? Can hardware replace cohesion? Above all, if tasked with giving some version of Vietnamization another go, what did U.S. forces need to do differently to ensure a different result?
At the time, with general officers and civilian officials more inclined to forget Vietnam than contemplate its implications, these questions attracted little attention. Instead, military professionals devoted themselves to gearing up for the next fight, which they resolved would be different. No more Vietnams — and therefore no more Vietnamization.
After the Gulf War of 1991, basking in the ostensible success of Operation Desert Storm, the officer corps persuaded itself that it had once and for all banished its Vietnam-induced bad memories. As Commander-in-Chief George H.W. Bush so memorably put it, “By God, we’ve kicked the Vietnam syndrome once and for all.”
In short, the Pentagon now had war figured out. Victory had become a foregone conclusion. As it happened, this self-congratulatory evaluation left U.S. troops ill-prepared for the difficulties awaiting them after 9/11 when interventions in Afghanistan and Iraq departed from the expected script, which posited short wars by a force beyond compare ending in decisive victories. What the troops got were two very long wars with no decision whatsoever. It was Vietnam on a smaller scale all over again — times two.
For Bush in Iraq and Obama after a brief, half-hearted flirtation with counterinsurgency in Afghanistan, opting for a variant of Vietnamization proved to be a no-brainer. Doing so offered the prospect of an escape from all complexities. True enough, Plan A — we export freedom and democracy — had fallen short. But Plan B — they (with our help) restore some semblance of stability — could enable Washington to salvage at least partial success in both places. With the bar suitably lowered, a version of “Mission Accomplished” might still be within reach.
If Plan A had looked to U.S. troops to vanquish their adversaries outright, Plan B focused on prepping besieged allies to take over the fight. Winning outright was no longer the aim — given the inability of U.S. forces to do so, this was self-evidently not in the cards — but holding the enemy at bay was.
Although allied with the United States, only in the loosest sense did either Iraq or Afghanistan qualify as a nation-state. Only nominally and intermittently did governments in Baghdad and Kabul exercise a writ of authority commanding respect from the people known as Iraqis and Afghans. Yet in the Washington of George Bush and Barack Obama, a willing suspension of disbelief became the basis for policy. In distant lands where the concept of nationhood barely existed, the Pentagon set out to create a full-fledged national security apparatus capable of defending that aspiration as if it represented reality. From day one, this was a faith-based undertaking.
As with any Pentagon project undertaken on a crash basis, this one consumed resources on a gargantuan scale — $25 billion in Iraq and an even more staggering $65 billion in Afghanistan. “Standing up” the requisite forces involved the transfer of vast quantities of equipment and the creation of elaborate U.S. training missions. Iraqi and Afghan forces acquired all the paraphernalia of modern war — attack aircraft or helicopters, artillery and armored vehicles, night vision devices and drones. Needless to say, stateside defense contractors lined up in droves to cash in.
Based on their performance, the security forces on which the Pentagon has lavished years of attention remain visibly not up to the job. Meanwhile, ISIS warriors, without the benefit of expensive third-party mentoring, appear plenty willing to fight and die for their cause. Ditto Taliban fighters in Afghanistan. The beneficiaries of U.S. assistance? Not so much. Based on partial but considerable returns, Vietnamization 2.0 seems to be following an eerily familiar trajectory that should remind anyone of Vietnamization 1.0. Meanwhile, the questions that ought to have been addressed back when our South Vietnamese ally went down to defeat have returned with a vengeance.
The most important of those questions challenges the assumption that has informed U.S. policy in the Greater Middle East since the freedom agenda went south: that Washington has a particular knack for organizing, training, equipping, and motivating foreign armies. Based on the evidence piling up before our eyes, that assumption appears largely false. On this score, retired Lieutenant General Karl Eikenberry, a former military commander and U.S. ambassador in Afghanistan, has rendered an authoritative judgment. “Our track record at building [foreign] security forces over the past 15 years is miserable,” he recently told the New York Times. Just so.
Fighting the Wrong War
Some might argue that trying harder, investing more billions, sending yet more equipment for perhaps another 15 years will produce more favorable results. But this is akin to believing that, given sufficient time, the fruits of capitalism will ultimately trickle down to benefit the least among us or that the march of technology holds the key to maximizing human happiness. You can believe it if you want, but it’s a mug’s game.
Indeed, the United States would be better served if policymakers abandoned the pretense that the Pentagon possesses any gift whatsoever for “standing up” foreign military forces. Prudence might actually counsel that Washington assume instead, when it comes to organizing, training, equipping, and motivating foreign armies, that the United States is essentially clueless.
Exceptions may exist. For example, U.S. efforts have probably helped boost the fighting power of the Kurdish peshmerga. Yet such exceptions are rare enough to prove the rule. Keep in mind that before American trainers and equipment ever showed up, Iraq’s Kurds already possessed the essential attributes of nationhood. Unlike Afghans and Iraqis, Kurds do not require tutoring in the imperative of collective self-defense.
What are the policy implications of giving up the illusion that the Pentagon knows how to build foreign armies? The largest is this: subletting war no longer figures as a plausible alternative to waging it directly. So where U.S. interests require that fighting be done, like it or not, we’re going to have to do that fighting ourselves. By extension, in circumstances where U.S. forces are demonstrably incapable of winning or where Americans balk at any further expenditure of American blood — today in the Greater Middle East both of these conditions apply — then perhaps we shouldn’t be there. To pretend otherwise is to throw good money after bad or, as a famous American general once put it, to wage (even if indirectly) “the wrong war, at the wrong place, at the wrong time, and with the wrong enemy.” This we have been doing now for several decades across much of the Islamic world.
In American politics, we await the officeholder or candidate willing to state the obvious and confront its implications.
Andrew J. Bacevich, a TomDispatch regular, is professor emeritus of history and international relations at Boston University. He is the author of Breach of Trust: How Americans Failed Their Soldiers and Their Country, among other works.
Copyright 2015 Andrew Bacevich
On Building Armies (and Watching Them Fail)
There is a peculiar form of insanity in which a veneer of rationality distracts attention from the madness lurking just beneath the surface. When Alice dove down her rabbit hole to enter a place where smirking cats offered directions, ill-mannered caterpillars dispensed advice, and Mock Turtles constituted the principal ingredient in Mock Turtle soup, she experienced something of the sort.
Yet, as the old adage goes, truth can be even stranger than fiction. For a real-life illustration of this phenomenon, one need look no further than Washington and its approach to national security policy. Viewed up close, it all seems to hang together. Peer out of the rabbit hole and the sheer lunacy quickly becomes apparent.
Consider this recent headline: “U.S. to Ship 2,000 Anti-Tank Missiles To Iraq To Help Fight ISIS.” The accompanying article describes a Pentagon initiative to reinforce Iraq’s battered army with a rush order of AT-4s. A souped-up version of the old bazooka, the AT-4 is designed to punch holes through armored vehicles.
Taken on its own terms, the decision makes considerable sense. Iraqi forces need something to counter a fearsome new tactic of the Islamic State of Iraq and Syria (ISIS): suicide bombers mounted in heavily armored wheeled vehicles. Improved antitank capabilities certainly could help Iraqi troops take out such bombers before they reach their intended targets. The logic is airtight. The sooner these weapons get into the hands of Iraqi personnel, the better for them — and so the better for us.
As it turns out, however, the vehicle of choice for ISIS suicide bombers these days is the up-armored Humvee. In June 2014, when the Iraqi Army abandoned the country’s second largest city, Mosul, ISIS acquired 2,300 made-in-the-U.S.A. Humvees. Since then, it’s captured even more of them.
As U.S. forces were themselves withdrawing from Iraq in 2011, they bequeathed a huge fleet of Humvees to the “new” Iraqi army it had built to the tune of $25 billion. Again, the logic of doing so was impeccable: Iraqi troops needed equipment; shipping used Humvees back to the U.S. was going to cost more than they were worth. Better to give them to those who could put them to good use. Who could quarrel with that?
Before they handed over the used equipment, U.S. troops had spent years trying to pacify Iraq, where order had pretty much collapsed after the invasion of 2003. American troops in Iraq had plenty of tanks and other heavy equipment, but once the country fell into insurgency and civil war, patrolling Iraqi cities required something akin to a hopped-up cop car. The readily available Humvee filled the bill. When it turned out that troops driving around in what was essentially an oversized jeep were vulnerable to sniper fire and roadside bombs, “hardening” those vehicles to protect the occupants became a no-brainer — as even Secretary of Defense Donald Rumsfeld eventually recognized.
At each step along the way, the decisions made possessed a certain obvious logic. It’s only when you get to the end — giving Iraqis American-made weapons to destroy specially hardened American-made military vehicles previously provided to those same Iraqis — that the strangely circular and seriously cuckoo Alice-in-Wonderland nature of the entire enterprise becomes apparent.
AT-4s blowing up those Humvees — with fingers crossed that the anti-tank weapons don’t also fall into the hands of ISIS militants — illustrates in microcosm the larger madness of Washington’s policies concealed by the superficial logic of each immediate situation.
The Promotion of Policies That Have Manifestly Failed
Let me provide a firsthand illustration. A week ago, I appeared on a network television news program to discuss American policy in Iraq and in particular the challenges posed by ISIS. The other guests were former Secretary of Defense and CIA Director Leon Panetta, former Undersecretary of Defense for Policy and current CEO of a Washington think tank Michelle Flournoy, and retired four-star general Anthony Zinni who had once headed up United States Central Command.
Washington is a city in which whatever happens within the current news cycle trumps all other considerations, whether in the immediate or distant past. So the moderator launched the discussion by asking the panelists to comment on President Obama’s decision, announced earlier that very day, to plus-up the 3,000-strong train-and-equip mission to Iraq with an additional 450 American soldiers, the latest ratcheting up of ongoing U.S. efforts to deal with ISIS.
Panetta spoke first and professed wholehearted approval of the initiative. “Well, there’s no question that I think the president’s taken the right step in adding these trainers and advisers.” More such steps — funneling arms to Iraqi Kurds and Sunnis and deploying U.S. Special Operations Forces to hunt down terrorists — were “going to be necessary in order to be able to achieve the mission that we have embarked on.” That mission was of critical importance. Unless defeated, ISIS would convert Iraq into “a base [for] attacking our country and attacking our homeland.”
Flournoy expressed a similar opinion. She called the decision to send additional trainers “a good move and a smart move,” although she, too, hoped that it was only the “first step in a broader series” of escalatory actions. If anything, her view of ISIS was more dire than that of her former Pentagon boss. She called it “the new jihad — violent jihadist vanguard in the Middle East and globally.” Unless stopped, ISIS was likely to become “a global network” with “transnational objectives,” while its “thousands of foreign fighters” from the West and Gulf states were eventually going to “return and be looking to carry out jihad in their home countries.”
General Zinni begged to differ — not on the nature of the danger confronting Washington, but on what to do about it. He described the present policy as “almost déjà vu,” a throwback “to Vietnam before we committed the ground forces. We dribble in more and more advisers and support.”
“We’re not fully committed to this fight,” the general complained. “We use terms like destroy. I can tell you, you could put ground forces on the ground now and we can destroy ISIS.” Zinni proposed doing just that. No more shilly-shallying. The template for action was readily at hand. “The last victory, clear victory that we had was in the first Gulf War,” he said. And what were the keys to success then? “We used overwhelming force. We ended it quickly. We went to the U.N. and got a resolution. We built a coalition. And that ought to be a model we ought to look at.” In short, go big, go hard, go home.
Panetta disagreed. He had a different template in mind. The Iraq War of 2003-2011 had clearly shown that “we know how to do this, and we know how to win at doing this.” The real key was to allow America’s generals a free hand to do what needed to be done. “[A]ll we really do need to do is to be able to give our military commanders the flexibility to design not only the strategy to degrade ISIS, but the larger strategy we need in order to defeat ISIS.” Unleashing the likes of Delta Force or SEAL Team 6 with some missile-firing drones thrown in for good measure was likely to suffice.
For her part, Flournoy thought the real problem was “making sure that there is Iraqi capacity to hold the territory, secure it long-term, so that ISIS doesn’t come back again. And that involves the larger political compromises” — the ones the Iraqis themselves needed to make. At the end of the day, the solution was an Iraqi army willing and able to fight and an Iraqi government willing and able to govern effectively. On that score, there was much work to be done.
Panetta then pointed out that none of this was in the cards unless the United States stepped up to meet the challenge. “[I]f the United States doesn’t provide leadership in these crises, nobody else will.” That much was patently obvious. Other countries and the Iraqis themselves might pitch in, “but we have to provide that leadership. We can’t just stand on the sidelines wringing our hands. I mean… ask the people of Paris what happened there with ISIS. Ask the people in Brussels what happened there with ISIS. What happened in Toronto? What’s happened in this country as a result of the threat from ISIS?”
Ultimately, everything turned on the willingness of America to bring order and stability out of chaos and confusion. Only the United States possessed the necessary combination of wisdom, competence, and strength. Here was a proposition to which Flournoy and Zinni readily assented.
With Alice in Washington
To participate in an exchange with these pillars of the Washington establishment was immensely instructive. Only nominally did their comments qualify as a debate. Despite superficial differences, the discussion was actually an exercise in affirming the theology of American national security — those essential matters of faith that define continuities of policy in Washington, whatever administration is in power.
In that regard, apparent disagreement on specifics masked a deeper consensus consisting of three elements:
* That ISIS represents something akin to an existential threat to the United States, the latest in a long line going back to the totalitarian ideologies of the last century; fascism and communism may be gone, but danger is ever present.
* That if the United States doesn’t claim ownership of the problem of Iraq, the prospects of “solving” it are nil; action or inaction by Washington alone, that is, determines the fate of the planet.
* That the exercise of leadership implies, and indeed requires, employing armed might; without a willingness to loose military power, global leadership is inconceivable.
In a fundamental respect, the purpose of the national security establishment, including the establishment media, is to shield that tripartite consensus from critical examination. This requires narrowing the aperture of analysis so as to exclude anything apart from the here-and-now. The discussion in which I participated provided a vehicle for doing just that. It was an exercise aimed at fostering collective amnesia.
So what the former secretary of defense, think tank CEO, and retired general chose not to say in fretting about ISIS is as revealing as what they did say. Here are some of the things they chose to overlook:
* ISIS would not exist were it not for the folly of the United States in invading — and breaking — Iraq in the first place; we created the vacuum that ISIS is now attempting to fill.
* U.S. military efforts to pacify occupied Iraq from 2003 to 2011 succeeded only in creating a decent interval for the United States to withdraw without having to admit to outright defeat; in no sense did “our” Iraq War end in anything remotely approximating victory, despite the already forgotten loss of thousands of American lives and the expenditure of trillions of dollars.
* For more than a decade and at very considerable expense, the United States has been attempting to create an Iraqi government that governs and an Iraqi army that fights; the results of those efforts speak for themselves: they have failed abysmally.
Now, these are facts. Acknowledging them might suggest a further conclusion: that anyone proposing ways for Washington to put things right in Iraq ought to display a certain sense of humility. The implications of those facts — behind which lies a policy failure of epic proportions — might even provide the basis for an interesting discussion on national television. But that would assume a willingness to engage in serious self-reflection. This, the culture of Washington does not encourage, especially on matters related to basic national security policy.
My own contribution to the televised debate was modest and ineffectual. Toward the end, the moderator offered me a chance to redeem myself. What, she asked, did I think about Panetta’s tribute to the indispensability of American leadership?
A fat pitch that I should have hit it out of the park. Instead, I fouled it off. What I should have said was this: leadership ought to mean something other than simply repeating and compounding past mistakes. It should require more than clinging to policies that have manifestly failed. To remain willfully blind to those failures is not leadership, it’s madness.
Not that it would have mattered if I had. When it comes to Iraq, we’re already halfway back down Alice’s rabbit hole.
Andrew J. Bacevich, a TomDispatch regular, is writing a military history of America’s War for the Greater Middle East. His most recent book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2015 Andrew J. Bacevich
Washington in Wonderland
First, they tried to shoot the dogs. Next, they tried to poison them with strychnine. When both failed as efficient killing methods, British government agents and U.S. Navy personnel used raw meat to lure the pets into a sealed shed. Locking them inside, they gassed the howling animals with exhaust piped in from U.S. military vehicles. Then, setting coconut husks ablaze, they burned the dogs’ carcasses as their owners were left to watch and ponder their own fate.
The truth about the U.S. military base on the British-controlled Indian Ocean island of Diego Garcia is often hard to believe. It would be easy enough to confuse the real story with fictional accounts of the island found in the Transformers movies, on the television series 24, and in Internet conspiracy theories about the disappearance of Malaysia Airlines flight MH370.
While the grim saga of Diego Garcia frequently reads like fiction, it has proven all too real for the people involved. It’s the story of a U.S. military base built on a series of real-life fictions told by U.S. and British officials over more than half a century. The central fiction is that the U.S. built its base on an “uninhabited” island. That was “true” only because the indigenous people were secretly exiled from the Chagos Archipelago when the base was built. Although their ancestors had lived there since the time of the American Revolution, Anglo-American officials decided, as one wrote, to “maintain the fiction that the inhabitants of Chagos [were] not a permanent or semi-permanent population,” but just “transient contract workers.” The same official summed up the situation bluntly: “We are able to make up the rules as we go along.”
And so they did: between 1968 and 1973, American officials conspired with their British colleagues to remove the Chagossians, carefully hiding their expulsion from Congress, Parliament, the U.N., and the media. During the deportations, British agents and members of a U.S. Navy construction battalion rounded up and killed all those pet dogs. Their owners were then deported to the western Indian Ocean islands of Mauritius and the Seychelles, 1,200 miles from their homeland, where they received no resettlement assistance. More than 40 years after their expulsion, Chagossians generally remain the poorest of the poor in their adopted lands, struggling to survive in places that outsiders know as exotic tourist destinations.
During the same period, Diego Garcia became a multi-billion-dollar Navy and Air Force base and a central node in U.S. military efforts to control the Greater Middle East and its oil and natural gas supplies. The base, which few Americans are aware of, is more important strategically and more secretive than the U.S. naval base-cum-prison at Guantánamo Bay, Cuba. Unlike Guantánamo, no journalist has gotten more than a glimpse of Diego Garcia in more than 30 years. And yet, it has played a key role in waging the Gulf War, the 2003 invasion of Iraq, the U.S.-led war in Afghanistan, and the current bombing campaign against the Islamic State in Syria and Iraq.
Following years of reports that the base was a secret CIA “black site” for holding terrorist suspects and years of denials by U.S. and British officials, leaders on both sides of the Atlantic finally fessed up in 2008. “Contrary to earlier explicit assurances,” said Secretary of State for Foreign and Commonwealth Affairs David Miliband, Diego Garcia had indeed played at least some role in the CIA’s secret “rendition” program.
Last year, British officials claimed that flight log records, which might have shed light on those rendition operations, were “incomplete due to water damage” thanks to “extremely heavy weather in June 2014.” A week later, they suddenly reversed themselves, saying that the “previously wet paper records have been dried out.” Two months later, they insisted the logs had not dried out at all and were “damaged to the point of no longer being useful.” Except that the British government’s own weather data indicates that June 2014 was an unusually dry month on Diego Garcia. A legal rights advocate said British officials “could hardly be less credible if they simply said ‘the dog ate my homework.’”
And these are just a few of the fictions underlying the base that occupies the Chagossians’ former home and that the U.S. military has nicknamed the “Footprint of Freedom.” After more than four decades of exile, however, with a Chagossian movement to return to their homeland growing, the fictions of Diego Garcia may finally be crumbling.
The story of Diego Garcia begins in the late eighteenth century. At that time, enslaved peoples from Africa, brought to work on Franco-Mauritian coconut plantations, became the first settlers in the Chagos Archipelago. Following emancipation and the arrival of indentured laborers from India, a diverse mixture of peoples created a new society with its own language, Chagos Kreol. They called themselves the Ilois — the Islanders.
While still a plantation society, the archipelago, by then under British colonial control, provided a secure life featuring universal employment and numerous social benefits on islands described by many as idyllic. “That beautiful atoll of Diego Garcia, right in the middle of the ocean,” is how Stuart Barber described it in the late 1950s. A civilian working for the U.S. Navy, Barber would become the architect of one of the most powerful U.S. military bases overseas.
Amid Cold War competition with the Soviet Union, Barber and other officials were concerned that there was almost no U.S. military presence in and around the Indian Ocean. Barber noted that Diego Garcia’s isolation — halfway between Africa and Indonesia and 1,000 miles south of India — ensured that it would be safe from attack, yet was still within striking distance of territory from southern Africa and the Middle East to South and Southeast Asia.
Guided by Barber’s idea, the administrations of John F. Kennedy and Lyndon Johnson convinced the British government to detach the Chagos Archipelago from colonial Mauritius and create a new colony, which they called the British Indian Ocean Territory. Its sole purpose would be to house U.S. military facilities.
During secret negotiations with their British counterparts, Pentagon and State Department officials insisted that Chagos come under their “exclusive control (without local inhabitants),” embedding an expulsion order in a polite-looking parenthetical phrase. U.S. officials wanted the islands “swept” and “sanitized.” British officials appeared happy to oblige, removing a people one official called “Tarzans” and, in a racist reference to Robinson Crusoe, “Man Fridays.”
“Absolutely Must Go”
This plan was confirmed with an “exchange of notes” signed on December 30, 1966, by U.S. and British officials, as one of the State Department negotiators told me, “under the cover of darkness.” The notes effectively constituted a treaty but required no Congressional or Parliamentary approval, meaning that both governments could keep their plans hidden.
According to the agreement, the United States would gain use of the new colony “without charge.” This was another fiction. In confidential minutes, the United States agreed to secretly wipe out a $14 million British military debt, circumventing the need to ask Congress for funding. In exchange, the British agreed to take the “administrative measures” necessary for “resettling the inhabitants.”
Those measures meant that, after 1967, any Chagossians who left home for medical treatment or a routine vacation in Mauritius were barred from returning. Soon, British officials began restricting the flow of food and medical supplies to Chagos. As conditions deteriorated, more islanders began leaving. By 1970, the U.S. Navy had secured funding for what officials told Congress would be an “austere communications station.” They were, however, already planning to ask for additional funds to expand the facility into a much larger base. As the Navy’s Office of Communications and Cryptology explained, “The communications requirements cited as justification are fiction.” By the 1980s, Diego Garcia would become a billion-dollar garrison.
In briefing papers delivered to Congress, the Navy described Chagos’s population as “negligible,” with the islands “for all practical purposes… uninhabited.” In fact, there were around 1,000 people on Diego Garcia in the 1960s and 500 to 1,000 more on other islands in the archipelago. With Congressional funds secured, the Navy’s highest-ranking admiral, Elmo Zumwalt, summed up the Chagossians’ fate in a 1971 memo of exactly three words: “Absolutely must go.”
The authorities soon ordered the remaining Chagossians — generally allowed no more than a single box of belongings and a sleeping mat — onto overcrowded cargo ships destined for Mauritius and the Seychelles. By 1973, the last Chagossians were gone.
At their destinations, most of the Chagossians were literally left on the docks, homeless, jobless, and with little money. In 1975, two years after the last removals, a Washington Post reporter found them living in “abject poverty.”
Aurélie Lisette Talate was one of the last to go. “I came to Mauritius with six children and my mother,” she told me. “We got our house… but the house didn’t have a door, didn’t have running water, didn’t have electricity. And then my children and I began to suffer. All my children started getting sick.”
Within two months, two of her children were dead. The second was buried in an unmarked grave because she lacked money for a proper burial. Aurélie experienced fainting spells herself and couldn’t eat. “We were living like animals. Land? We had none… Work? We had none. Our children weren’t going to school.”
Today, most Chagossians, who now number more than 5,000, remain impoverished. In their language, their lives are ones of lamizer(impoverished misery) and sagren (profound sorrow and heartbreak over being exiled from their native lands). Many of the islanders attribute sickness and even death to sagren. “I had something that had been affecting me for a long time, since we were uprooted,” was the way Aurélie explained it to me. “This sagren, this shock, it was this same problem that killed my child. We weren’t living free like we did in our natal land.”
Struggling for Justice
From the moment they were deported, the Chagossians demanded to be returned or at least properly resettled. After years of protest, including five hunger strikes led by women like Aurélie Talate, some in Mauritius received the most modest of compensation from the British government: small concrete houses, tiny plots of land, and about $6,000 per adult. Many used the money to pay off large debts they had accrued. For most, conditions improved only marginally. Those living in the Seychelles received nothing.
The Chagossian struggle was reinvigorated in 1997 with the launching of a lawsuit against the British government. In November 2000, the British High Court ruled the removal illegal. In 2001 and 2002, most Chagossians joined new lawsuits in both American and British courts demanding the right to return and proper compensation for their removal and for resettling their islands. The U.S. suit was ultimately dismissed on the grounds that the judiciary can’t, in most circumstances, overrule the executive branch on matters of military and foreign policy. In Britain, the Chagossians were more successful. In 2002, they secured the right to full U.K. citizenship. Over 1,000 Chagossians have since moved to Britain in search of better lives. Twice more, British courts ruled in the people’s favor, with judges calling the government’s behavior “repugnant” and an “abuse of power.”
On the government’s final appeal, however, Britain’s then highest court, the Law Lords in the House of Lords, upheld the exile in a 3-2 decision. The Chagossians appealed to the European Court of Human Rights to overturn the ruling.
A Green Fiction
Before the European Court could rule, the British government announced the creation of the world’s largest Marine Protected Area (MPA) in the Chagos Archipelago. The date of the announcement, April Fool’s Day 2010, should have been a clue that there was more than environmentalism behind the move. The MPA banned commercial fishing and limited other human activity in the archipelago, endangering the viability of any resettlement efforts.
And then came WikiLeaks. In December 2010, it released a State Department cable from the U.S. Embassy in London quoting a senior Foreign and Commonwealth Office official saying that the “former inhabitants would find it difficult, if not impossible, to pursue their claim for resettlement on the islands if the entire Chagos Archipelago were a marine reserve.” U.S. officials agreed. According to the Embassy, Political Counselor Richard Mills wrote, “Establishing a marine reserve might, indeed… be the most effective long-term way to prevent any of the Chagos Islands’ former inhabitants or their descendants from resettling.”
Not surprisingly, the main State Department concern was whether the MPA would affect base operations. “We are concerned,” the London Embassy noted, that some “would come to see the existence of a marine reserve as inherently inconsistent with the military use of Diego Garcia.” British officials assured the Americans there would be “no constraints on military operations.”
Although the European Court of Human Rights ultimately ruled against the Chagossians in 2013, this March, a U.N. tribunal found that the British government had violated international law in creating the Marine Protected Area. Next week, Chagossians will challenge the MPA and their expulsion before the British Supreme Court (now Britain’s highest) armed with the U.N. ruling and revelations that the government won its House of Lords decision with the help of a fiction-filled resettlement study.
Meanwhile, the European Parliament has passed a resolution calling for the Chagossians’ return, the African Union has condemned their deportation as unlawful, three Nobel laureates have spoken out on their behalf, and dozens of members of the British Parliament have joined a group supporting their struggle. In January, a British government “feasibility study” found no significant legal barriers to resettling the islands and outlined several possible resettlement plans, beginning with Diego Garcia. (Notably, Chagossians are not calling for the removal of the U.S. military base. Their opinions about it are diverse and complicated. At least some would prefer jobs on the base to lives of poverty and unemployment in exile.)
Of course, no study was needed to know that resettlement on Diego Garcia and in the rest of the archipelago is feasible. The base, which has hosted thousands of military and civilian personnel for more than 40 years, has demonstrated that well enough. In fact, Stuart Barber, its architect, came to the same conclusion in the years before his death. After he learned of the Chagossians’ fate, he wrote a series of impassioned letters to Human Rights Watch and the British Embassy in Washington, among others, imploring them to help the Chagossians return home. In a letter to Alaska Senator Ted Stevens, he said bluntly that the expulsion “wasn’t necessary militarily.”
In a 1991 letter to the Washington Post, Barber suggested that it was time “to redress the inexcusably inhuman wrongs inflicted by the British at our insistence.” He added, “Substantial additional compensation for 18-25 past years of misery for all evictees is certainly in order. Even if that were to cost $100,000 per family, we would be talking of a maximum of $40-50 million, modest compared with our base investment there.”
Almost a quarter-century later, nothing has yet been done. In 2016, the initial 50-year agreement for Diego Garcia will expire. While it is subject to an automatic 20-year renewal, it provides for a two-year renegotiation period, which commenced in late 2014. With momentum building in support of the Chagossians, they are optimistic that the two governments will finally correct this historic injustice. That U.S. officials allowed the British feasibility study to consider resettlement plans for Diego Garcia is a hopeful sign that Anglo-American policy may finally be shifting to right a great wrong in the Indian Ocean.
Unfortunately, Aurélie Talate will never see the day when her people go home. Like others among the rapidly dwindling number of Chagossians born in the archipelago, Aurélie died in 2012 at age 70, succumbing to the heartbreak that is sagren.
David Vine, a TomDispatch regular, is associate professor of anthropology at American University in Washington, D.C. His new book, Base Nation: How U.S. Military Bases Abroad Harm America and the World will be published in August as part of the American Empire Project (Metropolitan Books). He is also the author of Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia. He has written for the New York Times, the Washington Post, the Guardian, and Mother Jones, among other publications. For more of his writing, visit www.davidvine.net.
Copyright 2015 David Vine
The Truth About Diego Garcia
En route back to Washington at the tail end of his most recent overseas trip, John Kerry, America’s peripatetic secretary of state, stopped off in France “to share a hug with all of Paris.” Whether Paris reciprocated the secretary’s embrace went unrecorded.
Despite the requisite reference to General Pershing (“Lafayette, we are here!”) and flying James Taylor in from the 1960s to assure Parisians that “You’ve Got a Friend,” in the annals of American diplomacy Kerry’s hug will likely rank with President Eisenhower’s award of the Legion of Merit to Nicaraguan dictator Anastasio Somoza for “exceptionally meritorious conduct” and Jimmy Carter’s acknowledgment of the “admiration and love” said to define the relationship between the Iranian people and their Shah. In short, it was a moment best forgotten.
Not one of the signature foreign policy initiatives conceived in Obama’s first term has borne fruit. When it came to making a fresh start with the Islamic world, responsibly ending the “dumb” war in Iraq (while winning the “necessary” one in Afghanistan), “resetting” U.S.-Russian relations, and “pivoting” toward Asia, mark your scorecard 0 for 4.
There’s no doubt that when Kerry arrived at the State Department he brought with him some much-needed energy. That he is giving it his all — the department’s website reports that the secretary has already clocked over 682,000 miles of travel — is doubtless true as well. The problem is the absence of results. Remember when his signature initiative was going to be an Israeli-Palestinian peace deal? Sadly, that quixotic plan, too, has come to naught.
Yes, Team Obama “got” bin Laden. And, yes, it deserves credit for abandoning a self-evidently counterproductive 50-plus-year-old policy toward Cuba and for signing a promising agreement with China on climate change. That said, the administration’s overall record of accomplishment is beyond thin, starting with that first-day-in-the-Oval-Office symbol that things were truly going to be different: Obama’s order to close Guantanamo. That, of course, remains a work in progress (despite regular reassurances of light glimmering at the end of what has become a very long tunnel).
In fact, taking the president’s record as a whole, noting that on his watch occasional U.S. drone strikes have become routine, the Nobel Committee might want to consider revoking its Peace Prize.
Nor should we expect much in the time that Obama has remaining. Perhaps there is a deal with Iran waiting in the wings (along with the depth charge of ever-fiercer congressionally mandated sanctions), but signs of intellectual exhaustion are distinctly in evidence.
“Where there is no vision,” the Hebrew Bible tells us, “the people perish.” There’s no use pretending: if there’s one thing the Obama administration most definitely has not got and has never had, it’s a foreign policy vision.
In Search of Truly Wise (White) Men — Only Those 84 or Older Need Apply
All of this evokes a sense of unease, even consternation bordering on panic, in circles where members of the foreign policy elite congregate. Absent visionary leadership in Washington, they have persuaded themselves, we’re all going down. So the world’s sole superpower and self-anointed global leader needs to get game — and fast.
Leslie Gelb, former president of the Council on Foreign Relations, recently weighed in with a proposal for fixing the problem: clean house. Obama has surrounded himself with fumbling incompetents, Gelb charges. Get rid of them and bring in the visionaries.
Writing at the Daily Beast, Gelb urges the president to fire his entire national security team and replace them with “strong and strategic people of proven foreign policy experience.” Translation: the sort of people who sip sherry and nibble on brie in the august precincts of the Council of Foreign Relations. In addition to offering his own slate of nominees, including several veterans of the storied George W. Bush administration, Gelb suggests that Obama consult regularly with Henry Kissinger, Brent Scowcroft, Zbigniew Brzezinski, and James Baker. These distinguished war-horses range in age from 84 to 91. By implication, only white males born prior to World War II are eligible for induction into the ranks of the Truly Wise Men.
Anyway, Gelb emphasizes, Obama needs to get on with it. With the planet awash in challenges that “imperil our very survival,” there is simply no time to waste.
At best, Gelb’s got it half right. When it comes to foreign policy, this president has indeed demonstrated a knack for surrounding himself with lackluster lieutenants. That statement applies equally to national security adviser Susan Rice (and her predecessor), to Secretary of State Kerry (and his predecessor), and to outgoing Pentagon chief Chuck Hagel. Ashton Carter, the technocrat slated to replace Hagel as defense secretary, comes from the same mold.
They are all “seasoned” – in Washington, a euphemism for bland, conventional, and utterly unimaginative — charter members of the Rogers-Christopher school of American statecraft. (That may require some unpacking, so pretend you’re on Jeopardy. Alex Trebek: “Two eminently forgettable and completely forgotten twentieth-century secretaries of state.” You, hitting the buzzer: “Who were William Rogers and Warren Christopher?” “Correct!”)
Members of Obama’s national security team worked long and hard to get where they are. Yet along the way — perhaps from absorbing too many position papers, PowerPoint briefings, and platitudes about “American global leadership” — they lost whatever creative spark once endowed them with the appearance of talent and promise. Ambition, unquestioned patriotism, and a capacity for putting in endless hours (and enduring endless travel) — all these remain. But a serious conception of where the world is heading and what that implies for basic U.S. policy? Individually and collectively, they are without a clue.
I submit that maybe that’s okay, that plodding mediocrity can be a boon if, as at present, the alternatives on offer look even worse.
A Hug for Obama
You want vision? Obama’s predecessor surrounded himself with visionaries. Dick Cheney, Condoleezza Rice, Donald Rumsfeld, and Paul Wolfowitz, products of the Cold War one and all, certainly fancied themselves large-bore strategic thinkers. Busily positioning the United States to run (just another “i” and you have “ruin”) the world, they were blindsided by 9/11. Unembarrassed and unchastened by this disaster, they initiated a series of morally dubious, strategically boneheaded moves that were either (take your pick) going to spread freedom and democracy or position the United States to exercise permanent dominion. The ensuing Global War on Terror did neither, of course, while adding trillions to the national debt and helping fracture great expanses of the planet. Obama is still, however ineffectually, trying to clean up the mess they created.
If that’s what handing the keys to big thinkers gets you, give me Susan Rice any day. Although Obama’s “don’t do stupid shit” may never rank with Washington’s Farewell Address or the Monroe Doctrine in the history books, George W. Bush might have profited from having some comparable axiom taped to his laptop.
Big ideas have their place — indeed, are essential — when the issues at hand are clearly defined. The Fall of France in 1940 was one such moment, which President Franklin D. Roosevelt recognized. So too, arguably, was the period immediately after World War II. The defeat of Nazi Germany and Imperial Japan had left a dangerous power vacuum in both Europe and the Pacific to which George Marshall, Dean Acheson, and their compatriots forged a necessary response. Perhaps the period 1968-1969 falls into that same category, the debacle of Vietnam requiring a major adjustment in U.S. Cold War strategy. This Richard Nixon and Henry Kissinger undertook with their opening to China.
Yet despite the overwrought claims of Gelb (and others) that America’s very survival is today at risk, the present historical moment lacks comparable clarity. Ours is not a time when we face a single overarching threat. Instead, on several different fronts, worrisome developments are brewing. Environmental degradation, the rise of China and other emerging powers, the spread of radical Islam, the precarious state of the global economy, vulnerabilities that are an inevitable byproduct of our pursuit of a cyber-utopia: all of these bear very careful watching. Each one today should entail a defensive response, the United States protecting itself (and its allies) against worst-case outcomes. But none of these at the present moment justifies embarking upon a let-out-all-the-stops offensive. Chasing after one problem would necessarily divert attention from the rest.
The immediate future remains too opaque to say with certainty which threat will turn out to pose the greatest danger, whether in the next year or the next decade — and which might even end up not being a threat at all but an unexpected opportunity. Conditions are not ripe for boldness. The abiding imperative of the moment is to discern, which requires careful observation and patience. In short, forget about strategy.
And there’s a further matter. Correct discernment assumes a proper vantage point. What you see depends on where you sit and which way you’re facing. Those who inhabit the upper ranks of the Obama administration (and those whom Leslie Gelb offers as replacements) sit somewhere back in the twentieth century, their worldview shaped by memories of Munich and Yalta, Korea and Vietnam, the Cuban Missile Crisis and the Berlin Wall, none of which retain more than tangential relevance to the present day.
You want vision? That will require a new crop of visionaries. Instead of sitting down with ancients like Kissinger, Scowcroft, Brzezinski, or Baker, this president (or his successor) would be better served to pick the brain of the army captain back from multiple combat tours in Iraq and Afghanistan, the moral theologian specializing in inter-religious dialog, the Peace Corps volunteer who spent the last two years in West Africa, and the Silicon Valley entrepreneur best able to spell out the political implications of the next big thing.
In short, a post-twentieth century vision requires a post-twentieth century generation, able to free itself from old shibboleths to which Leslie Gelb and most of official Washington today remain stubbornly dedicated. That generation waits in the wings and after another presidential election or two may indeed wield some influence. We should hope so. In the meantime, we should bide our time, amending the words of the prophet to something like: “Where there is no vision, the people muddle along and await salvation.”
So as Obama and his team muddle toward their finish line, their achievements negligible, we might even express a modicum of gratitude. When they depart the scene, we will forget the lot of them. Yet at least they managed to steer clear of truly epic disasters. When muddling was the best Washington had on offer, they delivered. They may even deserve a hug.
Andrew J. Bacevich, a TomDispatch regular, is writing a military history of America’s War for the Greater Middle East. His most recent book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2015 Andrew Bacevich
Save Us From Washington’s Visionaries
The abiding defect of U.S. foreign policy? It’s isolationism, my friend. Purporting to steer clear of war, isolationism fosters it. Isolationism impedes the spread of democracy. It inhibits trade and therefore prosperity. It allows evildoers to get away with murder. Isolationists prevent the United States from accomplishing its providentially assigned global mission. Wean the American people from their persistent inclination to look inward and who knows what wonders our leaders will accomplish.
The United States has been at war for well over a decade now, with U.S. attacks and excursions in distant lands having become as commonplace as floods and forest fires. Yet during the recent debate over Syria, the absence of popular enthusiasm for opening up another active front evoked expressions of concern in Washington that Americans were once more turning their backs on the world.
As he was proclaiming the imperative of punishing the government of Bashar al-Assad, Secretary of State John Kerry also chided skeptical members of the Senate Foreign Relations Committee that “this is not the time for armchair isolationism.” Commentators keen to have a go at the Syrian autocrat wasted little time in expanding on Kerry’s theme.
Reflecting on “where isolationism leads,” Jennifer Rubin, the reliably bellicose Washington Post columnist, was quick to chime in, denouncing those hesitant to initiate another war as “infantile.” American isolationists, she insisted, were giving a green light to aggression. Any nation that counted on the United States for protection had now become a “sitting duck,” with “Eastern Europe [and] neighbors of Venezuela and Israel” among those left exposed and vulnerable. News reports of Venezuelan troop movements threatening Brazil, Colombia, or Guyana were notably absent from the Post or any other media outlet, but no matter — you get the idea.
Military analyst Frederick Kagan was equally troubled. Also writing in the Post, he worried that “the isolationist narrative is rapidly becoming dominant.” His preferred narrative emphasized the need for ever greater military exertions, with Syria just the place to launch a new campaign. For Bret Stephens, a columnist with the Wall Street Journal, the problem was the Republican Party. Where had the hawks gone? The Syria debate, he lamented, was “exposing the isolationist worm eating its way through the GOP apple.”
The Journal’s op-ed page also gave the redoubtable Norman Podhoretz, not only still alive but vigorously kicking, a chance to vent. Unmasking President Obama as “a left-wing radical” intent on “reduc[ing] the country’s power and influence,” the unrepentant neoconservative accused the president of exploiting the “war-weariness of the American people and the rise of isolationist sentiment… on the left and right” to bring about “a greater diminution of American power than he probably envisaged even in his wildest radical dreams.”
Obama escalated the war in Afghanistan, “got” Osama bin Laden, toppled one Arab dictator in Libya, and bashed and bombed targets in Somalia, Yemen, Pakistan, and elsewhere. Even so, it turns out he is actually part of the isolationist conspiracy to destroy America!
Over at the New York Times, similar concerns, even if less hysterically expressed, prevailed. According to Times columnist Roger Cohen, President Obama’s reluctance to pull the trigger showed that he had “deferred to a growing isolationism.” Bill Keller concurred. “America is again in a deep isolationist mood.” In a column entitled, “Our New Isolationism,” he decried “the fears and defeatist slogans of knee-jerk isolationism” that were impeding military action. (For Keller, the proper antidote to isolationism is amnesia. As he put it, “Getting Syria right starts with getting over Iraq.”)
For his part, Times staff writer Sam Tanenhaus contributed a bizarre two-minute exercise in video agitprop — complete with faked scenes of the Japanese attacking Pearl Harbor — that slapped the isolationist label on anyone opposing entry into any war whatsoever, or tiring of a war gone awry, or proposing that America go it alone.
When the “New Isolationism” Was New
Most of this, of course, qualifies as overheated malarkey. As a characterization of U.S. policy at any time in memory, isolationism is a fiction. Never really a tendency, it qualifies at most as a moment, referring to that period in the 1930s when large numbers of Americans balked at the prospect of entering another European war, the previous one having fallen well short of its “War To End All Wars” advance billing.
In fact, from the day of its founding down to the present, the United States has never turned its back on the world. Isolationism owes its storied history to its value as a rhetorical device, deployed to discredit anyone opposing an action or commitment (usually involving military forces) that others happen to favor. If I, a grandson of Lithuanian immigrants, favor deploying U.S. forces to Lithuania to keep that NATO ally out of Vladimir Putin’s clutches and you oppose that proposition, then you, sir or madam, are an “isolationist.” Presumably, Jennifer Rubin will see things my way and lend her support to shoring up Lithuania’s vulnerable frontiers.
For this very reason, the term isolationism is not likely to disappear from American political discourse anytime soon. It’s too useful. Indeed, employ this verbal cudgel to castigate your opponents and your chances of gaining entrée to the nation’s most prestigious publications improve appreciably. Warn about the revival of isolationism and your prospects of making the grade as a pundit or candidate for high office suddenly brighten. This is the great thing about using isolationists as punching bags: it makes actual thought unnecessary. All that’s required to posture as a font of wisdom is the brainless recycling of clichés, half-truths, and bromides.
No publication is more likely to welcome those clichés, half-truths, and bromides than the New York Times. There, isolationism always looms remarkably large and is just around the corner.
In July 1942, the New York Times Magazine opened its pages to Vice President Henry A. Wallace, who sounded the alarm about the looming threat of what he styled a “new isolationism.” This was in the midst of World War II, mind you.
After the previous world war, the vice president wrote, the United States had turned inward. As summer follows spring, “the choice led up to this present war.” Repeat the error, Wallace warned, and “the price will be more terrible and will be paid much sooner.” The world was changing and it was long past time for Americans to get with the program. “The airplane, the radio, and modern technology have bound the planet so closely together that what happens anywhere on the planet has a direct effect everywhere else.” In a world that had “suddenly become so small,” he continued, “we cannot afford to resume the role of hermit.”
The implications for policy were self-evident:
“This time, then, we have only one real choice. We must play a responsible part in the world — leading the way in world progress, fostering a healthy world trade, helping to protect the world’s peace.”
One month later, it was Archibald MacLeish’s turn. On August 16, 1942, the Times magazine published a long essay of his under the title of — wouldn’t you know it — “The New Isolationism.” For readers in need of coaching, Times editors inserted this seal of approval before the text: “There is great pertinence in the following article.”
A well-known poet, playwright, and literary gadfly, MacLeish was at the time serving as Librarian of Congress. From this bully pulpit, he offered the reassuring news that “isolationism in America is dead.” Unfortunately, like zombies, “old isolationists never really die: they merely dig in their toes in a new position. And the new position, whatever name is given it, is isolation still.”
Fortunately, the American people were having none of it. They had “recaptured the current of history and they propose to move with it; they don’t mean to be denied.” MacLeish’s fellow citizens knew what he knew: “that there is a stirring in our world…, a forward thrusting and overflowing human hope of the human will which must be given a channel or it will dig a channel itself.” In effect, MacLeish was daring the isolationists, in whatever guise, to stand in the way of this forward thrusting and overflowing hopefulness. Presumably, they would either drown or be crushed.
The end of World War II found the United States donning the mantle of global leadership, much as Wallace, MacLeish, and the Times had counseled. World peace did not ensue. Instead, a host of problems continued to afflict the planet, with isolationists time and again fingered as the culprits impeding their solution.
The Gift That Never Stops Giving
In June 1948, with a notable absence of creativity in drafting headlines, the Times once again found evidence of “the new isolationism.” In an unsigned editorial, the paper charged that an American penchant for hermit-like behavior was “asserting itself again in a manner that is both distressing and baffling.” With the Cold War fully joined and U.S. forces occupying Germany, Japan, and other countries, the Times worried that some Republicans in Congress appeared reluctant to fund the Marshall Plan.
From their offices in Manhattan, members of the Times editorial board detected in some quarters “a homesickness for the old days.” It was incumbent upon Americans to understand that “the time is past when we could protect ourselves easily behind our barriers behind the seas.” History was summoning the United States to lead the world: “The very success of our democracy has now imposed duties upon us which we must fulfill if that democracy is to survive.” Those entertaining contrary views, the Times huffed, “do not speak for the American people.”
That very month, Josef Stalin announced that the Soviet Union was blockading Berlin. The U.S. responded not by heading for the exits but by initiating a dramatic airlift. Oh, and Congress fully funded the Marshall Plan.
Barely a year later, in August 1949, with Stalin having just lifted the Berlin Blockade, Times columnist Arthur Krock discerned another urge to disengage. In a piece called “Chickens Usually Come Home,” he cited congressional reservations about the recently promulgated Truman Doctrine as evidence of, yes, a “new isolationism.” As it happened, Congress duly appropriated the money President Truman was requesting to support Greece and Turkey against the threat of communism — as it would support similar requests to throw arms and money at other trouble spots like French Indochina.
Even so, in November of that year, the Times magazine published yet another warning about “the challenge of a new isolationism.” The author was Illinois Governor Adlai Stevenson, then positioning himself for a White House run. Like many another would-be candidate before and since, Stevenson took the preliminary step of signaling his opposition to the I-word.
World War II, he wrote, had “not only destroyed fascism abroad, but a lot of isolationist notions here at home.” War and technological advance had “buried the whole ostrich of isolation.” At least it should have. Unfortunately, some Republicans hadn’t gotten the word. They were “internationally minded in principle but not in practice.” Stevenson feared that when the chips were down such head-in-the-sand inclinations might come roaring back. This he was determined to resist. “The eagle, not the ostrich,” he proclaimed, “is our national emblem.”
In August 1957, the Times magazine was at it once again, opening its pages to another Illinois Democrat, Senator Paul Douglas, for an essay familiarly entitled “A New Isolationism — Ripples or Tide?” Douglas claimed that “a new tide of isolationism is rising in the country.” U.S. forces remained in Germany and Japan, along with Korea, where they had recently fought a major war. Even so, the senator worried that “the internationalists are tiring rapidly now.”
Americans needed to fortify themselves by heeding the message of the Gospels: “Let the spirit of the Galilean enter our worldly and power-obsessed hearts.” In other words, the senator’s prescription for American statecraft was an early version of What Would Jesus Do? Was Jesus Christ an advocate of American global leadership? Senator Douglas apparently thought so.
Then came Vietnam. By May 1970, even Times-men were showing a little of that fatigue. That month, star columnist James Reston pointed (yet again) to the “new isolationism.” Yet in contrast to the paper’s scribblings on the subject over the previous three decades, Reston didn’t decry it as entirely irrational. The war had proven to be a bummer and “the longer it goes on,” he wrote, “the harder it will be to get public support for American intervention.” Washington, in other words, needed to end its misguided war if it had any hopes of repositioning itself to start the next one.
A Concept Growing Long in the Tooth
By 1980, the Times showed signs of recovering from its brief Vietnam funk. In a review of Norman Podhoretz’s The Present Danger, for example, the noted critic Anatole Broyard extolled the author’s argument as “dispassionate,” “temperate,” and “almost commonsensical.”
The actual text was none of those things. What the pugnacious Podhoretz called — get ready for it — “the new isolationism” was, in his words, “hard to distinguish from simple anti-Americanism.” Isolationists — anyone who had opposed the Vietnam War on whatever grounds — believed that the United States was “a force for evil, a menace, a terror.” Podhoretz detected a “psychological connection” between “anti-Americanism, isolationism, and the tendency to explain away or even apologize for anything the Soviet Union does, no matter how menacing.” It wasn’t bad enough that isolationists hated their country, they were, it seems, commie symps to boot.
Fast forward a decade, and — less than three months after U.S. troops invaded Panama — Times columnist Flora Lewis sensed a resurgence of you-know-what. In a February 1990 column, she described “a convergence of right and left” with both sides “arguing with increasing intensity that it’s time for the U.S. to get off the world.” Right-wingers saw that world as too nasty to save; left-wingers, the United States as too nasty to save it. “Both,” she concluded (of course), were “moving toward a new isolationism.”
Five months later, Saddam Hussein sent his troops into Kuwait. Instead of getting off the world, President George H.W. Bush deployed U.S. combat forces to defend Saudi Arabia. For Joshua Muravchik, however, merely defending that oil-rich kingdom wasn’t nearly good enough. Indeed, here was a prime example of the “New Isolationism, Same Old Mistake,” as his Times op-ed was entitled.
The mistake was to flinch from instantly ejecting Saddam’s forces. Although opponents of a war against Iraq did not “see themselves as isolationists, but as realists,” he considered this a distinction without a difference. Muravchik, who made his living churning out foreign policy analysis for various Washington think tanks, favored “the principle of investing America’s power in the effort to fashion an environment congenial to our long-term safety.” War, he firmly believed, offered the means to fashion that congenial environment. Should America fail to act, he warned, “our abdication will encourage such threats to grow.”
Of course, the United States did act and the threats grew anyway. In and around the Middle East, the environment continued to be thoroughly uncongenial. Still, in Times-world, the American penchant for doing too little rather than too much remained the eternal problem, eternally “new.” An op-ed by up-and-coming journalist James Traub appearing in the Times in December 1991, just months after a half-million U.S. troops had liberated Kuwait, was typical. Assessing the contemporary political scene, Traub detected “a new wave of isolationism gathering force.” Traub was undoubtedly establishing his bona fides. (Soon after, he landed a job working for the paper.)
This time, according to Traub, the problem was the Democrats. No longer “the party of Wilson or of John F. Kennedy,” Democrats, he lamented, “aspire[d] to be the party of middle-class frustrations — and if that entails turning your back on the world, so be it.” The following year Democrats nominated as their presidential candidate Bill Clinton, who insisted that he would never under any circumstances turn his back on the world. Even so, no sooner did Clinton win than Times columnist Leslie Gelb was predicting that the new president would “fall into the trap of isolationism and policy passivity.”
Get Me Rewrite!
Arthur Schlesinger defined the problem in broader terms. The famous historian and Democratic Party insider had weighed in early on the matter with a much-noted essay that appeared in The Atlantic Monthly back in 1952. He called it – you guessed it — “The New Isolationism.”
In June 1994, more than 40 years later, with the Cold War now finally won, Schlesinger was back for more with a Times op-ed that sounded the usual alarm. “The Cold War produced the illusion that traditional isolationism was dead and buried,” he wrote, but of course — this is, after all, the Times — it was actually alive and kicking. The passing of the Cold War had “weakened the incentives to internationalism” and was giving isolationists a new opening, even though in “a world of law requiring enforcement,” it was incumbent upon the United States to be the lead enforcer.
The warning resonated. Although the Times does not normally give commencement addresses much attention, it made an exception for Madeleine Albright’s remarks to graduating seniors at Barnard College in May 1995. The U.S. ambassador to the United Nations had detected what she called “a trend toward isolationism that is running stronger in America than at any time since the period between the two world wars,” and the American people were giving in to the temptation “to pull the covers up over our heads and pretend we do not notice, do not care, and are unaffected by events overseas.” In other circumstances in another place, it might have seemed an odd claim, given that the United States had just wrapped up armed interventions in Somalia and Haiti and was on the verge of initiating a bombing campaign in the Balkans.
Still, Schlesinger had Albright’s back. The July/August 1995 issue of Foreign Affairs prominently featured an article of his entitled “Back to the Womb? Isolationism’s Renewed Threat,” with Times editors publishing a CliffsNotes version on the op-ed page a month earlier. “The isolationist impulse has risen from the grave,” Schlesinger announced, “and it has taken the new form of unilateralism.”
His complaint was no longer that the United States hesitated to act, but that it did not act in concert with others. This “neo-isolationism,” he warned, introducing a new note into the tradition of isolationism-bashing for the first time in decades, “promises to prevent the most powerful nation on the planet from playing any role in enforcing the peace system.” The isolationists were winning — this time through pure international belligerence. Yet “as we return to the womb,” Schlesinger warned his fellow citizens, “we are surrendering a magnificent dream.”
Other Times contributors shared Schlesinger’s concern. On January 30, 1996, the columnist Russell Baker chipped in with a piece called “The New Isolationism.” For those slow on the uptake, Jessica Mathews, then a fellow at the Council on Foreign Relations, affirmed Baker’s concerns by publishing an identically titled column in the Washington Post a mere six days later. Mathews reported “troubling signs that the turning inward that many feared would follow the Cold War’s end is indeed happening.” With both the Times and the Post concurring, “the new isolationism” had seemingly reached pandemic proportions (as a title, if nothing else).
Did the “new” isolationism then pave the way for 9/11? Was al-Qaeda inspired by an unwillingness on Washington’s part to insert itself into the Islamic world?
Unintended and unanticipated consequences stemming from prior U.S. interventions might have seemed to offer a better explanation. But this much is for sure: as far as the Times was concerned, even in the midst of George W. Bush’s Global War in Terror, the threat of isolationism persisted.
In January 2004, David M. Malone, president of the International Peace Academy, worried in a Times op-ed “that the United States is retracting into itself” — this despite the fact that U.S. forces were engaged in simultaneous wars in Iraq and Afghanistan. Among Americans, a concern about terrorism, he insisted, was breeding “a sense of self-obsession and indifference to the plight of others.” “When Terrorists Win: Beware America’s New Isolationism,” blared the headline of Malone’s not-so-new piece.
Actually, Americans should beware those who conjure up phony warnings of a “new isolationism” to advance a particular agenda. The essence of that agenda, whatever the particulars and however packaged, is this: If the United States just tries a little bit harder — one more intervention, one more shipment of arms to a beleaguered “ally,” one more line drawn in the sand — we will finally turn the corner and the bright uplands of peace and freedom will come into view.
This is a delusion, of course. But if you write a piece exposing that delusion, don’t bother submitting it to the Times.
Andrew J. Bacevich is a professor of history and international relations at Boston University. His new book is Breach of Trust: How Americans Failed Their Soldiers and Their Country.
Copyright 2013 Andrew Bacevich