“The United States of Amnesia.” That’s what Gore Vidal once called us. We remember what we find it convenient to remember and forget everything else. That forgetfulness especially applies to the history of others. How could their past, way back when, have any meaning for us today? Well, it just might. Take the European conflagration of 1914-1918, for example.
You may not have noticed. There’s no reason why you should have, fixated as we all are on the daily torrent of presidential tweets and the flood of mindless rejoinders they elicit. But let me note for the record that the centenary of the conflict once known as The Great War is well underway and before the present year ends will have concluded.
Indeed, a hundred years ago this month, the 1918 German Spring Offensive — codenamed Operation Michael — was sputtering to an unsuccessful conclusion. A last desperate German gamble, aimed at shattering Allied defenses and gaining a decisive victory, had fallen short. In early August of that year, with large numbers of our own doughboys now on the front lines, a massive Allied counteroffensive was to commence, continuing until the eleventh hour of the eleventh day of the eleventh month, when an armistice finally took effect and the guns fell silent.
In the years that followed, Americans demoted The Great War. It became World War I, vaguely related to but overshadowed by the debacle next in line, known as World War II. Today, the average citizen knows little about that earlier conflict other than that it preceded and somehow paved the way for an even more brutal bloodletting. Also, on both occasions, the bad guys spoke German.
So, among Americans, the war of 1914-1918 became a neglected stepsister of sorts, perhaps in part because the United States only got around to suiting up for that conflict about halfway through the fourth quarter. With the war of 1939-1945 having been sacralized as the moment when the Greatest Generation saved humankind, the war-formerly-known-as-The-Great-War collects dust in the bottom drawer of American collective consciousness.
From time to time, some politician or newspaper columnist will resurrect the file labeled “August 1914,” the grim opening weeks of that war, and sound off about the dangers of sleepwalking into a devastating conflict that nobody wants or understands. Indeed, with Washington today having become a carnival of buncombe so sublimely preposterous that even that great journalistic iconoclast H.L. Mencken might have been struck dumb, ours is perhaps an apt moment for just such a reminder.
Yet a different aspect of World War I may possess even greater relevance to the American present. I’m thinking of its duration: the longer it lasted, the less sense it made. But on it went, impervious to human control like the sequence of Biblical plagues that God had inflicted on the ancient Egyptians.
So the relevant question for our present American moment is this: once it becomes apparent that a war is a mistake, why would those in power insist on its perpetuation, regardless of costs and consequences? In short, when getting in turns out to have been a bad idea, why is getting out so difficult, even (or especially) for powerful nations that presumably should be capable of exercising choice on such matters? Or more bluntly, how did the people in charge during The Great War get away with inflicting such extraordinary damage on the nations and peoples for which they were responsible?
For those countries that endured World War I from start to finish — especially Great Britain, France, and Germany — specific circumstances provided their leaders with an excuse for suppressing second thoughts about the cataclysm they had touched off.
Among them were:
* mostly compliant civilian populations deeply loyal to some version of King and Country, further kept in line by unremitting propaganda that minimized dissent;
* draconian discipline — deserters and malingerers faced firing squads — that maintained order in the ranks (most of the time) despite the unprecedented scope of the slaughter;
* the comprehensive industrialization of war, which ensured a seemingly endless supply of the weaponry, munitions, and other equipment necessary for outfitting mass conscript armies and replenishing losses as they occurred.
Economists would no doubt add sunk costs to the mix. With so much treasure already squandered and so many lives already lost, the urge to press on a bit longer in hopes of salvaging at least some meager benefit in return for what (and who) had been done in was difficult to resist.
Even so, none of these, nor any combination of them, can adequately explain why, in the midst of an unspeakable orgy of self-destruction, with staggering losses and nations in ruin, not one monarch or president or premier had the wit or gumption to declare: Enough! Stop this madness!
Instead, the politicians sat on their hands while actual authority devolved onto the likes of British Field Marshal Sir Douglas Haig, French Marshals Ferdinand Foch and Philippe Petain, and German commanders Paul von Hindenburg and Erich Ludendorff. In other words, to solve a conundrum they themselves had created, the politicians of the warring states all deferred to their warrior chieftains. For their part, the opposing warriors jointly subscribed to a perverted inversion of strategy best summarized by Ludendorff as “punch a hole [in the front] and let the rest follow.” And so the conflict dragged on and on.
The Forfeiture of Policy
Put simply, in Europe, a hundred years ago, war had become politically purposeless. Yet the leaders of the world’s principal powers — including, by 1917, U.S. President Woodrow Wilson — could conceive of no alternative but to try harder, even as the seat of Western civilization became a charnel house.
Only one leader bucked the trend: Vladimir Lenin. In March 1918, soon after seizing power in Russia, Lenin took that country out of the war. In doing so, he reasserted the primacy of politics and restored the possibility of strategy. Lenin had his priorities straight. Nothing in his estimation took precedence over ensuring the survival of the Bolshevik Revolution. Liquidating the war against Germany therefore became an imperative.
Allow me to suggest that the United States should consider taking a page out of Lenin’s playbook. Granted, prior to the collapse of the Soviet Union in 1991, such a suggestion might have smacked of treason. Today, however, in the midst of our never-ending efforts to expunge terrorism, we might look to Lenin for guidance on how to get our priorities straight.
As was the case with Great Britain, France, and Germany a century ago, the United States now finds itself mired in a senseless war. Back then, political leaders in London, Paris, and Berlin had abrogated control of basic policy to warrior chieftains. Today, ostensibly responsible political leaders in Washington have done likewise. Some of those latter-day American warrior chieftains who gather in the White House or testify on Capitol Hill may wear suits rather than uniforms, but all remain enamored with the twenty-first-century equivalent of Ludendorff’s notorious dictum.
Of course, our post-9/11 military enterprise — the undertaking once known as the Global War on Terrorism — differs from The Great War in myriad ways. The ongoing hostilities in which U.S. forces are involved in various parts of the Islamic world do not qualify, even metaphorically, as “great.” Nor will there be anything great about an armed conflict with Iran, should members of the current administration get their apparent wish to provoke one.
Today, Washington need not even bother to propagandize the public into supporting its war. By and large, members of the public are indifferent to its very existence. And given our reliance on a professional military, shooting citizen-soldiers who want to opt out of the fight is no longer required.
There are also obvious differences in scale, particularly when it comes to the total number of casualties involved. Cumulative deaths from the various U.S. interventions, large and small, undertaken since 9/11, number in the hundreds of thousands. The precise tally of those lost during the European debacle of 1914-1918 will never be known, but the total probably surpassed 13 million.
Even so, similarities between the Great War as it unspooled and our own not-in-the-least-great war(s) deserve consideration. Today, as then, strategy — that is, the principled use of power to achieve the larger interests of the state — has ceased to exist. Indeed, war has become an excuse for ignoring the absence of strategy.
For years now, U.S. military officers and at least some national security aficionados have referred to ongoing military hostilities as “the Long War.” To describe our conglomeration of spreading conflicts as “long” obviates any need to suggest when or under what circumstances (if any) they might actually end. It’s like the meteorologist forecasting a “long winter” or the betrothed telling his or her beloved that theirs will be a “long engagement.” The implicit vagueness is not especially encouraging.
Some high-ranking officers of late have offered a more forthright explanation of what “long” may really mean. In the Washington Post, the journalist Greg Jaffe recently reported that “winning for much of the U.S. military’s top brass has come to be synonymous with staying put.” Winning, according to Air Force General Mike Holmes, is simply “not losing. It’s staying in the game.”
Not so long ago, America’s armed forces adhered to a concept called victory, which implied conclusive, expeditious, and economical mission accomplished. No more. Victory, it turns out, is too tough to achieve, too restrictive, or, in the words of Army Lieutenant General Michael Lundy, “too absolute.” The United States military now grades itself instead on a curve. As Lundy puts it, “winning is more of a continuum,” an approach that allows you to claim mission accomplishment without, you know, actually accomplishing anything.
It’s like soccer for six-year-olds. Everyone tries hard so everyone gets a trophy. Regardless of outcomes, no one goes home feeling bad. In the U.S. military’s case, every general gets a medal (or, more likely, a chest full of them).
“These days,” in the Pentagon, Jaffe writes, “senior officers talk about ‘infinite war.’”
I would like to believe that Jaffe is pulling our leg. But given that he’s a conscientious reporter with excellent sources, I fear he knows what he’s talking about. If he’s right, as far as the top brass are concerned, the Long War has now officially gone beyond long. It has been deemed endless and is accepted as such by those who preside over its conduct.
In truth, infinite war is a strategic abomination, an admission of professional military bankruptcy. Erster General-Quartiermeister Ludendorff might have endorsed the term, but Ludendorff was a military fanatic.
Check that. Infinite war is a strategic abomination except for arms merchants, so-called defense contractors, and the “emergency men” (and women) devoted to climbing the greasy pole of what we choose to call the national security establishment. In other words, candor obliges us to acknowledge that, in some quarters, infinite war is a pure positive, carrying with it a promise of yet more profits, promotions, and opportunities to come. War keeps the gravy train rolling. And, of course, that’s part of the problem.
Who should we hold accountable for this abomination? Not the generals, in my view. If they come across as a dutiful yet unimaginative lot, remember that a lifetime of military service rarely nurtures imagination or creativity. And let us at least credit our generals with this: in their efforts to liberate or democratize or pacify or dominate the Greater Middle East they have tried every military tactic and technique imaginable. Short of nuclear annihilation, they’ve played just about every card in the Pentagon’s deck — without coming up with a winning hand. So they come and go at regular intervals, each new commander promising success and departing after a couple years to make way for someone else to give it a try.
It tells us something about our prevailing standards of generalship that, by resurrecting an old idea — counterinsurgency — and applying it with temporary success to one particular theater of war, General David Petraeus acquired a reputation as a military genius. If Petraeus is a military genius, so, too, is General George McClellan. After winning the Battle of Rich Mountain in 1861, newspapers dubbed McClellan “the Napoleon of the Present War.” But the action at Rich Mountain decided nothing and McClellan didn’t win the Civil War any more than Petraeus won the Iraq War.
No, it’s not the generals who have let us down, but the politicians to whom they supposedly report and from whom they nominally take their orders. Of course, under the heading of politician, we quickly come to our current commander-in-chief. Yet it would be manifestly unfair to blame President Trump for the mess he inherited, even if he is presently engaged in making matters worse.
The failure is a collective one, to which several presidents and both political parties have contributed over the years. Although the carnage may not be as horrific today as it was on the European battlefields on the Western and Eastern Fronts, members of our political class are failing us as strikingly and repeatedly as the political leaders of Great Britain, France, and Germany failed their peoples back then. They have abdicated responsibility for policy to our own homegrown equivalents of Haig, Foch, Petain, Hindenburg, and Ludendorff. Their failure is unforgivable.
Congressional midterm elections are just months away and another presidential election already looms. Who will be the political leader with the courage and presence of mind to declare: “Enough! Stop this madness!” Man or woman, straight or gay, black, brown, or white, that person will deserve the nation’s gratitude and the support of the electorate.
Until that occurs, however, the American penchant for war will stretch on toward infinity. No doubt Saudi and Israeli leaders will cheer, Europeans who remember their Great War will scratch their heads in wonder, and the Chinese will laugh themselves silly. Meanwhile, issues of genuinely strategic importance — climate change offers one obvious example — will continue to be treated like an afterthought. As for the gravy train, it will roll on.
The purpose of all wars, is peace. So observed St. Augustine early in the first millennium A.D. Far be it from me to disagree with the esteemed Bishop of Hippo, but his crisply formulated aphorism just might require a bit of updating.
I’m not a saint or even a bishop, merely an interested observer of this nation’s ongoing military misadventures early in the third millennium A.D. From my vantage point, I might suggest the following amendment to Augustine’s dictum: Any war failing to yield peace is purposeless and, if purposeless, both wrong and stupid.
War is evil. Large-scale, state-sanctioned violence is justified only when all other means of achieving genuinely essential objectives have been exhausted or are otherwise unavailable. A nation should go to war only when it has to — and even then, ending the conflict as expeditiously as possible should be an imperative.
Some might take issue with these propositions, President Trump’s latest national security adviser doubtless among them. Yet most observers — even, I’m guessing, most high-ranking U.S. military officers — would endorse them. How is it then that peace has essentially vanished as a U.S. policy objective? Why has war joined death and taxes in that select category of things that Americans have come to accept as unavoidable?
The United States has taken Thucydides’s famed Melian Dialogue and turned it inside out. Centuries before Augustine, the great Athenian historian wrote, “The strong do what they will, while the weak suffer what they must.” Strength confers choice; weakness restricts it. That’s the way the world works, so at least Thucydides believed. Yet the inverted Melian Dialogue that prevails in present-day Washington seemingly goes like this: strength imposes obligations and limits choice. In other words, we gotta keep doing what we’ve been doing, no matter what.
Making such a situation all the more puzzling is the might and majesty of America’s armed forces. By common consent, the United States today has the world’s best military. By some estimates, it may be the best in recorded history. It’s certainly the most expensive and hardest working on the planet.
Yet in the post-Cold War era when the relative strength of U.S. forces reached its zenith, our well-endowed, well-trained, well-equipped, and highly disciplined troops have proven unable to accomplish any of the core tasks to which they’ve been assigned. This has been especially true since 9/11.
We send the troops off to war, but they don’t achieve peace. Instead, America’s wars and skirmishes simply drag on, seemingly without end. We just keep doing what we’ve been doing, a circumstance that both Augustine and Thucydides would undoubtedly have found baffling.
Prosecuting War, Averting Peace
How to explain this paradox of a superb military that never gets the job done? Let me suggest that the problem lies with the present-day American military system, the principles to which the nation adheres in raising, organizing, supporting, and employing its armed forces. By its very existence, a military system expresses an implicit contract between the state, the people, and the military itself.
Here, as I see it, are the principles — seven in all — that define the prevailing military system of the United States.
First, we define military service as entirely voluntary. In the U.S., there is no link between citizenship and military service. It’s up to you as an individual to decide if you want to take up arms in the service of your country.
If you choose to do so, that’s okay. If you choose otherwise, that’s okay, too. Either way, your decision is of no more significance than whether you root for the Yankees or the Mets.
Second, while non-serving citizens are encouraged to “support the troops,” we avoid stipulating how this civic function is to be performed.
In practice, there are many ways of doing so, some substantive, others merely symbolic. Most citizens opt for the latter. This means that they cheer when invited to do so. Cheering is easy and painless. It can even make you feel good about yourself.
Third, when it comes to providing the troops with actual support, we expect Congress to do the heavy lifting. Our elected representatives fulfill that role by routinely ponying up vast sums of money for what is misleadingly called a defense budget. In some instances, Congress appropriates even more money than the Pentagon asks for, as was the case this year.
Meanwhile, under the terms of our military system, attention to how this money actually gets spent by our yet-to-be-audited Pentagon tends to be — to put the matter politely — spotty. Only rarely does the Congress insert itself forcefully into matters relating to what U.S. forces scattered around the world are actually doing.
Yes, there are periodic hearings, with questions posed and testimony offered. But unless there is some partisan advantage to be gained, oversight tends to be, at best, pro forma. As a result, those charged with implementing national security policy — another Orwellian phrase — enjoy very considerable latitude.
Fourth, under the terms of our military system, this latitude applies in spades to the chief executive. The commander-in-chief occupies the apex of our military system. The president may bring to office very little expertise pertinent to war or the art of statecraft, yet his authority regarding such matters is essentially unlimited.
Consider, if you will, the sobering fact that our military system empowers the president to order a nuclear attack, should he see the need — or feel the impulse — to do so. He need not obtain congressional consent. He certainly doesn’t need to check with the American people.
Since Harry Truman ordered the destruction of Hiroshima and Nagasaki in 1945, presidents have not exercised this option, for which we should all be grateful. Yet on more occasions than you can count, they have ordered military actions, large and small, on their own authority or after only the most perfunctory consultation with Congress. When Donald Trump, for instance, threatened North Korea’s Kim Jong-un with “fire and fury the likes of which the world has never seen,” he gave no hint that he would even consider asking for prior congressional authorization to do so. Trump’s words were certainly inflammatory. Yet were he to act on those words, he would merely be exercising a prerogative enjoyed by his predecessors going back to Truman himself.
The Constitution invests in Congress the authority to declare war. The relevant language is unambiguous. In practice, as countless commentators have noted, that provision has long been a dead letter. This, too, forms an essential part of our present military system.
Fifth, under the terms of that system, there’s no need to defray the costs of military actions undertaken in our name. Supporting the troops does not require citizens to pay anything extra for what the U.S. military is doing out there wherever it may be. The troops are asked to sacrifice; for the rest of us, sacrifice is anathema.
Indeed, in recent years, presidents who take the nation to war or perpetuate wars they inherit never even consider pressing Congress to increase our taxes accordingly. On the contrary, they advocate tax cuts, especially for the wealthiest among us, which lead directly to massive deficits.
Sixth, pursuant to the terms of our military system, the armed services have been designed not to defend the country but to project military power on a global basis. For the Department of Defense actually defending the United States qualifies as an afterthought, trailing well behind other priorities such as trying to pacify Afghanistan’s Kandahar Province or jousting with militant groups in Somalia. The United States Army, Navy, Air Force, and Marine Corps are all designed to fight elsewhere, relying on a constellation of perhaps 800 bases around the world to facilitate the conduct of military campaigns “out there,” wherever “there” may happen to be. They are, in other words, expeditionary forces.
Reflect for a moment on the way the Pentagon divvies the world up into gigantic swathes of territory and then assigns a military command to exercise jurisdiction over each of them: European Command, Africa Command, Central Command, Southern Command, Northern Command, and Pacific Command. With the polar icecap continuing to melt, a U.S. Arctic Command is almost surely next on the docket. Nor is the Pentagon’s mania for creating new headquarters confined to terra firma. We already have U.S. Cyber Command. Can U.S. Galactic Command be far behind?
No other nation adheres to this practice. Nor would the United States permit any nation to do so. Imagine the outcry in Washington if President Xi Jinping had the temerity to create a “PRC Latin America Command,” headed by a four-star Chinese general charged with maintaining order and stability from Mexico to Argentina.
Seventh (and last), our military system invests great confidence in something called the military profession.
The legal profession exists to implement the rule of law. We hope that the result is some approximation of justice. The medical profession exists to repair our bodily ailments. We hope that health and longevity will result. The military profession exists to master war. With military professionals in charge, it’s our hope that America’s wars will conclude quickly and successfully with peace the result.
To put it another way, we look to the military profession to avert the danger of long, costly, and inconclusive wars. History suggests that these sap the collective strength of a nation and can bring about its premature decline. We count on military professionals to forestall that prospect.
Our military system assigns the immediate direction of war to our most senior professionals, individuals who have ascended step by step to the very top of the military hierarchy. We expect three- and four-star generals and admirals to possess the skills needed to make war politically purposeful. This expectation provides the rationale for the status they enjoy and the many entitlements they are accorded.
America, the (Formerly) Indispensable
Now, the nation that has created this military system is not some “shithole country,” to use a phrase made famous by President Trump. We are, or at least claim to be, a democratic republic in which all power ultimately derives from the people. We believe in — indeed, are certain that we exemplify — freedom, even as we continually modify the meaning of that term.
In the aggregate, we are very rich. Since the latter part of the nineteenth century we have taken it for granted that the United States ought to be the richest country on the planet, notwithstanding the fact that large numbers of ordinary Americans are themselves anything but rich. Indeed, as a corollary to our military system, we count on these less affluent Americans to volunteer for military service in disproportionate numbers. Offered sufficient incentives, they do so.
Finally, since 1945 the United States has occupied the preeminent place in the global order, a position affirmed with the collapse of the Soviet Union and the end of the Cold War in 1991. Indeed, we have come to believe that American primacy reflects the will of God or of some cosmic authority.
From the early years of the Cold War, we have come to believe that the freedom, material abundance, and primacy we cherish all depend upon the exercise of “global leadership.” In practice, that seemingly benign term has been a euphemism for unquestioned military superiority and the self-assigned right to put our military to work as we please wherever we please. Back in the 1990s, Secretary of State Madeleine Albright said it best: “If we have to use force, it is because we are America. We are the indispensable nation. We stand tall. We see further into the future.”
Other countries might design their military establishments to protect certain vital interests. As Albright’s remark suggests, American designs have been far more ambitious.
Here, then, is a question: How do the principles and attitudes that undergird our military system actually suit twenty-first-century America? And if they don’t, what are the implications of clinging to such a system? Finally, what alternative principles might form a more reasonable basis for raising, organizing, supporting, and employing our armed forces?
Spoiler alert: Let me acknowledge right now that I consider our present-day military system irredeemably flawed and deeply harmful. For proof we need look no further than the conduct of our post-9/11 wars, especially in Iraq and Afghanistan, but also in Libya, Pakistan, Somalia, Syria, Yemen, and parts of sub-Saharan Africa.
These myriad undertakings of the last nearly 17 years have subjected our military system to a comprehensive real-world examination. Collectively, they have rendered a judgment on that system. And the judgment is negative. Put to the test, the American military system has failed.
And the cost so far? Trillions of dollars expended (with trillions more to come), thousands of American lives lost, tens of thousands of Americans grievously damaged, and even greater numbers of non-Americans killed, injured, and displaced.
One thing is certain: our wars have not brought about peace by even the loosest definition of the word.
A Military Report Card
There are many possible explanations for why our recent military record has been so dismal. One crucial explanation — perhaps the most important of all — relates to those seven principles that undergird our military system.
Let me review them in reverse order.
Principle 7, the military profession: Tally up the number of three- and four-star generals who have commanded the Afghan War since 2001. It’s roughly a dozen. None of them has succeeded in bringing it to a successful conclusion. Nor does any such happy ending seem likely to be in the offing anytime soon. The senior officers we expect to master war have demonstrated no such mastery.
The generals who followed one another in presiding over that war are undoubtedly estimable, well-intentioned men, but they have not accomplished the job for which they were hired. Imagine if you contracted with a dozen different plumbers — each highly regarded — to fix a leaking sink in your kitchen and you ended up with a flooded basement. You might begin to think that there’s something amiss in the way that plumbers are trained and licensed. Similarly, perhaps it’s time to reexamine our approach to identifying and developing very senior military officers.
Or alternatively, consider this possibility: Perhaps our theory of war as an enterprise where superior generalship determines the outcome is flawed. Perhaps war cannot be fully mastered, by generals or anyone else.
It might just be that war is inherently unmanageable. Take it from Winston Churchill, America’s favorite confronter of evil. “The statesman who yields to war fever,” Churchill wrote, “must realize that once the signal is given, he is no longer the master of policy but the slave of unforeseeable and uncontrollable events.”
If Churchill is right, perhaps our expectations that senior military professionals will tame war — control the uncontrollable — are misplaced. Perhaps our military system should put greater emphasis on avoiding war altogether or at least classifying it as an option to be exercised with great trepidation, rather than as the political equivalent of a handy-dandy, multi-functional Swiss Army knife.
Principle 6, organizing our forces to emphasize global power projection: Reflect for a moment on the emerging security issues of our time. The rise of China is one example. A petulant and over-armed Russia offers a second. Throw in climate change and mushrooming cyber-threats and you have a daunting set of problems. It’s by no means impertinent to wonder about the relevance of the current military establishment to these challenges.
Every year the United States spends hundreds of billions of dollars to maintain and enhance the lethality of a force configured for conventional power projection and to sustain the global network of bases that goes with it. For almost two decades, that force has been engaged in a futile war of attrition with radical Islamists that has now spread across much of the Greater Middle East and parts of Africa.
I don’t know about you, but I worry more about the implications of China’s rise and Russian misbehavior than I do about Islamic terrorism. And I worry more about changing weather patterns here in New England or somebody shutting down the electrical grid in my home town than I do about what Beijing and Moscow may be cooking up. Bluntly put, our existing military system finds us focused on the wrong problem set.
We need a military system that accurately prioritizes actual and emerging threats. The existing system does not. This suggests the need for radically reconfigured armed services, with the hallowed traditions of George Patton, John Paul Jones, Billy Mitchell, and Chesty Puller honorably but permanently retired.
Principle 5, paying — or not paying — for America’s wars: If you want it, you should be willing to pay for it. That hoary axiom ought to guide our military system as much as it should our personal lives. Saddling Millennials or members of Generation Z with the cost of paying for wars mostly conceived and mismanaged by my fellow Baby Boomers strikes me as downright unseemly.
One might expect the young to raise quite a ruckus over such an obvious injustice. In recent weeks, we’ve witnessed their righteous anger over the absence of effective gun controls in this country. That they aren’t comparably incensed about the misuse of guns by their own contemporaries deployed to distant lands represents a real puzzle, especially since they’re the ones who will ultimately be stuck with the bill.
Principles 4 and 3, the role of Congress and the authority of the commander-in-chief: Whatever rationale may once have existed for allowing the commander-in-chief to circumvent the Constitution’s plainly specified allocation of war powers to Congress should long since have lapsed. Well before Donald Trump became president, a responsible Congress would have reasserted its authority to declare war. That Trump sits in the Oval Office and now takes advice from the likes of John Bolton invests this matter with great urgency.
Surely President Trump’s bellicose volatility drives home the point that it’s past time for Congress to assert itself in providing responsible oversight regarding all aspects of U.S. military policy. Were it to do so, the chances of fixing the defects permeating our present military system would improve appreciably.
Of course, the likelihood of that happening is nil until the money changers are expelled from the temple. And that won’t occur until Americans who are not beholden to the military-industrial complex and its various subsidiaries rise up, purge the Congress of its own set of complexes, and install in office people willing to do their duty. And that brings us back to…
Principles 2 and 1, the existing relationship between the American people and their military and our reliance on a so-called all-volunteer force: Here we come to the heart of the matter.
I submit that the relationship between the American people and their military is shot through with hypocrisy. It is, in fact, nothing short of fraudulent. Worse still, most of us know it, even if we are loath to fess up. In practice, the informal mandate to “support the troops” has produced an elaborate charade. It’s theater, as phony as Donald Trump’s professed love for DACA recipients.
If Americans were genuinely committed to supporting the troops, they would pay a great deal more attention to what President Trump and his twenty-first-century predecessors have tasked those troops to accomplish — with what results and at what cost. Of course, that would imply doing more than cheering and waving the flag on cue. Ultimately, the existence of the all-volunteer force obviates any need for such an effort. It provides Americans with an ample excuse for ignoring our endless wars and allowing our flawed military system to escape serious scrutiny.
Having outsourced responsibility for defending the country to people few of us actually know, we’ve ended up with a military system that is unfair, undemocratic, hugely expensive, and largely ineffective, not to mention increasingly irrelevant to the threats coming our way. The perpetuation of that system finds us mired in precisely the sort of long, costly, inconclusive wars that sap the collective strength of a nation and may bring about its premature decline.
The root cause of our predicament is the all-volunteer force. Only when we ordinary citizens conclude that we have an obligation to contribute to the country’s defense will it become possible to devise a set of principles for raising, organizing, supporting, and employing U.S. forces that align with our professed values and our actual security requirements.
If Stormy Daniels can figure out when an existing contract has outlived its purpose, so can the rest of us.
What Happens When a Few Volunteer and the Rest Just Watch
March 20, 2018
Dear Mr. Sulzberger:
Congratulations on assuming the reins of this nation’s — and arguably, the world’s — most influential publication. It’s the family business, of course, so your appointment to succeed your father doesn’t exactly qualify as a surprise. Even so, the responsibility for guiding the fortunes of a great institution must weigh heavily on you, especially when the media landscape is changing so rapidly and radically.
Undoubtedly, you’re already getting plenty of advice on how to run the paper, probably more than you want or need. Still, with your indulgence, I’d like to offer an outsider’s perspective on “the news that’s fit to print.” The famous motto of the Times insists that the paper is committed to publishing “all” such news — an admirable aspiration even if an impossibility. In practice, what readers like me get on a daily basis is “all the news that Times editors deem worthy of print.”
Of course, within that somewhat more restrictive universe of news, not all stories are equal. Some appear on the front page above the fold. Others are consigned to page A17 on Saturday morning.
And some topics receive more attention than others. In recent years, comprehensive coverage of issues touching on diversity, sexuality, and the status of women has become a Times hallmark. When it comes to Donald Trump, “comprehensive” can’t do justice to the attention he receives. At the Times (and more than a few other media outlets), he has induced a form of mania, with his daily effusion of taunts, insults, preposterous assertions, bogus claims, and decisions made, then immediately renounced, all reported in masochistic detail. Throw in salacious revelations from Trump’s colorful past and leaks from the ongoing Mueller investigation of his campaign and our 45th president has become for the Times something akin to a Great White Whale, albeit with a comb-over and a preference for baggy suits.
In the meantime, other issues of equal or even greater importance — I would put climate change in this category — receive no more than sporadic or irregular coverage. And, of course, some topics simply don’t make the cut at all, like just about anything short of a school shooting that happens in that vast expanse west of the Hudson that Saul Steinberg years ago so memorably depicted for the New Yorker.
The point of this admittedly unsolicited memo is not to urge the Times to open a bureau in Terre Haute or in the rapidly melting Arctic. Nor am I implying that the paper should tone down its efforts to dismantle the hetero-normative order, empower women, and promote equality for transgender persons. Yet I do want to suggest that obsessing about this administration’s stupefying tomfoolery finds the Times overlooking one particular issue that predates and transcends the Trump Moment. That issue is the normalization of armed conflict, with your writers, editors, and editorial board having tacitly accepted that, for the United States, war has become a permanent condition.
Let me stipulate that the Times does devote an impressive number of column-inches to the myriad U.S. military activities around the planet. Stories about deployments, firefights, airstrikes, sieges, and casualties abound. Readers can count on the Times to convey the latest White House or Pentagon pronouncements about the briefly visible light at the end of some very long tunnel. And features describing the plight of veterans back from the war zone also appear with appropriate and commendable frequency.
So anyone reading the Times for a week or a month will have absorbed the essential facts of the case, including the following:
* Over 6,000 days after it began, America’s war in Afghanistan continues, with Times correspondents providing regular and regularly repetitive updates;
* In the seven-year-long civil war that has engulfed Syria, the ever-shifting cast of belligerents now includes at least 2,000 (some sources say 4,000) U.S. special operators, the rationale for their presence changing from week to week, even as plans to keep U.S. troops in Syria indefinitely take shape;
* In Iraq, now liberated from ISIS, itself a byproduct of U.S. invasion and occupation, U.S. troops are now poised to stay on, more or less as they did in West Germany in 1945 and in South Korea after 1953;
* On the Arabian Peninsula, U.S. forces have partnered with Saudi Crown Prince Mohammad Bin Salman Al Saud in brutalizing Yemen, thereby creating a vast humanitarian disaster despite the absence of discernible U.S. interests at stake;
* In the military equivalent of whacking self-sown weeds, American drones routinely attack Libyan militant groups that owe their existence to the chaos created in 2011 when the United States impulsively participated in the overthrow of Muammar Gaddafi;
* More than a quarter-century after American troops entered Somalia to feed the starving, the U.S. military mission continues, presently in the form of recurring airstrikes;
* Elsewhere in Africa, the latest theater to offer opportunities for road-testing the most recent counterterrorism techniques, the U.S. military footprint is rapidly expanding, all but devoid of congressional (or possibly any other kind of) oversight;
* From the Levant to South Asia, a flood of American-manufactured weaponry continues to flow unabated, to the delight of the military-industrial complex, but with little evidence that the arms we sell or give away are contributing to regional peace and stability;
*Amid this endless spiral of undeclared American wars and conflicts, Congress stands by passively, only rousing itself as needed to appropriate money that ensures the unimpeded continuation of all of the above;
*Meanwhile, President Trump, though assessing all of this military hyperactivity as misbegotten — “Seven trillion dollars. What a mistake.” — is effectively perpetuating and even ramping up the policies pioneered by his predecessors.
This conglomeration of circumstances, I submit, invites attention to several first-order questions to which the Times appears stubbornly oblivious. These questions are by no means original with me. Indeed, Mr. Sulzberger (may I call you A.G.?), if you’ve kept up with TomDispatch — if you haven’t, you really should — you will already have encountered several of them. Yet in the higher reaches of mainstream journalism they remain sadly neglected, with disastrous practical and moral implications.
The key point is that when it comes to recent American wars, the Times offers coverage without perspective. “All the news” is shallow and redundant. Lots of dots, few connections.
To put it another way, what’s missing is any sort of Big Picture. The Times would never depict Russian military actions in the Crimea, eastern Ukraine, and Syria, along with its cyber-provocations, as somehow unrelated to one another. Yet it devotes remarkably little energy to identifying any links between what U.S. forces today are doing in Niger and what they are doing in Afghanistan; between U.S. drone attacks that target this group of “terrorists” and those that target some other group; or, more fundamentally, between what we thought we were doing as far back as the 1980s when Washington supported Saddam Hussein and what we imagine we’re doing today in the various Muslim-majority nations in which the U.S. military is present, whether welcome or not.
Crudely put, the central question that goes not only unanswered but unasked is this: What the hell is going on? Allow me to deconstruct that in ways that might resonate with Times correspondents:
What exactly should we call the enterprise in which U.S. forces have been engaged all these years? The term that George W. Bush introduced back in 2001, “Global War on Terrorism,” fell out of favor long ago. Nothing has appeared to replace it. A project that today finds U.S. forces mired in open-ended hostilities across a broad expanse of Muslim-majority nations does, I suggest, deserve a name, even if the commander-in-chief consigns most of those countries to “shithole” status. A while back, I proposed “War for the Greater Middle East,” but that didn’t catch on. Surely, the president or perhaps one of his many generals could come up with something better, some phrase that conveys a sense of purpose, scope, stakes, or location. The paper of record should insist that whatever it is the troops out there may be doing, their exertions ought to have a descriptive name.
What is our overall objective in waging that no-name war? After 9/11, George W. Bush vowed at various times to eliminate terrorism, liberate the oppressed, spread freedom and democracy, advance the cause of women’s rights across the Islamic world, and even end evil itself. Today, such aims seem like so many fantasies. So what is it we’re trying to accomplish? What will we settle for? Without a readily identifiable objective, how will anyone know when to raise that “Mission Accomplished” banner (again) and let the troops come home?
By extension, what exactly is the strategy for bringing our no-name war to a successful conclusion? A strategy is a kind of roadmap aimed at identifying resources, defining enemies (as well as friends), and describing a sequence of steps that will lead to some approximation of victory. It should offer a vision that gets us from where we are to where we want to be. Yet when it comes to waging its no-name war, Washington today has no strategy worthy of the name. This fact should outrage the American people and embarrass the national security establishment. It should also attract the curiosity of the New York Times.
Roughly speaking, in what year, decade, or century might this war end? Even if only approximately, it would help to know — and the American people deserve to know — when the front page of the Times might possibly carry a headline reading “Peace Secured” or “Hostilities Ended” or even merely “It’s Over.” On the other hand, if it’s unrealistic to expect the ever-morphing, ever-spreading no-name war to end at all, then shouldn’t someone say so, allowing citizens to chew on the implications of that prospect? Who better to reveal this secret hidden in plain sight than the newspaper over which you preside?
What can we expect the no-name war to cost? Although the president’s estimate of $7 trillion may be a trifle premature, it’s not wrong. It may even end up being on the low side. What that money might otherwise have paid for — including infrastructure, education, scientific and medical research, and possibly making amends for all the havoc wreaked by our ill-considered military endeavors — certainly merits detailed discussion. Here’s a way to start just such a discussion: Imagine a running tally of sunk and projected cumulative costs featured on the front page of the Times every morning. Just two numbers: the first a tabulation of what the Pentagon has already spent pursuant to all U.S. military interventions, large and small, since 9/11; the second, a projection of what the final bill might look like decades from now when the last of this generation’s war vets passes on.
Finally, what are the implications of saddling future generations with this financial burden? With the sole exception of the very brief Gulf War of 1990-1991, the no-name war is the only substantial armed conflict in American history where the generation in whose name it was waged resolutely refused to pay for it — indeed, happily accepted tax cuts when increases were very much in order. With astonishingly few exceptions, politicians endorsed this arrangement. One might think that enterprising reporters would want to investigate the various factors that foster such irresponsibility.
So that’s my take. I’m sure, A.G., that journalists in your employ could sharpen my questions and devise more of their own. But here’s a small proposition: just for a single day, confine Donald Trump to page A17 and give our no-name war the attention that the Times normally reserves for the president it loathes.
I’m not a newspaperman, but I’m reminded of that wonderful 1940 Hitchcock movie Foreign Correspondent. I expect you’ve seen it. Europe is stumbling toward war and Mr. Powers, head honcho at the fictitious New York Globe, is tired of getting the same-old same-old from the people he has on the scene. “I don’t want any more economists, sages, or oracles bombinating over our cables,” he rages. “I want a reporter. Somebody who doesn’t know the difference between an ism and a kangaroo.”
His rant requires deciphering. What Powers wants is someone with the combination of guts and naiveté to pose questions that more seasoned journalists trapped in a defective narrative of their own creation simply overlook.
So he pulls the decidedly unseasoned and spectacularly uninformed John Jones off the police beat, renames him Huntley Haverstock, sets him up with an expense account, and sends him off to take a fresh look at what gives in Europe. Haverstock proceeds to unearth the big truths to which his more sophisticated colleagues have become blind. Almost singlehandedly he alerts the American people to the dangers just ahead — and he also gets the girl. Terrific movie (even if, given Hitchcock’s well-documented mistreatment of women, it may be politically incorrect to say so).
Anyway, A.G., we need you to do something approximating what Mr. Powers did, but in real life. Good luck. I’m in your corner.
On Seeing America’s Wars Whole
The present arrives out of a past that we are too quick to forget, misremember, or enshroud in myth. Yet like it or not, the present is the product of past choices. Different decisions back then might have yielded very different outcomes in the here-and-now. Donald Trump ascended to the presidency as a consequence of myriad choices that Americans made (or had made for them) over the course of decades. Although few of those were made with Trump in mind, he is the result.
Where exactly did Trump come from? How are we to account for his noxious presence as commander-in-chief and putative Leader of the Free World? The explanations currently on offer are legion. Some blame the nefarious Steve Bannon, others Hillary Clinton and her lackluster campaign. Or perhaps the fault lies with the Bernie Sanders insurgency, which robbed Clinton of the momentum she needed to win, or with Little Marco, Lyin’ Ted, and Low Energy Jeb, and the other pathetic Republicans whom Trump trampled underfoot en route to claiming the nomination. Or perhaps the real villains are all those “deplorables” — the angry and ignorant white males whose disdain for immigrants, feminists, gays, and people of color Trump stoked and manipulated to great effect.
All such explanations, however, suggest that the relevant story began somewhere around June 2015 when Donald Trump astonished the political world by announcing his intention to seek the presidency. My aim here is to suggest that the origins of the real story are to be found much earlier. The conditions that enabled Trump to capture the presidency stemmed from acts of commission and omission that occurred well before he rode down that escalator at Trump Tower to offer his services to the nation.
Here’s the sad part: at each step along the way, other alternatives were available. Had those alternatives been exercised, a Trump presidency would have remained an absurd fantasy rather than becoming an absurd and dangerous reality. Like the Cuban Missile Crisis or the Vietnam War or 9/11, Trump qualifies as a completely avoidable catastrophe with roots deep in the past.
So who’s at fault? Ultimately, we — the American people — must accept a considerable share of the responsibility. This is one buck that can’t be passed.
Coulda, Woulda, Shoulda
So what follows is a review of roads taken (and not) ultimately leading to the demoralizing presidency of Donald Trump, along with a little speculation on how different choices might have resulted in a decidedly different present.
1989: The Fall of the Berlin Wall. As the Cold War wound down, members of Washington’s smart set, Republicans and Democrats alike, declared that the opportunities now presenting themselves went beyond the merely stupendous. Indeed, history itself had ended. With the United States as the planet’s sole superpower, liberal democratic capitalism was destined to prevail everywhere. There would be no way except the American Way. In fact, however, the passing of the Cold War should have occasioned a moment of reflection regarding the sundry mistakes and moral compromises that marred U.S. policy from the 1940s through the 1980s. Unfortunately, policy elites had no interest in second thoughts — and certainly not in remorse or contrition. In the 1990s, rampant victory disease fueled extraordinary hubris and a pattern of reckless behavior informed by an assumption that the world would ultimately conform to the wishes of the “indispensable nation.” In the years to come, an endless sequence of costly mishaps would ensue from Mogadishu to Mosul. When, in due time, Donald Trump announced his intention to dismantle the establishment that had presided over those failures, many Americans liked what he had to say, even if he spoke from a position of total ignorance.
1992: President H. Ross Perot. In the first post-Cold War presidential election, H. Ross Perot, a wealthy entrepreneur and political novice, mounted an independent challenge to the Republican and Democratic nominees. Both parties, Perot charged, were in bed with lobbyists, insiders, and special interests. Both were enthusiastically presiding over the deindustrialization of a once dominant American economy. The rich were getting richer, the national debt was growing, and ordinary citizens were getting screwed, he contended. His charges were not without merit. Yet when Perot lost, Washington was back to business as usual. We cannot know what a Perot presidency would have produced. Yet such a victory — the American electorate, in effect, repudiating the two established parties — might have created powerful incentives for both Republicans and Democrats to clean up their acts and find ways of governing more effectively. Had they done so, Trump’s later vow to “drain the swamp” of corruption and self-dealing would have been beside the point.
1993: Gays in the Military. Bill Clinton ran for the presidency as a centrist. Even so, once elected, he immediately announced his intention to remove restrictions on gays serving in the armed forces. This was, to put it mildly, anything but the act of a centrist. Outraged senior military officers made clear their intention to defy the new commander-in-chief. Although Clinton quickly backpedalled, the episode infuriated both cultural traditionalists and progressives. Within 20 years, a different generation of senior officers decided that gays serving in the military was no big deal. The issue instantly vanished. Yet the controversy left behind a residue of bitterness, especially on the right, that worked in Trump’s favor. Had the generals of 1993 suppressed their insubordinate inclinations, they might have ever so slightly turned down the heat on the culture wars. When the heat is high, it’s the tub-thumpers and noisy haranguers who benefit.
1998: The Lewinsky Scandal. When President Clinton’s sexual encounters with a young White House intern became known, Hillary Clinton stood by her man. The first lady’s steadfast loyalty helped her husband avoid being thrown out of office, providing cover for other feminists to continue supporting the president. Imagine if she had done otherwise, declaring his conduct unacceptable. The pressure on him to resign coming from those who had been among his strongest supporters would have been intense. This much is certain: had evidence of infidelity, compounded by prior allegations of abuse toward women, forced President Clinton from office, Donald Trump would never have had a chance of being elected president. In all likelihood he would never even have considered running.
2000: Cheney Picks a Veep. When George W. Bush wrapped up the Republican nomination in 2000, he tagged Dick Cheney, his father’s defense secretary, with the task of identifying a suitable running mate. After surveying the field, Cheney decided that he himself was the man for the job. As vice president, Cheney wasted no time in stacking the upper ranks of the administration with likeminded allies keen to wield American military muscle to smite “evil-doers” and expand America’s empire. Bush had promised, if elected, to pursue a “humble” foreign policy and forego nation-building. Had he not surrounded himself with Cheney and bellicose companions like Donald Rumsfeld and Paul Wolfowitz, he might possibly have stuck to that course, even after 9/11. Instead, urged on by the uber-hawks in his own administration, he embarked upon a misguided “Global War on Terrorism.” No single action played a greater role in paving the way for Donald Trump to become president.
2000: The Supremes Pick a President. If, in choosing a president on our behalf, the Supreme Court had given the nod to Al Gore instead of George Bush, might they have averted that never-ending, never-contracting war on terrorism? No doubt the 9/11 attacks would still have occurred and some U.S. military action would have ensued. But Gore did not share the obsession with Saddam Hussein that infected members of the Bush-Cheney axis. Arguably, a President Gore would have been less likely than President Bush to insist on invading a country that had played no part in the al-Qaeda conspiracy. Had the U.S. not embarked upon a preventive war against Iraq — had this Original Sin of the post-9/11 era not occurred — a Trump presidency would have been far less likely.
2003: Congress Rolls Over. To its perpetual disgrace, Congress assented to Bush’s demands to invade Iraq. It did so less because its members, including presidential aspirants like Senators Hillary Clinton and John Kerry, were persuaded that Iraq posed a threat to national security (it did not) than because they sought to insulate themselves from the political consequences of opposing a president hell-bent on war. For decades, Congress had allowed presidents to encroach upon its constitutional responsibility to declare war, but this would be the last straw. Supine legislators became complicit in a disaster that to this day continues to unfold. A Congress with gumption might have averted that disaster, recovered its cojones, and left us with a legislative branch willing and able to fulfill its constitutional responsibilities.
2003: GM Kills the EV1 Electric Automobile. In the 1990s, General Motors produced the first viable electric car. Drivers loved it, but GM doubted its potential profitability. Shareholders were more likely to make money if the company focused on manufacturing vehicles powered by gasoline engines. So in 2003, GM executives killed the EV1. The effect was to postpone by at least a decade the development of a mass-produced electric car. Had GM persisted, it’s just possible that the EV1 might have jump-started the transition to a post-fossil fuel economy and offered humanity a leg up on climate change. Instead, politicians spent years bickering about whether climate change was even real. More than a few Republicans made political hay by denouncing those waging a “war on coal” or inhibiting crucially needed oil exploration — bogus charges that Trump adroitly exploited for his own purposes. Perhaps if the EV1 had fulfilled its potential, anyone mounting a presidential campaign while denouncing global warming as a hoax would have been laughed out of town instead of capturing the White House.
2009: Obama Bails Out Wall Street. President Obama entered the Oval Office with the U.S. economy in free-fall. His administration took prompt action to prevent systemic collapse — that is, it bailed out Wall Street. Meanwhile the little guy got clobbered, with millions of Americans losing their jobs and homes. A billionaire complaining about the system being “rigged” might otherwise have tested the outer limits of irony, but for Donald Trump the government’s handling of the Great Recession was a gift from the gods.
2010: Presidential Twitter Accounts. Huge numbers of Americans have willingly surrendered their lives to social media. I’m guessing that there are more vegans and curling aficionados in the United States today than there are non-subscribers to Facebook. So it was perhaps inevitable that politicians would hoist themselves onto the social media bandwagon, keen to use direct, unmediated electronic communications as a way of mobilizing their followers. Yet the resulting impact on American politics has been entirely negative. The space available for reasoned exchanges has shrunk. Political discourse has become increasingly corrosive, its apparent purpose less to inform than to obfuscate, trivialize, and create division. This development was probably inevitable and will no doubt prove irreversible. Even so, it was not inevitable that the presidency itself should succumb to this phenomenon. In 2010, when Barack Obama “made history” by sending the first presidential tweet, it was as if the Pope had begun spending his idle hours hanging out at some corner saloon. Even if only in barely measurable increments, the dignity and decorum associated with the presidency began to fade and with it the assumption that crude or boorish behavior would automatically disqualify someone for high office. Donald Trump, a first-class boor and maestro of Twitter, was quick to take notice.
2010: Mitch McConnell Chooses Party Over Country. With the nation still in the midst of a devastating economic crisis, Republican Senate leader Mitch McConnell declared on behalf of his party that the denial of a second term to President Obama was “the single most important thing we want to achieve.” To hell with the country, the GOP wanted Obama gone. McConnell’s troops fell obediently into line and the last vestiges of bipartisanship disappeared from Washington. Of course, the president won reelection in 2012 anyway, but in effect McConnell refused to recognize the result. So when Obama exercised a president’s prerogative to nominate someone to fill a Supreme Court vacancy, McConnell ensured that the nominee would not even receive the courtesy of a hearing. An environment rife with hyper-partisanship presented the perfect situation for a political outsider skilled in the “art of the deal” to offer himself as the antidote to persistent gridlock. Congratulations, Mitch! You won after all!
It’s time to look in the mirror, folks. Blaming Trump for being Trump simply won’t do. Like Lenin or Franco or Perón or dozens of other demagogues, Trump merely seized the opportunity that presented itself. Our president is a product and beneficiary of several decades worth of vainglory, cynicism, epic folly, political cowardice, missed opportunities, and a public not given to paying attention. In present-day Washington, no one can deny that the chickens have come home to roost. The biggest fowl of them all has taken up residence in the White House and, in a very real sense, we all put him there.
How We Got Donald Trump
Consider, if you will, these two indisputable facts. First, the United States is today more or less permanently engaged in hostilities in not one faraway place, but at least seven. Second, the vast majority of the American people could not care less.
Nor can it be said that we don’t care because we don’t know. True, government authorities withhold certain aspects of ongoing military operations or release only details that they find convenient. Yet information describing what U.S. forces are doing (and where) is readily available, even if buried in recent months by barrages of presidential tweets. Here, for anyone interested, are press releases issued by United States Central Command for just one recent week:
September 19: Military airstrikes continue against ISIS terrorists in Syria and Iraq
September 20: Military airstrikes continue against ISIS terrorists in Syria and Iraq
Iraqi Security Forces begin Hawijah offensive
September 21: Military airstrikes continue against ISIS terrorists in Syria and Iraq
September 22: Military airstrikes continue against ISIS terrorists in Syria and Iraq
September 23: Military airstrikes continue against ISIS terrorists in Syria and Iraq
Operation Inherent Resolve Casualty
September 25: Military airstrikes continue against ISIS terrorists in Syria and Iraq
September 26: Military airstrikes continue against ISIS terrorists in Syria and Iraq
Ever since the United States launched its war on terror, oceans of military press releases have poured forth. And those are just for starters. To provide updates on the U.S. military’s various ongoing campaigns, generals, admirals, and high-ranking defense officials regularly testify before congressional committees or brief members of the press. From the field, journalists offer updates that fill in at least some of the details — on civilian casualties, for example — that government authorities prefer not to disclose. Contributors to newspaper op-ed pages and “experts” booked by network and cable TV news shows, including passels of retired military officers, provide analysis. Trailing behind come books and documentaries that put things in a broader perspective.
But here’s the truth of it. None of it matters.
Like traffic jams or robocalls, war has fallen into the category of things that Americans may not welcome, but have learned to live with. In twenty-first-century America, war is not that big a deal.
While serving as defense secretary in the 1960s, Robert McNamara once mused that the “greatest contribution” of the Vietnam War might have been to make it possible for the United States “to go to war without the necessity of arousing the public ire.” With regard to the conflict once widely referred to as McNamara’s War, his claim proved grotesquely premature. Yet a half-century later, his wish has become reality.
Why do Americans today show so little interest in the wars waged in their name and at least nominally on their behalf? Why, as our wars drag on and on, doesn’t the disparity between effort expended and benefits accrued arouse more than passing curiosity or mild expressions of dismay? Why, in short, don’t we give a [expletive deleted]?
Perhaps just posing such a question propels us instantly into the realm of the unanswerable, like trying to figure out why people idolize Justin Bieber, shoot birds, or watch golf on television.
Without any expectation of actually piercing our collective ennui, let me take a stab at explaining why we don’t give a @#$%&! Here are eight distinctive but mutually reinforcing explanations, offered in a sequence that begins with the blindingly obvious and ends with the more speculative.
Americans don’t attend all that much to ongoing American wars because:
1. U.S. casualty rates are low. By using proxies and contractors, and relying heavily on airpower, America’s war managers have been able to keep a tight lid on the number of U.S. troops being killed and wounded. In all of 2017, for example, a grand total of 11 American soldiers have been lost in Afghanistan — about equal to the number of shooting deaths in Chicago over the course of a typical week. True, in Afghanistan, Iraq, and other countries where the U.S. is engaged in hostilities, whether directly or indirectly, plenty of people who are not Americans are being killed and maimed. (The estimated number of Iraqi civilians killed this year alone exceeds 12,000.) But those casualties have next to no political salience as far as the United States is concerned. As long as they don’t impede U.S. military operations, they literally don’t count (and generally aren’t counted).
2. The true costs of Washington’s wars go untabulated. In a famous speech, dating from early in his presidency, Dwight D. Eisenhower said that “Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.” Dollars spent on weaponry, Ike insisted, translated directly into schools, hospitals, homes, highways, and power plants that would go unbuilt. “This is not a way of life at all, in any true sense,” he continued. “[I]t is humanity hanging from a cross of iron.” More than six decades later, Americans have long since accommodated themselves to that cross of iron. Many actually see it as a boon, a source of corporate profits, jobs, and, of course, campaign contributions. As such, they avert their eyes from the opportunity costs of our never-ending wars. The dollars expended pursuant to our post-9/11 conflicts will ultimately number in the multi-trillions. Imagine the benefits of investing such sums in upgrading the nation’s aging infrastructure. Yet don’t count on Congressional leaders, other politicians, or just about anyone else to pursue that connection.
3. On matters related to war, American citizens have opted out. Others have made the point so frequently that it’s the equivalent of hearing “Rudolph the Red-Nosed Reindeer” at Christmastime. Even so, it bears repeating: the American people have defined their obligation to “support the troops” in the narrowest imaginable terms, ensuring above all that such support requires absolutely no sacrifice on their part. Members of Congress abet this civic apathy, while also taking steps to insulate themselves from responsibility. In effect, citizens and their elected representatives in Washington agree: supporting the troops means deferring to the commander in chief, without inquiring about whether what he has the troops doing makes the slightest sense. Yes, we set down our beers long enough to applaud those in uniform and boo those who decline to participate in mandatory rituals of patriotism. What we don’t do is demand anything remotely approximating actual accountability.
4. Terrorism gets hyped and hyped and hyped some more. While international terrorism isn’t a trivial problem (and wasn’t for decades before 9/11), it comes nowhere close to posing an existential threat to the United States. Indeed, other threats, notably the impact of climate change, constitute a far greater danger to the wellbeing of Americans. Worried about the safety of your children or grandchildren? The opioid epidemic constitutes an infinitely greater danger than “Islamic radicalism.” Yet having been sold a bill of goods about a “war on terror” that is essential for “keeping America safe,” mere citizens are easily persuaded that scattering U.S. troops throughout the Islamic world while dropping bombs on designated evildoers is helping win the former while guaranteeing the latter. To question that proposition becomes tantamount to suggesting that God might not have given Moses two stone tablets after all.
5. Blather crowds out substance. When it comes to foreign policy, American public discourse is — not to put too fine a point on it — vacuous, insipid, and mindlessly repetitive. William Safire of the New York Times once characterized American political rhetoric as BOMFOG, with those running for high office relentlessly touting the Brotherhood of Man and the Fatherhood of God. Ask a politician, Republican or Democrat, to expound on this country’s role in the world, and then brace yourself for some variant of WOSFAD, as the speaker insists that it is incumbent upon the World’s Only Superpower to spread Freedom and Democracy. Terms like leadership and indispensable are introduced, along with warnings about the dangers of isolationism and appeasement, embellished with ominous references to Munich. Such grandiose posturing makes it unnecessary to probe too deeply into the actual origins and purposes of American wars, past or present, or assess the likelihood of ongoing wars ending in some approximation of actual success. Cheerleading displaces serious thought.
6. Besides, we’re too busy. Think of this as a corollary to point five. Even if the present-day American political scene included figures like Senators Robert La Follette or J. William Fulbright, who long ago warned against the dangers of militarizing U.S. policy, Americans may not retain a capacity to attend to such critiques. Responding to the demands of the Information Age is not, it turns out, conducive to deep reflection. We live in an era (so we are told) when frantic multitasking has become a sort of duty and when being overscheduled is almost obligatory. Our attention span shrinks and with it our time horizon. The matters we attend to are those that happened just hours or minutes ago. Yet like the great solar eclipse of 2017 — hugely significant and instantly forgotten — those matters will, within another few minutes or hours, be superseded by some other development that briefly captures our attention. As a result, a dwindling number of Americans — those not compulsively checking Facebook pages and Twitter accounts — have the time or inclination to ponder questions like: When will the Afghanistan War end? Why has it lasted almost 16 years? Why doesn’t the finest fighting force in history actually win? Can’t package an answer in 140 characters or a 30-second made-for-TV sound bite? Well, then, slowpoke, don’t expect anyone to attend to what you have to say.
7. Anyway, the next president will save us. At regular intervals, Americans indulge in the fantasy that, if we just install the right person in the White House, all will be well. Ambitious politicians are quick to exploit this expectation. Presidential candidates struggle to differentiate themselves from their competitors, but all of them promise in one way or another to wipe the slate clean and Make America Great Again. Ignoring the historical record of promises broken or unfulfilled, and presidents who turn out not to be deities but flawed human beings, Americans — members of the media above all — pretend to take all this seriously. Campaigns become longer, more expensive, more circus-like, and ever less substantial. One might think that the election of Donald Trump would prompt a downward revision in the exalted expectations of presidents putting things right. Instead, especially in the anti-Trump camp, getting rid of Trump himself (Collusion! Corruption! Obstruction! Impeachment!) has become the overriding imperative, with little attention given to restoring the balance intended by the framers of the Constitution. The irony of Trump perpetuating wars that he once roundly criticized and then handing the conduct of those wars to generals devoid of ideas for ending them almost entirely escapes notice.
8. Our culturally progressive military has largely immunized itself from criticism. As recently as the 1990s, the U.S. military establishment aligned itself with the retrograde side of the culture wars. Who can forget the gays-in-the-military controversy that rocked Bill Clinton’s administration during his first weeks in office, as senior military leaders publicly denounced their commander-in-chief? Those days are long gone. Culturally, the armed forces have moved left. Today, the services go out of their way to project an image of tolerance and a commitment to equality on all matters related to race, gender, and sexuality. So when President Trump announced his opposition to transgendered persons serving in the armed forces, tweeting that the military “cannot be burdened with the tremendous medical costs and disruption that transgender in the military would entail,” senior officers politely but firmly disagreed and pushed back. Given the ascendency of cultural issues near the top of the U.S. political agenda, the military’s embrace of diversity helps to insulate it from criticism and from being called to account for a less than sterling performance in waging wars. Put simply, critics who in an earlier day might have blasted military leaders for their inability to bring wars to a successful conclusion hold their fire. Having women graduate from Ranger School or command Marines in combat more than compensates for not winning.
A collective indifference to war has become an emblem of contemporary America. But don’t expect your neighbors down the street or the editors of the New York Times to lose any sleep over that fact. Even to notice it would require them — and us — to care.
Like it or not, the president of the United States embodies America itself. The individual inhabiting the White House has become the preeminent symbol of who we are and what we represent as a nation and a people. In a fundamental sense, he is us.
It was not always so. Millard Fillmore, the 13th president (1850-1853), presided over but did not personify the American republic. He was merely the federal chief executive. Contemporary observers did not refer to his term in office as the Age of Fillmore. With occasional exceptions, Abraham Lincoln in particular, much the same could be said of Fillmore’s successors. They brought to office low expectations, which they rarely exceeded. So when Chester A. Arthur (1881-1885) or William Howard Taft (1909-1913) left the White House, there was no rush to immortalize them by erecting gaudy shrines — now known as “presidential libraries” — to the glory of their presidencies. In those distant days, ex-presidents went back home or somewhere else where they could find work.
Over the course of the past century, all that has changed. Ours is a republic that has long since taken on the trappings of a monarchy, with the president inhabiting rarified space as our king-emperor. The Brits have their woman in Buckingham Palace. We have our man in the White House.
Nominally, the Constitution assigns responsibilities and allocates prerogatives to three co-equal branches of government. In practice, the executive branch enjoys primacy. Prompted by a seemingly endless series of crises since the Great Depression and World War II, presidents have accumulated ever-greater authority, partly through usurpation, but more often than not through forfeiture.
At the same time, they also took on various extra-constitutional responsibilities. By the beginning of the present century, Americans took it for granted that the occupant of the Oval Office should function as prophet, moral philosopher, style-setter, interpreter of the prevailing zeitgeist, and — last but hardly least — celebrity-in-chief. In short, POTUS was the bright star at the center of the American solar system.
As recently as a year ago, few saw in this cult of the presidency cause for complaint. On odd occasions, some particularly egregious bit of executive tomfoolery might trigger grumbling about an “imperial presidency.” Yet rarely did such complaints lead to effective remedial action. The War Powers Resolution of 1973 might be considered the exception that proves the rule. Inspired by the disaster of the Vietnam War and intended to constrain presidents from using force without congressional buy-in and support, that particular piece of legislation ranks alongside the Volstead Act of 1919 (enacted to enforce Prohibition) as among the least effective ever to become law.
In truth, influential American institutions — investment banks and multinational corporations, churches and universities, big city newspapers and TV networks, the bloated national security apparatus and both major political parties — have found reason aplenty to endorse a system that elevates the president to the status of demigod. By and large, it’s been good for business, whatever that business happens to be.
Furthermore, it’s our president — not some foreign dude — who is, by common consent, the most powerful person in the universe. For inhabitants of a nation that considers itself both “exceptional” and “indispensable,” this seems only right and proper. So Americans generally like it that their president is the acknowledged Leader of the Free World rather than some fresh-faced pretender from France or Canada.
Then came the Great Hysteria. Arriving with a Pearl Harbor-like shock, it erupted on the night of November 8, 2016, just as the news that Hillary Clinton was losing Florida and appeared certain to lose much else besides became apparent.
Suddenly, all the habits and precedents that had contributed to empowering the modern American presidency no longer made sense. That a single deeply flawed individual along with a handful of unelected associates and family members should be entrusted with determining the fate of the planet suddenly seemed the very definition of madness.
Emotion-laden upheavals producing behavior that is not entirely rational are hardly unknown in the American experience. Indeed, they recur with some frequency. The Great Awakenings of the eighteenth and early nineteenth centuries are examples of the phenomenon. So also are the two Red Scares of the twentieth century, the first in the early 1920s and the second, commonly known as “McCarthyism,” coinciding with the onset of the Cold War.
Yet the response to Donald Trump’s election, combining as it has fear, anger, bewilderment, disgust, and something akin to despair, qualifies as an upheaval without precedent. History itself had seemingly gone off the rails. The crude Andrew Jackson’s 1828 ousting of an impeccably pedigreed president, John Quincy Adams, was nothing compared to the vulgar Donald Trump’s defeat of an impeccably credentialed graduate of Wellesley and Yale who had served as first lady, United States senator, and secretary of state. A self-evidently inconceivable outcome — all the smart people agreed on that point — had somehow happened anyway.
A vulgar, bombastic, thrice-married real-estate tycoon and reality TV host as prophet, moral philosopher, style-setter, interpreter of the prevailing zeitgeist, and chief celebrity? The very idea seemed both absurd and intolerable.
If we have, as innumerable commentators assert, embarked upon the Age of Trump, the defining feature of that age might well be the single-minded determination of those horrified and intent on ensuring its prompt termination. In 2016, TIME magazine chose Trump as its person of the year. In 2017, when it comes to dominating the news, that “person” might turn out to be a group — all those fixated on cleansing the White House of Trump’s defiling presence.
Egged on and abetted in every way by Trump himself, the anti-Trump resistance has made itself the Big Story. Lies, hate, collusion, conspiracy, fascism: rarely has the everyday vocabulary of American politics been as ominous and forbidding as over the past six months. Take resistance rhetoric at face value and you might conclude that Donald Trump is indeed the fifth horseman of the Apocalypse, his presence in the presidential saddle eclipsing all other concerns. Pestilence, War, Famine, and Death will just have to wait.
The unspoken assumption of those most determined to banish him from public life appears to be this: once he’s gone, history will be returned to its intended path, humankind will breathe a collective sigh of relief, and all will be well again. Yet such an assumption strikes me as remarkably wrongheaded — and not merely because, should Trump prematurely depart from office, Mike Pence will succeed him. Expectations that Trump’s ouster will restore normalcy ignore the very factors that first handed him the Republican nomination (with a slew of competitors wondering what hit them) and then put him in the Oval Office (with a vastly more seasoned and disciplined, if uninspiring, opponent left to bemoan the injustice of it all).
Not all, but many of Trump’s supporters voted for him for the same reason that people buy lottery tickets: Why not? In their estimation, they had little to lose. Their loathing of the status quo is such that they may well stick with Trump even as it becomes increasingly obvious that his promise of salvation — an America made “great again” — is not going to materialize.
Yet those who imagine that Trump’s removal will put things right are likewise deluding themselves. To persist in thinking that he defines the problem is to commit an error of the first order. Trump is not cause, but consequence.
For too long, the cult of the presidency has provided an excuse for treating politics as a melodrama staged at four-year intervals and centering on hopes of another Roosevelt or Kennedy or Reagan appearing as the agent of American deliverance. Donald Trump’s ascent to the office once inhabited by those worthies should demolish such fantasies once and for all.
How is it that someone like Trump could become president in the first place? Blame sexism, Fox News, James Comey, Russian meddling, and Hillary’s failure to visit Wisconsin all you want, but a more fundamental explanation is this: the election of 2016 constituted a de facto referendum on the course of recent American history. That referendum rendered a definitive judgment: the underlying consensus informing U.S. policy since the end of the Cold War has collapsed. Precepts that members of the policy elite have long treated as self-evident no longer command the backing or assent of the American people. Put simply: it’s the ideas, stupid.
Rabbit Poses a Question
“Without the Cold War, what’s the point of being an American?” As the long twilight struggle was finally winding down, Harry “Rabbit” Angstrom, novelist John Updike’s late-twentieth-century Everyman, pondered that question. In short order, Rabbit got his answer. So, too, after only perfunctory consultation, did his fellow citizens.
The passing of the Cold War offered cause for celebration. On that point all agreed. Yet, as it turned out, it did not require reflection from the public at large. Policy elites professed to have matters well in hand. The dawning era, they believed, summoned Americans not to think anew, but to keep doing precisely what they were accustomed to doing, albeit without fretting further about Communist takeovers or the risks of nuclear Armageddon. In a world where a “single superpower” was calling the shots, utopia was right around the corner. All that was needed was for the United States to demonstrate the requisite confidence and resolve.
Three specific propositions made up the elite consensus that coalesced during the initial decade of the post-Cold-War era. According to the first, the globalization of corporate capitalism held the key to wealth creation on a hitherto unimaginable scale. According to the second, jettisoning norms derived from Judeo-Christian religious traditions held the key to the further expansion of personal freedom. According to the third, muscular global leadership exercised by the United States held the key to promoting a stable and humane international order.
Unfettered neoliberalism plus the unencumbered self plus unabashed American assertiveness: these defined the elements of the post-Cold-War consensus that formed during the first half of the 1990s — plus what enthusiasts called the information revolution. The miracle of that “revolution,” gathering momentum just as the Soviet Union was going down for the count, provided the secret sauce that infused the emerging consensus with a sense of historical inevitability.
The Cold War itself had fostered notable improvements in computational speed and capacity, new modes of communication, and techniques for storing, accessing, and manipulating information. Yet, however impressive, such developments remained subsidiary to the larger East-West competition. Only as the Cold War receded did they move from background to forefront. For true believers, information technology came to serve a quasi-theological function, promising answers to life’s ultimate questions. Although God might be dead, Americans found in Bill Gates and Steve Jobs nerdy but compelling idols.
More immediately, in the eyes of the policy elite, the information revolution meshed with and reinforced the policy consensus. For those focused on the political economy, it greased the wheels of globalized capitalism, creating vast new opportunities for trade and investment. For those looking to shed constraints on personal freedom, information promised empowerment, making identity itself something to choose, discard, or modify. For members of the national security apparatus, the information revolution seemed certain to endow the United States with seemingly unassailable military capabilities. That these various enhancements would combine to improve the human condition was taken for granted; that they would, in due course, align everybody — from Afghans to Zimbabweans — with American values and the American way of life seemed more or less inevitable.
The three presidents of the post-Cold-War era — Bill Clinton, George W. Bush, and Barack Obama — put these several propositions to the test. Politics-as-theater requires us to pretend that our 42nd, 43rd, and 44th presidents differed in fundamental ways. In practice, however, their similarities greatly outweighed any of those differences. Taken together, the administrations over which they presided collaborated in pursuing a common agenda, each intent on proving that the post-Cold-War consensus could work in the face of mounting evidence to the contrary.
To be fair, it did work for some. “Globalization” made some people very rich indeed. In doing so, however, it greatly exacerbated inequality, while doing nothing to alleviate the condition of the American working class and underclass.
The emphasis on diversity and multiculturalism improved the status of groups long subjected to discrimination. Yet these advances have done remarkably little to reduce the alienation and despair pervading a society suffering from epidemics of chronic substance abuse, morbid obesity, teen suicide, and similar afflictions. Throw in the world’s highest incarceration rate, a seemingly endless appetite for porn, urban school systems mired in permanent crisis, and mass shootings that occur with metronomic regularity, and what you have is something other than the profile of a healthy society.
As for militarized American global leadership, it has indeed resulted in various bad actors meeting richly deserved fates. Goodbye, Saddam. Good riddance, Osama. Yet it has also embroiled the United States in a series of costly, senseless, unsuccessful, and ultimately counterproductive wars. As for the vaunted information revolution, its impact has been ambiguous at best, even if those with eyeballs glued to their personal electronic devices can’t tolerate being offline long enough to assess the actual costs of being perpetually connected.
In November 2016, Americans who consider themselves ill served by the post-Cold-War consensus signaled that they had had enough. Voters not persuaded that neoliberal economic policies, a culture taking its motto from the Outback steakhouse chain, and a national security strategy that employs the U.S. military as a global police force were working to their benefit provided a crucial margin in the election of Donald Trump.
The response of the political establishment to this extraordinary repudiation testifies to the extent of its bankruptcy. The Republican Party still clings to the notion that reducing taxes, cutting government red tape, restricting abortion, curbing immigration, prohibiting flag-burning, and increasing military spending will alleviate all that ails the country. Meanwhile, to judge by the promises contained in their recently unveiled (and instantly forgotten) program for a “Better Deal,” Democrats believe that raising the minimum wage, capping the cost of prescription drugs, and creating apprenticeship programs for the unemployed will return their party to the good graces of the American electorate.
In both parties embarrassingly small-bore thinking prevails, with Republicans and Democrats equally bereft of fresh ideas. Each party is led by aging hacks. Neither has devised an antidote to the crisis in American politics signified by the nomination and election of Donald Trump.
While our emperor tweets, Rome itself fiddles.
I am by temperament a conservative and a traditionalist, wary of revolutionary movements that more often than not end up being hijacked by nefarious plotters more interested in satisfying their own ambitions than in pursuing high ideals. Yet even I am prepared to admit that the status quo appears increasingly untenable. Incremental change will not suffice. The challenge of the moment is to embrace radicalism without succumbing to irresponsibility.
The one good thing we can say about the election of Donald Trump — to borrow an image from Thomas Jefferson — is this: it ought to serve as a fire bell in the night. If Americans have an ounce of sense, the Trump presidency will cure them once and for all of the illusion that from the White House comes redemption. By now we ought to have had enough of de facto monarchy.
By extension, Americans should come to see as intolerable the meanness, corruption, and partisan dysfunction so much in evidence at the opposite end of Pennsylvania Avenue. We need not wax sentimental over the days when Lyndon Johnson and Everett Dirksen presided over the Senate to conclude that Mitch McConnell and Chuck Schumer represent something other than progress. If Congress continues to behave as contemptibly as it has in recent years (and in recent weeks), it will, by default, allow the conditions that have produced Trump and his cronies to prevail.
So it’s time to take another stab at an approach to governance worthy of a democratic republic. Where to begin? I submit that Rabbit Angstrom’s question offers a place to start: What’s the point of being an American?
Authentic progressives and principled conservatives will offer different answers to Rabbit’s query. My own answer is rooted in an abiding conviction that our problems are less quantitative than qualitative. Rather than simply more — yet more wealth, more freedom, more attempts at global leadership — the times call for different. In my view, the point of being an American is to participate in creating a society that strikes a balance between wants and needs, that exists in harmony with nature and the rest of humankind, and that is rooted in an agreed upon conception of the common good.
My own prescription for how to act upon that statement of purpose is unlikely to find favor with most readers of TomDispatch. But therein lies the basis for an interesting debate, one that is essential to prospects for stemming the accelerating decay of American civic life.
Initiating such a debate, and so bringing into focus core issues, will remain next to impossible, however, without first clearing away the accumulated debris of the post-Cold-War era. Preliminary steps in that direction, listed in no particular order, ought to include the following:
First, abolish the Electoral College. Doing so will preclude any further occurrence of the circumstances that twice in recent decades cast doubt on the outcome of national elections and thereby did far more than any foreign interference to undermine the legitimacy of American politics.
Second, rollback gerrymandering. Doing so will help restore competitive elections and make incumbency more tenuous.
Third, limit the impact of corporate money on elections at all levels, if need be by amending the Constitution.
Fourth, mandate a balanced federal budget, thereby demolishing the pretense that Americans need not choose between guns and butter.
Fifth, implement a program of national service, thereby eliminating the All-Volunteer military and restoring the tradition of the citizen-soldier. Doing so will help close the gap between the military and society and enrich the prevailing conception of citizenship. It might even encourage members of Congress to think twice before signing off on wars that the commander-in-chief wants to fight.
Sixth, enact tax policies that will promote greater income equality.
Seventh, increase public funding for public higher education, thereby ensuring that college remains an option for those who are not well-to-do.
Eighth, beyond mere “job” creation, attend to the growing challenges of providing meaningful work — employment that is both rewarding and reasonably remunerative — for those without advanced STEM degrees.
Ninth, end the thumb-twiddling on climate change and start treating it as the first-order national security priority that it is.
Tenth, absent evident progress on the above, create a new party system, breaking the current duopoly in which Republicans and Democrats tacitly collaborate to dictate the policy agenda and restrict the range of policy options deemed permissible.
These are not particularly original proposals and I do not offer them as a panacea. They may, however, represent preliminary steps toward devising some new paradigm to replace a post-Cold-War consensus that, in promoting transnational corporate greed, mistaking libertinism for liberty, and embracing militarized neo-imperialism as the essence of statecraft, has paved the way for the presidency of Donald Trump.
We can and must do better. But doing so will require that we come up with better and truer ideas to serve as a foundation for American politics.
Slouching Toward Mar-a-Lago
Donald Trump’s election has elicited impassioned affirmations of a renewed commitment to unvarnished truth-telling from the prestige media. The common theme: you know you can’t trust him, but trust us to keep dogging him on your behalf. The New York Times has even unveiled a portentous new promotional slogan: “The truth is now more important than ever.” For its part, the Washington Post grimly warns that “democracy dies in darkness,” and is offering itself as a source of illumination now that the rotund figure of the 45th president has produced the political equivalent of a total eclipse of the sun. Meanwhile, National Public Radio fundraising campaigns are sounding an increasingly panicky note: give, listener, lest you be personally responsible for the demise of the Republic that we are bravely fighting to save from extinction.
If only it were so. How wonderful it would be if President Trump’s ascendancy had coincided with a revival of hard-hitting, deep-dive, no-holds-barred American journalism. Alas, that’s hardly the case. True, the big media outlets are demonstrating both energy and enterprise in exposing the ineptitude, inconsistency, and dubious ethical standards, as well as outright lies and fake news, that are already emerging as Trump era signatures. That said, pointing out that the president has (again) uttered a falsehood, claimed credit for a nonexistent achievement, or abandoned some position to which he had previously sworn fealty requires something less than the sleuthing talents of a Sherlock Holmes. As for beating up on poor Sean Spicer for his latest sequence of gaffes — well, that’s more akin to sadism than reporting.
Apart from a commendable determination to discomfit Trump and members of his inner circle (select military figures excepted, at least for now), journalism remains pretty much what it was prior to November 8th of last year: personalities built up only to be torn down; fads and novelties discovered, celebrated, then mocked; “extraordinary” stories of ordinary people granted 15 seconds of fame only to once again be consigned to oblivion — all served with a side dish of that day’s quota of suffering, devastation, and carnage. These remain journalism’s stock-in-trade. As practiced in the United States, with certain honorable (and hence unprofitable) exceptions, journalism remains superficial, voyeuristic, and governed by the attention span of a two year old.
As a result, all those editors, reporters, columnists, and talking heads who characterize their labors as “now more important than ever” ill-serve the public they profess to inform and enlighten. Rather than clearing the air, they befog it further. If anything, the media’s current obsession with Donald Trump — his every utterance or tweet treated as “breaking news!” — just provides one additional excuse for highlighting trivia, while slighting issues that deserve far more attention than they currently receive.
To illustrate the point, let me cite some examples of national security issues that presently receive short shrift or are ignored altogether by those parts of the Fourth Estate said to help set the nation’s political agenda. To put it another way: Hey, Big Media, here are two dozen matters to which you’re not giving faintly adequate thought and attention.
1. Accomplishing the “mission”: Since the immediate aftermath of World War II, the United States has been committed to defending key allies in Europe and East Asia. Not long thereafter, U.S. security guarantees were extended to the Middle East as well. Under what circumstances can Americans expect nations in these regions to assume responsibility for managing their own affairs? To put it another way, when (if ever) might U.S. forces actually come home? And if it is incumbent upon the United States to police vast swaths of the planet in perpetuity, how should momentous changes in the international order — the rise of China, for example, or accelerating climate change — affect the U.S. approach to doing so?
2. American military supremacy: The United States military is undoubtedly the world’s finest. It’s also far and away the most generously funded, with policymakers offering U.S. troops no shortage of opportunities to practice their craft. So why doesn’t this great military ever win anything? Or put another way, why in recent decades have those forces been unable to accomplish Washington’s stated wartime objectives? Why has the now 15-year-old war on terror failed to result in even a single real success anywhere in the Greater Middle East? Could it be that we’ve taken the wrong approach? What should we be doing differently?
3. America’s empire of bases: The U.S. military today garrisons the planet in a fashion without historical precedent. Successive administrations, regardless of party, justify and perpetuate this policy by insisting that positioning U.S. forces in distant lands fosters peace, stability, and security. In the present century, however, perpetuating this practice has visibly had the opposite effect. In the eyes of many of those called upon to “host” American bases, the permanent presence of such forces smacks of occupation. They resist. Why should U.S. policymakers expect otherwise?
4. Supporting the troops: In present-day America, expressing reverence for those who serve in uniform is something akin to a religious obligation. Everyone professes to cherish America’s “warriors.” Yet such bountiful, if superficial, expressions of regard camouflage a growing gap between those who serve and those who applaud from the sidelines. Our present-day military system, based on the misnamed All-Volunteer Force, is neither democratic nor effective. Why has discussion and debate about its deficiencies not found a place among the nation’s political priorities?
5. Prerogatives of the commander-in-chief: Are there any military actions that the president of the United States may not order on his own authority? If so, what are they? Bit by bit, decade by decade, Congress has abdicated its assigned role in authorizing war. Today, it merely rubberstamps what presidents decide to do (or simply stays mum). Who does this deference to an imperial presidency benefit? Have U.S. policies thereby become more prudent, enlightened, and successful?
6. Assassin-in-chief: A policy of assassination, secretly implemented under the aegis of the CIA during the early Cold War, yielded few substantive successes. When the secrets were revealed, however, the U.S. government suffered considerable embarrassment, so much so that presidents foreswore politically motivated murder. After 9/11, however, Washington returned to the assassination business in a big way and on a global scale, using drones. Today, the only secret is the sequence of names on the current presidential hit list, euphemistically known as the White House “disposition matrix.” But does assassination actually advance U.S. interests (or does it merely recruit replacements for the terrorists it liquidates)? How can we measure its costs, whether direct or indirect? What dangers and vulnerabilities does this practice invite?
7. The war formerly known as the “Global War on Terrorism”: What precisely is Washington’s present strategy for defeating violent jihadism? What sequence of planned actions or steps is expected to yield success? If no such strategy exists, why is that the case? How is it that the absence of strategy — not to mention an agreed upon definition of “success” — doesn’t even qualify for discussion here?
8. The campaign formerly known as Operation Enduring Freedom: The conflict commonly referred to as the Afghanistan War is now the longest in U.S. history — having lasted longer than the Civil War, World War I, and World War II combined. What is the Pentagon’s plan for concluding that conflict? When might Americans expect it to end? On what terms?
9. The Gulf: Americans once believed that their prosperity and way of life depended on having assured access to Persian Gulf oil. Today, that is no longer the case. The United States is once more an oil exporter. Available and accessible reserves of oil and natural gas in North America are far greater than was once believed. Yet the assumption that the Persian Gulf still qualifies as crucial to American national security persists in Washington. Why?
10. Hyping terrorism: Each year terrorist attacks kill far fewer Americans than do auto accidents, drug overdoses, or even lightning strikes. Yet in the allocation of government resources, preventing terrorist attacks takes precedence over preventing all three of the others combined. Why is that?
11. Deaths that matter and deaths that don’t: Why do terrorist attacks that kill a handful of Europeans command infinitely more American attention than do terrorist attacks that kill far larger numbers of Arabs? A terrorist attack that kills citizens of France or Belgium elicits from the United States heartfelt expressions of sympathy and solidarity. A terrorist attack that kills Egyptians or Iraqis elicits shrugs. Why the difference? To what extent does race provide the answer to that question?
12. Israeli nukes: What purpose is served by indulging the pretense that Israel does not have nuclear weapons?
13. Peace in the Holy Land: What purpose is served by indulging illusions that a “two-state solution” offers a plausible resolution to the Israeli-Palestinian conflict? As remorselessly as white settlers once encroached upon territory inhabited by Native American tribes, Israeli settlers expand their presence in the occupied territories year by year. As they do, the likelihood of creating a viable Palestinian state becomes ever more improbable. To pretend otherwise is the equivalent of thinking that one day President Trump might prefer the rusticity of Camp David to the glitz of Mar-a-Lago.
14. Merchandizing death: When it comes to arms sales, there is no need to Make America Great Again. The U.S. ranks number one by a comfortable margin, with long-time allies Saudi Arabia and Israel leading recipients of those arms. Each year, the Saudis (per capita gross domestic product $20,000) purchase hundreds of millions of dollars of U.S. weapons. Israel (per capita gross domestic product $38,000) gets several billion dollars worth of such weaponry annually courtesy of the American taxpayer. If the Saudis pay for U.S. arms, why shouldn’t the Israelis? They can certainly afford to do so.
15. Our friends the Saudis (I): Fifteen of the 19 hijackers on September 11, 2001, were Saudis. What does that fact signify?
16. Our friends the Saudis (II): If indeed Saudi Arabia and Iran are competing to determine which nation will enjoy the upper hand in the Persian Gulf, why should the United States favor Saudi Arabia? In what sense do Saudi values align more closely with American values than do Iranian ones?
17. Our friends the Pakistanis: Pakistan behaves like a rogue state. It is a nuclear weapons proliferator. It supports the Taliban. For years, it provided sanctuary to Osama bin Laden. Yet U.S. policymakers treat Pakistan as if it were an ally. Why? In what ways do U.S. and Pakistani interests or values coincide? If there are none, why not say so?
18. Free-loading Europeans: Why can’t Europe, “whole and free,” its population and economy considerably larger than Russia’s, defend itself? It’s altogether commendable that U.S. policymakers should express support for Polish independence and root for the Baltic republics. But how does it make sense for the United States to care more about the wellbeing of people living in Eastern Europe than do people living in Western Europe?
19. The mother of all “special relationships”: The United States and the United Kingdom have a “special relationship” dating from the days of Franklin Roosevelt and Winston Churchill. Apart from keeping the Public Broadcasting Service supplied with costume dramas and stories featuring eccentric detectives, what is the rationale for that partnership today? Why should U.S. relations with Great Britain, a fading power, be any more “special” than its relations with a rising power like India? Why should the bonds connecting Americans and Britons be any more intimate than those connecting Americans and Mexicans? Why does a republic now approaching the 241st anniversary of its independence still need a “mother country”?
20. The old nuclear disarmament razzmatazz: American presidents routinely cite their hope for the worldwide elimination of nuclear weapons. Yet the U.S. maintains nuclear strike forces on full alert, has embarked on a costly and comprehensive trillion-dollar modernization of its nuclear arsenal, and even refuses to adopt a no-first-use posture when it comes to nuclear war. The truth is that the United States will consider surrendering its nukes only after every other nation on the planet has done so first. How does American nuclear hypocrisy affect the prospects for global nuclear disarmament or even simply for the non-proliferation of such weaponry?
21. Double standards (I): American policymakers take it for granted that their country’s sphere of influence is global, which, in turn, provides the rationale for the deployment of U.S. military forces to scores of countries. Yet when it comes to nations like China, Russia, or Iran, Washington takes the position that spheres of influence are obsolete and a concept that should no longer be applicable to the practice of statecraft. So Chinese, Russian, and Iranian forces should remain where they belong — in China, Russia, and Iran. To stray beyond that constitutes a provocation, as well as a threat to global peace and order. Why should these other nations play by American rules? Why shouldn’t similar rules apply to the United States?
22. Double standards (II): Washington claims that it supports and upholds international law. Yet when international law gets in the way of what American policymakers want to do, they disregard it. They start wars, violate the sovereignty of other nations, and authorize agents of the United States to kidnap, imprison, torture, and kill. They do these things with impunity, only forced to reverse their actions on the rare occasions when U.S. courts find them illegal. Why should other powers treat international norms as sacrosanct since the United States does so only when convenient?
23. Double standards (III): The United States condemns the indiscriminate killing of civilians in wartime. Yet over the last three-quarters of a century, it killed civilians regularly and often on a massive scale. By what logic, since the 1940s, has the killing of Germans, Japanese, Koreans, Vietnamese, Laotians, Cambodians, Afghans, and others by U.S. air power been any less reprehensible than the Syrian government’s use of “barrel bombs” to kill Syrians today? On what basis should Americans accept Pentagon claims that, when civilians are killed these days by U.S. forces, the acts are invariably accidental, whereas Syrian forces kill civilians intentionally and out of malice? Why exclude incompetence or the fog of war as explanations? And why, for instance, does the United States regularly gloss over or ignore altogether the noncombatants that Saudi forces (with U.S. assistance) are routinely killing in Yemen?
24. Moral obligations: When confronted with some egregious violation of human rights, members of the chattering classes frequently express an urge for the United States to “do something.” Holocaust analogies sprout like dandelions. Newspaper columnists recycle copy first used when Cambodians were slaughtering other Cambodians en masse or whenever Hutus and Tutsis went at it. Proponents of action — typically advocating military intervention — argue that the United States has a moral obligation to aid those victimized by injustice or cruelty anywhere on Earth. But what determines the pecking order of such moral obligations? Which comes first, a responsibility to redress the crimes of others or a responsibility to redress crimes committed by Americans? Who has a greater claim to U.S. assistance, Syrians suffering today under the boot of Bashar al-Assad or Iraqis, their country shattered by the U.S. invasion of 2003? Where do the Vietnamese fit into the queue? How about the Filipinos, brutally denied independence and forcibly incorporated into an American empire as the nineteenth century ended? Or African-Americans, whose ancestors were imported as slaves? Or, for that matter, dispossessed and disinherited Native Americans? Is there a statute of limitations that applies to moral obligations? And if not, shouldn’t those who have waited longest for justice or reparations receive priority attention?
Let me suggest that any one of these two dozen issues — none seriously covered, discussed, or debated in the American media or in the political mainstream — bears more directly on the wellbeing of the United States and our prospects for avoiding global conflict than anything Donald Trump may have said or done during his first 100 days as president. Collectively, they define the core of the national security challenges that presently confront this country, even as they languish on the periphery of American politics.
How much damage Donald Trump’s presidency wreaks before it ends remains to be seen. Yet he himself is a transient phenomenon. To allow his pratfalls and shenanigans to divert attention from matters sure to persist when he finally departs the stage is to make a grievous error. It may well be that, as the Times insists, the truth is now more important than ever. If so, finding the truth requires looking in the right places and asking the right questions.
By way of explaining his eight failed marriages, the American bandleader Artie Shaw once remarked, “I am an incurable optimist.” In reality, Artie was an incurable narcissist. Utterly devoid of self-awareness, he never looked back, only forward.
So, too, with the incurable optimists who manage present-day American wars. What matters is not past mistakes but future opportunities. This describes the view of General Joseph Votel, current head of U.S. Central Command (CENTCOM). Since its creation in 1983, CENTCOM has emerged as the ne plus ultra of the Pentagon’s several regional commands, the place where the action is always hot and heavy. Votel is the latest in a long train of four-star generals to preside over that action.
The title of this essay (exclamation point included) captures in a single phrase the “strategic approach” that Votel has devised for CENTCOM. That approach, according to the command’s website, is “proactive in nature and endeavors to set in motion tangible actions in a purposeful, consistent, and continuous manner.”
This strategic approach forms but one element in General Votel’s multifaceted (if murky) “command narrative,” which he promulgated last year upon taking the helm at CENTCOM headquarters in Tampa, Florida. Other components include a “culture,” a “vision,” a “mission,” and “priorities.” CENTCOM’s culture emphasizes “persistent excellence,” as the command “strives to understand and help others to comprehend, with granularity and clarity, the complexities of our region.” The vision, indistinguishable from the mission except perhaps for those possessing advanced degrees in hermeneutics, seeks to provide “a more stable and prosperous region with increasingly effective governance, improved security, and trans-regional cooperation.” Toward that estimable end, CENTCOM’s priorities include forging partnerships with other nations “based upon shared values,” “actively counter[ing] the malign influence” of hostile regimes, and “degrading and defeating violent extremist organizations and their networks.”
At present, CENTCOM is busily implementing the several components of Votel’s command narrative across an “area of responsibility” (AOR) consisting of 20 nations, among them Iran, Iraq, Syria, Afghanistan, and Pakistan. As the CENTCOM website puts it, without batting a digital eyelash, that AOR “spans more than 4 million square miles and is populated by more than 550 million people from 22 ethnic groups, speaking 18 languages with hundreds of dialects and confessing multiple religions which transect national borders.”
According to the Department of Defense Dictionary of Military and Associated Terms, an AOR is the “geographical area associated with a combatant command within which a geographic combatant commander has authority to plan and conduct operations.” Yet this anodyne definition fails to capture the spirit of the enterprise in which General Votel is engaged.
One imagines that there must be another Department of Defense Dictionary, kept under lock-and-key in the Pentagon, that dispenses with the bland language and penchant for deceptive euphemisms. That dictionary would define an AOR as “a vast expanse within which the United States seeks to impose order without exercising sovereignty.” An AOR combines aspects of colony, protectorate, and contested imperial frontier. In that sense, the term represents the latest incarnation of the informal empire that American elites have pursued in various forms ever since U.S. forces “liberated” Cuba in 1898.
To say that a military officer presiding over an AOR plans and conducts operations is a bit like saying that Jeff Bezos sells books. It’s a small truth that evades a larger one. To command CENTCOM is to function as a proconsul, to inhabit as a co-equal the rarified realm of kings, presidents, and prime ministers. CENTCOM commanders shape the future of their AOR — or at least fancy that they do.
Sustaining expectations of shaping the future requires a suitably accommodating version of the past. For CENTCOM, history is a record of events selected and arranged to demonstrate progress. By testifying to the achievements of previous CENTCOM commanders, history thereby validates Votel’s own efforts to carry on their work. Not for nothing, therefore, does the command’s website include this highly sanitized account of its recent past:
“In the wake of 9-11, the international community found Saddam Hussein’s continued lack of cooperation with United Nations Security Council (UNSC) Resolutions regarding weapons of mass destruction unacceptable. Hussein’s continued recalcitrance led the UNSC to authorize the use of force by a U.S.-led coalition. Operation Iraqi Freedom began 19 March 2003.
“Following the defeat of both the Taliban regime in Afghanistan (9 November 2001) and Saddam Hussein’s government in Iraq (8 April 2003), CENTCOM has continued to provide security to the new freely-elected governments in those countries, conducting counterinsurgency operations and assisting host nation security forces to provide for their own defense.”
Setbacks, disappointments, miscalculations, humiliations: you won’t hear about them from CENTCOM. Like Broadway’s Annie, down at headquarters in Tampa they’re “just thinkin’ about tomorrow,” which “clears away the cobwebs, and the sorrow, till there’s none!”
(Give the Vietnam War the CENTCOM treatment and you would end up with something like this: “Responding to unprovoked North Vietnamese attacks and acting at the behest of the international community, a U.S.-led coalition arrived to provide security to the freely-elected South Vietnamese government, conducting counterinsurgency operations and assisting host nation security forces to provide for their own defense.”)
In fact, the U.N. Security Council did not authorize the 2003 invasion of Iraq. Indeed, efforts by George W. Bush’s administration to secure such an authorization failed abysmally, collapsing in a welter of half-truths and outright falsehoods. What much of the international community found unacceptable, more so even than Saddam’s obstreperousness, was Bush’s insistence that he was going to have his war regardless of what others might think. As for celebrating the “defeat” of the Taliban and of Saddam, that’s the equivalent of declaring “game over” when the whistle sounds ending the first quarter of a football game.
More to the point, to claim that, in the years since, CENTCOM “has continued to provide security to the new freely-elected governments” of Afghanistan and Iraq whitewashes history in ways that would cause the most shameless purveyor of alt-facts on Fox News to blush. The incontestable truth is that Afghans and Iraqis have not known security since U.S. forces, under the direction of General Votel’s various predecessors, arrived on the scene. Rather than providing security, CENTCOM has undermined it.
CENTCOM Headquarters (Where It’s Always Groundhog Day)
Even so, as the current steward of CENTCOM’s culture, vision, mission, strategic approach, and priorities, General Votel remains undaunted. In his view, everything that happened prior to his assuming ownership of the CENTCOM AOR is irrelevant. What matters is what will happen from now on — in Washington-speak, “going forward.” As with Artie Shaw, serial disappointments leave intact the conviction that persistence will ultimately produce a happy ending.
Earlier this month, Votel provided a progress report to the Senate Armed Services Committee and outlined his expectations for future success. In a city that now competes for the title of Comedy Central, few paid serious attention to what the CENTCOM commander had to say. Yet his presentation was, in its own way, emblematic of how, in the Age of Trump, U.S. national security policy has become fully divorced from reality.
General Votel began by inventorying the various “drivers of instability” afflicting his AOR. That list, unsurprisingly enough, turned out to be a long one, including ethnic and sectarian divisions, economic underdevelopment, an absence of opportunity for young people “susceptible to unrest [and] radical ideologies,” civil wars, humanitarian crises, large refugee populations, and “competition among outside actors, including Russia and China, seeking to promote their interests and supplant U.S. influence in the region.” Not qualifying for mention as destabilizing factors, however, were the presence and activities of U.S. military forces, their footprint dwarfing that of Russia and China.
Indeed, the balance of Votel’s 64-page written statement argued, in effect, that U.S. military activities are the key to fixing all that ails the CENTCOM AOR. After making a brief but obligatory bow to the fact that “a solely military response is not sufficient” to address the region’s problems, he proceeded to describe at length the military response (and only the military response) that will do just that.
Unfortunately for General Votel, length does not necessarily correlate with substance. Once upon a time, American military professionals prized brevity and directness in their writing. Not so the present generation of generals who are given to logorrhea. Consider just this bit of cliché-ridden drivel — I could quote vast passages of it — that Votel inflicted on members of the United States Senate. “In a region beset by myriad challenges,” he reported,
“we must always be on the look-out for opportunities to seize the initiative to support our objectives and goals. Pursuing opportunities means that we are proactive — we don’t wait for problems to be presented; we look for ways to get ahead of them. It also means that we have to become comfortable with transparency and flat communications — our ability to understand our AOR better than anyone else gives us the advantage of knowing where opportunities exist. Pursuing opportunities also means we have to take risk — by delegating authority and responsibility to the right level, by trusting our partners, and being willing to trust our best instincts in order to move faster than our adversaries.”
In third-tier business schools, bromides of this sort might pass for “best practices.” But my guess is that George C. Marshall or Dwight D. Eisenhower would award the author of that paragraph an F and return him to staff college for further instruction.
Frothy verbiage aside, what exactly does General Votel propose? The answer — for those with sufficient patience to wade through the entire 64 pages — reduces to this: persist. In concrete terms, that means keeping on killing and enabling our “allies” to do the same until the other side is finally exhausted and gives up. In other words, it’s the movie Groundhog Day transposed from Punxsutawney, Pennsylvania, to Tampa and then to Afghanistan, Iraq, and other countries where the bodies continue to pile up.
True, the document Votel presented to Congress is superficially comprehensive, with sections touting everything from “Building Partner Capacity” (“we must be forward-leaning and empower our partners to meet internal security challenges”) to creating a “Global Engagement Center” (“The best way to defeat an idea is to present a better, more appealing idea”). Strip away the fluff, however, and what’s left is nothing more than a call to keep doing what CENTCOM has been doing for years now.
To see what all this really means, practically speaking, just check out CENTCOM press releases for the week of March 5th through 10th. The titles alone suffice to describe a situation where every day is like the one that preceded it:
As the good nuns used to tell me back in parochial school, actions speak louder than words. What the CENTCOM commander says matters less than what CENTCOM forces do. What they are doing is waging an endless war of attrition.
Ludendorff Would Have Approved
“Punch a hole and let the rest follow.”
During the First World War, that aphorism, attributed to General Erich Ludendorff, captured the essence of the German army’s understanding of strategy, rooted in the conviction that violence perpetrated on a sufficient scale over a sufficient period of time will ultimately render a politically purposeless war purposeful. The formula didn’t work for Germany in Ludendorff’s day and yielded even more disastrous results when Hitler revived it two decades later.
Of course, U.S. military commanders today don’t make crude references to punching holes. They employ language that suggests discrimination, deliberation, precision, and control as the qualities that define the American way of war. They steer clear of using terms like attrition. Yet differences in vocabulary notwithstanding, the U.S. military’s present-day MO bears a considerable resemblance to the approach that Ludendorff took fully a century ago. And for the last decade and a half, U.S. forces operating in the CENTCOM AOR have been no more successful than were German forces on the Western Front in achieving the purposes that ostensibly made war necessary.
To divert attention from this disturbing fact, General Votel offers Congress and by extension the American people a 64-page piece of propaganda. Whether he himself is deluded or dishonest is difficult to say, just as it remains difficult to say whether General William Westmoreland was deluded or dishonest when he assured Congress in November 1967 that victory in Vietnam was in sight. “With 1968,” Westmoreland promised, “a new phase is now starting. We have reached an important point when the end begins now to come into view.”
Westmoreland was dead wrong, as the enemy’s 1968 Tet Offensive soon demonstrated. That a comparable disaster, no doubt different in form, will expose Votel’s own light-at-the-end-of-the-tunnel assessment as equally fraudulent is a possibility, even if one to which American political and military leaders appear to be oblivious. This much is certain: in the CENTCOM AOR the end is not even remotely in view.
What are we to make of this charade of proconsuls parading through Washington to render false or misleading reports on the status of the American empire’s outer precincts?
Perhaps the time has come to look elsewhere for advice and counsel. Whether generals like Votel are deluded or dishonest is ultimately beside the point. More relevant is the fact that the views they express — and that inexplicably continue to carry weight in Washington — are essentially of no value. So many years later, no reason exists to believe that they know what they are doing.
To reground U.S. national security policy in something that approximates reality would require listening to new voices, offering views long deemed heretical.
Let me nonetheless offer you an example:
“Fifteen years after launching a worldwide effort to defeat and destroy terrorist organizations, the United States finds itself locked in a pathologically recursive loop; we fight to prevent attacks and defend our values, only to incite further violence against ourselves and allies while destabilizing already chaotic regions…”
That is not the judgment of some lefty from Cambridge or San Francisco, but of Major John Q. Bolton, a veteran of both the Iraq and Afghan Wars. Within that brief passage is more wisdom than in all of General Votel’s 64 pages of blather.
I submit that Bolton’s grasp of our predicament is infinitely superior to Votel’s. The contrast between the two is striking. The officer who wears no stars dares to say what is true; the officer wearing four stars obfuscates. If the four-stars abandon obfuscation for truth, then and only then will they deserve our respectful attention. In the meantime, it’s like looking to Artie Shaw for marriage counseling.
Prepare, Pursue, Prevail!
Apart from being a police officer, firefighter, or soldier engaged in one of this nation’s endless wars, writing a column for a major American newspaper has got to be one of the toughest and most unforgiving jobs there is. The pay may be decent (at least if your gig is with one of the major papers in New York or Washington), but the pressures to perform on cue are undoubtedly relentless.
Anyone who has ever tried cramming a coherent and ostensibly insightful argument into a mere 750 words knows what I’m talking about. Writing op-eds does not perhaps qualify as high art. Yet, like tying flies or knitting sweaters, it requires no small amount of skill. Performing the trick week in and week out without too obviously recycling the same ideas over and over again — or at least while disguising repetitions and concealing inconsistencies — requires notable gifts.
David Brooks of the New York Times is a gifted columnist. Among contemporary journalists, he is our Walter Lippmann, the closest thing we have to an establishment-approved public intellectual. As was the case with Lippmann, Brooks works hard to suppress the temptation to rant. He shuns raw partisanship. In his frequent radio and television appearances, he speaks in measured tones. Dry humor and ironic references abound. And like Lippmann, when circumstances change, he makes at least a show of adjusting his views accordingly.
For all that, Brooks remains an ideologue. In his columns, and even more so in his weekly appearances on NPR and PBS, he plays the role of the thoughtful, non-screaming conservative, his very presence affirming the ideological balance that, until November 8th of last year, was a prized hallmark of “respectable” journalism. Just as that balance always involved considerable posturing, so, too, with the ostensible conservatism of David Brooks: it’s an act.
Praying at the Altar of American Greatness
In terms of confessional fealty, his true allegiance is not to conservatism as such, but to the Church of America the Redeemer. This is a virtual congregation, albeit one possessing many of the attributes of a more traditional religion. The Church has its own Holy Scripture, authenticated on July 4, 1776, at a gathering of 56 prophets. And it has its own saints, prominent among them the Good Thomas Jefferson, chief author of the sacred text (not the Bad Thomas Jefferson who owned and impregnated slaves); Abraham Lincoln, who freed said slaves and thereby suffered martyrdom (on Good Friday no less); and, of course, the duly canonized figures most credited with saving the world itself from evil: Winston Churchill and Franklin Roosevelt, their status akin to that of saints Peter and Paul in Christianity. The Church of America the Redeemer even has its own Jerusalem, located on the banks of the Potomac, and its own hierarchy, its members situated nearby in High Temples of varying architectural distinction.
This ecumenical enterprise does not prize theological rigor. When it comes to shalts and shalt nots, it tends to be flexible, if not altogether squishy. It demands of the faithful just one thing: a fervent belief in America’s mission to remake the world in its own image. Although in times of crisis Brooks has occasionally gone a bit wobbly, he remains at heart a true believer.
In a March 1997 piece for The Weekly Standard, his then-employer, he summarized his credo. Entitled “A Return to National Greatness,” the essay opened with a glowing tribute to the Library of Congress and, in particular, to the building completed precisely a century earlier to house its many books and artifacts. According to Brooks, the structure itself embodied the aspirations defining America’s enduring purpose. He called particular attention to the dome above the main reading room decorated with a dozen “monumental figures” representing the advance of civilization and culminating in a figure representing America itself. Contemplating the imagery, Brooks rhapsodized:
“The theory of history depicted in this mural gave America impressive historical roots, a spiritual connection to the centuries. And it assigned a specific historic role to America as the latest successor to Jerusalem, Athens, and Rome. In the procession of civilization, certain nations rise up to make extraordinary contributions… At the dawn of the 20th century, America was to take its turn at global supremacy. It was America’s task to take the grandeur of past civilizations, modernize it, and democratize it. This common destiny would unify diverse Americans and give them a great national purpose.”
This February, 20 years later, in a column with an identical title, but this time appearing in the pages of his present employer, the New York Times, Brooks revisited this theme. Again, he began with a paean to the Library of Congress and its spectacular dome with its series of “monumental figures” that placed America “at the vanguard of the great human march of progress.” For Brooks, those 12 allegorical figures convey a profound truth.
“America is the grateful inheritor of other people’s gifts. It has a spiritual connection to all people in all places, but also an exceptional role. America culminates history. It advances a way of life and a democratic model that will provide people everywhere with dignity. The things Americans do are not for themselves only, but for all mankind.”
In 1997, in the midst of the Clinton presidency, Brooks had written that “America’s mission was to advance civilization itself.” In 2017, as Donald Trump gained entry into the Oval Office, he embellished and expanded that mission, describing a nation “assigned by providence to spread democracy and prosperity; to welcome the stranger; to be brother and sister to the whole human race.”
Back in 1997, “a moment of world supremacy unlike any other,” Brooks had worried that his countrymen might not seize the opportunity that was presenting itself. On the cusp of the twenty-first century, he worried that Americans had “discarded their pursuit of national greatness in just about every particular.” The times called for a leader like Theodore Roosevelt, who wielded that classic “big stick” and undertook monster projects like the Panama Canal. Yet Americans were stuck instead with Bill Clinton, a small-bore triangulator. “We no longer look at history as a succession of golden ages,” Brooks lamented. “And, save in the speeches of politicians who usually have no clue what they are talking about,” America was no longer fulfilling its “special role as the vanguard of civilization.”
By early 2017, with Donald Trump in the White House and Steve Bannon whispering in his ear, matters had become worse still. Americans had seemingly abandoned their calling outright. “The Trump and Bannon anschluss has exposed the hollowness of our patriotism,” wrote Brooks, inserting the now-obligatory reference to Nazi Germany. The November 2016 presidential election had “exposed how attenuated our vision of national greatness has become and how easy it was for Trump and Bannon to replace a youthful vision of American greatness with a reactionary, alien one.” That vision now threatens to leave America as “just another nation, hunkered down in a fearful world.”
What exactly happened between 1997 and 2017, you might ask? What occurred during that “moment of world supremacy” to reduce the United States from a nation summoned to redeem humankind to one hunkered down in fear?
Trust Brooks to have at hand a brow-furrowing explanation. The fault, he explains, lies with an “educational system that doesn’t teach civilizational history or real American history but instead a shapeless multiculturalism,” as well as with “an intellectual culture that can’t imagine providence.” Brooks blames “people on the left who are uncomfortable with patriotism and people on the right who are uncomfortable with the federal government that is necessary to lead our project.”
An America that no longer believes in itself — that’s the problem. In effect, Brooks revises Norma Desmond’s famous complaint about the movies, now repurposed to diagnose an ailing nation: it’s the politics that got small.
Nowhere does he consider the possibility that his formula for “national greatness” just might be so much hooey. Between 1997 and 2017, after all, egged on by people like David Brooks, Americans took a stab at “greatness,” with the execrable Donald Trump now numbering among the eventual results.
Say what you will about the shortcomings of the American educational system and the country’s intellectual culture, they had far less to do with creating Trump than did popular revulsion prompted by specific policies that Brooks, among others, enthusiastically promoted. Not that he is inclined to tally up the consequences. Only as a sort of postscript to his litany of contemporary American ailments does he refer even in passing to what he calls the “humiliations of Iraq.”
A great phrase, that. Yet much like, say, the “tragedy of Vietnam” or the “crisis of Watergate,” it conceals more than it reveals. Here, in short, is a succinct historical reference that cries out for further explanation. It bursts at the seams with implications demanding to be unpacked, weighed, and scrutinized. Brooks shrugs off Iraq as a minor embarrassment, the equivalent of having shown up at a dinner party wearing the wrong clothes.
Under the circumstances, it’s easy to forget that, back in 2003, he and other members of the Church of America the Redeemer devoutly supported the invasion of Iraq. They welcomed war. They urged it. They did so not because Saddam Hussein was uniquely evil — although he was evil enough — but because they saw in such a war the means for the United States to accomplish its salvific mission. Toppling Saddam and transforming Iraq would provide the mechanism for affirming and renewing America’s “national greatness.”
Anyone daring to disagree with that proposition they denounced as craven or cowardly. Writing at the time, Brooks disparaged those opposing the war as mere “marchers.” They were effete, pretentious, ineffective, and absurd. “These people are always in the streets with their banners and puppets. They march against the IMF and World Bank one day, and against whatever war happens to be going on the next… They just march against.”
Perhaps space constraints did not permit Brooks in his recent column to spell out the “humiliations” that resulted and that even today continue to accumulate. Here in any event is a brief inventory of what that euphemism conceals: thousands of Americans needlessly killed; tens of thousands grievously wounded in body or spirit; trillions of dollars wasted; millions of Iraqis dead, injured, or displaced; this nation’s moral standing compromised by its resort to torture, kidnapping, assassination, and other perversions; a region thrown into chaos and threatened by radical terrorist entities like the Islamic State that U.S. military actions helped foster. And now, if only as an oblique second-order bonus, we have Donald Trump’s elevation to the presidency to boot.
In refusing to reckon with the results of the war he once so ardently endorsed, Brooks is hardly alone. Members of the Church of America the Redeemer, Democrats and Republicans alike, are demonstrably incapable of rendering an honest accounting of what their missionary efforts have yielded.
Brooks belongs, or once did, to the Church’s neoconservative branch. But liberals such as Bill Clinton, along with his secretary of state Madeleine Albright, were congregants in good standing, as were Barack Obama and his secretary of state Hillary Clinton. So, too, are putative conservatives like Senators John McCain, Ted Cruz, and Marco Rubio, all of them subscribing to the belief in the singularity and indispensability of the United States as the chief engine of history, now and forever.
Back in April 2003, confident that the fall of Baghdad had ended the Iraq War, Brooks predicted that “no day will come when the enemies of this endeavor turn around and say, ‘We were wrong. Bush was right.’” Rather than admitting error, he continued, the war’s opponents “will just extend their forebodings into a more distant future.”
Yet it is the war’s proponents who, in the intervening years, have choked on admitting that they were wrong. Or when making such an admission, as did both John Kerry and Hillary Clinton while running for president, they write it off as an aberration, a momentary lapse in judgment of no particular significance, like having guessed wrong on a TV quiz show.
Rather than requiring acts of contrition, the Church of America the Redeemer has long promulgated a doctrine of self-forgiveness, freely available to all adherents all the time. “You think our country’s so innocent?” the nation’s 45th president recently barked at a TV host who had the temerity to ask how he could have kind words for the likes of Russian President Vladimir Putin. Observers professed shock that a sitting president would openly question American innocence.
In fact, Trump’s response and the kerfuffle that ensued both missed the point. No serious person believes that the United States is “innocent.” Worshipers in the Church of America the Redeemer do firmly believe, however, that America’s transgressions, unlike those of other countries, don’t count against it. Once committed, such sins are simply to be set aside and then expunged, a process that allows American politicians and pundits to condemn a “killer” like Putin with a perfectly clear conscience while demanding that Donald Trump do the same.
What the Russian president has done in Crimea, Ukraine, and Syria qualifies as criminal. What American presidents have done in Iraq, Afghanistan, and Libya qualifies as incidental and, above all, beside the point.
Rather than confronting the havoc and bloodshed to which the United States has contributed, those who worship in the Church of America the Redeemer keep their eyes fixed on the far horizon and the work still to be done in aligning the world with American expectations. At least they would, were it not for the arrival at center stage of a manifestly false prophet who, in promising to “make America great again,” inverts all that “national greatness” is meant to signify.
For Brooks and his fellow believers, the call to “greatness” emanates from faraway precincts — in the Middle East, East Asia, and Eastern Europe. For Trump, the key to “greatness” lies in keeping faraway places and the people who live there as faraway as possible. Brooks et al. see a world that needs saving and believe that it’s America’s calling to do just that. In Trump’s view, saving others is not a peculiarly American responsibility. Events beyond our borders matter only to the extent that they affect America’s well-being. Trump worships in the Church of America First, or at least pretends to do so in order to impress his followers.
That Donald Trump inhabits a universe of his own devising, constructed of carefully arranged alt-facts, is no doubt the case. Yet, in truth, much the same can be said of David Brooks and others sharing his view of a country providentially charged to serve as the “successor to Jerusalem, Athens, and Rome.” In fact, this conception of America’s purpose expresses not the intent of providence, which is inherently ambiguous, but their own arrogance and conceit. Out of that conceit comes much mischief. And in the wake of mischief come charlatans like Donald Trump.
Angst in the Church of America the Redeemer
The fall of the Berlin Wall in October 1989 abruptly ended one historical era and inaugurated another. So, too, did the outcome of last year’s U.S. presidential election. What are we to make of the interval between those two watershed moments? Answering that question is essential to understanding how Donald Trump became president and where his ascendency leaves us.
Hardly had this period commenced before observers fell into the habit of referring to it as the “post-Cold War” era. Now that it’s over, a more descriptive name might be in order. My suggestion: America’s Age of Great Expectations.
Forgive and Forget
The end of the Cold War caught the United States completely by surprise. During the 1980s, even with Mikhail Gorbachev running the Kremlin, few in Washington questioned the prevailing conviction that the Soviet-American rivalry was and would remain a defining feature of international politics more or less in perpetuity. Indeed, endorsing such an assumption was among the prerequisites for gaining entrée to official circles. Virtually no one in the American establishment gave serious thought to the here-today, gone-tomorrow possibility that the Soviet threat, the Soviet empire, and the Soviet Union itself might someday vanish. Washington had plans aplenty for what to do should a Third World War erupt, but none for what to do if the prospect of such a climactic conflict simply disappeared.
Still, without missing a beat, when the Berlin Wall fell and two years later the Soviet Union imploded, leading members of that establishment wasted no time in explaining the implications of developments they had totally failed to anticipate. With something close to unanimity, politicians and policy-oriented intellectuals interpreted the unification of Berlin and the ensuing collapse of communism as an all-American victory of cosmic proportions. “We” had won, “they” had lost — with that outcome vindicating everything the United States represented as the archetype of freedom.
From within the confines of that establishment, one rising young intellectual audaciously suggested that the “end of history” itself might be at hand, with the “sole superpower” left standing now perfectly positioned to determine the future of all humankind. In Washington, various powers-that-be considered this hypothesis and concluded that it sounded just about right. The future took on the appearance of a blank slate upon which Destiny itself was inviting Americans to inscribe their intentions.
American elites might, of course, have assigned a far different, less celebratory meaning to the passing of the Cold War. They might have seen the outcome as a moment that called for regret, repentance, and making amends.
After all, the competition between the United States and the Soviet Union, or more broadly between what was then called the Free World and the Communist bloc, had yielded a host of baleful effects. An arms race between two superpowers had created monstrous nuclear arsenals and, on multiple occasions, brought the planet precariously close to Armageddon. Two singularly inglorious wars had claimed the lives of many tens of thousands of American soldiers and literally millions of Asians. One, on the Korean peninsula, had ended in an unsatisfactory draw; the other, in Southeast Asia, in catastrophic defeat. Proxy fights in Asia, Africa, Latin America, and the Middle East killed so many more and laid waste to whole countries. Cold War obsessions led Washington to overthrow democratic governments, connive in assassination, make common cause with corrupt dictators, and turn a blind eye to genocidal violence. On the home front, hysteria compromised civil liberties and fostered a sprawling, intrusive, and unaccountable national security apparatus. Meanwhile, the military-industrial complex and its beneficiaries conspired to spend vast sums on weapons purchases that somehow never seemed adequate to the putative dangers at hand.
Rather than reflecting on such somber and sordid matters, however, the American political establishment together with ambitious members of the country’s intelligentsia found it so much more expedient simply to move on. As they saw it, the annus mirabilis of 1989 wiped away the sins of former years. Eager to make a fresh start, Washington granted itself a plenary indulgence. After all, why contemplate past unpleasantness when a future so stunningly rich in promise now beckoned?
Three Big Ideas and a Dubious Corollary
Soon enough, that promise found concrete expression. In remarkably short order, three themes emerged to define the new American age. Informing each of them was a sense of exuberant anticipation toward an era of almost unimaginable expectations. The twentieth century was ending on a high note. For the planet as a whole but especially for the United States, great things lay ahead.
Focused on the world economy, the first of those themes emphasized the transformative potential of turbocharged globalization led by U.S.-based financial institutions and transnational corporations. An “open world” would facilitate the movement of goods, capital, ideas, and people and thereby create wealth on an unprecedented scale. In the process, the rules governing American-style corporate capitalism would come to prevail everywhere on the planet. Everyone would benefit, but especially Americans who would continue to enjoy more than their fair share of material abundance.
Focused on statecraft, the second theme spelled out the implications of an international order dominated as never before — not even in the heydays of the Roman and British Empires — by a single nation. With the passing of the Cold War, the United States now stood apart as both supreme power and irreplaceable global leader, its status guaranteed by its unstoppable military might.
In the editorial offices of the Wall Street Journal, the Washington Post, the New Republic, and the Weekly Standard, such “truths” achieved a self-evident status. Although more muted in their public pronouncements than Washington’s reigning pundits, officials enjoying access to the Oval Office, the State Department’s 7th floor, and the E-ring of the Pentagon generally agreed. The assertive exercise of (benign!) global hegemony seemingly held the key to ensuring that Americans would enjoy safety and security, both at home and abroad, now and in perpetuity.
The third theme was all about rethinking the concept of personal freedom as commonly understood and pursued by most Americans. During the protracted emergency of the Cold War, reaching an accommodation between freedom and the putative imperatives of national security had not come easily. Cold War-style patriotism seemingly prioritized the interests of the state at the expense of the individual. Yet even as thrillingly expressed by John F. Kennedy — “Ask not what your country can do for you, ask what you can do for your country” — this was never an easy sell, especially if it meant wading through rice paddies and getting shot at.
Once the Cold War ended, however, the tension between individual freedom and national security momentarily dissipated. Reigning conceptions of what freedom could or should entail underwent a radical transformation. Emphasizing the removal of restraints and inhibitions, the shift made itself felt everywhere, from patterns of consumption and modes of cultural expression to sexuality and the definition of the family. Norms that had prevailed for decades if not generations — marriage as a union between a man and a woman, gender identity as fixed at birth — became passé. The concept of a transcendent common good, which during the Cold War had taken a backseat to national security, now took a backseat to maximizing individual choice and autonomy.
Finally, as a complement to these themes, in the realm of governance, the end of the Cold War cemented the status of the president as quasi-deity. In the Age of Great Expectations, the myth of the president as a deliverer from (or, in the eyes of critics, the ultimate perpetrator of) evil flourished. In the solar system of American politics, the man in the White House increasingly became the sun around which everything seemed to orbit. By comparison, nothing else much mattered.
From one administration to the next, of course, presidential efforts to deliver Americans to the Promised Land regularly came up short. Even so, the political establishment and the establishment media collaborated in sustaining the pretense that out of the next endlessly hyped “race for the White House,” another Roosevelt or Kennedy or Reagan would magically emerge to save the nation. From one election cycle to the next, these campaigns became longer and more expensive, drearier and yet ever more circus-like. No matter. During the Age of Great Expectations, the reflexive tendency to see the president as the ultimate guarantor of American abundance, security, and freedom remained sacrosanct.
Meanwhile, between promise and reality, a yawning gap began to appear. During the concluding decade of the twentieth century and the first decade-and-a-half of the twenty-first, Americans endured a seemingly endless series of crises. Individually, none of these merit comparison with, say, the Civil War or World War II. Yet never in U.S. history has a sequence of events occurring in such close proximity subjected American institutions and the American people to greater stress.
During the decade between 1998 and 2008, they came on with startling regularity: one president impeached and his successor chosen by the direct intervention of the Supreme Court; a massive terrorist attack on American soil that killed thousands, traumatized the nation, and left senior officials bereft of their senses; a mindless, needless, and unsuccessful war of choice launched on the basis of false claims and outright lies; a natural disaster (exacerbated by engineering folly) that all but destroyed a major American city, after which government agencies mounted a belated and half-hearted response; and finally, the worst economic downturn since the Great Depression, bringing ruin to millions of families.
For the sake of completeness, we should append to this roster of seismic occurrences one additional event: Barack Obama’s election as the nation’s first black president. He arrived at the zenith of American political life as a seemingly messianic figure called upon not only to undo the damage wrought by his predecessor, George W. Bush, but somehow to absolve the nation of its original sins of slavery and racism.
Yet during the Obama presidency race relations, in fact, deteriorated. Whether prompted by cynical political calculations or a crass desire to boost ratings, race baiters came out of the woodwork — one of them, of course, infamously birthered in Trump Tower in mid-Manhattan — and poured their poisons into the body politic. Even so, as the end of Obama’s term approached, the cult of the presidency itself remained remarkably intact.
Individually, the impact of these various crises ranged from disconcerting to debilitating to horrifying. Yet to treat them separately is to overlook their collective implications, which the election of Donald Trump only now enables us to appreciate. It was not one president’s dalliance with an intern or “hanging chads” or 9/11 or “Mission Accomplished” or the inundation of the Lower Ninth Ward or the collapse of Lehman Brothers or the absurd birther movement that undermined the Age of Great Expectations. It was the way all these events together exposed those expectations as radically suspect.
In effect, the various crises that punctuated the post-Cold War era called into question key themes to which a fevered American triumphalism had given rise. Globalization, militarized hegemony, and a more expansive definition of freedom, guided by enlightened presidents in tune with the times, should have provided Americans with all the blessings that were rightly theirs as a consequence of having prevailed in the Cold War. Instead, between 1989 and 2016, things kept happening that weren’t supposed to happen. A future marketed as all but foreordained proved elusive, if not illusory. As actually experienced, the Age of Great Expectations became an Age of Unwelcome Surprises.
A Candidate for Decline
True, globalization created wealth on a vast scale, just not for ordinary Americans. The already well-to-do did splendidly, in some cases unbelievably so. But middle-class incomes stagnated and good jobs became increasingly hard to find or keep. By the election of 2016, the United States looked increasingly like a society divided between haves and have-nots, the affluent and the left-behind, the 1% and everyone else. Prospective voters were noticing.
Meanwhile, policies inspired by Washington’s soaring hegemonic ambitions produced remarkably few happy outcomes. With U.S. forces continuously engaged in combat operations, peace all but vanished as a policy objective (or even a word in Washington’s political lexicon). The acknowledged standing of the country’s military as the world’s best-trained, best-equipped, and best-led force coexisted uneasily with the fact that it proved unable to win. Instead, the national security establishment became conditioned to the idea of permanent war, high-ranking officials taking it for granted that ordinary citizens would simply accommodate themselves to this new reality. Yet it soon became apparent that, instead of giving ordinary Americans a sense of security, this new paradigm induced an acute sense of vulnerability, which left many susceptible to demagogic fear mongering.
As for the revised definition of freedom, with autonomy emerging as the national summum bonum, it left some satisfied but others adrift. During the Age of Great Expectations, distinctions between citizen and consumer blurred. Shopping became tantamount to a civic obligation, essential to keeping the economy afloat. Yet if all the hoopla surrounding Black Friday and Cyber Monday represented a celebration of American freedom, its satisfactions were transitory at best, rarely extending beyond the due date printed on a credit card statement. Meanwhile, as digital connections displaced personal ones, relationships, like jobs, became more contingent and temporary. Loneliness emerged as an abiding affliction. Meanwhile, for all the talk of empowering the marginalized — people of color, women, gays — elites reaped the lion’s share of the benefits while ordinary people were left to make do. The atmosphere was rife with hypocrisy and even a whiff of nihilism.
To these various contradictions, the establishment itself remained stubbornly oblivious, with the 2016 presidential candidacy of Hillary Clinton offering a case in point. As her long record in public life made abundantly clear, Clinton embodied the establishment in the Age of Great Expectations. She believed in globalization, in the indispensability of American leadership backed by military power, and in the post-Cold War cultural project. And she certainly believed in the presidency as the mechanism to translate aspirations into outcomes.
Such commonplace convictions of the era, along with her vanguard role in pressing for the empowerment of women, imparted to her run an air of inevitability. That she deserved to win appeared self-evident. It was, after all, her turn. Largely overlooked were signs that the abiding themes of the Age of Great Expectations no longer commanded automatic allegiance.
Gasping for Air
Senator Bernie Sanders offered one of those signs. That a past-his-prime, self-professed socialist from Vermont with a negligible record of legislative achievement and tenuous links to the Democratic Party might mount a serious challenge to Clinton seemed, on the face of it, absurd. Yet by zeroing in on unfairness and inequality as inevitable byproducts of globalization, Sanders struck a chord.
Knocked briefly off balance, Clinton responded by modifying certain of her longstanding positions. By backing away from free trade, the ne plus ultra of globalization, she managed, though not without difficulty, to defeat the Sanders insurgency. Even so, he, in effect, served as the canary in the establishment coal mine, signaling that the Age of Great Expectations might be running out of oxygen.
A parallel and far stranger insurgency was simultaneously wreaking havoc in the Republican Party. That a narcissistic political neophyte stood the slightest chance of capturing the GOP seemed even more improbable than Sanders taking a nomination that appeared Clinton’s by right.
Coarse, vulgar, unprincipled, uninformed, erratic, and with little regard for truth, Trump was sui generis among presidential candidates. Yet he possessed a singular gift: a knack for riling up those who nurse gripes and are keen to pin the blame on someone or something. In post-Cold War America, among the millions that Hillary Clinton was famously dismissing as “deplorables,” gripes had been ripening like cheese in a hothouse.
Through whatever combination of intuition and malice aforethought, Trump demonstrated a genius for motivating those deplorables. He pushed their buttons. They responded by turning out in droves to attend his rallies. There they listened to a message that they found compelling.
In Trump’s pledge to “make America great again” his followers heard a promise to restore everything they believed had been taken from them in the Age of Great Expectations. Globalization was neither beneficial nor inevitable, the candidate insisted, and vowed, once elected, to curb its effects along with the excesses of corporate capitalism, thereby bringing back millions of lost jobs from overseas. He would, he swore, fund a massive infrastructure program, cut taxes, keep a lid on the national debt, and generally champion the cause of working stiffs. The many complications and contradictions inherent in these various prescriptions would, he assured his fans, give way to his business savvy.
In considering America’s role in the post-Cold War world, Trump exhibited a similar impatience with the status quo. Rather than allowing armed conflicts to drag on forever, he promised to win them (putting to work his mastery of military affairs) or, if not, to quit and get out, pausing just long enough to claim as a sort of consolation prize whatever spoils might be lying loose on the battlefield. At the very least, he would prevent so-called allies from treating the United States like some patsy. Henceforth, nations benefitting from American protection were going to foot their share of the bill. What all of this added up to may not have been clear, but it did suggest a sharp departure from the usual post-1989 formula for exercising global leadership.
No less important than Trump’s semi-coherent critique of globalization and American globalism, however, was his success in channeling the discontent of all those who nursed an inchoate sense that post-Cold War freedoms might be working for some, but not for them.
Not that Trump had anything to say about whether freedom confers obligations, or whether conspicuous consumption might not actually hold the key to human happiness, or any of the various controversies related to gender, sexuality, and family. He was indifferent to all such matters. He was, however, distinctly able to offer his followers a grimly persuasive explanation for how America had gone off course and how the blessings of liberties to which they were entitled had been stolen. He did that by fingering as scapegoats Muslims, Mexicans, and others “not-like-me.”
Trump’s political strategy reduced to this: as president, he would overturn the conventions that had governed right thinking since the end of the Cold War. To the amazement of an establishment grown smug and lazy, his approach worked. Even while disregarding all received wisdom when it came to organizing and conducting a presidential campaign in the Age of Great Expectations, Trump won. He did so by enchanting the disenchanted, all those who had lost faith in the promises that had sprung from the bosom of the elites that the end of the Cold War had taken by surprise.
Adrift Without a Compass
Within hours of Trump’s election, among progressives, expressing fear and trepidation at the prospect of what he might actually do on assuming office became de rigueur. Yet those who had actually voted for Trump were also left wondering what to expect. Both camps assign him the status of a transformative historical figure. However, premonitions of incipient fascism and hopes that he will engineer a new American Golden Age are likely to prove similarly misplaced. To focus on the man himself rather than on the circumstances that produced him is to miss the significance of what has occurred.
Note, for example, that his mandate is almost entirely negative. It centers on rejection: of globalization, of counterproductive military meddling, and of the post-Cold War cultural project. Yet neither Trump nor any of his surrogates has offered a coherent alternative to the triad of themes providing the through line for the last quarter-century of American history. Apart a lingering conviction that forceful — in The Donald’s case, blustering — presidential leadership can somehow turn things around, “Trumpism” is a dog’s breakfast.
In all likelihood, his presidency will prove less transformative than transitional. As a result, concerns about what he may do, however worrisome, matter less than the larger question of where we go from here. The principles that enjoyed favor following the Cold War have been found wanting. What should replace them?
Efforts to identify those principles should begin with an honest accounting of the age we are now leaving behind, the history that happened after “the end of history.” That accounting should, in turn, allow room for regret, repentance, and making amends — the very critical appraisal that ought to have occurred at the end of the Cold War but was preempted when American elites succumbed to their bout of victory disease.
Don’t expect Donald Trump to undertake any such appraisal. Nor will the establishment that candidate Trump so roundly denounced, but which President-elect Trump, at least in his senior national security appointments, now shows sign of accommodating. Those expecting Trump’s election to inject courage into members of the political class or imagination into inside-the-Beltway “thought leaders” are in for a disappointment. So the principles we need — an approach to political economy providing sustainable and equitable prosperity; a foreign policy that discards militarism in favor of prudence and pragmatism; and an enriched, inclusive concept of freedom — will have to come from somewhere else.
“Where there is no vision,” the Book of Proverbs tells us, “the people perish.” In the present day, there is no vision to which Americans collectively adhere. For proof, we need look no further than the election of Donald Trump.
The Age of Great Expectations has ended, leaving behind an ominous void. Yet Trump’s own inability to explain what should fill that great void provides neither excuse for inaction nor cause for despair. Instead, Trump himself makes manifest the need to reflect on the nation’s recent past and to think deeply about its future.
A decade before the Cold War ended, writing in democracy, a short-lived journal devoted to “political renewal and radical change,” the historian and social critic Christopher Lasch sketched out a set of principles that might lead us out of our current crisis. Lasch called for a politics based on “the nurture of the soil against the exploitation of resources, the family against the factory, the romantic vision of the individual against the technological vision, [and] localism over democratic centralism.” Nearly a half-century later, as a place to begin, his prescription remains apt.
The Age of Great Expectations and the Great Void
President-elect Donald Trump’s message for the nation’s senior military leadership is ambiguously unambiguous. Here is he on 60 Minutes just days after winning the election.
Trump: “We have some great generals. We have great generals.”
Lesley Stahl: “You said you knew more than the generals about ISIS.”
Trump: “Well, I’ll be honest with you, I probably do because look at the job they’ve done. OK, look at the job they’ve done. They haven’t done the job.”
In reality, Trump, the former reality show host, knows next to nothing about ISIS, one of many gaps in his education that his impending encounter with actual reality is likely to fill. Yet when it comes to America’s generals, our president-to-be is onto something. No doubt our three- and four-star officers qualify as “great” in the sense that they mean well, work hard, and are altogether fine men and women. That they have not “done the job,” however, is indisputable — at least if their job is to bring America’s wars to a timely and successful conclusion.
Trump’s unhappy verdict — that the senior U.S. military leadership doesn’t know how to win — applies in spades to the two principal conflicts of the post-9/11 era: the Afghanistan War, now in its 16th year, and the Iraq War, launched in 2003 and (after a brief hiatus) once more grinding on. Yet the verdict applies equally to lesser theaters of conflict, largely overlooked by the American public, that in recent years have engaged the attention of U.S. forces, a list that would include conflicts in Libya, Somalia, Syria, and Yemen.
Granted, our generals have demonstrated an impressive aptitude for moving pieces around on a dauntingly complex military chessboard. Brigades, battle groups, and squadrons shuttle in and out of various war zones, responding to the needs of the moment. The sheer immensity of the enterprise across the Greater Middle East and northern Africa — the sorties flown, munitions expended, the seamless deployment and redeployment of thousands of troops over thousands of miles, the vast stockpiles of material positioned, expended, and continuously resupplied — represents a staggering achievement. Measured by these or similar quantifiable outputs, America’s military has excelled. No other military establishment in history could have come close to duplicating the logistical feats being performed year in, year out by the armed forces of the United States.
Nor should we overlook the resulting body count. Since the autumn of 2001, something like 370,000 combatants and noncombatants have been killed in the various theaters of operations where U.S. forces have been active. Although modest by twentieth century standards, this post-9/11 harvest of death is hardly trivial.
Yet in evaluating military operations, it’s a mistake to confuse how much with how well. Only rarely do the outcomes of armed conflicts turn on comparative statistics. Ultimately, the one measure of success that really matters involves achieving war’s political purposes. By that standard, victory requires not simply the defeat of the enemy, but accomplishing the nation’s stated war aims, and not just in part or temporarily but definitively. Anything less constitutes failure, not to mention utter waste for taxpayers, and for those called upon to fight, it constitutes cause for mourning.
By that standard, having been “at war” for virtually the entire twenty-first century, the United States military is still looking for its first win. And however strong the disinclination to concede that Donald Trump could be right about anything, his verdict on American generalship qualifies as apt.
A Never-Ending Parade of Commanders for Wars That Never End
That verdict brings to mind three questions. First, with Trump a rare exception, why have the recurring shortcomings of America’s military leadership largely escaped notice? Second, to what degree does faulty generalship suffice to explain why actual victory has proven so elusive? Third, to the extent that deficiencies at the top of the military hierarchy bear directly on the outcome of our wars, how might the generals improve their game?
As to the first question, the explanation is quite simple: During protracted wars, traditional standards for measuring generalship lose their salience. Without pertinent standards, there can be no accountability. Absent accountability, failings and weaknesses escape notice. Eventually, what you’ve become accustomed to seems tolerable. Twenty-first century Americans inured to wars that never end have long since forgotten that bringing such conflicts to a prompt and successful conclusion once defined the very essence of what generals were expected to do.
Senior military officers were presumed to possess unique expertise in designing campaigns and directing engagements. Not found among mere civilians or even among soldiers of lesser rank, this expertise provided the rationale for conferring status and authority on generals.
In earlier eras, the very structure of wars provided a relatively straightforward mechanism for testing such claims to expertise. Events on the battlefield rendered harsh judgments, creating or destroying reputations with brutal efficiency.
Back then, standards employed in evaluating generalship were clear-cut and uncompromising. Those who won battles earned fame, glory, and the gratitude of their countrymen. Those who lost battles got fired or were put out to pasture.
During the Civil War, for example, Abraham Lincoln did not need an advanced degree in strategic studies to conclude that Union generals like John Pope, Ambrose Burnside, and Joseph Hooker didn’t have what it took to defeat the Army of Northern Virginia. Humiliating defeats sustained by the Army of the Potomac at the Second Bull Run, Fredericksburg, and Chancellorsville made that obvious enough. Similarly, the victories Ulysses S. Grant and William T. Sherman gained at Shiloh, at Vicksburg, and in the Chattanooga campaign strongly suggested that here was the team to which the president could entrust the task of bringing the Confederacy to its knees.
Today, public drunkenness, petty corruption, or sexual shenanigans with a subordinate might land generals in hot water. But as long as they avoid egregious misbehavior, senior officers charged with prosecuting America’s wars are largely spared judgments of any sort. Trying hard is enough to get a passing grade.
With the country’s political leaders and public conditioned to conflicts seemingly destined to drag on for years, if not decades, no one expects the current general-in-chief in Iraq or Afghanistan to bring things to a successful conclusion. His job is merely to manage the situation until he passes it along to a successor, while duly adding to his collection of personal decorations and perhaps advancing his career.
Today, for example, Army General John Nicholson commands U.S. and allied forces in Afghanistan. He’s only the latest in a long line of senior officers to preside over that war, beginning with General Tommy Franks in 2001 and continuing with Generals Mikolashek, Barno, Eikenberry, McNeill, McKiernan, McChrystal, Petraeus, Allen, Dunford, and Campbell. The title carried by these officers changed over time. So, too, did the specifics of their “mission” as Operation Enduring Freedom evolved into Operation Freedom’s Sentinel. Yet even as expectations slipped lower and lower, none of the commanders rotating through Kabul delivered. Not a single one has, in our president-elect’s concise formulation, “done the job.” Indeed, it’s increasingly difficult to know what that job is, apart from preventing the Taliban from quite literally toppling the government.
In Iraq, meanwhile, Army Lieutenant General Stephen Townsend currently serves as the — count ‘em — ninth American to command U.S. and coalition forces in that country since the George W. Bush administration ordered the invasion of 2003. The first in that line, (once again) General Tommy Franks, overthrew the Saddam Hussein regime and thereby broke Iraq. The next five, Generals Sanchez, Casey, Petraeus, Odierno, and Austin, labored for eight years to put it back together again.
At the end of 2011, President Obama declared that they had done just that and terminated the U.S. military occupation. The Islamic State soon exposed Obama’s claim as specious when its militants put a U.S.-trained Iraqi army to flight and annexed large swathes of that country’s territory. Following in the footsteps of his immediate predecessors Generals James Terry and Sean MacFarland, General Townsend now shoulders the task of trying to restore Iraq’s status as a more or less genuinely sovereign state. He directs what the Pentagon calls Operation Inherent Resolve, dating from June 2014, the follow-on to Operation New Dawn (September 2010-December 2011), which was itself the successor to Operation Iraqi Freedom (March 2003-August 2010).
When and how Inherent Resolve will conclude is difficult to forecast. This much we can, however, say with some confidence: with the end nowhere in sight, General Townsend won’t be its last commander. Other generals are waiting in the wings with their own careers to polish. As in Kabul, the parade of U.S. military commanders through Baghdad will continue.
For some readers, this listing of mostly forgotten names and dates may have a soporific effect. Yet it should also drive home Trump’s point. The United States may today have the world’s most powerful and capable military — so at least we are constantly told. Yet the record shows that it does not have a corps of senior officers who know how to translate capability into successful outcomes.
Draining Which Swamp?
That brings us to the second question: Even if commander-in-chief Trump were somehow able to identify modern day equivalents of Grant and Sherman to implement his war plans, secret or otherwise, would they deliver victory?
On that score, we would do well to entertain doubts. Although senior officers charged with running recent American wars have not exactly covered themselves in glory, it doesn’t follow that their shortcomings offer the sole or even a principal explanation for why those wars have yielded such disappointing results. The truth is that some wars aren’t winnable and shouldn’t be fought.
So, yes, Trump’s critique of American generalship possesses merit, but whether he knows it or not, the question truly demanding his attention as the incoming commander-in-chief isn’t: Who should I hire (or fire) to fight my wars? Instead, far more urgent is: Does further war promise to solve any of my problems?
One mark of a successful business executive is knowing when to cut your losses. It’s also the mark of a successful statesman. Trump claims to be the former. Whether his putative business savvy will translate into the world of statecraft remains to be seen. Early signs are not promising.
As a candidate, Trump vowed to “defeat radical Islamic terrorism,” destroy ISIS, “decimate al-Qaeda,” and “starve funding for Iran-backed Hamas and Hezbollah.” Those promises imply a significant escalation of what Americans used to call the Global War on Terrorism.
Toward that end, the incoming administration may well revive some aspects of the George W. Bush playbook, including repopulating the military prison at Guantanamo Bay, Cuba, and “if it’s so important to the American people,” reinstituting torture. The Trump administration will at least consider re-imposing sanctions on countries like Iran. It may aggressively exploit the offensive potential of cyber-weapons, betting that America’s cyber-defenses will hold.
Yet President Trump is also likely to double down on the use of conventional military force. In that regard, his promise to “quickly and decisively bomb the hell out of ISIS” offers a hint of what is to come. His appointment of the uber-hawkish Lieutenant General Michael Flynn as his national security adviser and his rumored selection of retired Marine Corps General James (“Mad Dog”) Mattis as defense secretary suggest that he means what he says. In sum, a Trump administration seems unlikely to reexamine the conviction that the problems roiling the Greater Middle East will someday, somehow yield to a U.S.-imposed military solution. Indeed, in the face of massive evidence to the contrary, that conviction will deepen, with genuinely ironic implications for the Trump presidency.
In the immediate wake of 9/11, George W. Bush concocted a fantasy of American soldiers liberating oppressed Afghans and Iraqis and thereby “draining the swamp” that served to incubate anti-Western terrorism. The results achieved proved beyond disappointing, while the costs exacted in terms of lives and dollars squandered were painful indeed. Incrementally, with the passage of time, many Americans concluded that perhaps the swamp most in need of attention was not on the far side of the planet but much closer at hand — right in the imperial city nestled alongside the Potomac River.
To a very considerable extent, Trump defeated Hillary Clinton, preferred candidate of the establishment, because he advertised himself as just the guy disgruntled Americans could count on to drain that swamp.
Yet here’s what too few of those Americans appreciate, even today: war created that swamp in the first place. War empowers Washington. It centralizes. It provides a rationale for federal authorities to accumulate and exercise new powers. It makes government bigger and more intrusive. It lubricates the machinery of waste, fraud, and abuse that causes tens of billions of taxpayer dollars to vanish every year. When it comes to sustaining the swamp, nothing works better than war.
Were Trump really intent on draining that swamp — if he genuinely seeks to “Make America Great Again” — then he would extricate the United States from war. His liquidation of Trump University, which was to higher education what Freedom’s Sentinel and Inherent Resolve are to modern warfare, provides a potentially instructive precedent for how to proceed.
But don’t hold your breath on that one. All signs indicate that, in one fashion or another, our combative next president will perpetuate the wars he’s inheriting. Trump may fancy that, as a veteran of Celebrity Apprentice (but not of military service), he possesses a special knack for spotting the next Grant or Sherman. But acting on that impulse will merely replenish the swamp in the Greater Middle East along with the one in Washington. And soon enough, those who elected him with expectations of seeing the much-despised establishment dismantled will realize that they’ve been had.
Which brings us, finally, to that third question: To the extent that deficiencies at the top of the military hierarchy do affect the outcome of wars, what can be done to fix the problem?
The most expeditious approach: purge all currently serving three- and four-star officers; then, make a precondition for promotion to those ranks confinement in a reeducation camp run by Iraq and Afghanistan war amputees, with a curriculum designed by Veterans for Peace. Graduation should require each student to submit an essay reflecting on these words of wisdom from U.S. Grant himself: “There never was a time when, in my opinion, some way could not be found to prevent the drawing of the sword.”
True, such an approach may seem a bit draconian. But this is no time for half-measures — as even Donald Trump may eventually recognize.
Copyright 2016 Andrew J. Bacevich
You may have missed it. Perhaps you dozed off. Or wandered into the kitchen to grab a snack. Or by that point in the proceedings were checking out Seinfeld reruns. During the latter part of the much hyped but excruciating-to-watch first presidential debate, NBC Nightly News anchor Lester Holt posed a seemingly straightforward but cunningly devised question. His purpose was to test whether the candidates understood the essentials of nuclear strategy.
A moderator given to plain speaking might have said this: “Explain why the United States keeps such a large arsenal of nuclear weapons and when you might consider using those weapons.”
What Holt actually said was: “On nuclear weapons, President Obama reportedly considered changing the nation’s longstanding policy on first use. Do you support the current policy?”
The framing of the question posited no small amount of knowledge on the part of the two candidates. Specifically, it assumed that Donald Trump and Hillary Clinton each possess some familiarity with the longstanding policy to which Holt referred and with the modifications that Obama had contemplated making to it.
If you will permit the equivalent of a commercial break as this piece begins, let me explain why I’m about to parse in detail each candidate’s actual answer to Holt’s question. Amid deep dives into, and expansive punditry regarding, issues like how “fat” a former Miss Universe may have been and how high an imagined future wall on our southern border might prove to be, national security issues likely to test the judgment of a commander-in-chief have received remarkably little attention. So indulge me. This largely ignored moment in last week’s presidential debate is worth examining.
With regard to the issue of “first use,” every president since Harry Truman has subscribed to the same posture: the United States retains the prerogative of employing nuclear weapons to defend itself and its allies against even nonnuclear threats. In other words, as a matter of policy, the United States rejects the concept of “no first use,” which would prohibit any employment of nuclear weapons except in retaliation for a nuclear attack. According to press reports, President Obama had toyed with but then rejected the idea of committing the United States to a “no first use” posture. Holt wanted to know where the two candidates aspiring to succeed Obama stood on the matter.
Cruelly, the moderator invited Trump to respond first. The look in the Republican nominee’s eyes made it instantly clear that Holt could have been speaking Farsi for all he understood. A lesser candidate might then have begun with the nuclear equivalent of “What is Aleppo?”
Yet Trump being Trump, he gamely — or naively — charged headlong into the ambush that Holt had carefully laid, using his allotted two minutes to offer his insights into how as president he would address the nuclear conundrum that previous presidents had done so much to create. The result owed less to early Cold War thinkers-of-the-unthinkable like Herman Kahn or Albert Wohlstetter, who created the field of nuclear strategy, than to Dr. Strangelove. Make that Dr. Strangelove on meth.
Trump turned first to Russia, expressing concern that it might be gaining an edge in doomsday weaponry. “They have a much newer capability than we do,” he said. “We have not been updating from the new standpoint.” The American bomber fleet in particular, he added, needs modernization. Presumably referring to the recent employment of Vietnam-era bombers in the wars in Afghanistan, Iraq, and Syria, he continued somewhat opaquely, “I looked the other night. I was seeing B-52s, they’re old enough that your father, your grandfather, could be flying them. We are not — we are not keeping up with other countries.”
Trump then professed an appreciation for the awfulness of nuclear weaponry. “I would like everybody to end it, just get rid of it. But I would certainly not do first strike. I think that once the nuclear alternative happens, it’s over.”
Give Trump this much: even in a field that tends to favor abstraction and obfuscating euphemisms like “fallout” or “dirty bomb,” classifying Armageddon as the “nuclear alternative” represents something of a contribution.
Still, it’s worth noting that, in the arcane theology of nuclear strategy, “first strike” and “first use” are anything but synonymous. “First strike” implies a one-sided, preventive war of annihilation. The logic of a first strike, such as it is, is based on the calculation that a surprise nuclear attack could inflict the “nuclear alternative” on your adversary, while sparing your own side from suffering a comparable fate. A successful first strike would be a one-punch knockout, delivered while your opponent still sits in his corner of the ring.
Yet whatever reassurance was to be found in Trump’s vow never to order a first strike — not the question Lester Holt was asking — was immediately squandered. The Republican nominee promptly revoked his “no first strike” pledge by insisting, in a cliché much favored in Washington, that “I can’t take anything off the table.”
Piling non sequitur upon non sequitur, he next turned to the threat posed by a nuclear-armed North Korea, where “we’re doing nothing.” Yet, worrisome as this threat might be, keeping Pyongyang in check, he added, ought to be Beijing’s job. “China should solve that problem for us,” he insisted. “China should go into North Korea. China is totally powerful as it relates to North Korea.”
If China wouldn’t help with North Korea, however, what could be more obvious than that Iran, many thousands of miles away, should do so — and might have, if only President Obama had incorporated the necessary proviso into the Iran nuclear deal. “Iran is one of their biggest trading partners. Iran has power over North Korea.” When the Obama administration “made that horrible deal with Iran, they should have included the fact that they do something with respect to North Korea.” But why stop with North Korea? Iran “should have done something with respect to Yemen and all these other places,” he continued, wandering into the nonnuclear world. U.S. negotiators suitably skilled in the Trumpian art of the deal, he implied, could easily have maneuvered Iran into solving such problems on Washington’s behalf.
Veering further off course, Trump then took a passing swipe at Secretary of State John Kerry: “Why didn’t you add other things into the deal?” Why, in “one of the great giveaways of all time,” did the Obama administration fork over $400 million in cash? At which point, he promptly threw in another figure without the slightest explanation — “It was actually $1.7 billion in cash” — in “one of the worst deals ever made by any country in history.”
Trump then wrapped up his meandering tour d’horizon by decrying the one action of the Obama administration that arguably has reduced the prospect of nuclear war, at least in the near future. “The deal with Iran will lead to nuclear problems,” he stated with conviction. “All they have to do is sit back 10 years, and they don’t have to do much. And they’re going to end up getting nuclear.” For proof, he concluded, talk to the Israelis. “I met with Bibi Netanyahu the other day,” he added for no reason in particular. “Believe me, he’s not a happy camper.”
On this indecipherable note, his allotted time exhausted, Trump’s recitation ended. In its way, it had been a Joycean performance.
Bridge Over Troubled Waters?
It was now Clinton’s turn to show her stuff. If Trump had responded to Holt like a voluble golf caddy being asked to discuss the finer points of ice hockey, Hillary Clinton chose a different course: she changed the subject. She would moderate her own debate. Perhaps Trump thought Holt was in charge of the proceedings; Clinton knew better.
What followed was vintage Clinton: vapid sentiments, smoothly delivered in the knowing tone of a seasoned Washington operative. During her two minutes, she never came within a country mile of discussing the question Holt had asked or the thoughts she evidently actually has about nuclear issues.
“[L]et me start by saying, words matter,” she began. “Words matter when you run for president. And they really matter when you are president. And I want to reassure our allies in Japan and South Korea and elsewhere that we have mutual defense treaties and we will honor them.”
It was as if Clinton were already speaking from the Oval Office. Trump had addressed his remarks to Lester Holt. Clinton directed hers to the nation at large, to people the world over, indeed to history itself. Warming to her task, she was soon rolling out the sort of profundities that play well at the Brookings Institution, the Carnegie Endowment, or the Council on Foreign Relations, causing audiences to nod — or nod off.
“It is essential that America’s word be good,” Clinton continued. “And so I know that this campaign has caused some questioning and worries on the part of many leaders across the globe. I’ve talked with a number of them. But I want to — on behalf of myself, and I think on behalf of a majority of the American people, say that, you know, our word is good.”
Then, after inserting a tepid, better-than-nothing endorsement of the Iran nuclear deal, she hammered Trump for not offering an alternative. “Would he have started a war? Would he have bombed Iran?” If you’re going to criticize, she pointed out, you need to offer something better. Trump never does, she charged. “It’s like his plan to defeat ISIS. He says it’s a secret plan, but the only secret is that he has no plan.”
With that, she reverted to platitudes. “So we need to be more precise in how we talk about these issues. People around the word follow our presidential campaigns so closely, trying to get hints about what we will do. Can they rely on us? Are we going to lead the world with strength and in accordance with our values? That’s what I intend to do. I intend to be a leader of our country that people can count on, both here at home and around the world, to make decisions that will further peace and prosperity, but also stand up to bullies, whether they’re abroad or at home.”
Like Trump, she offered no specifics. Which bullies? Where? How? In what order? Would she start with Russia’s Putin? North Korea’s Kim Jong-Un? Perhaps Rodrigo Duterte of the Philippines? How about Turkey’s Recep Tayyip Erdogan? Or Bibi?
In contrast to Trump, however, Clinton did speak in complete sentences, which followed one another in an orderly fashion. She thereby came across as at least nominally qualified to govern the country, much like, say, Warren G. Harding nearly a century ago. And what worked for Harding in 1920 may well work for Clinton in 2016.
Of Harding’s speechifying, H.L. Mencken wrote at the time, “It reminds me of a string of wet sponges.” Mencken characterized Harding’s rhetoric as “so bad that a sort of grandeur creeps into it. It drags itself out of the dark abysm of pish, and crawls insanely up the topmost pinnacle of posh. It is rumble and bumble. It is flap and doodle. It is balder and dash.” So, too, with Hillary Clinton. She is our Warren G. Harding. In her oratory, flapdoodle and balderdash live on.
The National Security Void
If I’ve taxed your patience by recounting this non-debate and non-discussion of nuclear first use, it’s to make a larger point. The absence of relevant information elicited by Lester Holt’s excellent question speaks directly to what has become a central flaw in this entire presidential campaign: the dearth of attention given to matters basic to U.S. national security policy.
In the nuclear arena, the issue of first use is only one of several on which anyone aspiring to become the next commander-in-chief should be able to offer an informed judgment. Others include questions such as these:
- What is the present-day justification for maintaining the U.S. nuclear “triad,” a strike force consisting of manned bombers and land-based ballistic missiles and submarine-launched ballistic missiles?
- Why is the Pentagon embarking upon a decades-long, trillion-dollar program to modernize that triad, fielding a new generation of bombers, missiles, and submarines along with an arsenal of new warheads? Is that program necessary?
- How do advances in non-nuclear weaponry — for example, in the realm of cyberwarfare — affect theories of nuclear deterrence devised by the likes of Kahn and Wohlstetter during the 1950s and 1960s? Does the logic of those theories still pertain?
Beyond the realm of nuclear strategy, there are any number of other security-related questions about which the American people deserve to hear directly from both Trump and Clinton, testing their knowledge of the subject matter and the quality of their judgments. Among such matters, one in particular screams out for attention. Consider it the question that Washington has declared off-limits: What lessons should be drawn from America’s costly and disappointing post-9/11 wars and how should those lessons apply to future policy?
With Election Day now merely a month away, there is no more reason to believe that such questions will receive serious consideration than to expect Trump to come clean on his personal finances or Clinton to release the transcripts of her handsomely compensated Goldman Sachs speeches.
When outcomes don’t accord with his wishes, Trump reflexively blames a “rigged” system. But a system that makes someone like Trump a finalist for the presidency isn’t rigged. It is manifestly absurd, a fact that has left most of the national media grasping wildly for explanations (albeit none that tag them with having facilitated the transformation of politics into theater).
I’ll take a backseat to no one in finding Trump unfit to serve as president. Yet beyond the outsized presence of one particular personality, the real travesty of our predicament lies elsewhere — in the utter shallowness of our political discourse, no more vividly on display than in the realm of national security.
What do our presidential candidates talk about when they don’t want to talk about nuclear war? The one, in a vain effort to conceal his own ignorance, offers rambling nonsense. The other, accustomed to making her own rules, simply changes the subject.
The American people thereby remain in darkness. On that score, Trump, Clinton, and the parties they represent are not adversaries. They are collaborators.
What We Talk About When We Don’t Want to Talk About Nuclear War
My earliest recollection of national politics dates back exactly 60 years to the moment, in the summer of 1956, when I watched the political conventions in the company of that wondrous new addition to our family, television. My parents were supporting President Dwight D. Eisenhower for a second term and that was good enough for me. Even as a youngster, I sensed that Ike, the former supreme commander of allied forces in Europe in World War II, was someone of real stature. In a troubled time, he exuded authority and self-confidence. By comparison, Democratic candidate Adlai Stevenson came across as vaguely suspect. Next to the five-star incumbent, he seemed soft, even foppish, and therefore not up to the job. So at least it appeared to a nine-year-old living in Chicagoland.
Of the seamy underside of politics I knew nothing, of course. On the surface, all seemed reassuring. As if by divine mandate, two parties vied for power. The views they represented defined the allowable range of opinion. The outcome of any election expressed the collective will of the people and was to be accepted as such. That I was growing up in the best democracy the world had ever known — its very existence a daily rebuke to the enemies of freedom — was beyond question.
Naïve? Embarrassingly so. Yet how I wish that Election Day in November 2016 might present Americans with something even loosely approximating the alternatives available to them in November 1956. Oh, to choose once more between an Ike and an Adlai.
Don’t for a second think that this is about nostalgia. Today, Stevenson doesn’t qualify for anyone’s list of Great Americans. If remembered at all, it’s for his sterling performance as President John F. Kennedy’s U.N. ambassador during the Cuban Missile Crisis. Interrogating his Soviet counterpart with cameras rolling, Stevenson barked that he was prepared to wait “until hell freezes over” to get his questions answered about Soviet military activities in Cuba. When the chips were down, Adlai proved anything but soft. Yet in aspiring to the highest office in the land, he had come up well short. In 1952, he came nowhere close to winning and in 1956 he proved no more successful. Stevenson was to the Democratic Party what Thomas Dewey had been to the Republicans: a luckless two-time loser.
As for Eisenhower, although there is much in his presidency to admire, his errors of omission and commission were legion. During his two terms, from Guatemala to Iran, the CIA overthrew governments, plotted assassinations, and embraced unsavory right-wing dictators — in effect, planting a series of IEDs destined eventually to blow up in the face of Ike’s various successors. Meanwhile, binging on nuclear weapons, the Pentagon accumulated an arsenal far beyond what even Eisenhower as commander-in-chief considered prudent or necessary.
In addition, during his tenure in office, the military-industrial complex became a rapacious juggernaut, an entity unto itself as Ike himself belatedly acknowledged. By no means least of all, Eisenhower fecklessly committed the United States to an ill-fated project of nation-building in a country that just about no American had heard of at the time: South Vietnam. Ike did give the nation eight years of relative peace and prosperity, but at a high price — most of the bills coming due long after he left office.
The Pathology of American Politics
And yet, and yet…
To contrast the virtues and shortcomings of Stevenson and Eisenhower with those of Hillary Rodham Clinton and Donald Trump is both instructive and profoundly depressing. Comparing the adversaries of 1956 with their 2016 counterparts reveals with startling clarity what the decades-long decay of American politics has wrought.
In 1956, each of the major political parties nominated a grown-up for the highest office in the land. In 2016, only one has.
In 1956, both parties nominated likeable individuals who conveyed a basic sense of trustworthiness. In 2016, neither party has done so.
In 1956, Americans could count on the election to render a definitive verdict, the vote count affirming the legitimacy of the system itself and allowing the business of governance to resume. In 2016, that is unlikely to be the case. Whether Trump or Clinton ultimately prevails, large numbers of Americans will view the result as further proof of “rigged” and irredeemably corrupt political arrangements. Rather than inducing some semblance of reconciliation, the outcome is likely to deepen divisions.
How in the name of all that is holy did we get into such a mess?
How did the party of Eisenhower, an architect of victory in World War II, choose as its nominee a narcissistic TV celebrity who, with each successive Tweet and verbal outburst, offers further evidence that he is totally unequipped for high office? Yes, the establishment media are ganging up on Trump, blatantly displaying the sort of bias normally kept at least nominally under wraps. Yet never have such expressions of journalistic hostility toward a particular candidate been more justified. Trump is a bozo of such monumental proportions as to tax the abilities of our most talented satirists. Were he alive today, Mark Twain at his most scathing would be hard-pressed to do justice to The Donald’s blowhard pomposity.
Similarly, how did the party of Adlai Stevenson, but also of Stevenson’s hero Franklin Roosevelt, select as its candidate someone so widely disliked and mistrusted even by many of her fellow Democrats? True, antipathy directed toward Hillary Clinton draws some of its energy from incorrigible sexists along with the “vast right wing conspiracy” whose members thoroughly loathe both Clintons. Yet the antipathy is not without basis in fact.
Even by Washington standards, Secretary Clinton exudes a striking sense of entitlement combined with a nearly complete absence of accountability. She shrugs off her misguided vote in support of invading Iraq back in 2003, while serving as senator from New York. She neither explains nor apologizes for pressing to depose Libya’s Muammar Gaddafi in 2011, her most notable “accomplishment” as secretary of state. “We came, we saw, he died,” she bragged back then, somewhat prematurely given that Libya has since fallen into anarchy and become a haven for ISIS.
She clings to the demonstrably false claim that her use of a private server for State Department business compromised no classified information. Now opposed to the Trans Pacific Partnership (TTP) that she once described as the “gold standard in trade agreements,” Clinton rejects charges of political opportunism. That her change of heart occurred when attacking the TPP was helping Bernie Sanders win one Democratic primary after another is merely coincidental. Oh, and the big money accepted from banks and Wall Street as well as the tech sector for minimal work and the bigger money still from leading figures in the Israel lobby? Rest assured that her acceptance of such largesse won’t reduce by one iota her support for “working class families” or her commitment to a just peace settlement in the Middle East.
Let me be clear: none of these offer the slightest reason to vote for Donald Trump. Yet together they make the point that Hillary Clinton is a deeply flawed candidate, notably so in matters related to national security. Clinton is surely correct that allowing Trump to make decisions related to war and peace would be the height of folly. Yet her record in that regard does not exactly inspire confidence.
When it comes to foreign policy, Trump’s preference for off-the-cuff utterances finds him committing astonishing gaffes with metronomic regularity. Spontaneity serves chiefly to expose his staggering ignorance.
By comparison, the carefully scripted Clinton commits few missteps, as she recites with practiced ease the pabulum that passes for right thinking in establishment circles. But fluency does not necessarily connote soundness. Clinton, after all, adheres resolutely to the highly militarized “Washington playbook” that President Obama himself has disparaged — a faith-based belief in American global primacy to be pursued regardless of how the world may be changing and heedless of costs.
On the latter point, note that Clinton’s acceptance speech in Philadelphia included not a single mention of Afghanistan. By Election Day, the war there will have passed its 15th anniversary. One might think that a prospective commander-in-chief would have something to say about the longest conflict in American history, one that continues with no end in sight. Yet, with the Washington playbook offering few answers, Mrs. Clinton chooses to remain silent on the subject.
So while a Trump presidency holds the prospect of the United States driving off a cliff, a Clinton presidency promises to be the equivalent of banging one’s head against a brick wall without evident effect, wondering all the while why it hurts so much.
Pseudo-Politics for an Ersatz Era
But let’s not just blame the candidates. Trump and Clinton are also the product of circumstances that neither created. As candidates, they are merely exploiting a situation — one relying on intuition and vast stores of brashness, the other putting to work skills gained during a life spent studying how to acquire and employ power. The success both have achieved in securing the nominations of their parties is evidence of far more fundamental forces at work.
In the pairing of Trump and Clinton, we confront symptoms of something pathological. Unless Americans identify the sources of this disease, it will inevitably worsen, with dire consequences in the realm of national security. After all, back in Eisenhower’s day, the IEDs planted thanks to reckless presidential decisions tended to blow up only years — or even decades — later. For example, between the 1953 U.S.-engineered coup that restored the Shah to his throne and the 1979 revolution that converted Iran overnight from ally to adversary, more than a quarter of a century elapsed. In our own day, however, detonation occurs so much more quickly — witness the almost instantaneous and explosively unhappy consequences of Washington’s post-9/11 military interventions in the Greater Middle East.
So here’s a matter worth pondering: How is it that all the months of intensive fundraising, the debates and speeches, the caucuses and primaries, the avalanche of TV ads and annoying robocalls have produced two presidential candidates who tend to elicit from a surprisingly large number of rank-and-file citizens disdain, indifference, or at best hold-your-nose-and-pull-the-lever acquiescence?
Here, then, is a preliminary diagnosis of three of the factors contributing to the erosion of American politics, offered from the conviction that, for Americans to have better choices next time around, fundamental change must occur — and soon.
First, and most important, the evil effects of money: Need chapter and verse? For a tutorial, see this essential 2015 book by Professor Lawrence Lessig of Harvard: Republic Lost, Version 2.0. Those with no time for books might spare 18 minutes for Lessig’s brilliant and deeply disturbing TED talk. Professor Lessig argues persuasively that unless the United States radically changes the way it finances political campaigns, we’re pretty much doomed to see our democracy wither and die.
Needless to say, moneyed interests and incumbents who benefit from existing arrangements take a different view and collaborate to maintain the status quo. As a result, political life has increasingly become a pursuit reserved for those like Trump who possess vast personal wealth or for those like Clinton who display an aptitude for persuading the well to do to open their purses, with all that implies by way of compromise, accommodation, and the subsequent repayment of favors.
Second, the perverse impact of identity politics on policy: Observers make much of the fact that, in capturing the presidential nomination of a major party, Hillary Clinton has shattered yet another glass ceiling. They are right to do so. Yet the novelty of her candidacy starts and ends with gender. When it comes to fresh thinking, Donald Trump has far more to offer than Clinton — even if his version of “fresh” tends to be synonymous with wacky, off-the-wall, ridiculous, or altogether hair-raising.
The essential point here is that, in the realm of national security, Hillary Clinton is utterly conventional. She subscribes to a worldview (and view of America’s role in the world) that originated during the Cold War, reached its zenith in the 1990s when the United States proclaimed itself the planet’s “sole superpower,” and persists today remarkably unaffected by actual events. On the campaign trail, Clinton attests to her bona fides by routinely reaffirming her belief in American exceptionalism, paying fervent tribute to “the world’s greatest military,” swearing that she’ll be “listening to our generals and admirals,” and vowing to get tough on America’s adversaries. These are, of course, the mandatory rituals of the contemporary Washington stump speech, amplified if anything by the perceived need for the first female candidate for president to emphasize her pugnacity.
A Clinton presidency, therefore, offers the prospect of more of the same — muscle-flexing and armed intervention to demonstrate American global leadership — albeit marketed with a garnish of diversity. Instead of different policies, Clinton will offer an administration that has a different look, touting this as evidence of positive change.
Yet while diversity may be a good thing, we should not confuse it with effectiveness. A national security team that “looks like America” (to use the phrase originally coined by Bill Clinton) does not necessarily govern more effectively than one that looks like President Eisenhower’s. What matters is getting the job done.
Since the 1990s women have found plentiful opportunities to fill positions in the upper echelons of the national security apparatus. Although we have not yet had a female commander-in-chief, three women have served as secretary of state and two as national security adviser. Several have filled Adlai Stevenson’s old post at the United Nations. Undersecretaries, deputy undersecretaries, and assistant secretaries of like gender abound, along with a passel of female admirals and generals.
So the question needs be asked: Has the quality of national security policy improved compared to the bad old days when men exclusively called the shots? Using as criteria the promotion of stability and the avoidance of armed conflict (along with the successful prosecution of wars deemed unavoidable), the answer would, of course, have to be no. Although Madeleine Albright, Condoleezza Rice, Susan Rice, Samantha Power, and Clinton herself might entertain a different view, actually existing conditions in Afghanistan, Iraq, Libya, Syria, Somalia, Sudan, Yemen, and other countries across the Greater Middle East and significant parts of Africa tell a different story.
The abysmal record of American statecraft in recent years is not remotely the fault of women; yet neither have women made a perceptibly positive difference. It turns out that identity does not necessarily signify wisdom or assure insight. Allocating positions of influence in the State Department or the Pentagon based on gender, race, ethnicity, or sexual orientation — as Clinton will assuredly do — may well gratify previously disenfranchised groups. Little evidence exists to suggest that doing so will produce more enlightened approaches to statecraft, at least not so long as adherence to the Washington playbook figures as a precondition to employment. (Should Clinton win in November, don’t expect the redoubtable ladies of Code Pink to be tapped for jobs at the Pentagon and State Department.)
In the end, it’s not identity that matters but ideas and their implementation. To contemplate the ideas that might guide a President Trump along with those he will recruit to act on them — Ivanka as national security adviser? — is enough to elicit shudders from any sane person. Yet the prospect of Madam President surrounding herself with an impeccably diverse team of advisers who share her own outmoded views is hardly cause for celebration.
Putting a woman in charge of national security policy will not in itself amend the defects exhibited in recent years. For that, the obsolete principles with which Clinton along with the rest of Washington remains enamored will have to be jettisoned. In his own bizarre way (albeit without a clue as to a plausible alternative), Donald Trump seems to get that; Hillary Clinton does not.
Third, the substitution of “reality” for reality: Back in 1962, a young historian by the name of Daniel Boorstin published The Image: A Guide to Pseudo-Events in America. In an age in which Donald Trump and Hillary Clinton vie to determine the nation’s destiny, it should be mandatory reading. The Image remains, as when it first appeared, a fire bell ringing in the night.
According to Boorstin, more than five decades ago the American people were already living in a “thicket of unreality.” By relentlessly indulging in ever more “extravagant expectations,” they were forfeiting their capacity to distinguish between what was real and what was illusory. Indeed, Boorstin wrote, “We have become so accustomed to our illusions that we mistake them for reality.”
While ad agencies and PR firms had indeed vigorously promoted a world of illusions, Americans themselves had become willing accomplices in the process.
“The American citizen lives in a world where fantasy is more real than reality, where the image has more dignity than its original. We hardly dare to face our bewilderment, because our ambiguous experience is so pleasantly iridescent, and the solace of belief in contrived reality is so thoroughly real. We have become eager accessories to the great hoaxes of the age. These are the hoaxes we play on ourselves.”
This, of course, was decades before the nation succumbed to the iridescent allure of Facebook, Google, fantasy football, “Real Housewives of _________,” selfies, smartphone apps, Game of Thrones, Pokémon GO — and, yes, the vehicle that vaulted Donald Trump to stardom, The Apprentice.
“The making of the illusions which flood our experience has become the business of America,” wrote Boorstin. It’s also become the essence of American politics, long since transformed into theater, or rather into some sort of (un)reality show.
Presidential campaigns today are themselves, to use Boorstin’s famous term, “pseudo-events” that stretch from months into years. By now, most Americans know better than to take at face value anything candidates say or promise along the way. We’re in on the joke — or at least we think we are. Reinforcing that perception on a daily basis are media outlets that have abandoned mere reporting in favor of enhancing the spectacle of the moment. This is especially true of the cable news networks, where talking heads serve up a snide and cynical complement to the smarmy fakery that is the office-seeker’s stock in trade. And we lap it up. It matters little that we know it’s all staged and contrived, as long as — a preening Megyn Kelly getting under Trump’s skin, Trump himself denouncing “lyin’ Ted” Cruz, etc., etc. — it’s entertaining.
This emphasis on spectacle has drained national politics of whatever substance it still had back when Ike and Adlai commanded the scene. It hardly need be said that Donald Trump has demonstrated an extraordinary knack — a sort of post-modern genius — for turning this phenomenon to his advantage. Yet in her own way Clinton plays the same game. How else to explain a national convention organized around the idea of “reintroducing to the American people” someone who served eight years as First Lady, was elected to the Senate, failed in a previous high-profile run for the presidency, and completed a term as secretary of state? The just-ended conclave in Philadelphia was, like the Republican one that preceded it, a pseudo-event par excellence, the object of the exercise being to fashion a new “image” for the Democratic candidate.
The thicket of unreality that is American politics has now become all-enveloping. The problem is not Trump and Clinton, per se. It’s an identifiable set of arrangements — laws, habits, cultural predispositions — that have evolved over time and promoted the rot that now pervades American politics. As a direct consequence, the very concept of self-government is increasingly a fantasy, even if surprisingly few Americans seem to mind.
At an earlier juncture back in 1956, out of a population of 168 million, we got Ike and Adlai. Today, with almost double the population, we get — well, we get what we’ve got. This does not represent progress. And don’t kid yourself that things really can’t get much worse. Unless Americans rouse themselves to act, count on it, they will.
Copyright 2016 Andrew J. Bacevich
The Decay of American Politics
We have it on highest authority: the recent killing of Taliban leader Mullah Akhtar Muhammad Mansour by a U.S. drone strike in Pakistan marks “an important milestone.” So the president of the United States has declared, with that claim duly echoed and implicitly endorsed by media commentary — the New York Times reporting, for example, that Mansour’s death leaves the Taliban leadership “shocked” and “shaken.”
But a question remains: A milestone toward what exactly?
Toward victory? Peace? Reconciliation? At the very least, toward the prospect of the violence abating? Merely posing the question is to imply that U.S. military efforts in Afghanistan and elsewhere in the Islamic world serve some larger purpose.
Yet for years now that has not been the case. The assassination of Mansour instead joins a long list of previous milestones, turning points, and landmarks briefly heralded as significant achievements only to prove much less than advertised.
One imagines that Obama himself understands this perfectly well. Just shy of five years ago, he was urging Americans to “take comfort in knowing that the tide of war is receding.” In Iraq and Afghanistan, the president insisted, “the light of a secure peace can be seen in the distance.”
“These long wars,” he promised, were finally coming to a “responsible end.” We were, that is, finding a way out of Washington’s dead-end conflicts in the Greater Middle East.
Who can doubt Obama’s sincerity, or question his oft-expressed wish to turn away from war and focus instead on unattended needs here at home? But wishing is the easy part. Reality has remained defiant. Even today, the wars in Iraq and Afghanistan that George W. Bush bequeathed to Obama show no sign of ending.
Like Bush, Obama will bequeath to his successor wars he failed to finish. Less remarked upon, he will also pass along to President Clinton or President Trump new wars that are his own handiwork. In Libya, Somalia, Yemen, and several other violence-wracked African nations, the Obama legacy is one of ever-deepening U.S. military involvement. The almost certain prospect of a further accumulation of briefly celebrated and quickly forgotten “milestones” beckons.
During the Obama era, the tide of war has not receded. Instead, Washington finds itself drawn ever deeper into conflicts that, once begun, become interminable — wars for which the vaunted U.S. military has yet to devise a plausible solution.
The Oldest (Also Latest) Solution: Bombs Away
Once upon a time, during the brief, if heady, interval between the end of the Cold War and 9/11 when the United States ostensibly reigned supreme as the world’s “sole superpower,” Pentagon field manuals credited U.S. forces with the ability to achieve “quick, decisive victory — on and off the battlefield — anywhere in the world and under virtually any conditions.” Bold indeed (if not utterly delusional) would be the staff officer willing to pen such words today.
To be sure, the United States military routinely demonstrates astonishing technical prowess — putting a pair of Hellfire missiles through the roof of the taxi in which Mansour was riding, for example. Yet if winning — that is, ending wars on conditions favorable to our side — offers the measure of merit by which to judge a nation’s military forces, then when put to the test ours have been found wanting.
Not for lack of trying, of course. In their quest for a formula that might actually accomplish the mission, those charged with directing U.S. military efforts in the Greater Middle East have demonstrated notable flexibility. They have employed overwhelming force and “shock-and awe.” They have tried regime change (bumping off Saddam Hussein and Muammar Gaddafi, for example) and “decapitation” (assassinating Mansour and a host of other militant leaders, including Osama Bin Laden). They have invaded and occupied countries, even giving military-style nation-building a whirl. They have experimented with counterinsurgency and counterterrorism, peacekeeping and humanitarian intervention, retaliatory strikes and preventive war. They have operated overtly, covertly, and through proxies. They have equipped, trained, and advised — and when the beneficiaries of these exertions have folded in the face of the enemy, they have equipped, trained, and advised some more. They have converted American reservists into quasi-regulars, subject to repeated combat tours. In imitation of the corporate world, they have outsourced as well, handing over to profit-oriented “private security” firms functions traditionally performed by soldiers. In short, they have labored doggedly to translate American military power into desired political outcomes.
In this one respect at least, an endless parade of three- and four-star generals exercising command in various theaters over the past several decades have earned high marks. In terms of effort, they deserve an A.
As measured by outcomes, however, they fall well short of a passing grade. However commendable their willingness to cast about for some method that might actually work, they have ended up waging a war of attrition. Strip away the light-at-the-end-of-the-tunnel reassurances regularly heard at Pentagon press briefings or in testimony presented on Capitol Hill and America’s War for the Greater Middle East proceeds on this unspoken assumption: if we kill enough people for a long enough period of time, the other side will eventually give in.
On that score, the prevailing Washington gripe directed at Commander-in-Chief Obama is that he has not been willing to kill enough. Take, for example, a recent Wall Street Journal op-ed penned by that literary odd couple, retired General David Petraeus and Brookings Institution analyst Michael O’Hanlon, that appeared under the pugnacious headline “Take the Gloves Off Against the Taliban.” To turn around the longest war in American history, Petraeus and O’Hanlon argue, the United States just needs to drop more bombs.
The rules of engagement currently governing air operations in Afghanistan are, in their view, needlessly restrictive. Air power “represents an asymmetric Western advantage, relatively safe to apply, and very effective.” (The piece omits any mention of incidents such as the October 2015 destruction of a Doctors Without Borders hospital in the Afghan provincial capital of Kunduz by a U.S. Air Force gunship.) More ordnance will surely produce “some version of victory.” The path ahead is clear. “Simply waging the Afghanistan air-power campaign with the vigor we are employing in Iraq and Syria,” the authors write with easy assurance, should do the trick.
When armchair generals cite the ongoing U.S. campaign in Iraq and Syria as a model of effectiveness, you know that things must be getting desperate.
Granted, Petraeus and O’Hanlon are on solid ground in noting that as the number of U.S. and NATO troops in Afghanistan has decreased, so, too, has the number of air strikes targeting the Taliban. Back when more allied boots were on the ground, more allied planes were, of course, overhead. And yet the 100,000 close-air-support sorties flown between 2011 and 2015 — that’s more than one sortie per Taliban fighter — did not, alas, yield “some version of victory.” In short, we’ve already tried the Petraeus-O’Hanlon take-the-gloves-off approach to defeating the Taliban. It didn’t work. With the Afghanistan War’s 15th anniversary now just around the corner, to suggest that we can bomb our way to victory there is towering nonsense.
In Washington, Big Thinking and Small
Petraeus and O’Hanlon characterize Afghanistan as “the eastern bulwark in our broader Middle East fight.” Eastern sinkhole might be a more apt description. Note, by the way, that they have nothing useful to say about the “broader fight” to which they allude. Yet that broader fight — undertaken out of the conviction, still firmly in place today, that American military assertiveness can somehow repair the Greater Middle East — is far more deserving of attention than how to employ very expensive airplanes against insurgents armed with inexpensive Kalashnikovs.
To be fair, in silently passing over the broader fight, Petraeus and O’Hanlon are hardly alone. On this subject no one has much to say — not other stalwarts of the onward-to-victory school, nor officials presently charged with formulating U.S. national security policy, nor members of the Washington commentariat eager to pontificate about almost anything. Worst of all, the subject is one on which each of the prospective candidates for the presidency is mum.
From Secretary of Defense Ashton Carter and Chairman of the Joint Chiefs of Staff General Joseph Dunford on down to the lowliest blogger, opinions about how best to wage a particular campaign in that broader fight are readily available. Need a plan for rolling back the Islamic State? Glad you asked. Concerned about that new ISIS franchise in Libya? Got you covered. Boko Haram? Here’s what you need to know. Losing sleep over Al-Shabab? Take heart — big thinkers are on the case.
As to the broader fight itself, however, no one has a clue. Indeed, it seems fair to say that merely defining our aims in that broader fight, much less specifying the means to achieve them, heads the list of issues that people in Washington studiously avoid. Instead, they prattle endlessly about the Taliban and ISIS and Boko Haram and al-Shabab.
Here’s the one thing you need to know about the broader fight: there is no strategy. None. Zilch. We’re on a multi-trillion-dollar bridge to nowhere, with members of the national security establishment more or less content to see where it leads.
May I suggest that we find ourselves today in what might be called a Khe Sanh moment? Older readers will recall that back in late 1967 and early 1968 in the midst of the Vietnam War, one particular question gripped the national security establishment and those paid to attend to its doings: Can Khe Sanh hold?
Now almost totally forgotten, Khe Sanh was then a battlefield as well known to Americans as Fallujah was to become in our own day. Located in the northern part of South Vietnam, it was the site of a besieged and outnumbered Marine garrison, surrounded by two full enemy divisions. In the eyes of some observers, the outcome of the Vietnam War appeared to hinge on the ability of the Marines there to hold out — to avoid the fate that had befallen the French garrison at Dien Bien Phu slightly more than a decade earlier. For France, the fall of Dien Bien Phu had indeed spelled final defeat in Indochina.
Was history about to repeat itself at Khe Sanh? As it turned out, no… and yes.
The Marines did hold — a milestone! — and the United States lost the war anyway.
In retrospect, it seems pretty clear that those responsible for formulating U.S. policy back then fundamentally misconstrued the problem at hand. Rather than worrying about the fate of Khe Sanh, they ought to have been asking questions like these: Is the Vietnam War winnable? Does it even make sense? If not, why are we there? And above all, does no alternative exist to simply pressing on with a policy that shows no signs of success?
Today the United States finds itself in a comparable situation. What to do about the Taliban or ISIS is not a trivial question. Much the same can be said regarding the various other militant organizations with which U.S. forces are engaged in a variety of countries — many now failing states — across the Greater Middle East.
But the question of how to take out organization X or put country Y back together pales in comparison with the other questions that should by now have come to the fore but haven’t. Among the most salient are these: Does waging war across a large swath of the Islamic world make sense? When will this broader fight end? What will it cost? Short of reducing large parts of the Middle East to rubble, is that fight winnable in any meaningful sense? Above all, does the world’s most powerful nation have no other choice but to persist in pursuing a manifestly futile endeavor?
Try this thought experiment. Imagine the opposing candidates in a presidential campaign each refusing to accept war as the new normal. Imagine them actually taking stock of the broader fight that’s been ongoing for decades now. Imagine them offering alternatives to armed conflicts that just drag on and on. Now that would be a milestone.
Copyright 2016 Andrew Bacevich
Milestones (Or What Passes for Them in Washington)
Let’s face it: in times of war, the Constitution tends to take a beating. With the safety or survival of the nation said to be at risk, the basic law of the land — otherwise considered sacrosanct — becomes nonbinding, subject to being waived at the whim of government authorities who are impatient, scared, panicky, or just plain pissed off.
The examples are legion. During the Civil War, Abraham Lincoln arbitrarily suspended the writ of habeas corpus and ignored court orders that took issue with his authority to do so. After U.S. entry into World War I, the administration of Woodrow Wilson mounted a comprehensive effort to crush dissent, shutting down anti-war publications in complete disregard of the First Amendment. Amid the hysteria triggered by Pearl Harbor, Franklin Roosevelt issued an executive order consigning to concentration camps more than 100,000 Japanese-Americans, many of them native-born citizens. Asked in 1944 to review this gross violation of due process, the Supreme Court endorsed the government’s action by a 6-3 vote.
More often than not, the passing of the emergency induces second thoughts and even remorse. The further into the past a particular war recedes, the more dubious the wartime arguments for violating the Constitution appear. Americans thereby take comfort in the “lessons learned” that will presumably prohibit any future recurrence of such folly.
Even so, the onset of the next war finds the Constitution once more being ill-treated. We don’t repeat past transgressions, of course. Instead, we devise new ones. So it has been during the ongoing post-9/11 period of protracted war.
During the presidency of George W. Bush, the United States embraced torture as an instrument of policy in clear violation of the Eighth Amendment prohibiting cruel and unusual punishment. Bush’s successor, Barack Obama, ordered the extrajudicial killing of an American citizen, a death by drone that was visibly in disregard of the Fifth and Fourteenth Amendments. Both administrations — Bush’s with gusto, Obama’s with evident regret — imprisoned individuals for years on end without charge and without anything remotely approximating the “speedy and public trial, by an impartial jury” guaranteed by the Sixth Amendment. Should the present state of hostilities ever end, we can no doubt expect Guantánamo to become yet another source of “lessons learned” for future generations of rueful Americans.
Congress on the Sidelines
Yet one particular check-and-balance constitutional proviso now appears exempt from this recurring phenomenon of disregard followed by professions of dismay, embarrassment, and “never again-ism” once the military emergency passes. I mean, of course, Article I, section 8 of the Constitution, which assigns to Congress the authority “to declare war” and still stands as testimony to the genius of those who drafted it. There can be no question that the responsibility for deciding when and whether the United States should fight resides with the legislative branch, not the executive, and that this was manifestly the intent of the Framers.
On parchment at least, the division of labor appears straightforward. The president’s designation as commander-in-chief of the armed forces in no way implies a blanket authorization to employ those forces however he sees fit or anything faintly like it. Quite the contrary: legitimizing presidential command requires explicit congressional sanction.
Actual practice has evolved into something altogether different. The portion of Article I, Section 8, cited above has become a dead letter, about as operative as blue laws still on the books in some American cities and towns that purport to regulate Sabbath day activities. Superseding the written text is an unwritten counterpart that goes something like this: with legislators largely consigned to the status of observers, presidents pretty much wage war whenever, wherever, and however they see fit. Whether the result qualifies as usurpation or forfeiture is one of those chicken-and-egg questions that’s interesting but practically speaking beside the point.
This is by no means a recent development. It has a history. In the summer of 1950, when President Harry Truman decided that a U.N. Security Council resolution provided sufficient warrant for him to order U.S. forces to fight in Korea, congressional war powers took a hit from which they would never recover.
Congress soon thereafter bought into the notion, fashionable during the Cold War, that formal declarations of hostilities had become passé. Waging the “long twilight struggle” ostensibly required deference to the commander-in-chief on all matters related to national security. To sustain the pretense that it still retained some relevance, Congress took to issuing what were essentially permission slips, granting presidents maximum freedom of action to do whatever they might decide needed to be done in response to the latest perceived crisis.
The Tonkin Gulf Resolution of 1964 offers a notable example. With near unanimity, legislators urged President Lyndon Johnson “to take all necessary measures to repel any armed attack against the forces of the United States and to prevent further aggression” across the length and breadth of Southeast Asia. Through the magic of presidential interpretation, a mandate to prevent aggression provided legal cover for an astonishingly brutal and aggressive war in Vietnam, as well as Cambodia and Laos. Under the guise of repelling attacks on U.S. forces, Johnson and his successor, Richard Nixon, thrust millions of American troops into a war they could not win, even if more than 58,000 died trying.
To leap almost four decades ahead, think of the Authorization to Use Military Force (AUMF) that was passed by Congress in the immediate aftermath of 9/11 as the grandchild of the Tonkin Gulf Resolution. This document required (directed, called upon, requested, invited, urged) President George W. Bush “to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations, or persons.” In plain language: here’s a blank check; feel free to fill it in any way you like.
As a practical matter, one specific individual — Osama bin Laden — had hatched the 9/11 plot. A single organization — al-Qaeda — had conspired to pull it off. And just one nation — backward, Taliban-controlled Afghanistan — had provided assistance, offering sanctuary to bin Laden and his henchmen. Yet nearly 15 years later, the AUMF remains operative and has become the basis for military actions against innumerable individuals, organizations, and nations with no involvement whatsoever in the murderous events of September 11, 2001.
Consider the following less than comprehensive list of four developments, all of which occurred just within the last month and a half:
*In Yemen, a U.S. airstrike killed at least 50 individuals, said to be members of an Islamist organization that did not exist on 9/11.
*In Somalia, another U.S. airstrike killed a reported 150 militants, reputedly members of al-Shabab, a very nasty outfit, even if one with no real agenda beyond Somalia itself.
*In Syria, pursuant to the campaign of assassination that is the latest spin-off of the Iraq War, U.S. special operations forces bumped off the reputed “finance minister” of the Islamic State, another terror group that didn’t even exist in September 2001.
*In Libya, according to press reports, the Pentagon is again gearing up for “decisive military action” — that is, a new round of air strikes and special operations attacks to quell the disorder resulting from the U.S.-orchestrated air campaign that in 2011 destabilized that country. An airstrike conducted in late February gave a hint of what is to come: it killed approximately 50 Islamic State militants (and possibly two Serbian diplomatic captives).
Yemen, Somalia, Syria, and Libya share at least this in common: none of them, nor any of the groups targeted, had a hand in the 9/11 attacks.
Imagine if, within a matter of weeks, China were to launch raids into Vietnam, Thailand, and Taiwan, with punitive action against the Philippines in the offing. Or if Russia, having given a swift kick to Ukraine, Georgia, and Azerbaijan, leaked its plans to teach Poland a lesson for mismanaging its internal affairs. Were Chinese President Xi Jinping or Russian President Vladimir Putin to order such actions, the halls of Congress would ring with fierce denunciations. Members of both houses would jostle for places in front of the TV cameras to condemn the perpetrators for recklessly violating international law and undermining the prospects for world peace. Having no jurisdiction over the actions of other sovereign states, senators and representatives would break down the doors to seize the opportunity to get in their two cents worth. No one would be able to stop them. Who does Xi think he is! How dare Putin!
Yet when an American president undertakes analogous actions over which the legislative branch does have jurisdiction, members of Congress either yawn or avert their eyes.
In this regard, Republicans are especially egregious offenders. On matters where President Obama is clearly acting in accordance with the Constitution — for example, in nominating someone to fill a vacancy on the Supreme Court — they spare no effort to thwart him, concocting bizarre arguments nowhere found in the Constitution to justify their obstructionism. Yet when this same president cites the 2001 AUMF as the basis for initiating hostilities hither and yon, something that is on the face of it not legal but ludicrous, they passively assent.
Indeed, when Obama in 2015 went so far as to ask Congress to pass a new AUMF addressing the specific threat posed by the Islamic State — that is, essentially rubberstamping the war he had already launched on his own in Syria and Iraq — the Republican leadership took no action. Looking forward to the day when Obama departs office, Senator Mitch McConnell with his trademark hypocrisy worried aloud that a new AUMF might constrain his successor. The next president will “have to clean up this mess, created by all of this passivity over the last eight years,” the majority leader remarked. In that regard, “an authorization to use military force that ties the president’s hands behind his back is not something I would want to do.” The proper role of Congress was to get out of the way and give this commander-in-chief carte blanche so that the next one would enjoy comparably unlimited prerogatives.
Collaborating with a president they roundly despise — implicitly concurring in Obama’s questionable claim that “existing statutes [already] provide me with the authority I need” to make war on ISIS — the GOP-controlled Congress thereby transformed the post-9/11 AUMF into what has now become, in effect, a writ of permanent and limitless armed conflict. In Iraq and Syria, for instance, what began as a limited but open-ended campaign of air strikes authorized by President Obama in August 2014 has expanded to include an ever-larger contingent of U.S. trainers and advisers for the Iraqi military, special operations forces conducting raids in both Iraq and Syria, the first new all-U.S. forward fire base in Iraq, and at least 5,000 U.S. military personnel now on the ground, a number that continues to grow incrementally.
Remember Barack Obama campaigning back in 2008 and solemnly pledging to end the Iraq War? What he neglected to mention at the time was that he was retaining the prerogative to plunge the country into another Iraq War on his own ticket. So has he now done, with members of Congress passively assenting and the country essentially a prisoner of war.
By now, through its inaction, the legislative branch has, in fact, surrendered the final remnant of authority it retained on matters relating to whether, when, against whom, and for what purpose the United States should go to war. Nothing now remains but to pay the bills, which Congress routinely does, citing a solemn obligation to “support the troops.” In this way does the performance of lesser duties provide an excuse for shirking far greater ones.
In military circles, there is a term to describe this type of behavior. It’s called cowardice.
Copyright 2016 Andrew J. Bacevich