Americans are facing “A Spring Unlike Any Before.” So warned a front-page headline in the March 13th New York Times.
That headline, however hyperbolic, was all too apt. The coming of spring has always promised relief from the discomforts of winter. Yet, far too often, it also brings its own calamities and afflictions.
According to the poet T.S. Eliot, “April is the cruelest month.” Yet while April has certainly delivered its share of cataclysms, March and May haven’t lagged far behind. In fact, cruelty has seldom been a respecter of seasons. The infamous influenza epidemic of 1918, frequently cited as a possible analogue to our current crisis, began in the spring of that year, but lasted well into 1919.
That said, something about the coronavirus pandemic does seem to set this particular spring apart. At one level, that something is the collective panic now sweeping virtually the entire country. President Trump’s grotesque ineptitude and tone-deafness have only fed that panic. And in their eagerness to hold Trump himself responsible for the pandemic, as if he were the bat that first transmitted the disease to a human being, his critics magnify further a growing sense of events spinning out of control.
Yet to heap the blame for this crisis on Trump alone (though he certainly deserves plenty of blame) is to miss its deeper significance. Deferred for far too long, Judgment Day may at long last have arrived for the national security state.
Origins of a Colossus
That state within a state’s origins date from the early days of the Cold War. Its ostensible purpose has been to keep Americans safe and so, by extension, to guarantee our freedoms. From the 1950s through the 1980s, keeping us safe provided a seemingly adequate justification for maintaining a sprawling military establishment along with a panoply of “intelligence” agencies — the CIA, the DIA, the NRO, the NSA — all engaged in secret activities hidden from public view. From time to time, the scope, prerogatives, and actions of that conglomeration of agencies attracted brief critical attention — the Cuban Bay of Pigs fiasco in 1961, the Vietnam War of the 1960s and early 1970s, and the Iran-Contra affair during the presidency of Ronald Reagan being prime examples. Yet at no time did such failures come anywhere close to jeopardizing its existence.
Indeed, even when the implosion of the Soviet Union and the end of the Cold War removed the original justification for its creation, the entire apparatus persisted. With the Soviet Empire gone, Russia in a state of disarray, and communism having lost its appeal as an alternative to democratic capitalism, the managers of the national security state wasted no time in identifying new threats and new missions.
The new threats included autocrats like Panama’s Manuel Noriega and Iraq’s Saddam Hussein, once deemed valuable American assets, but now, their usefulness gone, classified as dangers to be eliminated. Prominent among the new missions was a sudden urge to repair broken places like the Balkans, Haiti, and Somalia, with American power deployed under the aegis of “humanitarian intervention” and pursuant to a “responsibility to protect.” In this way, in the first decade of the post-Cold War era, the national security state kept itself busy. While the results achieved, to put it politely, were mixed at best, the costs incurred appeared tolerable. In sum, the entire apparatus remained impervious to serious scrutiny.
During that decade, however, both the organs of national security and the American public began taking increased notice of what was called “anti-American terrorism” — and not without reason. In 1993, Islamic fundamentalists detonated a bomb in a parking garage of New York’s World Trade Center. In 1996, terrorists obliterated an apartment building used to house U.S. military personnel in Saudi Arabia. Two years later, the U.S. embassies in Kenya and Tanzania were blown up and, in 2000, suicide bombers nearly sank the USS Cole, a Navy destroyer making a port call in Aden at the tip of the Arabian peninsula. To each of these increasingly brazen attacks, all occurring during the administration of President Bill Clinton, the national security state responded ineffectually.
Then, of course, came September 11, 2001. Orchestrated by Osama bin Laden and carried out by 19 suicidal al-Qaeda operatives, this act of mass murder inflicted incalculable harm on the United States. In its wake, it became common to say that “9/11 changed everything.”
In fact, however, remarkably little changed. Despite its 17 intelligence agencies, the national security state failed utterly to anticipate and thwart that devastating attack on the nation’s political and financial capitals. Yet apart from minor adjustments — primarily expanding surveillance efforts at home and abroad — those outfits mostly kept doing what they had been doing, even as their leaders evaded accountability. After Pearl Harbor, at least, one admiral and one general were fired. After 9/11, no one lost his or her job. At the upper echelons of the national security state, the wagons were circled and a consensus quickly formed: no one had screwed up.
Once President George W. Bush identified an “Axis of Evil” (Iraq, Iran, and North Korea), three nations that had had nothing whatsoever to do with the 9/11 attacks, as the primary target for his administration’s “Global War on Terrorism,” it became clear that no wholesale reevaluation of national security policy was going to occur. The Pentagon and the Intelligence Community, along with their sprawling support network of profit-minded contractors, could breathe easy. All of them would get ever more money. That went without saying. Meanwhile, the underlying premise of U.S. policy since the immediate aftermath of World War II — that projecting hard power globally would keep Americans safe — remained sacrosanct.
Viewed from this perspective, the sequence of events that followed was probably overdetermined. In late 2001, U.S. forces invaded Afghanistan, overthrew the Taliban regime, and set out to install a political order more agreeable to Washington. In early 2003, with the mission in Afghanistan still anything but complete, U.S. forces set out to do the same in Iraq. Both of those undertakings have dragged on, in one fashion or another, without coming remotely close to success. Today, the military undertaking launched in 2001 continues, even if it no longer has a name or an agreed-upon purpose.
Nonetheless, at the upper echelons of the national security state, the consensus forged after 9/11 remains firmly in place: no one screws up. In Washington, the conviction that projecting hard power keeps Americans safe likewise remains sacrosanct.
In the nearly two decades since 9/11, willingness to challenge this paradigm has rarely extended beyond non-conforming publications like TomDispatch. Until Donald Trump came along, rare was the ambitious politician of either political party who dared say aloud what Trump himself has repeatedly said — that, as he calls them, the “ridiculous endless wars” launched in response to 9/11 represent the height of folly.
Astonishingly enough, within the political establishment that point has still not sunk in. So, in 2020, as in 2016, the likely Democratic nominee for president will be someone who vigorously supported the 2003 invasion of Iraq. Imagine, if you will, Democrats in 1880 nominating not a former union general (as they did) but a former confederate who, 20 years before, had advocated secession. Back then, some sins were unforgivable. Today, politicians of both parties practice self-absolution and get away with it.
The Real Threat
Note, however, the parallel narrative that has unfolded alongside those post-9/11 wars. Taken seriously, that narrative exposes the utter irrelevance of the national security state as currently constituted. The coronavirus pandemic will doubtless prove to be a significant learning experience. Here is one lesson that Americans cannot afford to overlook.
Presidents now routinely request and Congress routinely appropriates more than a trillion dollars annually to satisfy the national security state’s supposed needs. Even so, Americans today do not feel safe and, to a degree without precedent, they are being denied the exercise of basic everyday freedoms. Judged by this standard, the apparatus created to keep them safe and free has failed. In the face of a pandemic, nature’s version of an act of true terror, that failure, the consequences of which Americans will suffer through for months to come, should be seen as definitive.
But wait, some will object: Don’t we find ourselves in uncharted waters? Is this really the moment to rush to judgment? In fact, judgment is long overdue.
While the menace posed by the coronavirus may differ in scope, it does not differ substantively from the myriad other perils that Americans have endured since the national security state wandered off on its quixotic quest to pacify Afghanistan and Iraq and purge the planet of terrorists. Since 9/11, a partial roster of those perils would include: Hurricane Katrina (2005), Hurricane Sandy (2012), Hurricanes Harvey, Irma, and Maria (2017), and massive wildfires that have devastated vast stretches of the West Coast on virtually an annual basis. The cumulative cost of such events exceeds a half-trillion dollars. Together, they have taken the lives of several thousand more people than were lost in the 2001 attack on the World Trade Center and the Pentagon.
Earlier generations might have written all of these off as acts of God. Today, we know better. As with blaming Trump, blaming God won’t do. Human activities, ranging from the hubristic reengineering of rivers like the Mississippi to the effects of climate change stemming from the use of fossil fuels, have substantially exacerbated such “natural” catastrophes.
And unlike faraway autocrats or terrorist organizations, such phenomena, from extreme-weather events to pandemics, directly and immediately threaten the safety and wellbeing of the American people. Don’t tell the Central Intelligence Agency or the Joint Chiefs of Staff but the principal threats to our collective wellbeing are right here where we live.
Apart from modest belated efforts at mitigation, the existing national security state is about as pertinent to addressing such threats as President Trump’s cheery expectations that the coronavirus will simply evaporate once warmer weather appears. Terror has indeed arrived on our shores and it has nothing to do with al-Qaeda or ISIS or Iranian-backed militias. Americans are terrorized because it has now become apparent that our government, whether out of negligence or stupidity, has left them exposed to dangers that truly put life and liberty at risk. As it happens, all these years in which the national security state has been preoccupied with projecting hard power abroad have left us naked and vulnerable right here at home.
Protecting Americans where they live ought to be the national security priority of our time. The existing national security state is incapable of fulfilling that imperative, while its leaders, fixated on waging distant wars, have yet to even accept that they have a responsibility to do so.
Worst of all, even in this election year, no one on the national political scene appears to recognize the danger now fully at hand.
Judgment Day for the National Security State
The impeachment of the president of the United States! Surely such a mega-historic event would reverberate for weeks or months, leaving in its wake no end of consequences, large and small. Wouldn’t it? Shouldn’t it?
Truth to tell, the word historic does get tossed around rather loosely these days. Just about anything that happens at the White House, for example, is deemed historic. Watch the cable news networks and you’ll hear the term employed regularly to describe everything from Oval Office addresses to Rose Garden pronouncements to press conferences in which foreign dignitaries listen passively while their presidential host pontificates about subjects that have nothing to do with them and everything to do with him.
Of course, almost all of these are carefully scripted performances that are devoid of authenticity. In short, they’re fraudulent. The politicians who participate in such performances know that it’s all a sham. So, too, do the reporters and commentators paid to “interpret” the news. So, too, does any semi-attentive, semi-informed citizen.
Yet on it goes, day in, day out, as politicians, journalists, and ordinary folk collaborate in manufacturing, propagating, and consuming a vast panoply of staged incidents, which together comprise what Americans choose to treat as the very stuff of contemporary history. “Pseudo-events” was the term that historian Daniel Boorstin coined to describe them in his classic 1961 book The Image: A Guide to Pseudo-Events in America. The accumulation of such incidents creates a make-believe world. As Boorstin put it, they give rise to a “thicket of unreality that stands between us and the facts of life.”
As substitutes for reality, pseudo-events, he claimed, breed “extravagant expectations” that can never be met, with disappointment, confusion, and anger among the inevitable results. Writing decades before the advent of CNN, Fox News, Google, Facebook, and Twitter, Boorstin observed that “we are deceived and obstructed by the very machines we make to enlarge our vision.” So it was back then during the presidency of John F. Kennedy, a master of pseudo-events in the still relatively early days of television. And so our world remains today during the presidency of Donald Trump who achieved high office by unmasking the extravagant post-Cold War/sole superpower/indispensable nation/end of history expectations of the political class, only to weave his own in their place.
As Trump so skillfully demonstrates, even as they deceive, pseudo-events also seduce, inducing what Boorstin referred to as a form of “national self-hypnosis.” With enough wishful thinking, reality becomes entirely optional. So the thousands of Trump loyalists attending MAGA rallies implicitly attest as they count on their hero to make their dreams come true and their nightmares go away.
Yet when it comes to extravagant expectations, few pseudo-events can match the recently completed presidential impeachment and trial. Even before his inauguration, the multitudes who despise Donald Trump longed to see him thrown out of office. To ensure the survival of the Republic, Trump’s removal needed to happen. And when the impeachment process did finally begin to unfold, feverish reporters and commentators could find little else to talk about. With the integrity of the Constitution itself said to be at stake, the enduringly historic significance of each day’s developments appeared self-evident. Or so we were told anyway.
Yet while all parties involved dutifully recited their prescribed lines — no one with greater relish than Donald Trump himself — the final outcome was never in doubt. The Republican Senate was no more likely to convict the president than he was to play golf without cheating. So no sooner did the Senate let Trump off the hook than the fever broke. In an instant, the farcical nature of the entire process became blindingly apparent. Rarely has the gap between hype and actual historical substance been so vast.
The effort to oust the president from office had unleashed a tidal wave of angst, anxiety, anger, and hope. Yet a mere handful of weeks after its conclusion, the impeachment of Donald Trump retains about as much salience as the impeachment of Andrew Johnson, which concluded in 1868.
What does the instantaneous deflation of this ostensibly historic event signify? Among other things, it shows that we still live in the world of pseudo-events that Boorstin described nearly 60 years ago. The American susceptibility to contrived and scripted versions of reality persists, revealing an emptiness at the core of our national politics. Arguably, in our age of social media, that emptiness is greater still. To look past the pseudo-events staged to capture our attention is to confront a void.
Pseudo-events Gone Wrong
Yet in this dismal situation, flickering bits of truth occasionally do appear in moments when pseudo-events inadvertently expose realities they are meant to conceal. Boorstin posited that “pseudo-events produce more pseudo-events.” While that might be broadly correct, let me offer a caveat: given the right conditions, pseudo-events can also be self-subverting, their cumulative absurdity undermining their cumulative authority. Every now and then, in other words, we get the sneaking suspicion that much of what in Washington gets advertised as historic just might be a load of bullshit.
As it happens, the season of Trump’s impeachment offered three encouraging instances of a prominent pseudo-event being exposed as delightfully bogus: the Iowa Caucus, the State of the Union Address, and the National Prayer Breakfast.
According to custom, every four years the Iowa Caucus initiates what is said to be a fair, methodical, and democratic process of selecting the presidential nominees of the two principal political parties. According to custom and in accordance with a constitutional requirement, the State of the Union Address offers presidents an annual opportunity to appear before Congress and the American people to assess the nation’s condition and describe administration plans for the year ahead. Pursuant to a tradition dating from the early years of the Cold War, the National Prayer Breakfast, held annually in Washington, invites members of the political establishment to bear witness to the assertion that we remain a people “under God,” united in all our wondrous diversity by a shared faith in the Almighty.
This year all three went haywire, each in a different way, but together hinting at the vulnerability of other pseudo-events assumed to be fixed and permanent. By offering a peek at previously hidden truths, the trio of usually forgettable events just might merit celebration.
First, on February 3rd, came the long-awaited Iowa Caucus. Commentators grasping for something to write about in advance of caucus night entertained themselves by lamenting the fact that the Hawkeye State is too darn white, implying, in effect, that Iowans aren’t sufficiently American. As it happened, the problem turned out to be not a lack of diversity, but a staggering lack of competence, as the state’s Democratic Party thoroughly botched the one and only event that allows Iowa to claim a modicum of national political significance. To tally caucus results, it employed an ill-tested and deficient smartphone app created by party insiders who were clearly out of their depth.
The result was an epic cockup, a pseudo-event exposed as political burlesque. The people of Iowa had spoken — the people defined in this instance as registered Democrats who bothered to show up — but no one quite knew what they had said. By the time the counting and recounting were over, the results no longer mattered. Iowa was supposed to set in motion an orderly sorting-out process for the party and its candidates. Instead, it sowed confusion and then more confusion. Yet in doing so, the foul-up in Iowa suggested that maybe, just maybe, the entire process of selecting presidential candidates is in need of a complete overhaul, with the present quadrennial circus replaced by an approach that might yield an outcome more expeditiously, while wasting less money and, yes, also taking diversity into account.
Next, on February 4th, came the State of the Union Address. Resplendent with ritual and ceremony, this event certainly deserves an honored place in the pseudo-event Hall of Fame. This year’s performance was no exception. President Trump bragged shamelessly about his administration’s many accomplishments, planted compliant live mannequins in the gallery of the House of Representatives to curry favor with various constituencies — hatemongering radio host Rush Limbaugh received the Medal of Freedom from the First Lady! — even as he otherwise kept pretty much to the model employed by every president since Ronald Reagan. It was, in other words, a pseudo-event par excellence.
The sole revelatory moment came just after Trump finished speaking. In an endearing and entirely salutary gesture, House Speaker Nancy Pelosi, standing behind the president, promptly rendered her verdict on the entire occasion. Like a thoroughly miffed schoolteacher rejecting unsatisfactory homework from a delinquent pupil, she tore the text of Trump’s remarks in two. In effect, Pelosi thereby announced that the entire evening had consisted of pure, unadulterated nonsense, as indeed it had and as has every other State of the Union Address in recent memory.
Blessings upon Speaker Pelosi. Next year, we must hope that she will skip the occasion entirely as not worthy of her time. Other members of Congress, preferably from both parties, may then follow her example, finding better things to do. Within a few years, presidents could find themselves speaking in an empty chamber. The networks will then lose interest. At that juncture, the practice that prevailed from the early days of the Republic until the administration of Woodrow Wilson might be restored: every year or so, presidents can simply send a letter to Congress ruminating about the state of the nation, with members choosing to attend to or ignore it as it pleases them. And the nation’s calendar will therefore be purged altogether of one prominent pseudo-event.
The National Prayer Breakfast, which occurred on February 6th, completes our trifecta of recent pseudo-events gone unexpectedly awry. Here the credit belongs entirely to President Trump who used his time at the dais during this nominally religious event as an opportunity to whine about the “terrible ordeal” he had just endured at the hands of “some very dishonest and corrupt people.” Alluding specifically to Pelosi (and perhaps with Mitt Romney also in mind), Trump denounced his critics as hypocrites. “I don’t like people who use their faith as justification for doing what they know is wrong,” he said. “Nor do I like people who say, ‘I pray for you,’ when they know that that’s not so.”
Jesus might have forgiven his tormentors, but Donald Trump, a self-described Christian, is not given to following the Lord’s example. So instead of an occasion for faux displays of brotherly ecumenism, this year’s National Prayer Breakfast became one more exhibition of petty partisanship — relieving the rest of us (and the media) of any further need to pretend that it ever possessed anything approximating a serious religious motivation.
So if only in an ironic sense, the first week of February 2020 did end up qualifying as a genuinely historic occasion. Granted, those who claim the authority to instruct the rest of us on what deserves that encomium missed its true significance. They had wasted no time in moving on to the next pseudo-event, this one in New Hampshire. Yet over the course of a handful of days, Americans had been granted a glimpse of the reality that pseudo-events are designed to camouflage.
A few more such glimpses and something like “the facts of life” to which Boorstin alluded so long ago might become impossible to hide any longer. Imagine: No more bullshit. In these dark and discouraging times, aren’t we at least entitled to such a hope?
How “Historic” Are We?
Thirty years ago this month, President George H.W. Bush appeared before a joint session of Congress to deliver his first State of the Union Address, the first post-Cold War observance of this annual ritual. Just weeks before, the Berlin Wall had fallen. That event, the president declared, “marks the beginning of a new era in the world’s affairs.” The Cold War, that “long twilight struggle” (as President John F. Kennedy so famously described it), had just come to an abrupt end. A new day was dawning. President Bush seized the opportunity to explain just what that dawning signified.
“There are singular moments in history, dates that divide all that goes before from all that comes after,” the president said. The end of World War II had been just such a moment. In the decades that followed, 1945 provided “the common frame of reference, the compass points of the postwar era we’ve relied upon to understand ourselves.” Yet the hopeful developments of the year just concluded — Bush referred to them collectively as “the Revolution of ’89” — had initiated “a new era in the world’s affairs.”
While many things were certain to change, the president felt sure that one element of continuity would persist: the United States would determine history’s onward course. “America, not just the nation but an idea,” he emphasized, is and was sure to remain “alive in the minds of people everywhere.”
“As this new world takes shape, America stands at the center of a widening circle of freedom — today, tomorrow, and into the next century. Our nation is the enduring dream of every immigrant who ever set foot on these shores and the millions still struggling to be free. This nation, this idea called America, was and always will be a new world — our new world.”
Bush had never shown himself to be a particularly original or imaginative thinker. Even so, during a long career in public service, he had at least mastered the art of packaging sentiments deemed appropriate for just about any occasion. The imagery he employed in this instance — America occupying the center of freedom’s widening circle — did not stake out a new claim devised for fresh circumstances. That history centered on what Americans professed or did expressed a hallowed proposition, one with which his listeners were both familiar and comfortable. Indeed, Bush’s description of America as a perpetually self-renewing enterprise engaged in perfecting freedom summarized the essence of the nation’s self-assigned purpose.
In his remarks to Congress, the president was asserting a prerogative that his predecessors had long ago appropriated: interpreting the zeitgeist in such a way as to merge past, present, and future into a seamless, self-congratulatory, and reassuring narrative of American power. He was describing history precisely as Americans — or at least privileged Americans — wished to see it. He was, in other words, speaking a language in which he was fluent: the idiom of the ruling class.
As the year 1990 began, duty — destiny, even — was summoning members of that ruling class to lead not just this country, but the planet itself and not just for a decade or two, or even for an “era,” but forever and a day. In January 1990, the way ahead for the last superpower on planet Earth — the Soviet Union would officially implode in 1991 but its fate already seemed obvious enough — was clear indeed.
So, How’d We Do?
Thirty years later, perhaps it’s time to assess just how well the United States has fulfilled the expectations President Bush articulated in 1990. Personally, I would rate the results somewhere between deeply disappointing and flat-out abysmal.
Bush’s “circle of freedom” invoked a planet divided between the free and the unfree. During the Cold War, this distinction had proven useful even if it was never particularly accurate. Today, it retains no value whatsoever as a description of the actually existing world, even though in Washington it persists, as does the conviction that the U.S. has a unique responsibility to expand that circle.
Encouraged by ambitious politicians and ideologically driven commentators, many (though not all) Americans bought into a militarized, Manichean, vastly oversimplified conception of the Cold War. Having misconstrued its meaning, they misconstrued the implications of its passing, leaving them ill-prepared to see through the claptrap in President Bush’s 1990 State of the Union Address.
Bush depicted the “Revolution of ‘89” as a transformative moment in world history. In fact, the legacy of that moment has proven far more modest than he imagined. As a turning point in the history of the modern world, the end of the Cold War ranks slightly above the invention of the machine gun (1884), but well below the fall of Russia’s Romanov dynasty (1917) or the discovery of penicillin (1928). Among the factors shaping the world in which we now live, the outcome of the Cold War barely registers.
Fairness obliges me to acknowledge two exceptions to that broad claim, one pertaining to Europe and the other to the United States.
First, the end of the Cold War led almost immediately to a Europe made “whole and free” thanks to the collapse of the Soviet empire. Yet while Poles, Lithuanians, the former citizens of the German Democratic Republic, and other Eastern Europeans are certainly better off today than they were under the Kremlin’s boot, Europe itself plays a significantly diminished role in world affairs. In healing its divisions, it shrank, losing political clout. Meanwhile, in very short order, new cleavages erupted in the Balkans, Spain, and even the United Kingdom, with the emergence of a populist right calling into question Europe’s assumed commitment to multicultural liberalism.
In many respects, the Cold War began as an argument over who would determine Europe’s destiny. In 1989, our side won that argument. Yet, by then, the payoff to which the United States laid claim had largely been depleted. Europe’s traditional great powers were no longer especially great. After several centuries in which global politics had centered on that continent, Europe had suddenly slipped to the periphery. In practice, “whole and free” turned out to mean “preoccupied and anemic,” with Europeans now engaging in their own acts of folly. Three decades after the “Revolution of ’89,” Europe remains an attractive tourist destination. Yet, from a geopolitical perspective, the action has long since moved elsewhere.
The second exception to the Cold War’s less than momentous results relates to U.S. attitudes toward military power. For the first time in its history, the onset of the Cold War had prompted the United States to create and maintain a powerful peacetime military establishment. The principal mission of that military was to defend, deter, and contain. While it would fight bitter wars in Korea and Vietnam, its advertised aim was to avert armed conflicts or, at least, keep them from getting out of hand. In that spirit, the billboard at the entrance to the headquarters of the Strategic Air Command, the Pentagon’s principal Cold War nuclear strike force (which possessed the means to extinguish humankind), reassuringly announced that “peace is our profession.”
When the Cold War ended, however, despite the absence of any real threats to U.S. security, Washington policymakers decided to maintain the mightiest armed forces on the planet in perpetuity. Negligible debate preceded this decision, which even today remains minimally controversial. That the United States should retain military capabilities far greater than those of any other nation or even combination of numerous other nations seemed eminently sensible.
In appearance or configuration, the post-Cold War military differed little from what it had looked like between the 1950s and 1989. Yet the armed forces of the United States now took on a radically different, far more ambitious mission: to impose order and spread American values globally, while eliminating obstacles deemed to impede those efforts. During the Cold War, policymakers had placed a premium on holding U.S. forces in readiness. Now, the idea was to put “the troops” to work. Power projection became the name of the game.
Just a month prior to his State of the Union Address, President Bush himself had given this approach a test run, ordering U.S. forces to intervene in Panama, overthrow the existing government there, and install in its place one expected to be more compliant. The president now neatly summarized the outcome of that action in three crisp sentences. “One year ago,” he announced, “the people of Panama lived in fear, under the thumb of a dictator. Today democracy is restored; Panama is free. Operation Just Cause has achieved its objective.”
Mission accomplished: end of story. Here, it seemed, was a template for further application globally.
As it happened, however, Operation Just Cause proved to be the exception rather than the rule. Intervention in Panama did inaugurate a period of unprecedented American military activism. In the years that followed, U.S. forces invaded, occupied, bombed, or raided an astonishing array of countries. Rarely, however, was the outcome as tidy as it had been in Panama, where the fighting lasted a mere five days. Untidy and protracted conflicts proved more typical of the post-Cold War U.S. experience, with the Afghanistan War, a futile undertaking now in its 19th year, a notable example. The present-day U.S. military qualifies by any measure as highly professional, much more so than its Cold War predecessor. Yet the purpose of today’s professionals is not to preserve peace but to fight unending wars in distant places.
Intoxicated by a post-Cold War belief in its own omnipotence, the United States allowed itself to be drawn into a long series of armed conflicts, almost all of them yielding unintended consequences and imposing greater than anticipated costs. Since the end of the Cold War, U.S. forces have destroyed many targets and killed many people. Only rarely, however, have they succeeded in accomplishing their assigned political purposes. From a military perspective — except perhaps in the eyes of the military-industrial complex — the legacy of the “Revolution of ‘89” turned out to be almost entirely negative.
A Broken Compass
So, contrary to President Bush’s prediction, the fall of the Berlin Wall did not inaugurate a “new era in world affairs” governed by “this idea called America.” It did, however, accelerate Europe’s drift toward geopolitical insignificance and induced in Washington a sharp turn toward reckless militarism — neither of which qualifies as cause for celebration.
Yet today, 30 years after Bush’s 1990 State of the Union, a “new era of world affairs” is indeed upon us, even if it bears scant resemblance to the order Bush expected to emerge. If his “idea called America” did not shape the contours of this new age, then what has?
Answer: all the things post-Cold War Washington policy elites misunderstood or relegated to the status of afterthought. Here are three examples of key factors that actually shaped the present era. Notably, each had its point of origin prior to the end of the Cold War. Each came to maturity while U.S. policymakers, hypnotized by the “Revolution of ’89,” were busily trying to reap the benefits they fancied to be this country’s for the taking. Each far surpasses in significance the fall of the Berlin Wall.
The “Rise” of China: The China that we know today emerged from reforms instituted by Communist Party leader Deng Xiaoping, which transformed the People’s Republic into an economic powerhouse. No nation in history, including the United States, has ever come close to matching China’s spectacular ascent. In just three decades, its per capita gross domestic product skyrocketed from $156 in 1978 to $9,771 in 2017.
The post-Cold War assumption common among American elites that economic development would necessarily prompt political liberalization turned out to be wishful thinking. In Beijing today, the Communist Party remains firmly in control. Meanwhile, as illustrated by its “Belt and Road” initiative, China has begun to assert itself globally, while simultaneously enhancing the capabilities of the People’s Liberation Army. In all of this, the United States — apart from borrowing from China to pay for an abundance of its imported products (now well over a half-trillion dollars of them annually) — has figured as little more than a bystander. As China radically alters the balance of power in twenty-first-century East Asia, the outcome of the Cold War has no more relevance than does Napoleon’s late-eighteenth-century expedition to Egypt.
A Resurgence of Religious Extremism: Like the poor, religious fanatics will always be with us. They come in all stripes: Christians, Hindus, Jews, Muslims. Yet implicit in the American idea that lay at the heart of Bush’s State of the Union Address was an expectation of modernity removing religion from politics. That the global advance of secularization would lead to the privatization of faith was accepted as a given in elite circles. After all, the end of the Cold War ostensibly left little to fight about. With the collapse of communism and the triumph of democratic capitalism, all the really big questions had been settled. That religiously inspired political violence would become a crucial factor in global politics therefore seemed inconceivable.
Yet a full decade before the “Revolution of ’89,” events were already shredding that expectation. In November 1979, radical Islamists shocked the House of Saud by seizing the Grand Mosque in Mecca. Although local security forces regained control after a bloody gun battle, the Saudi royal family resolved to prevent any recurrence of such a disaster by demonstrating beyond the shadow of a doubt its own fealty to the teachings of Allah. It did so by expending staggering sums throughout the Ummah to promote a puritanical form of Islam known as Wahhabism.
In effect, Saudi Arabia became the principal underwriter of what would morph into Islamist terror. For Osama bin Laden and his militant followers, the American idea to which President Bush paid tribute that January in 1990 was blasphemous, intolerable, and a justification for war. Lulled by a belief that the end of the Cold War had yielded a definitive victory, the entire U.S. national security apparatus would be caught unawares in September 2001 when religious warriors assaulted New York and Washington. Nor was the political establishment prepared for the appearance of violence perpetrated by domestic religious extremists. During the Cold War, it had become fashionable to declare God dead. That verdict turned out to be premature.
The Assault on Nature: From its inception, the American idea so lavishly praised by President Bush in 1990 had allowed, even fostered, the exploitation of the natural world based on a belief in Planet Earth’s infinite capacity to absorb punishment. During the Cold War, critics like Rachel Carson, author of the pioneering environmental book Silent Spring, had warned against just such an assumption. While their warnings received respectful hearings, they elicited only modest corrective action.
Then, in 1988, a year prior to the fall of the Berlin Wall, in testimony before Congress, NASA scientist James Hansen issued a far more alarming warning: human activity, particularly the burning of fossil fuels, was inducing profound changes in the global climate with potentially catastrophic consequences. (Of course, a prestigious scientific advisory committee had offered just such a warning to President Lyndon Johnson more than two decades earlier, predicting the early twenty-first-century effects of climate change, to no effect whatsoever.)
To put it mildly, President Bush and other members of the political establishment did not welcome Hansen’s analysis. After all, to take him seriously meant admitting to the necessity of modifying a way of life centered on self-indulgence, rather than self-restraint. At some level, perpetuating the American penchant for material consumption and personal mobility had described the ultimate purpose of the Cold War. Bush could no more tell Americans to settle for less than he could imagine a world order in which the United States no longer occupied “the center of a widening circle of freedom.”
Some things were sacrosanct. As he put it on another occasion, “The American way of life is not up for negotiations. Period.”
So while President Bush was not an outright climate-change denier, he temporized. Talk took precedence over action. He thereby set a pattern to which his successors would adhere, at least until the Trump years. To thwart communism during the Cold War, Americans might have been willing to “pay any price, bear any burden.” Not so when it came to climate change. The Cold War itself had seemingly exhausted the nation’s capacity for collective sacrifice. So, on several fronts, the assault on nature continues and is even gaining greater momentum.
In sum, from our present vantage point, it becomes apparent that the “Revolution of ‘89” did not initiate a new era of history. At most, the events of that year fostered various unhelpful illusions that impeded our capacity to recognize and respond to the forces of change that actually matter.
Restoring the American compass to working order won’t occur until we recognize those illusions for what they are. Step one might be to revise what “this idea called America” truly signifies.
A Report Card on the American Project
AL, FRED, AND HOMER’S AMERICA—AND MINE
Donald Trump was born in June 1946, the son of a wealthy New York real estate developer. I was born thirteen months later in Normal, Illinois. My parents, both World War II veterans, were anything but wealthy. At the time of my birth, my father was attending college on the GI Bill, with my mother, a former army nurse, working to keep our family afloat. In most respects, Trump and I had (and have) almost nothing in common.
Yet however the particulars may have differed, he and I were, in another sense, born in the same place, governed by certain identifiable propositions. Just then beginning to assume concrete form, those propositions informed post–World War II America. They described a way of life and defined what it meant to be an American. They conferred prerogatives and apportioned benefits. And not least of all, they situated the United States in the stream of history. Metaphysically, even though we have never met, Trump and I are kin—white heterosexual males who came of age at a time when white heterosexual males were granted first claim on all the privileges heralded by an American Century just then hitting its stride.
At the time of his birth and mine, ordinary Americans, whatever their race, gender, or sexual orientation, wanted nothing more than to move past the trials of the recent past, and the sooner the better. Mobilizing the nation for total war, a process directed from Washington, had taken years to accomplish. Demobilization, driven from the bottom up, occurred virtually overnight as the armed forces of the United States all but disintegrated. In the wake of Japan’s surrender in September 1945, an eruption of civil disobedience unlike any in U.S. history swept through the ranks of the armed forces, an event all the more remarkable in that it was without structure or leaders. America’s citizen soldiers were done with war and done with taking orders. With millions of GIs demanding to shed their uniforms and their loved ones echoing those demands, authorities in Washington had no option but to comply.1
Ever so briefly, the meaning of postwar freedom centered on getting out of the service and returning home. For vets, home meant the possibility of normalcy restored. While readjusting to civilian life might pose challenges, these could be overcome. The movie that dominated the Oscars in the year of Trump’s birth offered assurances on that score.
Directed by William Wyler and written by Robert Sherwood, The Best Years of Our Lives tells the story of three veterans—Al, Fred, and Homer—back from overseas just as the sweet taste of victory is beginning to give way to the vexations of everyday life. All three are eager to return to life in “Boone City” while simultaneously wary of what awaits them there. All three are white, their ethnic identity or religious affiliation indeterminate. All three bear the scars of war, whether physical or psychological. Yet they exude a decency that asks for little apart from a fair shake. They are three ordinary men who have surmounted extraordinary challenges: one a small-time banker returning from combat as an infantry platoon sergeant in the Pacific; the second, a soda jerk elevated to the rank of captain who served as a B-17 bombardier flying missions over Nazi Germany; the third, a young enlisted sailor who lost both hands due to a shipboard fire.
In the course of the film, each of the three protagonists encounters severe trials, which he surmounts through grit and determination (along with the help of a good woman). Implicit in the film’s gratifying message is this subtext: The hopes and dreams of these modest men are themselves modest. In the Middle America represented by Boone City, freedom isn’t gaudy. It does not put on airs or bridle against received norms. Freedom imparts direction and confers purpose.
In an immediate sense, Al, Fred, and Homer expect no more than what they believe they have earned. As Fred, the soda-jerk-turned-airman, puts it, “All I want is a good job, a mild future, a little house big enough for me and my wife—give me that and I’ll be rehabilitated all right.”2 But Wyler looks beyond whether or not returning vets can land a good job and afford a little house that’s big enough. His story’s several threads focus on this shared concern: whether intimate relationships shelved or torn asunder by war can be restored or, if not restored, replaced. Ultimately, he answers that question in the affirmative. By the time the film reaches its final scene, life’s “best years” may still lie ahead, an outcome that validates the political, cultural, and moral framework to which the movie itself testifies.
The point here is neither to denigrate nor to idealize that framework, merely to acknowledge its appeal. The Best Years of Our Lives depicts postwar American freedom at its point of origin, when verities still retained a semblance of permanence. That upon returning from a war that has turned their world upside down Al, Fred, and Homer should want things put back in place, returned to what they had been when they went away, is hardly surprising. Neither is their yearning for stability, predictability, and normalcy.
In Wyler’s Boone City, preexisting norms, not least of all those determining individual status, merit respect. “Freedom from” takes precedence over “freedom to.” Almost of necessity, access to this unpretentious Eden is therefore limited, with women allowed only auxiliary membership and people of color all but excluded. Despite such restrictions—or perhaps because of them—this cinematic portrait of postwar America in its very first days resonated with those willing to spend two bits for a ticket.
And why not? The film was a mirror, a depiction of place and people that conformed to what large numbers of ordinary Americans wished to see as they left behind one period of history and embarked upon another. It offered assurances that, despite the recent upheavals, nothing essential had changed. The satisfactions of life centered on a stable marriage, an intact family, and honest work remained readily available, especially to those with the good fortune to have been born white, male, and heterosexual. As the critic Robert Warshaw, writing at the time in Partisan Review, put it, The Best Years of Our Lives offered a message of reassurance, “impressing the spectator with the dignity and meaningfulness of ‘typical’ American experience (his own experience) and making him feel a certain confidence that the problems of American life (his own problems) can be solved by the operation of ‘simple’ and ‘American’ virtues.”3
From our present-day vantage point, we may doubt that the America depicted in The Best Years of Our Lives ever actually existed. Yet those flocking to see the movie when it was first released believed otherwise. Through the ensuing decades of the postwar era, the real-life equivalents of Al, Fred, and Homer, including my own parents, if not perhaps Donald Trump’s, persisted in that belief. World War II—the Good War, even before that phrase came into common usage—remained a fixed point of reference, a lodestar. To preserve what the nation had won constituted a categorical imperative.
Foster and Henry Weigh In
Yet preservation was likely to require effort. Members of the policy elite were already insisting that the United States could ill afford to rest on its laurels. Just ahead lay new dangers that Americans dared not ignore. In the very week of Donald Trump’s birth, for example, Life magazine, then at the height of its influence, featured a lengthy essay by John Foster Dulles, offering his “Thoughts on Soviet Foreign Policy and What to Do About It.”4 Here was a sign that Boone City’s modest aspirations would not suffice.
Already exuding the authority of the secretary of state he was to become, Foster, as he was known to friends and colleagues, was a paragon of the Eastern foreign policy establishment. Less than a year before, World War II’s triumphal conclusion had brought to fruition that establishment’s fondest dreams, thrusting the United States into a position of global preeminence. Even so, Dulles’s perspective was unrelentingly grim. Although Nazi Germany was gone and Imperial Japan vanquished, the United States faced another comparable threat. The Kremlin, he charged, was already pressing to create a vast “Pax Sovietica.” Russia and America were on a collision course, with Soviet ambitions directly threatening all that Americans stood for and cherished. It was therefore incumbent upon the United States “to resist all expansive manifestations of Soviet policy.” Failure to do so invited the ultimate disaster. “Assume that Soviet leaders cannot be brought to change their program,” Dulles wrote. The inevitable result would be a “drift into surrender or war.”
“If the past is any guide,” he added, “it will be war.” Averting such a terrible prospect was going to require concerted action or, as Dulles put it, “an affirmative demonstration that our society of freedom still has the qualities needed for survival.” Here, a mere nine months after V-J Day, was a blunt articulation of the theme employed with notable success over the next several decades to keep the rabble in line: Dark forces abroad posed an imminent threat to freedom’s very survival.
Dulles called upon Americans to confront this new peril head-on, making it “clear beyond peradventure that they are prepared to accept personal sacrifice to help keep freedom alive in the world.” The real-life counterparts of Al, Fred, and Homer might think that their work was done. John Foster Dulles held to another view: The struggle for freedom was only just beginning. Sustaining that struggle required the United States to take the lead in opposing Soviet totalitarianism.
A devout if dour Presbyterian, Dulles framed the task at hand in spiritual terms. To overcome godless adversaries would require that Americans remain a God-fearing people. Unless disciplined by faith, he warned, freedom becomes little more than an excuse for “self-gratification,” a temptation to which he suggested his countrymen were notably susceptible. “Under such circumstances,” Dulles cautioned, “freedom is dangerous.” Only by tempering the exercise of freedom could Americans ensure its preservation.
Yet the magazine in which Dulles’s sermon appeared preached quite a different gospel. Life was all about self-gratification. Dulles might urge his fellow citizens to submit to God’s will (and, by extension, Washington’s authority). For their part, the editors who assembled Life each week under the direction of publisher Henry Luce encouraged readers to do something else: grab with both hands all the happiness within reach now that the nation had survived both prolonged economic distress and global war. The issue dated June 10, 1946, containing Dulles’s prescription for foreign policy, was no exception.
On the cover, the young actress Donna Reed posed at her most fetching. Inside was Life’s usual mix of stories, running the gamut from natural disasters (flash floods along the Susquehanna) and lurid crime (a murderer on the loose in Texarkana) to oddities (a photographic essay of university students engaged in an experiment “to test their kisses for germs”) and vivid updates on the latest in fashion, fun, and politics.
The big spread of that week celebrated the postwar boom already transforming California into “the land of golden sunshine and golden opportunity.” Vets were flocking to the state through which so many had passed during the war years. Bustle and promise were everywhere, Life reported. “Walnut groves and peach orchards are being grubbed out to make way for housing projects, movie theaters, [and] drive-ins.” Fortune favored those with the moxie to seize it, including contractors converting abandoned streetcars into makeshift apartments rented out for $25 per week. The future of the golden state glittered—or at least it did for the white ex-servicemen whose entrepreneurial élan Life chose to highlight.
All of this was standard Life boosterism, as was the advertising copy that enlivened almost every page and reinforced the message of material plenty available to all. Bracketing Dulles’s call to arms were ads for facial soap, shampoo, hair oil, mouthwash, cosmetics, deodorant, cologne, and other personal products. For those nursing complaints, there were remedies for headache, constipation, sunburn, and athlete’s foot.
Other ads touted the latest in nylon stockings, women’s undergarments, swimwear, and men’s shirts that were “handkerchief-soft” while “richly masculine.” For would-be sophisticates, Life offered whiskey favored by the “Men who Plan beyond Tomorrow.” For the harried, there were cigarettes, one ad depicting an agitated mother confronting her misbehaving teenager. “When junior’s fighting rates a scold,” the copy read, “Why be irritated? Light an Old Gold.”
Woven throughout was the promise of science providing Americans with longer, better, and more fulfilling lives. Thanks to “Eugenics in a Cornfield,” the Jolly Green Giant now guaranteed uniformity in each can of corn. With every kernel “bred right [and] grown right,” it was, according to the copywriters, “Planned Parenthood” applied to agriculture.
Offering further fulfillment of that promise were the latest in household gadgets, which touted ease, convenience, and an end to drudgery. Kitchen appliances meant a “new kind of freedom.” For diversion, Life promoted an array of radios, phonographs, and that novelty called television. For now, however, the automobile remained king—hence, the junked streetcars available for repurposing. A full-page ad in brilliant color proclaimed Packard’s Clipper sedan “America’s No. 1 Glamour Car!”
Elsewhere in the world, wartime exigencies had imposed rationing destined to continue for years. Life assured its readers that in America rationing was gone for good. John Foster Dulles might summon his fellow citizens to gird themselves for sacrifice, while choosing God above Mammon. At least implicitly, Life countered that sacrifice was becoming un-American. As for forgoing the delights of this world in order to gain entry into the next, that choice could be postponed or even finessed altogether.
William Wyler, John Foster Dulles, and the pages of Henry Luce’s Life represented three very different and arguably irreconcilable notions of what postwar American freedom entailed or allowed. The Boone City version, a Norman Rockwell painting on celluloid, centered on safeguarding hard-won gains. Its rendering of freedom was tied to an idealized past. Dulles conceived of freedom in terms of impending ideological struggle. Preserving it—an iffy proposition at best—was going to require fresh exertions on a sustained basis. For the editors and advertisers of Life, in contrast, freedom was material, finding expression in the cornucopia of goods flooding the American marketplace now that the war had ended. Freedom centered on satisfying a continually evolving array of appetites and desires.
From our present-day vantage point, we may find fault with all three of these conceptions. None gave more than scant attention to what subsequently emerged as the two most troubling moral issues of the era, namely, the Holocaust and Hiroshima. Americans had experienced World War II as a Manichean event pitting all that was good against all that was evil. Now that Hitler had been removed from the scene and with the United States, however briefly, enjoying a nuclear monopoly, they were disinclined to entertain second thoughts about the war’s origins, conduct, or legacy. What mattered most was its outcome.
These postwar versions of freedom fell short in other respects as well. In each, race, gender, and sexuality figured as the barest afterthought or not at all. None of them gave serious attention to environmental concerns or human rights, as we understand such matters today. As for diversity, inclusiveness, or multiculturalism—issues now at the forefront of American politics—even the terms were then alien.
Copyright © 2020 by Andrew Bacevich
The Age of Illusions
Let us stipulate at the outset that Donald Trump is a vulgar and dishonest fraud without a principled bone in his corpulent frame. Yet history is nothing if not a tale overflowing with irony. Despite his massive shortcomings, President Trump appears intent on recalibrating America’s role in the world. Initiating a long-overdue process of aligning U.S. policy with actually existing global conditions just may prove to be his providentially anointed function. Go figure.
The Valhalla of the Indispensable Nation is a capacious place, even if it celebrates mostly white and mostly male diversity. Recall that in the eighteenth century, it was a slaveholding planter from Virginia who secured American independence. In the nineteenth, an ambitious homespun lawyer from Illinois destroyed slavery, thereby clearing the way for his country to become a capitalist behemoth. In the middle third of the twentieth century, a crippled Hudson River grandee delivered the United States to the summit of global power. In that century’s difficult later decades, a washed-up movie actor declared that it was “morning in America” and so, however briefly, it seemed to be. Now, in the twenty-first century, to inaugurate the next phase of the American story, history has seemingly designated as its agent a New York real estate developer, casino bankruptee, and reality TV star.
In all likelihood, George Washington, Abraham Lincoln, Franklin Delano Roosevelt, and Ronald Reagan would balk at having Donald Trump classified as their peer. Yet, however preposterously, in our present moment of considerable crisis, he has succeeded them as the nation’s Great Helmsman, albeit one with few ideas about what course to set. Yet somehow Trump has concluded that our existing course has the United States headed toward the rocks. He just might be right.
“Great nations do not fight endless wars.” So the president announced in his 2019 State of the Union Address. Implicit in such a seemingly innocuous statement was a genuinely radical proposition, as laden with portent as Lincoln’s declaration in 1858 that a house divided cannot stand. Donald Trump appears determined to overturn the prevailing national security paradigm, even if he is largely clueless about what should replace it.
Much as Southerners correctly discerned the import of Lincoln’s veiled threat, so, too, have Trump’s many critics within the national security apparatus grasped the implications of his insistence that “endless wars” must indeed end. In the unlikely event that he ever delivers on his campaign promise to end the conflicts he inherited, all the claims, assumptions, and practices that together define the U.S. national security praxis will become subject to reexamination. Tug hard enough on this one dangling thread — the wars that drag on and on — and the entire fabric may well unravel.
The Decalogue Plus One
In other words, to acknowledge the folly of this country’s endless wars will necessarily call into question the habits that people in and around Washington see as the essence of “American global leadership.” Prominent among these are: (1) positioning U.S. forces in hundreds of bases abroad; (2) partitioning the whole planet into several contiguous regional military commands; (3) conferring security guarantees on dozens of nations, regardless of their ability to defend themselves or the values to which they subscribe; (4) maintaining the capability to project power to the remotest corners of the earth; (5) keeping in instant readiness a “triad” of nuclear strike forces; (6) endlessly searching for “breakthrough technologies” that will eliminate war’s inherent risks and uncertainties; (7) unquestioningly absorbing the costs of maintaining a sprawling national security bureaucracy; (8) turning a blind eye to the corrupting influence of the military-industrial complex; and easily outpacing all other nations, friend and foe alike, in (9) weapons sales and (10) overall military spending.
Complementing this Decalogue, inscribed not on two tablets but in thousands of pages of stupefyingly bureaucratic prose, is an unwritten eleventh commandment: Thou shalt not prevent the commander-in-chief from doing what he deems necessary. Call it all D+1. In theory, the Constitution endows Congress with the authority to prevent any president from initiating, prolonging, or expanding a war. In practice, Congress has habitually deferred to an increasingly imperial presidency and treated the war-powers provisions of the Constitution as non-binding.
This Decalogue-plus-one has been with us for decades. It first emerged during the early phases of the Cold War. Its godfathers included such distinguished (if today largely forgotten) figures as Paul Nitze, principal author of a famously unhinged policy paper known as NSC-68, and General Curtis LeMay, who transformed the Strategic Air Command into a “cocked weapon” capable of obliterating humankind.
During the 1960s, better-dead-than-Red began to fall from favor and a doctrine of “flexible response” became all the rage. In those years, as an approach to waging, and therefore perpetuating the Cold War, D+1 achieved maturity. At that very juncture, the search for fresh thinking to justify existing policies vaulted the likes of Robert McNamara and Maxwell Taylor into positions of authority as secretary of defense and chairman of the Joint Chiefs of Staff.
The Vietnam War put the American military establishment’s capacity for flexibility to the test. That test did not go well, with Secretary McNamara and General Taylor prominent among the officials whose reputations did not survive. Remarkably, however, amid the carnage of that war, D+1 did survive all but unscathed. Vietnam was surely a debacle, but as long as the Cold War persisted, asking first-order questions about the basic organization of “national security” appeared just too risky. So the Decalogue emerged with hardly a scratch. Notwithstanding the disappointing presidencies of Lyndon Johnson and Richard Nixon, so, too, did the Eleventh Commandment.
More striking still, even after the fall of the Berlin Wall, D+1 persisted. Thirty years ago this month when the Cold War ended, everyone agreed that a new era of global affairs was dawning. The Soviet Union, the threat that had prompted the creation of the Decalogue, had vanished. Yet without missing a beat, a new generation of Nitzes and LeMays, McNamaras and Taylors devised an altogether different rationale for preserving their predecessors’ handiwork.
That new rationale was nothing if not expansive. During the Cold War, the overarching purpose of D+1 had been to avert the ultimate disaster of Armageddon. Its revised purpose was to promote the ultimate goal of remaking the world in America’s image. With a “sole superpower” now presiding over the international order, D+1 offered a recipe for simultaneously cementing permanent U.S. primacy and securing the universal triumph of American values. So, at least, members of an intoxicated foreign policy elite persuaded themselves.
Yet in the wake of the Cold War came not peace and harmony but unprecedented U.S. military activism. Here was the common theme of the otherwise disparate presidencies of George H.W. Bush, Bill Clinton, George W. Bush, and Barack Obama. During the quarter-century that elapsed between the fall of the Berlin Wall and the election of Donald Trump, the United States intervened in or attacked Panama, Iraq, Somalia, Haiti, Bosnia, Kosovo, Afghanistan, Sudan, Afghanistan (again), Iraq (again), Libya, Somalia (again), Yemen, Syria, several West African nations, and, briefly, Pakistan. And given a presidential preference for employing Special Operations forces on highly classified missions, that list is almost surely incomplete. Simply put, reticence regarding the use of force vanished.
As for the Eleventh Commandment, it now achieved a status comparable to the doctrine of papal infallibility. After 9/11, Congress quickly passed an open-ended Authorization to Use Military Force (AUMF), empowering the president “to take action to deter and prevent acts of international terrorism against the United States.” Of course, “terrorism,” as we are frequently reminded by the likes of Benjamin Netanyahu, Vladimir Putin, and Recep Tayyip Erdogan is very much in the eyes of the beholder. In effect, Congress had simply handed the commander-in-chief a blank check.
That AUMF became law on September 18, 2001, following a unanimous vote in the Senate and with only a single member dissenting in the House of Representatives. In the 18 years since, it has shown both remarkable durability and elasticity. Best illustrating its durability have been the wars launched under its auspices. Best illustrating its elasticity was Barack Obama’s “disposition matrix,” a secret procedure devised by his administration empowering him to order the killing of just about anyone anywhere on the planet deemed to pose a threat to the United States. All of this transpired with the cool deliberation and thorough consultation that was an Obama signature. Acting pursuant to the provisions of that AUMF, in other words, Obama codified assassination as an integral component of U.S. policy. In Washington, war thereby became a permanent undertaking that recognized no boundaries.
In or Out? Old or New?
Read the papers or watch cable news and you might conclude that the pivotal issue of our moment is the fate of Syria’s Kurds, with the United States military deemed uniquely responsible for ensuring their wellbeing. Yet while such a conclusion may play well with our troubled consciences — and troubled they certainly should be — it is radically misleading.
True enough, Trump’s abrupt abandonment of the Kurds qualifies as cruel, callous, and immoral. It also ranks as only the latest in a long string of such American betrayals, as various Native American tribes, Chinese Nationalists, Cuban exiles, South Vietnamese, and prior generations of Kurds (among others) can testify. So Trump has not exactly broken with past precedent.
More to the point, the matter at hand relates less to the Kurds than to a far larger question: Should the United States perpetuate the military enterprise commonly but misleadingly referred to as the “global war on terrorism?” Or should the United States recognize that this so-called GWOT has failed and consider a different approach to policy? Given that the GWOT represents D+1 applied to the Greater Middle East, “different” implies a wholesale reexamination of basic national security policy. It’s that prospect that worries the foreign policy establishment.
With the GWOT’s 20th anniversary now within hailing distance, we are in a position to evaluate just what that war has actually achieved. Honest differences of opinion may be possible, but in my judgment the results rank somewhere between disappointing and catastrophic. This much is certain: we have not won and victory is nowhere in sight.
Granted, Iraq’s Saddam Hussein is gone, as is Libya’s Muammar Gaddafi, both of them guilty of terrible crimes (although innocent of any direct involvement in 9/11). For the moment at least, the repressive Taliban do not rule in Kabul. And Osama bin Laden and Abu Bakr al-Baghdadi are dead. Proponents of the GWOT and of D+1 can point to these as positive achievements.
Yet widen the aperture slightly and the outcome appears less impressive. George W. Bush’s much-ballyhooed Freedom Agenda came to naught. Regime change in Kabul, Baghdad, and Tripoli produced not liberal democracy but chronic instability, pervasive corruption, and endemic violence. In Afghanistan, the Taliban never admitted defeat and today threaten the Western-installed Afghan government. Rather than affirming American military mastery and benign intentions, the reckless and illegal invasion of Iraq, advertised under the banner of Operation Iraqi Freedom, became a gift to our adversaries. If anyone can be said to have won the Iraq War, that honor must surely belong to the Islamic Republic of Iran. Worse still, by upending the existing regional order, U.S. forces created a power vacuum that facilitated the emergence of new terrorist movements like ISIS.
America’s ongoing post-9/11 wars deserve to be called “endless” because, despite contributing to hundreds of thousands of deaths and squandering trillions of dollars over the course of many years, the United States has come nowhere close to fulfilling its declared political aims. The plight of the Kurds in Syria offers a small but telling illustration of the magnitude of that failure.
Now the president of the United States, acting pursuant to the authority granted him by the Eleventh Commandment, says he wants to call it quits. It’s like Adam in the Garden of Eden: the one thing he’s forbidden to do, he does — or in Trump’s case makes a show of intending to do at least.
In response, in a show of near-unanimity Democratic and Republican defenders of the Decalogue Plus One insist that President Trump may not do what he declares himself intent on doing. Recall that George W. Bush’s doctrine of preventive war — sometimes disguised as “anticipatory self-defense” — elicited only modest opposition at best, largely along partisan lines. Much the same can be said of Barack Obama’s self-appointment as assassin-in-chief. But Donald Trump’s declared intention to withdraw U.S. troops from Syria as a preliminary step toward reducing our regional military presence has elicited bipartisan condemnation expressed in the strongest terms.
Senate Majority Leader Mitch McConnell, typically the president’s most stalwart defender, took to the pages of the Washington Post to denounce Trump’s decision in no uncertain terms. Riddled with half-truths and hyperbole, his op-ed qualifies as a model of “fake news.” Yet credit McConnell with this much: he understands that, in the dispute between Trump and the foreign policy establishment, the fate of Syria’s Kurds rates as no more than incidental.
The real issue, according to McConnell, is preserving “the post-World War II international system” that, he asserts, “has sustained an unprecedented era of peace, prosperity, and technological development.” Furthermore, having created that system, the United States remains “its indispensable nation,” a phrase introduced by Madeleine Albright and Bill Clinton in the early 1990s. Preserving that system’s benefits requires keeping faith with the Kurds, maintaining the U.S. military presence throughout the Middle East, and above all preserving the established framework of national security policy. In short, compliance with the Decalogue is mandatory. Even (or especially) presidents must obey.
Now, if you believe that the world we live in today does not differ in any significant way from the one that existed in the wake of World War II, McConnell’s argument might just possess some merit. Yet back then, the American economy led the pack in every conceivable measure. America’s European allies had been ravaged by war and desperately needed U.S. assistance. Both they and the defeated Axis powers, Germany and Japan, appeared vulnerable to the siren song of Communism.
To some observers, the Soviet Union appeared intent on taking over the world. China was poor, weak, backward, and divided. Imperial powers like Great Britain, France, and the Netherlands still clung to the illusion that they could keep a lid on demands for national self-determination in South and Southeast Asia, the Middle East, and Africa. Nuclear weapons offered a source of reassurance rather than concern — apart from the United States no one had them. Finally, that a climate crisis attributable to human activity might one day cause grievous harm on a planetary scale was literally beyond imagining.
Time has rendered every bit of this inoperative. McConnell’s “post-World War II international system” is now a fantasy about as relevant to contemporary reality as belief in the tooth fairy.
In what may be the sole redeeming feature of his otherwise abysmal presidency, Trump appears determined to blow the whistle on this charade. Sadly, his efforts do not extend much beyond making noise. Even the troop withdrawals that he announces with such fanfare tend to result in little more than repositioning within the region rather than redeployment back to the United States. Worse still, the motly band of mediocrities who surround the president consists almost entirely of believers in D+1. In his impulsive and ignorant way, Trump wants change; they oppose it.
As a result, diplomatic initiatives that might actually open a pathway to ending endless wars — negotiating the restoration of normal diplomatic relations with Tehran, for example, or curtailing weapons sales (and giveaways) to nations that use U.S.-manufactured arms to create mayhem, or demonstrating leadership by declaring a no-first-use policy on nuclear weapons — don’t even qualify for discussion. So Trump is left to flail about on his own, haplessly posing legitimate questions that he is incapable of answering.
The fears of the Decalogue’s defenders are not misplaced: Syria is the loose tip of a dangling thread. Give that thread a good yank and the entire moth-eaten fabric of U.S. national security policy just might become undone. Yet it will take someone with greater determination, consistency, and strength of character than Donald Trump to perform this necessary task.
There is blood in the water and frenzied sharks are closing in for the kill. Or so they think.
From the time of Donald Trump’s election, American elites have hungered for this moment. At long last, they have the 45th president of the United States cornered. In typically ham-handed fashion, Trump has given his adversaries the very means to destroy him politically. They will not waste the opportunity. Impeachment now — finally, some will say — qualifies as a virtual certainty.
No doubt many surprises lie ahead. Yet the Democrats controlling the House of Representatives have passed the point of no return. The time for prudential judgments — the Republican-controlled Senate will never convict, so why bother? — is gone for good. To back down now would expose the president’s pursuers as spineless cowards. The New York Times, the Washington Post, CNN, and MSNBC would not soon forgive such craven behavior.
So, as President Woodrow Wilson, speaking in 1919 put it, “The stage is set, the destiny disclosed. It has come about by no plan of our conceiving, but by the hand of God.” Of course, the issue back then was a notably weighty one: whether to ratify the Versailles Treaty. That it now concerns a “Mafia-like shakedown” orchestrated by one of Wilson’s successors tells us something about the trajectory of American politics over the course of the last century and it has not been a story of ascent.
The effort to boot the president from office is certain to yield a memorable spectacle. The rancor and contempt that have clogged American politics like a backed-up sewer since the day of Donald Trump’s election will now find release. Watergate will pale by comparison. The uproar triggered by Bill Clinton’s “sexual relations” will be nothing by comparison. A de facto collaboration between Trump, those who despise him, and those who despise his critics all but guarantees that this story will dominate the news, undoubtedly for months to come.
As this process unspools, what politicians like to call “the people’s business” will go essentially unattended. So while Congress considers whether or not to remove Trump from office, gun-control legislation will languish, the deterioration of the nation’s infrastructure will proceed apace, needed healthcare reforms will be tabled, the military-industrial complex will waste yet more billions, and the national debt, already at $22 trillion — larger, that is, than the entire economy — will continue to surge. The looming threat posed by climate change, much talked about of late, will proceed all but unchecked. For those of us preoccupied with America’s role in the world, the obsolete assumptions and habits undergirding what’s still called “national security” will continue to evade examination. Our endless wars will remain endless and pointless.
By way of compensation, we might wonder what benefits impeachment is likely to yield. Answering that question requires examining four scenarios that describe the range of possibilities awaiting the nation.
The first and most to be desired (but least likely) is that Trump will tire of being a public piñata and just quit. With the thrill of flying in Air Force One having worn off, being president can’t be as much fun these days. Why put up with further grief? How much more entertaining for Trump to retire to the political sidelines where he can tweet up a storm and indulge his penchant for name-calling. And think of the “deals” an ex-president could make in countries like Israel, North Korea, Poland, and Saudi Arabia on which he’s bestowed favors. Cha-ching! As of yet, however, the president shows no signs of taking the easy (and lucrative) way out.
The second possible outcome sounds almost as good but is no less implausible: a sufficient number of Republican senators rediscover their moral compass and “do the right thing,” joining with Democrats to create the two-thirds majority needed to convict Trump and send him packing. In the Washington of that classic twentieth-century film director Frank Capra, with Jimmy Stewart holding forth on the Senate floor and a moist-eyed Jean Arthur cheering him on from the gallery, this might have happened. In the real Washington of “Moscow Mitch” McConnell, think again.
The third somewhat seamier outcome might seem a tad more likely. It postulates that McConnell and various GOP senators facing reelection in 2020 or 2022 will calculate that turning on Trump just might offer the best way of saving their own skins. The president’s loyalty to just about anyone, wives included, has always been highly contingent, the people streaming out of his administration routinely making the point. So why should senatorial loyalty to the president be any different? At the moment, however, indications that Trump loyalists out in the hinterlands will reward such turncoats are just about nonexistent. Unless that base were to flip, don’t expect Republican senators to do anything but flop.
That leaves outcome number four, easily the most probable: while the House will impeach, the Senate will decline to convict. Trump will therefore stay right where he is, with the matter of his fitness for office effectively deferred to the November 2020 elections. Except as a source of sadomasochistic diversion, the entire agonizing experience will, therefore, prove to be a colossal waste of time and blather.
Furthermore, Donald Trump might well emerge from this national ordeal with his reelection chances enhanced. Such a prospect is belatedly insinuating itself into public discourse. For that reason, certain anti-Trump pundits are already showing signs of going wobbly, suggesting, for instance, that censure rather than outright impeachment might suffice as punishment for the president’s various offenses. Yet censuring Trump while allowing him to stay in office would be the equivalent of letting Harvey Weinstein off with a good tongue-lashing so that he can get back to making movies. Censure is for wimps.
Besides, as Trump campaigns for a second term, he would almost surely wear censure like a badge of honor. Keep in mind that Congress’s approval ratings are considerably worse than his. To more than a few members of the public, a black mark awarded by Congress might look like a gold star.
Not Removal But Restoration
So if Trump finds himself backed into a corner, Democrats aren’t necessarily in a more favorable position. And that ain’t the half of it. Let me suggest that, while Trump is being pursued, it’s you, my fellow Americans, who are really being played. The unspoken purpose of impeachment is not removal, but restoration. The overarching aim is not to replace Trump with Mike Pence — the equivalent of exchanging Groucho for Harpo. No, the object of the exercise is to return power to those who created the conditions that enabled Trump to win the White House in the first place.
Just recently, for instance, Hillary Clinton declared Trump to be an “illegitimate president.” Implicit in her charge is the conviction — no doubt sincere — that people like Donald Trump are not supposed to be president. People like Hillary Clinton — people possessing credentials like hers and sharing her values — should be the chosen ones. Here we glimpse the true meaning of legitimacy in this context. Whatever the vote in the Electoral College, Trump doesn’t deserve to be president and never did.
For many of the main participants in this melodrama, the actual but unstated purpose of impeachment is to correct this great wrong and thereby restore history to its anointed path.
In a recent column in the Guardian, Professor Samuel Moyn makes the essential point: Removing from office a vulgar, dishonest, and utterly incompetent president comes nowhere close to capturing what’s going on here. To the elites most intent on ousting Trump, far more important than anything he may say or do is what he signifies. He is a walking, talking repudiation of everything they believe and, by extension, of a future they had come to see as foreordained.
Moyn styles these anti-Trump elites as “centrists,” members of the post-Cold War political mainstream that allowed ample room for nominally conservative Bushes and nominally liberal Clintons, while leaving just enough space for Barack Obama’s promise of hope-and-(not-too-much) change.
These centrists share a common worldview. They believe in the universality of freedom as defined and practiced within the United States. They believe in corporate capitalism operating on a planetary scale. They believe in American primacy, with the United States presiding over a global order as the sole superpower. They believe in “American global leadership,” which they define as primarily a military enterprise. And perhaps most of all, while collecting degrees from Georgetown, Harvard, Oxford, Wellesley, the University of Chicago, and Yale, they came to believe in a so-called meritocracy as the preferred mechanism for allocating wealth, power, and privilege. All of these together comprise the sacred scripture of contemporary American political elites. And if Donald Trump’s antagonists have their way, his removal will restore that sacred scripture to its proper place as the basis of policy.
“For all their appeals to enduring moral values,” Moyn writes, “the centrists are deploying a transparent strategy to return to power.” Destruction of the Trump presidency is a necessary precondition for achieving that goal. “Centrists simply want to return to the status quo interrupted by Trump, their reputations laundered by their courageous opposition to his mercurial reign, and their policies restored to credibility.” Precisely.
High Crimes and Misdemeanors
For such a scheme to succeed, however, laundering reputations alone will not suffice. Equally important will be to bury any recollection of the catastrophes that paved the way for an über-qualified centrist to lose to an indisputably unqualified and unprincipled political novice in 2016.
Holding promised security assistance hostage unless a foreign leader agrees to do you political favors is obviously and indisputably wrong. Trump’s antics regarding Ukraine may even meet some definition of criminal. Still, how does such misconduct compare to the calamities engineered by the “centrists” who preceded him? Consider, in particular, the George W. Bush administration’s decision to invade Iraq in 2003 (along with the spin-off wars that followed). Consider, too, the reckless economic policies that produced the Great Recession of 2007-2008. As measured by the harm inflicted on the American people (and others), the offenses for which Trump is being impeached qualify as mere misdemeanors.
Honest people may differ on whether to attribute the Iraq War to outright lies or monumental hubris. When it comes to tallying up the consequences, however, the intentions of those who sold the war don’t particularly matter. The results include thousands of Americans killed; tens of thousands wounded, many grievously, or left to struggle with the effects of PTSD; hundreds of thousands of non-Americans killed or injured; millions displaced; trillions of dollars expended; radical groups like ISIS empowered (and in its case even formed inside a U.S. prison in Iraq); and the Persian Gulf region plunged into turmoil from which it has yet to recover. How do Trump’s crimes stack up against these?
The Great Recession stemmed directly from economic policies implemented during the administration of President Bill Clinton and continued by his successor. Deregulating the banking sector was projected to produce a bonanza in which all would share. Yet, as a direct result of the ensuing chicanery, nearly nine million Americans lost their jobs, while overall unemployment shot up to 10%. Roughly four million Americans lost their homes to foreclosure. The stock market cratered and millions saw their life savings evaporate. Again, the question must be asked: How do these results compare to Trump’s dubious dealings with Ukraine?
Trump’s critics speak with one voice in demanding accountability. Yet virtually no one has been held accountable for the pain, suffering, and loss inflicted by the architects of the Iraq War and the Great Recession. Why is that? As another presidential election approaches, the question not only goes unanswered, but unasked.
To win reelection, Trump, a corrupt con man (who jumped ship on his own bankrupt casinos, money in hand, leaving others holding the bag) will cheat and lie. Yet, in the politics of the last half-century, these do not qualify as novelties. (Indeed, apart from being the son of a sitting U.S. vice president, what made Hunter Biden worth $50Gs per month to a gas company owned by a Ukrainian oligarch? I’m curious.) That the president and his associates are engaging in a cover-up is doubtless the case. Yet another cover-up proceeds in broad daylight on a vastly larger scale. “Trump’s shambolic presidency somehow seems less unsavory,” Moyn writes, when considering the fact that his critics refuse “to admit how massively his election signified the failure of their policies, from endless war to economic inequality.” Just so.
What are the real crimes? Who are the real criminals? No matter what happens in the coming months, don’t expect the Trump impeachment proceedings to come within a country mile of addressing such questions.
The Real Cover-Up
When the conflict that the Vietnamese refer to as the American War ended in April 1975, I was a U.S. Army captain attending a course at Fort Knox, Kentucky. In those days, the student body at any of our Army’s myriad schools typically included officers from the Army of the Republic of Vietnam (ARVN).
Since ARVN’s founding two decades earlier, the United States had assigned itself the task of professionalizing that fledgling military establishment. Based on a conviction that the standards, methods, and ethos of our armed forces were universally applicable and readily exportable, the attendance of ARVN personnel at such Army schools was believed to contribute to the professionalizing of the South Vietnamese military.
Evidence that the U.S. military’s own professional standards had recently taken a hit — memories of the My Lai massacre were then still fresh — elicited no second thoughts on our part. Association with American officers like me was sure to rub off on our South Vietnamese counterparts in ways that would make them better soldiers. So we professed to believe, even while subjecting that claim to no more scrutiny than we did the question of why most of us had spent a year or more of our lives participating in an obviously misbegotten and misguided war in Indochina.
For serving officers at that time one question in particular remained off-limits (though it had been posed incessantly for years by antiwar protestors in the streets of America): Why Vietnam? Prizing compliance as a precondition for upward mobility, military service rarely encourages critical thinking.
On the day that Saigon, the capital of the Republic of Vietnam, fell and that country ceased to exist, I approached one of my ARVN classmates, also a captain, wanting at least to acknowledge the magnitude of the disaster that had occurred. “I’m sorry about what happened to your country,” I told him.
I did not know that officer well and no longer recall his name. Let’s call him Captain Nguyen. In my dim recollection, he didn’t even bother to reply. He simply looked at me with an expression both distressed and mournful. Our encounter lasted no more than a handful of seconds. I then went on with my life and Captain Nguyen presumably with his. Although I have no inkling of his fate, I like to think that he is now retired in Southern California after a successful career in real estate. But who knows?
All I do know is that today I recall our exchange with a profound sense of embarrassment and even shame. My pathetic effort to console Captain Nguyen had been both presumptuous and inadequate. Far worse was my failure — inability? refusal? — to acknowledge the context within which that catastrophe was occurring: the United States and its armed forces had, over years, inflicted horrendous harm on the people of South Vietnam.
In reality, their defeat was our defeat. Yet while we had decided that we were done paying, they were going to pay and pay for a long time to come.
Rather than offering a fatuous expression of regret for the collapse of his country, I ought to have apologized for having played even a minuscule role in what was, by any measure, a catastrophe of epic proportions. It’s a wonder Captain Nguyen didn’t spit in my eye.
I genuinely empathized with Captain Nguyen. Yet the truth is that, along with most other Americans, soldiers and civilians alike, I was only too happy to be done with South Vietnam and all its troubles. Dating back to the presidency of Dwight D. Eisenhower, the United States and its armed forces had made a gargantuan effort to impart legitimacy to the Republic of Vietnam and to coerce the Democratic Republic of Vietnam to its north into giving up its determination to exercise sovereignty over the entirety of the country. In that, we had failed spectacularly and at a staggering cost.
“Our” war in Indochina — the conflict we chose to call the Vietnam War — officially ended in January 1973 with the signing in Paris of an “Agreement Ending the War and Restoring Peace in Vietnam.” Under the terms of that fraudulent pact, American prisoners of war were freed from captivity in North Vietnam and the last U.S. combat troops in the south left for home, completing a withdrawal begun several years earlier. Primary responsibility for securing the Republic of Vietnam thereby fell to ARVN, long deemed by U.S. commanders incapable of accomplishing that mission.
Meanwhile, despite a nominal cessation of hostilities, approximately 150,000 North Vietnamese regulars still occupied a large swathe of South Vietnamese territory — more or less the equivalent to agreeing to end World War II when there were still several German panzer tank divisions lurking in Belgium’s Ardennes Forest. In effect, our message to our enemy and our ally was this: We’re outta here; you guys sort this out. In a bit more than two years, that sorting-out process would extinguish the Republic of Vietnam.
Been There, Done That
The course Captain Nguyen and I were attending in the spring of 1975 paid little attention to fighting wars like the one that, for years, had occupied the attention of my army and his. Our Army, in fact, was already moving on. Having had their fill of triple-canopy jungles in Indochina, America’s officer corps now turned to defending the Fulda Gap, the region in West Germany deemed most hospitable to a future Soviet invasion. As if by fiat, gearing up to fight those Soviet forces and their Warsaw Pact allies, should they (however improbably) decide to take on NATO and lunge toward the English Channel, suddenly emerged as priority number one. At Fort Knox and throughout the Army’s ranks, we were suddenly focused on “high-intensity combined arms operations” — essentially, a replay of World War II-style combat with fancier weaponry. In short, the armed forces of the United States had reverted to “real soldiering.”
And so it is again today. At the end of the 17th year of what Americans commonly call the Afghanistan War — one wonders what name Afghans will eventually assign it — U.S. military forces are moving on. Pentagon planners are shifting their attention back to Russia and China. Great power competition has become the name of the game. However we might define Washington’s evolving purposes in its Afghanistan War — “nation building,” “democratization,” “pacification” — the likelihood of mission accomplishment is nil. As in the early 1970s, so in 2019, rather than admitting failure, the Pentagon has chosen to change the subject and is once again turning its attention to “real soldiering.”
Remember the infatuation with counterinsurgency (commonly known by its acronym COIN) that gripped the national security establishment around 2007 when the Iraq “surge” overseen by General David Petraeus briefly ranked alongside Gettysburg as a historic victory? Well, these days promoting COIN as the new American way of war has become, to put it mildly, a tough sell. Given that few in Washington will openly acknowledge the magnitude of the military failure in Afghanistan, the incentive for identifying new enemies in settings deemed more congenial becomes all but irresistible.
Only one thing is required to validate this reshuffling of military priorities. Washington needs to create the appearance, as in 1973, that it’s exiting Afghanistan on its own terms. What’s needed, in short, is an updated equivalent of that “Agreement Ending the War and Restoring Peace in Vietnam.”
Until last weekend, the signing of such an agreement seemed imminent. Donald Trump and his envoy, former ambassador to Afghanistan Zalmay Khalilzad, appeared poised to repeat the trick that President Richard Nixon and National Security Advisor Henry Kissinger pulled off in 1973 in Paris: pause the war and call it peace. Should fighting subsequently resume after a “decent interval,” it would no longer be America’s problem. Now, however, to judge by the president’s Twitter account — currently the authoritative record of U.S. diplomacy — the proposed deal has been postponed, or perhaps shelved, or even abandoned altogether. If National Security Advisor John Bolton has his way, U.S. forces might just withdraw in any case, without an agreement of any sort being signed.
Based on what we can divine from press reports, the terms of that prospective Afghan deal would mirror those of the 1973 Paris Accords in one important respect. It would, in effect, serve as a ticket home for the remaining U.S. and NATO troops still in that country (though for the present only the first 5,000 of them would immediately depart). Beyond that, the Taliban was to promise not to provide sanctuary to anti-American terrorist groups, even though the Afghan branch of ISIS is already firmly lodged there. Still, this proviso would allow the Trump administration to claim that it had averted any possible recurrence of the 9/11 terror attacks that were, of course, planned by Osama bin Laden while residing in Afghanistan in 2001 as a guest of the Taliban-controlled government. Mission accomplished, as it were.
Back in 1973, North Vietnamese forces occupying parts of South Vietnam neither disarmed nor withdrew. Should this new agreement be finalized, Taliban forces currently controlling or influencing significant swaths of Afghan territory will neither disarm nor withdraw. Indeed, their declared intention is to continue fighting.
In 1973, policymakers in Washington were counting on ARVN to hold off Communist forces. In 2019, almost no one expects Afghan security forces to hold off a threat consisting of both the Taliban and ISIS. In a final insult, just as the Saigon government was excluded from U.S. negotiations with the North Vietnamese, so, too, has the Western-installed government in Kabul been excluded from U.S. negotiations with its sworn enemy, the Taliban.
A host of uncertainties remain. As with the olive branches that President Trump has ostentatiously offered to Russia, China, and North Koea, this particular peace initiative may come to naught — or, given the approach of the 2020 elections, he may decide that Afghanistan offers his last best hope of claiming at least one foreign policy success. One way or another, in all likelihood, the deathwatch for the U.S.-backed Afghan government has now begun. One thing only is for sure. Having had their fill of Afghanistan, when the Americans finally leave, they won’t look back. In that sense, it will be Vietnam all over again.
What Price Peace?
However great my distaste for President Trump, I support his administration’s efforts to extricate the United States from Afghanistan. I do so for the same reason I supported the Paris Peace Accords of 1973. Prolonging this folly any longer does not serve U.S. interests. Rule number one of statecraft ought to be: when you’re doing something really stupid, stop. To my mind, this rule seems especially applicable when the lives of American soldiers are at stake.
In Vietnam, Washington wasted 58,000 of those lives for nothing. In Afghanistan, we have lost more than 2,300 troops, with another 20,000 wounded, again for next to nothing. Last month, two American Special Forces soldiers were killed in a firefight in Faryab Province. For what?
That said, I’m painfully aware of the fact that, on the long-ago day when I offered Captain Nguyen my feeble condolences, I lacked the imagination to conceive of the trials about to befall his countrymen. In the aftermath of the American War, something on the order of 800,000 Vietnamese took to open and unseaworthy boats to flee their country. According to estimates by the United Nations High Commissioner for Refugees, between 200,000 and 400,000 boat people died at sea. Most of those who survived were destined to spend years in squalid refugee camps scattered throughout Southeast Asia. Back in Vietnam itself, some 300,000 former ARVN officers and South Vietnamese officials were imprisoned in so-called reeducation camps for up to 18 years. Reconciliation did not rank high on the postwar agenda of the unified country’s new leaders.
Meanwhile, for the Vietnamese, north and south, the American War has in certain ways only continued. Mines and unexploded ordnance left from that war have inflicted more than 100,000 casualties since the last American troops departed. Even today, the toll caused by Agent Orange and other herbicides that the U.S. Air Force sprayed with abandon over vast stretches of territory continues to mount. The Red Cross calculates that more than one million Vietnamese have suffered health problems, including serious birth defects and cancers as a direct consequence of the promiscuous use of those poisons as weapons of war.
For anyone caring to calculate the moral responsibility of the United States for its actions in Vietnam, all of those would have to find a place on the final balance sheet. The 1.3 million Vietnamese admitted to the United States as immigrants since the American War formally concluded can hardly be said to make up for the immense damage suffered by the people of Vietnam as a direct or indirect result of U.S. policy.
As to what will follow if Washington does succeed in cutting a deal with the Taliban, well, don’t count on President Trump (or his successor for that matter) welcoming anything like 1.3 million Afghan refugees to the United States once a “decent interval” has passed. Yet again, our position will be: we’re outta here; you guys sort this out.
Near the end of his famed novel, The Great Gatsby, F. Scott Fitzgerald described two of his privileged characters, Tom and Daisy, as “careless people” who “smashed up things and creatures” and then “retreated back into their money or their vast carelessness” to “let other people clean up the mess they had made.” That description applies to the United States as a whole, especially when Americans tire of a misguided war. We are a careless people. In Vietnam, we smashed up things and human beings with abandon, only to retreat into our money, leaving others to clean up the mess in a distinctly bloody fashion.
Count on us, probably sooner rather than later, doing precisely the same thing in Afghanistan.
Reflections on “Peace” in Afghanistan
[Editorial note: This remnant of a manuscript, discovered in a vault near the coastal town of Walpole, Massachusetts, appears to have been part of a larger project, probably envisioned as an interpretive history of the United States since the year 2000. Only a single chapter, probably written near the midpoint of the twenty-first century, has survived. Whether the remainder of the manuscript has been lost or the author abandoned it before its completion is unknown.]
From our present vantage point, it seems clear that, by 2019, the United States had passed a point of no return. In retrospect, this was the moment when indications of things gone fundamentally awry should have become unmistakable. Although at the time much remained hidden in shadows, the historic pivot now commonly referred to as the Great Reckoning had commenced.
Even today, it remains difficult to understand why, given mounting evidence of a grave crisis, passivity persisted for so long across most sectors of society. An epidemic of anomie affected a large swath of the population. Faced with a blizzard of troubling developments, large and small, Americans found it difficult to put things into anything approximating useful perspective. Few even bothered to try. Fewer succeeded. As with predictions of cataclysmic earthquakes or volcanic eruptions, a not-in-my-lifetime mood generally prevailed.
During what was then misleadingly known as the Age of Trump, the political classes dithered. While the antics of President Donald Trump provoked intense interest — the word “intense” hardly covers the attention paid to him — they also provided a convenient excuse for letting partisan bickering take precedence over actual governance or problem solving of any sort. Meanwhile, “thought leaders” (a term then commonly used to describe pontificating windbags) indulged themselves with various pet projects.
In the midst of what commentators were pleased to call the Information Age, most ordinary Americans showed a pronounced affinity for trivia over matters of substance. A staggering number of citizens willingly traded freedom and privacy for convenience, bowing to the dictates of an ever-expanding array of personalized gadgetry. What was then called a “smartphone” functioned as a talisman of sorts, the electronic equivalent of a rosary or prayer beads. Especially among the young, separation from one’s “phone” for more than a few minutes could cause acute anxiety and distress. The novelty of “social media” had not yet worn off, with its most insidious implications just being discovered.
Divided, distracted, and desperately trying to keep up: these emerged then as the abiding traits of life in contemporary America. Craft beer, small-batch bourbon, and dining at the latest farm-to-table restaurant often seemed to matter more than the fate of the nation or, for that matter, the planet as a whole. But all that was about to change.
Scholars will undoubtedly locate the origins of the Great Reckoning well before 2019. Perhaps they will trace its source to the aftermath of the Cold War when American elites succumbed to a remarkable bout of imperial hubris, while ignoring (thanks in part to the efforts of Big Energy companies) the already growing body of information on the human-induced alteration of the planet, which came to be called “climate change” or “global warming.” While, generally speaking, the collective story of humankind unfolds along a continuum, by 2019 conditions conducive to disruptive change were forming. History was about to zig sharply off its expected course.
This disruption occurred, of course, within a specific context. During the first two decades of the twenty-first century, American society absorbed a series of punishing blows. First came the contested election of 2000, the president of the United States installed in office by a 5-4 vote of a politicized Supreme Court, which thereby effectively usurped the role of the electorate. And that was just for starters. Following in short order came the terrorist attacks of September 11, 2001, which the world’s (self-proclaimed) premier intelligence services failed to anticipate and the world’s preeminent military establishment failed to avert.
Less than two years later, the administration of George W. Bush, operating under the delusion that the ongoing war in Afghanistan was essentially won, ordered U.S. forces to invade Iraq, a nation that had played no part in the events of 9/11. The result of this patently illegal war of aggression would not be victory, despite the president’s almost instant “mission accomplished” declaration, but a painful replay of the quagmire that U.S. troops had experienced decades before in Vietnam. Expectations of Iraq’s “liberation” paving the way for a broader Freedom Agenda that would democratize the Islamic world came to naught. The Iraq War and other armed interventions initiated during the first two decades of the century ended up costing trillions of taxpayer dollars, while sowing the seeds of instability across much of the Greater Middle East and later Africa.
Then, in August 2005, Hurricane Katrina smashed into the Gulf Coast, killing nearly 2,000 Americans. U.S. government agencies responded with breathtaking ineptitude, a sign of things to come, as nature itself was turning increasingly unruly. Other natural disasters of unnatural magnitude followed. In 2007, to cite but one example, more than 9,000 wildfires in California swept through more than a million acres. Like swarms of locusts, fires now became an annual (and worsening) plague ravaging the Golden State and the rest of the West Coast. If this weren’t enough of a harbinger of approaching environmental catastrophe, the populations of honeybees, vital to American agriculture, began to collapse in these very same years.
Americans were, as it turned out, largely indifferent to the fate of honeybees. They paid far greater attention to the economy, however, which experienced its own form of collapse in 2008. The ensuing Great Recession saw millions thrown out of work and millions more lose their homes as a result of fraudulent mortgage practices. None of the perpetrators were punished. The administration of President Barack Obama chose instead to bail out offending banks and large corporations. Record federal deficits resulted, as the government abandoned once and for all even the pretense of trying to balance the budget. And, of course, the nation’s multiple wars dragged on and on and on.
Through all these trials, the American people more or less persevered. If not altogether stoic, they remained largely compliant. As a result, few members of the nation’s political, economic, intellectual, or cultural elites showed any awareness that something fundamental might be amiss. The two established parties retained their monopoly on national politics. As late as 2016, the status quo appeared firmly intact. Only with that year’s presidential election did large numbers of citizens signal that they had had enough: wearing red MAGA caps rather than wielding pitchforks, they joined Donald Trump’s assault on that elite and, thumbing their noses at Washington, installed a reality TV star in the White House.
To the legions who had found the previous status quo agreeable, Trump’s ascent to the apex of American politics amounted to an unbearable affront. They might tolerate purposeless, endless wars, raise more or less any set of funds for the military that was so unsuccessfully fighting them, and turn a blind eye to economic arrangements that fostered inequality on a staggering scale. They might respond to the accelerating threat posed by climate change with lip service and, at best, quarter-measures. But Donald Trump in the Oval Office? That they could not abide.
As a result, from the moment of his election, Trump dominated the American scene. Yet the outrage that he provoked, day in and day out, had this unfortunate side effect: it obscured developments that would in time prove to be of far more importance than the 45th American president himself. Like the “noise” masking signals that, if detected and correctly interpreted, might have averted Pearl Harbor in December 1941 or, for that matter, 9/11, obsessing about Trump caused observers to regularly overlook or discount matters far transcending in significance the daily ration of presidential shenanigans.
Here, then, is a very partial listing of some of the most important of those signals then readily available to anyone bothering to pay attention. On the eve of the Great Reckoning, however, they were generally treated as mere curiosities or matters of limited urgency — problems to be deferred to a later, more congenial moment.
Item: The reality of climate change was now indisputable. All that remained in question was how rapidly it would occur and the extent (and again rapidity) of the devastation that it would ultimately inflict.
Item: Despite everything that was then known about the dangers of further carbon emissions, the major atmospheric contributor to global warming, they only continued to increase, despite the myriad conferences and agreements intended to curb them. (U.S. carbon emissions, in particular, were still rising then, and global emissions were expected to rise by record or near-record amounts as 2019 began.)
Item: The polar icecap was disappearing, with scientists reporting that it had melted more in just 20 years than in the previous 10,000. This, in turn, meant that sea levels would continue to rise at record rates, posing an increasing threat to coastal cities.
Item: Approximately eight million metric tons of plastic were seeping into the world’s oceans each year, from the ingestion of which vast numbers of seabirds, fish, and marine mammals were dying annually. Payback would come in the form of microplastics contained in seafood consumed by humans.
Item: With China and other Asian countries increasingly refusing to accept American recyclables, municipalities in the United States found themselves overwhelmed by accumulations of discarded glass, plastic, metal, cardboard, and paper. That year, the complete breakdown of the global recycling system already loomed as a possibility.
All of these fall into the category of what we recognize today as planetary issues of existential importance. But even in 2019 there were other matters of less than planetary significance that ought to have functioned as a wake-up call. Among them were:
Item: With the federal government demonstrably unable to secure U.S. borders, immigration authorities were seizing hundreds of thousands of migrants annually. By 2019, the Trump administration was confining significant numbers of those migrants, including small children, in what were, in effect, concentration camps.
Item: Cybercrime had become a major growth industry, on track to rake in $6 trillion annually by 2021. Hackers were already demonstrating the ability to hold large American cities hostage and the authorities proved incapable of catching up.
Item: With the three richest Americans — Jeff Bezos, Bill Gates, and Warren Buffet — controlling more wealth than the bottom 50% of the entire population, the United States had become a full-fledged oligarchy. While politicians occasionally expressed their dismay about this reality, prior to 2019 it was widely tolerated.
Item: As measured by roads, bridges, dams, or public transportation systems, the nation’s infrastructure was strikingly inferior to what it had been a half-century earlier. (By 2019, China, for instance, had built more than 19,000 miles of high-speed rail; the U.S., not one.) Agreement that this was a problem that needed fixing was universal; corrective action (and government financing), however, was not forthcoming.
Item: Military spending in constant dollars exceeded what it had been at the height of the Cold War when the country’s main adversary, the Soviet Union, had a large army with up-to-date equipment and an arsenal of nuclear weapons. In 2019, Iran, the country’s most likely adversary, had a modest army and no nuclear weapons.
Item: Incivility, rudeness, bullying, and general nastiness had become rampant, while the White House, once the site of solemn ceremony, deliberation, and decision, played host to politically divisive shouting matches and verbal brawls.
To say that Americans were oblivious to such matters would be inaccurate. Some were, for instance, considering a ban on plastic straws. Yet taken as a whole, the many indications of systemic and even planetary dysfunction received infinitely less popular attention than the pregnancies of British royals, the antics of the justifiably forgotten Kardashian clan, or fantasy football, a briefly popular early twenty-first century fad.
Of course, decades later, viewed with the benefit of hindsight, the implications of these various trends and data points seem painfully clear: the dominant ideological abstraction of late postmodernity — liberal democratic capitalism — was rapidly failing or had simply become irrelevant to the challenges facing the United States and the human species as a whole. To employ another then-popular phrase, liberal democratic capitalism had become an expression of “fake news,” a scam sold to the many for the benefit of the privileged few.
“Toward the end of an age,” historian John Lukacs (1924-2019) once observed, “more and more people lose faith in their institutions and finally they abandon their belief that these institutions might still be reformed from within.” Lukacs wrote those words in 1970, but they aptly described the situation that had come to exist in that turning-point year of 2019. Basic American institutions — the overworked U.S. military being a singular exception — no longer commanded popular respect.
In essence, the postmodern age was ending, though few seemed to know it — with elites, in particular, largely oblivious to what was occurring. What would replace postmodernity in a planet heading for ruin remained to be seen.
[Editor’s note: Here the account breaks off.]
The Great Reckoning
How best to describe the recently completed allied commemoration of the 75th anniversary of the D-Day invasion of France? Two words come immediately to mind: heartfelt and poignant. The aged D-Day veterans gathering for what was probably the last time richly deserved every bit of praise bestowed on them. Yet one particular refrain that has become commonplace in this age of Donald Trump was absent from the proceedings. I’m referring to “fake news.” In a curious collaboration, Trump and the media, their normal relationship one of mutual loathing, combined forces to falsify the history of World War II. Allow me to explain.
In a stirring presentation, Donald Trump — amazingly — rose to the occasion and captured the spirit of the moment, one of gratitude, respect, even awe. Ever so briefly, the president sounded presidential. In place of his usual taunts and insults, he managed a fair imitation of Ronald Reagan’s legendary “Boys of Pointe Du Hoc” speech of 1984. “We are gathered here on Freedom’s Altar,” Trump began — not exactly his standard introductory gambit.
Then, in a rare display of generosity toward people who were neither Republicans nor members of his immediate family, Trump acknowledged the contributions of those who had fought alongside the G.I.s at Normandy, singling out Brits, Canadians, Poles, Norwegians, Australians, and members of the French resistance for favorable mention. He related moving stories of great heroism and paid tribute to the dwindling number of D-Day veterans present. And as previous presidents had done on similar occasions marking D-Day anniversaries, he placed the events of that day in a reassuringly familiar historical context:
“The blood that they spilled, the tears that they shed, the lives that they gave, the sacrifice that they made, did not just win a battle. It did not just win a war. Those who fought here won a future for our nation. They won the survival of our civilization. And they showed us the way to love, cherish, and defend our way of life for many centuries to come.”
Nor was that all. “Today, as we stand together upon this sacred Earth,” Trump concluded,
“We pledge that our nations will forever be strong and united. We will forever be together. Our people will forever be bold. Our hearts will forever be loyal. And our children, and their children, will forever and always be free.”
Strong and united, together, bold, loyal, and free… forever.
It was, in its way, an astonishing performance, all the more so because it was entirely out of character. It was as if Secretary of State Mike Pompeo had published a book of sonnets or National Security Advisor John Bolton had performed a serviceable rendition of “Nessun dorma” on the steps of the Lincoln Memorial — wonderful in its way, but given the source startling as well.
Selective Remembering and Convenient Forgetting
If the purpose of Trump’s speech was to make his listeners feel good, he delivered. Yet in doing so, he also relieved them of any responsibility for thinking too deeply about the event being commemorated.
Now, let me just say that I hold no brief for Josef Stalin or the Soviet Union, or Marxism-Leninism. Yet you don’t need to be an apologist for Communism to acknowledge that the Normandy invasion would never have succeeded had it not been for the efforts of Marshal Stalin’s Red Army. For three full years before the first wave of G.I.s splashed ashore at Omaha Beach, Russian troops had been waging a titanic struggle along a vast front in their own devastated land against the cream of the German military machine.
One data point alone summarizes the critical nature of the Soviet contribution: in May 1944, there were some 160 German divisions tied up on the Eastern Front. That represented more than two-thirds of the armed might of the Third Reich, 160 combat divisions that were therefore unavailable for commitment against the Anglo-American forces desperately trying to establish a foothold in Normandy.
As has been the custom for quite some time now the German chancellor, representing the defeated enemy, attended the D-Day anniversary festivities as an honored guest. Angela Merkel’s inclusion testifies to an admirable capacity to forgive without forgetting.
Russian President Vladimir Putin did not, however, make the guest list. In liberal circles, Putin has, of course, made himself persona non grata. Yet excluding him obviated any need for Trump and other dignitaries in attendance to acknowledge, even indirectly, the Soviet role in winning World War II. Although the Red Army was never known for finesse or artfulness, it did kill an estimated four million of Merkel’s countrymen, who were thereby not on hand to have a go at killing Donald Trump’s countrymen.
If war is ultimately about mayhem and murder, then the Soviet Union did more than any other belligerent to bring about the final victory against Nazi Germany. Without for a second slighting the courage and contributions of our Canadian, Polish, Norwegian, and Australian comrades — bless them all — it was the Red Army that kept General Dwight Eisenhower’s expeditionary command from being pushed back into the Channel. In other words, thank God for the godless communists.
So, however heartfelt and poignant, the 75th anniversary of the D-Day landings was an exercise in selective remembering and convenient forgetting. It was, in other words, propaganda or, in contemporary parlance, fake news. The deception — for that’s what it was — did not escape the notice of Russian commentators. Yet members of the American media, otherwise ever alert to Trump’s sundry half-truths and outright deceptions, chose to ignore or more accurately endorse this whopper.
Time to Get Over the Hangover?
How much does such selective remembering and convenient forgetting matter? A lot, in my estimation. Distorting the past distorts the present and sows confusion about the problems we actually face.
For a small illustration of the implications of this particular elision of history we need look no further than the D-Day anniversary-inspired ruminations of New York Times columnist Bret Stephens. The purpose of his column, which appeared on June 7th, was to spin the spin. Stephens was intent on reinforcing Trump’s carefully edited interpretation of World War II in order to further his own version of a crusading and militarized American foreign policy agenda.
Now, the war against Adolf Hitler occurred a considerable time ago. The war against Iraqi autocrat Saddam Hussein is a far more recent memory. Which should have greater relevance for U.S. policy today? On that score, Stephens is quite clear: it’s the “lessons” of World War II, not of the reckless invasion of Iraq, that must pertain, not only today but in perpetuity. Sure, the Iraq War turned out to be a bit of a headache. “But how long,” Stephens asks, “should the hangover last?” Time to take an Alka-Seltzer and get back to smiting evildoers, thereby keeping alive the ostensible tradition of the Greatest Generation.
“If we really wanted to honor the sacrifices of D-Day,” Stephens writes, “we would do well to learn again what it is the Allies really fought for.” According to him, they fought “not to save the United States or even Britain,” but to liberate all of Europe; not to defeat Nazi Germany, “but to eradicate a despicable ideology”; and “not to subsume our values under our interests but to define our interests according to our values.”
Now, only someone oblivious to the actual experience of war could subscribe to such a noble list of “what the Allies really fought for.” Perhaps more to the point, in expounding on what inspired the Allied war effort, Stephens chose to overlook the fact that the ranks of those Allies included the Soviet Union. Winston Churchill, Franklin Roosevelt, and their generals would not have considered this a casual omission. They thanked their lucky stars for the Soviet Union’s participation.
Furthermore, Soviet leaders from Josef Stalin on down entertained their own distinct ideas about the war’s purposes. They adhered to and were intent on exporting an ideology hardly less despicable than that of the Nazis. Their purpose was not to liberate Europe, but to absorb large chunks of it into an expanded Soviet sphere of influence. And while correlating interests with values might have appealed to the Soviet dictator, the values to which he subscribed excluded just about every item in the American Bill of Rights. So if we are serious about identifying common war aims, “what the Allies really fought for” focused on one thing only: destroying the Third Reich.
Just like Trump, however, Stephens airbrushes the Soviet Union out of the picture. In doing so, he sanitizes the past. His motive is anything but innocent. Having concocted his own spurious version of “what the Allies really fought for,” Stephens pivots to the present moment and discovers — wouldn’t you know it — that we are right back in those terrible days of the 1930s when the Western democracies hesitated to confront the rising threat posed by Adolf Hitler.
Seventy years after D-Day, the world is in disarray. And the West, Stephens charges, is sitting on its hands. Syria is a mess. So is Venezuela. Kim Jong-Un, “the world’s most sinister dictator,” still rules North Korea. In Cuba, China, Saudi Arabia, and Iran, dissidents languish behind bars. Nobody “other than a few journalists and activists” seems to care. Everywhere indifference prevails.
And we’ve seen this movie before, he insists:
“This is the West almost as it looked in the 1930s: internally divided and inward looking, hesitant in the face of aggression, incanting political pieties in which it no longer believed — and so determined not to repeat the mistakes of the last war that it sleepwalked its way into the next.”
Now, in those circles where neoconservatives congregate and call for the United States to embark upon some new crusade, this analysis undoubtedly finds favor. But as a description of actually existing reality, it’s about as accurate as Trump’s own periodic blathering about the state of the world.
Is the West today “inward looking”? Then how do we explain the presence of Western forces in Afghanistan, of all places, for nigh onto 20 years? Is the West “hesitant in the face of aggression”? How does that charge square with actions taken by the United States and its allies in Iraq, Libya, Somalia, Yemen, and elsewhere? When it comes to war, some might suggest that our problem of late has not been hesitancy, but unending hubris and the absence of even minimal due diligence. More often than not, when it comes to aggressive behavior, we’re the ones spoiling for a fight. Take General Kenneth McKenzie, the latest bellicose head of U.S. Central Command, for example, who is now plugging for “a return to a larger U.S. military presence in the Middle East” with Iran in mind. Don’t accuse him of hesitance.
The prescription that Stephens offers reduces to this: just as in June 1944, brave men with guns, preferably speaking English, will put things right and enable freedom and democracy to prevail. We need only gird our loins and make the effort.
It’s all very inspiring really. Yet Stephens leaves out something important: this time we won’t be able to count on some other nation with a large and willing army to do most of the fighting and dying on our behalf.
The Art of Shaping Memory
Earlier this month, I spent a day visiting Marseilles to videotape a documentary about recent American military history, specifically the ongoing wars that most of us prefer not to think about.
Lest there be any confusion, let me be more specific. I am not referring to Marseilles (mar-SAY), France, that nation’s largest port and second largest city with a population approaching 900,000. No, my destination was Marseilles (mar-SAYLZ), Illinois, a small prairie town with a population hovering around 5,000.
Our own lesser Marseilles nestles alongside the Illinois River, more or less equidistant between Chicago and Peoria, smack dab in the middle of flyover country. I have some personal familiarity with this part of America. More than half a century ago, the school I attended in nearby Peru used to play the Panthers of Marseilles High. Unfortunately, their school closed three decades ago.
Back then, the town had achieved minor distinction for manufacturing corrugated boxes for Nabisco. But that factory was shuttered in 2002 and only the abandoned building remains, its eight-story hulk still looming above Main Street.
Today, downtown Marseilles, running a few short blocks toward the river, consists of tired-looking commercial structures dating from early in the previous century. Many of the storefronts are empty. By all appearances, the rest may suffer a similar fate in the not-too-distant future. Although the U.S. economy has bounced back from the Great Recession, recovery bypassed Marseilles. Here, the good times ended long ago and never came back. The feel of the place is weary and forlorn. Hedge-fund managers keen to turn a quick profit should look elsewhere.
Perhaps not too surprisingly, this is Trump country. Marseilles is located in LaSalle County, which in 2016 voted for Donald Trump over Hillary Clinton by a hefty 14% margin. It’s easy to imagine residents of Marseilles, which is more than 96% white, taking umbrage at Clinton’s disparaging reference to The Donald’s supporters as so many “deplorables.” They had reason to do so.
A Midwestern Memorial to America’s Wars in the Greater Middle East
Today, Marseilles retains one modest claim to fame. It’s the site of the Middle East Conflicts Wall Memorial, dedicated in June 2004 and situated on an open plot of ground between the river and the old Nabisco plant. The memorial, created and supported by a conglomeration of civic-minded Illinois bikers, many of them Vietnam veterans, is the only one in the nation that commemorates those who have died during the course of the various campaigns, skirmishes, protracted wars, and nasty mishaps that have involved U.S. forces in various quarters of the Greater Middle East over the past several decades.
Think about it: Any American wanting to pay personal tribute to those who fought and died for our country in World War II or Korea or Vietnam knows where to go — to the Mall in Washington D.C., that long stretch of lawn and reflecting pools connecting the Washington Monument and the Lincoln Memorial. Any American wanting to honor the sacrifice of those who fought and died in a series of more recent conflicts that have lasted longer than World War II, Korea, and Vietnam combined must travel to a place where the nearest public transportation is a Greyhound bus station down the road in Ottawa and the top restaurant is Bobaluk’s Beef and Pizza. Nowhere else in this vast nation of ours has anyone invested the money and the effort to remember more than a generation’s worth of less-than-triumphant American war making. Marseilles has a lock on the franchise.
Critics might quibble with the aesthetics of the memorial, dismissing it as an unpretentious knock-off of the far more famous Vietnam Wall. Yet if the design doesn’t qualify as cutting edge, it is palpably honest and heartfelt. It consists chiefly of a series of polished granite panels listing the names of those killed during the various phases of this country’s “forever wars” going all the way back to the sailors gunned down in the June 1967 Israeli attack on the USS Liberty.
Those panels now contain more than 8,000 names. Each June, in conjunction with the annual “Illinois Motorcycle Freedom Run,” which ends at the memorial, more are added. Along with flags and plaques, there is also text affirming that all those commemorated there are heroes who died for freedom and will never be forgotten.
On that point, allow me to register my own quibble. Although my son’s name is halfway down near the left margin of Panel 5B, I find myself uneasy with any reference to American soldiers having died for freedom in the Greater Middle East. Our pronounced penchant for using that term in connection with virtually any American military action strikes me as a dodge. It serves as an excuse for not thinking too deeply about the commitments, policies, and decisions that led to all those names being etched in stone, with more to come next month and probably for many years thereafter.
In Ernest Hemingway’s famed novel about World War I, A Farewell to Arms, his protagonist is “embarrassed by the words sacred, glorious, and sacrifice and the expression in vain.” I feel something similar when it comes to the use of freedom in this context. Well, not embarrassed exactly, but deeply uncomfortable. Freedom, used in this fashion, conceals truth behind a veil of patriotic sentiment.
Those whose names are engraved on the wall in Marseilles died in service to their country. Of that there is no doubt. Whether they died to advance the cause of freedom or even the wellbeing of the United States is another matter entirely. Terms that might more accurately convey why these wars began and why they have persisted for so long include oil, dominion, hubris, a continuing and stubborn refusal among policymakers to own up to their own stupendous folly, and the collective negligence of citizens who have become oblivious to where American troops happen to be fighting at any given moment and why. Some might add to the above list an inability to distinguish between our own interests and those of putative allies like Saudi Arabia and Israel.
Candidates at the Wall
During the several hours I spent there, virtually no one else visited the Middle East Conflicts Wall Memorial. A single elderly couple stopped by briefly and that was that. If this was understandable, it was also telling. After all, Marseilles, Illinois, is an out-of-the-way, isolated little burg. Touristy it’s not. There’s no buzz and no vibe and it’s a long way from the places that set the tone in present-day America. To compare Marseilles with New York, Washington, Hollywood, Las Vegas, or Silicon Valley is like comparing a Dollar General with Saks Fifth Avenue. Marseilles has the former. The closest Saks outlet is about a two-hour drive to Chicago’s Loop.
On the other hand, when you think about it, Marseilles is exactly the right place to situate the nation’s only existing memorial to its Middle Eastern wars. Where better, after all, to commemorate conflicts that Americans would like to ignore or forget than in a hollowing-out Midwestern town they never knew existed in the first place?
So, with the campaign for the 2020 presidential election now heating up, allow me to offer a modest proposal of my own — one that might, briefly at least, make Marseilles a destination of sorts.
Just as there are all-but-mandatory venues in Iowa and New Hampshire where candidates are expected to appear, why not make Marseilles, Illinois, one as well. Let all of the candidates competing to oust Donald Trump from the White House (their ranks now approaching two dozen) schedule at least one campaign stop at the Middle East Conflicts Wall, press entourage suitably in tow.
Let them take a page from presidents John F. Kennedy and Ronald Reagan at the Berlin Wall and use the site as a backdrop to reflect on the historical significance of this particular place. They should explain in concrete terms what the conflicts memorialized there signify; describe their relationship to the post-Cold War narrative of America as the planet’s “indispensable nation” or “sole superpower”; assess the disastrous costs and consequences of those never-ending wars; fix accountability; lay out to the American people how to avoid repeating the mistakes made by previous administrations, including the present one that seems to be itching for yet another conflict in the Middle East; and help us understand how, under the guise of promoting liberty and democracy, Washington has sown chaos through much of the region.
And, just to make it interesting, bonus points for anyone who can get through their remarks without referring to “freedom” or “supreme sacrifice,” citing the Gospel of John, chapter 15, verse 13 (“Greater love hath no man than this…”), or offering some fatuous reference to GIs as agents of the Lord called upon to smite evildoers. On the other hand, apt comparisons to Vietnam are not just permitted but encouraged.
I’m betting that the good bikers of Illinois who long ago served in Vietnam will happily provide a mic and a podium. If they won’t, I will.
The “Forever Wars” Enshrined
Irony, paradox, contradiction, consternation — these define the times in which we live. On the one hand, the 45th president of the United States is a shameless liar. On the other hand, his presidency offers an open invitation to Americans to confront myths about the way their country actually works. Donald Trump is a bullshit artist of the first order. Yet all art reflects the time in which it’s produced and Trump’s art is no exception. Within all the excrement lie nuggets of truth.
Well before Trump rode the down escalator to the center of American politics, there were indicators aplenty that things had gone fundamentally awry. Yet only with the presidential election of 2016 did the chickens come home to roost. And with their arrival, it became apparent that more than a few propositions hitherto accepted as true are anything but.
Let me offer seven illustrative examples of myths that the Trump presidency has once-and-for-all demolished.
Myth #1: The purpose of government is to advance the common good. In modern American politics, the concept of the common good no longer has any practical meaning. It hasn’t for decades. The phrase might work for ceremonial occasions — inaugural addresses, prayer breakfasts, that sort of thing — but finds little application in the actual business of governing.
When did politics at the national level become a zero-sum game? Was it during Richard Nixon’s presidency? Bill Clinton’s? While the question may be of academic interest, more pertinent is the fact that, with Trump in the White House, there is no need to pretend otherwise. Indeed, Trump’s popularity with his “base” stems in part from his candid depiction of his political adversaries not as a loyal opposition but an enemy force. Trump’s critics return the favor: their loathing for the president and — now that Trump’s generals are gone — anyone in his employ knows no bounds.
It’s the Mitch McConnell Rule elevated to the status of dogma: If your side wins, mine loses. Therefore, nothing is more important than my side winning. Compromise is for wusses.
Myth #2: Good governance entails fiscal responsibility. This is one of the hoariest shibboleths of modern American politics: feckless Democrats tax and spend; sober Republicans stand for balanced budgets. So President Ronald Reagan claimed, en route to racking up the massive deficits that transformed the United States from the world’s number one creditor into its biggest debtor. George W. Bush doubled down on Reagan’s promise. Yet during his presidency, deficits skyrocketed, eventually exceeding a trillion dollars per annum. No apologies were forthcoming. “Deficits don’t matter,” his vice president announced.
Then along came Trump. Reciting the standard Republican catechism, he vowed not only to balance the budget but to pay off the entire national debt within eight years. It was going to be a cinch. Instead, the projected deficit in the current fiscal year will once again top a cool trillion dollars while heading skywards. The media took brief note — and moved on.
Here’s the naked truth that Trump invites us to contemplate: both parties are more than comfortable with red ink. As charged, the Democrats are indeed the party of tax and spend. Yet the GOP is the party of spend-at-least-as-much (especially on the Pentagon) while offering massive tax cuts to the rich.
Myth #3: Justice is blind. The nomination of Brett Kavanaugh to the Supreme Court and the controversies surrounding his confirmation affirmed in unmistakable terms what had been hidden in plain sight since at least 1987 when Robert Bork was denied a seat on the court. The Supreme Court has become a venue for advancing a partisan agenda. It serves, in effect, as a third legislative body, consisting of unelected members with lifelong tenure, answerable only to itself. So politically active Americans of whatever stripe believe. Justice impartially administered is for people who still believe in the Tooth Fairy.
As a result, the Supremes now wear invisible labels on their black robes, identifying members as either liberal or conservative, aligned, in effect, with Democratic or Republican positions. On hot-button issues — gun rights and abortion rights are two examples — their job is to act accordingly. Hence, the consternation caused when a member violates those expectations, as was the case when Chief Justice John Roberts voted to preserve the Affordable Care Act.
So both parties engage in unapologetic court packing. In recent years, Mitch McConnell and the Senate Republicans, who blocked dozens of Obama appointees to the federal bench and prevented Merrick Garland’s nomination to the Supreme Court from even being considered, have done so with considerable skill. But Democrats are merely biding their time. Hence, the imperative of ensuring that Justice Ruth Bader Ginsberg, now 86 and ailing, won’t retire until a Democrat once again sits in the Oval Office.
Crucially, neither the left nor the right acknowledges the possibility that a politicized judiciary, however useful in advancing a partisan agenda, might not serve the nation’s long-term interests.
Myth #4: The “wise men” are truly wise. To keep America safe, protect core U.S. interests, and promote peace, presidents since World War II have sought advice and counsel from a small self-perpetuating group of foreign policy insiders claiming specialized knowledge about how the world works and America’s proper role atop that world. In the 1960s, thanks to the disastrous war in Vietnam, the reputation of this cadre of “wise men” cratered. Yet they weren’t finished, not by a long shot. Their ranks now including women, they staged a remarkable comeback in the wake of 9/11. Among the ensuing catastrophes were the wars in Afghanistan, Iraq, Libya, and Syria.
As a candidate, Trump made his contempt for this elite clear. Yet fool that he is, the president now employs a bargain-basement version of the “best and brightest”: a national security advisor who believes that “To Stop Iran’s Bomb, Bomb Iran”; a secretary of state whose conception of history derives from the Bible; an acting defense secretary on loan from Boeing who reportedly spends time trashing his former employer’s competitors; and a CIA director who earned her stripes supervising secret torture chambers.
Members of this posse may carry all the requisite security clearances, but sound thinking or foresight? One might do at least as well and perhaps better consulting a class full of college sophomores. Thanks to Trump, only the truly gullible will persist in thinking that the foreign policy establishment has a lock on wisdom.
Myth #5: The Persian Gulf is a vital U.S. national security interest. For decades now, Americans have been fed this line with unhappy results. Dominating the Persian Gulf, we’ve been told, is essential to preserving our way of life. Stripped to its essentials, here’s the gist of the argument: They have the oil and we need it.
In fact, we don’t need their oil. There’s plenty right here in our own hemisphere — in, that is, “Saudi America.” Moreover, burning all that oil accelerates climate change, which poses a greater proximate threat to the well-being of the American people than anything likely to happen in the Gulf. Meanwhile, several decades of U.S. meddling in that region have produced the inverse of what policymakers promised. Instead of order, there is instability; instead of democracy, illiberalism; instead of peace, death and destruction. In terms of lives lost and damaged and treasure wasted, the cost to the United States has been immense.
To his credit, Trump has now explained the actual basis for the continuing U.S. interest in this part of the world: the Saudis, as well as other Gulf states, have an insatiable appetite for made-in-the-USA armaments. It’s all about the Benjamins, baby, and we can’t allow Russia or China to horn in on our market. Only to the military-industrial complex and its co-conspirators is the Persian Gulf a vital interest. Trump relieves us of the burden of having to pretend otherwise. Thank you, Mr. President.
Myth #6: Prospects for an Israeli-Palestinian peace depend on Washington playing the role of honest broker. Here, too, let’s give President Trump his due. He has definitively exposed the entire peace process as a fiction and a fraud. In fulfilling the promise made by previous presidents to move the U.S. embassy to Jerusalem and by endorsing the Israeli claim to the Golan Heights, Trump has stripped away the last vestiges of pretense: Washington favors just one side in this festering dispute, as it has since at least the 1960s.
Why this should even qualify as news is a bit of a mystery. After all, for decades, the United States has been providing Israel with diplomatic cover at the U.N. Security Council and elsewhere, along with an annual gift of billions of dollars in weaponry — other customers pay cash — even as droves of non-Jewish politicians compete with one another to profess their undying love for and devotion to a country other than their own. Talk about dual loyalty!
Yes, of course, son-in-law Jared is busily hammering out what Trump himself has called “the toughest of all deals.” Perhaps there is genius in turning to an amateur when the professionals have failed. If Kushner pulls this off, we’ll wonder why Richard Nixon didn’t send daughter Tricia to Paris to negotiate an end to the Vietnam War and why Jimmy Carter didn’t dispatch wife Rosalynn to Tehran to sort out the hostage crisis. Yet whether Jared succeeds or not, thanks to Trump, we can now say definitively that when it comes to Israel, the United States is all in, now and forever.
Myth #7: War is the continuation of policy by other means. So, in a riff on Prussian military theorist Carl Von Clausewitz’s famous maxim, generations of American statesmen and military officers have professed to believe. Yet, in the present century, the challenge of making armed force politically purposeful has turned out to be daunting. Nothing illustrates the point more clearly than America’s never-ending war in Afghanistan.
Like the clutter of online ads that our eyes automatically ignore, Americans have learned to tune out this longest war in our history. Originally styled Operation Enduring Freedom, the war itself has certainly endured. It began when this year’s crop of high school graduates were just leaving the womb. In terms of total length, it’s on track to outlast the Civil War (1861-1865), U.S. participation in the two world wars (1917-1918, 1941-1945), the Korean War (1950-1953), and the Vietnam War (1965-1973) combined.
The Pentagon has never demonstrated more than minimal interest in calculating the war’s cumulative costs. While researchers do their best to keep up with the mounting tally, their numbers possess almost no political salience. Congressional Democrats get exercised about the handful of billions of dollars that Donald Trump wants to waste on building his wall, but few members of either party attend to the hundreds of billions wasted in Afghanistan. So like the Energizer Bunny, the war there just keeps on going, while going nowhere in particular.
In his State of the Union Address earlier this year, the president opined that “Great nations do not fight endless wars.” It was a commendable declaration. Indeed, Trump has made it unmistakably clear that he wants out of Afghanistan as well as Syria, and the sooner the better. The boss has spoken: We’re leaving, pronto, sayonara, gone for good.
Yet as is so often the case with this president, words have not translated into action. So, contrary to Trump’s clearly expressed intentions, the Pentagon is planning on keeping 7,000 U.S. troops in Afghanistan for another three to five years while also sustaining an active presence in Syria. In other words, the endless wars won’t be ending any time soon.
There’s a lesson to be learned here and the lesson is this: while senior military officers will never overtly disobey their president — heaven forbid! – they have evolved a repertoire of tricks over the decades to frustrate any president’s intentions. On the eve of his retirement from office in 1961, President Dwight D. Eisenhower went on national television to tell the American people how it’s done.
Credit the present generation of generals with having gone one further. Remarkably enough, they have inverted Clausewitz. No longer does discernible political purpose serve as a necessary precondition for perpetuating a war. If generals (and militarized civilians) don’t want a war to end, that suffices as a rationale for its continuation. The boss will comply.
We can therefore thank Trump for inadvertently laying bare the reality of civil-military relations in twenty-first-century Washington: The commander-in-chief isn’t really in command.
Historians are never going to rate Trump as a great or even mediocre president. Even so, they may one day come to appreciate the Trump era as the moment when things long hidden became plain to see, when hitherto widely accepted falsehoods, fabrications, and obsolete assumptions about American democracy finally became untenable. For that, if for nothing else, we may yet have reason to thank our 45th president for services rendered.
Can We Stop Pretending Now?
The news, however defined, always contains a fair amount of pap. Since Donald Trump’s ascent to the presidency, however, the trivia quotient in the average American’s daily newsfeed has grown like so many toadstools in a compost heap, overshadowing or crowding out matters of real substance. We’re living in TrumpWorld, folks. Never in the history of journalism have so many reporters, editors, and pundits expended so much energy fixating on one particular target, while other larger prey frolic unmolested within sight.
As diversion or entertainment — or as a way to make a buck or win 15 seconds of fame — this development is not without value. Yet the overall impact on our democracy is problematic. It’s as if all the nation’s sportswriters obsessed 24/7 about beating New England Patriots coach Bill Belichick.
In TrumpWorld, journalistic importance now correlates with relevance to the ongoing saga of Donald J. Trump. To members of the mainstream media (Fox News, of course, excepted), that saga centers on efforts to oust the president from office before he destroys the Republic or blows up the planet.
Let me stipulate for the record: this cause is not entirely meritless. Yet to willingly embrace such a perspective is to forfeit situational awareness bigly. All that ends up mattering are the latest rumors, hints, signs, or sure-fire indicators that The Day of Reckoning approaches. Meanwhile, the president’s own tweets, ill-tempered remarks, and outlandish decisions each serve as a reminder that the moment when he becomes an ex-president can’t arrive too soon.
Hotels in Moscow, MAGA Caps, and a Nixon Tattoo
Ostensibly big stories erupt, command universal attention, and then evaporate like the dewfall on a summer morning, their place taken by the next equally big, no less ephemeral story. Call it the Michael Wolff syndrome. Just a year ago, Wolff’s Fire and Fury: Inside the Trump White House took the political world by storm, bits and pieces winging across the Internet while the book itself reportedly sold a cool million copies in the first four days of its release. Here was the unvarnished truth of TrumpWorld with a capital T. Yet as quickly as Fire and Fury appeared, it disappeared, leaving nary a trace.
Today, 99 cents will get you a copy of that same hardcover book. As a contribution to deciphering our times, the value of Wolff’s volume is about a dollar less than its current selling price. A mere year after its appearance, it’s hard to recall what all the fuss was about.
Smaller scale versions of the Wolff syndrome play themselves out almost daily. Remember the recent bombshell BuzzFeed report charging that Trump had ordered his lawyer Michael Cohen to lie about a proposed hotel project in Moscow? For a day or so, it was the all-encompassing, stop-the-presses-get-me-rewrite version of reality, the revelation — finally! — that would bring down the president. Then the office of Special Counsel Robert Mueller announced that key aspects of the report were “not accurate” and the 24/7 buzz created by that scoop vanished as quickly as it had appeared.
Immediately thereafter, Rudy Giuliani, once “America’s mayor,” now Trump’s Barney Fife-equivalent of a personal lawyer, announced on national television that he had never said “there was no collusion” between the Trump campaign and Russian authorities in election 2016. Observers on the lookout for the proverbial smoking gun quickly interpreted that odd formulation as an admission that collusion must, in fact, have occurred.
The headlines were thunderous. Yet within hours, the gotcha-interpretation fell apart. Alternative explanations appeared, suggesting that Giuliani was suffering from dementia or that his drinking habit had gotten out of hand. With the ex-mayor wasting little time walking back his own comment, another smoking gun morphed into a cap pistol.
Fortunately for what little survives of his reputation, Giuliani’s latest gaffe was promptly eclipsed by video clips that seemed to show white students from an all-boys Catholic high school in Kentucky (Strike One!) who had just participated in the annual March for Life in Washington (Strike Two!) and were taunting an elderly Native American Vietnam War veteran using Tomahawk chops while sporting MAGA hats on the steps of the Lincoln Memorial (Strike Three!).
The ensuing rush to judgment became a wind sprint. Here was the distilled essence of every terrible thing that Donald Trump had done to America. The pro-Trump baseball caps said it all. As a columnist in my hometown newspaper put it, “Like a white hood, that cap represents a provocation and a threat: ‘You know where we stand. You’ve been warned. And the president of the United States has our back.’ And, yes, I do equate MAGA gear with traditional Klan attire. The sartorial choices change, the racism remains the same.” For those too obtuse to grasp the underlying point, the title of the essay drove it home: “White America, come get your children.”
As luck would have it, however, the events that actually unfolded on the steps of the Lincoln Memorial turned out to be more complicated than was first reported. No matter: in TrumpWorld, all sides treat facts as malleable and striking the right moral posture counts for far more than balance or accuracy.
Anyway, soon after, with news that Trump confidant Roger Stone had been indicted on various charges, the boys from Covington could return to the obscurity from which they briefly emerged. To judge from the instantaneous media reaction, Stone’s first name might as well have been Rosetta. Here at last — for sure this time — was the key to getting the real dirt.
Rest assured, though, that by the time this essay appears, Stone and his Richard Nixon tattoo will have been superseded by yet another sensational Trump-related revelation (or two or three).
And so it goes, in an endlessly churning cycle: “breaking news” goes viral; commentators rush in to explain what-it-all-means; the president himself retaliates by lashing out on Twitter (“The Greatest Witch Hunt in the History of our Country!”), much to the delight of his critics. This tit-for-tat exchange continues until the next fresh tidbit of “breaking news” gives the cycle another vigorous turn.
When Does a Hill of Beans Become a Mountain?
Do all of the words spoken or written result in citizens who are better informed and better able to reach sensible conclusions about the global situation in which our country finds itself? Not as far as I can tell. Granted, if I spent more time watching those gabbling heads on CNN, MSNBC, and Fox News, I might feel differently. But I doubt it.
Still, having been involuntarily shanghaied into TrumpWorld, I worry that my fellow citizens are losing their ability to distinguish between what truly matters and what doesn’t, between what’s vital and what’s merely interesting. True, Donald J. Trump has a particular knack for simplifying and thereby distorting almost any subject to which he gives even the slightest attention, ranging from border security to forest management. Yet almost everywhere in TrumpWorld, this very tendency has become endemic, with nuance and perspective sacrificed to the larger cause of cleansing the temple of the president’s offending presence. Nothing, it appears, comes close to the importance of this effort.
Not even wars.
I admit to a preoccupation with the nation’s seemingly never-ending armed conflicts. These days it’s not the conduct of our wars that interests me — they have become all but indecipherable — but their duration, aimlessness, and cumulative costs. Yet even more than all of these, what’s fascinating is the way that they continue more or less on autopilot.
I don’t wish to imply that political leaders and media outlets ignore our wars altogether. That would be unfair. Yet in TrumpWorld, while the president’s performance in office receives intensive and persistent coverage day in, day out, the attention given to America’s wars has been sparse and perfunctory, when not positively bizarre.
As a case in point, consider the op-ed that recently appeared in the New York Times (just as actual peace talks between the U.S. and the Taliban seemed to be progressing), making the case for prolonging the U.S. war in Afghanistan, while chiding President Trump for considering a reduction in the number of U.S. troops currently stationed there. Any such move, warned Michael O’Hanlon of the Brookings Institution, would be a “mistake” of the first order.
The ongoing Afghan War dates from a time when some of today’s recruits were still in diapers. Yet O’Hanlon counsels patience: a bit more time and things just might work out. This is more or less comparable to those who suggested back in the 1950s that African Americans might show a bit more patience in their struggle for equality: Hey, what’s the rush?
I don’t pretend to know what persuaded the editors of the Times that O’Hanlon’s call to make America’s longest war even longer qualifies as something readers of the nation’s most influential newspaper just now need to ponder. Yet I do know this: the dearth of critical attention to the costs and consequences of our various post-9/11 wars is nothing short of shameful, a charge to which politicians and journalists alike should plead equally guilty.
I take it as a given that President Trump is an incompetent nitwit, precisely as his critics charge. Yet his oft-repeated characterization of those wars as profoundly misguided has more than a little merit. Even more striking than Trump’s critique is the fact that so few members of the national security establishment are willing to examine it seriously. As a consequence, the wars persist, devoid of purpose.
Still, I find myself wondering: If a proposed troop drawdown in Afghanistan qualifies as a “mistake,” as O’Hanlon contends, then what term best describes a war that has cost something like a trillion dollars, killed and maimed tens of thousands, and produced a protracted stalemate?
Disaster? Debacle? Catastrophe? Humiliation?
And, if recent press reports prove true, with U.S. government officials accepting Taliban promises of good behavior as a basis for calling it quits, then this longest war in our history will not have provided much of a return on investment. Given the disparity between the U.S. aims announced back in 2001 and the results actually achieved, defeat might be an apt characterization.
Yet the fault is not Trump’s. The fault belongs to those who have allowed their immersion in the dank precincts of TrumpWorld to preclude serious reexamination of misguided and reckless policies that predate the president by at least 15 years.
Lost in TrumpWorld
What does President Trump’s recent nomination of retired Army General John Abizaid to become the next U.S. ambassador to Saudi Arabia signify? Next to nothing — and arguably quite a lot.
Abizaid’s proposed appointment is both a non-event and an opportunity not to be wasted. It means next to nothing in this sense: while once upon a time, American diplomats abroad wielded real clout — Benjamin Franklin and John Quincy Adams offer prominent examples — that time is long past. Should he receive Senate confirmation, Ambassador Abizaid will not actually shape U.S. policy toward Saudi Arabia. At most, he will convey policy, while keeping officials back in Washington apprised regarding conditions in the Kingdom. “Conditions” in this context will mean the opinions, attitudes, whims, and mood of one particular individual: Mohammed bin Salman. MBS, as he is known, is the Saudi crown prince and the Kingdom’s de facto absolute ruler. By no means incidentally, he is also that country’s assassin-in-chief as well as the perpetrator of atrocities in a vicious war that he launched in neighboring Yemen in 2015.
Implicit in Abizaid’s job description will be a requirement to cozy up to MBS. “Cozy up” in this context implies finding ways to befriend, influence, and seduce; that is, seeking to replicate in Riyadh the achievements in Washington of Prince Bandar bin Sultan, who from 1983 to 2005 served as Saudi ambassador to the United States.
With plenty of money to spread around, Bandar charmed — which in this context means suborned — the Washington establishment, while ingratiating himself with successive presidents and various other power brokers. With his fondness for nicknames, George W. Bush dubbed him “Bandar Bush,” informally designating the Saudi prince a member of his own dynastic clan.
After 9/11, the Saudi envoy made the most of those connections, deflecting attention away from the role Saudis had played in the events of that day while fingering Saddam Hussein’s Iraq as the true font of Islamist terrorism. Bush came around to endorsing Bandar’s view — although he may not have needed much urging. So while Bandar may not rank alongside the likes of Vice President Dick Cheney, Secretary of Defense Donald Rumsfeld, and Deputy Secretary of Defense Paul Wolfowitz among the architects of the ensuing Iraq War, he certainly deserves honorable mention.
That Abizaid will come anywhere close to replicating Bandar’s notable (or nefarious) achievements seems unlikely. For starters, at age 67, he may not want to spend the next 20 years or so in the Saudi capital, Riyadh, sucking up to the Kingdom’s royals. At least as significantly, he lacks Bandar’s bankroll. However much dough Abizaid may have raked in via his consulting firm since leaving the Army a decade ago, it doesn’t qualify as real money in Saudi circles, where a billion dollars is a mere rounding error. The mega-rich do not sell themselves cheaply, unless perhaps your surname is Trump.
So the substantive implications of Abizaid’s appointment for U.S.-Saudi relations will likely be negligible. Trump’s son-in-law Jared Kushner will undoubtedly continue to wield greater influence over MBS than Ambassador Abizaid — or at least will fancy that he is doing so.
Long (and Wrong) War
In another sense, however, Abizaid’s appointment to this post (vacant since Donald Trump became president) could mean quite a lot. It offers an ideal opportunity to take stock of the “Long War.”
Now that phrase “Long War” is one that presidents, national security advisors, defense secretaries, and their minions assiduously avoid. Yet, in military circles, it long ago superseded the Global War on Terrorism as an umbrella term describing what U.S. forces have been doing across the Greater Middle East all these many years.
Already by 2005, for example, hawkish analysts employed by a conservative Washington think tank were marketing their recipe for Winning the Long War. And that was just for starters. For more than a decade now, the Long War Journal has been offering authoritative analysis of U.S. military operations across the Greater Middle East and Africa. In the meantime, West Point’s Combating Terrorism Center churns out monographs with titles like Fighting the Long War. Always quick to recognize another golden goose of government contracts, the RAND Corporation weighed in with Unfolding the Future of the Long War. After publishing a lengthy essay in the New York Times Magazine called “My Long War,” correspondent Dexter Filkins went a step further and titled his book The Forever War. (And for creative types, Voices from the Long War invites Iraq and Afghan War vets to reflect on their experiences before a theatrical audience.)
But where, you might wonder, did that dour phrase originate? As it happens, General Abizaid himself coined it back in 2004 when he was still an active duty four-star and head of U.S. Central Command, the regional headquarters principally charged with waging that conflict. In other words, just a year after the U.S. invaded Iraq and President George W. Bush posed under a White House-produced “Mission Accomplished” banner, with administration officials and their neoconservative boosters looking forward to many more “Iraqi Freedom”-style victories to come, the senior officer presiding over that war went on record to indicate that victory wasn’t going to happen anytime soon. Oops.
And so it has come to pass. The Long War has now lasted twice as long as the average length of marriages in the United States, with no end in sight. Whether intuitively or after careful study, General Abizaid had divined something important indeed.
Crucially, however, his critique went beyond the question of duration. Abizaid also departed from the administration’s line in describing the actual nature of the problem at hand. “Terrorists” per se were not the enemy, he insisted at the time. The issue was much bigger than any one organization such as al-Qaeda. The real threat facing the United States came from what he called “Salafist jihadists,” radicalized Sunni Muslims committed by whatever means necessary to propagating a strict and puritanical form of Islam around the world. To promote their cause, Salafists eagerly embraced violence.
Back in 2004, when Abizaid was venturing heretical thoughts, the United States had gotten itself all tangled up in a nasty scuffle in Iraq. A year earlier, the U.S. had invaded that country to overthrow Saddam Hussein. Now the Iraqi dictator was indubitably a bad actor. At least some of the charges that George W. Bush and his subordinates, amplified by a neoconservative chorus, lodged against him were true. Yet Saddam was the inverse of a Salafist.
Indeed, even before plunging into Iraq, looking beyond an expected easy win over Saddam, George W. Bush had identified Iran as a key member of an “Axis of Evil” and implicitly next in line for liberation. Sixteen years later, members of the Trump administration still hanker to have it out with the ayatollahs governing Shiite-majority Iran. Yet, as was the case with Saddam, those ayatollahs are anything but Salafists.
Now, it’s worth noting that Abizaid was not some dime-a-dozen four-star. He speaks Arabic, won a fellowship to study in Jordan, and earned a graduate degree in Middle East Studies at Harvard. If the post-9/11 American officer corps had in its ranks an equivalent of Lawrence of Arabia, he was it, even if without T.E. Lawrence’s (or Peter O’Toole’s) charisma and flair for self-promotion. Nonetheless, with Abizaid suggesting, in effect, that the Iraq War was “the wrong war at the wrong place at the wrong time against the wrong enemy,” just about no one in Washington was willing to listen.
That once-familiar quotation dates from 1951, when General Omar Bradley warned against extending the then-ongoing Korean War into China. Bradley’s counsel carried considerable weight — and limiting the scope of the Korean War made it possible to end that conflict in 1953.
Abizaid’s counsel turned out to carry next to no weight at all. So the Long War just keeps getting longer, even as its strategic rationale becomes ever more difficult to discern.
The Real Enemy
Posit, for the sake of discussion, that back in 2004 Abizaid was onto something — as indeed he was. Who then, in this Long War of ours, is our adversary? Who is in league with those Salafi jihadists? Who underwrites their cause?
The answer to those questions is not exactly a mystery. It’s the Saudi royal family. Were it not for Saudi Arabia’s role in promoting militant Salafism over the course of several decades, it would pose no bigger problem than Cliven Bundy’s bickering with the Bureau of Land Management.
To put it another way, while the Long War has found U.S. troops fighting the wrong enemy for years on end in places like Iraq and Afghanistan, the nexus of the problem remains Saudi Arabia. The Saudis have provided billions to fund madrassas and mosques, spreading Salafism to the far reaches of the Islamic world. Next to oil, violent jihadism is Saudi Arabia’s principal export. Indeed, the former funds the latter.
Those Saudi efforts have borne fruit of a poisonous character. Recall that Osama bin Laden was a Saudi. So, too, were 15 of the 19 hijackers on September 11, 2001. These facts are not incidental, even if — to expand on Donald Rumsfeld’s famous typology of known knowns, known unknowns, and unknown unknowns — Washington treats them as knowns we prefer to pretend we don’t know.
So from the outset, in the conflict that the United States dates from September 2001, our ostensible ally has been the principal source of the problem. In the Long War, Saudi Arabia represents what military theorists like to call the center of gravity, defined as “the source of power that provides moral or physical strength, freedom of action, or will to act” to the enemy. When it comes to Salafist jihadism, Saudi Arabia fits that definition to a T.
So there is more than a little poetic justice — or is it irony? — in General Abizaid’s proposed posting to Riyadh. The one senior military officer who early on demonstrated an inkling of understanding of the Long War’s true nature now prepares to take up an assignment in what is, in essence, the very center of the enemy’s camp. It’s as if President Lincoln had dispatched Ulysses S. Grant to Richmond, Virginia, in 1864 as his liaison to Jefferson Davis.
Which brings us to the opportunity referred to at the outset of this essay. The opportunity is not Abizaid’s. He can look forward to a frustrating and probably pointless assignment. Yet Trump’s nomination of Abizaid presents an opportunity to the U.S. senators charged with approving his appointment. While we can take it for granted that Abizaid will be confirmed, the process of confirmation offers the Senate, and especially members of the Senate Foreign Relations Committee, a chance to take stock of this Long War of ours and, in particular, to assess how Saudi Arabia fits into the struggle.
Who better to reflect on these matters than John Abizaid? Imagine the questions:
General, can you describe this Long War of ours? What is its nature? What is it all about?
Are we winning? How can we tell?
How much longer should Americans expect it to last?
What are we up against? Give us a sense of the enemy’s intentions, capabilities, and prospects.
With MBS in charge, is Saudi Arabia part of the solution or part of the problem?
Take all the time you need, sir. Be candid. We’re interested in your opinion.
After the embarrassment of the Kavanaugh confirmation hearings, the Senate is badly in need of refurbishing its reputation. The Abizaid nomination provides a ready-made chance to do just that. Let’s see if the “world’s greatest deliberative body” rises to the occasion. Just don’t hold your breath.
Our Man in Riyadh
Senator Elizabeth Warren
317 Hart Senate Office Building
Dear Senator Warren:
As a constituent, I have noted with interest your suggestion that you will “take a hard look” at running for president in 2020, even as you campaign for reelection to the Senate next month. Forgive me for saying that I interpret that comment to mean “I’m in.” Forgive me, as well, for my presumption in offering this unsolicited — and perhaps unwanted — advice on how to frame your candidacy.
You are an exceedingly smart and gifted politician, so I’m confident that you have accurately gauged the obstacles ahead. Preeminent among them is the challenge of persuading citizens beyond the confines of New England, where you are known and respected, to cast their ballot for a Massachusetts liberal who possesses neither executive nor military experience and is a woman to boot.
Voters will undoubtedly need reassurance that you have what it takes to keep the nation safe and protect its vital interests. And yes, there is a distinct double standard at work here. Without possessing the most minimal of qualifications to serve as commander-in-chief, Donald Trump won the presidency in 2016. Who can doubt that gender and race played a role?
So the challenge you face is an enormous one. To meet it, in my estimation, you should begin by exposing the tangle of obsolete assumptions and hitherto unresolvable contradictions embedded in present-day U.S. national security policy. You’ll have to demonstrate a superior understanding of how events are actually trending. And you’ll have to articulate a plausible way of coping with the problems that lie ahead. To become a viable candidate in 2020, to win the election, and then to govern effectively, you’ll need to formulate policies that not only sound better, but are better than what we’ve got today or have had in the recent past. So there’s no time to waste in beginning to formulate a Warren Doctrine.
Of course, the city in which you spend your workweek is awash with endless blather about a changing world, emerging challenges, and the need for fresh thinking. Yet, curiously enough, what passes for national security policy has remained largely immune to change, fixed in place by two specific episodes that retain a chokehold on that city’s policy elite: the Cold War and the events of 9/11.
The Cold War ended three decades ago in what was ostensibly a decisive victory for the United States. History itself had seemingly anointed us as the “indispensable nation.”
Yet here we are, all these years later, gearing up again to duel our old Cold War adversaries, the Ruskies and ChiComs. How, in the intervening decades, did the United States manage to squander the benefits of coming out on top in that “long twilight struggle”? Few members of the foreign policy establishment venture to explain how or why things so quickly went awry. Fewer still are willing to consider the possibility that our own folly offers the principal explanation.
By the time you are elected, the 20th anniversary of 9/11 will be just around the corner, and with it the 20th anniversary of the Global War on Terrorism. Who can doubt that when you are inaugurated on January 20, 2021, U.S. forces will still be engaged in combat operations in Afghanistan, Syria, Libya, and various other places across the Greater Middle East and Africa? Yet in present-day Washington, the purpose and prospects of those campaigns elude serious discussion. Does global leadership necessarily entail being permanently at war? In Washington, the question goes not only unanswered, but essentially unasked.
Note that President Trump has repeatedly made plain his desire to extricate the United States from our wars without end, only to be told by his subordinates that he can’t. Trump then bows to the insistence of the hawks because, for all his bluster, he’s weak and easily rolled. Yet there’s a crucial additional factor in play as well: Trump is himself bereft of strategic principles that might provide the basis for a military posture that is not some version of more of the same. When he’s told “we have to stay,” he simply can’t refute the argument. So we stay.
You, too, will meet pressure to perpetuate the status quo. You, too, will be told that no real alternatives exist. Hence, the importance of bringing into office a distinctive strategic vision that offers the possibility of real change.
You will want to tailor that vision so that it finds favor with three disparate audiences. First, to win the nomination, you’ll need to persuade members of your own party to prefer your views to those of your potential competitors, including Democrats with far more impressive national security credentials than your own. Among those already hinting at a possible run for the presidency are a well-regarded former vice president and possibly even a former secretary of state who is a decorated combat veteran and chaired the Senate Foreign Relations Committee. Although long in the tooth, they are not to be dismissed.
Second, having won the nomination, you’ll have to motivate voters who are not Democrats that your vision will, in the words of the preamble to the Constitution, “secure the Blessings of Liberty to ourselves and our Posterity.” In this context, motivation should start with education, with, that is, disabusing citizens of the conviction — now prevalent in Washington — that “global leadership” is synonymous with a willingness to use force.
Finally, once you enter the Oval Office, you’ll need to get buy-ins from Congress, the national security apparatus, and U.S. allies. That means convincing them that your approach can work, won’t entail unacceptable risks, and won’t do undue damage to their own parochial interests.
To recap, a Warren Doctrine will need to appeal to progressives likely to have an aversion to the very phrase “national security,” even as it inspires middle-of-the-roaders to give you their vote and persuades elites that you can be trusted to exercise power responsibly. All in all, that is a tall order.
Yet I think it can be done. Indeed, it needs to be done if the United States is ever to find a way out of the strategic wilderness in which it is presently wandering, with the likes of Donald Trump, John Bolton, Mike Pompeo, and James Mattis taking turns holding the compass while trying to figure out which way is north.
1 + 3 = You Win
A strategic paradigm worthy of the name begins with a tough-minded appraisal of the existing situation. There is, to put it mildly, a lot going on in our world today, much of it not good: terrorism, whether Islamist or otherwise; unchecked refugee flows; cross-border trafficking in drugs, weapons, and human beings; escalating Saudi-Iranian competition to dominate the Persian Gulf; pent-up resentment among Palestinians, Kurds, and other communities denied their right to self-determination; the provocations of “rogue states” like Russia, Pakistan, and North Korea; and, not to be forgotten, the ever-present danger of unintended nuclear war. As a candidate, you will need to have informed views on each of these.
Yet let me suggest that these are legacy issues, most of them detritus traceable to the twentieth century. None of them are without importance. None can be ignored. If mishandled, two or three of them have the potential to produce apocalyptic catastrophes. Even so, the place to begin formulating a distinctive Warren Doctrine that will resonate with each of those three constituencies — Democrats, the general public, and the establishment — is to posit that these have become secondary concerns.
Eclipsing such legacy issues in immediate significance are three developments that Washington currently neglects or treats as afterthoughts, along with one contradiction that simultaneously permeates and warps any discussion of national security. If properly understood, the items in this quartet would rightly cause Americans to wonder if the blessings of liberty will remain available to their posterity. It’s incumbent upon you to provide that understanding. In short, a Warren Doctrine should tackle all four head-on.
Addressing that contradiction should come first. Its essence is this: we Americans believe that we are a peaceful people. Our elected and appointed leaders routinely affirm this as true. Yet our nation is permanently at war. We Americans also believe that we have a pronounced aversion to empire. Indeed, our very founding as a republic testifies to our anti-imperial credentials. Yet in Washington, D.C. — an imperial city if there ever was one — references to the United States of America as the rightful successor to Rome in the era of the Caesars and the British Empire in its heyday abound. And there is more here than mere rhetoric: The military presence of U.S. forces around the planet testifies in concrete terms to our imperial ambitions. We may be an “empire in denial,” but we are an empire.
The point of departure for the Warren Doctrine should be to subject this imperial project to an honest cost-benefit appraisal, demonstrating that it leads inexorably to bankruptcy, both fiscal and moral. Allow militarized imperialism to stand as the central theme of U.S. policy and the national security status quo will remain sacrosanct. Expose its defects and the reordering of national security and other priorities becomes eminently possible.
That reordering ought to begin with three neglected developments that should be at the forefront of a Warren Doctrine. The first is a warming planet. The second is an ongoing redistribution of global power, signified by (but not limited to) the rise of China. The third is a growing cyber-threat to our ever more network-dependent way of life. A Warren Doctrine centered on this trio of challenges will both set you apart from your competitors and enable you to take office with clearly defined priorities — at least until some unexpected event, comparable to the fall of the Berlin Wall or the attack on the Twin Towers, obliges you to extemporize, as will inevitably happen.
Here, then, is a CliffsNotes take on each of the Big Three. (You can hire some smart young folk to fill in the details.)
Climate change poses a looming national security threat with existential implications. With this summer’s heat waves and recent staggering storms, evidence of this threat has become incontrovertible. Its adverse consequences have already ruined thousands of American lives as evidenced by Hurricanes Katrina (2005), Irma (2017), Harvey (2017), Maria (2017), and Michael (2018), along with Superstorm Sandy (2013), not to mention pervasive drought and increasingly destructive wildfires in a fire season that seems hardly to end. It no longer suffices to categorize these as Acts of God.
The government response to such events has, to say the least, been grossly inadequate. So, too, has government action to cushion Americans from the future impact of far more of the same. A Warren administration needs to make climate change a priority, improving both warning and response to the most immediate dangers and, more importantly, implementing a coherent long-term strategy aimed at addressing (and staunching) the causes of climate change. For those keen for the United States to shoulder the responsibilities of global leadership, here’s an opportunity for us to show our stuff.
Second, say goodbye to the conceit of America as the “last” or “sole” superpower. The power shift now well underway, especially in East Asia, but also in other parts of the world, is creating a multipolar global order in which — no matter what American elites might fancy — the United States will no longer qualify as the one and only “indispensable nation.” Peace and stability will depend on incorporating into that order other nations with their own claims to indispensability, preeminently China.
And no, China is not our friend and won’t be. It’s our foremost competitor. Yet China is also an essential partner, especially when it comes to trade, investment, and climate change — that country and the U.S. being the two biggest emitters of greenhouse gases. So classifying China as an enemy, an idea now gaining traction in policy circles, is the height of folly. Similarly, playing games of chicken over artificial islands in the South China Sea, citing as an imperative “freedom of navigation,” exemplifies the national security establishment’s devotion to dangerously obsolete routines.
Beyond China are other powers, some of them not so new, with interests that the United States will have to take into account. Included in their ranks are India, Russia, Turkey, Japan, a potentially united Korea, Iran (not going away any time soon), and even, if only as a matter of courtesy, Europe. Recognizing the imperative of avoiding a recurrence of the great power rivalries that made the twentieth century a bath of blood, a Warren administration should initiate and sustain an intensive diplomatic dialogue directed at negotiating lasting terms of mutual coexistence — not peace perhaps but at least a reasonable facsimile thereof.
Then there’s that cyber-threat, which has multiple facets. First, it places at risk networks on which Americans, even tech-challenged contributors to TomDispatch like me, have become dependent. Yet deflecting these threats may invite “solutions” likely to demolish the last remnants of our personal privacy while exposing Americans to comprehensive surveillance by both domestic and foreign intelligence services. A Warren Doctrine would have to ensure that Americans enjoy full access to the “network of things,” but on their own terms, not those dictated by corporate entities or governments.
Second, the same technologies that allow the Pentagon to equip U.S. forces with an ever-expanding and ever-more expensive arsenal of “smart” weapons are also creating vulnerabilities that may well render those weapons useless. It’s a replication of the Enigma phenomenon: to assume that your secrets are yours alone is to invite disaster, as the Nazis learned in World War II when their unbreakable codes turned out to be breakable. A Warren Doctrine would challenge the assumption, omnipresent in military circles, that equates advances in technology with greater effectiveness. If technology held the key to winning wars, we’d have declared victory in Afghanistan many moons ago.
Finally, there is the dangerous new concept of offensive cyber-warfare, introduced by the United States when it unleashed the Stuxnet virus on Iran’s nuclear program back in 2011. Now, as the Trump administration prepares to make American offensive cyber-operations far more likely, it appears to be the coming thing — like strategic bombing in the run-up to World War II or nukes in its aftermath. Yet before charging further down that cyber-path, we would do well to reflect on the consequences of the twentieth century’s arms races. They invariably turned out to be far more expensive than anticipated, often with horrific results. A Warren Doctrine should seek to avert the normalization of offensive cyber-warfare.
Let me mention a potential bonus here. Even modest success in addressing the Big Three may create openings to deal with some of those nagging legacy issues as well. Cooperation among great powers on climate change, for example, could create an environment more favorable to resolving regional disputes.
Of course, none of this promises to be easy. Naysayers will describe a Warren Doctrine of this sort as excessively ambitious and insufficiently bellicose. Yet as President Kennedy declared in 1962, when announcing that the United States would go to the moon within the decade, some goals are worthy precisely “because they are hard.” Back then, Americans thrilled to Kennedy’s promises.
Here’s my bet: This may well be another moment when Americans will respond positively to goals that are hard but also daring and of pressing importance. Make yourself the champion of those goals and you just might win yourself a promotion to the White House.
The road between now and November 2020 is a long one. I wish you well as you embark upon the journey.
Unsolicited Advice for an Undeclared Presidential Candidate
Donald Trump’s tenure as the 45th U.S. president may last another few weeks, another year, or another 16 months. However unsettling the prospect, the leaky vessel that is the S.S. Trump might even manage to stay afloat for a second term. Nonetheless, recent headline-making revelations suggest that, like some derelict ship that’s gone aground, the Trump presidency may already have effectively run its course. What, then, does this bizarre episode in American history signify?
Let me state my own view bluntly: forget the atmospherics. Despite the lies, insults, name calling, and dog whistles, almost nothing of substance has changed. Nor will it.
To a far greater extent than Trump’s perpetually hyperventilating critics are willing to acknowledge, the United States remains on a trajectory that does not differ appreciably from what it was prior to POTUS #45 taking office. Post-Trump America, just now beginning to come into view, is shaping up to look remarkably like pre-Trump America.
I understand that His Weirdness remains in the White House. Yet for all practical purposes, Trump has ceased to govern. True, he continues to rant and issue bizarre directives, which his subordinates implement, amend, or simply disregard as they see fit.
Except in a ceremonial sense, the office of the presidency presently lies vacant. Call it an abdication-in-place. It’s as if British King Edward VIII, having abandoned his throne for “the woman I love,” continued to hang around Buckingham Palace fuming about the lack of respect given Wallis and releasing occasional bulletins affirming his admiration for Adolf Hitler.
In Trump’s case, it’s unlikely he ever had a more serious interest in governing than Edward had in performing duties more arduous than those he was eventually assigned as Duke of Windsor. Nonetheless, the 60-plus million Americans who voted for Trump did so with at least the expectation that he was going to shake things up.
And bigly. Remember, he was going to “lock her up.” He would “drain the swamp” and “build a wall” with Mexico volunteering to foot the bill. Without further ado, he would end “this American carnage.” Meanwhile, “America First” would form the basis for U.S. foreign policy. Once Trump took charge, things were going to be different, as he and he alone would “make America great again.”
Yet the cataclysm that Trump’s ascendency was said to signify has yet to occur. Barring a nuclear war, it won’t.
If you spend your days watching CNN or MSNBC or reading columnists employed by the New York Times and the Washington Post, you might conclude otherwise. But those are among the institutions that, on November 8, 2016, suffered a nervous breakdown from which they have yet to recover. Nor, it now seems clear, do they wish to recover as long as Donald Trump remains president. To live in a perpetual state of high dudgeon, denouncing his latest inanity and predicting the onset of fascism, is to enjoy the equivalent of a protracted psychic orgasm, one induced by mutual masturbation.
Yet if you look beyond the present to the fairly recent past, it becomes apparent that change on the scale that Trump was promising had actually occurred, even if well before he himself showed up on the scene. The consequences of that Big Change are going to persist long after he is gone. It’s those consequences that now demand our attention, not the ongoing Gong Show jointly orchestrated by the White House and journalists fancying themselves valiant defenders of Truth.
Trump himself is no more than a pimple on the face of this nation’s history. It’s time to step back from the mirror and examine the face in full. Pretty it’s not.
The Way We Were
Compare the America that welcomed young Donald Trump into the world in 1946 with the country that, some 70 years later, elected him president. As the post-World War II era was beginning, three large facts — so immense that they were simply taken for granted — defined America.
First, the United States made everything and made more of it than anyone else. In postwar America, wealth derived in large measure from the manufacture of stuff: steel, automobiles, refrigerators, shoes, socks, blouses, baseballs, you name it. “Made in the USA” was more than just a slogan. With so much of the industrialized world in ruins, the American economy dominated and defined everyday economic reality globally.
Second, back then while the mighty engine of industrial capitalism was generating impressive riches, it was also distributing the benefits on a relatively equitable basis. Postwar America was the emblematic middle-class country, the closest approximation to a genuinely classless and democratic society the world had ever seen.
Third, having had their fill of fighting from 1941 to 1945, Americans had a genuine aversion to war. They may not have been a peace-loving people, but they knew enough about war to see it as a great evil. Avoiding its further occurrence, if at all possible, was a priority, although one not fully shared by the new national security establishment just then beginning to flex its muscles in Washington.
Now, by twenty-first-century standards, many, perhaps nearly all, Americans of that era were bigots of one sort or the other. Racism, sexism, and homophobia flourished, lamented by some, promoted by others, tolerated by the vast majority. An anti-communist political hysteria, abetted by cynical politicians, also flourished. Americans worked themselves into a tizzy over the putative threat posed by small numbers of homegrown subversives. And they fouled the air, water, and soil with abandon. Add to this list violence, crime, corruption, sexual angst, and various forms of self-abuse. Taken as a whole, American society, as it existed when Trump was growing up, was anything but perfect. Yet, for all that, postwar Americans were the envy of the world. And they knew it.
By 2016, when Trump was elected president, America had become an altogether different country. Without actually disappearing, racism, sexism, and homophobia had — at least for the moment — gone underground. Attitudes toward people of color, women, and gays that a half-century earlier had been commonplace were now largely confined to a pathological fringe. Hysteria about communists had essentially disappeared, only to be replaced by hysteria over Islamic terrorists. Pollution, of course, persisted, as did violence, crime, corruption, and sexual angst. New and more imaginative forms of self-destructive behavior had made their appearance.
Yet little of that turned out to be central. What had truly changed in the decades since Trump was a babe-in-arms were those three taken-for-granted facts that had once distinguished the United States. New realities emerged to invert them.
By 2016, the U.S. was no longer by any stretch of the imagination the place that made everything, though it bought everything, often made elsewhere. It had long since become the ultimate consumer society, with Americans accustomed to acquiring and enjoying more than they produced or could afford. Accounts no longer balanced. The government lived on credit, assuming that the bills would never come due. So, too, did many citizens.
By 2016, the U.S. had long since become a deeply unequal society of haves and have-nots. Finance capitalism, the successor to industrial capitalism, was creating immense fortunes without even pretending to distribute the benefits equitably. Politicians still routinely paid tribute to the Great American Middle Class. Yet the hallmarks of postwar middle-class life — a steady job, a paycheck adequate to support a family, the prospect of a pension — were rapidly disappearing. While Americans still enjoyed freedom of a sort, many of them lacked security.
By 2016, Americans had also come to accept war as normal. Here was “global leadership” made manifest. So U.S. troops were now always out there somewhere fighting, however obscure the purpose of their exertions and however dim their prospects of achieving anything approximating victory. The 99% of Americans who were not soldiers learned to tune out those wars, content merely to “support the troops,” an obligation fulfilled by offering periodic expressions of reverence on public occasions. Thank you for your service!
The Way We Are
But note: Donald Trump played no role in creating this America or consigning the America of 1946 to oblivion. As a modern equivalent of P.T. Barnum, he did demonstrate considerable skill in exploiting the opportunities on offer as the strictures of postwar America gave way. Indeed, he parlayed those opportunities into fortune, celebrity, lots of golf, plenty of sex, and eventually the highest office in the land. Only in America, as we used to say.
In 1946, it goes without saying, he would never have been taken seriously as a would-be presidential candidate. By 2016, his narcissism, bombast, vulgarity, and talent for self-promotion nicely expressed the underside of the prevailing zeitgeist. His candidacy was simultaneously preposterous, yet strangely fitting.
By the twenty-first century, the values that Trump embodies had become as thoroughly and authentically American as any of those specified in the oracular pronouncements of Thomas Jefferson, Abraham Lincoln, or Franklin Roosevelt. Trump’s critics may see him as an abomination. But he is also one of us.
And here’s the real news: the essential traits that define America today — those things that make this country so different from what it seemed to be in 1946 — will surely survive the Trump presidency. If anything, he and his cronies deserve at least some credit for sustaining just those traits.
Candidate Trump essentially promised Americans a version of 1946 redux. He would revive manufacturing and create millions of well-paying jobs for working stiffs. By cutting taxes, he would put more money in the average Joe or Jill’s pocket. He would eliminate the trade deficit and balance the federal budget. He would end our endless wars and bring the troops home where they belong. He would oblige America’s allies, portrayed as a crew of freeloaders, to shoulder their share of the burden. He would end illegal immigration. He would make the United States once more the God-fearing Christian country it was meant to be.
How seriously Trump expected any of those promises to be taken is anyone’s guess. But this much is for sure: they remain almost entirely unfulfilled.
True, domestic manufacturing has experienced a slight uptick, but globalization remains an implacable reality. Unless you’ve got a STEM degree, good jobs are still hard to come by. Ours is increasingly a “gig” economy, which might be cool enough when you’re 25, but less so when you’re in your sixties and wondering if you’ll ever be able to retire.
While Trump and a Republican Congress delivered on their promise of tax “reform,” its chief beneficiaries will be the rich, further confirmation, if it were needed, that the American economy is indeed rigged in favor of a growing class of plutocrats. Trade deficit? It’s headed for a 10-year high. Balanced budget? You’ve got to be joking. The estimated federal deficit next year will exceed a trillion dollars, boosting the national debt past $21 trillion. (Trump had promised to eliminate that debt entirely.)
And, of course, the wars haven’t ended. Here is Trump, just last month, doing his best George McGovern imitation: “I’m constantly reviewing Afghanistan and the whole Middle East,” he asserted. “We never should have been in the Middle East. It was the single greatest mistake in the history of our country.” Yet Trump has perpetuated and, in some instances, expanded America’s military misadventures in the Greater Middle East, while essentially insulating himself from personal responsibility for their continuation.
As commander-in-chief, he’s a distinctly hands-off kind of guy. Despite being unable to walk, President Franklin Roosevelt visited GIs serving in combat zones more often than Trump has. If you want to know why we are in Afghanistan and how long U.S. forces will stay there, ask Defense Secretary James Mattis or some general, but don’t, whatever you do, ask the president.
On Not Turning America’s Back on the World
And then there is the matter of Trump’s “isolationism.” Recall that when he became president, foreign policy experts across Washington warned that the United States would now turn its back on the world and abandon its self-assigned role as keeper of order and defender of democracy. Now, nearing the mid-point of Trump’s first (and hopefully last) term, the United States remains formally committed to defending the territorial integrity of each and every NATO member state, numbering 29 in all. Add to that an obligation to defend nations as varied as Japan, South Korea, and, under the terms of the Rio Pact of 1947, most of Latin America. Less formally but no less substantively, the U.S. ensures the security of Israel, Saudi Arabia, and various other Persian Gulf countries.
As for obliging those allies to pony up more for the security we have long claimed to provide, that’s clearly not going to happen any time soon. Our European allies have pocketed both Trump’s insults and his assurances that the United States will continue to defend them, offering in return the vaguest of promises that, sometime in the future, they might consider investing more in defense.
By-the-by, U.S. forces under Donald Trump’s ostensible command are today present in more than 150 countries worldwide. Urged on by the president, Congress has passed a bill that boosts the Pentagon budget to $717 billion, an $82 billion increase over the prior year. Needless to say, no adversary or plausible combination of adversaries comes anywhere close to matching that figure.
To call this isolationism is comparable to calling Trump svelte.
As for the promised barrier, that “big, fat, beautiful wall,” to seal the southern border, it has advanced no further than the display of several possible prototypes. No evidence exists to suggest that Mexico will, as Trump insisted, pay for its construction, nor that Congress will appropriate the necessary funds, estimated at somewhere north of $20 billion, even with Republicans still controlling both houses of Congress. And in truth, whether it is built or not, the U.S.-Mexico border will remain what it has been for decades: heavily patrolled but porous, a conduit for desperate people seeking safety and opportunity, but also for criminal elements trafficking in drugs or human beings.
The point of this informal midterm report card is not to argue that Donald Trump has somehow failed. It is rather to highlight his essential irrelevance.
Trump is not the disruptive force that anti-Trumpers accuse him of being. He is merely a noxious, venal, and ineffectual blowhard, who has assembled a team of associates who are themselves, with few exceptions, noxious, venal, or ineffectual.
So here’s the upshot of it all: if you were basically okay with where America was headed prior to November 2016, just take a deep breath and think of Donald Trump as the political equivalent of a kidney stone — not fun, but sooner or later, it will pass. And when it does, normalcy will return. Soon enough you’ll forget it ever happened.
If, on the other hand, you were not okay with where America was headed in 2016, it’s past time to give up the illusion that Donald Trump is going to make things right. Eventually a pimple dries up and disappears, often without leaving a trace. Such is the eventual destiny of Donald Trump as president.
In the meantime, of course, there are any number of things about Trump to raise our ire. Climate change offers a good example. And yet climate change may be the best illustration of Trump’s insignificance.
Under President Obama, the United States showed signs of mounting a belated effort to address global warming. The Trump administration wasted little time in reversing course, reverting to the science-denying position to which Republicans adhered long before Trump himself showed up.
No doubt future generations will find fault with Trump’s inaction in the face of this crisis. Yet when Miami is underwater and California wildfires rage throughout the year, Trump himself won’t be the only — or even the principal — culprit charged with culpable neglect.
The nation’s too-little, too-late response to climate change for which a succession of presidents share responsibility illustrates the great and abiding defect of contemporary American politics. When all is said and done, presidents don’t shape the country; the country shapes the presidency — or at least it defines the parameters within which presidents operate. Over the course of the last few decades, those parameters have become increasingly at odds with the collective wellbeing of the American people, not to mention of the planet as a whole.
Yet Americans have been obdurate in refusing to acknowledge that fact.
Americans today are deeply divided. There exists no greater symbol of that division than Trump himself — the wild enthusiasm he generates in some quarters and the antipathy verging on hatred he elicits in others.
The urgent need of the day is to close that divide, which is as broad as it is deep, touching on culture, the political economy, America’s role in the world, and the definition of the common good. I submit that these matters lie beyond any president’s purview, but especially this one’s.
Trump is not the problem. Think of him instead as a summons to address the real problem, which in a nation ostensibly of, by, and for the people is the collective responsibility of the people themselves. For Americans to shirk that responsibility further will almost surely pave the way for more Trumps — or someone worse — to come.