On February 15th, Donald Trump declared a state of national emergency in order to fund his “great, great” border wall without having to go through Congress. There is, of course, no emergency, despite the rape fantasy that the president has regularly tried to pass off as public policy. In speech after speech, including his declaration of that emergency, he has told the same story: the United States needs a border wall to prevent sex traffickers from driving women into the country, bound with duct tape.

“Women are tied up,” he typically says. “They’re bound. Duct tape put around their faces, around their mouths. In many cases they can’t even breathe.”

It’s a scenario he’s only continued to elaborate over time. “They have tape over their mouths, electrical tape, usually blue tape, as they call it. It’s powerful stuff. Not good. And they have three, four, five of them in vans, or three of them in back seats of cars.” As they approach ports of entry, he swears, the vehicles carrying them “get off the road, and they drive out into the desert and they come in, they make a left turn — usually it’s a left, not a right.”

Fact-checkers and experts in border sex trafficking have been quick to insist that they know of no such incidents, however elaborately imagined — not one. Instead, most women and children forced into prostitution, they report, enter the country through legal ports of entry.

Border Patrol headquarters even sent out a request asking agents to provide any evidence whatsoever that might help support the president’s tall tales. None apparently did. It’s worth noting that Trump first added stories of duct-taped women to his border repertoire in early January, not too long after the heartbreaking news broke of the discovery of two Saudi sisters, 16 and 22, found dead in New York City’s Hudson River, duct-taped together. Their deaths were ruled suicides, committed after the United States denied them asylum and ordered them deported to Saudi Arabia, a close American ally. Their bodies even washed up on West 68th Street and Riverside Drive, close to Trump Place Condominiums. (He seems inescapable.)

In any case, one doesn’t need Sigmund Freud to grasp the crude displacement evidently underway here. By narrating the “crisis” on the border in a pornographic manner, painting it as a hellscape ruled by MS-13 murderers and rapists, President Trump is undoubtedly using ever more salacious fables to sublimate guilty desires, as well as his and the nation’s complicity in hellish atrocities.

Currently, Immigration and Customs Enforcement, or ICE, has nearly 50,000 migrants in custody. That’s roughly the number of people Canada incarcerates in its entire prison system. And no one knows how many migrant children the U.S. is detaining, except that the number is much higher than the 2,737 listed in court documents. The Department of Health and Human Services can’t even provide journalists with an accurate count: “The total number of children separated from a parent or guardian by immigration authorities is unknown” is all its spokespeople can say.

Many of those children are housed in tent compounds in the desert or vacant Walmarts, forced to eat in shifts and sleep on the floors of chain-link cells covered only by a thin, metallic blanket. In one Florida detention center, children are packed “like sardines” in large halls stacked with bunk beds with little room even to walk. At such places, they are reportedly taunted or even sexually terrorized, either by staff or older migrants. They are overprescribed psychotropic drugs to numb them, given pills to make them sleep, and often refused medical attention when sick.

Border Patrol agents have even reportedly snatched babies from their mothers as they were breastfeeding them. Families have been tear gassed at the border and children have already died in Border Patrol custody (though “custody” is undoubtedly too soft a word to describe what the U.S. is doing to the progeny of nearby republics). “These kids are incarcerated,” said an MSNBC reporter who visited one of the detention complexes.

Some of the incarcerated migrant children are then delivered to a Christian adoption service with links to Trump’s Secretary of Education, Betsy DeVos. According to the Associated Press, the Trump administration has all but given up trying to reunite children placed in “sponsor” homes with their actual families, since returning them, according to the Department of Health and Human Services, “would present grave child welfare concerns.”

Make Heaven Weep

Racial and sexual violence on the border has a long history. In Washington’s 1846 war on Mexico, for instance, which established the current boundary between the two countries, state militia volunteers and Army regulars rampaged across that region, burning churches, raping women, and scalping men.


Featured Title from this Author

On February 9, 1847, for example, a member of an Arkansas volunteer regiment raped a Mexican woman near the regiment’s camp at Agua Nueva in the state of Coahuila and Mexicans retaliated by killing a U.S. soldier. In response, more than 100 of those Arkansas volunteers cornered a group of war refugees in a cave. Screaming “like fiends,” according to one witness, they raped and slaughtered their victims, even as the women and children among them were “shrieking for mercy.” By the time it ended, scores of Mexicans lay dead or dying on a cave floor thick with clotted blood. Many of them had been scalped. (That’s hardly surprising since more than a few of those U.S. Army volunteers had, in the pre-war years, made their livings on those same borderlands by scalping Apaches for bounty money, or “barbering” them, as one Texan scalp-hunter put it.)

Even before that massacre, General Winfield Scott, commander of U.S. forces, wrote Washington to complain of other atrocities being committed by such volunteers, organized under the command of future president Zachary Taylor. The crimes of Taylor’s men, Scott said, were so heinous they would “make Heaven weep.”

When the war ended, Washington had taken all of Mexico’s northern territories, including all or parts of present-day Arizona, New Mexico, California, Nevada, western Colorado, Utah, and southwestern Wyoming. About 500,000 square miles, home to an estimated 80,000 to 100,000 people, had been added to the United States.

The Immigration and Naturalization Service’s Chief Pimp

Sexual violence only continued, committed by members of the Border Patrol (founded in 1924) and other security forces like the Texas Rangers.

Starting in the 1970s, ever more middle-class families in the U.S. began hiring undocumented Mexican women as live-in servants, cooks, maids, and nannies. Many of them found themselves far from home in peonage-like conditions, unable to leave the houses in which they worked. Some of those women quickly found themselves not just trapped, but sexually and emotionally battered. One was locked in a house in Nevada for months, according to a witness: “She worked from sunup to way after dark. She requested that her wages be sent to her father in Mexico. No money was ever sent to her father. This went on for about a year and a half. Then she flipped — she became insane, broke out of the house and ran down the street. That’s when the Border Patrol got her.”

Others were raped by their employers and, if they complained, beaten or told that they would be handed over to the Border Patrol, which came to double as a labor procurement service for wealthy households and large ranchers. During those years, in fact, the Border Patrol and the Immigration and Naturalization Service (INS) were notoriously corrupt, violent federal institutions. In Texas, Border Patrollers worked closely with ranchers, delivering workers to their properties (including one owned by Lyndon Johnson when he was still president), then raiding those properties just before payday and deporting the same workers. “The ranchers got their crops harvested for free, the INS men got fishing and hunting privileges on the ranches, and the Mexicans got nothing,” a New York Times reporter, John Crewdson, wrote.

An investigation into INS corruption revealed that agency officials traded young Mexican women caught at the border to the Los Angeles Rams for season tickets. One such official was known within the INS as the service’s “chief pimp.” Part of his job was to help other officials and politicians, including New Jersey Democrat Peter Rodino (who presided over Richard Nixon’s impeachment in the House of Representatives), “get laid” by arranging visits to Mexican brothels.

In his memoir, a former guard, Tony Hefner, described the INS detention center in Port Isabel, Texas — overflowing in the 1980s with refugees from President Ronald Reagan’s Central American wars — as essentially a rape camp. There, underage Salvadoran women, summoned by the center’s guards and wardens, were forced to dance, watch gore films like The Texas Chainsaw Massacre, and submit to sexual demands. They were given abortion pills in case such encounters resulted in pregnancy.

Human Prey

For decades, the border also gave liberty to nativist fantasies, as vigilantes of one sort or another ran wild there.

In the era after the United States lost its war in Vietnam and began to make its fast turn toward deindustrialization, such fantasies became ever more sadistic. In 1990, for instance, a group of San Diego high school students fashioned themselves into a neo-Nazi paramilitary group, calling themselves the Metal Militia, and began staging “war games” on the border, hunting down and robbing migrants. The spree was notable in that it was covered by a new broadcasting network, Fox, on a show called The Reporters.

Racism and nativism would become Fox News’ bread and butter, but here it went for sensationalism, titling the episode “Human Prey.”  Its host, former Newsday investigative journalist Bob Drury, depicted migrants sympathetically. In a wide-lapelled white blazer, he interviewed one vigilante who estimated that there were about 10 militant groups in the San Diego County area who would “hunt, track, and stalk” migrants for sport. The film crew accompanied one such group as they captured a family, including a baby and a terrified grandmother.

Drury linked this upsurge in border extremism to the end of the Vietnam War: many of the vigilantes were veterans of that war. Others were teenagers who modeled their tactics, including the setting of booby traps, on Vietnam War movies they had seen. The most disturbing portions of Drury’s report were his interviews with vigilantes. Disguised so as not to be recognizable, they expressed unalloyed hate. “Grab a kid,” one said, discussing his favored method of terrorizing migrants, and “nobody is going to do anything.”

Rome at the Colosseum

“Human Prey” helped launch a genre of TV “border patrol porn.” Even before Trump came on the political scene, the National Geographic channel ran five seasons of Border Wars. Since then, more such shows have aired, including Discovery Channel’s Border Live and Netflix’s Border Security. Copying the style of law-and-order series like Cops, these shows offer viewers ride-alongs with Border Patrol agents as they guard the country’s frontier. The set-up is familiar: greenish night-goggle cinematography, Black Hawk helicopters, battered-down doors, and sunrise jeep runs through mesquite scrub. While driving, Border Patrol agents in dark sunglasses hold forth on life, duty, manhood, and their occasional doubts, as an unseen camera films them from the passenger seat or back seat.

One episode from season two of Border Wars, Lost in the River,” reveals a common, often deadly Border Patrol practice: the use of helicopters and all-terrain vehicles to scatter border crossers, forcing them ever deeper into the dangerous desert or fast-flowing rivers. It’s a game — patrollers play scatter, chase, catch; migrants surrender or die — that pits desperate people with next to no resources against one of the best-funded, high-tech, armed-to-the-teeth law enforcement agencies on earth. “We’ll let him tire himself out. If he wants to run, we’ll let him run,” says one agent. “You kind of have to pick your battles, and I usually pick the one who runs the most… We’ve got bodies running all over the place… It’s a never-ending game for us.”

Some of those migrants are chased back into Mexico, others caught, but many simply disappear and die, either from drowning or dehydration. Those that do make it to the United States go on to work at some of the lowest-paying but essential jobs around: they pick crops, slaughter and pack meat, clean houses, tend to the sick, watch kids — and for the privilege of all this, the federal government has put them through a dystopic death race, which is then transformed into reality-show entertainment for the masses. Watching such spectacles on cable TV, it’s hard not to feel that the United States is now ancient Rome — an empire that, in its later years, held compassion to be a vice — and the whole of that southwestern desert our Colosseum.

Occasionally, these shows humanize immigrants, but only long enough to super-humanize their pursuers. In one Border Wars episode, a group of 24 detained migrants sit around in the cold morning desert air, looking alternately scared and bored. “It tugs at your heart string[s],” says one of the Border Patrollers who chased them down. “When you see people that are in a bad position, you know, it’s tough, it plays on you emotionally as an agent, even though you have a job to do. To keep America safe.” None of these shows, however, reveal what happens off screen, including reports that Border Patrollers gratuitously tackle non-resisting migrants, beat those they catch, piss on their belongings, destroy their sources of drinking water, and deny them humanitarian aid.

If the images that do appear on screen sooner or later come to numb the moral senses and if viewers need to up the ante, they can always click on PornHub, which offers a whole subgenre of actual Border Porn, including actors dressed as border agents and as migrants: “If you are caught, you are fucked,” is the title of one video.

“Like the Sabine virgins,” the New York Herald wrote a century and a half ago about how Mexicans would come to enjoy being ruled by Washington, Mexico “will soon learn to love her ravishers.”

Trump’s Necromancers

Maybe there’s a better metaphor than describing the United States as decadent Rome. Maybe Trump’s wall, whether built or not, is psychologically refashioning the country into a besieged medieval fortress, complete with its own cult of martyrs. As a candidate, Trump campaigned with the victims (or the families of victims) of crimes committed by undocumented immigrants, using their grief to stoke grievances. As president, one of his first acts was to establish a government office charged with providing support services to “victims of crimes committed by removable aliens.” (Never mind that such aliens have a lower crime rate here than the general population.) Trump’s never happier than when, at one of his rallies or speeches, he’s able to call the name of someone who had a family member killed or raped by an undocumented immigrant.

A few years before Trump’s election, as Robin Reineke of the Colibri Center for Human Rights has reported, the sort of men who would later become Trump’s followers began showing up at Tea Party conventions with binders full of photographs of migrant corpses, gruesome images of the desiccated remains of those who had died in the desert trying to enter the United States. The anti-migrant activists who displayed such books of the dead claimed they were humanitarians, trying to raise support to build a wall to stop poor migrants from crossing over and so dying. But really they, like the president today, were necromancers, a kind of American priesthood of the lost frontier, offering a new litany of hate and using the fetish pornography of death to reassure racists that their cruelty was actually kindness.

Bread, Circuses, and Duct Tape

Fleeing Forward

Poetry was the language of the frontier, and the historian Frederick Jackson Turner was among its greatest laureates. “The United States lies like a huge page in the history of society,” he wrote in 1893. “Line by line as we read this continental page from West to East we find the record of social evolution.”1 Expansion across the continent, Turner said, made Europeans into something new, into a people both coarse and curious, self-disciplined and spontaneous, practical and inventive, filled with a “restless, nervous energy” and lifted by “that buoyancy and exuberance which comes with freedom.” Turner’s scholarly career spanned the late nineteenth and early twentieth centuries, during the height of Jim Crow and the consolidation of anti-miscegenation and nativist exclusion laws, with the KKK resurgent. Mexican workers were being lynched in Texas, and the U.S. military was engaged in deadly counterinsurgencies in the Caribbean and Pacific. But what became known as Turner’s Frontier Thesis—which argued that the expansion of settlement across a frontier of “free land” created a uniquely American form of political equality, a vibrant, forward-looking individualism—placed a wager on the future.

The kind of Americanism Turner represented took all the unbounded optimism that went into the founding of the United States and bet that the country’s progress, moving forward on the frontier and into the world, would reduce racism to a remnant and leave it behind as residue. It would dilute other social problems as well, including poverty, inequality, and extremism, teaching diverse people how to live together in peace. Frank Norris, in 1902, hoped that territorial expansion would lead to a new kind of universalism, to the “brotherhood of man” when Americans would realize that “the whole world is our nation and simple humanity our countrymen.”2

Facing west meant facing the Promised Land, an Edenic utopia where the American as the new Adam could imagine himself free from nature’s limits, society’s burdens, and history’s ambiguities. No myth in American history has been more powerful, more invoked by more presidents, than that of pioneers advancing across an endless meridian. Onward, and then onward again. There were lulls, doubts, dissents, and counter-movements, notably in the 1930s and 1970s. But the expansionist imperative has remained constant, in one version or another, for centuries. As Woodrow Wilson said in the 1890s, “a frontier people always in our van, is, so far, the central and determining fact of our national history.” “There was no thought,” Wilson said, “of drawing back.”3

So far. The poetry stopped on June 16, 2015, when Donald J. Trump announced his presidential campaign by standing Frederick Jackson Turner on his head. “I will build a great wall,” Trump said.

Trump most likely had never heard of Turner, or his outsized influence on American thought. But there, in the lobby of his tower on Fifth Avenue in Manhattan, he offered his own judgment on history. Referring specifically to the North American Free Trade Agreement and broadly to the country’s commitment to free trade, he said, “We have to stop, and it has to stop now.”

* * *


Featured Title from this Author

All nations have borders, and many today even have walls. But only the United States has had a frontier, or at least a frontier that has served as a proxy for liberation, synonymous with the possibilities and promises of modern life itself and held out as a model for the rest of the world to emulate.4

Decades before its founders won their independence, America was thought of as a process of endless becoming and ceaseless unfurling. In 1651, Thomas Hobbes described British colonialism in America as driven by an “insatiable appetite, or Bulimia, of enlarging dominion.”5 Thomas Jefferson, in a political manifesto he wrote two years before the Declaration of Independence, identified the right “of departing from the country in which chance, not choice” had placed settlers, “of going in quest of new habitations” as an element of universal law.6

True religion moved east to west with the sun, believed early American theologians, and if man could keep pace with its light, perhaps historical time itself could be overcome and decline avoided.7 The West, said one frontier writer, was “the land of mankind’s second chance.”8 It was, said Turner, a place of “perennial rebirth.” Are there new frontiers? The historian Walter Prescott Webb, writing in the early 1950s, said that what that perennial question revealed was nothing less than a rejection of the death instinct. You might as well ask, Webb said, is there a human soul?9 Faith in the regenerative power of the frontier resided in the fact that the West did offer, for many, a chance to shake off their circumstances. More than a few even got rich. The United States was great, in ambition as well as dimension.

The concept of the frontier served as both diagnosis (to explain the power and wealth of the United States) and prescription (to recommend what policy makers should do to maintain and extend that power and wealth). And when the physical frontier was closed, its imagery could easily be applied to other arenas of expansion, to markets, war, culture, technology, science, the psyche, and politics. In the years after World War II, the “frontier” became a central metaphor to capture a vision of a new kind of world order. Past empires established their dominance in an environment where resources were thought to be finite, extending their supremacy to capture as much of the world’s wealth as possible, to the detriment of their rivals. Now, though, the United States made a credible claim to be a different sort of global power, presiding over a world economy premised on endless growth. Washington, its leaders said, didn’t so much rule as help organize and stabilize an international community understood as liberal, universal, and multilateral. The promise of a limitless frontier meant that wealth wasn’t a zero-sum proposition. It could be shared by all. Borrowing frontier language used by Andrew Jackson and his followers in the 1830s and 1840s, postwar planners said the United States would extend the world’s “area of freedom” and enlarge its “circle of free institutions.”10

* * *

The ideal of the frontier contained within itself the terms of its own criticism, which is another reason why it serves as so powerful a national metaphor. Martin Luther King, Jr., argued that the ideal fed into multiple reinforcing pathologies: into racism, a violent masculinity, and moralism that celebrates the rich and punishes the poor. For over a year, from early 1967 until his murder in April 1968—as the United States escalated its war in Vietnam—King put forth, in a series of sermons and press conferences, a damning analysis. Military expansion abroad, he argued, quickened domestic polarization. The “flame throwers in Vietnam fan the flames in our cities,” he said; “the bombs in Vietnam explode at home.” At the same time, constant war served to deflect the worst consequences of that polarization outward.11

King’s point is as simple as it is profound: A constant fleeing forward allowed the United States to avoid a true reckoning with its social problems, such as economic inequality, racism, crime and punishment, and violence. Other critics at the time were coming to similar conclusions. Some scholars argued that imperial expansion let the United States “buy off” its domestic white skilled working class, either through social welfare or higher wages made possible by third world exploitation. Others stressed the political benefits of expansion, which allowed the reconciliation of competing interests.12 Still others emphasized more Freudian, even Jungian, motives: deep-seated violent fantasies, formed in long-ago wars against people of color on the frontier, projected outward; soldiers sublimating their “own guilty desires,” their own complicity in wartime atrocities, with ever more grotesque sadism.13

There is a lot to unpack in the argument that over the long course of U.S. history, endless expansion, either over land or through markets and militarism, deflects domestic extremism. How, for example, might historical traumas and resentments, myths and symbols, be passed down the centuries from one generation to another? Did the United States objectively need to expand in order to secure foreign resources and open markets for domestic production? Or did the country’s leaders just believe they had to expand? Whatever the answers to those questions, the United States, since its founding, pushed outward and justified that push in moral terms, as beneficial equally for the people within and beyond the frontier. The idea of expansion, the historian William Appleman Williams wrote in 1966, was “exhilarating in a psychological and philosophical sense” since it could be “projected to infinity.”14

Not, as it turns out, to infinity.

* * *


Featured Title from this Author

The United States is now into the eighteenth year of a war that it will never win. Soldiers who fought in Afghanistan and Iraq in the early 2000s are now seeing their children enlist. A retired Marine general recently said the United States will be in Afghanistan for yet another sixteen years, at least. By that point, the grandchildren of the first generation of veterans will be enlisting. Senator Lindsey Graham believes that the United States is fighting “an endless war without boundaries, no limitation on time or geography.”15 Another former officer (referring to the expansion of military operations into African countries like Niger) said the war “will never end.”16 And grandchildren down the line will be paying its bill, now estimated to approach six trillion dollars.17

While the United States is mired in an endless war, it can no longer imagine endless growth. An entire generation’s expectations have been radically foreshortened, as the 2007–2008 financial collapse has been followed by a perverse kind of recovery, marked by mediocre rates of investment, stockpiled wealth, soaring stocks, and stagnant wages.18 The roots of the current crisis reach back decades, to the economic restructuring that began in the 1980s with farm failures and deindustrialization, and continued forward with financial deregulation, crippling tax cuts, and the entrenchment of low-paying service jobs and personal debt. The nation’s political class, over the course of these decades, sold economic restructuring by ratcheting up the language of limitlessness. “Nothing is impossible,” Ronald Reagan said. “There are no limits to growth.”19 The presidents who followed—George H. W. Bush, Bill Clinton, and George W. Bush—presided over an ideological bubble that proved as unrealistic as a prediction by one of Clinton’s top economists, who in 1998 said that the soon-to-be-busted dot-com boom “will run forever.”20 All four presidents steadily upped the ante, pushing global “engagement” as a moral imperative, a mission that led the United States to the Persian Gulf and to its financially exhausting and morally discrediting global war.

Gaps exist in all nationalisms between ideal and experience. But in the years following defeat in Vietnam, the revival of the myth of rugged individualism and frontier limitlessness—at a moment when deindustrialization was making daily life precarious for an increasing number of people, when more and more people were reaching their limits—has created a punishing kind of dissonance. It was used to weaken the mechanisms of social solidarity, especially government-provided welfare and labor unions, just when they were most needed. In the mythology of the West, cowboys don’t join unions.21 The gap between myth and reality has now widened into a chasm.

The United States is a nation founded on the principle that government should leave individuals free to pursue their self-interest. Corruption and greed, even as the United States moved out in the world with a sense of moral mission, have not been foreign qualities. But it’s hard to think of a period in the nation’s history when venality and disillusionment have been so sovereign, when so many of the country’s haves have nothing to offer but disdain for the have-nots.

* * *

The 2016 election of Donald Trump as president of the United States—and all the vitriol his campaign and presidency have unleashed—has been presented by commentators as one of two opposing possibilities. Trumpism either represents a rupture, a wholly un-American movement that has captured the institutions of government; or he is the realization of a deep-rooted American form of extremism. Does Trump’s crass and cruel appeal to nativism represent a break from tradition, from a fitful but persistent commitment to tolerance and equality at home and defense of multilateralism, democracy, and open markets abroad? Or is it but the “dark side,” to use Dick Cheney’s resonant phrase, of U.S. history coming into the light? Breach or continuity?

What’s missing from most commentary is an acknowledgment of the role that expansion, along with the promise of boundlessness, played in relegating racism and extremism to the fringe. To be sure, previous cycles of dislocation have given rise to demagogues similar to Trump, such as George Wallace and Pat Buchanan. But the movements those nativists led remained marginal and were contained—geographically, institutionally, and ideologically. And the United States has had other presidents who were open racists. Before Richard Nixon put his “southern strategy” into place to win the votes of southern neo-Confederates, Woodrow Wilson cultivated what was left of actual Confederates, and their sons and grandsons, into an electoral coalition, re-segregating the federal bureaucracy and legitimating the KKK. Before Wilson, there was Andrew Jackson, who personally drove a slave coffle between Natchez and Nashville and presided over a policy of ethnic cleansing that freed up vast amounts of land for white settlers, putting the full power of the federal government to creating a “Caucasian democracy.”

What distinguishes earlier racist presidents like Jackson and Wilson from Trump, though, is that they were in office during the upswing of America’s moving out in the world, when domestic political polarization could be stanched and the country held together—even after the Civil War nearly tore it apart—by the promise of endless growth. Trumpism is extremism turned inward, all-consuming and self-devouring. There is no “divine, messianic” crusade that can harness and redirect passions outward. Expansion, in any form, can no longer satisfy the interests, reconcile the contradictions, dilute the factions, or redirect the anger.

The “furies,” as the writer Sam Tanenhaus described the conservative fringe that gained ground during Barack Obama’s presidency, have nowhere left to go.22 They whip around the homeland. Trump tapped into various forms of American racism: trading in birtherism, embracing law-and-order extremists, and refusing to distance himself from KKK and Nazi supporters, for instance. But it was the focus on the border and all that went with it—labeling Mexicans rapists, calling migrants snakes and animals, stirring up anger at undocumented residents, proposing to end birthright citizenship, and unleashing ICE agents to raid deep into the country, to stalk schools and hospitals, to split families and spread grief—that provided Trumpism its most compelling through-line message: The world’s horizon is not limitless; not all can share in its wealth; and the nation’s policies should reflect that reality. That argument isn’t new. Over the years, there have been two versions of it. One is humane, a recognition that modern life imposes obligations, that nature’s resources aren’t infinite, and that society should be organized in a way that distributes fortune as fairly as possible. The other thinks that recognition of limits requires domination.

“To live past the end of your myth is a perilous thing,” the Canadian poet Anne Carson once said. With Trump, America finds itself at the end of its myth.

* * *

To talk about the frontier is also to talk about capitalism, about its power and possibility and its promise of boundlessness. Donald Trump figured out that to talk about the border—and to promise a wall—was a way to acknowledge capitalism’s limits, its pain, without having to challenge capitalism’s terms. Trump ran promising to end the wars and to reverse the extreme anti-regulatory and free-market program of his party. Once in office, though, he accelerated deregulation, increased military spending, and expanded the wars.23 But he kept talking about his wall.

That wall might or might not be built. But even if it remains only in its phantasmagorical, budgetary stage, a perpetual negotiating chip between Congress and the White House, the promise of a two-thousand-mile-long, thirty-foot-high ribbon of concrete and steel running along the United States’ southern border serves its purpose. It’s America’s new myth, a monument to the final closing of the frontier. It is a symbol of a nation that used to believe that it had escaped history, or at least strode atop history, but now finds itself trapped by history, and of a people who used to think they were captains of the future, but now are prisoners of the past.

NOTES

INTRODUCTION

1. Turner’s essay, “The Significance of the Frontier in American History,” which is easily found on the internet, has been reproduced widely, including in a volume edited by John Mack Faragher, Rereading Frederick Jackson Turner (1994). All subsequent uncited Turner quotations are from this volume.

2. Frank Norris, “The Frontier Gone at Last,” The Responsibilities of the Novelist: And Other Essays (1903), p. 83.

3. Woodrow Wilson, The Course of American History (1895), pp. 11, 15.

4. Over the years, the Turner thesis and other conceptualizations of the “frontier” have been applied to many countries that incorporated frontier experience into their national mythologies. The United States, however, is distinct both in its long history of expansion and in taking its frontier myth as an exemplary metaphor of capitalism. For applying Turner-like arguments to Russia: Mark Bassin, “Turner, Solov’ev, and the ‘Frontier Hypothesis’: The Nationalist Signification of Open Spaces,” Journal of Modern History 65.3 (1993), pp. 473–511. For comparative settler societies: Lynette Russell, ed., Colonial Frontiers: Indigenous–European Encounters in Settler Societies (2001); Paul Maylam, in South Africa’s Racial Past (2017), p. 52, points out that attempts to apply Turner’s Frontier Thesis to South Africa render its racism explicit. For Brazil: Mary Lombardi, “The Frontier in Brazilian History,” Pacific Historical Review (November 1975), vol. 44, no. 4, pp. 437–57; For comparative South America: Gilbert J. Butland, “Frontiers of Settlement in South America,” Revista Geográfica (December 1966), vol. 66, pp. 93–108; and David Weber and Jane Rausch, eds., Where Cultures Meet; Frontiers in Latin American History (1994).

5. For Hobbes’s connection to the Virginia Company: Patricia Springborg, “Hobbes, Donne and the Virginia Company: Terra Nullius and ‘the Bulimia of Dominium,’ ” History of Political Thought (2015), vol. 36, no. 1, pp. 113–64; and Andrew Fitzmaurice, “The Civic Solution to the Crisis of English Colonization, 1609–1625,” Historical Journal (1999), vol. 42, pp. 25–51, as well as Fitzmaurice, Sovereignty, Property and Empire, 1500–2000 (2014), p. 104.

6. “A Summary View of the Rights of British America,” 1774, available at: http://press-pubs.uchicago.edu/founders/print_documents/v1ch14s10.html.

7. Loren Baritz, “The Idea of the West,” American Historical Review (April 1961), vol. 66, no. 3, pp. 618–40.

8. Paul Horgan, Great River (1954), vol. 2, p. 638.

9. Walter Prescott Webb, The Great Frontier (1951), p. 126.

10. “General Jackson’s Letter,” dated February 12, 1843, and published in Niles’ National Register (March 30, 1844), p. 70.

11. Flame throwers: Rick Perlstein, Nixonland (2010), p. 243; Bombs: “The Casualties of the War in Vietnam” (February 25, 1967), http://www.aavw.org/special_features/speeches_speech_king02.html.

12. Eliot Janeway, The Economics of Crisis: War, Politics, and the Dollar (1968), p. 114; Walter LaFeber, The New Empire (1961).

13. Frances FitzGerald, Fire in the Lake (1972), p. 371. Richard Slotkin’s trilogy on the myth of the frontier in America is the fullest elaboration of such arguments.

14. William Appleman Williams, The Great Evasion (1966), p. 13.

15. Rukmini Callimachi, Helene Cooper, Eric Schmitt, Alan Blinder, and Thomas Gibbons-Neff, “ ‘An Endless War’: Why 4 U.S. Soldiers Died in a Remote African Desert,” New York Times (February 20, 2018).

16. Wesley Morgan and Bryan Bender, “America’s Shadow War in Africa,” Politico (October 12, 2017), https://www.politico.com/story/2017/10/12/niger-shadow-war-africa-243695.

17. According to one report, spending on operations in Iraq and Afghanistan alone—not including the costs of wars in Pakistan, Yemen, Syria, Libya, and sub-Saharan Africa—will top six trillion dollars. “The largest portion of that bill is yet to be paid,” the authors of the report write, referring to interest on deficit spending to finance the operations, as well as the long-term medical care and disability compensation for veterans and their families. Linda Bilmes, “The Financial Legacy of Iraq and Afghanistan: How Wartime Spending Decisions Will Constrain Future National Security Budgets,” HKS Faculty Research Working Paper Series RWP13-006 (March 2013). Neta Crawford’s “U.S. Budgetary Costs of Wars Through 2016,” Watson Institute, Brown University (September 2016), does include spending in Syria, Pakistan, and on Homeland Security: http://watson.brown.edu/costsofwar/files/cow/imce/papers/2016/Costs{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20of{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20War{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20through{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}202016{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20FINAL{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20final{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4}20v2.pdf.

18. J. W. Mason, “What Recovery?” Roosevelt Institute (July 25, 2017), http://rooseveltinstitute.org/wp-content/uploads/2017/07/Monetary-Policy-Report-070617-2.pdf; Larry Summers, “The Age of Secular Stagnation,” Foreign Affairs (March–April 2017); Nelson Schwartz, “The Recovery Threw the Middle-Class Dream Under a Benz,” New York Times (September 12, 2018), https://www.nytimes.com/2018/09/12/business/middle-class-financial-crisis.html; David Lazarus, “The Economy May Be Booming, but Nearly Half of Americans Can’t Make Ends Meet, Los Angeles Times (August 31, 2018), http://www.latimes.com/business/lazarus/la-fi-lazarus-economy-stagnant-wages-20180831-story.html.

19. “Remarks Announcing Candidacy for the Republican Presidential Nomination” (November 13, 1979), http://www.presidency.ucsb.edu/ws/?pid=76116; “Second Inaugural Address” (January 21, 1985), http://avalon.law.yale.edu/20th_century/reagan2.asp.

20. Rudiger Dornbusch, Keys to Prosperity (2002), p. 66.

21. Though in real life they did: Mark Lause, The Great Cowboy Strike: Bullets, Ballots, and Class Conflicts in the American West (2018).

22. Sam Tanenhaus, The Death of Conservatism (2010), p. 99.

23. Andy Kroll, “How Trump Learned to Love the Koch Brothers,” Mother Jones (December 1, 2017), describes the degree to which Trump, despite running against the Kochs, has fulfilled their deregulation agenda. As of this writing, though, Trump’s proposal to impose tariffs on imports has strained his relationship with free-trade Republicans.

Copyright © 2019 by Greg Grandin

The End of the Myth

The point was less to actually build “the wall” than to constantly announce the building of the wall. “We started building our wall. I’m so proud of it,” Donald Trump tweeted. “What a thing of beauty.”

In fact, no wall, or certainly not the “big, fat, beautiful” one promised by Trump, is being built. True, miles of some kind of barrier — barbed wire, chain-link and steel-slat fencing, corrugated panels, and, yes, even lengths of what can only be described as concrete wall — have gone up along the U.S.-Mexico border, starting at least as far back as the administration of President William Taft, early in the last century. Trump has claimed repairs and expansions of these barriers as proof that he is fulfilling his signature campaign promise. Plaques have already been bolted onto upgrades in existing fencing, crediting him with work started and funded by previous administrations.

And yet Trump’s phantasmagorical wall, whether it ever materializes or not, has become a central artifact in American politics. Think of his promise of a more than 1,000-mile-long, 30-foot-high ribbon of concrete and steel running along the southern border of the United States as America’s new myth. It is a monument to the final closing of the frontier, a symbol of a nation that used to believe it had escaped history, but now finds itself trapped by history, and of a people who used to believe they were captains of the future, but now are prisoners of the past.

From Open to Closed Borders

Prior to World War I, the border — established in the late 1840s and early 1850s after the U.S. military invaded Mexico and took a significant part of that country’s territory — was relatively unpoliced. As historian Mae Ngai has pointed out, before World War I the United States “had virtually open borders” in every sense of the term. The only exception: laws that explicitly excluded Chinese migrants. “You didn’t need a passport,” says Ngai. “You didn’t need a visa. There was no such thing as a green card. If you showed up at Ellis Island, walked without a limp, had money in your pocket, and passed a very simple [IQ] test in your own language, you were admitted.”

A similar openness existed at the border with Mexico. “There is no line to indicate the international boundary,” reported Motor Age, a magazine devoted to promoting automobile tourism, in 1909. The only indication that you had crossed into a new country, heading south, was the way a well-graded road turned into a “rambling cross-country trail, full of chuck-holes and dust.”

The next year, the State Department made plans to roll “great coils of barbed wire… in a straight line over the plain” across the open borderland range where Texans and Mexicans ran their cattle. The hope was to build “the finest barbed-wire boundary line in the history of the world.” Not, though, to keep out people, as the border wasn’t yet an obstacle for the Mexican migrant workers who traveled back and forth, daily or seasonally, to work in homes, factories, and fields in the United States. That barbed-wire barrier was meant to quarantine tick-infested longhorn cattle. Both Washington and Mexico City hoped that such a fence would help contain “Texas Fever,” a parasitic disease decimating herds of cattle on both sides of the border and leading to a rapid rise in the cost of beef.

As far as I can tell, the first use of the word “wall” to describe an effort to close off the border came with the tumultuous Mexican Revolution. “American troops,” announced the Department of War in March 1911 during Taft’s presidency, “have been sent to form a solid military wall along the Rio Grande.” Yes, Donald Trump was not the first to deploy the U.S. Army to the border. Twenty thousand soldiers, a large percentage of that military at the time, along with thousands of state militia volunteers, were dispatched to stop the movement of arms and men not out of, but into Mexico, in an effort to cut off supplies to revolutionary forces. Such a “wall” would “prove an object lesson to the world,” claimed the Department of War. The point: to reassure European investors in Mexico that the U.S. had the situation south of the border under control. “The revolution in the republic to the south must end” was the lesson that the soldiers were dispatched to teach.

The revolution, however, raged on and borderland oil companies like Texaco began building their own private border walls to protect their holdings. Then, in April 1917, the month the United States entered World War I, President Woodrow Wilson signed into law a set of sweeping constraints on immigration generally, including literacy tests, entrance taxes, and quota restrictions. From that point on, the border sharpened — literally, as lengths of barbed wire were stretched ever further on either side of port-of-entry customs houses.

What follows is a chronology of both the physical fortification of the U.S.-Mexico boundary and the psychic investment in such a fortification — the fantasy, chased by both Democrats and Republicans for more than half a century, that with enough funds, technology, cement, steel, razor ribbon, barbed wire, and personnel, the border could be sealed. This timeline illustrates how some of the most outward-looking presidents, men who insisted that the prosperity of the nation was inseparable from the prosperity of the world, also presided over the erection of a deadly run of border barriers, be they called fences or walls, that would come to separate the United States from Mexico.


Featured Title from this Author

A Chronology

1945: The first significant physical barrier, a chain-link fence about five miles long and 10 feet high, went up along the Mexican border near Calexico, California. Its posts and wire mesh were recycled from California’s Crystal City Internment Camp, which had been used during World War II to hold Japanese-Americans.

1968: Richard Nixon’s “southern strategy” famously played to the resentments of white southern Democrats who opposed civil rights. As it turned out, though, the president had another southern strategy in mind as well, a “border strategy.” As historian Patrick Timmons has written, running for president in 1968, Nixon promised to get tough on illegal drugs from Mexico — the “marijuana problem,” he called it. Shortly after winning the White House, he launched “Operation Intercept,” a brief but prophetic military-style, highly theatrical crackdown along the border. That operation created three weeks of chaos, described by National Security Archive analyst Kate Doyle as an “unprecedented slow-down of all plane, truck, car and foot traffic — legitimate or not — flowing from Mexico into the southern United States.” That it would be run by two right-wing figures, G. Gordon Liddy and Joe Arpaio, should be a reminder of the continuities between the Nixon era and the kind of demagoguery that now rules the country. Arpaio would become the racist sheriff of Maricopa County, Arizona, who gratuitously imposed humiliating, brutal, and often deadly conditions on his overwhelmingly Latino prisoners. He would also become an early supporter of Donald Trump and would receive the first pardon of Trump’s presidency after a judge found him in criminal contempt in a racial-profiling case. Liddy, of course, went on to run Nixon’s “Plumbers,” the burglars who infamously broke into the Democratic National Committee’s headquarters at the Watergate Hotel, precipitating the president’s downfall. In his 1996 memoir, Liddy said Operation Intercept primarily wasn’t about stopping the flow of pot. Instead, its “true purpose” was “an exercise in international extortion, pure, simple, and effective, designed to bend Mexico to our will” — to force that country to be more cooperative on a range of policies.

1973-1977: The United States had just lost a war in Vietnam largely because it proved impossible to control a border dividing the two parts of that country. In fact, Secretary of Defense Robert McNamara, desperate to keep North Vietnamese forces from infiltrating South Vietnam, had spent more than $500 million on 200,000 spools of barbed wire and five million fence posts, intending to build a “barrier” — dubbed the “McNamara Line” — running from the South China Sea to Laos. That line failed dismally. The first bulldozed six-mile strip quickly became overgrown with jungle, while its wooden watch towers were, the New York Times reported, “promptly burned down.” It was as that war ended that, for the first time, rightwing activists began to call for a “wall” to be built along the U.S.-Mexico border.

Biologist Garrett Hardin, a professor at the University of California, Santa Barbara, was typical. In “Population and Immigration: Compassion or Responsibility?,” an essay in the Ecologist, he wrote: “We might build a wall, literally.” Hardin was an early exponent of what today is called “race realism,” which holds that, in a world of limited resources and declining white birth rates, borders must be “hardened.”

During these years, southern border conflicts were especially acute in California, where Ronald Reagan was then governor. As San Diego’s sprawl began to push against agricultural fields where migrant workers from Mexico toiled, racist attacks on them increased. Vigilantes drove around the back roads of the greater San Diego area shooting at Mexicans from the flatbeds of their pickup trucks. Dozens of bodies were found in shallow graves.

Such anti-migrant violence was fueled, in part, by angry Vietnam veterans who began to carry out what they called “beaner raids” to break up migrant camps. Snipers also took aim at Mexicans crossing the border. Led by the 27-year-old David Duke, the Ku Klux Klan set up a “border watch” in 1977 at the San Ysidro point of entry and received significant support from local Border Patrol agents. Other KKK groups soon set up similar patrols in south Texas, placing leaflets stamped with skulls and crossbones on the doorsteps of Latino residents. Around this time, in the swampy Tijuana estuary, an area that border vigilantes began calling “Little ‘Nam,” U.S. border agents reported finding pitfall traps modeled on the punji traps the Vietnamese had set for American soldiers.

1979: President Jimmy Carter’s administration offered a plan to build a fence along heavily trafficked stretches of the border, but scuttled the idea as the 1980 presidential election approached.

1980-1984: “You don’t build a nine-foot fence along the border between two friendly nations,” Ronald Reagan said on a presidential campaign swing through Texas in September 1980. By taking a swipe at the Carter administration’s plans, he was making a play for that state’s Latino vote, 87{068cdbfab37e4f27da76d005a9c3d7cc8b4ae1429371040bf09b1dfe920189b4} of which had gone to Carter four years earlier. “You document the undocumented workers and let them come in here with a visa,” Reagan said, and let them stay “for whatever length of time they want to stay.”

Then, four years later, President Reagan shifted gears. “Our borders are out of control,” he insisted in October 1984. As he ran for reelection, his administration started pushing the idea that the border could indeed be “sealed” and that the deployment of “high tech” equipment — infrared scopes, spotter planes, night-vision goggles — might provide just such effective control. “New stuff,” claimed a Border Patrol official, though some of the ground sensors being set out along that border were leftovers from Vietnam. In his second term, Reagan did get an immigration reform bill passed that helped more than two million undocumented residents obtain citizenship. But his administration, looking to appease a growing caucus of nativists in the Republican Party, also launched Operation Jobs, sending federal agents into workplaces to round up and deport undocumented workers. In 1984, the Border Patrol saw the largest staff increase in its 60-year history.

1989: In March 1989, a few months before the Berlin Wall fell, the new administration of President George H. W. Bush proposed building a 14-foot-wide, 5-foot-deep border trench south of San Diego. Some likened it to a “moat,” since it would be filled with run-off rainwater. “The only thing they haven’t tried is mining the area,” quipped Robert Martinez, the director of San Diego’s American Friends Service Committee. Opponents called it an “inverted Berlin Wall,” while the White House claimed that the trench would solve both drainage and immigration problems. The idea was shelved.

1992: Richard Nixon’s former speechwriter Patrick Buchanan provided an unexpectedly strong challenge to a sitting president for the Republican nomination, calling, among other things, for a wall or a ditch — a “Buchanan trench,” as he put it — along the U.S.-Mexico border and for the Constitution to be amended so that migrant children born in the country couldn’t claim citizenship. Bush won the nomination, but Buchanan managed to insert a pledge in the Republican platform to build a “structure” on the border. It proved an embarrassment at a moment when there was an emerging post-Cold War consensus among Republican and Democratic Party leaders that a free trade agreement with Mexico had to be encouraged and the border left open, at least for corporations and capital. Bush’s campaign tried to fudge the issue by claiming that a “structure” didn’t necessarily mean a wall, but Buchanan’s people promptly shot back. “They don’t put lighthouses on the border,” his sister and spokesperson Bay Buchanan said.

1993: Having passed the North American Free Trade Agreement in Congress, President Bill Clinton immediately started to militarize the border, once again significantly increasing the budget and staff of the Border Patrol and supplying it with ever more technologically advanced equipment: infrared night scopes, thermal-imaging devices, motion detectors, in-ground sensors, and software that allowed biometric scanning of all apprehended migrants. Stadium lights went up, shining into Tijuana. Hundreds of miles of what the Clinton White House refused to call a “wall” went up as well. “We call it a fence,” said one government official. “‘Wall’ has kind of a negative connotation.”

The objective was to close off relatively safe urban border crossings and force migrants to use more treacherous places in their attempts to reach the United States, either the creosote flatlands of south Texas or the gulches and plateaus of the Arizona desert. Trips that used to take days now took weeks on arid sands and under a scorching sun. Clinton’s Immigration and Naturalization Service commissioner, Doris Meissner, claimed “geography” as an “ally” — meaning that desert torments would work wonders as a deterrent.

The Clinton White House was so eager to put up a set of barriers that it barely paid attention to the actual borderline, at one point mistakenly running a section of the structure into Mexico, prompting a protest from that country’s government.

Another stretch, spanning 15 miles from the Pacific Ocean, would be built using Vietnam-era steel helicopter landing pads stood on end. Their edges were so sharp that migrants trying to climb over them often severed their fingers. As one observer noted, the use of the pads raised “the chilling possibility” that the U.S. might be able to “wall off the country” with leftover war matériel.

2006: The Secure Fence Act, passed by President George W. Bush’s administration with considerable Democratic support, appropriated billions of dollars to pay for drones, a “virtual wall,” aerostat blimps, radar, helicopters, watchtowers, surveillance balloons, razor ribbon, landfill to block canyons, border berms, adjustable barriers to compensate for shifting dunes, and a lab (located at Texas A&M and run in partnership with Boeing) to test fence prototypes. The number of border agents doubled yet again and the length of border fencing quadrupled. Operation Streamline detained, prosecuted, and tried migrants en masse and then expedited their deportation (mostly using an immigration reform law Clinton had signed in 1996). Agents from Immigration and Customs Enforcement (created after 9/11) seized children off school buses and tracked undocumented residents deep into liberal states, including in the exclusive Hamptons on New York’s Long Island and in New Bedford, Massachusetts. All told, in his eight years in office, Bush deported two million people, at a rate roughly matched by his successor, Barack Obama.

2013: The Democratic-controlled Senate passed a bill in June 2013 that — in exchange for the promise of a one-time amnesty and a long-shot chance at citizenship for some of the millions of undocumented residents in the country — offered more billions of dollars for policing, fencing, and deportations. According to the New York Times, with a winding down in Iraq and Afghanistan (however brief it would prove to be), defense contractors like Lockheed Martin were betting on a “military-style buildup at the border zone,” hoping to supply even more helicopters, heat-seeking cameras, radiation detectors, virtual fences, watchtowers, ships, Predator drones, and military-grade radar. The bill failed in the House, killed by nativists. But the Democratic Party would continue to fund “tough-as-nails” (in the phrase of New York Democratic Senator Charles Schumer) border security programs that amounted to years of up-armoring the border in what was then referred to as a “border surge.”

No one really knows how many people have died trying to get into the United States since Washington began to make the border tough as nails. Most die of dehydration, hyperthermia, or hypothermia. Others drown in the Rio Grande. Since about 1998, the Border Patrol has reported nearly 7,000 deaths, with groups like the Tucson-based Coalición de Derechos Humanos estimating that the remains of at least 6,000 immigrants have been recovered. These numbers are, however, undoubtedly just a fraction of the actual toll.

June 16, 2015: Donald J. Trump descends an escalator in Trump Tower to the tune of Neil Young’s “Rockin’ in the Free World” to announce his presidential campaign and denounce “Mexican rapists.”

“I will build a great, great wall on our southern border,” he tells Americans. “And I will have Mexico pay for that wall.”

Show Me a 50-Foot Wall…

“Something there is that doesn’t love a wall,” poet Robert Frost once wrote.

Borders, not to mention walls, represent domination and exploitation. But they also symbolize the absurdity of political leaders taking the world as it is and trying to make it as they think it ought to be. However much people might curse border fortifications, they also enjoy subverting them — even if the subversion only lasts a moment, as when citizens of Naco, Sonora, and Naco, Arizona, play an annual volleyball game over the border fence; or when an artist decides to paint “the world’s longest mural” on border fencing; or when families come together to gossip, tell jokes, and pass tamales and sweets between the posts; or when couples get married through the spaces separating the slats. As long as the United States keeps coming up with new ways to fortify the border, people will keep coming up with new ways to beat the border, including tunnels, ramps, catapults and homemade cannons (to launch bales of marijuana to the other side), and GoFundMe campaigns to pay for ladders.

As Janet Napolitano, former governor of Arizona and former director of Homeland Security, once said, “Show me a fifty-foot wall, and I’ll show you a fifty-one-foot ladder.”

How Not to Build a “Great, Great Wall”

The only person Henry Kissinger flattered more than President Richard Nixon was Mohammad Reza Pahlavi, the Shah of Iran. In the early 1970s, the Shah, sitting atop an enormous reserve of increasingly expensive oil and a key figure in Nixon and Kissinger’s move into the Middle East, wanted to be dealt with as a serious person. He expected his country to be treated with the same respect Washington showed other key Cold War allies like West Germany and Great Britain. As Nixon’s national security adviser and, after 1973, secretary of state, Kissinger’s job was to pump up the Shah, to make him feel like he truly was the “king of kings.”

Reading the diplomatic record, it’s hard not to imagine his weariness as he prepared for his sessions with the Shah, considering just what gestures and words would be needed to make it clear that his majesty truly mattered to Washington, that he was valued beyond compare. “Let’s see,” an aide who was helping Kissinger get ready for one such meeting said, “the Shah will want to talk about Pakistan, Afghanistan, Saudi Arabia, the Gulf, the Kurds, and Brezhnev.”

During another prep, Kissinger was told that “the Shah wants to ride in an F-14.” Silence ensued. Then Kissinger began to think aloud about how to flatter the monarch into abandoning the idea. “We can say,” he began, “that if he has his heart set on it, okay, but the President would feel easier if he didn’t have that one worry in 10,000 [that the plane might crash]. The Shah will be flattered.” Once, Nixon asked Kissinger to book the entertainer Danny Kaye for a private performance for the Shah and his wife.

The 92-year-old Kissinger has a long history of involvement in Iran and his recent opposition to Barack Obama’s Iran nuclear deal, while relatively subdued by present Washington standards, matters.  In it lies a certain irony, given his own largely unexamined record in the region.  Kissinger’s criticism has focused mostly on warning that the deal might provoke a regional nuclear arms race as Sunni states led by Saudi Arabia line up against Shia Iran. “We will live in a proliferated world,” he said in testimony before the Senate. In aWall Street Journal op-ed co-authored with another former secretary of state, George Shultz, Kissinger worried that, as the region “trends toward sectarian upheaval” and “state collapse,” the “disequilibrium of power” might likely tilt toward Tehran.

Of all people, Kissinger knows well how easily the best laid plans can go astray and careen toward disaster. The former diplomat is by no means solely responsible for the mess that is today’s Middle East. There is, of course, George W. Bush’s 2003 invasion of Iraq (which Kissinger supported). But he does bear far more responsibility for our proliferated world’s disequilibrium of power than anyone usually recognizes.

Some of his Middle East policies are well known. In early 1974, for instance, his so-called shuttle diplomacy helped deescalate the tensions that had led to the previous year’s Arab-Israeli War. At the same time, however, it locked inIsrael’s veto over U.S. foreign policy for decades to come. And in December 1975, wrongly believing that he had worked out a lasting pro-American balance of power between Iran and Iraq, Kissinger withdrew his previous support from the Kurds (whom he had been using as agents of destabilization against Baghdad’s Baathists). Iraq moved quickly to launch an assault on the Kurds that killed thousands and then implemented a program of ethnic cleansing, forcibly relocating Kurdish survivors and moving Arabs into their homes. “Even in the context of covert action ours was a cynical enterprise,”noted a Congressional investigation into his sacrifice of the Kurds.

Less well known is the way in which Kissinger’s policies toward Iran and Saudi Arabia accelerated the radicalization in the region, how step by catastrophic step he laid the groundwork for the region’s spiraling crises of the present moment.

Guardian of the Gulf

Most critical histories of U.S. involvement in Iran rightly began with the joint British-U.S. coup against democratically elected Prime Minister Mohammad Mosaddegh in 1953, which installed Pahlavi on the Peacock Throne. But it was Kissinger who, in 1972, greatly deepened the relationship between Washington and Tehran. He was the one who began a policy of unconditional support for the Shah as a way to steady American power in the Persian Gulf while the U.S. extracted itself from Southeast Asia. As James Schlesinger, who served as Nixon’s CIA director and secretary of defense, noted, if “we were going to make the Shah the Guardian of the Gulf, we’ve got to give him what he needs.” Which, Schlesinger added, really meant “giving him what he wants.”

What the Shah wanted most of all were weapons of every variety — and American military trainers, and a navy, and an air force. It was Kissinger who overrode State Department and Pentagon objections and gave the Shah what no other country had: the ability to buy anything he wanted from U.S. weapons makers.

“We are looking for a navy,” the Shah told Kissinger in 1973, “we have a large shopping list.” And so Kissinger let him buy a navy.

By 1976, Kissinger’s last full year in office, Iran had become the largest purchaser of American weaponry and housed the largest contingent of U.S. military advisors anywhere on the planet. By 1977, the historian Ervand Abrahamian notes, “the shah had the largest navy in the Persian Gulf, the largest air force in Western Asia, and the fifth-largest army in the whole world.” That meant, just to begin a list, thousands of modern tanks, hundreds of helicopters, F-4 and F-5 fighter jets, dozens of hovercraft, long-range artillery pieces, and Maverick missiles. The next year, the Shah bought another $12 billion worth of equipment.

After Kissinger left office, the special relationship he had worked so hard to establish blew up with the Iranian Revolution of 1979, the flight of the Shah, the coming to power of Ayatollah Khomeini, and the taking of the U.S. Embassy in Tehran (and its occupants as hostages) by student protesters. Washington’s political class is still trying to dig itself out of the rubble. A number of high-ranking Middle East policymakers and experts held Kissinger directly responsible for the disaster, especially career diplomat George Ball, who called Kissinger’s Iran policy an “act of folly.”

Kissinger is deft at deflecting attention from this history. After a speech at Annapolis in 2007, a cadet wanted to know why he had sold weapons to the Shah of Iran when “he knew the nature of his regime?”

“Every American government from the 1950s on cooperated with the Shah of Iran,” Kissinger answered. He continued: “Iran is a crucial piece of strategic real estate, and the fact that it is now in adversarial hands shows why we cooperated with the Shah of Iran. Why did we sell weapons to him? Because he was willing to defend himself and because his defense was in our interest. And again, I simply don’t understand why we have to apologize for defending the American national interest, which was also in the national interest of that region.”

This account carefully omits his role in greatly escalating the support provided to the Shah, including to his infamous SAVAK torturers — the agents of his murderous, U.S.-trained secret police-cum-death-squad — who upheld his regime. Each maimed body or disappeared family member was one more klick on the road to revolution. As George Ball’s biographer, James Bill, writes: considering the “manifest failure” of Kissinger’s Iran policy, “it is worthy of note that in his two massive volumes of political memoirs totalling twenty-eight-hundred pages, Kissinger devoted less than twenty pages to the Iranian revolution and U.S.-Iran relations.”

After the Shah fell, the ayatollahs were the beneficiaries of Kissinger’s arms largess, inheriting billions of dollars of warships, tanks, fighter jets, guns, and other materiel. It was also Kissinger who successfully urged the Carter administration to grant the Shah asylum in the United States, which hastened the deterioration of relations between Tehran and Washington, precipitating the embassy hostage crisis.

Then, in 1980, Saddam Hussein’s Iraq invaded Iran, beginning a war that consumed hundreds of thousands of lives. The administration of Ronald Reagan “tilted” toward Baghdad, providing battlefield intelligence used to launch lethal sarin gas attacks on Iranian troops. At the same time, the White House illegally and infamously trafficked high-tech weaponry to revolutionary Iran as part of what became the Iran-Contra affair.

“It’s a pity they can’t both lose,” Kissinger is reported to have said of Iran and Iraq. Although that quotation is hard to confirm, Raymond Tanter, who served on the National Security Council, reports that, at a foreign-policy briefing for Republican presidential nominee Ronald Reagan in October 1980, Kissinger suggested “the continuation of fighting between Iran and Iraq was in the American interest.”  Having bet (and lost) on the Shah, Kissinger now hoped to make the best of a bad war.  The U.S., he counselled Reagan, “should capitalize on continuing hostilities.”

Saudi Arabia and the Petrodollar Fix

Kissinger’s other “guardian” of the Gulf, Sunni Saudi Arabia, however, didn’t fall and he did everything he could to turn that already close relationship into an ironclad alliance. In 1975, he signaled what was to come by working out an arms deal for the Saudi regime similar to the one he had green-lighted for Tehran, including a $750 million contract for the sale of 60 F-5E/F fighters to the sheiks. By this time, the U.S. already had more than a trillion dollars’ worth of military agreements with Riyadh. Only Iran had more.

Like Tehran, Riyadh paid for this flood of weaponry with the proceeds from rising oil prices. The word “petrodollar,” according to the Los Angeles Times, was coined in late 1973, and introduced into English by New York investment bankers who were courting the oil-producing countries of the Middle East. Soon enough, as that paper wrote, the petrodollar had become part of “the world’s macroeconomic interface” and crucial to Kissinger’s developing Middle Eastern policy.

By June 1974, Treasury Secretary George Shultz was already suggesting that rising oil prices could result in a “highly advantageous mutual bargain” between the U.S. and petroleum-producing countries in the Middle East. Such a “bargain,” as others then began to argue, might solve a number of problems, creating demand for the U.S. dollar, injecting needed money into a flagging defense industry hard hit by the Vietnam wind-down, and using petrodollars to cover mounting trade deficits.

As it happened, petrodollars would prove anything but a quick fix. High energy prices were a drag on the U.S. economy, with inflation and high interest rates remaining a problem for nearly a decade. Nor was petrodollar dependence part of any preconceived Kissingerian “plan.”  As with far more of his moves than he or his admirers now care to admit, he more or less stumbled into it.  This was why, in periodic frustration, he occasionally daydreamed about simply seizing the oil fields of the Arabian peninsula and doing away with all the developing economic troubles.

“Can’t we overthrow one of the sheikhs just to show that we can do it?” hewondered in November 1973, fantasizing about which gas-pump country he could knock off. “How about Abu Dhabi?” he later asked. (Imagine what the world would be like today had Kissinger, in the fall of 1973, moved to overthrow the Saudi regime rather than Chile’s democratically elected president, Salvador Allende.) “Let’s work out a plan for grabbing some Middle East oil if we want,” Kissinger said.

Such scimitar rattling was, however, pure posturing. Not only did Kissinger broker the various deals that got the U.S. hooked on recycled Saudi petrodollars, he also began to promote the idea of an “oil floor price” below which the cost per barrel wouldn’t fall. Among other things, this scheme was meant to protect the Saudis (and Iran, until 1979) from a sudden drop in demand and provide U.S. petroleum corporations with guaranteed profit margins.

Stephen Walt, a scholar of international relations, writes: “By the end of 1975, more than six thousand Americans were engaged in military-related activities in Saudi Arabia. Saudi arms purchased for the period 1974-1975 totaled over $3.8 billion, and a bewildering array of training missions and construction projects worth over $10 billion were now underway.”

Since the 1970s, one administration after another has found the iron-clad alliance Kissinger deepened between the House of Saud’s medieval “moderates” and Washington indispensable not only to keep the oil flowing but as a balance against Shia radicalism and secular nationalism of every sort. Recently, however, a series of world-historical events has shattered the context in which that alliance seemed to make sense. These include: the catastrophic war on and occupation of Iraq, the Arab Spring, the Syrian uprising and ensuing civil war, the rise of ISIS, Israel’s rightwing lurch, the conflict in Yemen, the falling price of petroleum, and, now, Obama’s Iran deal.

But the arms spigot that Kissinger turned on still remains wide open.According to the New York Times, “Saudi Arabia spent more than $80 billion on weaponry last year — the most ever, and more than either France or Britain — and has become the world’s fourth-largest defense market.” Just as they did after the Vietnam drawdown, U.S. weapons manufacturing are compensating for limits on the defense budget at home by selling arms to Gulf states. The “proxy wars in the Middle East could last for years,” write Mark Mazzetti and Helene Cooper of the New York Times, “which will make countries in the region even more eager for the F-35 fighter jet, considered to be the jewel of America’s future arsenal of weapons. The plane, the world’s most expensive weapons project, has stealth capabilities and has been marketed heavily to European and Asian allies. It has not yet been peddled to Arab allies because of concerns about preserving Israel’s military edge.”

If fortune is really shining on Lockheed and Boeing, Kissinger’s prediction that Obama’s de-escalation of tensions with Tehran will sooner or later prompt Saudi–Iranian hostilities will pan out. “With the balance of power in the Middle East in flux, several defense analysts said that could change. Russia is a major arms supplier to Iran, and a decision by President Vladimir Putin to sell an advanced air defense system to Iran could increase demand for the F-35, which is likely to have the ability to penetrate Russian-made defenses,” the Times reports.

“This could be the precipitating event: the emerging Sunni-Shia civil war coupled with the sale of advanced Russian air defense systems to Iran,” said one defense analyst. “If anything is going to result in F-35 clearance to the gulf states, this is the combination of events.’”

Into Afghanistan

If all Henry Kissinger contributed to the Middle East were a regional arms race, petrodollar addiction, Iranian radicalization, and the Tehran-Riyadh conflict, it would be bad enough. His legacy, however, is far worse than that: he has to answer for his role in the rise of political Islam.

In July 1973, after a coup in Afghanistan brought to power a moderate, secular, but Soviet-leaning republican government, the Shah, then approaching the height of his influence with Kissinger, pressed his advantage. He asked for even more military assistance. Now, he said, he “must cover the East with fighter aircraft.” Kissinger complied.

Tehran also began to meddle in Afghan politics, offering Kabul billions of dollars for development and security, in exchange for loosening “its ties with the Soviet Union.” This might have seemed a reasonably peaceful way to increase U.S. influence via Iran over Kabul. It was, however, paired with an explosive initiative: via SAVAK, the Shah’s secret police, and Pakistan’s Inter-Services Intelligence agency (ISI), extremist Islamic insurgents were to be slipped into Afghanistan to destabilize Kabul’s republican government.

Kissinger, who knew his British and his Russian imperial history, had long considered Pakistan of strategic importance. “The defense of Afghanistan,” he wrote in 1955, “depends on the strength of Pakistan.” But before he could put Pakistan into play against the Soviets in Afghanistan, he had to perfume away the stink of genocide. In 1971, that country had launched a bloodbath in East Pakistan (now Bangladesh), with Nixon and Kissinger standing “stoutly behind Pakistan’s generals, supporting the murderous regime at many of the most crucial moments,” as Gary Bass has detailed. The president and his national security adviser, Bass writes, “vigorously supported the killers and tormentors of a generation of Bangladeshis.”

Because of that genocidal campaign, the State Department, acting against Kissinger’s wishes, had cut off military aid to the country in 1971, though Nixon and Kissinger kept it flowing covertly via Iran. In 1975, Kissinger vigorously pushed for its full, formal restoration, even as he was offering his tacit approval to Maoist China to back Pakistan whose leaders had their own reasons for wanting to destabilize Afghanistan, having to do with border disputes and the ongoing rivalry with India.

Kissinger helped make that possible, in part by the key role he played in building up Pakistan as part of a regional strategy in which Iran and Saudi Arabia were similarly deputized to do his dirty work. When Pakistani Prime Minister Zulfikar Ali Bhutto, who had backed the 1971 rampage in East Pakistan, visited Washington in 1975 to make the case for restoration of military aid, Kissinger assured President Gerald Ford that he “was great in ’71.” Ford agreed, and U.S. dollars soon started to flow directly to the Pakistani army and intelligence service.

As national security adviser and then secretary of state, Kissinger was directly involved in planning and executing covert actions in such diverse places as Cambodia, Angola, and Chile. No available information indicates that he ever directly encouraged Pakistan’s ISI or Iran’s SAVAK to destabilize Afghanistan. But we don’t need a smoking gun to appreciate the larger context and consequences of his many regional initiatives in what, in the twenty-first century, would come to be known in Washington as the “greater Middle East.” In their 1995 book, Out of Afghanistan, based on research in Soviet archives, foreign-policy analysts Diego Cordovez and Selig Harrison provide a wide-ranging sense of just how so many of the policies Kissinger put in place — the empowerment of Iran, the restoration of military relations with Pakistan, high oil prices, an embrace of Saudi Wahhabism, and weapon sales — came together to spark jihadism:

”It was in the early 1970s, with oil prices rising, that Shah Mohammed Reza Pahlavi of Iran embarked on his ambitious effort to roll back Soviet influence in neighboring countries and create a modern version of the ancient Persian empire… Beginning in 1974, the Shah launched a determined effort to draw Kabul into a Western-tilted, Tehran-centered regional economic and security sphere embracing India, Pakistan and the Persian Gulf states… The United States actively encouraged this roll-back policy as part of its broad partnership with the Shah… SAVAK and the CIA worked hand in hand, sometimes in loose collaboration with underground Afghani Islamic fundamentalist groups that shared their anti-Soviet objectives but had their own agendas as well… As oil profits sky-rocketed, emissaries from these newly affluent Arab fundamentalist groups arrived on the Afghan scene with bulging bankrolls.”

Harrison also wrote that “SAVAK, the CIA, and Pakistani agents” were involved in failed “fundamentalist coup attempts” in Afghanistan in 1973 and 1974, along with an attempted Islamic insurrection in the Panjshir Valley in 1975, laying the groundwork for the jihad of the 1980s (and beyond).

Much has been made of Jimmy Carter’s decision, on the advice of National Security Adviser Zbigniew Brzezinski, to authorize “nonlethal” aid to the Afghan mujahedeen in July 1979, six months before Moscow sent troops to support the Afghan government in its fight against a spreading Islamic insurgency. But lethal aid had already long been flowing to those jihadists via Washington’s ally Pakistan (and Iran until its revolution in 1979). This provision of support to radical Islamists, initiated in Kissinger’s tenure and continuing through the years of Ronald Reagan’s presidency, had a number of unfortunate consequences known all too well today but seldom linked to the good doctor. It put unsustainable pressure on Afghanistan’s fragile secular government. It laid the early infrastructure for today’s transnational radical Islam. And, of course, it destabilized Afghanistan and so helped provoke the Soviet invasion.

Some still celebrate the decisions of Carter and Reagan for their role in pulling Moscow into its own Vietnam-style quagmire and so hastening the demise of the Soviet Union. “What is most important to the history of the world?” Brzezinski infamously asked. “The Taliban or the collapse of the Soviet empire? Some stirred-up Moslems or the liberation of Central Europe and the end of the cold war?” (The rivalry between the two Harvard immigrant diplomats, Kissinger and Brzezinski, is well known. But Brzezinski by 1979 was absolutely Kissingerian in his advice to Carter. In fact, a number of Kissinger’s allies who continued on in the Carter administration, including Walter Slocombe and David Newsom, influenced the decision to support the jihad.)

Moscow’s occupation of Afghanistan would prove a disaster — and not just for the Soviet Union. When Soviet troops pulled out in 1989, they left behind a shattered country and a shadowy network of insurgent fundamentalists who, for years, had worked hand-in-glove with the CIA in the Agency’s longest covert operation, as well as the Saudis and the Pakistani ISI.  It was a distinctly Kissingerian line-up of forces. 

Few serious scholars now believe that the Soviet Union would have proved any more durable had it not invaded Afghanistan. Nor did the allegiance of Afghanistan — whether it tilted toward Washington, Moscow, or Tehran — make any difference to the outcome of the Cold War, any more than did, say, that of Cuba, Iraq, Angola, or Vietnam.

For all of the celebration of him as a “grand strategist,” as someone who constantly advises presidents to think of the future, to base their actions today on where they want the country to be in five or 10 years’ time, Kissinger was absolutely blind to the fundamental feebleness and inevitable collapse of the Soviet Union. None of it was necessary; none of the lives Kissinger sacrificed in Cambodia, Laos, Angola, Mozambique, Chile, Argentina, Uruguay, East Timor, and Bangladesh made one bit of difference in the outcome of the Cold War.

Similarly, each of Kissinger’s Middle East initiatives has been disastrous in the long run. Just think about them from the vantage point of 2015: banking on despots, inflating the Shah, providing massive amounts of aid to security forces that tortured and terrorized democrats, pumping up the U.S. defense industry with recycled petrodollars and so spurring a Middle East arms race financed by high gas prices, emboldening Pakistan’s intelligence service, nurturing Islamic fundamentalism, playing Iran and the Kurds off against Iraq, and then Iraq and Iran off against the Kurds, and committing Washington to defending Israel’s occupation of Arab lands.

Combined, they’ve helped bind the modern Middle East into a knot that even Alexander’s sword couldn’t sever.

Bloody Inventions

Over the last decade, an avalanche of documents — transcripts of conversations and phone calls, declassified memos, and embassy cables — have implicated Henry Kissinger in crimes in Bangladesh, Cambodia, southern Africa, Laos, the Middle East, and Latin America. He’s tried to defend himself by arguing for context. “Just to take a sentence out of a telephone conversation when you have 50 other conversations, it’s just not the way to analyze it,” Kissinger said recently, after yet another damning tranche of documents was declassified. “I’ve been telling people to read a month’s worth of conversations, so you know what else went on.”

But a month’s worth of conversations, or eight years for that matter, reads like one of Shakespeare’s bloodiest plays. Perhaps Macbeth, with its description of what we today call blowback: “That we but teach bloody instructions, which, being taught, return to plague the inventor.”

We are still reaping the bloody returns of Kissinger’s inventions.

Greg Grandin, a TomDispatch regular, teaches history at New York University. He is the author of FordlandiaThe Empire of Necessity, which won the Bancroft Prize in American history, and, most recently, Kissinger’s Shadow: The Long Reach of America’s Most Controversial Statesman.

Copyright 2015 Greg Grandin

Debacle, Inc.

The Pentagon just can’t let go. In the wake of the Charleston Massacre, Amazon and Walmart have announced that they will no longer sell Confederate flag merchandise. Ebay says it will stop offering Confederate items for electronic auction. Mississippi's Republican speaker of the house calls his state flag, which includes the Stars and Bars in the top left corner, “a point of offense that needs to be removed.” Even Kentucky’s Mitch McConnell, the majority leader of the U.S. Senate, agrees that a statue of Confederate President Jefferson Davis in his state's capitol building belongs in a museum.

Yet the Department of Defense says it isn’t even “reviewing” the possibility of a ban on the flag, deciding instead to leave any such move to the various service branches, while military bases named after Confederate officers will remain so. One factor in this decision: the South provides more than 40% of all military recruits, many of them white; only 15% are from the Northeast.

Filling the ranks isn't, however, the only reason for the military’s refusal to act.

Over the last few weeks, there has been near unanimous agreement among liberal and mainstream commentators that the Confederate flag represents “hate, not heritage.” The flag’s current presence in American culture is ubiquitous. It adorns license plates, bumper stickers, mugs, bodies (via tattoos), and even baby diapers. The flag’s popularity is normally traced back to the post-World War II reaction of the Dixiecrat South to the Civil Rights Movement. South Carolina, for instance, raised the Stars and Bars over its state house in 1961 as part, columnist Eugene Robinson said on "Meet the Press," of its “massive resistance to racial desegregation."

All true. But like many discussions of American conservativism, this account misses the role endless war played in sustaining domestic racism. Starting around 1898, well before it became an icon of redneck backlash, the Confederate Battle Flag served for half a century as an important pennant in the expanding American empire and a symbol of national unification, not polarization.

It was a reconciled Army that moved out into the world after the Civil War, an unstoppable combination of Northern law (bureaucratic command and control, industrial might, and technology) and Southern spirit (an “exaltation of military ideals and virtues,” including valor, duty, and honor). Both law and spirit had their dark sides leading to horrors committed due either to the very nature of the American empire — the genocide of Native Americans, for example, or the war in Southeast Asia — or to the particular passions of some of its soldiers. And both law and spirit had their own flags.

Lost Cause Found

“Northerners and Southerners agreed on little” in the years after the Civil War, historians Boyd Cothran and Ari Kelman write, “except that the Army should pacify Western tribes.” Reconstruction — Washington’s effort to set the terms for the South’s readmission to the Union and establish postwar political equality — was being bitterly opposed by defeated white separatists. According to Cothran and Kelman, however, “Many Americans found rare common ground on the subject of Manifest Destiny.”

After the surrender at Appomattox, it was too soon to fly the Stars and Bars against Native Americans. And it was Union officers — men like generals George Armstrong Custer and Philip Sheridan – who committed most of the atrocities against indigenous peoples. But Confederate veterans and their sons used the pacification of the West as a readmission program into the U.S. Army. The career of Luther Hare, a Texas son of a Confederate captain, is illustrative. He barely survived Custer’s campaign against the Sioux. Cornered in a skirmish that preceded Little Big Horn, Hare “opened fire and let out a rebel yell” before escaping. He then went on to fight Native Americans in Montana, Texas, the Pacific Northwest, and Arizona, where he put down the “last of the renegade Apaches,” before being sent to the Philippines as a colonel.  There, he led a detachment of Texans against the Spanish.

With Reconstruction over and Jim Crow segregation installed in every southern State, the Spanish-American War of 1898, in which the U.S. took Cuba and Puerto Rico in the Caribbean and the Philippines and Guam in the Pacific, was a key moment in the rehabilitation of the Confederacy. Earlier, when slavery was still a going concern, southerners had yearned to separate Cuba from Spain and turn it into a slave state.  Now, conquering the island served a different purpose: a chance to prove their patriotism and reconcile with the North.

Southern ports like New OrleansCharleston, and Tampa were used as staging areas for the invasions of Cuba and Puerto Rico. Northern soldiers passing through New Orleans were glad to see that “grizzled old Confederates” were cheering them on, saluting the Union flag, and happy to send their sons “to fight and die under it.” Newspapers throughout the South, along with Dixie's largest veterans association, the United Confederate Veterans, saw war with Spain as a vindication of the “Old Cause” and reveled in the exploits of former Confederate generals, including Robert E. Lee’s nephew, Fitzhugh Lee.

In June 1898, just weeks after U.S. troops landed in Cuba, two train-car loads of Confederate flags arrived in Atlanta for a coming reunion of southern veterans of the war. The Stars and Bars would soon festoon the city Union General William T. Sherman had burned to the ground. At the very center of the celebration’s main venue stood a 30-foot Confederate flag, flanked by a Cuban and a U.S. flag. Speech after speech extolled “sublime” war — not just the Civil War but all the wars that made up the nineteenth century — with Mexico, against Native Americans, and now versus Spain. “The gallantry and heroism of your sons as they teach the haughty Spaniard amid the carnage of Santiago to honor and respect the flag of our country, which shall float forever over an ‘indissoluble union of indestructible states,’” was how one southern veteran put it.

War with Spain allowed “our boys” to once more be “wrapped in the folds of the American flag,” said General John Gordon, commander of the United Confederate Veterans, in remarks opening the proceedings. Their heroism, he added, has led “to the complete and permanent obliteration of all sectional distrusts and to the establishment of the too long delayed brotherhood and unity of the American people.” In this sense, the War of 1898 was alchemic, transforming the “lost cause” of the Confederacy (that is, the preservation of slavery) into a crusade for world freedom. The South, Gordon said, was helping to bring “the light of American civilization and the boon of Republican liberty to the oppressed islands of both oceans.”

With Spain defeated, President William McKinley took a victory tour of the South, hailing the “the valor and the heroism [that] the men from the south and the men of the north have within the past three years… shown in Cuba, in Puerto Rico, in the Philippines, and in China.”

“When we are all on one side,” the president said, “we are unconquerable.” It was around this time that, after much delay, Congress finally authorized the return of Confederate flags captured by Union forces during the Civil War to the United Confederate Veterans.

To Serve Mankind

World War I brought more goodwill. In June 1916, as Woodrow Wilson began to push through Congress a remarkable set of laws militarizing the country, including the expansion of the Army and National Guard (and an authorization to place the former under federal authority), the construction of nitrate plants for munitions production, and the funding of military research and development, Confederate veterans descended on Washington, D.C., to show their support for the coming war in Europe.

“About 10,000 men wearing the gray, escorted by several thousand who wore the blue, marched along Pennsylvania Avenue and were reviewed by the President,” one observer reported. “In the line were many young soldiers now serving in the regular army, grandsons of those who fought for the Confederacy and of those who fought for the Union. The Stars and Bars of the Confederacy were proudly borne at the head of the procession… As the long line passed the reviewing stand the old men in gray offered their services in the present war. ‘We will go to France or anywhere you want to send us!’ they shouted to the president.”

Wilson won reelection in 1916, his campaign running on the slogan, “He kept us out of war.” But he could then betray his anti-war supporters knowing that a rising political coalition — made up, in part, of men looking to redeem a lost war by finding new wars to fight — had his back.

Decades before President Richard Nixon bet his reelection on winning the Dixiecrat vote, Wilson worked out his own Southern Strategy. Even as he was moving the nation to war, Wilson re-segregated Washington and purged African Americans from federal jobs. And it was Wilson who started the presidential tradition of laying a Memorial Day wreath at Arlington Cemetery’s Confederate War Memorial. 

In 1916, he turned that event into a war rally. “America is roused,” Wilson said to a large gathering of Confederate veterans, “roused to a self-consciousness she has not had in a generation. And this spirit is going out conquering and to conquer until, it may be, in the Providence of God, a new light is lifted up in America which shall throw the rays of liberty and justice far abroad upon every sea, and even upon the lands which now wallow in darkness and refuse to see the light.”

What alchemy it was — with Wilson conscripting the Confederate cause into his brand of arrogant, martial universalism. The conflict in Europe, Wilson said at the same wreath-laying event a year later (less then two months after the U.S. had declared war on Germany), offered a chance “to vindicate the things which we have professed” and to “show the world” that America “was born to serve mankind.”

American history was fast turning into an endless parade of war, and the sectional reconciliation that went with it meant that throughout the first half of the twentieth century the “conquered banner” could fly pretty much anywhere with little other than positive comment. In World War II, for instance, after a two-month battle for the island of Okinawa, the first flag Marines raised upon taking the headquarters of the Japanese Imperial Army was the Confederate one.  It had been carried into battle in the helmet of a captain from South Carolina.

With the Korean War, the NAACP’s journal,The Crisis, reported a staggering jump in sales of Confederate flags from 40,000 in 1949 to 1,600,000 in 1950.  Much of the demand, it reported, was coming from soldiers overseas in Germany and Korea. The Crisis hoped for the best, writing that the banner’s growing popularity had nothing to do with rising “reactionary Dixiecratism.” It was a “fad,” the magazine claimed, “like carrying foxtails on cars.”

As it happened, it wasn’t. As the Civil Rights Movement evolved and the Black Power movement emerged, as Korea gave way to Vietnam, the Confederate flag returned to its original meaning: the bunting of resentful white supremacy. Dixie found itself in Danang.

Dixie in Danang

“We are fighting and dying in a war that is not very popular in the first place,” Lieutenant Eddie Kitchen, a 33-year-old African-American stationed in Vietnam, wrote his mother in Chicago in late February 1968, “and we still have some people who are still fighting the Civil War.” Kitchen, who had been in the military since 1955, reported a rapid proliferation of Confederate flags, mounted on jeeps and flying over some bases. “The Negroes here are afraid and cannot do anything,” Kitchen added. Two weeks later he was dead, officially listed as “killed in action.” His mother believed that he had been murdered by white soldiers in retaliation for objecting to the flag.

Kitchen’s was one of many such complaints, as the polarization tearing through domestic politics in the United States, along with the symbols of White Supremacy — not just the Confederate flag but the burning cross, the Klan robe and hood, and racist slurs — spilled into Vietnam. As early as Christmas Day 1965, a number of white soldiers paraded in front of the audience of conservative comedian Bob Hope’s USO show at Bien Hoa Air Base. “After they were seated,” wrote an African-American soldier protesting the display, “several officers and NCOs [non-commissioned officers] were seen posing and taking pictures under the flag. I felt like an outsider.” An African-American newspaper, the Chicago Defender, reported that southern Whites were “infecting” Vietnamese with their racism. “The Confederate flags seem more popular in Vietnam than the flags of several countries,” the paper wrote, judging by the “display of flags for sale on a Saigon street corner.”

Black soldiers who pushed back against such Dixie-ism were subject to insult and abuse. Some were thrown in the stockade. When Private First Class Danny Frazier complained of the “damn flag” flown by Alabama soldiers in his barracks to his superior officers, he was ordered to do demeaning work and then demoted.

Martin Luther King, Jr., was assassinated in early April 1968 and American military bases throughout South Vietnam lowered their flags to half-mast. In some places, such as the Cam Ranh Naval Base, however, white soldiers celebrated by raising the Confederate flag and burning crosses. Following King’s murder, the Department of Defense tried to ban the Confederate flag. “Race is our most serious international problem,” a Pentagon representative said. But Dixiecrat politicians, who controlled the votes President Lyndon Johnson needed to fund the war, objected and the Pentagon backpedaled. Instead of enforcing the ban, it turned to sensitivity training. The Confederate flag, a black military instructor told a class of black and white soldiers at Fort Dix, does not necessarily “mean a man belongs to the Ku Klux Klan.”

The Sum of All Lost Causes

Back home, a backlash against the antiwar movement helped nationalize the Confederate flag. The banner was increasingly seen not just at gatherings of the fringe KKK and the John Birch Society, but at “patriotic” rallies in areas of the country outside the old South: in Detroit, Chicago, California, Pennsylvania, and Connecticut. For instance, on June 14, 1970 — Flag Day — pro-war demonstrators marched up Pittsburg’s Liberty Avenue with a large Confederate flag demanding that “Washington… get in there and win.”

For many, the Confederate flag remained an emblem of racist reaction to federal efforts to advance equal rights and integration. Yet as issues of race, militarism, and class resentment merged into a broader “cultural war,” some in the rising New Right rallied around the Stars and Bars to avenge not the South, but South Vietnam.

In 1973, shortly after the U.S. officially ended combat operations in South Vietnam, for instance, Bart Bonner, a conservative activist and Vietnam veteran from Waterbury, New York, met with South Vietnam’s military attaché in Washington and offered to raise “a private, volunteer force of 75,000 American veterans to fight in South Vietnam under the Confederate flag.” For Bonner, and many like him, that flag now stood not for the “lost cause” but all lost causes conservatives cared about, an icon of resistance to the liberal Establishment.

Bonner told Soldier of Fortune magazine that he had the financial support of Texas millionaire Ross Perot and 100 men, including former Green Berets, Air Force commandos, and Navy Seals, ready to “show the people of South Vietnam… that not all Americans are cowards.” He added: “The Stars and Bars — the Confederate flag — is a beautiful flag.”

Nothing came of Bonner’s plan. But the scheme did anticipate many of the strategies the New Right would use to circumvent all those cumbersome restrictions the post-Vietnam Congress placed on the ability of the executive branch to wage war and conduct covert operations, including the rise of mercenary groups that continue to play a significant role in fighting America’s wars and attempts to raise money from private, often southern rightwing sources. Ross Perot, for instance, would fund some of Oliver North’s effort to run a foreign policy independent of congressional oversight, a scandal that would become known as Iran-Contra.

Moonlight, Magnolia, and My Lai

Before Watergate brought him down, President Richard Nixon fused overseas militarism and domestic racism into one noxious whole as part of his strategy to win the South in 1972 and secure his reelection. In southern Africa, where Black-led national liberation movements were contesting white rule, this meant putting in place National Security Adviser Henry Kissinger’s “Tar-Baby Tilt,” strengthening ties with the white supremacist nations of South Africa and Rhodesia. Support for Pretoria and Salisbury was popular in Biloxi.

But the foreign-policy centerpiece of Nixon’s “Southern Strategy” was Vietnam. Senator George McGovern summed the situation up this way after being told by Kissinger that the U.S. couldn’t exit Vietnam because “the boss’s whole constituency would just fall apart”: “They were willing to continue killing Asians and sacrificing the lives of young Americans because of their interpretation of what would play in the United States.”

The infamous March 1968 massacre at My Lai would prove especially useful in helping Nixon win the Moonlight and Magnolia set. After it came to light that members of the 23rd Infantry Division, also known as the Americal, had slaughtered more than 500 Vietnamese civilians, including women, children, and infants, Nixon made his support for Lieutenant William Calley, the only soldier convicted for taking part in the massacre, a key element in his reelection campaign. As historian Joseph Fry points out in his new book, The American South and the Vietnam War, Calley, who was from Florida, was extremely popular in the South. George Wallace, the segregationist governor of Alabama, flew to Fort Benning, where Calley was being held under house arrest, to speak at a rally, replete with Confederate flags. Mississippi Governor John Bell Williams told Nixon’s vice president, Spiro Agnew, that his state was "about ready to secede from the union" over Calley.

The campaign to depict Calley as an honorable warrior scapegoated by elites was but one more opportunity to generalize the historical experience of southern humiliation into an ongoing national sentiment. As after 1865, the solution to such humiliation has been more war, forever war. And with endless war comes an endless tolerance for atrocities. “Most people don’t give a shit whether he killed them or not,” Nixon said of Calley’s actions at My Lai. “The villagers got what they deserved,” commented Louisiana Senator Allen Ellender. You can draw a straight line from such hard-heartedness to today’s torture coalition, to men like Dick Cheney, who defend inflicting pain on innocent people “as long as we achieve our objective.”

The Confederate flag still flies overseas. It was carried into Iraq in 2003. In Afghanistan, at the infamous Bagram Theater Internment Facility, a platoon implicated in the torture of detainees, known as the “the Testosterone Gang,” hung a Confederate flag in their tent.

It is good to see the Confederate flag coming down in some places, but I suspect that reports of its final furling are premature. Endless wars will always have their atrocities. And atrocities will always find a flag.

Greg Grandin, a TomDispatch regular, teaches history at New York University and is the author of a number of books, including Fordlandia, a finalist for the Pulitzer Prize and the National Book Award, and The Empire of Necessity, which won the Bancroft Prize in American History. His new book, Kissinger’s Shadow: The Long Reach of America’s Most Controversial Statesman, will be published in August.

Copyright 2015 Greg Grandin

The Confederate Flag at War

Many in the United States were outraged by the remarks of conservative evangelical preacher Pat Robertson, who blamed Haiti’s catastrophic 2010 earthquake on Haitians for selling their souls to Satan. Bodies were still being pulled from the rubble — as many as 300,000 died — when Robertson went on TV and gave his viewing audience a little history lesson: the Haitians had been "under the heel of the French" but they "got together and swore a pact to the devil. They said, 'We will serve you if you will get us free from the French.' True story. And so, the devil said, 'OK, it's a deal.'"

A supremely callous example of right-wing idiocy? Absolutely. Yet in his own kooky way, Robertson was also onto something. Haitians did, in fact, swear a pact with the devil for their freedom. Only Beelzebub arrived smelling not of sulfur, but of Parisian cologne. 

Haitian slaves began to throw off the “heel of the French” in 1791, when they rose up and, after bitter years of fighting, eventually declared themselves free. Their French masters, however, refused to accept Haitian independence. The island, after all, had been an extremely profitable sugar producer, and so Paris offered Haiti a choice: compensate slave owners for lost property — their slaves (that is, themselves) — or face its imperial wrath. The fledgling nation was forced to finance this payout with usurious loans from French banks. As late as 1940, 80% of the government budget was still going to service this debt.

In the on-again, off-again debate that has taken place in the United States over the years about paying reparations for slavery, opponents of the idea insist that there is no precedent for such a proposal. But there is. It’s just that what was being paid was reparations-in-reverse, which has a venerable pedigree. After the War of 1812 between Great Britain and the U.S., London reimbursed southern planters more than a million dollars for having encouraged their slaves to run away in wartime. Within the United Kingdom, the British government also paid a small fortune to British slave owners, including the ancestors of Britain’s current Prime Minister, David Cameron, to compensate for abolition (which Adam Hochschild calculated in his 2005 book Bury the Chains to be “an amount equal to roughly 40% of the national budget then, and to about $2.2 billion today”).

Advocates of reparations — made to the descendants of enslaved peoples, not to their owners — tend to calculate the amount due based on the negative impact of slavery. They want to redress either unpaid wages during the slave period or injustices that took place after formal abolition (including debt servitude and exclusion from the benefits extended to the white working class by the New Deal). According to one estimate, for instance, 222,505,049 hours of forced labor were performed by slaves between 1619 and 1865, when slavery was ended. Compounded at interest and calculated in today’s currency, this adds up to trillions of dollars.

But back pay is, in reality, the least of it. The modern world owes its very existence to slavery.

Voyage of the Blind

Consider, for example, the way the advancement of medical knowledge was paid for with the lives of slaves.

The death rate on the trans-Atlantic voyage to the New World was staggeringly high. Slave ships, however, were more than floating tombs. They were floating laboratories, offering researchers a chance to examine the course of diseases in fairly controlled, quarantined environments.  Doctors and medical researchers could take advantage of high mortality rates to identify a bewildering number of symptoms, classify them into diseases, and hypothesize about their causes.

Corps of doctors tended to slave ports up and down the Atlantic seaboard. Some of them were committed to relieving suffering; others were simply looking for ways to make the slave system more profitable. In either case, they identified types of fevers, learned how to decrease mortality and increase fertility, experimented with how much water was needed for optimum numbers of slaves to survive on a diet of salted fish and beef jerky, and identified the best ratio of caloric intake to labor hours. Priceless epidemiological information on a range of diseases — malaria, smallpox, yellow fever, dysentery, typhoid, cholera, and so on — was gleaned from the bodies of the dying and the dead.

When slaves couldn’t be kept alive, their autopsied bodies still provided useful information. Of course, as the writer Harriet Washington has demonstrated in her stunning such experimentation continued long after slavery ended: in the 1940s, one doctor said that the “future of the Negro lies more in the research laboratory than in the schools.” As late as the 1960s, another researcher, reminiscing in a speech given at Tulane Medical School, said that it was “cheaper to use Niggers than cats because they were everywhere and cheap experimental animals.” 

Medical knowledge slowly filtered out of the slave industry into broader communities, since slavers made no proprietary claims on the techniques or data that came from treating their slaves. For instance, an epidemic of blindness that broke out in 1819 on the French slaver Rôdeur, which had sailed from Bonny Island in the Niger Delta with about 72 slaves on board, helped eye doctors identify the causes, patterns, and symptoms of what is today known as trachoma. 

The disease first appeared on the Rôdeur not long after it set sail, initially in the hold among the slaves and then on deck. In the end, it blinded all the voyagers except one member of the crew. According to a passenger’s account, sightless sailors worked under the direction of that single man “like machines” tied to the captain with a thick rope. “We were blind — stone blind, drifting like a wreck upon the ocean,” he recalled. Some of the sailors went mad and tried to drink themselves to death. Others retired to their hammocks, immobilized. Each “lived in a little dark world of his own, peopled by shadows and phantasms. We did not see the ship, nor the heavens, nor the sea, nor the faces of our comrades.”

But they could still hear the cries of the blinded slaves in the hold.

This went on for 10 days, through storms and calms, until the voyagers heard the sound of another ship. The Spanish slaver San León had drifted alongside the Rôdeur. But the entire crew and all the slaves of that ship, too, had been blinded. When the sailors of each vessel realized this “horrible coincidence,” they fell into a silence “like that of death.” Eventually, the San León drifted away and was never heard from again.

The Rôdeur’s one seeing mate managed to pilot the ship to Guadeloupe, an island in the Caribbean. By now, a few of the crew, including the captain, had regained some of their vision. But 39 of the Africans hadn’t. So before entering the harbor the captain decided to drown them, tying weights to their legs and throwing them overboard. The ship was insured and their loss would be covered: the practice of insuring slaves and slave ships meant that slavers weighed the benefits of a dead slave versus living labor and acted accordingly. 

Events on the Rôdeur caught the attention of Sébastien Guillié, chief of medicine at Paris’s Royal Institute for Blind Youth. He wrote up his findings — which included a discussion of the disease’s symptoms, the manner in which it spread, and best treatment options — and published them in Bibliothèque Ophtalmologique, which was then cited in other medical journals as well as in an 1846 U.S. textbook, A Manual of the Diseases of the Eye.

Slaves spurred forward medicine in other ways, too. Africans, for instance, were the primary victims of smallpox in the New World and were also indispensable to its eradication. In the early 1800s, Spain ordered that all its American subjects be vaccinated against the disease, but didn’t provide enough money to carry out such an ambitious campaign. So doctors turned to the one institution that already reached across the far-flung Spanish Empire: slavery. They transported the live smallpox vaccine in the arms of Africans being moved along slave routes as cargo from one city to another to be sold: doctors chose one slave from a consignment, made a small incision in his or her arm, and inserted the vaccine (a mixture of lymph and pus containing the cowpox virus). A few days after the slaves set out on their journey, pustules would appear in the arm where the incision had been made, providing the material to perform the procedure on yet another slave in the lot — and then another and another until the consignment reached its destination. Thus the smallpox vaccine was disseminated through Spanish America, saving countless lives. 

Slavery’s Great Schism

In 1945, Allied troops marched into the first of the Nazi death camps. What they saw inside, many have remarked, forced a radical break in the West’s moral imagination. The Nazi genocide of Jews, one scholar has written, is history’s “black hole,” swallowing up all the theological, ethical, and philosophical certainties that had earlier existed.    

Yet before there was the Holocaust, there was slavery, an institution that also transformed the West’s collective consciousness, as I’ve tried to show in my new book, The Empire of Necessity: Slavery, Freedom, and Deception in the New World.

Take, for example, the case of the Joaquín, a Portuguese frigate that left Mozambique in late 1803 with 301 enslaved East Africans. Nearly six months later, when a port surgeon opened the ship’s hatch in Montevideo, Uruguay, he was sickened by what he saw: only 31 bone-thin survivors in a foul, bare room, otherwise empty save for hundreds of unused shackles.

City officials convened a commission of inquiry to explain the deaths of the other 270 slaves, calling on the expertise of five surgeons — two British doctors, a Spaniard, a Swiss Italian, and one from the United States. The doctors testified that before boarding the Joaquín, the captives would have felt extreme anguish, having already been forced to survive on roots and bugs until arriving on the African coast emaciated and with their stomachs distended. Then, once on the ocean, crowded into a dark hold with no ventilation, they would have had nothing to do other than listen to the cries of their companions and the clanking of their chains. Many would have gone mad trying to make sense of their situation, trying to ponder “the imponderable.” The surgeons decided that the East Africans had died from dehydration and chronic diarrhea, aggravated by the physical and psychological hardships of slavery — from, that is, what they called “nostalgia,” “melancholia,” and “cisma,”a Spanish word that loosely means brooding or mourning.    

The collective opinion of the five surgeons — who represented the state of medical knowledge in the U.S., Great Britain, and Spain — reveals the way slavery helped in what might be called the disenchanting of medicine. In it you can see how doctors dealing with the slave trade began taking concepts like melancholia out of the hands of priests, poets, and philosophers and giving them actual medical meaning.  

Prior to the arrival of the Joaquín in Montevideo, for instance, the Royal Spanish Academy was still associating melancholia with actual nighttime demon possession. Cisma literally meant schism, a theological concept Spaniards used to refer to the spiritual split personality of fallen man. The doctors investigating the Joaquín, however, used these concepts in a decidedly secular, matter-of-fact manner and in ways that unmistakably affirmed the humanity of slaves. To diagnose enslaved Africans as suffering from nostalgia and melancholia was to acknowledge that they had selves that could be lost, inner lives that could suffer schism or alienation, and pasts over which they could mourn.

Two decades after the incident involving the Joaquín, the Spanish medical profession no longer thought melancholia to be caused by an incubus, but considered it a type of delirium, often related to seasickness. Medical dictionaries would later describe the condition in terms similar to those used by critics of the Middle Passage — as caused by rancid food, too close contact, extreme weather, and above all the “isolation” and “uniform and monotonous life” one experiences at sea. As to nostalgia, one Spanish dictionary came to define it as “a violent desire compelling those taken out of their country to return home.”

It was as if each time a doctor threw back a slave hatch to reveal the human-made horrors below, it became a little bit more difficult to blame mental illness on demons.

In the case of the Joaquín, however, the doctors didn’t extend the logic of their own reasoning to the slave trade and condemn it. Instead, they focused on the hardships of the Middle Passage as a technical concern. “It is in the interests of commerce and humanity,” said the Connecticut-born, Edinburgh-educated John Redhead, “to get slaves off their ships as soon as possible.”

Follow the Money

Slavery transformed other fields of knowledge as well. For instance, centuries of buying and selling human beings, of shipping them across oceans and continents, of defending, excoriating, or trying reform the practice, revolutionized both Christianity and secular law, giving rise to what we think of as modern human rights law.

In the realm of economics, the importance of slaves went well beyond the wealth generated from their uncompensated labor. Slavery was the flywheel on which America’s market revolution turned — not just in the United States, but in all of the Americas.

Starting in the 1770s, Spain began to deregulate the slave trade, hoping to establish what merchants, not mincing any words, called a “free trade in blacks.” Decades before slavery exploded in the United States (following the War of 1812 with Great Britain), the slave population increased dramatically in Spanish America. Enslaved Africans and African Americans slaughtered cattle and sheared wool on the pampas of Argentina, spun cotton and wove clothing in textile workshops in Mexico City, and planted coffee in the mountains outside Bogotá. They fermented grapes for wine at the foot of the Andes and boiled Peruvian sugar to make candy. In Guayaquil, Ecuador, enslaved shipwrights built cargo vessels that were used for carrying more slaves from Africa to Montevideo. Throughout the thriving cities of mainland Spanish America, slaves worked, often for wages, as laborers, bakers, brick makers, liverymen, cobblers, carpenters, tanners, smiths, rag pickers, cooks, and servants.

It wasn’t just their labor that spurred the commercialization of society. The driving of more and more slaves inland and across the continent, the opening up of new slave routes and the expansion of old ones, tied hinterland markets together and created local circuits of finance and trade. Enslaved peoples were investments (purchased and then rented out as laborers), credit (used to secure loans), property, commodities, and capital, making them an odd mix of abstract and concrete value. Collateral for loans and items for speculation, slaves were also objects of nostalgia, mementos of a fading aristocratic world even as they served as the coin for the creation of a new commercialized one.

Slaves literally made money: working in Lima’s mint, they trampled quicksilver into ore with their bare feet, pressing toxic mercury into their bloodstream in order to amalgamate the silver used for coins. And they were money — at least in a way. It wasn’t that the value of individual slaves was standardized in relation to currency, but that slaves were quite literally the standard.  When appraisers calculated the value of any given hacienda, or estate, slaves usually accounted for over half of its worth; they were, that is, much more valuable than inanimate capital goods like tools and millworks.

In the United States, scholars have demonstrated that profit wasn’t made just from southerners selling the cotton that slaves picked or the cane they cut.  Slavery was central to the establishment of the industries that today dominate the U.S. economy: finance, insurance, and real estate. And historian Caitlan Rosenthal has shown how Caribbean slave plantations helped pioneer “accounting and management tools, including depreciation and standardized efficiency metrics, to manage their land and their slaves” — techniques that were then used in northern factories.

Slavery, as the historian Lorenzo Green argued half a century ago, “formed the very basis of the economic life of New England: about it revolved, and on it depended, most of her other industries.” Fathers grew wealthy building slave ships or selling fish, clothing, and shoes to slave islands in the Caribbean; when they died, they left their money to sons who “built factories, chartered banks, incorporated canal and railroad enterprises, invested in government securities, and speculated in new financial instruments.”  In due course, they donated to build libraries, lecture halls, botanical gardens, and universities, as Craig Steven Wilder has revealed in his new book, Ebony and Ivy.

In Great Britain, historians have demonstrated how the “reparations” paid to slave-owning families “fuelled industry and the development of merchant banks and marine insurance, and how it was used to build country houses and to amass art collections.”

Follow the money, as the saying goes, and you don’t even have to move very far along the financial trail to begin to see the wealth and knowledge amassed through slavery. To this day, it remains all around us, in our museums, courts, places of learning and worship, and doctors’ offices. Even the tony clothier, Brooks Brothers (founded in New York in 1818), got its start selling coarse slave clothing to southern plantations.  It now describes itself as an “institution that has shaped the American style of dress.”

Fever Dreams and the Bleached Bones of the Dead

In the United States, the reparations debate faded away with the 2008 election of Barack Obama — except as an idea that continues to haunt the fever dreams of the right-wing imagination. A significant part of the backlash against the president is driven by the fantasy that he is presiding over a radical redistribution of wealth — think of all those free cell phones that the Drudge Report says he’s handing out to African Americans! — part of a stealth plan to carry out reparations by any means possible.  

“What they don't know,” said Rush Limbaugh shortly after Obama’s inauguration, “is that Obama's entire economic program is reparations.” The conservative National Legal Policy Center recently raised the specter of “slavery reparations courts” — Black Jacobin tribunals presided over by the likes of Jessie Jackson, Louis Farrakhan, Al Sharpton, and Russell Simmons and empowered to levy a $50,000 tax on every white “man, woman, and child in this country.”  It’s time to rescue the discussion of reparations from the swamp of talk radio and the comment sections of the conservative blogosphere.  

The idea that slavery made the modern world is not new, though it seems that every generation has to rediscover that truth anew. Almost a century ago, in 1915, W.E.B Du Bois wrote, “Raphael painted, Luther preached, Corneille wrote, and Milton sung; and through it all, for four hundred years, the dark captives wound to the sea amid the bleaching bones of the dead; for four hundred years the sharks followed the scurrying ships; for four hundred years America was strewn with the living and dying millions of a transplanted race; for four hundred years Ethiopia stretched forth her hands unto God.”

How would we calculate the value of what we today would call the intellectual property — in medicine and other fields — generated by slavery’s suffering? I’m not sure. But a revival of efforts to do so would be a step toward reckoning with slavery’s true legacy: our modern world. 

TomDispatch regular Greg Grandin’s new book, The Empire of Necessity: Slavery, Freedom, and Deception in the New World, has just been published. 

Copyright 2014 Greg Grandin

The Bleached Bones of the Dead

A captain ready to drive himself and all around him to ruin in the hunt for a white whale. It’s a well-known story, and over the years, mad Ahab in Herman Melville’s most famous novel, Moby-Dick, has been used as an exemplar of unhinged American power, most recently of George W. Bush’s disastrous invasion of Iraq.

But what’s really frightening isn't our Ahabs, the hawks who periodically want to bomb some poor country, be it Vietnam or Afghanistan, back to the Stone Age.  The respectable types are the true “terror of our age,” as Noam Chomsky called them collectively nearly 50 years ago.  The really scary characters are our soberest politicians, scholars, journalists, professionals, and managers, men and women (though mostly men) who imagine themselves as morally serious, and then enable the wars, devastate the planet, and rationalize the atrocities.  They are a type that has been with us for a long time.  More than a century and a half ago, Melville, who had a captain for every face of empire, found their perfect expression — for his moment and ours.

For the last six years, I’ve been researching the life of an American seal killer, a ship captain named Amasa Delano who, in the 1790s, was among the earliest New Englanders to sail into the South Pacific.  Money was flush, seals were many, and Delano and his fellow ship captains established the first unofficial U.S. colonies on islands off the coast of Chile.  They operated under an informal council of captains, divvied up territory, enforced debt contracts, celebrated the Fourth of July, and set up ad hoc courts of law.  When no bible was available, the collected works of William Shakespeare, found in the libraries of most ships, were used to swear oaths.

From his first expedition, Delano took hundreds of thousands of sealskins to China, where he traded them for spices, ceramics, and tea to bring back to Boston.  During a second, failed voyage, however, an event took place that would make Amasa notorious — at least among the readers of the fiction of Herman Melville.

Here’s what happened: One day in February 1805 in the South Pacific, Amasa Delano spent nearly a full day on board a battered Spanish slave ship, conversing with its captain, helping with repairs, and distributing food and water to its thirsty and starving voyagers, a handful of Spaniards and about 70 West African men and women he thought were slaves. They weren’t.

Those West Africans had rebelled weeks earlier, killing most of the Spanish crew, along with the slaver taking them to Peru to be sold, and demanded to be returned to Senegal.  When they spotted Delano’s ship, they came up with a plan: let him board and act as if they were still slaves, buying time to seize the sealer’s vessel and supplies.  Remarkably, for nine hours, Delano, an experienced mariner and distant relative of future president Franklin Delano Roosevelt, was convinced that he was on a distressed but otherwise normally functioning slave ship.

Having barely survived the encounter, he wrote about the experience in his memoir, which Melville read and turned into what many consider his “other” masterpiece.  Published in 1855, on the eve of the Civil War, Benito Cereno is one of the darkest stories in American literature.  It’s told from the perspective of Amasa Delano as he wanders lost through a shadow world of his own racial prejudices.

One of the things that attracted Melville to the historical Amasa was undoubtedly the juxtaposition between his cheerful self-regard — he considers himself a modern man, a liberal opposed to slavery — and his complete obliviousness to the social world around him.  The real Amasa was well meaning, judicious, temperate, and modest.

In other words, he was no Ahab, whose vengeful pursuit of a metaphysical whale has been used as an allegory for every American excess, every catastrophic war, every disastrous environmental policy, from Vietnam and Iraq to the explosion of the BP oil rig in the Gulf of Mexico in 2010.

Ahab, whose peg-legged pacing of the quarterdeck of his doomed ship enters the dreams of his men sleeping below like the “crunching teeth of sharks.”  Ahab, whose monomania is an extension of the individualism born out of American expansion and whose rage is that of an ego that refuses to be limited by nature’s frontier.  “Our Ahab,” as a soldier in Oliver Stone’s movie Platoon calls a ruthless sergeant who senselessly murders innocent Vietnamese.

Ahab is certainly one face of American power. In the course of writing a book on the history that inspired Benito Cereno, I’ve come to think of it as not the most frightening — or even the most destructive of American faces.  Consider Amasa.

Killing Seals

Since the end of the Cold War, extractive capitalism has spread over our post-industrialized world with a predatory force that would shock even Karl Marx.  From the mineral-rich Congo to the open-pit gold mines of Guatemala, from Chile’s until recently pristine Patagonia to the fracking fields of Pennsylvania and the melting Arctic north, there is no crevice where some useful rock, liquid, or gas can hide, no jungle forbidden enough to keep out the oil rigs and elephant killers, no citadel-like glacier, no hard-baked shale that can’t be cracked open, no ocean that can’t be poisoned.

And Amasa was there at the beginning.  Seal fur may not have been the world’s first valuable natural resource, but sealing represented one of young America’s first experiences of boom-and-bust resource extraction beyond its borders.

With increasing frequency starting in the early 1790s and then in a mad rush beginning in 1798, ships left New Haven, Norwich, Stonington, New London, and Boston, heading for the great half-moon archipelago of remote islands running from Argentina in the Atlantic to Chile in the Pacific.  They were on the hunt for the fur seal, which wears a layer of velvety down like an undergarment just below an outer coat of stiff gray-black hair.

In Moby-Dick, Melville portrayed whaling as the American industry.  Brutal and bloody but also humanizing, work on a whale ship required intense coordination and camaraderie.  Out of the gruesomeness of the hunt, the peeling of the whale’s skin from its carcass, and the hellish boil of the blubber or fat, something sublime emerged: human solidarity among the workers.  And like the whale oil that lit the lamps of the world, divinity itself glowed from the labor: “Thou shalt see it shining in the arm that wields a pick or drives a spike; that democratic dignity which, on all hands, radiates without end from God.”

Sealing was something else entirely.  It called to mind not industrial democracy but the isolation and violence of conquest, settler colonialism, and warfare.  Whaling took place in a watery commons open to all.  Sealing took place on land.  Sealers seized territory, fought one another to keep it, and pulled out what wealth they could as fast as they could before abandoning their empty and wasted island claims.  The process pitted desperate sailors against equally desperate officers in as all-or-nothing a system of labor relations as can be imagined.

In other words, whaling may have represented the promethean power of proto-industrialism, with all the good (solidarity, interconnectedness, and democracy) and bad (the exploitation of men and nature) that went with it, but sealing better predicted today’s postindustrial extracted, hunted, drilled, fracked, hot, and strip-mined world.

Seals were killed by the millions and with a shocking casualness.  A group of sealers would get between the water and the rookeries and simply start clubbing.  A single seal makes a noise like a cow or a dog, but tens of thousands of them together, so witnesses testified, sound like a Pacific cyclone.  Once we “began the work of death,” one sealer remembered, “the battle caused me considerable terror.”

South Pacific beaches came to look like Dante’s Inferno.  As the clubbing proceeded, mountains of skinned, reeking carcasses piled up and the sands ran red with torrents of blood.  The killing was unceasing, continuing into the night by the light of bonfires kindled with the corpses of seals and penguins.

And keep in mind that this massive kill-off took place not for something like whale oil, used by all for light and fire.  Seal fur was harvested to warm the wealthy and meet a demand created by a new phase of capitalism: conspicuous consumption.  Pelts were used for ladies’ capes, coats, muffs, and mittens, and gentlemen’s waistcoats.  The fur of baby pups wasn’t much valued, so some beaches were simply turned into seal orphanages, with thousands of newborns left to starve to death.  In a pinch though, their downy fur, too, could be used — to make wallets.

Occasionally, elephant seals would be taken for their oil in an even more horrific manner: when they opened their mouths to bellow, their hunters would toss rocks in and then begin to stab them with long lances.  Pierced in multiple places like Saint Sebastian, the animals’ high-pressured circulatory system gushed “fountains of blood, spouting to a considerable distance.”

At first the frenetic pace of the killing didn’t matter: there were so many seals.  On one island alone, Amasa Delano estimated, there were “two to three millions of them” when New Englanders first arrived to make “a business of killing seals.”

“If many of them were killed in a night,” wrote one observer, “they would not be missed in the morning.”  It did indeed seem as if you could kill every one in sight one day, then start afresh the next.  Within just a few years, though, Amasa and his fellow sealers had taken so many seal skins to China that Canton’s warehouses couldn’t hold them.  They began to pile up on the docks, rotting in the rain, and their market price crashed.

To make up the margin, sealers further accelerated the pace of the killing — until there was nothing left to kill.  In this way, oversupply and extinction went hand in hand.  In the process, cooperation among sealers gave way to bloody battles over thinning rookeries.  Previously, it only took a few weeks and a handful of men to fill a ship’s hold with skins.  As those rookeries began to disappear, however, more and more men were needed to find and kill the required number of seals and they were often left on desolate islands for two- or three-year stretches, living alone in miserable huts in dreary weather, wondering if their ships were ever going to return for them.

“On island after island, coast after coast,” one historian wrote, “the seals had been destroyed to the last available pup, on the supposition that if sealer Tom did not kill every seal in sight, sealer Dick or sealer Harry would not be so squeamish.”  By 1804, on the very island where Amasa estimated that there had been millions of seals, there were more sailors than prey.  Two years later, there were no seals at all.

The Machinery of Civilization

There exists a near perfect inverse symmetry between the real Amasa and the fictional Ahab, with each representing a face of the American Empire.  Amasa is virtuous, Ahab vengeful.  Amasa seems trapped by the shallowness of his perception of the world.  Ahab is profound; he peers into the depths.  Amasa can’t see evil (especially his own). Ahab sees only nature’s “intangible malignity.”

Both are representatives of the most predatory industries of their day, their ships carrying what Delano once called the “machinery of civilization” to the Pacific, using steel, iron, and fire to kill animals and transform their corpses into value on the spot.

Yet Ahab is the exception, a rebel who hunts his white whale against all rational economic logic.  He has hijacked the “machinery” that his ship represents and rioted against “civilization.”  He pursues his quixotic chase in violation of the contract he has with his employers.  When his first mate, Starbuck, insists that his obsession will hurt the profits of the ship’s owners, Ahab dismisses the concern: “Let the owners stand on Nantucket beach and outyell the Typhoons. What cares Ahab?  Owners, Owners?  Thou art always prating to me, Starbuck, about those miserly owners, as if the owners were my conscience.”

Insurgents like Ahab, however dangerous to the people around them, are not the primary drivers of destruction.  They are not the ones who will hunt animals to near extinction — or who are today forcing the world to the brink.  Those would be the men who never dissent, who either at the frontlines of extraction or in the corporate backrooms administer the destruction of the planet, day in, day out, inexorably, unsensationally without notice, their actions controlled by an ever greater series of financial abstractions and calculations made in the stock exchanges of New York, London, and Shanghai.

If Ahab is still the exception, Delano is still the rule.  Throughout his long memoir, he reveals himself as ever faithful to the customs and institutions of maritime law, unwilling to take any action that would injure the interests of his investors and insurers.  “All bad consequences,” he wrote, describing the importance of protecting property rights, “may be avoided by one who has a knowledge of his duty, and is disposed faithfully to obey its dictates.”

It is in Delano’s reaction to the West African rebels, once he finally realizes he has been the target of an elaborately staged con, that the distinction separating the sealer from the whaler becomes clear.  The mesmeric Ahab — the “thunder-cloven old oak” — has been taken as a prototype of the twentieth-century totalitarian, a one-legged Hitler or Stalin who uses an emotional magnetism to convince his men to willingly follow him on his doomed hunt for Moby Dick.

Delano is not a demagogue.  His authority is rooted in a much more common form of power: the control of labor and the conversion of diminishing natural resources into marketable items.  As seals disappeared, however, so too did his authority.  His men first began to grouse and then conspire.  In turn, Delano had to rely ever more on physical punishment, on floggings even for the most minor of offences, to maintain control of his ship — until, that is, he came across the Spanish slaver.  Delano might have been personally opposed to slavery, yet once he realized he had been played for a fool, he organized his men to retake the slave ship and violently pacify the rebels.  In the process, they disemboweled some of the rebels and left them writhing in their viscera, using their sealing lances, which Delano described as “exceedingly sharp and as bright as a gentleman’s sword.”

Caught in the pincers of supply and demand, trapped in the vortex of ecological exhaustion, with no seals left to kill, no money to be made, and his own crew on the brink of mutiny, Delano rallied his men to the chase — not of a white whale but of black rebels.  In the process, he reestablished his fraying authority.  As for the surviving rebels, Delano re-enslaved them.  Propriety, of course, meant returning them and the ship to its owners.

Our Amasas, Ourselves

With Ahab, Melville looked to the past, basing his obsessed captain on Lucifer, the fallen angel in revolt against the heavens, and associating him with America’s “manifest destiny,” with the nation’s restless drive beyond its borders.  With Amasa, Melville glimpsed the future.  Drawing on the memoirs of a real captain, he created a new literary archetype, a moral man sure of his righteousness yet unable to link cause to effect, oblivious to the consequences of his actions even as he careens toward catastrophe.

They are still with us, our Amasas.  They have knowledge of their duty and are disposed faithfully to follow its dictates, even unto the ends of the Earth.

TomDispatch regular Greg Grandin’s new book, The Empire of Necessity:  Slavery, Freedom, and Deception in the New World, has just been published. 

Copyright 2014 Greg Grandin

The Two Faces of Empire

The map tells the story.  To illustrate a damning new report, “Globalizing Torture: CIA Secret Detentions and Extraordinary Rendition,” recently published by the Open Society Institute, the Washington Post put together an equally damning graphic: it’s soaked in red, as if with blood, showing that in the years after 9/11, the CIA turned just about the whole world into a gulag archipelago.

Back in the early twentieth century, a similar red-hued map was used to indicate the global reach of the British Empire, on which, it was said, the sun never set.  It seems that, between 9/11 and the day George W. Bush left the White House, CIA-brokered torture never saw a sunset either.

All told, of the 190-odd countries on this planet, a staggering 54 participated in various ways in this American torture system, hosting CIA “black site” prisons, allowing their airspace and airports to be used for secret flights, providing intelligence, kidnapping foreign nationals or their own citizens and handing them over to U.S. agents to be “rendered” to third-party countries like Egypt and Syria.  The hallmark of this network, Open Society writes, has been torture.  Its report documents the names of 136 individuals swept up in what it says is an ongoing operation, though its authors make clear that the total number, implicitly far higher, “will remain unknown” because of the “extraordinary level of government secrecy associated with secret detention and extraordinary rendition.”

No region escapes the stain.  Not North America, home to the global gulag’s command center.  Not Europe, the Middle East, Africa, or Asia.  Not even social-democratic Scandinavia.  Sweden turned over at least two people to the CIA, who were then rendered to Egypt, where they were subject to electric shocks, among other abuses.  No region, that is, except Latin America.

What’s most striking about the Post’s map is that no part of its wine-dark horror touches Latin America; that is, not one country in what used to be called Washington’s “backyard” participated in rendition or Washington-directed or supported torture and abuse of “terror suspects.”  Not even Colombia, which throughout the last two decades was as close to a U.S.-client state as existed in the area.  It’s true that a fleck of red should show up on Cuba, but that would only underscore the point: Teddy Roosevelt took Guantánamo Bay Naval Base for the U.S. in 1903 “in perpetuity.”

Two, Three, Many CIAs 

How did Latin America come to be territorio libre in this new dystopian world of black sites and midnight flights, the Zion of this militarist matrix (as fans of the Wachowskis’ movies might put it)?  After all, it was in Latin America that an earlier generation of U.S. and U.S.-backed counterinsurgents put into place a prototype of Washington’s twenty-first century Global War on Terror.

Even before the 1959 Cuban Revolution, before Che Guevara urged revolutionaries to create “two, three, many Vietnams,” Washington had already set about establishing two, three, many centralized intelligence agencies in Latin America.  As Michael McClintock shows in his indispensable book Instruments of Statecraft, in late 1954, a few months after the CIA’s infamous coup in Guatemala that overthrew a democratically elected government, the National Security Council first recommended strengthening “the internal security forces of friendly foreign countries.”

In the region, this meant three things.  First, CIA agents and other U.S. officials set to work “professionalizing” the security forces of individual countries like Guatemala, Colombia, and Uruguay; that is, turning brutal but often clumsy and corrupt local intelligence apparatuses into efficient, “centralized,” still brutal agencies, capable of gathering information, analyzing it, and storing it.  Most importantly, they were to coordinate different branches of each country’s security forces — the police, military, and paramilitary squads — to act on that information, often lethally and always ruthlessly.

Second, the U.S. greatly expanded the writ of these far more efficient and effective agencies, making it clear that their portfolio included not just national defense but international offense.  They were to be the vanguard of a global war for “freedom” and of an anticommunist reign of terror in the hemisphere.  Third, our men in Montevideo, Santiago, Buenos Aires, Asunción, La Paz, Lima, Quito, San Salvador, Guatemala City, and Managua were to help synchronize the workings of individual national security forces.

The result was state terror on a nearly continent-wide scale.  In the 1970s and 1980s, Chilean dictator Augusto Pinochet’s Operation Condor, which linked together the intelligence services of Argentina, Brazil, Uruguay, Paraguay, and Chile, was the most infamous of Latin America’s transnational terror consortiums, reaching out to commit mayhem as far away as Washington D.C., Paris, and Rome.  The U.S. had earlier helped put in place similar operations elsewhere in the Southern hemisphere, especially in Central America in the 1960s.

By the time the Soviet Union collapsed in 1991, hundreds of thousands of Latin Americans had been tortured, killed, disappeared, or imprisoned without trial, thanks in significant part to U.S. organizational skills and support.  Latin America was, by then, Washington’s backyard gulag.  Three of the region’s current presidents — Uruguay’s José Mujica, Brazil’s Dilma Rousseff, and Nicaragua’s Daniel Ortega — were victims of this reign of terror.

When the Cold War ended, human rights groups began the herculean task of dismantling the deeply embedded, continent-wide network of intelligence operatives, secret prisons, and torture techniques — and of pushing militaries throughout the region out of governments and back into their barracks.  In the 1990s, Washington not only didn’t stand in the way of this process, but actually lent a hand in depoliticizing Latin America’s armed forces.  Many believed that, with the Soviet Union dispatched, Washington could now project its power in its own “backyard” through softer means like international trade agreements and other forms of economic leverage.  Then 9/11 happened.

“Oh My Goodness”

In late November 2002, just as the basic outlines of the CIA’s secret detention and extraordinary rendition programs were coming into shape elsewhere in the world, Secretary of Defense Donald Rumsfeld flew 5,000 miles to Santiago, Chile, to attend a hemispheric meeting of defense ministers.  “Needless to say,” Rumsfeld nonetheless said, “I would not be going all this distance if I did not think this was extremely important.” Indeed.

This was after the invasion of Afghanistan but before the invasion of Iraq and Rumsfeld was riding high, as well as dropping the phrase “September 11th” every chance he got.  Maybe he didn’t know of the special significance that date had in Latin America, but 29 years earlier on the first 9/11, a CIA-backed coup by General Pinochet and his military led to the death of Chile’s democratically elected president Salvador Allende.  Or did he, in fact, know just what it meant and was that the point?  After all, a new global fight for freedom, a proclaimed Global War on Terror, was underway and Rumsfeld had arrived to round up recruits.

There, in Santiago, the city out of which Pinochet had run Operation Condor, Rumsfeld and other Pentagon officials tried to sell what they were now terming the “integration” of “various specialized capabilities into larger regional capabilities” — an insipid way of describing the kidnapping, torturing, and death-dealing already underway elsewhere. “Events around the world before and after September 11th suggest the advantages,” Rumsfeld said, of nations working together to confront the terror threat.

“Oh my goodness,” Rumsfeld told a Chilean reporter, “the kinds of threats we face are global.”  Latin America was at peace, he admitted, but he had a warning for its leaders: they shouldn’t lull themselves into believing that the continent was safe from the clouds gathering elsewhere.  Dangers exist, “old threats, such as drugs, organized crime, illegal arms trafficking, hostage taking, piracy, and money laundering; new threats, such as cyber-crime; and unknown threats, which can emerge without warning.”

“These new threats,” he added ominously, “must be countered with new capabilities.” Thanks to the Open Society report, we can see exactly what Rumsfeld meant by those “new capabilities.”

A few weeks prior to Rumsfeld’s arrival in Santiago, for example, the U.S., acting on false information supplied by the Royal Canadian Mounted Police, detained Maher Arar, who holds dual Syrian and Canadian citizenship, at New York’s John F. Kennedy airport and then handed him over to a “Special Removal Unit.” He was flown first to Jordan, where he was beaten, and then to Syria, a country in a time zone five hours ahead of Chile, where he was turned over to local torturers.  On November 18th, when Rumsfeld was giving his noon speech in Santiago, it was five in the afternoon in Arar’s “grave-like” cell in a Syrian prison, where he would spend the next year being abused.

Ghairat Baheer was captured in Pakistan about three weeks before Rumsfeld’s Chile trip, and thrown into a CIA-run prison in Afghanistan called the Salt Pit.  As the secretary of defense praised Latin America’s return to the rule of law after the dark days of the Cold War, Baheer may well have been in the middle of one of his torture sessions, “hung naked for hours on end.”

Taken a month before Rumsfeld’s visit to Santiago, the Saudi national Abd al Rahim al Nashiri was transported to the Salt Pit, after which he was transferred “to another black site in Bangkok, Thailand, where he was waterboarded.” After that, he was passed on to Poland, Morocco, Guantánamo, Romania, and back to Guantánamo, where he remains.  Along the way, he was subjected to a “mock execution with a power drill as he stood naked and hooded,” had U.S. interrogators rack a “semi-automatic handgun close to his head as he sat shackled before them.”  His interrogators also “threatened to bring in his mother and sexually abuse her in front of him.”

Likewise a month before the Santiago meeting, the Yemini Bashi Nasir Ali Al Marwalah was flown to Camp X-Ray in Cuba, where he remains to this day.

Less than two weeks after Rumsfeld swore that the U.S. and Latin America shared “common values,” Mullah Habibullah, an Afghan national, died “after severe mistreatment” in CIA custody at something called the “Bagram Collection Point.” A U.S. military investigation “concluded that the use of stress positions and sleep deprivation combined with other mistreatment… caused, or were direct contributing factors in, his death.”

Two days after the secretary’s Santiago speech, a CIA case officer in the Salt Pit had Gul Rahma stripped naked and chained to a concrete floor without blankets.  Rahma froze to death.

And so the Open Society report goes… on and on and on.

Territorio Libre 

Rumsfeld left Santiago without firm commitments.  Some of the region’s militaries were tempted by the supposed opportunities offered by the secretary’s vision of fusing crime fighting into an ideological campaign against radical Islam, a unified war in which all was to be subordinated to U.S. command.  As political scientist Brian Loveman has noted, around the time of Rumsfeld’s Santiago visit, the head of the Argentine army picked up Washington’s latest set of themes, insisting that “defense must be treated as an integral matter,” without a false divide separating internal and external security.

But history was not on Rumsfeld’s side.  His trip to Santiago coincided with Argentina’s epic financial meltdown, among the worst in recorded history.  It signaled a broader collapse of the economic model — think of it as Reaganism on steroids — that Washington had been promoting in Latin America since the late Cold War years.  Soon, a new generation of leftists would be in power across much of the continent, committed to the idea of national sovereignty and limiting Washington’s influence in the region in a way that their predecessors hadn’t been.

Hugo Chávez was already president of Venezuela.  Just a month before Rumsfeld’s Santiago trip, Luiz Inácio Lula da Silva won the presidency of Brazil. A few months later, in early 2003, Argentines elected Néstor Kirchner, who shortly thereafter ended his country’s joint military exercises with the U.S.  In the years that followed, the U.S. experienced one setback after another.  In 2008, for instance, Ecuador evicted the U.S. military from Manta Air Base.

In that same period, the Bush administration’s rush to invade Iraq, an act most Latin American countries opposed, helped squander whatever was left of the post-9/11 goodwill the U.S. had in the region.  Iraq seemed to confirm the worst suspicions of the continent’s new leaders: that what Rumsfeld was trying to peddle as an international “peacekeeping” force would be little more than a bid to use Latin American soldiers as Gurkhas in a revived unilateral imperial war.

Brazil’s “Smokescreen”

Diplomatic cables released by Wikileaks show the degree to which Brazil rebuffed efforts to paint the region red on Washington’s new global gulag map.

A May 2005 U.S. State Department cable, for instance, reveals that Lula’s government refused “multiple requests” by Washington to take in released Guantánamo prisoners, particularly a group of about 15 Uighurs the U.S. had been holding since 2002, who could not be sent back to China.

“[Brazil’s] position regarding this issue has not changed since 2003 and will likely not change in the foreseeable future,” the cable said.  It went on to report that Lula’s government considered the whole system Washington had set up at Guantánamo (and around the world) to be a mockery of international law.  “All attempts to discuss this issue” with Brazilian officials, the cable concluded, “were flatly refused or accepted begrudgingly.”

In addition, Brazil refused to cooperate with the Bush administration’s efforts to create a Western Hemisphere-wide version of the Patriot Act.  It stonewalled, for example, about agreeing to revise its legal code in a way that would lower the standard of evidence needed to prove conspiracy, while widening the definition of what criminal conspiracy entailed.

Lula stalled for years on the initiative, but it seems that the State Department didn’t realize he was doing so until April 2008, when one of its diplomats wrote a memo calling Brazil’s supposed interest in reforming its legal code to suit Washington a “smokescreen.”  The Brazilian government, another Wikileaked cable complained, was afraid that a more expansive definition of terrorism would be used to target “members of what they consider to be legitimate social movements fighting for a more just society.” Apparently, there was no way to “write an anti-terrorism legislation that excludes the actions” of Lula’s left-wing social base.

One U.S. diplomat complained that this “mindset” — that is, a mindset that actually valued civil liberties  — “presents serious challenges to our efforts to enhance counterterrorism cooperation or promote passage of anti-terrorism legislation.”  In addition, the Brazilian government worried that the legislation would be used to go after Arab-Brazilians, of which there are many.  One can imagine that if Brazil and the rest of Latin America had signed up to participate in Washington’s rendition program, Open Society would have a lot more Middle Eastern-sounding names to add to its list.

Finally, cable after Wikileaked cable revealed that Brazil repeatedly brushed off efforts by Washington to isolate Venezuela’s Hugo Chávez, which would have been a necessary step if the U.S. was going to marshal South America into its counterterrorism posse.

In February 2008, for example, U.S. ambassador to Brazil Clifford Sobell met with Lula’s Minister of Defense Nelson Jobin to complain about Chávez.  Jobim told Sobell that Brazil shared his “concern about the possibility of Venezuela exporting instability.”  But instead of “isolating Venezuela,” which might only “lead to further posturing,” Jobim instead indicated that his government “supports [the] creation of a ‘South American Defense Council’ to bring Chavez into the mainstream.”

There was only one catch here: that South American Defense Council was Chávez’s idea in the first place!  It was part of his effort, in partnership with Lula, to create independent institutions parallel to those controlled by Washington.  The memo concluded with the U.S. ambassador noting how curious it was that Brazil would use Chavez’s “idea for defense cooperation” as part of a “supposed containment strategy” of Chávez.

Monkey-Wrenching the Perfect Machine of Perpetual War

Unable to put in place its post-9/11 counterterrorism framework in all of Latin America, the Bush administration retrenched.  It attempted instead to build a “perfect machine of perpetual war” in a corridor running from Colombia through Central America to Mexico.  The process of militarizing that more limited region, often under the guise of fighting “the drug wars,” has, if anything, escalated in the Obama years.  Central America has, in fact, become the only place Southcom — the Pentagon command that covers Central and South America — can operate more or less at will.  A look at this other map, put together by the Fellowship of Reconciliation, makes the region look like one big landing strip for U.S. drones and drug-interdiction flights.

Washington does continue to push and probe further south, trying yet again to establish a firmer military foothold in the region and rope it into what is now a less ideological and more technocratic crusade, but one still global in its aspirations.  U.S. military strategists, for instance, would very much like to have an airstrip in French Guyana or the part of Brazil that bulges out into the Atlantic.  The Pentagon would use it as a stepping stone to its increasing presence in Africa, coordinating the work of Southcom with the newest global command, Africom.

But for now, South America has thrown a monkey wrench into the machine.  Returning to that Washington Post map, it’s worth memorializing the simple fact that, in one part of the world, in this century at least, the sun never rose on US-choreographed torture.

Greg Grandin is a TomDispatch regular and the author of Fordlandia: The Rise and Fall of Henry Ford’s Lost Jungle City, a finalist for a Pulitzer Prize.  Later this year, his new book, Empire of Necessity: Slavery, Freedom, and Deception in the New World, will be published by Metropolitan Books.

Copyright 2013 Greg Grandin

The Latin American Exception

Americans, it’s been said, learn geography when they go to war.  Now, it seems, many get their history when they go to a Tea Party rally or tune in to Glenn Beck.

History is a “battlefield of ideas,” as Beck recently put it, while looking professorial in front of a blackboard filled with his trademark circled names connected by multidirectional arrows, his hands covered with chalk dust.  In this struggle, movement historians like Beck go all in, advancing a comprehensive interpretation of American history meant to provide analytical clarity to believers and potential converts alike.  As paranoid as it may be, this history is neither radical nor revisionist, since the Tea Party activists and their fellow travelers pluck at some of the major chords of American nationalism.

It’s easy to dismiss the iconography of the movement: the wigs and knee breeches, the founding-father fetishism, the coiled snakes, and, yes, the tea bags.  It’s no less easy to laugh at recent historical howlers like the claims of Dick Armey, who heads FreedomWorks, a corporate Tea Party front, that Jamestown was settled by “socialists” or the Texas School Board’s airbrushing of Deist Thomas Jefferson from its history textbooks.  It’s fun to ridicule Beck, as Jon Stewart recently did, when he goes all “Da Vinci Code,” and starts connecting Woodrow Wilson, Mussolini, and ACORN in order to explain 2008’s economic collapse.

But historical analysis is about making connections, and there is, in fact, coherence to the Tea Party version of history, which allows conservative cadres not just to interpret the world but to act in it.  And yes, it is all about race.

The 1040 Archipelago

At the heart of Tea Party history is the argument that “progressivism is fascism is communism.”  Conceptually, such a claim helps frame what many call “American exceptionalism,” a belief that the exclusive role of government is to protect individual rights — to speech, to assembly, to carry guns, and, of course, to own property — and not to deliver social rights like health care, education, or welfare.

At Tea Party rallies and on right-wing blogs, it’s common to hear that, since the time of President Woodrow Wilson, progressives have been waging a “hundred-year-long war” on America’s unique values.  This bit of wisdom comes directly from Beck, who has become something like the historian laureate of American exceptionalism, devoting many on-air hours to why progressivism is a threat equal to Nazism and Stalinism.

Progressives, he typically says, “started a hundred-year time bomb.  They planted it in the early 1900s.”  Beck has compared himself to “Israeli Nazi hunters,” promising, with language more easily associated with the Nazis than those who pursued them, to track down the progressive “vampires” who are “sucking the blood out of the republic.”

As Michael Lind pointed out in a recent essay at Salon.com, behind such Sturm-und-Drang language lurks a small group of relatively obscure historians, teaching in peaceful, leafy liberal arts colleges, many of them influenced by the late University of Chicago political theorist Leo Strauss.  They argue that the early twentieth-century progressive movement betrayed the very idea of universal natural rights invested in the individual, embracing instead a relativist “cult of the state.” As a result, a quest for “social justice” was elevated above the defense of “liberty” — a path which led straight to the gulag and the 1040 short form.  From there, it was an easy leap to History’s terminus: the Obamacare Death Panels.

These historians and their popular interpreters, especially Beck and Jonah Goldberg, the author of Liberal Fascism, naturally ignore the real threats to individualism that the turn-of-the-twentieth-century progressive movement was responding to — namely a massive concentration of corporate political and economic power and Gilded Era “wage slavery.” Instead, they present history as a zero-sum, all-or-nothing “battlefield of ideas,” with the founding fathers, Abraham Lincoln, and Winston Churchill on one side, and Jefferson Davis, Wilson, Franklin Roosevelt, Stalin, Hitler, and Obama on the other.  The individual versus the state.  Freedom versus slavery.

In such an epic view of American history, there is, however, a fly in the ointment or, more accurately, a Confederate in the conceptual attic — and that’s the inability of the Tea Party and affiliated right-wing movements to whistle past Dixie.

Is the Tea Party Racist?

Of course it is.  Polls confirm that Tea Party militants entertain deep-seated racial resentment.  In April, a New York Times/CBS News study revealed that most tea partiers tend to be over 45, white, male, affluent, and educated and think that “too much has been made of the problems facing black people.”  A high percentage of them also believe that Obama favors blacks over whites.

But to say the movement is racist based only on the spit and vitriol hurled at African-American congressmen and civil rights activists like Emanuel Cleaver, or on the placards depicting Obama as a monkey or a pimp, allows for rebuttal.  The minute the reality of the spitting incident is challenged and “Don’t Tread on Me” is substituted for “Go Back to Kenya,” voilà, the movement is instantly as wholesome as apple pie.

A debate over a recent University of Washington poll helps us understand why the movement is racist no matter which slogans and symbols it chooses to use.  The poll found that “support for the Tea Party remains a valid predictor of racial resentment.”  When right-wingers offered the criticism that the pollsters’ methodology conflated racism with support for small-government ideology, they reexamined their data and found themselves in agreement (of a sort) with their critics.  “Ideology,” they wrote in a follow up, was indeed an important factor, for “as people become more conservative, it increases by 23 percent the chance that they’re racially resentful.”  In other words, it wasn’t membership in the Tea Party movement per se that predicted racism, but conservatism itself (though the Tea Party does have a higher percentage of members who displayed racism than conservatism in general).

This should surprise no one.  After all, the Founding Fathers cut Thomas Jefferson’s description of slavery as an “execrable commerce” and an “assemblage of horrors” from the final draft of the Declaration of Independence, and race has been crucially embedded in the conception of the patriot ideal of the sovereign individual ever since.  As Harvard historian Jill Lepore has written about the original Boston Tea Party, the colonists had a choice: “either abolish slavery… [or] resist parliamentary rule.  It could not do both.”  Many in Virginia, of course, didn’t want to do both.  Instead, they simply defined the defense of slavery as part of American liberty.

While Jefferson, himself a slaveholder, failed in his effort to extend the notion of individual inalienable rights to blacks, he was successful in setting two rhetorical precedents that would continue to influence American political culture.  First, he used chattel slavery as a metaphor for British tyranny, equating the oppression of Africans with the oppression of the white colonists.  At the same time, he stoked racial fears to incite rebellion: King George III, he wrote, was “exciting” blacks to “rise in arms among us, and to purchase that liberty of which he has deprived them by murdering” whites.  One could draw a straight line from these words to George H.W. Bush’s infamous 1988 Willie Horton ad.

From then on, the ideal of the assertion and protection of individual rights was regularly bound up with racial demonology.  Anglo genocidal campaigns against and land theft from Native Americans, for instance, contributed to the influential theories concerning property of John Locke, who before Beck arrived on the scene, was considered “America’s philosopher,” the man most associated with the notion of God-given inalienable individual rights and restricted government.

Once such theories were formulated, they were then used to further justify dispossession, contributing, as law professor Howard Berman put it, to the “Americanization of the law of real property.”  The nineteenth century was known for a frenzied speculative capitalism that generated staggering inequality.  At the same time, eliminationist wars that drove Indian removal, the illegal invasion of Mexico by the United States in 1846, and the ongoing subjugation of African Americans helped stabilize the Daniel Boone-like image of a disciplined, propertied, white male self — and did so by contrasting it with racial enemies who were imagined to be unbridled (like the speculative capitalists), but also abject and property-less.

The Civil War cemented the metaphor whereby the free individual was defined by (and endangered by) his opposite, the slave, and has been used ever since to frame conflicts that often, on the surface at least, don’t seem to be about race at all.  It’s a point nicely illustrated recently by Dale Robertson, a prominent Tea Party organizer, who carried a sign at a rally that read: “Congress = Slaveowner, Taxpayer = Niggar.”  Beck, for his part, has identified ACORN, the Service Employees International Union or SEIU, the census, and the healthcare bill, among other threats, as laying the foundation for a “modern-day slave state” in which, of course, his overwhelmingly white following could be reduced to the status of slaves.  As to progressives, he has said that, “back in Samuel Adams’ day, they used to call them tyrants. A little later I think they were also called slave owners, people who encourage you to become more dependent on them.”

Sometimes, though, it really is just about race: “Obama’s Plan,” announced one placard at a Wisconsin Tea Party gathering, would lead to “White Slavery.”

Lock-And-Load Populism

When Tea Partiers say “Obama is trying to turn us into something we are not,” as one did recently on cable TV, they are not wrong.  It’s an honest statement, acknowledging that attempts to implement any government policies to help the poor would signal an assault on American exceptionalism, defined by Beck and likeminded others as extreme individualism.

The issue is not really the specific content of any particular policy.  As any number of frustrated observers can testify, it is no use pointing out that, say, the healthcare legislation that passed is fundamentally conservative and similar to past Republican healthcare plans, or that Obama has actually lowered taxes for most Americans, or that he gets an F rating from the Brady Campaign to Prevent Gun Violence.  The issue is the idea of public policy itself, which, for many on the right, violates an ideal of absolute individual rights.

In other words, any version of progressive taxation, policy, and regulation, no matter how mild, or for that matter, of social “justice” and the “common good” — qualities the Texas School Board recently deleted from its textbook definition of “good citizenship” — are not simply codes for race.  They are race.  To put it another way, individual supremacy has been, historically speaking, white supremacy.

This helps explain why it is impossible for the anti-Obama backlash to restrain its Tourette-like references to the Civil War to frame its fight, or its rhetorical spasms invoking secession and nullification, or its urge to carry Confederate flags as well as signs equating taxpayers with slaves.  That America’s first Black president’s first major social legislation was health care — something so intimately, even invasively about the body, the place where the social relations of race are physically inscribed (and recorded in differential mortality rates) — pushed the world-turned-upside-down carnival on display every night on Fox News, where the privileged fancy themselves powerless, another step toward the absurd.

The deepest contradiction may, however, lie in this: the teabaggers who reject any move by Big Government when it comes to social policy at home remain devoted, as Andrew Sullivan recently wrote, to the Biggest Budget-Busting Government of All, the “military-industrial-ideological complex” and its all-powerful commander-in-chief executive (and surprising numbers of them are also dependent on that complex’s give-away welfare state when it comes to their livelihoods).

As James Bovard, a consistent libertarian, has observed, “many ‘tea party’ activists staunchly oppose big government, except when it is warring, wiretapping, or waterboarding.”  For all the signs asking “Who is John Galt?,” the movement has openly embraced Arizona’s new “show-me-your-papers” immigration law and mutters not one complaint over the fact that America is “the most incarcerated society on earth,” something Robert Perkinson detailed in Texas Tough, his book on the Lone Star roots of the U.S. penitentiary system.  The skin color of those being tortured, rounded up, and jailed obviously has something to do with the selective libertarianism of much of the conservative movement. But this passion for pain and punishment is also an admission that the crisis-prone ideal of absolute individualism, forged in racial violence, would be unsustainable without further state violence.

Behind the lock-and-load populism and the kitsch calls to “rearm for revolution” is a recognition that the right’s agenda of corporate deregulation — the effects of which are evident in exploding coal mines in West Virginia and apocalyptic oil spills in the Gulf of Mexico — can only be achieved through ceaseless mobilization against enemies domestic and foreign.

Here’s an example: “I know that the safety and health of coal miners is my most important job,” said Don Blankenship at a corporate-funded Friends of America rally held in West Virginia last Labor Day, where speakers such as Ted Nugent and Sean Hannity spoke out against tyrants, regulation, “Obama and his cronies,” taxes, cap-and-trade legislation, unnamed “cockroaches,” China, green technology, and, naturally, gun control.  Blankenship just happens to be the CEO of Massey Energy, owner of the Upper Big Branch mine where 29 workers recently lost their lives.

He is also famous for waving the banner of individual rights even as he presides over a company that any totalitarian state worth its salt would envy, one that intimidates “its workers into a type of lock-step compliance that most often takes the form of silence,” including threats to fire workers who take time off to attend the funerals of the dead miners.  Wrapping himself in the American flag — literally, wearing a stars-and-strips shirt and baseball cap — Blankenship told that Labor Day crowd that he didn’t “need Washington politicians to tell” him about mine safety.  Seven months later, 29 miners are dead.

The End of American Exceptionalism

And here’s the irony, or one of them anyway: in the process of defining American exceptionalism as little more than a pitchfork loyalty to individual rights, Beck and other right-wingers are themselves becoming the destroyers of what was exceptional, governmentally speaking, about the United States.  Like John Locke’s celebration of inalienable rights, Founding Father James Madison’s distrust of the masses became a distinctive feature of American political culture.  Madison valued individual rights, but in the tripartite American system of government he worked hard to help fashion, a bulwark meant to contain the passions he knew they generated.  “Liberty is to faction what air is to fire,” he wrote in 1787, and in the centuries that followed, American politicians would consistently define their unique democracy against the populist and revolutionary excesses of other countries.

Today, though, not just Fox News Jacobins like Beck and Hannity but nearly the entire leadership of the Republican Party are fanning those flames.  Newt Gingrich hopes the Tea Party will become the “militant wing of the Republican Party,” looking to hitch his political fortunes to a movement now regularly calling for a “second bloody revolution.”  It is hard to think of another time in American history when one half of the political establishment has so wholly embraced insurrectionary populism as an electoral strategy.

Considering the right’s success at mimicking the organizing tactics of the left, it would be tempting to see recent calls for rebellion and violence as signs that the conservative movement is entering its Weathermen phase — the moment in the 1960s and 1970s when some left-wing activists succumbed to revolutionary fantasies, contributing to the New Left’s crackup.   Except that violence did not really come all that easy to the American leftists of that moment.  There was endless theorizing and agonizing, Leninist justifying and Dostoevskian moralizing, from which the left, considering the ongoing finger-pointing and mea culpas, still hasn’t recovered.

In contrast, conservative entitlement to the threat of violence is so baked into American history that, in moments like this, it seems to be taken for granted.  The Tea Party crowd, along with its militia, NRA, and Oath Keeper friends, would just as easily threaten to overthrow the federal government — or waterboard Nancy Pelosi — as go golfing.

On the 15th anniversary of the bombing of the Oklahoma Federal Building, which left 168 people dead and 600 wounded, gun-rights militants held a rally at the capital mall in Washington, along with a smaller, heavily armed one across the Potomac, where speaker after speaker threatened revolution and invoked the federal siege of Waco to justify the Oklahoma bombing.  This is the kind of militancy Gingrich believes the Republicans can harness and which he tenderly calls a “natural expression” of frustration.

Where all this will lead, who knows?   But you still “don’t need a weatherman to know which way the wind blows.”

Greg Grandin is a professor of history at New York University.  His most recent book, Fordlandia: The Rise and Fall of Henry Ford’s Forgotten Jungle City, just published in paperback, was a finalist for the Pulitzer Prize, the National Book Award, and the National Book Critics Circle Award, and was picked by the New York Times, the New Yorker, and NPR for their “best of” lists.   A new edition of his previous book, Empire’s Workshop: Latin America, the United States and the Rise of the New Imperialism, will be published later this year.

Copyright 2010 Greg Grandin

This article was originally posted at TomDispatch.com.

Glenn Beck, America’s Historian Laureate

The empire ends with a pull out. Not, as many supposed a few years ago, from Iraq. There, as well as in Afghanistan, we are mulishly staying the course, come what may, trapped in the biggest of all the “too-big-to-fail” boondoggles. But from Detroit.

Of course, the real evacuation of the Motor City began decades ago, when Ford, General Motors, and Chrysler started to move more and more of their operations out of the downtown area to harder to unionize rural areas and suburbs, and, finally, overseas. Even as the economy boomed in the 1950s and 1960s, 50 Detroit residents were already packing up and leaving their city every day. By the time the Berlin Wall fell in 1989, Detroit could count tens of thousands of empty lots and over 15,000 abandoned homes. Stunning Beaux Arts and modernist buildings were left deserted to return to nature, their floors and roofs covered by switchgrass. They now serve as little more than ornate bird houses.

In mythological terms, however, Detroit remains the ancestral birthplace of storied American capitalism. And looking back in the years to come, the sudden disintegration of the Big Three this year will surely be seen as a blow to American power comparable to the end of the Raj, Britain’s loss of India, that jewel in the imperial crown, in 1948. Forget the possession of a colony or the bomb, in the second half of the twentieth century, the real marker of a world power was the ability to make a precision V-8.

There have been dissections aplenty of what went wrong with the U.S. auto industry, as well as fond reminiscences about Detroit’s salad days, about outsized tailfins and double-barrel carburetors. Last year, the iconic Clint Eastwood even put the iconic white auto worker to rest in his movie Gran Torino. Few of these postmortems have conveyed, however, just how crucial Detroit was to U.S. foreign policy — not just as the anchor of America’s high-tech, high-profit export economy, but as a confirmation of our sense of ourselves as the world’s premier power (although in linking Detroit’s demise to the blowback from President Nixon’s illegal war in Laos, Eastwood at least came closer than most).

Detroit not only supplied a continual stream of symbols of America’s cultural power, but offered the organizational know-how necessary to run a vast industrial enterprise like a car company — or an empire. Pundits love to quote GM President “Engine” Charlie Wilson, who once famously said that he thought what was good for America “was good for General Motors, and vice versa.” It’s rarely noted, however, that Wilson made his remark at his Senate confirmation hearings to be Dwight D. Eisenhower’s Secretary of Defense. At the Pentagon, Wilson would impose GM’s corporate bureaucratic model on the armed forces, modernizing them to fight the Cold War.

After GM, it was Ford’s turn to take the reins, with John F. Kennedy tapping its CEO Robert McNamara and his “whiz kids” to ready American troops for a “long twilight struggle, year in and year out.” McNamara used Ford’s integrated “systems management” approach to wage “mechanized, dehumanizing slaughter,” as historian Gabriel Kolko once put it, from the skies over Vietnam, Laos, and Cambodia.

Perhaps, then, we should think of the ruins of Detroit as our Roman Forum. Just as Rome’s triumphal arches still remind us of its bygone imperial victories in Mesopotamia, Persia, and elsewhere, so Motown’s dilapidated buildings today invoke America’s fast slipping supremacy.

Among the most imposing is Henry Ford’s Highland Park factory, shuttered since the late 1950s. Dubbed the Crystal Palace for its floor to ceiling glass walls, it was here that Ford perfected assembly-line production, building up to 9,000 Model Ts a day — a million by 1915 — catapulting the United States light-years ahead of industrial Europe.

It was also here that Ford first paid his workers five dollars a day, creating one of the fastest growing and most prosperous working-class neighborhoods in all of America, filled with fine arts-and-crafts style homes. Today, Highland Park looks like a war zone, its streets covered with shattered glass and lined with burnt-out houses. More than 30% of its population lives in poverty, and you don’t want to know the unemployment numbers (more than 20%) or the median yearly income (less than $20,000).

There is one reminder that it wasn’t always so. A small historical-register plaque outside the Ford factory reads: “Mass production soon moved from here to all phases of American industry and set the pattern of abundance for 20th Century living.”

America in the Amazon

To truly grasp how far America has fallen from the heights of its industrial grandeur — and to understand how that grandeur led to stupendous acts of folly — you should tour another set of ruins far from the Midwest rustbelt; they lie, in fact, deep (and nearly forgotten) in, of all places, the Brazilian Amazon rainforest. There, overrun by tropical vines, sits Henry Ford’s testament to the belief that the American Way of Life could easily be exported, even to one of the wildest places on the planet.

Ford owned forests in Michigan as well as mines in Kentucky and West Virginia, which gave him control over every natural resource needed to make a car — save rubber. So in 1927, he obtained an Amazonian land grant the size of a small American state. Ford could have simply set up a purchasing office there, and bought rubber from local producers, leaving them to live their lives as they saw fit. That’s what other rubber exporters did.

Ford, however, had more grandiose ideas. He felt compelled to cultivate not only “rubber but the rubber gatherers as well.” So he set out to overlay Americana on Amazonia. He had his managers build Cape Cod-style shingled houses for the Brazilian work force he hired. He urged them to tend flower and vegetable gardens and eat whole wheat bread, unpolished rice, canned Michigan peaches, and oatmeal. He dubbed his jungle town, with suitable pride, Fordlandia.

It was the 1920s, of course, and so his managers enforced alcohol Prohibition, or at least tried to, though it wasn’t a Brazilian law, as it was in the United States at the time. On weekends, the company organized square dances and recitations of the poetry of Henry Longfellow. The hospital Ford had built in the town offered free health care for workers and visitors alike. It was designed by Albert Kahn, the renowned architect who built a number of Detroit’s most famous buildings, including the Crystal Palace. Fordlandia had a central square, sidewalks, indoor plumbing, manicured lawns, a movie theater, shoe stores, ice cream and perfume shops, swimming pools, tennis courts, a golf course, and, of course, Model Ts rolling down its paved streets.

The clash between Henry Ford — the man who reduced industrial production to its simplest motions in order to produce a series of infinitely identical products, the first indistinguishable from the millionth — and the Amazon, the world’s most complex and diverse ecosystem, was Chaplinesque in its absurdity, producing a parade of mishaps straight out of a Hollywood movie. Think Modern Times meets Fitzcarraldo. Brazilian workers rebelled against Ford’s Puritanism and nature rebelled against his industrial regimentation. Run by incompetent managers who knew little about rubber planting much less social engineering, Fordlandia in its early years was plagued by vice, knife fights, and riots. The place seemed less Our Town than Deadwood, as brothels and bars sprawled around its edges.

Ford did eventually manage to get control over his namesake fiefdom, but because he insisted that his managers plant rubber trees in tight rows — back in his Detroit factories, Ford famously crowded machines close together to reduce movement — he actually created the conditions for the explosive growth of the bugs and blight that feed off rubber, and these eventually laid waste to the plantation. Over the course of nearly two decades, Ford sank millions upon millions of dollars into trying to make his jungle utopia work the American way, yet not one drop of Fordlandia latex ever made its way into a Ford car.

The eeriest thing of all is this: Today, the ruins of Fordlandia look a lot like those in Highland Park, as well as in other rustbelt towns where neighborhoods that once hummed with life centered on a factory are now returned to weed. There is, in fact, an uncanny resemblance between Fordlandia’s rusting water tower, broken-glassed sawmill, and empty power plant and the husks of the same structures in Iron Mountain, a depressed industrial city on Michigan’s Upper Peninsula that also used to be a Ford town.

In the Amazon, Albert Kahn’s hospital has collapsed, the jungle has reclaimed the golf course and tennis courts, and bats have taken up residence in houses where American managers once lived, covering their plaster walls with a glaze of guano. No commemorative plaque marks its place in history, but Fordlandia, no less than the wreck of Detroit, is a monument to the titans of American capital — none more titanic than Ford — who believed that the United States offered a universal, and universally acknowledged, model for the rest of humanity.

Errand into the Wilderness

It would be easy to read the story of Fordlandia as a parable of arrogance. With a surety of purpose and incuriosity about the world that seem all too familiar, Ford deliberately rejected expert advice and set out to turn the Amazon into the Midwest of his imagination. The more the project failed on its own terms — that is, to grow rubber — the more Ford company officials defended it as a civilizational mission; think of it as a kind of distant preview of the ever expanding set of justifications for why the U.S. invaded Iraq six years ago. Yet Fordlandia cuts deeper into the marrow of the American experience than that.

Over 50 years ago, the Harvard historian Perry Miller gave a famous lecture which he titled “Errand into the Wilderness.” In it, he tried to explain why English Puritans lit out for the New World to begin with, as opposed to, say, going to Holland. They went, Miller suggested, not just to escape the corruptions of the Church of England but to complete the Protestant reformation of Christendom that had stalled in Europe.

The Puritans did not flee to the New World, Miller said, but rather sought to give the faithful back in England a “working model” of a purer community. Put another way, central from the beginning to American expansion was “deep disquietude,” a feeling that “something had gone wrong” at home. With the Massachusetts Bay Colony just a few decades old, a dissatisfied Cotton Mather began to learn Spanish, thinking that a better “New Jerusalem” could be raised in Mexico.

The founding of Fordlandia was driven by a similar restlessness, a chafing sense, even in the good times, the best of times, that “something had gone wrong” in America. When Ford embarked on his Amazon adventure, he had already spent the greater part of two decades, and a large part of his enormous fortune, trying to reform American society. His frustrations and discontents with domestic politics and culture were legion. War, unions, Wall Street, energy monopolies, Jews, modern dance, cow’s milk, both Theodore and Franklin Roosevelt, cigarettes, and alcohol were among his many targets and complaints. Yet churning beneath all these imagined annoyances was the fact that the force of industrial capitalism he had helped unleash was undermining the world he hoped to restore.

Ford preached with a pastor’s confidence his one true idea: ever increasing productivity combined with ever increasing pay would both relieve human drudgery and create prosperous working-class communities, with corporate profits dependent on the continual expansion of consumer demand. “High wages,” as Ford put it, to create “large markets.” By the late 1920s, Fordism — as this idea came to be called — was synonymous with Americanism, envied the world over for having apparently humanized industrial capitalism.

But Fordism contained within itself the seeds of its own undoing: the breaking down of the assembly process into smaller and smaller tasks, combined with rapid advances in transportation and communication, made it easier for manufacturers to break out of the dependent relationship established by Ford between high wages and large markets. Goods could be made in one place and sold somewhere else, removing the incentive employers had to pay workers enough to buy the products they made.

In Rome, the ruins came after the empire fell. In the United States, the destruction of Detroit happened even as the country was rising to new heights as a superpower.

Ford sensed this unraveling early on and responded to it, trying at least to slow it in ever more eccentric ways. He established throughout Michigan a series of decentralized “village-industries” designed to balance farm and factory work and rescue small-town America. Yet his pastoral communes were no match for the raw power of the changes he had played such a large part in engendering. So he turned to the Amazon to raise his City on a Hill, or in this case a city in a tropical river valley, pulling together all the many strains of his utopianism in one last, desperate bid for success.

Nearly a century ago, the journalist Walter Lippmann remarked that Henry Ford’s drive to make the world anew represented a common strain of “primitive Americanism,” reinforced by a confidence born of unparalleled achievement. He then followed with a question meant to be sarcastic but which was, in fact, all too prophetic: “Why shouldn’t success in Detroit assure success in front of Baghdad?” We know the ruination that befell Detroit. Whither Baghdad? Whither America?

Greg Grandin is a professor of history at New York University and author of a number of books, most recently, Fordlandia: The Rise and Fall of Henry Ford’s Forgotten Jungle City (Metropolitan 2009). Check out a TomDispatch audio interview with Grandin about Henry Ford’s strange adventure in the Amazon by clicking here.

Copyright 2009 Greg Grandin

Touring Empire’s Ruins

Google "neglect," "Washington," and "Latin America," and you will be led to thousands of hand-wringing calls from politicians and pundits for Washington to "pay more attention" to the region. True, Richard Nixon once said that "people don’t give one shit" about the place. And his National Security Advisor Henry Kissinger quipped that Latin America is a "dagger pointed at the heart of Antarctica." But Kissinger also made that same joke about Chile, Argentina, and New Zealand — and, of the three countries, only the latter didn’t suffer widespread political murder as a result of his policies, a high price to pay for such a reportedly inconsequential place.

Latin America, in fact, has been indispensable in the evolution of U.S. diplomacy. The region is often referred to as America’s "backyard," but a better metaphor might be Washington’s "strategic reserve," the place where ascendant foreign-policy coalitions regroup and redraw the outlines of U.S. power, following moments of global crisis.

When the Great Depression had the U.S. on the ropes, for example, it was in Latin America that New Deal diplomats worked out the foundations of liberal multilateralism, a diplomatic framework that Washington would put into place with much success elsewhere after World War II.

In the 1980s, the first generation of neocons turned to Latin America to play out their "rollback" fantasies — not just against Communism, but against a tottering multilateralist foreign-policy. It was largely in a Central America roiled by left-wing insurgencies that the New Right first worked out the foundational principles of what, after 9/11, came to be known as the Bush Doctrine: the right to wage war unilaterally in highly moralistic terms.

We are once again at a historic crossroads. An ebbing of U.S. power — this time caused, in part, by military overreach — faces a mobilized Latin America; and, on the eve of regime change at home, with George W. Bush’s neoconservative coalition in ruins after eight years of disastrous rule, would-be foreign policy makers are once again looking south.

Goodbye to All That

"The era of the United States as the dominant influence in Latin America is over," says the Council on Foreign Relations, in a new report filled with sober policy suggestions for ways the U.S. can recoup its waning influence in a region it has long claimed as its own.

Latin America is now mostly governed by left or center-left governments that differ in policy and style — from the populism of Hugo Chávez in Venezuela to the reformism of Luiz Inácio Lula da Silva in Brazil and Michelle Bachelet in Chile. Yet all share a common goal: asserting greater autonomy from the United States.

Latin Americans are now courting investment from China, opening markets in Europe, dissenting from Bush’s War on Terror, stalling the Free Trade Agreement of the Americas, and sidelining the International Monetary Fund which, over the last couple of decades, has served as a stalking horse for Wall Street and the Treasury Department.

And they are electing presidents like Ecuador’s Rafael Correa, who recently announced that his government would not renew the soon-to-expire lease on Manta Air Field, the most prominent U.S. military base in South America. Correa had previously suggested that, if Ecuador could set up its own base in Florida, he would consider extending the lease. When Washington balked, he offered Manta to a Chinese concession, suggesting that the airfield be turned into "China’s gateway to Latin America."

In the past, such cheek would have been taken as a clear violation of the Monroe Doctrine, proclaimed in 1823 by President James Monroe, who declared that Washington would not permit Europe to recolonize any part of the Americas. In 1904, Theodore Roosevelt updated the doctrine to justify a series of Caribbean invasions and occupations. And Presidents Dwight Eisenhower and Ronald Reagan invoked it to validate Cold War CIA-orchestrated coups and other covert operations.

But things have changed. "Latin America is not Washington’s to lose," the Council on Foreign Relations report says, "nor is it Washington’s to save." The Monroe Doctrine, it declares, is "obsolete."

Good news for Latin America, one would think. But the last time someone from the Council on Foreign Relations, which since its founding in 1921 has represented mainstream foreign-policy opinion, declared the Monroe Doctrine defunct, the result was genocide.

Enter the Liberal Establishment

That would be Sol Linowitz who, in 1975, as chair of the Commission on United States-Latin American Relations, said that the Monroe Doctrine was "inappropriate and irrelevant to the changed realities and trends of the future."

The little-remembered Linowitz Commission was made up of respected scholars and businessmen from what was then called the "liberal establishment." It was but one part of a broader attempt by America’s foreign-policy elite to respond to the cascading crises of the 1970s — defeat in Vietnam, rising third-world nationalism, Asian and European competition, skyrocketing energy prices, a falling dollar, the Watergate scandal, and domestic dissent. Confronted with a precipitous collapse of America’s global legitimacy, the Council on Foreign Relations, along with other mainline think tanks like the Brookings Institute and the newly formed Trilateral Commission, offered a series of proposals that might help the U.S. stabilize its authority, while allowing for "a smooth and peaceful evolution of the global system."

There was widespread consensus among the intellectuals and corporate leaders affiliated with these institutions that the kind of anticommunist zeal that had marched the U.S. into the disaster in Vietnam needed to be tamped down, and that "new forms of common management" between Washington, Europe, and Japan had to be worked out. Advocates for a calmer world order came from the same corporate bloc that underwrote the Democratic Party and the Rockefeller-wing of the Republican Party.

They hoped that a normalization of global politics would halt, if not reverse, the erosion of the U.S. economic position. Military de-escalation would free up public revenue for productive investment, while containing inflationary pressures (which scared the bond managers of multinational banks). Improved relations with the Communist bloc would open the USSR, Eastern Europe, and China to trade and investment. There was also general agreement that Washington should stop viewing Third World socialism through the prism of the Cold War conflict with the Soviet Union.

At that moment throughout Latin America, leftists and nationalists were — as they are now — demanding a more equitable distribution of global wealth. Lest radicalization spread, the Trilateral Commission’s executive director Zbignew Brzezinski, soon to be President Jimmy Carter’s national security advisor, argued that it would be "wise for the United States to make an explicit move to abandon the Monroe Doctrine." The Linowitz Commission agreed and offered a series of recommendations to that effect — including the return of the Panama Canal to Panama and a decrease in U.S. military aid to the region — that would largely define Carter’s Latin American policy.

Exit the Liberal Establishment

Of course, it was not corporate liberalism but rather a resurgent and revanchist militarism from the Right that turned out to offer the most cohesive and, for a time, successful solution to the crises of the 1970s.

Uniting a gathering coalition of old-school law-and-order anticommunists, first generation neoconservatives, and newly empowered evangelicals, the New Right organized an ever metastasizing set of committees, foundations, institutes, and magazines that focused on specific issues — the SALT II nuclear disarmament negotiations, the Panama Canal Treaty, and the proposed MX missile system, as well as U.S. policy in Cuba, South Africa, Rhodesia, Israel, Taiwan, Afghanistan, and Central America. All of them were broadly committed to avenging defeat in Vietnam (and the "stab in the back" by the liberal media and the public at home). They were also intent on restoring righteous purpose to American diplomacy.

As had corporate liberals, so, now, neoconservative intellectuals looked to Latin America to hone their ideas. President Ronald Reagan’s ambassador to the UN, Jeane Kirkpatrick, for instance, focused mainly on Latin America in laying out the foundational principles of modern neoconservative thought. She was particularly hard on Linowitz, who, she said, represented the "disinterested internationalist spirit" of "appeasement" — a word back with us again. His report, she insisted, meant "abandoning the strategic perspective which has shaped U.S. policy from the Monroe Doctrine down to the eve of the Carter administration, at the center of which was a conception of the national interest and a belief in the moral legitimacy of its defense."

At first, Brookings, the Council on Foreign Affairs, and the Trilateral Commission, as well as the Business Roundtable, founded in 1972 by the crème de la CEO crème, opposed the push to remilitarize American society; but, by the late 1970s, it was clear that "normalization" had failed to solve the global economic crisis. Europe and Japan were not cooperating in stabilizing the dollar, and the economies of Eastern Europe, the USSR, and China were too anemic to absorb sufficient amounts of U.S. capital or serve as profitable trading partners. Throughout the 1970s, financial houses like the Rockefellers’ Chase Manhattan Bank had become engorged with petrodollars deposited by Saudi Arabia, Iran, Venezuela, and other oil-exporting nations. They needed to do something with all that money, yet the U.S. economy remained sluggish, and much of the Third World off limits.

So, after Ronald Reagan’s 1980 presidential victory, mainstream policymakers and intellectuals, many of them self-described liberals, increasingly came to back the Reagan Revolution’s domestic and foreign agenda: gutting the welfare state, ramping up defense spending, opening up the Third World to U.S. capital, and jumpstarting the Cold War.

A decade after the Linowitz Commission proclaimed the Monroe Doctrine no longer viable, Ronald Reagan invoked it to justify his administration’s patronage of murderous anti-communists in Nicaragua, Guatemala, and El Salvador. A few years after Jimmy Carter announced that the U.S. had broken "free of that inordinate fear of communism," Reagan quoted John F. Kennedy saying, "Communist domination in this hemisphere can never be negotiated."

Reagan’s illegal patronage of the Contras — those murderers he hailed as the "moral equivalent of America’s founding fathers" and deployed to destabilize Nicaragua’s Sandinista government — and his administration’s funding of death squads in El Salvador and Guatemala brought together, for the first time, the New Right’s two main constituencies. Neoconservatives provided Reagan’s revival of the imperial presidency with legal and intellectual justification, while the religious Right backed up the new militarism with grassroots energy.

This partnership was first built — just as it has more recently been continued in Iraq — on a mountain of mutilated corpses: 40,000 Nicaraguans and 70,000 El Salvadorans killed by U.S. allies; 200,000 Guatemalans, many of them Mayan peasants, victimized in a scorched-earth campaign the UN would rule to be genocidal.

The End of the Neocon Holiday from History

The recent Council on Foreign Relations report on Latin America, arriving as it does in another moment of imperial decline, seems once again to signal a new emerging consensus, one similar in tone to that of the post-Vietnam 1970s. In every dimension other than military, Newsweek editor Fareed Zacharia argues in his new book, The Post-American World, "the distribution of power is shifting, moving away from American dominance." (Never mind that, just five years ago, on the eve of the invasion of Iraq, he was insisting on the exact opposite — that we now lived in a "unipolar world" where America’s position was, and would be, "unprecedented.")

To borrow a phrase from their own lexicon, the neocons’ "holiday from history" is over. The fiasco in Iraq, the fall in the value of the dollar, the rise of India and China as new industrial and commercial powerhouses, and of Russia as an energy superpower, the failure to secure the Middle East, soaring oil and gas prices (as well as skyrocketing prices for other key raw materials and basic foodstuffs), and the consolidation of a prosperous Europe have all brought their dreams of global supremacy crashing down.

Barack Obama is obviously the candidate best positioned to walk the U.S. back from the edge of irrelevance. Though no one hoping for a job in his White House would put it in such defeatist terms, the historic task of the next president will not be to win this president’s Global War on Terror, but to negotiate America’s reentry into a community of nations.

Parag Khanna, an Obama advisor, recently argued that, by maximizing its cultural and technological advantage, the U.S. can, with a little luck, perhaps secure a position as third partner in a new tripartite global order in which Europe and Asia would have equal shares, a distinct echo of the trilateralist position of the 1970s. (Forget those Munich analogies, if the U.S. electorate were more historically literate, Republicans would get better mileage out of branding Obama not Neville Chamberlain, but Spain’s Fernando VII or Britain’s Clement Richard Attlee, each of whom presided over his country’s imperial decline.)

So it has to be asked: If Obama wins in November and tries to implement a more rational, less ideologically incandescent deployment of American power — perhaps using Latin America as a staging ground for a new policy — would it once again provoke the kind of nationalist backlash that purged Rockefellerism from the Republican Party, swept Jimmy Carter out of the White House, and armed the death squads in Central America?

Certainly, there are already plenty of feverish conservative think tanks, from the Hudson Institute to the Heritage Foundation, that would double down on Bush’s crusades as a way out of the current mess. But in the 1970s, the New Right was in ascendance; today, it is visibly decomposing. Then, it could lay responsibility for the deep and prolonged crisis that gripped the United States at the feet of the "establishment," while offering solutions — an arms build-up, a renewed push into the Third World, and free-market fundamentalism — that drew much of that establishment into its orbit.

Today, the Right wholly owns the current crisis, along with its most immediate cause, the Iraq War. Even if John McCain were able to squeak out a win in November, he would be the functional equivalent not of Reagan, who embodied a movement on the march, but of Jimmy Carter, trying desperately to hold a fraying coalition together.

The Right’s decay as an intellectual force is nowhere more evident than in the fits it throws in the face of the Left’s — or China’s — advances in Latin America. The self-confidant vitality with which Jeane Kirkpatrick used Latin America to skewer the Carter administration has been replaced with the tinny, desperate shrill of despair. "Who lost Latin America?" asks the Center for Security Policy’s Frank Gaffney — of pretty much everyone he meets. The region, he says, is now a "magnet for Islamist terrorists and a breeding ground for hostile political movements… The key leader is Chávez, the billionaire dictator of Venezuela who has declared a Latino jihad against the United States."

Scare-Quote Diplomacy

But just because the Right is unlikely to unfurl its banner over Latin America again soon doesn’t mean that U.S. hemispheric diplomacy will be demilitarized. After all, it was Bill Clinton, not George W. Bush, who, at the behest of Lockheed Martin in 1997, reversed a Carter administration ban (based on Linowitz report recommendations) on the sale of high-tech weaponry to Latin America. That, in turn, kicked off a reckless and wasteful Southern Cone arms race. And it was Clinton, not Bush, who dramatically increased military aid to the murderous Colombian government and to corporate mercenaries like Blackwater and Dyncorp, further escalating the misguided U.S. "war on drugs" in Latin America.

In fact, a quick comparison between the Linowitz report and the new Council on Foreign Relations study on Latin America provides a sobering way of measuring just how far right the "liberal establishment" has shifted over the last three decades. The Council does admirably advise Washington to normalize relations with Cuba and engage with Venezuela, while downplaying the possibility of "Islamic terrorists" using the area as a staging ground — a longstanding fantasy of the neocons. (Douglas Feith, former Pentagon undersecretary, suggested that, after 9/11, the U.S. hold off invading Afghanistan and instead bomb Paraguay, which has a large Shi’ite community, just to "surprise" the Sunni al-Qaeda.)

Yet, where the Linowitz report provoked the ire of the likes of Jeane Kirkpatrick by writing that the U.S. should not try to "define the limits of ideological diversity for other nations" and that Latin Americans "can and will assess for themselves the merits and disadvantages of the Cuban approach," the Council is much less open-minded. It insists on presenting Venezuela as a problem the U.S. needs to address — even though the government in Caracas is recognized as legitimate by all and is considered an ally, even a close one, by most Latin American countries. Latin Americans may "know what is best for themselves," as the new report concedes, yet Washington still knows better, and so should back "social justice" issues as a means to win Venezuelans and other Latin Americans away from Chávez.

That the Council report regularly places "social justice" between scare quotes suggests that the phrase is used more as a marketing ploy — kind of like "New Coke" — than to signal that U.S. banks and corporations are willing to make substantive concessions to Latin American nationalists. Seven decades ago, Franklin Roosevelt supported the right of Latin American countries to nationalize U.S. interests, including Standard Oil holdings in Bolivia and Mexico, saying it was time for others in the hemisphere to get their "fair share." Three decades ago, the Linowitz Commission recommended the establishment of a "code of conduct" defining the responsibilities of foreign corporations in the region and recognizing the right of governments to nationalize industries and resources.

The Council, in contrast, sneers at Chávez’s far milder efforts to create joint ventures with oil multinationals, while offering nothing but pablum in its place. Its centerpiece recommendation — aimed at cultivating Brazil as a potential anchor of a post-Bush, post-Chávez hemispheric order — urges the abolition of subsidies and tariffs protecting U.S. agro-industry in order to advance a "Biofuel Partnership" with Brazil’s own behemoth agricultural sector. This would be an environmental disaster, pushing large, mechanized plantations ever deeper into the Amazon basin, while doing nothing to generate decent jobs or distribute wealth more fairly.

Dominated by representatives from the finance sector of the U.S. economy, the Council recommends little beyond continuing the failed corporate "free trade" policies of the last twenty years — and, in this case, those scare quotes are justified because what they’re advocating is about as free as corporate "social justice" is just.

An Obama Doctrine?

So far, Barack Obama promises little better. A few weeks ago, he traveled to Miami and gave a major address on Latin America to the Cuban American National Foundation. It was hardly an auspicious venue for a speech that promised to "engage the people of the region with the respect owed to a partner."

Surely, the priorities for humane engagement would have been different had he been addressing not wealthy right-wing Cuban exiles but an audience, say, of the kinds of Latino migrants in Los Angeles who have revitalized the U.S. labor movement, or of Central American families in Postville, Iowa, where immigration and Justice Department authorities recently staged a massive raid on a meatpacking plant, arresting as many as 700 undocumented workers. Obama did call for comprehensive immigration reform and promised to fulfill Franklin Roosevelt’s 68 year-old Four Freedoms agenda, including the social-democratic "freedom from want." Yet he spent much of his speech throwing red meat to his Cuban audience.

Ignoring the not-exactly-radical advice of the Council on Foreign Relations, the candidate pledged to maintain the embargo on Cuba. And then he went further. Sounding a bit like Frank Gaffney, he all but accused the Bush administration of "losing Latin America" and allowing China, Europe, and "demagogues like Hugo Chávez" to step "into the vacuum." He even raised the specter of Iranian influence in the region, pointing out that "just the other day Tehran and Caracas launched a joint bank with their windfall oil profits."

Whatever one’s opinion on Hugo Chávez, any diplomacy that claims to take Latin American opinion seriously has to acknowledge one thing: Most of the region’s leaders not only don’t see him as a "problem," but have joined him on major economic and political initiatives like the Bank of the South, an alternative to the International Monetary Fund and the Union of South American Nations, modeled on the European Union, established just two weeks ago. And any U.S. president who is sincere in wanting to help Latin Americans liberate themselves from "want" will have to work with the Latin American left — in all its varieties.

But more ominous than Obama’s posturing on Venezuela is his position on Colombia. Critics have long pointed out that the billions of dollars in military aid provided to the Colombian security forces to defeat the FARC insurgency and curtail cocaine production would discourage a negotiated end to the civil war in that country and potentially provoke its escalation into neighboring Andean lands. That’s exactly what happened last March, when Colombia’s president Alvaro Uribe ordered the bombing of a rebel camp located in Ecuador (possibly with U.S. logistical support supplied from Manta Air Force Base, which gives you an idea of why Correa wants to give it to China). To justify the raid, Uribe explicitly invoked the Bush Doctrine’s right of preemptive, unilateral action. In response, Ecuador and Venezuela began to mobilize troops along their border with Colombia, bringing the region to the precipice of war.

Most interestingly, in that conflict, an overwhelming majority of Latin American and Caribbean countries sided with Venezuela and Ecuador, categorically condemning the Colombian raid and reaffirming the sovereignty of individual nations recognized by Franklin Roosevelt long ago. Not Obama, however. He essentially endorsed the Bush administration’s drive to transform Colombia’s relations with its Andean neighbors into the one Israel has with most of the Middle East. In his Miami speech, he swore that he would "support Colombia’s right to strike terrorists who seek safe-havens across its borders."

Equally troublesome has been Obama’s endorsement of the controversial Merida Initiative, which human rights groups like Amnesty International have condemned as an application of the "Colombian solution" to Mexico and Central America, providing their militaries and police with a massive infusion of money to combat drugs and gangs. Crime is indeed a serious problem in these countries, and deserves considered attention. It’s chilling, however, to have Colombia — where death-squads now have infiltrated every level of government, and where union and other political activists are executed on a regular basis — held up as a model for other parts of Latin America.

Obama, however, not only supports the initiative, but wants to expand it beyond Mexico and Central America. "We must press further south as well," he said in Miami.

It seems that once again that, as in the 1970s, reports of the death of the Monroe Doctrine are greatly exaggerated.

Greg Grandin teaches history at New York University. He is the author of Empire’s Workshop: Latin America, the United States, and the Rise of the New Imperialism and The Last Colonial Massacre: Latin America in the Cold War.

Copyright 2008 Greg Grandin

Losing Latin America

The world is made up, as Captain Segura in Graham Greene’s 1958 novel Our Man in Havana put it, of two classes: the torturable and the untorturable. "There are people," Segura explained, "who expect to be tortured and others who would be outraged by the idea."

Then — so Greene thought — Catholics, particularly Latin American Catholics, were more torturable than Protestants. Now, of course, Muslims hold that distinction, victims of a globalized network of offshore and outsourced imprisonment coordinated by Washington and knitted together by secret flights, concentration camps, and black-site detention centers. The CIA’s deployment of Orwellian "Special Removal Units" to kidnap terror suspects in Europe, Canada, the Middle East, and elsewhere and the whisking of these "ghost prisoners" off to Third World countries to be tortured goes, today, by the term "extraordinary rendition," a hauntingly apt phrase. "To render" means not just to hand over, but to extract the essence of a thing, as well as to hand out a verdict and "give in return or retribution" — good descriptions of what happens during torture sessions.

In the decades after Greene wrote Our Man in Havana, Latin Americans coined an equally resonant word to describe the terror that had come to reign over most of the continent. Throughout the second half of the Cold War, Washington’s anti-communist allies killed more than 300,000 civilians, many of whom were simply desaparecido — "disappeared." The expression was already well known in Latin America when, on accepting his 1982 Nobel Prize for Literature in Sweden, Colombian novelist Gabriel García Márquez reported that the region’s "disappeared number nearly one hundred and twenty thousand, which is as if suddenly no one could account for all the inhabitants of Uppsala."

When Latin Americans used the word as a verb, they usually did so in a way considered grammatically incorrect — in the transitive form and often in the passive voice, as in "she was disappeared." The implied (but absent) actor/subject signaled that everybody knew the government was responsible, even while investing that government with unspeakable, omnipotent power. The disappeared left behind families and friends who spent their energies dealing with labyrinthine bureaucracies, only to be met with silence or told that their missing relative probably went to Cuba, joined the guerrillas, or ran away with a lover. The victims were often not the most politically active, but the most popular, and were generally chosen to ensure that their sudden absence would generate a chilling ripple-effect.

An Unholy Trinity

Like rendition, disappearances can’t be carried out without a synchronized, sophisticated, and increasingly transnational infrastructure, which, back in the 1960s and 1970s, the United States was instrumental in creating. In fact, it was in Latin America that the CIA and U.S. military intelligence agents, working closely with local allies, first helped put into place the unholy trinity of government-sponsored terrorism now on display in Iraq and elsewhere: death squads, disappearances, and torture.

Death Squads: Clandestine paramilitary units, nominally independent from established security agencies yet able to draw on the intelligence and logistical capabilities of those agencies, are the building blocks for any effective system of state terror. In Latin America, Washington supported the assassination of suspected Leftists at least as early as 1954, when the CIA successfully carried out a coup in Guatemala, which ousted a democratically elected president. But its first sustained sponsorship of death squads started in 1962 in Colombia, a country which then vied with Vietnam for Washington’s attention.

Having just ended a brutal 10-year civil war, its newly consolidated political leadership, facing a still unruly peasantry, turned to the U.S. for help. In 1962, the Kennedy White House sent General William Yarborough, later better known for being the "Father of the Green Berets" (as well as for directing domestic military surveillance of prominent civil-rights activists, including Martin Luther King Jr.). Yarborough advised the Colombian government to set up an irregular unit to "execute paramilitary, sabotage and/or terrorist activities against known communist proponents" — as good a description of a death squad as any.

As historian Michael McClintock puts it in his indispensable book Instruments of Statecraft, Yarborough left behind a "virtual blueprint" for creating military-directed death squads. This was, thanks to U.S. aid and training, immediately implemented. The use of such death squads would become part of what the counterinsurgency theorists of the era liked to call "counter-terror" — a concept hard to define since it so closely mirrored the practices it sought to contest.

Throughout the 1960s, Latin America and Southeast Asia functioned as the two primary laboratories for U.S. counterinsurgents, who moved back and forth between the regions, applying insights and fine-tuning tactics. By the early 1960s, death-squad executions were a standard feature of U.S. counterinsurgency strategy in Vietnam, soon to be consolidated into the infamous Phoenix Program, which between 1968 and 1972 "neutralized" more than 80,000 Vietnamese — 26,369 of whom were "permanently eliminated."

As in Latin America, so too in Vietnam, the point of death squads was not just to eliminate those thought to be working with the enemy, but to keep potential rebel sympathizers in a state of fear and anxiety. To do so, the U.S. Information Service in Saigon provided thousands of copies of a flyer printed with a ghostly looking eye. The "terror squads" then deposited that eye on the corpses of those they murdered or pinned it "on the doors of houses suspected of occasionally harboring Viet Cong agents." The technique was called "phrasing the threat" — a way to generate a word-of-mouth terror buzz.

In Guatemala, such a tactic started up at roughly the same time. There, a "white hand" was left on the body of a victim or the door of a potential one.

Continue reading this post at TomDispatch.com.

The Unholy Trinity

A Republican Party on the ropes, bloodied by a mid-second-term scandal; a resurrected Democratic opposition, sure it can capitalize on public outrage to prove that it is still, in the American heart of hearts, the majority party.

But before House Democrats start divvying up committee assignments and convening special investigations, they should consider that they’ve been here before, and things didn’t turn out exactly the way they hoped.

It was twenty years ago this November 3rd — exactly one day after the Democrats regained control of the Senate after six years in the minority — that the Lebanese magazine Ash-Shiraa reported on the Reagan administration’s secret, high-tech missile sale to Ayatollah Khomeini’s Iran, which violated an arms embargo against that country and contradicted President Ronald Reagan’s personal pledge never to deal with governments that sponsored terrorism.

Democrats couldn’t believe their luck. After years of banging their heads on Reagan’s popularity and failing to derail his legislative agenda, they had not only taken back the Senate, but follow-up investigations soon uncovered a scandal of epic proportions, arguably the most consequential in American history, one that seemed sure to disgrace every single constituency that had fueled the upstart conservative movement. The Reagan Revolution, it appeared, had finally been thrown into reverse.

The New York Times reported that the National Security Council was running an extensive “foreign policy initiative largely in private hands,” made up of rogue intelligence agents, mercenaries, neoconservative intellectuals, Arab sheiks, drug runners, anticommunist businessmen, even the Moonies. Profits from the missile sale to Iran, brokered by a National Security Council staffer named Oliver North, went to the Nicaraguan Contras, breaking yet another law, this one banning military aid to the anti-Sandinista guerrillas.

The ultimate goal of this shadow government, said a congressional investigation, was to create a “worldwide private covert operation organization” whose “income-generating capacity came almost entirely from its access to U.S. government resources and connections” — either from trading arms to Iran or from contributions requested by administration officials. Joseph Coors and H. Ross Perot kicked in, as did the Sultan of Brunei, whose $10,000,000 gift, solicited by Assistant Secretary of State Elliot Abrams, went missing after it was deposited into the wrong Swiss bank account.

The Democrats, now the majority in both congressional chambers, gleefully convened multiple inquiries into the scandal. From May to August 1987, TV viewers tuned in to congressional hearings on the affair. They got a rare glimpse into the cabalistic world of spooks, bagmen, and mercenaries, with their code words, encryption machines, offshore holding companies, unregistered fleets of boats and planes, and furtive cash transfers. Fawn Hall, Oliver North’s secret shredder, told of smuggling evidence out of the Old Executive Office Building in her boots, and lectured Representative Thomas Foley that “sometimes you have to go above the written law.”

Foreign enemies were not the only targets set in North’s crosshairs, as later investigations described what was in effect a covert operation run on domestic soil, with the White House mobilizing conservative grassroots organizations to plant disinformation in the press and harass legislators and reporters who opposed or criticized President Reagan’s Contra policy.

Reagan’s poll numbers plummeted and talk of impeachment was rampant. Democratics thought they had found in Iran-Contra a sequel to Watergate, another tutorial about the imperial presidency that would enable them to consolidate the power Congress had assumed over foreign policy in the 1970s.

But just a year after the hearings, Iran-Contra was a dead issue. When Congress released its final report on the matter in November 1988, Reagan breezily dismissed it. “They labored,” he said, “and brought forth a mouse.” Vice President George H.W. Bush was elected president that month, despite being implicated in the scandal.

Ollie’s Song

How could the Democrats have failed to inflict serious damage on an administration that had sold sophisticated weaponry to a sworn enemy of the United States? How could they have botched the job of transforming a conspiracy of self-righteous renegades, many of whom not only admitted their crimes but unrepentantly declared themselves to be above the law, into a defense of constitutional checks and balances in the realm of foreign affairs?

One reason is that the congressional hearings they called backfired on them. In the early months of those hearings, Congress methodically gathered damning testimony and documentary evidence of what many believed amounted to treason by high-level administration officials, if not the President himself.

But then in marched Oliver North — the crisp Marine, with his hard-rock jaw and chest full of medals. Ronald Reagan may have once been an actor, but it was North’s dramatic chops that rescued his presidency.

For six days, the Marine fended off the questions of politicians and their lawyers. His answers were contradictory and self-serving, but his performance was virtuoso. Many viewers viscerally connected with the loyalty and courage so artfully on display. “If the commander in chief tells this lieutenant colonel to go stand in the corner and stand on his head,” North said, “I will do so.” Never mind that, as Senator Daniel Inouye, a maimed WWII veteran, pointed out, the U.S. Military Code stipulates that only legal orders are to be followed. Ollie-mania swept the heartland and Hollywood. Even liberal TV producer Norman Lear admitted he couldn’t “take [his] eyes off” the colonel.

North’s luster may not have rubbed off on Reagan, but his standoff with Congress allowed the president’s defenders to take control of the storyline, reducing the scandal’s cacophony to the simple chords of patriotism and anticommunism. Conservative activist Richard Viguerie compared the hearings to a song: “Liberals are listening to the words, but the guy in the street hears the music. The music is about men and women who are prepared to die for their country.”

At the heart of the Democrats disaster was their unwillingness ever to question North’s militarism or Reagan’s support for the Contras, whose human-rights atrocities were well-documented. Rather than attacking Reagan’s restoration of anticommunism as the guiding principle of U.S. policy, they focused on procedure — such as the White House’s failure to oversee the National Security Council — or on proving that top officials had prior knowledge of the crimes.

Much as Hillary Clinton and John Kerry today focus on this administration’s “incompetence” and “mishandling” of the Iraq War, Democrats twenty years ago were scathing in their descriptions of an administration steeped in “confusion, secrecy and deception” as well as of the White House’s “pervasive dishonesty” and “disarray.” But as today, so then, these criticisms seemed like mere cavils when the security of the United States — of the “Free World” — was at stake.

In 1988, when Democratic presidential candidate Michael Dukakis, in his first debate with Vice President Bush, brought up the scandal, Bush responded that he would take “all the blame” for Iran-Contra if he got “half the credit for all the good things that have happened in world peace since Ronald Reagan and I took over.” Dukakis quietly took the deal, never again raising the issue. So, when Ollie North jibed that Libya’s Muammar Qaddafi endorsed Dukakis, there was little left for the Massachusetts governor to do but don a helmet, jump in a tank, and look famously foolish.

Along with political timidity, there was another factor that led to the Democratic collapse on Iran-Contra — careerism. Far more so than today, Washington was then a clubby, small, inbred world. One of the reasons why the anger over George H.W. Bush’s Christmas Eve 1992 pardon of six indicted Iran-Contra figures was so short-lived is that the move was quietly blessed by ranking Congressional Democrats, including Wisconsin Representative Les Aspin, who huffed and puffed but let the matter die. Aspin, who had supported aid to the Contras, was later tapped by Bill Clinton to be Secretary of Defense, easily winning confirmation with significant Republican support.

Careerism naturally leads to back-room deals. There were rumors that Democratic House Majority Leader Tip O’Neill, who unlike Aspin was an outspoken critic of Contra funding, toned down his opposition as a quid pro quo to secure federal funds for Boston’s Big Dig construction project — another disaster from the 1980s that we are still living with.

Unleashing the Imperial Presidency

But if the Democrats failed to gain political traction with the scandal, or wring a parable out of it, others did far better. Dick Cheney today points to Iran-Contra not as a cautionary tale against unchecked executive power but as a blueprint for how to obtain it.

It turns out that it was Dick Cheney’s current chief of staff David Addington — the man the press calls “Cheney’s Cheney” for his defense of unchecked presidential power in matters of foreign policy — who, as a counsel to the Republicans serving on the congressional Iran-Contra committee, wrote the controversial 1988 “Minority Report” on the scandal.

At the time, the report, which condemned not the National Security Council for its secret dealings but Congress for its “legislative hostage taking,” was considered out of the mainstream. Today, it reads like a run-of-the-mill Justice Department memo outlining the legal basis for any of the Bush Administration’s wartime power grabs. It was this report that Cheney referenced when asked last December about his role in strengthening the executive branch. The report, he said, was “very good in laying out a robust view of the President’s prerogatives” to wage war and defend national security.

Cheney and Addington are not the only veterans of the scandal to have found a home in the current White House. Other Iran-Contra notables who have resurfaced in recent years include Elliot Abrams, John Bolton, Otto Reich, John Negroponte, John Poindexter, neoconservative Michael Ledeen, and even Manucher Ghorbanifar, the Iranian arms dealer who brokered one of the first missile sales to the Khomeini regime.

This recycling of Iran-Contra personnel to fight the War on Terror points to the most important reason it has been so difficult to transform the scandal into a parable: Iran-Contra wasn’t just a crime and a cover-up — as Watergate was — or a misdemeanor like Monica-Gate. It was rather the first battle in the neoconservative campaign against Congress and in defense of the imperial presidency.

Iran-Contra field-tested many of the tactics used by the Bush administration to build support for the invasion of Iraq by manipulating intelligence, spinning public opinion, and riding roughshod over experts in the CIA and the State Department who counseled restraint. While the original Iran-Contra battle might be termed a draw — the eleven convicted conspirators won on appeal or were pardoned by George H.W. Bush — the backlash has become the establishment.

That 80s Show

Today, with that establishment shackled to the most ruinous war in recent U.S. history, the Republicans, taking a page out of Oliver North’s songbook, decided that the best defense was to go on the offensive, to turn the upcoming midterm vote into a debate on Iraq and national security. Up until the eve of the recent Foley IM-sex scandal, the strategy seemed like it just might be working once again. The Democrats were losing momentum in the run-up to next month’s elections, unanimously consenting to a distended military budget, and watching silently as Republicans, with significant Democratic support, revoked habeas corpus and gave the President the right to torture at will.

Foley-gate, along with a cascade of other scandals, controversies, and bad war news, may indeed now give the Democrats the House, and perhaps even the Senate. But already there are reports that, if they do take over Congress, their agenda will have a remarkably 1986-ish look to it: hearings and calls for more congressional “oversight” of foreign policy that leave uncontested the crusading premises driving the President’s extremist foreign policy.

If the Democratic Party wants to halt, or even reverse, its long decline and avoid yet again snatching defeat from the jaws of victory, it will need to do more than investigate the six-year reign of corruption, incompetence, and arrogance presided over by Cheney and company. Progressive politicians who protest the war in Iraq will have to do more than criticize the way it has been fought or demand to have more of a say in how it is waged. They must challenge the militarism that justified the invasion and that has made war the option of first resort for too many of our foreign-policy makers. Otherwise, no matter how many tanks they drive or veterans they nominate — or congressional seats they pick up — the Democrats will always be dancing to Ollie’s tune.

Greg Grandin is the author of the other book endorsed by Hugo Chávez on his recent New York visit: Empire’s Workshop: Latin America, the United States, and the Rise of the New Imperialism (Metropolitan).

Copyright 2006 Greg Grandin

Still Dancing to Ollie’s Tune