Teachers in red-state America are hard at work teaching us all a lesson. The American mythos has always rested on a belief that this country was born out of a kind of immaculate conception, that the New World came into being and has forever after been preserved as a land without the class hierarchies and conflicts that so disfigured Europe.
The strikes, rallies, and walkouts of public school teachers in West Virginia, Oklahoma, Kentucky, soon perhaps Arizona, and elsewhere are a stunning reminder that class has always mattered far more in our public and private lives than our origin story would allow. Insurgent teachers are instructing us all about a tale of denial for which we’ve paid a heavy price.
Professionals or Proletarians?
Are teachers professionals, proletarians, or both? One symptom of our pathological denial of class realities is that we are accustomed to thinking of teachers as “middle class.” Certainly, their professional bona fides should entitle them to that social station. After all, middle class is the part of the social geography that we imagine as the aspirational homing grounds for good citizens of every sort, a place so all-embracing that it effaces signs of rank, order, and power. The middle class is that class so universal that it’s really no class at all.
School teachers, however, have always been working-class stiffs. For a long time, they were also mainly women who would have instantly recognized the insecurities, struggles to get by, and low public esteem that plague today’s embattled teachers.
The women educators of yesteryear may have thought of their work as a profession or a “calling,” subject to its own code of ethics and standards of excellence, as well as an intellectual pursuit and social service. But whatever they thought about themselves, they had no ability to convince public authorities to pay attention to such aspirations (and they didn’t). As “women’s work,” school teaching done by “school marms” occupied an inherently low position in a putatively class-free America.
What finally lent weight to the incipient professional ideals of public school teachers was, ironically, their unionization; that is, their self-identification as a constituent part of the working class. The struggle to create teacher unions was one of the less heralded breakthroughs of the 1960s and early 1970s. A risky undertaking, involving much self-sacrifice and militancy, it was met with belligerent resistance by political elites everywhere. When victory finally came, it led to considerable improvements in the material conditions of a chronically underpaid part of the labor force. Perhaps no less important, for the first time it institutionalized the long-held desire of teachers for some respect, a desire embodied in tenure systems and other forms of professional recognition and protection.
Those hard-won teachers’ unions also paved the way for the large-scale organization of government workers of every sort. That was yet another world at odds with itself: largely white collar and well educated, with a powerful sense of professionalism, yet long mistreated, badly underpaid, and remarkably powerless, as if its denizens were… well, real life proletarians (which, of course, was exactly what they were).
Rebellion in the Land of Acquiescence and Austerity
Despite their past history of working-class rebelliousness, the sight of teachers striking (and sometimes even breaking the law to do so) still has a remarkable ability to shock the rest of us. Somehow, it just doesn’t fit the image, still so strong, of the mild-mannered, middle-class, law-abiding professionals that public school teachers are supposed to be.
What drives that shock even deeper is where all this uproar is happening. After all, for decades those “red states” have been the lands of acquiescence to the rule of big money and its political enablers. The state of Oklahoma, for example, had a legislature so craven, so slavishly in the service of the Koch brothers and the oil industry, that it prohibited the people’s representatives by law from passing new taxes with anything but a legislative supermajority. (A simple majority was, of course, perfectly sufficient when it came to cutting taxes.)
Arizona typically has had a “right-to-work” law since 1947 to fend off attempts to organize workers. Such laws are, in fact, a grotesque misnomer. Rather than guaranteeing employment, they ban unions from negotiating contracts requiring that all workers who benefit from the contract become members of the union and contribute dues to cover the costs of their representation. In all these states, teachers (along with other public employees) are prohibited or severely limited by law from striking.
Such concerted and contagious insurgency in the homelands of the bended knee was unimaginable… until, of course, it happened. Both acquiescence and the current explosive wave of resistance from teachers were the wages of austerity. Those particular Republican-run states were hardly the only ones to cut social services to the bone while muscling up on giveaways to corporate powerbrokers. (Plenty of Democrat-run state governments did the same.) But the abysmal conditions of public schools and the people who work in them in those states have made them the poster children for an age of austerity that’s lasted decades.
Oklahoma, for instance, cut funding per student by 30% over the past 10 years and led the nation when it came to education cutbacks since the 2008 recession. Meanwhile, Arizona has spent less per student than any other state. And that’s just to start down a list of red-state austerity measures in education. The nitty gritty result of such slash-and-burn tactics has meant classes with outdated textbooks, antiquated computers (if any at all), schoolhouses without heat, and sometimes even a four-day version of the usual five-day school week.
West Virginia’s teachers, the first to go out on strike, averaged salaries of $45,240 in 2016, which ranked them 47th in the nation in teacher pay. At $41,000, Oklahoma is even worse. Arizona’s teachers, now threatening to join the strike lines, are 43rd, while Kentucky does only a bit better at $52,000. At some point — always impossible to predict no matter how inevitable it may seem in hindsight — enough proved enough.
Austerity is a politics of class overlordship, or (as we tend to say these days) the dominion of the 1%. It entails, however, far more than just the starving of the public sector, especially education. Those teacher’s salaries and the grim conditions of the deprived schools that go with them are just the budgetary expression of a deeper process of ruthless economic underdevelopment and cultural cruelty.
After all, over the last generation, the deindustrialization of America has paid handsome dividends to financiers, merger and acquisition speculators, junk bond traders, and corporations fleeing a unionized work force for the sweated labor of the global South. In the process, deindustrialization ravaged the economic and social landscape of working-class communities (including that of red-state teachers), turned whole cities into ghost towns, leaving millions on the down escalator of social mobility, and made opioids the dietary staple of the country’s rural and urban hinterlands.
In the process, deindustrialization dried up sources of industry-based tax revenues which had once helped maintain a modicum of social services, including ones as basic as public education. Tax givebacks, subsidies, or exemptions for the business world grew lush as roads, bridges, public transport, health care, and classrooms deteriorated.
Blaming the Victims
Scapegoats for this unfolding disaster were rounded up — the usual suspects, of course: the inherent laziness of the desperately poor and immigrants, all living off the public weal; liberal sentimentalists manning the welfare state; greedy unionized workers undermining American competitiveness; and above all, the racially disfavored.
Oh yes, and there was one extra, far more surprising miscreant in that line-up: those otherwise quintessentially respectable, law-abiding professionals teaching our kids. If those children failed to measure up, if they couldn’t read or write or do math, if they were scientific illiterates, if they grew up black or “undocumented” distrusting official authority, if they dropped out or were drugged out, if they seemed to exhibit an all-sided dysfunction and ill-discipline, it had to be the fault of their teachers. After all, they had cushy jobs, went home at three, had their summers off, and enjoyed immunity from public oversight thanks to their all-too-powerful unions.
Acquiescence and austerity breed cultural decline, a telling sign of which has been the blaming of teachers for a profound, many-sided social breakdown they were largely the brunt of, not the cause of. A country undergoing systemic underdevelopment like the United States can’t provide decent housing or health care, a non-toxic environment or reasonable child care, color-blind justice or well-equipped schoolhouses, no less rewarding work. The classroom inherits all those deficits.
Millions of children arrive at school burdened by the costs of secular decline before they ever enter their first class. Teachers try to cope, often heroically, but it’s a losing battle and they get stigmatized for the defeat. It matters not at all that many of them, like those staffing the school systems of West Virginia or Oklahoma, spend innumerable hours beyond the “normal” school day prepping and inventing ways to treat the wounds of social meanness. They even draw on their own spare resources to make up for yawning gaps in books, computers, paper (and not just notebook paper, but toilet paper) that state and local governments have refused to provide funds for.
In those children and those schools can be seen a vision of our society’s future and clearly it doesn’t work. Like so much else about American life of late, this is a world of “winners” and “losers” — and the kids, as well as the teachers, have been on the wrong side of that equation for far too long now.
How convenient it is for the powers-that-be to depict the striking teachers as the problem, as the “losers,” while whittling away at their salaries, supplies, tenure arrangements, and other union protections (when they’re fortunate enough to even have unions), while lengthening teaching hours, reducing vital prep periods, and subjecting them to the discipline of teaching to the test. Just to make ends meet, teachers in those red states often have to moonlight as waitresses or car-service drivers. In a word, until the recent strikes and walk-outs, they had been turned into powerless rather than empowered proletarians.
If Not Now, When?
Punishing and demoralizing as this regime has been, the teachers stood up. Though the urge to write “finally stood up” is there, no one should underestimate the courage and desperation it takes to do just that. Moreover, this moment of resistance to an American world of austerity overseen by plutocrats is not as surprising as it might seem.
We live in the era of both Donald Trump and Bernie Sanders. In their starkly different ways each of them is symptomatic of our moment — in Trump’s case of a pathological condition, in Sanders’s of the possibility of recovery from the disease of acquiescence and austerity. In both, you can see the established order losing its grip. Even before the Sanders campaign, there were signs that the winds were shifting, most dramatically in the Occupy Wall Street uprising (however short-lived that was). Today, thanks in part to the Sanders phenomenon, millennials who were especially drawn to the Vermont senator make up the most pro-union part of the general population.
Atmospheric change of this sort was abetted by elements closer to the ground. Irate teachers in the red states were generally either not in unions at all or only in union-like institutions with little power or influence. So they had to rely on themselves to mold a fighting force, an act of social creativity which happens rarely. When it does, however, it’s both captivating and inspiring, as the West Virginia uprising clearly proved to be in a surprising number of other red states.
Class matters as does its history. West Virginia wasn’t the only place where striking or protesting teachers entered the fray well aware and proud of their state’s long history of working class resistance to the predatory behavior of employers. In the case of West Virginia, it was the coal barons. Many of the strikers had families where memories of the mine wars were still archived.
Kentucky, most memorably “bloody Harlan County,” where strikes, bombings, and other forms of civil war between mine owners and workers went on for nearly a decade in the 1930s (requiring multiple interventions by state and federal troops), can say the same. Oklahoma, even when it was still a territory, had a vibrant populist movement and later a militant labor movement that included robust representation from the Industrial Workers of the World (the legendary “Wobblies”), a tradition of resistance that flared up again during the Great Depression.
Arizona was once similarly home to a militant labor tradition in its metal mining industries. Its grim history was most infamously acted out in Bisbee, Arizona, in 1917. At that time, copper miners striking against Phelps Dodge and other mining companies were rounded up by deputized vigilantes, hauled out to the New Mexican desert in fetid railroad boxcars, and left there to fend for themselves. Those mine wars against Phelps Dodge and other corporate goliaths continued well into the 1980s.
Memories like these helped stoke the will to resist and to envision a world beyond acquiescence and austerity. Under normal circumstances to be proletarian is to be without power. Before capital is an economic category, it’s a political one. If you have it, you’re obviously so much freer to do as you please; if you don’t, you’re dependent on those who do. Hiding in plain sight, however, is a contrary fact: without the collective work of those ostensibly powerless workers, nothing moves.
This is emphatically the case with skilled workers, which after all is what teachers are. Discovering this “fact” and acting on it requires a leap of moral imagination. That this happened to the beleaguered teachers of so many red states is reflected in the esprit de corps that numerous accounts of these rebellions have reported, including the likening of the strikes to an “Arab Spring for teachers.”
And keep in mind that many other parts of the modern labor force suffer from precarious conditions not so dissimilar from those of the public school teachers, including highly skilled “professionals” like computer techies, college teachers, journalists, and even growing numbers of engineers. So the recent strikes may portend similar recognitions of latent power in equally improbable zones where professionals are undergoing a process of proletarianization.
An imaginative leap of the sort those teachers have taken bears other fruit that nourishes victory. Instead of depicting their struggles as confined to their own “profession,” for instance, the teachers today are fashioning their movement to echo broader desires. In Oklahoma and West Virginia, for example, they have insisted on improvements not just in their own working lives, but in those of all school staff members. Oklahoma teachers refused to go back to school even after the legislature granted them a raise, insisting that the state adequately fund the education system as well. And everywhere these insurgencies have deliberately made common cause with the whole community that uses the schools — parents and students alike — while repeatedly expressing the desire that children not be sacrificed on the altar of austerity.
Nothing could be more at odds with the emotional logic of austerity and acquiescence, with a society that has learned to salute “winners” and give the back of the hand to “losers,” than the widening social sympathy that has been sweeping through the schoolhouses of red state America.
Class dismissed? It doesn’t look like it.
Arising from the shadows of the American repressed, Bernie Sanders and Donald Trump have been sending chills through the corridors of establishment power. Who would have thunk it? Two men, both outliers, though in starkly different ways, seem to be leading rebellions against the masters of our fate in both parties; this, after decades in which even imagining such a possibility would have been seen as naïve at best, delusional at worst. Their larger-than-life presence on the national stage may be the most improbable political development of the last American half-century. It suggests that we are entering a new phase in our public life.
A year ago, in my book The Age of Acquiescence, I attempted to resolve a mystery hinted at in its subtitle: “The rise and fall of American resistance to organized wealth and power.” Simply stated, that mystery was: Why do people rebel at certain moments and acquiesce in others?
Resisting all the hurts, insults, threats to material well-being, exclusions, degradations, systematic inequalities, over-lordship, indignities, and powerlessness that are the essence of everyday life for millions would seem natural enough, even inescapable, if not inevitable. Why put up with all that?
Historically speaking, however, the impulse to give in has proven no less natural. After all, to resist is often to risk yourself, your means of livelihood, and your way of life. To rise up means to silence those intimidating internal voices warning that the overlords have the right to rule by virtue of their wisdom, wealth, and everything that immemorial custom decrees. Fear naturally closes in.
In our context, then, why at certain historical moments have Americans shown a striking ability to rise up, at other times to submit?
To answer that question, I explored those years in the first gilded age of the nineteenth century when millions of Americans took to the streets to protest, often in the face of the armed might of the state, and the period in the latter part of the twentieth century and the first years of this one when the label “the age of acquiescence” seemed eminently reasonable — until, in 2016, it suddenly didn’t.
So consider this essay a postscript to that work, my perhaps belated realization that the age of acquiescence has indeed come to an end. Millions are now, of course, feeling the Bern and cheering The Donald. Maybe I should have paid more attention to the first signs of what was to come as I was finishing my book: the Tea Party on the right, and on the left Occupy Wall Street, strikes by low-wage workers, minimum and living wage movements, electoral victories for urban progressives, a surge of environmental activism, and the eruption of the Black Lives Matter movement just on the eve of publication.
But when you live for so long in the shade of acquiescence where hope goes to die or at least grows sickly, you miss such things. After all, if history has a logic, it can remain so deeply hidden as to be indecipherable… until it bites. So, for example, if someone had X-rayed American society in 1932, in the depth of the Great Depression, that image would have revealed a body politic overrun with despair, cynicism, fatalism, and fear — in a word, acquiescence, a mood that had shadowed the land since “black Tuesday” and the collapse of the stock market in 1929.
Yet that same X-ray taken in 1934, just two years later, would have revealed a firestorm of mass strikes, general strikes, sit-down strikes, rent strikes, seizures of shuttered coal mines and utilities by people who were cold and lightless, marches of the unemployed, and a general urge to unseat the ancien régime; in a word, rebellion. In this way, the equilibrium of a society can shift phases in the blink of an eye and without apparent warning (although in hindsight historians and others will explore all the reasons everybody should have seen it coming).
Liberalism vs. Liberalism
Anticipated or not, a new age of rebellion has begun, one that threatens the status quo from the left and the right. Perhaps its most shocking aspect: people are up in arms against liberalism.
That makes no sense, right? How can it, when come November the queen of liberalism will face off against the billionaire standard bearer of Republicanism? In the end, the same old same old, yes? Liberal vs. conservative.
Well, not really. If you think of Hillary as the “limousine liberal” of this election season and The Donald as the right-wing “populist in pinstripes,” and consider how each of them shimmied their way to the top of the heap and who they had to fend off to get there, a different picture emerges. Clinton inherits the mantle of a liberalism that has hollowed out the American economy and metastasized the national security state. It has confined the remnants of any genuine egalitarianism to the attic of the Democratic Party so as to protect the vested interests of the oligarchy that runs things. That elite has no quarrel with racial and gender equality as long as they don’t damage the bottom line, which is after all the defining characteristic of the limousine liberalism Hillary champions. Trump channels the hostility generated by that neoliberal indifference to the well-being of working people and its scarcely concealed cultural contempt for heartland America into a racially inflected anti-establishmentarianism. Meanwhile, Bernie Sanders targets Clintonian liberalism from the other shore. Liberalism is, in other words, besieged.
The Sixties Take on Liberalism
How odd! For decades “progressives” have found themselves defending the achievements of liberal reform from the pitiless assault of an ascendant conservatism. It’s hard to remember that the liberal vs. conservative equation didn’t always apply (and so may not again).
Go back half a century to the 1960s, however, and the battlefield seems not dissimilar to today’s terrain. That was a period when the Vietnam antiwar movement indicted liberalism for its imperialism in the name of democracy, while the civil rights and black power movements called it out for its political alliance with segregationists in the South.
In those years, the New Left set up outposts in urban badlands where liberalism’s boast about the U.S. being an “affluent society” seemed like a cruel joke. Students occupied campus buildings to say no to the bureaucratization of higher education and the university’s servitude to another liberal offspring, the military-industrial complex. Women severed the knot tying the liberal ideal of the nuclear family to its gendered hierarchy. The counterculture exhibited its contempt for liberalism’s sense of propriety in a thousand ways. No hairstyle conventions, marriage contracts, sexual inhibitions, career ambitions, religious orthodoxies, clothing protocols, racial taboos, or chemical prohibitions escaped unscathed.
Liberalism adjusted, however. It has since taken credit for most of the reforms associated with that time. Civil rights laws, the war on poverty (including Medicare and Medicaid), women’s rights, affirmative action, and the erasure of cultural discrimination are now a de rigueur part of the CVs of Democratic presidents and the party’s top politicians, those running the mainstream media, the chairmen of leading liberal foundations, Ivy League college presidents, high-end Protestant theologians and clerics, and so many others who proudly display the banner of liberalism. And they do deserve some of the credit. They may have genuinely felt that “Bern” of yesteryear, the one crying out for equal rights before the law.
More importantly, those liberal elites were wise enough or malleable enough, or both, to surf the waves of rebellion of that time. Wisdom and flexibility, however, are only part of the answer to this riddle: Why did mid-twentieth century liberalism manage to reform itself instead of cracking up under the pressure of that sixties moment? The deeper explanation may be that the uprisings of those years assaulted liberalism — but largely on behalf of liberalism. Explicitly at times, as in the Port Huron Statement, that founding document of the ur-New Left group, Students for a Democratic Society, at other times by implication, the rebellions of that moment demanded that the liberal order live up to its own sacred credo of liberty, equality, and the pursuit of happiness.
The demand to open the system up became the heart and soul of the next phase of liberalism, of the urge to empower the free individual. Today, we might recognize this as the classic Clintonista desire to let all-comers join “the race to the top.”
Looking back, it’s been customary to treat the sixties as an era of youth rebellion. While more than that, it certainly could be understood, in part, as an American version of fathers and sons (not to speak of mothers and daughters). An older generation had created the New Deal order, itself an act of historic rebellion. As it happened, that creation didn’t fit well with a Democratic Party whose southern wing, embedded in the segregationist former Confederacy, rested on Jim Crow laws and beliefs. Nor did New Deal social welfare reforms that presumed a male breadwinner/head of household, while excluding underclasses, especially (but not only) those of the wrong complexion from its protections, square with a yearning for equality.
Moreover, the New Deal saved a capitalist economy laid low in the Great Depression by installing a new political economy of mass consumption. While a wondrous material accomplishment, that was also a socially disabling development, nourishing a culture of status-seeking individualism and so undermining the sense of social solidarity that had made the New Deal possible. Finally, in the Cold War years, it became clear that prosperity and democracy at home depended on an imperial relationship with the rest of the world and the garrisoning of the planet. In the famed phrase of Life Magazine publisher Henry Luce, an “American Century” was born.
Uprisings against that ossifying version of New Deal liberalism made the sixties “The Sixties.” Political emotions were at a fever pitch as rebels faced off against a liberal “establishment.” Matters sometimes became so overheated they threatened to melt the surface of public life. And yet here was a question that, no matter the temperature, was tough to raise at the time: What if liberalism wasn’t the problem? Admittedly, that thought was in the air then, raised not just by new and old lefties, but by Martin Luther King who famously enunciated his second thoughts about capitalism, poverty, race, and war in speeches like “Beyond Vietnam: A Time to Break Silence.”
Most of the rebels of that moment, however, clung to the ancestral faith. In the end, they were convinced that once equilibrium was restored, a more modern liberalism, shorn of its imperfections, could become a safe haven by excluding nobody. Indicted in those years for its hypocrisy and bad faith, it would be cleansed.
Thanks to those mass rebellions and the persistent if less fiery efforts that followed for decades, the hypocrisy of exclusion, whether of blacks, women, gays, or others, would indeed largely be ended. Or so it seemed. The liberalism inherited from the New Deal had been cleansed — not entirely to be sure and not without fierce resistance, but then again, nothing’s perfect, is it? End of hypocrisy. End of story.
The Missing Link
Yet at the dawning of the new millennium a paradox began to emerge. Liberal society had proved compatible with justice for all and an equal shot at the end zone. Strangely, however, in its ensuing glorious new world, the one Bill Clinton presided over, liberty, justice, and equality all seemed to be on short rations.
If not the liberal order, then something else was spoiling things. After all, the everyday lives of so many ordinary Americans were increasingly constrained by economic anxiety and a vertiginous sense of social freefall. They experienced feelings of being shut out and scorned, of suffering from a hard-to-define political disenfranchisement, of being surveilled at work (if they had it) and probably elsewhere if not, of fearing the future rather than hoping for what it might bring their way.
Brave and audacious as they were, rarely had the rebel movements of the fabled sixties or those that followed explicitly challenged the underlying distribution of property and power in American society. And yet if liberalism had proved compatible enough with liberty, equality, and democracy, capitalism was another matter.
The liberal elite that took credit for opening up that race to the top had also at times presided over a neoliberal capitalism which had, for decades, been damaging the lives of working people of all colors. (Indeed, nowadays Hillary expends a lot of effort trying to live down the legacy of mass incarceration bequeathed by her husband.) But Republicans have more than shared in this; they have, in fact, often taken the lead in implanting a market- and finance-driven economic system that has produced a few “winners” and legions of losers. Both parties heralded a deregulated marketplace, global free trade, the outsourcing of manufacturing and other industries, the privatization of public services, and the shrink-wrapping of the social safety net. All of these together gutted towns and cities as well as whole regions (think: Rust Belt America) and ways of life.
In the process, the New Deal Democratic Party’s tradition of resisting economic exploitation and inequality vaporized, while the “new Democrats” of the Clinton era and beyond, as well as many in the boardrooms of the Fortune 500 and in hedge-fund America, continued to champion equal rights for all. They excoriated conservative attempts to rollback protections against racial, gender, and sexual discrimination; but the one thing they didn’t do — none of them — was disturb the equanimity of the 1%.
And what does freedom and equality amount to in the face of that? For some who could — thanks to those breakthroughs — participate in the “race to the top,” it amounted to a lot. For many millions more, however, who have either been riding the down escalator or already lived near or at the bottom of society, it has been a mockery, a hollow promise, something (as George Carlin once noted) we still call the American Dream because “you have to be asleep to believe in it.”
Given their hand in abetting this painful dilemma, the new Democrats seemed made for the already existing sobriquet — a kind of curse invented by the populist right — “limousine liberal.” An emblem of hypocrisy, it was conceived and first used in 1969 not by the left but by figures in that then-nascent right-wing movement. The image of a silk-stocking crowd to-the-manner born, bred and educated to rule, networked into the circuits of power and wealth, professing a concern for the downtrodden but not about to surrender any privileges to alleviate their plight (yet prepared to demand that everyone else pony up) has lodged at the heart of American politics ever since. In our time, it has been the magnetic North of right-wing populism.
Class Struggle, American Style
In 1969, President Richard Nixon invoked the “silent majority” to do battle with those who would soon come to be known as “limousine liberals.” He hoped to mobilize a broad swath of the white working class and lower middle class for the Republican Party. This group had been the loyalists of the New Deal Democratic Party, but were then feeling increasingly abandoned by it and disturbed by the rebelliousness of the era.
In the decades that followed, the limousine liberal would prove a perfect piñata for absorbing their resentments about racial upheaval, as well as de-industrialization and decline, and their grief over the fading away of the “traditional family” and its supposed moral certitudes. In this way, the Republican Party won a substantial white working-class vote. It’s clear enough in retrospect that this confrontation between the silent majority and limousine liberalism was always a form of American class struggle.
Nixon proved something of a political genius and his gambit worked stunningly well… until, of course, in our own moment it didn’t. Following his lead, the Republican high command soon understood that waving the red flag of “limousine liberalism” excited passions and elicited votes. They never, however, had the slightest intention of doing anything to truly address the deteriorating circumstances of that silent majority. The party’s leading figures were far too committed to defending the interests of corporate America and the upper classes.
Their gestures, the red meat they tossed to their followers in the “culture wars,” only increased the passions of the era until, in the aftermath of the 2007 financial meltdown and Great Recession, they exploded in a fashion the Republican elite had no way to deal with. What began as their creature, formed in cynicism and out of the festering jealousies and dark feelings of Nixon himself over the way the liberal establishment had held him in contempt, ended up turning on its fabricators.
A “silent majority” would no longer remain conveniently silent. The Tea Party howled about every kind of political establishment in bed with Wall Street, crony capitalists, cultural and sexual deviants, free-traders who scarcely blinked at the jobs they incinerated, anti-taxers who had never met a tax shelter they didn’t love, and decriers of big government who lived off state subsidies. In a zip code far, far away, a privileged sliver of Americans who had gamed the system, who had indeed made gaming the system into the system, looked down on the mass of the previously credulous, now outraged, incredulously.
In the process, the Republican Party was dismembered and it was The Donald who magically rode that Trump Tower escalator down to the ground floor to pick up the pieces. His irreverence for established authority worked. His racist and misogynist phobias worked. His billions worked for millions who had grown infatuated with all the celebrated Wall Street conquistadors of the second Gilded Age. His way of gingerly tiptoeing around Social Security worked with those whose neediness and emotional logic was captured by the person who memorably told a Republican congressman, “Keep your government hands off my Medicare.” Most of all, his muscle-flexing bombast worked for millions fed up with demoralization, paralysis, and powerlessness. They felt The Donald.
In the face-off between right-wing populism and neoliberalism, Tea Party legions and Trumpists now find Fortune 500 CEOs morally obnoxious and an economic threat, grow irate at Federal Reserve bail-outs, and are fired up by the multiple crises set off by global free trade and the treaties that go with it. And underlying such positions is a fantasy of an older capitalism, one friendlier to the way they think America used to be. They might be called anti-capitalists on behalf of capitalism.
Others — often their neighbors in communities emptying of good jobs and seemingly under assault — are feeling the Bern. This represents yet another attack on neoliberalism of the limousine variety. Bernie Sanders proudly classifies himself as a socialist, even if his programmatic ideas echo a mildly left version of the New Deal. Yet even to utter the verboten word “socialism” in public, no less insistently run on it and get away with it, exciting the fervent commitment of millions, is stunning — in fact, beyond imagining in any recent America.
The Sanders campaign had made its stand against the liberalism of the Clinton elite. It has resonated so deeply because the candidate, with all his grandfatherly charisma and integrity, repeatedly insists that Americans should look beneath the surface of a liberal capitalism that is economically and ethically bankrupt and running a political confidence game, even as it condescends to “the forgotten man.”
To a degree then, Trump and Sanders are competing for the same constituencies, which should surprise no one given how far the collateral damage of neoliberal capitalism has spread. Don’t forget that, in the Great Depression era as the Nazis grew more powerful, their party, the National Socialists, not only incorporated that word — “socialism” — but competed with the Socialist and Communist parties among the distressed workers of Germany for members and voters. There were even times (when they weren’t killing each other in the streets) that they held joint demonstrations.
Trump is, of course, a conscienceless demagogue, serial liar, and nihilist with a belief in nothing save himself. Sanders, on the other hand, means what he says. On the issue of economic justice, he has been a broken record for more than a quarter-century, even if no one beyond the boundaries of Vermont paid much attention until recently. He is now widely trusted and applauded for his views.
Hillary Clinton is broadly distrusted. Sanders has consistently outpolled her against potential Republican opponents for president because she is indeed a limousine liberal whose career has burned through trust at an astonishing rate. And more important than that, the rebellion that has carried Sanders aloft is not afraid to put capitalism in the dock. Trump is hardly about to do that, but the diseased state of the neoliberal status quo has made him, too, a force to be reckoned with. However you look at it, the age of acquiescence is passing away.
Copyright 2016 Steve Fraser
Bernie, The Donald, and the Sins of Liberalism
[The following passages are excerpted and slightly adapted from The Age of Acquiescence: The Life and Death of American Resistance to Organized Wealth and Power (Little, Brown and Company).]
Part 1: The Great Upheaval
What came to be known as the Great Upheaval, the movement for the eight-hour day, elicited what one historian has called “a strange enthusiasm.” The normal trade union strike is a finite event joining two parties contesting over limited, if sometimes intractable, issues. The mass strike in 1886 or before that in 1877 — all the many localized mass strikes that erupted in towns and small industrial cities after the Civil War and into the new century — was open-ended and ecumenical in reach.
So, for example, in Baltimore when the skilled and better-paid railroad brakemen on the Baltimore and Ohio Railroad first struck in 1877 so, too, did less well off “box-makers, sawyers, and can-makers, engaged in the shops and factories of that city, [who] abandoned their places and swarmed into the streets.” This in turn “stimulated the railroad men to commit bolder acts.” When the governor of West Virginia sent out the Berkeley Light Guard and Infantry to confront the strikers at Martinsburg at the request of the railroad’s vice president, the militia retreated and “the citizens of the town, the disbanded militia, and the rural population of the surrounding country fraternized,” encouraging the strikers.
The centrifugal dynamic of the mass strike was characteristic of this extraordinary phenomenon. By the third day in Martinsburg the strikers had been “reinforced during the night at all points by accessions of working men engaged in other avocations than railroading,” which, by the way, made it virtually impossible for federal troops by then on the scene to recruit scabs to run the trains.
By the fourth day, “mechanics, artisans, and laborers in every department of human industry began to show symptoms of restlessness and discontent.” Seeping deeper and deeper into the subsoil of proletarian life, down below the “respectable” working class of miners and mechanics and canal boat-men, frightened observers reported a “mighty current of passion and hate” sweeping up a “vast swarm of vicious idlers, vagrants, and tramps.” And so it went.
Smaller cities and towns like Martinsburg were often more likely than the biggest urban centers to experience this sweeping sense of social solidarity. (What today we might call a massing of the 99%.) During the 1877 Great Uprising, the social transmission of the mass strike moved first along the great trunk lines of the struck railroads, but quickly flowed into the small villages and towns along dozens of tributary lines and into local factories, workshops, and coal mines as squads of strikers moved from settlement to settlement mobilizing the populace.
In these locales, face-to-face relations still prevailed. It was by no means taken for granted that antagonism between labor and capital was fated to be the way of the world. Aversion to the new industrial order and a “democratic feeling” brought workers, storekeepers, lawyers, and businessmen of all sorts together, appalled by the behavior of large industrialists who often enough didn’t live in those communities and so were the more easily seen as alien beings.
It was not uncommon for local officials, like the mayor of Cumberland, Maryland, to take the side of the mass strikers. The federal postmaster in Indianapolis wired Washington, “Our mayor is too weak, and our Governor will do nothing. He is believed to sympathize with the strikers.” In Fort Wayne, like many other towns its size, the police and militia simply could not be counted on to put down the insurrectionists. In this world, corporate property was not accorded the same sanctified status still deferred to when it came to personal property. Sometimes company assets were burned to the ground or disabled; at other times they were seized, but not damaged.
Metropolises also witnessed their own less frequent social earthquakes. Anonymous relations were more common there, the gulf separating social classes was much wider, and the largest employers could count on the new managerial and professional middle classes for support and a political establishment they could more often rely on.
Still, the big city hardly constituted a DMZ. During the mass strike of 1877 in Pittsburgh, when 16 citizens were killed, the city erupted and “the whole population seemed to have joined the rioters.”
“Strange to say,” noted one journalist, elements of the population who had a “reputation for being respectable people — tradesmen,â€‹ householders, well-to-do mechanics and such — openly mingled with the [turbulent mob] and encouraged them to commit further deeds of violence.” Here, too, as in smaller locales, enraged as they clearly were, mass strikers still drew a distinction between railroad property and the private property of individuals, which they scrupulously avoided attacking. Often enough the momentum of the mass strike was enough to win concessions on wages, hours, or on other conditions of work — although they might be provisional, not inscribed in contracts, and subject to being violated or ignored when law and order was restored.
“Eight Hours for What We Will”
Brickyard and packinghouse workers, dry goods clerks, and iron molders, unskilled Jewish female shoe sewers and skilled telegraphers, German craftsmen from the bookbinding trade and unlettered Bohemian freight handlers, all assembled together under the banner of the Knights of Labor or less formal, impromptu assemblies. The full name of the Knights was actually the Noble and Holy Order of the Knights of Labor, a peculiar name that like so much of the electric language of the long nineteenth century sounds so dissonant and oddly exotic to modern ears. With one foot in the handicraft past and the other trying to step beyond the proletarian servitude waiting ominously off in the future, the Knights was itself the main organizational expression of the mass strike. It was part trade union, part guild, part political protest, part an aspiring alternate cooperative economy.
At all times and especially in smaller industrial towns, the Knights relied on ties to the larger community — kin, neighbors, local tradespeople — not merely the fellowship of the workplace. Like the Populist movement, it practically constituted an alternative social universe of reading rooms, newspapers, lecture societies, libraries, clubs, and producer cooperatives. Infused with a sense of the heroic and the “secular sacred,” the Knights envisioned themselves as if on a mission, appealing to the broad middling ranks of local communities to rescue the nation and preserve its heritage of republicanism and the dignity of productive labor.
This “Holy Order,” ambiguous and ambivalent in ultimate purpose, nevertheless mustered a profound resistance to the whole way of life represented by industrial capitalism even while wrestling with ways of surviving within it. So it offered everyday remedies — abolishing child and convict labor, establishing an income tax and public ownership of land for settlement not speculation, among others. Above all, however, it conveyed a yearning for an alternative, a “cooperative commonwealth” in place of the Hobbesian nightmare that Progress had become.
Transgressive by its nature, this “strange enthusiasm” shattered and then recombined dozens of more parochial attachments. The intense heat of the mass strike fused these shards into something more daring and generous-minded. Everything about it was unscripted. The mass strike had a rhythm all its own, syncopated and unpredictable as it spread like an epidemic from worksite to marketplace to slum. It had no command central, unlike a conventional strike, but neither was it some mysterious instance of spontaneous combustion. Rather, it had dozens of choreographers who directed local uprisings that nevertheless remained elastic enough to cohere with one another while remaining distinct. Its program defied easy codification. At one moment and place it was about free speech, at another about a foreman’s chronic abuse, here about the presence of scabs and armed thugs, there about a wage cut.
It ranged effortlessly from something as prosaic as a change in the piece rate to something as portentous as the nationalization of the country’s transportation and communication infrastructure, but at its core stood the demand for the eight-hour day. Blunt yet profound, it defined for that historical moment both the irreducible minimum of a just and humane civilization and what the prevailing order of things seemingly could not, or would not, grant. The “Eight Hour Day Song,” which became the movement’s anthem, captured that intermixing of the quotidian and the transcendent:
“We want to feel the sunshine
We want to smell the flowers;
We’re sure God has willed it.
And we mean to have eight hours.
We’re summoning our forces from
shipyard, shop, and mill;
Eight hours for work, eight hours for rest
Eight hours for what we will.”
When half a million workers struck on May 1, 1886 — the original “May Day,” still celebrated most places in the world except in the United States where it began — the strikers called it Emancipation Day. How archaic that sounds. Such hortatory rhetoric has gone out of fashion. The eight-hour-day movement of 1886 and the mass strikes that preceded, accompanied, and followed it were a freedom movement in the land of the free directed against a form of slavery no one would recognize or credit today.
Part 2: A Potemkin Village of the Nouveau Riche
Feudalism of a distinctly theatrical kind was the utopian refuge of the upper classes. Mostly that consisted of a retreat from an active engagement with the tumult around them. Some plutocrats, like George Pullman or J.P. Morgan, were, on the contrary, deeply implicated in running things. Morgan functioned as the nation’s unofficial central banker, but from a distinctly feudal point of view, famously declaring, “I owe the public nothing.”
Other corporate chieftains, like Mark Hanna, a kingmaker within the Republican Party, or August Belmont, who performed a similar role for the Democrats, became increasingly involved in political affairs. (Hanna once mordantly remarked, “There are only two things that are important in politics. The first is money and I can’t remember what the second one is.”) The two party machines had exercised some independence immediately after the Civil War, demanding tribute from the business classes. As the century ran down, however, they were domesticated, becoming water carriers for those they had once tithed. Legislative bodies during this era, including the Senate, otherwise known as “the Millionaires Club,” filled up with the factotums of corporate America.
Far larger numbers of the nouveau riche, a rentier class of landlords and coupon clippers, however, were gun-shy about embroiling themselves. Instead, they confected a hermetically sealed-off Potemkin village in which they pretended they were aristocrats with all the entitlements and deference and legitimacy that comes with that station.
Looking back a century and more, all that dressing up — the masquerade balls where the Social Register elite (the “Patriarchs” of the 1870s, the “400” by the 1890s) paraded about as Henry VIII and Marie Antoinette, the liveried servants, the castles disassembled in France or Italy or England and shipped stone by stone to be reassembled on Fifth Avenue, the fake genealogies and coats of arms, hunting to hounds and polo playing, raising pedigreed livestock for decorative purposes, the helter-skelter piling up of heirloom jewelry, Old Masters, and oriental rugs, the marrying off of American “dollar princesses” to the hard-up offspring of Europe’s decaying nobility, the exclusive watering holes in Newport and Bar Harbor, the prep schools, and gentlemen’s clubs fencing them off from hoi polloi, the preoccupation with social preferment that turned prized parterre boxes at opera houses and concert halls into deadly serious tournament jousts — seems silly. Or more to the point, it all appears as incongruously weird behavior in the homeland of the democratic revolution. And in some sense it was.
Yet this spectacle had a purpose, or multiple purposes. First of all, it was a time-tested way of displaying power for all to see. More than that was going on, however. It constituted the infrastructure of a utopian cultural fantasy by a risen class so raw and unsure of its place and mission in the world that it needed all these borrowed credentials as protective coloring. An elaborate camouflage, it might serve to legitimate them both in the eyes of those over whom they were suddenly exercising or seeking to exercise enormous power, and in their own eyes as well.
After all, many of these first- and second-generation bourgeois potentates had just sprung from social obscurity and the homeliest economic pursuits. Their native crudity was in plain sight, mocked by many. Herman Melville remarked, “The class of wealthy people are, in aggregate, such a mob of gilded dunces, that not to be wealthy carries with it a certain distinction and nobility.” As their social prominence and economic throw weight increased at an extraordinary rate — and, along with it, the most furious challenges to their sudden preeminence — so too did the need to fabricate delusions of stability and tradition, to feel rooted even in the shallowest of soils, to thicken the borders of their social insulation.
Caroline Astor, better known as “Mrs. Astor,” the doyenne of this world, whose grandfather-in-law had started out as a butcher, wrestled to express how such tensions might be resolved. Her family’s life was described by one observer this way: “The livery of their footmen was a close copy of that familiar at Windsor Castle and their linen was marked with emblems of royalty. At the opera they wore tiaras, and when they dined the plates were in keeping with imperial pretensions.”
Similar portraits were painted of many of the great dynastic families and their offspring; the Goulds, Harry Payne Whitney, the Vanderbilts, and others were depicted in ways that made them highly improbable candidates to form a socially conscious aristocracy. Mrs. Astor herself was once described as a “walking chandelier” because so many diamonds and pearls were pinned to every available empty space on her body.
Her relative John Jacob Astor IV, a notorious playboy, was chastised, along with his peers, by an Episcopal minister: “Mr. Astor and his crowd of New York and Newport associates have for years not paid the slightest attention to the laws of the church and state which have seemed to contravene their personal pleasures or sensual delights. But you can’t defy God all the time. The day of reckoning comes and comes not in our own way.” Some years later Astor went down with the Titanic. Another member of the clan declined an invitation by President Hayes to serve as ambassador to England on the grounds that it violated the family credo: “Work hard, but never work after dinner.”
Ward McAllister, the major-domo of the Social Register’s “400,” took a stab at coherence from another angle. “Now, with the rapid growth of riches, millionaires are too common to receive much deference; a fortune of a million is only respectable poverty,” McAllister said. “So we have to draw social boundaries on another basis: old connections, gentle breeding, perfection in all the requisite accomplishments of a gentleman, elegant leisure and an unstained private reputation count for more than newly gotten riches.”
But the “old connections” were as new and ephemeral as yesterday’s business negotiation, and “gentle breeding” for some didn’t even include full literacy or numeracy but did include copious spitting; the “accomplishments of a gentleman” would have to embrace every kind of shrewd dealing in the marketplace, or else the pickings would be scarce. And the tides of America’s volatile economy meant that no matter how high the sand dunes were built around the redoubts of “old money,” they could never resist for long the onrush of new money.
It was all, as one historian noted, a “pageant and fairy tale,” a peculiar arcadia of castles and servants, an homage to the “beau ideal” by a newly hatched social universe trying but failing to “live down its mercantile origins.” But this dream life was ill-suited to the arts and crafts of ruling over a society that at best was apt to find this charade amusing, at worst an insult. What was missing was an actual aristocracy.
Wall Street Brahmin Henry Lee Higginson, fearing “Awful Democracy” — thatâ€‹ whole menagerie of radicalisms — urgently appealed to his fellows to take up the task of mastery, “more wisely and more humanely than the kings and nobles have done. Our chance is now — before the country is full and the struggle for bread becomes intense. I would have the gentlemen of the country lead the new men who are trying to become gentlemen.”
The appeal fell mainly on deaf ears. Many in this set were sea-dog capitalists, dynasty builders, for whom accumulation was a singular, all-consuming obsession. They reckoned with outside authority if they had to, manipulated it if they could, but just as often went about their business as if it didn’t exist. Bred to hold politics in contempt, one Social Register memoirist recalled growing up during the “great barbeque.” He was taught to think of politics as something “remote, disreputable, and infamous, like slave-trading or brothel-keeping.”
Together they concocted a world set apart from the commercial, political, sexual, ethnic, and religious chaos threatening to envelop them. An upper-class “white city” of chivalry, honor codes, and fraternal loyalties, mannered, carefree, and self-regarding, it was a laboratory of narcissistic self-indulgence, an ostensible repudiation of those distinctly bourgeois character traits of prudence, thrift, and money-grubbing.
Born into an age defined by steam, steel, and electricity, they attempted to wall themselves off from modernity in an alternate universe, part medieval, part Renaissance Europe, part ancient Greece and Rome, a pastiche of golden ages. The long nineteenth century had given birth to a plutocracy unschooled and indisposed to win the trust and preside over a society it feared. Instead, the plutocracy preferred playacting at aristocracy, simultaneously confirming all the popular suspicions about its real intentions and forming a society that had forsaken society.
The self-imposed aloofness and feudal pretentiousness of the upper classes left the institutions and cultural wherewithal of the commonwealth thin on the ground. An indigenous suspicion of overbearing government born out of the nation’s founding left the apparatus of the state strikingly weak and underdeveloped well past the turn of the twentieth century. All of its resources, that is, except one: force, rule by blunt instrument. The frequent resort to violence that so marked the period was thus the default position of a ruling elite not really prepared to rule. And of course it only aggravated the dilemma of consent. Those suffering from the callousness of the dominant classes were only too ready to treat them as they depicted themselves — that is, as aristocrats but usurping ones lacking even a scintilla of legitimate authority.
The American upper classes did not constitute a seasoned aristocracy, but could only mimic one. They lacked the former’s sense of social obligation, of noblesse oblige, of what in the Old World emerged as a politically coherent “Tory socialism” that worked to quiet class antagonisms. But neither did they absorb the democratic ethos that today allows the country’s gilded elite to act as if they were just plain folks: a credible enough charade of plutocratic populism. Instead, faced with mass social disaffection, they turned to the “tramp terror” and other innovations in machine-gun technology, to private corporate armies and government militias, to suffrage restrictions, judicial injunctions, and lynchings. Why behave otherwise in dealing with working-class “scum” a community of “mongrel firebugs”?
One historian has described what went on during the Great Uprising as an “interlocking directorate of railroad executives, military officers, and political officials which constituted the apex of the country’s new power elite.” After Haymarket, the haute bourgeoisie went into the fort building business; Fort Sheridan in Chicago, for example, was erected to defend against “internal insurrection.” New York City’s armories, which have long since been turned into sites for indoor tennis, concerts, and theatergoing, were originally erected after the 1877 insurrection to deal with the working-class canaille.
During the anthracite coal strike of 1902, George Baer, president of the Philadelphia and Reading Railroad and leader of the mine owners, sent a letter to the press: “The rights and interests of the laboring man will be protected and cared for… not by the labor agitators, but by the Christian men of property to whom God has given control of the property rights of the country.” To the Anthracite Coal Commission investigating the uproar, Baer proclaimed, “These men don’t suffer. Why hell, half of them don’t even speak English.”
Ironically, it was thanks in part to its immersion in bloodshed that the first rudimentary forms of a more sophisticated class consciousness began to appear among this new elite. These would range from Pullman-like Potemkin villages to more practical-minded attempts to reach a modus vivendi with elements of the trade union movement readier to accept the wages system.
Yet the political arena, however much its main institutions bent to the will of the rich and mighty, remained ostensibly contested terrain. On the one hand, powerful interests relied on state institutions both to keep the “dangerous classes” in line and to facilitate the process of primitive accumulation. But an opposed instinct, native to capitalism in its purest form, wanted the state kept weak and poor so as not to intrude where it wasn’t wanted. Due to this ambivalence, the American state was notoriously undernourished, its bureaucracy kept skimpy, amateurish, and machine-controlled, its executive and administrative reach stunted.
No society can live indefinitely on such shifting terrain, leaving the most vital matters unresolved. Even before the grand denouement of the Great Depression and New Deal arrived, an answer to the labor question was surfacing, one that would put an end to the long era of anti-capitalism. It would become the antechamber to the Age of Acquiescence.
Steve Fraser is an historian, editor, writer, and TomDispatch regular. His newest book is The Age of Acquiescence: The Life and Death of American Resistance to Organized Wealth and Power, excerpted above. His previous books include Every Man a Speculator: A History of Wall Street in American Life and Wall Street: America’s Dream Palace. He is the co-founder of the American Empire Project.
Excerpted from the book The Age of Acquiescence by Steve Fraser. Copyright (c) 2015 by Steve Fraser. Reprinted with permission of Little, Brown and Company.
Copyright 2015 Steve Fraser
Plutocracy The First Time Around
In 2007, a financial firestorm ravaged Wall Street and the rest of the country. In 2012, Hurricane Sandy obliterated a substantial chunk of the Atlantic seaboard. We think of the first as a man-made calamity, the second as the malignant innocence of nature. But neither the notion of a man-made nor natural disaster quite captures how the power of a few and the vulnerability of the many determine what is really going on at ground level. Causes and consequences, who gets blamed and who leaves the scene permanently scarred, who goes down and who emerges better positioned than before: these are matters often predetermined by the structures of power and wealth, racial and ethnic hierarchies, and despised and favored forms of work, as well as moral and social prejudices in place before disaster strikes.
When it comes to our recent financial implosion, this is easy enough to see, although great efforts have been expended trying to deny the self-evident. “Man” did not bring the system to its knees; the country’s dominant financial institutions and a complicit government did that. They’ve recovered, the rest of us haven’t.
Sandy seems a more ambiguous case. On the one hand, it’s obvious enough that an economy resting on fossil fuels played a catalytic role in intensifying the storm. Those corporate interests profiting from that form of energy production and doing all they can to defend it are certainly culpable, not the rest of mankind which has no other choice but to depend on the energy system we’re given.
On the other hand, rich and poor, big businesses and neighborhood shops suffered; some, however, more than others. Among them were working class communities; public-housing residents; outer borough homeowners; communities in Long Island, along the New Jersey shore, and inland as well; workers denied unemployment compensation; and the old, the sick, and the injured abandoned for days or weeks in dark and dangerous high-rises without medical help or access to fresh food or water. Help, when it came to these “disadvantaged” worlds, often arrived late, or last, or not at all.
Cleaning up and rebuilding New York City and other places hit by the storm will provide a further road map of who gets served and whose ox gets gored. It’s ominous, if hardly shocking, that Mayor Bloomberg has already appointed Mark Ricks of Goldman Sachs to the business-dominated team planning the city’s future. Where would this billionaire mayor turn other than to his fraternity brothers, especially in this era when, against all the odds, we still worship at the altar of the deal-makers, no matter their malfeasances and fatal ineptitudes?
Still, it is early days and the verdict is not in on the post-Sandy future. However, an incisive analysis by sociologists Kevin Fox Gotham and Miriam Greenberg of what happened after the 9/11 attacks in New York and in New Orleans after Hurricane Katrina offers some concrete forebodings. Everyone knows that, as soon as Katrina made landfall, the racial divisions of New Orleans became the scandal of the month when it came to which communities were drowned and which got helped, who got arrested (and shot), and who left town forever. To be poor in New Orleans during and after Katrina was a curse. To be poor and black amounted to excommunication.
Gotham and Greenberg prove that, post-9/11 and post-Katrina, reconstruction and rehabilitation was also skewed heavily in favor of the business community and the wealthier. In both cities, big business controlled the redevelopment process — and so where the money landed and where it didn’t.
Tax breaks and private sector subsidies became channels for federal aid. “Public benefit” standards, which once accompanied federal grants and tax exemptions to insure that projects served some public purpose, especially to “benefit persons of low and moderate income,” were eliminated, leaving poorer people out in the cold, while exacerbating existing inequalities. Governments scurried around inventing ways to auction off reconstruction projects to private interests by issuing tax exempt “Private Activity Bonds.” These were soon gloriously renamed “Liberty Bonds,” though the unasked question was: Whose liberty?
The lion’s share of grants and exemptions went, of course, to the biggest corporations. In New York, more than 40% of all bonds, or $2.4 billion, went to a single developer, Larry Silverstein. Second to Silverstein was — don’t be shocked — Goldman Sachs. Yet these institutions and their inhabitants represented at best a mere 15% of those affected, most of whom were low-wage workers who, in some cases, ended up getting evicted from their homes thanks to those business-oriented tax breaks. Federal aid, hypothetically tied to building affordable housing and the creation of living-wage jobs ended up as just that: hypothetical.
Naturally, these mechanisms proved lucrative. More than that they are the means by which elites use disasters as opportunities to turn wrecked cities or regions into money-making centers and playlands for what in the nineteenth century was called “the upper tendom” and what we now call “the 1%.”
Indeed, the original “upper tendom” faced its own “natural” disasters during the Gilded Age. Then, too, such catastrophes exposed the class and racial anatomy of America to public view. Then, too, one man’s disaster was another’s main chance. Whether you focus on the cause of the calamity, the way people reacted to it, or the means and purposes that drove the reclamation afterwards, disasters and capitalism metabolized together long before “disaster capitalism” became the nom de jour.
Mrs. O’Leary’s infamously rambunctious cow did not kick a lantern into a batch of hay and start the Chicago fire of 1871. To this day, however, many probably still believe the story, even though the journalist who first reported it admitted a mere 20 years later that he’d made it up.
It was a story that stuck because it meshed with the ethnic and social fears and prejudices of bourgeois Chicago. Irish and German immigrants then filled up the congested warrens of that Midwestern center of industry and commerce. Their customs, religions, languages, political beliefs, and proletarian status were alien and alarming — especially because that was the year of the Paris Commune, when proletarians took over the French national capital for two months. It was an event that scared the daylights out of the “upper tendom” and broad stretches of the middle classes as well in cities and towns throughout the U.S.
Chicago’s papers were full of stories about “petroleuses,” “amazon-like women” with “long flaming hair” coursing through the streets of Paris hurling the equivalent of Molotov cocktails at the French National Guard. Could it happen here? That was the question. Impoverished immigrant workers were already raising a ruckus in mines and on railroads. Perhaps as in France, so in Chicago they would become conspirators and incendiaries. Perhaps the great fire that gutted the city was no accident. Even if it was, weren’t there those prepared to make malevolent use of it?
Rumors of secret societies, revolutionary arsonists, and mass assaults on property circulated widely by word of mouth and through the Chicago media. So Mrs. O’Leary proved an especially apt scapegoat for the conflagration, fitting perfectly the temper of the time. She was, after all, “low class” Irish at a moment when her immigrant countrymen were still despised as rustic potato eaters, bestial and good for nothing but back-breaking labor. It was also known that they were all too Catholic, notoriously fond of alcohol, and quite capable of terrorizing British landlords back home.
Less talked about was the likelier cause of the fire: namely, the unimaginably congested neighborhoods of the poor, made entirely out of wood — houses, signs, and sidewalks, too. These had for years been the sites of frequent fires (two a day in 1870). Such frail structures became kindling for the flames that would in 1871 end up leveling downtown banks, businesses, and the homes of the rich.
These fears leaped with the flames that were burning up the city, killing 3,000 and leaving 100,000 homeless, and in the days and weeks that followed they hardly subsided. Immigrant, poor, and proletarian, Chicago’s working class was held in deep moral suspicion. Believing is often seeing, so when an upper-class eyewitness looked here’s what he saw: “Vice and crime had got the first scorching. The district where the fire got firm foothold was the Alsatia of Chicago. Fleeing before it was a crowd of blear-eyed, drunken and diseased wretches, male and female, half naked, ghastly with painted cheeks cursing, uttering ribald jests.”
Relief agencies, mainly privately run, were charged with aiding only the “worthy,” and they were “deserving” of help only after close inspection of their work habits, family arrangements, home economics, drinking customs, and so on. Civil War General Phillip Sheridan established martial law and was quick to fire on suspected looters, while enforcing a curfew to keep the “twilight population” in check.
At the same time, Chicago’s business elite, its civic leaders, and a remarkable roster of first-rate architects went about reshaping downtown Chicago into a modern hub of commerce and culture that they hoped would rival New York. Real-estate speculators made a fortune, although none were known to have been shot for looting. For some, in other words, the fire functioned as a fortuitous slum clearance/urban renewal program on speed.
Angry working people marched against new restrictions on cheaper building materials, seeing them as discriminatory against labor and immigrants, as attempts to force them out of their city. They paraded to the Common Council, where they threw bricks through the windows while it dutifully passed the ordinances. For their efforts, the protesters were denounced as the “scum of the community,” “mongrel firebugs,” and likened to the Parisian communards, intent on establishing a “reign of terror.”
The fire was out but only for the time being. The fires of social insurrection were still smoldering and would flame up again and again in the streets of Chicago throughout the rest of the century.
An unnatural disaster! With a “roar like thunder,” a wall of water 60 feet high from Lake Conemaugh, believed then to be the largest artificial body of water in the world, came racing down a canyon near Johnstown, Pennsylvania, at 40 miles an hour. Everything in its path was swept away, starting with Woodvale, a company town run by the Cambria Iron Works. Johnstown itself was next as the tidal wave rushed on relentlessly drowning and destroying bridges, oil tankers, and factories. It tossed locomotives, railroad cars, and even houses into the air. It ended the lives of more than 2,200 people. Seven hundred and seventy-seven were never identified and are buried in the “Plot of the Unknown.” Johnstown has been memorialized ever since in song and story.
Was it fate as well as an especially rainy spring that did the trick in 1889? At the top of the canyon, members of the South Fork Fishing and Hunting Club, men like iron and steel magnates Andrew Carnegie, Henry Clay Frick, and Andrew Mellon, as well as the crème de la crème of Pittsburgh high society (the city was only 60 miles away) had long enjoyed the pleasures of that man-made lake. They had gone fishing, paddle boating, and sailing there for years. And for years, engineers kept informing the iron and steel barons that the earthen dam holding back its waters was defective. The spillway was both too small and clogged with fencing materials meant to keep the expensive sports fish stocked in the lake from escaping into a nearby river. Auxiliary discharge pipes had decayed and leaks had been routinely noticed at the base of the dam even when the weather was especially dry.
The club’s sportsmen did nothing. In fact, they ordered several feet shaved off the top of the dam to make way for a road so members could get to their “cottages” faster from the nearby railroad station. After the horror, there were lawsuits aplenty, but no one was ever held responsible for what quickly became a legendary tragedy. In 1989, on the centennial of the disaster, an article in the Journal of Civil Engineering confirmed that the actions of the South Fork Club were the proximate cause of this “natural disaster.”
All was not lost, however. Some years after Johnstown was rebuilt, Andrew Carnegie donated one of his libraries for which he would become so widely celebrated.
Bubonic plague returned to San Francisco when the earthquake of 1906 sent hordes of rats racing through the rubble, chasing through the raw sewage spilling into the streets as the city’s sewer pipes crumpled. Anyone was potentially susceptible. In one way the earthquake had been an equal-opportunity destroyer. Chinatown, with its masses of poor living in squalid wooden shacks, was razed to the ground by the quake and subsequent fire. Other working class precincts were similarly leveled and burnt. But so, too, was Nob Hill, where the city’s gilded elite lived.
A mythic memory of communal suffering, self-sacrifice, and mutual aid emerged in the immediate aftermath of the San Francisco disaster, as it still does in the wake of many similar collective traumas. After 9/11, as after Superstorm Sandy, stories of how people from all walks of life banded together to help one another were commonplace. This was even true in Chicago after the fire, notwithstanding the white-hot hostilities between the classes and the masses. These are not fables, but moving accounts drawn from real life. They offer a kind of hope in disaster and, consoling as they are meant to be, linger on, sometimes forever. Meanwhile, interred and resting in peace are often the disaster’s darker doings.
Looking back on earthquake-ravaged San Francisco, a well-off refugee remembered that the calamity “did not discriminate between tavern and tabernacle, bank and brothel.” Yet the wife of the president of Levi-Strauss and Co. drove up to one of the relief centers in her limousine (in those early days cars were still mainly luxury machines and she owned one of the handful of limos in the city). She was, of course, ushered right to the head of its endless line.
Even in these immediate post-quake reports, one could detect other motivations at work. So, for example, while San Francisco was ravaged, the death toll was calculated at only about 375 people. For a savage firestorm coursing through the most densely packed of neighborhoods, that low figure surprised people and left some wondering. The answer turned out to be this: the city fathers were determined to cite a low number so as not to discourage San Francisco’s rebuilding and the outside investments that would require. For many years, the figure was nonetheless accepted as accurate. Recently, however, through the diligent efforts of researchers, we know that the numbers of dead were probably 10 times higher. News of the bubonic plague was suppressed for similar reasons.
Calculations of that kind informed many aspects of the tragedy. While sitting atop the San Andreas Fault is not ideal, should the underlying tectonic plates move a bit, not much was said about other contributory causes. Minor earthquakes had erupted for decades and these had been set off, at least in part, due to the hydraulic mining that accompanied the California gold rush in its later years.
The operation to relieve the distress of hundreds of thousands of homeless people after the quake was tainted by class and ethnic biases not unlike those in Chicago. Relief camps segregated refugees by class as well as race and gender. Firefighters pooled water and equipment to save the homes of the wealthy first. In working class districts, fire-fighting focused on commercial properties like a Folger’s Coffee warehouse and freight sheds, not on saving homes. Seventeen hundred troops under General Frederick Funston guarded richer precincts because, as he explained, “San Francisco had its class of people no doubt who would take advantage of any opportunity to plunder the banks and rich jewelry…” Chinatown did not die an entirely natural death either. It was dynamited to create firebreaks and so prevent the fires already raging there from spreading to tonier neighborhoods.
Two years after the event, poor people were still living in “relief cottages,” tents, and other makeshift accommodations which, at rental rates of six dollars a month, many couldn’t afford. To get relief required a letter from a clergyman testifying to one’s moral worthiness. Working class women took to the streets to protest.
Meanwhile, former residents of Nob Hill had moved into equally luxurious digs elsewhere in the city. However, they did have a problem in those early months. There was a crying lack of domestic help. As the San Francisco Chronicle reported, “Everyone had wondered where the cooks had gone. They had been lost since the fire.” So working women, who were bending all their efforts to restoring their devastated families by making use of what relief was available, were chastised for not returning to the kitchens of the elect. One paper claimed that the women “were loafing… when families needed help,” or as a Red Cross matron observed, “Women [domestics] prefer to live… in relief camps.”
Help was, however, on the way. Special rehabilitation funds were reserved for single women so they could resume their lives of domestic service.
Being solicitous about the needs of the rich could reach heights of absurdity. It was recommended, for instance, that special philanthropic pawn shops be established for the upper classes where “people who saved their jewels could be rehabilitated by having such a place to go where they would not have to pay too much interest.”
If rehabilitation and recovery was on the civic mind, certain minds counted more than others. Everybody knew that the city’s wood-frame buildings could not stand up to the pressures of another earthquake, which — they also knew — was a reasonable future possibility. So new building codes were adopted calling for the use of reinforced concrete and steel in structures over six stories high. They lasted a year. Pressures from the business community and builders caused the city to relax those rules, except in the new downtown which was urgently readying itself for the Panama Pacific International Exposition of 1915, where the city’s boosters hoped to eradicate the last pungent odors of the calamity.
A $500 “bonus plan” to help rebuild homes favored the native-born and two-parent households. Housing rehabilitation began with the wealthy and worked its way very slowly to the poor. There were lots of jobs for “earthquake mechanics,” but at wages that could never keep up with escalating rents driven by real-estate speculators.
Insurance companies had by then rewritten their home-owner policies to exempt earthquakes from coverage. Fire was covered, however, and it’s clear that people deliberately set fire to their own homes, already ruined by the tremors, since without insurance money there was no way they could recover and rebuild. Not surprisingly, pay-offs were highest for the wealthy. The insurance companies worked at delaying payments to the hardest hit, the poor. This fit with the mood of the moment — that those working class shacks were “no loss to the city.”
Neither was Chinatown. San Francisco’s upper crust, as well as large portions of its white middle and working classes, had never been fond of the Chinese in their midst, even though they depended on their labors. The quake struck the city’s burghers as an opportunity to funnel them out of the center of the city — the old Chinatown had largely been destroyed — to some enclave on its outskirts. (“Fire has reclaimed civilization and cleanliness from the Chinese ghetto.”) Their plans were, however, successfully thwarted by the concerted resistance of the Chinese community.
Resistance notwithstanding, Chicago and San Francisco emerged from their trials by fire as bustling centers of capitalist enterprise. Disaster capitalism has a long history. One of the last remaining “relief cottages” built by Funston’s army at the cost of $100 and rented for $2 was just recently sold for $600,000.
Recently, when the Republican majority in Congress temporarily blocked funds for Sandy relief and rehabilitation efforts, it was a chilling reminder that no matter how universal a calamity is, we live in times when the commonwealth regularly takes a backseat to wealth. Appeals to fellowship, to mutual assistance and shared sacrifice seem to give way with scandalous speed to the commanding imperatives of a warped economy and political plutocracy.
More Sandys are surely headed our way, more climate-driven disasters of all sorts than we can now fully imagine. And rest assured, they will be no more “natural” than the Chicago fire, the Johnstown flood, or the San Francisco earthquake. More than fire itself what we need to deal with now is the power of the finance, insurance, and real estate — or FIRE — sector whose leading corporations now effectively run our economy. Without doing that, the “nature” these interests have helped create will punish us all while providing a ghoulish boondoggle for a few.
Steve Fraser is editor-at-large of New Labor Forum, a co-founder of the American Empire Project, and author most recently of Wall Street: America’s Dream Palace. A version of this piece will appear in the spring issue of New Labor Forum.
Copyright 2013 Steve Fraser
Making Disaster Pay
Shakespeare’s Polonius offered this classic advice to his son: “neither a borrower nor a lender be.” Many of our nation’s Founding Fathers emphatically saw it otherwise. They often lived by the maxim: always a borrower, never a lender be. As tobacco and rice planters, slave traders, and merchants, as well as land and currency speculators, they depended upon long lines of credit to finance their livelihoods and splendid ways of life. So, too, in those days, did shopkeepers, tradesmen, artisans, and farmers, as well as casual laborers and sailors. Without debt, the seedlings of a commercial economy could never have grown to maturity.
Ben Franklin, however, was wary on the subject. “Rather go to bed supperless than rise in debt” was his warning, and even now his cautionary words carry great moral weight. We worry about debt, yet we can’t live without it.
Debt remains, as it long has been, the Dr. Jekyll and Mr. Hyde of capitalism. For a small minority, it’s a blessing; for others a curse. For some the moral burden of carrying debt is a heavy one, and no one lets them forget it. For privileged others, debt bears no moral baggage at all, presents itself as an opportunity to prosper, and if things go wrong can be dumped without a qualm.
Those who view debt with a smiley face as the royal road to wealth accumulation and tend to be forgiven if their default is large enough almost invariably come from the top rungs of the economic hierarchy. Then there are the rest of us, who get scolded for our impecunious ways, foreclosed upon and dispossessed, leaving behind scars that never fade away and wounds that disable our futures.
Think of this upstairs-downstairs class calculus as the politics of debt. British economist John Maynard Keynes put it like this: “If I owe you a pound, I have a problem; but if I owe you a million, the problem is yours.”
After months of an impending “debtpocalypse,” the dreaded “debt ceiling,” and the “fiscal cliff,” Americans remain preoccupied with debt, public and private. Austerity is what we’re promised for our sins. Millions are drowning, or have already drowned, in a sea of debt — mortgages gone bad, student loans that may never be paid off, spiraling credit card bills, car loans, payday loans, and a menagerie of new-fangled financial mechanisms cooked up by the country’s “financial engineers” to milk what’s left of the American standard of living.
The world economy almost came apart in 2007-2008, and still may do so under the whale-sized carcass of debt left behind by financial plunderers who found in debt the leverage to get ever richer. Most of them still live in their mansions and McMansions, while other debtors live outdoors, or in cars or shelters, or doubled-up with relatives and friends — or even in debtor’s prison. Believe it or not, a version of debtor’s prison, that relic of early American commercial barbarism, is back.
In 2013, you can’t actually be jailed for not paying your bills, but ingenious corporations, collection agencies, cops, courts, and lawyers have devised ways to insure that debt “delinquents” will end up in jail anyway. With one-third of the states now allowing the jailing of debtors (without necessarily calling it that), it looks ever more like a trend in the making.
Will Americans tolerate this, or might there emerge a politics of resistance to debt, as has happened more than once in a past that shouldn’t be forgotten?
The World of Debtor’s Prisons
Imprisonment for debt was a commonplace in colonial America and the early republic, and wasn’t abolished in most states until the 1830s or 1840s, in some cases not until after the Civil War. Today, we think of it as a peculiar and heartless way of punishing the poor — and it was. But it was more than that.
Some of the richest, most esteemed members of society also ended up there, men like Robert Morris, who helped finance the American Revolution and ran the Treasury under the Articles of Confederation; John Pintard, a stock-broker, state legislator, and founder of the New York Historical Society; William Duer, graduate of Eton, powerful merchant and speculator, assistant secretary in the Treasury Department of the new federal government, and master of a Hudson River manse; a Pennsylvania Supreme Court judge; army generals; and other notables.
Whether rich or poor, you were there for a long stretch, even for life, unless you could figure out some way of discharging your debts. That, however, is where the similarity between wealthy and impoverished debtors ended.
Whether in the famous Marshalsea in London where Charles Dickens had Little Dorritt’s father incarcerated (and where Dickens’s father had actually languished when the author was 12), or in the New Gaol in New York City, where men like Duer and Morris did their time, debtors prisons were segregated by class. If your debts were large enough and your social connections weighty enough (the two tended to go together) you lived comfortably. You were supplied with good food and well-appointed living quarters, as well as books and other pleasures, including on occasion manicurists and prostitutes.
Robert Morris entertained George Washington for dinner in his “cell.” Once released, he resumed his career as the new nation’s richest man. Before John Pintard moved to New Gaol, he redecorated his cell, had it repainted and upholstered, and shipped in two mahogany writing desks.
Meanwhile, the mass of petty debtors housed in the same institution survived, if at all, amid squalor, filth, and disease. They were often shackled, and lacked heat, clean water, adequate food, or often food of any kind. (You usually had to have the money to buy your own food, clothing, and fuel.) Debtors in these prisons frequently found themselves quite literally dying of debt. And you could end up in such circumstances for trivial sums. Of the 1,162 jailed debtors in New York City in 1787, 716 owed less than twenty shillings or one pound. A third of Philadelphia’s inmates in 1817 were there for owing less than $5, and debtors in the city’s prisons outnumbered violent criminals by 5:1. In Boston, 15% of them were women. Shaming was more the point of punishment than anything else.
Scenes of public pathos were commonplace. Inmates at the New Gaol, if housed on its upper floors, would lower shoes out the window on strings to collect alms for their release. Other prisons installed “beggar gates” through which those jailed in cellar dungeons could stretch out their palms for the odd coins from passersby.
Poor and rich alike wanted out. Pamphleteering against the institution of debtor’s prison began in the 1750s. An Anglican minister in South Carolina denounced the jails, noting that “a person would be in a better situation in the French King’s Gallies, or the Prisons of Turkey or Barbary than in this dismal place.” Discontent grew. A mass escape from New Gaol of 40 prisoners armed with pistols and clubs was prompted by extreme hunger.
In the 1820s and 1830s, as artisans, journeymen, sailors, longshoremen, and other workers organized the early trade union movement as well as workingmen’s political parties, one principal demand was for the abolition of imprisonment for debt. Inheritors of a radical political culture, their complaints echoed that Biblical tradition of Jubilee mentioned in Leviticus, which called for a cancellation of debts, the restoration of lost houses and land, and the freeing of slaves and bond servants every 50 years.
Falling into debt was a particularly ruinous affliction for those who aspired to modest independence as shopkeepers, handicraftsmen, or farmers. As markets for their goods expanded but became ever less predictable, they found themselves taking out credit to survive and sometimes going into arrears, often followed by a stint in debtor’s prison that ended their dreams forever.
However much the poor organized and protested, it was the rich who got debt relief first. Today, we assume that debts can be discharged through bankruptcy (although even now that option is either severely restricted or denied to certain classes of less favored debt delinquents like college students). Although the newly adopted U.S. Constitution opened the door to a national bankruptcy law, Congress didn’t walk through it until 1800, even though many, including the well-off, had been lobbying for it.
Enough of the old moral faith that frowned on debt as sinful lingered. The United States has always been an uncharitable place when it comes to debt, a curious attitude for a society largely settled by absconding debtors and indentured servants (a form of time-bound debt peonage). Indeed, the state of Georgia was founded as a debtor’s haven at a time when England’s jails were overflowing with debtors.
When Congress finally passed the Bankruptcy Act, those in the privileged quarters at New Gaol threw a party. Down below, however, life continued in its squalid way, since the new law only applied to people who had sizable debts. If you owed too little, you stayed in jail.
Debt and the Birth of a Nation
Nowadays, the conservative media inundate us with warnings about debt from the Founding Fathers, and it’s true that some of them like Jefferson — himself an inveterate, often near-bankrupt debtor — did moralize on the subject. However, Alexander Hamilton, an idol of the conservative movement, was the architect of the country’s first national debt, insisting that “if it is not excessive, [it] will be to us a national blessing.”
As the first Secretary of the Treasury, Hamilton’s goal was to transform the former 13 colonies, which today we would call an underdeveloped land, into a country that someday would rival Great Britain. This, he knew, required liquid capital (resources not tied up in land or other less mobile forms of wealth), which could then be invested in sometimes highly speculative and risky enterprises. Floating a national debt, he felt sure, would attract capital from well-positioned merchants at home and abroad, especially in England.
However, for most ordinary people living under the new government, debt aroused anger. To begin with, there were all those veterans of the Revolutionary War and all the farmers who had supplied the revolutionary army with food and been paid in notoriously worthless “continentals” — the currency issued by the Continental Congress — or equally valueless state currencies.
As rumors of the formation of a new national government spread, speculators roamed the countryside buying up this paper money at a penny on the dollar, on the assumption that the debts they represented would be redeemed at face value. In fact, that is just what Hamilton’s national debt would do, making these “sunshine patriots” quite rich, while leaving the yeomanry impoverished.
Outrage echoed across the country even before Hamilton’s plan got adopted. Jefferson denounced the currency speculators as loathsome creatures and had this to say about debt in general: “The modern theory of the perpetuation of debt has drenched the earth with blood and crushed its inhabitants under burdens ever accumulating.” He and others denounced the speculators as squadrons of counter-revolutionary “moneycrats” who would use their power and wealth to undo the democratic accomplishments of the revolution.
In contrast, Hamilton saw them as a disinterested monied elite upon whom the country’s economic well-being depended, while dismissing the criticisms of the Jeffersonians as the ravings of Jacobin levelers. Soon enough, political warfare over the debt turned founding fathers into fratricidal brothers.
Hamilton’s plan worked — sometimes too well. Wealthy speculators in land like Robert Morris, or in the building of docks, wharves, and other projects tied to trade, or in the national debt itself — something William Duer and grandees like him specialized in — seized the moment. Often enough, however, they over-reached and found themselves, like the yeomen farmers and soldiers, in default to their creditors.
Duer’s attempts to corner the market in the bonds issued by the new federal government and in the stock of the country’s first National Bank represented one of the earliest instances of insider trading. They also proved a lurid example of how speculation could go disastrously wrong. When the scheme collapsed, it caused the country’s first Wall Street panic and a local depression that spread through New England, ruining “shopkeepers, widows, orphans, butchers… gardeners, market women, and even the noted Bawd Mrs. McCarty.”
A mob chased Duer through the streets of New York and might have hanged or disemboweled him had he not been rescued by the city sheriff, who sent him to the safety of debtor’s prison. John Pintard, part of the same scheme, fled to Newark, New Jersey, before being caught and jailed as well.
Sending the Duers and Pintards of the new republic off to debtors’ prison was not, however, quite what Hamilton had in mind. And leaving them rotting there was hardly going to foster the “enterprising spirit” that would, in the treasury secretary’s estimation, turn the country into the Great Britain of the next century. Bankruptcy, on the other hand, ensured that the overextended could start again and keep the machinery of commercial transactions lubricated. Hence, the Bankruptcy Act of 1800.
If, however, you were not a major player, debt functioned differently. Shouldered by the hoi polloi, it functioned as a mechanism for funneling wealth into the mercantile-financial hothouses where American capitalism was being incubated.
No wonder debt excited such violent political emotions. Even before the Constitution was adopted, farmers in western Massachusetts, indebted to Boston bankers and merchants and in danger of losing their ancestral homes in the economic hard times of the 1780s, rose in armed rebellion. In those years, the number of lawsuits for unpaid debt doubled and tripled, farms were seized, and their owners sent off to jail. Incensed, farmers led by a former revolutionary soldier, Daniel Shays, closed local courts by force and liberated debtors from prisons. Similar but smaller uprisings erupted in Maine, Connecticut, New York, and Pennsylvania, while in New Hampshire and Vermont irate farmers surrounded government offices.
Shays’ Rebellion of 1786 alarmed the country’s elites. They depicted the unruly yeomen as “brutes” and their houses as “sties.” They were frightened as well by state governments like Rhode Island’s that were more open to popular influence, declared debt moratoria, and issued paper currencies to help farmers and others pay off their debts. These developments signaled the need for a stronger central government fully capable of suppressing future debtor insurgencies.
Federal authority established at the Constitutional Convention allowed for that, but the unrest continued. Shays’ Rebellion was but part one of a trilogy of uprisings that continued into the 1790s. The Whiskey Rebellion of 1794 was the most serious. An excise tax (“whiskey tax”) meant to generate revenue to back up the national debt threatened the livelihoods of farmers in western Pennsylvania who used whiskey as a “currency” in a barter economy. President Washington sent in troops, many of them Revolutionary War veterans, with Hamilton at their head to put down the rebels.
Debt Servitude and Primitive Accumulation
Debt would continue to play a vital role in national and local political affairs throughout the nineteenth century, functioning as a form of capital accumulation in the financial sector, and often sinking pre-capitalist forms of life in the process.
Before and during the time that capitalists were fully assuming the prerogatives of running the production process in field and factory, finance was building up its own resources from the outside. Meanwhile, the mechanisms of public and private debt made the lives of farmers, craftsmen, shopkeepers, and others increasingly insupportable.
This parasitic economic metabolism helped account for the riotous nature of Gilded Age politics. Much of the high drama of late nineteenth-century political life circled around “greenbacks,” “free silver,” and “the gold standard.” These issues may strike us as arcane today, but they were incendiary then, threatening what some called a “second Civil War.” In one way or another, they were centrally about debt, especially a system of indebtedness that was driving the independent farmer to extinction.
All the highways of global capitalism found their way into the trackless vastness of rural America. Farmers there were not in dire straits because of their backwoods isolation. On the contrary, it was because they turned out to be living at Ground Zero, where the explosive energies of financial and commercial modernity detonated. A toxic combination of railroads, grain-elevator operators, farm-machinery manufacturers, commodity-exchange speculators, local merchants, and above all the banking establishment had the farmer at their mercy. His helplessness was only aggravated when the nineteenth-century version of globalization left his crops in desperate competition with those from the steppes of Canada and Russia, as well as the outbacks of Australia and South America.
To survive this mercantile onslaught, farmers hooked themselves up to long lines of credit that stretched back to the financial centers of the East. These lifelines allowed them to buy the seed, fertilizer, and machines needed to farm, pay the storage and freight charges that went with selling their crops, and keep house and home together while the plants ripened and the hogs fattened. When market day finally arrived, the farmer found out just what all his backbreaking work was really worth. If the news was bad, then those credit lines were shut off and he found himself dispossessed.
The family farm and the network of small town life that went with it were being washed into the rivers of capital heading for metropolitan America. On the “sod house” frontier, poverty was a “badge of honor which decorated all.” In his Devil’s Dictionary, the acid-tongued humorist Ambrose Bierce defined the dilemma this way: “Debt. n. An ingenious substitute for the chain and whip of the slave-driver.”
Across the Great Plains and the cotton South, discontented farmers spread the blame for their predicament far and wide. Anger, however, tended to pool around the strangulating system of currency and credit run out of the banking centers of the northeast. Beginning in the 1870s with the emergence of the Greenback Party and Greenback-Labor Party and culminating in the 1890s with the People’s or Populist Party, independent farmers, tenant farmers, sharecroppers, small businessmen, and skilled workers directed ever more intense hostility at “the money power.”
That “power” might appear locally in the homeliest of disguises. At coal mines and other industrial sites, among “coolies” working to build the railroads or imported immigrant gang laborers and convicts leased to private concerns, workers were typically compelled to buy what they needed in company scrip at company stores at prices that left them perpetually in debt. Proletarians were so precariously positioned that going into debt — whether to pawnshops or employers, landlords or loan sharks — was unavoidable. Often they were paid in kind: wood chips, thread, hemp, scraps of canvas, cordage: nothing, that is, that was of any use in paying off accumulated debts. In effect, they were, as they called themselves, “debt slaves.”
In the South, hard-pressed growers found themselves embroiled in a crop-lien system, dependent on the local “furnishing agent” to supply everything needed, from seed to clothing to machinery, to get through the growing season. In such situations, no money changed hands, just a note scribbled in the merchant’s ledger, with payment due at “settling up” time. This granted the lender a lien, or title, to the crop, a lien that never went away.
In this fashion, the South became “a great pawn shop,” with farmers perpetually in debt at interest rates exceeding 100% per year. In Alabama, Georgia, and Mississippi, 90% of farmers lived on credit. The first lien you signed was essentially a life sentence. Either that or you became a tenant farmer, or you simply left your land, something so commonplace that everyone knew what the letters “G.T.T.” on an abandoned farmhouse meant: “Gone to Texas.” (One hundred thousand people a year were doing that in the 1870s.)
The merchant’s exaction was so steep that African-Americans and immigrants in particular were regularly reduced to peonage — forced, that is, to work to pay off their debt, an illegal but not uncommon practice. And that neighborhood furnishing agent was often tied to the banks up north for his own lines of credit. In this way, the sucking sound of money leaving for the great metropolises reverberated from region to region.
Facing dispossession, farmers formed alliances to set up cooperatives to extend credit to one another and market crops themselves. As one Populist editorialist remarked, this was the way “mortgage-burdened farmers can assert their freedom from the tyranny of organized capital.” But when they found that these groupings couldn’t survive the competitive pressure of the banking establishment, politics beckoned.
From one presidential election to the next and in state contests throughout the South and West, irate grain and cotton growers demanded that the government expand the paper currency supply, those “greenbacks,” also known as “the people’s money,” or that it monetize silver, again to enlarge the money supply, or that it set up public institutions to finance farmers during the growing season. With a passion hard for us to imagine, they railed against the “gold standard” which, in Democratic Party presidential candidate William Jennings Bryan’s famous cry, should no longer be allowed to “crucify mankind on a cross of gold.”
Should that cross of gold stay fixed in place, one Alabama physician prophesied, it would “reduce the American yeomanry to menials and paupers, to be driven by monopolies like cattle and swine.” As Election Day approached, populist editors and speakers warned of an approaching war with “the money power,” and they meant it. “The fight will come and let it come!”
The idea was to force the government to deliberately inflate the currency and so raise farm prices. And the reason for doing that? To get out from under the sea of debt in which they were submerged. It was a cry from the heart and it echoed and re-echoed across the heartland, coming nearer to upsetting the established order than any American political upheaval before or since.
The passion of those populist farmers and laborers was matched by that of their enemies, men at the top of the economy and government for whom debt had long been a road to riches rather than destitution. They dismissed their foes as “cranks” and “calamity howlers.” And in the election of 1896, they won. Bryan went down to defeat, gold continued its pitiless process of crucifixion, and a whole human ecology was set on a path to extinction.
The Return of Debt Servitude
When populism died, debt — as a spark for national political confrontation — died, too. The great reform eras that followed — Progessivism, the New Deal, and the Great Society — were preoccupied with inequality, economic collapse, exploitation in the workplace, and the outsized nature of corporate power in a consolidated industrial capitalist system.
Rumblings about debt servitude could certainly still be heard. Foreclosed farmers during the Great Depression mobilized, held “penny auctions” to restore farms to families, hanged judges in effigy, and forced Prudential Insurance Company, the largest land creditor in Iowa, to suspend foreclosures on 37,000 farms (which persuaded Metropolitan Life Insurance Company to do likewise). A Kansas City realtor was shot in the act of foreclosing on a family farm, a country sheriff kidnapped while trying to evict a farm widow and dumped 10 miles out of town, and so on.
Urban renters and homeowners facing eviction formed neighborhood groups to stop the local sheriff or police from throwing families out of their houses or apartments. Furniture tossed into the street in eviction proceedings would be restored by neighbors, who would also turn the gas and electricity back on. New Deal farm and housing finance legislation bailed out banks and homeowners alike. Right-wing populists like the Catholic priest Father Charles Coughlin carried on the war against the gold standard in tirades tinged with anti-Semitism. Signs like one in Nebraska — “The Jew System of Banking” (illustrated with a giant rattlesnake) — showed up too often.
But the age of primitive accumulation in which debt and the financial sector had played such a strategic role was drawing to a close.
Today, we have entered a new phase. What might be called capitalist underdevelopment and once again debt has emerged as both the central mode of capital accumulation and a principal mechanism of servitude. Warren Buffett (of all people) has predicted that, in the coming decades, the United States is more likely to turn into a “sharecropper society” than an “ownership society.”
In our time, the financial sector has enriched itself by devouring the productive wherewithal of industrial America through debt, starving the public sector of resources, and saddling ordinary working people with every conceivable form of consumer debt.
Household debt, which in 1952 was at 36% of total personal income, had by 2006 hit 127%. Even financing poverty became a lucrative enterprise. Taking advantage of the low credit ratings of poor people and their need for cash to pay monthly bills or simply feed themselves, some check-cashing outlets, payday lenders, tax preparers, and others levy interest of 200% to 300% and more. As recently as the 1970s, a good part of this would have been considered illegal under usury laws that no longer exist. And these poverty creditors are often tied to the largest financiers, including Citibank, Bank of America, and American Express.
Credit has come to function as a “plastic safety net” in a world of job insecurity, declining state support, and slow-motion economic growth, especially among the elderly, young adults, and low-income families. More than half the pre-tax income of these three groups goes to servicing debt. Nowadays, however, the “company store” is headquartered on Wall Street.
Debt is driving this system of auto-cannibalism which, by every measure of social wellbeing, is relentlessly turning a developed country into an underdeveloped one.
Dr. Jekyll and Mr. Hyde are back. Is a political resistance to debt servitude once again imaginable?
Steve Fraser is a historian, writer, and editor-at-large for New Labor Forum, co-founder of the American Empire Project, and TomDispatch regular. He is, most recently, the author of Wall Street: America’s Dream Palace. He teaches at Columbia University. This essay will appear in the next issue of Jacobin magazine.
Copyright 2013 Steve Fraser
The Politics of Debt in America
In 1729, when Ireland had fallen into a state of utter destitution at the hands of its British landlords, Jonathan Swift published a famous essay, “A Modest Proposal for Preventing the Children of Poor People in Ireland from Being A Burden to Their Parents or Country, and for Making Them Beneficial to the Public.”
His idea was simple: the starving Irish should sell their own children to the rich as food.
His inspiration, as it happened, came from across the Atlantic. As he explained, “I have been assured by a very knowing American of my acquaintance in London, that a young, healthy child well nourished is at a year old a most delicious, nourishing, and wholesome food, whether stewed, roasted, baked, or boiled; and I make no doubt that it will equally serve in a fricassee, or a ragoust.”
Inspired in turn by Swift, I want to suggest that we put in motion a similar undertaking: on January 16th, Martin Luther King Day, citizens from around the country should gather at the New York Stock Exchange on Wall Street. Let’s call this macabre gathering — with luck and even worse times, it should be mammoth — “We Surrender” or “Restore Debtor’s Prisons” or “De-Fault Is Ours” or “Collateralize Us.” And plan on a mirthful day of mourning.
The basic idea is that we offer ourselves up, 99% of us anyway, on the altar of high finance as a sacrifice to the bond markets. It was Karl Marx who first observed that high finance is “the Vatican of capitalism.” How right he turned out to be — right with a vengeance!
The Death of Democracy
Whole governments, democratically elected, are collapsing, or abdicating on orders from our secular version of the papacy. Who will weep for the passing of Italian Prime Minister Silvio Berlusconi? Not many, surely. Still, it’s appalling that, in Italy as in Greece, governing authority has been usurped by technocrats, elected by no one, answerable only to the European institutions of high finance that installed them in power.
At last count, eight governments of the European Union have come and gone, suffering the wrath of our new god. Other European governments barely hang on and scurry to curry favor with the bond market, proposing in effect to eat their own children and the futures of 99% of their people, if that’s what it takes to make high finance happy. More will follow. By the time this piece was published, who would be surprised if yet another government had bitten the dust.
What about here in the U.S.A.? That capitalism and democracy go together (like love and marriage in that old song) has been the imperial boast conveyed to the rest of the world by American banks and diplomats and presidents and Marines for a century — and more recently, by crony capitalist outfits like KBR and predator drones. Today, at home and abroad that particular gospel seems a sorry piece of hypocrisy. Capitalism has become a synonym for — to use an old word on its way back just in time — plutocracy, not democracy. The Obama administration like the Bush administration before it, and the one before that, and the one before that, has bent its knee to “the Vatican of capitalism.”
Take Our Children, Please!
Anticipating Swift, we are already eating our own children or, at least, the futures available to them. My suggestion is to make the most of that reality.
When we assemble on January 16th, we should arrive as supplicants, bringing the deeds to our homes, if we still have them. We could come dressed as credit-default swaps or collateralized debt obligations. (Use your imagination!)
You’ll want to turn in your subprime mortgage documents. And do you really need that mobile home or tent? And certainly, you’ll want to offer up your children to Wall Street if they’re young enough to make a “delicious” and nourishing meal. If a bit older, haul along that creaky swing-set from your backyard, or dilapidated blackboards and outmoded computer consoles from your child’s underfunded, disintegrating school. Bring with you the paints, recorders, and stage props once used by art, music, and theater teachers, but made superfluous when their programs were cut by schools too poor to afford them.
If your children are older still, and waterlogged from the college loans that put them “underwater” before they even had their first jobs, why not donate those debts as securitized gifts to the Street? Better yet: give back their college diplomas.
If you can, cart along vats of heating oil or coal bins to symbolize the winter fuel that you can no longer afford. (Thank god for global warming!) Declare yourself undocumented or at the very least “undeserving” (a prematurely retired and wonderfully apt word used by the 1% back in the late nineteenth century to describe those who apparently preferred starving to working). Turn in your food stamps and unemployment insurance checks.
If you happen to have a job, return it or tithe a portion of your wages or pension for the cause.
Give back your votes; they do you no good, but might placate The Street. If you’re not too shy, donate your medical records, x-rays, CAT scans and IV drips; you won’t need them anymore since the odds are you won’t be able to afford health care, and Wall Street can use them. After all, who is more endlessly ingenious when it comes to turning misery into money?
Here’s a really big January 16th gesture if you’re up for it: securitize your body parts. What, for example, is a leg- or ear- or brainpan-derivative really worth on the open market? You don’t know, but Wall Street will. And you can think of it as your contribution to solving the deficit dilemma, which keeps the 1% awake at night.
My poor imagination is hardly up to the task of imagining all the ways in which we might express our fealty to Wall Street’s financiers. But we, the partisans of OWS, are if nothing else a remarkably creative bunch. I’m confident that, when we get together on the 16th of January, the world will marvel at our inventiveness.
An Archipelago of Isolation Chambers
However “Swiftian” our mood, signage, and costumes, however much we retain the vital capacity to laugh at our own predicament and make fun of our tormentors, what I’m proposing is, in the end, serious business. A massive “Collateralize Us” day is doable — and through its wit could embolden us and shame those in charge of the care and feeding of the 1%. More important, it could put in the most graphic terms, where everyone could see it, a core indictment of a system in ruins and perhaps even hint at what might replace it.
Why pick a single day and a single place to symbolically immolate our own children (and their children to come)? Why not continue to occupy as many places as we can on all days? We should!
However, the simple epiphany that OWS allowed millions to experience was its blunt discovery that Wall Street, the world of financial mis-engineers and predatory speculators, was the taproot of our multiple dilemmas. For people around the globe, that street remains, at least symbolically, the site where our misbegotten Age of Austerity was born. So it makes continuing sense to persevere in pressing that singular insight, in pursuing a determination to confront a dysfunctional system where it originates.
So, too, local governments around the country have consistently used their police forces to cage, disperse, or otherwise fragment local occupations and may even have coordinated their police “occupations” with one another. “Our streets” are ever less “ours” in any meaningful sense. The geography of democracy is being transformed into an archipelago of isolation chambers.
But that won’t be the case if untold numbers assemble in New York on the 16th. If every movement and organization that has had anything to do with OWS over these last months were to collaborate in mobilizing, even on the bitterest of January days, the streets will again be “ours.”
Martin Luther King and Jubilee Day
Then, of course, there is the resonant significance of the day itself. Martin Luther King was a lawbreaker for justice. So, too, were all those who defied “legitimate authority” alongside him. I’m not suggesting we break the law. I do suggest we exercise rights that are growing weak, and will grow weaker, if allowed to atrophy further. And I do suggest as well that we, like King, become the midwives of new law.
If credit-default swaps and structured investment vehicles are legal, as they are, and if marching in the streets is becoming ever less so, as it is, then on January 16th we should begin to turn that kind of preposterous world upside down. What was lawful shall become criminal and what was denied to the people shall be taken by them and made good law.
When we gather on the 16th of January at the corner of Broad and Wall streets — don’t worry, you’ll find it! — in an act of unprecedented symbolic self-sacrifice, we might also make one modest request. With Martin Luther King in mind, let us propose that January 16th also become Jubilee Day.
Such days were a more or less regular part of the calendar in biblical times and long after. It was the moment when common people were relieved of their crushing debts and the world was allowed to start over again. Our own version of such a “day of forgiveness” would focus on all the debts with which the 1% have burdened so many working people.
On that day, we might resume a conversation about how to start the world anew. It would undoubtedly be a conversation about all the vital resources that everyone depends on to enjoy life, be healthy, and have a future worthy of bequeathing to our children. It would certainly be about how these must never again be allowed to congeal in the hands of an infinitesimal elite organized in a tiny number of private institutions indifferent to the commonweal and immune from censure.
See you on the 16th. Bring your children.
Steve Fraser is Editor-at-Large of New Labor Forum, a TomDispatch regular, and co-founder of the American Empire Project (Metropolitan Books). He is a labor and economic historian whose most recent book is Wall Street: America’s Dream Palace.
Copyright 2011 Steve Fraser
Take Our Children, Please!
Occupy Wall Street, the ongoing demonstration-cum-sleep-in that began a month ago not far from the New York Stock Exchange and has since spread like wildfire to cities around the country, may be a game-changer. If so, it couldn’t be more appropriate or more in the American grain that, when the game changed, Wall Street was directly in the sights of the protesters.
The fact is that the end of the world as we’ve known it has been taking place all around us for some time. Until recently, however, thickets of political verbiage about cutting this and taxing that, about the glories of “job creators” and the need to preserve “the American dream,” have obscured what was hiding in plain sight — that street of streets, known to generations of our ancestors as “the street of torments.”
After an absence of well over half a century, Wall Street is back, center stage, as the preferred American icon of revulsion, a status it held for a fair share of our history. And we can thank a small bunch of campers in Manhattan’s Zuccotti Park for hooking us up to a venerable tradition of resistance and rebellion.
The Street of Torments
Peering back at a largely forgotten terrain of struggle against “the Street,” so full of sound and fury signifying quite a lot, it’s astonishing — to a historian of Wall Street, at least — that the present movement didn’t happen sooner. It’s already hard to remember that only weeks ago, three years into the near shutdown of the world financial system and the Great Recession, an eerie unprotesting silence still blanketed the country.
Stories accumulated of Wall Street greed and arrogance, astonishing tales of incompetence and larceny. The economy slowed and stalled. People lost their homes and jobs. Poverty reached record levels. The political system proved as bankrupt as the big banks. Bipartisan consensus emerged — but only around the effort to save “too big to fail” financial goliaths, not the legions of victims their financial wilding had left in its wake.
The political class then prescribed what people already had plenty of: yet another dose of austerity plus a faith-based belief in a “recovery” that, for 99% of Americans, was never much more than an optical illusion. In those years, the hopes of ordinary people for a chance at a decent future withered and bitterness set in.
Strangely, however, popular resistance was hard to find. In the light of American history, this passivity was surpassingly odd. From decades before the Gilded Age of the late nineteenth century through the Great Depression, again and again Wall Street found itself in the crosshairs of an outraged citizenry mobilized thanks to political parties, labor unions, or leagues of the unemployed. Such movements were filled with a polyglot mix of middle-class anti-trust reformers, bankrupted small businessmen, dispossessed farmers, tenants and sharecroppers, out-of-work laborers, and so many others.
If Occupy Wall Street signals the end of our own, atypical period of acquiescence, could a return to a version of “class warfare” that would, once upon a time, have been familiar to so many Americans be on the horizon? Finally!
What began as a relatively sparsely attended and impromptu affair has displayed a staying power and magnetic attractiveness that has taken the country, and above all the political class, by surprise. A recent rally of thousands in lower Manhattan, where demonstrators marched from the city’s government center to Zuccotti Park, the location of the “occupiers” encampment, was an extraordinarily diverse gathering by any measure of age, race, or class. Community organizations, housing advocates, environmentalists, and even official delegations of trade unionists not normally at ease hanging out with anarchists and hippies gave the whole affair a social muscularity and reach that was exhilarating to experience.
Diversity, however, can cut both ways. Popular protest, to the degree that there’s been much during the recent past — and mainly over the war in Iraq — has sometimes been criticized for the chaotic way it assembled a grab-bag of issues and enemies, diffuse and without focus. Occupy Wall Street embraces diverse multitudes but this time in the interest of convergence. In its targeting of “the street of torments,” this protean uprising has, in fact, found common ground. To a historian’s ear this echoes loudly.
Karl Marx described high finance as “the Vatican of capitalism,” its diktat to be obeyed without question. We’ve spent a long generation learning not to mention Marx in polite company, and not to use suspect and nasty phrases like “class warfare” or “the reserve army of labor,” among many others.
In times past, however, such phrases and the ideas that went with them struck our forebears as useful, even sometimes as true depictions of reality. They used them regularly, along with words like “plutocracy,” “robber baron,” and “ruling class,” to identify the sources of economic exploitation and inequality that oppressed them, as well as to describe the political disenfranchisement they suffered and the subversion of democracy they experienced.
Never before, however, has “the Vatican of capitalism” captured quite so perfectly the specific nature of the oligarchy that’s run the country for a generation and has now run it into the ground. Even political consultant and pundit James Carville, no Marxist he, confessed as much during the Clinton years when he said the bond market “intimidates everybody.”
Perhaps that era of everyday intimidation is finally ending. Here are some of the signs of it — literally — from that march I attended: “Loan Sharks Ate My World” (illustrated with a reasonable facsimile of the Great White from Jaws), “End the Federal Reserve,” “Wall Street Sold Out, Let’s Not Bail-Out,” “Kill the Over the Counter Derivative Market,” “Wall Street Banks Madoff Well,” “The Middle Class is Too Big To Fail,” “Eat the Rich, Feed the Poor,” “Greed is Killing the Earth.” During the march, a pervasive chant — “We are the 99%” — resoundingly reminded the bond market just how isolated and vulnerable it might become.
And it is in confronting this elemental, determining feature of our society’s predicament, in gathering together all the multifarious manifestations of our general dilemma right there on “the street of torments,” that Occupy Wall Street — even without a program or clear set of demands, as so many observers lament — has achieved a giant leap backward, summoning up a history of opposition we would do well to recall today.
A Century of Our Streets and Wall Street
One young woman at the demonstration held up a corrugated cardboard sign roughly magic-markered with one word written three times: “system,” “system,” “system.” That single word resonates historically, even if it sounds strange to our ears today. The indictment of presumptive elites, especially those housed on Wall Street, the conviction that the system over which they presided must be replaced by something more humane, was a robust feature of our country’s political and cultural life for a long century or more.
When in the years following the American Revolution, Jeffersonian democrats raised alarms about the “moneycrats” and their counterrevolutionary intrigues — they meant Alexander Hamilton and his confederates in particular — they were worried about the installation in the New World of a British system of merchant capitalism that would undo the democratic and egalitarian promise of the Revolution.
When followers of Andrew Jackson inveighed against the Second Bank of the United States — otherwise known as “the Monster Bank” — they were up in arms against what they feared was the systematic monopolizing of financial resources by a politically privileged elite. Just after the Civil War, the Farmer-Labor and Greenback political parties freed themselves of the two-party runaround, determined to mobilize independently to break the stranglehold on credit exercised by the big banks back East.
Later in the nineteenth century, Populists decried the overweening power of the Wall Street “devil fish” (shades of Matt Taibbi’s “giant vampire squid” metaphor for Goldman Sachs). Its tentacles, they insisted, not only reached into every part of the economy, but also corrupted churches, the press, and institutions of higher learning, destroyed the family, and suborned public officials from the president on down. When, during his campaign for the presidency in 1896, the Populist-inspired “boy orator of the Platte” and Democratic Party candidate William Jennings Bryan vowed that mankind would not be “crucified on a cross of gold,” he meant Wall Street and everyone knew it.
Around the turn of the century, the anti-trust movement captured the imagination of small businessmen, consumers, and working people in towns and cities across America. The trust they worried most about was “the Money Trust.” Captained by J.P. Morgan, “the financial Gorgon,” the Money Trust was skewered in court and in print by future Supreme Court justice Louis Brandeis, subjected to withering Congressional investigations, excoriated in the exposés of “muckraking” journalists, and depicted by cartoonists as a cabal of prehensile Visigoths in death-heads.
As the twentieth century began, progressive reformers in state houses and city halls, socialists in industrial cities and out on the prairies, strikebound workers from coast to coast, working-class feminists, antiwar activists, and numerous others were still vigorously condemning that same Money Trust for turning the whole country into a closely-held system of financial pillage, labor exploitation, and imperial adventuring abroad. As the movements made clear, everyone but Wall Street was suffering the consequences of a system of proliferating abuses perpetrated by “the Street.”
The tradition the Occupy Wall Street demonstrators have tapped into is a long and vibrant one that culminated during the Great Depression. Then as now, there was no question in the minds of "the 99%” that Wall Street was principally to blame for the country’s crisis (however much that verdict has since been challenged by disputatious academics).
Insurgencies by industrial workers, powerful third-party threats to replace capitalism with something else, rallies and marches of the unemployed, and, yes, occupations, even seizures of private property, foreclosures forestalled by infuriated neighbors, and a pervasive sense that the old order needed burying had their lasting effect. In response, the New Deal attempted to unhorse those President Franklin Roosevelt termed “economic royalists,” who were growing rich off “other people’s money” while the country suffered its worst trauma since the Civil War. “The Street” trembled.
“System, System, System”: It would be foolish to make too much of a raggedy sign — or to leap to conclusions about just how lasting this Occupy Wall Street moment will be and just where (if anywhere) it’s heading. It would be crazily optimistic to proclaim our own pitiful age of acquiescence ended.
Still, it would be equally foolish to dismiss the powerful American tradition the demonstrators of this moment have tapped into. In the past, Wall Street has functioned as an icon of revulsion, inciting anger, stoking up energies, and summoning visions of a new world that might save the New World.
It is poised to play that role again. Remember this: in 1932, three years into the Great Depression, most Americans were more demoralized than mobilized. A few years later, all that had changed as “Our Street, Not Wall Street” came alive. The political class had to scurry to keep up. Occupy Wall Street may indeed prove the opening act in an unfolding drama of renewed resistance and rebellion against “the system.”
Steve Fraser is Editor-at-Large of New Labor Forum, a TomDispatch regular, and co-founder of the American Empire Project (Metropolitan Books). A historian of Wall Street, his most recent book on the subject is Wall Street: America’s Dream Palace. He teaches history at Columbia University.
Copyright 2011 Steve Fraser
The All-American Occupation
Not long ago, the city council of Ventura, California, passed an ordinance making it legal for the unemployed and homeless to sleep in their cars. At the height of the Great Recession of 2008, one third of the capital equipment of the American economy lay idle. Of the women and men idled along with that equipment, only 37% got a government unemployment check and that check, on average, represented only 35% of their weekly wages.
Meanwhile, there are now two million ”99ers” — those who have maxed out their supplemental unemployment benefits because they have been out of work for more than 99 weeks. Think of them as a full division in “the reserve army of labor.” That “army,” in turn, accounts for 17% of the American labor force, if one includes part-time workers who need and want full-time work and the millions of unemployed Americans who have grown so discouraged that they’ve given up looking for jobs and so aren’t counted in the official unemployment figures. As is its historic duty, that force of idle workers is once again driving down wages, lengthening working hours, eroding on-the-job conditions, and adding an element of raw fear to the lives of anyone still lucky enough to have a job.
No one volunteers to serve in this army. But anyone, from Silicon Valley engineers to Florida tomato pickers, is eligible to join what, in our time, might be thought of as the all-involuntary force. Its mission is to make the world safe for capitalism. Today, with the world spiraling into a second “Great Recession” (even if few, besides the banks, ever noticed that the first one had ended), its ranks are bound to grow.
The All-Involuntary Army (of Labor)
As has always been true, the coexistence of idling workplaces and cast-off workers remains the single most severe indictment of capitalism as a system for the reproduction of human society. The arrival of a new social category — “the 99ers” — punctuates that grim observation today.
After all, what made the Great Depression “great” was not only the staggering level of unemployment (no less true in various earlier periods of economic collapse), but its duration. Years went by, numbingly, totally demoralizingly, without work or hope. When it all refused to end, people began to question the fundamentals, to wonder if, as a system, capitalism hadn’t outlived its usefulness.
Nowadays, the 99ers notwithstanding, we don’t readily jump to such a conclusion. Along with the “business cycle,” including stock market bubbles and busts and other economic perturbations, unemployment has been normalized. No one thinks it’s a good thing, of course, but it’s certainly not something that should cause us to question the way the economy is organized.
Long gone are the times when unemployment was so shocking and traumatic that it took people back to the basics. We don’t, for instance, even use that phrase “the reserve army of labor” anymore. It strikes many, along with “class struggle” and “working class,” as embarrassing. It’s too “Marxist” or anachronistic in an age of post-industrial flexible capitalism, when we’ve grown accustomed to the casualness and transience of work, or even anointed it as a form of “free agency.”
However, long before leftists began referring to the unemployed as a reserve army, that redolent metaphor was regularly wielded by anxious or angry nineteenth century journalists, government officials, town fathers, governors, churchmen, and other concerned citizens. Something new was happening, they were sure, even if they weren’t entirely clear on what to make of it.
Unemployment as a recurring feature of the social landscape only caught American attention with the rise of capitalism in the pre-Civil War era. Before that, even if the rhythms of agricultural and village life included seasonal oscillations between periods of intense labor and downtime, farmers and handicraftsmen generally retained the ability to sustain their families.
Hard times were common enough, but except in extremis most people retained land and tools, not to speak of common rights to woodlands, grazing areas, and the ability to hunt and fish. They were — we would say today — “self-employed.” Only when such means of subsistence and production became concentrated in the hands of merchant-capitalists, manufacturers, and large landowners did the situation change fundamentally. A proletariat — those without property of any kind except their own labor power — made its appearance, dependent on the propertied to employ them. If, for whatever reason, the market for their labor power dried up, they were set adrift.
This process of dispossession lasted more than a century. In the early decades of the nineteenth century, its impact remained limited. The farmers, handicraftsmen, fishermen, and various tradespeople swept into the new textile or shoe factories, or the farm women set to work out in the countryside spinning and weaving for merchant capitalists still held onto some semblance of their old ways of life. They maintained vegetable gardens, continued to hunt and fish, and perhaps kept a few domestic animals.
When the first commercial panics erupted in the 1830s and 1850s and business came to a standstill, many could fall back on pre-capitalist ways of making a living, even if a bare one. Still, the first regiments of the reserve army of the unemployed had made their appearance. Jobless men were already roaming the roads, an alarming new sight for townspeople not used to such things.
Demobilizing the Workforce Becomes the New Norm
When industrial capitalism exploded after the Civil War, unemployment suddenly became a chronic and frightening aspect of modern life affecting millions. Panics and depressions now occurred with distressing frequency. Their randomness, severity, and duration (some lasted half a decade or more) only swelled the ranks of the reserve army. Crushing helplessness in the face of unemployment would be a devastating new experience for the great waves of immigrants just landing on American shores, many of them peasants from southern and eastern Europe accustomed to falling back on their own meager resources in fields and forests when times were bad.
The very presence of this “army” of able-bodied but destitute workers seemed to catch the essential savagery of the new economy and it stunned onlookers. The “tramp” became a ubiquitous figure, travelling the roads and rails, sometimes carrying his tools with him, desperate for work. He proved a threatening specter for villagers and city people alike.
Just as shocking was a growing realization — made undeniable by each dismal repetition of the business cycle — that the new industrial economy wasn’t just producing that reserve army, but depended on its regular mobilization and demobilization to carry on the process of capital accumulation. It was no passing phenomenon, no natural disaster that would run its course. It was the new normal.
Initial reactions were varied and dramatic. Local governments rushed to pass punitive laws against tramping and vagrancy, mandating terms of six months to two years of hard labor in workhouses. Meanwhile, the orthodox thinking of that moment raised steep barriers to government aid for those in need. During the devastating depression of the 1870s, for instance, President Ulysses Grant’s Secretary of the Treasury put things succinctly: “It is not part of the business of government to find employment for people.”
Punishment and studied indifference were, however, by no means the only responses as emergency relief efforts — some private, some public — became common. The ravaging effects of unemployment, the way it spread like a plague, and its chronic reappearance also put more radical measures on the agenda, proposals that questioned the viability and morality of what was then termed the “wages system.”
Calls went out to colonize vacant land and establish state-run factories and farms to productively re-employ the idled. Infuriated throngs occupied state houses demanding public works. Elements of the labor and populist movements advocated manufacturing and agricultural cooperatives as a way around the ruthlessness of the Darwinian free market. Business “trusts” or monopolies were often decried for driving other businesses under and so exacerbating the unemployment dilemma. In some cases, their nationalization was called for. Militants of the moment began to demand work not as a sop to the indigent, but as a right of citizenship, as precious and inviolable as anything in the Bill of Rights.
The greatest and most prolonged mass mobilization of the mid-1880s was the national movement for the eight-hour work day. It was animated partly by a desire for more leisure time, but also by a vain hope that its passage by Congress might effectively raise wages. (Industrialists, however, had no intention of paying the same amount for eight hours of work as they had for 12.) Its main impetus, though, was a belief that mandating a national reduction in the hours of work would spread jobs around and so diminish the ranks of the reserve army.
Some were convinced that capitalism’s appetite for human labor was too voracious for business ever to agree to such limits. So long as the business cycle was on its upward arc, the compulsion to exploit labor power was insatiable. When the market went south, all that surplus humanity could be left to fend for itself. Its partisans nonetheless believed that the movement for an eight-hour day would expose the barbarism of the economic system for all to see, opening the door to something more humane.
In other words, a wide spectrum of responses to unemployment was enfolded within a broad and growing anti-capitalist culture. Within the organized labor movement, that proto–union, the Knights of Labor, was immersed in the idea of an anti-capitalist insurgency. Most trade unions of the time, however, accepted that the “wages system” was here to stay and focused instead on the issues of job security, fighting for unemployment benefit funds for members, seniority, prohibitions against overtime, and the shortening of working hours.
Even agitation to ban child labor and limit female employment was motivated in part by a desire to temper the pervasiveness of unemployment by curtailing the pool of available labor. Other trade union procedures and proposals were more mean-spirited, including attempts to ban immigration or exclude African-American and other minorities or the unskilled from membership in the movement. That insularity bedevils trade unionism to this day.
As part of this tumultuous season of upheaval, which lasted from the 1870s through the Great Depression, the unemployed themselves organized demonstrations. A gathering in Tompkins Square Park of thousands of New Yorkers left destitute by the panic and depression of 1873 was dispersed with infamous brutality by the police. Local newspapers labeled the protestors “communards.” (The recently defeated Paris Commune had ignited a hysterical fear of “un-American” radicalism, a toxin that has never since left the American bloodstream.)
Although the Tomkins Square rally was mainly a plea for relief and public works, there was some talk of marching on Wall Street. Such radical rhetoric, not to speak of actual violence, was hardly unusual in such confrontations then, a measure of how raw class relations were and how profoundly disturbed people had become by the haunting presence of mass unemployment.
Just as telling, the unemployed and those still at work but at loggerheads with their bosses frequently displayed their solidarity in public. During the “Great Insurrection” of 1877, when railroad strikers from coast to coast faced off against state militias, federal troops, and the private armies of the railroad barons, they were joined by regiments of the “reserve army.” Often these were their neighbors and family members, but also strangers who, feeling an affinity for their beleaguered brethren, preferred setting fire to railroad engine houses than going to work in them as scabs. Amid the awful depression of the 1890s, a cigar maker caught the temper of the times simply: “I believe the working men themselves will have to take action. I believe those men that are employed will have look out for the unemployed that work at the same business they do.”
Marching Armies (of the Unemployed)
Demonstrations of the unemployed resurfaced with each major economic downturn. In the depression winter of 1893-1894, for example, ragged “armies” of the desperate gathered in various parts of the country, 40 of them in all. (Eighteen-year-old future novelist Jack London joined one in California.) The largest commandeered a train in an effort to get to Washington, D.C., and was chased for 300 miles across Montana by federal troops.
The most famous of them was led by Jacob Coxey, a self-made Ohio businessman. “Coxey’s Army” (more formally known as “the Commonwealers” or the “Commonwealth of Christ Army”) made it all the way to the capital, a “living petition” to Congress. It was led by his 17-year-old daughter as “the Goddess of Peace” riding a white horse.
In the nation’s capital, the “Army” lodged its plea for relief, work, and an increase in the money supply. (Jacob’s son was called “Legal Tender Cox.”) President Grover Cleveland wasn’t hearing any of it, having already made his views known in 1889 during his first term in office: “The lessons of paternalism ought to be unlearned and the better lesson taught that while the people should patriotically and cheerfully support their government, its functions do not include support of the people.”
Christian charity was not Cleveland’s long suit. Others of the faith, however, believers in the social gospel and Christian socialism especially, staged spectacular public dramas on behalf of the “shorn lambs of the unemployed” — even a mock “slave auction” in Boston in 1921 during a severe post-World War I slump, in which the jobless were offered to the highest bidders as evidence of what “wage slavery” really meant.
The Great Depression brought this protracted period of labor turmoil to a climax and to an end. In its early years, the ethos of “mutualism” and solidarity between the employed and unemployed was strengthened. In those years, railroads began to report startling jumps in the numbers of Americans engaged in “train hopping” — the rail equivalent of hitchhiking. On one line, the “hoppers” went from 14,000 in 1929 to 186,000 in 1931.
In 1930, when the unemployment rate was at about today’s level, in cities across the country the first rallies of the unemployed began with demands for work and relief. Later, there were food riots and raids on delivery trucks and packinghouses, as well as the occupations of shuttered coalmines and bankrupt utility companies by the desperate who began to work them.
“Leagues” and “councils” of the unemployed, sometimes organized by the Communist Party, sometimes by the Socialist Party, and sometimes by a group run by radical pacifist A.J. Muste, marshaled their forces to stop home evictions, support strikes, and make far-reaching proposals for a permanent system of public works and unemployment insurance. Muste’s groups, strong in the Midwest, set up bartering arrangements and labor exchanges among the jobless.
In support of striking workers, unemployed protestors shut down the Briggs plant in Highland Park, Michigan — it manufactured auto bodies for Ford — pledging that they would not scab on the striking workers. A march of former and current employees of the Ford facilities in Dearborn, Michigan, made the unusual demand that the company (not the government) provide work for the jobless. For their trouble, they were bloodied by Ford’s hired thugs and five of them were killed.
President Herbert Hoover took similar action. In a move that shocked much of the nation, he ordered Army Chief of Staff General Douglas MacArthur to use troops to disperse the Bonus Expeditionary Army, World War I jobless veterans gathered in tents on Anacostia Flats in Washington asking for accelerated payments of their war-time pensions. They were routed at bayonet point and MacArthur’s troops burned down their tent city.
How the New Deal Dealt
The Great Depression was, however, so profoundly unsettling that the unemployed finally became a political constituency of national proportions. The pressure on mainstream politicians to do something grew ever more intense. The Conference of Mayors that meets to this day was founded then to lobby Washington for federal relief for the jobless. Even segments of the business community had begun to complain about the “costs” of unemployment when it came to workplace efficiency.
Unemployment insurance, work relief, welfare, and public works — all of which had surfaced in public debate since the turn of the twentieth century — made up the basic package of responses offered by President Franklin Roosevelt’s New Deal to the inherent insecurity of proletarian life. None were exactly expansive either in what they provided or in their execution, and yet all of them found themselves under chronic assault from birth (as they are today).
The most daring legislation under consideration, the Lundeen bill (authored by a Minnesota congressman), would have provided unemployment insurance equal to prevailing wages for anyone over 18 working part or full time. Though it never became law, it was to be financed by a tax on incomes exceeding $5,000, and administered by elected worker representatives. It was not atypical in its most basic assumption which once would have been thought intolerable — that unemployment at significant levels would continue into the indefinite future.
Unemployment was now to be ameliorated, but also accepted. Harry Hopkins, who ran the New Deal’s relief efforts, was typical in predicting that “a probable minimum of four to five million” Americans would remain out of work “even in future ‘prosperity’ periods.” Consequently, the new relief reforms were to be considered defense mechanisms designed to recharge the batteries of a stalled economy and to minimize the political fallout from outsized joblessness. This menu of “solutions” has constituted the core of the labor and progressive movement’s approach to unemployment ever since.
“The Natural Rate of Unemployment”
After World War II, unemployment became, for the most part, a numerical and policy issue rather than a social phenomenon. By the 1960s, what once struck most Americans as unnatural and ghastly had been fully transformed by economists and political elites into “the natural rate of unemployment” — a level of joblessness that should never be tampered with because it was futile to do so and to try would induce inflation.
More recently matters have turned truly perverse. Neo-liberals, who during the Reagan era of the 1980s eclipsed Keynesians as the dominant thinkers when it came to economic policy, were worried that unemployment might not be high enough. It was increasingly feared that, if the ranks of the jobless were not large indeed, both labor costs and inflation would rise, threatening the future value of capital investments. The world, in other words, had turned upside down.
As official society adapted to the permanence of unemployment, the unemployed themselves subsided into political quiescence. There were exceptions, however.
Perhaps the most massive unemployment demonstration in the nation’s history took place in 1963 when 100,000 Americans marched on Washington for “Jobs and Freedom.” It is a telling commentary on the political sensibilities of the last half-century that the March on Washington, recalled mainly for Martin Luther King’s famed “I Have a Dream” speech, is rarely if ever remembered as an outpouring of righteous anger about a system that consigned much of a whole race to the outcast status first experienced by the young women of New England textile mills in antebellum America.
Today, the question is: As the new unemployment “norm” rises, will the “99ers” remain just a number, or will anger and systemic dysfunction lead to the rebirth of movements of the unemployed, perhaps allied, as in the past, with others suffering from the economy’s relentless downward arc? Keep in mind that the extent of organized protest by the unemployed in the past should not be exaggerated. Not even the Great Depression evoked their sustained mass mobilization. That’s hardly surprising. By its nature, unemployment demoralizes and isolates people. It makes of them a transient and chronically fluctuating population with no readily discernable common enemy and no obvious place to coalesce.
Another question might be: In the coming years, might we see the return of a basic American horror at the phenomenon of joblessness? And might it drive Americans to begin to ask deeper questions about the system that lives and feeds on it?
After all, we now exist in an under-developing economy. What new jobs it is creating are poor paying, low skill, and often temporary, nor are there enough of them to significantly reduce the numbers of those out of work. The 99ers are stark evidence that we may be witnessing the birth of a new permanent class of the marginalized. (The percentage of the unemployed who have been out of work for more than six months has grown from 8.6% in 1979 to 19.6% today.) Moreover, our mode of “flexible capitalism” has made work itself increasingly transient and precarious.
Until now, ideologues of the new order have had remarkable success in dressing this up as a new form of freedom. But our ancestors, who experienced frequent and distressing interruptions in their work lives, who migrated thousands of miles to find jobs which they kept or lost at the whim of employers, and who, in solitary search for work, tramped the roads and hopped the freight cars (even if they could not yet roam Internet chat rooms), were not so delusional.
We have a choice: Americans can continue to accept large-scale unemployment as “natural” and permanent, even — a truly grotesque development — as a basic feature on a bipartisan road to “recovery” via austerity. Or we can follow the lead of the jobless young in the Arab Spring and of protestors beginning to demonstrate en masse in Europe. Even the newly minted proletarians of Ventura, California, sleeping in their cars, may decide that they have had enough of a political and economic order of things so bankrupt it can find no use for them at any price.
Steve Fraser is Editor-at-Large of New Labor Forum and co-founder of the American Empire Project (Metropolitan Books). He is, most recently, the author of Wall Street: America’s Dream Palace. He teaches history at Columbia University.
Joshua B. Freeman teaches history at Queens College and the Graduate Center of the City University of New York and is affiliated with its Joseph S. Murphy Labor Institute. His forthcoming book, American Empire, will be the final volume of the Penguin History of the United States.
This piece is an adaptation of an “In the Rearview Mirror” column that will be published in the Spring 2012 issue of the magazine New Labor Forum.
Copyright 2011 Steve Fraser and Josh Freeman
Uncle Sam Does(n’t) Want You
On a winter’s day in Boston in 1773, a rally of thousands at Faneuil Hall to protest a new British colonial tax levied on tea turned into an iconic moment in the pre-history of the American Revolution. Some of the demonstrators — Sons of Liberty, they called themselves — left the hall and boarded the Dartmouth, a ship carrying tea, and dumped it overboard.
One of the oddest features of the Boston Tea Party, from which our current crop of Tea Party populists draw their inspiration, is that a number of those long-ago guerilla activists dressed up as Mohawk Indians, venting their anger by emitting Indian war cries, and carrying tomahawks to slice open the bags of tea. This masquerade captured a fundamental ambivalence that has characterized populist risings ever since. After all, if in late eighteenth century America, the Indian already functioned as a symbol of an oppressed people and so proved suitable for use by others who felt themselves put upon, it was also the case that the ancestors of those Boston patriots had managed to exterminate a goodly portion of the region’s Native American population in pursuit of their own self-aggrandizement.
Today’s Tea Party movement, like so many of its “populist” predecessors, is a house of contradiction, a bewildering network of crosscutting political emotions, ideas, and institutions. What connects it powerfully to a populist past stretching all the way back to Boston Harbor is, however, a sense of violation: “Don’t Tread on Me.”
Despite a recurring resistance to the impositions of powerful outside forces — anti-elitism has been axiomatic for all such insurgencies — populist movements have differed greatly on just what those forces were and what needed to be done to free people from their yoke. It’s worth noting, for instance, that an earlier invocation of the Boston Tea Party took place at a 1973 rally on a replica of the Dartmouth — a rally called to promote the impeachment of President Richard Nixon.
From the Know-Nothings to the People’s Party
Over the course of American history, the populist instinct, now resurgent in the Tea Party movement, has oscillated between a desire to transform, and so create a new order of things, and a desire to restore a yearned-for (or imagined) old order.
Before the Civil War, one such movement that caught both these urges was colloquially dubbed the “Know-Nothings” (not for any anti-intellectualism, but because its members deliberately conducted much of their business in secret — hence, if questioned, were instructed to say, “I know nothing”). Know-nothing-ism exuded the desire to move forward and backward at the same time. During the 1840s and 1850s, it swept across much of the country, North and South. There were “know-nothing” candies, “know-nothing” toothpicks, and “know-nothing” stagecoaches.
Soon enough, the movement evolved into a national political party, the American Party, that appealed to small farmers, small businessmen, and working people. Its attraction was two-fold. The party vociferously opposed Irish and German Catholic immigration to the U.S. (as well as that of Chinese and Chilean immigrants working in the gold fields of California). Yet, in the North, it also denounced slavery. As planks in a political program, nativism and anti-slavery might seem like an odd couple, but in the minds of the party’s followers they were joined at the hip. As Know-Nothings saw it, the Papacy and the South’s slave-owning planter elite were both conspiring to undermine a democratic society of masterless men.
Keep in mind that conspiratorial thinking has long been deeply embedded in American populist movements (as in the Tea Party today). In nineteenth century protestant America, alleged plots by Vatican hierarchs were a recurrent feature of political life. In the North, a wave of crime and the rise of “poor relief” and other forms of dependency — including wage labor, which accompanied the arrival of a flood of impoverished Catholic immigrants — seemed to threaten an American promise of a society of free, equal, and self-reliant individuals (supposedly so noxious to the priestly elite of the Catholic Church). In the slave South, where the master class was believed to be hard at work subverting the Constitution, conspiratorial machinations were self-evidently afoot. By the mid-1850s, most “Know-Nothings” in the North had found their way into the newborn Republican Party which combined hostility to slavery with a milder form of anti-Catholicism.
Populism with a capital “P,” the great economic and political insurgency of the last third of the nineteenth century that blanketed rural America from the cotton South to the grain-growing Great Plains and the Rocky Mountain West, would bear its own distinctive ambivalence. The People’s Party indicted corporate and finance capitalism for destroying the livelihoods and lives of independent farmers and handicraftsmen. It also attacked big business for subverting the foundations of democracy by capturing all three branches of government and transforming them into coercive instruments of rule by a new plutocracy. Populists sometimes attributed what they termed an American “counterrevolution” to the conspiratorial plots of the “great Devil Fish of Wall Street,” suspected of colluding with Great Britain’s elite to undo the American Revolution.
The remedies proposed, however, were hardly those of Luddites. These instead anticipated many of the fundamental reforms of the next century, including government subsidies for farmers, the graduated income tax, direct election of the Senate, the eight-hour day, and even the public ownership of railroads and public utilities. A tragic movement of the dispossessed, the Populists yearned to restore a society of independent producers, a world without a proletariat and without corporate trusts. Yet they also envisioned something new and transformative, a “cooperative commonwealth” that would escape the barbaric competitiveness and exploitation of free market capitalism.
The Great Plains of Resentment
For the next four decades, populism remained emphatically against corporate capitalism and held on tightly to its resentment of powerful outsiders as well as a penchant for conspiracy mongering. During the 1930s, however, the location of Conspiracy Central began to shift from Wall Street and the City of London to Moscow — and even New Deal Washington. Anti-communism added a new ingredient to an already roiling American politics of fear and paranoia, a toxic element which still inflames the Tea Party imagination two decades after the Berlin Wall was torn down.
During the 1936 presidential campaign, in the midst of the Great Depression, three populist movements — Louisiana Senator Huey Long’s “Share Our Wealth” clubs, the Union for Social Justice formed by the charismatic “radio priest” Father Charles E. Coughlin, and Francis Townsend’s campaign for government pensions for the elderly — coalesced, albeit briefly and uneasily, to form the Union Party. It ran from the left against President Franklin Roosevelt, nominating as its presidential candidate North Dakota Congressman William Lemke, a one-time spokesman for radical farmers. (The vice-presidential candidate was a labor lawyer from Boston.)
The Union Party expressed a broad dissatisfaction with the failure of Roosevelt’s New Deal to relieve economic distress and injustice. Senator Long, the latest in a long line of Southern populist demagogues, had been decrying the power of land barons, “moneycrats,” and big oil since his days as Louisiana’s governor. His “Share Our Wealth” plan called for pensions and public education for all, as well as confiscatory taxes on incomes over $1 million, a minimum wage, and public works projects to give jobs to the unemployed. Townsend’s scheme was designed to solve unemployment and the penury of old age by offering monthly government pensions of $200, financed by taxes on business, to everyone over the age of 60. Coughlin, an early supporter of Roosevelt, trained his fire on finance capitalism, inveighing against its usurious, unchristian “parasitism.”
But Long and especially Coughlin were at pains to distinguish their form of radicalism from the collectivism and atheism of the Red menace. Father Coughlin expressed support for labor unions and a just wage. He was, however, an inveterate foe of the left-leaning United Automobile Workers union, and roundly condemned the sit-down strikes which spread like a prairie fire following Roosevelt’s triumphal landslide victory in the 1936 presidential election, as workers across the country occupied everything from auto plants to department stores demanding union recognition.
Indeed, in his radio addresses and his newspaper, Social Justice, the priest ranted about an incongruous conspiracy of Bolsheviks and bankers whose aim was to betray America. He would eventually add a tincture of anti-Semitism to his warnings about a Wall Street cabal. His growing sympathy for Nazism was not so shocking. Fascism, after all, had its roots in a European version of populism that conveyed a post-World War I disgust with the selfishness and incompetence of cosmopolitan ruling elites, a virulent racial nationalism, and a hatred of bankers and especially Bolsheviks.
Followers of Long and Coughlin loathed big business and big government, even though big government — back then anyway — was taking on big business. For them, “Don’t Tread on Me” meant a defense of local economies, traditional moral codes, and established ways of life that seemed increasingly endangered by national corporations as well as the state bureaucracies that began to proliferate under the New Deal. Union Party campaign oratory was filled with references to the “forgotten man,” an image first invoked by Roosevelt on behalf of the working poor.
In the years ahead, kindred images would resurface during a time of turmoil in the late 1960s in Nixon’s appeals to the “silent majority” of “Middle America,” and more recently in the Tea Party’s wounded sense of exclusion. “Forgotten man” populism conveyed the irate politics of resentment of precariously positioned Americans against the organized power blocs of modern industrial society: Big Business, Big Labor, and Big Government.
Race, Resentment, and the Rise of Conservative Populism
Over the last half century populism has drifted steadily rightward, becoming ever more restorationist and ever less transformative, ever more anti-collectivist and ever less anti-capitalist. What were subordinate themes in the older style populism — religious orthodoxy, national chauvinism, phobic racism, and the politics of fear and paranoia — have come to the fore in our time. At least in broad terms, both the Barry Goldwater and the George Wallace insurgencies of the 1960s displayed this trajectory.
Goldwater, the Arizona senator and 1964 Republican candidate for president, an “insurgent”? Yes, if you keep in mind his condemnation of the too-liberal elite running the Republican Party, who, in his eyes, represented a clubby world of Ivy League bankers, corrupt politicians, media lords, and “one-worlders.” Or consider the way he flirted with the freakish John Birch Society (which called President Dwight Eisenhower a “dedicated, conscious agent of the Communist Party” and warned of a Red plot to weaken the minds of Americans by fluoridating the water supply). Or the Senator’s alarming readiness to threaten to push the nuclear button in defense of “freedom,” which could be thought of as the Cold War version of “Don’t Tread on Me.”
Above all, Goldwater was the avatar of today’s politics of limited government. In his opposition to civil rights legislation, he might be called the original “tenther” — that is, a serial quoter of the Tenth Amendment to the Constitution, which reserves for the states all powers not expressly granted to the Federal government, with which he justified hamstringing all efforts by Washington to rectify social or economic injustice. For Goldwater the outlawing of Jim Crow was an infringement of constitutionally protected states’ rights. Moreover, he was an inveterate enemy of all forms of collectivism, including of course unions and the welfare state.
As the Goldwater opposition sank its grassroots into the lush soil of the Sunbelt, its desire to restore an older order of things was palpable. At a time when New Deal liberalism was the reigning orthodoxy, the senator’s reactionary impulses seemed startlingly adrift from the mainstream, and so strange indeed.
Goldwater’s rebellious constituents were an oddly positioned band of rebels. Unlike the declining middling sorts attracted to the Union Party, they came mainly from a rising Sunbelt stratum, a new middle class significantly nourished by the mushrooming military-industrial complex: technicians and engineers, real-estate developers, middle managers, and mid-level entrepreneurs who resented the intrusion of Big Government while in fact being remarkably dependent on it.
They could be described as reactionary modernists for whom liberalism had become the new communism. How shocking when this Arizona “maverick” — he deserved the label far more than John McCain ever did (if he ever did) — won the Republican nomination in a knock-down brawl with the presidium, led by New York Governor Nelson Rockefeller, that had run the party until then. Might the Tea Party accomplish something similar today?
Think of Alabama Governor George Wallace as the other missing link between the economic populism of yesteryear and the cultural populism of the late twentieth century. He was all at once an anti-elitist, a populist, a racist, a chauvinist, and a tribune of the politics of revenge and resentment. “Segregation now, segregation tomorrow, segregation forever”: a line spoken at his inauguration as governor in 1963 that would be his signature defiance of the civil rights revolution and its alliance with the federal government. In no uncertain terms, it signaled the militant racism of his bed-rock supporters.
His appeal, however, ran far deeper than that. The whole tenor of his politicking involved a down-home defense of blue-collar America. Like Huey Long, he was sensitive to the economic predicament of his lower-class constituents. As governor he favored expanded state spending on education and public health, pay raises for school teachers, and free textbooks. When he ran for president as a third party candidate in 1968, he called for increases in social security and Medicare. As late as 1972, Wallace increased retirement pensions and unemployment compensation in Alabama.
Yet he championed the hard-hat American heartland by hailing its ethos of hard work and what today would be known as “family values” far more than by proposing concrete measures to assure its economic well-being. Wallace railed against the know-it-all arrogance of “pointy-headed” Washington bureaucrats, the indolence of “welfare queens,” and the impiety, moral decadence, and disloyalty of privileged long-haired, pot-smoking, anti-war college students.
Bellicose calls for law and order, states’ rights, and a muscular patriotism fueled the revanchist emotions that made Wallace into more than a regional figure. When he ran in the Democratic primaries in 1964 (with the support of the John Birch Society and the White Citizens Council), he won significant numbers of votes not only in the Deep South, but in states like Indiana, Wisconsin, and Maryland, a sign of the Southernization of American politics at a time when the spread of NASCAR, country music, and the blues were Southernizing its culture as well.
Wallace’s venture into third-party politics (on the predictably named American Independent Party ticket) terrified the Democrats, who feared the loss of part of their blue-collar base. He called Vice President Hubert Humphrey, then running for president against Richard Nixon, as well as Northern liberals generally, a “group of god-damned, mealy-mouthed sissy-britches” — shades of Senator Joe McCarthy and the 1950s — and he promised to take the gloves off, if elected, and bomb North Vietnam back to the Stone Age.
Wallace’s popularity revealed a possibility to Nixon and the Republicans denied them since the end of Reconstruction: that, on the road to an Electoral College victory, they might begin to develop a “southern strategy.” In the meantime, his populist cry that there “was not a dime’s worth of difference between the Democratic and Republican parties” won him 10 million votes, 13.5% of the total and 46 votes in the Electoral College. And remember this: a crowd of 20,000 attended a Wallace rally in 1968 at a sold-out Madison Square Garden in New York City.
Don’t Tread on My Taxes
So what does this episodic and checkered history of American populism have to do with the Tea Party?
As a start, the Tea Party movement reminds us that the moral self-righteousness, sense of dispossession, anti-elitism, revanchist patriotism, racial purity, and “Don’t Tread on Me” militancy that were always at least a part of the populist admixture are alive and well. For all the fantastical paranoia that often accompanies such emotional stances, they speak to real experiences — for some, of economic anxiety, insecurity, and loss; for others, of deeper fears of personal, cultural, political, or even national decline and moral disorientation.
Though such fears and feelings are, in part, legacies of the corporate liberal order — one of the dark sides of “progress” under capitalism — in this new populist moment, anti-capitalism itself barely lingers on. Though outrage at the bank bailout did help propel the Tea Party explosion, anti-big-business sentiment is now a pale shadow of its former self, a muted sub-theme in the movement when compared to the Wallace moment, not to mention those of Huey Long or the Populists.
This is hardly surprising since, at least economically, capitalism has, according to recent surveys of Tea Party membership, served many of them reasonably well. Like Goldwater supporters of the 1960s, those who identify with the Tea Party movement are generally wealthier than the population as a whole, and more likely to be employed. They are also apparently better educated, so their fondness for Sarah Palin’s intellectual debilities may be more a case of resentment of bicoastal cultural snobbery than eye-popping ignorance.
Alongside an exalted rhetoric about threats to liberty lies a sour, narrow-minded defensiveness against any possible threat of income redistribution that might creep into the body politic… and so into their pockets. “Don’t Tread on Me,” once a rebel war cry, has morphed into: “I’ve got mine. Don’t dare tax it.” The state, not the corporation, is now the enemy of choice.
Tea Party populism should also be thought of as a kind of identity politics of the right. Almost entirely white, and disproportionately male and older, Tea Party advocates express a visceral anger at the cultural and, to some extent, political eclipse of an America in which people who looked and thought like them were dominant (an echo, in its own way, of the anguish of the Know-Nothings). A black President, a female Speaker of the House, and a gay head of the House Financial Services Committee are evidently almost too much to bear. Though the anti-immigration and Tea Party movements so far have remained largely distinct (even if with growing ties), they share an emotional grammar: the fear of displacement.
But identity politics aside, Tea Party anger reaches far beyond the ranks of the modest Tea Party movement. It resonates with other Americans who understandably feel that political and economic elites, serving themselves at the expense of everyone else, have failed Americans. The big question is just exactly how (or even if) that private and personal rage gets transformed into moral and political outrage. If the heirs of George Wallace and Barry Goldwater, or the Sarah Palins of today, have their way, the outcome won’t be a tea party.
Steve Fraser is editor-at large of New Labor Forum, co-founder of the American Empire Project, a writer, TomDispatch contributor, and an historian. His latest book is Wall Street: America’s Dream Palace.
Joshua B. Freeman teaches history at the City University of New York. He is currently completing a history of the United States since World War II as part of the Penguin History of the U.S.
This piece is an adaptation of an article that will be published in the Fall 2010 issue of the magazine New Labor Forum.
Copyright 2010 Steve Fraser and Joshua B. Freeman
This article was originally posted at TomDispatch.com.
History’s Mad Hatters
On March 4, 1933, the day he took office, Franklin Roosevelt excoriated the “money changers” who “have fled from their high seats in the temples of our civilization [because…] they know only the rules of a generation of self-seekers. They have no vision and where there is no vision, the people perish.”
Rhetoric, however, is only rhetoric. According to one skeptical congressional observer of FDR’s first inaugural address, “The President drove the money-changers out of the Capitol on March 4th — and they were all back on the 9th.”
That was essentially true. It was what happened after that, in the midst of the Great Depression, which set the New Deal on a course that is the mirror image of the direction in which the Obama administration seems headed.
Buoyed by great expectations when he assumed office, Barack Obama has so far revealed himself to be an unfolding disappointment. On arrival, expectations were far lower for FDR, who was not considered extraordinary at all — until he actually did something extraordinary.
The great expectations of 2009 are, only a year later, beginning to smell like a pile of dead fish with new rhetoric — including populist-style attacks on villainous bankers that sound fake (or cynically pandering) when uttered by Obama’s brainiacs — layered on top of the pile like deodorant. Meanwhile, the country is suffering through a recovery that isn’t a recovery unless you happen to be a banker, and the administration stands by, too politically or intellectually inhibited or incapacitated to do much of anything about it. A year into “change we can believe in” and the new regime, once so flush with power and the promise of big doings, seems exhausted, vulnerable, and afraid. A year into the New Deal — indeed a mere 100 days into Roosevelt’s era — change, whether you believed in it or not, clearly had the wind at its back.
A Tale of Two Presidencies
If, a few days after Roosevelt pronounced them ex-communicant, the “money-changers” were back inside the temple — “temple,” by the way, was how the Federal Reserve used to be known before its recent fall from grace — no one was too surprised. He, like Obama, was initially worried about alienating big business and high finance. He arrived in the Oval Office, in fact, still a prisoner of his own past and the country’s. He believed, for example, in the then-orthodox wisdom of balancing the budget and would never entirely abandon that faith.
Not long before he assumed office, his predecessor, Herbert Hoover, vetoed a bill calling for the accelerated payment of bonuses to World War I veterans. Many of them had only recently gathered in makeshift tents on Anacostia Flats in Washington D.C., an army of the destitute, to plead their case. Hoover, to his lasting dishonor, ordered Army Chief of Staff General Douglas McArthur to have their tents set on fire and drive them away at bayonet point. Not long after FDR took the oath of office, he vetoed the same bill. He shared, as well, in a broad cultural repugnance for what was then called “the dole,” and today is known as “welfare.”
The legendary first 100 days of the Roosevelt administration, memorable for a raft of reform and recovery legislation, also prominently featured an Economy Act designed to reduce government expenditures. Fearing the possibility of a break with the commercial elite, the president tried forging a partnership with them, much as Hoover had. As a matter of fact, the first two pieces of recovery legislation his administration submitted to Congress — the National Industrial Recovery Act and the Agricultural Adjustment Act — were formulated and implemented in a way that would seem familiar today. They gave the country’s major corporations and largest agricultural interests the principal authority for re-starting the country’s stalled economic engines.
However, even as the administration tried to maintain its ties to powerful business interests and a traditional fiscal conservatism, it broke them — and it severed those connections in ways, and for reasons, that are instructive today.
*The Glass-Steagall Act: This emergency banking legislation passed during those extraordinary first 100 days separated commercial from investment banking. It was meant to prevent the misuse of commercial bank deposits (other people’s money like yours and mine) in dangerous forms of speculation, which many at the time believed had helped cause the Great Wall Street Crash of 1929, prelude to the Great Depression. Today, ever more people wish Glass-Steagall had never been repealed (as it was in 1999), as its absence helped open the door to the financial misadventures that brought us the Great Crash of ’08.
The bill infuriated what was called, in those days, “the Money Trust,” especially the once omnipotent house of Morgan, the dominant member of an elite group of Wall Street firms that had run the financial system since the turn of the century when J.P. Morgan, America’s most famous banker, was revered and feared around the world. (Jack, the patriarch’s son, was so incensed by New Deal financial reform that he banned all pictures of the President from the bank’s premises.) Glass-Steagall, as well as the two Securities Acts of 1933 and 1934 which created the Securities and Exchange Commission and left the doyens of the New York Stock Exchange apoplectic, represented real reform, and so were different in kind from TARP and all the other contraptions designed by the Bush and Obama Treasury Departments simply to bail out the financial sector.
*The Tennessee Valley Authority (TVA): Offspring also of those first 100 days, the TVA uplifted a vast, underdeveloped, and impoverished rural region of the country by bringing it electric power, irrigation, soil conservation, and flood control. It introduced the then-alien (and once again alien) idea of government-directed economic planning and development. It left the private utility industry irate at the prospect of having to compete with effective, publicly owned electrical-power-generating facilities. Fast-forward to today when, on the contrary, the private health insurance and pharmaceutical industries, conniving behind closed doors with Obama’s people, proved triumphant in a similar confrontation, leaving government competition in the dust.
*Jobs: And then there was, as there is again, the question of jobs and how to create them. In 1933, American politicians still took the notion of balancing the budget each year with deadly seriousness. In our present era, every president from Ronald Reagan and Bill Clinton to George W. Bush and now, apparently, Barack Obama talks the talk without any intention of walking the walk. What made the Roosevelt moment remarkable was this: balanced-budget orthodoxy notwithstanding, the new administration soon forged ahead with a set of jobs programs that not only implied deficit spending but an even more radical departure from business as usual.
Initially, the Public Works Administration (PWA), created as part of the National Industrial Recovery Act, relied on large-scale infrastructure projects farmed out to private enterprise. Undertaking such projects inevitably entailed government borrowing and deficits. Partly for that reason, the PWA proceeded at a glacial pace, put few to work right away, and — in the way it looked to the private sector to take the lead — resembled the latest thinking of the Obama administration whose newest tepid suggestions for creating jobs depend almost solely on funneling tax relief to business.
Simultaneously, however, the New Deal pursued a more daring alternative. FDR diverted a third of the PWA’s budget to the Civil Works Administration (CWA), out of which was born the legendary Civilian Conservation Corps, an agency that deployed hundreds of thousands of unemployed young men to restore the country’s forests and parklands. The CWA skipped the private sector entirely and simply put people to work: four million people in the summer and fall of 1933. (That would be the equivalent, today, of ten million Americans back on the job.)
During the first nine months of the Roosevelt administration manual laborers, clerks, architects, book-binders, teachers, actors, white and blue collar workers alike became Federal employees. They laid millions of feet of sewer pipe, improved hundreds of thousands of miles of roads, and built thousands of schools, playgrounds, and airports. Harry Hopkins, who ran the CWA, was authorized to seize tools, equipment, and materials from Army warehouses to get the new system up and running. (The Works Progress Administration, a subsequent incarnation of the CWA, would later create eight million jobs on the same principle of public employment.)
This isn’t even within hailing distance of where the current Administration is now as it frets about the deficit and pledges to freeze domestic spending (and implies, without having the courage to say so, that Medicare, Medicaid, and Social Security had better watch out). Coming from a regnant Democratic Party this is change we can’t or don’t want to believe in.
Like Obama, Roosevelt was denounced by his enemies in the Republican Party and the business community as a closet socialist (not to mention a cripple, a Jew, and a homosexual). While the administration would sometimes trim its sails considerably to weather the right wing storm, its general reaction to Republican opposition was the opposite of Obama’s. Even during that first year, and at an accelerating pace afterwards, the momentum of the New Deal carried it irresistibly to the left.
This was true, in fact, of the whole Democratic Party. The Congress elected in the off-year of 1934 was not only more overwhelmingly Democratic, but the Democrats who won were considerably more progressive-minded. They were far readier to jettison the shibboleths of the old order and press a still cautious President in their direction. By 1936, the essentials of the social welfare and regulatory state were in place, an insurgent labor movement had won the elementary right to organize (while becoming the New Deal’s most muscular constituency), and the president was denouncing “economic royalists” and “tories of industry” whose “hatred” for him he “welcomed.”
Today the Obama administration and the Democratic Party are visibly moving in the opposite direction. They read the lesson of humiliating defeat in Massachusetts and the voluble hostility of the populist right as an advisory to move further to the right. Tacking rightward, tailoring policy to match the tastes of business and finance, cautioning Americans that they’ll need to tighten their belts (as if they hadn’t already been doing so), adopting the parsimonious sanctimony of the balanced budget, slimming down their great expectations until what little is left mocks the hopes of so many who elected them — all of this is seen as smart politics.
Smart like a chicken. This is the same cleverness that, beginning with Ronald Reagan’s triumph, turned the Democratic Party into Republican-lite. Shrewdness like this helps explain, in part, why Obama’s inner circle and Democratic leaders took the early, fateful steps that were bound to land them where they find themselves today.
Would the Republican right and its tea-party populists — marginal, mockable political freaks less than a year ago — have enjoyed their current growth spasm if the administration hadn’t been committed to bailing out the very institutions most people considered the villains responsible for running this country into a ditch? Would the Democratic Party have been in imminent danger of losing its faltering grip on Congress had it found the will to pursue serious health-care reform and environmental legislation, or wrestled the financial oligarchy to the mat as Roosevelt did? A long generation spent cowering in the shadows of the conservative ascendancy has left the newly empowered Democrats congenitally incapable of seizing their own historic moment.
After a year of feinting to the left without meaning it, how seriously is anyone going to take the administration’s latest call to tax the banks or break their addiction to reckless speculation? Even if Obama now means to push ahead with some sort of health-care reform or put some teeth into new financial regulations, he has spent so much political capital moving in the opposite direction and seeking partners where there never were any that his quest, even if genuine, may now be purely quixotic. As for the surge in Afghanistan and the endless war that goes with it, by election time 2010, it’s an even bet that it will have further undermined any hopes of a late-inning Democratic Party revival.
Conventional wisdom notwithstanding, off-year elections do not always favor the minority party. Indeed, 1934 may be the best example of the opposite effect. Exactly because the New Deal showed itself ever readier to junk the ancien régime, break with economic orthodoxy, and above all say goodbye to its erstwhile corporate friends, it was rewarded handsomely at the polls. None of that apparently will be repeated in 2010, given an administration that seems to be running a New Deal in reverse.
Steve Fraser is the co-editor of The Rise and Fall of the New Deal Order and author, most recently, of Wall Street: America’s Dream Palace. He is Research Associate at the Joseph Murphy Center for Labor and Community Studies at the Graduate Center of the City of New York. (To catch him in an exclusive TomDispatch audio interview discussing why Obama has ignored the public-works job model Franklin D. Roosevelt pioneered, click here.)
Copyright 2010 Steve Fraser
The New Deal in Reverse
Obtuse hardly does justice to the social stupidity of our late, unlamented financial overlords. John Thain of Merrill Lynch and Richard Fuld of Lehman Brothers, along with an astonishing number of their fraternity brothers, continue to behave like so many intoxicated toreadors waving their capes at an enraged bull, oblivious even when gored.
Their greed and self-indulgence in the face of an economic cataclysm for which they bear heavy responsibility is, unsurprisingly, inciting anger and contempt, as daily news headlines indicate. It is undermining the last shreds of their once exalted social status — and, in that regard, they are evidently fated to relive the experience of their predecessors, those Wall Street “lords of creation” who came crashing to Earth during the last Great Depression.
Ever since the bail-out state went into hyper-drive, popular anger has been simmering. In fact, even before the meltdown gained real traction, a sign at a mass protest outside the New York Stock Exchange advised those inside: “Jump, You Fuckers.”
You can already buy “I Hate Investment Banking” T-shirts on line. All the Caesar-sized salaries and the Caligula-like madness as the economy crashes and burns, all the bonuses, dividends, princely consulting fees for learning how to milk the Treasury, not to speak of those new corporate jets, as well as the government funds poured down the black hole of mega-mergers, moneys that might otherwise have spared citizens from foreclosure — all of this is making ordinary Americans apoplectic.
Nothing, however, may be more galling than the rationale regularly offered for so much of this self-indulgence. Asked about why he had given out $4 billion in bonuses to his Merrill Lynch staff in a quarter in which the company had lost a staggering $15 billion dollars, ex-CEO John Thain typically responded: “If you don’t pay your best people, you will destroy your franchise. Those best people can get jobs other places, they will leave.”
Apparently it never occurs to those who utter such perverse statements about rewarding the “best people,” or “the best men,” that we’d all have been better off, and saved some serious money, if they had hired the worst men. After all, based on the recent record, who could possibly have done more damage than the “best” Merrill Lynch, Wachovia, Wamu, Citigroup, A.I.G., Bank of America, and so many other top financial crews had to offer?
The “Best Men” Fall
Now even the new powers in Washington are venting. Vice President Biden has suggested that our one time masters of the universe be thrown “in the brig”; Missouri Senator Claire McKaskill has denounced them as “idiots… that are kicking sand in the face of the American taxpayer,” and even the new president, a man of exquisite tact with an instinct for turning the other cheek, labeled Wall Street’s titans as reckless, irresponsible, and shameful.
To those who remember the history, all this bears a painfully familiar ring. Soon enough, that history tells us, Congressional investigators will start hauling such people into the public dock and the real fireworks will begin. It happened once before — a vital chapter in the ongoing story of how an old regime dies and a new one is born.
After the Great Crash of 1929, those at the commanding heights of the economy who had enriched themselves and deluded others into believing that, under their leadership, the United States had achieved “a permanent plateau of prosperity” — sound familiar? — were subject to a whirlwind of anger, public shaming, and withering ridicule. Like the John Thain’s of today, Jack Morgan, Charles Mitchell, Richard Whitney, Albert Wiggins, and others who headed the country’s chief investment and commercial banks, trusts, insurance companies, and the New York Stock Exchange never knew what hit them. They, too, had been steeped in the comforting bathwaters of self-delusion for so long that they believed, like Thain and his compadres, that they were indeed the “best,” the wisest, the most entitled, and the most impregnable men in America. Even amid the ruins of the world they had made, they were incapable of recognizing that their day was done.
Under the merciless glare of Congressional hearings, above all the Senate’s Pecora Committee (named after its bulldog chief counsel Ferdinand Pecora), it was revealed that Jack Morgan and his partners in the House of Morgan hadn’t paid income taxes for years; that “Sunshine” Charlie Mitchell, head of National City Bank (the country’s largest), had been short-selling his own bank’s stock and transferring assets into his wife’s name to escape taxes; that other financiers just like him, who had been hero-worshiped for a decade or more as financial messiahs, had regularly engaged in insider-trading schemes that made them wealthy and fleeced legions of unknowing investors.
The Pecora Committee was not the only scourge of the old financial elite. Franklin Delano Roosevelt, as publicly mild-mannered as and perhaps even more amiable and charming than President Obama, began excoriating them from the moment of his first inaugural address. He condemned them in no uncertain terms for misusing “other people’s money” and for their reckless speculations; he blamed them for the sorry state of the country; he promised to chase these “unscrupulous money changers” from their “high seats in the temples of American civilization.”
Jack Morgan, called to testify by yet another set of Congressional investigators, had a circus midget plopped in his lap to the delight of a swarm of photo-journalists who memorialized the moment for millions. It was an emblematic photo, a visual metaphor for a once proud, powerful elite, its gravitas gone, reduced to impotence, ridiculed for its incompetence, and no longer capable of intimidating a soul.
What happened to Jack Morgan or later Richard Whitney — a crowd of 6,000 turned out at New York’s Grand Central Station in 1938 to watch the handcuffed former president of the New York Stock Exchange be escorted onto a train for Sing Sing, having been convicted of embezzlement — was the political and social equivalent of a great depression. It represented, that is, a catastrophic deflation of the legitimacy of the ancien régime. It was part of what made possible the advent of something entirely new.
Speculators and Con Men
Under normal circumstances, most Americans have been perfectly willing to draw a relatively sharp distinction between the misguided speculator and the confidence man’s outright felonious behavior. One is a legitimate banker gone astray, the other an outlaw.
Under the extraordinary circumstances of terminal systemic breakdown, that distinction grows ever hazier. That was certainly true in the early years of the first Great Depression, when a damaging question arose: just exactly what was the difference between the behavior of Charles Mitchell, Jack Morgan, and Richard Whitney, lions of that era’s Establishment, and outliers like “Sell-em” Ben Smith, Ivar Kreuger, “the match king,” Jesse Livermore, “the man with the evil eye,” William Crappo Durant, maestro of investment pool stock kiting, or the one-time Broadway ticket agent and stock manipulator Michael Meehan, men long barred from the walnut-paneled inner sanctums of white-shoe Wall Street?
Admittedly, their dare-devil escapades had often left them on the wrong side of the law and they would end their days in jail, as suicides, or in penury and disgrace. Nonetheless, as is true today, many Americans then came to accept that between the speculating banker and the confidence man lay a distinction without a meaningful difference. After all, by the early 1930s, the whole American financial system seemed like nothing but a confidence game deserving of the deepest ignominy.
In that sense, Bernie Madoff, a former chairman of the NASDAQ stock exchange, already seems like a synecdoche for a whole way of life. Technically speaking, he ran a Ponzi scheme out of his brokerage firm, as strictly fraudulent as the original one invented by Charles Ponzi, that Italian vegetable peddler, smuggler, and after he got out of an American jail, minor fascist official in Mussolini’s Italy.
Ponzi, however, was a small-timer. He gulled ordinary folks out of their five and ten dollar bills. Madoff’s $50 billion game was something else again. It was completely dependent on his ties to the most august circles of our financial establishment, to major hedge funds and funds of funds, to top-drawer consulting firms, to blue-ribbon nonprofits, and to a global aristocracy of the super-rich. True enough, people of middling means, as well as public and union pension funds, got taken too. At the end of the day, however, Madoff’s scheme, unlike Ponzi’s, was premised on a pervasive insiderism which had everything to do with the way our financial system has been run for the past quarter century.
Once Madoff was exposed, everybody questioned the credulousness of those who invested with him: why didn’t they grow suspicious of such consistently high rates of return? But the equally reasonable question was: why should they have? Not only did you practically need an embossed invitation before you could entrust your loot to Madoff, but the whole financial sector had been enjoying extraordinary returns for a very long time (admittedly, with occasional major hiccups like the Dot-com bust of 1999-2000, which somehow seemed to fade quickly from memory).
Keep in mind as well that these lucrative dealings were based on speculative investments in securities so far removed from anything tangible or comprehensible that they seemed to be floating in thin air. The whole system was a Ponzi-like scheme which, like the Energizer Bunny, just kept on going and going and going… until, of course, it didn’t.
Locked into the Bailout State
After 1929, when the old order went down in flames, when it commanded no more credibility and legitimacy than a confidence game, there was an urgent cry to regulate both the malefactors and their rogue system. Indeed, new financial regulation was at the top of, and made up a hefty part of, Roosevelt’s New Deal agenda during its first year. That included the Bank Holiday, the creation of the Federal Deposit Insurance Corporation, the passing of the Glass-Steagall Act, which separated commercial from investment banking (their prior cohabitation had been a prime incubator of financial hanky-panky during the Jazz Age of the previous decade), and the first Securities Act to monitor the stock exchange.
One might have anticipated an even more robust response today, given the damage done not only to our domestic economy, but to the global one upon which any American economic recovery will rely to a very considerable degree. At the moment, however, financial regulation or re-regulation — given the last 30 years of Washington’s fiercely deregulatory policies — seems to have a surprisingly low profile in the new administration’s stated plans. Capping bonuses, pay scales, and stock options for the financial upper crust is all well and good and should happen promptly, but serious regulation and reform of the financial system must strike much deeper than that.
Instead, the new administration is evidently locked into the bail-out state invented by its predecessors, the latest version of which, the creation of a government “bad bank” (whether called that or not) to buy up toxic securities from the private sector, commands increasing attention. A “bad bank” seems a strikingly lose-lose proposition: either we, the tax-paying public, buy or guarantee these securities at something approaching their grossly inflated, largely fictitious value, in which case we will be supporting this second gilded age’s financial malfeasance for who knows how long, or the government’s “bad bank” buys these shoddy assets at something close to their real value in which case major banks will remain in lock-down mode, if they survive at all. Worse yet, the administration’s latest “bad bank” plan does not even compel rescued institutions to begin lending to anybody, which presumably is the whole point of this new financial welfare system.
Why this timidity and narrowness of vision, which seems less like reform than capitulation? Perhaps it comes, in part, from the extraordinary economic and political throw-weight of the FIRE (finance, insurance, and real estate) sector of our national economy. It has, after all, grown geometrically for decades and is now a vital part of the economy in a way that would have been inconceivable back when the U.S. was a real industrial powerhouse.
Naturally, FIRE’s political influence expanded accordingly, as politicians doing its bidding dismantled the regulatory apparatus installed by the New Deal. Even today, even in ruins, many in that world no doubt hope to keep things more or less that way; and unfortunately, spokesmen for that view — or at least people who used to champion that approach during the Clinton years, including Larry Summers and Robert Rubin (who “earned” more than a $115 million dollars at Citigroup from 1999 to 2008), occupy enormously influential positions in, or as informal advisors to, the new Obama administration.
Still, popular anger and ridicule of the sort our New Deal era ancestors once let loose are growing more and more common, which explains, of course, the newly discovered voice of righteous anger of some of our leading politicians who are feeling the heat. Certain observers have dismissed popular resistance to the bail-out state as nothing more than right-wing, Republican-inspired hostility to government intervention of any sort. No doubt that may account for some of it, but much of the anger is indeed righteous, reasonable, and coming from ordinary Americans who simply have had enough.
Progressive-minded people in and outside of government must find a way to make re-regulation urgent business, and to do so outside the imprisoning, politically self-defeating confines of the bail-out state. Just weeks ago, the notion of nationalizing the banks seemed irretrievably un-American. Now, it is part of the conversation, even if, for the moment, Obama’s savants have ruled it out.
The old order is dying. Let’s bury it. The future beckons.
Copyright 2009 Steve Fraser
The “Best Men” Fall
On a December day in 1932, with the country prostrate under the weight of the Great Depression, ex-president Calvin Coolidge — who had presided over the reckless stock market boom of the Jazz Age Twenties (and famously declaimed that "the business of America is business") — confided to a friend: "We are in a new era to which I do not belong." He punctuated those words, a few weeks later, by dying.
A similar premonition grips the popular imagination today. A new era beckons. No person has been more responsible for arousing that expectation than President-elect Barack Obama. From beginning to end, his presidential campaign was born aloft by invocations of the "fierce urgency of now," by "change we can believe in," by "yes, we can!" and by the obvious significance of his race and generation. Not surprisingly then, as the gravity of the national economic calamity has become terrifyingly clearer, yearnings for salvation have attached themselves ever more firmly to the incoming administration.
This is as it should be — and as it once was. When in March 1933, a few months after Coolidge gave up the ghost, Franklin Delano Roosevelt was inaugurated president, people looked forward to audacious changes, even if they had little or no idea just what, in concrete terms, that might mean. If Coolidge, an iconic representative of the old order, knew that the ancien régime was dead, millions of ordinary Americans had drawn the same conclusion years earlier. Full of fear, depressed and disillusioned, they nonetheless had an appetite for the untried. Like Obama, FDR had, during his campaign, encouraged feverish hopes with no less vaporous references to a "new deal" for Americans.
Brain Trust vs Brainiacs
Yet today, something is amiss. Even if everyone is now using the Great Depression and the New Deal as benchmarks for what we're living through, Act I of the new script has already veered away from the original.
A suffocating political and intellectual provincialism has captured the new administration in embryo. Instead of embracing a sense of adventurousness, a readiness to break with the past so enthusiastically promoted during the campaign, Obama seems overcome with inhibitions and fears.
Practically without exception he has chosen to staff his government at its highest levels with refugees from the Clinton years. This is emphatically true in the realms of foreign and economic policy. It would, in fact, be hard to find an original idea among the new appointees being called to power in those realms — some way of looking at the American empire abroad or the structure of power and wealth at home that departs radically from views in circulation a decade or more ago. A team photo of Obama's key cabinet and other appointments at Treasury, Health and Human Services, Commerce, the President's Economic Recovery Advisory Board, the State Department, the Pentagon, the National Security Council, and in the U.S. Intelligence Community, not to speak of senior advisory posts around the President himself, could practically have been teleported from perhaps the year 1995.
Recycled Clintonism is recycled neo-liberalism. This is change only the brainiacs from Hyde Park and Harvard Square could believe in. Only the experts could get hot under the collar about the slight differences between "behavioral economics" (the latest academic fad that fascinates some high level Obama-ites) and straight-up neo-liberal deference to the market. And here's the sobering thing: despite the grotesque extremism of the Bush years, neo-liberalism also served as its ideological magnetic north.
Is this parochialism, this timorousness and lack of imagination, inevitable in a period like our own, when the unknown looms menacingly and one natural reaction is certainly to draw back, to find refuge in the familiar? Here, the New Deal years can be instructive.
Roosevelt was no radical; indeed, he shared many of the conservative convictions of his class and times. He believed deeply in both balanced budgets and the demoralizing effects of relief on the poor. He tried mightily to rally the business community to his side. For him, the labor movement was terra incognita and — though it may be hard to believe today — played no role in his initial policy and political calculations. Nonetheless, right from the beginning, Roosevelt cobbled together a cabinet and circle of advisers strikingly heterogeneous in its views, one that, by comparison, makes Obama's inner sanctum, as it is developing today, look like a sectarian cult.
Heterogeneous does not mean radical. Some of FDR's early appointments — as at the Treasury Department — were die-hard conservatives. Jesse Jones, who ran the Reconstruction Finance Corporation, a Hoover administration creation, retained by FDR, that had been designed to rescue tottering banks, railroads, and other enterprises too big to fail, was a practitioner of business-friendly bailout capitalism before present Treasury Secretary Henry Paulson was even born.
But there was also Henry Wallace as Secretary of Agriculture, a Midwestern progressive who would become the standard bearer for the most left-leaning segments of the New Deal coalition. He was joined at the Agriculture Department — far more important then than now — by men like Mordecai Ezekiel, who was prepared to challenge the power of the country's landed oligarchs.
Then there were corporatists like Raymond Moley, Donald Richberg, and General Hugh Johnson. Moley was an original member of FDR's legendary "brain trust" (a small group of the President's most influential advisers who often held no official government position). Richberg and Johnson helped design and run the National Recovery Administration (the New Deal's first and failed attempt at industrial recovery). All three men were partial to the interests of the country's peak corporations. All three wanted them released from the strictures of the Sherman Anti-Trust Act so that they could collaborate in setting prices and wages to arrest the killing deflation that gripped the economy. But they also wanted these corporate behemoths and the codes of competition they promulgated subjected to government oversight and restraints.
Meanwhile, Felix Frankfurter (another confidant of FDR's and a future Supreme Court justice), aided by the behind-the-scenes efforts of Supreme Court Justice Louis Brandeis, fiercely contested the influence of the corporatists within the new administration, favoring anti-trust and then-new Keynesian approaches to economic recovery. Secretary of Labor Frances Perkins used her extensive ties to the social work community and the labor movement to keep an otherwise tone-deaf president apprised of portentous rumblings from that quarter. In this fashion, she eased the way for the passage of the Wagner Act that legislated the right to organize and bargain collectively, and that ended the reign of industrial autocracy in the workplace.
Roosevelt's "brain trust" also included Rexford Tugwell. He was an avid proponent of government economic planning. Another founding member of the "brain trust" was Adolph Berle, who had published a bestselling, scathing indictment of the financial and social irresponsibility of the corporate elite just before FDR assumed office.
People like Tugwell and others, including future Federal Reserve Board chairman Marriner Eccles, were believers in Keynesian deficit spending as the road to recovery and argued fiercely for this position within the inner councils of the administration, even while Roosevelt himself remained, until later in his presidency, an orthodox budget balancer.
All of these people — the corporatists and the Keynesians, the planners and the anti-trusters — were there at the creation. They often came to blows. A genuine administration of "rivals" didn't faze FDR. He was deft at borrowing all of, or pieces of, their ideas, then jettisoning some when they didn't work, and playing one faction against another in a remarkable display of political agility. Roosevelt's tolerance of real differences stands in stark contrast to the new administration's cloning of the Clinton-era brainiacs.
It was this openness to a variety of often untested solutions — including at that point Keynesianism — that helped give the New Deal the flexibility to adjust to shifts in the country's political chemistry in the worst of times. If the New Deal came to represent a watershed in American history, it was in part due to the capaciousness of its imagination, its experimental elasticity, and its willingness to venture beyond the orthodox. Many failures were born of this, but so, too, many enduring triumphs.
Beyond the Bailout State
Why, at least so far, is the Obama approach so different? Some of it no doubt has to do with the same native caution that caused FDR to navigate carefully in treacherous waters. But some of it may result from the fallout of history. Because the Great Depression and the New Deal happened, nothing can ever really be the same again.
We are accustomed to thinking of the Bush years — maybe even the whole era from the presidency of Ronald Reagan on — as a throwback to the 1920s or even the laissez-faire golden years of the Gilded Age of the late nineteenth century. In some respects, that's probably accurate, but in at least one critical way it's not. Back in those days, faced with a potentially terminal financial crisis, the government did nothing, simply letting the economy plunge into depression. This happened repeatedly until 1929, when it happened again.
Since the New Deal, however, inaction has ceased to be a viable option for Washington. State intervention to prevent catastrophe has become an unspoken axiom of political life in perilous times. Of course, thanks to regulatory mechanisms installed during the New Deal years, there was no need to engage in heroic rescues — not, at least, until the triumph of deregulation in our own time.
Then crises began to erupt with ever greater frequency — the stock market crash of 1987, the savings and loan collapse at the end of that decade, the massive Latin American debt defaults of the early 1990s, the collapse of the economies of the Asian "tigers" in the mid-1990s, the near bankruptcy of the then-huge hedge fund, Long Term Capital Management, later in that decade, the dot-com implosion at the turn the century, climaxing with the general global collapse of the present moment. Beginning perhaps with the bailout of the Chrysler Corporation in the late 1970s, these recurring crises have been met with increasingly strenuous efforts to stop the bleeding by what some have called "the bailout state."
The Resolution Trust Corporation, created to rescue the savings and loan industry, first institutionalized what Kevin Phillips has since described as a new political economy of "financial mercantilism." Under this new order the state stands ready to backstop the private sector — or at least the financial sub-sector which, for the past quarter century, has been the driving engine of economic growth — whenever it undergoes severe stress.
Today, the starting point for all mainstream policymakers, even those who otherwise preach the virtues of the free market and the evils of big government, is the active intervention of the state to prevent the failure of private-sector institutions considered "too big to fail" (as with most recently Citigroup and the insurance company AIG). So, too, the tolerance level for deficit spending, not only for military purposes but, in extremis, to help stop ordinary people from going under, is infinitely higher than in 1932. Ronald Reagan was prepared to live with such spending, if necessary, even as he removed portraits of Thomas Jefferson and Harry S. Truman from the Cabinet Room and replaced them with a canvas of Calvin Coolidge.
The question for our "new era" — not one our New Deal ancestors would have thought to ask — has become: How do we get beyond the bailout state? This is one crucial realm where genuinely new thinking and new ideas are badly needed.
At the moment, as best we can make out, the bailout state is being managed in secret and apparently in the interests, above all, of those who run the financial institutions being "rescued." Often, we don't actually know who is getting what from the Federal Reserve and the Treasury, or on what terms, or even which institutions are being helped and which aren't, or often what our public monies are actually being used for.
What we do know, however, is anything but encouraging. It includes tax exemptions for merging banks, prices for public-equity stakes in failing outfits that far exceed what is being paid by governments (or even private investors) abroad for similar holdings. Add to this a stark lack of accountability, aggravated by the fact that the U.S. government has neither voting rights (nor even a voice) on boards of directors whose firms would be in bankruptcy court without Washington's aid.
Living in an Empire of Depression
Are we, then, witnessing the birth of some warped, exceedingly partial version of state capitalism — partial, that is, to the resuscitation of the old order? If so, lurking within this string of bum deals might there not be a great opportunity? Putting the economy and country back together will require massive resources directed toward common purposes. There is no more suitable means of mobilizing and steering those resources than the institutions of democratic government.
Under the present dispensation, the bailout state makes the government the handmaiden of the financial sector. Under a new one, the tables might be turned. But who will speak for that option within the limited councils of the Obama team?
A real democratic nationalization of the banks — good value for our money rather than good money to add to their value — should be part of the policy agenda up for discussion in the Obama era. As things now stand, the public supplies the loans and the investment capital, but the key decisions about how they are to be deployed remain in private hands. A democratic version of nationalizing the financial system would transfer these critical decisions to new institutions created by the Congress and designed to pursue public, not private, objectives. How to subject the flow of credit and investment capital to public control ought to be on the drawing boards if we are to look beyond the old New Deal to a new one.
Or, for instance, if we are to bail out the auto industry, which we should — millions of jobs, businesses, communities, and what's left of once powerful and proud unions are at stake — then why not talk about its nationalization, too? Why not create a representative body of workers, consumers, environmentalists, suppliers, and other interested parties to supervise the industry's reorganization and retooling to produce, just as the president-elect says he wants, new green means of transportation — and not just cars?
Why not apply the same model to the rehabilitation of the nation's infrastructure; indeed, why not to the reindustrialization of the country as a whole? If, as so many commentators are now claiming, what lies ahead is the kind of massive, crippling deflation characteristic of such crises, then why not consider creating democratic mechanisms to impose an incomes policy on wages and prices that works against that deflation?
Overseas, if everything isn't up for discussion — and it most certainly isn't — it ought to be. What happens there bears directly on our future here at home. After all, we live in the empire of depression. America's favorite export for more than a decade has been a toxic line-up of securitized debt. Having ingested it in lethal amounts, every economy in the world from Iceland's and Germany's to Russia's and Indonesia's is either folding up or threatening to fold up like an accordion under the pressure of economic disaster.
Until now, the American way of life, including its economy of mass consumption, has depended on maintaining the country's global preeminence by any means possible: economic, political, and, in the end, military. The news of the Bush years was that, in this mix, Washington reached for its six-guns so much more quickly.
A global depression will challenge that fundamental hierarchy in every conceivable way. The United States can try to recapture its imperiled hegemony by methods familiar to the Obama-Clinton-Bush (the father) foreign policy establishment, that is by using the country's waning but still intimidating economic and military muscle. But that's a devil's game played at exorbitant cost which will further imperil the domestic economy.
It might, of course, be possible, as in domestic affairs, to try something new, something that embraces the public redevelopment of America in concert with the global South. This would entail at a minimum a radical break with the "Washington Consensus" of the Clinton years in which the United States insisted that the rest of the world conform to its free market model of economic behavior. It would establish multilateral mechanisms for regulating the flow of investment capital and severe penalties and restrictions on speculation in international markets. Most of all, it would mean lifting the strangulating grip of American military might that now girdles the globe.
All of this would require a capacity for re-imagining foreign affairs as something other than a zero-sum game. So far, nothing in Obama's line-up of foreign policy and national security mandarins suggests this kind of potential policy deviance. Again, no Rooseveltian "brain trust" is in sight, even though unorthodoxies are called for, not just because of the hopes Obama's victory have aroused, but because of the urgency of our present circumstances.
If original thinking doesn't find a home somewhere within this forming administration soon, it will be an omen of an even more troubled future to come, when options not even being considered today may be unavailable tomorrow. Certainly, Americans ought to expect something better than a trip down (the grimmest of) memory lanes into the failed neo-liberalism of yesteryear.
Steve Fraser is a visiting professor at New York University and the author of Wall Street: America's Dream Palace. He is a regular contributor to TomDispatch.com and co-founder of the American Empire Project (Metropolitan Books).
Copyright 2008 Steve Fraser
Beyond the Bailout State
Wall Street sits at the eye of a political hurricane. Its enemies converge from every point on the compass. What a stunning turn of events.
For well more than half a century Wall Street has enjoyed a remarkable political immunity, but matters were not always like that. Now, with history marching forward in seven league boots, we are about to revisit a time when the Street functioned as the country’s lightning rod, attracting its deepest animosities and most passionate desires for economic justice and democracy.
For the better part of a century, from the 1870s through the tumultuous years of the Great Depression and the New Deal, the specter of Wall Street haunted the popular political imagination. For Populists it was the "Great Satan," its stranglehold over the country’s credit system being held responsible for driving the family farmer to the edge of extinction and beyond.
For legions mobilized in the anti-monopoly movement, Wall Street was the prime engine house of monopoly capitalism, leaving behind it a trail of victimized businesses, consumers, captive municipalities, and crushed workers. For Progressive reformers around the turn of the twentieth century, Wall Street’s "money trust" was the mother of all trusts, its tentacles — and the octopus was indeed a popular image of the time — choking off economic opportunity for all but a favored few. Its political power in Congress, in presidential cabinets, in statehouses, in both major political parties was seen as so overwhelming as to threaten to suffocate democracy itself.
All the periodic panics and depressions — 1873, 1884, 1893, 1907, and 1913 — that, with numbing regularity, punctuated economic life until the Crash of ’29 and the Great Depression brought the house down seemed to begin on the Street. And whether they actually began there or not, all the misery that followed in their wake — the homelessness, the armies of tramps and hobos, the starvation, the bankruptcies, the broken families, the crushing sense of dispossession — was regularly laid at the feet of the Street.
Despite the hot-tempered invective directed its way, the "Great Satan" didn’t face its comeuppance until the New Deal in the 1930s. Then, all its transgressions — its speculative greed, its felonious insider-dealing, its cynical manipulation of popular credulity, its extravagant incompetence and seemingly limitless capacity for self-delusion — left Wall Street truly vulnerable. Its reputation had struck bottom.
Wall Street’s Invisible Decades
Just like our Wall Street heroes of the recent past, so, too, back in the 1920s the savants of the Street claimed credit for the rickety prosperity of the Jazz Age. With the Crash they took the blame for the disaster, just as they had taken the credit for the prosperity, and were despised for their hypocrisy as well. Just as seems to be starting to happen today, Congressmen, some of whom had spent their careers genuflecting before the titans of Wall Street, suddenly hauled them before investigating committees, there to be defrocked, treated to a withering storm of biblically-inspired injunctions and Shakespearean curses, and indicted in the court of public opinion. Wall Street was, as it now seems about to be again, excommunicated.
Suddenly weak beyond compare, the Street was powerless to resist Franklin D. Roosevelt’s regulatory state. In rapid succession came the Glass-Steagall banking act and the Federal Deposit Insurance Corporation, the two securities acts of 1933 and 1934, the creation of the Securities and Exchange Commission (SEC), the Public Utility Holding Company Act, and much more. When, in 1936, the President summoned the people to battle against the "economic royalists" everyone knew just who he was talking about.
It’s long been said that FDR’s New Deal saved capitalism from itself. That is true. One ironic consequence of that fateful turn of events was, politically speaking, to cloak Wall Street in invisibility. After all the shouting was over, after the installation of legislative reforms had further chastened an already cowed Street and constrained its penchant for financial wilding, it ceased to function as the magnetic north for all those troubled by the inequities, injustices, and deformations of capitalism.
During the long prosperity of the post-war years from 1945 to 1970, when the income and wealth inequalities that had always been associated with Wall Street narrowed dramatically — economic historians know this as "the great compression" — news of the Street retreated to the business pages and remained there. Except for an occasional act of street theater, even in the tumultuous 1960s, the Street remained largely exempt from sustained political criticism. Once the bête noire of all those who found themselves in opposition to the ravages of laissez-faire capitalism, Wall Street had been neutered.
Just as remarkable is how long that immunity from criticism lasted. After all, Wall Street’s record over the past quarter century is nothing to boast about — unless, that is, you happened to have made your living on it or in its environs.
Beginning in the 1980s, the Street supervised and profited handsomely from the de-industrialization of America. "Lean and mean" capitalism, the watchword of the Reagan era, added up to the systematic dismantling of the core of American industry. This was done in the interests of "shareholder value," as well, of course, as the bounteous short-term returns offered by the merger, acquisition, and junk-bond mania of those years. Did the rise of a speculative economy of virtual wealth and the fall of an economy that had once employed millions productively at decent wages disturb the political equanimity of American public life? Barely.
When the financial regulatory apparatus of the New Deal was weakened, piece by piece, or simply eliminated by a triumphant conservatism, the economy began to re-experience the cycles of bubble and bust so familiar to previous generations of Americans. In 1987, the stock market briefly collapsed. Then, during the late 1980s, a large-scale savings and loan bailout was accompanied by the rescue of banks caught short holding shaky Latin American debt. Not long after that came the savaging of the "Asian tiger" economies by Thomas Friedman’s "electronic herd" of speculators, and the government-arranged bailout of that period’s biggest hedge fund, Long-Term Capital Management.
Before the country could catch its breath, matters got really serious with the popping of the dot.com bubble, Enronization, and finally, of course, our current catastrophe. Through all of this — until now — the political fallout was virtually nil. Sarbanes-Oxely, the act passed by Congress in 2002 in response to an avalanche of Wall Street and corporate scandals that began with Enron, was a remarkably tepid piece of reformist legislation, given the scale of the debauch; yet, within moments of its passage, howls of protest could be heard from our offended friends on the Street, grievous complaints treated with all due seriousness by the media, somehow still infatuated with Wall Street’s rain-makers.
The Return of the Repressed
No longer. There is a new agenda in America and it calls for re-regulation, recovery, and retribution. It is enough to make one gasp in disbelief, but nowadays there is practically universal agreement that the financial sector must be more or less rigorously reined in and regulated. (Hedge fund managers and some other hold-outs demur, of course.) Yet mere weeks ago, "government regulation" was still a phrase to be avoided like the plague, ranking right up there with "liberal" in the vocabulary of political obloquy.
It’s hard not to be reminded of just how quickly the political chemistry of the country changed at the end of the 1920s. The presiding figure who had loomed over that decade was Secretary of the Treasury Andrew Mellon — then considered the greatest Treasury secretary since Hamilton. His insane faith in the free market led him to suggest to President Herbert Hoover that the way out of the Depression was to do nothing, except "liquidate stocks, liquidate labor, liquidate the farmer, liquidate real estate." That thought earned him the enmity of a once admiring country. So, too, laissez-faire has suddenly become much too French for Americans who, but moments ago, treated it like the Holy Grail. We are all regulators today.
Of course, the devil, as every politician on television now makes sure to say, will lie in the details of just what re-regulation consists of. If all it involves is transparency, that won’t be nearly enough. After all, that is precisely what Sarbanes-Oxley promised when it required financial institutions to make full disclosure of their activities. When it comes to circumventing the rules of information sharing so as to leave the insiders in the know and the rest of us out in the cold, where there’s a will, there will always be a way. The new regulatory regime must have powers that extend beyond umpiring. New rules need to be invented whose purpose is as much to assure economic recovery and equity as it is to police the borders of illegality.
Indeed, popular anger fueling the regulatory crusade now seems to be coupled with a deep-running fear of a coming depression and an urge to reverse course. This, too, is symptomatic of a shift in the axis of political debate, in the zeitgeist, if you will.
The meltdown of the financial system has called into question American economic behavior over the last generation. Wall Street has come to stand for a paper economy that produces nothing useful, nothing tangible the way it once did. It has frittered away resources on embarrassingly grotesque forms of conspicuous consumption and patently non-productive forms of investment. It has left the real economy underdeveloped, its infrastructure rotting away in plain sight, its wealth fractured by unprecedented inequalities, dependent on sweated labor, and its industries, across a broad spectrum, technologically second-rate. It has left the country lost in a sea of debt and headed for an abyss of unemployment, bankruptcy, and evictions. Somehow regulation — although not all by itself — must address this, or so, for the first time in a long while, large numbers of Americans hope and desire.
People are now looking to the government — that ogre of the dying old order — as the only power resourceful and strong enough to direct the flow of capital where it’s needed rather than where the discredited overlords of the financial system think may be most profitable. Conservatives, especially those who rightly balk at the mega-bailout now in the works as unfair to the American taxpayer, decry what they call financial socialism. But what then?
The Meaning of Retribution
As it did in 1929, the free market has failed beyond tolerance. Overwhelming popular sentiment (which each new poll registers with added vehemence) may, sooner or later, bring not only a full recognition of just how wrong-headed the country has been for how long, but how much in need it is of fresh institutions. New forms of public authority, closely overseen by the mechanisms of democracy rather than turned over to some autocrat on leave from his day job as an investment banker, might have a chance of doing what was once unthinkable: de-sanctifying private property and compelling it to perform in the general interest when its private misuse has placed us all in peril. The New Deal ventured in that direction. We need to venture further.
Here’s a first principle: Refuse to reward those institutions that have done us no service. If that entails their liquidation (to borrow a word from Andrew Mellon), so be it. The world won’t end, only the world as they have known it.
Let’s use what’s left of their grossly inflated assets to re-start the engines of real economic development. Compel investment in the re-industrialization of the country along lines that reward labor not parasitism, end the reign of the sweatshop, rescue the country from environmental suicide, revise the division of wealth and income so we can all live free of the indecencies of lavish piggery, and insist that social responsibility takes precedence over the bottom line.
Many will seek retribution as well, just as Americans used to do in the decades before the Great Depression. How could they not? That’s what happens when simple rage turns into moral outrage, when people are finally called to account for the damage they’ve done. The emotion fuels a chemical reaction even now at work in our cultural innards. It may prove the catalyst for an intellectual and emotional explosion that someday will add up to a genuine break with the past. It did so back in 1929.
However justifiable, cutting CEOs loose from the life-support systems they’ve used to drain corporate treasuries for decades is small potatoes. Do it, but let’s hope the instinct for retribution will be turned to better purposes — to, in fact, reintroducing into our political life and our economic behavior an ethos of social solidarity. Let’s see where that might take us. We could do much worse.
Copyright 2008 Steve Fraser