In brief, the Global War on Terror sledgehammer strategy has spread jihadi terror from a tiny corner of Afghanistan to much of the world, from Africa through the Levant and South Asia to Southeast Asia. It has also incited attacks in Europe and the United States. The invasion of Iraq made a substantial contribution to this process, much as intelligence agencies had predicted. Terrorism specialists Peter Bergen and Paul Cruickshank estimate that the Iraq War “generated a stunning sevenfold increase in the yearly rate of fatal jihadist attacks, amounting to literally hundreds of additional terrorist attacks and thousands of civilian lives lost; even when terrorism in Iraq and Afghanistan is excluded, fatal attacks in the rest of the world have increased by more than one-third.” Other exercises have been similarly productive.
A group of major human rights organizations — Physicians for Social Responsibility (U.S.), Physicians for Global Survival (Canada), and International Physicians for the Prevention of Nuclear War (Germany) — conducted a study that sought “to provide as realistic an estimate as possible of the total body count in the three main war zones [Iraq, Afghanistan, and Pakistan] during 12 years of ‘war on terrorism,'” including an extensive review “of the major studies and data published on the numbers of victims in these countries,” along with additional information on military actions. Their “conservative estimate” is that these wars killed about 1.3 million people, a toll that “could also be in excess of 2 million.” A database search by independent researcher David Peterson in the days following the publication of the report found virtually no mention of it. Who cares?
More generally, studies carried out by the Oslo Peace Research Institute show that two-thirds of the region’s conflict fatalities were produced in originally internal disputes where outsiders imposed their solutions. In such conflicts, 98% of fatalities were produced only after outsiders had entered the domestic dispute with their military might. In Syria, the number of direct conflict fatalities more than tripled after the West initiated air strikes against the self-declared Islamic State and the CIA started its indirect military interference in the war — interference which appears to have drawn the Russians in as advanced US antitank missiles were decimating the forces of their ally Bashar al-Assad. Early indications are that Russian bombing is having the usual consequences.
The evidence reviewed by political scientist Timo Kivimäki indicates that the “protection wars [fought by ‘coalitions of the willing’] have become the main source of violence in the world, occasionally contributing over 50% of total conflict fatalities.” Furthermore, in many of these cases, including Syria, as he reviews, there were opportunities for diplomatic settlement that were ignored. That has also been true in other horrific situations, including the Balkans in the early 1990s, the first Gulf War, and of course the Indochina wars, the worst crime since World War II. In the case of Iraq the question does not even arise. There surely are some lessons here.
The general consequences of resorting to the sledgehammer against vulnerable societies comes as little surprise. William Polk’s careful study of insurgencies, Violent Politics, should be essential reading for those who want to understand today’s conflicts, and surely for planners, assuming that they care about human consequences and not merely power and domination. Polk reveals a pattern that has been replicated over and over. The invaders — perhaps professing the most benign motives — are naturally disliked by the population, who disobey them, at first in small ways, eliciting a forceful response, which increases opposition and support for resistance. The cycle of violence escalates until the invaders withdraw — or gain their ends by something that may approach genocide.
Playing by the Al-Qaeda Game Plan
Obama’s global drone assassination campaign, a remarkable innovation in global terrorism, exhibits the same patterns. By most accounts, it is generating terrorists more rapidly than it is murdering those suspected of someday intending to harm us — an impressive contribution by a constitutional lawyer on the 800th anniversary of Magna Carta, which established the basis for the principle of presumption of innocence that is the foundation of civilized law.
Another characteristic feature of such interventions is the belief that the insurgency will be overcome by eliminating its leaders. But when such an effort succeeds, the reviled leader is regularly replaced by someone younger, more determined, more brutal, and more effective. Polk gives many examples. Military historian Andrew Cockburn has reviewed American campaigns to kill drug and then terror “kingpins” over a long period in his important study Kill Chain and found the same results. And one can expect with fair confidence that the pattern will continue.
No doubt right now U.S. strategists are seeking ways to murder the “Caliph of the Islamic State” Abu Bakr al-Baghdadi, who is a bitter rival of al-Qaeda leader Ayman al-Zawahiri. The likely result of this achievement is forecast by the prominent terrorism scholar Bruce Hoffman, senior fellow at the U.S. Military Academy’s Combating Terrorism Center. He predicts that “al-Baghdadi’s death would likely pave the way for a rapprochement [with al-Qaeda] producing a combined terrorist force unprecedented in scope, size, ambition and resources.”
Polk cites a treatise on warfare by Henry Jomini, influenced by Napoleon’s defeat at the hands of Spanish guerrillas, that became a textbook for generations of cadets at the West Point military academy. Jomini observed that such interventions by major powers typically result in “wars of opinion,” and nearly always “national wars,” if not at first then becoming so in the course of the struggle, by the dynamics that Polk describes. Jomini concludes that “commanders of regular armies are ill-advised to engage in such wars because they will lose them,” and even apparent successes will prove short-lived.
Careful studies of al-Qaeda and ISIS have shown that the United States and its allies are following their game plan with some precision. Their goal is to “draw the West as deeply and actively as possible into the quagmire” and “to perpetually engage and enervate the United States and the West in a series of prolonged overseas ventures” in which they will undermine their own societies, expend their resources, and increase the level of violence, setting off the dynamic that Polk reviews.
Scott Atran, one of the most insightful researchers on jihadi movements, calculates that “the 9/11 attacks cost between $400,000 and $500,000 to execute, whereas the military and security response by the U.S. and its allies is in the order of 10 million times that figure. On a strictly cost-benefit basis, this violent movement has been wildly successful, beyond even Bin Laden’s original imagination, and is increasingly so. Herein lies the full measure of jujitsu-style asymmetric warfare. After all, who could claim that we are better off than before, or that the overall danger is declining?”
And if we continue to wield the sledgehammer, tacitly following the jihadi script, the likely effect is even more violent jihadism with broader appeal. The record, Atran advises, “should inspire a radical change in our counter-strategies.”
Al-Qaeda/ISIS are assisted by Americans who follow their directives: for example, Ted “carpet-bomb ’em” Cruz, a top Republican presidential candidate. Or, at the other end of the mainstream spectrum, the leading Middle East and international affairs columnist of the New York Times, Thomas Friedman, who in 2003 offered Washington advice on how to fight in Iraq on the Charlie Rose show: “There was what I would call the terrorism bubble… And what we needed to do was to go over to that part of the world and burst that bubble. We needed to go over there basically, and, uh, take out a very big stick, right in the heart of that world, and burst that bubble. And there was only one way to do it… What they needed to see was American boys and girls going house to house from Basra to Baghdad, and basically saying, which part of this sentence don’t you understand? You don’t think we care about our open society, you think this bubble fantasy we’re going to just let it go? Well, suck on this. Ok. That, Charlie, was what this war was about.”
That’ll show the ragheads.
Atran and other close observers generally agree on the prescriptions. We should begin by recognizing what careful research has convincingly shown: those drawn to jihad “are longing for something in their history, in their traditions, with their heroes and their morals; and the Islamic State, however brutal and repugnant to us and even to most in the Arab-Muslim world, is speaking directly to that… What inspires the most lethal assailants today is not so much the Quran but a thrilling cause and a call to action that promises glory and esteem in the eyes of friends.” In fact, few of the jihadis have much of a background in Islamic texts or theology, if any.
The best strategy, Polk advises, would be “a multinational, welfare-oriented and psychologically satisfying program… that would make the hatred ISIS relies upon less virulent. The elements have been identified for us: communal needs, compensation for previous transgressions, and calls for a new beginning.” He adds, “A carefully phrased apology for past transgressions would cost little and do much.” Such a project could be carried out in refugee camps or in the “hovels and grim housing projects of the Paris banlieues,” where, Atran writes, his research team “found fairly wide tolerance or support for ISIS’s values.” And even more could be done by true dedication to diplomacy and negotiations instead of reflexive resort to violence.
Not least in significance would be an honorable response to the “refugee crisis” that was a long time in coming but surged to prominence in Europe in 2015. That would mean, at the very least, sharply increasing humanitarian relief to the camps in Lebanon, Jordan, and Turkey where miserable refugees from Syria barely survive. But the issues go well beyond, and provide a picture of the self-described “enlightened states” that is far from attractive and should be an incentive to action.
There are countries that generate refugees through massive violence, like the United States, secondarily Britain and France. Then there are countries that admit huge numbers of refugees, including those fleeing from Western violence, like Lebanon (easily the champion, per capita), Jordan, and Syria before it imploded, among others in the region. And partially overlapping, there are countries that both generate refugees and refuse to take them in, not only from the Middle East but also from the U.S. “backyard” south of the border. A strange picture, painful to contemplate.
An honest picture would trace the generation of refugees much further back into history. Veteran Middle East correspondent Robert Fisk reports that one of the first videos produced by ISIS “showed a bulldozer pushing down a rampart of sand that had marked the border between Iraq and Syria. As the machine destroyed the dirt revetment, the camera panned down to a handwritten poster lying in the sand. ‘End of Sykes-Picot,’ it said.”
For the people of the region, the Sykes-Picot agreement is the very symbol of the cynicism and brutality of Western imperialism. Conspiring in secret during World War I, Britain’s Mark Sykes and France’s François Georges-Picot carved up the region into artificial states to satisfy their own imperial goals, with utter disdain for the interests of the people living there and in violation of the wartime promises issued to induce Arabs to join the Allied war effort. The agreement mirrored the practices of the European states that devastated Africa in a similar manner. It “transformed what had been relatively quiet provinces of the Ottoman Empire into some of the least stable and most internationally explosive states in the world.”
Repeated Western interventions since then in the Middle East and Africa have exacerbated the tensions, conflicts, and disruptions that have shattered the societies. The end result is a “refugee crisis” that the innocent West can scarcely endure. Germany has emerged as the conscience of Europe, at first (but no longer) admitting almost one million refugees — in one of the richest countries in the world with a population of 80 million. In contrast, the poor country of Lebanon has absorbed an estimated 1.5 million Syrian refugees, now a quarter of its population, on top of half a million Palestinian refugees registered with the U.N. refugee agency UNRWA, mostly victims of Israeli policies.
Europe is also groaning under the burden of refugees from the countries it has devastated in Africa — not without U.S. aid, as Congolese and Angolans, among others, can testify. Europe is now seeking to bribe Turkey (with over two million Syrian refugees) to distance those fleeing the horrors of Syria from Europe’s borders, just as Obama is pressuring Mexico to keep U.S. borders free from miserable people seeking to escape the aftermath of Reagan’s GWOT along with those seeking to escape more recent disasters, including a military coup in Honduras that Obama almost alone legitimized, which created one of the worst horror chambers in the region.
Words can hardly capture the U.S. response to the Syrian refugee crisis, at least any words I can think of.
Returning to the opening question “Who rules the world?” we might also want to pose another question: “What principles and values rule the world?” That question should be foremost in the minds of the citizens of the rich and powerful states, who enjoy an unusual legacy of freedom, privilege, and opportunity thanks to the struggles of those who came before them, and who now face fateful choices as to how to respond to challenges of great human import.
The Costs of Violence
[This piece, the first of two parts, is excerpted from Noam Chomsky’s new book, Who Rules the World? Part 2 will be posted on Tuesday morning.]
When we ask “Who rules the world?” we commonly adopt the standard convention that the actors in world affairs are states, primarily the great powers, and we consider their decisions and the relations among them. That is not wrong. But we would do well to keep in mind that this level of abstraction can also be highly misleading.
States of course have complex internal structures, and the choices and decisions of the political leadership are heavily influenced by internal concentrations of power, while the general population is often marginalized. That is true even for the more democratic societies, and obviously for others. We cannot gain a realistic understanding of who rules the world while ignoring the “masters of mankind,” as Adam Smith called them: in his day, the merchants and manufacturers of England; in ours, multinational conglomerates, huge financial institutions, retail empires, and the like. Still following Smith, it is also wise to attend to the “vile maxim” to which the “masters of mankind” are dedicated: “All for ourselves and nothing for other people” — a doctrine known otherwise as bitter and incessant class war, often one-sided, much to the detriment of the people of the home country and the world.
In the contemporary global order, the institutions of the masters hold enormous power, not only in the international arena but also within their home states, on which they rely to protect their power and to provide economic support by a wide variety of means. When we consider the role of the masters of mankind, we turn to such state policy priorities of the moment as the Trans-Pacific Partnership, one of the investor-rights agreements mislabeled “free-trade agreements” in propaganda and commentary. They are negotiated in secret, apart from the hundreds of corporate lawyers and lobbyists writing the crucial details. The intention is to have them adopted in good Stalinist style with “fast track” procedures designed to block discussion and allow only the choice of yes or no (hence yes). The designers regularly do quite well, not surprisingly. People are incidental, with the consequences one might anticipate.
The Second Superpower
The neoliberal programs of the past generation have concentrated wealth and power in far fewer hands while undermining functioning democracy, but they have aroused opposition as well, most prominently in Latin America but also in the centers of global power. The European Union (EU), one of the more promising developments of the post-World War II period, has been tottering because of the harsh effect of the policies of austerity during recession, condemned even by the economists of the International Monetary Fund (if not the IMF’s political actors). Democracy has been undermined as decision making shifted to the Brussels bureaucracy, with the northern banks casting their shadow over their proceedings.
Mainstream parties have been rapidly losing members to left and to right. The executive director of the Paris-based research group EuropaNova attributes the general disenchantment to “a mood of angry impotence as the real power to shape events largely shifted from national political leaders [who, in principle at least, are subject to democratic politics] to the market, the institutions of the European Union and corporations,” quite in accord with neoliberal doctrine. Very similar processes are under way in the United States, for somewhat similar reasons, a matter of significance and concern not just for the country but, because of U.S. power, for the world.
The rising opposition to the neoliberal assault highlights another crucial aspect of the standard convention: it sets aside the public, which often fails to accept the approved role of “spectators” (rather than “participants”) assigned to it in liberal democratic theory. Such disobedience has always been of concern to the dominant classes. Just keeping to American history, George Washington regarded the common people who formed the militias that he was to command as “an exceedingly dirty and nasty people [evincing] an unaccountable kind of stupidity in the lower class of these people.”
In Violent Politics, his masterful review of insurgencies from “the American insurgency” to contemporary Afghanistan and Iraq, William Polk concludes that General Washington “was so anxious to sideline [the fighters he despised] that he came close to losing the Revolution.” Indeed, he “might have actually done so” had France not massively intervened and “saved the Revolution,” which until then had been won by guerrillas — whom we would now call “terrorists” — while Washington’s British-style army “was defeated time after time and almost lost the war.”
A common feature of successful insurgencies, Polk records, is that once popular support dissolves after victory, the leadership suppresses the “dirty and nasty people” who actually won the war with guerrilla tactics and terror, for fear that they might challenge class privilege. The elites’ contempt for “the lower class of these people” has taken various forms throughout the years. In recent times one expression of this contempt is the call for passivity and obedience (“moderation in democracy”) by liberal internationalists reacting to the dangerous democratizing effects of the popular movements of the 1960s.
Sometimes states do choose to follow public opinion, eliciting much fury in centers of power. One dramatic case was in 2003, when the Bush administration called on Turkey to join its invasion of Iraq. Ninety-five percent of Turks opposed that course of action and, to the amazement and horror of Washington, the Turkish government adhered to their views. Turkey was bitterly condemned for this departure from responsible behavior. Deputy Secretary of Defense Paul Wolfowitz, designated by the press as the “idealist-in-chief” of the administration, berated the Turkish military for permitting the malfeasance of the government and demanded an apology. Unperturbed by these and innumerable other illustrations of our fabled “yearning for democracy,” respectable commentary continued to laud President George W. Bush for his dedication to “democracy promotion,” or sometimes criticized him for his naïveté in thinking that an outside power could impose its democratic yearnings on others.
The Turkish public was not alone. Global opposition to U.S.-UK aggression was overwhelming. Support for Washington’s war plans scarcely reached 10% almost anywhere, according to international polls. Opposition sparked huge worldwide protests, in the United States as well, probably the first time in history that imperial aggression was strongly protested even before it was officially launched. On the front page of the New York Times, journalist Patrick Tyler reported that “there may still be two superpowers on the planet: the United States and world public opinion.”
Unprecedented protest in the United States was a manifestation of the opposition to aggression that began decades earlier in the condemnation of the U.S. wars in Indochina, reaching a scale that was substantial and influential, even if far too late. By 1967, when the antiwar movement was becoming a significant force, military historian and Vietnam specialist Bernard Fall warned that “Vietnam as a cultural and historic entity… is threatened with extinction… [as] the countryside literally dies under the blows of the largest military machine ever unleashed on an area of this size.”
But the antiwar movement did become a force that could not be ignored. Nor could it be ignored when Ronald Reagan came into office determined to launch an assault on Central America. His administration mimicked closely the steps John F. Kennedy had taken 20 years earlier in launching the war against South Vietnam, but had to back off because of the kind of vigorous public protest that had been lacking in the early 1960s. The assault was awful enough. The victims have yet to recover. But what happened to South Vietnam and later all of Indochina, where “the second superpower” imposed its impediments only much later in the conflict, was incomparably worse.
It is often argued that the enormous public opposition to the invasion of Iraq had no effect. That seems incorrect to me. Again, the invasion was horrifying enough, and its aftermath is utterly grotesque. Nevertheless, it could have been far worse. Vice President Dick Cheney, Secretary of Defense Donald Rumsfeld, and the rest of Bush’s top officials could never even contemplate the sort of measures that President Kennedy and President Lyndon Johnson adopted 40 years earlier largely without protest.
Western Power Under Pressure
There is far more to say, of course, about the factors in determining state policy that are put to the side when we adopt the standard convention that states are the actors in international affairs. But with such nontrivial caveats as these, let us nevertheless adopt the convention, at least as a first approximation to reality. Then the question of who rules the world leads at once to such concerns as China’s rise to power and its challenge to the United States and “world order,” the new cold war simmering in eastern Europe, the Global War on Terror, American hegemony and American decline, and a range of similar considerations.
The challenges faced by Western power at the outset of 2016 are usefully summarized within the conventional framework by Gideon Rachman, chief foreign-affairs columnist for the London Financial Times. He begins by reviewing the Western picture of world order: “Ever since the end of the Cold War, the overwhelming power of the U.S. military has been the central fact of international politics.” This is particularly crucial in three regions: East Asia, where “the U.S. Navy has become used to treating the Pacific as an ‘American lake’”; Europe, where NATO — meaning the United States, which “accounts for a staggering three-quarters of NATO’s military spending” — “guarantees the territorial integrity of its member states”; and the Middle East, where giant U.S. naval and air bases “exist to reassure friends and to intimidate rivals.”
The problem of world order today, Rachman continues, is that “these security orders are now under challenge in all three regions” because of Russian intervention in Ukraine and Syria, and because of China turning its nearby seas from an American lake to “clearly contested water.” The fundamental question of international relations, then, is whether the United States should “accept that other major powers should have some kind of zone of influence in their neighborhoods.” Rachman thinks it should, for reasons of “diffusion of economic power around the world — combined with simple common sense.”
There are, to be sure, ways of looking at the world from different standpoints. But let us keep to these three regions, surely critically important ones.
The Challenges Today: East Asia
Beginning with the “American lake,” some eyebrows might be raised over the report in mid-December 2015 that “an American B-52 bomber on a routine mission over the South China Sea unintentionally flew within two nautical miles of an artificial island built by China, senior defense officials said, exacerbating a hotly divisive issue for Washington and Beijing.” Those familiar with the grim record of the 70 years of the nuclear weapons era will be all too aware that this is the kind of incident that has often come perilously close to igniting terminal nuclear war. One need not be a supporter of China’s provocative and aggressive actions in the South China Sea to notice that the incident did not involve a Chinese nuclear-capable bomber in the Caribbean, or off the coast of California, where China has no pretensions of establishing a “Chinese lake.” Luckily for the world.
Chinese leaders understand very well that their country’s maritime trade routes are ringed with hostile powers from Japan through the Malacca Straits and beyond, backed by overwhelming U.S. military force. Accordingly, China is proceeding to expand westward with extensive investments and careful moves toward integration. In part, these developments are within the framework of the Shanghai Cooperation Organization (SCO), which includes the Central Asian states and Russia, and soon India and Pakistan with Iran as one of the observers — a status that was denied to the United States, which was also called on to close all military bases in the region. China is constructing a modernized version of the old silk roads, with the intent not only of integrating the region under Chinese influence, but also of reaching Europe and the Middle Eastern oil-producing regions. It is pouring huge sums into creating an integrated Asian energy and commercial system, with extensive high-speed rail lines and pipelines.
One element of the program is a highway through some of the world’s tallest mountains to the new Chinese-developed port of Gwadar in Pakistan, which will protect oil shipments from potential U.S. interference. The program may also, China and Pakistan hope, spur industrial development in Pakistan, which the United States has not undertaken despite massive military aid, and might also provide an incentive for Pakistan to clamp down on domestic terrorism, a serious issue for China in western Xinjiang Province. Gwadar will be part of China’s “string of pearls,” bases being constructed in the Indian Ocean for commercial purposes but potentially also for military use, with the expectation that China might someday be able to project power as far as the Persian Gulf for the first time in the modern era.
All of these moves remain immune to Washington’s overwhelming military power, short of annihilation by nuclear war, which would destroy the United States as well.
In 2015, China also established the Asian Infrastructure Investment Bank (AIIB), with itself as the main shareholder. Fifty-six nations participated in the opening in Beijing in June, including U.S. allies Australia, Britain, and others which joined in defiance of Washington’s wishes. The United States and Japan were absent. Some analysts believe that the new bank might turn out to be a competitor to the Bretton Woods institutions (the IMF and the World Bank), in which the United States holds veto power. There are also some expectations that the SCO might eventually become a counterpart to NATO.
The Challenges Today: Eastern Europe
Turning to the second region, Eastern Europe, there is a crisis brewing at the NATO-Russian border. It is no small matter. In his illuminating and judicious scholarly study of the region, Frontline Ukraine: Crisis in the Borderlands, Richard Sakwa writes — all too plausibly — that the “Russo-Georgian war of August 2008 was in effect the first of the ‘wars to stop NATO enlargement’; the Ukraine crisis of 2014 is the second. It is not clear whether humanity would survive a third.”
The West sees NATO enlargement as benign. Not surprisingly, Russia, along with much of the Global South, has a different opinion, as do some prominent Western voices. George Kennan warned early on that NATO enlargement is a “tragic mistake,” and he was joined by senior American statesmen in an open letter to the White House describing it as a “policy error of historic proportions.”
The present crisis has its origins in 1991, with the end of the Cold War and the collapse of the Soviet Union. There were then two contrasting visions of a new security system and political economy in Eurasia. In Sakwa’s words, one vision was of a “‘Wider Europe,’ with the EU at its heart but increasingly coterminous with the Euro-Atlantic security and political community; and on the other side there [was] the idea of ‘Greater Europe,’ a vision of a continental Europe, stretching from Lisbon to Vladivostok, that has multiple centers, including Brussels, Moscow and Ankara, but with a common purpose in overcoming the divisions that have traditionally plagued the continent.”
Soviet leader Mikhail Gorbachev was the major proponent of Greater Europe, a concept that also had European roots in Gaullism and other initiatives. However, as Russia collapsed under the devastating market reforms of the 1990s, the vision faded, only to be renewed as Russia began to recover and seek a place on the world stage under Vladimir Putin who, along with his associate Dmitry Medvedev, has repeatedly “called for the geopolitical unification of all of ‘Greater Europe’ from Lisbon to Vladivostok, to create a genuine ‘strategic partnership.’”
These initiatives were “greeted with polite contempt,” Sakwa writes, regarded as “little more than a cover for the establishment of a ‘Greater Russia’ by stealth” and an effort to “drive a wedge” between North America and Western Europe. Such concerns trace back to earlier Cold War fears that Europe might become a “third force” independent of both the great and minor superpowers and moving toward closer links to the latter (as can be seen in Willy Brandt’s Ostpolitik and other initiatives).
The Western response to Russia’s collapse was triumphalist. It was hailed as signaling “the end of history,” the final victory of Western capitalist democracy, almost as if Russia were being instructed to revert to its pre-World War I status as a virtual economic colony of the West. NATO enlargement began at once, in violation of verbal assurances to Gorbachev that NATO forces would not move “one inch to the east” after he agreed that a unified Germany could become a NATO member — a remarkable concession, in the light of history. That discussion kept to East Germany. The possibility that NATO might expand beyond Germany was not discussed with Gorbachev, even if privately considered.
Soon, NATO did begin to move beyond, right to the borders of Russia. The general mission of NATO was officially changed to a mandate to protect “crucial infrastructure” of the global energy system, sea lanes and pipelines, giving it a global area of operations. Furthermore, under a crucial Western revision of the now widely heralded doctrine of “responsibility to protect,” sharply different from the official U.N. version, NATO may now also serve as an intervention force under U.S. command.
Of particular concern to Russia are plans to expand NATO to Ukraine. These plans were articulated explicitly at the Bucharest NATO summit of April 2008, when Georgia and Ukraine were promised eventual membership in NATO. The wording was unambiguous: “NATO welcomes Ukraine’s and Georgia’s Euro-Atlantic aspirations for membership in NATO. We agreed today that these countries will become members of NATO.” With the “Orange Revolution” victory of pro-Western candidates in Ukraine in 2004, State Department representative Daniel Fried rushed there and “emphasized U.S. support for Ukraine’s NATO and Euro-Atlantic aspirations,” as a WikiLeaks report revealed.
Russia’s concerns are easily understandable. They are outlined by international relations scholar John Mearsheimer in the leading U.S. establishment journal, Foreign Affairs. He writes that “the taproot of the current crisis [over Ukraine] is NATO expansion and Washington’s commitment to move Ukraine out of Moscow’s orbit and integrate it into the West,” which Putin viewed as “a direct threat to Russia’s core interests.”
“Who can blame him?” Mearsheimer asks, pointing out that “Washington may not like Moscow’s position, but it should understand the logic behind it.” That should not be too difficult. After all, as everyone knows, “The United States does not tolerate distant great powers deploying military forces anywhere in the Western hemisphere, much less on its borders.”
In fact, the U.S. stand is far stronger. It does not tolerate what is officially called “successful defiance” of the Monroe Doctrine of 1823, which declared (but could not yet implement) U.S. control of the hemisphere. And a small country that carries out such successful defiance may be subjected to “the terrors of the earth” and a crushing embargo — as happened to Cuba. We need not ask how the United States would have reacted had the countries of Latin America joined the Warsaw Pact, with plans for Mexico and Canada to join as well. The merest hint of the first tentative steps in that direction would have been “terminated with extreme prejudice,” to adopt CIA lingo.
As in the case of China, one does not have to regard Putin’s moves and motives favorably to understand the logic behind them, nor to grasp the importance of understanding that logic instead of issuing imprecations against it. As in the case of China, a great deal is at stake, reaching as far — literally — as questions of survival.
The Challenges Today: The Islamic World
Let us turn to the third region of major concern, the (largely) Islamic world, also the scene of the Global War on Terror (GWOT) that George W. Bush declared in 2001 after the 9/11 terrorist attack. To be more accurate, re-declared. The GWOT was declared by the Reagan administration when it took office, with fevered rhetoric about a “plague spread by depraved opponents of civilization itself” (as Reagan put it) and a “return to barbarism in the modern age” (the words of George Shultz, his secretary of state). The original GWOT has been quietly removed from history. It very quickly turned into a murderous and destructive terrorist war afflicting Central America, southern Africa, and the Middle East, with grim repercussions to the present, even leading to condemnation of the United States by the World Court (which Washington dismissed). In any event, it is not the right story for history, so it is gone.
The success of the Bush-Obama version of GWOT can readily be evaluated on direct inspection. When the war was declared, the terrorist targets were confined to a small corner of tribal Afghanistan. They were protected by Afghans, who mostly disliked or despised them, under the tribal code of hospitality — which baffled Americans when poor peasants refused “to turn over Osama bin Laden for the, to them, astronomical sum of $25 million.”
There are good reasons to believe that a well-constructed police action, or even serious diplomatic negotiations with the Taliban, might have placed those suspected of the 9/11 crimes in American hands for trial and sentencing. But such options were off the table. Instead, the reflexive choice was large-scale violence — not with the goal of overthrowing the Taliban (that came later) but to make clear U.S. contempt for tentative Taliban offers of the possible extradition of bin Laden. How serious these offers were we do not know, since the possibility of exploring them was never entertained. Or perhaps the United States was just intent on “trying to show its muscle, score a victory and scare everyone in the world. They don’t care about the suffering of the Afghans or how many people we will lose.”
That was the judgment of the highly respected anti-Taliban leader Abdul Haq, one of the many oppositionists who condemned the American bombing campaign launched in October 2001 as “a big setback” for their efforts to overthrow the Taliban from within, a goal they considered within their reach. His judgment is confirmed by Richard A. Clarke, who was chairman of the Counterterrorism Security Group at the White House under President George W. Bush when the plans to attack Afghanistan were made. As Clarke describes the meeting, when informed that the attack would violate international law, “the President yelled in the narrow conference room, ‘I don’t care what the international lawyers say, we are going to kick some ass.'” The attack was also bitterly opposed by the major aid organizations working in Afghanistan, who warned that millions were on the verge of starvation and that the consequences might be horrendous.
The consequences for poor Afghanistan years later need hardly be reviewed.
The next target of the sledgehammer was Iraq. The U.S.-UK invasion, utterly without credible pretext, is the major crime of the twenty-first century. The invasion led to the death of hundreds of thousands of people in a country where the civilian society had already been devastated by American and British sanctions that were regarded as “genocidal” by the two distinguished international diplomats who administered them, and resigned in protest for this reason. The invasion also generated millions of refugees, largely destroyed the country, and instigated a sectarian conflict that is now tearing apart Iraq and the entire region. It is an astonishing fact about our intellectual and moral culture that in informed and enlightened circles it can be called, blandly, “the liberation of Iraq.”
Pentagon and British Ministry of Defense polls found that only 3% of Iraqis regarded the U.S. security role in their neighborhood as legitimate, less than 1% believed that “coalition” (U.S.-UK) forces were good for their security, 80% opposed the presence of coalition forces in the country, and a majority supported attacks on coalition troops. Afghanistan has been destroyed beyond the possibility of reliable polling, but there are indications that something similar may be true there as well. Particularly in Iraq the United States suffered a severe defeat, abandoning its official war aims, and leaving the country under the influence of the sole victor, Iran.
The sledgehammer was also wielded elsewhere, notably in Libya, where the three traditional imperial powers (Britain, France, and the United States) procured Security Council resolution 1973 and instantly violated it, becoming the air force of the rebels. The effect was to undercut the possibility of a peaceful, negotiated settlement; sharply increase casualties (by at least a factor of 10, according to political scientist Alan Kuperman); leave Libya in ruins, in the hands of warring militias; and, more recently, to provide the Islamic State with a base that it can use to spread terror beyond. Quite sensible diplomatic proposals by the African Union, accepted in principle by Libya’s Muammar Qaddafi, were ignored by the imperial triumvirate, as Africa specialist Alex de Waal reviews. A huge flow of weapons and jihadis has spread terror and violence from West Africa (now the champion for terrorist murders) to the Levant, while the NATO attack also sent a flood of refugees from Africa to Europe.
Yet another triumph of “humanitarian intervention,” and, as the long and often ghastly record reveals, not an unusual one, going back to its modern origins four centuries ago.
Copyright 2016 Noam Chomsky
American Power Under Challenge
Throughout the world there is great relief and optimism about the nuclear deal reached in Vienna between Iran and the P5+1 nations, the five veto-holding members of the U.N. Security Council and Germany. Most of the world apparently shares the assessment of the U.S. Arms Control Association that “the Joint Comprehensive Plan of Action establishes a strong and effective formula for blocking all of the pathways by which Iran could acquire material for nuclear weapons for more than a generation and a verification system to promptly detect and deter possible efforts by Iran to covertly pursue nuclear weapons that will last indefinitely.”
There are, however, striking exceptions to the general enthusiasm: the United States and its closest regional allies, Israel and Saudi Arabia. One consequence of this is that U.S. corporations, much to their chagrin, are prevented from flocking to Tehran along with their European counterparts. Prominent sectors of U.S. power and opinion share the stand of the two regional allies and so are in a state of virtual hysteria over “the Iranian threat.” Sober commentary in the United States, pretty much across the spectrum, declares that country to be “the gravest threat to world peace.” Even supporters of the agreement here are wary, given the exceptional gravity of that threat. After all, how can we trust the Iranians with their terrible record of aggression, violence, disruption, and deceit?
Opposition within the political class is so strong that public opinion has shifted quickly from significant support for the deal to an even split. Republicans are almost unanimously opposed to the agreement. The current Republican primaries illustrate the proclaimed reasons. Senator Ted Cruz, considered one of the intellectuals among the crowded field of presidential candidates, warns that Iran may still be able to produce nuclear weapons and could someday use one to set off an Electro Magnetic Pulse that “would take down the electrical grid of the entire eastern seaboard” of the United States, killing “tens of millions of Americans.”
The two most likely winners, former Florida Governor Jeb Bush and Wisconsin Governor Scott Walker, are battling over whether to bomb Iranimmediately after being elected or after the first Cabinet meeting. The one candidate with some foreign policy experience, Lindsey Graham, describesthe deal as “a death sentence for the state of Israel,” which will certainly come as a surprise to Israeli intelligence and strategic analysts — and which Graham knows to be utter nonsense, raising immediate questions about actual motives.
Keep in mind that the Republicans long ago abandoned the pretense of functioning as a normal congressional party. They have, as respected conservative political commentator Norman Ornstein of the right-wing American Enterprise Institute observed, become a “radical insurgency” that scarcely seeks to participate in normal congressional politics.
Since the days of President Ronald Reagan, the party leadership has plunged so far into the pockets of the very rich and the corporate sector that they can attract votes only by mobilizing parts of the population that have not previously been an organized political force. Among them are extremist evangelical Christians, now probably a majority of Republican voters; remnants of the former slave-holding states; nativists who are terrified that “they” are taking our white Christian Anglo-Saxon country away from us; and others who turn the Republican primaries into spectacles remote from the mainstream of modern society — though not from the mainstream of the most powerful country in world history.
The departure from global standards, however, goes far beyond the bounds of the Republican radical insurgency. Across the spectrum, there is, for instance, general agreement with the “pragmatic” conclusion of General Martin Dempsey, chairman of the Joint Chiefs of Staff, that the Vienna deal does not “prevent the United States from striking Iranian facilities if officials decide that it is cheating on the agreement,” even though a unilateral military strike is “far less likely” if Iran behaves.
Former Clinton and Obama Middle East negotiator Dennis Ross typically recommends that “Iran must have no doubts that if we see it moving towards a weapon, that would trigger the use of force” even after the termination of the deal, when Iran is theoretically free to do what it wants. In fact, the existence of a termination point 15 years hence is, he adds, "the greatest single problem with the agreement." He also suggests that the U.S. provide Israel with specially outfitted B-52 bombers and bunker-busting bombs to protect itself before that terrifying date arrives.
“The Greatest Threat”
Opponents of the nuclear deal charge that it does not go far enough. Some supporters agree, holding that “if the Vienna deal is to mean anything, the whole of the Middle East must rid itself of weapons of mass destruction.” The author of those words, Iran’s Minister of Foreign Affairs Javad Zarif, added that “Iran, in its national capacity and as current chairman of the Non-Aligned Movement [the governments of the large majority of the world’s population], is prepared to work with the international community to achieve these goals, knowing full well that, along the way, it will probably run into many hurdles raised by the skeptics of peace and diplomacy.” Iran has signed “a historic nuclear deal,” he continues, and now it is the turn of Israel, “the holdout.”
Israel, of course, is one of the three nuclear powers, along with India and Pakistan, whose weapons programs have been abetted by the United States and that refuse to sign the Nonproliferation Treaty (NPT).
Zarif was referring to the regular five-year NPT review conference, which ended in failure in April when the U.S. (joined by Canada and Great Britain) once again blocked efforts to move toward a weapons-of-mass-destruction-free zone in the Middle East. Such efforts have been led by Egypt and other Arab states for 20 years. As Jayantha Dhanapala and Sergio Duarte, leading figures in the promotion of such efforts at the NPT and other U.N. agencies,observe in “Is There a Future for the NPT?,” an article in the journal of the Arms Control Association: “The successful adoption in 1995 of the resolution on the establishment of a zone free of weapons of mass destruction (WMD) in the Middle East was the main element of a package that permitted the indefinite extension of the NPT.” The NPT, in turn, is the most important arms control treaty of all. If it were adhered to, it could end the scourge of nuclear weapons.
Repeatedly, implementation of the resolution has been blocked by the U.S., most recently by President Obama in 2010 and again in 2015, as Dhanapala and Duarte point out, “on behalf of a state that is not a party to the NPT and is widely believed to be the only one in the region possessing nuclear weapons” — a polite and understated reference to Israel. This failure, they hope, “will not be the coup de grâce to the two longstanding NPT objectives of accelerated progress on nuclear disarmament and establishing a Middle Eastern WMD-free zone.”
A nuclear-weapons-free Middle East would be a straightforward way to address whatever threat Iran allegedly poses, but a great deal more is at stake in Washington’s continuing sabotage of the effort in order to protect its Israeli client. After all, this is not the only case in which opportunities to end the alleged Iranian threat have been undermined by Washington, raising further questions about just what is actually at stake.
In considering this matter, it is instructive to examine both the unspoken assumptions in the situation and the questions that are rarely asked. Let us consider a few of these assumptions, beginning with the most serious: that Iran is the gravest threat to world peace.
In the U.S., it is a virtual cliché among high officials and commentators that Iran wins that grim prize. There is also a world outside the U.S. and although its views are not reported in the mainstream here, perhaps they are of some interest. According to the leading western polling agencies (WIN/Gallup International), the prize for “greatest threat” is won by the United States. The rest of the world regards it as the gravest threat to world peace by a large margin. In second place, far below, is Pakistan, its ranking probably inflated by the Indian vote. Iran is ranked below those two, along with China, Israel, North Korea, and Afghanistan.
“The World’s Leading Supporter of Terrorism”
Turning to the next obvious question, what in fact is the Iranian threat? Why, for example, are Israel and Saudi Arabia trembling in fear over that country? Whatever the threat is, it can hardly be military. Years ago, U.S. intelligence informed Congress that Iran has very low military expenditures by the standards of the region and that its strategic doctrines are defensive — designed, that is, to deter aggression. The U.S. intelligence community has also reported that it has no evidence Iran is pursuing an actual nuclear weapons program and that “Iran’s nuclear program and its willingness to keep open the possibility of developing nuclear weapons is a central part of its deterrent strategy.”
The authoritative SIPRI review of global armaments ranks the U.S., as usual, way in the lead in military expenditures. China comes in second with about one-third of U.S. expenditures. Far below are Russia and Saudi Arabia, which are nonetheless well above any western European state. Iran isscarcely mentioned. Full details are provided in an April report from the Center for Strategic and International Studies (CSIS), which finds “a conclusive case that the Arab Gulf states have… an overwhelming advantage of Iran in both military spending and access to modern arms.”
Iran’s military spending, for instance, is a fraction of Saudi Arabia’s and far below even the spending of the United Arab Emirates (UAE). Altogether, the Gulf Cooperation Council states — Bahrain, Kuwait, Oman, Saudi Arabia, and the UAE – outspend Iran on arms by a factor of eight, an imbalance that goes back decades. The CSIS report adds: “The Arab Gulf states have acquired and are acquiring some of the most advanced and effective weapons in the world [while] Iran has essentially been forced to live in the past, often relying on systems originally delivered at the time of the Shah.” In other words, they are virtually obsolete. When it comes to Israel, of course, the imbalance is even greater. Possessing the most advanced U.S. weaponry and a virtual offshore military base for the global superpower, it also has a huge stock of nuclear weapons.
To be sure, Israel faces the “existential threat” of Iranian pronouncements: Supreme Leader Khamenei and former president Mahmoud Ahmadinejad famously threatened it with destruction. Except that they didn’t – and if they had, it would be of little moment. Ahmadinejad, for instance, predicted that “under God’s grace [the Zionist regime] will be wiped off the map.” In other words, he hoped that regime change would someday take place. Even that falls far short of the direct calls in both Washington and Tel Aviv for regime change in Iran, not to speak of the actions taken to implement regime change. These, of course, go back to the actual “regime change” of 1953, when the U.S. and Britain organized a military coup to overthrow Iran’s parliamentary government and install the dictatorship of the Shah, who proceeded to amass one of the worst human rights records on the planet.
These crimes were certainly known to readers of the reports of Amnesty International and other human rights organizations, but not to readers of the U.S. press, which has devoted plenty of space to Iranian human rights violations — but only since 1979 when the Shah’s regime was overthrown. (To check the facts on this, read The U.S. Press and Iran, a carefully documented study by Mansour Farhang and William Dorman.)
None of this is a departure from the norm. The United States, as is well known, holds the world championship title in regime change and Israel is no laggard either. The most destructive of its invasions of Lebanon in 1982 was explicitly aimed at regime change, as well as at securing its hold on the occupied territories. The pretexts offered were thin indeed and collapsed at once. That, too, is not unusual and pretty much independent of the nature of the society — from the laments in the Declaration of Independence about the “merciless Indian savages” to Hitler’s defense of Germany from the “wild terror” of the Poles.
No serious analyst believes that Iran would ever use, or even threaten to use, a nuclear weapon if it had one, and so face instant destruction. There is, however, real concern that a nuclear weapon might fall into jihadi hands — not thanks to Iran, but via U.S. ally Pakistan. In the journal of the Royal Institute of International Affairs, two leading Pakistani nuclear scientists, Pervez Hoodbhoy and Zia Mian, write that increasing fears of “militants seizing nuclear weapons or materials and unleashing nuclear terrorism [have led to]… the creation of a dedicated force of over 20,000 troops to guard nuclear facilities. There is no reason to assume, however, that this force would be immune to the problems associated with the units guarding regular military facilities,” which have frequently suffered attacks with “insider help.” In brief, the problem is real, just displaced to Iran thanks to fantasies concocted for other reasons.
Other concerns about the Iranian threat include its role as “the world’s leading supporter of terrorism,” which primarily refers to its support for Hezbollah and Hamas. Both of those movements emerged in resistance to U.S.-backed Israeli violence and aggression, which vastly exceeds anything attributed to these villains, let alone the normal practice of the hegemonic power whose global drone assassination campaign alone dominates (and helps to foster) international terrorism.
Those two villainous Iranian clients also share the crime of winning the popular vote in the only free elections in the Arab world. Hezbollah is guilty of the even more heinous crime of compelling Israel to withdraw from its occupation of southern Lebanon, which took place in violation of U.N. Security Council orders dating back decades and involved an illegal regime of terror and sometimes extreme violence. Whatever one thinks of Hezbollah, Hamas, or other beneficiaries of Iranian support, Iran hardly ranks high in support of terror worldwide.
Another concern, voiced at the U.N. by U.S. Ambassador Samantha Power, is the “instability that Iran fuels beyond its nuclear program.” The U.S. will continue to scrutinize this misbehavior, she declared. In that, she echoed the assurance Defense Secretary Ashton Carter offered while standing on Israel’s northern border that “we will continue to help Israel counter Iran’s malign influence” in supporting Hezbollah, and that the U.S. reserves the right to use military force against Iran as it deems appropriate.
The way Iran “fuels instability” can be seen particularly dramatically in Iraq where, among other crimes, it alone at once came to the aid of Kurds defending themselves from the invasion of Islamic State militants, even as it is building a $2.5 billion power plant in the southern port city of Basra to try to bring electrical power back to the level reached before the 2003 invasion. Ambassador Power’s usage is, however, standard: Thanks to that invasion, hundreds of thousands were killed and millions of refugees generated, barbarous acts of torture were committed — Iraqis have compared the destruction to the Mongol invasion of the thirteenth century — leaving Iraq the unhappiest country in the world according to WIN/Gallup polls. Meanwhile, sectarian conflict was ignited, tearing the region to shreds and laying the basis for the creation of the monstrosity that is ISIS. And all of that is called “stabilization.”
Only Iran’s shameful actions, however, “fuel instability.” The standard usage sometimes reaches levels that are almost surreal, as when liberal commentator James Chace, former editor of Foreign Affairs, explained that the U.S. sought to “destabilize a freely elected Marxist government in Chile” because “we were determined to seek stability” under the Pinochet dictatorship.
Others are outraged that Washington should negotiate at all with a “contemptible” regime like Iran’s with its horrifying human rights record and urge instead that we pursue “an American-sponsored alliance between Israel and the Sunni states.” So writes Leon Wieseltier, contributing editor to the venerable liberal journal the Atlantic, who can barely conceal his visceral hatred for all things Iranian. With a straight face, this respected liberal intellectual recommends that Saudi Arabia, which makes Iran look like a virtual paradise, and Israel, with its vicious crimes in Gaza and elsewhere, should ally to teach that country good behavior. Perhaps the recommendation is not entirely unreasonable when we consider the human rights records of the regimes the U.S. has imposed and supported throughout the world.
Though the Iranian government is no doubt a threat to its own people, it regrettably breaks no records in this regard, not descending to the level of favored U.S. allies. That, however, cannot be the concern of Washington, and surely not Tel Aviv or Riyadh.
It might also be useful to recall — surely Iranians do — that not a day has passed since 1953 in which the U.S. was not harming Iranians. After all, as soon as they overthrew the hated U.S.-imposed regime of the Shah in 1979, Washington put its support behind Iraqi leader Saddam Hussein, who would, in 1980, launch a murderous assault on their country. President Reagan went so far as to deny Saddam’s major crime, his chemical warfare assault on Iraq’s Kurdish population, which he blamed on Iran instead. When Saddam was tried for crimes under U.S. auspices, that horrendous crime, as well as others in which the U.S. was complicit, was carefully excluded from the charges, which were restricted to one of his minor crimes, the murder of 148 Shi’ites in 1982, a footnote to his gruesome record.
Saddam was such a valued friend of Washington that he was even granted a privilege otherwise accorded only to Israel. In 1987, his forces were allowed to attack a U.S. naval vessel, the USS Stark, with impunity, killing 37 crewmen. (Israel had acted similarly in its 1967 attack on the USS Liberty.) Iran pretty much conceded defeat shortly after, when the U.S. launched Operation Praying Mantis against Iranian ships and oil platforms in Iranian territorial waters. That operation culminated when the USS Vincennes, under no credible threat, shot down an Iranian civilian airliner in Iranian airspace, with 290 killed — and the subsequent granting of a Legion of Merit award to the commander of the Vincennes for “exceptionally meritorious conduct” and for maintaining a “calm and professional atmosphere” during the period when the attack on the airliner took place. Comments philosopher Thill Raghu, “We can only stand in awe of such display of American exceptionalism!”
After the war ended, the U.S. continued to support Saddam Hussein, Iran’s primary enemy. President George H.W. Bush even invited Iraqi nuclear engineers to the U.S. for advanced training in weapons production, an extremely serious threat to Iran. Sanctions against that country were intensified, including against foreign firms dealing with it, and actions were initiated to bar it from the international financial system.
In recent years the hostility has extended to sabotage, the murder of nuclear scientists (presumably by Israel), and cyberwar, openly proclaimed with pride. The Pentagon regards cyberwar as an act of war, justifying a military response, as does NATO, which affirmed in September 2014 that cyber attacks may trigger the collective defense obligations of the NATO powers — when we are the target that is, not the perpetrators.
“The Prime Rogue State”
It is only fair to add that there have been breaks in this pattern. President George W. Bush, for example, offered several significant gifts to Iran by destroying its major enemies, Saddam Hussein and the Taliban. He even placed Iran’s Iraqi enemy under its influence after the U.S. defeat, which was so severe that Washington had to abandon its officially declared goals of establishing permanent military bases (“enduring camps”) and ensuring that U.S. corporations would have privileged access to Iraq’s vast oil resources.
Do Iranian leaders intend to develop nuclear weapons today? We can decide for ourselves how credible their denials are, but that they had such intentions in the past is beyond question. After all, it was asserted openly on the highest authority and foreign journalists were informed that Iran would develop nuclear weapons “certainly, and sooner than one thinks.” The father of Iran’s nuclear energy program and former head of Iran’s Atomic Energy Organization was confident that the leadership’s plan “was to build a nuclear bomb.” The CIA also reported that it had “no doubt” Iran would develop nuclear weapons if neighboring countries did (as they have).
All of this was, of course, under the Shah, the “highest authority” just quoted and at a time when top U.S. officials — Dick Cheney, Donald Rumsfeld, and Henry Kissinger, among others — were urging him to proceed with his nuclear programs and pressuring universities to accommodate these efforts. Under such pressures, my own university, MIT, made a deal with the Shah to admit Iranian students to the nuclear engineering program in return for grants he offered and over the strong objections of the student body, but with comparably strong faculty support (in a meeting that older faculty will doubtless remember well).
Asked later why he supported such programs under the Shah but opposed them more recently, Kissinger responded honestly that Iran was an ally then.
Putting aside absurdities, what is the real threat of Iran that inspires such fear and fury? A natural place to turn for an answer is, again, U.S. intelligence. Recall its analysis that Iran poses no military threat, that its strategic doctrines are defensive, and that its nuclear programs (with no effort to produce bombs, as far as can be determined) are “a central part of its deterrent strategy.”
Who, then, would be concerned by an Iranian deterrent? The answer is plain: the rogue states that rampage in the region and do not want to tolerate any impediment to their reliance on aggression and violence. In the lead in this regard are the U.S. and Israel, with Saudi Arabia trying its best to join the club with its invasion of Bahrain (to support the crushing of a reform movement there) and now its murderous assault on Yemen, accelerating a growing humanitarian catastrophe in that country.
For the United States, the characterization is familiar. Fifteen years ago, the prominent political analyst Samuel Huntington, professor of the science of government at Harvard, warned in the establishment journal Foreign Affairsthat for much of the world the U.S. was “becoming the rogue superpower… the single greatest external threat to their societies.” Shortly after, his words were echoed by Robert Jervis, the president of the American Political Science Association: “In the eyes of much of the world, in fact, the prime rogue state today is the United States.” As we have seen, global opinion supports this judgment by a substantial margin.
Furthermore, the mantle is worn with pride. That is the clear meaning of the insistence of the political class that the U.S. reserves the right to resort to force if it unilaterally determines that Iran is violating some commitment. This policy is of long standing, especially for liberal Democrats, and by no means restricted to Iran. The Clinton Doctrine, for instance, confirmed that the U.S. was entitled to resort to the “unilateral use of military power” even to ensure “uninhibited access to key markets, energy supplies, and strategic resources,” let alone alleged “security” or “humanitarian” concerns. Adherence to various versions of this doctrine has been well confirmed in practice, as need hardly be discussed among people willing to look at the facts of current history.
These are among the critical matters that should be the focus of attention in analyzing the nuclear deal at Vienna, whether it stands or is sabotaged by Congress, as it may well be.
Noam Chomsky is institute professor emeritus in the Department of Linguistics and Philosophy at Massachusetts Institute of Technology. A TomDispatch regular, among his recent books are Hegemony or Survival, Failed States, Power Systems, Hopes and Prospects, and Masters of Mankind. Haymarket Books recently reissued twelve of his classic books in new editions. His website is www.chomsky.info.
Copyright 2015 Noam Chomsky
“The Iranian Threat”
On August 26th, Israel and the Palestinian Authority (PA) both accepted a ceasefire agreement after a 50-day Israeli assault on Gaza that left 2,100 Palestinians dead and vast landscapes of destruction behind. The agreement calls for an end to military action by both Israel and Hamas, as well as an easing of the Israeli siege that has strangled Gaza for many years.
This is, however, just the most recent of a series of ceasefire agreements reached after each of Israel's periodic escalations of its unremitting assault on Gaza. Throughout this period, the terms of these agreements remain essentially the same. The regular pattern is for Israel, then, to disregard whatever agreement is in place, while Hamas observes it — as Israel has officially recognized — until a sharp increase in Israeli violence elicits a Hamas response, followed by even fiercer brutality. These escalations, which amount to shooting fish in a pond, are called "mowing the lawn" in Israeli parlance. The most recent was more accurately described as "removing the topsoil" by a senior U.S. military officer, appalled by the practices of the self-described "most moral army in the world."
The first of this series was the Agreement on Movement and Access Between Israel and the Palestinian Authority in November 2005. It called for "a crossing between Gaza and Egypt at Rafah for the export of goods and the transit of people, continuous operation of crossings between Israel and Gaza for the import/export of goods, and the transit of people, reduction of obstacles to movement within the West Bank, bus and truck convoys between the West Bank and Gaza, the building of a seaport in Gaza, [and the] re-opening of the airport in Gaza" that Israeli bombing had demolished.
That agreement was reached shortly after Israel withdrew its settlers and military forces from Gaza. The motive for the disengagement was explained by Dov Weissglass, a confidant of then-Prime Minister Ariel Sharon, who was in charge of negotiating and implementing it. "The significance of the disengagement plan is the freezing of the peace process," Weissglass informed the Israeli press. "And when you freeze that process, you prevent the establishment of a Palestinian state, and you prevent a discussion on the refugees, the borders, and Jerusalem. Effectively, this whole package called the Palestinian state, with all that it entails, has been removed indefinitely from our agenda. And all this with authority and permission. All with a [U.S.] presidential blessing and the ratification of both houses of Congress." True enough.
"The disengagement is actually formaldehyde," Weissglass added. "It supplies the amount of formaldehyde that is necessary so there will not be a political process with the Palestinians." Israeli hawks also recognized that instead of investing substantial resources in maintaining a few thousand settlers in illegal communities in devastated Gaza, it made more sense to transfer them to illegal subsidized communities in areas of the West Bank that Israel intended to keep.
The disengagement was depicted as a noble effort to pursue peace, but the reality was quite different. Israel never relinquished control of Gaza and is, accordingly, recognized as the occupying power by the United Nations, the U.S., and other states (Israel apart, of course). In their comprehensive history of Israeli settlement in the occupied territories, Israeli scholars Idith Zertal and Akiva Eldar describe what actually happened when that country disengaged: the ruined territory was not released "for even a single day from Israel's military grip or from the price of the occupation that the inhabitants pay every day." After the disengagement, "Israel left behind scorched earth, devastated services, and people with neither a present nor a future. The settlements were destroyed in an ungenerous move by an unenlightened occupier, which in fact continues to control the territory and kill and harass its inhabitants by means of its formidable military might."
Operations Cast Lead and Pillar of Defense
Israel soon had a pretext for violating the November Agreement more severely. In January 2006, the Palestinians committed a serious crime. They voted "the wrong way" in carefully monitored free elections, placing the parliament in the hands of Hamas. Israel and the United States immediately imposed harsh sanctions, telling the world very clearly what they mean by "democracy promotion." Europe, to its shame, went along as well.
The U.S. and Israel soon began planning a military coup to overthrow the unacceptable elected government, a familiar procedure. When Hamas pre-empted the coup in 2007, the siege of Gaza became far more severe, along with regular Israeli military attacks. Voting the wrong way in a free election was bad enough, but preempting a U.S.-planned military coup proved to be an unpardonable offense.
A new ceasefire agreement was reached in June 2008. It again called for opening the border crossings to "allow the transfer of all goods that were banned and restricted to go into Gaza." Israel formally agreed to this, but immediately announced that it would not abide by the agreement and open the borders until Hamas released Gilad Shalit, an Israeli soldier held by Hamas.
Israel itself has a long history of kidnapping civilians in Lebanon and on the high seas and holding them for lengthy periods without credible charge, sometimes as hostages. Of course, imprisoning civilians on dubious charges, or none, is a regular practice in the territories Israel controls. But the standard western distinction between people and "unpeople" (in Orwell's useful phrase) renders all this insignificant.
Israel not only maintained the siege in violation of the June 2008 ceasefire agreement but did so with extreme rigor, even preventing the United Nations Relief and Works Agency, which cares for the huge number of official refugees in Gaza, from replenishing its stocks.
On November 4th, while the media were focused on the U.S. presidential election, Israeli troops entered Gaza and killed half a dozen Hamas militants. That elicited a Hamas missile response and an exchange of fire. (All the deaths were Palestinian.) In late December, Hamas offered to renew the ceasefire. Israel considered the offer, but rejected it, preferring instead to launch Operation Cast Lead, a three-week incursion of the full power of the Israeli military into the Gaza strip, resulting in shocking atrocities well documented by international and Israeli human rights organizations.
On January 8, 2009, while Cast Lead was in full fury, the U.N. Security Council passed a unanimous resolution (with the U.S. abstaining) calling for "an immediate ceasefire leading to a full Israeli withdrawal, unimpeded provision through Gaza of food, fuel, and medical treatment, and intensified international arrangements to prevent arms and ammunition smuggling."
A new ceasefire agreement was indeed reached, but the terms, similar to the previous ones, were again never observed and broke down completely with the next major mowing-the-lawn episode in November 2012, Operation Pillar of Defense. What happened in the interim can be illustrated by the casualty figures from January 2012 to the launching of that operation: one Israeli was killed by fire from Gaza while 78 Palestinians were killed by Israeli fire.
The first act of Operation Pillar of Defense was the murder of Ahmed Jabari, a high official of the military wing of Hamas. Aluf Benn, editor-in-chief of Israel's leading newspaper Haaretz, described Jabari as Israel's "subcontractor" in Gaza, who enforced relative quiet there for more than five years. As always, there was a pretext for the assassination, but the likely reason was provided by Israeli peace activist Gershon Baskin. He had been involved in direct negotiations with Jabari for years and reported that, hours before he was assassinated, Jabari "received the draft of a permanent truce agreement with Israel, which included mechanisms for maintaining the ceasefire in the case of a flare-up between Israel and the factions in the Gaza Strip."
There is a long record of Israeli actions designed to deter the threat of a diplomatic settlement. After this exercise of mowing the lawn, a ceasefire agreement was reached yet again. Repeating the now-standard terms, it called for a cessation of military action by both sides and the effective ending of the siege of Gaza with Israel "opening the crossings and facilitating the movements of people and transfer of goods, and refraining from restricting residents' free movements and targeting residents in border areas."
What happened next was reviewed by Nathan Thrall, senior Middle East analyst of the International Crisis Group. Israeli intelligence recognized that Hamas was observing the terms of the ceasefire. "Israel,” Thrall wrote, “therefore saw little incentive in upholding its end of the deal. In the three months following the ceasefire, its forces made regular incursions into Gaza, strafed Palestinian farmers and those collecting scrap and rubble across the border, and fired at boats, preventing fishermen from accessing the majority of Gaza's waters." In other words, the siege never ended. "Crossings were repeatedly shut. So-called buffer zones inside Gaza [from which Palestinians are barred, and which include a third or more of the strip’s limited arable land] were reinstated. Imports declined, exports were blocked, and fewer Gazans were given exit permits to Israel and the West Bank."
Operation Protective Edge
So matters continued until April 2014, when an important event took place. The two major Palestinian groupings, Gaza-based Hamas and the Fatah-dominated Palestinian Authority in the West Bank signed a unity agreement. Hamas made major concessions. The unity government contained none of its members or allies. In substantial measure, as Nathan Thrall observes, Hamas turned over governance of Gaza to the PA. Several thousand PA security forces were sent there and the PA placed its guards at borders and crossings, with no reciprocal positions for Hamas in the West Bank security apparatus. Finally, the unity government accepted the three conditions that Washington and the European Union had long demanded: non-violence, adherence to past agreements, and the recognition of Israel.
Israel was infuriated. Its government declared at once that it would refuse to deal with the unity government and cancelled negotiations. Its fury mounted when the U.S., along with most of the world, signaled support for the unity government.
There are good reasons why Israel opposes the unification of Palestinians. One is that the Hamas-Fatah conflict has provided a useful pretext for refusing to engage in serious negotiations. How can one negotiate with a divided entity? More significantly, for more than 20 years, Israel has been committed to separating Gaza from the West Bank in violation of the Oslo Accords it signed in 1993, which declare Gaza and the West Bank to be an inseparable territorial unity.
A look at a map explains the rationale. Separated from Gaza, any West Bank enclaves left to Palestinians have no access to the outside world. They are contained by two hostile powers, Israel and Jordan, both close U.S. allies — and contrary to illusions, the U.S. is very far from a neutral "honest broker."
Furthermore, Israel has been systematically taking over the Jordan Valley, driving out Palestinians, establishing settlements, sinking wells, and otherwise ensuring that the region — about one-third of the West Bank, with much of its arable land — will ultimately be integrated into Israel along with the other regions that country is taking over. Hence remaining Palestinian cantons will be completely imprisoned. Unification with Gaza would interfere with these plans, which trace back to the early days of the occupation and have had steady support from the major political blocs, including figures usually portrayed as doves like former president Shimon Peres, who was one of the architects of settlement deep in the West Bank.
As usual, a pretext was needed to move on to the next escalation. Such an occasion arose when three Israeli boys from the settler community in the West Bank were brutally murdered. The Israeli government evidently quickly realized that they were dead, but pretended otherwise, which provided the opportunity to launch a "rescue operation" — actually a rampage primarily targeting Hamas. The Netanyahu government has claimed from the start that it knew Hamas was responsible, but has made no effort to present evidence.
One of Israel's leading authorities on Hamas, Shlomi Eldar, reported almost at once that the killers very likely came from a dissident clan in Hebron that has long been a thorn in the side of the Hamas leadership. He added, "I'm sure they didn't get any green light from the leadership of Hamas, they just thought it was the right time to act."
The Israeli police have since been searching for and arresting members of the clan, still claiming, without evidence, that they are "Hamas terrorists." On September 2nd, Haaretz reported that, after very intensive interrogations, the Israeli security services concluded the abduction of the teenagers "was carried out by an independent cell" with no known direct links to Hamas.
The 18-day rampage by the Israeli Defense Forces succeeded in undermining the feared unity government. According to Israeli military sources, its soldiers arrested 419 Palestinians, including 335 affiliated with Hamas, and killed six, while searching thousands of locations and confiscating $350,000. Israel also conducted dozens of attacks in Gaza, killing five Hamas members on July 7th.
Hamas finally reacted with its first rockets in 18 months, Israeli officials reported, providing Israel with the pretext to launch Operation Protective Edge on July 8th. The 50-day assault proved the most extreme exercise in mowing the lawn — so far.
Operation [Still to Be Named]
Israel is in a fine position today to reverse its decades-old policy of separating Gaza from the West Bank in violation of its solemn agreements and to observe a major ceasefire agreement for the first time. At least temporarily, the threat of democracy in neighboring Egypt has been diminished, and the brutal Egyptian military dictatorship of General Abdul Fattah al-Sisi is a welcome ally for Israel in maintaining control over Gaza.
The Palestinian unity government, as noted earlier, is placing the U.S.-trained forces of the Palestinian Authority in control of Gaza’s borders, and governance may be shifting into the hands of the PA, which depends on Israel for its survival, as well as for its finances. Israel might feel that its takeover of Palestinian territory in the West Bank has proceeded so far that there is little to fear from some limited form of autonomy for the enclaves that remain to Palestinians.
There is also some truth to Prime Minister Benjamin Netanyahu's observation: "Many elements in the region understand today that, in the struggle in which they are threatened, Israel is not an enemy but a partner." Akiva Eldar, Israel's leading diplomatic correspondent, adds, however, that "all those ‘many elements in the region’ also understand that there is no brave and comprehensive diplomatic move on the horizon without an agreement on the establishment of a Palestinian state based on the 1967 borders and a just, agreed-upon solution to the refugee problem." That is not on Israel's agenda, he points out, and is in fact in direct conflict with the 1999 electoral program of the governing Likud coalition, never rescinded, which "flatly rejects the establishment of a Palestinian Arab state west of the Jordan river."
Some knowledgeable Israeli commentators, notably columnist Danny Rubinstein, believe that Israel is poised to reverse course and relax its stranglehold on Gaza.
The record of these past years suggests otherwise and the first signs are not auspicious. As Operation Protective Edge ended, Israel announced its largest appropriation of West Bank land in 30 years, almost 1,000 acres. Israel Radio reported that the takeover was in response to the killing of the three Jewish teenagers by "Hamas militants." A Palestinian boy was burned to death in retaliation for the murder, but no Israeli land was handed to Palestinians, nor was there any reaction when an Israeli soldier murdered 10-year-old Khalil Anati on a quiet street in a refugee camp near Hebron on August 10th, while the most moral army in the world was smashing Gaza to bits, and then drove away in his jeep as the child bled to death.
Anati was one the 23 Palestinians (including three children) killed by Israeli occupation forces in the West Bank during the Gaza onslaught, according to U.N. statistics, along with more than 2,000 wounded, 38% by live fire. "None of those killed were endangering soldiers' lives," Israeli journalist Gideon Levy reported. To none of this is there any reaction, just as there was no reaction while Israel killed, on average, more than two Palestinian children a week for the past 14 years. Unpeople, after all.
It is commonly claimed on all sides that, if the two-state settlement is dead as a result of Israel's takeover of Palestinian lands, then the outcome will be one state West of the Jordan. Some Palestinians welcome this outcome, anticipating that they can then conduct a civil rights struggle for equal rights on the model of South Africa under apartheid. Many Israeli commentators warn that the resulting "demographic problem" of more Arab than Jewish births and diminishing Jewish immigration will undermine their hope for a "democratic Jewish state."
But these widespread beliefs are dubious.
The realistic alternative to a two-state settlement is that Israel will continue to carry forward the plans it has been implementing for years, taking over whatever is of value to it in the West Bank, while avoiding Palestinian population concentrations and removing Palestinians from the areas it is integrating into Israel. That should avoid the dreaded "demographic problem."
The areas being integrated into Israel include a vastly expanded Greater Jerusalem, the area within the illegal "Separation Wall," corridors cutting through the regions to the East, and will probably also encompass the Jordan Valley. Gaza will likely remain under its usual harsh siege, separated from the West Bank. And the Syrian Golan Heights — like Jerusalem, annexed in violation of Security Council orders — will quietly become part of Greater Israel. In the meantime, West Bank Palestinians will be contained in unviable cantons, with special accommodation for elites in standard neocolonial style.
These basic policies have been underway since the 1967 conquest, following a principle enunciated by then-Defense Minister Moshe Dayan, one of the Israeli leaders most sympathetic to the Palestinians. He informed his cabinet colleagues that they should tell Palestinian refugees in the West Bank, "We have no solution, you shall continue to live like dogs, and whoever wishes may leave, and we will see where this process leads."
The suggestion was natural within the overriding conception articulated in 1972 by future president Haim Herzog: "I do not deny the Palestinians a place or stand or opinion on every matter… But certainly I am not prepared to consider them as partners in any respect in a land that has been consecrated in the hands of our nation for thousands of years. For the Jews of this land there cannot be any partner." Dayan also called for Israel’s "permanent rule" ("memshelet keva") over the occupied territories. When Netanyahu expresses the same stand today, he is not breaking new ground.
Like other states, Israel pleads "security" as justification for its aggressive and violent actions. But knowledgeable Israelis know better. Their recognition of reality was articulated clearly in 1972 by Air Force Commander (and later president) Ezer Weizmann. He explained that there would be no security problem if Israel were to accept the international call to withdraw from the territories it conquered in 1967, but the country would not then be able to "exist according to the scale, spirit, and quality she now embodies."
For a century, the Zionist colonization of Palestine has proceeded primarily on the pragmatic principle of the quiet establishment of facts on the ground, which the world was to ultimately come to accept. It has been a highly successful policy. There is every reason to expect it to persist as long as the United States provides the necessary military, economic, diplomatic, and ideological support. For those concerned with the rights of the brutalized Palestinians, there can be no higher priority than working to change U.S. policies, not an idle dream by any means.
Noam Chomsky is Institute Professor emeritus in the Department of Linguistics and Philosophy at Massachusetts Institute of Technology. Among his recent books are Hegemony or Survival, Failed States, Power Systems, Occupy, and Hopes and Prospects. His latest book, Masters of Mankind, will be published this week by Haymarket Books, which is also reissuing 12 of his classic books in new editions over the coming year. His work is regularly posted at TomDispatch.com. His website is www.chomsky.info.
Copyright 2014 Noam Chomsky
Ceasefires in Which Violations Never Cease
If some extraterrestrial species were compiling a history of Homo sapiens, they might well break their calendar into two eras: BNW (before nuclear weapons) and NWE (the nuclear weapons era). The latter era, of course, opened on August 6, 1945, the first day of the countdown to what may be the inglorious end of this strange species, which attained the intelligence to discover the effective means to destroy itself, but — so the evidence suggests — not the moral and intellectual capacity to control its worst instincts.
Day one of the NWE was marked by the “success” of Little Boy, a simple atomic bomb. On day four, Nagasaki experienced the technological triumph of Fat Man, a more sophisticated design. Five days later came what the official Air Force history calls the “grand finale,” a 1,000-plane raid — no mean logistical achievement — attacking Japan’s cities and killing many thousands of people, with leaflets falling among the bombs reading “Japan has surrendered.” Truman announced that surrender before the last B-29 returned to its base.
Those were the auspicious opening days of the NWE. As we now enter its 70th year, we should be contemplating with wonder that we have survived. We can only guess how many years remain.
Some reflections on these grim prospects were offered by General Lee Butler, former head of the U.S. Strategic Command (STRATCOM), which controls nuclear weapons and strategy. Twenty years ago, he wrote that we had so far survived the NWE “by some combination of skill, luck, and divine intervention, and I suspect the latter in greatest proportion.”
Reflecting on his long career in developing nuclear weapons strategies and organizing the forces to implement them efficiently, he described himself ruefully as having been “among the most avid of these keepers of the faith in nuclear weapons.” But, he continued, he had come to realize that it was now his “burden to declare with all of the conviction I can muster that in my judgment they served us extremely ill.” And he asked, “By what authority do succeeding generations of leaders in the nuclear-weapons states usurp the power to dictate the odds of continued life on our planet? Most urgently, why does such breathtaking audacity persist at a moment when we should stand trembling in the face of our folly and united in our commitment to abolish its most deadly manifestations?”
He termed the U.S. strategic plan of 1960 that called for an automated all-out strike on the Communist world “the single most absurd and irresponsible document I have ever reviewed in my life.” Its Soviet counterpart was probably even more insane. But it is important to bear in mind that there are competitors, not least among them the easy acceptance of extraordinary threats to survival.
Survival in the Early Cold War Years
According to received doctrine in scholarship and general intellectual discourse, the prime goal of state policy is “national security.” There is ample evidence, however, that the doctrine of national security does not encompass the security of the population. The record reveals that, for instance, the threat of instant destruction by nuclear weapons has not ranked high among the concerns of planners. That much was demonstrated early on, and remains true to the present moment.
In the early days of the NWE, the U.S. was overwhelmingly powerful and enjoyed remarkable security: it controlled the hemisphere, the Atlantic and Pacific oceans, and the opposite sides of those oceans as well. Long before World War II, it had already become by far the richest country in the world, with incomparable advantages. Its economy boomed during the war, while other industrial societies were devastated or severely weakened. By the opening of the new era, the U.S. possessed about half of total world wealth and an even greater percentage of its manufacturing capacity.
There was, however, a potential threat: intercontinental ballistic missiles with nuclear warheads. That threat was discussed in the standard scholarly study of nuclear policies, carried out with access to high-level sources — Danger and Survival: Choices About the Bomb in the First Fifty Years by McGeorge Bundy, national security adviser during the Kennedy and Johnson presidencies.
Bundy wrote that “the timely development of ballistic missiles during the Eisenhower administration is one of the best achievements of those eight years. Yet it is well to begin with a recognition that both the United States and the Soviet Union might be in much less nuclear danger today if [those] missiles had never been developed.” He then added an instructive comment: “I am aware of no serious contemporary proposal, in or out of either government, that ballistic missiles should somehow be banned by agreement.” In short, there was apparently no thought of trying to prevent the sole serious threat to the U.S., the threat of utter destruction in a nuclear war with the Soviet Union.
Could that threat have been taken off the table? We cannot, of course, be sure, but it was hardly inconceivable. The Russians, far behind in industrial development and technological sophistication, were in a far more threatening environment. Hence, they were significantly more vulnerable to such weapons systems than the U.S. There might have been opportunities to explore these possibilities, but in the extraordinary hysteria of the day they could hardly have even been perceived. And that hysteria was indeed extraordinary. An examination of the rhetoric of central official documents of that moment like National Security Council Paper NSC-68 remains quite shocking, even discounting Secretary of State Dean Acheson’s injunction that it is necessary to be “clearer than truth.”
One indication of possible opportunities to blunt the threat was a remarkable proposal by Soviet ruler Joseph Stalin in 1952, offering to allow Germany to be unified with free elections on the condition that it would not then join a hostile military alliance. That was hardly an extreme condition in light of the history of the past half-century during which Germany alone had practically destroyed Russia twice, exacting a terrible toll.
Stalin’s proposal was taken seriously by the respected political commentator James Warburg, but otherwise mostly ignored or ridiculed at the time. Recent scholarship has begun to take a different view. The bitterly anti-Communist Soviet scholar Adam Ulam has taken the status of Stalin’s proposal to be an “unresolved mystery.” Washington “wasted little effort in flatly rejecting Moscow’s initiative,” he has written, on grounds that “were embarrassingly unconvincing.” The political, scholarly, and general intellectual failure left open “the basic question,” Ulam added: “Was Stalin genuinely ready to sacrifice the newly created German Democratic Republic (GDR) on the altar of real democracy,” with consequences for world peace and for American security that could have been enormous?
Reviewing recent research in Soviet archives, one of the most respected Cold War scholars, Melvyn Leffler, has observed that many scholars were surprised to discover “[Lavrenti] Beria — the sinister, brutal head of the [Russian] secret police — propos[ed] that the Kremlin offer the West a deal on the unification and neutralization of Germany,” agreeing “to sacrifice the East German communist regime to reduce East-West tensions” and improve internal political and economic conditions in Russia — opportunities that were squandered in favor of securing German participation in NATO.
Under the circumstances, it is not impossible that agreements might then have been reached that would have protected the security of the American population from the gravest threat on the horizon. But that possibility apparently was not considered, a striking indication of how slight a role authentic security plays in state policy.
The Cuban Missile Crisis and Beyond
That conclusion was underscored repeatedly in the years that followed. When Nikita Khrushchev took control in Russia in 1953 after Stalin’s death, he recognized that the USSR could not compete militarily with the U.S., the richest and most powerful country in history, with incomparable advantages. If it ever hoped to escape its economic backwardness and the devastating effect of the last world war, it would need to reverse the arms race.
Accordingly, Khrushchev proposed sharp mutual reductions in offensive weapons. The incoming Kennedy administration considered the offer and rejected it, instead turning to rapid military expansion, even though it was already far in the lead. The late Kenneth Waltz, supported by other strategic analysts with close connections to U.S. intelligence, wrote then that the Kennedy administration “undertook the largest strategic and conventional peace-time military build-up the world has yet seen… even as Khrushchev was trying at once to carry through a major reduction in the conventional forces and to follow a strategy of minimum deterrence, and we did so even though the balance of strategic weapons greatly favored the United States.” Again, harming national security while enhancing state power.
U.S. intelligence verified that huge cuts had indeed been made in active Soviet military forces, both in terms of aircraft and manpower. In 1963, Khrushchev again called for new reductions. As a gesture, he withdrew troops from East Germany and called on Washington to reciprocate. That call, too, was rejected. William Kaufmann, a former top Pentagon aide and leading analyst of security issues, described the U.S. failure to respond to Khrushchev’s initiatives as, in career terms, “the one regret I have.”
The Soviet reaction to the U.S. build-up of those years was to place nuclear missiles in Cuba in October 1962 to try to redress the balance at least slightly. The move was also motivated in part by Kennedy’s terrorist campaign against Fidel Castro’s Cuba, which was scheduled to lead to invasion that very month, as Russia and Cuba may have known. The ensuing “missile crisis” was “the most dangerous moment in history,” in the words of historian Arthur Schlesinger, Kennedy’s adviser and confidant.
As the crisis peaked in late October, Kennedy received a secret letter from Khrushchev offering to end it by simultaneous public withdrawal of Russian missiles from Cuba and U.S. Jupiter missiles from Turkey. The latter were obsolete missiles, already ordered withdrawn by the Kennedy administration because they were being replaced by far more lethal Polaris submarines to be stationed in the Mediterranean.
Kennedy’s subjective estimate at that moment was that if he refused the Soviet premier’s offer, there was a 33% to 50% probability of nuclear war — a war that, as President Eisenhower had warned, would have destroyed the northern hemisphere. Kennedy nonetheless refused Khrushchev’s proposal for public withdrawal of the missiles from Cuba and Turkey; only the withdrawal from Cuba could be public, so as to protect the U.S. right to place missiles on Russia’s borders or anywhere else it chose.
It is hard to think of a more horrendous decision in history — and for this, he is still highly praised for his cool courage and statesmanship.
Ten years later, in the last days of the 1973 Israel-Arab war, Henry Kissinger, then national security adviser to President Nixon, called a nuclear alert. The purpose was to warn the Russians not to interfere with his delicate diplomatic maneuvers designed to ensure an Israeli victory, but of a limited sort so that the U.S. would still be in control of the region unilaterally. And the maneuvers were indeed delicate. The U.S. and Russia had jointly imposed a cease-fire, but Kissinger secretly informed the Israelis that they could ignore it. Hence the need for the nuclear alert to frighten the Russians away. The security of Americans had its usual status.
Ten years later, the Reagan administration launched operations to probe Russian air defenses by simulating air and naval attacks and a high-level nuclear alert that the Russians were intended to detect. These actions were undertaken at a very tense moment. Washington was deploying Pershing II strategic missiles in Europe with a five-minute flight time to Moscow. President Reagan had also announced the Strategic Defense Initiative (“Star Wars”) program, which the Russians understood to be effectively a first-strike weapon, a standard interpretation of missile defense on all sides. And other tensions were rising.
Naturally, these actions caused great alarm in Russia, which, unlike the U.S., was quite vulnerable and had repeatedly been invaded and virtually destroyed. That led to a major war scare in 1983. Newly released archives reveal that the danger was even more severe than historians had previously assumed. A CIA study entitled “The War Scare Was for Real” concluded that U.S. intelligence may have underestimated Russian concerns and the threat of a Russian preventative nuclear strike. The exercises “almost became a prelude to a preventative nuclear strike,” according to an account in the Journal of Strategic Studies.
It was even more dangerous than that, as we learned last September, when the BBC reported that right in the midst of these world-threatening developments, Russia’s early-warning systems detected an incoming missile strike from the United States, sending its nuclear system onto the highest-level alert. The protocol for the Soviet military was to retaliate with a nuclear attack of its own. Fortunately, the officer on duty, Stanislav Petrov, decided to disobey orders and not report the warnings to his superiors. He received an official reprimand. And thanks to his dereliction of duty, we’re still alive to talk about it.
The security of the population was no more a high priority for Reagan administration planners than for their predecessors. And so it continues to the present, even putting aside the numerous near-catastrophic nuclear accidents that occurred over the years, many reviewed in Eric Schlosser’s chilling study Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. In other words, it is hard to contest General Butler’s conclusions.
Survival in the Post-Cold War Era
The record of post-Cold War actions and doctrines is hardly reassuring either. Every self-respecting president has to have a doctrine. The Clinton Doctrine was encapsulated in the slogan “multilateral when we can, unilateral when we must.” In congressional testimony, the phrase “when we must” was explained more fully: the U.S. is entitled to resort to “unilateral use of military power” to ensure “uninhibited access to key markets, energy supplies, and strategic resources.” Meanwhile, STRATCOM in the Clinton era produced an important study entitled “Essentials of Post-Cold War Deterrence,” issued well after the Soviet Union had collapsed and Clinton was extending President George H.W. Bush’s program of expanding NATO to the east in violation of promises to Soviet Premier Mikhail Gorbachev — with reverberations to the present.
That STRATCOM study was concerned with “the role of nuclear weapons in the post-Cold War era.” A central conclusion: that the U.S. must maintain the right to launch a first strike, even against non-nuclear states. Furthermore, nuclear weapons must always be at the ready because they “cast a shadow over any crisis or conflict.” They were, that is, constantly being used, just as you’re using a gun if you aim but don’t fire one while robbing a store (a point that Daniel Ellsberg has repeatedly stressed). STRATCOM went on to advise that “planners should not be too rational about determining… what the opponent values the most.” Everything should simply be targeted. “[I]t hurts to portray ourselves as too fully rational and cool-headed… That the U.S. may become irrational and vindictive if its vital interests are attacked should be a part of the national persona we project.” It is “beneficial [for our strategic posture] if some elements may appear to be potentially ‘out of control,’” thus posing a constant threat of nuclear attack — a severe violation of the U.N. Charter, if anyone cares.
Not much here about the noble goals constantly proclaimed — or for that matter the obligation under the Non-Proliferation Treaty to make “good faith” efforts to eliminate this scourge of the earth. What resounds, rather, is an adaptation of Hilaire Belloc’s famous couplet about the Maxim gun (to quote the great African historian Chinweizu):
“Whatever happens, we have got,
The Atom Bomb, and they have not.”
After Clinton came, of course, George W. Bush, whose broad endorsement of preventative war easily encompassed Japan’s attack in December 1941 on military bases in two U.S. overseas possessions, at a time when Japanese militarists were well aware that B-17 Flying Fortresses were being rushed off assembly lines and deployed to those bases with the intent “to burn out the industrial heart of the Empire with fire-bomb attacks on the teeming bamboo ant heaps of Honshu and Kyushu.” That was how the prewar plans were described by their architect, Air Force General Claire Chennault, with the enthusiastic approval of President Franklin Roosevelt, Secretary of State Cordell Hull, and Army Chief of Staff General George Marshall.
Then comes Barack Obama, with pleasant words about working to abolish nuclear weapons — combined with plans to spend $1 trillion on the U.S. nuclear arsenal in the next 30 years, a percentage of the military budget “comparable to spending for procurement of new strategic systems in the 1980s under President Ronald Reagan,” according to a study by the James Martin Center for Nonproliferation Studies at the Monterey Institute of International Studies.
Obama has also not hesitated to play with fire for political gain. Take for example the capture and assassination of Osama bin Laden by Navy SEALs. Obama brought it up with pride in an important speech on national security in May 2013. It was widely covered, but one crucial paragraph was ignored.
Obama hailed the operation but added that it could not be the norm. The reason, he said, was that the risks “were immense.” The SEALs might have been “embroiled in an extended firefight.” Even though, by luck, that didn’t happen, “the cost to our relationship with Pakistan and the backlash among the Pakistani public over encroachment on their territory was… severe.”
Let us now add a few details. The SEALs were ordered to fight their way out if apprehended. They would not have been left to their fate if “embroiled in an extended firefight.” The full force of the U.S. military would have been used to extricate them. Pakistan has a powerful, well-trained military, highly protective of state sovereignty. It also has nuclear weapons, and Pakistani specialists are concerned about the possible penetration of their nuclear security system by jihadi elements. It is also no secret that the population has been embittered and radicalized by Washington’s drone terror campaign and other policies.
While the SEALs were still in the bin Laden compound, Pakistani Chief of Staff Ashfaq Parvez Kayani was informed of the raid and ordered the military “to confront any unidentified aircraft,” which he assumed would be from India. Meanwhile in Kabul, U.S. war commander General David Petraeus ordered “warplanes to respond” if the Pakistanis “scrambled their fighter jets.” As Obama said, by luck the worst didn’t happen, though it could have been quite ugly. But the risks were faced without noticeable concern. Or subsequent comment.
As General Butler observed, it is a near miracle that we have escaped destruction so far, and the longer we tempt fate, the less likely it is that we can hope for divine intervention to perpetuate the miracle.
Noam Chomsky is Institute Professor emeritus in the Department of Linguistics and Philosophy at Massachusetts Institute of Technology. Among his recent books are Hegemony or Survival, Failed States, Power Systems, Occupy, and Hopes and Prospects. His latest book, Masters of Mankind, will be published soon by Haymarket Books, which is also reissuing twelve of his classic books in new editions over the coming year. His website is www.chomsky.info.
Copyright 2014 Noam Chomsky
How Many Minutes to Midnight?
The question of how foreign policy is determined is a crucial one in world affairs. In these comments, I can only provide a few hints as to how I think the subject can be productively explored, keeping to the United States for several reasons. First, the U.S. is unmatched in its global significance and impact. Second, it is an unusually open society, possibly uniquely so, which means we know more about it. Finally, it is plainly the most important case for Americans, who are able to influence policy choices in the U.S. — and indeed for others, insofar as their actions can influence such choices. The general principles, however, extend to the other major powers, and well beyond.
There is a “received standard version,” common to academic scholarship, government pronouncements, and public discourse. It holds that the prime commitment of governments is to ensure security, and that the primary concern of the U.S. and its allies since 1945 was the Russian threat.
There are a number of ways to evaluate the doctrine. One obvious question to ask is: What happened when the Russian threat disappeared in 1989? Answer: everything continued much as before.
The U.S. immediately invaded Panama, killing probably thousands of people and installing a client regime. This was routine practice in U.S.-dominated domains — but in this case not quite as routine. For first time, a major foreign policy act was not justified by an alleged Russian threat.
Instead, a series of fraudulent pretexts for the invasion were concocted that collapse instantly on examination. The media chimed in enthusiastically, lauding the magnificent achievement of defeating Panama, unconcerned that the pretexts were ludicrous, that the act itself was a radical violation of international law, and that it was bitterly condemned elsewhere, most harshly in Latin America. Also ignored was the U.S. veto of a unanimous Security Council resolution condemning crimes by U.S. troops during the invasion, with Britain alone abstaining.
All routine. And all forgotten (which is also routine).
From El Salvador to the Russian Border
The administration of George H.W. Bush issued a new national security policy and defense budget in reaction to the collapse of the global enemy. It was pretty much the same as before, although with new pretexts. It was, it turned out, necessary to maintain a military establishment almost as great as the rest of the world combined and far more advanced in technological sophistication — but not for defense against the now-nonexistent Soviet Union. Rather, the excuse now was the growing “technological sophistication” of Third World powers. Disciplined intellectuals understood that it would have been improper to collapse in ridicule, so they maintained a proper silence.
The U.S., the new programs insisted, must maintain its “defense industrial base.” The phrase is a euphemism, referring to high-tech industry generally, which relies heavily on extensive state intervention for research and development, often under Pentagon cover, in what economists continue to call the U.S. “free-market economy.”
One of the most interesting provisions of the new plans had to do with the Middle East. There, it was declared, Washington must maintain intervention forces targeting a crucial region where the major problems “could not have been laid at the Kremlin’s door.” Contrary to 50 years of deceit, it was quietly conceded that the main concern was not the Russians, but rather what is called “radical nationalism,” meaning independent nationalism not under U.S. control.
All of this has evident bearing on the standard version, but it passed unnoticed — or perhaps, therefore it passed unnoticed.
Other important events took place immediately after the fall of the Berlin Wall, ending the Cold War. One was in El Salvador, the leading recipient of U.S. military aid — apart from Israel-Egypt, a separate category — and with one of the worst human rights records anywhere. That is a familiar and very close correlation.
The Salvadoran high command ordered the Atlacatl Brigade to invade the Jesuit University and murder six leading Latin American intellectuals, all Jesuit priests, including the rector, Fr. Ignacio Ellacuría, and any witnesses, meaning their housekeeper and her daughter. The Brigade had just returned from advanced counterinsurgency training at the U.S. Army John F. Kennedy Special Warfare Center and School in Fort Bragg, North Carolina, and had already left a bloody trail of thousands of the usual victims in the course of the U.S.-run state terror campaign in El Salvador, one part of a broader terror and torture campaign throughout the region. All routine. Ignored and virtually forgotten in the United States and by its allies, again routine. But it tells us a lot about the factors that drive policy, if we care to look at the real world.
Another important event took place in Europe. Soviet president Mikhail Gorbachev agreed to allow the unification of Germany and its membership in NATO, a hostile military alliance. In the light of recent history, this was a most astonishing concession. There was a quid pro quo. President Bush and Secretary of State James Baker agreed that NATO would not expand “one inch to the East,” meaning into East Germany. Instantly, they expanded NATO to East Germany.
Gorbachev was naturally outraged, but when he complained, he was instructed by Washington that this had only been a verbal promise, a gentleman’s agreement, hence without force. If he was naïve enough to accept the word of American leaders, it was his problem.
All of this, too, was routine, as was the silent acceptance and approval of the expansion of NATO in the U.S. and the West generally. President Bill Clinton then expanded NATO further, right up to Russia’s borders. Today, the world faces a serious crisis that is in no small measure a result of these policies.
The Appeal of Plundering the Poor
Another source of evidence is the declassified historical record. It contains revealing accounts of the actual motives of state policy. The story is rich and complex, but a few persistent themes play a dominant role. One was articulated clearly at a western hemispheric conference called by the U.S. in Mexico in February 1945 where Washington imposed “An Economic Charter of the Americas” designed to eliminate economic nationalism “in all its forms.” There was one unspoken condition. Economic nationalism would be fine for the U.S. whose economy relies heavily on massive state intervention.
The elimination of economic nationalism for others stood in sharp conflict with the Latin American stand of that moment, which State Department officials described as “the philosophy of the New Nationalism [that] embraces policies designed to bring about a broader distribution of wealth and to raise the standard of living of the masses.” As U.S. policy analysts added, “Latin Americans are convinced that the first beneficiaries of the development of a country’s resources should be the people of that country.”
That, of course, will not do. Washington understands that the “first beneficiaries” should be U.S. investors, while Latin America fulfills its service function. It should not, as both the Truman and Eisenhower administrations would make clear, undergo “excessive industrial development” that might infringe on U.S. interests. Thus Brazil could produce low-quality steel that U.S. corporations did not want to bother with, but it would be “excessive,” were it to compete with U.S. firms.
Similar concerns resonate throughout the post-World War II period. The global system that was to be dominated by the U.S. was threatened by what internal documents call “radical and nationalistic regimes” that respond to popular pressures for independent development. That was the concern that motivated the overthrow of the parliamentary governments of Iran and Guatemala in 1953 and 1954, as well as numerous others. In the case of Iran, a major concern was the potential impact of Iranian independence on Egypt, then in turmoil over British colonial practice. In Guatemala, apart from the crime of the new democracy in empowering the peasant majority and infringing on possessions of the United Fruit Company — already offensive enough — Washington’s concern was labor unrest and popular mobilization in neighboring U.S.-backed dictatorships.
In both cases the consequences reach to the present. Literally not a day has passed since 1953 when the U.S. has not been torturing the people of Iran. Guatemala remains one of the world’s worst horror chambers. To this day, Mayans are fleeing from the effects of near-genocidal government military campaigns in the highlands backed by President Ronald Reagan and his top officials. As the country director of Oxfam, a Guatemalan doctor, reported recently,
“There is a dramatic deterioration of the political, social, and economic context. Attacks against Human Rights defenders have increased 300% during the last year. There is a clear evidence of a very well organized strategy by the private sector and Army. Both have captured the government in order to keep the status quo and to impose the extraction economic model, pushing away dramatically indigenous peoples from their own land, due to the mining industry, African Palm and sugar cane plantations. In addition the social movement defending their land and rights has been criminalized, many leaders are in jail, and many others have been killed.”
Nothing is known about this in the United States and the very obvious cause of it remains suppressed.
In the 1950s, President Eisenhower and Secretary of State John Foster Dulles explained quite clearly the dilemma that the U.S. faced. They complained that the Communists had an unfair advantage. They were able to “appeal directly to the masses” and “get control of mass movements, something we have no capacity to duplicate. The poor people are the ones they appeal to and they have always wanted to plunder the rich.”
That causes problems. The U.S. somehow finds it difficult to appeal to the poor with its doctrine that the rich should plunder the poor.
The Cuban Example
A clear illustration of the general pattern was Cuba, when it finally gained independence in 1959. Within months, military attacks on the island began. Shortly after, the Eisenhower administration made a secret decision to overthrow the government. John F. Kennedy then became president. He intended to devote more attention to Latin America and so, on taking office, he created a study group to develop policies headed by the historian Arthur Schlesinger, who summarized its conclusions for the incoming president.
As Schlesinger explained, threatening in an independent Cuba was “the Castro idea of taking matters into one’s own hands.” It was an idea that unfortunately appealed to the mass of the population in Latin America where “the distribution of land and other forms of national wealth greatly favors the propertied classes, and the poor and underprivileged, stimulated by the example of the Cuban revolution, are now demanding opportunities for a decent living.” Again, Washington’s usual dilemma.
As the CIA explained, “The extensive influence of ‘Castroism’ is not a function of Cuban power… Castro’s shadow looms large because social and economic conditions throughout Latin America invite opposition to ruling authority and encourage agitation for radical change,” for which his Cuba provides a model. Kennedy feared that Russian aid might make Cuba a “showcase” for development, giving the Soviets the upper hand throughout Latin America.
The State Department Policy Planning Council warned that “the primary danger we face in Castro is… in the impact the very existence of his regime has upon the leftist movement in many Latin American countries… The simple fact is that Castro represents a successful defiance of the U.S., a negation of our whole hemispheric policy of almost a century and a half” — that is, since the Monroe Doctrine of 1823, when the U.S. declared its intention of dominating the hemisphere.
The immediate goal at the time was to conquer Cuba, but that could not be achieved because of the power of the British enemy. Still, that grand strategist John Quincy Adams, the intellectual father of the Monroe Doctrine and Manifest Destiny, informed his colleagues that over time Cuba would fall into our hands by “the laws of political gravitation,” as an apple falls from the tree. In brief, U.S. power would increase and Britain’s would decline.
In 1898, Adams’s prognosis was realized. The U.S. invaded Cuba in the guise of liberating it. In fact, it prevented the island’s liberation from Spain and turned it into a “virtual colony” to quote historians Ernest May and Philip Zelikow. Cuba remained so until January 1959, when it gained independence. Since that time it has been subjected to major U.S. terrorist wars, primarily during the Kennedy years, and economic strangulation. Not because of the Russians.
The pretense all along was that we were defending ourselves from the Russian threat — an absurd explanation that generally went unchallenged. A simple test of the thesis is what happened when any conceivable Russian threat disappeared. U.S. policy toward Cuba became even harsher, spearheaded by liberal Democrats, including Bill Clinton, who outflanked Bush from the right in the 1992 election. On the face of it, these events should have considerable bearing on the validity of the doctrinal framework for discussion of foreign policy and the factors that drive it. Once again, however, the impact was slight.
The Virus of Nationalism
To borrow Henry Kissinger’s terminology, independent nationalism is a “virus” that might “spread contagion.” Kissinger was referring to Salvador Allende’s Chile. The virus was the idea that there might be a parliamentary path towards some kind of socialist democracy. The way to deal with such a threat is to destroy the virus and to inoculate those who might be infected, typically by imposing murderous national security states. That was achieved in the case of Chile, but it is important to recognize that the thinking holds worldwide.
It was, for example, the reasoning behind the decision to oppose Vietnamese nationalism in the early 1950s and support France’s effort to reconquer its former colony. It was feared that independent Vietnamese nationalism might be a virus that would spread contagion to the surrounding regions, including resource-rich Indonesia. That might even have led Japan — called the “superdomino” by Asia scholar John Dower — to become the industrial and commercial center of an independent new order of the kind imperial Japan had so recently fought to establish. That, in turn, would have meant that the U.S. had lost the Pacific war, not an option to be considered in 1950. The remedy was clear — and largely achieved. Vietnam was virtually destroyed and ringed by military dictatorships that kept the “virus” from spreading contagion.
In retrospect, Kennedy-Johnson National Security Adviser McGeorge Bundy reflected that Washington should have ended the Vietnam War in 1965, when the Suharto dictatorship was installed in Indonesia, with enormous massacres that the CIA compared to the crimes of Hitler, Stalin, and Mao. These were, however, greeted with unconstrained euphoria in the U.S. and the West generally because the “staggering bloodbath,” as the press cheerfully described it, ended any threat of contagion and opened Indonesia’s rich resources to western exploitation. After that, the war to destroy Vietnam was superfluous, as Bundy recognized in retrospect.
The same was true in Latin America in the same years: one virus after another was viciously attacked and either destroyed or weakened to the point of bare survival. From the early 1960s, a plague of repression was imposed on the continent that had no precedent in the violent history of the hemisphere, extending to Central America in the 1980s under Ronald Reagan, a matter that there should be no need to review.
Much the same was true in the Middle East. The unique U.S. relations with Israel were established in their current form in 1967, when Israel delivered a smashing blow to Egypt, the center of secular Arab nationalism. By doing so, it protected U.S. ally Saudi Arabia, then engaged in military conflict with Egypt in Yemen. Saudi Arabia, of course, is the most extreme radical fundamentalist Islamic state, and also a missionary state, expending huge sums to establish its Wahhabi-Salafi doctrines beyond its borders. It is worth remembering that the U.S., like England before it, has tended to support radical fundamentalist Islam in opposition to secular nationalism, which has usually been perceived as posing more of a threat of independence and contagion.
The Value of Secrecy
There is much more to say, but the historical record demonstrates very clearly that the standard doctrine has little merit. Security in the normal sense is not a prominent factor in policy formation.
To repeat, in the normal sense. But in evaluating the standard doctrine we have to ask what is actually meant by “security”: security for whom?
One answer is: security for state power. There are many illustrations. Take a current one. In May, the U.S. agreed to support a U.N. Security Council resolution calling on the International Criminal Court to investigate war crimes in Syria, but with a proviso: there could be no inquiry into possible war crimes by Israel. Or by Washington, though it was really unnecessary to add that last condition. The U.S. is uniquely self-immunized from the international legal system. In fact, there is even congressional legislation authorizing the president to use armed force to “rescue” any American brought to the Hague for trial — the “Netherlands Invasion Act,” as it is sometimes called in Europe. That once again illustrates the importance of protecting the security of state power.
But protecting it from whom? There is, in fact, a strong case to be made that a prime concern of government is the security of state power from the population. As those who have spent time rummaging through archives should be aware, government secrecy is rarely motivated by a genuine for security, but it definitely does serve to keep the population in the dark. And for good reasons, which were lucidly explained by the prominent liberal scholar and government adviser Samuel Huntington, the professor of the science of government at Harvard University. In his words: “The architects of power in the United States must create a force that can be felt but not seen. Power remains strong when it remains in the dark; exposed to the sunlight it begins to evaporate.”
He wrote that in 1981, when the Cold War was again heating up, and he explained further that “you may have to sell [intervention or other military action] in such a way as to create the misimpression that it is the Soviet Union that you are fighting. That is what the United States has been doing ever since the Truman Doctrine.”
These simple truths are rarely acknowledged, but they provide insight into state power and policy, with reverberations to the present moment.
State power has to be protected from its domestic enemy; in sharp contrast, the population is not secure from state power. A striking current illustration is the radical attack on the Constitution by the Obama administration’s massive surveillance program. It is, of course, justified by “national security.” That is routine for virtually all actions of all states and so carries little information.
When the NSA’s surveillance program was exposed by Edward Snowden’s revelations, high officials claimed that it had prevented 54 terrorist acts. On inquiry, that was whittled down to a dozen. A high-level government panel then discovered that there was actually only one case: someone had sent $8,500 to Somalia. That was the total yield of the huge assault on the Constitution and, of course, on others throughout the world.
Britain’s attitude is interesting. In 2007, the British government called on Washington’s colossal spy agency “to analyze and retain any British citizens’ mobile phone and fax numbers, emails, and IP addresses swept up by its dragnet,” the Guardian reported. That is a useful indication of the relative significance, in government eyes, of the privacy of its own citizens and of Washington’s demands.
Another concern is security for private power. One current illustration is the huge trade agreements now being negotiated, the Trans-Pacific and Trans-Atlantic pacts. These are being negotiated in secret — but not completely in secret. They are not secret from the hundreds of corporate lawyers who are drawing up the detailed provisions. It is not hard to guess what the results will be, and the few leaks about them suggest that the expectations are accurate. Like NAFTA and other such pacts, these are not free trade agreements. In fact, they are not even trade agreements, but primarily investor rights agreements.
Again, secrecy is critically important to protect the primary domestic constituency of the governments involved, the corporate sector.
The Final Century of Human Civilization?
There are other examples too numerous to mention, facts that are well-established and would be taught in elementary schools in free societies.
There is, in other words, ample evidence that securing state power from the domestic population and securing concentrated private power are driving forces in policy formation. Of course, it is not quite that simple. There are interesting cases, some quite current, where these commitments conflict, but consider this a good first approximation and radically opposed to the received standard doctrine.
Let us turn to another question: What about the security of the population? It is easy to demonstrate that this is a marginal concern of policy planners. Take two prominent current examples, global warming and nuclear weapons. As any literate person is doubtless aware, these are dire threats to the security of the population. Turning to state policy, we find that it is committed to accelerating each of those threats — in the interests of the primary concerns, protection of state power and of the concentrated private power that largely determines state policy.
Consider global warming. There is now much exuberance in the United States about “100 years of energy independence” as we become “the Saudi Arabia of the next century” — perhaps the final century of human civilization if current policies persist.
That illustrates very clearly the nature of the concern for security, certainly not for the population. It also illustrates the moral calculus of contemporary Anglo-American state capitalism: the fate of our grandchildren counts as nothing when compared with the imperative of higher profits tomorrow.
These conclusions are fortified by a closer look at the propaganda system. There is a huge public relations campaign in the U.S., organized quite openly by Big Energy and the business world, to try to convince the public that global warming is either unreal or not a result of human activity. And it has had some impact. The U.S. ranks lower than other countries in public concern about global warming and the results are stratified: among Republicans, the party more fully dedicated to the interests of wealth and corporate power, it ranks far lower than the global norm.
The current issue of the premier journal of media criticism, the Columbia Journalism Review, has an interesting article on this subject, attributing this outcome to the media doctrine of “fair and balanced.” In other words, if a journal publishes an opinion piece reflecting the conclusions of 97% of scientists, it must also run a counter-piece expressing the viewpoint of the energy corporations.
That indeed is what happens, but there certainly is no “fair and balanced” doctrine. Thus, if a journal runs an opinion piece denouncing Russian President Vladimir Putin for the criminal act of taking over the Crimea, it surely does not have to run a piece pointing out that, while the act is indeed criminal, Russia has a far stronger case today than the U.S. did more than a century ago in taking over southeastern Cuba, including the country’s major port — and rejecting the Cuban demand since independence to have it returned. And the same is true of many other cases. The actual media doctrine is “fair and balanced” when the concerns of concentrated private power are involved, but surely not elsewhere.
On the issue of nuclear weapons, the record is similarly interesting — and frightening. It reveals very clearly that, from the earliest days, the security of the population was a non-issue, and remains so. There is no time here to run through the shocking record, but there is little doubt that it strongly supports the lament of General Lee Butler, the last commander of the Strategic Air Command, which was armed with nuclear weapons. In his words, we have so far survived the nuclear age “by some combination of skill, luck, and divine intervention, and I suspect the latter in greatest proportion.” And we can hardly count on continued divine intervention as policymakers play roulette with the fate of the species in pursuit of the driving factors in policy formation.
As we are all surely aware, we now face the most ominous decisions in human history. There are many problems that must be addressed, but two are overwhelming in their significance: environmental destruction and nuclear war. For the first time in history, we face the possibility of destroying the prospects for decent existence — and not in the distant future. For this reason alone, it is imperative to sweep away the ideological clouds and face honestly and realistically the question of how policy decisions are made, and what we can do to alter them before it is too late.
Noam Chomsky is Institute Professor emeritus in the Department of Linguistics and Philosophy at Massachusetts Institute of Technology. Among his recent books are Hegemony or Survival, Failed States, Power Systems, Occupy, and Hopes and Prospects. His latest book, Masters of Mankind, will be published soon by Haymarket Books, which is also reissuing twelve of his classic books in new editions over the coming year. His website is www.chomsky.info.
Copyright 2014 Noam Chomsky
What is the future likely to bring? A reasonable stance might be to try to look at the human species from the outside. So imagine that you’re an extraterrestrial observer who is trying to figure out what’s happening here or, for that matter, imagine you’re an historian 100 years from now — assuming there are any historians 100 years from now, which is not obvious — and you’re looking back at what’s happening today. You’d see something quite remarkable.
For the first time in the history of the human species, we have clearly developed the capacity to destroy ourselves. That’s been true since 1945. It’s now being finally recognized that there are more long-term processes like environmental destruction leading in the same direction, maybe not to total destruction, but at least to the destruction of the capacity for a decent existence.
And there are other dangers like pandemics, which have to do with globalization and interaction. So there are processes underway and institutions right in place, like nuclear weapons systems, which could lead to a serious blow to, or maybe the termination of, an organized existence.
How to Destroy a Planet Without Really Trying
The question is: What are people doing about it? None of this is a secret. It’s all perfectly open. In fact, you have to make an effort not to see it.
There have been a range of reactions. There are those who are trying hard to do something about these threats, and others who are acting to escalate them. If you look at who they are, this future historian or extraterrestrial observer would see something strange indeed. Trying to mitigate or overcome these threats are the least developed societies, the indigenous populations, or the remnants of them, tribal societies and first nations in Canada. They’re not talking about nuclear war but environmental disaster, and they’re really trying to do something about it.
In fact, all over the world — Australia, India, South America — there are battles going on, sometimes wars. In India, it’s a major war over direct environmental destruction, with tribal societies trying to resist resource extraction operations that are extremely harmful locally, but also in their general consequences. In societies where indigenous populations have an influence, many are taking a strong stand. The strongest of any country with regard to global warming is in Bolivia, which has an indigenous majority and constitutional requirements that protect the “rights of nature.”
Ecuador, which also has a large indigenous population, is the only oil exporter I know of where the government is seeking aid to help keep that oil in the ground, instead of producing and exporting it — and the ground is where it ought to be.
Venezuelan President Hugo Chavez, who died recently and was the object of mockery, insult, and hatred throughout the Western world, attended a session of the U.N. General Assembly a few years ago where he elicited all sorts of ridicule for calling George W. Bush a devil. He also gave a speech there that was quite interesting. Of course, Venezuela is a major oil producer. Oil is practically their whole gross domestic product. In that speech, he warned of the dangers of the overuse of fossil fuels and urged producer and consumer countries to get together and try to work out ways to reduce fossil fuel use. That was pretty amazing on the part of an oil producer. You know, he was part Indian, of indigenous background. Unlike the funny things he did, this aspect of his actions at the U.N. was never even reported.
So, at one extreme you have indigenous, tribal societies trying to stem the race to disaster. At the other extreme, the richest, most powerful societies in world history, like the United States and Canada, are racing full-speed ahead to destroy the environment as quickly as possible. Unlike Ecuador, and indigenous societies throughout the world, they want to extract every drop of hydrocarbons from the ground with all possible speed.
Both political parties, President Obama, the media, and the international press seem to be looking forward with great enthusiasm to what they call “a century of energy independence” for the United States. Energy independence is an almost meaningless concept, but put that aside. What they mean is: we’ll have a century in which to maximize the use of fossil fuels and contribute to destroying the world.
And that’s pretty much the case everywhere. Admittedly, when it comes to alternative energy development, Europe is doing something. Meanwhile, the United States, the richest and most powerful country in world history, is the only nation among perhaps 100 relevant ones that doesn’t have a national policy for restricting the use of fossil fuels, that doesn’t even have renewable energy targets. It’s not because the population doesn’t want it. Americans are pretty close to the international norm in their concern about global warming. It’s institutional structures that block change. Business interests don’t want it and they’re overwhelmingly powerful in determining policy, so you get a big gap between opinion and policy on lots of issues, including this one.
So that’s what the future historian — if there is one — would see. He might also read today’s scientific journals. Just about every one you open has a more dire prediction than the last.
“The Most Dangerous Moment in History”
The other issue is nuclear war. It’s been known for a long time that if there were to be a first strike by a major power, even with no retaliation, it would probably destroy civilization just because of the nuclear-winter consequences that would follow. You can read about it in the Bulletin of Atomic Scientists. It’s well understood. So the danger has always been a lot worse than we thought it was.
We’ve just passed the 50th anniversary of the Cuban Missile Crisis, which was called “the most dangerous moment in history” by historian Arthur Schlesinger, President John F. Kennedy’s advisor. Which it was. It was a very close call, and not the only time either. In some ways, however, the worst aspect of these grim events is that the lessons haven’t been learned.
What happened in the missile crisis in October 1962 has been prettified to make it look as if acts of courage and thoughtfulness abounded. The truth is that the whole episode was almost insane. There was a point, as the missile crisis was reaching its peak, when Soviet Premier Nikita Khrushchev wrote to Kennedy offering to settle it by a public announcement of a withdrawal of Russian missiles from Cuba and U.S. missiles from Turkey. Actually, Kennedy hadn’t even known that the U.S. had missiles in Turkey at the time. They were being withdrawn anyway, because they were being replaced by more lethal Polaris nuclear submarines, which were invulnerable.
So that was the offer. Kennedy and his advisors considered it — and rejected it. At the time, Kennedy himself was estimating the likelihood of nuclear war at a third to a half. So Kennedy was willing to accept a very high risk of massive destruction in order to establish the principle that we — and only we — have the right to offensive missiles beyond our borders, in fact anywhere we like, no matter what the risk to others — and to ourselves, if matters fall out of control. We have that right, but no one else does.
Kennedy did, however, accept a secret agreement to withdraw the missiles the U.S. was already withdrawing, as long as it was never made public. Khrushchev, in other words, had to openly withdraw the Russian missiles while the U.S. secretly withdrew its obsolete ones; that is, Khrushchev had to be humiliated and Kennedy had to maintain his macho image. He’s greatly praised for this: courage and coolness under threat, and so on. The horror of his decisions is not even mentioned — try to find it on the record.
And to add a little more, a couple of months before the crisis blew up the United States had sent missiles with nuclear warheads to Okinawa. These were aimed at China during a period of great regional tension.
Well, who cares? We have the right to do anything we want anywhere in the world. That was one grim lesson from that era, but there were others to come.
Ten years after that, in 1973, Secretary of State Henry Kissinger called a high-level nuclear alert. It was his way of warning the Russians not to interfere in the ongoing Israel-Arab war and, in particular, not to interfere after he had informed the Israelis that they could violate a ceasefire the U.S. and Russia had just agreed upon. Fortunately, nothing happened.
Ten years later, President Ronald Reagan was in office. Soon after he entered the White House, he and his advisors had the Air Force start penetrating Russian air space to try to elicit information about Russian warning systems, Operation Able Archer. Essentially, these were mock attacks. The Russians were uncertain, some high-level officials fearing that this was a step towards a real first strike. Fortunately, they didn’t react, though it was a close call. And it goes on like that.
What to Make of the Iranian and North Korean Nuclear Crises
At the moment, the nuclear issue is regularly on front pages in the cases of North Korea and Iran. There are ways to deal with these ongoing crises. Maybe they wouldn’t work, but at least you could try. They are, however, not even being considered, not even reported.
Take the case of Iran, which is considered in the West — not in the Arab world, not in Asia — the gravest threat to world peace. It’s a Western obsession, and it’s interesting to look into the reasons for it, but I’ll put that aside here. Is there a way to deal with the supposed gravest threat to world peace? Actually there are quite a few. One way, a pretty sensible one, was proposed a couple of months ago at a meeting of the non-aligned countries in Tehran. In fact, they were just reiterating a proposal that’s been around for decades, pressed particularly by Egypt, and has been approved by the U.N. General Assembly.
The proposal is to move toward establishing a nuclear-weapons-free zone in the region. That wouldn’t be the answer to everything, but it would be a pretty significant step forward. And there were ways to proceed. Under U.N. auspices, there was to be an international conference in Finland last December to try to implement plans to move toward this. What happened?
You won’t read about it in the newspapers because it wasn’t reported — only in specialist journals. In early November, Iran agreed to attend the meeting. A couple of days later Obama cancelled the meeting, saying the time wasn’t right. The European Parliament issued a statement calling for it to continue, as did the Arab states. Nothing resulted. So we’ll move toward ever-harsher sanctions against the Iranian population — it doesn’t hurt the regime — and maybe war. Who knows what will happen?
In Northeast Asia, it’s the same sort of thing. North Korea may be the craziest country in the world. It’s certainly a good competitor for that title. But it does make sense to try to figure out what’s in the minds of people when they’re acting in crazy ways. Why would they behave the way they do? Just imagine ourselves in their situation. Imagine what it meant in the Korean War years of the early 1950s for your country to be totally leveled, everything destroyed by a huge superpower, which furthermore was gloating about what it was doing. Imagine the imprint that would leave behind.
Bear in mind that the North Korean leadership is likely to have read the public military journals of this superpower at that time explaining that, since everything else in North Korea had been destroyed, the air force was sent to destroy North Korea’s dams, huge dams that controlled the water supply — a war crime, by the way, for which people were hanged in Nuremberg. And these official journals were talking excitedly about how wonderful it was to see the water pouring down, digging out the valleys, and the Asians scurrying around trying to survive. The journals were exulting in what this meant to those “Asians,” horrors beyond our imagination. It meant the destruction of their rice crop, which in turn meant starvation and death. How magnificent! It’s not in our memory, but it’s in their memory.
Let’s turn to the present. There’s an interesting recent history. In 1993, Israel and North Korea were moving towards an agreement in which North Korea would stop sending any missiles or military technology to the Middle East and Israel would recognize that country. President Clinton intervened and blocked it. Shortly after that, in retaliation, North Korea carried out a minor missile test. The U.S. and North Korea did then reach a framework agreement in 1994 that halted its nuclear work and was more or less honored by both sides. When George W. Bush came into office, North Korea had maybe one nuclear weapon and verifiably wasn’t producing any more.
Bush immediately launched his aggressive militarism, threatening North Korea — “axis of evil” and all that — so North Korea got back to work on its nuclear program. By the time Bush left office, they had eight to 10 nuclear weapons and a missile system, another great neocon achievement. In between, other things happened. In 2005, the U.S. and North Korea actually reached an agreement in which North Korea was to end all nuclear weapons and missile development. In return, the West, but mainly the United States, was to provide a light-water reactor for its medical needs and end aggressive statements. They would then form a nonaggression pact and move toward accommodation.
It was pretty promising, but almost immediately Bush undermined it. He withdrew the offer of the light-water reactor and initiated programs to compel banks to stop handling any North Korean transactions, even perfectly legal ones. The North Koreans reacted by reviving their nuclear weapons program. And that’s the way it’s been going.
It’s well known. You can read it in straight, mainstream American scholarship. What they say is: it’s a pretty crazy regime, but it’s also following a kind of tit-for-tat policy. You make a hostile gesture and we’ll respond with some crazy gesture of our own. You make an accommodating gesture and we’ll reciprocate in some way.
Lately, for instance, there have been South Korean-U.S. military exercises on the Korean peninsula which, from the North’s point of view, have got to look threatening. We’d think they were threatening if they were going on in Canada and aimed at us. In the course of these, the most advanced bombers in history, Stealth B-2s and B-52s, are carrying out simulated nuclear bombing attacks right on North Korea’s borders.
This surely sets off alarm bells from the past. They remember that past, so they’re reacting in a very aggressive, extreme way. Well, what comes to the West from all this is how crazy and how awful the North Korean leaders are. Yes, they are. But that’s hardly the whole story, and this is the way the world is going.
It’s not that there are no alternatives. The alternatives just aren’t being taken. That’s dangerous. So if you ask what the world is going to look like, it’s not a pretty picture. Unless people do something about it. We always can.
Noam Chomsky is Institute Professor Emeritus in the MIT Department of Linguistics and Philosophy. A TomDispatch regular, he is the author of numerous best-selling political works, including Hopes and Prospects, Making the Future, and most recently (with interviewer David Barsamian), Power Systems: Conversations on Global Democratic Uprisings and the New Challenges to U.S. Empire (The American Empire Project, Metropolitan Books).
[Note: This piece was adapted (with the help of Noam Chomsky) from an online video interview done by the website What, which is dedicated to integrating knowledge from different fields with the aim of encouraging the balance between the individual, society, and the environment.]
Copyright 2013 Noam Chomsky
[This post is adapted from “Uprisings,” a chapter in Power Systems: Conversations on Global Democratic Uprisings and the New Challenges to U.S. Empire, Noam Chomsky’s new interview book with David Barsamian (with thanks to the publisher, Metropolitan Books). The questions are Barsamian’s, the answers Chomsky’s.]
Does the United States still have the same level of control over the energy resources of the Middle East as it once had?
The major energy-producing countries are still firmly under the control of the Western-backed dictatorships. So, actually, the progress made by the Arab Spring is limited, but it’s not insignificant. The Western-controlled dictatorial system is eroding. In fact, it’s been eroding for some time. So, for example, if you go back 50 years, the energy resources — the main concern of U.S. planners — have been mostly nationalized. There are constantly attempts to reverse that, but they have not succeeded.
Take the U.S. invasion of Iraq, for example. To everyone except a dedicated ideologue, it was pretty obvious that we invaded Iraq not because of our love of democracy but because it’s maybe the second- or third-largest source of oil in the world, and is right in the middle of the major energy-producing region. You’re not supposed to say this. It’s considered a conspiracy theory.
The United States was seriously defeated in Iraq by Iraqi nationalism — mostly by nonviolent resistance. The United States could kill the insurgents, but they couldn’t deal with half a million people demonstrating in the streets. Step by step, Iraq was able to dismantle the controls put in place by the occupying forces. By November 2007, it was becoming pretty clear that it was going to be very hard to reach U.S. goals. And at that point, interestingly, those goals were explicitly stated. So in November 2007 the Bush II administration came out with an official declaration about what any future arrangement with Iraq would have to be. It had two major requirements: one, that the United States must be free to carry out combat operations from its military bases, which it will retain; and two, “encouraging the flow of foreign investments to Iraq, especially American investments.” In January 2008, Bush made this clear in one of his signing statements. A couple of months later, in the face of Iraqi resistance, the United States had to give that up. Control of Iraq is now disappearing before their eyes.
Iraq was an attempt to reinstitute by force something like the old system of control, but it was beaten back. In general, I think, U.S. policies remain constant, going back to the Second World War. But the capacity to implement them is declining.
Declining because of economic weakness?
Partly because the world is just becoming more diverse. It has more diverse power centers. At the end of the Second World War, the United States was absolutely at the peak of its power. It had half the world’s wealth and every one of its competitors was seriously damaged or destroyed. It had a position of unimaginable security and developed plans to essentially run the world — not unrealistically at the time.
This was called “Grand Area” planning?
Yes. Right after the Second World War, George Kennan, head of the U.S. State Department policy planning staff, and others sketched out the details, and then they were implemented. What’s happening now in the Middle East and North Africa, to an extent, and in South America substantially goes all the way back to the late 1940s. The first major successful resistance to U.S. hegemony was in 1949. That’s when an event took place, which, interestingly, is called “the loss of China.” It’s a very interesting phrase, never challenged. There was a lot of discussion about who is responsible for the loss of China. It became a huge domestic issue. But it’s a very interesting phrase. You can only lose something if you own it. It was just taken for granted: we possess China — and if they move toward independence, we’ve lost China. Later came concerns about “the loss of Latin America,” “the loss of the Middle East,” “the loss of” certain countries, all based on the premise that we own the world and anything that weakens our control is a loss to us and we wonder how to recover it.
On the other hand, the capacity to preserve control has sharply declined. By 1970, the world was already what was called tripolar economically, with a U.S.-based North American industrial center, a German-based European center, roughly comparable in size, and a Japan-based East Asian center, which was then the most dynamic growth region in the world. Since then, the global economic order has become much more diverse. So it’s harder to carry out our policies, but the underlying principles have not changed much.
Take the Clinton doctrine. The Clinton doctrine was that the United States is entitled to resort to unilateral force to ensure “uninhibited access to key markets, energy supplies, and strategic resources.” That goes beyond anything that George W. Bush said. But it was quiet and it wasn’t arrogant and abrasive, so it didn’t cause much of an uproar. The belief in that entitlement continues right to the present. It’s also part of the intellectual culture.
Right after the assassination of Osama bin Laden, amid all the cheers and applause, there were a few critical comments questioning the legality of the act. Centuries ago, there used to be something called presumption of innocence. If you apprehend a suspect, he’s a suspect until proven guilty. He should be brought to trial. It’s a core part of American law. You can trace it back to Magna Carta. So there were a couple of voices saying maybe we shouldn’t throw out the whole basis of Anglo-American law. That led to a lot of very angry and infuriated reactions, but the most interesting ones were, as usual, on the left liberal end of the spectrum. Matthew Yglesias, a well-known and highly respected left liberal commentator, wrote an article in which he ridiculed these views. He said they’re “amazingly naive,” silly. Then he expressed the reason. He said that “one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers.” Of course, he didn’t mean Norway. He meant the United States. So the principle on which the international system is based is that the United States is entitled to use force at will. To talk about the United States violating international law or something like that is amazingly naive, completely silly. Incidentally, I was the target of those remarks, and I’m happy to confess my guilt. I do think that Magna Carta and international law are worth paying some attention to.
I merely mention that to illustrate that in the intellectual culture, even at what’s called the left liberal end of the political spectrum, the core principles haven’t changed very much. But the capacity to implement them has been sharply reduced. That’s why you get all this talk about American decline. Take a look at the year-end issue of Foreign Affairs, the main establishment journal. Its big front-page cover asks, in bold face, “Is America Over?” It’s a standard complaint of those who believe they should have everything. If you believe you should have everything and anything gets away from you, it’s a tragedy, the world is collapsing. So is America over? A long time ago we “lost” China, we’ve lost Southeast Asia, we’ve lost South America. Maybe we’ll lose the Middle East and North African countries. Is America over? It’s a kind of paranoia, but it’s the paranoia of the superrich and the superpowerful. If you don’t have everything, it’s a disaster.
The New York Times describes the “defining policy quandary of the Arab Spring: how to square contradictory American impulses that include support for democratic change, a desire for stability, and wariness of Islamists who have become a potent political force.” The Times identifies three U.S. goals. What do you make of them?
Two of them are accurate. The United States is in favor of stability. But you have to remember what stability means. Stability means conformity to U.S. orders. So, for example, one of the charges against Iran, the big foreign policy threat, is that it is destabilizing Iraq and Afghanistan. How? By trying to expand its influence into neighboring countries. On the other hand, we “stabilize” countries when we invade them and destroy them.
I’ve occasionally quoted one of my favorite illustrations of this, which is from a well-known, very good liberal foreign policy analyst, James Chace, a former editor of Foreign Affairs. Writing about the overthrow of the Salvador Allende regime and the imposition of the dictatorship of Augusto Pinochet in 1973, he said that we had to “destabilize” Chile in the interests of “stability.” That’s not perceived to be a contradiction — and it isn’t. We had to destroy the parliamentary system in order to gain stability, meaning that they do what we say. So yes, we are in favor of stability in this technical sense.
Concern about political Islam is just like concern about any independent development. Anything that’s independent you have to have concern about because it might undermine you. In fact, it’s a little ironic, because traditionally the United States and Britain have by and large strongly supported radical Islamic fundamentalism, not political Islam, as a force to block secular nationalism, the real concern. So, for example, Saudi Arabia is the most extreme fundamentalist state in the world, a radical Islamic state. It has a missionary zeal, is spreading radical Islam to Pakistan, funding terror. But it’s the bastion of U.S. and British policy. They’ve consistently supported it against the threat of secular nationalism from Gamal Abdel Nasser’s Egypt and Abd al-Karim Qasim’s Iraq, among many others. But they don’t like political Islam because it might become independent.
The first of the three points, our yearning for democracy, that’s about on the level of Joseph Stalin talking about the Russian commitment to freedom, democracy, and liberty for the world. It’s the kind of statement you laugh about when you hear it from commissars or Iranian clerics, but you nod politely and maybe even with awe when you hear it from their Western counterparts.
If you look at the record, the yearning for democracy is a bad joke. That’s even recognized by leading scholars, though they don’t put it this way. One of the major scholars on so-called democracy promotion is Thomas Carothers, who is pretty conservative and highly regarded — a neo-Reaganite, not a flaming liberal. He worked in Reagan’s State Department and has several books reviewing the course of democracy promotion, which he takes very seriously. He says, yes, this is a deep-seated American ideal, but it has a funny history. The history is that every U.S. administration is “schizophrenic.” They support democracy only if it conforms to certain strategic and economic interests. He describes this as a strange pathology, as if the United States needed psychiatric treatment or something. Of course, there’s another interpretation, but one that can’t come to mind if you’re a well-educated, properly behaved intellectual.
Within several months of the toppling of [President Hosni] Mubarak in Egypt, he was in the dock facing criminal charges and prosecution. It’s inconceivable that U.S. leaders will ever be held to account for their crimes in Iraq or beyond. Is that going to change anytime soon?
That’s basically the Yglesias principle: the very foundation of the international order is that the United States has the right to use violence at will. So how can you charge anybody?
And no one else has that right.
Of course not. Well, maybe our clients do. If Israel invades Lebanon and kills a thousand people and destroys half the country, okay, that’s all right. It’s interesting. Barack Obama was a senator before he was president. He didn’t do much as a senator, but he did a couple of things, including one he was particularly proud of. In fact, if you looked at his website before the primaries, he highlighted the fact that, during the Israeli invasion of Lebanon in 2006, he cosponsored a Senate resolution demanding that the United States do nothing to impede Israel’s military actions until they had achieved their objectives and censuring Iran and Syria because they were supporting resistance to Israel’s destruction of southern Lebanon, incidentally, for the fifth time in 25 years. So they inherit the right. Other clients do, too.
But the rights really reside in Washington. That’s what it means to own the world. It’s like the air you breathe. You can’t question it. The main founder of contemporary IR [international relations] theory, Hans Morgenthau, was really quite a decent person, one of the very few political scientists and international affairs specialists to criticize the Vietnam War on moral, not tactical, grounds. Very rare. He wrote a book called The Purpose of American Politics. You already know what’s coming. Other countries don’t have purposes. The purpose of America, on the other hand, is “transcendent”: to bring freedom and justice to the rest of the world. But he’s a good scholar, like Carothers. So he went through the record. He said, when you study the record, it looks as if the United States hasn’t lived up to its transcendent purpose. But then he says, to criticize our transcendent purpose “is to fall into the error of atheism, which denies the validity of religion on similar grounds” — which is a good comparison. It’s a deeply entrenched religious belief. It’s so deep that it’s going to be hard to disentangle it. And if anyone questions that, it leads to near hysteria and often to charges of anti-Americanism or “hating America” — interesting concepts that don’t exist in democratic societies, only in totalitarian societies and here, where they’re just taken for granted.
Noam Chomsky is Institute Professor Emeritus in the MIT Department of Linguistics and Philosophy. A TomDispatch regular, he is the author of numerous best-selling political works, including recently Hopes and Prospects and Making the Future. This piece is adapted from the chapter “Uprisings” in his newest book (with interviewer David Barsamian), Power Systems: Conversations on Global Democratic Uprisings and the New Challenges to U.S. Empire (The American Empire Project, Metropolitan Books).
Excerpted from Power Systems: Conversations on Global Democratic Uprisings and the New Challenges to U.S. Empire, published this month by Metropolitan Books, an imprint of Henry Holt and Company, LLC. Copyright (c) 2013 by Noam Chomsky and David Barsamian. All rights reserved.
The Paranoia of the Superrich and Superpowerful
The world stood still 50 years ago during the last week of October, from the moment when it learned that the Soviet Union had placed nuclear-armed missiles in Cuba until the crisis was officially ended — though unknown to the public, only officially.
The image of the world standing still is the turn of phrase of Sheldon Stern, former historian at the John F. Kennedy Presidential Library, who published the authoritative version of the tapes of the ExComm meetings where Kennedy and a close circle of advisers debated how to respond to the crisis. Those meetings were secretly recorded by the president, which might bear on the fact that his stand throughout the recorded sessions is relatively temperate compared to other participants, who were unaware that they were speaking to history.
Stern has just published an accessible and accurate review of this critically important documentary record, finally declassified in the late 1990s. I will keep to that here. “Never before or since,” he concludes, “has the survival of human civilization been at stake in a few short weeks of dangerous deliberations,” culminating in “the week the world stood still.”
There was good reason for the global concern. A nuclear war was all too imminent, a war that might “destroy the Northern Hemisphere,” President Dwight Eisenhower had warned. Kennedy’s own judgment was that the probability of war might have been as high as 50%. Estimates became higher as the confrontation reached its peak and the “secret doomsday plan to ensure the survival of the government was put into effect” in Washington, as described by journalist Michael Dobbs in his well-researched bestseller on the crisis (though he doesn’t explain why there would be much point in doing so, given the likely nature of nuclear war).
Dobbs quotes Dino Brugioni, “a key member of the CIA team monitoring the Soviet missile buildup,” who saw no way out except “war and complete destruction” as the clock moved to “one minute to midnight,” the title of his book. Kennedy’s close associate, historian Arthur Schlesinger, described the events as “the most dangerous moment in human history.” Defense Secretary Robert McNamara wondered aloud whether he “would live to see another Saturday night,” and later recognized that “we lucked out” — barely.
“The Most Dangerous Moment”
A closer look at what took place adds grim overtones to these judgments, with reverberations to the present moment.
There are several candidates for “the most dangerous moment.” One is October 27th, when U.S. destroyers enforcing a quarantine around Cuba were dropping depth charges on Soviet submarines. According to Soviet accounts, reported by the National Security Archive, submarine commanders were “rattled enough to talk about firing nuclear torpedoes, whose 15 kiloton explosive yields approximated the bomb that devastated Hiroshima in August 1945.”
In one case, a reported decision to assemble a nuclear torpedo for battle readiness was aborted at the last minute by Second Captain Vasili Arkhipov, who may have saved the world from nuclear disaster. There is little doubt what the U.S. reaction would have been had the torpedo been fired, or how the Russians would have responded as their country was going up in smoke.
Kennedy had already declared the highest nuclear alert short of launch (DEFCON 2), which authorized “NATO aircraft with Turkish pilots … [or others] … to take off, fly to Moscow, and drop a bomb,” according to the well-informed Harvard University strategic analyst Graham Allison, writing in the major establishment journal Foreign Affairs.
Another candidate is October 26th. That day has been selected as “the most dangerous moment” by B-52 pilot Major Don Clawson, who piloted one of those NATO aircraft and provides a hair-raising description of details of the Chrome Dome (CD) missions during the crisis — “B-52s on airborne alert” with nuclear weapons “on board and ready to use.”
October 26th was the day when “the nation was closest to nuclear war,” he writes in his “irreverent anecdotes of an Air Force pilot,” Is That Something the Crew Should Know? On that day, Clawson himself was in a good position to set off a likely terminal cataclysm. He concludes, “We were damned lucky we didn’t blow up the world — and no thanks to the political or military leadership of this country.”
The errors, confusions, near-accidents, and miscomprehension of the leadership that Clawson reports are startling enough, but nothing like the operative command-and-control rules — or lack of them. As Clawson recounts his experiences during the 15 24-hour CD missions he flew, the maximum possible, the official commanders “did not possess the capability to prevent a rogue-crew or crew-member from arming and releasing their thermonuclear weapons,” or even from broadcasting a mission that would have sent off “the entire Airborne Alert force without possibility of recall.” Once the crew was airborne carrying thermonuclear weapons, he writes, “it would have been possible to arm and drop them all with no further input from the ground. There was no inhibitor on any of the systems.”
About one-third of the total force was in the air, according to General David Burchinal, director of plans on the Air Staff at Air Force Headquarters. The Strategic Air Command (SAC), technically in charge, appears to have had little control. And according to Clawson’s account, the civilian National Command Authority was kept in the dark by SAC, which means that the ExComm “deciders” pondering the fate of the world knew even less. General Burchinal’s oral history is no less hair-raising, and reveals even greater contempt for the civilian command. According to him, Russian capitulation was never in doubt. The CD operations were designed to make it crystal clear to the Russians that they were hardly even competing in the military confrontation, and could quickly have been destroyed.
From the ExComm records, Stern concludes that, on October 26th, President Kennedy was “leaning towards military action to eliminate the missiles” in Cuba, to be followed by invasion, according to Pentagon plans. It was evident then that the act might have led to terminal war, a conclusion fortified by much later revelations that tactical nuclear weapons had been deployed and that Russian forces were far greater than U.S. intelligence had reported.
As the ExComm meetings were drawing to a close at 6 p.m. on the 26th, a letter arrived from Soviet Prime Minister Nikita Khrushchev, sent directly to President Kennedy. His “message seemed clear,” Stern writes: “the missiles would be removed if the U.S. promised not to invade Cuba.”
The next day, at 10 am, the president again turned on the secret tape. He read aloud a wire service report that had just been handed to him: “Premier Khrushchev told President Kennedy in a message today he would withdraw offensive weapons from Cuba if the United States withdrew its rockets from Turkey” — Jupiter missiles with nuclear warheads. The report was soon authenticated.
Though received by the committee as an unexpected bolt from the blue, it had actually been anticipated: “we’ve known this might be coming for a week,” Kennedy informed them. To refuse public acquiescence would be difficult, he realized. These were obsolete missiles, already slated for withdrawal, soon to be replaced by far more lethal and effectively invulnerable Polaris submarines. Kennedy recognized that he would be in an “insupportable position if this becomes [Khrushchev’s] proposal,” both because the Turkish missiles were useless and were being withdrawn anyway, and because “it’s gonna — to any man at the United Nations or any other rational man, it will look like a very fair trade.”
Keeping U.S. Power Unrestrained
The planners therefore faced a serious dilemma. They had in hand two somewhat different proposals from Khrushchev to end the threat of catastrophic war, and each would seem to any “rational man” to be a fair trade. How then to react?
One possibility would have been to breathe a sigh of relief that civilization could survive and to eagerly accept both offers; to announce that the U.S. would adhere to international law and remove any threat to invade Cuba; and to carry forward the withdrawal of the obsolete missiles in Turkey, proceeding as planned to upgrade the nuclear threat against the Soviet Union to a far greater one — only part, of course, of the global encirclement of Russia. But that was unthinkable.
The basic reason why no such thought could be contemplated was spelled out by National Security Adviser McGeorge Bundy, former Harvard dean and reputedly the brightest star in the Camelot firmament. The world, he insisted, must come to understand that “[t]he current threat to peace is not in Turkey, it is in Cuba,” where missiles were directed against the U.S. A vastly more powerful U.S. missile force trained on the much weaker and more vulnerable Soviet enemy could not possibly be regarded as a threat to peace, because we are Good, as a great many people in the Western hemisphere and beyond could testify — among numerous others, the victims of the ongoing terrorist war that the U.S. was then waging against Cuba, or those swept up in the “campaign of hatred” in the Arab world that so puzzled Eisenhower, though not the National Security Council, which explained it clearly.
Of course, the idea that the U.S. should be restrained by international law was too ridiculous to merit consideration. As explained recently by the respected left-liberal commentator Matthew Yglesias, “one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers” — meaning the U.S. — so that it is “amazingly naïve,” indeed quite “silly,” to suggest that it should obey international law or other conditions that we impose on the powerless. This was a frank and welcome exposition of operative assumptions, reflexively taken for granted by the ExComm assemblage.
In subsequent colloquy, the president stressed that we would be “in a bad position” if we chose to set off an international conflagration by rejecting proposals that would seem quite reasonable to survivors (if any cared). This “pragmatic” stance was about as far as moral considerations could reach.
In a review of recently released documents on Kennedy-era terror, Harvard University Latin Americanist Jorge Domínguez observes, “Only once in these nearly thousand pages of documentation did a U.S. official raise something that resembled a faint moral objection to U.S.-government sponsored terrorism”: a member of the National Security Council staff suggested that raids that are “haphazard and kill innocents… might mean a bad press in some friendly countries.”
The same attitudes prevailed throughout the internal discussions during the missile crisis, as when Robert Kennedy warned that a full-scale invasion of Cuba would “kill an awful lot of people, and we’re going to take an awful lot of heat on it.” And they prevail to the present, with only the rarest of exceptions, as easily documented.
We might have been “in even a worse position” if the world had known more about what the U.S. was doing at the time. Only recently was it learned that, six months earlier, the U.S. had secretly deployed missiles in Okinawa virtually identical to those the Russians would send to Cuba. These were surely aimed at China at a moment of elevated regional tensions. To this day, Okinawa remains a major offensive U.S. military base over the bitter objections of its inhabitants who, right now, are less than enthusiastic about the dispatch of accident-prone V-22 Osprey helicopters to the Futenma military base, located at the heart of a heavily populated urban center.
An Indecent Disrespect for the Opinions of Humankind
The deliberations that followed are revealing, but I will put them aside here. They did reach a conclusion. The U.S. pledged to withdraw the obsolete missiles from Turkey, but would not do so publicly or put the offer in writing: it was important that Khrushchev be seen to capitulate. An interesting reason was offered, and is accepted as reasonable by scholarship and commentary. As Dobbs puts it, “If it appeared that the United States was dismantling the missile bases unilaterally, under pressure from the Soviet Union, the [NATO] alliance might crack” — or to rephrase a little more accurately, if the U.S. replaced useless missiles with a far more lethal threat, as already planned, in a trade with Russia that any “rational man” would regard as very fair, then the NATO alliance might crack.
To be sure, when Russia withdrew Cuba’s only deterrent against an ongoing U.S. attack — with a severe threat to proceed to direct invasion still in the air — and quietly departed from the scene, the Cubans would be infuriated (as, in fact, they understandably were). But that is an unfair comparison for the standard reasons: we are human beings who matter, while they are merely “unpeople,” to adapt George Orwell’s useful phrase.
Kennedy also made an informal pledge not to invade Cuba, but with conditions: not just the withdrawal of the missiles, but also termination, or at least “a great lessening,” of any Russian military presence. (Unlike Turkey, on Russia’s borders, where nothing of the kind could be contemplated.) When Cuba is no longer an “armed camp,” then “we probably wouldn’t invade,” in the president’s words. He added that, if it hoped to be free from the threat of U.S. invasion, Cuba must end its “political subversion” (Stern’s phrase) in Latin America. “Political subversion” had been a constant theme for years, invoked for example when Eisenhower overthrew the parliamentary government of Guatemala and plunged that tortured country into an abyss from which it has yet to emerge. And these themes remained alive and well right through Ronald Reagan’s vicious terror wars in Central America in the 1980s. Cuba’s “political subversion” consisted of support for those resisting the murderous assaults of the U.S. and its client regimes, and sometimes even perhaps — horror of horrors — providing arms to the victims.
The usage is standard. Thus, in 1955, the Joint Chiefs of Staff had outlined “three basic forms of aggression.” The first was armed attack across a border, that is, aggression as defined in international law. The second was “overt armed attack from within the area of each of the sovereign states,” as when guerrilla forces undertake armed resistance against a regime backed or imposed by Washington, though not of course when “freedom fighters” resist an official enemy. The third: “Aggression other than armed, i.e., political warfare, or subversion.” The primary example at the time was South Vietnam, where the United States was defending a free people from “internal aggression,” as Kennedy’s U.N. Ambassador Adlai Stevenson explained — from “an assault from within” in the president’s words.
Though these assumptions are so deeply embedded in prevailing doctrine as to be virtually invisible, they are occasionally articulated in the internal record. In the case of Cuba, the State Department Policy Planning Council explained that “the primary danger we face in Castro is… in the impact the very existence of his regime has upon the leftist movement in many Latin American countries… The simple fact is that Castro represents a successful defiance of the US, a negation of our whole hemispheric policy of almost a century and a half,” since the Monroe Doctrine announced Washington’s intention, then unrealizable, to dominate the Western hemisphere.
Not the Russians of that moment then, but rather the right to dominate, a leading principle of foreign policy found almost everywhere, though typically concealed in defensive terms: during the Cold War years, routinely by invoking the “Russian threat,” even when Russians were nowhere in sight. An example of great contemporary import is revealed in Iran scholar Ervand Abrahamian’s important upcoming book of the U.S.-U.K. coup that overthrew the parliamentary regime of Iran in 1953. With scrupulous examination of internal records, he shows convincingly that standard accounts cannot be sustained. The primary causes were not Cold War concerns, nor Iranian irrationality that undermined Washington’s “benign intentions,” nor even access to oil or profits, but rather the way the U.S. demand for “overall controls” — with its broader implications for global dominance — was threatened by independent nationalism.
That is what we discover over and over by investigating particular cases, including Cuba (not surprisingly) though the fanaticism in that particular case might merit examination. U.S. policy towards Cuba is harshly condemned throughout Latin America and indeed most of the world, but “a decent respect for the opinions of mankind” is understood to be meaningless rhetoric intoned mindlessly on July 4th. Ever since polls have been taken on the matter, a considerable majority of the U.S. population has favored normalization of relations with Cuba, but that too is insignificant.
Dismissal of public opinion is of course quite normal. What is interesting in this case is dismissal of powerful sectors of U.S. economic power, which also favor normalization, and are usually highly influential in setting policy: energy, agribusiness, pharmaceuticals, and others. That suggests that, in addition to the cultural factors revealed in the hysteria of the Camelot intellectuals, there is a powerful state interest involved in punishing Cubans.
Saving the World from the Threat of Nuclear Destruction
The missile crisis officially ended on October 28th. The outcome was not obscure. That evening, in a special CBS News broadcast, Charles Collingwood reported that the world had come out “from under the most terrible threat of nuclear holocaust since World War II” with a “humiliating defeat for Soviet policy.” Dobbs comments that the Russians tried to pretend that the outcome was “yet another triumph for Moscow’s peace-loving foreign policy over warmongering imperialists,” and that “[t]he supremely wise, always reasonable Soviet leadership had saved the world from the threat of nuclear destruction.”
Extricating the basic facts from the fashionable ridicule, Khrushchev’s agreement to capitulate had indeed “saved the world from the threat of nuclear destruction.”
The crisis, however, was not over. On November 8th, the Pentagon announced that all known Soviet missile bases had been dismantled. On the same day, Stern reports, “a sabotage team carried out an attack on a Cuban factory,” though Kennedy’s terror campaign, Operation Mongoose, had been formally curtailed at the peak of the crisis. The November 8th terror attack lends support to Bundy’s observation that the threat to peace was Cuba, not Turkey, where the Russians were not continuing a lethal assault — though that was certainly not what Bundy had in mind or could have understood.
More details are added by the highly respected scholar Raymond Garthoff, who also had rich experience within the government, in his careful 1987 account of the missile crisis. On November 8th, he writes, “a Cuban covert action sabotage team dispatched from the United States successfully blew up a Cuban industrial facility,” killing 400 workers according to a Cuban government letter to the U.N. Secretary General.
Garthoff comments: “The Soviets could only see [the attack] as an effort to backpedal on what was, for them, the key question remaining: American assurances not to attack Cuba,” particularly since the terrorist attack was launched from the U.S. These and other “third party actions” reveal again, he concludes, “that the risk and danger to both sides could have been extreme, and catastrophe not excluded.” Garthoff also reviews the murderous and destructive operations of Kennedy’s terrorist campaign, which we would certainly regard as more than ample justification for war, if the U.S. or its allies or clients were victims, not perpetrators.
From the same source we learn further that, on August 23, 1962, the president had issued National Security Memorandum No. 181, “a directive to engineer an internal revolt that would be followed by U.S. military intervention,” involving “significant U.S. military plans, maneuvers, and movement of forces and equipment” that were surely known to Cuba and Russia. Also in August, terrorist attacks were intensified, including speedboat strafing attacks on a Cuban seaside hotel “where Soviet military technicians were known to congregate, killing a score of Russians and Cubans”; attacks on British and Cuban cargo ships; the contamination of sugar shipments; and other atrocities and sabotage, mostly carried out by Cuban exile organizations permitted to operate freely in Florida. Shortly after came “the most dangerous moment in human history,” not exactly out of the blue.
Kennedy officially renewed the terrorist operations after the crisis ebbed. Ten days before his assassination he approved a CIA plan for “destruction operations” by U.S. proxy forces “against a large oil refinery and storage facilities, a large electric plant, sugar refineries, railroad bridges, harbor facilities, and underwater demolition of docks and ships.” A plot to assassinate Castro was apparently initiated on the day of the Kennedy assassination. The terrorist campaign was called off in 1965, but reports Garthoff, “one of Nixon’s first acts in office in 1969 was to direct the CIA to intensify covert operations against Cuba.”
We can, at last, hear the voices of the victims in Canadian historian Keith Bolender’s Voices From the Other Side, the first oral history of the terror campaign — one of many books unlikely to receive more than casual notice, if that, in the West because the contents are too revealing.
In the current issue of Political Science Quarterly, the professional journal of the association of American political scientists, Montague Kern observes that the Cuban missile crisis is one of those “full-bore crises… in which an ideological enemy (the Soviet Union) is universally perceived to have gone on the attack, leading to a rally-’round-the-flag effect that greatly expands support for a president, increasing his policy options.”
Kern is right that it is “universally perceived” that way, apart from those who have escaped sufficiently from the ideological shackles to pay some attention to the facts. Kern is, in fact, one of them. Another is Sheldon Stern, who recognizes what has long been known to such deviants. As he writes, we now know that “Khrushchev’s original explanation for shipping missiles to Cuba had been fundamentally true: the Soviet leader had never intended these weapons as a threat to the security of the United States, but rather considered their deployment a defensive move to protect his Cuban allies from American attacks and as a desperate effort to give the U.S.S.R. the appearance of equality in the nuclear balance of power.” Dobbs, too, recognizes that “Castro and his Soviet patrons had real reasons to fear American attempts at regime change, including, as a last resort, a U.S. invasion of Cuba… [Khrushchev] was also sincere in his desire to defend the Cuban revolution from the mighty neighbor to the north.”
“Terrors of the Earth”
The American attacks are often dismissed in U.S. commentary as silly pranks, CIA shenanigans that got out of hand. That is far from the truth. The best and the brightest had reacted to the failure of the Bay of Pigs invasion with near hysteria, including the president, who solemnly informed the country: “The complacent, the self-indulgent, the soft societies are about to be swept away with the debris of history. Only the strong… can possibly survive.” And they could only survive, he evidently believed, by massive terror — though that addendum was kept secret, and is still not known to loyalists who perceive the ideological enemy as having “gone on the attack” (the near universal perception, as Kern observes). After the Bay of Pigs defeat, historian Piero Gleijeses writes, JFK launched a crushing embargo to punish the Cubans for defeating a U.S.-run invasion, and “asked his brother, Attorney General Robert Kennedy, to lead the top-level interagency group that oversaw Operation Mongoose, a program of paramilitary operations, economic warfare, and sabotage he launched in late 1961 to visit the ‘terrors of the earth’ on Fidel Castro and, more prosaically, to topple him.”
The phrase “terrors of the earth” is Arthur Schlesinger’s, in his quasi-official biography of Robert Kennedy, who was assigned responsibility for conducting the terrorist war, and informed the CIA that the Cuban problem carries “[t]he top priority in the United States Government — all else is secondary — no time, no effort, or manpower is to be spared” in the effort to overthrow the Castro regime. The Mongoose operations were run by Edward Lansdale, who had ample experience in “counterinsurgency” — a standard term for terrorism that we direct. He provided a timetable leading to “open revolt and overthrow of the Communist regime” in October 1962. The “final definition” of the program recognized that “final success will require decisive U.S. military intervention,” after terrorism and subversion had laid the basis. The implication is that U.S. military intervention would take place in October 1962 — when the missile crisis erupted. The events just reviewed help explain why Cuba and Russia had good reason to take such threats seriously.
Years later, Robert McNamara recognized that Cuba was justified in fearing an attack. “If I were in Cuban or Soviet shoes, I would have thought so, too,” he observed at a major conference on the missile crisis on the 40th anniversary.
As for Russia’s “desperate effort to give the U.S.S.R. the appearance of equality,” to which Stern refers, recall that Kennedy’s very narrow victory in the 1960 election relied heavily on a fabricated “missile gap” concocted to terrify the country and to condemn the Eisenhower administration as soft on national security. There was indeed a “missile gap,” but strongly in favor of the U.S.
The first “public, unequivocal administration statement” on the true facts, according to strategic analyst Desmond Ball in his authoritative study of the Kennedy missile program, was in October 1961, when Deputy Secretary of Defense Roswell Gilpatric informed the Business Council that “the U.S. would have a larger nuclear delivery system left after a surprise attack than the nuclear force which the Soviet Union could employ in its first strike.” The Russians of course were well aware of their relative weakness and vulnerability. They were also aware of Kennedy’s reaction when Khrushchev offered to sharply reduce offensive military capacity and proceeded to do so unilaterally. The president failed to respond, undertaking instead a huge armaments program.
Owning the World, Then and Now
The two most crucial questions about the missile crisis are: How did it begin, and how did it end? It began with Kennedy’s terrorist attack against Cuba, with a threat of invasion in October 1962. It ended with the president’s rejection of Russian offers that would seem fair to a rational person, but were unthinkable because they would have undermined the fundamental principle that the U.S. has the unilateral right to deploy nuclear missiles anywhere, aimed at China or Russia or anyone else, and right on their borders; and the accompanying principle that Cuba had no right to have missiles for defense against what appeared to be an imminent U.S. invasion. To establish these principles firmly it was entirely proper to face a high risk of war of unimaginable destruction, and to reject simple and admittedly fair ways to end the threat.
Garthoff observes that “in the United States, there was almost universal approbation for President Kennedy’s handling of the crisis.” Dobbs writes, “The relentlessly upbeat tone was established by the court historian, Arthur M. Schlesinger, Jr., who wrote that Kennedy had ‘dazzled the world’ through a ‘combination of toughness and restraint, of will, nerve and wisdom, so brilliantly controlled, so matchlessly calibrated.’” Rather more soberly, Stern partially agrees, noting that Kennedy repeatedly rejected the militant advice of his advisers and associates who called for military force and the dismissal of peaceful options. The events of October 1962 are widely hailed as Kennedy’s finest hour. Graham Allison joins many others in presenting them as “a guide for how to defuse conflicts, manage great-power relationships, and make sound decisions about foreign policy in general.”
In a very narrow sense, that judgment seems reasonable. The ExComm tapes reveal that the president stood apart from others, sometimes almost all others, in rejecting premature violence. There is, however, a further question: How should JFK’s relative moderation in the management of the crisis be evaluated against the background of the broader considerations just reviewed? But that question does not arise in a disciplined intellectual and moral culture, which accepts without question the basic principle that the U.S. effectively owns the world by right, and is by definition a force for good despite occasional errors and misunderstandings, one in which it is plainly entirely proper for the U.S. to deploy massive offensive force all over the world while it is an outrage for others (allies and clients apart) to make even the slightest gesture in that direction or even to think of deterring the threatened use of violence by the benign global hegemon.
That doctrine is the primary official charge against Iran today: it might pose a deterrent to U.S. and Israeli force. It was a consideration during the missile crisis as well. In internal discussion, the Kennedy brothers expressed their fears that Cuban missiles might deter a U.S. invasion of Venezuela, then under consideration. So “the Bay of Pigs was really right,” JFK concluded.
These principles still contribute to the constant risk of nuclear war. There has been no shortage of severe dangers since the missile crisis. Ten years later, during the 1973 Israel-Arab war, National Security Advisor Henry Kissinger called a high-level nuclear alert (DEFCON 3) to warn the Russians to keep their hands off while he was secretly authorizing Israel to violate the cease-fire imposed by the U.S. and Russia. When Reagan came into office a few years later, the U.S. launched operations probing Russian defenses and simulating air and naval attacks, while placing Pershing missiles in Germany with a five-minute flight time to Russian targets, providing what the CIA called a “super-sudden first strike” capability. Naturally this caused great alarm in Russia, which unlike the U.S. has repeatedly been invaded and virtually destroyed. That led to a major war scare in 1983. There have been hundreds of cases when human intervention aborted a first strike minutes before launch, after automated systems gave false alarms. We don’t have Russian records, but there’s no doubt that their systems are far more accident-prone.
Meanwhile, India and Pakistan have come close to nuclear war several times, and the sources of the conflict remain. Both have refused to sign the Non-Proliferation Treaty, along with Israel, and have received U.S. support for development of their nuclear weapons programs — until today in the case of India, now a U.S. ally. War threats in the Middle East, which might become reality very soon, once again escalate the dangers.
In 1962, war was avoided by Khrushchev’s willingness to accept Kennedy’s hegemonic demands. But we can hardly count on such sanity forever. It’s a near miracle that nuclear war has so far been avoided. There is more reason than ever to attend to the warning of Bertrand Russell and Albert Einstein, almost 60 years ago, that we must face a choice that is “stark and dreadful and inescapable: Shall we put an end to the human race; or shall mankind renounce war?”
Noam Chomsky is Institute Professor Emeritus in the MIT Department of Linguistics and Philosophy. A TomDispatch regular, he is the author of numerous best-selling political works, most recently, Hopes and Prospects, Making the Future, and Occupy.
Copyright 2012 Noam Chomsky
The Week the World Stood Still
Down the road only a few generations, the millennium of Magna Carta, one of the great events in the establishment of civil and human rights, will arrive. Whether it will be celebrated, mourned, or ignored is not at all clear.
That should be a matter of serious immediate concern. What we do right now, or fail to do, will determine what kind of world will greet that event. It is not an attractive prospect if present tendencies persist — not least, because the Great Charter is being shredded before our eyes.
The first scholarly edition of Magna Carta was published by the eminent jurist William Blackstone. It was not an easy task. There was no good text available. As he wrote, “the body of the charter has been unfortunately gnawn by rats” — a comment that carries grim symbolism today, as we take up the task the rats left unfinished.
Blackstone’s edition actually includes two charters. It was entitled The Great Charter and the Charter of the Forest. The first, the Charter of Liberties, is widely recognized to be the foundation of the fundamental rights of the English-speaking peoples — or as Winston Churchill put it more expansively, “the charter of every self-respecting man at any time in any land.” Churchill was referring specifically to the reaffirmation of the Charter by Parliament in the Petition of Right, imploring King Charles to recognize that the law is sovereign, not the King. Charles agreed briefly, but soon violated his pledge, setting the stage for the murderous Civil War.
After a bitter conflict between King and Parliament, the power of royalty in the person of Charles II was restored. In defeat, Magna Carta was not forgotten. One of the leaders of Parliament, Henry Vane, was beheaded. On the scaffold, he tried to read a speech denouncing the sentence as a violation of Magna Carta, but was drowned out by trumpets to ensure that such scandalous words would not be heard by the cheering crowds. His major crime had been to draft a petition calling the people “the original of all just power” in civil society — not the King, not even God. That was the position that had been strongly advocated by Roger Williams, the founder of the first free society in what is now the state of Rhode Island. His heretical views influenced Milton and Locke, though Williams went much farther, founding the modern doctrine of separation of church and state, still much contested even in the liberal democracies.
As often is the case, apparent defeat nevertheless carried the struggle for freedom and rights forward. Shortly after Vane’s execution, King Charles granted a Royal Charter to the Rhode Island plantations, declaring that “the form of government is Democratical,” and furthermore that the government could affirm freedom of conscience for Papists, atheists, Jews, Turks — even Quakers, one of the most feared and brutalized of the many sects that were appearing in those turbulent days. All of this was astonishing in the climate of the times.
A few years later, the Charter of Liberties was enriched by the Habeas Corpus Act of 1679, formally entitled “an Act for the better securing the liberty of the subject, and for prevention of imprisonment beyond the seas.” The U.S. Constitution, borrowing from English common law, affirms that “the writ of habeas corpus shall not be suspended” except in case of rebellion or invasion. In a unanimous decision, the U.S. Supreme Court held that the rights guaranteed by this Act were “[c]onsidered by the Founders [of the American Republic] as the highest safeguard of liberty.” All of these words should resonate today.
The Second Charter and the Commons
The significance of the companion charter, the Charter of the Forest, is no less profound and perhaps even more pertinent today — as explored in depth by Peter Linebaugh in his richly documented and stimulating history of Magna Carta and its later trajectory. The Charter of the Forest demanded protection of the commons from external power. The commons were the source of sustenance for the general population: their fuel, their food, their construction materials, whatever was essential for life. The forest was no primitive wilderness. It had been carefully developed over generations, maintained in common, its riches available to all, and preserved for future generations — practices found today primarily in traditional societies that are under threat throughout the world.
The Charter of the Forest imposed limits to privatization. The Robin Hood myths capture the essence of its concerns (and it is not too surprising that the popular TV series of the 1950s, “The Adventures of Robin Hood,” was written anonymously by Hollywood screenwriters blacklisted for leftist convictions). By the seventeenth century, however, this Charter had fallen victim to the rise of the commodity economy and capitalist practice and morality.
With the commons no longer protected for cooperative nurturing and use, the rights of the common people were restricted to what could not be privatized, a category that continues to shrink to virtual invisibility. In Bolivia, the attempt to privatize water was, in the end, beaten back by an uprising that brought the indigenous majority to power for the first time in history. The World Bank has just ruled that the mining multinational Pacific Rim can proceed with a case against El Salvador for trying to preserve lands and communities from highly destructive gold mining. Environmental constraints threaten to deprive the company of future profits, a crime that can be punished under the rules of the investor-rights regime mislabeled as “free trade.” And this is only a tiny sample of struggles underway over much of the world, some involving extreme violence, as in the Eastern Congo, where millions have been killed in recent years to ensure an ample supply of minerals for cell phones and other uses, and of course ample profits.
The rise of capitalist practice and morality brought with it a radical revision of how the commons are treated, and also of how they are conceived. The prevailing view today is captured by Garrett Hardin’s influential argument that “freedom in a commons brings ruin to us all,” the famous “tragedy of the commons”: what is not owned will be destroyed by individual avarice.
An international counterpart was the concept of terra nullius, employed to justify the expulsion of indigenous populations in the settler-colonial societies of the Anglosphere, or their “extermination,” as the founding fathers of the American Republic described what they were doing, sometimes with remorse, after the fact. According to this useful doctrine, the Indians had no property rights since they were just wanderers in an untamed wilderness. And the hard-working colonists could create value where there was none by turning that same wilderness to commercial use.
In reality, the colonists knew better and there were elaborate procedures of purchase and ratification by crown and parliament, later annulled by force when the evil creatures resisted extermination. The doctrine is often attributed to John Locke, but that is dubious. As a colonial administrator, he understood what was happening, and there is no basis for the attribution in his writings, as contemporary scholarship has shown convincingly, notably the work of the Australian scholar Paul Corcoran. (It was in Australia, in fact, that the doctrine has been most brutally employed.)
The grim forecasts of the tragedy of the commons are not without challenge. The late Elinor Olstrom won the Nobel Prize in economics in 2009 for her work showing the superiority of user-managed fish stocks, pastures, woods, lakes, and groundwater basins. But the conventional doctrine has force if we accept its unstated premise: that humans are blindly driven by what American workers, at the dawn of the industrial revolution, bitterly called “the New Spirit of the Age, Gain Wealth forgetting all but Self.”
Like peasants and workers in England before them, American workers denounced this New Spirit, which was being imposed upon them, regarding it as demeaning and destructive, an assault on the very nature of free men and women. And I stress women; among those most active and vocal in condemning the destruction of the rights and dignity of free people by the capitalist industrial system were the “factory girls,” young women from the farms. They, too, were driven into the regime of supervised and controlled wage labor, which was regarded at the time as different from chattel slavery only in that it was temporary. That stand was considered so natural that it became a slogan of the Republican Party, and a banner under which northern workers carried arms during the American Civil War.
Controlling the Desire for Democracy
That was 150 years ago — in England earlier. Huge efforts have been devoted since to inculcating the New Spirit of the Age. Major industries are devoted to the task: public relations, advertising, marketing generally, all of which add up to a very large component of the Gross Domestic Product. They are dedicated to what the great political economist Thorstein Veblen called “fabricating wants.” In the words of business leaders themselves, the task is to direct people to “the superficial things” of life, like “fashionable consumption.” That way people can be atomized, separated from one another, seeking personal gain alone, diverted from dangerous efforts to think for themselves and challenge authority.
The process of shaping opinion, attitudes, and perceptions was termed the “engineering of consent” by one of the founders of the modern public relations industry, Edward Bernays. He was a respected Wilson-Roosevelt-Kennedy progressive, much like his contemporary, journalist Walter Lippmann, the most prominent public intellectual of twentieth century America, who praised “the manufacture of consent” as a “new art” in the practice of democracy.
Both recognized that the public must be “put in its place,” marginalized and controlled — for their own interests of course. They were too “stupid and ignorant” to be allowed to run their own affairs. That task was to be left to the “intelligent minority,” who must be protected from “the trampling and the roar of [the] bewildered herd,” the “ignorant and meddlesome outsiders” — the “rascal multitude” as they were termed by their seventeenth century predecessors. The role of the general population was to be “spectators,” not “participants in action,” in a properly functioning democratic society.
And the spectators must not be allowed to see too much. President Obama has set new standards in safeguarding this principle. He has, in fact, punished more whistleblowers than all previous presidents combined, a real achievement for an administration that came to office promising transparency. WikiLeaks is only the most famous case, with British cooperation.
Among the many topics that are not the business of the bewildered herd is foreign affairs. Anyone who has studied declassified secret documents will have discovered that, to a large extent, their classification was meant to protect public officials from public scrutiny. Domestically, the rabble should not hear the advice given by the courts to major corporations: that they should devote some highly visible efforts to good works, so that an “aroused public” will not discover the enormous benefits provided to them by the nanny state. More generally the U.S. public should not learn that “state policies are overwhelmingly regressive, thus reinforcing and expanding social inequality,” though designed in ways that lead “people to think that the government helps only the undeserving poor, allowing politicians to mobilize and exploit anti-government rhetoric and values even as they continue to funnel support to their better-off constituents” — I’m quoting from the main establishment journal, Foreign Affairs, not from some radical rag.
Over time, as societies became freer and the resort to state violence more constrained, the urge to devise sophisticated methods of control of attitudes and opinion has only grown. It is natural that the immense PR industry should have been created in the most free of societies, the United States and Great Britain. The first modern propaganda agency was the British Ministry of Information a century ago, which secretly defined its task as “to direct the thought of most of the world” — primarily progressive American intellectuals, who had to be mobilized to come to the aid of Britain during World War I.
Its U.S. counterpart, the Committee on Public Information, was formed by Woodrow Wilson to drive a pacifist population to violent hatred of all things German — with remarkable success. American commercial advertising deeply impressed others. Goebbels admired it and adapted it to Nazi propaganda, all too successfully. The Bolshevik leaders tried as well, but their efforts were clumsy and ineffective.
A primary domestic task has always been “to keep [the public] from our throats,” as essayist Ralph Waldo Emerson described the concerns of political leaders when the threat of democracy was becoming harder to suppress in the mid-nineteenth century. More recently, the activism of the 1960s elicited elite concerns about “excessive democracy,” and calls for measures to impose “more moderation” in democracy.
One particular concern was to introduce better controls over the institutions “responsible for the indoctrination of the young”: the schools, the universities, the churches, which were seen as failing that essential task. I’m quoting reactions from the left-liberal end of the mainstream spectrum, the liberal internationalists who later staffed the Carter administration, and their counterparts in other industrial societies. The right wing was much harsher. One of many manifestations of this urge has been the sharp rise in college tuition, not on economic grounds, as is easily shown. The device does, however, trap and control young people by debt, often for the rest of their lives, thus contributing to more effective indoctrination.
The Three-Fifths People
Pursuing these important topics further, we see that the destruction of the Charter of the Forest, and its obliteration from memory, relates rather closely to the continuing efforts to constrain the promise of the Charter of Liberties. The “New Spirit of the Age” cannot tolerate the pre-capitalist conception of the Forest as the shared endowment of the community at large, cared for communally for its own use and for future generations, protected from privatization, from transfer to the hands of private power for service to wealth, not needs. Inculcating the New Spirit is an essential prerequisite for achieving this end, and for preventing the Charter of Liberties from being misused to enable free citizens to determine their own fate.
Popular struggles to bring about a freer and more just society have been resisted by violence and repression, and massive efforts to control opinion and attitudes. Over time, however, they have met with considerable success, even though there is a long way to go and there is often regression. Right now, in fact.
The most famous part of the Charter of Liberties is Article 39, which declares that “no free man” shall be punished in any way, “nor will We proceed against or prosecute him, except by the lawful judgment of his peers and by the law of the land.”
Through many years of struggle, the principle has come to hold more broadly. The U.S. Constitution provides that no “person [shall] be deprived of life, liberty, or property, without due process of law [and] a speedy and public trial” by peers. The basic principle is “presumption of innocence” — what legal historians describe as “the seed of contemporary Anglo-American freedom,” referring to Article 39; and with the Nuremberg Tribunal in mind, a “particularly American brand of legalism: punishment only for those who could be proved to be guilty through a fair trial with a panoply of procedural protections” — even if their guilt for some of the worst crimes in history is not in doubt.
The founders of course did not intend the term “person” to apply to all persons. Native Americans were not persons. Their rights were virtually nil. Women were scarcely persons. Wives were understood to be “covered” under the civil identity of their husbands in much the same way as children were subject to their parents. Blackstone’s principles held that “the very being or legal existence of the woman is suspended during the marriage, or at least is incorporated and consolidated into that of the husband: under whose wing, protection, and cover, she performs every thing.” Women are thus the property of their fathers or husbands. These principles remain up to very recent years. Until a Supreme Court decision of 1975, women did not even have a legal right to serve on juries. They were not peers. Just two weeks ago, Republican opposition blocked the Fairness Paycheck Act guaranteeing women equal pay for equal work. And it goes far beyond.
Slaves, of course, were not persons. They were in fact three-fifths human under the Constitution, so as to grant their owners greater voting power. Protection of slavery was no slight concern to the founders: it was one factor leading to the American revolution. In the 1772 Somerset case, Lord Mansfield determined that slavery is so “odious” that it cannot be tolerated in England, though it continued in British possessions for many years. American slave-owners could see the handwriting on the wall if the colonies remained under British rule. And it should be recalled that the slave states, including Virginia, had the greatest power and influence in the colonies. One can easily appreciate Dr. Johnson’s famous quip that “we hear the loudest yelps for liberty among the drivers of negroes.”
Post-Civil War amendments extended the concept person to African-Americans, ending slavery. In theory, at least. After about a decade of relative freedom, a condition akin to slavery was reintroduced by a North-South compact permitting the effective criminalization of black life. A black male standing on a street corner could be arrested for vagrancy, or for attempted rape if accused of looking at a white woman the wrong way. And once imprisoned he had few chances of ever escaping the system of “slavery by another name,” the term used by then-Wall Street Journal bureau chief Douglas Blackmon in an arresting study.
This new version of the “peculiar institution” provided much of the basis for the American industrial revolution, with a perfect work force for the steel industry and mining, along with agricultural production in the famous chain gangs: docile, obedient, no strikes, and no need for employers even to sustain their workers, an improvement over slavery. The system lasted in large measure until World War II, when free labor was needed for war production.
The postwar boom offered employment. A black man could get a job in a unionized auto plant, earn a decent salary, buy a house, and maybe send his children to college. That lasted for about 20 years, until the 1970s, when the economy was radically redesigned on newly dominant neoliberal principles, with rapid growth of financialization and the offshoring of production. The black population, now largely superfluous, has been recriminalized.
Until Ronald Reagan’s presidency, incarceration in the U.S. was within the spectrum of industrial societies. By now it is far beyond others. It targets primarily black males, increasingly also black women and Hispanics, largely guilty of victimless crimes under the fraudulent “drug wars.” Meanwhile, the wealth of African-American families has been virtually obliterated by the latest financial crisis, in no small measure thanks to criminal behavior of financial institutions, with impunity for the perpetrators, now richer than ever.
Looking over the history of African-Americans from the first arrival of slaves almost 500 years ago to the present, they have enjoyed the status of authentic persons for only a few decades. There is a long way to go to realize the promise of Magna Carta.
Sacred Persons and Undone Process
The post-Civil War fourteenth amendment granted the rights of persons to former slaves, though mostly in theory. At the same time, it created a new category of persons with rights: corporations. In fact, almost all the cases brought to the courts under the fourteenth amendment had to do with corporate rights, and by a century ago, they had determined that these collectivist legal fictions, established and sustained by state power, had the full rights of persons of flesh and blood; in fact, far greater rights, thanks to their scale, immortality, and protections of limited liability. Their rights by now far transcend those of mere humans. Under the “free trade agreements,” Pacific Rim can, for example, sue El Salvador for seeking to protect the environment; individuals cannot do the same. General Motors can claim national rights in Mexico. There is no need to dwell on what would happen if a Mexican demanded national rights in the United States.
Domestically, recent Supreme Court rulings greatly enhance the already enormous political power of corporations and the super-rich, striking further blows against the tottering relics of functioning political democracy.
Meanwhile Magna Carta is under more direct assault. Recall the Habeas Corpus Act of 1679, which barred “imprisonment beyond the seas,” and certainly the far more vicious procedure of imprisonment abroad for the purpose of torture — what is now more politely called “rendition,” as when Tony Blair rendered Libyan dissident Abdel Hakim Belhaj, now a leader of the rebellion, to the mercies of Qaddafi; or when U.S. authorities deported Canadian citizen Maher Arar to his native Syria, for imprisonment and torture, only later conceding that there was never any case against him. And many others, often through Shannon Airport, leading to courageous protests in Ireland.
The concept of due process has been extended under the Obama administration’s international assassination campaign in a way that renders this core element of the Charter of Liberties (and the Constitution) null and void. The Justice Department explained that the constitutional guarantee of due process, tracing to Magna Carta, is now satisfied by internal deliberations in the executive branch alone. The constitutional lawyer in the White House agreed. King John might have nodded with satisfaction.
The issue arose after the presidentially ordered assassination-by-drone of Anwar al-Awlaki, accused of inciting jihad in speech, writing, and unspecified actions. A headline in the New York Times captured the general elite reaction when he was murdered in a drone attack, along with the usual collateral damage. It read: “The West celebrates a cleric’s death.” Some eyebrows were lifted, however, because he was an American citizen, which raised questions about due process — considered irrelevant when non-citizens are murdered at the whim of the chief executive. And irrelevant for citizens, too, under Obama administration due-process legal innovations.
Presumption of innocence has also been given a new and useful interpretation. As the New York Times reported, “Mr. Obama embraced a disputed method for counting civilian casualties that did little to box him in. It in effect counts all military-age males in a strike zone as combatants, according to several administration officials, unless there is explicit intelligence posthumously proving them innocent.” So post-assassination determination of innocence maintains the sacred principle of presumption of innocence.
It would be ungracious to recall the Geneva Conventions, the foundation of modern humanitarian law: they bar “the carrying out of executions without previous judgment pronounced by a regularly constituted court, affording all the judicial guarantees which are recognized as indispensable by civilized peoples.”
The most famous recent case of executive assassination was Osama bin Laden, murdered after he was apprehended by 79 Navy seals, defenseless, accompanied only by his wife, his body reportedly dumped at sea without autopsy. Whatever one thinks of him, he was a suspect and nothing more than that. Even the FBI agreed.
Celebration in this case was overwhelming, but there were a few questions raised about the bland rejection of the principle of presumption of innocence, particularly when trial was hardly impossible. These were met with harsh condemnations. The most interesting was by a respected left-liberal political commentator, Matthew Yglesias, who explained that “one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers,” so it is “amazingly naïve” to suggest that the U.S. should obey international law or other conditions that we righteously demand of the weak.
Only tactical objections can be raised to aggression, assassination, cyberwar, or other actions that the Holy State undertakes in the service of mankind. If the traditional victims see matters somewhat differently, that merely reveals their moral and intellectual backwardness. And the occasional Western critic who fails to comprehend these fundamental truths can be dismissed as “silly,” Yglesias explains — incidentally, referring specifically to me, and I cheerfully confess my guilt.
Executive Terrorist Lists
Perhaps the most striking assault on the foundations of traditional liberties is a little-known case brought to the Supreme Court by the Obama administration, Holder v. Humanitarian Law Project. The Project was condemned for providing “material assistance” to the guerrilla organization PKK, which has fought for Kurdish rights in Turkey for many years and is listed as a terrorist group by the state executive. The “material assistance” was legal advice. The wording of the ruling would appear to apply quite broadly, for example, to discussions and research inquiry, even advice to the PKK to keep to nonviolent means. Again, there was a marginal fringe of criticism, but even those accepted the legitimacy of the state terrorist list — arbitrary decisions by the executive, with no recourse.
The record of the terrorist list is of some interest. For example, in 1988 the Reagan administration declared Nelson Mandela’s African National Congress to be one of the world’s “more notorious terrorist groups,” so that Reagan could continue his support for the Apartheid regime and its murderous depredations in South Africa and in neighboring countries, as part of his “war on terror.” Twenty years later Mandela was finally removed from the terrorist list, and can now travel to the U.S. without a special waiver.
Another interesting case is Saddam Hussein, removed from the terrorist list in 1982 so that the Reagan administration could provide him with support for his invasion of Iran. The support continued well after the war ended. In 1989, President Bush I even invited Iraqi nuclear engineers to the U.S. for advanced training in weapons production — more information that must be kept from the eyes of the “ignorant and meddlesome outsiders.”
One of the ugliest examples of the use of the terrorist list has to do with the tortured people of Somalia. Immediately after September 11th, the United States closed down the Somali charitable network Al-Barakaat on grounds that it was financing terror. This achievement was hailed one of the great successes of the “war on terror.” In contrast, Washington’s withdrawal of its charges as without merit a year later aroused little notice.
Al-Barakaat was responsible for about half the $500 million in remittances to Somalia, “more than it earns from any other economic sector and 10 times the amount of foreign aid [Somalia] receives” a U.N. review determined. The charity also ran major businesses in Somalia, all destroyed. The leading academic scholar of Bush’s “financial war on terror,” Ibrahim Warde, concludes that apart from devastating the economy, this frivolous attack on a very fragile society “may have played a role in the rise… of Islamic fundamentalists,” another familiar consequence of the “war on terror.”
The very idea that the state should have the authority to make such judgments is a serious offense against the Charter of Liberties, as is the fact that it is considered uncontentious. If the Charter’s fall from grace continues on the path of the past few years, the future of rights and liberties looks dim.
Who Will Have the Last Laugh?
A few final words on the fate of the Charter of the Forest. Its goal was to protect the source of sustenance for the population, the commons, from external power — in the early days, royalty; over the years, enclosures and other forms of privatization by predatory corporations and the state authorities who cooperate with them, have only accelerated and are properly rewarded. The damage is very broad.
If we listen to voices from the South today we can learn that “the conversion of public goods into private property through the privatization of our otherwise commonly held natural environment is one way neoliberal institutions remove the fragile threads that hold African nations together. Politics today has been reduced to a lucrative venture where one looks out mainly for returns on investment rather than on what one can contribute to rebuild highly degraded environments, communities, and a nation. This is one of the benefits that structural adjustment programmes inflicted on the continent — the enthronement of corruption.” I’m quoting Nigerian poet and activist Nnimmo Bassey, chair of Friends of the Earth International, in his searing expose of the ravaging of Africa’s wealth, To Cook a Continent, the latest phase of the Western torture of Africa.
Torture that has always been planned at the highest level, it should be recognized. At the end of World War II, the U.S. held a position of unprecedented global power. Not surprisingly, careful and sophisticated plans were developed about how to organize the world. Each region was assigned its “function” by State Department planners, headed by the distinguished diplomat George Kennan. He determined that the U.S. had no special interest in Africa, so it should be handed over to Europe to “exploit” — his word — for its reconstruction. In the light of history, one might have imagined a different relation between Europe and Africa, but there is no indication that that was ever considered.
More recently, the U.S. has recognized that it, too, must join the game of exploiting Africa, along with new entries like China, which is busily at work compiling one of the worst records in destruction of the environment and oppression of the hapless victims.
It should be unnecessary to dwell on the extreme dangers posed by one central element of the predatory obsessions that are producing calamities all over the world: the reliance on fossil fuels, which courts global disaster, perhaps in the not-too-distant future. Details may be debated, but there is little serious doubt that the problems are serious, if not awesome, and that the longer we delay in addressing them, the more awful will be the legacy left to generations to come. There are some efforts to face reality, but they are far too minimal. The recent Rio+20 Conference opened with meager aspirations and derisory outcomes.
Meanwhile, power concentrations are charging in the opposite direction, led by the richest and most powerful country in world history. Congressional Republicans are dismantling the limited environmental protections initiated by Richard Nixon, who would be something of a dangerous radical in today’s political scene. The major business lobbies openly announce their propaganda campaigns to convince the public that there is no need for undue concern — with some effect, as polls show.
The media cooperate by not even reporting the increasingly dire forecasts of international agencies and even the U.S. Department of Energy. The standard presentation is a debate between alarmists and skeptics: on one side virtually all qualified scientists, on the other a few holdouts. Not part of the debate are a very large number of experts, including the climate change program at MIT among others, who criticize the scientific consensus because it is too conservative and cautious, arguing that the truth when it comes to climate change is far more dire. Not surprisingly, the public is confused.
In his State of the Union speech in January, President Obama hailed the bright prospects of a century of energy self-sufficiency, thanks to new technologies that permit extraction of hydrocarbons from Canadian tar sands, shale, and other previously inaccessible sources. Others agree. The Financial Times forecasts a century of energy independence for the U.S. The report does mention the destructive local impact of the new methods. Unasked in these optimistic forecasts is the question what kind of a world will survive the rapacious onslaught.
In the lead in confronting the crisis throughout the world are indigenous communities, those who have always upheld the Charter of the Forests. The strongest stand has been taken by the one country they govern, Bolivia, the poorest country in South America and for centuries a victim of western destruction of the rich resources of one of the most advanced of the developed societies in the hemisphere, pre-Columbus.
After the ignominious collapse of the Copenhagen global climate change summit in 2009, Bolivia organized a People’s Summit with 35,000 participants from 140 countries — not just representatives of governments, but also civil society and activists. It produced a People’s Agreement, which called for very sharp reduction in emissions, and a Universal Declaration on the Rights of Mother Earth. That is a key demand of indigenous communities all over the world. It is ridiculed by sophisticated westerners, but unless we can acquire some of their sensibility, they are likely to have the last laugh — a laugh of grim despair.
Noam Chomsky is Institute Professor Emeritus in the MIT Department of Linguistics and Philosophy. A TomDispatch regular, he is the author of numerous best-selling political works, most recently, Hopes and Prospects, Making the Future, and Occupy. This is the full text of a speech he gave recently at the University of St. Andrews in Scotland. His web site is www.chomsky.info. To catch Timothy MacBain’s latest Tomcast audio interview in which Chomsky discusses the recent shredding of the principles of the Magna Carta, click here or download it to your iPod here.
Copyright 2012 Noam Chomsky
Destroying the Commons
In the years of conscious, self-inflicted decline at home, “losses” continued to mount elsewhere. In the past decade, for the first time in 500 years, South America has taken successful steps to free itself from western domination, another serious loss. The region has moved towards integration, and has begun to address some of the terrible internal problems of societies ruled by mostly Europeanized elites, tiny islands of extreme wealth in a sea of misery. They have also rid themselves of all U.S. military bases and of IMF controls. A newly formed organization, CELAC, includes all countries of the hemisphere apart from the U.S. and Canada. If it actually functions, that would be another step in American decline, in this case in what has always been regarded as “the backyard.”
Even more serious would be the loss of the MENA countries — Middle East/North Africa — which have been regarded by planners since the 1940s as “a stupendous source of strategic power, and one of the greatest material prizes in world history.” Control of MENA energy reserves would yield “substantial control of the world,” in the words of the influential Roosevelt advisor A.A. Berle.
To be sure, if the projections of a century of U.S. energy independence based on North American energy resources turn out to be realistic, the significance of controlling MENA would decline somewhat, though probably not by much: the main concern has always been control more than access. However, the likely consequences to the planet’s equilibrium are so ominous that discussion may be largely an academic exercise.
The Arab Spring, another development of historic importance, might portend at least a partial “loss” of MENA. The US and its allies have tried hard to prevent that outcome — so far, with considerable success. Their policy towards the popular uprisings has kept closely to the standard guidelines: support the forces most amenable to U.S. influence and control.
Favored dictators are supported as long as they can maintain control (as in the major oil states). When that is no longer possible, then discard them and try to restore the old regime as fully as possible (as in Tunisia and Egypt). The general pattern is familiar: Somoza, Marcos, Duvalier, Mobutu, Suharto, and many others. In one case, Libya, the three traditional imperial powers intervened by force to participate in a rebellion to overthrow a mercurial and unreliable dictator, opening the way, it is expected, to more efficient control over Libya’s rich resources (oil primarily, but also water, of particular interest to French corporations), to a possible base for the U.S. Africa Command (so far restricted to Germany), and to the reversal of growing Chinese penetration. As far as policy goes, there have been few surprises.
Crucially, it is important to reduce the threat of functioning democracy, in which popular opinion will significantly influence policy. That again is routine, and quite understandable. A look at the studies of public opinion undertaken by U.S. polling agencies in the MENA countries easily explains the western fear of authentic democracy, in which public opinion will significantly influence policy.
Israel and the Republican Party
Similar considerations carry over directly to the second major concern addressed in the issue of Foreign Affairs cited in part one of this piece: the Israel-Palestine conflict. Fear of democracy could hardly be more clearly exhibited than in this case. In January 2006, an election took place in Palestine, pronounced free and fair by international monitors. The instant reaction of the U.S. (and of course Israel), with Europe following along politely, was to impose harsh penalties on Palestinians for voting the wrong way.
That is no innovation. It is quite in accord with the general and unsurprising principle recognized by mainstream scholarship: the U.S. supports democracy if, and only if, the outcomes accord with its strategic and economic objectives, the rueful conclusion of neo-Reaganite Thomas Carothers, the most careful and respected scholarly analyst of “democracy promotion” initiatives.
More broadly, for 35 years the U.S. has led the rejectionist camp on Israel-Palestine, blocking an international consensus calling for a political settlement in terms too well known to require repetition. The western mantra is that Israel seeks negotiations without preconditions, while the Palestinians refuse. The opposite is more accurate. The U.S. and Israel demand strict preconditions, which are, furthermore, designed to ensure that negotiations will lead either to Palestinian capitulation on crucial issues, or nowhere.
The first precondition is that the negotiations must be supervised by Washington, which makes about as much sense as demanding that Iran supervise the negotiation of Sunni-Shia conflicts in Iraq. Serious negotiations would have to be under the auspices of some neutral party, preferably one that commands some international respect, perhaps Brazil. The negotiations would seek to resolve the conflicts between the two antagonists: the U.S.-Israel on one side, most of the world on the other.
The second precondition is that Israel must be free to expand its illegal settlements in the West Bank. Theoretically, the U.S. opposes these actions, but with a very light tap on the wrist, while continuing to provide economic, diplomatic, and military support. When the U.S. does have some limited objections, it very easily bars the actions, as in the case of the E-1 project linking Greater Jerusalem to the town of Ma’aleh Adumim, virtually bisecting the West Bank, a very high priority for Israeli planners (across the spectrum), but raising some objections in Washington, so that Israel has had to resort to devious measures to chip away at the project.
The pretense of opposition reached the level of farce last February when Obama vetoed a Security Council resolution calling for implementation of official U.S. policy (also adding the uncontroversial observation that the settlements themselves are illegal, quite apart from expansion). Since that time there has been little talk about ending settlement expansion, which continues, with studied provocation.
Thus, as Israeli and Palestinian representatives prepared to meet in Jordan in January 2011, Israel announced new construction in Pisgat Ze’ev and Har Homa, West Bank areas that it has declared to be within the greatly expanded area of Jerusalem, annexed, settled, and constructed as Israel’s capital, all in violation of direct Security Council orders. Other moves carry forward the grander design of separating whatever West Bank enclaves will be left to Palestinian administration from the cultural, commercial, political center of Palestinian life in the former Jerusalem.
It is understandable that Palestinian rights should be marginalized in U.S. policy and discourse. Palestinians have no wealth or power. They offer virtually nothing to U.S. policy concerns; in fact, they have negative value, as a nuisance that stirs up “the Arab street.”
Israel, in contrast, is a valuable ally. It is a rich society with a sophisticated, largely militarized high-tech industry. For decades, it has been a highly valued military and strategic ally, particularly since 1967, when it performed a great service to the U.S. and its Saudi ally by destroying the Nasserite “virus,” establishing the “special relationship” with Washington in the form that has persisted since. It is also a growing center for U.S. high-tech investment. In fact, high tech and particularly military industries in the two countries are closely linked.
Apart from such elementary considerations of great power politics as these, there are cultural factors that should not be ignored. Christian Zionism in Britain and the U.S. long preceded Jewish Zionism, and has been a significant elite phenomenon with clear policy implications (including the Balfour Declaration, which drew from it). When General Allenby conquered Jerusalem during World War I, he was hailed in the American press as Richard the Lion-Hearted, who had at last won the Crusades and driven the pagans out of the Holy Land.
The next step was for the Chosen People to return to the land promised to them by the Lord. Articulating a common elite view, President Franklin Roosevelt’s Secretary of the Interior Harold Ickes described Jewish colonization of Palestine as an achievement “without comparison in the history of the human race.” Such attitudes find their place easily within the Providentialist doctrines that have been a strong element in popular and elite culture since the country’s origins: the belief that God has a plan for the world and the U.S. is carrying it forward under divine guidance, as articulated by a long list of leading figures.
Moreover, evangelical Christianity is a major popular force in the U.S. Further toward the extremes, End Times evangelical Christianity also has enormous popular outreach, invigorated by the establishment of Israel in 1948, revitalized even more by the conquest of the rest of Palestine in 1967 — all signs that End Times and the Second Coming are approaching.
These forces have become particularly significant since the Reagan years, as the Republicans have abandoned the pretense of being a political party in the traditional sense, while devoting themselves in virtual lockstep uniformity to servicing a tiny percentage of the super-rich and the corporate sector. However, the small constituency that is primarily served by the reconstructed party cannot provide votes, so they have to turn elsewhere.
The only choice is to mobilize tendencies that have always been present, though rarely as an organized political force: primarily nativists trembling in fear and hatred, and religious elements that are extremists by international standards but not in the U.S. One outcome is reverence for alleged Biblical prophecies, hence not only support for Israel and its conquests and expansion, but passionate love for Israel, another core part of the catechism that must be intoned by Republican candidates — with Democrats, again, not too far behind.
These factors aside, it should not be forgotten that the “Anglosphere” — Britain and its offshoots — consists of settler-colonial societies, which rose on the ashes of indigenous populations, suppressed or virtually exterminated. Past practices must have been basically correct, in the U.S. case even ordained by Divine Providence. Accordingly there is often an intuitive sympathy for the children of Israel when they follow a similar course. But primarily, geostrategic and economic interests prevail, and policy is not graven in stone.
The Iranian “Threat” and the Nuclear Issue
Let us turn finally to the third of the leading issues addressed in the establishment journals cited earlier, the “threat of Iran.” Among elites and the political class this is generally taken to be the primary threat to world order — though not among populations. In Europe, polls show that Israel is regarded as the leading threat to peace. In the MENA countries, that status is shared with the U.S., to the extent that in Egypt, on the eve of the Tahrir Square uprising, 80% felt that the region would be more secure if Iran had nuclear weapons. The same polls found that only 10% regard Iran as a threat — unlike the ruling dictators, who have their own concerns.
In the United States, before the massive propaganda campaigns of the past few years, a majority of the population agreed with most of the world that, as a signatory of the Non-Proliferation Treaty, Iran has a right to carry out uranium enrichment. And even today, a large majority favors peaceful means for dealing with Iran. There is even strong opposition to military engagement if Iran and Israel are at war. Only a quarter regard Iran as an important concern for the U.S. altogether. But it is not unusual for there to be a gap, often a chasm, dividing public opinion and policy.
Why exactly is Iran regarded as such a colossal threat? The question is rarely discussed, but it is not hard to find a serious answer — though not, as usual, in the fevered pronouncements. The most authoritative answer is provided by the Pentagon and the intelligence services in their regular reports to Congress on global security. They report that Iran does not pose a military threat. Its military spending is very low even by the standards of the region, minuscule of course in comparison with the U.S.
Iran has little capacity to deploy force. Its strategic doctrines are defensive, designed to deter invasion long enough for diplomacy to set it. If Iran is developing nuclear weapons capability, they report, that would be part of its deterrence strategy. No serious analyst believes that the ruling clerics are eager to see their country and possessions vaporized, the immediate consequence of their coming even close to initiating a nuclear war. And it is hardly necessary to spell out the reasons why any Iranian leadership would be concerned with deterrence, under existing circumstances.
The regime is doubtless a serious threat to much of its own population — and regrettably, is hardly unique on that score. But the primary threat to the U.S. and Israel is that Iran might deter their free exercise of violence. A further threat is that the Iranians clearly seek to extend their influence to neighboring Iraq and Afghanistan, and beyond as well. Those “illegitimate” acts are called “destabilizing” (or worse). In contrast, forceful imposition of U.S. influence halfway around the world contributes to “stability” and order, in accord with traditional doctrine about who owns the world.
It makes very good sense to try to prevent Iran from joining the nuclear weapons states, including the three that have refused to sign the Non-Proliferation Treaty — Israel, India, and Pakistan, all of which have been assisted in developing nuclear weapons by the U.S., and are still being assisted by them. It is not impossible to approach that goal by peaceful diplomatic means. One approach, which enjoys overwhelming international support, is to undertake meaningful steps towards establishing a nuclear weapons-free zone in the Middle East, including Iran and Israel (and applying as well to U.S. forces deployed there), better still extending to South Asia.
Support for such efforts is so strong that the Obama administration has been compelled to formally agree, but with reservations: crucially, that Israel’s nuclear program must not be placed under the auspices of the International Atomic Energy Association, and that no state (meaning the U.S.) should be required to release information about “Israeli nuclear facilities and activities, including information pertaining to previous nuclear transfers to Israel.” Obama also accepts Israel’s position that any such proposal must be conditional on a comprehensive peace settlement, which the U.S. and Israel can continue to delay indefinitely.
This survey comes nowhere near being exhaustive, needless to say. Among major topics not addressed is the shift of U.S. military policy towards the Asia-Pacific region, with new additions to the huge military base system underway right now, in Jeju Island off South Korea and Northwest Australia, all elements of the policy of “containment of China.” Closely related is the issue of U.S. bases in Okinawa, bitterly opposed by the population for many years, and a continual crisis in U.S.-Tokyo-Okinawa relations.
Revealing how little fundamental assumptions have changed, U.S. strategic analysts describe the result of China’s military programs as a “classic 'security dilemma,' whereby military programs and national strategies deemed defensive by their planners are viewed as threatening by the other side,” writes Paul Godwin of the Foreign Policy Research Institute. The security dilemma arises over control of the seas off China’s coasts. The U.S. regards its policies of controlling these waters as “defensive,” while China regards them as threatening; correspondingly, China regards its actions in nearby areas as “defensive” while the U.S. regards them as threatening. No such debate is even imaginable concerning U.S. coastal waters. This “classic security dilemma” makes sense, again, on the assumption that the U.S. has a right to control most of the world, and that U.S. security requires something approaching absolute global control.
While the principles of imperial domination have undergone little change, the capacity to implement them has markedly declined as power has become more broadly distributed in a diversifying world. Consequences are many. It is, however, very important to bear in mind that — unfortunately — none lifts the two dark clouds that hover over all consideration of global order: nuclear war and environmental catastrophe, both literally threatening the decent survival of the species.
Quite the contrary. Both threats are ominous, and increasing.
Noam Chomsky is Institute Professor emeritus in the MIT Department of Linguistics and Philosophy. He is the author of numerous best-selling political works. His latest books are Making the Future: Occupations, Intervention, Empire, and Resistance, The Essential Chomsky (edited by Anthony Arnove), a collection of his writings on politics and on language from the 1950s to the present, Gaza in Crisis, with Ilan Pappé, and Hopes and Prospects, also available as an audiobook. To listen to Timothy MacBain’s latest Tomcast audio interview in which Chomsky offers an anatomy of American defeats in the Greater Middle East, click here, or download it to your iPod here.
Copyright 2012 Noam Chomsky
The Imperial Way
Significant anniversaries are solemnly commemorated — Japan’s attack on the U.S. naval base at Pearl Harbor, for example. Others are ignored, and we can often learn valuable lessons from them about what is likely to lie ahead. Right now, in fact.
At the moment, we are failing to commemorate the 50th anniversary of President John F. Kennedy’s decision to launch the most destructive and murderous act of aggression of the post-World War II period: the invasion of South Vietnam, later all of Indochina, leaving millions dead and four countries devastated, with casualties still mounting from the long-term effects of drenching South Vietnam with some of the most lethal carcinogens known, undertaken to destroy ground cover and food crops.
The prime target was South Vietnam. The aggression later spread to the North, then to the remote peasant society of northern Laos, and finally to rural Cambodia, which was bombed at the stunning level of all allied air operations in the Pacific region during World War II, including the two atom bombs dropped on Hiroshima and Nagasaki. In this, Henry Kissinger’s orders were being carried out — “anything that flies on anything that moves” — a call for genocide that is rare in the historical record. Little of this is remembered. Most was scarcely known beyond narrow circles of activists.
When the invasion was launched 50 years ago, concern was so slight that there were few efforts at justification, hardly more than the president’s impassioned plea that “we are opposed around the world by a monolithic and ruthless conspiracy that relies primarily on covert means for expanding its sphere of influence” and if the conspiracy achieves its ends in Laos and Vietnam, “the gates will be opened wide.”
Elsewhere, he warned further that “the complacent, the self-indulgent, the soft societies are about to be swept away with the debris of history [and] only the strong… can possibly survive,” in this case reflecting on the failure of U.S. aggression and terror to crush Cuban independence.
By the time protest began to mount half a dozen years later, the respected Vietnam specialist and military historian Bernard Fall, no dove, forecast that “Vietnam as a cultural and historic entity… is threatened with extinction…[as]…the countryside literally dies under the blows of the largest military machine ever unleashed on an area of this size.” He was again referring to South Vietnam.
When the war ended eight horrendous years later, mainstream opinion was divided between those who described the war as a “noble cause” that could have been won with more dedication, and at the opposite extreme, the critics, to whom it was “a mistake” that proved too costly. By 1977, President Carter aroused little notice when he explained that we owe Vietnam “no debt” because “the destruction was mutual.”
There are important lessons in all this for today, even apart from another reminder that only the weak and defeated are called to account for their crimes. One lesson is that to understand what is happening we should attend not only to critical events of the real world, often dismissed from history, but also to what leaders and elite opinion believe, however tinged with fantasy. Another lesson is that alongside the flights of fancy concocted to terrify and mobilize the public (and perhaps believed by some who are trapped in their own rhetoric), there is also geostrategic planning based on principles that are rational and stable over long periods because they are rooted in stable institutions and their concerns. That is true in the case of Vietnam as well. I will return to that, only stressing here that the persistent factors in state action are generally well concealed.
The Iraq war is an instructive case. It was marketed to a terrified public on the usual grounds of self-defense against an awesome threat to survival: the “single question,” George W. Bush and Tony Blair declared, was whether Saddam Hussein would end his programs of developing weapons of mass destruction. When the single question received the wrong answer, government rhetoric shifted effortlessly to our “yearning for democracy,” and educated opinion duly followed course; all routine.
Later, as the scale of the U.S. defeat in Iraq was becoming difficult to suppress, the government quietly conceded what had been clear all along. In 2007-2008, the administration officially announced that a final settlement must grant the U.S. military bases and the right of combat operations, and must privilege U.S. investors in the rich energy system — demands later reluctantly abandoned in the face of Iraqi resistance. And all well kept from the general population.
Gauging American Decline
With such lessons in mind, it is useful to look at what is highlighted in the major journals of policy and opinion today. Let us keep to the most prestigious of the establishment journals, Foreign Affairs. The headline blaring on the cover of the December 2011 issue reads in bold face: “Is America Over?”
The title article calls for “retrenchment” in the “humanitarian missions” abroad that are consuming the country’s wealth, so as to arrest the American decline that is a major theme of international affairs discourse, usually accompanied by the corollary that power is shifting to the East, to China and (maybe) India.
The lead articles are on Israel-Palestine. The first, by two high Israeli officials, is entitled “The Problem is Palestinian Rejection”: the conflict cannot be resolved because Palestinians refuse to recognize Israel as a Jewish state — thereby conforming to standard diplomatic practice: states are recognized, but not privileged sectors within them. The demand is hardly more than a new device to deter the threat of political settlement that would undermine Israel’s expansionist goals.
The opposing position, defended by an American professor, is entitled “The Problem Is the Occupation.” The subtitle reads “How the Occupation is Destroying the Nation.” Which nation? Israel, of course. The paired articles appear under the heading “Israel under Siege.”
The January 2012 issue features yet another call to bomb Iran now, before it is too late. Warning of “the dangers of deterrence,” the author suggests that “skeptics of military action fail to appreciate the true danger that a nuclear-armed Iran would pose to U.S. interests in the Middle East and beyond. And their grim forecasts assume that the cure would be worse than the disease — that is, that the consequences of a U.S. assault on Iran would be as bad as or worse than those of Iran achieving its nuclear ambitions. But that is a faulty assumption. The truth is that a military strike intended to destroy Iran’s nuclear program, if managed carefully, could spare the region and the world a very real threat and dramatically improve the long-term national security of the United States.”
Others argue that the costs would be too high, and at the extremes some even point out that an attack would violate international law — as does the stand of the moderates, who regularly deliver threats of violence, in violation of the U.N. Charter.
Let us review these dominant concerns in turn.
American decline is real, though the apocalyptic vision reflects the familiar ruling class perception that anything short of total control amounts to total disaster. Despite the piteous laments, the U.S. remains the world dominant power by a large margin, and no competitor is in sight, not only in the military dimension, in which of course the U.S. reigns supreme.
China and India have recorded rapid (though highly inegalitarian) growth, but remain very poor countries, with enormous internal problems not faced by the West. China is the world’s major manufacturing center, but largely as an assembly plant for the advanced industrial powers on its periphery and for western multinationals. That is likely to change over time. Manufacturing regularly provides the basis for innovation, often breakthroughs, as is now sometimes happening in China. One example that has impressed western specialists is China’s takeover of the growing global solar panel market, not on the basis of cheap labor but by coordinated planning and, increasingly, innovation.
But the problems China faces are serious. Some are demographic, reviewed in Science, the leading U.S. science weekly. The study shows that mortality sharply decreased in China during the Maoist years, “mainly a result of economic development and improvements in education and health services, especially the public hygiene movement that resulted in a sharp drop in mortality from infectious diseases.” This progress ended with the initiation of the capitalist reforms 30 years ago, and the death rate has since increased.
Furthermore, China’s recent economic growth has relied substantially on a “demographic bonus,” a very large working-age population. “But the window for harvesting this bonus may close soon,” with a “profound impact on development”: “Excess cheap labor supply, which is one of the major factors driving China's economic miracle, will no longer be available.”
Demography is only one of many serious problems ahead. For India, the problems are far more severe.
Not all prominent voices foresee American decline. Among international media, there is none more serious and responsible than the London Financial Times. It recently devoted a full page to the optimistic expectation that new technology for extracting North American fossil fuels might allow the U.S. to become energy independent, hence to retain its global hegemony for a century. There is no mention of the kind of world the U.S. would rule in this happy event, but not for lack of evidence.
At about the same time, the International Energy Agency reported that, with rapidly increasing carbon emissions from fossil fuel use, the limit of safety will be reached by 2017 if the world continues on its present course. “The door is closing,” the IEA chief economist said, and very soon it “will be closed forever.”
Shortly before the U.S. Department of Energy reported the most recent carbon dioxide emissions figures, which “jumped by the biggest amount on record” to a level higher than the worst-case scenario anticipated by the International Panel on Climate Change (IPCC). That came as no surprise to many scientists, including the MIT program on climate change, which for years has warned that the IPCC predictions are too conservative.
Such critics of the IPCC predictions receive virtually no public attention, unlike the fringe of denialists who are supported by the corporate sector, along with huge propaganda campaigns that have driven Americans off the international spectrum in dismissal of the threats. Business support also translates directly to political power. Denialism is part of the catechism that must be intoned by Republican candidates in the farcical election campaign now in progress, and in Congress they are powerful enough to abort even efforts to inquire into the effects of global warming, let alone do anything serious about it.
In brief, American decline can perhaps be stemmed if we abandon hope for decent survival, prospects that are all too real given the balance of forces in the world.
“Losing” China and Vietnam
Putting such unpleasant thoughts aside, a close look at American decline shows that China indeed plays a large role, as it has for 60 years. The decline that now elicits such concern is not a recent phenomenon. It traces back to the end of World War II, when the U.S. had half the world’s wealth and incomparable security and global reach. Planners were naturally well aware of the enormous disparity of power, and intended to keep it that way.
The basic viewpoint was outlined with admirable frankness in a major state paper of 1948 (PPS 23). The author was one of the architects of the New World Order of the day, the chair of the State Department Policy Planning Staff, the respected statesman and scholar George Kennan, a moderate dove within the planning spectrum. He observed that the central policy goal was to maintain the “position of disparity” that separated our enormous wealth from the poverty of others. To achieve that goal, he advised, “We should cease to talk about vague and… unreal objectives such as human rights, the raising of the living standards, and democratization,” and must “deal in straight power concepts,” not “hampered by idealistic slogans” about “altruism and world-benefaction.”
Kennan was referring specifically to Asia, but the observations generalize, with exceptions, for participants in the U.S.-run global system. It was well understood that the “idealistic slogans” were to be displayed prominently when addressing others, including the intellectual classes, who were expected to promulgate them.
The plans that Kennan helped formulate and implement took for granted that the U.S. would control the Western Hemisphere, the Far East, the former British empire (including the incomparable energy resources of the Middle East), and as much of Eurasia as possible, crucially its commercial and industrial centers. These were not unrealistic objectives, given the distribution of power. But decline set in at once.
In 1949, China declared independence, an event known in Western discourse as “the loss of China” — in the U.S., with bitter recriminations and conflict over who was responsible for that loss. The terminology is revealing. It is only possible to lose something that one owns. The tacit assumption was that the U.S. owned China, by right, along with most of the rest of the world, much as postwar planners assumed.
The “loss of China” was the first major step in “America’s decline.” It had major policy consequences. One was the immediate decision to support France’s effort to reconquer its former colony of Indochina, so that it, too, would not be “lost.”
Indochina itself was not a major concern, despite claims about its rich resources by President Eisenhower and others. Rather, the concern was the “domino theory,” which is often ridiculed when dominoes don’t fall, but remains a leading principle of policy because it is quite rational. To adopt Henry Kissinger’s version, a region that falls out of control can become a “virus” that will “spread contagion,” inducing others to follow the same path.
In the case of Vietnam, the concern was that the virus of independent development might infect Indonesia, which really does have rich resources. And that might lead Japan — the “superdomino” as it was called by the prominent Asia historian John Dower — to “accommodate” to an independent Asia as its technological and industrial center in a system that would escape the reach of U.S. power. That would mean, in effect, that the U.S. had lost the Pacific phase of World War II, fought to prevent Japan’s attempt to establish such a New Order in Asia.
The way to deal with such a problem is clear: destroy the virus and “inoculate” those who might be infected. In the Vietnam case, the rational choice was to destroy any hope of successful independent development and to impose brutal dictatorships in the surrounding regions. Those tasks were successfully carried out — though history has its own cunning, and something similar to what was feared has since been developing in East Asia, much to Washington’s dismay.
The most important victory of the Indochina wars was in 1965, when a U.S.-backed military coup in Indonesia led by General Suharto carried out massive crimes that were compared by the CIA to those of Hitler, Stalin, and Mao. The “staggering mass slaughter,” as the New York Times described it, was reported accurately across the mainstream, and with unrestrained euphoria.
It was “a gleam of light in Asia,” as the noted liberal commentator James Reston wrote in the Times. The coup ended the threat of democracy by demolishing the mass-based political party of the poor, established a dictatorship that went on to compile one of the worst human rights records in the world, and threw the riches of the country open to western investors. Small wonder that, after many other horrors, including the near-genocidal invasion of East Timor, Suharto was welcomed by the Clinton administration in 1995 as “our kind of guy.”
Years after the great events of 1965, Kennedy-Johnson National Security Adviser McGeorge Bundy reflected that it would have been wise to end the Vietnam war at that time, with the “virus” virtually destroyed and the primary domino solidly in place, buttressed by other U.S.-backed dictatorships throughout the region.
Similar procedures have been routinely followed elsewhere. Kissinger was referring specifically to the threat of socialist democracy in Chile. That threat was ended on another forgotten date, what Latin Americans call “the first 9/11,” which in violence and bitter effects far exceeded the 9/11 commemorated in the West. A vicious dictatorship was imposed in Chile, one part of a plague of brutal repression that spread through Latin America, reaching Central America under Reagan. Viruses have aroused deep concern elsewhere as well, including the Middle East, where the threat of secular nationalism has often concerned British and U.S. planners, inducing them to support radical Islamic fundamentalism to counter it.
The Concentration of Wealth and American Decline
Despite such victories, American decline continued. By 1970, U.S. share of world wealth had dropped to about 25%, roughly where it remains, still colossal but far below the end of World War II. By then, the industrial world was “tripolar”: US-based North America, German-based Europe, and East Asia, already the most dynamic industrial region, at the time Japan-based, but by now including the former Japanese colonies Taiwan and South Korea, and more recently China.
At about that time, American decline entered a new phase: conscious self-inflicted decline. From the 1970s, there has been a significant change in the U.S. economy, as planners, private and state, shifted it toward financialization and the offshoring of production, driven in part by the declining rate of profit in domestic manufacturing. These decisions initiated a vicious cycle in which wealth became highly concentrated (dramatically so in the top 0.1% of the population), yielding concentration of political power, hence legislation to carry the cycle further: taxation and other fiscal policies, deregulation, changes in the rules of corporate governance allowing huge gains for executives, and so on.
Meanwhile, for the majority, real wages largely stagnated, and people were able to get by only by sharply increased workloads (far beyond Europe), unsustainable debt, and repeated bubbles since the Reagan years, creating paper wealth that inevitably disappeared when they burst (and the perpetrators were bailed out by the taxpayer). In parallel, the political system has been increasingly shredded as both parties are driven deeper into corporate pockets with the escalating cost of elections, the Republicans to the level of farce, the Democrats (now largely the former “moderate Republicans”) not far behind.
A recent study by the Economic Policy Institute, which has been the major source of reputable data on these developments for years, is entitled Failure by Design. The phrase “by design” is accurate. Other choices were certainly possible. And as the study points out, the “failure” is class-based. There is no failure for the designers. Far from it. Rather, the policies are a failure for the large majority, the 99% in the imagery of the Occupy movements — and for the country, which has declined and will continue to do so under these policies.
One factor is the offshoring of manufacturing. As the solar panel example mentioned earlier illustrates, manufacturing capacity provides the basis and stimulus for innovation leading to higher stages of sophistication in production, design, and invention. That, too, is being outsourced, not a problem for the “money mandarins” who increasingly design policy, but a serious problem for working people and the middle classes, and a real disaster for the most oppressed, African Americans, who have never escaped the legacy of slavery and its ugly aftermath, and whose meager wealth virtually disappeared after the collapse of the housing bubble in 2008, setting off the most recent financial crisis, the worst so far.
Noam Chomsky is Institute Professor emeritus in the MIT Department of Linguistics and Philosophy. He is the author of numerous best-selling political works. His latest books are Making the Future: Occupations, Intervention, Empire, and Resistance, The Essential Chomsky (edited by Anthony Arnove), a collection of his writings on politics and on language from the 1950s to the present, Gaza in Crisis, with Ilan Pappé, and Hopes and Prospects, also available as an audiobook. To listen to Timothy MacBain’s latest Tomcast audio interview in which Chomsky offers an anatomy of American defeats in the Greater Middle East, click here, or download it to your iPod here.
[Note: Part 2 of Noam Chomsky’s discussion of American decline, “The Imperial Way,” will be posted at TomDispatch tomorrow.]
Copyright 2012 Noam Chomsky
“Losing” the World
We are approaching the 10th anniversary of the horrendous atrocities of September 11, 2001, which, it is commonly held, changed the world. On May 1st, the presumed mastermind of the crime, Osama bin Laden, was assassinated in Pakistan by a team of elite US commandos, Navy SEALs, after he was captured, unarmed and undefended, in Operation Geronimo.
A number of analysts have observed that although bin Laden was finally killed, he won some major successes in his war against the U.S. “He repeatedly asserted that the only way to drive the U.S. from the Muslim world and defeat its satraps was by drawing Americans into a series of small but expensive wars that would ultimately bankrupt them,” Eric Margolis writes. “‘Bleeding the U.S.,’ in his words.” The United States, first under George W. Bush and then Barack Obama, rushed right into bin Laden’s trap… Grotesquely overblown military outlays and debt addiction… may be the most pernicious legacy of the man who thought he could defeat the United States” — particularly when the debt is being cynically exploited by the far right, with the collusion of the Democrat establishment, to undermine what remains of social programs, public education, unions, and, in general, remaining barriers to corporate tyranny.
That Washington was bent on fulfilling bin Laden’s fervent wishes was evident at once. As discussed in my book 9-11, written shortly after those attacks occurred, anyone with knowledge of the region could recognize “that a massive assault on a Muslim population would be the answer to the prayers of bin Laden and his associates, and would lead the U.S. and its allies into a ‘diabolical trap,’ as the French foreign minister put it.”
The senior CIA analyst responsible for tracking Osama bin Laden from 1996, Michael Scheuer, wrote shortly after that “bin Laden has been precise in telling America the reasons he is waging war on us. [He] is out to drastically alter U.S. and Western policies toward the Islamic world,” and largely succeeded: “U.S. forces and policies are completing the radicalization of the Islamic world, something Osama bin Laden has been trying to do with substantial but incomplete success since the early 1990s. As a result, I think it is fair to conclude that the United States of America remains bin Laden’s only indispensable ally.” And arguably remains so, even after his death.
The First 9/11
Was there an alternative? There is every likelihood that the Jihadi movement, much of it highly critical of bin Laden, could have been split and undermined after 9/11. The “crime against humanity,” as it was rightly called, could have been approached as a crime, with an international operation to apprehend the likely suspects. That was recognized at the time, but no such idea was even considered.
In 9-11, I quoted Robert Fisk’s conclusion that the “horrendous crime” of 9/11 was committed with “wickedness and awesome cruelty,” an accurate judgment. It is useful to bear in mind that the crimes could have been even worse. Suppose, for example, that the attack had gone as far as bombing the White House, killing the president, imposing a brutal military dictatorship that killed thousands and tortured tens of thousands while establishing an international terror center that helped impose similar torture-and-terror states elsewhere and carried out an international assassination campaign; and as an extra fillip, brought in a team of economists — call them “the Kandahar boys” — who quickly drove the economy into one of the worst depressions in its history. That, plainly, would have been a lot worse than 9/11.
Unfortunately, it is not a thought experiment. It happened. The only inaccuracy in this brief account is that the numbers should be multiplied by 25 to yield per capita equivalents, the appropriate measure. I am, of course, referring to what in Latin America is often called “the first 9/11”: September 11, 1973, when the U.S. succeeded in its intensive efforts to overthrow the democratic government of Salvador Allende in Chile with a military coup that placed General Pinochet’s brutal regime in office. The goal, in the words of the Nixon administration, was to kill the “virus” that might encourage all those “foreigners [who] are out to screw us” to take over their own resources and in other ways to pursue an intolerable policy of independent development. In the background was the conclusion of the National Security Council that, if the US could not control Latin America, it could not expect “to achieve a successful order elsewhere in the world.”
The first 9/11, unlike the second, did not change the world. It was “nothing of very great consequence,” as Henry Kissinger assured his boss a few days later.
These events of little consequence were not limited to the military coup that destroyed Chilean democracy and set in motion the horror story that followed. The first 9/11 was just one act in a drama which began in 1962, when John F. Kennedy shifted the mission of the Latin American military from “hemispheric defense” — an anachronistic holdover from World War II — to “internal security,” a concept with a chilling interpretation in U.S.-dominated Latin American circles.
In the recently published Cambridge University History of the Cold War, Latin American scholar John Coatsworth writes that from that time to “the Soviet collapse in 1990, the numbers of political prisoners, torture victims, and executions of non-violent political dissenters in Latin America vastly exceeded those in the Soviet Union and its East European satellites,” including many religious martyrs and mass slaughter as well, always supported or initiated in Washington. The last major violent act was the brutal murder of six leading Latin American intellectuals, Jesuit priests, a few days after the Berlin Wall fell. The perpetrators were an elite Salvadorean battalion, which had already left a shocking trail of blood, fresh from renewed training at the JFK School of Special Warfare, acting on direct orders of the high command of the U.S. client state.
The consequences of this hemispheric plague still, of course, reverberate.
From Kidnapping and Torture to Assassination
All of this, and much more like it, is dismissed as of little consequence, and forgotten. Those whose mission is to rule the world enjoy a more comforting picture, articulated well enough in the current issue of the prestigious (and valuable) journal of the Royal Institute of International Affairs in London. The lead article discusses “the visionary international order” of the “second half of the twentieth century” marked by “the universalization of an American vision of commercial prosperity.” There is something to that account, but it does not quite convey the perception of those at the wrong end of the guns.
The same is true of the assassination of Osama bin Laden, which brings to an end at least a phase in the “war on terror” re-declared by President George W. Bush on the second 9/11. Let us turn to a few thoughts on that event and its significance.
On May 1, 2011, Osama bin Laden was killed in his virtually unprotected compound by a raiding mission of 79 Navy SEALs, who entered Pakistan by helicopter. After many lurid stories were provided by the government and withdrawn, official reports made it increasingly clear that the operation was a planned assassination, multiply violating elementary norms of international law, beginning with the invasion itself.
There appears to have been no attempt to apprehend the unarmed victim, as presumably could have been done by 79 commandos facing no opposition — except, they report, from his wife, also unarmed, whom they shot in self-defense when she “lunged” at them, according to the White House.
A plausible reconstruction of the events is provided by veteran Middle East correspondent Yochi Dreazen and colleagues in the Atlantic. Dreazen, formerly the military correspondent for the Wall Street Journal, is senior correspondent for the National Journal Group covering military affairs and national security. According to their investigation, White House planning appears not to have considered the option of capturing bin Laden alive: “The administration had made clear to the military’s clandestine Joint Special Operations Command that it wanted bin Laden dead, according to a senior U.S. official with knowledge of the discussions. A high-ranking military officer briefed on the assault said the SEALs knew their mission was not to take him alive.”
The authors add: “For many at the Pentagon and the Central Intelligence Agency who had spent nearly a decade hunting bin Laden, killing the militant was a necessary and justified act of vengeance.” Furthermore, “capturing bin Laden alive would have also presented the administration with an array of nettlesome legal and political challenges.” Better, then, to assassinate him, dumping his body into the sea without the autopsy considered essential after a killing — an act that predictably provoked both anger and skepticism in much of the Muslim world.
As the Atlantic inquiry observes, “The decision to kill bin Laden outright was the clearest illustration to date of a little-noticed aspect of the Obama administration’s counterterror policy. The Bush administration captured thousands of suspected militants and sent them to detention camps in Afghanistan, Iraq, and Guantanamo Bay. The Obama administration, by contrast, has focused on eliminating individual terrorists rather than attempting to take them alive.” That is one significant difference between Bush and Obama. The authors quote former West German Chancellor Helmut Schmidt, who “told German TV that the U.S. raid was ‘quite clearly a violation of international law’ and that bin Laden should have been detained and put on trial,” contrasting Schmidt with U.S. Attorney General Eric Holder, who “defended the decision to kill bin Laden although he didn’t pose an immediate threat to the Navy SEALs, telling a House panel… that the assault had been ‘lawful, legitimate and appropriate in every way.’”
The disposal of the body without autopsy was also criticized by allies. The highly regarded British barrister Geoffrey Robertson, who supported the intervention and opposed the execution largely on pragmatic grounds, nevertheless described Obama’s claim that “justice was done” as an “absurdity” that should have been obvious to a former professor of constitutional law. Pakistan law “requires a colonial inquest on violent death, and international human rights law insists that the ‘right to life’ mandates an inquiry whenever violent death occurs from government or police action. The U.S. is therefore under a duty to hold an inquiry that will satisfy the world as to the true circumstances of this killing.”
Robertson usefully reminds us that “[i]t was not always thus. When the time came to consider the fate of men much more steeped in wickedness than Osama bin Laden — the Nazi leadership — the British government wanted them hanged within six hours of capture. President Truman demurred, citing the conclusion of Justice Robert Jackson that summary execution ‘would not sit easily on the American conscience or be remembered by our children with pride… the only course is to determine the innocence or guilt of the accused after a hearing as dispassionate as the times will permit and upon a record that will leave our reasons and motives clear.’”
Eric Margolis comments that “Washington has never made public the evidence of its claim that Osama bin Laden was behind the 9/11 attacks,” presumably one reason why “polls show that fully a third of American respondents believe that the U.S. government and/or Israel were behind 9/11,” while in the Muslim world skepticism is much higher. “An open trial in the U.S. or at the Hague would have exposed these claims to the light of day,” he continues, a practical reason why Washington should have followed the law.
In societies that profess some respect for law, suspects are apprehended and brought to fair trial. I stress “suspects.” In June 2002, FBI head Robert Mueller, in what the Washington Post described as “among his most detailed public comments on the origins of the attacks,” could say only that “investigators believe the idea of the Sept. 11 attacks on the World Trade Center and Pentagon came from al Qaeda leaders in Afghanistan, the actual plotting was done in Germany, and the financing came through the United Arab Emirates from sources in Afghanistan.”
What the FBI believed and thought in June 2002 they didn’t know eight months earlier, when Washington dismissed tentative offers by the Taliban (how serious, we do not know) to permit a trial of bin Laden if they were presented with evidence. Thus, it is not true, as President Obama claimed in his White House statement after bin Laden’s death, that “[w]e quickly learned that the 9/11 attacks were carried out by al-Qaeda.”
There has never been any reason to doubt what the FBI believed in mid-2002, but that leaves us far from the proof of guilt required in civilized societies — and whatever the evidence might be, it does not warrant murdering a suspect who could, it seems, have been easily apprehended and brought to trial. Much the same is true of evidence provided since. Thus, the 9/11 Commission provided extensive circumstantial evidence of bin Laden’s role in 9/11, based primarily on what it had been told about confessions by prisoners in Guantanamo. It is doubtful that much of that would hold up in an independent court, considering the ways confessions were elicited. But in any event, the conclusions of a congressionally authorized investigation, however convincing one finds them, plainly fall short of a sentence by a credible court, which is what shifts the category of the accused from suspect to convicted.
There is much talk of bin Laden’s “confession,” but that was a boast, not a confession, with as much credibility as my “confession” that I won the Boston marathon. The boast tells us a lot about his character, but nothing about his responsibility for what he regarded as a great achievement, for which he wanted to take credit.
Again, all of this is, transparently, quite independent of one’s judgments about his responsibility, which seemed clear immediately, even before the FBI inquiry, and still does.
Crimes of Aggression
It is worth adding that bin Laden’s responsibility was recognized in much of the Muslim world, and condemned. One significant example is the distinguished Lebanese cleric Sheikh Fadlallah, greatly respected by Hizbollah and Shia groups generally, outside Lebanon as well. He had some experience with assassinations. He had been targeted for assassination: by a truck bomb outside a mosque, in a CIA-organized operation in 1985. He escaped, but 80 others were killed, mostly women and girls as they left the mosque — one of those innumerable crimes that do not enter the annals of terror because of the fallacy of “wrong agency.” Sheikh Fadlallah sharply condemned the 9/11 attacks.
One of the leading specialists on the Jihadi movement, Fawaz Gerges, suggests that the movement might have been split at that time had the U.S. exploited the opportunity instead of mobilizing the movement, particularly by the attack on Iraq, a great boon to bin Laden, which led to a sharp increase in terror, as intelligence agencies had anticipated. At the Chilcot hearings investigating the background to the invasion of Iraq, for example, the former head of Britain’s domestic intelligence agency MI5 testified that both British and U.S. intelligence were aware that Saddam posed no serious threat, that the invasion was likely to increase terror, and that the invasions of Iraq and Afghanistan had radicalized parts of a generation of Muslims who saw the military actions as an “attack on Islam.” As is often the case, security was not a high priority for state action.
It might be instructive to ask ourselves how we would be reacting if Iraqi commandos had landed at George W. Bush’s compound, assassinated him, and dumped his body in the Atlantic (after proper burial rites, of course). Uncontroversially, he was not a “suspect” but the “decider” who gave the orders to invade Iraq — that is, to commit the “supreme international crime differing only from other war crimes in that it contains within itself the accumulated evil of the whole” for which Nazi criminals were hanged: the hundreds of thousands of deaths, millions of refugees, destruction of much of the country and its national heritage, and the murderous sectarian conflict that has now spread to the rest of the region. Equally uncontroversially, these crimes vastly exceed anything attributed to bin Laden.
To say that all of this is uncontroversial, as it is, is not to imply that it is not denied. The existence of flat earthers does not change the fact that, uncontroversially, the earth is not flat. Similarly, it is uncontroversial that Stalin and Hitler were responsible for horrendous crimes, though loyalists deny it. All of this should, again, be too obvious for comment, and would be, except in an atmosphere of hysteria so extreme that it blocks rational thought.
Similarly, it is uncontroversial that Bush and associates did commit the “supreme international crime” — the crime of aggression. That crime was defined clearly enough by Justice Robert Jackson, Chief of Counsel for the United States at Nuremberg. An “aggressor,” Jackson proposed to the Tribunal in his opening statement, is a state that is the first to commit such actions as “[i]nvasion of its armed forces, with or without a declaration of war, of the territory of another State ….” No one, even the most extreme supporter of the aggression, denies that Bush and associates did just that.
We might also do well to recall Jackson’s eloquent words at Nuremberg on the principle of universality: “If certain acts in violation of treaties are crimes, they are crimes whether the United States does them or whether Germany does them, and we are not prepared to lay down a rule of criminal conduct against others which we would not be willing to have invoked against us.”
It is also clear that announced intentions are irrelevant, even if they are truly believed. Internal records reveal that Japanese fascists apparently did believe that, by ravaging China, they were laboring to turn it into an “earthly paradise.” And although it may be difficult to imagine, it is conceivable that Bush and company believed they were protecting the world from destruction by Saddam’s nuclear weapons. All irrelevant, though ardent loyalists on all sides may try to convince themselves otherwise.
We are left with two choices: either Bush and associates are guilty of the “supreme international crime” including all the evils that follow, or else we declare that the Nuremberg proceedings were a farce and the allies were guilty of judicial murder.
The Imperial Mentality and 9/11
A few days before the bin Laden assassination, Orlando Bosch died peacefully in Florida, where he resided along with his accomplice Luis Posada Carriles and many other associates in international terrorism. After he was accused of dozens of terrorist crimes by the FBI, Bosch was granted a presidential pardon by Bush I over the objections of the Justice Department, which found the conclusion “inescapable that it would be prejudicial to the public interest for the United States to provide a safe haven for Bosch.” The coincidence of these deaths at once calls to mind the Bush II doctrine — “already… a de facto rule of international relations,” according to the noted Harvard international relations specialist Graham Allison — which revokes “the sovereignty of states that provide sanctuary to terrorists.”
Allison refers to the pronouncement of Bush II, directed at the Taliban, that “those who harbor terrorists are as guilty as the terrorists themselves.” Such states, therefore, have lost their sovereignty and are fit targets for bombing and terror — for example, the state that harbored Bosch and his associate. When Bush issued this new “de facto rule of international relations,” no one seemed to notice that he was calling for invasion and destruction of the U.S. and the murder of its criminal presidents.
None of this is problematic, of course, if we reject Justice Jackson’s principle of universality, and adopt instead the principle that the U.S. is self-immunized against international law and conventions — as, in fact, the government has frequently made very clear.
It is also worth thinking about the name given to the bin Laden operation: Operation Geronimo. The imperial mentality is so profound that few seem able to perceive that the White House is glorifying bin Laden by calling him “Geronimo” — the Apache Indian chief who led the courageous resistance to the invaders of Apache lands.
The casual choice of the name is reminiscent of the ease with which we name our murder weapons after victims of our crimes: Apache, Blackhawk… We might react differently if the Luftwaffe had called its fighter planes “Jew” and “Gypsy.”
The examples mentioned would fall under the category of “American exceptionalism,” were it not for the fact that easy suppression of one’s own crimes is virtually ubiquitous among powerful states, at least those that are not defeated and forced to acknowledge reality.
Perhaps the assassination was perceived by the administration as an “act of vengeance,” as Robertson concludes. And perhaps the rejection of the legal option of a trial reflects a difference between the moral culture of 1945 and today, as he suggests. Whatever the motive was, it could hardly have been security. As in the case of the “supreme international crime” in Iraq, the bin Laden assassination is another illustration of the important fact that security is often not a high priority for state action, contrary to received doctrine.
Noam Chomsky is Institute Professor emeritus in the MIT Department of Linguistics and Philosophy. He is the author of numerous bestselling political works, including 9-11: Was There an Alternative? (Seven Stories Press), an updated version of his classic account, just being published this week with a major new essay — from which this post was adapted — considering the 10 years since the 9/11 attacks.
Copyright 2011 Noam Chomsky
Was There an Alternative?
The democracy uprising in the Arab world has been a spectacular display of courage, dedication, and commitment by popular forces — coinciding, fortuitously, with a remarkable uprising of tens of thousands in support of working people and democracy in Madison, Wisconsin, and other U.S. cities. If the trajectories of revolt in Cairo and Madison intersected, however, they were headed in opposite directions: in Cairo toward gaining elementary rights denied by the dictatorship, in Madison towards defending rights that had been won in long and hard struggles and are now under severe attack.
Each is a microcosm of tendencies in global society, following varied courses. There are sure to be far-reaching consequences of what is taking place both in the decaying industrial heartland of the richest and most powerful country in human history, and in what President Dwight Eisenhower called "the most strategically important area in the world" — "a stupendous source of strategic power" and "probably the richest economic prize in the world in the field of foreign investment," in the words of the State Department in the 1940s, a prize that the U.S. intended to keep for itself and its allies in the unfolding New World Order of that day.
Despite all the changes since, there is every reason to suppose that today's policy-makers basically adhere to the judgment of President Franklin Delano Roosevelt’s influential advisor A.A. Berle that control of the incomparable energy reserves of the Middle East would yield "substantial control of the world." And correspondingly, that loss of control would threaten the project of global dominance that was clearly articulated during World War II, and that has been sustained in the face of major changes in world order since that day.
From the outset of the war in 1939, Washington anticipated that it would end with the U.S. in a position of overwhelming power. High-level State Department officials and foreign policy specialists met through the wartime years to lay out plans for the postwar world. They delineated a "Grand Area" that the U.S. was to dominate, including the Western hemisphere, the Far East, and the former British empire, with its Middle East energy resources. As Russia began to grind down Nazi armies after Stalingrad, Grand Area goals extended to as much of Eurasia as possible, at least its economic core in Western Europe. Within the Grand Area, the U.S. would maintain "unquestioned power," with "military and economic supremacy," while ensuring the "limitation of any exercise of sovereignty" by states that might interfere with its global designs. The careful wartime plans were soon implemented.
It was always recognized that Europe might choose to follow an independent course. NATO was partially intended to counter this threat. As soon as the official pretext for NATO dissolved in 1989, NATO was expanded to the East in violation of verbal pledges to Soviet leader Mikhail Gorbachev. It has since become a U.S.-run intervention force, with far-ranging scope, spelled out by NATO Secretary-General Jaap de Hoop Scheffer, who informed a NATO conference that "NATO troops have to guard pipelines that transport oil and gas that is directed for the West," and more generally to protect sea routes used by tankers and other "crucial infrastructure" of the energy system.
Grand Area doctrines clearly license military intervention at will. That conclusion was articulated clearly by the Clinton administration, which declared that the U.S. has the right to use military force to ensure "uninhibited access to key markets, energy supplies, and strategic resources," and must maintain huge military forces "forward deployed" in Europe and Asia "in order to shape people's opinions about us" and "to shape events that will affect our livelihood and our security."
The same principles governed the invasion of Iraq. As the U.S. failure to impose its will in Iraq was becoming unmistakable, the actual goals of the invasion could no longer be concealed behind pretty rhetoric. In November 2007, the White House issued a Declaration of Principles demanding that U.S. forces must remain indefinitely in Iraq and committing Iraq to privilege American investors. Two months later, President Bush informed Congress that he would reject legislation that might limit the permanent stationing of U.S. Armed Forces in Iraq or "United States control of the oil resources of Iraq" — demands that the U.S. had to abandon shortly after in the face of Iraqi resistance.
In Tunisia and Egypt, the recent popular uprisings have won impressive victories, but as the Carnegie Endowment reported, while names have changed, the regimes remain: "A change in ruling elites and system of governance is still a distant goal." The report discusses internal barriers to democracy, but ignores the external ones, which as always are significant.
The U.S. and its Western allies are sure to do whatever they can to prevent authentic democracy in the Arab world. To understand why, it is only necessary to look at the studies of Arab opinion conducted by U.S. polling agencies. Though barely reported, they are certainly known to planners. They reveal that by overwhelming majorities, Arabs regard the U.S. and Israel as the major threats they face: the U.S. is so regarded by 90% of Egyptians, in the region generally by over 75%. Some Arabs regard Iran as a threat: 10%. Opposition to U.S. policy is so strong that a majority believes that security would be improved if Iran had nuclear weapons — in Egypt, 80%. Other figures are similar. If public opinion were to influence policy, the U.S. not only would not control the region, but would be expelled from it, along with its allies, undermining fundamental principles of global dominance.
The Invisible Hand of Power
Support for democracy is the province of ideologists and propagandists. In the real world, elite dislike of democracy is the norm. The evidence is overwhelming that democracy is supported insofar as it contributes to social and economic objectives, a conclusion reluctantly conceded by the more serious scholarship.
Elite contempt for democracy was revealed dramatically in the reaction to the WikiLeaks exposures. Those that received most attention, with euphoric commentary, were cables reporting that Arabs support the U.S. stand on Iran. The reference was to the ruling dictators. The attitudes of the public were unmentioned. The guiding principle was articulated clearly by Carnegie Endowment Middle East specialist Marwan Muasher, formerly a high official of the Jordanian government: "There is nothing wrong, everything is under control." In short, if the dictators support us, what else could matter?
The Muasher doctrine is rational and venerable. To mention just one case that is highly relevant today, in internal discussion in 1958, president Eisenhower expressed concern about "the campaign of hatred" against us in the Arab world, not by governments, but by the people. The National Security Council (NSC) explained that there is a perception in the Arab world that the U.S. supports dictatorships and blocks democracy and development so as to ensure control over the resources of the region. Furthermore, the perception is basically accurate, the NSC concluded, and that is what we should be doing, relying on the Muasher doctrine. Pentagon studies conducted after 9/11 confirmed that the same holds today.
It is normal for the victors to consign history to the trash can, and for victims to take it seriously. Perhaps a few brief observations on this important matter may be useful. Today is not the first occasion when Egypt and the U.S. are facing similar problems, and moving in opposite directions. That was also true in the early nineteenth century.
Economic historians have argued that Egypt was well-placed to undertake rapid economic development at the same time that the U.S. was. Both had rich agriculture, including cotton, the fuel of the early industrial revolution — though unlike Egypt, the U.S. had to develop cotton production and a work force by conquest, extermination, and slavery, with consequences that are evident right now in the reservations for the survivors and the prisons that have rapidly expanded since the Reagan years to house the superfluous population left by deindustrialization.
One fundamental difference was that the U.S. had gained independence and was therefore free to ignore the prescriptions of economic theory, delivered at the time by Adam Smith in terms rather like those preached to developing societies today. Smith urged the liberated colonies to produce primary products for export and to import superior British manufactures, and certainly not to attempt to monopolize crucial goods, particularly cotton. Any other path, Smith warned, "would retard instead of accelerating the further increase in the value of their annual produce, and would obstruct instead of promoting the progress of their country towards real wealth and greatness."
Having gained their independence, the colonies were free to ignore his advice and to follow England's course of independent state-guided development, with high tariffs to protect industry from British exports, first textiles, later steel and others, and to adopt numerous other devices to accelerate industrial development. The independent Republic also sought to gain a monopoly of cotton so as to "place all other nations at our feet," particularly the British enemy, as the Jacksonian presidents announced when conquering Texas and half of Mexico.
For Egypt, a comparable course was barred by British power. Lord Palmerston declared that "no ideas of fairness [toward Egypt] ought to stand in the way of such great and paramount interests" of Britain as preserving its economic and political hegemony, expressing his "hate" for the "ignorant barbarian" Muhammed Ali who dared to seek an independent course, and deploying Britain's fleet and financial power to terminate Egypt's quest for independence and economic development.
After World War II, when the U.S. displaced Britain as global hegemon, Washington adopted the same stand, making it clear that the U.S. would provide no aid to Egypt unless it adhered to the standard rules for the weak — which the U.S. continued to violate, imposing high tariffs to bar Egyptian cotton and causing a debilitating dollar shortage. The usual interpretation of market principles.
It is small wonder that the "campaign of hatred" against the U.S. that concerned Eisenhower was based on the recognition that the U.S. supports dictators and blocks democracy and development, as do its allies.
In Adam Smith's defense, it should be added that he recognized what would happen if Britain followed the rules of sound economics, now called "neoliberalism." He warned that if British manufacturers, merchants, and investors turned abroad, they might profit but England would suffer. But he felt that they would be guided by a home bias, so as if by an invisible hand England would be spared the ravages of economic rationality.
The passage is hard to miss. It is the one occurrence of the famous phrase "invisible hand" in The Wealth of Nations. The other leading founder of classical economics, David Ricardo, drew similar conclusions, hoping that home bias would lead men of property to "be satisfied with the low rate of profits in their own country, rather than seek a more advantageous employment for their wealth in foreign nations," feelings that, he added, "I should be sorry to see weakened." Their predictions aside, the instincts of the classical economists were sound.
The Iranian and Chinese “Threats”
The democracy uprising in the Arab world is sometimes compared to Eastern Europe in 1989, but on dubious grounds. In 1989, the democracy uprising was tolerated by the Russians, and supported by western power in accord with standard doctrine: it plainly conformed to economic and strategic objectives, and was therefore a noble achievement, greatly honored, unlike the struggles at the same time "to defend the people's fundamental human rights" in Central America, in the words of the assassinated Archbishop of El Salvador, one of the hundreds of thousands of victims of the military forces armed and trained by Washington. There was no Gorbachev in the West throughout these horrendous years, and there is none today. And Western power remains hostile to democracy in the Arab world for good reasons.
Grand Area doctrines continue to apply to contemporary crises and confrontations. In Western policy-making circles and political commentary the Iranian threat is considered to pose the greatest danger to world order and hence must be the primary focus of U.S. foreign policy, with Europe trailing along politely.
What exactly is the Iranian threat? An authoritative answer is provided by the Pentagon and U.S. intelligence. Reporting on global security last year, they make it clear that the threat is not military. Iran's military spending is "relatively low compared to the rest of the region," they conclude. Its military doctrine is strictly "defensive, designed to slow an invasion and force a diplomatic solution to hostilities." Iran has only "a limited capability to project force beyond its borders." With regard to the nuclear option, "Iran's nuclear program and its willingness to keep open the possibility of developing nuclear weapons is a central part of its deterrent strategy." All quotes.
The brutal clerical regime is doubtless a threat to its own people, though it hardly outranks U.S. allies in that regard. But the threat lies elsewhere, and is ominous indeed. One element is Iran's potential deterrent capacity, an illegitimate exercise of sovereignty that might interfere with U.S. freedom of action in the region. It is glaringly obvious why Iran would seek a deterrent capacity; a look at the military bases and nuclear forces in the region suffices to explain.
Seven years ago, Israeli military historian Martin van Creveld wrote that "The world has witnessed how the United States attacked Iraq for, as it turned out, no reason at all. Had the Iranians not tried to build nuclear weapons, they would be crazy," particularly when they are under constant threat of attack in violation of the UN Charter. Whether they are doing so remains an open question, but perhaps so.
But Iran's threat goes beyond deterrence. It is also seeking to expand its influence in neighboring countries, the Pentagon and U.S. intelligence emphasize, and in this way to "destabilize" the region (in the technical terms of foreign policy discourse). The U.S. invasion and military occupation of Iran's neighbors is "stabilization." Iran's efforts to extend its influence to them are "destabilization," hence plainly illegitimate.
Such usage is routine. Thus the prominent foreign policy analyst James Chace was properly using the term "stability" in its technical sense when he explained that in order to achieve "stability" in Chile it was necessary to "destabilize" the country (by overthrowing the elected government of Salvador Allende and installing the dictatorship of General Augusto Pinochet). Other concerns about Iran are equally interesting to explore, but perhaps this is enough to reveal the guiding principles and their status in imperial culture. As Franklin Delano Roosevelt’s planners emphasized at the dawn of the contemporary world system, the U.S. cannot tolerate "any exercise of sovereignty" that interferes with its global designs.
The U.S. and Europe are united in punishing Iran for its threat to stability, but it is useful to recall how isolated they are. The nonaligned countries have vigorously supported Iran's right to enrich uranium. In the region, Arab public opinion even strongly favors Iranian nuclear weapons. The major regional power, Turkey, voted against the latest U.S.-initiated sanctions motion in the Security Council, along with Brazil, the most admired country of the South. Their disobedience led to sharp censure, not for the first time: Turkey had been bitterly condemned in 2003 when the government followed the will of 95% of the population and refused to participate in the invasion of Iraq, thus demonstrating its weak grasp of democracy, western-style.
After its Security Council misdeed last year, Turkey was warned by Obama's top diplomat on European affairs, Philip Gordon, that it must "demonstrate its commitment to partnership with the West." A scholar with the Council on Foreign Relations asked, "How do we keep the Turks in their lane?" — following orders like good democrats. Brazil's Lula was admonished in a New York Times headline that his effort with Turkey to provide a solution to the uranium enrichment issue outside of the framework of U.S. power was a "Spot on Brazilian Leader's Legacy." In brief, do what we say, or else.
An interesting sidelight, effectively suppressed, is that the Iran-Turkey-Brazil deal was approved in advance by Obama, presumably on the assumption that it would fail, providing an ideological weapon against Iran. When it succeeded, the approval turned to censure, and Washington rammed through a Security Council resolution so weak that China readily signed — and is now chastised for living up to the letter of the resolution but not Washington's unilateral directives — in the current issue of Foreign Affairs, for example.
While the U.S. can tolerate Turkish disobedience, though with dismay, China is harder to ignore. The press warns that "China's investors and traders are now filling a vacuum in Iran as businesses from many other nations, especially in Europe, pull out," and in particular, is expanding its dominant role in Iran's energy industries. Washington is reacting with a touch of desperation. The State Department warned China that if it wants to be accepted in the international community — a technical term referring to the U.S. and whoever happens to agree with it — then it must not "skirt and evade international responsibilities, [which] are clear": namely, follow U.S. orders. China is unlikely to be impressed.
There is also much concern about the growing Chinese military threat. A recent Pentagon study warned that China's military budget is approaching "one-fifth of what the Pentagon spent to operate and carry out the wars in Iraq and Afghanistan," a fraction of the U.S. military budget, of course. China's expansion of military forces might "deny the ability of American warships to operate in international waters off its coast," the New York Times added.
Off the coast of China, that is; it has yet to be proposed that the U.S. should eliminate military forces that deny the Caribbean to Chinese warships. China's lack of understanding of rules of international civility is illustrated further by its objections to plans for the advanced nuclear-powered aircraft carrier George Washington to join naval exercises a few miles off China's coast, with alleged capacity to strike Beijing.
In contrast, the West understands that such U.S. operations are all undertaken to defend stability and its own security. The liberal New Republic expresses its concern that "China sent ten warships through international waters just off the Japanese island of Okinawa." That is indeed a provocation — unlike the fact, unmentioned, that Washington has converted the island into a major military base in defiance of vehement protests by the people of Okinawa. That is not a provocation, on the standard principle that we own the world.
Deep-seated imperial doctrine aside, there is good reason for China's neighbors to be concerned about its growing military and commercial power. And though Arab opinion supports an Iranian nuclear weapons program, we certainly should not do so. The foreign policy literature is full of proposals as to how to counter the threat. One obvious way is rarely discussed: work to establish a nuclear-weapons-free zone (NWFZ) in the region. The issue arose (again) at the Non-Proliferation Treaty (NPT) conference at United Nations headquarters last May. Egypt, as chair of the 118 nations of the Non-Aligned Movement, called for negotiations on a Middle East NWFZ, as had been agreed by the West, including the U.S., at the 1995 review conference on the NPT.
International support is so overwhelming that Obama formally agreed. It is a fine idea, Washington informed the conference, but not now. Furthermore, the U.S. made clear that Israel must be exempted: no proposal can call for Israel's nuclear program to be placed under the auspices of the International Atomic Energy Agency or for the release of information about "Israeli nuclear facilities and activities." So much for this method of dealing with the Iranian nuclear threat.
Privatizing the Planet
While Grand Area doctrine still prevails, the capacity to implement it has declined. The peak of U.S. power was after World War II, when it had literally half the world's wealth. But that naturally declined, as other industrial economies recovered from the devastation of the war and decolonization took its agonizing course. By the early 1970s, the U.S. share of global wealth had declined to about 25%, and the industrial world had become tripolar: North America, Europe, and East Asia (then Japan-based).
There was also a sharp change in the U.S. economy in the 1970s, towards financialization and export of production. A variety of factors converged to create a vicious cycle of radical concentration of wealth, primarily in the top fraction of 1% of the population — mostly CEOs, hedge-fund managers, and the like. That leads to the concentration of political power, hence state policies to increase economic concentration: fiscal policies, rules of corporate governance, deregulation, and much more. Meanwhile the costs of electoral campaigns skyrocketed, driving the parties into the pockets of concentrated capital, increasingly financial: the Republicans reflexively, the Democrats — by now what used to be moderate Republicans — not far behind.
Elections have become a charade, run by the public relations industry. After his 2008 victory, Obama won an award from the industry for the best marketing campaign of the year. Executives were euphoric. In the business press they explained that they had been marketing candidates like other commodities since Ronald Reagan, but 2008 was their greatest achievement and would change the style in corporate boardrooms. The 2012 election is expected to cost $2 billion, mostly in corporate funding. Small wonder that Obama is selecting business leaders for top positions. The public is angry and frustrated, but as long as the Muasher principle prevails, that doesn't matter.
While wealth and power have narrowly concentrated, for most of the population real incomes have stagnated and people have been getting by with increased work hours, debt, and asset inflation, regularly destroyed by the financial crises that began as the regulatory apparatus was dismantled starting in the 1980s.
None of this is problematic for the very wealthy, who benefit from a government insurance policy called "too big to fail." The banks and investment firms can make risky transactions, with rich rewards, and when the system inevitably crashes, they can run to the nanny state for a taxpayer bailout, clutching their copies of Friedrich Hayek and Milton Friedman.
That has been the regular process since the Reagan years, each crisis more extreme than the last — for the public population, that is. Right now, real unemployment is at Depression levels for much of the population, while Goldman Sachs, one of the main architects of the current crisis, is richer than ever. It has just quietly announced $17.5 billion in compensation for last year, with CEO Lloyd Blankfein receiving a $12.6 million bonus while his base salary more than triples.
It wouldn't do to focus attention on such facts as these. Accordingly, propaganda must seek to blame others, in the past few months, public sector workers, their fat salaries, exorbitant pensions, and so on: all fantasy, on the model of Reaganite imagery of black mothers being driven in their limousines to pick up welfare checks — and other models that need not be mentioned. We all must tighten our belts; almost all, that is.
Teachers are a particularly good target, as part of the deliberate effort to destroy the public education system from kindergarten through the universities by privatization — again, good for the wealthy, but a disaster for the population, as well as the long-term health of the economy, but that is one of the externalities that is put to the side insofar as market principles prevail.
Another fine target, always, is immigrants. That has been true throughout U.S. history, even more so at times of economic crisis, exacerbated now by a sense that our country is being taken away from us: the white population will soon become a minority. One can understand the anger of aggrieved individuals, but the cruelty of the policy is shocking.
Who are the immigrants targeted? In Eastern Massachusetts, where I live, many are Mayans fleeing genocide in the Guatemalan highlands carried out by Reagan's favorite killers. Others are Mexican victims of Clinton's NAFTA, one of those rare government agreements that managed to harm working people in all three of the participating countries. As NAFTA was rammed through Congress over popular objection in 1994, Clinton also initiated the militarization of the U.S.-Mexican border, previously fairly open. It was understood that Mexican campesinos cannot compete with highly subsidized U.S. agribusiness, and that Mexican businesses would not survive competition with U.S. multinationals, which must be granted "national treatment" under the mislabeled free trade agreements, a privilege granted only to corporate persons, not those of flesh and blood. Not surprisingly, these measures led to a flood of desperate refugees, and to rising anti-immigrant hysteria by the victims of state-corporate policies at home.
Much the same appears to be happening in Europe, where racism is probably more rampant than in the U.S. One can only watch with wonder as Italy complains about the flow of refugees from Libya, the scene of the first post-World War I genocide, in the now-liberated East, at the hands of Italy's Fascist government. Or when France, still today the main protector of the brutal dictatorships in its former colonies, manages to overlook its hideous atrocities in Africa, while French President Nicolas Sarkozy warns grimly of the "flood of immigrants" and Marine Le Pen objects that he is doing nothing to prevent it. I need not mention Belgium, which may win the prize for what Adam Smith called "the savage injustice of the Europeans."
The rise of neo-fascist parties in much of Europe would be a frightening phenomenon even if we were not to recall what happened on the continent in the recent past. Just imagine the reaction if Jews were being expelled from France to misery and oppression, and then witness the non-reaction when that is happening to Roma, also victims of the Holocaust and Europe's most brutalized population.
In Hungary, the neo-fascist party Jobbik gained 17% of the vote in national elections, perhaps unsurprising when three-quarters of the population feels that they are worse off than under Communist rule. We might be relieved that in Austria the ultra-right Jörg Haider won only 10% of the vote in 2008 — were it not for the fact that the new Freedom Party, outflanking him from the far right, won more than 17%. It is chilling to recall that, in 1928, the Nazis won less than 3% of the vote in Germany.
In England the British National Party and the English Defence League, on the ultra-racist right, are major forces. (What is happening in Holland you know all too well.) In Germany, Thilo Sarrazin's lament that immigrants are destroying the country was a runaway best-seller, while Chancellor Angela Merkel, though condemning the book, declared that multiculturalism had "utterly failed": the Turks imported to do the dirty work in Germany are failing to become blond and blue-eyed, true Aryans.
Those with a sense of irony may recall that Benjamin Franklin, one of the leading figures of the Enlightenment, warned that the newly liberated colonies should be wary of allowing Germans to immigrate, because they were too swarthy; Swedes as well. Into the twentieth century, ludicrous myths of Anglo-Saxon purity were common in the U.S., including among presidents and other leading figures. Racism in the literary culture has been a rank obscenity; far worse in practice, needless to say. It is much easier to eradicate polio than this horrifying plague, which regularly becomes more virulent in times of economic distress.
I do not want to end without mentioning another externality that is dismissed in market systems: the fate of the species. Systemic risk in the financial system can be remedied by the taxpayer, but no one will come to the rescue if the environment is destroyed. That it must be destroyed is close to an institutional imperative. Business leaders who are conducting propaganda campaigns to convince the population that anthropogenic global warming is a liberal hoax understand full well how grave is the threat, but they must maximize short-term profit and market share. If they don't, someone else will.
This vicious cycle could well turn out to be lethal. To see how grave the danger is, simply have a look at the new Congress in the U.S., propelled into power by business funding and propaganda. Almost all are climate deniers. They have already begun to cut funding for measures that might mitigate environmental catastrophe. Worse, some are true believers; for example, the new head of a subcommittee on the environment who explained that global warming cannot be a problem because God promised Noah that there will not be another flood.
If such things were happening in some small and remote country, we might laugh. Not when they are happening in the richest and most powerful country in the world. And before we laugh, we might also bear in mind that the current economic crisis is traceable in no small measure to the fanatic faith in such dogmas as the efficient market hypothesis, and in general to what Nobel laureate Joseph Stiglitz, 15 years ago, called the "religion" that markets know best — which prevented the central bank and the economics profession from taking notice of an $8 trillion housing bubble that had no basis at all in economic fundamentals, and that devastated the economy when it burst.
All of this, and much more, can proceed as long as the Muashar doctrine prevails. As long as the general population is passive, apathetic, diverted to consumerism or hatred of the vulnerable, then the powerful can do as they please, and those who survive will be left to contemplate the outcome.
Noam Chomsky is Institute Professor emeritus in the MIT Department of Linguistics and Philosophy. He is the author of numerous best-selling political works including Imperial Ambitions, Hegemony or Survival, Failed States and What We Say Goes.
Copyright 2011 Noam Chomsky
Is the World Too Big to Fail?
The fact that the Israel-Palestine conflict grinds on without resolution might appear to be rather strange. For many of the world’s conflicts, it is difficult even to conjure up a feasible settlement. In this case, it is not only possible, but there is near universal agreement on its basic contours: a two-state settlement along the internationally recognized (pre-June 1967) borders — with “minor and mutual modifications,” to adopt official U.S. terminology before Washington departed from the international community in the mid-1970s.
The basic principles have been accepted by virtually the entire world, including the Arab states (who go on to call for full normalization of relations), the Organization of Islamic States (including Iran), and relevant non-state actors (including Hamas). A settlement along these lines was first proposed at the U.N. Security Council in January 1976 by the major Arab states. Israel refused to attend the session. The U.S. vetoed the resolution, and did so again in 1980. The record at the General Assembly since is similar.
There was one important and revealing break in U.S.-Israeli rejectionism. After the failed Camp David agreements in 2000, President Clinton recognized that the terms he and Israel had proposed were unacceptable to any Palestinians. That December, he proposed his “parameters”: imprecise, but more forthcoming. He then stated that both sides had accepted the parameters, while expressing reservations.
Israeli and Palestinian negotiators met in Taba, Egypt, in January 2001 to resolve the differences and were making considerable progress. In their final press conference, they reported that, with a little more time, they could probably have reached full agreement. Israel called off the negotiations prematurely, however, and official progress then terminated, though informal discussions at a high level continued leading to the Geneva Accord, rejected by Israel and ignored by the U.S.
A good deal has happened since, but a settlement along those lines is still not out of reach — if, of course, Washington is once again willing to accept it. Unfortunately, there is little sign of that.
Substantial mythology has been created about the entire record, but the basic facts are clear enough and quite well documented.
The U.S. and Israel have been acting in tandem to extend and deepen the occupation. In 2005, recognizing that it was pointless to subsidize a few thousand Israeli settlers in Gaza, who were appropriating substantial resources and protected by a large part of the Israeli army, the government of Ariel Sharon decided to move them to the much more valuable West Bank and Golan Heights.
Instead of carrying out the operation straightforwardly, as would have been easy enough, the government decided to stage a “national trauma,” which virtually duplicated the farce accompanying the withdrawal from the Sinai desert after the Camp David agreements of 1978-79. In each case, the withdrawal permitted the cry of “Never Again,” which meant in practice: we cannot abandon an inch of the Palestinian territories that we want to take in violation of international law. This farce played very well in the West, though it was ridiculed by more astute Israeli commentators, among them that country’s prominent sociologist the late Baruch Kimmerling.
After its formal withdrawal from the Gaza Strip, Israel never actually relinquished its total control over the territory, often described realistically as “the world’s largest prison.” In January 2006, a few months after the withdrawal, Palestine had an election that was recognized as free and fair by international observers. Palestinians, however, voted “the wrong way,” electing Hamas. Instantly, the U.S. and Israel intensified their assault against Gazans as punishment for this misdeed. The facts and the reasoning were not concealed; rather, they were openly published alongside reverential commentary on Washington’s sincere dedication to democracy. The U.S.-backed Israeli assault against the Gazans has only been intensified since, thanks to violence and economic strangulation, increasingly savage.
Meanwhile in the West Bank, always with firm U.S. backing, Israel has been carrying forward longstanding programs to take the valuable land and resources of the Palestinians and leave them in unviable cantons, mostly out of sight. Israeli commentators frankly refer to these goals as “neocolonial.” Ariel Sharon, the main architect of the settlement programs, called these cantons “Bantustans,” though the term is misleading: South Africa needed the majority black work force, while Israel would be happy if the Palestinians disappeared, and its policies are directed to that end.
Blockading Gaza by Land and Sea
One step towards cantonization and the undermining of hopes for Palestinian national survival is the separation of Gaza from the West Bank. These hopes have been almost entirely consigned to oblivion, an atrocity to which we should not contribute by tacit consent. Israeli journalist Amira Hass, one of the leading specialists on Gaza, writes that
“the restrictions on Palestinian movement that Israel introduced in January 1991 reversed a process that had been initiated in June 1967. Back then, and for the first time since 1948, a large portion of the Palestinian people again lived in the open territory of a single country — to be sure, one that was occupied, but was nevertheless whole.… The total separation of the Gaza Strip from the West Bank is one of the greatest achievements of Israeli politics, whose overarching objective is to prevent a solution based on international decisions and understandings and instead dictate an arrangement based on Israel’s military superiority.…
“Since January 1991, Israel has bureaucratically and logistically merely perfected the split and the separation: not only between Palestinians in the occupied territories and their brothers in Israel, but also between the Palestinian residents of Jerusalem and those in the rest of the territories and between Gazans and West Bankers/Jerusalemites. Jews live in this same piece of land within a superior and separate system of privileges, laws, services, physical infrastructure and freedom of movement.”
The leading academic specialist on Gaza, Harvard scholar Sara Roy, adds:
“Gaza is an example of a society that has been deliberately reduced to a state of abject destitution, its once productive population transformed into one of aid-dependent paupers.… Gaza’s subjection began long before Israel’s recent war against it [December 2008]. The Israeli occupation — now largely forgotten or denied by the international community — has devastated Gaza’s economy and people, especially since 2006…. After Israel’s December  assault, Gaza’s already compromised conditions have become virtually unlivable. Livelihoods, homes, and public infrastructure have been damaged or destroyed on a scale that even the Israel Defense Forces admitted was indefensible.
“In Gaza today, there is no private sector to speak of and no industry. 80 percent of Gaza’s agricultural crops were destroyed and Israel continues to snipe at farmers attempting to plant and tend fields near the well-fenced and patrolled border. Most productive activity has been extinguished.… Today, 96 percent of Gaza’s population of 1.4 million is dependent on humanitarian aid for basic needs. According to the World Food Programme, the Gaza Strip requires a minimum of 400 trucks of food every day just to meet the basic nutritional needs of the population. Yet, despite a March [22, 2009] decision by the Israeli cabinet to lift all restrictions on foodstuffs entering Gaza, only 653 trucks of food and other supplies were allowed entry during the week of May 10, at best meeting 23 percent of required need. Israel now allows only 30 to 40 commercial items to enter Gaza compared to 4,000 approved products prior to June 2006.”
It cannot be too often stressed that Israel had no credible pretext for its 2008–9 attack on Gaza, with full U.S. support and illegally using U.S. weapons. Near-universal opinion asserts the contrary, claiming that Israel was acting in self-defense. That is utterly unsustainable, in light of Israel’s flat rejection of peaceful means that were readily available, as Israel and its U.S. partner in crime knew very well. That aside, Israel’s siege of Gaza is itself an act of war, as Israel of all countries certainly recognizes, having repeatedly justified launching major wars on grounds of partial restrictions on its access to the outside world, though nothing remotely like what it has long imposed on Gaza.
One crucial element of Israel’s criminal siege, little reported, is the naval blockade. Peter Beaumont reports from Gaza that, “on its coastal littoral, Gaza’s limitations are marked by a different fence where the bars are Israeli gunboats with their huge wakes, scurrying beyond the Palestinian fishing boats and preventing them from going outside a zone imposed by the warships.” According to reports from the scene, the naval siege has been tightened steadily since 2000. Fishing boats have been driven steadily out of Gaza’s territorial waters and toward the shore by Israeli gunboats, often violently without warning and with many casualties. As a result of these naval actions, Gaza’s fishing industry has virtually collapsed; fishing is impossible near shore because of the contamination caused by Israel’s regular attacks, including the destruction of power plants and sewage facilities.
These Israeli naval attacks began shortly after the discovery by the BG (British Gas) Group of what appear to be quite sizeable natural gas fields in Gaza’s territorial waters. Industry journals report that Israel is already appropriating these Gazan resources for its own use, part of its commitment to shift its economy to natural gas. The standard industry source reports:
“Israel’s finance ministry has given the Israel Electric Corp. (IEC) approval to purchase larger quantities of natural gas from BG than originally agreed upon, according to Israeli government sources [which] said the state-owned utility would be able to negotiate for as much as 1.5 billion cubic meters of natural gas from the Marine field located off the Mediterranean coast of the Palestinian controlled Gaza Strip.
“Last year the Israeli government approved the purchase of 800 million cubic meters of gas from the field by the IEC…. Recently the Israeli government changed its policy and decided the state-owned utility could buy the entire quantity of gas from the Gaza Marine field. Previously the government had said the IEC could buy half the total amount and the remainder would be bought by private power producers.”
The pillage of what could become a major source of income for Gaza is surely known to U.S. authorities. It is only reasonable to suppose that the intention to appropriate these limited resources, either by Israel alone or together with the collaborationist Palestinian Authority, is the motive for preventing Gazan fishing boats from entering Gaza’s territorial waters.
There are some instructive precedents. In 1989, Australian foreign minister Gareth Evans signed a treaty with his Indonesian counterpart Ali Alatas granting Australia rights to the substantial oil reserves in “the Indonesian Province of East Timor.” The Indonesia-Australia Timor Gap Treaty, which offered not a crumb to the people whose oil was being stolen, “is the only legal agreement anywhere in the world that effectively recognises Indonesia’s right to rule East Timor,” the Australian press reported.
Asked about his willingness to recognize the Indonesian conquest and to rob the sole resource of the conquered territory, which had been subjected to near-genocidal slaughter by the Indonesian invader with the strong support of Australia (along with the U.S., the U.K., and some others), Evans explained that “there is no binding legal obligation not to recognise the acquisition of territory that was acquired by force,” adding that “the world is a pretty unfair place, littered with examples of acquisition by force.”
It should, then, be unproblematic for Israel to follow suit in Gaza.
A few years later, Evans became the leading figure in the campaign to introduce the concept “responsibility to protect” — known as R2P — into international law. R2P is intended to establish an international obligation to protect populations from grave crimes. Evans is the author of a major book on the subject and was co-chair of the International Commission on Intervention and State Sovereignty, which issued what is considered the basic document on R2P.
In an article devoted to this “idealistic effort to establish a new humanitarian principle,” the London Economist featured Evans and his “bold but passionate claim on behalf of a three-word expression which (in quite large part thanks to his efforts) now belongs to the language of diplomacy: the ‘responsibility to protect.’” The article is accompanied by a picture of Evans with the caption “Evans: a lifelong passion to protect.” His hand is pressed to his forehead in despair over the difficulties faced by his idealistic effort. The journal chose not to run a different photo that circulates in Australia, depicting Evans and Alatas exuberantly clasping their hands together as they toast the Timor Gap Treaty that they had just signed.
Though a “protected population” under international law, Gazans do not fall under the jurisdiction of the “responsibility to protect,” joining other unfortunates, in accord with the maxim of Thucydides — that the strong do as they wish, and the weak suffer as they must — which holds with its customary precision.
Obama and the Settlements
The kinds of restrictions on movement used to destroy Gaza have long been in force in the West Bank as well, less cruelly but with grim effects on life and the economy. The World Bank reports that Israel has established “a complex closure regime that restricts Palestinian access to large areas of the West Bank… The Palestinian economy has remained stagnant, largely because of the sharp downturn in Gaza and Israel’s continued restrictions on Palestinian trade and movement in the West Bank.”
The World Bank “cited Israeli roadblocks and checkpoints hindering trade and travel, as well as restrictions on Palestinian building in the West Bank, where the Western-backed government of Palestinian president Mahmoud Abbas holds sway.” Israel does permit — indeed encourage — a privileged existence for elites in Ramallah and sometimes elsewhere, largely relying on European funding, a traditional feature of colonial and neocolonial practice.
All of this constitutes what Israeli activist Jeff Halper calls a “matrix of control” to subdue the colonized population. These systematic programs over more than 40 years aim to establish Defense Minister Moshe Dayan’s recommendation to his colleagues shortly after Israel’s 1967 conquests that we must tell the Palestinians in the territories: “We have no solution, you shall continue to live like dogs, and whoever wishes may leave, and we will see where this process leads.”
Turning to the second bone of contention, settlements, there is indeed a confrontation, but it is rather less dramatic than portrayed. Washington’s position was presented most strongly in Secretary of State Hillary Clinton’s much-quoted statement rejecting “natural growth exceptions” to the policy opposing new settlements. Prime Minister Benjamin Netanyahu, along with President Shimon Peres and, in fact, virtually the whole Israeli political spectrum, insists on permitting “natural growth” within the areas that Israel intends to annex, complaining that the United States is backing down on George W. Bush’s authorization of such expansion within his “vision” of a Palestinian state.
Senior Netanyahu cabinet members have gone further. Transportation Minister Yisrael Katz announced that “the current Israeli government will not accept in any way the freezing of legal settlement activity in Judea and Samaria.” The term “legal” in U.S.-Israeli parlance means “illegal, but authorized by the government of Israel with a wink from Washington.” In this usage, unauthorized outposts are termed “illegal,” though apart from the dictates of the powerful, they are no more illegal than the settlements granted to Israel under Bush’s “vision” and Obama’s scrupulous omission.
The Obama-Clinton “hardball” formulation is not new. It repeats the wording of the Bush administration draft of the 2003 Road Map, which stipulates that in Phase I, “Israel freezes all settlement activity (including natural growth of settlements).” All sides formally accept the Road Map (modified to drop the phrase “natural growth”) — consistently overlooking the fact that Israel, with U.S. support, at once added 14 “reservations” that render it inoperable.
If Obama were at all serious about opposing settlement expansion, he could easily proceed with concrete measures by, for example, reducing U.S. aid by the amount devoted to this purpose. That would hardly be a radical or courageous move. The Bush I administration did so (reducing loan guarantees), but after the Oslo accord in 1993, President Clinton left calculations to the government of Israel. Unsurprisingly, there was “no change in the expenditures flowing to the settlements,” the Israeli press reported. “[Prime Minister] Rabin will continue not to dry out the settlements,” the report concludes. “And the Americans? They will understand.”
Obama administration officials informed the press that the Bush I measures are “not under discussion,” and that pressures will be “largely symbolic.” In short, Obama understands, just as Clinton and Bush II did.
At best, settlement expansion is a side issue, rather like the issue of “illegal outposts” — namely those that the government of Israel has not authorized. Concentration on these issues diverts attention from the fact that there are no “legal outposts” and that it is the existing settlements that are the primary problem to be faced.
The U.S. press reports that “a partial freeze has been in place for several years, but settlers have found ways around the strictures… [C]onstruction in the settlements has slowed but never stopped, continuing at an annual rate of about 1,500 to 2,000 units over the past three years. If building continues at the 2008 rate, the 46,500 units already approved will be completed in about 20 years.… If Israel built all the housing units already approved in the nation’s overall master plan for settlements, it would almost double the number of settler homes in the West Bank.” Peace Now, which monitors settlement activities, estimates further that the two largest settlements would double in size: Ariel and Ma’aleh Adumim, built mainly during the Oslo years in the salients that subdivide the West Bank into cantons.
“Natural population growth” is largely a myth, Israel’s leading diplomatic correspondent, Akiva Eldar, points out, citing demographic studies by Colonel (res.) Shaul Arieli, deputy military secretary to former prime minister and incumbent defense minister Ehud Barak. Settlement growth consists largely of Israeli immigrants in violation of the Geneva Conventions, assisted with generous subsidies. Much of it is in direct violation of formal government decisions, but carried out with the authorization of the government, specifically Barak, considered a dove in the Israeli spectrum.
Correspondent Jackson Diehl derides the “long-dormant Palestinian fantasy,” revived by President Abbas, “that the United States will simply force Israel to make critical concessions, whether or not its democratic government agrees.” He does not explain why refusal to participate in Israel’s illegal expansion — which, if serious, would “force Israel to make critical concessions” — would be improper interference in Israel’s democracy.
Returning to reality, all of these discussions about settlement expansion evade the most crucial issue about settlements: what the United States and Israel have already established in the West Bank. The evasion tacitly concedes that the illegal settlement programs already in place are somehow acceptable (putting aside the Golan Heights, annexed in violation of Security Council orders) — though the Bush “vision,” apparently accepted by Obama, moves from tacit to explicit support for these violations of law. What is in place already suffices to ensure that there can be no viable Palestinian self-determination. Hence, there is every indication that even on the unlikely assumption that “natural growth” will be ended, U.S.-Israeli rejectionism will persist, blocking the international consensus as before.
Subsequently, Prime Minister Netanyahu declared a 10-month suspension of new construction, with many exemptions, and entirely excluding Greater Jerusalem, where expropriation in Arab areas and construction for Jewish settlers continues at a rapid pace. Hillary Clinton praised these “unprecedented” concessions on (illegal) construction, eliciting anger and ridicule in much of the world.
It might be different if a legitimate “land swap” were under consideration, a solution approached at Taba and spelled out more fully in the Geneva Accord reached in informal high-level Israel-Palestine negotiations. The accord was presented in Geneva in October 2003, welcomed by much of the world, rejected by Israel, and ignored by the United States.
Barack Obama’s June 4, 2009, Cairo address to the Muslim world kept pretty much to his well-honed “blank slate” style — with little of substance, but presented in a personable manner that allows listeners to write on the slate what they want to hear. CNN captured its spirit in headlining a report “Obama Looks to Reach the Soul of the Muslim World.” Obama had announced the goals of his address in an interview with New York Times columnist Thomas Friedman. “‘We have a joke around the White House,’ the president said. ‘We’re just going to keep on telling the truth until it stops working and nowhere is truth-telling more important than the Middle East.’” The White House commitment is most welcome, but it is useful to see how it translates into practice.
Obama admonished his audience that it is easy to “point fingers… but if we see this conflict only from one side or the other, then we will be blind to the truth: the only resolution is for the aspirations of both sides to be met through two states, where Israelis and Palestinians each live in peace and security.”
Turning from Obama-Friedman Truth to truth, there is a third side, with a decisive role throughout: the United States. But that participant in the conflict Obama omitted. The omission is understood to be normal and appropriate, hence unmentioned: Friedman’s column is headlined “Obama Speech Aimed at Both Arabs and Israelis.” The front-page Wall Street Journal report on Obama’s speech appears under the heading “Obama Chides Israel, Arabs in His Overture to Muslims.” Other reports are the same.
The convention is understandable on the doctrinal principle that though the U.S. government sometimes makes mistakes, its intentions are by definition benign, even noble. In the world of attractive imagery, Washington has always sought desperately to be an honest broker, yearning to advance peace and justice. The doctrine trumps truth, of which there is little hint in the speech or the mainstream coverage of it.
Obama once again echoed Bush’s “vision” of two states, without saying what he meant by the phrase “Palestinian state.” His intentions were clarified not only by the crucial omissions already discussed, but also by his one explicit criticism of Israel: “The United States does not accept the legitimacy of continued Israeli settlements. This construction violates previous agreements and undermines efforts to achieve peace. It is time for these settlements to stop.” That is, Israel should live up to Phase I of the 2003 Road Map, rejected at once by Israel with tacit U.S. support, as noted — though the truth is that Obama has ruled out even steps of the Bush I variety to withdraw from participation in these crimes.
The operative words are “legitimacy” and “continued.” By omission, Obama indicates that he accepts Bush’s vision: the vast existing settlement and infrastructure projects are “legitimate,” thus ensuring that the phrase “Palestinian state” means “fried chicken.”
Always even-handed, Obama also had an admonition for the Arab states: they “must recognize that the Arab Peace Initiative was an important beginning, but not the end of their responsibilities.” Plainly, however, it cannot be a meaningful “beginning” if Obama continues to reject its core principles: implementation of the international consensus. To do so, however, is evidently not Washington’s “responsibility” in Obama’s vision; no explanation given, no notice taken.
On democracy, Obama said that “we would not presume to pick the outcome of a peaceful election” — as in January 2006, when Washington picked the outcome with a vengeance, turning at once to severe punishment of the Palestinians because it did not like the outcome of a peaceful election, all with Obama’s apparent approval judging by his words before, and actions since, taking office.
Obama politely refrained from comment about his host, President Mubarak, one of the most brutal dictators in the region, though he has had some illuminating words about him. As he was about to board a plane to Saudi Arabia and Egypt, the two “moderate” Arab states, “Mr. Obama signaled that while he would mention American concerns about human rights in Egypt, he would not challenge Mr. Mubarak too sharply, because he is a ‘force for stability and good’ in the Middle East… Mr. Obama said he did not regard Mr. Mubarak as an authoritarian leader. ‘No, I tend not to use labels for folks,’ Mr. Obama said. The president noted that there had been criticism ‘of the manner in which politics operates in Egypt,’ but he also said that Mr. Mubarak had been ‘a stalwart ally, in many respects, to the United States.’”
When a politician uses the word “folks,” we should brace ourselves for the deceit, or worse, that is coming. Outside of this context, there are “people,” or often “villains,” and using labels for them is highly meritorious. Obama is right, however, not to have used the word “authoritarian,” which is far too mild a label for his friend.
Just as in the past, support for democracy, and for human rights as well, keeps to the pattern that scholarship has repeatedly discovered, correlating closely with strategic and economic objectives. There should be little difficulty in understanding why those whose eyes are not closed tight shut by rigid doctrine dismiss Obama’s yearning for human rights and democracy as a joke in bad taste.
Noam Chomsky is Institute Professor emeritus in the Department of Linguistics and Philosophy at the Massachusetts Institute of Technology. He is the author of numerous books, including the New York Times bestsellers Hegemony or Survival and Failed States. His newest book, Hopes and Prospects, is out this week from Haymarket Books.
[Note: All material in this piece is sourced and footnoted in Noam Chomsky’s new book Hopes and Prospects.]
Copyright 2010 Noam Chomsky