More stories

  • in

    Guns and the Wrong Side of Rights

    The land that continues to pray for the well-being and continued prosperity of its Second Amendment has, according to Education Week, seen “30 school shootings this year, 22 since August 1.” The most spectacular multiple shooting occurred on November 30, when 15-year-old Ethan Crumbley used the “Christmas present” his parents had purchased four days earlier to randomly kill four students and wound seven others at his high school in Oxford, Michigan.

    With the possible exception of his own parents, even before the shooting everyone agreed with Judge Jeanine Pirro of Fox News that Crumbley was a “troubled kid.” Pirro is one of those judges who doesn’t need to hear the evidence before identifying the true culprit: “liberals.” In that, she stands in the noble company of other purveyors of accusatory news, such as The New York Times, when it consistently suspects Russia of the imaginary Havana syndrome attacks.

    Biden’s New Culture of Brinkmanship

    READ MORE

    Though the horror of the massacre was enough to make it eminently newsworthy, this story offered a new dimension when Oakland County Prosecutor Karen McDonald made the decision to charge the suspect’s parents for involuntary manslaughter. Considering them accomplices in a crime, she explained her reasoning in the following terms: “Gun ownership is a right, and with that right comes great responsibility.”

    Today’s Daily Devil’s Dictionary definition:

    Right:

    A fundamental concept built into the culture of consumerist individualism that confuses the acknowledgment of the tolerance by the state of different types of behavior with the idea of individuals’ possessing the absolute and unencumbered power to harness that tolerance for consciously antisocial purposes

    Contextual note

    In US culture, the notion of “rights” is less a philosophical or legal concept than it is an object of a certain secular faith tantamount to a religious dogma. The first 10 amendments of the US Constitution are called the “Bill of Rights.” Because many Americans view the Constitution as something similar to divine scripture, the fundamental rights it defines, instead of being treated as principles that help define the inevitably flexible relationship that obtains between established authority, society as a collective entity and citizens as individuals, the rights thus defined have been elevated to the status of divine commands.

    Embed from Getty Images

    The First Amendment guaranteeing free speech stands out in most people’s minds as the most sacred of the lot. It defines the very nature of American democracy. Freedom of speech ensures that everyone is empowered to “speak up” and cannot be reduced to silence. But as the current debates about what should be allowed or suppressed on social media demonstrate, only dogmatic libertarians are prepared to define that right as absolute.

    The Third Amendment has been relegated to the status of a museum piece. It reads: “No soldier shall, in time of peace be quartered in any house, without the consent of the owner, nor in time of war, but in a manner to be prescribed by law.” The “right” still stands, but with military practice having evolved in the meantime, the situation it describes no longer exists.

    Several of the first 10 amendments deal with defining due process and expectations with regard to the functioning of the judicial system. The Eighth Amendment, barring “cruel and unusual punishment,” may be the least absolute of the 10, since the US criminal justice system has found multiple innovative ways to apply punishment that only escapes being unusual by the fact that it has become usual.

    The Ninth Amendment provides for the possibility that other rights than those listed in the Bill of Rights may also emerge and be acknowledged. The 10th Amendment states that the federal government has only those powers specifically designated in the Constitution. All other powers belong either to the states or the people. From a historical rather than a legal point of view, it could be argued that the sacred status of the 10th Amendment disappeared after the Civil War. Once it was affirmed that the United States was “one nation, indivisible” rather than a federation of independent states, federal laws not derived from the Constitution have consistently trumped the original powers assumed to belong to the state.

    As a private citizen, McDonald may or may not appreciate how variable the meaning of the rights specified in the first 10 amendments may be. As a public official, she must accept the received majority opinion that “gun ownership” according to the Second Amendment is an absolute right. To attenuate the risk this has created for the lives of ordinary citizens and increasingly for school children, she employs the generally accepted moral notion that rights entail responsibilities. But from a strictly legal point of view, this makes little sense. Unless the nature of those responsibilities is clearly delineated, Americans assume that a right is so fundamental that only a generally accepted rule can qualify it, such as the suggestion that freedom of speech does not include shouting “fire” in a theater. It does, however, include crying wolf, even if it is fake news.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Within the hyper-individualistic culture of the country, Americans have been taught that rights, just like guns, are something the individual can literally own. Indeed, the debate concerning the interpretation of the Second Amendment focuses exclusively on the question of ownership. In many other cultures, rights are perceived not as something the individual possesses, but as areas of tolerance that describe the nature of relationships within the society.

    Historical note

    The understanding and practice of the rights in the Bill of Rights have undergone a lot of serious evolution in the way laws, customs and everyday activities reflect the reality — sacred or secular — of those ordained “rights.” No one appears obsessed about defending the rights outlined in the Third or even the Eighth Amendment. As for speech and even the freedom of religion, there has been room for considerable ambiguity in public debate.

    Curiously, the Second Amendment is the one deemed most worthy of solemn respect by those who insist on the sacred character of the Bill of Rights. Logically, we should consider it with the same critical regard we apply to the Third Amendment. The situation that gave it meaning simply no longer exists. Attentive (and honest) readers easily understand that lacking the historical persistence of the militias it mentions, the thinking behind it cannot be transposed to modern conditions.

    Because many Americans have been conditioned to think of the very notion of rights as something transcendent, they readily accept the notion that stating something as a right means it must be interpreted literally rather than understood historically. There is a sense in which many Americans believe it would be sacrilegious to call into question a text in the Constitution.

    In the case of the Second Amendment, the right in question concerns ownership rather than the actual use of the weapons in question. Owning a gun does not imply using the gun for any purpose, but it has become increasingly apparent that the use of guns is now a specific social problem linked to the ownership of guns. If one is looking for meaning in the Second Amendment, the key word would be “well-regulated.” Today, the entire issue appears beyond the possibility of regulation.

    Embed from Getty Images

    Karen McDonald uses the only weapon at her disposal: the moral idea of responsibility. But as a prosecutor, she is certainly aware that the notion of responsibility has no weight in the law. That is why Kyle Rittenhouse earned his acquittal for shooting two men dead and wounding a third on the streets of Kenosha, Wisconsin in 2020. His actions were irresponsible but not illegal.

    The real problem lies in the fact that there is no reasonable answer or antidote to the fundamental reality of the elevated symbolic status of firearms within US gun culture. A broad consensus attributes strong cultural value to guns as objects, to the belief that guns are legitimate instruments of justice, to the idea that every individual has the “right” to live in their own moral world, and that in a world of threats, an attitude of active self-defense is natural, not exceptional.

    Cultures are partially shaped in schools, but also in families, the marketplace, the neighborhood streets and religious institutions. Schools have increasingly become environments in which gun culture always risks making its presence known. Individuals can learn to be responsible. But how does a society learn it?

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Biden’s New Culture of Brinkmanship

    Taiwan is a problem. Historically separate from but linked to China, Taiwan was colonized by the Dutch and partially by the Spanish in the 17th century. Through a series of conflicts between aboriginal forces allied with the Ming dynasty and European colonial forces who also fought amongst themselves, by 1683, Taiwan became integrated into the Qing Empire. For two centuries, it evolved to become increasingly an integral part of China. In 1895, due to its strategic position on the eastern coast of China at the entry of the South China Sea, it became one of the spoils of the Sino-Japanese war and for half a century was ruled by the Japanese.

    Japan used Taiwan during the Second World War as the launching pad for its aggressive operations in Southeast Asia. At the end of the war, with the Japanese defeated and Mao Zedong’s communists in control of mainland China, Mao’s rival, Chiang Kai-shek, the leader of the Kuomintang, fled to Taiwan. This put the dissident government out of Mao’s reach. Chiang declared his government the Republic of China (ROC) in opposition to Mao’s People’s Republic of China (PRC). For forty years a single-party regime ruled Taiwan following Chiang Kai-shek’s initial declaration of martial law in 1949.

    Macron Promotes Fraternity in the Middle East

    READ MORE

    Because the United States had defined its post-war identity as anti-communist, Taiwan held the status of the preferred national government in what was then referred to as “the free world.” The fate of Taiwan — still referred to by its Portuguese name, Formosa — figured as a major foreign policy issue in the 1960 US presidential campaign that pitted John F. Kennedy against Richard Nixon. The debate turned around whether the US should commit to defending against the People’s Republic two smaller islands situated between continental China and Taiwan.

    In short, Taiwan’s history and geopolitical status over the past 150 years have become extremely complex. There are political, economic and geographical considerations as well as ideological and geopolitical factors that make it even more complex. These have been aggravated by a visible decline in the supposed capacity of the United States to impose and enforce solutions in different parts of the globe and the rise of China’s influence in the global economy.

    Embed from Getty Images

    Complexity, when applied to politics, generally signifies ambiguity. In the aftermath of the Korean War, the Eisenhower administration established a policy based on the idea of backing Taiwan while seriously hedging their bets. Writing for The Diplomat, Dennis Hickey explains that in 1954, the US “deliberately sought to ‘fuzz up’ the security pact [with Taiwan] in such a way that the territories covered by the document were unclear.”

    Following President Nixon’s historic overture in 1971, the US established diplomatic relations with the People’s Republic of China. This led to the transfer of China’s seat at the United Nations from the ROC to Mao’s PRC. The status of Taiwan was now inextricably ambiguous. US administrations, already accustomed to “fuzzy” thinking, described their policy approach as “strategic ambiguity.” It allowed them to treat Taiwan as an ally without recognizing it as an independent state. The point of such an attitude is what R. Nicolas Burns — President Joe Biden’s still unconfirmed pick for the post of US ambassador to China — calls “the smartest and most effective way” to avoid war.

    Recent events indicate that we may be observing a calculated shift in that policy. In other words, the ambiguity is becoming more ambiguous. Or, depending on one’s point of view, less ambiguous. There is a discernible trend toward the old Cold War principle of brinkmanship. A not quite prepared President Biden recently embarrassed himself in a CNN Town Hall for stating that the US had a “commitment” to defend Taiwan. The White House quickly walked back that commitment, reaffirming the position of strategic ambiguity.

    This week, Secretary of State Antony Blinken appeared to be pushing back in the other direction, threatening the Chinese with “terrible consequences” if they make any move to invade Taiwan. Blinken added, the Taipei Times reports, that the US has “been very clear and consistently clear” in its commitment to Taiwan. 

    Today’s Daily Devil’s Dictionary definition:

    Consistently clear:

    In normal use, unambiguous. In diplomatic use, obviously muddied and murky, but capable of being transformed by an act of assertive rhetoric into the expression of a bold-sounding intention that eliminates nuance, even when nuance remains necessary for balance and survival.

    Contextual note

    If Donald Trump’s administration projected a foreign policy based on fundamentally theatrical melodrama that consisted of calling the leader of a nuclear state “rocket man” and dismissing most of the countries of the Global South as “shitholes,” while accusing allies of taking advantage of the US, the defining characteristic of the now ten-months-old Biden administration’s foreign policy appears to be the commitment to the old 1950s Cold War stance known as brinkmanship.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    In November, the CIA director, William Burns, comically threatened Russia with “consequences” if it turned out — despite a total lack of evidence — that Vladimir Putin’s people were the perpetrators of a series of imaginary attacks popularly called the Havana syndrome. This week, backing up Biden’s warning “of a ‘strong’ Western economic response” to a Russian invasion of Ukraine, Security Adviser Jake Sullivan was more specific. “One target,” France 24 reports, “could be Russia’s mammoth Nord Stream 2 natural gas pipeline to Germany. Sullivan said the pipeline’s future was at ‘risk’ if Russia does invade Ukraine.” This may have been meant more to cow the Europeans, whose economy depends on Russian gas, than the Russians themselves.

    These various examples have made observers wonder what is going on, what the dreaded “consequences” repeatedly evoked may look like and what other further consequences they may provoke. The US administration seems to be recycling the nostalgia of members of Biden’s own generation, hankering after what their memory fuzzily associates with the prosperous years of the original Cold War.

    Historical Note

    Britannica defines brinkmanship as the “foreign policy practice in which one or both parties force the interaction between them to the threshold of confrontation in order to gain an advantageous negotiation position over the other. The technique is characterized by aggressive risk-taking policy choices that court potential disaster.”

    The term brinkmanship was coined by Dwight Eisenhower’s Democratic opponent in both of his elections, Adlai Stevenson, who dared to mock Secretary of State John Foster Dulles when he celebrated the principle of pushing things to the brink. “The ability to get to the verge,” Dulles explained, “without getting into the war is the necessary art…if you are scared to go to the brink, you are lost.” Eisenhower’s successor, John F. Kennedy, inherited the consequences of Dulles’ brinkmanship over Cuba, the nation that John Foster’s brother, CIA Director Alan Dulles, insisted on invading only months after Kennedy’s inauguration. This fiasco was a prelude to the truly frightening Cuban missile crisis in October 1962, when Kennedy’s generals, led by Curtis Lemay, sought to bring the world to the absolute brink.

    When, two years later, Lyndon Johnson set a hot war going in Vietnam, or when, decades later, George W. Bush triggered a long period of American military aggression targeting multiple countries in the Muslim world, the policy of brinkmanship was no longer in play. These proxy wars were calculated as bets that fell far short of the brink. The risk was limited to what, unfortunately, it historically turned out to be: a slow deterioration of the capacities and the image of a nation that was ready to abuse its power in the name of abstract principles — democracy, liberation, stifling terrorism, promoting women’s rights — that none of the perpetrators took seriously. Threats and sanctions were features of the daily rhetoric, but the idea at the core of brinkmanship — that some major, uncontrollable conflagration might occur — was never part of the equation.

    Embed from Getty Images

    The Biden administration may have serious reasons for returning to the policy of brinkmanship. The position of the United States on the world stage has manifestly suffered. Some hope it can be restored and believe it would require strong medicine. But there are also more trivial reasons: notably the fear of the administration being mocked by Republicans for being weak in the face of powerful enemies. 

    Both motivations signal danger. We may once again be returning to the devastating brinkman’s game logic illustrated in Stanley Kubrick’s “Dr. Strangelove.”

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    9/11 and the American Collective Unconscious

    A little more than a month ago, the most newsworthy controversy surrounding the imminent and highly symbolic 20th anniversary of 9/11 concerned the message by families of the victims that Joe Biden would not be welcome at the planned commemoration. They reproached the US president for failing to make good on last year’s campaign promise to declassify the documents they believe will reveal Saudi Arabia’s implication in the attacks.

    That was the story that grabbed headlines at the beginning of August. Hardly a week later, everything had changed. Kabul, the capital of Afghanistan, fell to the Taliban and soon the 20-year war would be declared over.

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    Though few paid attention to the phenomenon, this also meant that the significance of a commemoration of the attacks, would be radically different. For 19 years, the commemoration served to reinforce the will and resolution of the nation to overcome the humiliation of the fallen twin towers and a damaged wing of the Pentagon.

    Redefining the Meaning of the Historical Trauma

    In the aftermath of the attacks on September 11, 2001, politicians quickly learned to exploit the date as a painful reminder of a tragedy that had unified an otherwise chaotically disputatious nation in shared horror and mourning. Ever since that fatal day, politicians have invoked it to reinforce the belief in American exceptionalism.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    The nation is so exceptional in generously providing its people with what President George W. Bush called “our freedoms” — and which he identified as the target of the terrorists — that it was logical to suppose that evil people who didn’t possess those freedoms or were prevented from emigrating to the land of the free would do everything in their power to destroy those freedoms. To the degree that Americans are deeply thankful for possessing such an exceptional status, other ill-intentioned people will take exception to that exceptionality and in their unjustified jealousy will threaten to destroy it.

    On a less philosophical and far more pragmatic note, the remembrance of the 9/11 attacks has conveniently and consistently served to justify an ever-expanding military budget that no patriotic American, interested in preserving through the force of arms the nation’s exceptional status, should ever oppose. It went without saying, through the three previous presidencies, that the annual commemoration provided an obvious explanation of why the forever war in Afghanistan was lasting forever.

    The fall of Kabul on August 15, followed by the panicked retreat of all remaining Americans, caught everyone by surprise. It unexpectedly brought an official end to the war whose unforgettable beginning is traced back to that bright September day in 2001. Though no one has yet had the time to put it all in perspective, the debate in the media has shifted away from glossing the issues surrounding an ongoing war on terror to assessing the blame for its ignominious end. Some may have privately begun to wonder whether the theme being commemorated on this September 11 now concerns the martyrdom of its victims or the humiliation of the most powerful nation in the history of the world. The pace of events since mid-August has meant that the media have been largely silent on this quandary.

    So, What About Saudi Arabia?

    With the American retreat, the controversy around Biden’s unkept campaign promise concerning Saudi Arabia’s implication in 9/11 provisionally took a backseat to a much more consequent quarrel, one that will have an impact on next year’s midterm elections. Nearly every commentator has been eager to join the fray focusing on the assessment of the wisdom or folly of both Biden’s decision to withdraw US troops from Afghanistan and his seemingly improvised management of the final chaotic phase.

    Embed from Getty Images

    The human tragedy visible in the nightly news as throngs of people at Kabul airport desperately sought to flee the country easily eclipsed the genteel but politically significant showdown between a group of American citizens demanding the truth and a government committed to protecting the reputations of friends and allies, especially ones from oil-rich nations.

    The official excuse turns around the criterion that has become a magic formula: national security. But the relatives of victims are justified in wondering which nation’s security is being prioritized. They have a sneaking suspicion that some people in Washington have confused their own nation’s security with Saudi Arabia’s. Just as John Mearsheimer and Stephen Walt not long ago revealed that plenty of people within the Beltway continue to confuse US foreign policy with Israel’s, the families may be justified in suspecting that Saudi Arabia’s interest in hiding the truth trumps American citizens’ right to know the truth.

    To appease the families of 9/11 victims and permit his unimpeded participation in the commemorations, Biden offered to release some of the classified documents. It was a clever move, since the new, less-redacted version will only become available well after the commemoration. This gesture seems to have accomplished its goal of preventing an embarrassing showdown at the commemoration ceremonies. But it certainly will not be enough to satisfy the demands of the families, who apparently remain focused on obtaining that staple of the US criminal justice system: “the truth, the whole truth and nothing but the truth.”

    Mohammed bin Salman (MBS), the crown prince of Saudi Arabia, may have shown the way concerning the assassination of journalist Jamal Khashoggi in 2018. Like MBS, the White House prefers finding a way to release some of the truth rather than the whole truth — just the amount that doesn’t violate national security or tarnish the reputations of any key people. Those two goals have increasingly become synonymous. If the people knew what actual political personalities were doing, the nation’s security might be endangered, as the people might begin to lose faith in a government that insists on retaining the essential power of deciding how the truth should be told.

    Embed from Getty Images

    Here is how the White House officially formulates the legal principle behind its commitment to unveiling a little more truth than is currently available. “Although the indiscriminate release of classified information could jeopardize the national security — including the United States Government’s efforts to protect against future acts of terrorism — information should not remain classified when the public interest in disclosure outweighs any damage to the national security that might reasonably be expected from disclosure.”

    The White House has thus formulated an innovative legal principle brilliantly designed to justify concealing enough of the naked truth to avoid offending public morals by revealing its stark nakedness. Legal scholars of the future may refer to it as the “indiscriminate release” principle. Its logical content is worth exploring. It plays on the auxiliary verbs “could” and “should.” “Could” is invoked in such a way as to suggest that, though it is possible, no reasonable person would take the risk of an “indiscriminate release of classified information.” Later in the same sentence, the auxiliary verb “should” serves to speculatively establish the moral character of the principle. It tells us what “should” be the case — that is, what is morally ideal — even if inevitably the final result will be quite different. This allows the White House to display its good intentions while preparing for an outcome that will surely disappoint.

    To justify its merely partial exposure of the truth, the White House offers another original moral concept when it promises the maximization of transparency. The full sentence reads: “It is therefore critical to ensure that the United States Government maximizes transparency.”

    There is of course an easy way to maximize transparency if that is truly the government’s intention. It can be done simply by revealing everything and hiding nothing within the limits of its physical capability. No one doubts that the government is physically capable of removing all the redactions. But the public should know by now that the value cited as overriding all others — national security — implicitly requires hiding a determined amount of the truth. In other words, it is framed as a trade-off between maximum transparency and minimum concealment. Biden has consistently compared himself to President Franklin D. Roosevelt. Perhaps that trade-off between transparency and concealment is what historians will call Biden’s New Deal.

    But the White House’s reasoning is not yet complete. The document offers yet another guiding principle to explain why not everything will become visible. “Thus, information collected and generated in the United States Government’s investigation of the 9/11 terrorist attacks should now be disclosed,” it affirms, “except when the strongest possible reasons counsel otherwise.” Those reasons, the document tells us, will be defined by the Federal Bureau of Investigation during its “declassification reviews.” This invocation of the “strongest possible reasons” appears to empower the FBI to define or at least apply not only what is “strongest,” but also what is “possible.” That constitutes a pretty broad power.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    The document states very clearly what the government sees as the ultimate criterion for declassification: “Information may remain classified only if it still requires protection in the interest of the national security and disclosure of the information reasonably could be expected to result in damage to the national security. Information shall not remain classified if there is significant doubt about the need to maintain its classified status.” The families of the victims can simply hope that there will not be too much “significant doubt.” They might be forgiven for doubting that that will be the case.

    One September Morning vs. 20 Years of Subsequent Mornings

    Twenty years ago, a spectacular crime occurred on the East Coast of the United States that set off two decades of crimes, blunders and judgment errors that, now compounded by COVID-19 and aggravated climate change, have brought the world to a crisis point unique in human history.

    The Bush administration, in office for less than eight months at the time of the event, with no certain knowledge of who the perpetrator might have been, chose to classify the attack not as a crime, but as an act of war. When the facts eventually did become clearer after a moment of hesitation in which the administration attempted even to implicate Iraq, the crime became unambiguously attributable, not to a nation but to a politically motivated criminal organization: Osama bin Laden’s al-Qaeda that back then was operating out of Afghanistan, which was ruled by the Taliban.

    The administration’s choice of treating the attack as an act of war not only stands as a crime in itself, but, as history has shown, as the trigger for a series of even more shameless and far more destructive — if not quite as spectacular — crimes that would roll out for the next two decades and even gain momentum over time. Had the 9/11 attacks been treated as crimes rather than acts of war, the question of national security would have had less importance in the investigation. By going to war with Afghanistan, the Bush administration made it more difficult to investigate all the possible complicities. Could this partially explain its precipitation to start a war?

    Bin Laden, a Saudi, did not act alone. But he did not act in the name of a state either, which is the fundamental criterion for identifying an act of war. He acted within a state, in the territory of Afghanistan. Though his motive was political and the chosen targets were evocatively symbolic of political power, the act itself was in no way political. No more so, in any case, than the January 6 insurrection this year on Capitol Hill.

    Embed from Getty Images

    Though the facts are still being obscured and the text describing them remains redacted in the report of the 9/11 Commission, reading between the redacted lines reveals that bin Laden did have significant support from powerful personalities in Saudi Arabia, many of them with a direct connection to the government. This foreknowledge would seem to indicate complicity at some level of the state.

    On this 20th anniversary of a moment of horror, the families of the victims quite logically continue to suspect that if a state was involved that might eventually justify a declaration of war by Congress (as required by the US Constitution), the name of that state should not have been Afghanistan, but Saudi Arabia. It is equally clear that the Afghan government at the time was in no way directly complicit.

    When the new version of the 9/11 Commission’s report appears with its “maximum transparency,” meaning a bare minimum of redaction, the objections of the victims’ families will no longer be news, and the truth about the deeper complicities around 9/11 will most probably remain obscured. Other dramas, concerning the state of the COVID-19 pandemic, the increasingly obvious consequences of climate change and an upcoming midterm election will probably mean that next year’s 21st commemoration will be low-keyed and possibly considered unworthy of significant mention in the news.

    In 2021, the world has become a decidedly different place than it has been over the past two decades. The end of a forever war simply promises a host of new forever problems to emerge for increasingly unstable democracies to deal with.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Will the US Wake Up From Its Post-9/11 Nightmare?

    Looking back on it now, the 1990s were an age of innocence for America. The Cold War was over and our leaders promised us a “peace dividend.” There was no TSA — the Transportation Security Administration — to make us take off our shoes at airports (how many bombs have they found in those billions of shoes?). The government could not tap a US phone or read private emails without a warrant from a judge. And the national debt was only $5 trillion, compared with over $28 trillion today.

    We have been told that the criminal attacks of September 11, 2001, “changed everything.” But what really changed everything was the US government’s disastrous response to them. That response was not preordained or inevitable, but the result of decisions and choices made by politicians, bureaucrats and generals who fueled and exploited our fears, unleashed wars of reprehensible vengeance and built a secretive security state, all thinly disguised behind Orwellian myths of American greatness.  

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    Most Americans believe in democracy and many regard the United States as a democratic country. But the US response to 9/11 laid bare the extent to which American leaders are willing to manipulate the public into accepting illegal wars, torture, the Guantanamo gulag and sweeping civil rights abuses — activities that undermine the very meaning of democracy. 

    Former Nuremberg prosecutor Ben Ferencz said in a speech in 2011 that “a democracy can only work if its people are being told the truth.” But America’s leaders exploited the public’s fears in the wake of 9/11 to justify wars that have killed and maimed millions of people who had nothing to do with those crimes. Ferencz compared this to the actions of the German leaders he prosecuted at Nuremberg, who also justified their invasions of other countries as “preemptive first strikes.” 

    Unique Insights from 2,500+ Contributors in 90+ Countries

    “You cannot run a country as Hitler did, feeding them a pack of lies to frighten them that they’re being threatened, so it’s justified to kill people you don’t even know,” Ferencz continued. “It’s not logical, it’s not decent, it’s not moral, and it’s not helpful. When an unmanned bomber from a secret American airfield fires rockets into a little Pakistani or Afghan village and thereby kills or maims unknown numbers of innocent people, what is the effect of that? Every victim will hate America forever and will be willing to die killing as many Americans as possible. Where there is no court of justice, wild vengeance is the alternative.” 

    “Insurgent Math”

    Even the commander of US forces in Afghanistan, General Stanley McChrystal, talked about “insurgent math,” conjecturing that, for every innocent person killed, the US created 10 new enemies. Thus, the so-called global war on terror fueled a global explosion of terrorism and armed resistance that will not end unless and until the United States ends the state terrorism that provokes and fuels it. 

    By opportunistically exploiting 9/11 to attack countries that had nothing to do with it, like Iraq, Somalia, Libya, Syria and Yemen, the US vastly expanded the destructive strategy it used in the 1980s to destabilize Afghanistan, which spawned the Taliban and al-Qaeda in the first place. In Libya and Syria, only 10 years after 9/11, US leaders betrayed every American who lost a loved one on September 11 by recruiting and arming al-Qaeda-led militants to overthrow two of the most secular governments in the Middle East, plunging both countries into years of intractable violence and fueling radicalization throughout the region.

    The US response to 9/11 was corrupted by a toxic soup of revenge, imperialist ambitions, war profiteering, systematic brainwashing and sheer stupidity. Lincoln Chafee, the only Republican senator who voted against the war on Iraq, later wrote, “Helping a rogue president start an unnecessary war should be a career-ending lapse of judgment.”

    Embed from Getty Images

    But it wasn’t. Very few of the 263 Republicans or the 110 Democrats who voted in 2002 for the US to invade Iraq paid any political price for their complicity in international aggression, which the judges at Nuremberg explicitly called “the supreme international crime.” One of them now sits at the apex of power in the White House. 

    Failure in Afghanistan

    Donald Trump and Joe Biden’s withdrawal and implicit acceptance of the US defeat in Afghanistan could serve as an important step toward ending the violence and chaos their predecessors unleashed after the 9/11 attacks. But the current debate over next year’s military budget makes it clear that our deluded leaders are still dodging the obvious lessons of 20 years of war. 

    Barbara Lee, the only member of Congress with the wisdom and courage to vote against the war resolution in 2001, has introduced a bill to cut US military spending by almost half: $350 billion per year. With the miserable failure in Afghanistan, a war that will end up costing every US taxpayer $20,000, one would think that Representative Lee’s proposal would be eliciting tremendous support. But the White House, the Pentagon and the Armed Services Committees in the House and Senate are instead falling over each other to shovel even more money into the bottomless pit of the military budget.

    Politicians’ votes on questions of war, peace and military spending are the most reliable test of their commitment to progressive values and the well-being of their constituents. You cannot call yourself a progressive or a champion of working people if you vote to appropriate more money for weapons and war than for health care, education, green jobs and fighting poverty.

    These 20 years of war have revealed to Americans and the world that modern weapons and formidable military forces can only accomplish two things: kill and maim people and destroy homes, infrastructure and entire cities. American promises to rebuild bombed-out cities and “remake” countries it has destroyed have proved worthless, as President Biden has acknowledged. 

    Embed from Getty Images

    Both Iraq and Afghanistan are turning primarily to China for the help they need to start rebuilding and developing economically from the ruin and devastation left by the US and its allies. America destroys, China builds. The contrast could not be more stark or self-evident. No amount of Western propaganda can hide what the whole world can see. 

    But the different paths chosen by American and Chinese leaders are not predestined. Despite the intellectual and moral bankruptcy of the US corporate media, the American public has always been wiser and more committed to cooperative diplomacy than their country’s political and executive class. It has been well-documented that many of the endless crises in US foreign policy could have been avoided if America’s leaders had just listened to the people.

    Weapons and More Weapons

    The perennial handicap that has dogged US diplomacy since World War II is precisely our investment in weapons and military forces, including nuclear weapons that threaten our very existence. It is trite but true to say that, “when the only tool you have is a hammer, every problem looks like a nail.” 

    Other countries don’t have the option of deploying overwhelming military force to confront international problems, so they have had to be smarter and more nimble in their diplomacy and more prudent and selective in their more limited uses of military force. 

    The rote declarations of US leaders that “all options are on the table” are a euphemism for precisely the “threat or use of force” that the UN Charter explicitly prohibits, and they stymie the US development of expertise in nonviolent forms of conflict resolution. The bumbling and bombast of America’s leaders in international arenas stand in sharp contrast to the skillful diplomacy and clear language we often hear from top Russian, Chinese and Iranian diplomats, even when they are speaking in English, their second or third language.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    By contrast, US leaders rely on threats, coups, sanctions and war to project power around the world. They promise Americans that these coercive methods will maintain US “leadership” or dominance indefinitely into the future, as if that is America’s rightful place in the world: sitting atop the globe like a cowboy on a bucking bronco. 

    A “new American century” and “Pax Americana” are Orwellian versions of Adolf Hitler’s “thousand-year Reich” but are no more realistic. No empire has lasted forever, and there is historical evidence that even the most successful empires have a lifespan of no more than 250 years, by which time their rulers have enjoyed so much wealth and power that decadence and decline inevitably set in. This describes the United States today.  

    America’s economic dominance is waning. Its once productive economy has been gutted and financialized, and most countries in the world now do more trade with China and/or the European Union than with the United States. Where America’s military once kicked open doors for American capital to “follow the flag” and open up new markets, today’s US war machine is just a bull in the global china shop, wielding purely destructive power.    

    Time to Get Serious

    But we are not condemned to passively follow the suicidal path of militarism and hostility. Biden’s withdrawal from Afghanistan could be a down payment on a transition to a more peaceful post-imperial economy — if the American public starts to actively demand peace, diplomacy and disarmament and find ways to make our voices heard. 

    First, we must get serious about demanding cuts in the Pentagon budget. None of our other problems will be solved as long as we keep allowing our leaders to flush the majority of federal discretionary spending down the same military toilet as the $2.26 trillion they wasted on the war in Afghanistan. We must oppose politicians who refuse to cut the Pentagon budget, regardless of which party they belong to and where they stand on other issues.

    Embed from Getty Images

    Second, we must not let ourselves or our family members be recruited into the US war machine. Instead, we must challenge our leaders’ absurd claims that the imperial forces deployed across the world to threaten other countries are somehow, by some convoluted logic, defending America. As a translator paraphrased Voltaire, “Whoever can make you believe absurdities can make you commit atrocities.”  

    Third, we must expose the ugly, destructive reality behind our country’s myths of “defending” US vital interests, humanitarian intervention, the war on terror and the latest absurdity, the ill-defined “rules-based order” — whose rules only apply to others but never to the United States. 

    Finally, we must oppose the corrupt power of the arms industry, including US weapons sales to the world’s most repressive regimes, and an unwinnable arms race that risks a potentially world-ending conflict with China and Russia. 

    Our only hope for the future is to abandon the futile quest for hegemony and instead commit to peace, cooperative diplomacy, international law and disarmament. After 20 years of war and militarism that has only left the world a more dangerous place and accelerated America’s decline, we must choose the path of peace.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Is Operation Enduring Freedom Doomed to Endure Forever?

    Those were heady days in the US stock market. I would wake up by 5 am and watch CNBC before the stock market opened for trading at 6:30 am Pacific time. It was no different on the morning of September 11, 2001. Little did I know that catastrophic things were about to happen that would change the world.

    At 8:45 am Eastern time, an American Airlines flight had crashed into the north tower of the World Trade Center in New York City. Within minutes, CNBC stopped discussing stocks and started covering the incident, which, at that moment, no one knew if it was an anomalous accident or an attack of some kind.

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    Three minutes after 9 am Eastern, as I watched incredulously at the events unfolding, I saw a United Airlines passenger aircraft fly right into the south tower of the twin towers. In under an hour, the south tower collapsed, resulting in a massive cloud of dust and smoke. By now, there was no doubt that America was under attack.

     “We will remember the moment the news came, where we were and what we were doing,” said President George W. Bush in an address to Congress on September 20. Images from that Tuesday morning are still etched in my memory, happening, as it were, just nine days after my second child was born.

    In all, 2,996 people of 78 nationalities lost their lives in four coordinated attacks conducted by al-Qaeda using hijacked commercial, civilian airliners as their weapons, making 9/11 the second-biggest attack on American soil — second only to the genocidal assault on Native Americans committed by the nation’s immigrant settlers.

    Operation Enduring Freedom: America’s War on Terror

    Addressing the nation the following day, Bush called the attacks “more than acts of terror. They were acts of war.” He promised that “the United States of America will use all our resources to conquer this enemy.” The president went on to assure Americans that this “battle will take time and resolve, but make no mistake about it, we will win.”

    Embed from Getty Images

    Twenty years later, the US has left Afghanistan and Iraq in a chaotic mess. The question remains: Did the United States win the war on terror the Bush administration launched in 2001? This was a war that has cost more than $6.4 trillion and over 801,000 lives, according to Watson Institute for International and Public Affairs at Brown University.

    In October 2001, the US-led coalition invaded Afghanistan and overthrew the Taliban government that had harbored al-Qaeda. Soon after, al-Qaeda militants had been driven into hiding. Osama bin Laden, the mastermind behind the 9/11 attack and leader of al-Qaeda, was killed 10 years later in a raid conducted by US forces in Abbottabad, Pakistan.

    In a shrewd move, Bush had left himself room to take down Iraq and its president, Saddam Hussein, using an overarching definition for the war on terror. In his address to Congress on September 20, Bush also stated: “Our war on terror begins with Al-Qaeda, but it does not end there. It will not end until every terrorist group of global reach has been found, stopped and defeated.”

    True to his words, in 2003, the United States and its allies invaded Iraq under the premise that it possessed weapons of mass destruction. Bush settled his score with Hussein, ensuring he was captured, shamed and subsequently executed in 2006.

    Despite reducing al-Qaeda to nothing and killing bin Laden, despite wrecking Iraq and having its leader executed, it is impossible to say that the US has won the war on terror. All that Washington has managed to do is to trade the Islamic State (IS) group that swept through Syria and Iraq in 2014 for al-Qaeda, giving a new identity to an old enemy. Following the US and NATO pullout from Afghanistan last month, the Taliban, whom America drove out of power in 2001, are back in the saddle. In fact, the Taliban’s recapture of Afghanistan has been so swift, so precise and so comprehensive that the international community is in a shock, questioning the timing and prudence of the withdrawal of troops.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Setting an expectation for how long the war or terror was likely to last, the secretary of defense under the Bush administration, Donald Rumsfeld, remarked in September 2001 that “it is not going to be over in five minutes or five months, it’ll take years.” Rumsfeld, who christened the campaign Operation Enduring Freedom, was prescient, as the war enters its third decade in a never-ending fight against terrorism.

    The Winners and Losers

    Ironically, Operation Enduring Freedom has only resulted in an enduring loss of American freedom, one step at a time. I still remember that I had walked up to the jet bridge and received my wife as she deplaned from a flight in 1991. Another time, when she was traveling to Boston from San Francisco, I was allowed to enter the aircraft and help her get settled with her luggage, along with our 1-year-old. It is inconceivable to be allowed to do such a thing today, and I would not be surprised if readers question the veracity of my personal experience. In many ways, al-Qaeda has succeeded in stripping Americans of the sense of freedom they have always enjoyed.

    More than Americans, the biggest losers in this tragic war are Iraqis and Afghans, particularly the women. Afghan women, who had a brief respite from persecution under the Taliban’s strict Islamic laws and human rights abuses, are back to square one and justifiably terrified of their future under the new regime. The heart-wrenching scenes from Kabul airport of people trying to flee the country tell us about how Afghans view the quality of life under the Taliban and the uncertainty that the future holds. 

    To its east, the delicate balance of peace — if one could characterize the situation between India and Pakistan as peaceful — is likely to be put to the test as violence from Afghanistan spreads. To its north in Tajikistan, there isn’t much love lost between Tajiks and the Taliban. Tajikistan’s president, Emomali Rahmon, has refused to recognize the Taliban government, and Tajiks have promised to join anti-Taliban militia groups, paving the way for continued unrest and violence in Central Asia.

    If History Could be Rewritten

    In 2001, referring to Islamist terrorists, Bush asked the rhetorical question, “Why do they hate us?” He tried to answer it in a speech to Congress: “They hate what they see right here in this chamber: a democratically elected government. Their leaders are self-appointed. They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other.”

    Islamic fundamentalists couldn’t give two hoots about a form of government or a people’s way of life thousands of miles away. The real answer to Bush’s question lies deeply buried in US foreign policy. America’s steadfast support of Israel and its refusal to recognize the state of Palestine is the number one reason for it to become the target of groups like al-Qaeda and IS.

    Embed from Getty Images

    America’s ill-conceived response to the Soviet invasion of Afghanistan in 1979 during the Cold War led to the creation of al-Qaeda. It was with US funds and support that the anti-Soviet mujahideen fought America’s proxy war with the Soviets. Without US interference, al-Qaeda may never have come into existence.

    During the Iran-Iraq War of the 1980s, the US bolstered Saddam Hussein by backing his regime against the Iranians. When Hussein became too ambitious for America’s comfort and invaded Kuwait in 1990, George H.W. Bush engaged Iraq in the Persian Gulf War. The US motive at that time was primarily to protect its oil interests in Kuwait.

    The US created its own nemesis in Saddam Hussein and Osama bin Laden and spent $6 trillion to kill them. In the process, US leaders have reduced Iraq and Afghanistan to shambles and created a new monster in the Islamic State.

    Sadly, history can never be rewritten. The US has proved time and again that its involvement in the Middle East and Muslim world is aimed at advancing its own political interests. The only question that remains is: Can the US adopt a policy that would not aggravate the situation and, over time, deescalate it, without creating yet another Hussein or bin Laden? Without a radically different approach, Operation Enduring Freedom is doomed to endure forever, costing trillions of dollars each decade.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    How 9/11 and the War on Terror Shaped the World

    On September 11, 2001, 19 militants associated with the Islamist terrorist group al-Qaeda hijacked four planes and launched suicide attacks on iconic symbols of America, first striking the twin towers of the World Trade Center in New York and then the Pentagon. It would be the deadliest act of terrorism on American soil, claiming nearly 3,000 lives.

    Scroll down to read more in this 360° series

    The attacks not only shocked the world, but the images of planes crashing into the World Trade Center came to define a generation. In a speech on October 11, 2001, then-President George W. Bush spoke of “an attack on the heart and soul of the civilized world” and declared “war against all those who seek to export terror, and a war against those governments that support or shelter them.” This was the start of the global war on terror.

    The Story of the 9/11 Attacks and Retaliation

    Osama bin Laden, the Saudi leader of al-Qaeda, inspired the 9/11 attacks. Khalid Sheikh Mohammed, a Pakistani Islamist terrorist and the nephew of the truck driver convicted for the 1993 World Trade Center bombing, masterminded the operation. The 9/11 Commission Report described al-Qaeda as “sophisticated, patient, disciplined and lethal.” It held that the enemy rallied “broad support in the Arab and Muslim world.” The report concluded that al-Qaeda’s hostility to the US and its values was limitless.

    The report went on to say that the enemy aimed “to rid the world of religious and political pluralism, the plebiscite, and equal rights for women,” and observed that it made no distinction between military and civilian targets. The goal going forward was “to attack terrorists and prevent their ranks from swelling while at the same time protecting [the US] against future attacks.”

    Embed from Getty Images

    To prosecute the war on terror, the US built a worldwide coalition: 136 countries offered military assistance, and 46 multilateral organizations declared support. Washington began by launching a financial war on terror, freezing assets and disrupting fundraising pipelines. In the first 100 days, the Bush administration set aside $20 billion for homeland security.

    On October 7, 2001, the US inaugurated the war on terror with Operation Enduring Freedom. An international coalition that included Australia, Canada, Denmark, Germany, Japan, the UK and other countries, with the help of the Northern Alliance comprising various mujahedeen militias, overthrew the Taliban, which was sheltering al-Qaeda fighters, and took over Afghanistan.

    The war on terror that began in Afghanistan soon took on a global focus. In 2003, the Bush administration invaded Iraq despite the lack of a UN mandate. Washington made the argument that Iraqi dictator Saddam Hussein was developing weapons of mass destruction, represented a threat to world peace, and harbored and succored al-Qaeda and other Islamic jihadists. None of this proved to be true. Hussein’s regime fell as speedily as Mullah Omar’s Taliban.

    Victory, however, was short-lived. Soon, insurgency returned. In Afghanistan, suicide attacks quintupled from 27 in 2005 to 139 in 2006. Globally, the war on terror saw a “stunning” rise in jihadist activity, with just over 32,000 fighters split among 13 Islamist groups in 2001 burgeoning to 100,000 across 44 outfits in 2015. Terrorist attacks went up from an estimated 1,880 in 2001 to 14,806 in 2015, claiming 38,422 lives that year alone — a 397% increase on 2001.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Boosted by the US invasion of Iraq, al-Qaeda spawned affiliates across Asia, Africa and the Middle East, a decentralized structure that remained intact even after the US assassination of Osama bin Laden in 2011 dealt al-Qaeda a severe blow. One of its Iraqi offshoots morphed into what became the Islamic State (IS) group following the withdrawal of most US from Iraq under President Barack Obama in 2011.

    After declaring a caliphate in 2014, IS launched a global terrorist campaign that, within a year, conducted and inspired over 140 attacks in 29 countries beyond Syria and Iraq, according to one estimate. Islamic State acolytes went on to claim nearly 30,000 lives across the Middle East, Europe, the United States, Asia and Africa, controlling vast amounts of territory in Iraq and Syria, before suffering defeat by internationally-backed local forces in 2019.

    In Afghanistan, despite the war’s estimated trillion-dollar price tag, on August 15 the Taliban have taken control of the capital Kabul amid a chaotic US withdrawal, raising fears of al-Qaeda’s comeback. Last year, the Global Terrorism Index concluded that deaths from terrorism were still double the number recorded in 2001, with Afghanistan claiming a disproportionately large share of over 40% in 2019.

    Why Do 9/11 and the War on Terror Matter?

    While the failures and successes of the war on terror will remain subject to heated debate for years to come, what remains uncontested is the fact that the 9/11 attacks and the ensuing war on terror have forged the world we live in today.

    First, they have caused tremendous loss of blood and treasure. Brown University’s Costs of War project places an $8-trillion price tag on the US war on terror. It estimates that about 900,000 people “were killed as a direct result of war, whether by bombs, bullets or fire,” a number that does not include indirect deaths “caused by way of disease, displacement and loss of access to food or clean drinking water.”

    Embed from Getty Images

    Second, numerous countries, including liberal democracies such as the US and the UK, have eroded their own civil liberties and democratic institutions with the avowed goal of improving security. Boarding airplanes or entering public buildings now invariably involves elaborate security checks. Mass surveillance has become par for the course. The US continues to keep alleged terror suspects in indefinite detention without trial in Guantanamo Bay.

    Third, many analysts argue that the attacks and the response have coarsened the US. After World War II, Americans drew a line in the sand against torture. They put Germans and Japanese on trial for war crimes that included waterboarding. In the post-9/11 world, torture became part of the American toolkit. Airstrikes and drone strikes have caused high collateral casualties, killing a disputed number of innocents and losing the battle for the hearts and minds of local populations.

    These strikes raise significant issues of legality and the changing nature of warfare. There is a question as to the standing of “counterterrorism” operations in international and national law. However, such issues have garnered relatively little public attention. 

    Fourth, the 9/11 attacks and the ensuing war on terror have coincided with the spectacular rise of China. On December 11, 2001, the Middle Kingdom joined the World Trade Organization, which enabled the Chinese economy to grow at a speed and scale unprecedented in history. Analysts believe that distraction with the war on terror hindered the US response to the revolution occurring in global international relations and power dynamics. 

    Embed from Getty Images

    Under Barack Obama, the US initiated an explicit pivot to Asia policy that sought to shift focus from the war on terror and manage the rise of China. Under Donald Trump, Washington unleashed a trade war on Beijing and concluded a peace deal with the Taliban. Joe Biden has believed that, since the early days of the war on terror, US priorities have been too skewed toward terrorism and that Afghanistan is a secondary strategic issue, leading to a decision to withdraw troops to mark the 20th anniversary of 9/11.

    Biden has argued that the US has degraded al-Qaeda in Afghanistan and eliminated bin Laden. Despite worrying echoes of George W. Bush declaring the “mission accomplished” in Iraq in 2003, from now on, Biden wants the US to remain “narrowly focused on counterterrorism — not counterinsurgency or nation building.”

    While the terrorist threat still consumes US resources, Washington is now shifting its strategic attention and resources to China, Russia and Iran. The Biden administration has deemed these three authoritarian powers to be the biggest challenge for the postwar liberal and democratic order. The 20-year war on terror seems to be over — at least for now.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    After Afghanistan, How Probable Is Peace?

    As the world speculates about the future of Afghanistan, some key figures in the West — with a vested interest in how things evolve militarily — are today claiming to show the clairvoyance that has consistently failed them in the past. Many have criticized President Joe Biden’s decision to withdraw from the battlefield. Even more have complained about how it was managed.

    Republicans feel duty-bound to denounce a policy that only Lynn Cheney objected to when the Republican President Donald Trump promoted it. One “highly decorated British Army officer” complained that “6,500 people died, including 3,000 deaths at Twin Towers, and we didn’t achieve a single thing.” Special Operations Staff Sergeant Trevor Coult went further, claiming that Biden “is a danger while he is president.”

    Numerous Democrats attached to the military-industrial funding machine have objected to the very idea of abandoning the costly struggle. Representative Jim Langevin, of Rhode Island, penned an op-ed in Foreign Policy portraying the decision as a betrayal of a moral commitment to “our Afghan allies of 20 years” and “to our military service members and their families … who gave the ultimate sacrifice.” And, of course, he couldn’t forget “the women and girls of Afghanistan who are now experiencing a devastating new reality.”

    Will the Taliban End Up Under the Influence?

    READ MORE

    He seemed less concerned when for 20 years the majority of Afghan women and girls experienced another form of devastating reality: receiving bombs delivered by surgical drones, seeing their doors kicked in by well-armed soldiers, listening to drones buzzing overhead and wondering where they might strike, failing to understand which local warlord in the pay of the CIA might protect them or aggress them, or simply watching unutterable chaos unfold day after day.

    AFP reports on the opinion General Mark Milley expressed in an interview with Fox News: Now that the US troops are no longer there to enforce the law and maintain order, the chairman of the Joint Chiefs of Staff predicts further chaos, worse than ever before. After questioning the ability of the Taliban, even before they have formed a government, “to consolidate power and establish effective governance,” Milley offers his assessment of what’s to come. “I think there’s at least a very good probability of a broader civil war,” he asserts.

    Making certain his audience will understand the degree of fear his warnings should inspire, he adds that it “will then in turn lead to conditions that could, in fact, lead to a reconstitution of Al-Qaeda or a growth of ISIS or other … terrorist groups.”

    Today’s Daily Devil’s Dictionary definition:

    Good probability:

    A dire likelihood to be ardently wished for by anyone associated with the military-industrial complex or dependent on it for current or future employment

    Contextual note

    Military officers, including generals, may hide the truth about reality on the ground. As the Afghanistan Papers revealed, that happened consistently for over two decades. But even when painting a rosy picture of success or an assessment of troop performance, a soldier’s choice of language leaves some room for the truth. That is why most governments usually prefer that the military not engage too directly with the media.

    General Milley made clear what he means when he described the chaos to come as “at least a very good probability.” Both of his chosen expressions — “at least” and “very good” — reveal less about reality on the ground and more about how he hopes to see the situation evolve, calling for preparedness and possibly new operations. He wants Fox’s audience to understand that this is only a pause in the mission of the US to help other nations achieve the serenity of the global superpower that will always be a model for the rest of the world and lead by its example.

    Embed from Getty Images

    A totally neutral and objective observer who happened to be equally convinced of the likelihood of a civil war in Afghanistan would have formulated it differently, most likely asserting something along the lines that “a strong possibility of a broader civil war cannot be discounted.” Proverbial wisdom tells us that “where there’s a will, there’s a way,” but the authorities of a nation defined by its military clout tend to improve on that by suggesting that “where there’s a will, there’s a way of framing it in such a manner as to convince people of the way we have decided must be followed.”

    General Milley is no warmonger. No reasonable person would compare him to the legendary Curtis Lemay who summed up his philosophy about conflict — in this case with Russia during the 1962 Cuban Missile Crisis — with these words: “We should go in and wipe ’em out today.”

    Fortunately, no senior officer in the military would be tempted to think or act that way now. In contrast with the Cold War mentality, one of the lessons of all recent wars is that the US military is less motivated by the idea of winning wars than simply instilling the idea in the average American taxpayer’s mind that the nation needs a powerful, well-funded, technologically advanced military establishment to comfort the belief in American exceptionalism.

    In his interview with Fox News, General Milley shows no inclination to criticize Biden’s decision. He defends the way the withdrawal was conducted, laying all the blame on the Afghan government and its troops while claiming that everything was conducted according to plan. He cites the “corruption in the government” and its lack of legitimacy, “a fundamental issue that stretches back 20 years.”

    Embed from Getty Images

    Concerning the collapse of the army and the police force, he makes a truly interesting remark: “We created and developed forces that looked like Western forces,” adding significantly that “maybe those forces were not designed appropriately for the type mission.” 

    General Milley follows up that last observation with what almost sounds like a resolution for action in the future: “That was something that needs to be looked at.” Many commentators have remarked that at the core of the 20-year fiasco lay a persistent form of cultural ignorance. By referring to this question as “something that needs to be looked at,” Milley appears to be placing it on some unequivocally remote back burner. In military parlance, “what needs to be looked at” is what will never be looked at unless someone at the highest level of authority suddenly wakes up to acknowledge the necessity.

    Historical Note 

    In short, an episode of history has just come to an end. In the coming weeks and months, reflection on it will be mired in wild speculation about what might have been done differently, accompanied by accusations of irresponsibility and failure of accountability. And if recent history is any guide, accountability will be successfully evaded, if only because holding one identifiable person accountable opens the floodgate to calling into question the entire system of which they were a part.

    In 2009, voters for Barack Obama expected to see some form of accountability for nearly everyone in the Bush administration, guilty of multiple sins that included war crimes, the criminal transfer of wealth to the 1% and the gutting of the middle-class economy. There were zero prosecutions and instead a message about looking forward rather than backward and letting bygones be bygones.   

    Unique Insights from 2,500+ Contributors in 90+ Countries

    There are many lessons to be learned from the debacle in Afghanistan and a need for accountability that extends backward to the Bush administration. But none of the lessons can compete with the only essential idea the leaders and actors of the military-industrial complex will continue to put forward in the months to come: that we must be ready to repeat the patterns of the past and respond to the inevitable emergence of the equivalent of al-Qaeda again. We must be afraid of the next wave of terrorism, and we must be ready to respond. The logic of 2021 is the same as the logic of 2001 — and will undoubtedly lead to similar scenarios.

    And why should the logic be different? Military budgets have never been higher, and every new Congress is ready to raise the stakes. Many of us who grew up during the Vietnam War assumed that, once it was over, nothing like that 10-year nightmare could ever occur again. Instead, we have just sat through the equivalent of a Hollywood remake that lasted twice as long.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Welcome to Our Extreme World

    Admittedly, I hadn’t been there for 46 years, but old friends of mine still live (or at least lived) in the town of Greenville, California, and now, well, it’s more or less gone, though they survived. The Dixie Fire, one of those devastating West Coast blazes, had already “blackened” 504 square miles of Northern California in what was still essentially the (old) pre-fire season. It would soon become the second-largest wildfire in the state’s history. When it swept through Greenville, much of downtown, along with more than 100 homes, was left in ashes as the 1,000 residents of that Gold Rush-era town fled.

    I remember Greenville as a wonderful little place that, all these years later, still brings back fond memories. I’m now on the other coast, but much of that small, historic community is no longer there. This season, California’s wildfires have already devastated three times the territory burned in the same period in 2020’s record fire season. And that makes a point that couldn’t be more salient to our moment and our future.

    There’s No Such Thing as Plenty of Fish in the Sea

    READ MORE

    A heating planet is a danger, not in some distant time, but right now — yesterday, today, and tomorrow. Don’t just ask the inhabitants of Greenville, ask those in the village of Monte Lake, British Columbia, the second town in that Canadian province to be gutted by flames in recent months in a region that normally — or perhaps I should just say once upon a time — was used to neither extreme heat and drought, nor the fires that accompany them.

    In case you hadn’t noticed, we’re no longer just reading about the climate crisis; we’re living it in a startling fashion. At least for this old guy, that’s now a fact — not just of life but of all our lives — that simply couldn’t be more extreme and I don’t even need the latest harrowing report of the UN’s Intergovernmental Panel on Climate Change (IPCC) to tell me so.

    Whether you’ve been sweating and swearing under the latest heat dome; fleeing fires somewhere in the West; broiling in a Siberia that’s releasing startling amounts of heat-producing methane into the atmosphere; being swept away by floodwaters in Germany; sweltering in an unprecedented heat-and-fire season in Greece (where even the suburbs Athens were being evacuated); baking in Turkey or on the island of Sardinia in a “disaster without precedent”; neck-deep in water in a Chinese subway car; or, after “extreme rains,” wading through the subway systems of New York City or London, you — all of us — are in a new world and we better damn well get used to it. 

    Embed from Getty Images

    Floods, megadrought, the fiercest of forest fires, unprecedented storms — you name it and it seems to be happening not in 2100 or even 2031, but now. A recent study suggests that, in 2020 (not 2040 or 2080), more than a quarter of Americans had suffered in some fashion from the effects of extreme heat, already the greatest weather-based killer of Americans and, given this blazing summer, 2021 is only likely to be worse.

    By the way, don’t imagine that it’s just us humans who are suffering. Consider, for instance, the estimated billion or more — yes, 1 billion — mussels, barnacles and other small sea creatures that were estimated to have died off the coast of Vancouver, Canada, during the unprecedented heatwave there earlier in the summer.

    A few weeks ago, watching the setting sun, an eerie blaze of orange-red in a hazy sky here on the East Coast was an unsettling experience once I realized what I was actually seeing: a haze of smoke from the megadrought-stricken West’s disastrous early fire season. It had blown thousands of miles east for the second year in a row, managing to turn the air of New York and Philadelphia into danger zones.

    In a way, right now it hardly matters where you look on this planet of ours. Take Greenland, where a “massive melting event,” occurring after the temperature there hit double the normal this summer, made enough ice vanish “in a single day last week to cover the whole of Florida in two inches of water.” But there was also that record brush fire torching more than 62 square miles of Hawaii’s Big Island. And while you’re at it, you can skip prime houseboat-vacation season at Lake Powell on the Arizona-Utah border, since that huge reservoir is now three-quarters empty (and, among Western reservoirs, anything but alone).

    It almost doesn’t matter which recent report you cite. When it comes to what the scientists are finding, it’s invariably worse than you (or often even they) had previously imagined. It’s true, for instance, of the Amazon rainforest, one of the great carbon sinks on the planet. Parts of it are now starting to release carbon into the atmosphere, as a study in the journal Nature reported recently, partially thanks to climate change and partially to more direct forms of human intervention.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    It’s no less true of the Siberian permafrost in a region where, for the first time above the Arctic Circle, the temperature in one town reached more than 100 degrees Fahrenheit on a summer day in 2020. And yes, when Siberia heats up in such a fashion, methane (a far more powerful heat-trapping gas than CO2) is released into the atmosphere from that region’s melting permafrost wetlands, which had previously sealed it in. And recently, that’s not even the real news. What about the possibility, according to a new study published in the Proceedings of the National Academy of Sciences, that what’s being released now is actually a potential “methane bomb” not from that permafrost itself, but from thawing rock formations within it?

    In fact, when it comes to the climate crisis, as a recent study in the journal Bioscience found, “some 16 out of 31 tracked planetary vital signs, including greenhouse gas concentrations, ocean heat content, and ice mass, set worrying new records.” Similarly, carbon dioxide, methane and nitrous oxide “have all set new year-to-date records for atmospheric concentrations in both 2020 and 2021.”

    Mind you, just in case you hadn’t noticed, the last seven years have been the warmest in recorded history. And speaking of climate-change-style records in this era, last year, 22 natural disasters hit this country, including hurricanes, fires and floods, each causing more than $1 billion in damage, another instant record with — the safest prediction around — many more to come.

    “It Looked Like an Atomic Bomb”

    Lest you think that all of this represents an anomaly of some sort, simply a bad year or two on a planet that historically has gone from heat to ice and back again, think twice. A recent report published in Nature Climate Change, for instance, suggests that heat waves that could put the recent ones in the US West and British Columbia to shame are a certainty and especially likely for “highly populated regions in North America, Europe, and China.” (Keep in mind that, a few years ago, there was already a study suggesting that the North China plain with its 400 million inhabitants could essentially become uninhabitable by the end of this century due to heatwaves too powerful for human beings to survive.) Or as another recent study suggested, reports The Guardian, “heatwaves that smash previous records … would become two to seven times more likely in the next three decades and three to 21 times more likely from 2051-2080, unless carbon emissions are immediately slashed.”

    It turns out that, even to describe the new world we already live in, we may need a new vocabulary. I mean, honestly, until the West Coast broiled and burned from Los Angeles to British Columbia this summer, had you ever heard of, no less used, the phrase “heat dome” before? I hadn’t, I can tell you that.

    And by the way, there’s no question that climate change in its ever more evident forms has finally made the mainstream news in a major way. It’s no longer left to 350.org or Greta Thunberg and the Sunrise Movement to highlight what’s happening to us on this planet. It’s taken years, but in 2021 it’s finally become genuine news, even if not always with the truly fierce emphasis it deserves.

    Embed from Getty Images

    The New York Times, to give you an example, typically had a recent piece of reportage (not an op-ed) by Shawn Hubler headlined, “Is This the End of Summer as We’ve Known It?” Hubler wrote: “The season Americans thought we understood — of playtime and ease, of a sun we could trust, air we could breathe and a natural world that was, at worst, indifferent — has become something else, something ominous and immense. This is the summer we saw climate change merge from the abstract to the now, the summer we realized that every summer from now on will be more like this than any quaint memory of past summers.” And the new IPCC report on how fast things are indeed proceeding was front-page and front-screen news everywhere, as well it should have been, given the research it was summing up.

    My point here couldn’t be simpler: In heat and weather terms, our world is not just going to become extreme in 20 years or 50 years or as this century ends. It’s officially extreme right now. And here’s the sad thing: I have no doubt that, no matter what I write in this piece, no matter how up to date I am at this moment, by the time it appears it will already be missing key climate stories and revelations. Within months, it could look like ancient history.

    Welcome, then, to our very own not-so-slow-motion apocalypse. A friend of mine recently commented to me that, for most of the first 30 years of his life, he always expected the world to go nuclear. That was, of course, at the height of the Cold War between the US and the Soviet Union. And then, like so many others, he stopped ducking and covering. How could he have known that, in those very years, the world was indeed beginning to get nuked, or rather carbon-dioxided, methaned, greenhouse-gassed, even if in a slow-motion fashion? As it happens, this time there’s going to be no pretense for any of us of truly ducking and covering. 

    It’s true, of course, that ducking and covering was a fantasy of the Cold War era. After all, no matter where you might have ducked and covered then — even the Air Force’s command center dug into the heart of Cheyenne Mountain in Colorado — you probably wouldn’t have been safe from a full-scale nuclear conflict between the two superpowers of that moment, or at least not from the world it would have left behind, a disaster barely avoided in the Cuban Missile Crisis of 1962. (Today, we know that, thanks to the possibility of “nuclear winter,” even a regional nuclear conflict — say, between India and Pakistan — could kill billions of us, by starvation if nothing else.)

    In that context, I wasn’t surprised when a homeowner, facing his house, his possessions, and his car burned to a crisp in Oregon’s devastating Bootleg Fire, described the carnage this way: “It looked like an atomic bomb.”

    And, of course, so much worse is yet to come. It doesn’t matter whether you’re talking about a planet on which the Amazon rainforest has already turned into a carbon emitter or one in which the Gulf Stream collapses in a way that’s likely to deprive various parts of the planet of key rainfall necessary to grow crops for billions of people, while rising sea levels disastrously on the East Coast of the United States. And that just begins to enumerate the dangers involved, including the bizarre possibility that much of Europe might be plunged into a — hold your hats (and earmuffs) for this one — new ice age!

    World War III

    If this were indeed the beginning of a world war (instead of a world warm), you know perfectly well that the United States like so many other nations would, in the style of World War II, instantly mobilize resources to fight it (or as a group of leading climate scientists put it recently, we would “go big on climate” now). And yet in this country (as in too many others), so little has indeed been mobilized.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Worse yet, here one of the two major parties, only recently in control of the White House, supported the further exploitation of fossil fuels (and so the mass creation of greenhouse gases) big time, as well as further exploration for yet more of them. Many congressional Republicans are still in the equivalent of a state of staggering (not to say, stark raving mad) denial of what’s underway. They are ready to pay nothing and raise no money to shut down the production of greenhouse gases, no less create the genuinely green planet run on alternative energy sources that would actually rein in what’s happening.

    And criminal as that may have been, Donald Trump, Mitch McConnell and crew were just aiding and abetting those that, years ago, I called “the biggest criminal enterprise in history.” I was speaking of the executives of major fossil-fuel companies who, as I said then, were and remain the true “terrarists” (and no, that’s not a misspelling) of history. After all, their goal in hijacking all our lives isn’t simply to destroy buildings like the World Trade Center, but to take down Earth (Terra) as we’ve known it. And don’t leave out the leaders of countries like China still so disastrously intent on, for instance, producing yet more coal-fired power. Those CEOs and their enablers have been remarkably intent on quite literally committing terracide and, sadly enough, in that — as has been made oh-so-clear in this disastrous summer — they’ve already been remarkably successful.

    Companies like ExxonMobil knew long before most of the rest of us the sort of damage and chaos their products would someday cause and couldn’t have given less of a damn as long as the mega-profits continued to flow in. (They would, in fact, invest some of those profits in funding organizations that were promoting climate-change denial.) Worse yet, as revealing comments by a senior Exxon lobbyist recently made clear, they’re still at it, working hard to undermine US President Joe Biden’s relatively modest green-energy plans in any way they can.

    Thought about a certain way, even those of us who didn’t live in Greenville, California, are already in World War III. Many of us just don’t seem to know it yet. So, welcome to my (and your) extreme world, not next month or next year or next decade or next century, but right now. It’s a world of disaster worth mobilizing over if, that is, you care about the lives of all of us and particularly of the generations to come. 

    *[This article was originally published by TomDispatch.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More