More stories

  • in

    Investigating the Radical Right’s Presence in the Military

    In the early morning on July 2, Corey Hurren, a 46-year-old military reservist from Manitoba, rammed his pickup truck through the front gate of the grounds at Rideau Hall, which houses the official residence of the governor general of Canada and Rideau Cottage, the temporary residence of Prime Minister Justin Trudeau and his family. Armed with four weapons — a revolver, two shotguns, and a newly-banned Norinco M14 rifle — Hurren was arrested by the Royal Canadian Mounted Police (Canada’s federal police corps) after a 90-minute stand-off. Hurren currently faces 22 charges, including “knowingly utter[ing] a threat to Prime Minister Justin Trudeau.”

    Hurren’s motives remain somewhat unclear. In a handwritten two-page letter found on his person, he expressed a litany of grievances, including fears over the suspension of Parliament due to the ongoing pandemic and the possibility that the country, under Trudeau’s leadership, was on its way to a communist dictatorship. Such a list sits alongside what has been described in the Canadian press as a mixture of “personal despair and financial distress.”

    The Far Right Has a History of Infiltrating the British Army

    READ MORE

    Hurren’s exact motivation for this act remains somewhat nebulous. He has a long history of being drawn to conspiracy theories, including QAnon, a radical-right conspiracy theory detailing a supposed plot by an alleged deep state against US President Donald Trump and his supporters. The incident itself took place in the immediate aftermath of a protest on Parliament Hill, which saw a few hundred of far-right protesters descending on Ottawa to call for the prime minister to be prosecuted for diverse alleged crimes. These factors have led some analysts, including myself, to wonder whether this represented yet another incident of Canadian military personnel being in ideological alignment with radical-right groups.

    The Threat

    The threat posed by the presence of military personnel in radical-right groups is a growing concern across NATO member countries, but the full extent of the problem remains unclear. Over the past few months, Ondrej Hajn and I have so far identified 213 individual cases of military personnel from the United Kingdom, Canada, Germany and the United States discharged or prosecuted for their participation in radical-right groups since 2010. Only a fraction of cases involving soldiers discharged or prosecuted for harboring links to the radical right is available through open source information.

    While these numbers may at first glance seem insignificant compared to the overall size of these nations’ armed forces, two factors are worth bearing in mind. Firstly, publicly available information about individual military personnel involved in radical-right groups is extremely hard to come by. In fact, while our dataset records 14 cases in Canada, an internal document from the Canadian Armed Forces’ (CAF) Military Police Criminal Intelligence Program found that 53 CAF personnel were identified as being part of hate groups between January 2014 and November 2018. This indicates that our dataset only represents the tip of the iceberg.

    Daniel Koehler’s fantastic report on the issue paints a much grimmer picture but stops short of identifying individual cases, as we have sought to do. In fact, in the majority of cases, the information was not disclosed by the military itself but instead comes from media outlets, internet sleuths and law enforcement. Secondly, history has shown us the potentially disastrous consequences of letting such radical-right ideologies fester within the ranks of NATO militaries, giving succor to violent, racist ideologies that might lead to vigilante-style attacks.

    The threat posed by military personnel in radical-right groups became apparent in the immediate aftermath of the Oklahoma City bombing, when Timothy McVeigh, a US Army veteran radicalized by anti-government rhetoric and interactions with members of radical-right militias, killed 168 people with a truck bomb. However, the events of September 11 and the resultant global war on terror largely sidelined concerns about extremism within military ranks. In 2008, the FBI warned that radical-right groups were “making a concerted effort to recruit active-duty soldiers and recent combat veterans.” The report further highlighted that “military experience is found throughout the white supremacist extremist movement as the result of recruitment campaigns by extremist groups and self-recruitment by veterans sympathetic to white supremacist causes.”

    These warnings would prove to be prophetic, and the intersection of individuals aligned with radical-right groups and the military appear to have plagued almost every NATO member country, where radical-right groups have deliberately attempted to recruit individuals with military experience to “exploit their skills and knowledge derived from military training and combat.”

    Hateful Conduct

    Despite clear indication that the presence of military personnel among radical-right groups poses both a serious security threat and can be a detriment to unit readiness and successful deployment, Western militaries have been generally tight-lipped about their efforts to root out such individuals. A notable exception is the German Military Counterintelligence Service (MAD), which recently released its first publicly available report on extremism within the German federal defense forces, the Bundeswehr. (An English overview of this report can be found here.)

    Another positive step is last month’s unveiling of the new Canadian Armed Forces policy on hateful conduct. The policy provides a formal definition of hateful conduct as “an act or conduct, including the display or communication of words, symbols or images, by a CAF member, that they knew or ought reasonably to have known would constitute, encourage, justify or promote violence or hatred against a person or persons of an identifiable group, based on their national or ethnic origin, race, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics or disability.”

    Embed from Getty Images

    Prior to the unveiling of this policy, defining hateful conduct was a task which had previously proven difficult for military brass. Members found to have violated the policy can face administrative or disciplinary action that can range from mandatory education, counselling and treatment to having their cases investigated by military police.

    Along with the new policy, the Canadian Armed Forces announced that it will be implementing a new system to help monitor and track any suspected incidents of hateful conduct within its ranks. While details about this new system remain scarce, it has been reported that it will resemble the system created to monitor sexual misconduct in the ranks.

    While this new policy and the monitoring mechanism are clearly a step in the right direction that acknowledges the severity of the problem and the importance of addressing it in order to make the Canadian Armed Forces a more inclusive organization, it nonetheless falls short on several points. First, the new policy could be seen as a way of potentially decriminalizing hateful conduct within the CAF’s ranks. As argued by Colonel Michel Drapeau, “Under the new policy, the CAF has distanced itself from the Criminal Code, inviting commanding officers and members of the chain of command to treat any such wilful hateful conduct as an administrative, disciplinary matter.”

    Secondly, cases of hateful misconduct will continue to be dealt with behind closed doors, which makes it particularly hard for journalists, scholars and concerned members of the public to examine the full extent of the phenomenon. Without direct access to this data, scholars and the public will have to continue relying on open source data, which only paints a partial picture. Lastly and perhaps most importantly, while this new policy and the recent report from the German MAD are encouraging, the phenomenon has yet to be examined and tackled in a comparative way across all NATO countries, signaling a lack of efforts to coordinate practices and lessons learned amongst NATO member states concerning an increasingly transnational terror issue.

    *[The Centre for Analysis of the Radical Right is a partner institution of Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    The Mother of All War Crimes

    As Americans once again struggle with the very idea of having a history, let alone reflecting on its significance, an article in The Nation originally published in 2015 marks the anniversary of the bombing of Hiroshima and Nagasaki. It offers its readers a reminder of an event that no one has forgotten but whose monumental significance has been consistently distorted, if not denied.

    Japan’s surrender in 1945 officially ended World War II. It marked a glorious moment in history for the United States. But most serious historians agree on one fact that everyone has insisted on forgetting. The war would have ended without the demonstration of American scientific and military prowess carried out at the expense of hundreds of thousands of Japanese lives.

    Interactive: The Story of World War II

    READ MORE

    If history has any meaning, humanity should have applied to August 6, 1945, the very words President Franklin D. Roosevelt used at the beginning of America’s war with Japan following the attack on Pearl Harbor on December 7, 1941. More than Pearl Harbor, August 6, 1945, should be remembered as “a date which will live in infamy.” 

    In the article originally published to mark the 70th anniversary of the events that led to the end of World War II, the author, Gar Alperovitz, reminds us that almost every US military leader at the time counseled against dropping the bomb. It cites the testimony of Admiral William Leahy, President Harry Truman’s chief of staff; Henry “Hap” Arnold, the commanding general of the US Army Air Forces; Fleet Admiral Chester Nimitz, commander-in-chief of the Pacific Fleet; and Admiral William “Bull” Halsey Jr., commander of the US Third Fleet.

    All these senior officers agreed that “the first atomic bomb was an unnecessary experiment.” Even Major General Curtis LeMay, who nearly 30 years later tried to push John F. Kennedy into a nuclear war with the Soviet Union during the Cuban missile crisis in 1962, agreed that “the atomic bomb had nothing to do with the end of the war at all.”

    General Dwight Eisenhower, the future president, also believed “that Japan was already defeated and that dropping the bomb was completely unnecessary.” But Eisenhower added this consideration of profound geopolitical importance, which directly contradicts the official pretext given by the government and repeated in the official narrative, that thousands of American soldiers would die in the final assault on Japan. “I thought that our country should avoid shocking world opinion by the use of a weapon whose employment was, I thought, no longer mandatory as a measure to save American lives,” he said.

    Here is today’s 3D definition:

    World opinion:

    The understanding people across the globe have of how a hegemonic power works for or against their interests, a phenomenon that hegemonic powers learn to ignore as soon as they become convinced of the stability and durability of their hegemony

    Contextual Note

    World War II marked a sea-change in geopolitics. It literally ushered in the era of technological rather than purely military and economic hegemony. The real point of the bomb was to provide a graphic demonstration of how technological superiority rather than mere economic and military clout would define hegemony in the decades to come. That’s why the US has been able to consistently lose wars but dominate the global economy.

    “President Truman’s closest advisers viewed the bomb as a diplomatic and not simply a military weapon,” Alperovitz writes. It wasn’t just about ending the war but modeling the future. Truman’s secretary of state, James Byrnes, “believed that the use of atomic weapons would help the United States more strongly dominate the postwar era.” He seemed to have in mind the “military-industrial complex” that Eisenhower would later denounce.

    Embed from Getty Images

    Eisenhower’s prediction about world opinion in the aftermath of the nuking of Japan was apparently wrong. Polls taken in 1945 showed that only 4% of Americans said they would not have used the bomb. Relieved to see the war over, the media and governments across the globe made no attempt to mobilize world opinion against a manifest war crime.

    On the basis of the letters to the editor of The Times, one researcher nevertheless reached the conclusion that, in the UK, a majority of “civilians were outraged at the atomic bombings of Hiroshima and Nagasaki.” This probably reflects opinion across most of Europe. The Vatican roundly condemned the use of nuclear weapons, even two years before the bombing of Japan and then again after the war, but it had little impact on public opinion.

    Focused on the drama of the Nuremberg trials rather than the mass destruction in Japan, the nations of the world very quickly adjusted to the fatality of living with the continued presence of nuclear bombs. They even accepted the bomb as a stabilizing norm in what quickly became the Cold War’s nuclear arms race. After all, the idea of mutuality in the strategy of mutually assured destruction seemed to keep things in some sort of precarious balance. 

    With history effectively rewritten in a manner agreeable to the hegemony-minded governments of the US, American soft diplomacy — spearheaded to a large extent by Hollywood — did the rest. The American way of life almost immediately became a global ideal, only peripherally troubled by Godzilla and other disturbing radioactive mutants.

    Takeshi Matsuda explained in a 2008 article in the Asia-Pacific Journal: “By the end of World War II, the U.S. government had recognized how important a cultural dimension of foreign policy was to accomplishing its broad national objectives.” Those “national objectives” had clearly become nothing less than global hegemony.

    Historical Note

    Post-World War II history contains a cruel irony. An inhuman nuclear attack on Japanese civilians became perceived as the starting point of a new world order under the leadership of the nation that perpetrated that attack. The new world order has ever since been described as the “rule of law.” 

    Because the new order relied on the continued development of nuclear weapons, it might be more accurate to call it a “rule of managed terror.” It was built on the notion of fear. Over the following decades, the vaunted rule became increasingly dependent on a combination of expanding military might, mass surveillance, technological sophistication and the capacity of operational weapons to strike anywhere with great precision but without human intervention.

    In his article, Gar Alperovitz quotes a pertinent remark in 1946 of Admiral William “Bull” Halsey Jr., who called “the first atomic bomb … an unnecessary experiment. … It was a mistake to ever drop it … [the scientists] had this toy and they wanted to try it out, so they dropped it.” But Halsey was mistaken. The scientists didn’t drop the bombs. The politicians — especially Harry Truman, with whom the buck was destined to stop — ordered it. And bomber pilots did the dropping. But Halsey’s intuition about the rise of technology as the key to hegemony was correct.

    Whether Truman understood what was happening, or whether he was an unwitting tool of a group of American Dr. Strangeloves (the former Nazis were already being recruited), no historian has been able to determine. Fox News journalist Chris Wallace, in his book on Truman and the bomb, claims that the president “agonized over it,” as well he should have. 

    The problem that remains for those who seek to understand the significance of our global history is that once the deed was done, Truman’s and everyone else’s agonizing ended. Shakespeare’s Macbeth famously “murdered sleep,” but America’s official historians, in the years following Hiroshima, succeeded in putting the world’s moral sense to sleep.

    Humanity is still on the verge of nuclear annihilation. Some of the bellicose discourse we hear today may be bluff. But the US military has elaborated concrete plans for a nuclear war with China, and preparations for that war are already taking place. As journalist John Pilger points out, US Secretary of State Mike Pompeo has been pushing hard to foment a war mentality among the American public, partly because it is part of Trump’s reelection strategy and partly because Pompeo is “an evangelical fanatic who believes in the ‘rapture of the End.’”

    World opinion, if our democracies knew how to consult it, would undoubtedly prefer the plain and simple annihilation of our nuclear capacity. But the dream of a democracy of humanity, in the place of competing nation-states, dwells only in an obscure political and psychological limbo, existing as something between an empty promise and wishful thinking.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Click here to read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Is Europe More United Than the US?

    During the Trump era, America increasingly seems like a motley collection of states brought together for reasons of territorial contiguity and little else. The conservative South is ravaged by a pandemic. The liberal Northeast waits patiently for elections in November to oust a tyrant. A rebellious Pacific Northwest faces off against federal troops sent to “restore order.” The Farm Belt, the Rust Belt and the Sun Belt are like three nations divided by a common language.

    The European Union, on the other hand, really does consist of separate countries: 27 of them. The economic gap between Luxembourg and Latvia is huge, the difference in median household income even larger than that between America’s richest and poorest states (Maryland and West Virginia).

    Was the First Gulf War the Last Triumph of Multilateralism?

    READ MORE

    European countries have gone to war with each other more recently than the American states (a mere 25 years ago in the case of former Yugoslavia). All EU members are democracies, but the practice of politics varies wildly from perpetually fragmented Italy to stolid Germany to ever-more illiberal Hungary.

    Despite these economic and political differences, the EU recently managed to perform a miracle of consensus. After 90 hours of discussion, EU leaders hammered out a unified approach to rebuilding the region’s post-pandemic economy.

    The EU is looking at an 8.7% economic contraction for 2020. But the coronavirus pandemic clearly hit some parts of the EU worse than others, with Italy and Spain suffering disproportionately. Greece remains heavily indebted from the 2008-09 financial crisis. Most of Eastern Europe has yet to catch up to the rest of the EU. If left to themselves, EU members would recover from the current pandemic at very different rates and several might not recover at all.

    Embed from Getty Images

    That’s why the deal is so important. The EU could have helped out its struggling members by extending more loans, which was basically the approach after 2009. This time around, however, the EU is providing almost half of the money in the new recovery fund — $446 billion — in grants, not loans. The $1.3-trillion budget that European leaders negotiated for the next seven years will keep all critical EU programs afloat (like the European structural and investment funds that help bridge the gap between the wealthier and the less wealthy members).

    Sure, there were plenty of disagreements. The “frugal four” of the Netherlands, Denmark, Austria and Sweden argued down the amount of money allocated to the grant program and the budget numbers overall. Germany has often sided with the frugal faction in the past, but this time Chancellor Angela Merkel played a key role in negotiating the compromise. She also managed to bribe Hungary and Poland to support the deal by taking “rule-of-law” conditionality off the table. Both countries have run afoul of the EU by violating various rule-of-law norms with respect to media, judiciary and immigration. Yet both countries will still be able to access billions of dollars from the recovery fund and the overall budget.

    Until recently, the EU seemed to be on the brink of dissolution. The United Kingdom had bailed, Eastern Europe was increasingly authoritarian, the southern tier remained heavily in debt, and the pandemic was accelerating these centrifugal forces. But now it looks as the EU will spin together, not spin apart.

    The United States, on the other hand, looks ever more in disarray. As Lucrezia Reichlin, professor of economics at the London Business School, put it, “Despite being one country, the U.S. is coming out much more fragmented than Europe.”

    The Coming Storm

    The Trump administration has been all about restarting the US economy. President Donald Trump was reluctant to encourage states to lock down in the first place. He supported governors and even armed protesters demanding that states reopen prematurely.

    And now that the pandemic has returned even more dramatically than the first time around, the president is pretending as though the country isn’t registering over 60,000 new infections and over a thousand deaths every day. Trump was willing to cancel the Florida portion of the Republican Party convention for fear of infection, but he has no problem insisting that children hold the equivalent of thousands of mini-conventions when they return to school.

    Europe, which was much more stringent about prioritizing health over the economy, is now pretty much open for business.

    The challenge has been summer tourism. Vacationers hanging out on beaches and in bars are at heightened risk of catching the COVID-19 disease — which is caused by the novel coronavirus — and bringing it home with them. There have been some new outbreaks of the disease in Catalonia, an uptick in cases in Belgium and the Netherlands, and a significant increase in infections in Romania. Belgium is already re-instituting restrictions on social contacts. Sensibly, a number of European governments are setting up testing sites for returning tourists.

    The EU is determined not to repeat what’s going on in Florida, Texas and California. It is responding in a more deliberate and unified way to outbreaks leading to an average of 81 deaths a day than the United States is responding as a whole to a very nearly out-of-control situation producing more than 900 deaths a day.

    The US isn’t just facing a deadly resurgence of the pandemic. Various economic signals indicate that the so-called “V-shaped recovery” — much hyped by the Trump administration — is just not happening. More people are again filing for unemployment benefits. People are reluctant to go back to restaurants and hang out in hotels. The business sector in general is faring poorly.

    “The sugar rush from re-openings has now faded and a resurgence of domestic coronavirus cases, alongside very weak demand, supply chain disruptions, historically low oil prices, and high levels of uncertainty will weigh heavily on business investment,” according to Oren Klachkin, lead US economist at Oxford Economics in New York.

    The Organization of Economic Cooperation and Development (OECD) released a report in July that offered two potential scenarios for the US economy through the end of the year. Neither looks good. The “optimistic scenario” puts the unemployment rate at the end of 2020 at 11.3% (more or less what it is right now) and an overall economic contraction of 7.3%. According to the pessimistic scenario, the unemployment rate would be nearer to 13% and the economic contraction at 8.5%.

    Much depends on what Congress does. The package that Senate Republicans unveiled last week is $2 trillion less than what the Democrats have proposed. It offers more individual stimulus checks, but nothing for states and municipalities and no hazard pay for essential workers.

    Unemployment benefits expired a few weeks ago, and Republicans would only extend them at a much-decreased level. Although Congress will likely renew the eviction moratorium, some landlords are already trying to kick out renters during the gap. The student loan moratorium affecting 40 million Americans runs out at the end of September.

    The only sign of economic resurgence is the stock market, which seems to be running entirely on hope (of a vaccine or a tech-led economic revival). At some point, this irrational exuberance will meet its evil twin, grim reality. On the other side of the Atlantic, the Europeans are preparing the foundation for precisely the V-shaped recovery that the United States, at the moment, can only dream about.

    The Transatlantic Future

    What does a world with a stronger Europe and a weaker America look like? A stronger Europe will no longer have to kowtow to America’s mercurial foreign policy. Take the example of the Iran nuclear deal, which the Obama administration took the lead in negotiating. Trump not only canceled US participation, but he also threatened to sanction any actors that continued to do business with Iran. Europe protested and even set up its own mechanisms to maintain economic ties with Tehran. But it wasn’t enough. Soon enough, however, the United States won’t have the economic muscle to blackmail its allies.

    Embed from Getty Images

    The EU has certainly taken a tougher stance toward China over the last couple years, particularly on economic issues. But in its negotiations with Beijing, the EU has also put far greater emphasis on cooperation around common interests. As such, expect the European Union to take full advantage of the US decline to solidify its position in an East Asian regional economy that recovers far more quickly from the pandemic than pretty much anywhere in the world.

    Europe is also well-positioned to take the lead on climate change issues, which the United States has forfeited in its four years of catastrophic backsliding under Trump. As part of its new climate pact, the EU has pledged to become carbon-neutral by 2050. The European Commission is also considering a radical new idea: a carbon tax on imports. In the future, if you want to be competitive in selling your products in the European market, you’ll have to consider the carbon footprint of your operation.

    Of course, the EU could do better. But compared to the US, Russia or China, it’s way out in front. The European Union is not a demilitarized space. It has a very mixed record on human rights conditionality. And its attitudes toward immigration range from half-welcoming to downright xenophobic.

    But let’s say that Europe emerges from this pandemic with greater global authority, much as the US did after World War II. A lot of Americans, and most American politicians, will bemoan this loss of status. But a world led by a unified Europe would be a significantly better place than one mismanaged by a fragmented United States.

    *[This article was originally published by FPIF.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Was the First Gulf War the Last Triumph of Multilateralism?

    This week marks the 30th anniversary of Iraq’s invasion and occupation of Kuwait. Desperate to pay off his nation’s seemingly insurmountable debt, acquired as a result of his invasion of and the futile 8-year war with Iran that had just ended, Saddam Hussein saw oil-rich Kuwait as the solution. Iraq had never recognized Kuwait’s sovereignty, claiming it had been hived off by the British during its occupation of Iraq in the early 20th century. Moreover, as he and many Iraqis asserted, it really was Iraq’s “19th province.”

    The World Without American Leadership

    READ MORE

    Saddam deployed Iraqi troops to the border in July of 1990, prompting concern among neighboring Arab countries and the United States. In a much-reported meeting with then-US Ambassador April Glaspie late in July, he was asked about his intentions. Glaspie took pains to explain that the US had “no opinion” on Arab-Arab disputes, further expressing the US hope that the Iraqi-Kuwait border question might be resolved soon and without the use of force. (Egypt has been trying to mediate the dispute.) Saddam interpreted her response as an American green light to invade, as egregious a misinterpretation of a diplomatic communication as there ever was.

    A Multilateral Approach

    Within hours of the August 2 invasion, the UN Security Council convened and ordered Iraq’s immediate withdrawal. It was ignored by Saddam, as were multiple subsequent UNSC resolutions. Saddam did not believe that the US or any other nation would take action to defend the small patch of desert at the end of the Persian Gulf, despite its outsize oil wealth and massive reserves.

    He was wrong. Under the leadership of President George H. W. Bush and his able secretary of state, James Baker, the US organized a 34-nation coalition, including many Arab states and NATO allies. Armed with a UNSC resolution authorizing “all necessary means” if Saddam did not withdraw his forces by the January 15 deadline, the US and other coalition forces began assembling in Saudi Arabia, which many feared would be the next target of Saddam’s ambitions. Facing more than 650,000 troops and a massive US, British and French air assault, Iraqi forces were driven out of Kuwait. The three-day campaign cost coalition forces some 300 deaths, including 146 Americans. Iraqi casualties were never officially ascertained, but estimates range from 20,000 to 26,000 killed and 75,000 injured. Over 1,000 Kuwaitis also died, mostly civilians.

    The Kuwait incursion proved even more humiliating and costly than Iraq’s ill-fated invasion of Iran. Numerous and increasingly costly sanctions (including on critical oil exports), intrusive UN weapons inspectors and expansive no-fly zones in the country’s north and south decisively placed Iraq in pariah-nation status in the world. Ultimately, it set the stage for the American invasion and occupation of Iraq and Saddam’s removal in 2003.

    Leadership When It Counted

    The First Gulf War marked a significant achievement for American diplomacy, one that would be difficult to replicate today. Though Saddam remained unmoved by American warnings and UNSC resolutions and sanctions, the international community proceeded deliberately but measuredly before employing force. The UNSC’s approval of Resolution 678, which authorized the use of force, obtained 12 affirmative votes, including from four of the five permanent members (China abstained) and only two negatives (Cuba and Yemen).

    Deft diplomacy on the part of Bush and Baker attracted 33 other nations to the coalition that expelled Saddam’s forces. Secretary of Baker met on several occasions with Saddam’s foreign minister, Tariq Aziz, to resolve the crisis. This was a marked contrast to George W. Bush’s approach to, and eventual invasion of, Iraq in 2003, which failed to secure UNSC approval and incurred considerable worldwide condemnation.

    Importantly, despite a virtually open road to Baghdad and against the urgings of some in the US at the time, in 1991 President Bush withdrew all US forces from Iraq and did not seek to remove Saddam. This proved to be critical in maintaining the unprecedented coalition he had organized to address a Middle East crisis. Bush Sr. was able to capitalize on that achievement by assembling world leaders in Spain later that fall for the Madrid Conference, which brought together many of the same Arab countries from the coalition, plus Israel and the Palestinian Authority, and co-sponsor the Soviet Union to address the Arab-Israeli conflict. The conference became a stepping stone for increased action on the part of many Arab countries, the Palestinians and Israel, and the progress that followed.

    The Era of Great Power Rivalry

    The First Gulf War itself and what followed demonstrated what principled, deft and concerted diplomacy on the part of the US can achieve. Clearly, the task remains significantly short of its ultimate goal. But the hope of that seems all the more distant as the US under President Donald Trump eschews the Bush/Baker approach to multilateral diplomacy in favor of narrow, one-sided bilateral diplomacy. The latter has proven to be a contributing factor in the region’s — and perhaps the world’s — decided move toward “great power” competition.

    Embed from Getty Images

    Nations as diverse as Russia, China, Turkey, Iran, Saudi Arabia, the UAE and others now vie for increased influence and even dominance in the Middle East and elsewhere. Never a partisan in great power competition, the US now stands strangely quiet on the sidelines as these nations attempt to carve out spheres of influence, from the Crimea and Ukraine, to South and Central Asia, the Far East and the Middle East. For some of the peoples of the Middle East — Syria, Yemen and Libya — this has meant misery and devastation, and for the rest of the region, instability, uncertainty and fear. US-led multilateralism at a time when it stood unparalleled in military, political and economic power in the world helped address a genuine Middle East crisis 30 years ago. In that sense, America’s and the world’s actions in Iraq may very well have been the mythical “good” war in the Middle East, as much an oxymoron as that may sound.

    In an era of great-power maneuvering, it would be inconceivable to imagine now a similar response in the event of another crisis between nations of the region, say Iran and Saudi Arabia. With rival major powers choosing sides, one could more easily envision competing alliances being drawn up, culminating in the sort of conflict the world saw in Europe in World War I.

    Great-power competition seldom, if ever, leads to stability or peace. World War I amply proved that. The example of the First Gulf War, however, proved that multilateralism, especially when led by a powerful but principled nation, can diffuse escalating tensions, avert greater disaster and provide at least the prospect and a framework for peace and stability.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    The Latest Version of Russiagate

    The New York Times keeps slogging away at a four-year-old theme that it refuses to allow to die a natural death. Should we call it Russiagate 2.0 or 3.0 or 7.0? Whatever we call it, Russiagate has made its way back into The NYT’s headlines. Perhaps we should adopt the same convention as the health authorities who called the disease caused by the novel coronavirus COVID-19 because it first appeared in 2019. So, this could be Russiagate-20, although the number of minor versions that have appeared since the beginning of the year might make it Russiagate-20.3.

    The latest article’s title is “Russian Intelligence Agencies Push Disinformation on Pandemic,” followed by the subtitle, “Declassified U.S. intelligence accuses Moscow of pushing propaganda through alternative websites as Russia refines techniques used in 2016.”

    Elon Musk Declares War on Pronouns

    READ MORE

    The logic of the crime perpetrated by the recidivist known as Russia is well-known. The scenario is as familiar as any Hollywood remake. The authors of the article, Julian E. Barnes and David E. Sanger, want to make sure that the new variation on a story about Russian interference with American democracy does not suffer from the criticism leveled at anticlimactic events such as the Mueller report. Some will remember that in August 2019, The Times’ executive editor, Dean Baquet, embarrassingly admitted that the paper was “a little flat-footed” when it doggedly followed an editorial line that consisted of hyping Russiagate on the pretext that it looked “a certain way for two years.” It was the look that kept the story alive even though the narrative contained no substance.

    To make their point about the seriousness of this story, Barnes and Sanger take the trouble to cite, though not to name, “outside experts” who can confirm its reality. “The fake social media accounts and bots used by the Internet Research Agency and other Russia-backed groups to amplify false articles have proved relatively easy to stamp out,” The Times reports. “But it is far more difficult to stop the dissemination of such articles that appear on websites that seem legitimate, according to outside experts.”

    Here is today’s 3D definition:

    Dissemination:

    A synonym for publication that subtly suggests something underhanded, implying that the content of what is being broadcast consists of lies or disinformation

    Contextual Note

    What all these stories boil down to is a pair of simple facts with which readers should now be familiar. The first is the revelation that Russians and, more particularly, Russian intelligence agencies lie, just in case readers weren’t aware of that. The second is that the Russians are clever enough to get at least some of their lies published on the internet.

    For these well-known and oft-repeated “truths” to become newsworthy, the reader must believe something exceptional has occurred, following the man-bites-dog principle. The exceptional fact The Times wants its readers to understand is that, unlike the stories that looked “a certain way” for two years with reference to the 2016 US presidential election, this one is no remake. It is undeniably news because it is about the COVID-19 pandemic, which only became an issue this year.

    Embed from Getty Images

    To the discerning reader, the message is exactly the same as the idea behind the “flat-footed” campaign Baquet mentioned. But the content has changed. In both cases, processing the message requires that readers accept the implicit premise that Russians have a monopoly on lying or, alternatively, that that’s the only thing Russians know how to do. They are the only people on earth who invest in inventing contestable takes on the news and getting their lies published on the internet. There can be no legitimate reason to suspect any other nation, especially the United States, of telling lies about other nations and even managing to get them published on the web. How does The Times know that? Because its anonymous sources hailing from the very reliable US intelligence agencies have dutifully provided it with the data.

    If the story had focused only on COVID-19, it probably would not have justified a full-length article. Understanding this, the journalists sought evidence of Russian interference on “a variety of topics,” including a major one: NATO. “The government’s accusations came as Mandiant Threat Intelligence, part of the FireEye cybersecurity firm, reported that it had detected a parallel influence campaign in Eastern Europe intended to discredit the North Atlantic Treaty Organization,” Barnes and Sanger write.

    How extraordinary, Times readers must be thinking, that Russia might be trying to discredit NATO. That really is news, at least for anyone who has failed to pay attention to everything that has happened in Eastern Europe since the fall of the Berlin Wall in 1991. Do readers of The New York Times belong to that category of the deeply (or simply willfully) ignorant readers of the news? The Times has, after all, published a few articles at least since 1994 alluding to what historians now understand was a persistent act of betrayal by Western powers of the promises made to Russian leaders Mikhail Gorbachev and Boris Yeltsin not to expand NATO… before aggressively doing the contrary over decades.

    In an article in The Nation from 2018, the distinguished Russia expert Stephen Cohen highlighted the role of Western media — and The New York Times, in particular — in failing (or refusing) to cover that ongoing drama. It should surprise no one that even today, The Times not only neglects that vital bit of context, but it also uses its feigned ignorance to express its shock at the idea that the Russians might feel impelled to discredit NATO in Eastern Europe. This is not a case of Russian meddling in US elections. It’s an attempt to limit the damage the Russian government feels has resulted from Western perfidy.

    The latest Times article doesn’t stop there. It offers us this insight: “While the Mandiant report did not specifically name Russia and its intelligence agencies, it noted that the campaign was ‘aligned with Russian security interests’ in an effort to undermine NATO activities.” In other words, the reporters admit there is no direct evidence of Russian involvement. They simply expect Times readers to conclude that because there appears to be an “alignment,” Russia is to blame. This is a perfect encapsulation of everything that took place around Russiagate. Alignment is proof of collusion.

    Historical note

    During the Cold War, Americans were thrilled to find their vocabulary enriched when the word “propaganda,” derived from Latin, was imported from their enemy, the Soviet Union. The term literally means “what is to be propagated.” The Soviets used it as the official term to describe their communications operations modeled on the same logic as the “voice of America.” In both cases, it was all about teaching third parties why their system was better than their opponent’s.

    Americans sneered at the dastardly evil concept of propaganda. They clearly preferred the idea of PR (public relations). This was about the time that Vance Packard’s best-seller, “The Hidden Persuaders,” revealed how — as The New Yorker described it at the time — “manufacturers, fundraisers and politicians are attempting to turn the American mind into a kind of catatonic dough that will buy, give or vote at their command.”

    The monumental effort of Madison Avenue stepping in to dominate a rapidly expanding economy conveniently distracted most people’s attention from the magnificent work the CIA was undertaking across the globe in the scientific (or pseudo-scientific) dissemination of misinformation. The more Americans suspected advertising was lying to them, the less concerned they were by the skullduggery of the military-industrial complex and its intelligence agencies. It clearly went well under their radar as they focused on consumer pleasures.

    That gave the US a double advantage over the Soviet Union. It had two powerful industries working in parallel to feed a regular diet of lies to the American people, whereas the Soviet Union had only the government to supply them with glaringly obvious lies. The Russians were already beginning to receive its messages with growing skepticism. The US enjoyed another advantage to the extent that the fun of advertising and the pleasures of the consumer society took the sting out of their growing awareness that they too were being constantly lied to.

    Can there be any doubt today that The New York Times is committed to propaganda? Like most of the media sympathetic to the Democratic Party, it not only accepts uncritically the “assessments” of the intelligence community, but it also amplifies its messages. It even extrapolates to draw conclusions they dare not affirm.

    If the notion of dissemination has a negative connotation linked to the idea of propaganda, The New York Times is a master disseminator.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Click here to read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Content, Not Culture, Separates Americans

    It has finally come to pass in America that armed bands of federal government thugs in camouflage gear are taking over parts of selected US cities to serve the interests of the country’s fearful “leader” and autocrat-in-chief. At the behest of Trump and his stooge attorney general, unidentified Department of Homeland Security troops have swooped in to bring “law and order” to citizens hoping for some measure of police reform and racial justice. So, while the coronavirus pandemic rages out of control in the face of a chaotic response by the same federal government, Trump has decided to augment his failure by doubling down on leftists, socialists, anarchists and communists. This is real, it is an old playbook and it should be very scary.

    Armed and empowered federal government personnel with absolutely no training in dealing with citizen protests or protesters are being unleashed to confront largely peaceful demonstrators in America who are imploring their government to reduce police violence and address racial injustice. Local leaders and police commanders are confronted with an armed force that they have not asked for and that they do not want. This is American citizens being terrorized by American government personnel, ironically at the command of the federal Department of Homeland Security established to protect us from terrorists.

    America Is a Nation in Darkness

    READ MORE

    To be sure, this is largely theater. But it is theater that should shock anyone in America who smugly thought that the “land of the free” would never look like “other” despotic lands. It has been a very long time since America has come this close to rock bottom. As a nation, America is an international laughingstock, mocked by all those despots we bribed over the years to transform their way into our way, the American way. But guess what? We didn’t see it coming, but their way has now become our way.

    Turn on the news anywhere in the world, and it will feature some daily tale of woe from America. Turn on the news in America, and it is all a tale of American woe. Yet despite the perception that America has found new lows, amid pandemic and social strife, there is a palpable disconnect between the depth of the problems and a serious consensus about the solutions. As is often the case in America, this situation is a big problem in search of a label that will ensure that not much changes anytime soon.

    Every politician and pundit in the land seems to have settled on something called the “culture wars.” It seems so easy in the facile world in which we live to provide cover for complex problems by finding a meaningless catchy phrase that everyone can define for themselves instead of facing reality, particularly the reality of others.

    “Culture Wars”

    Today, everywhere you turn in American politics, “culture wars” are trotted out to explain away all manner of dysfunction in government and society. I am not sure what that term means. “Culture war” has been defined as “a conflict or struggle for dominance between groups within a society or between societies, arising from their differing beliefs, practices, etc.” The “etc.” at the end of this definition should be a clue that “culture war” means essentially whatever you want it to mean. What kind of definition is that?

    Embed from Getty Images

    Before there was the coronavirus pandemic, there was culture everywhere. Want to see a play, go for it. If art or anthropology is your interest, museums abound. Even a movie, particularly when called “cinema” or “film,” can qualify as a good solid cultural experience. Then there is the whole world of international and local cuisines, more cultural experience. Wines, beers, whiskeys, full of culture. When I think of culture, this is what I think of, along with the rich tapestry that defines some of who we are. 

    Somehow a war based on a film I like, what cuisine I choose to eat or the sports team I choose to root for seems trivial and even unlikely. So, a “culture war” must mean something deeper than that. It must mean, for example, that if you pay attention to public health experts in response to a pandemic, you are on one team and if not, you are on the other team. What a clever way to gloss over stupidity and ignorance.

    “Culture war” also implies something ingrained that cannot be altered or influenced by new ideas, new knowledge or new experience. However, the paralyzing conflict that we are enduring in America is routinely influenced by new ideas and new experiences. It is a policy conflict, a conflict over how best to address real human problems with a policy response. And much of it is driven by an individual’s momentary perception of the role of government in meeting these human challenges. 

    I truly dislike Senator Mitch McConnell, but we are both old white men who drink quality bourbon and could share a cigar now and again. What we disagree about is not culture, but content.

    As another example of what I am trying to convey, the urge to own a gun in America surely does not reflect the groupthink at the core of the “culture war” definition. The reasons for arming oneself or choosing not to cross every demographic and social line — that rich white couple in Missouri armed and ready in their front yard as protesters walked by would share little of cultural significance with a poor white subsistence hunter or a young, inner-city, Latino gangbanger. It is highly unlikely that these disparate gun owners ever cross each other’s paths except as casual observers inspecting the oddities of each other’s cultural foundation.

    I am sorry to take a dump on everyone’s latest label, but I am really tired of labels being used as a substitute for responsibility. If you choose to be ignorant, you can meet others like you at your church, your country club, your gym or your city council meeting. Willful ignorance is found in all cultures. It is a shame that it is so common and so misunderstood as the root of much of what separates us.

    That is not a cultural statement. We are not engaged in a “culture war.” We are engaged in a confrontation to define a better America and to find the policy solutions that will lead us there. This is America’s “war” for its future, not some wistful search for cultural reconciliation.

    *[A version of this article was featured on the author’s blog, Hard Left Turn.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Slaves Picked Cotton, Senator Cotton Picks a Fight with History

    History has always been one of the biggest sources of embarrassment for the United States. The liberated colonists left European history behind when they declared independence. Americans ever since have demonstrated an obsessive focus on the present and the future, believing the past is irrelevant. American culture treats history as a largely forgettable litany of loosely related events, the best of which serve to prove that the entire “course of human events” (Thomas Jefferson) has served a divinely ordained purpose: to elevate to dominance “the greatest country in the history of the world” (Senator Rick Scott), consolidating its power and affirming its global leadership.

    In the midst of the Civil War, President Abraham Lincoln resorted to some rhetorical trickery to get his audience in Gettysburg to think about the history of the nation’s founding. He caught the public’s attention by proposing an exercise in mental calculation, testing their skills at math while invoking historical facts. Challenged to make sense of the circumlocution “four score and seven years ago,” his listeners had to multiply 20 (one score) by four and add seven to arrive at the sum of 87, and then count backward to arrive at 1776, the year of Jefferson’s Declaration of Independence.

    The success of Lincoln’s Gettysburg Address now stands as just one more isolated fact in the timeline of history. It should be remembered not only as a moment of inspired political thought and patriotic expression, but also for its clever rhetorical ploy to focus the audience’s attention on history. 

    Today’s creative teachers might do well to follow Lincoln’s example. With the right rhetoric they could encourage their students to think things out instead of simply subjecting them to boring lectures that present history as a sequence of anecdotes largely devoid of context and meaning. Of course, today’s teachers are no longer in a position to teach due to the coronavirus. And even if they could, they would be expected to focus on STEM (science, technology, engineering and math) instead of history.

    The Transatlantic Slave Trade Led to the Birth of Racism

    READ MORE

    This year’s lockdown caused by COVID-19 has given Americans more time to think. The ongoing protests against police brutality and racial inequality have forced a renewed discussion about the nation’s founding and its historical logic. In 2019, The New York Times promoted a project aimed at understanding the crucial role slavery played in building the colonial economy and structuring the nation that emerged from it in the late 18th century. Called The 1619 Project, it focused on the annoying fact that the first permanent settlements in Virginia, a year before the arrival of the Pilgrims in New England, inaugurated the practice of importing African slaves.

    Senator Tom Cotton of Arkansas was sufficiently annoyed to propose a law that would ban the results of the project from being taught in schools. He explained: “We have to study the history of slavery and its role and impact on the development of our country because otherwise we can’t understand our country. As the founding fathers said, it was the necessary evil upon which the union was built, but the union was built in a way, as Lincoln said, to put slavery on the course to its ultimate extinction.”

    Here is today’s 3D definition:

    Necessary:

    1. Required by the logic of events to attain a certain goal.

    2. When applied to the history of the United States, ordained by Providence in its plan to elevate American capitalism to the status of paragon of both political and economic organization.

    Contextual Note

    Realizing that the idea of a “necessary evil” sounded like an excuse for racism, Cotton “claimed he was citing the views of America’s founding fathers, rather than his own.” Some might interpret that as aggravating the offense, since it calls into question the judgment of the founders, generally considered by Republicans to be secular saints called upon by the divinity to establish the most perfect nation on earth. If the founders thought slavery was both evil and necessary, this either brands them as hypocrites or flawed political thinkers.

    The historians who have commented on Cotton’s assertion that slavery was a necessary evil have pointed out that there is no instance of any of the founders taking and defending this position. Pressed to reveal his own views, Cotton distanced himself from the cynical founders: “Of course slavery is an evil institution in all its forms, at all times in America’s past, or around the world today.”

    When pressed further by Brian Kilmeade on Fox News, Cotton offered this explanation: “What I said is that many founders believed that only with the Union and the Constitution could we put slavery on the path to its ultimate extinction. That’s exactly what Lincoln said.” There is of course no evidence that “many founders” believed that the mission embodied in the Constitution was to phase out slavery. Furthermore, Lincoln never said “exactly” any such thing.

    Embed from Getty Images

    Cotton believes history should not be thought of in terms of acts and deeds or the nature of institutions and their workings, but simply remembered for its stated ideals. Here is how he frames it: “But the fundamental moral principle of America is right there in the Declaration [of Independence.] ‘All men are created equal.’ And the history of America is the long and sometimes difficult struggle to live up to that principle. That’s a history we ought to be proud of.”

    Does he really think that learning about the reality of slavery and its role in building the nation’s economy will prevent students from being proud of their country? Cotton seems to believe that studying the documented facts about the nation’s past rather than simply admiring the edifying text of a slaveholder who claimed to believe in equality is a form of perverse revisionism. 

    The question being asked today by vast swaths of the US population — and not only those protesting in the streets — concerns precisely the point Cotton mentions: the “difficult struggle to live up to that principle.” He seems to believe that the struggle ended long ago and merits no further consideration. Mission accomplished. But if he were sincere, he would highlight the fact that if we want to live up to the principle, we should examine the facts rather than simply parrot the principle.

    Historical Note

    Cotton was specific in his complaint about The 1619 Project. He called it “a racially divisive, revisionist account of history that denies the noble principles of freedom and equality on which our nation was founded. Not a single cent of federal funding should go to indoctrinate young Americans with this left-wing garbage.” Though it would be difficult to find any logical structure to this assertion, Cotton implies that denying “the noble principles of freedom and equality” is what makes the project “racially divisive.” 

    Acknowledging the fact that the principles of freedom and equality he vaunts cannot apply to slavery does not amount to denying the principles. On the contrary, it asserts their importance by signaling the historical contradictions that not only should have been taken into account in 1789 (when the Constitution was ratified), but also in 1865 (at the end of the Civil War), as well as in 1964 (when the Civil Rights Act was passed) and in 2020, when the whole question has emerged again after the brutal death of George Floyd in Minneapolis.

    The real problem lies in the idea of a “necessary evil.” How does Cotton justify the concept? One might argue that Officer Chauvin’s killing of George Floyd was the evil that was necessary to provoke today’s protests. And the protests may have the effect of changing things to make the nation less racist than it was before. But an evil act by an individual cannot be compared with an institution, an economy and a way of life, which is what slavery was.

    To call something necessary means it is required for some purpose. What is that purpose? Senator Cotton seems to suggest it was the abolition of slavery. And in purely logical terms, he’s right. Slavery couldn’t be abolished if it didn’t exist. Long live the great institutions of the past, especially the ones that foresaw their own abolition.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Click here to read more of The Daily Devil’s Dictionary.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    A Master Class in Linguistic Suppression

    Boston’s local National Public Radio news station, WBUR, recently interviewed Rutgers University linguist Kristen Syrett, an advocate for pushing the sacred cause of political correctness beyond its currently accepted boundaries. Presumably in the name of racial justice, Syrett wants to root out every conscious or unconscious reference in the English language to the institution of slavery.

    Because plantation slave owners were referred to by slaves as “master,” or “massa” in the black vernacular pronunciation, Syrett believes the expression “master bedroom” should be expunged from the language of real estate. In the program, she and the interviewer, Robin Young, approve of the initiative to change the name of the Augusta Masters golf tournament, doubly culpable because in the past its caddies “had to be black.” It may seem odd that she has nothing to say about chess masters and grandmasters. This oversight seems even odder because chess is a game that pits a black army against a white one, which always has the first move.

    Syrett explains why the master bedroom must disappear: “There are people who are part of our population who do associate that practice and that history with that word.” And therein lies the problem because “there are times when language can express implicit bias.”

    Here is today’s 3D definition:

    Implicit bias:

    1. In contrast with explicit bias, the attribution to another person, by people with a superior moral standing, of an unjustifiable idea, belief or value that merits being condemned even if the accused person does not entertain that idea or belief.

    2. A supposed reprehensible mental habit of ordinary people that is discernible only to a class of people skilled at reading meaning that is not there into everyday language.

    Contextual Note

    What Syrett may not realize in her puritanical Bostonian zeal is that the enemy she’s tilting against isn’t racism — it’s the English language. She is calling into question the legitimacy of metaphor. Impoverishing the language does nothing to combat racism and may even have the effect of sheltering it from criticism. Racism is a worldview, not a vocabulary list.

    Syrett and Young appear intent on identifying, listing and banishing from polite discourse any words that might be associated with the slave economy. As she works in the field of children’s language acquisition, she appears to propose establishing a list of words teachers will be instructed never to use in classrooms to protect students’ ears from their vile influence.

    Some may suspect that these language detectives are primarily motivated by the personal pleasure gleaned from occupying the high moral ground that empowers them to designate unconscious racists for public opprobrium. Isn’t that part of the great Puritan tradition of New England to find ways of feeling more virtuous than the unwashed masses?

    For all her apparent schooling in the fashion of “critical theory,” Syrett’s critical thinking often relies on specious reasoning. Here is how she justifies the need to ban words: “To the extent that language can be a way of expressing who we are and what our values are and to the extent to which that language can either be a way to exclude people from a discourse or include them as key participants, then this is a great opportunity for us to revisit.”

    On several occasions in the interview, Syrett builds her reasoning around the phrase “to the extent that,” an expression that introduces a speculative and indeterminate idea. By concatenating two unrelated speculations, she creates the rhetorical illusion of equivalence or even of cause and effect. In this case it allows her to reveal an “opportunity.” But she hasn’t justified either proposition and even less the relationship (non-existent) between them. The opportunity this chopped logic permits is simply the censure she seeks to impose on the language ordinary people use.

    At another point, she says: “I think in a lot of cases, people aren’t really thinking that the expression conveys that kind of racism or misogyny.” Her point is clear: She thinks, whereas other “people” don’t think. With a more scientific approach, she might seek to explore why people don’t think what she thinks rather than supposing that they aren’t thinking. She may be right about their ignorance, but it may also emerge that she has misconstrued their and the language’s reality.

    Undoubtedly, Syrett starts with a noble intention. She wants to protect the victims of a truly oppressive system, even when the victims may not realize they are being oppressed. She believes language can be more secure by hiding reality. It must rid itself of anything that might, in her words, “marginalize and hurt other people.” The best way to do that is to scold those who fail to conform to the findings of her science.

    Historical Note

    Syrett’s approach is a perfect example of the decades-long trend in academe of the phenomenon known as critical theory. The first half of the 20th century produced a vibrant intellectual current called structuralism. It originated in the field of linguistics (Ferdinand de Saussure) and anthropology (Claude Lévy-Strauss) and offered insight into how societies and the cultures they produced were structured as complex interdependent systems. 

    In the mid-20th century, a disparate group of French linguists, philosophers, psychoanalysts and literary critics influenced by structuralism set about “deconstructing” the relationships between ideas, practices, language and modes of thought, from penal systems and sexuality (Michel Foucault) to popular entertainment and advertising (Roland Barthes). The chief deconstructionist, Jacques Derrida, denied the fundamental stability of meaning itself, which could only be a function of context. The ongoing dialogue of these thinkers, all of whom wrote in French, contains subtle and complex reflection on how human knowledge is created, managed and transmitted. 

    Alas, when this body of discourse crossed the Atlantic Ocean in the 1960s and 1970s, it lost something in translation. A strange mutation took place as academics labelled it “critical theory.” It appealed to humanities departments in the US who felt the need to show their concern with social issues. Because thinkers such as Michel Foucault offered insight into how cultural artifacts could reflect and support dominant worldviews and ideologies, American academics neglected its focus on the structural complexity of cultural and political ecosystems and instead seized on it as of method of assigning criminal intent to those who exercised power and oppressed minorities. 

    From the French post-structuralist perspective, this hijacking of the intellectual toolbox contributed little to our understanding of the societies past and present but served to reveal systemic features of US society and culture. If the French took delight in detecting the complex play of influences within a cultural system, American academics turned the method into a polarizing game of blame and victimization. Where the French thinkers saw intricate resonances that supported morally ambiguous social and political hierarchies, American academics saw arbitrary acts of personal abuse.

    Embed from Getty Images

    A French structuralist or post-structuralist observing this historical trend among intellectuals today might remark on the continuity in American society between the early Puritans’ insistence on dividing the world into the just and the unjust — those predestined by God to be among the virtuous and those condemned to sin. This cultural trend underlies the current obsession with separating society into two groups: innocent victims (any specific minority group) and evil oppressors. An aggressive system of identity politics has become the dominant ideology of the mainstream Democratic Party. It opposes the equally aggressive Republican insistence on defending “the shining city on the hill,” essentially a metaphor for white privilege.

    The rise of puritanical linguistic despotism can be traced back to World War I when sauerkraut was renamed “liberty cabbage” to protect American ears from German words. The tradition was perpetuated and even aggravated after George W. Bush’s invasion of Iraq when the White House punished Jacques Chirac’s disloyal France by renaming French fries “freedom fries.” 

    If liberty cabbage had some legitimacy because it was meant to spare people using a word from the enemy’s language, Bush’s initiative went further. France, after all, was not the enemy, except in the sense implied by Bush when he asserted in September 2001 that “Every nation, in every region, now has a decision to make. Either you are with us, or you are with the terrorists.” When France refused to line up behind his government in what President Chirac correctly deemed a deceitful and murderous enterprise, Bush undoubtedly saw France as an ally of the terrorists.

    That helps to situate the common thread between Syrett’s assault on “master bedrooms” and US foreign policy. It’s all about identifying, shaming and, when possible, banishing the culprit.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Click here to read more of The Daily Devil’s Dictionary.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More