In the olden days, which some of us remember as the 20th century, news stories and commentary tended to focus on people and their actions. The news would sometimes highlight and even debate current ideas circulating about society and politics. New stories quite often sought to weigh the arguments surrounding serious projects intended to improve things. The general tendency was to prefer substance over form.
Things have radically changed since the turn of the century. It may be related to a growing sentiment of fatalism that defines our Zeitgeist. Outside of the billionaire class, people feel powerless, a feeling that is already wreaking havoc in the world of politics. After banks that were “too big to fail,” we have inherited problems that appear too big to solve. Climate change and COVID-19 have contributed powerfully to the trend, but a series of chaotic elections in several of our most stable democracies, accompanied by newer wars or prospects of war called upon to replace the old ones all serve to comfort the trend.
Language and the News
READ MORE
In the United States, this feeling of helplessness has had the unfortunate effect of turning people’s attention away from the issues and the facts that matter to focus on the language individuals use to describe them. Words that inspire aggressive emotional reactions now dominate the news cycle, eclipsing the people, events and ideas that should be at the core of the news cycle.
One reason we have launched Fair Observer’s new feature, Language and the News, and are continuing with a weekly dictionary of what was formerly The Daily Devil’s Dictionary is that, increasingly, the meaning of the words people use has been obscured and replaced by the emotions different groups of combative people attach to words.
What explains this drift into a state of permanent combat over words? Addressing the issues — any issues — apparently demands too much effort, too much wrestling with nuance and perspective. It is much easier to reduce complex political and moral problems to a single word and load that word with an emotional charge that disperses even the possibility of nuance. This was already the case when political correctness emerged decades ago. But the binary logic that underlies such oppositional thinking has now taken root in the culture and goes well beyond the simple identification of words to use or not use in polite society.
The Problem of Celebrities Who Say Things Out Loud
Last week, US podcast host Joe Rogan and actress Whoopi Goldberg submitted to concerted public ostracism (now graced with the trendy word “canceled”) over the words and thoughts they happened to express in contexts that used to be perceived as informal, exploratory conversations. Neither was attempting to make a formal pronouncement about the state of the world. They were guilty of thinking out loud, sharing thoughts that emerged spontaneously.
It wasn’t James Joyce (who was at one time canceled by the courts), but it was still a stream of consciousness. Human beings have been interacting in that way ever since the dawn of language, at least 50,000 years. The exchange of random and sometimes focused thoughts about the world has been an essential part of building and regulating every human institution we know, from family life to nation-states.
Embed from Getty Images
During these centuries of exchanges, many of the thoughts uttered were poorly or only partially reasoned. Dialogue with others helped them to evolve and become the constructs of culture. Some were mistaken and bad. Others permitted moments of self-enlightenment. Only popes have ever had the privilege of making ex cathedra pronouncement deemed infallible, at least to the faithful. The rest of us have the messy obligation of debating among ourselves what we want to understand as the truth.
Dialogue never establishes the truth. It permits us to approach it. That doesn’t preclude the fact that multiple groups have acquired the habit of thinking themselves endowed with papal certainty allowing them to close the debate before it even begins. Everyone has noticed the severe loss of trust in the institutions once counted upon to guide the mass of humanity: governments, churches and the media.
That general loss of trust means that many groups with like-minded tastes, interests or factors of identity have been tempted to impose on the rest of society the levels of certainty they feel they have attained. Paradoxically, internationally established churches, once dominant across vast swaths of the globe, have come to adopt an attitude of humble dialogue just as governments, the media and various interest groups have become ensconced in promulgating the certainty of their truth while displaying an intolerance of dialogue.
Dialogue permits us to refine our perceptions, insights and intuitions and put them into some kind of perspective. That perspective is always likely to shift as new insights (good) and social pressures (not always so good) emerge. The sane attitude consists of accepting that no linguistically formulated belief — even the idea that the sun rises in the east — should be deemed to be a statement of absolute truth. (After all, despite everyone’s daily experience, the sun doesn’t rise — the Earth turns.) Perspective implies that, however stable any of our ideas may appear to us at a particular time, we can never be absolutely sure they are right and even less sure that the words we have chosen to frame such truths sum up their meaning.
Truth and the US State Department
A quick glance at the media over the past week demonstrates the complexity of the problem. Theoretically, a democratic society will always encourage dialogue, since voting itself, though highly imperfect, is presented as a means for the people to express their intentions concerning real world issues. In a democracy, a plurality of perspectives is not only desirable, but inevitable and should be viewed as an asset. But those who are convinced of their truth and have the power to impose their truth see it as a liability.
On February 3, State Department spokesman Ned Price spent nearly four minutes trying to affirm, in response to a journalist’s persistent objections, that his announced warning about a Russian false flag operation wasn’t, as the journalist suspected, itself a false flag. The journalist, Matt Lee of the Associated Press, asked for the slightest glimpse of the substance of the operation before accepting to report that there actually was something to report on. What he got were words.
Embed from Getty Images
Price, a former CIA officer, believed that the term was self-explanatory. He clearly expected members of the press to be grateful for receiving “information that is present in the US government.” Price sees Lee’s doubt as a case of a reporter seeking “solace in information that the Russians are putting out.” In other words, either a traitor or a useful idiot. Maggie Haberman of The New York Times reacted by tweeting, “ This is really something as an answer. Questioning the US government does not = supporting what Russia is saying.”
Haberman is right, though she might want to instruct some of her fellow journalists at The Times, who have acquired the habit of unquestioningly echoing anything the State Department, the Defense Department or the intelligence community shares with them. Especially when for more than five years, The Times’ specialized in promoting alarmism about Russia’s agency in the “Havana syndrome” saga. Because the CIA suspected, all the cases were the result of “hostile acts.” Acts, by the way, for which the only physically identified perpetrator was a species of Cuban crickets.
The back and forth concerning Russia’s false flag operation, like the Havana syndrome itself, illustrates a deeper trend that has seriously eroded the quality of basic communication in the United States. It takes the form of an increasingly binary, even Manichean type of reasoning. For Price, it’s the certainty of the existence of evil acts by Russians before needing any proof and even before those acts take place. But it also appears in the war of obstinate aggression waged by those who seek to silence anyone who suggests that the government’s vaccine mandates and other COVID-19 restrictions may not be justified.
This binary syndrome now permeates all levels of US culture, and not only the political sphere. The constraining force of the law is one thing, which people can accept. The refusal of dialogue is literally anti-human, especially in a democracy. But it also takes the form of moral rage when someone expresses an idea calling into question some aspect of authority or, worse, pronounces a word whose sound alone provokes a violent reaction. There is a residual vigilante culture that still infects US individualism. The willingness, or rather the need people feel, to apply summary justice helps to explain the horrendous homicide rate in the United States. Vigilantism has gradually contaminated the world of politics, entertainment and even education, where parents and school boards go to battle over words and ideas.
George W. Bush’s contribution
US culture has always privileged binary oppositions and shied away from nuance because nuance is seen as an obstacle to efficiency in a world where “time is money.” But a major shift began to take place at the outset of the 21st century that seriously amplified the phenomenon. The 1990s were a decade in which Americans believed their liberal values had triumphed globally following the collapse of the Soviet Union. For many people, it turned out to be boring. The spice of having an enemy was missing.
In 2001, the Manichean thinking that dominated the Cold War period was thus programmed for a remake. Although the American people tend to prefer both comfort and variety (at least tolerance of variety in their lifestyles), politicians find it useful to identify with an abstract mission consisting of defending the incontestable good against the threat posed by inveterate evil. The updated Cold War was inaugurated by George W. Bush in September 2001 when the US president famously proclaimed, “Every nation, in every region, now has a decision to make: either you are with us, or you are with the terrorists.”
Unique Insights from 2,500+ Contributors in 90+ Countries
The cultural attitude underlying this statement is now applied to multiple contexts, not just military ones. I like to call it the standard American binary exclusionist worldview. It starts from the conviction that one belongs to a camp and that camp represents either what is right or a group that has been unjustly wronged. Other camps may exist. Some may even be well-intentioned. But they are all guilty of entertaining false beliefs, like Price’s characterization of journalists who he imagines promote Russian talking points. That has long been standard fare in politics, but the same pattern applies in conflicts concerning what are called “culture issues,” from abortion to gender issues, religion or teaching Critical Race Theory.
In the political realm, the exclusionist worldview describes the dark side of what many people like to celebrate as “American exceptionalism,” the famous “shining city on a hill.” The idea it promotes supposes that others — those who don’t agree, accept and obey the stated rules and principles — are allied with evil, either because they haven’t yet understood the force of truth, justice and democracy and the American way, or because they have committed to undermining it. That is why Bush claimed they had “a decision to make.” Ned Price seems to be saying something similar to Matt Lee.
A General Cultural Phenomenon
But the exclusionist mentality is not just political. It now plays out in less straightforward ways across the entire culture. Nuance is suspected of being a form of either cowardice or hypocrisy. Whatever the question, debate will be cut short by one side or the other because they have taken the position that, if you are not for what I say, you are against it. This is dangerous, especially in a democracy. It implies an assumption of moral authority that is increasingly perceived by others to be unfounded, whether it is expressed by government officials or random interest groups.
The example of Price’s false flag and Lee’s request for substance — at least something to debate — reveals how risky the exclusionist mentality can be. Anyone familiar with the way intelligence has worked over the past century knows that false flags are a very real item in any intelligence network’s toolbox. The CIA’s Operation Northwoods spelled out clearly what the agency intended to carry out. “We could blow up a U.S. ship in Guantanamo Bay and blame Cuba,” a Pentagon official wrote, adding that “casualty lists in U.S. newspapers would cause a helpful wave of national indignation.”
There is strong evidence that the 2001 anthrax attacks in the US, designed to incriminate Saddam Hussein’s Iraq and justify a war in the immediate aftermath of 9/11, was an attempted false flag operation that failed miserably when it was quickly discovered that the strain of anthrax could only have been produced in America. Lacking this proof, which also would have had the merit of linking Hussein to the 9/11 attacks, the Bush administration had to struggle for another 18 months to build (i.e., fabricate) the evidence of Iraq’s (non-existent) weapons of mass destruction.
Embed from Getty Images
This enabled the operation “shock and awe” that brought down Hussein’s regime in 2003. It took the FBI nearly seven years to complete the coverup of the anthrax attacks designed to be attributed to Iraq. They did so by pushing the scientist Bruce Ivins to commit suicide and bury any evidence that may have elucidated a false flag operation that, by the way, killed five Americans.
But false flags have become a kind of sick joke. In a 2018 article on false flags, Vox invokes the conventional take that false flag reports tend to be the elements of the tawdry conspiracy theories that have made it possible for people like Alex Jones to earn a living. “So ‘false flag’ attacks have happened,” Vox admits, “but not often. In the world of conspiracy theorists, though, ‘false flags’ are seemingly everywhere.” If this is true, Lee would have been on the right track if he were to suspect the intelligence community and the State Department of fabricating a conspiracy theory.
Although democracy is theoretically open to a diversity of competing viewpoints, the trend in the political realm has always pointed toward a binary contrast rather than the development of multiple perspectives. The founding fathers of the republic warned against parties, which they called factions. But it didn’t take long to realize that the growing cultural diversity of the young nation, already divided into states that were theoretically autonomous, risked creating a hopelessly fragmented political system. The nation needed to construct some standard ideological poles to attract and crystallize the population’s political energies. In the course of the 19th century, a two-party system emerged, following the pattern of the Whigs and Tories in England, something the founders initially hoped to avoid.
It took some time for the two political parties to settle into a stable binary system with the labels: Democrat and Republican. Their names reflected the two pillars of the nation’s founding ideology. Everyone accepted the idea that the United States was a democratic republic, if only because it wasn’t a monarchy. It was democratic because people could vote on who would represent them.
It took nearly 200 years to realize that because the two fundamental ideas that constituted an ideology had become monopolized by two parties, there was no room for a third, fourth or fifth party to challenge them. The two parties owned the playing field. At some point in the late 20th century, the parties became competitors only in name. They morphed into an ideological duopoly that had little to do with the idea of being either a democracy or a republic. As James Carville insisted in his advice to candidate Bill Clinton in the 1992 presidential campaign, “It’s the economy, stupid.” He was right. As it had evolved, the political system represented the economy and no longer the people.
Nevertheless, the culture created by a two-century-long rivalry contributed mightily to the triumph of the binary exclusionist worldview. In the 20th century, the standard distinction between Democrats and Republicans turned around the belief that the former believed in an active, interventionist government stimulating collective behavior on behalf of the people, and the latter in a minimalist barebones government committed to reinforcing private enterprise and protecting individualism.
Where, as a duopoly, the two parties ended up agreeing is that interventionism was good when directed elsewhere, in the form of a military presence across the globe intended to demonstrate aggressive potential. Not because either party believed in the domination of foreign lands, but because they realized that the defense industry was the one thing that Republicans could accept as a legitimate highly constraining collective, national enterprise and that the Democrats, following Carville’s dictum, realized underpinned a thriving economy in which ordinary people could find employment.
The Crimes of Joe Rogan and Whoopi Goldberg
Politics, therefore, set in place a more general phenomenon: the binary exclusionist worldview that would soon spread to the rest of the culture. Exclusionism is a common way of thinking about what people consider to be issues that matter. It has fueled the deep animosity between opposing sides around the so-called cultural issues that, in reality, have nothing to do with culture but increasingly dominate the news cycle.
Until the launch of the culture wars around issues such as abortion, gay marriage, identity and gender, many Americans had felt comfortable as members of two distinct camps. As Democrats and Republicans, they functioned like two rival teams in sport. Presidential elections were always Super Bowls of a sort at which the people would come for the spectacle. The purpose of the politicians that composed the parties was not to govern, but to win elections. But, for most of the 20th century, the acrimony they felt and generated focused on issues of public policy, which once implemented the people would accept, albeit grudgingly if the other party was victorious. After the storm, the calm. In contrast, cultural issues generate bitterness, resentment and ultimately enmity. After the storm, the tempest.
Embed from Getty Images
The force of the raging cultural winds became apparent last week in two entirely different celebrity incidents, concerning Joe Rogan and Whoopi Goldberg. Both were treated to the new style of excommunication that the various churches of correct thinking and exclusionary practices now mete out on a regular basis. In an oddly symmetrical twist, the incriminating words were what is now referred to as “the N-word” spoken by a white person and the word “race” spoken by a black person. Later in the week, a debate arose about yet another word with racial implications — apartheid — when Amnesty International formally accused the state of Israel of practicing it against Palestinians.
The N-word has become the locus classicus of isolating an item of language that — while muddled historically and linguistically — is so definitively framed that, even while trying to come to grips with it informally as an admittedly strange and fascinating phenomenon in US culture, any white person who utters the reprehensible term will be considered as having delivered a direct insult to a real person or an entire population. Years ago, Joe Rogan made a very real mistake that he now publicly regrets. While examining the intricate rules surrounding the word and its interdiction, he allowed himself the freedom to actually pronounce the word.
In his apology, Rogan claimed that he hasn’t said the word in years, which in itself is an interesting historical point. He recognizes that the social space for even talking about the word has become exaggeratedly restricted. Branding Rogan as a racist just on that basis may represent a legitimate suspicion about the man’s character, worth examining, but it is simply an erroneous procedure. Using random examples from nearly 10 years ago may raise some questions about the man’s culture, but it makes no valid case for proving Rogan is or even was at the time a racist.
The Whoopi Goldberg case is less straightforward because it wasn’t about a word but an idea. She said the Holocaust “was not about race.” Proposing the hypothesis that Nazi persecution of Jews may be a case of something other than simple racism is the kind of thought any legitimate historian might entertain and seek to examine. It raises some serious questions not only about what motivated the Nazis, but about what our civilization means by the words “race” and “racism.” There is considerable ambiguity to deal with in such a discussion, but any statement seeking to clarify the nature of what is recognized as evil behavior should be seen as potentially constructive.
Once some kind of perspective can be established about the terms and formulations that legitimately apply to the historical case, it could be possible to conclude, as many think, that either Goldberg’s particular formulation is legitimate, inaccurate or inappropriate. Clearly, Goldberg’s critics found her formulation inappropriate, but, objectively speaking, they were in no position to prove it inaccurate without engaging in the meaning of “race.”
The problem is complex because history is complex, both the history of the time and the historical moment today. One of the factors of complexity appeared in another controversy created by Amnesty International’s publication of a study that accuses Israel of being an apartheid state, which considered in international law is to be a crime against humanity.
Unique Insights from 2,500+ Contributors in 90+ Countries
Interestingly, The Times of Israel gives a fair and very complete hearing to Amnesty International’s spokespersons, whereas American media largely ignored the report. When they did cover it, US media focused on the dismissive Israeli reaction. PBS News Hour quoted Ned Price, who in another exchange with Matt Lee stated that the department rejects “the view that Israel‘s actions constitute apartheid.”
Once again, the debate is over a word, the difference in this case being that the word is specifically defined in international law. The debate predictably sparked, among some commentators, another word, whose definition has often been stretched in extreme directions in the interest of provoking strong emotions: anti-Semitism. Goldberg’s incriminating sentence itself was branded by some as anti-Semitism.
At the end of the day, the words used in any language can be understood in a variety of ways. Within a culture that has adopted the worldview of binary exclusionism, the recourse to constructive dialogue is rapidly disappearing. Instead, we are all saddled with the task of trying to memorize the lists of words one can and cannot say and the ideas it will be dangerous to express.
What this means is that addressing and solving real problems is likely to become more and more difficult. It also means that the media will become increasingly less trustworthy than it already is today. For one person, a “false flag” corresponds to a fact, and for another, it can only be the component of a conspiracy theory. The N-word is a sound white people must never utter, even if reading Mark Twain’s Huckleberry Finn aloud. And the word “race” — a concept that has no biological reality — now may apply to any group of people who have been oppressed by another group and who choose to be thought of as a race.
The topics these words refer to are all serious. For differing reasons, they are all uncomfortable to talk about. But so are issues spawned by the COVID-19 pandemic, related to health and prevention, especially when death and oppressive administrative constraints happen to be involved. The real problem is that as soon as the dialogue begins to stumble over a specific word or ill-defined concept or the feeling of injustice, reasoning is no longer possible. Obedient acceptance of what becomes imposed itself as the “norm” is the only possible survival strategy, especially for anyone visible to the public. But that kind of obedience may not be the best way to practice democracy.
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More