More stories

  • in

    Critical Race Theory: A Dictatorship of the Woke?

    In Washoe County, Nevada, parents protest critical race theory (CRT), while a conservative group is pushing for teachers to wear body cameras to make sure they aren’t indoctrinating students. In Loudon county, Virginia, home to Leesburg, a town named after Confederate General Robert E. Lee, wealthy white parents scream in school meetings. Across the US, mostly white parents picket school board meetings, holding up “No CRT” signs as though it were 1954 and their schools were about to be integrated.

    Understanding Racism in All Its Forms

    READ MORE

    This demonization of an academic theory is supported by virulent media discourses. Fox News says that the teachers’ unions support CRT and will push it on your schools at a cost of $127,600. Breitbart takes it further, suggesting that CRT is going to set up “a dictatorship of the anti-racists.” On Twitter, opponents compare CRT to anti-white racism and the far-right conspiracy of white genocide.

    Undoing Racism

    So what is critical race theory? Is it a radical anti-racist Marxist program bent on overturning power structures for an amount equivalent to what Tucker Carlson earns in a week? Scholars say CRT is in fact a framework from critical legal studies emphasizing not the social construction of race but the reality of racism, in particular racism’s deep roots in American history and its perpetuation in legal and social structures. Kimberlé Crenshaw, who coined the term, emphasizes that it is an ongoing scholarly practice of interrogating racism.

    Is it being taught in your schools? Nobody is teaching CRT to kindergarteners. Critical race theory has become part of education studies, one of many frameworks influencing researchers and instuctors who want a framework for understanding, and undoing, racism in education. Some link CRT in schools to The 1619 Project launched by The New York Times that seeks to center black history and slavery in the story of America’s founding.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    So why does your uncle who spends too much time on the internet think this is a dictatorship of the woke? The moral panic over CRT is the brainchild of Chris Rufo, who began using the term to refer to a catch-all, nefarious force behind all kinds of social change, from Joe Biden’s weak liberalism to Black Lives Matter. Conservatives link CRT to trans rights and communism, the Heritage Foundation compares it to Marxist critical theory. The Trump administration launched a counter to The 1619 Project, the 1776 Commision, to elevate whiteness and fight “critical race theorists” and “anti-American historical revisionism.”

    Moral panics position one idea, process, identity or group as evil, a threat to public order, values and morality, but they align institutional power with popular discourses to enforce the social positions and identities behind them. As of July, 22 states have proposed bills against teaching critical race theory and five have signed them into law. These bills ban teaching CRT, which they insist makes white students uncomfortable and introduces “divisive concepts.” For the right, the vision of US history is one that teaches color-blind unity and pride in being American. Of course, it also teaches that the KKK was OK.

    Anti-Anti-Racist Panic

    This is far from the first moral panic over education. Historian Adam Laats compares the fight against CRT to the fight against the evolution of teaching. This first moral panic led to widespread distrust in public schools. More recent moral panics also led to divestment in social institutions. In the 1980s, a panic about satanic kindergartens in the US led to the reinforcement of dominant gender and racial power structures, but also to the withdrawal of support for daycare and early childhood education.

    Panics over sex education, from Australia to Aabama, called for defunding these programs, shrinking already limited school budgets while increasing conservative opposition to public education. In the UK, the Conservative Party wants to ban teaching white privilege because it hurts working-class boys — while at the same time dismantling the free school meals program.

    What will the effects of this anti-anti-racist panic be? Will they curb the freedom of teachers to share the truths of history or push them to teach a still more nationalist version of the American story? Will history classes explicitly celebrate white masculinity, full of heroic founders fulfilling a holy promise for freedom and capital? Or might it also serve as another push to demonize public schools, painting them not as (unequally funded) shared democratic institutions but as anti-American indoctrination centers?

    Embed from Getty Images

    Even if the bills do not reshape education standards, the dramatic language around CRT and white genocide continues the longstanding push to defund and privatize public schools. As education scholar Michael Apple notes, the right’s education reform has long linked neoliberal privatization with neoconservative curriculums, something that continues with the opposition to CRT.

    Breitbart mentions Utah’s Say No to Indoctrination Act that will “keep taxpayer dollars from funding discriminatory practices and divisive worldviews,” linking cost and curriculum. It is not a coincidence that conservative media mention the price of anti-racist interventions and the dog whistle of “taxpayer dollars.” Fighting CRT might mean bills to change curriculum standards, but it could equally mean a push to cut funding for public schools reframed as cutting funding for CRT — as Senate candidate J.D. Vance suggests on Twitter — or a call for greater support for private, religious and home education.

    Both increased nationalism and privatization of education were key issues for the right. Donald Trump’s 2020 education platform’s first point was to teach American exceptionalism; his second was to have school choice. With this panic over critical race theory, far-right drama serves to reinforce the more banal nationalism of capital and conservatism. Painting schools as cultural-Marxist madrassas makes it a lot easier to stop paying for them.

    *[Fair Observer is a media partner of the Centre for Analysis of the Radical Right.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    The New American Art of Inconclusive Conclusions

    In early 2020, as soon as the epidemic caused by a novel coronavirus began turning into a global pandemic, everyone, from scientists to politicians and media pundits, was eager to understand where it came from. Conveniently for US President Donald Trump, it came from China. That enabled him to suggest that if it originated in a nation now perceived to be America’s enemy, it was probably a malicious plot designed to weaken his electoral chances.

    But the scientific community, relayed by the media, calmly explained that, like earlier examples of the coronavirus, it was transmitted to humans by animals and originated with bats. The essential message could be boiled down to: Trust the scientists, who know what they’re talking about.

    A year later, with Trump no longer in the White House, suspicion arose even among many scientists that, well, accidents happen, even among all-knowing scientists. But this accident, if that’s what it was, turned out to be particularly embarrassing, with millions of people dying, the global economy thrown into a tailspin and all the rituals of daily life upended, including such things as children’s education and, more drastically (in terms of loss of income), professional sports.

    Do Americans Still Trust Their Public Health Agencies? 

    READ MORE

    When the stakes are so high, suspicion about who and what is to blame takes on a new dimension. The dominant take of 2020 was that it was all about wild animals. The dominant take in the spring of 2021 was that, no, it was people, and specifically scientists, who were the unwitting culprits. Back in 2020, the logic of US politics meant that reasonable people could assume that any assertion by the incumbent president, Donald Trump, known for his addiction to “alternative facts,” was a self-interested lie. Moreover, if a scientist provided a version that contradicted Trump, it was likely to be the truth.

    A year later, Trump was gone. The path was cleared for rational public discussions. It became possible to begin weighing evidence before asserting a possibly unfounded opinion. That is when some medically-informed journalists and an increasing number of scientists admitted that human error as the source of COVID-19 was not only possible, but highly credible. 

    The confusion spawned by this reversal of public discourse led the presumably level-headed President Joe Biden to commission a report from the intelligence community on the true origin of the pandemic. Last week, The Washington Post reported on the initial result of that study. The article stated that “President Biden on Tuesday received a classified report from the intelligence community that was inconclusive about the origins of the novel coronavirus, including whether the pathogen jumped from an animal to a human as part of a natural process, or escaped from a lab in central China, according to two U.S. officials familiar with the matter.”

    Today’s Daily Devil’s Dictionary definition:

    Inconclusive:

    Not quite certain enough yet to be codified and promulgated as an official lie

    Contextual Note

    If Trump could be counted on to produce any version of the “facts” that suited his agenda, Biden came into office with a confirmed capacity for lying about the facts of his own life — including his educational honors and his stance on the Iraq War — but also with a reputation for largely respecting publicly acknowledged truth. He did, however, out of ordinary political opportunism, give credence to the easily debunked reports about Russians paying bounty to Afghans willing to kill Americans. That was because he knew his fellow Democrats were fond of blaming Russians for all the nation’s ills. 

    One difference between the two presidents is that Trump was always ready to jump to a conclusion, rejecting the temptation to call anything inconclusive. He painted the world in black and white, from which nuance was excluded. There was, however, one exception. He opposed the CIA’s largely conclusive assessment that Trump’s buddy, Saudi Crown Prince Mohammed bin Salman, had commanded the bone-saw crew who dismembered Washington Post journalist Jamal Khashoggi. On that issue, Trump claimed that the evidence was inconclusive.

    Most theories that lead to blaming someone other than the initially designated culprit are routinely deemed inconclusive or labeled as conspiracies to the extent that no smoking gun has been found. Those who cling to the idea that Lee Harvey Oswald was the lone assassin of John F. Kennedy and the soon-to-be-paroled Sirhan Sirhan for the death of Robert Kennedy continue to claim that the mountain of countervailing evidence is inconclusive. In both Kennedy assassinations, the smoke eventually became visible, but the smell of the gunfire had faded. Any forensic traces of actual smoke were of course branded conspiracy theories.

    Embed from Getty Images

    Concerning the report on the origins of COVID-19, the inconclusive assessment appears justified. The case for a lab leak has grown stronger in recent months, but apart from suspicion generated by the fact that the Chinese government has been obstructive, there is no serious evidence to justify it. The Chinese government is by definition obstructive in everything it does, so this could hardly be confused with the kind of exceptional behavior that credibly points toward a coverup. 

    The Post offers an interesting explanation of another apparent anomaly: “Proponents of that theory point to classified information, first disclosed in the waning days of the Trump administration, that three unidentified workers from the Wuhan Institute of Virology — one of the world’s preeminent research institutions studying coronaviruses — went to the hospital in November 2019 with flu-like symptoms.”

    Americans would of course find this suspect, since, given the crippling price of medical treatment in the US, people avoid going to the hospital except in an emergency. The Post helpfully adds: “In China, people visit the hospital for routine and mild illnesses.” Cultural assumptions can always intervene to skew the perception of the meaning of the evidence.

    Historical Note

    Since COVID-19 is still mutating and raging nearly two years after its outbreak, no one knows when the definitive history of the COVID-19 pandemic will be written. The current wisdom says that, unlike the Spanish flu of a century ago, it will end up not as a chapter of history, with a beginning, a middle and an end, but as an endemic feature of humanity’s pathological landscape.

    In contrast, the history of the deep psychological mutations taking place as a result of the pandemic, especially in Western society, is beginning to take shape. Democracy has always lent itself to contestation. Protest has traditionally served to help define the positive dynamics of democracy, where voices could be heard that might influence what Thomas Jefferson once called “the course of human events.” But the pandemic has accelerated a different, far less positive trend, not of constructive protest but of an utter loss of faith not only in civic authority, but also in every other form of authority. Science itself may be the victim. 

    The secular order imposed by modern formally democratic governments depends to a large extent on the belief in the beneficent authority of science and the sincerity of its representatives, the scientists. In recent decades that authority has been shaken by the role powerful economic actors and complicit politicians have played in manipulating science to serve their purposes. The managers of the economy have become accustomed to using their clout to promote comforting lies about science itself, in the name of “national interest” and the “needs of the economy,” which means the health, not of the planet or its population, but of the mighty enterprises that create (and also destroy) jobs.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    It is a well-known fact that in US culture, uncertainty and inconclusiveness are unpopular. That aversion was one of the keys to Trump’s electoral success. Not having a decided opinion on something is often seen as an excuse for not getting things done, which means committing the cardinal sin of wasting time. Americans tend to see having and expressing a strong opinion — the art of being assertive — even when poorly informed, as a fundamental right that should never be compromised by the rituals of dialogue and debate.

    Nearly 60 years after the JFK assassination, an event that still contributes to undermining Americans’ faith in political authority, an accumulation of more crises has added powerfully to the confusion. The latest Afghan debacle, an unresolved pandemic and mounting evidence of the tragedy of climate change have combined to undermine every American’s hope for establishing the kind of certainty Americans believe to be their birthright.

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Infrastructure: The Key to the China Challenge

    China has been recognized by Washington as the major rival to the United States in nearly every field. However, this isn’t the first time an Asian country has posed a threat to America’s economic dominance. In the mid-1980s, Japan built up a massive trade surplus with the United States, igniting a fierce backlash from both Republicans and Democrats over how it acquired US technology — often by theft, according to US officials — and how Tokyo used the government’s deep influence to push its companies into a dominant global position.

    But there was no nefarious scheme. In reality, Japan had made significant investments in its own education and infrastructure, allowing it to produce high-quality goods that American customers desired. In the case of China, American businesses and investors are covertly profiting by operating low-wage factories and selling technologies to their “partners” in China. American banks and venture capitalists are also active in China, funding agreements. Furthermore, with the Belt and Road Initiative (BRI), China’s infrastructure investment extends far beyond its own borders.

    The Unintended Economic Impacts of China’s Belt and Road Initiative

    READ MORE

    The BRI is Chinese President Xi Jinping’s hallmark foreign policy initiative and the world’s largest-ever global infrastructure project, funding and developing roads, power plants, ports, railroads, 5G networks and fiber-optic cables all over the world. The BRI was created with the goal of connecting China’s modern coastal cities with the country’s undeveloped heartland and to its Asian neighbors, firmly establishing China’s place at the center of an interlinked globe.

    The program has already surpassed its initial regional corridors and spread across every continent. The expansion of the BRI is worrying because it may make countries more vulnerable to Chinese political coercion while also allowing China to extend its authority more widely. 

    Infrastructure Wars

    US President Joe Biden and other G7 leaders launched a worldwide infrastructure plan, Build Back Better World (B3W), to counterweight China’s BRI during the G7 summit in Cornwall in June. The plan, according to a White House statement, aims to narrow infrastructure need in low and middle-income countries around the world through investment by the private sector, the G7 and its financial partners. The Biden administration also aims to use the plan to complement its domestic infrastructure investment and create more jobs at home to demonstrate US competitiveness abroad.

    The US government deserves credit for prioritizing a response to the BRI and collaborating with the G7 nations to provide an open, responsible and sustainable alternative. However, it seems unlikely that this new attempt would be sufficient to emulate the BRI and rebuild America’s own aging infrastructure, which, according to the Council on Foreign Relations, “is both dangerously overstretched and lagging behind that of its economic competitors, particularly China.”

    On the one hand, it’s unknown if B3W will be equipped with the necessary instruments to compete. The Biden administration has acknowledged that “status quo funding and financing approaches are inadequate,” hinting at a new financial structure but without providing specific details. It remains to be seen if B3W will assist development finance firms to stimulate adequate new private infrastructure investments as well as whether Congress will authorize much-needed extra funding.

    Embed from Getty Images

    Even with more funding, B3W may not be sufficiently ambitious. While the World Bank predicts that an $18-trillion global infrastructure deficit exists, the project will be unable to make real progress until extra resources are allocated to it.

    Also, the United States still lacks an affirmative Asia-Pacific trade policy. To compete with the BRI, the US will need to reach new trade and investment agreements while also bolstering core competitiveness in vital technologies such as 5G. It will also need to devote greater resources to leading the worldwide standards-setting process, as well as training, recruiting and maintaining elite personnel.

    On the other hand, China is often the only country willing to invest in vital infrastructure projects in underdeveloped and developing countries, and, in some cases, China is more competitive than the US as it can move quickly from design to construction. 

    Desire to Invest

    Furthermore, China’s desire to invest is unaffected by a country’s political system, as seen by the fact that it has signed memorandums of understanding with 140 nations, including 18 EU members and several other US allies such as Japan, South Korea, Australia and New Zealand. Even the United Kingdom, as a member of the G7, had a 5G expansion deal with Huawei that was canceled owing to security and geopolitical concerns. Nonetheless, the termination procedure will take about two years, during which time the Chinese tech behemoth will continue to run and upgrade the UK’s telecoms infrastructure.

    As a result, the BRI has fueled a rising belief in low and middle-income nations that China is on the rise and the US and its allies are on the decline. The policy consequence for these countries is that their future economic growth is dependent on strong political ties with China. 

    Unlike the US and European governments, which only make up for part of the exporters’ losses, Beijing guarantees the initial capital and repays the profits to the investing companies and banks. In addition, since there is no transfer of power and government in China, there will be virtually no major policy changes, meaning that investors will feel more secure. So far, about 60% of the BRI projects have been funded by the Chinese government and 26% by the private sector. 

    Unique Insights from 2,500+ Contributors in 90+ Countries

    For far too long, the US reaction to the BRI has been to emphasize its flaws and caution countries against accepting Chinese finance or technology without providing an alternative. Until now, this haphazard reaction has failed to protect American interests. The United States is now presenting a comprehensive, positive agenda for the first time. Transparency, economic, environmental and social sustainability, good governance and high standards are all emphasized in Build Back Better World.

    While providing a credible US-led alternative to the Belt and Road Initiative is desirable, the US must commit adequate financial and leadership resources to the effort. This is a good first step, but Washington must be careful not to create a new paranoia by demonizing economic and geopolitical rivals such as China and Japan to the point where it distorts priorities and leads to increased military spending rather than public investments in education, infrastructure and basic research, all of which are critical to America’s future prosperity and security.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Ending the US Empire of War, Corruption and Poverty

    Americans have been shocked by reports of thousands of Afghans risking their lives to flee the Taliban, whose militants swept through Afghanistan and returned to power on August 15. This was followed by a suicide bombing claimed by the Islamic State in Khorasan Province (IS-KP) that killed at least 170 people, including 13 US troops. Some eyewitnesses told the BBC that “significant numbers” of those killed were shot dead by American and foreign forces.

    Even as UN agencies warn of an impending humanitarian crisis in Afghanistan, the US Treasury has frozen nearly all of the Afghan central bank’s $9.4 billion in foreign currency reserves, depriving the new government led by the Taliban of funds it will desperately need in the coming months to feed its people and provide basic services. Under pressure from the Biden administration, the International Monetary Fund decided not to release $450 million in funds that were scheduled to be sent to Afghanistan to help the country cope with the coronavirus pandemic. 

    Afghanistan: A Final Nail in the Coffin of American Foreign Policy

    READ MORE

    The US and other Western countries have also halted humanitarian aid to Afghanistan. After chairing a G7 summit on Afghanistan on August 24, British Prime Minister Boris Johnson said that withholding aid and recognition gave them “very considerable leverage — economic, diplomatic and political” over the Taliban. 

    Western politicians couch this leverage in terms of human rights, but they are clearly trying to ensure that their Afghan allies retain some power in the new government and that Western influence and interests in Afghanistan do not end with the Taliban’s return. This leverage is being exercised in dollars, pounds and euros, but it will be paid for in Afghan lives.

    US Spending in Afghanistan

    To read or listen to Western analysts, one would think that the United States and its allies’ 20-year war in Afghanistan was a benign and beneficial effort to modernize the country, liberate Afghan women and provide health care, education and good jobs, and that this has all now been swept away by capitulation to the Taliban. The reality is quite different and not so hard to understand.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    The United States spent $2.26 trillion on its war in Afghanistan. Spending that kind of money in any country should have lifted most people out of poverty. But the vast bulk of those funds, about $1.5 trillion, went to absurd, stratospheric military spending to maintain the US-led military occupation, drop tens of thousands of bombs and missiles, pay private contractors and transport troops, weapons and military equipment back and forth around the world for 20 years. 

    Since the United States fought this war with borrowed money, it has also cost half a trillion dollars in interest payments alone, which will continue far into the future. Medical and disability costs for US soldiers wounded in Afghanistan and Iraq already amount to over $350 billion, and they will likewise keep mounting as the soldiers age. Medical and disability costs for both of those US-led wars could eventually reach another trillion dollars over the next 40 years.

    So, what about “rebuilding Afghanistan”? Congress appropriated $144 billion for reconstruction in Afghanistan since 2001, but $88 billion of that was spent to recruit, arm, train and pay the Afghan “security forces” that have now disintegrated, with soldiers returning to their villages or joining the Taliban. Another $15.5 billion spent between 2008 and 2017 was, as per Al Jazeera, documented as “waste, fraud and abuse” by the US Special Inspector General for Afghanistan Reconstruction.

    Corruption

    The crumbs left over, less than 2% of total US spending on Afghanistan, amount to about $40 billion, which should have provided some benefit to the Afghan people in economic development, health care, education, infrastructure and humanitarian aid. But, as in Iraq, the government the US installed in Afghanistan was notoriously corrupt, and its corruption only became more entrenched and systemic over time. Transparency International (TI) has consistently ranked Afghanistan as among the most corrupt countries in the world.

    Western readers may think that this corruption is a long-standing problem in the country, as opposed to a particular feature of the US-led occupation, but this is not the case. TI noted that “it is widely recognized that the scale of corruption in the post-2001 period has increased over previous levels.” A 2009 report by the Organization for Economic Cooperation and Development (OECD) warned that “corruption has soared to levels not seen in previous administrations.” Those administrations would include the Taliban government that US and NATO invasion forces removed from power in 2001, and the Soviet-allied socialist governments that were overthrown by the US-supported precursors of al-Qaeda and the Taliban in the 1980s, destroying the substantial progress they had made in education, health care and women’s rights.

    Embed from Getty Images

    A 2010 report by Anthony H. Cordesman, a Pentagon official under Ronald Reagan, entitled “How America Corrupted Afghanistan,” chastised the US government for throwing gobs of money into that country with virtually no accountability. The New York Times reported in 2013 that every month for a decade, the CIA had been dropping off suitcases, backpacks and even plastic shopping bags stuffed with US dollars for the Afghan president to bribe warlords and politicians.

    Corruption also undermined the very areas that Western politicians now hold up as the successes of the occupation, like education and health care. The education system has been riddled with schools, teachers and students that exist only on paper. Afghan pharmacies are stocked with fake, expired or low-quality medicines, many smuggled in from neighboring Pakistan. At the personal level, corruption was fueled by civil servants like teachers earning only one-tenth the salaries of better-connected Afghans working for foreign NGOs and contractors. 

    Rooting out corruption and improving Afghan lives has always been secondary to the primary US goal of fighting the Taliban and maintaining or extending its puppet Afghan government’s control. As TI reported, the US “has intentionally paid different armed groups and Afghan civil servants to ensure cooperation and/or information and cooperated with governors regardless of how corrupt they were… Corruption has undermined the U.S. mission in Afghanistan by fuelling grievances against the Afghan government and channelling material support to the insurgency.”

    Poverty and Freezing Funds

    The endless violence of the US-led occupation and the corruption of the Afghan government boosted popular support for the Taliban, especially in rural areas where three-quarters of Afghans live. The intractable poverty of Afghanistan also contributed to the Taliban victory, as people naturally questioned how their occupation by wealthy countries like the United States and its Western allies could leave them in such abject poverty.

    Well before the current crisis, the number of Afghans reporting that they were struggling to live on their current income increased from 60% in 2008 to 90% by 2018. A 2018 Gallup poll found the lowest levels of self-reported “well-being” that Gallup has ever recorded anywhere in the world. Afghans not only reported record levels of misery, but also unprecedented hopelessness about their future.

    Embed from Getty Images

    Despite some gains in education for girls, only a third of Afghan girls attended primary school in 2019 and only 37% of adolescent Afghan girls were literate. One reason that so few children go to school in Afghanistan is that more than 2 million children between the ages of 6 and 14 have to work to support their poverty-stricken families.  

    Yet instead of atoning for their role in keeping most Afghans mired in poverty, Western leaders are now cutting off desperately needed economic and humanitarian aid that was funding three-quarters of Afghanistan’s public sector and made up 40% of its total GDP. 

    In effect, the United States and its allies are responding to losing the war by threatening the Taliban and the people of Afghanistan with a second: economic war. If the new Afghan government does not give in to their “leverage” and meet their demands, our leaders will starve their people and then blame the Taliban for the ensuing famine and humanitarian crisis, just as they demonize and blame other victims of US economic warfare, from Cuba to Iran. 

    After pouring trillions of dollars into endless war in Afghanistan, America’s main duty now is to help the 38 million Afghans who have not fled their country, as they try to recover from the terrible wounds and trauma of the conflict that the US inflicted on them. This is coupled with a massive drought that devastated 40% of their crops this year and a crippling third wave of COVID-19. 

    The US should release the $9.4 billion in Afghan funds held in American banks. It should shift the $6 billion allocated for the now-defunct Afghan armed forces to humanitarian aid, instead of diverting it to other forms of wasteful military spending. It should encourage European allies and the IMF not to withhold funds. Instead, they should fully fund the UN 2021 appeal for $1.3 billion in emergency aid, which as of late August was less than 40% funded.

    Rethinking Its Place

    Once upon a time, the United States helped its British and Soviet allies to defeat Germany and Japan. The Americans then helped to rebuild them as healthy, peaceful and prosperous countries. For all America’s serious faults — its racism, its crimes against humanity in Hiroshima and Nagasaki and its neocolonial relations with poorer countries — it held up a promise of prosperity that people in many countries around the world were ready to follow. 

    Unique Insights from 2,500+ Contributors in 90+ Countries

    If all the United States has to offer other countries today is the war, corruption and poverty it brought to Afghanistan, then the world is wise to be moving on and looking at other models to follow: new experiments in popular and social democracy; a renewed emphasis on national sovereignty and international law; alternatives to the use of military force to resolve international problems; and more equitable ways of organizing internationally to tackle global crises like the COVID-19 pandemic and the climate disaster. 

    The US can either stumble on in its fruitless attempt to control the world through militarism and coercion, or it can use this opportunity to rethink its place in the world. Americans should be ready to turn the page on our fading role as global hegemon and see how we can make a meaningful, cooperative contribution to a future that we will never again be able to dominate, but which we must help to build.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    The State Versus the People: Who Controls the Internet?

    India’s government and Twitter have been fighting a legal battle over compliance with domestic laws. In June 2020, the tech giant failed to make key local appointments required under India’s new information technology rules. Twitter has more than 22 million users in India and was recently categorized as a “significant intermediary” alongside Facebook and WhatsApp.

    Soon after, the companies were required to appoint three Indian officers to a mandatory compliance and grievance redress mechanism. The government’s aim is to make social media companies more accountable to local law enforcement agencies. These officers would help local authorities access data from servers. While WhatsApp and Facebook complied, Twitter did not.

    When Technology Cancels Anonymity

    READ MORE

    In retaliation, the Indian government petitioned and then stripped Twitter of its “safe harbor” immunity that protects them against liability for the content posted by users. Non-compliance has made Twitter vulnerable. Ever since, it has faced four major lawsuits from some of India‘s top statutory bodies, including the National Commission for Protection of Child Rights.

    During an appearance in the Delhi High Court, Twitter denied having any intent to contravene government regulations. In response, one official characterized Twitter’s position as a “prevarication” that “cocks a snook at the digital sovereignty of this country.”

    Today’s Daily Devil’s Dictionary definition:

    Sovereignty:

    A term used to foreignize a rival while claiming moral supremacy and projecting oneself as an insider

    Contextual Note

    India is not alone in expecting tech giants like Twitter and Facebook to be subject to local laws. States across the world use various data localization legislation to assert their sovereignty. Over the years, companies have been asked to place their servers within national jurisdictions. Alternatively, governments have claimed copies of all data and unhindered access to servers located abroad. According to data collected by the European Centre for International Politics Economy, the number of such laws existing in 2010 more than doubled by 2015, from 40 to over 80. That number has been steadily increasing.

    Data localization trends reflect a certain sense of insecurity among states about diluting their digital sovereignty. In an article on digital sovereignty and international conflicts, James A. Lewis defines it as “the right of a state to govern its network to serve its national interests, the most important of which are security, privacy and commerce.”

    Nevertheless, the irony in this scenario is hard to miss. Little prevents information from being circulated on the largest technology markets due to companies and their vulnerability to being influenced. Government access to servers located in India or elsewhere cannot potentially exclude these enterprises themselves. At the very least, host countries pose a constant security threat concerning the data stored on servers in their jurisdiction. These states hold significant leverage.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Making matters worse, most global servers are concentrated in a select few powerful countries. Four of the world’s five largest server facilities are located in the US. Although seemingly borderless, the internet extensively depends on a physical infrastructure, which makes it vulnerable to interference. This makes data localization ineffective and hard to implement. For instance, since 2016, Forbes has identified multiple instances in which WhatsApp shared data with the US government. It may be that there are no laws that can provide states with the supremacy they seek. 

    These issues are not new. Experts have been dealing with the redundancy of data localization laws for some time. Lewis warns of an impending Balkanization of the internet because of such laws. He posits that each state seeking its own internet may lead to no internet at all.

    However, the struggle between omnipotent states and omnipresent corporations could have deeper impacts on where global power lies. Apart from the practical problem of granting each state supremacy, the fundamental question relates to our current idea of sovereignty and its bearing on our times.  

    Historical Note

    The modern idea of sovereignty was arguably formulated in Europe with the Treaty of Westphalia in 1648. The treaty brought an end to more than a century of continuous religious violence in the Holy Roman Empire. It was the culmination of various failed attempts that built up to the idea of recognizing a supreme authority within a territory that took on the character of a nation-state. However, sovereignty is a much wider concept with many variants.

    Modern sovereignty is envisaged as the political power held by one authority. Historically, that has not always been the case. Overlapping power was distributed between the monarchs and the Catholic Church throughout most of the Middle Ages. Monarchs dealt with the temporal prerogatives of society. Here too, authority was highly distributed among nobles and vassals responsible for maintaining the monarch’s troops. The church dealt with spiritual considerations. It stood as the conscience keeper of both monarchy and society, which permitted the church to play an active role in the secular authorities’ decisions. The church held large tracts of land and actively participated in wars.

    Thus, the Westphalian notion of “supreme authority over a territory” was established by overriding powerful historical forces in the process, including some that were even more powerful than the monarchs.

    The Treaty of Westphalia stripped the churches of their decision-making power. Pope Innocent X immediately expressed his reasoned disagreement along with his taste for provocative adjectives, calling the treaty “null, void, invalid, iniquitous, unjust, damnable, reprobate, inane, and empty of meaning and effect for all time.” 

    Embed from Getty Images

    Present-day corporations share some of the features of the mighty nobles and ecclesiastics of the past. At $2.1 trillion, Apple has a market capitalization greater than the GDP of 96% of the world’s countries, while Amazon surpasses 92% of country GDPs at $1.7 trillion. Undoubtedly, they command the digital realm with no real challenge to their authority and would be unlikely to accept a challenge by believers in modern sovereignty.

    Modern sovereignty was established on the basis of a territorial element, against large trans-border forces like religion. Unsurprisingly, it failed in its motives beyond curbing immediate violence. Today, the forces that states face are similar in their reach. The current scenario concerning data localization laws bears an uncanny resemblance to the Peace of Augsburg of 1555.

    At Augsburg, political forces sought to curb conflict by nationalizing religion. States agreed on the principle of cuius regio, eius religio, meaning that the prince’s religion would be their realm’s religion. This created power blocs consisting of countries practicing similar religions. This provided the historical logic behind the devastating Thirty Years’ War. Today, the Balkanization of the internet may result in blocs of states with similar laws, leading to potentially disastrous outcomes for internet freedom.

    The idea of sovereignty originated from morality and not mere capability. It may be legitimate to ask whether states, rather than people, are the correct party to claim rights to privacy and data protection. States have been equally guilty of exploiting access to private information. Recently, numbers of prominent political figures, journalists, human rights activists and business executives in more than 50 countries appear to have been targeted using the Pegasus spyware supplied by the Israeli cybersecurity NSO Group. Adding data privacy to the laundry list of state prerogatives, only because they concern supposedly foreign elements, may be an act of overreach. 

    Embed from Getty Images

    Instead, it may be up to individuals across the world to decide who they sign a contract with to secure a connection to the gods — the satellites in the sky. Just as monarchs, who unlike the church lacked the power to legislate the terms of salvation, today the state may be incapable of regulating what doesn’t belong to its realm: the internet. At best, it could play the role of facilitating a contract between the people and the tech giants.

    Hence, adapting to the changing times may require revising our concept of sovereignty. It will not be easy. The French philosopher Jacques Maritain, in his book “Man and the State,” traced the significant circumscription of sovereignty in the wake of World War II. States have at least theoretically united and often consensually given up on their supreme authority on subjects such as human rights and climate change. Supranational arrangements like the EU are testaments to the changing times and ideas.

    The rise of a new digital realm may provide us with a chance to forge this change. It could help us question whether issues like data privacy and protection should be subject to a state’s consent or whether they concern the people, who might want to define their own personal sovereignty. 

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    From Merkel to Baerbock: Female Politicians Still Face Sexism in Germany

    Angela Merkel has become a symbol of women’s success and self-assertion in a political arena still dominated by men, both in Germany and globally. Until a few months ago, the prospect of a female successor seemed very likely. But the initial euphoria, shortly after the Green Party named Annalena Baerbock as its candidate for the chancellorship, has died down.

    Germany’s Greens Are Within Earshot of Power

    READ MORE

    In May, polls showed that 43% of the German population perceived Baerbock as a suitable successor for Merkel, leading over her two main contenders; at the end of August, this figure was down to 22%. Targeted online campaigns have been busy exploiting Baerbock’s missteps and stoking fears of political change among voters. These attacks have laid bare how modern political campaigns in the age of social media flush sexist attitudes that persist in both politics and the wider society to the surface. 

    Belittled and Patronized

    Before Merkel rose to become one of the world’s most powerful female politicians, she was underestimated and belittled throughout the 1990s as a woman from East Germany by a male-dominated West German political class. Despite prevailing in intra-party struggles by often adapting to male behavior, she still had to face gender-based headwinds during her first general election campaign in 2005 as the front runner of her party.

    The Christian Democratic Union/Christian Social Union (CDU/CSU) began the campaign polling at 48%, only to plummet to a sobering 35.2% on election day, securing a knife-edge victory over the incumbent, Chancellor Gerhard Schröder. Even back then, when social media was still a negligible factor, Merkel had to face partly overt, partly subliminal gender-discriminatory reporting. German media dissected Merkel’s outward appearance, starting with the corners of her mouth and her hairstyle and ending with her now-famous pantsuits.

    According to Rita Süssmuth, president of the German Bundestag from 1988 to 1998, at times, “there was more discussion about hairstyle, outer appearance, facial expression, hands, etc. than there was debate about the content. And how often did the question come up: Can the girl do it?”

    Embed from Getty Images

    Her competence was called into doubt, as stereotypical headlines from the time show: “Angela Merkel — an angel of understanding kindness,” “A power woman … corpses pave her way.” In 2004, the Austrian newspaper Die Presse came to the following conclusion to the question of why Merkel had to face such inappropriate media scrutiny: “Because she is a woman and comes from the East. And that is not the stuff of political fantasies that make West German men’s clubs ecstatic.”

    Even Merkel’s nickname, “Mutti” (mommy), used affectionately by most Germans now, was originally a derisive epithet. The slow reinterpretation of this nickname is emblematic of how difficult it is for women in politics to break away from antiquated role models.

    Since then, Merkel has emerged victorious in four consecutive elections, at the moment the country’s second-longest serving chancellor after Helmut Kohl. She is one of the countless global role models who have proven women to be apt leaders. In light of this overwhelming evidence of women’s political prowess, the levels of sexism and disinformation launched against Baerbock are astonishing. 

    Targeted From Day One

    When the Green Party chose Baerbock as its front runner in April, it did so with confidence that after 16 years of Angela Merkel, voters had shed their misgivings about aspiring female politicians. If anything, the Greens expected a young, energetic woman to embody political change and provide an appealing contrast to the stodgy, veteran, male candidates like Armin Laschet of the CDU and Olaf Scholz of the Social Democratic Party (SPD). But soon after the announcement of her candidacy, voices emerged online questioning whether a mother of two would be suitable for the chancellorship. However, it’s not just her status as a mother that made Baerbock an ideal target, especially for conservatives and far-right populists on the internet: Unlike Merkel, she is young, politically more inexperienced, liberal and green.

    Adding to that, Baerbock exposed herself to criticism by making unforced mistakes. False statements in her CV, delayed declarations of supplementary income and alleged plagiarism in her book published in June provided further ammunition to her adversaries. Her book’s title, “Now. How We Renew Our Country,” and the criticism she faces mirror the Greens’ current dilemma. Before Baerbock could even communicate a new, innovative policy approach with climate protection at its center to the voters, public attention had already diverted to her shortcomings.

    While part of the blame rests with Baerbock herself, a lack of proportionality of criticisms toward her as opposed to other contestants in this election is apparent. For more than a year now, accusations loom around her contender for the post of chancellor, Olaf Scholz. As finance minister and chairman of the Federal Financial Supervisory Authority, he is accused of failing to prevent the biggest accounting scandal in the history of the Federal Republic of Germany surrounding Wirecard AG, a payment processor and financial services provider. Luckily for Scholz, still-unanswered questions concerning the scandal receive scarce media attention, partly due to the complexity of the issue at hand making it harder to distill into bite-size news. 

    Embed from Getty Images

    Armin Laschet, the CDU‘s candidate for chancellorship and minister president of the state of North Rhine-Westphalia, had to navigate rough waters during the COVID-19 crisis. The state government used opaque procedures to award a contract for protective gowns worth €38.5 million ($45.6 million) to the luxury fashion manufacturer van Laack, a company linked to Laschet’s son. Laschet also received criticism for a good-humored appearance during a visit to areas affected by floods that killed at least 189 in July. In addition, he too was accused of plagiarism due to suspicious passages in a book published in 2009.

    Even though Scholz’s, and especially Laschet’s missteps have not gone unnoticed by the media, the public and political opponents, Lothar Probst, a researcher at the University of Bremen, recognizes a systematic character in the criticism faced by Baerbock. In an interview with the German Press Agency, he surmised: “Her credibility, respectability, and authority are undermined, she is portrayed as sloppy. … A young, urban smart woman [is] once again tackled harder than her competitors.”

    Even before Baerbock’s gaffes were in the spotlight, she found herself in the firing line. Conspiracy theories surfaced, suggesting that Baerbock was a puppet of George Soros and an advocate of the “great reset” conspiracy. Disinformation about Baerbock was also gender-based. Collages of sexualized images quickly circulated, including deepfake photographs disseminated via the messenger Telegram.

    Such disinformation originated significantly from far-right circles. In 2019, according to the federal criminal police office, 77% of registered hate posts were attributable to the center-right and far-right political spectrum. According to political scientist Uwe Jun, from Trier University, female politicians from green parties are primary targets for right-wing attacks and disinformation because topics such as climate protection and emancipation inflame passions and mobilize the political right.

    Worldwide Concern

    Baerbock’s political opponents and critics deny disproportionate criticism, insisting that she should have known what she had signed up for; after all, election campaigns are not for the faint-hearted, especially when entering the race as the front-runner. Yet statistics prove that in Germany, hatred toward female politicians is an everyday occurrence. A survey by Report München showed that 87% of the female politicians interviewed encountered hate and threats on an almost daily basis; 57% of these were sexist attacks.

    These results are in line with international studies. In a 2019 report “#ShePersisted. Women, Politics & Power in the New Media World,” conducted by Lucina di Meco and Kristina Wilfore, 88 global female leaders were interviewed, most of whom were “concerned about the pervasiveness of gender-based abuse.” The study concluded that “A new wave of authoritarian leaders and illiberal actors around the world use gendered disinformation and online abuse to push back against the progress made on women’s and minority rights.”

    A recent study from January, “Malign Creativity: How Gender, Sex, and Lies are Weaponized Against Women Online,” by the Woodrow Wilson International Center, also shows that 12 of the 13 surveyed female politicians suffered gendered abuse online. Nine of them were at the receiving end of gendered disinformation, containing racist, transphobic and sexual narratives, with the latter being the most common.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Sixteen years have passed between Angela Merkel‘s and Annalena Baerbock’s first campaigns for the chancellorship. Today, women striving for power still have to deal with mistrust and gender-discriminatory prejudice. Merkel had to hold her own in a male-dominated environment where she was underestimated and often treated disparagingly. But compared to Merkel, the campaign against Baerbock has reached a new, unprecedented dimension. Merkel, who is childless, outwardly inconspicuous and politically more conservative, offered less of a target to conservative, male adversaries than the young, modern and progressive Baerbock.

    Besides, Baerbock’s opponents in 2021 have more effective tools for spreading gendered disinformation on social media. While disinformation targets both male and female politicians, women are more affected. It aims to undermine women’s credibility and their chances of electoral success and discourage future generations of women from pursuing political careers. Germany’s female politicians must keep in mind that such disinformation is spread by distorted, unrepresentative groups that don’t reflect the social progress made over the years.

    At this particular moment, it appears unlikely that Baerbock will move into the chancellor’s office as Merkel did in 2005 by the narrowest of margins. Yet the race is far from over, with nearly a month until election day. Baerbock’s recent performance in the first of three TV debates proves that she is not ready to abandon the field to (online) campaigners spreading gender-based prejudice and disinformation. Despite polls declaring Scholz as the debate’s winner, narrowly ahead of Baerbock, she presented herself as a modern and socioecological alternative to both her contenders and reverted attention to policy away from her persona and gender.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    It’s Back to Square One in Libya

    Six years ago, Libya’s political process fell apart almost as soon as it started. The country was forcibly divided as politicians got buyers’ remorse over their agreement and realized that competition was considerably more profitable than cooperation. Libya’s revolutionary transition stalled while rifts deepened, the state degraded and quality of life collapsed.

    Worse still, the moribund process was the perfect environment for a renegade military officer, Khalifa Haftar, to transform a counterterror operation into a Libyan forever war that saw him promoted to general — then field marshal — in a five-year journey of over 2,000 kilometers from eastern Libya to the gates of Tripoli.

    The Libyan Government Faces Numerous Challenges

    READ MORE

    It was an internationally-driven campaign that ended with Libya’s domestic bifurcation replicating itself internationally. By June 2020, with Haftar’s campaign and army in tatters, Turkey dominated western Libya, whilst Russia adeptly controlled the east and all that the United Arab Emirates, Egypt and France had once hopefully built for their marshal. Yet this war, and the international dynamics around it, had supercharged Libya’s drivers of destabilization and the largely clandestine proxy war threatened to explode into direct regional conflict.

    The Political Process in Libya

    So, when the United Nations returned to pick up Libya’s much-abused political process once more, there was relief from many. However, the UN failed to learn from its mistakes of just five years ago and so built a process that may not be an exact repeat of what came before but which certainly rhymed with it.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    It was a process that promised elections in December 2021 and relied upon the same politicians who had divided the country in 2015 to first reunify it and then prepare the elections that would remove themselves from office. In an extension of that same wisdom, the process also re-empowered Haftar — the defeated megalomaniac who had attacked Libya’s capital in 2019 — and gave him a driver’s seat for building a unified national military. Overseeing it all was a man, Abdul Hamid Dbeibah, who had gamed the UN process by paying millions of euros in bribes to those Libyans taking part in it to become the prime minister of Libya’s new government of national unity (GNU).

    Given the framing of this process, it is perhaps not such a surprise that, eight months later, there is little substantive progress toward elections, while each of the main actors are more firmly rooted in their positions.

    Aguila Saleh, the speaker of Libya’s parliament and perhaps the most influential of the remaining political class, has given everything to block progress toward elections, whilst working to reverse what little unification took place after the formation of Dbeibah’s unity government. He has used his role as speaker to continuously postpone what were necessary and urgent discussions on the constitutional basis for elections — i.e., what the Libyan people would exactly be voting for at the end of the year.

    This forced the discussion out from parliament to the UN convened body, which had first authorized this new process. However, with all political players having significant influence over that body and the newest UN special envoy, Jan Kubis, being notable only for his anonymity in the role, these discussions were quickly sidetracked to irrelevance.

    Instead, Saleh worked on extorting the GNU to guarantee a swollen budget for him to build out a patronage network across eastern Libya and develop bilateral relations with countries like Greece and Egypt, providing them access to public tenders in the east. As such, despite the presence of a unity government, Libya is perhaps more divided today than it was 12 months ago when parallel governments existed — as Saleh acts as a de-facto prime minister of the east.

    However, during a recent interview with Reuters, Saleh shirked all responsibility for the failure to make progress on elections. Instead, he publicly blamed the GNU, claiming that Prime Minister Dbeibah had betrayed the UN process and, as a result, he would be forced to reappoint an eastern government. This is a convenient outcome for Saleh, who has used the process to grab further power and funding for himself, which he will now lock-in by refreezing the political transition and any political process with western Libya and its actual government.

    The Field Marshall

    Haftar has supported him toward that end. The UN process brought the warlord time and space to reconstitute what he could of his forces, while Russia and the UAE provided him with mercenaries to buttress his position and allow him to repair his branding. His new-look army still claims to be Libya’s national military and claims parliamentary support for that distinction. However, the groups responsible for local security across east and south Libya no longer follow his orders and unilaterally pursue their own interests, rendering his control nominal.

    Embed from Getty Images

    Instead, Haftar has focused on maintaining political credentials and growing his economic activity. His “military investment authority” has started their own construction projects using Emirati companies to allegedly break ground on three new cities in eastern Libya with a promised capacity of 12 million people — a real boon to the tired and impoverished country of 6 million. His sons continue to dominate smuggling operations throughout Libya even as their father postures as he prepares to run for president.

    Haftar and the media machines provided by his foreign backers have focused on a narrative that Libya’s UN-promised elections are only to be presidential elections, and any attempts to create a more complex electoral process or constitutional framing than that would be to violate the people’s freedom of choice. Saleh has supported this, posturing as a democrat, knowing that a president would not affect his parliament.

    Moreover, both men know that this gambit is a sure winner. Elections will either be forced, with Haftar using armed groups to fix the vote to become an all-powerful president or, more likely, a majority of the country will refute the notion of allowing someone who bears significant command responsibility for war crimes and the killings of thousands of Libyans over the past five years. Then he can leverage his position supporting elections to regain international legitimacy, put the blame on western Libya and work with Saleh toward an eastern government he controls.

    Such is the disingenuity of Saleh and Haftar that Dbeibah never even had to try to postpone elections, although most of Libya knew his intention is to be there for the long haul. He has played off the stalling tactics of the other two and their direct hostility to try to build a policy around gathering international support to help his government settle, rebuild and return essential services, plan a proper constitutional basis, unify the military and only then — sometime in the future — allow for elections. The financial promise of this rebuilding enterprise has brought him the support of key players in addition to just Turkey, with whom he remains close.

    Libya’s Future

    As Libya’s process hurtles toward its expected collapse, the shape of its future will look familiar to anyone watching the country: re-division, disingenuous political bickering between those who never had an interest in governing, quiet cooperation between those bickering when it comes to corruption, and the ever-worrying threat of renewed conflict as Haftar awaits a new opportunity to seize power and other armed groups contest the depleted legitimacy of those in charge and look for a route of their own into the government coffers.

    Meanwhile, it is the Libyan people, as always, who suffer as their essential services continue to collapse, their wealth disappears and the soaring temperatures of a warming world begin to make everything that bit more volatile.

    *[This article was originally published by Arab Digest, a partner organization of Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Some Boots on the Ground Leave Odd Footprints

    Most people consider the trillions of dollars spent by the Americans on military campaigns in Afghanistan and Iraq to have been a waste of money and the sign of a failed vision. When US President Joe Biden reasoned that it was time to put a stop to the spending, he claimed it had long ago achieved its most essential objectives: “to ensure Afghanistan would not be used as a base from which to attack our homeland again.”

    Biden thus implicitly admitted that most of the operational objectives defined and pursued over the past two decades ended in failure. He noted that once Osama bin Laden was out of the picture, there was nothing further to accomplish in Afghanistan. “We delivered justice to bin Laden a decade ago,” he proudly announced, “and we’ve stayed in Afghanistan for a decade since,” he complained.

    US Media Amplifies Afghan Chaos

    READ MORE

    Anyone who has been through Harvard Business School (HBS) and understands the basic principles of economics would recognize that there were probably better things to do with that amount of money. Hal Brands and Michael O’Hanlon, the authors of an article in Foreign Affairs, have penned an article with the title, “America Failed Its Way to Counterterrorism Success.” Neither did an MBA at HBS. Both men aimed higher, frequenting other elite universities (Stanford, Yale and Princeton), where they focused on what really matters: managing the global strategies of an expanding empire. They share a solid reputation as experts in strategic defense and are associates respectively of the neoconservative American Enterprise Institute and the centrist Brookings Institute. 

    Brands and O’Hanlon have teamed up to convince their public of the Leibnitzian truth that all was for the best despite the obvious fiasco. In their analysis of the profligate waste that appears on the Pentagon’s still unaudited books, they recognize but appear unconcerned by a two-decade-long failure to produce even a minimal return on investment. Thanks to their clever detective work, they believe that there is a hidden success story waiting to be told.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Their embarrassment with the obvious facts becomes clear in two sentences that begin respectively with the words “but” and “yet.” “But since around 2014,” they affirm, “Washington has settled on a medium-footprint model based on modest investments, particularly in special operations forces and airpower, to support local forces that do most of the fighting and dying.” In the same paragraph, they explain: “Yet the experience of the past two decades suggests that the medium-footprint strategy is still the best of bad options available to the United States.”

    Today’s Daily Devil’s Dictionary definition:

    Medium-footprint:

    In geopolitics, a size that sounds reasonable after realizing that an effort based on maximum power by a mighty nation has catastrophically failed. The virtue associated with a medium footprint follows the convincing reasoning that spending less on wasteful activities is more virtuous than spending more.

    Contextual Note

    For the past two decades, US foreign policy has accelerated the trend, influenced by McDonald’s, of supersizing everything, all in the name of security needs. But the supersized milkshake appears to have fatally slipped from every administration’s hands and has now crashed to the ground in the place it all began. President Biden’s decision has rocked the foundations of America’s belief in the efficacy of its unparalleled military might.

    Brands and O’Hanlon agree with the now commonly held opinion that the entire campaign in Afghanistan was, in terms of its stated goals, a spectacular failure. They see it as tragic overreach. But rather than applaud Biden for seeking to put an end to “unsustainably expensive military commitments in Afghanistan and Iraq,” they complain that “the United States underreached by pulling back from the broader Middle East too fast and allowing old threats to reemerge.” They appear to lay the blame on all three occupants of the White House since 2014: Barack Obama, Donald Trump and Joe Biden. At the same time, they applaud what they see as a trend of downsizing the supersized calamities initiated by the Bush administration.

    Embed from Getty Images

    When the Americans put boots on the ground, the footprints they leave tend to be outrageously big and messy. Rather than simply marching forward, US military campaigns have a habit of seeking to mash into the ground everything that seems a bit foreign on their path. The authors think they can do better and applaud what they see as a trend toward a medium footprint as a form of progress. In their eyes, it may even justify all the otherwise obvious failure.

    They present the medium-footprint model as a kind of silver living in a somber and depressing cloud, daring to invoke a possible positive return on investment. “When combined with nonmilitary tools such as intelligence cooperation, law enforcement efforts, and economic aid,” they write, “this approach provides reasonably good protection at a reasonable price.” They recast the failure as a kind of research and development investment that has prepared a brighter future and will guarantee increased market share for US military domination (i.e., security).

    Historical Note

    In their audit of success and failure, Hal Brands and Michael O’Hanlon rejoice in one accomplishment: that “Washington has inflicted devastating losses on its enemies and forced them to focus more on surviving than thriving.” An honest historian would be tempted to reframe this in the following terms: Washington has inflicted devastating losses on multiple civilian populations and forced them to focus more on surviving than thriving.

    That is the obvious truth concerning people’s lives and social structures in Afghanistan, Iraq, Syria, Libya and Yemen. Is this something Americans who celebrate their military as “a force for good in the world” can really rejoice in? Reducing entire populations to the struggle for survival seems a far cry from spreading democracy and defending human rights. Creating misery is an aggressive denial of human rights. It creates conditions that encourage further violations of those rights.

    Playing the imaginary role of a CEO called in to take over a failing enterprise, the authors note that thanks to their vaunted medium-footprint strategy, “the rate of expenditure has come down markedly in the last decade.” They point out that wars that “once cost as much as $200 billion a year in all” now cost “just a few billion dollars.” That is effective downsizing.

    They define the new strategy as “managing intractable problems rather than solving them or simply walking away.” In other words, the Goldilocks principle. Not too heavy, not too light. “Just right,” or, as the authors put it, “meant to be both aggressive and limited.” 

    Embed from Getty Images

    Could this be an example of Aristotle’s golden mean as a moral principle? In such situations, Aristotle defines an extreme to be avoided as anger, which provokes the subject “to inflict pain, and to perceive his revenge.” That was the official motivation the Bush administration adopted for its war on the Taliban in 2001. “Managing intractable problems” presumably involves taking into account all the parameters of a situation. It should include engaging in dialogue and negotiation. 

    But that is not what the authors envision. The “aggressive” side they recommend involves delegating the nasty part of war to local partners who “clear and hold terrain,” assured that they will be accompanied by the “direct use of U.S. military power — especially special operations forces, drones, and manned airpower.” Aristotle would object that that could not be thought of as managing the problem. It is simply a more complex and messier prosecution of anger.

    The authors believe this strategy worked against the Islamic State (IS), a movement that wouldn’t even have arisen without the initial “heavy footprint.” But did it work or merely seem to work? There has been a lot of seeming over the past 20 years. Last week’s attack at Kabul airport was conducted by the Islamic State in Khorasan Province, a regional affiliate of IS.

    The authors explain the purpose of their medium footprint: “to maintain the regional military footholds.” From footprint to foothold, all seems to be clear. They even imagine that such strategies pay off in the form of “deeper reform once conditions stabilized.” Can they cite any cases of reform and stabilized conditions as a result of either heavy or medium footprints? Not really, because there haven’t been any. Instead, they console themselves with this conclusion: “On the whole, the medium-footprint strategy was more sustainable and effective than anything Washington had tried before.”

    The lesser of two evils? Or simply the slower of two evils?

    *[In the age of Oscar Wilde and Mark Twain, another American wit, the journalist Ambrose Bierce, produced a series of satirical definitions of commonly used terms, throwing light on their hidden meanings in real discourse. Bierce eventually collected and published them as a book, The Devil’s Dictionary, in 1911. We have shamelessly appropriated his title in the interest of continuing his wholesome pedagogical effort to enlighten generations of readers of the news. Read more of The Daily Devil’s Dictionary on Fair Observer.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More