More stories

  • in

    Will the US Wake Up From Its Post-9/11 Nightmare?

    Looking back on it now, the 1990s were an age of innocence for America. The Cold War was over and our leaders promised us a “peace dividend.” There was no TSA — the Transportation Security Administration — to make us take off our shoes at airports (how many bombs have they found in those billions of shoes?). The government could not tap a US phone or read private emails without a warrant from a judge. And the national debt was only $5 trillion, compared with over $28 trillion today.

    We have been told that the criminal attacks of September 11, 2001, “changed everything.” But what really changed everything was the US government’s disastrous response to them. That response was not preordained or inevitable, but the result of decisions and choices made by politicians, bureaucrats and generals who fueled and exploited our fears, unleashed wars of reprehensible vengeance and built a secretive security state, all thinly disguised behind Orwellian myths of American greatness.  

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    Most Americans believe in democracy and many regard the United States as a democratic country. But the US response to 9/11 laid bare the extent to which American leaders are willing to manipulate the public into accepting illegal wars, torture, the Guantanamo gulag and sweeping civil rights abuses — activities that undermine the very meaning of democracy. 

    Former Nuremberg prosecutor Ben Ferencz said in a speech in 2011 that “a democracy can only work if its people are being told the truth.” But America’s leaders exploited the public’s fears in the wake of 9/11 to justify wars that have killed and maimed millions of people who had nothing to do with those crimes. Ferencz compared this to the actions of the German leaders he prosecuted at Nuremberg, who also justified their invasions of other countries as “preemptive first strikes.” 

    Unique Insights from 2,500+ Contributors in 90+ Countries

    “You cannot run a country as Hitler did, feeding them a pack of lies to frighten them that they’re being threatened, so it’s justified to kill people you don’t even know,” Ferencz continued. “It’s not logical, it’s not decent, it’s not moral, and it’s not helpful. When an unmanned bomber from a secret American airfield fires rockets into a little Pakistani or Afghan village and thereby kills or maims unknown numbers of innocent people, what is the effect of that? Every victim will hate America forever and will be willing to die killing as many Americans as possible. Where there is no court of justice, wild vengeance is the alternative.” 

    “Insurgent Math”

    Even the commander of US forces in Afghanistan, General Stanley McChrystal, talked about “insurgent math,” conjecturing that, for every innocent person killed, the US created 10 new enemies. Thus, the so-called global war on terror fueled a global explosion of terrorism and armed resistance that will not end unless and until the United States ends the state terrorism that provokes and fuels it. 

    By opportunistically exploiting 9/11 to attack countries that had nothing to do with it, like Iraq, Somalia, Libya, Syria and Yemen, the US vastly expanded the destructive strategy it used in the 1980s to destabilize Afghanistan, which spawned the Taliban and al-Qaeda in the first place. In Libya and Syria, only 10 years after 9/11, US leaders betrayed every American who lost a loved one on September 11 by recruiting and arming al-Qaeda-led militants to overthrow two of the most secular governments in the Middle East, plunging both countries into years of intractable violence and fueling radicalization throughout the region.

    The US response to 9/11 was corrupted by a toxic soup of revenge, imperialist ambitions, war profiteering, systematic brainwashing and sheer stupidity. Lincoln Chafee, the only Republican senator who voted against the war on Iraq, later wrote, “Helping a rogue president start an unnecessary war should be a career-ending lapse of judgment.”

    Embed from Getty Images

    But it wasn’t. Very few of the 263 Republicans or the 110 Democrats who voted in 2002 for the US to invade Iraq paid any political price for their complicity in international aggression, which the judges at Nuremberg explicitly called “the supreme international crime.” One of them now sits at the apex of power in the White House. 

    Failure in Afghanistan

    Donald Trump and Joe Biden’s withdrawal and implicit acceptance of the US defeat in Afghanistan could serve as an important step toward ending the violence and chaos their predecessors unleashed after the 9/11 attacks. But the current debate over next year’s military budget makes it clear that our deluded leaders are still dodging the obvious lessons of 20 years of war. 

    Barbara Lee, the only member of Congress with the wisdom and courage to vote against the war resolution in 2001, has introduced a bill to cut US military spending by almost half: $350 billion per year. With the miserable failure in Afghanistan, a war that will end up costing every US taxpayer $20,000, one would think that Representative Lee’s proposal would be eliciting tremendous support. But the White House, the Pentagon and the Armed Services Committees in the House and Senate are instead falling over each other to shovel even more money into the bottomless pit of the military budget.

    Politicians’ votes on questions of war, peace and military spending are the most reliable test of their commitment to progressive values and the well-being of their constituents. You cannot call yourself a progressive or a champion of working people if you vote to appropriate more money for weapons and war than for health care, education, green jobs and fighting poverty.

    These 20 years of war have revealed to Americans and the world that modern weapons and formidable military forces can only accomplish two things: kill and maim people and destroy homes, infrastructure and entire cities. American promises to rebuild bombed-out cities and “remake” countries it has destroyed have proved worthless, as President Biden has acknowledged. 

    Embed from Getty Images

    Both Iraq and Afghanistan are turning primarily to China for the help they need to start rebuilding and developing economically from the ruin and devastation left by the US and its allies. America destroys, China builds. The contrast could not be more stark or self-evident. No amount of Western propaganda can hide what the whole world can see. 

    But the different paths chosen by American and Chinese leaders are not predestined. Despite the intellectual and moral bankruptcy of the US corporate media, the American public has always been wiser and more committed to cooperative diplomacy than their country’s political and executive class. It has been well-documented that many of the endless crises in US foreign policy could have been avoided if America’s leaders had just listened to the people.

    Weapons and More Weapons

    The perennial handicap that has dogged US diplomacy since World War II is precisely our investment in weapons and military forces, including nuclear weapons that threaten our very existence. It is trite but true to say that, “when the only tool you have is a hammer, every problem looks like a nail.” 

    Other countries don’t have the option of deploying overwhelming military force to confront international problems, so they have had to be smarter and more nimble in their diplomacy and more prudent and selective in their more limited uses of military force. 

    The rote declarations of US leaders that “all options are on the table” are a euphemism for precisely the “threat or use of force” that the UN Charter explicitly prohibits, and they stymie the US development of expertise in nonviolent forms of conflict resolution. The bumbling and bombast of America’s leaders in international arenas stand in sharp contrast to the skillful diplomacy and clear language we often hear from top Russian, Chinese and Iranian diplomats, even when they are speaking in English, their second or third language.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    By contrast, US leaders rely on threats, coups, sanctions and war to project power around the world. They promise Americans that these coercive methods will maintain US “leadership” or dominance indefinitely into the future, as if that is America’s rightful place in the world: sitting atop the globe like a cowboy on a bucking bronco. 

    A “new American century” and “Pax Americana” are Orwellian versions of Adolf Hitler’s “thousand-year Reich” but are no more realistic. No empire has lasted forever, and there is historical evidence that even the most successful empires have a lifespan of no more than 250 years, by which time their rulers have enjoyed so much wealth and power that decadence and decline inevitably set in. This describes the United States today.  

    America’s economic dominance is waning. Its once productive economy has been gutted and financialized, and most countries in the world now do more trade with China and/or the European Union than with the United States. Where America’s military once kicked open doors for American capital to “follow the flag” and open up new markets, today’s US war machine is just a bull in the global china shop, wielding purely destructive power.    

    Time to Get Serious

    But we are not condemned to passively follow the suicidal path of militarism and hostility. Biden’s withdrawal from Afghanistan could be a down payment on a transition to a more peaceful post-imperial economy — if the American public starts to actively demand peace, diplomacy and disarmament and find ways to make our voices heard. 

    First, we must get serious about demanding cuts in the Pentagon budget. None of our other problems will be solved as long as we keep allowing our leaders to flush the majority of federal discretionary spending down the same military toilet as the $2.26 trillion they wasted on the war in Afghanistan. We must oppose politicians who refuse to cut the Pentagon budget, regardless of which party they belong to and where they stand on other issues.

    Embed from Getty Images

    Second, we must not let ourselves or our family members be recruited into the US war machine. Instead, we must challenge our leaders’ absurd claims that the imperial forces deployed across the world to threaten other countries are somehow, by some convoluted logic, defending America. As a translator paraphrased Voltaire, “Whoever can make you believe absurdities can make you commit atrocities.”  

    Third, we must expose the ugly, destructive reality behind our country’s myths of “defending” US vital interests, humanitarian intervention, the war on terror and the latest absurdity, the ill-defined “rules-based order” — whose rules only apply to others but never to the United States. 

    Finally, we must oppose the corrupt power of the arms industry, including US weapons sales to the world’s most repressive regimes, and an unwinnable arms race that risks a potentially world-ending conflict with China and Russia. 

    Our only hope for the future is to abandon the futile quest for hegemony and instead commit to peace, cooperative diplomacy, international law and disarmament. After 20 years of war and militarism that has only left the world a more dangerous place and accelerated America’s decline, we must choose the path of peace.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Did 9/11 Change Everything?

    Twenty years ago, the United States sustained the first substantial attacks on the mainland since the War of 1812. It was a collective shock to all Americans who believed their country to be impregnable. The Cold War had produced the existential dread of a nuclear attack, but that always lurked in the realm of the maybe. On a day-to-day basis, Americans enjoyed the exceptional privilege of national security. No one would dare attack us for fear of massive retaliation. Little did we imagine that someone would attack us in order to precipitate massive retaliation.

    Osama bin Laden understood that American power was vulnerable when overextended. He knew that the greatest military power in the history of the world, deranged by a desire for vengeance, could be lured into taking a cakewalk into a quagmire. With the attacks on September 11, 2001, al-Qaeda turned ordinary American airplanes into weapons to attack American targets. In the larger sense, bin Laden used the entire American army to destroy the foundations of American empire.

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    The commentary on this 20th anniversary of 9/11 has been predictably shallow: how the attacks changed travel, fiction, the arts in general. Consider this week’s Washington Post magazine section in which 28 contributors reflect on the ways that the attacks changed the world.

    “The attack would alter the lives of U.S. troops and their families, and millions of people in Afghanistan and Iraq,” the editors write. “It would set the course of political parties and help to decide who would lead our country. In short, 9/11 changed the world in demonstrable, massive and heartbreaking ways. But the ripple effects altered our lives in subtle, often-overlooked ways as well.”

    Unique Insights from 2,500+ Contributors in 90+ Countries

    The subsequent entries on art, fashion, architecture, policing, journalism and so on attempt to describe these subtler effects. Yet it’s difficult to read this special issue without concluding that 9/11, in fact, didn’t change the world much at all.

    The demonization of American Muslims? That began long before the fateful day, cresting after the Iranian Revolution in 1979. The paranoid retrenchment in American architecture? US embassies were rebuilt not in response to 9/11, but the embassy bombings in Beirut in 1983-84 and Kenya and Tanzania in 1998.

    The impact of 9/11 on the arts can be traced through a handful of works like Spike Lee’s “25th Hour” or the TV series “24” or Don DeLillo’s “Falling Man,” but it didn’t produce a new artistic movement like Dada in the wake of World War I or cli-fi in response to the climate crisis. Even the experience of flying hasn’t changed that much beyond beefed-up security measures. At this point, the introduction of personal in-flight entertainment systems has arguably altered the flying experience more profoundly.

    And isn’t the assertion that 9/11 changed everything exceptionally America-centric? Americans were deeply affected, as were the places invaded by US troops. But how much has life in Japan or Zimbabwe or Chile truly changed as a result of 9/11? Of course, Americans have always believed that, as the song goes, “we are the world.”

    More Than a Mistake

    In a more thoughtful Post consideration of 9/11, Carlos Lozado reviews many of the books that have come out in the last 20 years on what went wrong. In his summary, US policy proceeds like a cascade of falling dominos, each one a mistake that follows from the previous and sets into motion the next.

    Embed from Getty Images

    Successive administrations underestimated al-Qaeda and failed to see signs of preparation for the 9/11 attacks. In the aftermath of the tragedy, the Bush administration mistakenly followed the example of numerous empires in thinking that it could subdue Afghanistan and remake it in the image of the colonial overlord. It then compounded that error by invading Iraq in 2003 with the justification that Saddam Hussein was in cahoots with al-Qaeda, was building up a nuclear program, or was otherwise part of an alliance of nations determined to take advantage of an America still reeling from the 9/11 attacks. Subsequent administrations made the mistake of doubling down in Afghanistan, expanding the war on terror to other battlefields and failing to end US operations at propitious moments like the killing of Osama bin Laden in 2011.

    Lozado concludes by pointing out that Donald Trump is in many ways a product of the war on terror that followed 9/11. “Absent the war on terror, it is harder to imagine a presidential candidate decrying a sitting commander in chief as foreign, Muslim, illegitimate—and using that lie as a successful political platform,” he writes. “Absent the war on terror, it is harder to imagine a travel ban against people from Muslim-majority countries. Absent the war on terror, it is harder to imagine American protesters labeled terrorists, or a secretary of defense describing the nation’s urban streets as a ‘battle space’ to be dominated.”

    But to understand the rise of Trump, it’s necessary to see 9/11 and its aftermath as more than just the product of a series of errors of perception and judgment. Implicit in Lozado’s review is the notion that America somehow lost its way, that an otherwise robust intelligence community screwed the pooch, that some opportunistic politicians used the attacks to short-circuit democracy, public oversight and even military logic. But this assumes that the war on terror represents a substantial rift in the American fabric. The 9/11 attacks were a surprise. The response wasn’t.

    The United States had already launched a war against Iraq in 1991. It had already mistakenly identified Iran, Hamas and jihadist forces like al-Qaeda as enemies linked by their broad religious identity. It had built a worldwide arsenal of bases and kept up extraordinarily high levels of military spending to maintain full-spectrum dominance. Few American politicians questioned the necessity of this hegemony, though liberals tended to prefer that US allies shoulder some of the burden and neoconservatives favored a more aggressive effort to roll back the influence of Russia, China and other regional hegemons.

    Embed from Getty Images

    The “war on terror” effectively began in 1979 when the United States established its “state sponsors of terrorism” list. The Reagan administration used “counterterrorism” as an organizing principle of US foreign policy throughout the 1980s. In the post-Cold War era, the Clinton administration attempted to demonstrate its hawk credentials by launching counterterrorism strikes in Sudan, Afghanistan and Iraq.

    What changed after 9/11 is that neoconservatives could push their regime-change agenda more successfully because the attacks had temporarily suppressed the Vietnam syndrome, a response to the negative consequences of extended overseas military engagements. Every liberal in Congress, except for the indomitable Barbara Lee, supported the invasion of Afghanistan in 2001, as if they’d been born just the day before. That just happens to be one of those side-effects of empire listed in fine print on the label: periodic and profound amnesia.

    In this sense, Trump is not a product of the war on terror. His views on US foreign policy have ranged across the spectrum from jingoistic to non-interventionist. His attitude toward protesters was positively Nixonian. And his recourse to conspiracy theories derived from his legendary disregard for truth. Regardless of 9/11, Trump’s ego would have propelled him toward the White House.

    The surge of popular support that placed him in the Oval Office, on the other hand, can only be understood in the post-9/11 context. Cyberspace was full of all sorts of nonsense prior to 9/11 (remember the Y2K predictions?). But the attacks gave birth to a new variety of “truthers” who insisted, against all contrary evidence, that nefarious forces had constructed a self-serving reality. The attacks on the twin towers and the Pentagon were “inside jobs.” The Newtown shootings had been staged by “crisis actors.” Barack Obama was born in Kenya.

    The shock of the United States being so dramatically and improbably attacked by a couple dozen foreigners was so great that some Americans, uncoupled from their bedrock assumptions about their own national security, were now willing to believe anything. Ultimately, they were even willing to believe someone who lied more consistently and more frequently than any other politician in US history.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Trump effectively promised to erase 9/11 from the American consciousness and rewind the clock back to the golden moment of unipolar US power. In offering such selective memory loss, Trump was a quintessentially imperial president.

    The Real Legacy of 9/11

    Even after the British formally began to withdraw from the empire business after World War II, they couldn’t help but continue to act as if the sun didn’t set on their domains. It was the British who masterminded the coup that deposed Mohammed Mossadegh in Iran in 1953. It was the British at the head of the invasion of Egypt in 1956 to recapture control of the Suez Canal. Between 1949 and 1970, Britain launched 34 military interventions in all.

    The UK apparently never received the memo that it was no longer a dominant military power. It’s hard for empires to retire gracefully. Just ask the French.

    The final US withdrawal from Afghanistan last month was in many ways a courageous and successful action by the Biden administration, though it’s hard to come to that conclusion by reading the media accounts. President Joe Biden made the difficult political decision to stick to the terms that his predecessor negotiated with the Taliban last year. Despite being caught by surprise by the Taliban’s rapid seizure of power over the summer, the administration was able to evacuate around 120,000 people, a number that virtually no one would have expected prior to the fall of Kabul, the capital of Afghanistan. Sure, the administration should have been better prepared. Sure, it should have committed to evacuating more Afghans who fear for their lives under the Taliban. But it made the right move to finally end the US presence in Afghanistan.

    Biden has made clear that US counterterrorism strikes in Afghanistan will continue, that the war on terror in the region is not over. Yet, US operations in the Middle East now have the feel of those British interventions in the twilight of empire. America is retreating, slowly but surely and sometimes under a protective hail of bullets. The Islamic State group and its various incarnations have become the problem of the Taliban — and the Syrian state, the Iraq state, the Libyan state (such that it is) and so on.

    Embed from Getty Images

    Meanwhile, the United States turns its attention toward China. But this is no Soviet Union. China is a powerhouse economy with a government that has skillfully used nationalism to bolster domestic support. With trade and investment, Beijing has recreated a Sinocentric tributary system in Asia. America really doesn’t have the capabilities to roll back Chinese influence in its own backyard.

    So that, in the end, is what 9/11 has changed. The impact on culture, on the daily lives of those not touched directly by the tragedies, has been minimal. The deeper changes — on perceptions of Muslims, on the war on terror — had been set in motion before the attacks happened.

    But America’s place in the world? In 2000, the United States was still riding high in the aftermath of the end of the Cold War. Today, despite the strains of MAGA that can be heard throughout America’s political culture, the United States has become one major power among many. It can’t dictate policy down the barrel of a gun. Economically it must reckon with China. In geopolitics, it has become the unreliable superpower.

    Even in our profound narcissism, Americans are slowly realizing, like the Brits so many years ago, that the imperial game is up.

    *[This article was originally published by FPIF.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    The War on Terror Was Never Turkey’s Fight

    Do you know where you were on August 14, 2001? Perhaps not, since it isn’t a defining day in world history in quite the same way as September 11, 2001, or 9/11, as it’s become known. Yet in the Turkish political landscape, August 14, 2001, can now be seen as something of a watershed moment.

    It was on this day that the Justice and Development Party (AKP) was founded. One of its founding members was a man named Recep Tayyip Erdogan. It was the latest in a long list of parties catering to a religiously devout and socially conservative constituency in Turkey. All the previous ones had been banned.

    360˚ Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    What makes August 14, 2001, so significant is the simple fact that the AKP was never banned. Despite the party’s daring to tread on secularist principles that few others had dared, this time, the country, with strong European Union support, had no appetite for military-backed bans.

    Turkey Says No

    Just as September 11 didn’t really come out of a clear blue sky for anyone observing the tide of Islamist militancy, so too the success of the AKP in Turkey did not come unannounced. It was a long time in the making, but its assumption of power, so soon after 9/11, has been defining for the country.

    By 2003, when George W. Bush’s war on terror was swinging into action in Iraq, the AKP took control of Turkey‘s government. Despite repeated attempts to shutter the party and even a failed 2016 coup, the AKP remains in power. As perhaps the most successful Islamist party in the Middle East, its relationship to both the events of 9/11 and the ensuing war on terror has always been a strained one. The Turkey of the 20th century would have been an unquestioning supporter of US policy. The new Turkey was not.

    Embed from Getty Images

    I was in Turkey on 9/11 and I saw the immediate reaction of ordinary people to the attacks on the World Trade Center and the Pentagon. In the hours after the towers fell, there were wild, yet in retrospect on-the-mark rumors that the US was about to bomb Afghanistan. The mood among ordinary Turks was not one of support.

    Visceral anger and anti-American sentiment were clearly palpable. While not outright cheering al-Qaeda, it was obvious that most people wouldn’t take the US side in a fight. This mood was reflected when Washington eventually went to war with Iraq and hoped to use the airbase at Incirlik in southeastern Turkey.

    The parliamentary vote that vetoed the use of the base for flights into Iraq was a pivotal one. It was the first strong sign of demonstrable national action in reflection of a national mood. In the post-Cold War world, Turkey’s Islamist government was ready to plow its own furrow.

    Who Defines Terrorism?

    The years that have followed have seen an ambiguous and often highly contorted relationship with the war on terror. Sometimes, Turkey has used the anti-terrorism concept to its own ends, as have many other US allies. At other times, it has turned a blind eye to activity that surely fell under the banner of terrorism.

    The Arab Spring of 2010 offered Islamists across the Middle East their big moment. Secular autocrats, long propped up by the West, tottered. Turkey’s Islamist government was one of the most vocal and active in attempting to ride this wave that they hoped would bring Islamist governments to a swathe of countries.

    Initially, the signs were good. The Muslim Brotherhood won the first free and fair elections in Egypt. Meanwhile, in neighboring Syria, the long-suppressed Islamist movement threatened to overwhelm the dictatorship of Bashar al-Assad. For a time, Turkey became a beacon of hope and a model for how the rest of the Middle East might evolve.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Turkish flags were being waved by demonstrators in Syria, and President Erdogan became the most popular leader in the region, loved by people far beyond his own nation. Then the Egyptian coup destroyed the Brotherhood, and Russia and Iran stepped in to save Assad’s regime in Syria. The mood soured for Turkey.

    In an attempt to rescue something in the Syrian conflict and in response to the collapse of domestic peace talks between the government and the Kurdistan Workers’ Party, Turkey’s border became a very porous route for jihadists entering into Syria. In time, these jihadists named themselves the Islamic State and declared a caliphate. This audacious move severely upped the stakes on al-Qaeda’s attempts of 2001, with an even more brutal brand of terrorism. Turkey’s ambiguous attitude to these developments was hardly a war on terror.

    Yet by this stage, the concept behind the war on terror had become so nebulous and the AKP’s relations to the US so strained by Washington’s support for the Kurds in Syria, that it was a case of realpolitik all the way. To any accusation of soft-handedness toward terrorists, Turkey pointed to US attitudes vis-à-vis Kurdish militants.

    Embed from Getty Images

    President Erdogan has, over time, began to carve a space for himself as an anti-Western champion, a leader of some kind of latter-day non-aligned movement, a spokesman for Muslim rights worldwide. This political and cultural position has made Turkey’s place in a liberal, democratic world order highly questionable.

    What seems clear in retrospect is that both 9/11 and the subsequent war on terror were never Turkey’s fights. Due to the longstanding Turkish alliance with the US and NATO, these have been constantly recurring themes in Turkish politics. But the events that have been so central to US policymaking for the past two decades have generally been used to advance Ankara’s own strategic goals in light of the assumption of power and entrenched hegemony of the Islamist movement in Turkey’s contemporary politics.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Is Operation Enduring Freedom Doomed to Endure Forever?

    Those were heady days in the US stock market. I would wake up by 5 am and watch CNBC before the stock market opened for trading at 6:30 am Pacific time. It was no different on the morning of September 11, 2001. Little did I know that catastrophic things were about to happen that would change the world.

    At 8:45 am Eastern time, an American Airlines flight had crashed into the north tower of the World Trade Center in New York City. Within minutes, CNBC stopped discussing stocks and started covering the incident, which, at that moment, no one knew if it was an anomalous accident or an attack of some kind.

    360° Context: How 9/11 and the War on Terror Shaped the World

    READ MORE

    Three minutes after 9 am Eastern, as I watched incredulously at the events unfolding, I saw a United Airlines passenger aircraft fly right into the south tower of the twin towers. In under an hour, the south tower collapsed, resulting in a massive cloud of dust and smoke. By now, there was no doubt that America was under attack.

     “We will remember the moment the news came, where we were and what we were doing,” said President George W. Bush in an address to Congress on September 20. Images from that Tuesday morning are still etched in my memory, happening, as it were, just nine days after my second child was born.

    In all, 2,996 people of 78 nationalities lost their lives in four coordinated attacks conducted by al-Qaeda using hijacked commercial, civilian airliners as their weapons, making 9/11 the second-biggest attack on American soil — second only to the genocidal assault on Native Americans committed by the nation’s immigrant settlers.

    Operation Enduring Freedom: America’s War on Terror

    Addressing the nation the following day, Bush called the attacks “more than acts of terror. They were acts of war.” He promised that “the United States of America will use all our resources to conquer this enemy.” The president went on to assure Americans that this “battle will take time and resolve, but make no mistake about it, we will win.”

    Embed from Getty Images

    Twenty years later, the US has left Afghanistan and Iraq in a chaotic mess. The question remains: Did the United States win the war on terror the Bush administration launched in 2001? This was a war that has cost more than $6.4 trillion and over 801,000 lives, according to Watson Institute for International and Public Affairs at Brown University.

    In October 2001, the US-led coalition invaded Afghanistan and overthrew the Taliban government that had harbored al-Qaeda. Soon after, al-Qaeda militants had been driven into hiding. Osama bin Laden, the mastermind behind the 9/11 attack and leader of al-Qaeda, was killed 10 years later in a raid conducted by US forces in Abbottabad, Pakistan.

    In a shrewd move, Bush had left himself room to take down Iraq and its president, Saddam Hussein, using an overarching definition for the war on terror. In his address to Congress on September 20, Bush also stated: “Our war on terror begins with Al-Qaeda, but it does not end there. It will not end until every terrorist group of global reach has been found, stopped and defeated.”

    True to his words, in 2003, the United States and its allies invaded Iraq under the premise that it possessed weapons of mass destruction. Bush settled his score with Hussein, ensuring he was captured, shamed and subsequently executed in 2006.

    Despite reducing al-Qaeda to nothing and killing bin Laden, despite wrecking Iraq and having its leader executed, it is impossible to say that the US has won the war on terror. All that Washington has managed to do is to trade the Islamic State (IS) group that swept through Syria and Iraq in 2014 for al-Qaeda, giving a new identity to an old enemy. Following the US and NATO pullout from Afghanistan last month, the Taliban, whom America drove out of power in 2001, are back in the saddle. In fact, the Taliban’s recapture of Afghanistan has been so swift, so precise and so comprehensive that the international community is in a shock, questioning the timing and prudence of the withdrawal of troops.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Setting an expectation for how long the war or terror was likely to last, the secretary of defense under the Bush administration, Donald Rumsfeld, remarked in September 2001 that “it is not going to be over in five minutes or five months, it’ll take years.” Rumsfeld, who christened the campaign Operation Enduring Freedom, was prescient, as the war enters its third decade in a never-ending fight against terrorism.

    The Winners and Losers

    Ironically, Operation Enduring Freedom has only resulted in an enduring loss of American freedom, one step at a time. I still remember that I had walked up to the jet bridge and received my wife as she deplaned from a flight in 1991. Another time, when she was traveling to Boston from San Francisco, I was allowed to enter the aircraft and help her get settled with her luggage, along with our 1-year-old. It is inconceivable to be allowed to do such a thing today, and I would not be surprised if readers question the veracity of my personal experience. In many ways, al-Qaeda has succeeded in stripping Americans of the sense of freedom they have always enjoyed.

    More than Americans, the biggest losers in this tragic war are Iraqis and Afghans, particularly the women. Afghan women, who had a brief respite from persecution under the Taliban’s strict Islamic laws and human rights abuses, are back to square one and justifiably terrified of their future under the new regime. The heart-wrenching scenes from Kabul airport of people trying to flee the country tell us about how Afghans view the quality of life under the Taliban and the uncertainty that the future holds. 

    To its east, the delicate balance of peace — if one could characterize the situation between India and Pakistan as peaceful — is likely to be put to the test as violence from Afghanistan spreads. To its north in Tajikistan, there isn’t much love lost between Tajiks and the Taliban. Tajikistan’s president, Emomali Rahmon, has refused to recognize the Taliban government, and Tajiks have promised to join anti-Taliban militia groups, paving the way for continued unrest and violence in Central Asia.

    If History Could be Rewritten

    In 2001, referring to Islamist terrorists, Bush asked the rhetorical question, “Why do they hate us?” He tried to answer it in a speech to Congress: “They hate what they see right here in this chamber: a democratically elected government. Their leaders are self-appointed. They hate our freedoms: our freedom of religion, our freedom of speech, our freedom to vote and assemble and disagree with each other.”

    Islamic fundamentalists couldn’t give two hoots about a form of government or a people’s way of life thousands of miles away. The real answer to Bush’s question lies deeply buried in US foreign policy. America’s steadfast support of Israel and its refusal to recognize the state of Palestine is the number one reason for it to become the target of groups like al-Qaeda and IS.

    Embed from Getty Images

    America’s ill-conceived response to the Soviet invasion of Afghanistan in 1979 during the Cold War led to the creation of al-Qaeda. It was with US funds and support that the anti-Soviet mujahideen fought America’s proxy war with the Soviets. Without US interference, al-Qaeda may never have come into existence.

    During the Iran-Iraq War of the 1980s, the US bolstered Saddam Hussein by backing his regime against the Iranians. When Hussein became too ambitious for America’s comfort and invaded Kuwait in 1990, George H.W. Bush engaged Iraq in the Persian Gulf War. The US motive at that time was primarily to protect its oil interests in Kuwait.

    The US created its own nemesis in Saddam Hussein and Osama bin Laden and spent $6 trillion to kill them. In the process, US leaders have reduced Iraq and Afghanistan to shambles and created a new monster in the Islamic State.

    Sadly, history can never be rewritten. The US has proved time and again that its involvement in the Middle East and Muslim world is aimed at advancing its own political interests. The only question that remains is: Can the US adopt a policy that would not aggravate the situation and, over time, deescalate it, without creating yet another Hussein or bin Laden? Without a radically different approach, Operation Enduring Freedom is doomed to endure forever, costing trillions of dollars each decade.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Liberalizing India’s Economy Is Critical for Global Stability

    The COVID-19 pandemic is increasing inequality globally and even advanced economies have not been spared. Before the pandemic began in 2020, inequality was on the rise. Decades of globalization, loose monetary policy and the rise of oligopolies have contributed to this phenomenon. In many ways, globalization has kept inflation down. When Walmart imports Chinese goods, Americans get more for less.

    China can manufacture cheaply because labor costs are low. The Chinese Communist Party (CCP) also runs an authoritarian regime. The regime has repressive land and labor laws with scant regard for human rights. Legally, the CCP owns all the land in China and can appropriate any property it wants. Similarly, workers have little recourse to courts and sometimes work in slave-like conditions.

    360° Context: The State of the Indian Republic

    READ MORE

    A rising China is challenging the postwar global order. Democracies, including the United States, are finding it difficult to meet the challenge for two reasons. First, loose monetary policies in recent years have brought back the specter of inflation. Second, no economy other than China’s can meet the supply needs of advanced economies. From laptops to toys, most goods are made in China.

    Labor arbitrage has defined globalization from its early years. Companies set up factories where wages tend to be lower. This increases revenues and profits, making consumers and shareholders happy. Given rising inflationary expectations, advanced economies need labor arbitrage to keep costs of goods down. At the same time, these democratic societies want to decouple their supply chain from China.

    With the size of its young workforce, India has a unique opportunity to become the new workshop of the world and emerge as a stabilizing global force in a multipolar world. To grasp this historic opportunity, it has to liberalize its economy wisely.

    The Legacy of the Past

    India could do well to heed the lessons of the past. The Soviet Union, Western Europe and the US emerged as strong economies after World War II by leveraging their manufacturing base. The war economy had led to a relentless focus on infrastructure, mass production and industrialization. In the case of Western Europe, the Marshall Plan helped put shattered economies back on track.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Over time, these advanced economies deindustrialized and production started shifting to emerging economies. China’s rapprochement with the US allowed it to enter the postwar Western economic system. Reforms in 1978 were critical to its success. The fall of the Soviet Union in 1991 created a brave new world where companies chased cheap production. China, with its size, scale and speedy centralized decision-making, emerged as the big winner.

    As production moved to China, workers lost jobs in advanced economies and other industries did not emerge to retrain and employ them. The Rust Belt in the US has become a synonym for down-at-heel places left behind by globalization. Even as workers grew poorer, shareholders grew wealthier, exacerbating inequality.

    Today, the United States finds itself in a complicated position with China. On the one hand, the Middle Kingdom steals intellectual property, transgresses international law and challenges the US. On the other hand, it supplies American consumers with cheap goods they need. America’s economic stimulus during the pandemic has, in fact, reinforced the country’s dependency on China. So, Washington cannot hold China’s feet to the fire and penalize its bad behavior. Beijing follows its policy of pinpricks short of outright conflict.

    The US dollar is the reserve currency of the world. Since the days of Alan Greenspan, the Federal Reserve has followed a loose monetary policy. After the 2007-08 financial crisis, the US adopted the Japanese playbook from the 1990s and introduced quantitative easing. In practice, this means buying treasury and even corporate bonds to release money into the economy after interest rates touch zero. Such increased liquidity in the US has led to bloated company valuations and allowed the likes of Amazon or Uber to expand their operations. The cost of capital has been so low that profitability in the short or even medium run matters little.

    Loose monetary policy has enabled the US to counter China’s state-subsidized companies to some degree. Yet both policies have distorted the market. The US can only continue with loose monetary policy as long as inflation is low. Should inflation rise, interest rates would also have to rise. This might trigger a stock market collapse, increase the cost of capital for its companies and weaken the global dominance of the US economy.

    Embed from Getty Images

    To persist with its economic model and simultaneously contain China, the US needs to curb inflation. This is only possible by shifting some if not all production away from China. Mexico, Vietnam and Bangladesh are possible alternatives. Mexico has a major drug, violence and governance problem. Vietnam and Bangladesh benefit from huge Chinese investment. Therefore, they might not be the best hedge for securing supply chains from the Middle Kingdom, especially if the companies manufacturing in these countries are Chinese.

    As a vibrant democracy with a formidable military, India offers the US and the West a unique hedge against China. For geopolitical reasons alone, manufacturing in India makes sense. However, doing business in the country continues to be difficult because of red tape, corruption, erratic policymaking, a colonial bureaucracy with a socialistic culture and more.

    India’s Nehruvian past still hobbles the nation’s economy. The country adopted socialist command-and-control policies using a colonial-era bureaucracy that prevented the economy from achieving high economic growth. Manufacturing suffered the most. To start a factory, any entrepreneur needed multiple licenses that cost time, money and energy. Poor infrastructure made it difficult for manufacturers to compete with their East Asian counterparts. While wages were low in India, the cost of doing business made many manufacturers uncompetitive.

    Acquiring land in India is still a challenge. The experience of the Tata group in Singur revealed both political and legal risks that still exist. Similarly, convoluted labor laws made hiring and firing onerous, rendering companies inflexible and unable to respond quickly to market demand. Liberalization in 1991 improved matters, but the state continues to choke the supply side of the Indian economy.

    In the second half of the 1990s, liberalization lost momentum. Coalition governments supported by strong interest groups stalled reforms. In fact, India drifted back to left-leaning policies starting 2004 and this severely limited economic growth. For instance, many industrial and infrastructure projects were killed by ministers to protect the environment. India’s toxic legacy of Nehruvian socialism persisted in terms of continuing state intervention. The country never meaningfully transitioned from an agricultural to an industrial economy and still suffers from low productivity. This in turn has constrained consumption and slowed down growth.

    Embed from Getty Images

    India’s much-heralded information technology sector only grew because it was new. The government did not exactly know what was going on and, as a result, there were fewer regulations to constrain this sector. Fewer regulations meant that the likes of Infosys and Wipro had greater autonomy in decision-making and fewer bribes to pay.

    Reduce Red Tape

    The first thing that India needs is an overhaul of its colonial-era bureaucracy that resolutely strives to occupy the commanding heights of the economy. It foists endless red tape on business, strangles entrepreneurship and takes too long to make most decisions. Government service is seen as lifelong employment. Once people become bureaucrats, they have little incentive to perform. Like their colonial predecessors, they lord over citizens instead of serving them. Rarely do they craft sensible policies. Even when a government comes up with a good policy, bureaucrats implement it poorly when they are not sabotaging it actively. This must change. Bureaucrats must be accountable to citizens. Performance-linked promotions and dismissal for underperformance are long overdue.

    Over the years, politicians have tried to deliver benefits and services to citizens to win reelection. To get around a corrupt, colonial and dysfunctional bureaucracy, they instituted direct benefit transfers for welfare schemes, emulating other emerging economies like Brazil. This move is necessary but not sufficient. India needs sound economic policymaking directed by domain experts in each administrative department.

    Only members of the Indian Administrative Service (IAS) occupy key positions in the finance ministry. Instead, India needs economists, chartered accountants, finance professionals and those with varied skill sets in this ministry. The treasuries of the US, Britain, Germany and almost every advanced economies have this diversity of talent in their upper echelons.

    There is no reason why economic policymaking in 21st-century India should be monopolized by an archaic IAS. The government has made noise about the lateral entry of professionals into policymaking, but tangible results have been few and far between.

    If the bureaucracy holds India back, so does the judiciary. Nearly 37 million cases are pending in the courts. It takes around six years for a case to be resolved in a subordinate court, over three years in the high courts and another three years in the supreme court. A case that goes all the way to the supreme court takes an average of 10 years to resolve. Many cases get stuck for 20 to 30 years or more.

    Embed from Getty Images

    India needs to reform its judicial system if its economy is to thrive. Justice is invariably delayed, if not denied, and it also costs an arm and a leg. Not only does it add to transaction costs, but it also undermines business confidence. Virtual courts have already shown the way forward during the pandemic. A higher number of judges using both in-person and online technology could reduce the seemingly unending number of pending cases.

    Create Efficient Markets

    To improve labor productivity and consumption, the government must reduce inflation and improve purchasing power. For decades after independence in 1947, India was united politically but divided economically. Producers in one state could not sell in other states without paying taxes and, in some cases, bribes. In agricultural markets, they could not even sell in other districts. India’s new goods and services tax (GST) might be imperfect, but it has already made a difference. Even during a pandemic, interstate goods movement rose by 20% and menu costs, a term in economics used for the costs of adapting to changing prices or taxes, dropped because tax filings were done online.

    The 2016 Insolvency and Bankruptcy Code has led to major efficiency gains. Now, lenders can recover their debt more speedily. Bankruptcy proceedings are now much simpler even if haircuts remain high. Unsurprisingly, India has risen in the World Bank Doing Business rankings from 130 in 2016 to 63 in 2020.

    As Atul Singh and Manu Sharma explained in an article on Fair Observer in 2018, non-performing assets of Indian banks have led to a financial crisis. The government could do well to adopt some if not all the reforms the authors suggested. Given rising inflationary pressures because of rising oil prices, India’s central bank can no longer cut rates. So, the government has to be creative in tackling its banking issues and free up liquidity for Indian businesses with great potential to grow. Banks burnt by poor lending in the past and fearful of corruption charges as well must discover the judgment and appetite to lend to deserving businesses in a fast-growing economy that needs credit for capital formation.

    A little-noticed need of the Indian economy is to strengthen its own credit rating systems and agencies. Capital flows are aided by accurate corporate and political risk assessment. The US enjoys a global comparative advantage in attracting investments thanks to the big three homegrown agencies: S&P, Moody’s and Fitch. These agencies tend to fall short in their India assessment. The standards they set give American companies an advantage over Indian ones.

    Therefore, both the private sector and the government must strengthen Indian rating agencies such as CRISIL and ICRA. These agencies are improving continuously. They now have access to increased digital high-frequency data, which they can interpret in the domestic context. As a result, Indian agencies can benchmark corporate or sovereign risk better than their American counterparts for domestic markets. A better benchmarking of risk is likely to deepen the bond market and cause a multiplier effect by enabling companies to raise money for increased capital expenditure.

    Embed from Getty Images

    For decades, India followed a socialist model of agriculture, doling out large unsustainable subsidies. As Singh and Sharma explained in a separate article, the Soviet model was the inspiration for the Indian one. Indian agriculture denuded groundwater, emptied government coffers and lowered farm productivity. The current reforms allow farmers to grow what they want and sell wherever they want to bypass parasitic middlemen. The new legislation emulates the US farm bills and promises to boost agricultural production, lower inflation and increase exports. This legislation might also lower rural hunger and improve India’s human capital in the long term.

    India has to transition hundreds of millions from agriculture to industry. Currently, 58% of the country’s population is dependent on agriculture and contributes just 20% to gross domestic product (GDP). All advanced and industrialized economies have a much lower percentage of their populations engaged in agriculture. In the US, the figure is 1.3% and in Vietnam, 43% work in agriculture. The last time the US had 50% of its population engaged in agriculture was in 1870.

    Improve Infrastructure

    To facilitate movement from agriculture to industry, India must invest in infrastructure and urbanization. For decades, its infrastructure has been woefully inadequate. Indian cities are known to be chaotic and do not provide basic services to their citizens. Recently, India launched a $1.9-trillion National Infrastructure Pipeline that is engaged in a rollout of road, rail, seaport and airports to connect centers of manufacturing with points of export. This focus on infrastructure has to be consistent and relentless.

    India could emulate Chinese cities like Chongqing and Shenzhen that could be home to industry and hubs of trade, both domestic and international. Projects like the smart city in Dholera, 80 kilometers from Gujarat’s capital of Ahmedabad, are the way forward. Similarly, the new Production Linked Incentive scheme is the sort of policy India needs. The Tatas are setting up a plant to manufacture lithium-ion batteries under this scheme. Not only could Indian industry meet the needs of a fast-growing market, but it could also be a source of cheap imports for many other countries.

    India must not only focus on metropolises, but also smaller cities and towns where the cost of living is lower. Digitalization of work will allow people to stay in such urban areas. Of course, they will need investment and organization for which India must tap capital and talent not only nationally but internationally. For instance, pension funds in North America and Europe are seeking growth to meet their increasing liabilities. If India could get its act together, investment into Indian markets could be significant.

    A key part of infrastructure that needs reform in a low energy consumption society is the power sector. Gujarat’s growth is underpinned by increased production and improved distribution of electricity. The rest of the country must emulate this westernmost state and Gujarat itself must bring in further reforms. Renewable energy sources such as gas, solar, wind and hydro must grow further. A nationwide energy market would bring in efficiency gains and boost growth.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    A focus on renewable energy also brings risks and opportunities. Currently, China controls critical metals and rare earths required in electric vehicle and battery manufacturing. Beijing has an effective monopoly over 80% of the world’s cobalt, 50% of lithium, 85% of rare earth oxides and 90% of rare earth metals. A decarbonized future cannot be intrinsically linked to an authoritarian state that has a history of not playing by free market rules.

    India’s $1.1-billion “Deep Ocean Mission” offers a unique opportunity for the country to provide energy security to democratic nations in North America, Europe and elsewhere. As they transition to clean technologies, India can provide a safer, more reliable and benign alternative to an increasingly belligerent China.

    In 2021, India has a historic opportunity to enter a new economic arc. The global conditions could not be more favorable. Advanced economies are looking to decouple from China without triggering inflation. India is the only country with the size and the scale to be an alternative. Its large youth population and rising middle class are powerful tailwinds for high economic growth. Indeed, India owes it not only to its citizens, but also to the rest of the world to get its act together and become a force for global stability at a time of much volatility and uncertainty.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    India’s Highway Construction Is in the Fast Lane

    When experts look back at the early 2000s, they will observe that India embarked on a construction spree to develop its transport infrastructure. The country is emulating what the United States and Europe did in the previous century and what China and East Asia have done more recently. Traditionally, India focused on railways. For the last 20 years, roads have been the priority. Now, the country is also focusing on its 116 rivers and long coastline to develop commercial waterways. 

    As is well known, various factors contribute to a nation’s development. The most fundamental is the availability of food and water for the population. Here, India has had some success since its independence in 1947. In health care and education, India can and must do better. India also needs to improve safety and security for its citizens and improve the rule of law. The factor most important for India’s development is perhaps transportation because it has the greatest multiplier effect on the economy. As a result, transportation has the greatest potential to improve the lives of ordinary citizens.

    360° Context: The State of the Indian Republic

    READ MORE

    Transportation infrastructure, such as railways, roads, air traffic and waterways, are the arteries of a country’s economy. The German economy was built on the backbone of an outstanding railway system and the legendary autobahn. The US is knit together by a crisscrossing network of freight trains, interstate highways and airports. Advanced economies like Japan, South Korea, Switzerland and the Netherlands are known for their evolved infrastructure.

    In recent years, China has set the standard for implementing infrastructure at a scale and speed unprecedented in history. Most economists credit spectacular rates of economic growth to Chinese investment in infrastructure. India is betting that building good infrastructure will boost growth, create jobs and raise the standard of living for hundreds of millions.

    Railway and Highway Infrastructure

    According to a 2018 report by NITI Aayog, the premier policy think tank of the Indian government, 59% of all freight in India is transported by road, 35% by railways, 6% by waterways and less than 1% by air.

    On March 31, 2020, India’s railway track length stood at 126,366 kilometers and, on March 31, 2019, the length of national highways was 132,500 kilometers. Per 100 square kilometers, India has more railway tracks and highways than countries like the US and France. This does not necessarily mean India is doing well. South Korea and Japan have over four times the highway length per 100 square kilometers.

    Instead of the density of infrastructure per unit area, density per population size seems to be the more accurate metric. When it comes to infrastructure per million people, India fares very poorly. For instance, Indonesia’s population is merely 20% of India’s, but its highways are twice as long as India’s. South Korea’s population is a tiny 4% of India’s, but its highways are thrice as long as India’s. The top two stars on the infrastructure front are the US and Australia, followed by Japan and France.

    India’s highway network is inadequate for the country’s needs. Highways comprise 1.94% of India’s total road networks but carry a staggering 40% of total road traffic. This means that not only do they suffer high wear and tear, but transportation continues to be a big bottleneck for the economy. It is little surprise that India is finally investing in transport infrastructure.

    After independence in 1947, India underinvested in infrastructure. Two centuries of colonial extraction had left the country with limited resources and almost unlimited public needs. In its early years of independence, India struggled to feed its masses. There was little money to build railways, roads, ports, airports and transport infrastructure.

    India also lacked the expertise to build such infrastructure at scale. Planners, engineers and skilled labor were all in short supply. The nation did not have enough knowledge of transport technology either. There was another challenge in a densely populated democratic country. Infrastructure projects result in the displacement of large numbers of people. Many resist, others negotiate hard and still, others approach their local politicians who start resisting these projects to win votes.

    India’s varied geography also imposed daunting challenges for developing infrastructure. Largely flat countries like Australia and France could focus on railways, which run twice as long as their roads. Mountainous countries like South Korea and Japan have built more roads than railway lines. While plains and plateaus in India are crisscrossed by railway lines, roads are the means of transportation in its extensive mountainous regions.

    A New Focus

    Over the last 20 years, India’s focus has shifted to roads. This began under the coalition National Democratic Alliance (NDA) government led by Atal Bihari Vajpayee of the Bharatiya Janata Party (BJP). Although this government lost the 2004 election, NDA’s vision set in motion transport infrastructure development. In 2014, the BJP-led NDA returned to power and accelerated the building of highways across the country.

    NDA-initiated highway construction was kickstarted by the Golden Quadrilateral, a project connecting India’s four biggest cities: Delhi, Mumbai, Chennai and Kolkata. This boosted economic growth. Since NDA returned to power, India has embarked on Bharatmala Pariyojana, an ambitious project to connect the entire country through a network of highways like the fabled interstate highway system of the US. Even remote regions such as the northeast and Jammu and Kashmir will be covered.

    In the past, India did not measure highways as per international standards. This meant their growth could not be measured and compared easily. To quote management guru Peter F. Drucker, “If you can’t measure it, you can’t improve it.” Since 2018, the measure of highway length in India has been aligned with international standards. While impressive figures on the growth of national highways have been published, their interpretation now is clear and consistent.

    There has also been a steady increase in highway construction rates. In March 2021, it reached 37 kms/day. For the 2020-21 financial year — India’s financial year begins on April 1 and ends on March 31 — road construction averaged 29.81 kms/day. In 2014-15, the rate was 16.61 kms/day. Six years on, the road construction rate has almost doubled and is the fastest India has achieved since independence. The credit goes to Nitin Gadkari, the minister for road transport, one of the star performers of the NDA cabinet. In March, he claimed that India had secured the world record for fastest road construction.

    India’s Evolving Waterways Make a Big Splash

    The oldest civilizations have originated and flourished near major rivers for a simple reason. They provide fresh water, a fundamental human need. Rivers also provided an easy way to travel and transport goods before the advent of roads and railways. Even today, commercial transport of goods via rivers, lakes and oceans continues to cost less than via land. While container ships regularly carry goods across the high seas, most countries no longer use their rivers very well. The US, Australia, Japan, Russia and China are among the few countries that use their rivers and inland waterways well. 

    India has 116 rivers. Potentially, these could provide 35,000 kilometers of waterways and should be tapped. The government set up the Inland Waterways Authority of India in 1986 for “development and regulation of inland waterways for shipping and navigation.” In spite of tremendous cost advantages, waterways’ commercialization received little attention over the next 30 years. In 2016, the NDA declared 111 rivers across India as national waterways, a quantum leap up from five. By 2020, the government operationalized 12 of these waterways. The journey to suitably develop the remaining 99 will be a long and expensive one. However, this investment will cut logistics costs tremendously in the long run and boost India’s competitiveness.

    Unique Insights from 2,500+ Contributors in 90+ Countries

    Gadkari points out that the cost of logistics in India is 18% of the total cost of production. For China, this figure is 8-10%. Notably, waterways account for 47% of total transportation in China, compared to 3.5% in India. As waterways develop, so will commercial activity along their banks and lead to job creation.

    India has another major underutilized natural resource. It has a long coastline of 7,500 kilometers spread across 14 states. To develop ports and coastal transportation, the government has launched the Sagarmala project. This could achieve what the Golden Quadrilateral did for roads in the past. By 2025, the government aims to increase the share of waterways transportation from 3.5% to 6%, reducing logistics costs, boosting exports and generating 4 million new jobs.

    The Road Ahead

    About 53% of India’s population is under 25 years of age and many of them need jobs. Employed young people are more likely to send their children to school. They are likely to eat better and live longer. So far, India’s growth rate has not exceeded the job creation rate. For social and political stability, the government needs to create jobs. 

    While India’s economy continues to grow, the pace of growth does not match the employment needs of India’s young population. Building infrastructure is one of the best ways to generate employment because of its massive multiplier effect in an emerging economy like India. The country needs competent ministers and bureaucrats with domain expertise such as Gadkari. Key ministries overseeing power and finance in New Delhi and India’s state capitals should emulate this model.

    Along with building infrastructure, India must reform its arcane laws of colonial and socialist heritage to boost economic activity. The government must also reform education and vocational training in collaboration with industry to raise the skills of the workforce, improve employability and increase productivity. This is a tall order, but if India can get its house in order, then domestic and foreign investment would flow in. Then, the country would finally be able to join the Asian tigers as one of the world’s fast-growing economies.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Agriculture Is India’s Ray of Hope in Time of Crisis

    As India completes 73 years of independence, agriculture has emerged as a mainstay of the economy. Despite the COVID-19 crisis, Indian agriculture is poised to grow by an estimated 3% in 2020-21. Shaktikanta Das, the governor of the Reserve Bank of India (RBI), has acknowledged that agriculture remains a “beacon of hope” at a time the economy is shrinking.

    The government has announced a new agricultural policy that has drawn both supporters and detractors. Farmer protests have broken out in parts of the country. About 50,000 have marched to New Delhi from the agrarian state of Punjab, objecting to the loosening of price, storage and sales regulations that have traditionally shielded India’s farmers from the free market forces.

    Land Reform Can Transform India’s Economy

    READ MORE in this 360˚ Series

    As of August 25, the International Monetary Fund projected India’s real GDP growth to be 4.5% in 2020. This shrinking of the economy in a country with a growing population could lead to a major crisis. Already, jobs are scarce, industrial production has declined, services have suffered and demand has plummeted. Even after decades of independence, agriculture remains “the largest source of livelihoods in India.” As India gears up to celebrate Mahatma Gandhi’s 151st birthday, there is no better time than now to achieve the Gandhian vision of rural self-reliance.

    Blessing in Disguise

    COVID-19 has made rural areas more important than ever. On March 25, Indian Prime Minister Narendra Modi announced a nationwide lockdown. It took the country by surprise. Millions of urban migrant workers were left with little choice but to walk home to their villages. Carrying their meager household possessions and with their small children in tow, many walked hundreds of kilometers, suffering thirst, hunger and pain. Some died en route.

    Embed from Getty Images

    India’s Economic Survey 2016-17 estimated the “annual inter-state migration [to be] about 5-6.5 million between 2001 and 2011.” In 2020, this migration has been reversed. People who fled rural areas for urban jobs have returned home. Chinmay Tumbe, a professor of economics at the Indian Institute of Management Ahmedabad and an expert on migration, estimates that 30 million migrants might have returned to their villages since the lockdown began. The number could be as high as 70-80 million if reverse intrastate migration is accounted for.

    The reverse migration from urban to rural areas might be a blessing in disguise. Over the last few decades, urban migration has led to overcrowding of cities, the proliferation of slums and much misery for poor migrants. In cities, they have lacked community, cultural moorings and social safety nets. The massive migration to India’s cities was a result of failed economic policies that focused on megacities while neglecting villages. Several studies have found that at least 60% to 70% of the migrant workers who returned to their native places are unlikely to return back to the cities, at least not in the near future. The millions of migrant workers, whom I refer to as agricultural refugees, flocked to cities because the government’s economic policies kept them impoverished.

    A recent study by the Organisation for Economic Co-operation and Development in collaboration with ICRIER, a New Delhi-based think tank, concluded that Indian farmers suffered a cumulative loss of Rs. 45 lakh crore (over $600 billion) between 2000 and 2016-17 because of such policies. Subsequently, the NITI Aayog, a policy think tank of the government of India, admitted that, between 2011-12 and 2015-16, the growth in real farm incomes was less than 0.5% every year. It was 0.44% to be exact.

    Since then, the growth in real farm incomes has been near zero. With farm incomes growing painfully slowly and then stagnating, what else could be expected from the rural workforce but migration to cities where menial jobs as daily wage workers give many the only shot at survival?

    Despite the Hardships

    Despite these hardships, Indian farmers have toiled hard to produce a bumper harvest year after year. This has led to overflowing food stocks. Reports show that this abundance of food grains has come in handy. The government has been able to provide subsidized rations to over 720 million people during the four months of the post-COVID-19 lockdown. In addition, the government has been able to provide free rations to the needy.

    A buoyant agricultural output has hidden a severe agrarian crisis. Farmers get little money for their produce. With less money available in their hands, rural demand has dipped. This had led to a slowdown in the Indian economy even prior to the lockdown. In a country where the agricultural workforce accounts for nearly 50% of the population, the surest way to bolster the economy is to create more rural demand. This involves providing farmers with decent incomes.   

    The lockdown has increased downward pressure on farm incomes. It coincided with the rabi (winter crop) harvest season and resulted in a crash in demand for winter produce. Farmers suffered huge losses in the case of perishables such as vegetables, fruits, flowers, poultry, dairy and fish. Not all news is grim though. On May 15, the United States Department of Agriculture estimated that India is on course to produce “a record 295.7 million metric tons, with estimated record rice, wheat and corn production.”

    For the next kharif (monsoon crop) season, the sowing area coverage of summer crops has increased by 13.92% as compared to last year. With rains expected to be normal, and with a much higher area under cultivation, the kharif harvest will be bountiful just like the rabi one. It seems that in these times of crisis, agriculture alone provides a ray of hope in India.

    Aim for an Economic New Normal

    The coronavirus pandemic has come as a timely reminder of the limitations of dominant economic thinking. Its inherent bias and blind spots stand exposed. For the last two centuries and more, economics has sacrificed agriculture on the altar of industry. The dominant assumption is that industry drives productivity and growth.

    India has never quite managed to industrialize like, for example, the US or China. Still, it has kept farm incomes low and neglected public investment in agriculture for many decades. As per the RBI, this investment hovered around 0.4% of the GDP between 2011-12 and 2017-18. It is little surprise that agriculture has floundered in India.

    The time has come to change outdated economic thinking. Agriculture matters to India because it employs a majority of the country’s population. It provides food security to 1.3 billion people whose ancestors suffered repeated famines until a few decades ago. COVID-19 gives the country the opportunity to return not to normal, but to a new normal.

    The return of migrant labor to villages gives India the opportunity to reinvigorate its rural economy. The country must tap the socioeconomic wealth of rural enterprise, its diversity, and the traditional knowledge base. Prime Minister Narendra Modi’s vision of Atmanirbhar Bharat — a self-reliant India — can only be achieved through a focus on agriculture. A sharp focus, sensible policies and public investment can unleash growth not only in the sector but also in the country.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    What Ails Corporate Governance in India?

    Most businesses perish not because of strong competition or adverse macroeconomic conditions but because of cracks within. One such failing is weak corporate governance. For publicly listed companies, this often translates to controlling shareholders or “promoters” pursuing policies and practices in their own interests at the expense of minority shareholders. It turns out that companies with such promoters are at greater risk of crises and near-death moments in bad economic cycles. Those companies with better governance, where promoters act responsibly in the interests of shareholders, tend to do better during adversity. In fact, savvy investors now treat good corporate governance as an intangible asset.

    This can be best seen in India’s banking sector. In general, private sector banks have practiced better governance than state-owned ones. Consequently, their financial and operating metrics also tell a story of profitable growth with less asset quality issues than their public sector peers. No wonder that private sector banks trade at a higher valuation than public sector ones.

    360˚ Context: The State of the Indian Republic

    READ MORE

    Higher valuation puts these banks into a virtuous growth cycle. They are able to raise capital cheaply with less dilution. This reinforces their already high return ratios, which in turn continue to support a higher valuation. This self-perpetuating cycle has led to long-term compounding of shareholder returns. State-owned peers have fared much worse.

    Despite a large number of state-owned banks, the majority of credit growth in India is led by private sector banks. In fact, state-owned banks are struggling and the government is forced to merge them to ensure their survival. The success of well-run private banks demonstrates how good governance can lower a company’s cost of capital. That is not all. The resulting higher valuation also gives such companies immense pricing power in corporate transactions and talent management, widening their economic moat. 

    Multiple Issues

    India boasts of the oldest stock exchange in Asia, which is also the region’s largest. However, corporate governance in India still lags behind many other places like Singapore or Taiwan. India must understand that good corporate governance is the foundation of a lasting business. It builds investor confidence and has other benefits. India is short of capital and needs to earn investors’ trust. Without an infusion of capital, the Indian economy will fail to thrive. 

    Embed from Getty Images

    There are multiple issues that plague corporate governance in India. First is the lack of accountability among controlling shareholders. For example, promoters get away with appointing their friends, ex-employees and business-school classmates as independent directors with no one raising an eyebrow. Often, statutory auditors are given only one-year extensions to pressurize them to “comply” with management demands. Compliant auditors tend to persist for too long, developing far-too-cozy relationships with the very people they are supposed to keep an eye on. With no strong checks and balances, promoters are in effect incentivized to take advantage of minority shareholders. 

    Second is the slow and selective enforcement by the Securities and Exchange Board of India (SEBI), the country’s market regulator. Cases against the management’s missteps take years to resolve. SEBI generally hands out warnings or mild punishments. This could be because SEBI does not have enough resources to deal with a large number of cases, or it could be a lack of authority or competence. In certain cases, promoters are extremely powerful and politically connected. Given that regulators are political appointees, it is far from easy for them to ignore pressure from politicians, remain impartial, punish the powerful and deliver justice.

    Third is the fact that markets do not punish poorly managed companies for their misdeeds. India needs deeper markets with broader participation for true price discovery. Stock markets must be treated as marketplaces, not as forums for votes of confidence on the government’s economic policies. Because governments place too much importance on market performance, they have an incentive to keep them inflated. Indian corporate bond markets are even worse than stock markets in terms of participation. They are really accessible to only a handful of companies. 

    Fourth is the lack of transparency and weak disclosure requirements. This further perpetuates weak governance. The most detailed yearly disclosures by Indian companies are annual reports, which are often colorful marketing decks instead of detailed, factful and insightful documents, like the 10-Ks in the US. The quarterly earnings report for many companies is just a one-pager. This discloses summary items only without any breakdown of details.

    Earlier, manufacturing companies were mandated to disclose operational details pertaining to capacity, production and inventory. A few years ago, this disclosure requirement was done away with. Now, the only time companies make adequate disclosures only during their initial public offerings, which is a mere one-time event instead of an annual exercise.

    Bringing Sense to the Madness

    The only way to bring some sense to the madness in India’s public markets is to give more independence, power and resources to SEBI. At the same time, India must seriously penalize auditors and boards of companies for overlooking management follies. In addition, the authorities must incentivize and protect whistleblowers in a similar manner to developed economies.

    Some argue that complying with higher disclosure requirements might be too costly for smaller companies. That is not true. Furthermore, even the top 100 Indian companies default frequently on mandatory disclosures. Instead of reducing requirements for disclosures, India should lower costs of disclosures and compliance by using more technology.

    Another way to improve the health of India’s public markets is to increase market participation and trading volumes. Then good corporate governance would be rewarded while poor corporate governance would be penalized. Making short-selling a smoother affair might make the market deeper and more liquid. To increase depth in corporate bond markets, India must make lasting banking reforms. This involves privatization and granting more powers to the banking regulator.

    An unintended consequence of banking reform might be the improvement of India’s infrastructure. Currently, many state-owned enterprises in infrastructure sectors such as power are mismanaged because their bosses are able to buy time by restructuring their bank loans. Banking reforms will make that impossible and will transform this sector too.

    A combination of disclosure, regulation and enforcement can improve corporate governance. Reforms can also reduce conflicts of interests as well as create the right incentives and disincentives for Indian companies. These would inevitably lead to some short-term backlash, but the substantial long-term benefits are too significant to be ignored.

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More