More stories

  • in

    The inside story of how we reached the Facebook-Trump verdict | Alan Rusbridger

    As so often is the case, Donald Trump gets to the heart of the problem. On 6 January, he was the president of the United States: probably the most powerful man in the world. He should be free to speak his mind, and voters should be free to listen. But he was also a habitual liar who, by the end of his term, had edged into repudiating the very democracy that had elevated him.And then came his inflammatory words on that day, uttered even as rioters were breaking their way into the heart of US democracy. His words had a veneer of restraint – “We have to have peace, so go home.” But his statements were laced with lies, along with praise for the mob who terrorised lawmakers as they sought to confirm Biden as Trump’s successor – “We love you, you’re very special … great patriots … remember this day for ever.”At 5.41pm and 6.15pm that day, Facebook removed two posts from Trump. The following day the company banned Trump from its platform indefinitely. Around the same day, Twitter also moved to ban the president – permanently.So there was the problem that Donald Trump embodied – in a country whose commitment to free speech is baked into its core. The president might be a bitterly polarising figure, but surely he has a right to be heard – and for voters to be free to make up their own minds?Facebook’s decision to the contrary would spark passionate debate within the United States. But it had a wider resonance. For how much longer would giant social media platforms act as an amplification system for any number of despots around the world. Would they, too, be banned?The classic defence of free expression is that good speech defeats bad speech. Political speech – in some views – should be the most protected speech. It is vital we know who our leaders are. We have a right – surely? – to know if they are crooks, liars or demagogues.On 7 January Facebook decided: no longer. And now the Facebook oversight board, of which I am a member, has published its own verdict on the decision: Facebook was both right and wrong. Right to remove his 6 January words and right, the following day, to ban the president from the platform. But wrong to ban him “indefinitely”.The key word is “indefinitely” – if only because Facebook’s own policies do not appear to permit it. The oversight board (OSB) judgment doesn’t mince its words: “In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.” Ball squarely back in Facebook’s court.What Facebook has to do now – in our judgment, which the company is bound to implement – is to re-examine the arbitrary penalty it imposed on 7 January. It should take account of the gravity of the violation and the prospect of future harm.The case is the most prominent the OSB has decided since it was established as an independent entity and will inevitably focus more attention on its work. Why is such a body thought necessary?But this 38-page text is, I hope, a serious contribution to thinking about free speech in an age of chaosLet’s assume we might agree that it’s a bad thing for one person, Mark Zuckerberg, to be in charge of the rules of speech for 2 billion or more people. He is clearly a wonderfully talented engineer – but nothing in his background suggests he is equipped to think deeply about the complexities involved in free expression.Maybe most people who have studied the behaviour of governments towards publishers and newspapers over 300 years might also agree that politicians are not the best people to be trusted with individual decisions about who gets to say what.Into the void between those two polarities has stepped the OSB. At the moment we’re 19 individuals with backgrounds in journalism, law, academia and human rights: by the end of 2021 we hope to be nearer 40.Are we completely independent from Facebook? It certainly feels that way. It’s true that Facebook was involved in selecting the first 20 members, but once the board reaches its full complement, we decide who our future colleagues will be. Since a few early meetings to understand Facebook processes around moderation and similar matters we have had nothing to do with the company.We have our own board of distinguished trustees – again, free of any influence from Facebook. From what I’ve seen of my colleagues so far they’re an odd bunch to have picked if you were in search of a quiet life.The Trump decision was reached through the processes we’ve devised ourselves. A panel of five – with a good spread of regional backgrounds – did the initial heavy lifting, including sifting through more than 9,000 responses from the public.The wider board fed in its own views. We looked at Facebook’s own values – what they call voice, safety and dignity – as well as its content policies and community standards. But we also apply an international human rights lens in trying to balance freedom of expression with possible harms.In the Trump case we looked at the UN Guiding Principles on Business and Human Rights (UNGPs), which establish a voluntary framework for the human rights responsibilities of private businesses. We also considered the right to freedom of expression set out in articles 19 and 20 of the International Covenant on Civil and Political Rights (ICCPR) – as well as the qualifying articles to do with the rights to life, security of person, non-discrimination, participation in public affairs and so on.We also considered the 2013 Rabat Plan of Action, which attempts to identify and control hate speech online. We took into account a submission sent on behalf of Trump himself and sent Facebook 46 questions. They answered 37 fully, and two partially.And then we debated, and argued – virtually/verbally and in writing. A number of drafts were circulated, with most board members pitching in with tweaks, challenges, corrections and disagreements. Gradually, a consensus developed – resulting in a closely argued 38-page decision which openly reflects the majority and minority opinions.In addition to our ruling about the original and “indefinite” bans, we’ve sent Facebook a number of policy advisory statements. One of these concentrates on the question of how social media platforms should deal with “influential users” (a more useful conceit than “political leaders”).Speed is clearly of the essence where potentially harmful speech is involved. While it’s important to protect the rights of people to hear political speech, “if the head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a determinate period sufficient to protect against imminent harm”.As in previous judgments, we are critical of a lack of clarity in some of Facebook’s own rules, together with insufficient transparency about how they’re enforced. We would like to see Facebook carry out a comprehensive review of its potential contribution to the narrative around electoral fraud and in the exacerbated tensions that culminated in the violence on 6 January.And then this: “This should be an open reflection on the design and policy choices that Facebook has made that may enable its platform to be abused.” Which many people will read as not-so-coded reference to what is shorthanded as The Algorithm.Social media is still in its infancy. Among the many thorny issues we periodically discuss as a board is, what is this thing we’re regulating? The existing language – “platform”, “publisher”, “public square” – doesn’t adequately describe these new entities.Most of the suggested forms of more interventionist regulation stub their toes on the sheer novelty of this infant space for the unprecedented mass exchange of views.The OSB is also taking its first steps. The Trump judgment cannot possibly satisfy everyone. But this 38-page text is, I hope, a serious contribution to thinking about how to handle free speech in an age of information chaos. More

  • in

    The Spread of Global Hate

    One insidious way to torture the detainees at Guantanamo Bay was to blast music at them at all hours. The mixtape, which included everything from Metallica to the Meow Mix jingle, was intended to disorient the captives and impress upon them the futility of resistance. It worked: This soundtrack from hell did indeed break several inmates.

    For four years, Americans had to deal with a similar sonic blast, namely the “music” of President Donald Trump. His voice was everywhere: on TV and radio, screaming from the headlines of newspapers, pumped out nonstop on social media. MAGAmen and women danced to the repetitive beat of his lies and distortions. Everyone else experienced the nonstop assault of Trump’s instantly recognizable accent and intonations as nails on a blackboard. After the 2016 presidential election, psychologists observed a significant uptick in the fears Americans had about the future. One clinician even dubbed the phenomenon “Trump anxiety disorder.”

    What Led to Europe’s Vaccine Disaster?

    READ MORE

    The volume of Trump’s assault on the senses has decreased considerably since January. Obviously, he no longer has the bully pulpit of the Oval Office to broadcast his views. The mainstream media no longer covers his every utterance. Most importantly, the major social media platforms have banned him. In the wake of the January 6 insurrection on Capitol Hill, Twitter suspended Trump permanently under its glorification of violence policy. Facebook made the same decision, though its oversight board is now revisiting the former president’s deplatforming.

    It’s not only Trump. The Proud Boys, QAnon, the militia movements: The social media footprint of the far right has decreased a great deal in 2021, with a parallel decline in the amount of misinformation available on the Web.

    And it’s not just a problem of misinformation and hate speech. According to a new report by the Center for Strategic and International Studies (CSIS) on domestic terrorism, right-wing extremists have been involved in 267 plots and 91 fatalities since 2015, with the number of incidents rising in 2020 to a height unseen in a quarter of a century. A large number of the perpetrators are loners who have formed their beliefs from social media. As one counterterrorism official put it, “Social media has afforded absolutely everything that’s bad out there in the world the ability to come inside your home.”

    So, why did the tech giants provide Trump, his extremist followers and their global counterparts unlimited access to a growing audience over those four long years?

    Facebook Helps Trump

    In a new report from the Global Project Against Hate and Extremism (GPAHE), Heidi Beirich and Wendy Via write: “For years, Trump violated the community standards of several platforms with relative impunity. Tech leaders had made the affirmative decision to allow exceptions for the politically powerful, usually with the excuse of ‘newsworthiness’ or under the guise of ‘political commentary’ that the public supposedly needed to see.”

    Even before Trump became president, Facebook was cutting him a break. In 2015, he was using the social media platform to promote a Muslim travel ban, which generated considerable controversy, particularly within Facebook itself. The Washington Post reports:

    “Outrage over the video led to a companywide town hall, in which employees decried the video as hate speech, in violation of the company’s policies. And in meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, according to three former employees, who spoke on the condition of anonymity for fear of retribution. [Facebook CEO Mark] Zuckerberg expressed in meetings that he was personally disgusted by it and wanted it removed, the people said.”

    But the company’s most prominent Republican, Vice-President of Global Policy Joel Kaplan, persuaded Zuckerberg to change his position. In spring 2016, when Zuckerberg wanted to condemn Trump’s plan to build a wall on the border with Mexico, he was again persuaded to step back for fear of seeming too partisan.

    Embed from Getty Images

    Facebook went on to play a critical role in getting Trump elected. It wasn’t simply the Russian campaign to create fake accounts, fake messaging and even fake events using Facebook, or the theft of Facebook user data by Cambridge Analytica. More important was the role played by Facebook staff in helping Trump’s digital outreach team maximize its use of social media. The Trump campaign spent $70 million on Facebook ads and raised much of its $250 million in online fundraising through Facebook as well.

    Trump established a new paradigm through brute force and money. As he turned himself into clickbait, the social media giants applied the same “exceptionalism” to other rancid politicians. More ominously, the protection accorded politicians extended to extremists. According to an account of a discussion at a Twitter staff meeting, one employee explained that “on a technical level, content from Republican politicians could get swept up by algorithms aggressively removing white supremacist material. Banning politicians wouldn’t be accepted by society as a trade-off for flagging all of the white supremacist propaganda.”

    Of course, in the wake of the January 6 insurrection, social media organizations decided that society could indeed accept the banning of politicians, at least when it came to some politicians in the United States.

    The Real Fake News

    In the Philippines, an extraordinary 97% of internet users had accounts with Facebookas of 2019, up from 40% in 2018 (by comparison, about 67% of Americans have Facebook accounts). Increasingly, Filipinos get their news from social media. That’s bad news for the mainstream media in the Philippines. And that’s particularly bad news for journalists like Maria Ressa, who runs an online news site called Rappler.

    At a press conference for the GPAHE report, Ressa described how the government of Rodrigo Duterte, with an assist from Facebook, has made her life a living hell. Like Trump, President Duterte came to power on a populist platform spread through Facebook. Because of her critical reporting on government affairs, Ressa felt the ire of the Duterte fan club, which generated half a million hate posts that, according to one study, consisted of 60% attacks on her credibility and 40% sexist and misogynist slurs. This onslaught created a bandwagon effect that equated journalists like her with criminals.

    This noxious equation on social media turned into a real case when the Philippine authorities arrested Ressa in 2019 and convicted her of the dubious charge of “cyberlibel.” She faces a sentence of as much as 100 years in prison.

    “Our dystopian present is your dystopian future,” she observed. What happened in the Philippines in that first year of Duterte became the reality in the United States under Trump. It was the same life cycle of hate in which misinformation is introduced in social media, then imported into the mainstream media and supported from the top down by opportunistic politicians.

    The Philippines faces another presidential election next year, and Duterte is barred from running again by term limits. Duterte’s daughter, who is currently the mayor of Davao City just like her father had been, tops the early polls, though she hasn’t thrown her hat in the ring and her father has declared that women shouldn’t run for president. This time around, however, Facebook disrupted the misinformation campaign tied to the Dutertes when it took down fake accounts coming from China that supported the daughter’s potential bid for the presidency.

    President Duterte was furious. “Facebook, listen to me,” he said. “We allow you to operate here hoping that you could help us. Now, if government cannot espouse or advocate something which is for the good of the people, then what is your purpose here in my country? What would be the point of allowing you to continue if you can’t help us?”

    Duterte had been led to believe, based on his previous experience, that Facebook was his lapdog. Other authoritarian regimes had come to expect the same treatment. In India, according to the GPAHE report, Prime Minister Narendra Modi’s Bharatiya Janata Party:

    “… was Facebook India’s biggest advertising spender in 2020. Ties between the company and the Indian government run even deeper, as the company has multiple commercial ties, including partnerships with the Ministry of Tribal Affairs, the Ministry of Women and the Board of Education. Both CEO Mark Zuckerberg and COO Sheryl Sandberg have met personally with Modi, who is the most popular world leader on Facebook. Before Modi became prime minister, Zuckerberg even introduced his parents to him.”

    Facebook has also cozied up to the right-wing government in Poland, misinformation helped get Jair Bolsonaro elected in Brazil, and the platform served as a vehicle for the Islamophobic content that contributed to the rise of the far right in the Netherlands. But the decision to ban Trump has set in motion a backlash. In Poland, for instance, the Law and Justice Party has proposed a law to fine Facebook and others for removing content if it doesn’t break Polish law, and a journalist has attempted to establish a pro-government alternative to Facebook called Albicla.

    Back in the USA

    Similarly, in the United States, the far right have suddenly become a big booster of free speech now that social media platforms have begun to deplatform high-profile users like Trump and take down posts for their questionable veracity and hate content. In the second quarter of 2020 alone, Facebook removed 22.5 million posts.

    .custom-post-from {float:right; margin: 0 10px 10px; max-width: 50%; width: 100%; text-align: center; background: #000000; color: #ffffff; padding: 15px 0 30px; }
    .custom-post-from img { max-width: 85% !important; margin: 15px auto; filter: brightness(0) invert(1); }
    .custom-post-from .cpf-h4 { font-size: 18px; margin-bottom: 15px; }
    .custom-post-from .cpf-h5 { font-size: 14px; letter-spacing: 1px; line-height: 22px; margin-bottom: 15px; }
    .custom-post-from input[type=”email”] { font-size: 14px; color: #000 !important; width: 240px; margin: auto; height: 30px; box-shadow:none; border: none; padding: 0 10px; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-pen-icon.svg”); background-repeat: no-repeat; background-position: center right 14px; background-size:14px;}
    .custom-post-from input[type=”submit”] { font-weight: normal; margin: 15px auto; height: 30px; box-shadow: none; border: none; padding: 0 10px 0 35px; background-color: #1878f3; color: #ffffff; border-radius: 4px; display: inline-block; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-email-icon.svg”); background-repeat: no-repeat; background-position: 14px center; background-size: 14px; }

    .custom-post-from .cpf-checkbox { width: 90%; margin: auto; position: relative; display: flex; flex-wrap: wrap;}
    .custom-post-from .cpf-checkbox label { text-align: left; display: block; padding-left: 32px; margin-bottom: 0; cursor: pointer; font-size: 11px; line-height: 18px;
    -webkit-user-select: none;
    -moz-user-select: none;
    -ms-user-select: none;
    user-select: none;
    order: 1;
    color: #ffffff;
    font-weight: normal;}
    .custom-post-from .cpf-checkbox label a { color: #ffffff; text-decoration: underline; }
    .custom-post-from .cpf-checkbox input { position: absolute; opacity: 0; cursor: pointer; height: 100%; width: 24%; left: 0;
    right: 0; margin: 0; z-index: 3; order: 2;}
    .custom-post-from .cpf-checkbox input ~ label:before { content: “f0c8”; font-family: Font Awesome 5 Free; color: #eee; font-size: 24px; position: absolute; left: 0; top: 0; line-height: 28px; color: #ffffff; width: 20px; height: 20px; margin-top: 5px; z-index: 2; }
    .custom-post-from .cpf-checkbox input:checked ~ label:before { content: “f14a”; font-weight: 600; color: #2196F3; }
    .custom-post-from .cpf-checkbox input:checked ~ label:after { content: “”; }
    .custom-post-from .cpf-checkbox input ~ label:after { position: absolute; left: 2px; width: 18px; height: 18px; margin-top: 10px; background: #ffffff; top: 10px; margin: auto; z-index: 1; }
    .custom-post-from .error{ display: block; color: #ff6461; order: 3 !important;}

    Facebook has tried to get ahead of this story by establishing an oversight board that includes members like Jamal Greene, a law professor at Columbia University; Julie Owono, executive director at Internet Sans Frontiere; and Nighat Dad, founder of the Digital Rights Foundation. Now, Facebook users can also petition the board to remove content.

    With Facebook, Twitter, YouTube and others now removing a lot of extremist content, the far right have migrated to other platforms, such as Gab, Telegram, and MeWe. They continue to spread conspiracy theories, anti-COVID vaccine misinformation and pro-Trump propaganda on these alternative platforms. Meanwhile, the MAGA crowd awaits the second coming of Trump in the form of a new social media platform that he plans to launch in a couple of months to remobilize his followers.

    Even without such an alternative alt-right platform — Trumpbook? TrumpSpace? Trumper? — the life cycle of hate is still alive and well in the United States. Consider the “great replacement theory,” according to which immigrants and denizens of the non-white world are determined to “replace” white populations in Europe, America and elsewhere. Since its inception in France in 2010, this extremist conspiracy theory has spread far and wide on social media. It has been picked up by white nationalists and mass shooters. Now, in the second stage of the life cycle, it has landed in the mainstream media thanks to right-wing pundits like Tucker Carlson, who recently opined, “The Democratic Party is trying to replace the current electorate of the voters now casting ballots with new people, more obedient voters from the Third World.”

    Pressure is mounting on Fox to fire Carlson, though the network is resisting. Carlson and his supporters decry the campaign as yet another example of “cancel culture.” They insist on their First Amendment right to express unpopular opinions. But a privately-owned media company is under no obligation to air all views, and the definition of acceptability is constantly evolving.

    Also, a deplatformed Carlson would still be able to air his crank views on the street corner or in emails to his followers. No doubt when Trumpbook debuts at some point in the future, Carlson’s biggest fan will also give him a digital megaphone to spread lies and hate all around the world. These talking heads will continue talking no matter what. The challenge is to progressively shrink the size of their global platform.

    *[This article was originally published by FPIF.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Zuckerberg faces Capitol attack grilling as Biden signals tougher line on big tech

    Mark Zuckerberg, the head of Facebook, could be in for a rough ride on Thursday when he testifies to Congress for the first time about the 6 January insurrection at the Capitol in Washington DC and amid growing questions over his platform’s role in fuelling the violence.The testimony will come after signs that the new administration of Joe Biden is preparing to take a tougher line on the tech industry’s power, especially when it comes to the social media platforms and their role in spreading misinformation and conspiracy theories.Zuckerberg will be joined by Sundar Pichai and Jack Dorsey, the chief executives of Google and Twitter respectively, at a hearing pointedly entitled “Disinformation nation: social media’s role in promoting extremism and misinformation” by the House of Representatives’ energy and commerce committee.The scrutiny comes after a report found that Facebook allowed groups linked to the QAnon, boogaloo and militia movements to glorify violence during the 2020 election and weeks leading up to the deadly mob violence at the US Capitol.Avaaz, a non-profit advocacy group, says it identified 267 pages and groups on Facebook that spread “violence-glorifying content” in the heat of the 2020 election to a combined following of 32 million users. More than two-thirds of the groups and pages had names aligned with several domestic extremist movements.The top 100 most popular false or misleading stories on Facebook related to the elections received an estimated 162m views, the report found. Avaaz called on the White House and Congress to open an investigation into Facebook’s failures and urgently pass legislation to protect American democracy.Fadi Quran, its campaign director, said: “This report shows that American voters were pummeled with false and misleading information on Facebook every step of the 2020 election cycle. We have over a year’s worth of evidence that the platform helped drive billions of views to pages and content that confused voters, created division and chaos, and, in some instances, incited violence.“But the most worrying finding in our analysis is that Facebook had the tools and capacity to better protect voters from being targets of this content, but the platform only used them at the very last moment, after significant harm was done.”Facebook claimed that Avaaz had used flawed methodology. Andy Stone, a spokesperson, said: “We’ve done more than any other internet company to combat harmful content, having already banned nearly 900 militarized social movements and removed tens of thousands of QAnon pages, groups and accounts from our apps.”He acknowledged: “Our enforcement isn’t perfect, which is why we’re always improving it while also working with outside experts to make sure that our policies remain in the right place.”But the report is likely to prompt tough questions for Zuckerberg in what is part of a wider showdown between Washington and Silicon Valley. Another flashpoint on Thursday could be Section 230 of the 1996 Communications Decency Act, which shields social media companies from liability for content their users post.Repealing the law is one of the few things on which Biden and his predecessor as president, Donald Trump, agree, though for different reasons. Democrats are concerned that Section 230 allows disinformation and conspiracy theories such as QAnon to flourish, while Trump and other Republicans have argued that it protects companies from consequences for censoring conservative voices.More generally, critics say that tech companies are too big and that the coronavirus pandemic has only increased their dominance. The cosy relationship between Barack Obama’s administration and Silicon Valley is a thing of the past, while libertarian Republicans who oppose government interference are a fading force.Amazon, Apple, Facebook and Google have all come under scrutiny from Congress and regulators in recent years. The justice department, the Federal Trade Commission (FTC) and state attorneys general are suing the behemoths over various alleged antitrust violations.In a letter this week to Biden and Merrick Garland, the new attorney general, a coalition of 29 progressive groups wrote: “It’s clear that the ability of Big Tech giants like Google to acquire monopoly power has been abetted by the leadership deficit at top enforcement agencies such as the FTC … We need a break from past, failed leadership, and we need it now.”There are signs that Biden is heeding such calls and spoiling for a confrontation. On Monday he nominated Lina Khan, an antitrust scholar who wants stricter regulation of internet companies, to the FTC. Earlier this month Tim Wu, a Columbia University law professor among the most outspoken critics of big tech, was appointed to the national economic council.There is support in Congress from the likes of David Cicilline, chairman of the House judiciary committee’s antitrust panel, which last year released a 449-page report detailing abuses of market power by Apple, Amazon, Google and Facebook.The Democratic congressman is reportedly poised to issue at least 10 legislative initiatives targeting big tech, a blitz that will make it harder for the companies and their lobbyists to focus their opposition on a single piece of legislation.Cicilline, also working on a separate bill targeting Section 230, told the Axios website: “My strategy is you’ll see a number of bills introduced, both because it’s harder for [the tech companies] to manage and oppose, you know, 10 bills as opposed to one.“It also is an opportunity for members of the committee who have expressed a real interest or enthusiasm about a particular issue, to sort of take that on and champion it.” More

  • in

    All I want for 2021 is to see Mark Zuckerberg up in court | John Naughton

    It’s always risky making predictions about the tech industry, but this year looks like being different, at least in the sense that there are two safe bets. One is that the attempts to regulate the tech giants that began last year will intensify; the second that we will be increasingly deluged by sanctimonious cant from Facebook & co as they seek to avoid democratic curbing of their unaccountable power.On the regulation front, last year in the US, Alphabet, Google’s corporate owner, found itself facing major antitrust suits from 38 states as well as from the Department of Justice. On this side of the pond, there are preparations for a Digital Markets Unit with statutory powers that will be able to neatly sidestep the tricky definitional questions of what constitutes a monopoly in a digital age. Instead, the unit will decide on a case-by-case basis whether a particular tech company has “strategic market status” if it possesses “substantial, entrenched market power in at least one digital activity” or if it acts as an online “gateway” for other businesses. And if a company is judged to have this status, then penalties and regulations will be imposed on it.Over in Brussels, the European Union has come up with a new two-pronged legal framework for curbing digital power – the Digital Markets Act and the Digital Services Act. The Digital Markets Act is aimed at curbing anti-competitive practices in the tech industry (like buying up potential competitors before they can scale up) and will include fines of 10% of global revenues for infringers. The Digital Services Act, for its part, will oblige social media platforms to take more responsibility for illegal content on their platforms – scams, terrorist content, images of abuse, etc – for which they could face fines of up to 6% of global revenue if they fail to police content adequately. So the US and UK approach focuses on corporate behaviour; the EU approach focuses on defining what is allowed legally.All of this action has been a long time coming and while it’s difficult to say exactly how it will play out, the bottom line is that the tech industry is – finally – going to become a regulated one. Its law-free bonanza is going to come to an end.Joe Biden’s choices for top staff in his administration include a depressing proportion of former tech company stalwartsThe big question, though, is: when? Antitrust actions proceed at a glacial pace because of the complexity of the issues and the bottomless legal budgets of the companies involved. The judge in one of the big American antitrust cases against Google has said that he expects the case to get to court only in late 2023 and then it could run for several years (as the Microsoft case did in the 1990s).The problem with that, as the veteran anti-monopoly campaigner Matt Stoller has pointed out, is that the longer monopolistic behaviour goes on, the more damage (eg, to advertisers whose revenue is being stolen and other businesses whose property is being appropriated) is being done. Google had $170bn in revenue last year and is growing on average at 10-20% a year. On a conservative estimate of 10% growth, the company will add another $100bn to its revenue by 2025, when the case will still be in the court. Facebook, says Stoller, “is at $80bn of revenue this year, but it is growing faster, so the net increase of revenue is a roughly similar amount. In other words, if the claims of the government are credible, then the lengthy case, while perhaps necessary, is also enabling these monopolists to steal an additional $100bn apiece.”What could speed up bringing these monopolists to account? A key factor is the vigour with which the US Department of Justice prosecutes its case(s). In the run-up to the 2020 election, the Democrats in Congress displayed an encouraging enthusiasm for tackling tech monopolies, but Joe Biden’s choices for top staff in his administration include a depressing proportion of former tech company stalwarts. And his vice-president-elect, Kamala Harris, consistently turned a blind eye to the anti-competitive acquisitions of the Silicon Valley giants throughout her time as California’s attorney general. So if people are hoping for antitrust zeal from the new US government, they may be in for disappointment.Interestingly, Stoller suggests that another approach (inspired by the way trust-busters in the US acted in the 1930s) could have useful leverage on corporate behaviour from now on. Monopolisation isn’t just illegal, he points out, “it is in fact a crime, an appropriation of the rights and property of others by a dominant actor. The lengthy trial is essentially akin to saying that bank robbers getting to keep robbing banks until they are convicted and can probably keep the additional loot.”Since a basic principle of the rule of law is that crime shouldn’t pay, an addition of the possibility of criminal charges to the antitrust actions might, like the prospect of being hanged in the morning (pace Dr Johnson), concentrate minds in Facebook, Google, Amazon and Apple. As an eternal optimist, I cannot think of a nicer prospect for 2021 than the sight of Mark Zuckerberg and Sundar Pichai in the dock – with Nick Clegg in attendance, taking notes. Happy new year!What I’ve been readingWho knew?What We Want Doesn’t Always Make Us Happy is a great Bloomberg column by Noah Smith.Far outIntriguing piece on how investors are using real-time satellite images to predict retailers’ sales (Stock Picks From Space), by Frank Partnoy on the Atlantic website.An American dream Lovely meditation on Nora Ephron’s New York, by Carrie Courogen on the Bright Wall/Dark Room website. More

  • in

    Facebook faces antitrust allegations over deals for Instagram and WhatsApp

    Facebook is expecting significant new legal challenges, as the US Federal Trade Commission and a coalition of attorney generals from up to 40 states are preparing antitrust suits.
    [embedded content]
    Although the specific charges in both cases remain unclear, the antitrust allegations are expected to center on the tech giant’s acquisition of two big apps: a $1bn deal to buy the photo-sharing app Instagram in 2012, and the $19bn purchase of the global messaging service WhatsApp in 2014. Together, the buys brought the top four social media companies worldwide under Facebook’s control. The purchases would constitute antitrust violations if Facebook believed the companies were viable competitors.
    At the time of its acquisition, Instagram had 30 million users, and, even though it was growing rapidly, it wasn’t yet making money. WhatsApp boasted more than 450 million monthly active users when it was acquired. “WhatsApp is on a path to connect 1 billion people,” Zuckerberg said in a statement at the time.
    The FTC cleared Facebook for the acquisitions when they occurred, and the company is hoping to leverage those approvals in mounting a defense. Facebook executives have also argued their company has helped the apps grow.
    But Facebook has come under greater scrutiny since the deals were done, and the FTC launched a new investigation into the potential antitrust violations in 2019.
    The FTC probe will build on findings from a separate inquiry conducted by the US House Judiciary subcommittee, which released millions of documents that appeared to show that Facebook executives, including CEO Mark Zuckerberg, were concerned the apps could become competition, before aggressively pursuing them.
    In one 2012 email, made public through the House investigation, Zuckerberg highlighted how Instagram had an edge on mobile, an area where Facebook was falling behind. In another, the CEO said Instagram could hurt Facebook even if it doesn’t become huge. “The businesses are nascent but the networks are established, the brands are already meaningful and if they grow to a large scale they could be disruptive to us,” Zuckerberg wrote. Instagram’s co-founder also fretted that his company might be targeted for destruction by Zuckerberg if he refused the deal.
    The FTC is expected to vote on a possible suit this week. Three of the five-member commission are believed to be in favor of the move, including chair Joseph Simons, who is expected to leave the agency before the new Biden administration is sworn in, Politico reported.
    Commissioners also have to decide where to file the suit: in federal court, which would leave the outcome to a judge; or in the FTC, where the commission could ultimately decide.
    The suit expected from the bipartisan coalition of states is headed by New York attorney general Letitia James. While details of their complaint are also scant, several states’ top law enforcement offices launched probes into Facebook’s acquisitions last year, adding to the pressure put on the company by federal regulators.
    Facebook did not respond to a request for comment.
    Facebook’s possible legal challenges come as a growing number of US lawmakers are arguing that companies including Amazon, Google, Facebook and Apple have amassed too much power and should be reined in.
    These companies “wield their dominance in ways that erode entrepreneurship, degrade Americans’ privacy online, and undermine the vibrancy of the free and diverse press”, the House judiciary committee concluded in its nearly 500-page report.
    “The result is less innovation, fewer choices for consumers, and a weakened democracy.”
    President-elect Joe Biden, too, has been critical of the tech companies. “Many technology giants and their executives have not only abused their power, but misled the American people, damaged our democracy and evaded any form of responsibility,” said Biden spokesperson Matt Hill to the New York Times. “That ends with a President Biden.”
    In May, Facebook took over Giphy, a hugely popular moving-image app, with plans to integrate it with Instagram. Late last month, the company also announced plans to acquire Kustomer, an e-commerce app.
    “This deal is about providing more choices and better products for consumers,” a company spokesman said in a statement to the New York Times. “The key to Facebook’s success has always been innovation, with M&A being just a part of our overall business strategy, and we will continue to demonstrate to regulators that competition in the technology sector is vibrant.” More

  • in

    If you think Biden's administration would rein in big tech, think again | John Naughton

    Before the US presidential election I wondered aloud if Mark Zuckerberg had concluded that the re-election of Trump might be better for Facebook than a Biden victory. There were several reasons for thinking this. One was the strange way Zuckerberg appeared to be sucking up to Trump: at least one private dinner in the White House; the way he jumped on to Fox News when Twitter first placed a warning on a Trump tweet to say that Facebook would not be doing stuff like that; and the majority report of the House subcommittee on tech monopolies, in which it was clear that the Democrats had it in for the companies.But the most significant piece of evidence for the belief that a Biden administration would finally tackle the tech giants, and Facebook in particular, came in the long interview Biden gave last January to the New York Times, in which he was highly critical of the company.“I’ve never been a big Zuckerberg fan,” Biden said. “I think he’s a real problem … I’ve been in the view that not only should we be worrying about the concentration of power, we should be worried about the lack of privacy and them being exempt, which you’re not exempt. [The New York Times] can’t write something you know to be false and be exempt from being sued. But he can. The idea that it’s a tech company is that Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms.” As readers of this column know only too well, section 230 of the 1996 US Telecommunications Act is the clause that exempts tech platforms from legal liability for anything that users post on their platforms. It’s the nearest thing social media has to a kill switch. Pull it and their business models evaporate. Trump had been threatening to pull it before the election, but he lacked the attention span to be able to do anything about it. Biden, on the other hand, had already talked about it in January and would have people around him who knew what they were doing. So maybe we were going to get some real progress in getting tech giants under control.And then he gets elected and what do we find? Biden’s transition eam is packed with tech industry insiders. Tom Sullivan, from Amazon, is earmarked for the Department of State. Mark Schwartz, also from Amazon, is heading for the Office of Management and Budget, as are Divya Kumaraiah from Airbnb and Brandon Belford from Lyft, the ride-hailing company. The US Treasury gets Nicole Isaac from LinkedIn, Microsoft’s department of spam, and Will Fields, who was Sidewalk Labs’ senior development associate. (Sidewalk Labs was the organiser of Google’s attempt – eventually cancelled – to turn Toronto’s waterfront into a data-geyser for surveillance capitalism.) The Environmental Protection Agency, a body that Trump looted and sidelined, gets Ann Dunkin, who is Dell’s chief technology officer. And so on.Well, I thought, perusing this sordid list, at least there’s nobody from Facebook on it. How innocent can you be? Politico reveals that the joint chair of Biden’s transition team, Jeff Zients, is a former Facebook board member. Another former board member is an adviser. And two others, one who was a Facebook director and another who was a company lobbyist, have, according to Politico “taken leadership roles”. And then, to cap it all, it turns out that Biden himself has a friendly relationship with a guy called Nick Clegg, who was once a serious politician and now doubles as Mark Zuckerberg’s bagman and representative on Earth.Truly, you couldn’t make this up. And just to add a touch of satire to it, the woman who is now a heartbeat away from the presidency, Kamala Harris, has a career-long record of cosying up to Silicon Valley. She participated, for example, in the marketing campaign for Lean In, Sheryl Sandberg’s anthem of capitalist feminism, even though at the time Harris was California’s law enforcement official most responsible for overseeing Facebook. As the state’s attorney general, she took a semi-comatose view of the way the big tech companies were allowed to gobble up potential rivals and bulldoze their way into new industries. Facebook’s controversial acquisitions of WhatsApp and Instagram, perhaps the most obvious anti-competitive mergers in the short history of the tech industry, happened on her watch and triggered no regulatory reflex. If Silicon Valley could be said to have a darling, then Ms Harris is it. And all those campaign donations from tech companies and moguls may turn out to have been a shrewd investment after all.Given these sobering circumstances, how should we calculate the odds of a Biden administration taking on the power of the tech giants? The answer: slightly better than those of a snowball staying cool in hell. But only slightly.What I’ve been readingIs 2020 just a taster?Graeme Wood has written a riveting essay, titled The Next Decade Could Be Even Worse, on the work of Peter Turchin, a quantitative historian who believes he has discovered iron laws that predict the rise and fall of societies.Birth of an iNationWhat if we viewed tech giants as countries? A thoughtful essay in Tortoise Media considers Apple as a one-party state as secretive as China. But more liberal. Phew!Is less Moore?I enjoyed a lovely post by Venkatesh Rao on the Ribbonfarm blog, about the mindset induced by living in a world governed by Moore’s Law. More

  • in

    Facebook and Twitter CEOs face Senate hearing over handling of 2020 US election – video

    The chief executive officers of Twitter and Facebook appear before a US Senate hearing to testify about allegations of anti-conservative bias and their handling of the 2020 election. Jack Dorsey and Mark Zuckerberg face questioning for the second time in as many months, with Republican lawmakers alleging – without evidence – censorship of conservative views
    Twitter and Facebook CEOs testify on alleged anti-conservative bias More