More stories

  • in

    Facebook’s very bad year. No, really, it might be the worst yet

    Facebook’s very bad year. No, really, it might be the worst yet From repeated accusations of fostering misinformation to multiple whistleblowers, the company weathered some battles in 2021It’s a now-perennial headline: Facebook has had a very bad year.Years of mounting pressure from Congress and the public culminated in repeated PR crises, blockbuster whistleblower revelations and pending regulation over the past 12 months.And while the company’s bottom line has not yet wavered, 2022 is not looking to be any better than 2021 – with more potential privacy and antitrust actions on the horizon.Here are some of the major battles Facebook has weathered in the past year.Capitol riots launch a deluge of scandalsFacebook’s year started with allegations that a deadly insurrection on the US Capitol was largely planned on its platform. Regulatory uproar over the incident reverberated for months, leading lawmakers to call CEO Mark Zuckerberg before Congress to answer for his platform’s role in the attack.In the aftermath, Zuckerberg defended his decision not to take action against Donald Trump, though the former president stoked anger and separatist flames on his personal and campaign accounts. Facebook’s inaction led to a rare public employee walkout and Zuckerberg later reversed the hands-off approach to Trump. Barring Trump from Facebook platforms sparked backlash once again – this time from Republican lawmakers alleging censorship.What ensued was a months-long back-and-forth between Facebook and its independent oversight board, with each entity punting the decision of whether to keep Trump off the platform. Ultimately, Facebook decided to extend Trump’s suspension to two years. Critics said this underscored the ineffectiveness of the body. “What is the point of the oversight board?” asked the Real Oversight Board, an activist group monitoring Facebook, after the non-verdict.Whistleblowers take on FacebookThe scandal with perhaps the biggest impact on the company this year came in the form of the employee-turned-whistleblower Frances Haugen, who leaked internal documents that exposed some of the inner workings of Facebook and just how much the company knew about the harmful effects its platform was having on users and society.Haugen’s revelations, first reported by the Wall Street Journal, showed Facebook was aware of many of its grave public health impacts and had the means to mitigate them – but chose not to do so.For instance, documents show that since at least 2019, Facebook has studied the negative impact Instagram had on teenage girls and yet did little to mitigate the harms and publicly denied that was the case. Those findings in particular led Congress to summon company executives to multiple hearings on the platform and teen users.Facebook has since paused its plans to launch an Instagram app for kids and introduced new safety measures encouraging users to take breaks if they use the app for long periods of time. In a Senate hearing on 8 December, the Instagram executive Adam Mosseri called on Congress to launch an independent body tasked with regulating social media more comprehensively, sidestepping calls for Instagram to regulate itself.Haugen also alleged Facebook’s tweaks to its algorithm, which turned off some safeguards intended to fight misinformation, may have led to the Capitol attack. She provided information underscoring how little of its resources it dedicates to moderating non-English language content.In response to the Haugen documents, Congress has promised legislation and drafted a handful of new bills to address Facebook’s power. One controversial measure would target Section 230, a portion of the Communications Decency Act that exempts companies from liability for content posted on their platforms.Haugen was not the only whistleblower to take on Facebook in 2021. In April, the former Facebook data scientist turned whistleblower Sophie Zhang revealed to the Guardian that Facebook repeatedly allowed world leaders and politicians to use its platform to deceive the public or harass opponents. Zhang has since been called to testify on these findings before parliament in the UK and India.Lawmakers around the world are eager to hear from the Facebook whistleblowers. Haugen also testified in the UK regarding the documents she leaked, telling MPs Facebook “prioritizes profit over safety”.Such testimony is likely to influence impending legislation, including the Online Safety Bill: a proposed act in the UK that would task the communications authority Ofcom with regulating content online and requiring tech firms to protect users from harmful posts or face substantial fines.Zuckerberg and Cook feud over Apple updateThough Apple has had its fair share of regulatory battles, Facebook did not find an ally in its fellow tech firm while facing down the onslaught of consumer and regulatory pressure that 2021 brought.The iPhone maker in April launched a new notification system to alert users when and how Facebook was tracking their browsing habits, supposedly as a means to give them more control over their privacy.Facebook objected to the new policy, arguing Apple was doing so to “self-preference their own services and targeted advertising products”. It said the feature would negatively affect small businesses relying on Facebook to advertise. Apple pressed on anyway, rolling it out in April and promising additional changes in 2022.Preliminary reports suggest Apple is, indeed, profiting from the change while Google and Facebook have seen advertising profits fall.Global outage takes out all Facebook productsIn early October, just weeks after Haugen’s revelations, things took a sudden turn for the worse when the company faced a global service outage.Perhaps Facebook’s largest and most sustained tech failure in recent history, the glitch left billions of users unable to access Facebook, Instagram or Whatsapp for six hours on 4 and 5 October.Facebook’s share price dropped 4.9% that day, cutting Zuckerberg’s personal wealth by $6bn, according to Bloomberg.Other threats to FacebookAs Facebook faces continuing calls for accountability, its time as the wunderkind of Silicon Valley has come to a close and it has become a subject of bipartisan contempt.Republicans repeatedly have accused Facebook of being biased against conservatism, while liberals have targeted the platform for its monopolistic tendencies and failure to police misinformation.In July, the Biden administration began to take a harder line with the company over vaccine misinformation – which Joe Biden said was “killing people” and the US surgeon general said was “spreading like wildfire” on the platform. Meanwhile, the appointment of the antitrust thought leader Lina Khan to head of the FTC spelled trouble for Facebook. She has been publicly critical of the company and other tech giants in the past, and in August refiled a failed FTC case accusing Facebook of anti-competitive practices.After a year of struggles, Facebook has thrown something of a Hail Mary: changing its name. The company announced it would now be called Meta, a reference to its new “metaverse” project, which will create a virtual environment where users can spend time.The name change was met with derision and skepticism from critics. But it remains to be seen whether Facebook, by any other name, will beat the reputation that precedes it.TopicsFacebookTim CookMark ZuckerbergUS CongressUS Capitol attackAppleUS politicsfeaturesReuse this content More

  • in

    Facebook boss ‘not willing to protect public from harm’

    The ObserverFacebookFacebook boss ‘not willing to protect public from harm’ Frances Haugen says chief executive has not shown any desire to shield users from the consequences of harmful content Dan MilmoSat 23 Oct 2021 21.02 EDTLast modified on Sun 24 Oct 2021 04.23 EDTThe Facebook whistleblower whose revelations have tipped the social media giant into crisis has launched a stinging new criticism of Mark Zuckerberg, saying he has not shown any readiness to protect the public from the harm his company is causing.Frances Haugen told the Observer that Facebook’s founder and chief executive had not displayed a desire to run the company in a way that shields the public from the consequences of harmful content.Her intervention came as pressure mounted on the near-$1tn (£730bn) business following a fresh wave of revelations based on documents leaked by Haugen, a former Facebook employee. The New York Times reported that workers had repeatedly warned that Facebook was being flooded with false claims about the 2020 presidential election result being fraudulent and believed the company should have done more to tackle it.Frances Haugen: ‘I never wanted to be a whistleblower. But lives were in danger’Read moreHaugen, who appears before MPs and peers in Westminster on Monday, said Zuckerberg, who controls the business via a majority of its voting shares, has not shown any willingness to protect the public.“Right now, Mark is unaccountable. He has all the control. He has no oversight, and he has not demonstrated that he is willing to govern the company at the level that is necessary for public safety.”She added that giving all shareholders an equal say in the running of the company would result in changes at the top. “I believe in shareholder rights and the shareholders, or shareholders minus Mark, have been asking for years for one share one vote. And the reason for that is, I am pretty sure the shareholders would choose other leadership if they had an option.”Haugen, who quit as a Facebook product manager in May, said she had leaked tens of thousand of documents to the Wall Street Journal and to Congress because she had realised that the company would not change otherwise.She said: “There are great companies that have done major cultural changes. Apple did a major cultural change; Microsoft did a major cultural change. Facebook can change too. They just have to get the will.”This weekend, a consortium of US news organisations released a fresh wave of stories based on the Haugen documents. The New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for Facebook – were of posts falsely alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle Stop the Steal groups spreading claims on the platform that the election was rigged. “Enforcement was piecemeal,” said the research.The revelations have reignited concerns about Facebook’s role in the 6 January riots, in which a mob seeking to overturn the election result stormed the Capitol in Washington. The New York Times added that some of the reporting for the story was based on documents not released by Haugen.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our commercial interests lie. The truth is we’ve invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook’s vice-president of integrity, Guy Rosen, said the company had put in place multiple measures to protect the public during and after the election and that “responsibility for the [6 January] insurrection lies with those who broke the law during the attack and those who incited them”.It was also reported on Friday that a new Facebook whistleblower had come forward and, like Haugen, had filed a complaint to the Securities and Exchange Commission, the US financial regulator, alleging that the company declined to enforce safety rules for fear of angering Donald Trump or impacting Facebook’s growth.Haugen will testify in person on Monday to the joint committee scrutinising the draft online safety bill, which would impose a duty of care on social media companies to protect users from harmful content, and allow the communications regulator, Ofcom, to fine those who breach this. The maximum fine is 10% of global turnover, so in the case of Facebook, this could run into billions of pounds. Facebook, whose services also include Instagram and WhatsApp, has 2.8 billion daily users and generated an income last year of $86bn.As well as issuing detailed rebuttals of Haugen’s revelations, Facebook is reportedly planning a major change that would attempt to put some distance between the company and its main platform. Zuckerberg could announce a rebranding of Facebook’s corporate identity on Thursday, according to a report that said the company is keen to emphasise its future as a player in the “metaverse”, a digital world in which people interact and lead their social and professional lives virtually.Haugen said Facebook must be compelled by all regulators to be more transparent with the information at its disposal internally, as detailed in her document leaks. She said one key reform would be to set up a formal structure whereby regulators could demand reports from Facebook on any problem that they identify.“Let’s imagine there was a brand of car that was having five times as many car accidents as other cars. We wouldn’t accept that car company saying, ‘this is really hard, we are trying our best, we are sorry, we are trying to do better in the future’. We would never accept that as an answer and we are hearing that from Facebook all the time. There needs to be an avenue where we can escalate a concern and they actually have to give us a response.”TopicsFacebookThe ObserverSocial networkingMark ZuckerbergUS elections 2020US CongressUS politicsReuse this content More

  • in

    Facebook harms children and is damaging democracy, claims whistleblower

    FacebookFacebook harms children and is damaging democracy, claims whistleblowerFrances Haugen says in US Congress testimony that Facebook puts ‘astronomical profits before people’04:21Dan Milmo and Kari PaulTue 5 Oct 2021 14.56 EDTFirst published on Tue 5 Oct 2021 14.48 EDTFacebook puts “astronomical profits before people”, harms children and is destabilising democracies, a whistleblower has claimed in testimony to the US Congress.Frances Haugen said Facebook knew it steered young users towards damaging content and that its Instagram app was “like cigarettes” for under-18s. In a wide-ranging testimony, the former Facebook employee said the company did not have enough staff to keep the platform safe and was “literally fanning” ethnic violence in developing countries.She also told US senators:
    The “buck stops” with the founder and chief executive, Mark Zuckerberg.
    Facebook knows its systems lead teenagers to anorexia-related content.
    The company had to “break the glass” and turn back on safety settings after the 6 January Washington riots.
    Facebook intentionally targets teenagers and children under 13.
    Monday’s outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not “destabilise democracies”.
    Haugen appeared in Washington on Tuesday after coming forward as the source of a series of revelations in the Wall Street Journal last month based on internal Facebook documents. They revealed the company knew Instagram was damaging teenagers’ mental health and that changes to Facebook’s News Feed feature – a central plank of users’ interaction with the service – had made the platform more polarising and divisive.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead moreHer evidence to senators included the claim that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.In her opening testimony, Haugen, 37, said: “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.” She added that Facebook was “buying its profits with our safety”. In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).Referring to Monday’s near six-hour outage in which Facebook’s platforms including Instagram and WhatsApp were disabled for billions of users, Haugen’s testimony added: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.” Facebook has 3.5 billion monthly active users across its platforms including Instagram and WhatsApp.Warning that Facebook makes choices that “go against the common good”, Haugen said the company should be treated like the tobacco industry, which was subject to government action once it was discovered it was hiding the harms its products caused, or like car companies that were forced to adopt seatbelts or opioid firms that have been sued by government agencies.Urging lawmakers to force more transparency on Facebook, she said there should be more scrutiny of its algorithms, which shape the content delivered to users. “The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said. With greater transparency, she added, “we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.The hearing focused on the impact of Facebook’s platforms on children, with Haugen likening the appeal of Instagram to tobacco. “It’s just like cigarettes … teenagers don’t have good self-regulation.” Haugen added women would be walking around with brittle bones in 60 years’ time because of the anorexia-related content they found on Facebook platforms.Haugen told lawmakers that Facebook intentionally targets teens and “definitely” targets children as young as eight for the Messenger Kids app. The former Facebook product manager left the company in May after copying tens of thousands of internal documents.A Facebook spokesperson, Andy Stone, said in a tweet during the hearing: “Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”Haugen said that, according to internal documents, Zuckerberg had been given “soft options” to make the Facebook platform less “twitchy” and viral in countries prone to violence but declined to take them because it might affect “meaningful social interactions”, or MSI. She added: “We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.”Haugen said Zuckerberg had built a company that was “very metrics driven”, because the more time people spent on Facebook platforms the more appealing the business was to advertisers. Asked about Zuckerberg’s ultimate responsibility for decisions made at Facebook, she said: “The buck stops with him.”Haugen also warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside of the US.Referring to the aftermath of the 6 January storming of the Capitol, as protesters sought to overturn the US presidential election result, Haugen said she was disturbed that Facebook had to “break the glass” and reinstate safety settings that it had put in place for the November poll. Haugen, who worked for the Facebook team that monitored election interference globally, said those precautions had been dropped after Joe Biden’s victory in order to spur growth on the platform.Among the reforms recommended by Haugen were ensuring that Facebook shares internal information and research with “appropriate” oversight bodies such as Congress and removing the influence of algorithms on Facebook’s News Feed by allowing it to be ranked chronologically.Senator Ed Markey said Congress would take action. “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”Haugen’s lawyers have also filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teenagers found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”A Facebook spokesperson said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookUS CongressSocial networkingUS politicsMark ZuckerbergnewsReuse this content More

  • in

    The inside story of how we reached the Facebook-Trump verdict | Alan Rusbridger

    As so often is the case, Donald Trump gets to the heart of the problem. On 6 January, he was the president of the United States: probably the most powerful man in the world. He should be free to speak his mind, and voters should be free to listen. But he was also a habitual liar who, by the end of his term, had edged into repudiating the very democracy that had elevated him.And then came his inflammatory words on that day, uttered even as rioters were breaking their way into the heart of US democracy. His words had a veneer of restraint – “We have to have peace, so go home.” But his statements were laced with lies, along with praise for the mob who terrorised lawmakers as they sought to confirm Biden as Trump’s successor – “We love you, you’re very special … great patriots … remember this day for ever.”At 5.41pm and 6.15pm that day, Facebook removed two posts from Trump. The following day the company banned Trump from its platform indefinitely. Around the same day, Twitter also moved to ban the president – permanently.So there was the problem that Donald Trump embodied – in a country whose commitment to free speech is baked into its core. The president might be a bitterly polarising figure, but surely he has a right to be heard – and for voters to be free to make up their own minds?Facebook’s decision to the contrary would spark passionate debate within the United States. But it had a wider resonance. For how much longer would giant social media platforms act as an amplification system for any number of despots around the world. Would they, too, be banned?The classic defence of free expression is that good speech defeats bad speech. Political speech – in some views – should be the most protected speech. It is vital we know who our leaders are. We have a right – surely? – to know if they are crooks, liars or demagogues.On 7 January Facebook decided: no longer. And now the Facebook oversight board, of which I am a member, has published its own verdict on the decision: Facebook was both right and wrong. Right to remove his 6 January words and right, the following day, to ban the president from the platform. But wrong to ban him “indefinitely”.The key word is “indefinitely” – if only because Facebook’s own policies do not appear to permit it. The oversight board (OSB) judgment doesn’t mince its words: “In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.” Ball squarely back in Facebook’s court.What Facebook has to do now – in our judgment, which the company is bound to implement – is to re-examine the arbitrary penalty it imposed on 7 January. It should take account of the gravity of the violation and the prospect of future harm.The case is the most prominent the OSB has decided since it was established as an independent entity and will inevitably focus more attention on its work. Why is such a body thought necessary?But this 38-page text is, I hope, a serious contribution to thinking about free speech in an age of chaosLet’s assume we might agree that it’s a bad thing for one person, Mark Zuckerberg, to be in charge of the rules of speech for 2 billion or more people. He is clearly a wonderfully talented engineer – but nothing in his background suggests he is equipped to think deeply about the complexities involved in free expression.Maybe most people who have studied the behaviour of governments towards publishers and newspapers over 300 years might also agree that politicians are not the best people to be trusted with individual decisions about who gets to say what.Into the void between those two polarities has stepped the OSB. At the moment we’re 19 individuals with backgrounds in journalism, law, academia and human rights: by the end of 2021 we hope to be nearer 40.Are we completely independent from Facebook? It certainly feels that way. It’s true that Facebook was involved in selecting the first 20 members, but once the board reaches its full complement, we decide who our future colleagues will be. Since a few early meetings to understand Facebook processes around moderation and similar matters we have had nothing to do with the company.We have our own board of distinguished trustees – again, free of any influence from Facebook. From what I’ve seen of my colleagues so far they’re an odd bunch to have picked if you were in search of a quiet life.The Trump decision was reached through the processes we’ve devised ourselves. A panel of five – with a good spread of regional backgrounds – did the initial heavy lifting, including sifting through more than 9,000 responses from the public.The wider board fed in its own views. We looked at Facebook’s own values – what they call voice, safety and dignity – as well as its content policies and community standards. But we also apply an international human rights lens in trying to balance freedom of expression with possible harms.In the Trump case we looked at the UN Guiding Principles on Business and Human Rights (UNGPs), which establish a voluntary framework for the human rights responsibilities of private businesses. We also considered the right to freedom of expression set out in articles 19 and 20 of the International Covenant on Civil and Political Rights (ICCPR) – as well as the qualifying articles to do with the rights to life, security of person, non-discrimination, participation in public affairs and so on.We also considered the 2013 Rabat Plan of Action, which attempts to identify and control hate speech online. We took into account a submission sent on behalf of Trump himself and sent Facebook 46 questions. They answered 37 fully, and two partially.And then we debated, and argued – virtually/verbally and in writing. A number of drafts were circulated, with most board members pitching in with tweaks, challenges, corrections and disagreements. Gradually, a consensus developed – resulting in a closely argued 38-page decision which openly reflects the majority and minority opinions.In addition to our ruling about the original and “indefinite” bans, we’ve sent Facebook a number of policy advisory statements. One of these concentrates on the question of how social media platforms should deal with “influential users” (a more useful conceit than “political leaders”).Speed is clearly of the essence where potentially harmful speech is involved. While it’s important to protect the rights of people to hear political speech, “if the head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a determinate period sufficient to protect against imminent harm”.As in previous judgments, we are critical of a lack of clarity in some of Facebook’s own rules, together with insufficient transparency about how they’re enforced. We would like to see Facebook carry out a comprehensive review of its potential contribution to the narrative around electoral fraud and in the exacerbated tensions that culminated in the violence on 6 January.And then this: “This should be an open reflection on the design and policy choices that Facebook has made that may enable its platform to be abused.” Which many people will read as not-so-coded reference to what is shorthanded as The Algorithm.Social media is still in its infancy. Among the many thorny issues we periodically discuss as a board is, what is this thing we’re regulating? The existing language – “platform”, “publisher”, “public square” – doesn’t adequately describe these new entities.Most of the suggested forms of more interventionist regulation stub their toes on the sheer novelty of this infant space for the unprecedented mass exchange of views.The OSB is also taking its first steps. The Trump judgment cannot possibly satisfy everyone. But this 38-page text is, I hope, a serious contribution to thinking about how to handle free speech in an age of information chaos. More

  • in

    The Spread of Global Hate

    One insidious way to torture the detainees at Guantanamo Bay was to blast music at them at all hours. The mixtape, which included everything from Metallica to the Meow Mix jingle, was intended to disorient the captives and impress upon them the futility of resistance. It worked: This soundtrack from hell did indeed break several inmates.

    For four years, Americans had to deal with a similar sonic blast, namely the “music” of President Donald Trump. His voice was everywhere: on TV and radio, screaming from the headlines of newspapers, pumped out nonstop on social media. MAGAmen and women danced to the repetitive beat of his lies and distortions. Everyone else experienced the nonstop assault of Trump’s instantly recognizable accent and intonations as nails on a blackboard. After the 2016 presidential election, psychologists observed a significant uptick in the fears Americans had about the future. One clinician even dubbed the phenomenon “Trump anxiety disorder.”

    What Led to Europe’s Vaccine Disaster?

    READ MORE

    The volume of Trump’s assault on the senses has decreased considerably since January. Obviously, he no longer has the bully pulpit of the Oval Office to broadcast his views. The mainstream media no longer covers his every utterance. Most importantly, the major social media platforms have banned him. In the wake of the January 6 insurrection on Capitol Hill, Twitter suspended Trump permanently under its glorification of violence policy. Facebook made the same decision, though its oversight board is now revisiting the former president’s deplatforming.

    It’s not only Trump. The Proud Boys, QAnon, the militia movements: The social media footprint of the far right has decreased a great deal in 2021, with a parallel decline in the amount of misinformation available on the Web.

    And it’s not just a problem of misinformation and hate speech. According to a new report by the Center for Strategic and International Studies (CSIS) on domestic terrorism, right-wing extremists have been involved in 267 plots and 91 fatalities since 2015, with the number of incidents rising in 2020 to a height unseen in a quarter of a century. A large number of the perpetrators are loners who have formed their beliefs from social media. As one counterterrorism official put it, “Social media has afforded absolutely everything that’s bad out there in the world the ability to come inside your home.”

    So, why did the tech giants provide Trump, his extremist followers and their global counterparts unlimited access to a growing audience over those four long years?

    Facebook Helps Trump

    In a new report from the Global Project Against Hate and Extremism (GPAHE), Heidi Beirich and Wendy Via write: “For years, Trump violated the community standards of several platforms with relative impunity. Tech leaders had made the affirmative decision to allow exceptions for the politically powerful, usually with the excuse of ‘newsworthiness’ or under the guise of ‘political commentary’ that the public supposedly needed to see.”

    Even before Trump became president, Facebook was cutting him a break. In 2015, he was using the social media platform to promote a Muslim travel ban, which generated considerable controversy, particularly within Facebook itself. The Washington Post reports:

    “Outrage over the video led to a companywide town hall, in which employees decried the video as hate speech, in violation of the company’s policies. And in meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, according to three former employees, who spoke on the condition of anonymity for fear of retribution. [Facebook CEO Mark] Zuckerberg expressed in meetings that he was personally disgusted by it and wanted it removed, the people said.”

    But the company’s most prominent Republican, Vice-President of Global Policy Joel Kaplan, persuaded Zuckerberg to change his position. In spring 2016, when Zuckerberg wanted to condemn Trump’s plan to build a wall on the border with Mexico, he was again persuaded to step back for fear of seeming too partisan.

    Embed from Getty Images

    Facebook went on to play a critical role in getting Trump elected. It wasn’t simply the Russian campaign to create fake accounts, fake messaging and even fake events using Facebook, or the theft of Facebook user data by Cambridge Analytica. More important was the role played by Facebook staff in helping Trump’s digital outreach team maximize its use of social media. The Trump campaign spent $70 million on Facebook ads and raised much of its $250 million in online fundraising through Facebook as well.

    Trump established a new paradigm through brute force and money. As he turned himself into clickbait, the social media giants applied the same “exceptionalism” to other rancid politicians. More ominously, the protection accorded politicians extended to extremists. According to an account of a discussion at a Twitter staff meeting, one employee explained that “on a technical level, content from Republican politicians could get swept up by algorithms aggressively removing white supremacist material. Banning politicians wouldn’t be accepted by society as a trade-off for flagging all of the white supremacist propaganda.”

    Of course, in the wake of the January 6 insurrection, social media organizations decided that society could indeed accept the banning of politicians, at least when it came to some politicians in the United States.

    The Real Fake News

    In the Philippines, an extraordinary 97% of internet users had accounts with Facebookas of 2019, up from 40% in 2018 (by comparison, about 67% of Americans have Facebook accounts). Increasingly, Filipinos get their news from social media. That’s bad news for the mainstream media in the Philippines. And that’s particularly bad news for journalists like Maria Ressa, who runs an online news site called Rappler.

    At a press conference for the GPAHE report, Ressa described how the government of Rodrigo Duterte, with an assist from Facebook, has made her life a living hell. Like Trump, President Duterte came to power on a populist platform spread through Facebook. Because of her critical reporting on government affairs, Ressa felt the ire of the Duterte fan club, which generated half a million hate posts that, according to one study, consisted of 60% attacks on her credibility and 40% sexist and misogynist slurs. This onslaught created a bandwagon effect that equated journalists like her with criminals.

    This noxious equation on social media turned into a real case when the Philippine authorities arrested Ressa in 2019 and convicted her of the dubious charge of “cyberlibel.” She faces a sentence of as much as 100 years in prison.

    “Our dystopian present is your dystopian future,” she observed. What happened in the Philippines in that first year of Duterte became the reality in the United States under Trump. It was the same life cycle of hate in which misinformation is introduced in social media, then imported into the mainstream media and supported from the top down by opportunistic politicians.

    The Philippines faces another presidential election next year, and Duterte is barred from running again by term limits. Duterte’s daughter, who is currently the mayor of Davao City just like her father had been, tops the early polls, though she hasn’t thrown her hat in the ring and her father has declared that women shouldn’t run for president. This time around, however, Facebook disrupted the misinformation campaign tied to the Dutertes when it took down fake accounts coming from China that supported the daughter’s potential bid for the presidency.

    President Duterte was furious. “Facebook, listen to me,” he said. “We allow you to operate here hoping that you could help us. Now, if government cannot espouse or advocate something which is for the good of the people, then what is your purpose here in my country? What would be the point of allowing you to continue if you can’t help us?”

    Duterte had been led to believe, based on his previous experience, that Facebook was his lapdog. Other authoritarian regimes had come to expect the same treatment. In India, according to the GPAHE report, Prime Minister Narendra Modi’s Bharatiya Janata Party:

    “… was Facebook India’s biggest advertising spender in 2020. Ties between the company and the Indian government run even deeper, as the company has multiple commercial ties, including partnerships with the Ministry of Tribal Affairs, the Ministry of Women and the Board of Education. Both CEO Mark Zuckerberg and COO Sheryl Sandberg have met personally with Modi, who is the most popular world leader on Facebook. Before Modi became prime minister, Zuckerberg even introduced his parents to him.”

    Facebook has also cozied up to the right-wing government in Poland, misinformation helped get Jair Bolsonaro elected in Brazil, and the platform served as a vehicle for the Islamophobic content that contributed to the rise of the far right in the Netherlands. But the decision to ban Trump has set in motion a backlash. In Poland, for instance, the Law and Justice Party has proposed a law to fine Facebook and others for removing content if it doesn’t break Polish law, and a journalist has attempted to establish a pro-government alternative to Facebook called Albicla.

    Back in the USA

    Similarly, in the United States, the far right have suddenly become a big booster of free speech now that social media platforms have begun to deplatform high-profile users like Trump and take down posts for their questionable veracity and hate content. In the second quarter of 2020 alone, Facebook removed 22.5 million posts.

    .custom-post-from {float:right; margin: 0 10px 10px; max-width: 50%; width: 100%; text-align: center; background: #000000; color: #ffffff; padding: 15px 0 30px; }
    .custom-post-from img { max-width: 85% !important; margin: 15px auto; filter: brightness(0) invert(1); }
    .custom-post-from .cpf-h4 { font-size: 18px; margin-bottom: 15px; }
    .custom-post-from .cpf-h5 { font-size: 14px; letter-spacing: 1px; line-height: 22px; margin-bottom: 15px; }
    .custom-post-from input[type=”email”] { font-size: 14px; color: #000 !important; width: 240px; margin: auto; height: 30px; box-shadow:none; border: none; padding: 0 10px; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-pen-icon.svg”); background-repeat: no-repeat; background-position: center right 14px; background-size:14px;}
    .custom-post-from input[type=”submit”] { font-weight: normal; margin: 15px auto; height: 30px; box-shadow: none; border: none; padding: 0 10px 0 35px; background-color: #1878f3; color: #ffffff; border-radius: 4px; display: inline-block; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-email-icon.svg”); background-repeat: no-repeat; background-position: 14px center; background-size: 14px; }

    .custom-post-from .cpf-checkbox { width: 90%; margin: auto; position: relative; display: flex; flex-wrap: wrap;}
    .custom-post-from .cpf-checkbox label { text-align: left; display: block; padding-left: 32px; margin-bottom: 0; cursor: pointer; font-size: 11px; line-height: 18px;
    -webkit-user-select: none;
    -moz-user-select: none;
    -ms-user-select: none;
    user-select: none;
    order: 1;
    color: #ffffff;
    font-weight: normal;}
    .custom-post-from .cpf-checkbox label a { color: #ffffff; text-decoration: underline; }
    .custom-post-from .cpf-checkbox input { position: absolute; opacity: 0; cursor: pointer; height: 100%; width: 24%; left: 0;
    right: 0; margin: 0; z-index: 3; order: 2;}
    .custom-post-from .cpf-checkbox input ~ label:before { content: “f0c8”; font-family: Font Awesome 5 Free; color: #eee; font-size: 24px; position: absolute; left: 0; top: 0; line-height: 28px; color: #ffffff; width: 20px; height: 20px; margin-top: 5px; z-index: 2; }
    .custom-post-from .cpf-checkbox input:checked ~ label:before { content: “f14a”; font-weight: 600; color: #2196F3; }
    .custom-post-from .cpf-checkbox input:checked ~ label:after { content: “”; }
    .custom-post-from .cpf-checkbox input ~ label:after { position: absolute; left: 2px; width: 18px; height: 18px; margin-top: 10px; background: #ffffff; top: 10px; margin: auto; z-index: 1; }
    .custom-post-from .error{ display: block; color: #ff6461; order: 3 !important;}

    Facebook has tried to get ahead of this story by establishing an oversight board that includes members like Jamal Greene, a law professor at Columbia University; Julie Owono, executive director at Internet Sans Frontiere; and Nighat Dad, founder of the Digital Rights Foundation. Now, Facebook users can also petition the board to remove content.

    With Facebook, Twitter, YouTube and others now removing a lot of extremist content, the far right have migrated to other platforms, such as Gab, Telegram, and MeWe. They continue to spread conspiracy theories, anti-COVID vaccine misinformation and pro-Trump propaganda on these alternative platforms. Meanwhile, the MAGA crowd awaits the second coming of Trump in the form of a new social media platform that he plans to launch in a couple of months to remobilize his followers.

    Even without such an alternative alt-right platform — Trumpbook? TrumpSpace? Trumper? — the life cycle of hate is still alive and well in the United States. Consider the “great replacement theory,” according to which immigrants and denizens of the non-white world are determined to “replace” white populations in Europe, America and elsewhere. Since its inception in France in 2010, this extremist conspiracy theory has spread far and wide on social media. It has been picked up by white nationalists and mass shooters. Now, in the second stage of the life cycle, it has landed in the mainstream media thanks to right-wing pundits like Tucker Carlson, who recently opined, “The Democratic Party is trying to replace the current electorate of the voters now casting ballots with new people, more obedient voters from the Third World.”

    Pressure is mounting on Fox to fire Carlson, though the network is resisting. Carlson and his supporters decry the campaign as yet another example of “cancel culture.” They insist on their First Amendment right to express unpopular opinions. But a privately-owned media company is under no obligation to air all views, and the definition of acceptability is constantly evolving.

    Also, a deplatformed Carlson would still be able to air his crank views on the street corner or in emails to his followers. No doubt when Trumpbook debuts at some point in the future, Carlson’s biggest fan will also give him a digital megaphone to spread lies and hate all around the world. These talking heads will continue talking no matter what. The challenge is to progressively shrink the size of their global platform.

    *[This article was originally published by FPIF.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More

  • in

    Zuckerberg faces Capitol attack grilling as Biden signals tougher line on big tech

    Mark Zuckerberg, the head of Facebook, could be in for a rough ride on Thursday when he testifies to Congress for the first time about the 6 January insurrection at the Capitol in Washington DC and amid growing questions over his platform’s role in fuelling the violence.The testimony will come after signs that the new administration of Joe Biden is preparing to take a tougher line on the tech industry’s power, especially when it comes to the social media platforms and their role in spreading misinformation and conspiracy theories.Zuckerberg will be joined by Sundar Pichai and Jack Dorsey, the chief executives of Google and Twitter respectively, at a hearing pointedly entitled “Disinformation nation: social media’s role in promoting extremism and misinformation” by the House of Representatives’ energy and commerce committee.The scrutiny comes after a report found that Facebook allowed groups linked to the QAnon, boogaloo and militia movements to glorify violence during the 2020 election and weeks leading up to the deadly mob violence at the US Capitol.Avaaz, a non-profit advocacy group, says it identified 267 pages and groups on Facebook that spread “violence-glorifying content” in the heat of the 2020 election to a combined following of 32 million users. More than two-thirds of the groups and pages had names aligned with several domestic extremist movements.The top 100 most popular false or misleading stories on Facebook related to the elections received an estimated 162m views, the report found. Avaaz called on the White House and Congress to open an investigation into Facebook’s failures and urgently pass legislation to protect American democracy.Fadi Quran, its campaign director, said: “This report shows that American voters were pummeled with false and misleading information on Facebook every step of the 2020 election cycle. We have over a year’s worth of evidence that the platform helped drive billions of views to pages and content that confused voters, created division and chaos, and, in some instances, incited violence.“But the most worrying finding in our analysis is that Facebook had the tools and capacity to better protect voters from being targets of this content, but the platform only used them at the very last moment, after significant harm was done.”Facebook claimed that Avaaz had used flawed methodology. Andy Stone, a spokesperson, said: “We’ve done more than any other internet company to combat harmful content, having already banned nearly 900 militarized social movements and removed tens of thousands of QAnon pages, groups and accounts from our apps.”He acknowledged: “Our enforcement isn’t perfect, which is why we’re always improving it while also working with outside experts to make sure that our policies remain in the right place.”But the report is likely to prompt tough questions for Zuckerberg in what is part of a wider showdown between Washington and Silicon Valley. Another flashpoint on Thursday could be Section 230 of the 1996 Communications Decency Act, which shields social media companies from liability for content their users post.Repealing the law is one of the few things on which Biden and his predecessor as president, Donald Trump, agree, though for different reasons. Democrats are concerned that Section 230 allows disinformation and conspiracy theories such as QAnon to flourish, while Trump and other Republicans have argued that it protects companies from consequences for censoring conservative voices.More generally, critics say that tech companies are too big and that the coronavirus pandemic has only increased their dominance. The cosy relationship between Barack Obama’s administration and Silicon Valley is a thing of the past, while libertarian Republicans who oppose government interference are a fading force.Amazon, Apple, Facebook and Google have all come under scrutiny from Congress and regulators in recent years. The justice department, the Federal Trade Commission (FTC) and state attorneys general are suing the behemoths over various alleged antitrust violations.In a letter this week to Biden and Merrick Garland, the new attorney general, a coalition of 29 progressive groups wrote: “It’s clear that the ability of Big Tech giants like Google to acquire monopoly power has been abetted by the leadership deficit at top enforcement agencies such as the FTC … We need a break from past, failed leadership, and we need it now.”There are signs that Biden is heeding such calls and spoiling for a confrontation. On Monday he nominated Lina Khan, an antitrust scholar who wants stricter regulation of internet companies, to the FTC. Earlier this month Tim Wu, a Columbia University law professor among the most outspoken critics of big tech, was appointed to the national economic council.There is support in Congress from the likes of David Cicilline, chairman of the House judiciary committee’s antitrust panel, which last year released a 449-page report detailing abuses of market power by Apple, Amazon, Google and Facebook.The Democratic congressman is reportedly poised to issue at least 10 legislative initiatives targeting big tech, a blitz that will make it harder for the companies and their lobbyists to focus their opposition on a single piece of legislation.Cicilline, also working on a separate bill targeting Section 230, told the Axios website: “My strategy is you’ll see a number of bills introduced, both because it’s harder for [the tech companies] to manage and oppose, you know, 10 bills as opposed to one.“It also is an opportunity for members of the committee who have expressed a real interest or enthusiasm about a particular issue, to sort of take that on and champion it.” More

  • in

    All I want for 2021 is to see Mark Zuckerberg up in court | John Naughton

    It’s always risky making predictions about the tech industry, but this year looks like being different, at least in the sense that there are two safe bets. One is that the attempts to regulate the tech giants that began last year will intensify; the second that we will be increasingly deluged by sanctimonious cant from Facebook & co as they seek to avoid democratic curbing of their unaccountable power.On the regulation front, last year in the US, Alphabet, Google’s corporate owner, found itself facing major antitrust suits from 38 states as well as from the Department of Justice. On this side of the pond, there are preparations for a Digital Markets Unit with statutory powers that will be able to neatly sidestep the tricky definitional questions of what constitutes a monopoly in a digital age. Instead, the unit will decide on a case-by-case basis whether a particular tech company has “strategic market status” if it possesses “substantial, entrenched market power in at least one digital activity” or if it acts as an online “gateway” for other businesses. And if a company is judged to have this status, then penalties and regulations will be imposed on it.Over in Brussels, the European Union has come up with a new two-pronged legal framework for curbing digital power – the Digital Markets Act and the Digital Services Act. The Digital Markets Act is aimed at curbing anti-competitive practices in the tech industry (like buying up potential competitors before they can scale up) and will include fines of 10% of global revenues for infringers. The Digital Services Act, for its part, will oblige social media platforms to take more responsibility for illegal content on their platforms – scams, terrorist content, images of abuse, etc – for which they could face fines of up to 6% of global revenue if they fail to police content adequately. So the US and UK approach focuses on corporate behaviour; the EU approach focuses on defining what is allowed legally.All of this action has been a long time coming and while it’s difficult to say exactly how it will play out, the bottom line is that the tech industry is – finally – going to become a regulated one. Its law-free bonanza is going to come to an end.Joe Biden’s choices for top staff in his administration include a depressing proportion of former tech company stalwartsThe big question, though, is: when? Antitrust actions proceed at a glacial pace because of the complexity of the issues and the bottomless legal budgets of the companies involved. The judge in one of the big American antitrust cases against Google has said that he expects the case to get to court only in late 2023 and then it could run for several years (as the Microsoft case did in the 1990s).The problem with that, as the veteran anti-monopoly campaigner Matt Stoller has pointed out, is that the longer monopolistic behaviour goes on, the more damage (eg, to advertisers whose revenue is being stolen and other businesses whose property is being appropriated) is being done. Google had $170bn in revenue last year and is growing on average at 10-20% a year. On a conservative estimate of 10% growth, the company will add another $100bn to its revenue by 2025, when the case will still be in the court. Facebook, says Stoller, “is at $80bn of revenue this year, but it is growing faster, so the net increase of revenue is a roughly similar amount. In other words, if the claims of the government are credible, then the lengthy case, while perhaps necessary, is also enabling these monopolists to steal an additional $100bn apiece.”What could speed up bringing these monopolists to account? A key factor is the vigour with which the US Department of Justice prosecutes its case(s). In the run-up to the 2020 election, the Democrats in Congress displayed an encouraging enthusiasm for tackling tech monopolies, but Joe Biden’s choices for top staff in his administration include a depressing proportion of former tech company stalwarts. And his vice-president-elect, Kamala Harris, consistently turned a blind eye to the anti-competitive acquisitions of the Silicon Valley giants throughout her time as California’s attorney general. So if people are hoping for antitrust zeal from the new US government, they may be in for disappointment.Interestingly, Stoller suggests that another approach (inspired by the way trust-busters in the US acted in the 1930s) could have useful leverage on corporate behaviour from now on. Monopolisation isn’t just illegal, he points out, “it is in fact a crime, an appropriation of the rights and property of others by a dominant actor. The lengthy trial is essentially akin to saying that bank robbers getting to keep robbing banks until they are convicted and can probably keep the additional loot.”Since a basic principle of the rule of law is that crime shouldn’t pay, an addition of the possibility of criminal charges to the antitrust actions might, like the prospect of being hanged in the morning (pace Dr Johnson), concentrate minds in Facebook, Google, Amazon and Apple. As an eternal optimist, I cannot think of a nicer prospect for 2021 than the sight of Mark Zuckerberg and Sundar Pichai in the dock – with Nick Clegg in attendance, taking notes. Happy new year!What I’ve been readingWho knew?What We Want Doesn’t Always Make Us Happy is a great Bloomberg column by Noah Smith.Far outIntriguing piece on how investors are using real-time satellite images to predict retailers’ sales (Stock Picks From Space), by Frank Partnoy on the Atlantic website.An American dream Lovely meditation on Nora Ephron’s New York, by Carrie Courogen on the Bright Wall/Dark Room website. More