More stories

  • in

    Brawny billionaires, pumped-up politicians: why powerful men are challenging each other to fights

    The first rule of insecure masculinity fight club? Tell everyone about it. And I mean everyone. Tweet about it, talk to reporters, shout about it from the rooftops. Make sure the entire world knows that you are a big boy who could beat just about anyone in a fistfight.Twenty twenty-three, as I’m sure you will have observed, was the year that tech CEOs stepped away from their screens and decided to get physical. Elon Musk, perennially thirsty for attention, was at the center of this embarrassing development. The 52-year-old – who challenged Vladimir Putin to single combat in 2022 – spent much of the year teasing the idea that he was going head-to-head with Mark Zuckerberg in a cage fight. At one point he suggested the fight would be held at the Colosseum in Rome.Don’t worry, you didn’t miss it. The fight never happened and will never ever happen for the simple reason that Musk would get destroyed by Zuckerberg, who has been obsessively training in mixed martial arts (MMA) and won a bunch of medals in a Brazilian jiujitsu tournament. The only way Musk will actually follow through with the cage match is if he manages to get his hands on some kind of brain-implant technology that magically transforms him into a lean, mean, fighting machine. Indeed, I wouldn’t be surprised if Neuralink, Musk’s brain-chip startup, was working on that brief right now. Although seeing as the company is under federal investigation after killing 1,500 animals in testing– many of which died extremely grisly deaths – it may be a while before any such technology comes to fruition.Musk and Zuck aren’t the only tech execs looking to get physical. Vin Diesel-level biceps have become the latest billionaire status symbol. Just look at Jeff Bezos: his muscles have increased at about the same rate as his bank account. The Airbnb CEO, Brian Chesky, has also been working on getting swole. Back in June, Chesky told the Bloomberg writer Dave Lee that he’d “challenge any leader in tech to bench press”. He added: “I’ve been waiting for these physical battles in tech. It’s just so funny.”It’s not just tech bros. Politicians are at it too. Over the summer, Robert F Kennedy Jr posted a video of himself doing push-ups while shirtless with the caption “Getting in shape for my debates with President Biden!” Which may or may not have been prompted by Biden once challenging an Iowa voter and Donald Trump to a push-up contest.I don’t know how good Kevin McCarthy is at push-ups, but he’s certainly fond of shoving. In November, the former speaker bumped into the congressman Tim Burchett of Tennessee and reportedly elbowed him in the back. Burchett then chased after him, calling him a “jerk” and a “chicken”. McCarthy, it seems, was angry that Burchett had helped oust him from the speakership in October, making him the first speaker in US history to have been removed by his own side.Just a few hours after that altercation, Markwayne Mullin, a Republican senator from Oklahoma, challenged Sean O’Brien, president of the International Brotherhood of Teamsters, to a physical confrontation during a Senate committee hearing on labor unions. Mullin, a former businessman who regularly boasts about his prowess as an MMA fighter, was miffed that O’Brien had once called him a “greedy CEO” and a “clown” on Twitter. He decided to settle his private grievance during a public hearing and the two agreed to have a fight right there and then – yelling at each other to “stand your butt up” and get started. Eventually Bernie Sanders got them to calm down.Just pause for a moment and imagine acting like this in your own job. I don’t know about you, but I’m pretty sure that if I challenged a colleague to a fight and started yelling at them to “sit their butt down” in the middle of a public meeting, I would face some sort of consequences. In the Mullins case, the meltdown doesn’t seem to have had any impact on his career. It may have even increased his popularity among his base. Politicians routinely seem to be held to a lower standard than the rest of us.If you ignore the fact that we’re being ruled by people with enormous egos and no self-restraint, then there is an amusing element to all this. But more than anything, it’s just pathetic, isn’t it? All these grown men so clearly worried about their masculinity that they feel the need to puff out their chests and show everyone just how strong they are.The one per cent’s desperate shows of bravado are part of a broader insecurity about masculinity in the west that plenty of snake-oil salesmen and opportunists are exploiting for all it’s worth. In 2022, for example, the rightwing commentator Tucker Carlson came out with a documentary called The End of Men that argues testosterone counts are plummeting and “real men” are an endangered species. The documentary was full of bizarre ways to counteract this, including testicle tanning. I’m not sure how many tech bros and politicians are regularly exposing their balls to red-light therapy, but there does seem to be a widespread preoccupation with “bromeopathic” ways to increase testosterone. Testosterone blood-test “T parties” are apparently a growing trend among tech types: a bunch of founders get together and find ways to raise their T.Do whatever you like in private, I say. Tan your testicles, go to T parties, organize push-up competitions. Just don’t foist your masculine insecurities on the rest of us. Stop challenging each other to public fights and getting into brawls in government. It seems to be easy enough for women to follow this advice, doesn’t it? I mean … has a female CEO or politician ever tried to organize a public fistfight with a female counterpart? I’ve got a weird feeling the answer is “no, they would be a complete laughingstock if they did”, but if anyone can find me a recent example then I’ll eat my hat. Or – on second thoughts – I’ll throw my hat in the ring and fight Elon Musk myself in the Roman Colosseum. Consider that a challenge. More

  • in

    Republicans shelve Zuckerberg contempt vote in ‘censorship’ inquiry ‘for now’

    Mark Zuckerberg, the chief executive of Meta, is no stranger to Capitol Hill, where he has sparred with Republicans and Democrats over how he runs his platforms. A Republican-led panel was set to vote on Thursday on a resolution to hold him in contempt of Congress, for allegedly failing to turn over internal documents on content moderation.However, House judiciary committee chair Jim Jordan, a Republican of Ohio, temporarily suspended the vote.Jordan announced on Twitter that the committee “decided to hold contempt in abeyance. For now” and posted a series of tweets of alleged internal communications among Meta executives hours ahead of the hearing.“To be clear, contempt is still on the table and WILL be used if Facebook fails to cooperate in FULL,” Jordan said.Republican lawmakers have repeatedly accused Meta – along with other big names Google, Apple and Microsoft – of suppressing conservative speech on their platforms.Jim Jordan had alleged that Meta failed to turn over requested internal company documents to an investigation into tech companies and “willfully refused to comply in full with a congressional subpoena”, according to a report released on Tuesday.Jordan, an Ohio Republican, also subpoenaed the chief executives at Alphabet, Microsoft, Amazon and Apple in February. Zuckerberg is so far the only one facing additional scrutiny.But regulating tech companies is a rare area of bipartisan support, even if the reasons behind it are different. Meta has come under fire from Democrats over privacy concerns and its marketing toward kids and teens. In 2020, Zuckerberg, along with the then Twitter chief executive, Jack Dorsey, faced intense questioning during a Senate judiciary hearing where Democrats condemned the executives for amplifying misinformation, such as false claims of election fraud, and raised antitrust concerns.Meta says it has fully complied with the congressional investigation.“For many months, Meta has operated in good faith with this committee’s sweeping requests for information. We began sharing documents before the committee’s February subpoena and have continued to do so,” said a Meta spokesperson, Andy Stone, in a statement posted in response to the hearing notice on Tuesday.He said Meta had so far delivered more than 53,000 pages of internal and external documents and “made nearly a dozen current and former employees available to discuss external and internal matters, including some scheduled this very week”, according to the statement.Politico reported that Meta handed over more documents hours before Jordan announced the Thursday vote but that the Ohio Republican was not satisfied.“They’ve given us documents because we’re pushing and because we’re talking about this – we appreciate that, but we are convinced that it’s way short of what they should be providing us,” Jordan reportedly said in an interview.One social media company, Twitter – which now goes by X – has escaped much of the scrutiny as its chief executive, Elon Musk, has been seen as friendly to conservatives. In his February letter to tech companies, Jordan called Twitter a model of transparency and praised its “Twitter files” – which many experts flagged as sensationalized.Meta’s second-quarter revenue defied expectations after its earnings release on Wednesday, and Zuckerberg’s own net worth surged on Thursday. More

  • in

    Nick Clegg to decide on Trump’s 2023 return to Instagram and Facebook

    Nick Clegg to decide on Trump’s 2023 return to Instagram and FacebookMeta’s president of global affairs said it would be a decision ‘I oversee’ after the ex-president’s accounts were suspended in 2021 Nick Clegg, Meta’s president of global affairs, is charged with deciding whether Donald Trump will be allowed to return to Facebook and Instagram in 2023, Clegg said on Thursday.Speaking at an event held in Washington by news organization Semafor, Clegg said the company was seriously debating whether Trump’s accounts should be reinstated and said it was a decision that “I oversee and I drive”.Judge asks Trump’s team for proof that FBI planted documents at Mar-a-Lago Read moreClegg added that while he will be making the final call, he will consult the CEO, Mark Zuckerberg, the Facebook board of directors and outside experts.“It’s not a capricious decision,” he said. “We will look at the signals related to real-world harm to make a decision whether at the two-year point – which is early January next year – whether Trump gets reinstated to the platform.”The former president was suspended from a number of online platforms, including those owned by Meta, following the 6 January 2021 Capitol riot during which Trump used his social media accounts to praise and perpetuate the violence.While Twitter banned Trump permanently, Meta suspended Trump’s accounts for two years, to be later re-evaluated. In May 2021, a temporary ban was upheld by Facebook’s oversight board – a group of appointed academics and former politicians meant to operate independently of Facebook’s corporate leadership.However, the board returned the final decision on Trump’s accounts to Meta, suggesting the company decide in six months whether to make the ban permanent. Clegg said that decision will be made by 7 January 2023.Clegg previously served as Britain’s deputy prime minister and joined Facebook as vice‑president for global affairs and communications in 2018. In February, he was promoted to the top company policy executive role.In the years since he began at Meta, Clegg has seen the company through a number of scandals, including scrutiny of its policies during the 2016 US presidential election, Facebook’s role in the persecution of the Rohingya in Myanmar, and the revelations made by whistleblower Frances Haugen.TopicsDonald TrumpNick CleggMark ZuckerbergFacebookInstagramUS Capitol attackUS politicsnewsReuse this content More

  • in

    Facebook’s very bad year. No, really, it might be the worst yet

    Facebook’s very bad year. No, really, it might be the worst yet From repeated accusations of fostering misinformation to multiple whistleblowers, the company weathered some battles in 2021It’s a now-perennial headline: Facebook has had a very bad year.Years of mounting pressure from Congress and the public culminated in repeated PR crises, blockbuster whistleblower revelations and pending regulation over the past 12 months.And while the company’s bottom line has not yet wavered, 2022 is not looking to be any better than 2021 – with more potential privacy and antitrust actions on the horizon.Here are some of the major battles Facebook has weathered in the past year.Capitol riots launch a deluge of scandalsFacebook’s year started with allegations that a deadly insurrection on the US Capitol was largely planned on its platform. Regulatory uproar over the incident reverberated for months, leading lawmakers to call CEO Mark Zuckerberg before Congress to answer for his platform’s role in the attack.In the aftermath, Zuckerberg defended his decision not to take action against Donald Trump, though the former president stoked anger and separatist flames on his personal and campaign accounts. Facebook’s inaction led to a rare public employee walkout and Zuckerberg later reversed the hands-off approach to Trump. Barring Trump from Facebook platforms sparked backlash once again – this time from Republican lawmakers alleging censorship.What ensued was a months-long back-and-forth between Facebook and its independent oversight board, with each entity punting the decision of whether to keep Trump off the platform. Ultimately, Facebook decided to extend Trump’s suspension to two years. Critics said this underscored the ineffectiveness of the body. “What is the point of the oversight board?” asked the Real Oversight Board, an activist group monitoring Facebook, after the non-verdict.Whistleblowers take on FacebookThe scandal with perhaps the biggest impact on the company this year came in the form of the employee-turned-whistleblower Frances Haugen, who leaked internal documents that exposed some of the inner workings of Facebook and just how much the company knew about the harmful effects its platform was having on users and society.Haugen’s revelations, first reported by the Wall Street Journal, showed Facebook was aware of many of its grave public health impacts and had the means to mitigate them – but chose not to do so.For instance, documents show that since at least 2019, Facebook has studied the negative impact Instagram had on teenage girls and yet did little to mitigate the harms and publicly denied that was the case. Those findings in particular led Congress to summon company executives to multiple hearings on the platform and teen users.Facebook has since paused its plans to launch an Instagram app for kids and introduced new safety measures encouraging users to take breaks if they use the app for long periods of time. In a Senate hearing on 8 December, the Instagram executive Adam Mosseri called on Congress to launch an independent body tasked with regulating social media more comprehensively, sidestepping calls for Instagram to regulate itself.Haugen also alleged Facebook’s tweaks to its algorithm, which turned off some safeguards intended to fight misinformation, may have led to the Capitol attack. She provided information underscoring how little of its resources it dedicates to moderating non-English language content.In response to the Haugen documents, Congress has promised legislation and drafted a handful of new bills to address Facebook’s power. One controversial measure would target Section 230, a portion of the Communications Decency Act that exempts companies from liability for content posted on their platforms.Haugen was not the only whistleblower to take on Facebook in 2021. In April, the former Facebook data scientist turned whistleblower Sophie Zhang revealed to the Guardian that Facebook repeatedly allowed world leaders and politicians to use its platform to deceive the public or harass opponents. Zhang has since been called to testify on these findings before parliament in the UK and India.Lawmakers around the world are eager to hear from the Facebook whistleblowers. Haugen also testified in the UK regarding the documents she leaked, telling MPs Facebook “prioritizes profit over safety”.Such testimony is likely to influence impending legislation, including the Online Safety Bill: a proposed act in the UK that would task the communications authority Ofcom with regulating content online and requiring tech firms to protect users from harmful posts or face substantial fines.Zuckerberg and Cook feud over Apple updateThough Apple has had its fair share of regulatory battles, Facebook did not find an ally in its fellow tech firm while facing down the onslaught of consumer and regulatory pressure that 2021 brought.The iPhone maker in April launched a new notification system to alert users when and how Facebook was tracking their browsing habits, supposedly as a means to give them more control over their privacy.Facebook objected to the new policy, arguing Apple was doing so to “self-preference their own services and targeted advertising products”. It said the feature would negatively affect small businesses relying on Facebook to advertise. Apple pressed on anyway, rolling it out in April and promising additional changes in 2022.Preliminary reports suggest Apple is, indeed, profiting from the change while Google and Facebook have seen advertising profits fall.Global outage takes out all Facebook productsIn early October, just weeks after Haugen’s revelations, things took a sudden turn for the worse when the company faced a global service outage.Perhaps Facebook’s largest and most sustained tech failure in recent history, the glitch left billions of users unable to access Facebook, Instagram or Whatsapp for six hours on 4 and 5 October.Facebook’s share price dropped 4.9% that day, cutting Zuckerberg’s personal wealth by $6bn, according to Bloomberg.Other threats to FacebookAs Facebook faces continuing calls for accountability, its time as the wunderkind of Silicon Valley has come to a close and it has become a subject of bipartisan contempt.Republicans repeatedly have accused Facebook of being biased against conservatism, while liberals have targeted the platform for its monopolistic tendencies and failure to police misinformation.In July, the Biden administration began to take a harder line with the company over vaccine misinformation – which Joe Biden said was “killing people” and the US surgeon general said was “spreading like wildfire” on the platform. Meanwhile, the appointment of the antitrust thought leader Lina Khan to head of the FTC spelled trouble for Facebook. She has been publicly critical of the company and other tech giants in the past, and in August refiled a failed FTC case accusing Facebook of anti-competitive practices.After a year of struggles, Facebook has thrown something of a Hail Mary: changing its name. The company announced it would now be called Meta, a reference to its new “metaverse” project, which will create a virtual environment where users can spend time.The name change was met with derision and skepticism from critics. But it remains to be seen whether Facebook, by any other name, will beat the reputation that precedes it.TopicsFacebookTim CookMark ZuckerbergUS CongressUS Capitol attackAppleUS politicsfeaturesReuse this content More

  • in

    Facebook boss ‘not willing to protect public from harm’

    The ObserverFacebookFacebook boss ‘not willing to protect public from harm’ Frances Haugen says chief executive has not shown any desire to shield users from the consequences of harmful content Dan MilmoSat 23 Oct 2021 21.02 EDTLast modified on Sun 24 Oct 2021 04.23 EDTThe Facebook whistleblower whose revelations have tipped the social media giant into crisis has launched a stinging new criticism of Mark Zuckerberg, saying he has not shown any readiness to protect the public from the harm his company is causing.Frances Haugen told the Observer that Facebook’s founder and chief executive had not displayed a desire to run the company in a way that shields the public from the consequences of harmful content.Her intervention came as pressure mounted on the near-$1tn (£730bn) business following a fresh wave of revelations based on documents leaked by Haugen, a former Facebook employee. The New York Times reported that workers had repeatedly warned that Facebook was being flooded with false claims about the 2020 presidential election result being fraudulent and believed the company should have done more to tackle it.Frances Haugen: ‘I never wanted to be a whistleblower. But lives were in danger’Read moreHaugen, who appears before MPs and peers in Westminster on Monday, said Zuckerberg, who controls the business via a majority of its voting shares, has not shown any willingness to protect the public.“Right now, Mark is unaccountable. He has all the control. He has no oversight, and he has not demonstrated that he is willing to govern the company at the level that is necessary for public safety.”She added that giving all shareholders an equal say in the running of the company would result in changes at the top. “I believe in shareholder rights and the shareholders, or shareholders minus Mark, have been asking for years for one share one vote. And the reason for that is, I am pretty sure the shareholders would choose other leadership if they had an option.”Haugen, who quit as a Facebook product manager in May, said she had leaked tens of thousand of documents to the Wall Street Journal and to Congress because she had realised that the company would not change otherwise.She said: “There are great companies that have done major cultural changes. Apple did a major cultural change; Microsoft did a major cultural change. Facebook can change too. They just have to get the will.”This weekend, a consortium of US news organisations released a fresh wave of stories based on the Haugen documents. The New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for Facebook – were of posts falsely alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle Stop the Steal groups spreading claims on the platform that the election was rigged. “Enforcement was piecemeal,” said the research.The revelations have reignited concerns about Facebook’s role in the 6 January riots, in which a mob seeking to overturn the election result stormed the Capitol in Washington. The New York Times added that some of the reporting for the story was based on documents not released by Haugen.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our commercial interests lie. The truth is we’ve invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook’s vice-president of integrity, Guy Rosen, said the company had put in place multiple measures to protect the public during and after the election and that “responsibility for the [6 January] insurrection lies with those who broke the law during the attack and those who incited them”.It was also reported on Friday that a new Facebook whistleblower had come forward and, like Haugen, had filed a complaint to the Securities and Exchange Commission, the US financial regulator, alleging that the company declined to enforce safety rules for fear of angering Donald Trump or impacting Facebook’s growth.Haugen will testify in person on Monday to the joint committee scrutinising the draft online safety bill, which would impose a duty of care on social media companies to protect users from harmful content, and allow the communications regulator, Ofcom, to fine those who breach this. The maximum fine is 10% of global turnover, so in the case of Facebook, this could run into billions of pounds. Facebook, whose services also include Instagram and WhatsApp, has 2.8 billion daily users and generated an income last year of $86bn.As well as issuing detailed rebuttals of Haugen’s revelations, Facebook is reportedly planning a major change that would attempt to put some distance between the company and its main platform. Zuckerberg could announce a rebranding of Facebook’s corporate identity on Thursday, according to a report that said the company is keen to emphasise its future as a player in the “metaverse”, a digital world in which people interact and lead their social and professional lives virtually.Haugen said Facebook must be compelled by all regulators to be more transparent with the information at its disposal internally, as detailed in her document leaks. She said one key reform would be to set up a formal structure whereby regulators could demand reports from Facebook on any problem that they identify.“Let’s imagine there was a brand of car that was having five times as many car accidents as other cars. We wouldn’t accept that car company saying, ‘this is really hard, we are trying our best, we are sorry, we are trying to do better in the future’. We would never accept that as an answer and we are hearing that from Facebook all the time. There needs to be an avenue where we can escalate a concern and they actually have to give us a response.”TopicsFacebookThe ObserverSocial networkingMark ZuckerbergUS elections 2020US CongressUS politicsReuse this content More

  • in

    Facebook harms children and is damaging democracy, claims whistleblower

    FacebookFacebook harms children and is damaging democracy, claims whistleblowerFrances Haugen says in US Congress testimony that Facebook puts ‘astronomical profits before people’04:21Dan Milmo and Kari PaulTue 5 Oct 2021 14.56 EDTFirst published on Tue 5 Oct 2021 14.48 EDTFacebook puts “astronomical profits before people”, harms children and is destabilising democracies, a whistleblower has claimed in testimony to the US Congress.Frances Haugen said Facebook knew it steered young users towards damaging content and that its Instagram app was “like cigarettes” for under-18s. In a wide-ranging testimony, the former Facebook employee said the company did not have enough staff to keep the platform safe and was “literally fanning” ethnic violence in developing countries.She also told US senators:
    The “buck stops” with the founder and chief executive, Mark Zuckerberg.
    Facebook knows its systems lead teenagers to anorexia-related content.
    The company had to “break the glass” and turn back on safety settings after the 6 January Washington riots.
    Facebook intentionally targets teenagers and children under 13.
    Monday’s outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not “destabilise democracies”.
    Haugen appeared in Washington on Tuesday after coming forward as the source of a series of revelations in the Wall Street Journal last month based on internal Facebook documents. They revealed the company knew Instagram was damaging teenagers’ mental health and that changes to Facebook’s News Feed feature – a central plank of users’ interaction with the service – had made the platform more polarising and divisive.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead moreHer evidence to senators included the claim that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.In her opening testimony, Haugen, 37, said: “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.” She added that Facebook was “buying its profits with our safety”. In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).Referring to Monday’s near six-hour outage in which Facebook’s platforms including Instagram and WhatsApp were disabled for billions of users, Haugen’s testimony added: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.” Facebook has 3.5 billion monthly active users across its platforms including Instagram and WhatsApp.Warning that Facebook makes choices that “go against the common good”, Haugen said the company should be treated like the tobacco industry, which was subject to government action once it was discovered it was hiding the harms its products caused, or like car companies that were forced to adopt seatbelts or opioid firms that have been sued by government agencies.Urging lawmakers to force more transparency on Facebook, she said there should be more scrutiny of its algorithms, which shape the content delivered to users. “The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said. With greater transparency, she added, “we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.The hearing focused on the impact of Facebook’s platforms on children, with Haugen likening the appeal of Instagram to tobacco. “It’s just like cigarettes … teenagers don’t have good self-regulation.” Haugen added women would be walking around with brittle bones in 60 years’ time because of the anorexia-related content they found on Facebook platforms.Haugen told lawmakers that Facebook intentionally targets teens and “definitely” targets children as young as eight for the Messenger Kids app. The former Facebook product manager left the company in May after copying tens of thousands of internal documents.A Facebook spokesperson, Andy Stone, said in a tweet during the hearing: “Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”Haugen said that, according to internal documents, Zuckerberg had been given “soft options” to make the Facebook platform less “twitchy” and viral in countries prone to violence but declined to take them because it might affect “meaningful social interactions”, or MSI. She added: “We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.”Haugen said Zuckerberg had built a company that was “very metrics driven”, because the more time people spent on Facebook platforms the more appealing the business was to advertisers. Asked about Zuckerberg’s ultimate responsibility for decisions made at Facebook, she said: “The buck stops with him.”Haugen also warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside of the US.Referring to the aftermath of the 6 January storming of the Capitol, as protesters sought to overturn the US presidential election result, Haugen said she was disturbed that Facebook had to “break the glass” and reinstate safety settings that it had put in place for the November poll. Haugen, who worked for the Facebook team that monitored election interference globally, said those precautions had been dropped after Joe Biden’s victory in order to spur growth on the platform.Among the reforms recommended by Haugen were ensuring that Facebook shares internal information and research with “appropriate” oversight bodies such as Congress and removing the influence of algorithms on Facebook’s News Feed by allowing it to be ranked chronologically.Senator Ed Markey said Congress would take action. “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”Haugen’s lawyers have also filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teenagers found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”A Facebook spokesperson said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookUS CongressSocial networkingUS politicsMark ZuckerbergnewsReuse this content More

  • in

    The inside story of how we reached the Facebook-Trump verdict | Alan Rusbridger

    As so often is the case, Donald Trump gets to the heart of the problem. On 6 January, he was the president of the United States: probably the most powerful man in the world. He should be free to speak his mind, and voters should be free to listen. But he was also a habitual liar who, by the end of his term, had edged into repudiating the very democracy that had elevated him.And then came his inflammatory words on that day, uttered even as rioters were breaking their way into the heart of US democracy. His words had a veneer of restraint – “We have to have peace, so go home.” But his statements were laced with lies, along with praise for the mob who terrorised lawmakers as they sought to confirm Biden as Trump’s successor – “We love you, you’re very special … great patriots … remember this day for ever.”At 5.41pm and 6.15pm that day, Facebook removed two posts from Trump. The following day the company banned Trump from its platform indefinitely. Around the same day, Twitter also moved to ban the president – permanently.So there was the problem that Donald Trump embodied – in a country whose commitment to free speech is baked into its core. The president might be a bitterly polarising figure, but surely he has a right to be heard – and for voters to be free to make up their own minds?Facebook’s decision to the contrary would spark passionate debate within the United States. But it had a wider resonance. For how much longer would giant social media platforms act as an amplification system for any number of despots around the world. Would they, too, be banned?The classic defence of free expression is that good speech defeats bad speech. Political speech – in some views – should be the most protected speech. It is vital we know who our leaders are. We have a right – surely? – to know if they are crooks, liars or demagogues.On 7 January Facebook decided: no longer. And now the Facebook oversight board, of which I am a member, has published its own verdict on the decision: Facebook was both right and wrong. Right to remove his 6 January words and right, the following day, to ban the president from the platform. But wrong to ban him “indefinitely”.The key word is “indefinitely” – if only because Facebook’s own policies do not appear to permit it. The oversight board (OSB) judgment doesn’t mince its words: “In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.” Ball squarely back in Facebook’s court.What Facebook has to do now – in our judgment, which the company is bound to implement – is to re-examine the arbitrary penalty it imposed on 7 January. It should take account of the gravity of the violation and the prospect of future harm.The case is the most prominent the OSB has decided since it was established as an independent entity and will inevitably focus more attention on its work. Why is such a body thought necessary?But this 38-page text is, I hope, a serious contribution to thinking about free speech in an age of chaosLet’s assume we might agree that it’s a bad thing for one person, Mark Zuckerberg, to be in charge of the rules of speech for 2 billion or more people. He is clearly a wonderfully talented engineer – but nothing in his background suggests he is equipped to think deeply about the complexities involved in free expression.Maybe most people who have studied the behaviour of governments towards publishers and newspapers over 300 years might also agree that politicians are not the best people to be trusted with individual decisions about who gets to say what.Into the void between those two polarities has stepped the OSB. At the moment we’re 19 individuals with backgrounds in journalism, law, academia and human rights: by the end of 2021 we hope to be nearer 40.Are we completely independent from Facebook? It certainly feels that way. It’s true that Facebook was involved in selecting the first 20 members, but once the board reaches its full complement, we decide who our future colleagues will be. Since a few early meetings to understand Facebook processes around moderation and similar matters we have had nothing to do with the company.We have our own board of distinguished trustees – again, free of any influence from Facebook. From what I’ve seen of my colleagues so far they’re an odd bunch to have picked if you were in search of a quiet life.The Trump decision was reached through the processes we’ve devised ourselves. A panel of five – with a good spread of regional backgrounds – did the initial heavy lifting, including sifting through more than 9,000 responses from the public.The wider board fed in its own views. We looked at Facebook’s own values – what they call voice, safety and dignity – as well as its content policies and community standards. But we also apply an international human rights lens in trying to balance freedom of expression with possible harms.In the Trump case we looked at the UN Guiding Principles on Business and Human Rights (UNGPs), which establish a voluntary framework for the human rights responsibilities of private businesses. We also considered the right to freedom of expression set out in articles 19 and 20 of the International Covenant on Civil and Political Rights (ICCPR) – as well as the qualifying articles to do with the rights to life, security of person, non-discrimination, participation in public affairs and so on.We also considered the 2013 Rabat Plan of Action, which attempts to identify and control hate speech online. We took into account a submission sent on behalf of Trump himself and sent Facebook 46 questions. They answered 37 fully, and two partially.And then we debated, and argued – virtually/verbally and in writing. A number of drafts were circulated, with most board members pitching in with tweaks, challenges, corrections and disagreements. Gradually, a consensus developed – resulting in a closely argued 38-page decision which openly reflects the majority and minority opinions.In addition to our ruling about the original and “indefinite” bans, we’ve sent Facebook a number of policy advisory statements. One of these concentrates on the question of how social media platforms should deal with “influential users” (a more useful conceit than “political leaders”).Speed is clearly of the essence where potentially harmful speech is involved. While it’s important to protect the rights of people to hear political speech, “if the head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a determinate period sufficient to protect against imminent harm”.As in previous judgments, we are critical of a lack of clarity in some of Facebook’s own rules, together with insufficient transparency about how they’re enforced. We would like to see Facebook carry out a comprehensive review of its potential contribution to the narrative around electoral fraud and in the exacerbated tensions that culminated in the violence on 6 January.And then this: “This should be an open reflection on the design and policy choices that Facebook has made that may enable its platform to be abused.” Which many people will read as not-so-coded reference to what is shorthanded as The Algorithm.Social media is still in its infancy. Among the many thorny issues we periodically discuss as a board is, what is this thing we’re regulating? The existing language – “platform”, “publisher”, “public square” – doesn’t adequately describe these new entities.Most of the suggested forms of more interventionist regulation stub their toes on the sheer novelty of this infant space for the unprecedented mass exchange of views.The OSB is also taking its first steps. The Trump judgment cannot possibly satisfy everyone. But this 38-page text is, I hope, a serious contribution to thinking about how to handle free speech in an age of information chaos. More

  • in

    The Spread of Global Hate

    One insidious way to torture the detainees at Guantanamo Bay was to blast music at them at all hours. The mixtape, which included everything from Metallica to the Meow Mix jingle, was intended to disorient the captives and impress upon them the futility of resistance. It worked: This soundtrack from hell did indeed break several inmates.

    For four years, Americans had to deal with a similar sonic blast, namely the “music” of President Donald Trump. His voice was everywhere: on TV and radio, screaming from the headlines of newspapers, pumped out nonstop on social media. MAGAmen and women danced to the repetitive beat of his lies and distortions. Everyone else experienced the nonstop assault of Trump’s instantly recognizable accent and intonations as nails on a blackboard. After the 2016 presidential election, psychologists observed a significant uptick in the fears Americans had about the future. One clinician even dubbed the phenomenon “Trump anxiety disorder.”

    What Led to Europe’s Vaccine Disaster?

    READ MORE

    The volume of Trump’s assault on the senses has decreased considerably since January. Obviously, he no longer has the bully pulpit of the Oval Office to broadcast his views. The mainstream media no longer covers his every utterance. Most importantly, the major social media platforms have banned him. In the wake of the January 6 insurrection on Capitol Hill, Twitter suspended Trump permanently under its glorification of violence policy. Facebook made the same decision, though its oversight board is now revisiting the former president’s deplatforming.

    It’s not only Trump. The Proud Boys, QAnon, the militia movements: The social media footprint of the far right has decreased a great deal in 2021, with a parallel decline in the amount of misinformation available on the Web.

    And it’s not just a problem of misinformation and hate speech. According to a new report by the Center for Strategic and International Studies (CSIS) on domestic terrorism, right-wing extremists have been involved in 267 plots and 91 fatalities since 2015, with the number of incidents rising in 2020 to a height unseen in a quarter of a century. A large number of the perpetrators are loners who have formed their beliefs from social media. As one counterterrorism official put it, “Social media has afforded absolutely everything that’s bad out there in the world the ability to come inside your home.”

    So, why did the tech giants provide Trump, his extremist followers and their global counterparts unlimited access to a growing audience over those four long years?

    Facebook Helps Trump

    In a new report from the Global Project Against Hate and Extremism (GPAHE), Heidi Beirich and Wendy Via write: “For years, Trump violated the community standards of several platforms with relative impunity. Tech leaders had made the affirmative decision to allow exceptions for the politically powerful, usually with the excuse of ‘newsworthiness’ or under the guise of ‘political commentary’ that the public supposedly needed to see.”

    Even before Trump became president, Facebook was cutting him a break. In 2015, he was using the social media platform to promote a Muslim travel ban, which generated considerable controversy, particularly within Facebook itself. The Washington Post reports:

    “Outrage over the video led to a companywide town hall, in which employees decried the video as hate speech, in violation of the company’s policies. And in meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, according to three former employees, who spoke on the condition of anonymity for fear of retribution. [Facebook CEO Mark] Zuckerberg expressed in meetings that he was personally disgusted by it and wanted it removed, the people said.”

    But the company’s most prominent Republican, Vice-President of Global Policy Joel Kaplan, persuaded Zuckerberg to change his position. In spring 2016, when Zuckerberg wanted to condemn Trump’s plan to build a wall on the border with Mexico, he was again persuaded to step back for fear of seeming too partisan.

    Embed from Getty Images

    Facebook went on to play a critical role in getting Trump elected. It wasn’t simply the Russian campaign to create fake accounts, fake messaging and even fake events using Facebook, or the theft of Facebook user data by Cambridge Analytica. More important was the role played by Facebook staff in helping Trump’s digital outreach team maximize its use of social media. The Trump campaign spent $70 million on Facebook ads and raised much of its $250 million in online fundraising through Facebook as well.

    Trump established a new paradigm through brute force and money. As he turned himself into clickbait, the social media giants applied the same “exceptionalism” to other rancid politicians. More ominously, the protection accorded politicians extended to extremists. According to an account of a discussion at a Twitter staff meeting, one employee explained that “on a technical level, content from Republican politicians could get swept up by algorithms aggressively removing white supremacist material. Banning politicians wouldn’t be accepted by society as a trade-off for flagging all of the white supremacist propaganda.”

    Of course, in the wake of the January 6 insurrection, social media organizations decided that society could indeed accept the banning of politicians, at least when it came to some politicians in the United States.

    The Real Fake News

    In the Philippines, an extraordinary 97% of internet users had accounts with Facebookas of 2019, up from 40% in 2018 (by comparison, about 67% of Americans have Facebook accounts). Increasingly, Filipinos get their news from social media. That’s bad news for the mainstream media in the Philippines. And that’s particularly bad news for journalists like Maria Ressa, who runs an online news site called Rappler.

    At a press conference for the GPAHE report, Ressa described how the government of Rodrigo Duterte, with an assist from Facebook, has made her life a living hell. Like Trump, President Duterte came to power on a populist platform spread through Facebook. Because of her critical reporting on government affairs, Ressa felt the ire of the Duterte fan club, which generated half a million hate posts that, according to one study, consisted of 60% attacks on her credibility and 40% sexist and misogynist slurs. This onslaught created a bandwagon effect that equated journalists like her with criminals.

    This noxious equation on social media turned into a real case when the Philippine authorities arrested Ressa in 2019 and convicted her of the dubious charge of “cyberlibel.” She faces a sentence of as much as 100 years in prison.

    “Our dystopian present is your dystopian future,” she observed. What happened in the Philippines in that first year of Duterte became the reality in the United States under Trump. It was the same life cycle of hate in which misinformation is introduced in social media, then imported into the mainstream media and supported from the top down by opportunistic politicians.

    The Philippines faces another presidential election next year, and Duterte is barred from running again by term limits. Duterte’s daughter, who is currently the mayor of Davao City just like her father had been, tops the early polls, though she hasn’t thrown her hat in the ring and her father has declared that women shouldn’t run for president. This time around, however, Facebook disrupted the misinformation campaign tied to the Dutertes when it took down fake accounts coming from China that supported the daughter’s potential bid for the presidency.

    President Duterte was furious. “Facebook, listen to me,” he said. “We allow you to operate here hoping that you could help us. Now, if government cannot espouse or advocate something which is for the good of the people, then what is your purpose here in my country? What would be the point of allowing you to continue if you can’t help us?”

    Duterte had been led to believe, based on his previous experience, that Facebook was his lapdog. Other authoritarian regimes had come to expect the same treatment. In India, according to the GPAHE report, Prime Minister Narendra Modi’s Bharatiya Janata Party:

    “… was Facebook India’s biggest advertising spender in 2020. Ties between the company and the Indian government run even deeper, as the company has multiple commercial ties, including partnerships with the Ministry of Tribal Affairs, the Ministry of Women and the Board of Education. Both CEO Mark Zuckerberg and COO Sheryl Sandberg have met personally with Modi, who is the most popular world leader on Facebook. Before Modi became prime minister, Zuckerberg even introduced his parents to him.”

    Facebook has also cozied up to the right-wing government in Poland, misinformation helped get Jair Bolsonaro elected in Brazil, and the platform served as a vehicle for the Islamophobic content that contributed to the rise of the far right in the Netherlands. But the decision to ban Trump has set in motion a backlash. In Poland, for instance, the Law and Justice Party has proposed a law to fine Facebook and others for removing content if it doesn’t break Polish law, and a journalist has attempted to establish a pro-government alternative to Facebook called Albicla.

    Back in the USA

    Similarly, in the United States, the far right have suddenly become a big booster of free speech now that social media platforms have begun to deplatform high-profile users like Trump and take down posts for their questionable veracity and hate content. In the second quarter of 2020 alone, Facebook removed 22.5 million posts.

    .custom-post-from {float:right; margin: 0 10px 10px; max-width: 50%; width: 100%; text-align: center; background: #000000; color: #ffffff; padding: 15px 0 30px; }
    .custom-post-from img { max-width: 85% !important; margin: 15px auto; filter: brightness(0) invert(1); }
    .custom-post-from .cpf-h4 { font-size: 18px; margin-bottom: 15px; }
    .custom-post-from .cpf-h5 { font-size: 14px; letter-spacing: 1px; line-height: 22px; margin-bottom: 15px; }
    .custom-post-from input[type=”email”] { font-size: 14px; color: #000 !important; width: 240px; margin: auto; height: 30px; box-shadow:none; border: none; padding: 0 10px; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-pen-icon.svg”); background-repeat: no-repeat; background-position: center right 14px; background-size:14px;}
    .custom-post-from input[type=”submit”] { font-weight: normal; margin: 15px auto; height: 30px; box-shadow: none; border: none; padding: 0 10px 0 35px; background-color: #1878f3; color: #ffffff; border-radius: 4px; display: inline-block; background-image: url(“https://www.fairobserver.com/wp-content/plugins/moosend_form/cpf-email-icon.svg”); background-repeat: no-repeat; background-position: 14px center; background-size: 14px; }

    .custom-post-from .cpf-checkbox { width: 90%; margin: auto; position: relative; display: flex; flex-wrap: wrap;}
    .custom-post-from .cpf-checkbox label { text-align: left; display: block; padding-left: 32px; margin-bottom: 0; cursor: pointer; font-size: 11px; line-height: 18px;
    -webkit-user-select: none;
    -moz-user-select: none;
    -ms-user-select: none;
    user-select: none;
    order: 1;
    color: #ffffff;
    font-weight: normal;}
    .custom-post-from .cpf-checkbox label a { color: #ffffff; text-decoration: underline; }
    .custom-post-from .cpf-checkbox input { position: absolute; opacity: 0; cursor: pointer; height: 100%; width: 24%; left: 0;
    right: 0; margin: 0; z-index: 3; order: 2;}
    .custom-post-from .cpf-checkbox input ~ label:before { content: “f0c8”; font-family: Font Awesome 5 Free; color: #eee; font-size: 24px; position: absolute; left: 0; top: 0; line-height: 28px; color: #ffffff; width: 20px; height: 20px; margin-top: 5px; z-index: 2; }
    .custom-post-from .cpf-checkbox input:checked ~ label:before { content: “f14a”; font-weight: 600; color: #2196F3; }
    .custom-post-from .cpf-checkbox input:checked ~ label:after { content: “”; }
    .custom-post-from .cpf-checkbox input ~ label:after { position: absolute; left: 2px; width: 18px; height: 18px; margin-top: 10px; background: #ffffff; top: 10px; margin: auto; z-index: 1; }
    .custom-post-from .error{ display: block; color: #ff6461; order: 3 !important;}

    Facebook has tried to get ahead of this story by establishing an oversight board that includes members like Jamal Greene, a law professor at Columbia University; Julie Owono, executive director at Internet Sans Frontiere; and Nighat Dad, founder of the Digital Rights Foundation. Now, Facebook users can also petition the board to remove content.

    With Facebook, Twitter, YouTube and others now removing a lot of extremist content, the far right have migrated to other platforms, such as Gab, Telegram, and MeWe. They continue to spread conspiracy theories, anti-COVID vaccine misinformation and pro-Trump propaganda on these alternative platforms. Meanwhile, the MAGA crowd awaits the second coming of Trump in the form of a new social media platform that he plans to launch in a couple of months to remobilize his followers.

    Even without such an alternative alt-right platform — Trumpbook? TrumpSpace? Trumper? — the life cycle of hate is still alive and well in the United States. Consider the “great replacement theory,” according to which immigrants and denizens of the non-white world are determined to “replace” white populations in Europe, America and elsewhere. Since its inception in France in 2010, this extremist conspiracy theory has spread far and wide on social media. It has been picked up by white nationalists and mass shooters. Now, in the second stage of the life cycle, it has landed in the mainstream media thanks to right-wing pundits like Tucker Carlson, who recently opined, “The Democratic Party is trying to replace the current electorate of the voters now casting ballots with new people, more obedient voters from the Third World.”

    Pressure is mounting on Fox to fire Carlson, though the network is resisting. Carlson and his supporters decry the campaign as yet another example of “cancel culture.” They insist on their First Amendment right to express unpopular opinions. But a privately-owned media company is under no obligation to air all views, and the definition of acceptability is constantly evolving.

    Also, a deplatformed Carlson would still be able to air his crank views on the street corner or in emails to his followers. No doubt when Trumpbook debuts at some point in the future, Carlson’s biggest fan will also give him a digital megaphone to spread lies and hate all around the world. These talking heads will continue talking no matter what. The challenge is to progressively shrink the size of their global platform.

    *[This article was originally published by FPIF.]

    The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy. More