More stories

  • in

    The whistleblower who plunged Facebook into crisis

    After a set of leaks last month that represented the most damaging insight into Facebook’s inner workings in the company’s history, the former employee behind them has come forward. Now Frances Haugen has given evidence to the US Congress – and been praised by senators as a ‘21st century American hero’. Will her testimony accelerate efforts to bring the social media giant to heel?

    How to listen to podcasts: everything you need to know

    On Monday, Facebook and its subsidiaries Instagram and WhatsApp went dark after a router failure. There were thousands of negative headlines, millions of complaints, and more than 3 billion users were forced offline. On Tuesday, the company’s week got significantly worse. Frances Haugen, a former product manager with Facebook, testified before US senators about what she had seen in her two years there – and set out why she had decided to leak a trove of internal documents to the Wall Street Journal. Haugen had revealed herself as the source of the leak a few days earlier. And while the content of the leak – from internal warnings of the harm being done to teenagers by Instagram to the deal Facebook gives celebrities to leave their content unmoderated – had already led to debate about whether the company needed to reform, Haugen’s decision to come forward escalated the pressure on Mark Zuckerberg. In this episode, Nosheeen Iqbal talks to the Guardian’s global technology editor, Dan Milmo, about what we learned from Haugen’s testimony, and how damaging a week this could be for Facebook. Milmo sets out the challenges facing the company as it seeks to argue that the whistleblower is poorly informed or that her criticism is mistaken. And he reflects on what options politicians and regulators around the world will consider as they look for ways to curb Facebook’s power, and how likely such moves are to succeed. After Haugen spoke, Zuckerberg said her claims that the company puts profit over people’s safety were “just not true”. In a blog post, he added: “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” You can read more of Zuckerberg’s defence here. And you can read an analysis of how Haugen’s testimony is likely to affect Congress’s next move here. Archive: BBC; YouTube; TikTok; CSPAN; NBC; CBS;CNBC; Vice; CNN More

  • in

    Facebook ‘tearing our societies apart’: key excerpts from a whistleblower

    FacebookFacebook ‘tearing our societies apart’: key excerpts from a whistleblower Frances Haugen tells US news show why she decided to reveal inside story about social networking firm Dan Milmo Global technology editorMon 4 Oct 2021 08.33 EDTLast modified on Mon 4 Oct 2021 10.30 EDTFrances Haugen’s interview with the US news programme 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show the tech firm prioritised profit over safety and was “tearing our societies apart”.Haugen will testify in Washington on Tuesday, as political pressure builds on Facebook. Here are some of the key excerpts from Haugen’s interview.Choosing profit over the public goodHaugen’s most cutting words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit above the wellbeing of its users and the public. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money.”She also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election was over, allowing misinformation to spread on the platform again. “And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”Facebook’s approach to safety compared with othersIn a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”Instagram and mental healthThe document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterisation” of its research.Why Haugen leaked the documentsHaugen said “person after person” had attempted to tackle Facebook’s problems but had been ground down. “Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”Having joined the company in 2019, Haugen said she decided to act this year and started copying tens of thousands of documents from Facebook’s internal system, which she believed show that Facebook is not, despite public comments to the contrary, making significant progress in combating online hate and misinformation . “At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”Facebook and violenceHaugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”Facebook and the Washington riotThe 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”The 2018 algorithm changeFacebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customised feed of content such as friends’ photos and news stories – to prioritise content that increased user engagement. Haugen said this made divisive content more prominent.“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”Haugen said European political parties contacted Facebook to say that the news feed change was forcing them to take more extreme political positions in order to win users’ attention. Describing polititicians’ concerns, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”In a statement to 60 Minutes, Facebook said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”TopicsFacebookSocial networkingUS Capitol attackInstagramMental healthSocial mediaYoung peoplenewsReuse this content More

  • in

    Facebook to suspend Trump’s account for two years

    Facebook is suspending Donald Trump’s account for two years, the company has announced in a highly anticipated decision that follows months of debate over the former president’s future on social media.“Given the gravity of the circumstances that led to Mr Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols. We are suspending his accounts for two years, effective from the date of the initial suspension on January 7 this year,” Nick Clegg, Facebook’s vice-president of global affairs, said in a statement on Friday.At the end of the suspension period, Facebook said, it would work with experts to assess the risk to public safety posed by reinstating Trump’s account. “We will evaluate external factors, including instances of violence, restrictions on peaceful assembly and other markers of civil unrest,” Clegg wrote. “If we determine that there is still a serious risk to public safety, we will extend the restriction for a set period of time and continue to re-evaluate until that risk has receded.”He added that once the suspension was lifted, “a strict set of rapidly escalating sanctions” would be triggered if Trump violated Facebook policies.Friday’s decision comes just weeks after input from the Facebook oversight board – an independent advisory committee of academics, media figures and former politicians – who recommended in early May that Trump’s account not be reinstated.However the oversight board punted the ultimate decision on Trump’s fate back to Facebook itself, giving the company six months to make the final call. The board said that Facebook’s “indeterminate and standardless penalty of indefinite suspension” for Trump was “not appropriate”, criticism that Clegg wrote the company “absolutely accept[s]”.The new policy allows for escalating penalties of suspensions for one month, six months, one year, and two years.The former president has been suspended since January, following the deadly Capitol attack that saw a mob of Trump supporters storm Congress in an attempt to overturn the 2020 presidential election. The company suspended Trump’s Facebook and Instagram accounts over posts in which he appeared to praise the actions of the rioters, saying that his actions posed too great a risk to remain on the platform.Following the Capitol riot, Trump was suspended from several major tech platforms, including Twitter, YouTube and Snapchat. Twitter has since made its ban permanent.The former president called Facebook’s decision “an insult to the record-setting 75m people, plus many others, who voted for us in the 2020 Rigged Presidential Election,” in a statement. “They shouldn’t be allowed to get away with this censoring and silencing, and ultimately, we will win.” Trump received fewer than 75m votes in the 2020 election, which he lost. He also hinted at a 2024 run.Facebook also announced that it would revoke its policy of treating speech by politicians as inherently newsworthy and exempt from enforcement of its content rules that ban, among other things, hate speech. The decision marks a major reversal of a set of policies that Clegg and Facebook’s CEO, Mark Zuckerberg, once championed as crucial to democracy and free speech.The company first created the newsworthiness exemption to its content rules in 2016, following international outcry over its decision to censor posts including the historic “napalm girl” photograph for violating its ban on nude images of children. The new rule tacitly acknowledged the importance of editorial judgment in Facebook’s censorship decisions.In 2019, at a speech at the Atlantic festival in Washington, Clegg revealed that Facebook had decided to treat all speech by politicians as newsworthy, exempting it from content rules. “Would it be acceptable to society at large to have a private company in effect become a self-appointed referee for everything that politicians say? I don’t believe it would be,” Clegg said at the time.Under the new rules, Clegg wrote Friday, “when we assess content for newsworthiness, we will not treat content posted by politicians any differently from content posted by anyone else”.The newsworthiness exemption is by no means the only policy area in which Facebook treats politicians differently from other users. The company also exempts politicians’ speech from its third-party fact-checking and maintains a list of high-profile accounts that are exempted from the AI systems that Facebook relies on for enforcement of many of its rules.Facebook did not immediately respond to questions about whether those policies remain in effect. More

  • in

    Rightwing 'super-spreader': study finds handful of accounts spread bulk of election misinformation

    A handful of rightwing “super-spreaders” on social media were responsible for the bulk of election misinformation in the run-up to the Capitol attack, according to a new study that also sheds light on the staggering reach of falsehoods pushed by Donald Trump.A report from the Election Integrity Partnership (EIP), a group that includes Stanford and the University of Washington, analyzed social media platforms including Facebook, Twitter, Instagram, YouTube, and TikTok during several months before and after the 2020 elections.It found that “super-spreaders” – responsible for the most frequent and most impactful misinformation campaigns – included Trump and his two elder sons, as well as other members of the Trump administration and the rightwing media.The study’s authors and other researchers say the findings underscore the need to disable such accounts to stop the spread of misinformation.“If there is a limit to how much content moderators can tackle, have them focus on reducing harm by eliminating the most effective spreaders of misinformation,” said said Lisa Fazio, an assistant professor at Vanderbilt University who studies the psychology of fake news but was not involved EIP report. “Rather than trying to enforce the rules equally across all users, focus enforcement on the most powerful accounts.” The report analyzed social media posts featuring words like “election” and “voting” to track key misinformation narratives related to the the 2020 election, including claims of mail carriers throwing away ballots, legitimate ballots strategically not being counted, and other false or unproven stories.The report studied how these narratives developed and the effect they had. It found during this time period, popular rightwing Twitter accounts “transformed one-off stories, sometimes based on honest voter concerns or genuine misunderstandings, into cohesive narratives of systemic election fraud”.Ultimately, the “false claims and narratives coalesced into the meta-narrative of a ‘stolen election’, which later propelled the January 6 insurrection”, the report said.“The 2020 election demonstrated that actors – both foreign and domestic – remain committed to weaponizing viral false and misleading narratives to undermine confidence in the US electoral system and erode Americans’ faith in our democracy,” the authors concluded.Next to no factchecking, with Trump as the super-spreader- in-chiefIn monitoring Twitter, the researchers analyzed more than more than 22 million tweets sent between 15 August and 12 December. The study determined which accounts were most influential by the size and speed with which they spread misinformation.“Influential accounts on the political right rarely engaged in factchecking behavior, and were responsible for the most widely spread incidents of false or misleading information in our dataset,” the report said.Out of the 21 top offenders, 15 were verified Twitter accounts – which are particularly dangerous when it comes to election misinformation, the study said. The “repeat spreaders” responsible for the most widely spread misinformation included Eric Trump, Donald Trump, Donald Trump Jr. and influencers like James O’Keefe, Tim Pool, Elijah Riot, and Sidney Powell. All 21 of the top accounts for misinformation leaned rightwing, the study showed.“Top-down mis- and disinformation is dangerous because of the speed at which it can spread,” the report said. “If a social media influencer with millions of followers shares a narrative, it can garner hundreds of thousands of engagements and shares before a social media platform or factchecker has time to review its content.”On nearly all the platforms analyzed in the study – including Facebook, Twitter, and YouTube – Donald Trump played a massive role.It pinpointed 21 incidents in which a tweet from Trump’s official @realDonaldTrump account jumpstarted the spread of a false narrative across Twitter. For example, Trump’s tweets baselessly claiming that the voting equipment manufacturer Dominion Voting Systems was responsible for election fraud played a large role in amplifying the conspiracy theory to a wider audience. False or baseless tweets sent by Trump’s account – which had 88.9m followers at the time – garnered more than 460,000 retweets.Meanwhile, Trump’s YouTube channel was linked to six distinct waves of misinformation that, combined, were the most viewed of any other repeat-spreader’s videos. His Facebook account had the most engagement of all those studied.The Election Integrity Partnership study is not the first to show the massive influence Trump’s social media accounts have had on the spread of misinformation. In one year – between 1 January 2020 and 6 January 2021 – Donald Trump pushed disinformation in more than 1,400 Facebook posts, a report from Media Matters for America released in February found. Trump was ultimately suspended from the platform in January, and Facebook is debating whether he will ever be allowed back.Specifically, 516 of his posts contained disinformation about Covid-19, 368 contained election disinformation, and 683 contained harmful rhetoric attacking his political enemies. Allegations of election fraud earned over 149.4 million interactions, or an average of 412,000 interactions per post, and accounted for 16% of interactions on his posts in 2020. Trump had a unique ability to amplify news stories that would have otherwise remained contained in smaller outlets and subgroups, said Matt Gertz of Media Matters for America.“What Trump did was take misinformation from the rightwing ecosystem and turn it into a mainstream news event that affected everyone,” he said. “He was able to take these absurd lies and conspiracy theories and turn them into national news. And if you do that, and inflame people often enough, you will end up with what we saw on January 6.”Effects of false election narratives on voters“Super-spreader” accounts were ultimately very successful in undermining voters’ trust in the democratic system, the report found. Citing a poll by the Pew Research Center, the study said that, of the 54% of people who voted in person, approximately half had cited concerns about voting by mail, and only 30% of respondents were “very confident” that absentee or mail-in ballots had been counted as intended.The report outlined a number of recommendations, including removing “super-spreader” accounts entirely.Outside experts agree that tech companies should more closely scrutinize top accounts and repeat offenders.Researchers said the refusal to take action or establish clear rules for when action should be taken helped to fuel the prevalence of misinformation. For example, only YouTube had a publicly stated “three-strike” system for offenses related to the election. Platforms like Facebook reportedly had three-strike rules as well but did not make the system publicly known.Only four of the top 20 Twitter accounts cited as top spreaders were actually removed, the study showed – including Donald Trump’s in January. Twitter has maintained that its ban of the former president is permanent. YouTube’s chief executive officer stated this week that Trump would be reinstated on the platform once the “risk of violence” from his posts passes. Facebook’s independent oversight board is now considering whether to allow Trump to return.“We have seen that he uses his accounts as a way to weaponize disinformation. It has already led to riots at the US Capitol; I don’t know why you would give him the opportunity to do that again,” Gertz said. “It would be a huge mistake to allow Trump to return.” More

  • in

    'Four years of propaganda': Trump social media bans come too late, experts say

    In the 24 hours since the US Capitol in Washington was seized by a Trump-supporting mob disputing the results of the 2020 election, American social media companies have barred the president from their platforms for spreading falsehoods and inciting the crowd.Facebook, Snapchat and Twitch suspended Donald Trump indefinitely. Twitter locked his account temporarily. Multiple platforms removed his messages.Those actions, coming just days before the end of Trump’s presidency, are too little, too late, according to misinformation experts and civil rights experts who have long warned about the rise of misinformation and violent rightwing rhetoric on social media sites and Trump’s role in fueling it.“This was exactly what we expected,” said Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who studies the rise of movements like QAnon. “It is very consistent with how the coalescing of different factions responsible for what happened yesterday have been operating online, and how platforms’ previous attempts to deal with them have fallen short.”Over the past decade, tech platforms have been reluctant to moderate Trump’s posts, even as he repeatedly violated hate speech regulations. Before winning the presidency, Trump used Twitter to amplify his racist campaign asserting, falsely, that Barack Obama was not born in the US. As president, he shared racist videos targeting Muslims on Twitter and posted on Facebook in favor of banning Muslims from entering the US, a clear violation of the platform’s policies against hate speech. He retweeted to his tens of millions of followers a video of one of his supporters shouting “white power!” in 2020 June. He appeared to encourage violence against Black Lives Matter protests in a message shared to multiple platforms that included the phrase “when the looting starts, the shooting starts”.Trump’s lies and rhetoric found an eager audience online – one that won’t disappear when his administration ends. Experts warn the platforms will continue to be used to organize and perpetuate violence. They point, for example, to Facebook and YouTube’s failure to curb the proliferation of dangerous conspiracy theory movements like QAnon, a baseless belief that a secret cabal is controlling the government and trafficking children and that Trump is heroically stopping it. Parts of the crowd that stormed the Capitol on Wednesday to bar the certification of Trump’s election defeat donned QAnon-related merchandise, including hats and T-shirts, and the action was discussed weeks in advance on many QAnon-related groups and forums.QAnon theories and communities have flourished on Facebook this year. By the time the company banned QAnon-themed groups, pages and accounts in October, hundreds of related pages and groups had amassed more than 3 million followers and members.YouTube removed “tens of thousands of QAnon-videos and terminated hundreds of channels” around the time of Facebook’s measures. It also updated its policy to target more conspiracy theory videos that promote real-world violence, but it still stopped short of banning QAnon content outright. A spokesman from YouTube noted the company had taken a number of other actions to address QAnon content, including adding information panels sharing facts about QAnon on videos as early as 2018.Trump’s leverage of social media to spread propaganda has gone largely unchecked amid a vacuum of laws regulating government speech on social media, said Jennifer M Grygiel, assistant professor of communication at Syracuse University and expert on social media.Grygiel cited the Smith-Mundt Act of 1948, which regulates the distribution of government propaganda, as an example of one law that limits the government’s communication. But such regulation does not exist for the president’s Twitter account, Grygiel said. Instead we have relied on the assumption the president would not use his social media account to incite an insurrection.“What happened this week is the product of four years of systematic propaganda from the presidency,” Grygiel said.In the absence of any meaningful regulation, tech companies have had little incentive to regulate their massively profitable platforms, curb the spread of falsehoods that produce engagement and moderate the president.That’s why experts say things have to change. In 2020, Republicans and Democrats amplified calls to regulate big tech. The events this week underscore that the reckoning over big tech must include measures aimed at addressing the risks posed by leaders lying and promoting violence on their platforms, some argue.“The violence that we witnessed today in our nation’s capital is a direct response to the misinformation, conspiracy theories and hate speech that have been allowed to spread on social media platforms like Facebook, YouTube, Twitter etc,” said Jim Steyer, who runs the non-profit children’s advocacy organization Common Sense Media and helped organize the Stop Hate for Profit campaign (with the ADL and a number of civil rights organizations), which called on advertisers to boycott Facebook over hate speech concerns and cost Facebook millions.“Social media platforms must be held accountable for their complicity in the destruction of our democracy,” he added, arguing that in absence of meaningful enforcement from social media, Congress must pass better legislation to address hate speech on these platforms.Facebook and Twitter did not respond to requests for comment.Grygiel said it was time to move away from the idea that a president should be tweeting at all. Adam Mosseri, head of Facebook’s subsidiary Instagram, said on Twitter on Thursday evening that Facebook has long said it believes “regulation around harmful content would be a good thing”. He acknowledged that Facebook “cannot tackle harmful content without considering those in power as a potential source”.Grygiel said: “We need non-partisan work here. We need legislation that ensures no future president can ever propagandize the American people in this way again.” More

  • in

    AOC's cooking live streams perfect the recipe for making politics palatable

    When life gives you lemons, make lemonade. Or talk up a storm about the minimum wage, healthcare and the existential struggle for democracy.Alexandria Ocasio-Cortez’s latest Instagram live stream found the youngest woman ever elected to the US Congress standing at a chopping board with two lemons and a plastic jug as she expounded her political philosophy.“Both Democrats and Republicans,” she said, scooping up a lemon with her right hand, “when they indulge in these narratives of commonsense policies being radical,” – setting the lemon down on the board again – “what they’re trying to do is really shorten the window of what’s possible.”A twee icing contest on The Great British Bake Off this is not. And as far as we know, Gordon Ramsay, Ina Garten and Nigella Lawson have never been heard to exclaim, “Shoutout to my fellow radicals!” as Ocasio-Cortez did last Thursday night.But for anyone worried that politics might become a little too boring under Joe Biden’s presidency, “AOC”, as she is universally known, is bringing comfort food. The 31-year-old New York Democrat has gained a vast social media following with her intimate videos of cooking, fashion tips, furniture assembly and behind the scenes in Congress.Rep. AOC: “All these Republicans and all these folks who were anti-shutdown are the same people who weren’t wearing masks who forced us to shut down in the first place.” pic.twitter.com/85bW0lNefU— The Hill (@thehill) December 11, 2020
    This may say something about a public craving for authenticity in politicians: Biden, Donald Trump and AOC’s mentor Bernie Sanders have it, as far as their supporters are concerned. Similarly Ocasio-Cortez, from a working-class family in the Bronx, comes over more like your relatable drinking buddy than a Washington stiff but combines it with a millennial’s instinct for social media and a timeless star quality.But it is also proof that entertainment and politics have become mutually indistinguishable. The trend arguably began 60 years ago with the televised Kennedy v Nixon debates, received a boost from Bill Clinton playing saxophone on a late-night talkshow and reached its apotheosis with Trump, who went from reality TV host to reality TV president.So Ocasio-Cortez offers a glimpse of where we’re heading. Her Instagram live streams typically begin with the type of bit that might feature on daytime TV before pivoting to policy. It is a technique that serves journalists, novelists and other storytellers well: first hook your audience with something engaging, then move on to the substantial idea you really want to talk about.A cooking video last year began with Ocasio-Cortez in an unglamorous kitchen, rinsing rice in the sink. What are you making? asked viewers. The answer: chicken tikka masala. “I am missing ginger, which is a really big bummer,” she said, before fielding questions on everything from Medicare for All to presidential impeachments.In last Thursday’s edition, lemons were a suitably sour match for Ocasio-Cortez’s mood in a country where 3,000 people a day are dying from coronavirus, Congress has stalled for months over providing economic relief – and Biden appears in no hurry to put her progressive allies in his cabinet.Wearing a “tax the rich” sweater, the congresswoman was visibly more angry and frustrated than usual. “If people think that the present day is like radical far left, they just haven’t even opened a book,” she said with expressive hand gestures. “Like, we had much more radicalism in the United States as recently as the 60s.“We talk about how labour unions started in this country. That was radical. People died, people died in this country, it was almost like a war for the 40-hour work week and your weekends. And a lot of people died for these very basic economic rights. We can’t go back to that time.”She added: “Doubling the minimum wage should be normal. Guaranteed healthcare should be normal. Trying to save our planet should be centrist politics.”She became even more irate as she talked about Covid-19. Hands resting on a plastic jug, she said animatedly: “Here’s the thing that’s also a huge irony to me, is that all these Republicans and all these folks who were anti-shutdown are the same people who weren’t wearing masks who forced us to shut down in the first place.”The final 12 words of that sentence came in a rapid staccato, accompanied by Ocasio-Cortez’s left hand clapping or chopping her right for emphasis. “I wanna see my family,” she said. “I haven’t seen my family in a year, like many of you all. I wanna be able to visit my friends without being scared and I wanna be able to hang out with my friends when it’s cold outside and not have to be outside.”If anyone was depending on AOC for their dinner that night, they were in for a long wait. Ilhan Omar, a fellow member of “the Squad” in Congress, teased her on Twitter: “@AOC you forgot to tell us what you were making tonight sis.”Ocasio-Cortez confessed: “I tried to make salmon spinach pasta but got carried away about how jacked up our Covid response is and how badly we need stimulus checks and healthcare that all I did was zest a lemon I’ll post my meal when it’s done.”And she eventually did post a photo for “accountability purposes”. (Her pet dog looked intrigued.)The style has been honed over time. Last year there was the live stream of Ocasio-Cortez in her unfurnished apartment where she had been sleeping on a mattress on the floor. Again, relatable. “I’ve been living like a completely depraved lifestyle,” she said, chewing on popcorn (top tip: add ground pepper) and assembling a table. “There’s something very satisfying about putting together Ikea furniture.”But she also delivered meat in the sandwich. “Your grandchildren will not be able to hide the fact that you fought against acknowledging and taking bold actions on climate change,” Ocasio-Cortez warned opponents. “We have 12 years left to cut emissions by at least 50%, if not more, and for everyone who wants to make a joke about that, you may laugh but your grandkids will not.”Another classic of the genre came in August this year when Ocasio-Cortez shot a video for Vogue about her skincare and lipstick routine. On one level, it was glamorous and fun. On another, it was a golden opportunity to riff on patriarchy, the gender pay gap and what it is to live in systems largely built for the convenience of men – in a medium that was infinitely more digestible than a dry university seminar.“The reason why I think it’s so important to share these things is that, first of all, femininity has power, and in politics there is so much criticism and nitpicking about how women and femme people present ourselves,” she said. “Just being a woman is quite politicised here in Washington.“…… There’s this really false idea that if you care about makeup or if your interests are in beauty and fashion, that that’s somehow frivolous. But I actually think these are some of the most substantive decisions that we make – and we make them every morning.”One of the keys to understanding the phenomenon of Ocasio-Cortez, and the backlash against her, is her years of working as a bartender and waitress. Critics seek to portray this as a weakness, with Twitter jibes such as “Shut up and sit down, bartender”. On the contrary, it is a strength, a schooling in the art of conversation and listening.Ocasio-Cortez shot back last year: “I find it revealing when people mock where I came from, and say they’re going to ‘send me back to waitressing’, as if that is bad or shameful. It’s as though they think being a member of Congress makes you intrinsically ‘better’ than a waitress. But our job is to serve, not rule.”The US constitution stipules that the president must be at least 35 years old. Ocasio-Cortez turns 35 less than a month before the next election. She is already campaigning from her kitchen without knowing it. More

  • in

    Facebook faces antitrust allegations over deals for Instagram and WhatsApp

    Facebook is expecting significant new legal challenges, as the US Federal Trade Commission and a coalition of attorney generals from up to 40 states are preparing antitrust suits.
    [embedded content]
    Although the specific charges in both cases remain unclear, the antitrust allegations are expected to center on the tech giant’s acquisition of two big apps: a $1bn deal to buy the photo-sharing app Instagram in 2012, and the $19bn purchase of the global messaging service WhatsApp in 2014. Together, the buys brought the top four social media companies worldwide under Facebook’s control. The purchases would constitute antitrust violations if Facebook believed the companies were viable competitors.
    At the time of its acquisition, Instagram had 30 million users, and, even though it was growing rapidly, it wasn’t yet making money. WhatsApp boasted more than 450 million monthly active users when it was acquired. “WhatsApp is on a path to connect 1 billion people,” Zuckerberg said in a statement at the time.
    The FTC cleared Facebook for the acquisitions when they occurred, and the company is hoping to leverage those approvals in mounting a defense. Facebook executives have also argued their company has helped the apps grow.
    But Facebook has come under greater scrutiny since the deals were done, and the FTC launched a new investigation into the potential antitrust violations in 2019.
    The FTC probe will build on findings from a separate inquiry conducted by the US House Judiciary subcommittee, which released millions of documents that appeared to show that Facebook executives, including CEO Mark Zuckerberg, were concerned the apps could become competition, before aggressively pursuing them.
    In one 2012 email, made public through the House investigation, Zuckerberg highlighted how Instagram had an edge on mobile, an area where Facebook was falling behind. In another, the CEO said Instagram could hurt Facebook even if it doesn’t become huge. “The businesses are nascent but the networks are established, the brands are already meaningful and if they grow to a large scale they could be disruptive to us,” Zuckerberg wrote. Instagram’s co-founder also fretted that his company might be targeted for destruction by Zuckerberg if he refused the deal.
    The FTC is expected to vote on a possible suit this week. Three of the five-member commission are believed to be in favor of the move, including chair Joseph Simons, who is expected to leave the agency before the new Biden administration is sworn in, Politico reported.
    Commissioners also have to decide where to file the suit: in federal court, which would leave the outcome to a judge; or in the FTC, where the commission could ultimately decide.
    The suit expected from the bipartisan coalition of states is headed by New York attorney general Letitia James. While details of their complaint are also scant, several states’ top law enforcement offices launched probes into Facebook’s acquisitions last year, adding to the pressure put on the company by federal regulators.
    Facebook did not respond to a request for comment.
    Facebook’s possible legal challenges come as a growing number of US lawmakers are arguing that companies including Amazon, Google, Facebook and Apple have amassed too much power and should be reined in.
    These companies “wield their dominance in ways that erode entrepreneurship, degrade Americans’ privacy online, and undermine the vibrancy of the free and diverse press”, the House judiciary committee concluded in its nearly 500-page report.
    “The result is less innovation, fewer choices for consumers, and a weakened democracy.”
    President-elect Joe Biden, too, has been critical of the tech companies. “Many technology giants and their executives have not only abused their power, but misled the American people, damaged our democracy and evaded any form of responsibility,” said Biden spokesperson Matt Hill to the New York Times. “That ends with a President Biden.”
    In May, Facebook took over Giphy, a hugely popular moving-image app, with plans to integrate it with Instagram. Late last month, the company also announced plans to acquire Kustomer, an e-commerce app.
    “This deal is about providing more choices and better products for consumers,” a company spokesman said in a statement to the New York Times. “The key to Facebook’s success has always been innovation, with M&A being just a part of our overall business strategy, and we will continue to demonstrate to regulators that competition in the technology sector is vibrant.” More

  • in

    Facebook says it rejected 2.2m ads seeking to obstruct voting in US election

    A total of 2.2m ads on Facebook and Instagram have been rejected and 120,000 posts withdrawn for attempting to “obstruct voting” in the upcoming US presidential election, Facebook’s vice president Nick Clegg has said.In addition, warnings were posted on 150m examples of false information posted online, the former British deputy prime minister told French weekly Journal du Dimanche on Sunday.Facebook has been increasing its efforts to avoid a repeat of events leading up to the 2016 US presidential election, won by Donald Trump, when its network was used for attempts at voter manipulation, carried out from Russia.There were similar problems ahead of Britain’s 2016 referendum on leaving the European Union.“Thirty-five thousand employees take care of the security of our platforms and contribute for elections,” said Clegg, who is vice president of global affairs and communications at Facebook.“We have established partnerships with 70 specialised media, including five in France, on the verification of information”, he added. AFP is one of those partners.Clegg added that the company also uses artificial intelligence that has “made it possible to delete billions of posts and fake accounts, even before they are reported by users”.Facebook also stores all advertisements and information on their funding and provenance for seven years “to ensure transparency,” he said.In 2016, while he was still deputy prime minister, Clegg complained to the Journal du Dimanche that Facebook had not identified or suppressed a single foreign network interfering in the US election.On Wednesday, Trump rebuked Facebook and Twitter for blocking links to a New York Post article purporting to expose corrupt dealings by election rival Joe Biden and his son Hunter in Ukraine.A day earlier Facebook announced a ban on ads that discourage people from getting vaccinated, in light of the coronavirus pandemic which the social media giant said has “highlighted the importance of preventive health behaviours”. More