More stories

  • in

    YouTube’s stronger election misinformation policies had a spillover effect on Twitter and Facebook, researchers say.

    .dw-chart-subhed {
    line-height: 1;
    margin-bottom: 6px;
    font-family: nyt-franklin;
    color: #121212;
    font-size: 15px;
    font-weight: 700;
    }

    Share of Election-Related Posts on Social Platforms Linking to Videos Making Claims of Fraud
    Source: Center for Social Media and Politics at New York UniversityBy The New York TimesYouTube’s stricter policies against election misinformation was followed by sharp drops in the prevalence of false and misleading videos on Facebook and Twitter, according to new research released on Thursday, underscoring the video service’s power across social media.Researchers at the Center for Social Media and Politics at New York University found a significant rise in election fraud YouTube videos shared on Twitter immediately after the Nov. 3 election. In November, those videos consistently accounted for about one-third of all election-related video shares on Twitter. The top YouTube channels about election fraud that were shared on Twitter that month came from sources that had promoted election misinformation in the past, such as Project Veritas, Right Side Broadcasting Network and One America News Network.But the proportion of election fraud claims shared on Twitter dropped sharply after Dec. 8. That was the day YouTube said it would remove videos that promoted the unfounded theory that widespread errors and fraud changed the outcome of the presidential election. By Dec. 21, the proportion of election fraud content from YouTube that was shared on Twitter had dropped below 20 percent for the first time since the election.The proportion fell further after Jan. 7, when YouTube announced that any channels that violated its election misinformation policy would receive a “strike,” and that channels that received three strikes in a 90-day period would be permanently removed. By Inauguration Day, the proportion was around 5 percent.The trend was replicated on Facebook. A postelection surge in sharing videos containing fraud theories peaked at about 18 percent of all videos on Facebook just before Dec. 8. After YouTube introduced its stricter policies, the proportion fell sharply for much of the month, before rising slightly before the Jan. 6 riot at the Capitol. The proportion dropped again, to 4 percent by Inauguration Day, after the new policies were put in place on Jan. 7.To reach their findings, researchers collected a random sampling of 10 percent of all tweets each day. They then isolated tweets that linked to YouTube videos. They did the same for YouTube links on Facebook, using a Facebook-owned social media analytics tool, CrowdTangle.From this large data set, the researchers filtered for YouTube videos about the election broadly, as well as about election fraud using a set of keywords like “Stop the Steal” and “Sharpiegate.” This allowed the researchers to get a sense of the volume of YouTube videos about election fraud over time, and how that volume shifted in late 2020 and early 2021.Misinformation on major social networks has proliferated in recent years. YouTube in particular has lagged behind other platforms in cracking down on different types of misinformation, often announcing stricter policies several weeks or months after Facebook and Twitter. In recent weeks, however, YouTube has toughened its policies, such as banning all antivaccine misinformation and suspending the accounts of prominent antivaccine activists, including Joseph Mercola and Robert F. Kennedy Jr.Ivy Choi, a YouTube spokeswoman, said that YouTube was the only major online platform with a presidential election integrity policy. “We also raised up authoritative content for election-related search queries and reduced the spread of harmful election-related misinformation,” she said.Megan Brown, a research scientist at the N.Y.U. Center for Social Media and Politics, said it was possible that after YouTube banned the content, people could no longer share the videos that promoted election fraud. It is also possible that interest in the election fraud theories dropped considerably after states certified their election results.But the bottom line, Ms. Brown said, is that “we know these platforms are deeply interconnected.” YouTube, she pointed out, has been identified as one of the most-shared domains across other platforms, including in both of Facebook’s recently released content reports and N.Y.U.’s own research.“It’s a huge part of the information ecosystem,” Ms. Brown said, “so when YouTube’s platform becomes healthier, others do as well.” More

  • in

    Lawmakers seek to rein in big tech with bills aimed at competition and liability

    TechnologyLawmakers seek to rein in big tech with bills aimed at competition and liabilityOne bill would prevent platforms from giving preference to their own products, the other would remove Section 230 protections Kari PaulThu 14 Oct 2021 17.58 EDTLast modified on Thu 14 Oct 2021 18.37 EDTUS lawmakers announced two major new proposals seeking to rein in the power of big tech, days after the revelations from a former Facebook employee spotlighted the company’s sweeping impact.The first bill, proposed by a group of senators headed by Democrat Amy Klobuchar and Republican Chuck Grassley would bar big tech platforms from favoring their own products and services.The second bill, put forward by House Democrats, would remove some protections afforded tech companies by Section 230, a portion of the Communications Decency Act that exempts them from liability for what is posted on their platforms.Facebook whistleblower’s testimony could finally spark action in CongressRead moreThe proposals are part of a slew of bills from this Congress aimed at reining in tech firms, including industry leaders Facebook and Apple. Thus far, none have become law although one, a broader measure to increase resources for antitrust enforcers, has passed the Senate.Klobuchar and Grassley’s bill would specifically prohibit platforms from requiring companies operating on their sites to purchase the platform’s goods or services and ban them from biasing search results to favor the platform. It is a companion bill to a measure which has passed the House judiciary committee and must pass both houses of Congress to become law.The bill would address concerns that tech giants have become gatekeepers, giving preference to their own products, blocking rivals from accessing markets and imposing onerous fees and terms on smaller businesses.“As dominant digital platforms – some of the biggest companies our world has ever seen – increasingly give preference to their own products and services, we must put policies in place to ensure small businesses and entrepreneurs still have the opportunity to succeed in the digital marketplace,” Klobuchar said in a statement.The legislation comes as Congress is increasingly working on a bipartisan basis to address antitrust issues in big tech. Traditionally lawmakers have differed on their critiques of the industry – with Democrats claiming the companies are monopolies and Republicans criticizing what they perceive as an anti-conservative bias on the platforms.“This bill is welcome proof that the momentum in Congress to tackle big tech’s monopoly power is rapidly gaining force on both sides of the aisle,” read a statement from the Institute for Local Self-Reliance, a non-profit that fights against corporate monopolies. “We agree with their view that the tech giants cannot continue to abuse their power at the expense of competition, innovation, and entrepreneurship.”Meanwhile, the debate around Section 230 – a portion of the Communications Decency Act that protects companies from legal liability for content posted on their platforms – has continued. Its impact has long been a hot-button issue but became increasingly so during the Donald Trump’s presidency.The bill House Democrats introduced on Thursday would create an amendment in Section 230 that would hold companies responsible for the personalized algorithmic amplification of problematic content.In other words it seeks to simply “turn off” the Facebook news algorithm, said Evan Greer, director of digital rights group Fight For the Future.The law would apply only to large tech firms with 5,000,000 or more monthly users, but could still have negative consequences for firms large enough to qualify but that still have fewer resources than Facebook.“Facebook would likely be able to survive this, but smaller competitors wouldn’t,” Greer said. “That’s why Facebook has repeatedly called for changes to Section 230 – they know it will only serve to solidify their dominance and monopoly power.“This bill is well-intentioned, but it’s a total mess,” added Greer. “Democrats are playing right into Facebook’s hands by proposing tweaks to Section 230 instead of thoughtful policies that will actually reduce the harm done by surveillance-driven algorithms.”Lawmakers are “failing to understand how these policies will actually play out in the real world”, she added.Earlier this year more than 70 civil rights, LGBTQ+, sex worker advocacy and human rights organizations sent a letter cautioning lawmakers against changing Section 230.They instead prefer an approach to reining in Facebook and other platforms by attacking the data harvesting and surveillance practices they rely on as a business model.Democrats should instead “pass a privacy bill strong enough to kill Facebook’s surveillance driven business model while leaving the democratizing power of the internet intact”, Greer said.Reuters contributed to this reportTopicsTechnologyFacebookUS politicsSocial mediaApplenewsReuse this content More

  • in

    The whistleblower who plunged Facebook into crisis

    After a set of leaks last month that represented the most damaging insight into Facebook’s inner workings in the company’s history, the former employee behind them has come forward. Now Frances Haugen has given evidence to the US Congress – and been praised by senators as a ‘21st century American hero’. Will her testimony accelerate efforts to bring the social media giant to heel?

    How to listen to podcasts: everything you need to know

    On Monday, Facebook and its subsidiaries Instagram and WhatsApp went dark after a router failure. There were thousands of negative headlines, millions of complaints, and more than 3 billion users were forced offline. On Tuesday, the company’s week got significantly worse. Frances Haugen, a former product manager with Facebook, testified before US senators about what she had seen in her two years there – and set out why she had decided to leak a trove of internal documents to the Wall Street Journal. Haugen had revealed herself as the source of the leak a few days earlier. And while the content of the leak – from internal warnings of the harm being done to teenagers by Instagram to the deal Facebook gives celebrities to leave their content unmoderated – had already led to debate about whether the company needed to reform, Haugen’s decision to come forward escalated the pressure on Mark Zuckerberg. In this episode, Nosheeen Iqbal talks to the Guardian’s global technology editor, Dan Milmo, about what we learned from Haugen’s testimony, and how damaging a week this could be for Facebook. Milmo sets out the challenges facing the company as it seeks to argue that the whistleblower is poorly informed or that her criticism is mistaken. And he reflects on what options politicians and regulators around the world will consider as they look for ways to curb Facebook’s power, and how likely such moves are to succeed. After Haugen spoke, Zuckerberg said her claims that the company puts profit over people’s safety were “just not true”. In a blog post, he added: “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” You can read more of Zuckerberg’s defence here. And you can read an analysis of how Haugen’s testimony is likely to affect Congress’s next move here. Archive: BBC; YouTube; TikTok; CSPAN; NBC; CBS;CNBC; Vice; CNN More

  • in

    Facebook whistleblower’s testimony could finally spark action in Congress

    FacebookFacebook whistleblower’s testimony could finally spark action in CongressDespite years of hearings, the company has long seemed untouchable. But Frances Haugen appears to have inspired rare bipartisanship Kari PaulWed 6 Oct 2021 01.00 EDTThe testimony of Frances Haugen, a former Facebook employee, is likely to increase pressure on US lawmakers to undertake concrete legislative actions against the formerly untouchable tech company, following years of hearings and circular discussions about big tech’s growing power.In a hearing on Tuesday, the whistleblower shared internal Facebook reports with Congress and argued the company puts “astronomical profits before people”, harms children and is destabilizing democracies.Facebook harms children and is damaging democracy, claims whistleblowerRead moreAfter years of sparring over the role of tech companies in past American elections, lawmakers from both sides of the aisle on Tuesday appeared to agree on the need for new regulations that would change how Facebook targets users and amplifies content.“Frances Haugen’s testimony appears to mark a rare moment of bipartisan consensus that the status quo is no longer acceptable,” said Imran Ahmed, chief executive officer of the Center for Countering Digital Hate, a non-profit that fights hate speech and misinformation. “This is increasingly becoming a non-political issue and one that has cut through definitively to the mainstream.”Throughout the morning, Congress members leveled questions at Haugen about what specifically could and should be done to address the harms caused by Facebook.With 15 years in the industry as an expert in algorithms and design, Haugen offered a number of suggestions – including changing news feeds to be chronological rather than algorithmic, appointing a government body for tech oversight, and requiring more transparency on internal research.“I think the time has come for action,” Senator Amy Klobuchar told Haugen. “And I think you are the catalyst for that action.”Unlike past hearings, which were frequently derailed by partisan bickering, Tuesday’s questioning largely stuck to problems posed by Facebook’s opaque algorithmic formulas and how it harms children. Such issues can unite Congress and there is going to be “a lot of bipartisan concern about this today and in future hearings”, said Senator Roger Wicker of Mississippi.“The recent revelations about Facebook’s mental health effects on children are indeed disturbing,” he said. “They just show how urgent it is for Congress to act against powerful tech companies, on behalf of children and the broader public.”However, activists who have been calling on Congress to enact laws protecting children from the negative effects of social media are skeptical of such promises.“The bipartisan anger at Facebook is encouraging and totally justified,” said Jim Steyer, founder and CEO of the children’s protection organization Common Sense. “The next step is to turn that bipartisan anger into bipartisan legislative action before the year is over.”Exactly what should be done to regulate Facebook is a matter of debate. Senator Todd Young of Indiana asked Haugen whether she believed breaking up Facebook would solve these issues.“I’m actually against breaking up Facebook,” Haugen said. “Oversight and finding collaborative solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous even if broken up.”Many laws introduced or discussed thus far in Congress take aim at section 230, a portion of US internet regulations that exempts platforms from legal liability for content generated by their users.While some organizations, including Common Sense, are calling for the reform of section 230, other internet freedom advocates have warned that targeting that law could have unintended negative consequences for human rights, activism, and freedom of expression.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead more“Haugen’s proposal to create a carveout in section 230 around algorithmic amplification would do more harm than good,” said Evan Greer, director of the activist group Fight for the Future. “Your feed would become like Disneyland, where everything in it is sanitized, vetted by lawyers, and paid for by corporations.”Following the hearing, Facebook disputed Haugen’s characterizations. But the company said it agreed more regulation was in order. “We agree on one thing. It’s time to begin to create standard rules for the internet,” said Lena Pietsch, Facebook’s director of policy communications, in a statement. “It’s been 25 years since the rules of the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”Greer argued that Facebook was promoting changes to internet laws so that it could have a hand in crafting legislation that would largely benefit big corporations. Other members of Congress have put forward potential paths to regulation that sidestep section 230 reform. Common Sense has called on Congress to pass the Children and Media Research Advancement (Camra) Act, which would authorize the National Institutes of Health to carry out research on the effects of social media on children and teens.Advocacy groups have also called on Congress for updates to the Children’s Online Privacy Protection Act (Coppa), currently the primary mechanism for protecting children online.Proposed changes would stop companies from profiling teens and youth and microtargeting them with ads and content specifically designed to prey on their fears and insecurities.“Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content and preying on children and teens is over,” Markey, who authored one such bill, called the Kids Act, said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy any longer.”TopicsFacebookUS CongressSocial networkingUS politicsSocial mediaanalysisReuse this content More

  • in

    ‘Congress will be taking action’: key takeaways from the Facebook whistleblower hearing

    Facebook‘Congress will be taking action’: key takeaways from the Facebook whistleblower hearingFrances Haugen’s testimony spotlighted the negative effects of social media’s impact on children and called for regulation of the company04:21Kari PaulTue 5 Oct 2021 15.16 EDTLast modified on Tue 5 Oct 2021 17.28 EDTThe Facebook whistleblower, Frances Haugen, testified before the US Congress on Tuesday, painting a dire picture of the tech giant’s policies.Haugen’s appearance in front of the US Senate is just the latest high-profile hearing on big tech, but it proved a substantive and insightful session that is sure to have a lasting impact.One of the most useful big tech hearings yetUS lawmakers have held several high-profile hearings on the practices of prominent tech companies such as Facebook, Google and Amazon in the past years, but we have rarely seen testimony from a witness who has so much expertise and so many actionable suggestions for improving a tech company. It may have been the most useful big tech hearing yet.Facebook harms children and is damaging democracy, claims whistleblowerRead moreHaugen’s testimony echoed concerns from activists and researchers that Facebook systematically promotes harmful content and encourages engagement at all costs. “The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” she said. Social media’s impact on childrenTuesday’s hearing followed a Wall Street Journal report that revealed that Facebook had put aside its own research on the negative impact of its Instagram app on children. Haugen told lawmakers that Facebook intentionally targets teens, including children under the age of 13. She added she does not believe Facebook when it says it is suspending Instagram Kids, its platform for young users.Just last week, Facebook’s head of safety Antigone Davis had responded to questions about the company’s targeting of young users by emphasizing that children under the age of 13 were not allowed on Facebook.Fresh calls for regulationHaugen argued that Facebook needs more regulation, portraying a company that lacks the staffing, expertise and transparency needed to make meaningful change.“Facebook is stuck in a cycle where it struggles to hire,” she says. “That causes it to understaff projects, which causes scandals, which then makes it harder to hire.”Senators seemed to agreeSenators repeatedly compared Facebook to big tobacco, suggesting we may see similar regulation to the platform as we have seen of cigarettes in the past. “Facebook is like big tobacco, enticing young kids with that first cigarette,” said Senator Ed Markey. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer,” Markey added.A spotlight on Facebook’s role abroadHaugen put the spotlight on the impact of Facebook’s policy decisions outside the US, saying that the company does not dedicate equal amounts of research and resources to misinformation and hate speech to non-English content. “Facebook invests more in users that make them more money, even though danger may not be evenly distributed based on profitability,” she said.Haugen said 87% of misinformation spending at Facebook is on English content when only 9% of users are English speakers. That resource gap, she said, is fueling violence in places like Ethiopia.And on Facebook’s lack of transparencyHaugen also said Facebook lacks transparency, and urged lawmakers to demand more insight in the company’s research. She referenced Facebook’s decision in August to revoke the access of researchers of New York University to the platform’s data about the spread of vaccine misinformation.“The fact that Facebook is so scared of even basic transparency, that it goes out of its way to block researchers who are asking awkward questions, shows the need for congressional oversight,” she said.An array of possible next stepsHaugen stopped short of calling for a breakup of the company, but suggested several measures that could be taken to regulate it.Those measures include an independent government body staffed by former tech workers who understand how the algorithm works, changing the news feed to be chronological rather than ranking content through an opaque algorithm and requiring Facebook to publicly disclose its internal research.She encouraged the company to accept help from outsiders, offering empathy to Facebook and conceding “these are really, really hard questions” to address.Following the hearing, Facebook spokeswoman Lena Pietsch said in a statement that the company doesn’t agree with Haugen’s characterizations. “Despite all this, we agree on one thing: it’s time to begin to create standard rules for the internet,” she added. “It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookSocial mediaUS SenateUS politicsChildren’s healthnewsReuse this content More

  • in

    Facebook whistleblower accuses firm of serially misleading over safety

    FacebookFacebook whistleblower accuses firm of serially misleading over safety Frances Haugen filed at least eight complaints against the company regarding its approach to safety Dan Milmo Global technology editorTue 5 Oct 2021 07.50 EDTLast modified on Tue 5 Oct 2021 10.23 EDTThe Facebook whistleblower, Frances Haugen, who testifies at the US Congress on Tuesday, has filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.The complaints, published online by the news programme 60 Minutes late on Monday, hours before Haugen’s testimony to US senators at 10am EDT (3pm BST), are based on tens of thousands of internal documents that Haugen copied shortly before she quit Facebook in May.The complaints and testimony from Haugen, who stepped forward on Sunday as the source of a damning series of revelations in the Wall Street Journal, are taking place against a backdrop of operational chaos for Facebook, whose platforms, including Instagram and WhatsApp, went offline around the world for nearly six hours on Monday.The first whistleblower complaint filed to the US Securities and Exchange Commission relates to the 6 January riots in Washington, when crowds of protesters stormed the Capitol, and alleges that Facebook knowingly chose to permit political misinformation and contests statements made by its chief executive, Mark Zuckerberg, to the contrary.“Our anonymous client is disclosing original evidence showing that Facebook … has, for years past and ongoing, violated US securities laws by making material misrepresentations and omissions in statements to investors and prospective investors,” the sweeping opening statement reads, “including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”The complaints against Facebook, which reflect a series of reports in the Wall Street Journal in recent weeks, also cover:
    The company’s approach to hate speech.
    Its approach to teenage mental health.
    Its monitoring of human trafficking.
    How the company’s algorithms promoted hate speech.
    Preferential disciplinary treatment for VIP users.
    Promoting ethnic violence.
    Failing to inform investors about a shrinking user base in certain demographics.
    The first complaint, regarding 6 January, contests testimony given to Congress in March by Facebook’s founder and chief executive, Mark Zuckerberg, in which he stated that: “We remove language that incites or facilitates violence, and we ban groups that proclaim a hateful and violent mission.”The complaint rebuts this, claiming that the company’s own records show it “knowingly chose to permit political misinformation and violent content/groups and failed to adopt or continue measures to combat these issues, including as related to the 2020 US election and the 6 January insurrection, in order to promote virality and growth on its platforms.”According to one internal Facebook document quoted in the complaints, the company admits: “For example, we estimate that we may action as little as 3-5% of hate [speech] and ~0.6% of V&V [violent and inciting content] on Facebook.”A complaint also alleges that Facebook misrepresented its “reach and frequency”, which are key metrics for the advertisers who provide the majority of Facebook’s revenue. That included concealing a decline in the key demographic of young users, the complaint stated. “During Covid, every cohort’s use of Facebook increased, except for those 23 and under, which continued to decline,” the complaint said.“For years, Facebook has misrepresented core metrics to investors and advertisers including the amount of content produced on its platforms and growth in individual users,” it said, adding this applied particularly in “high-value demographics” such as US teenagers.Facebook has been approached for comment.The human trafficking complaint alleges that Facebook and its photo-sharing app, Instagram, were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude”. The hate speech complaint quotes another internal document that states: “We only take action against approximately 2% of the hate speech on the platform.” The teen health complaint focuses on the most damaging allegation from the WSJ series: that Instagram knew the app caused anxiety about body image among teenage girls.A complaint about Facebook’s approach to algorithms alleges that a tweak to the app’s News Feed product – a key part of users’ interaction with the app – led to the prioritisation of divisive content, while the complaint about ethnic violence contains an excerpt from an internal study that claims “in the Afghanistan market, the action rate for hate speech is worryingly low”.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teens found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”TopicsFacebookSocial mediaUS CongressUS politicsDigital mediaSocial networkingnewsReuse this content More

  • in

    Facebook whistleblower to take her story before the US Senate

    FacebookFacebook whistleblower to take her story before the US SenateFrances Haugen, who came forward accusing the company of putting profit over safety, will testify in Washington on Tuesday Dan Milmo and Kari PaulMon 4 Oct 2021 23.00 EDTLast modified on Mon 4 Oct 2021 23.23 EDTA former Facebook employee who has accused the company of putting profit over safety will take her damning accusations to Washington on Tuesday when she testifies to US senators.Frances Haugen, 37, came forward on Sunday as the whistleblower behind a series of damaging reports in the Wall Street Journal that have heaped further political pressure on the tech giant. Haugen told the news program 60 Minutes that Facebook’s priority was making money over doing what was good for the public.“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money,” she said.How losing a friend to misinformation drove Facebook whistleblower Read moreHaugen is expected to tell lawmakers that Facebook faces little oversight, and will urge Congress to take action. “As long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good,” she wrote in her written testimony.Haugen was called to testify before the US Senate’s commerce subcommittee on the risks the company’s products pose to children. Lawmakers called the hearing in response to a Wall Street Journal story based on Haugen’s documents that showed Facebook was aware of the damage its Instagram app was causing to teen mental health and wellbeing. One survey in the leaked research estimated that 30% of teenage girls felt Instagram made dissatisfaction with their body worse.She is expected to compare Facebook to big big tobacco, which resisted telling the public that smoking damaged consumers’ health. “When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seatbelts, the government took action,” Haugen wrote. “I implore you to do the same here.”Haugen will argue that Facebook’s closed design means it has no oversight, even from its own oversight board, a regulatory group that was formed in 2020 to make decisions independent of Facebook’s corporate leadership.“This inability to see into the actual systems of Facebook and confirm that Facebook’s systems work like they say is like the Department of Transportation regulating cars by watching them drive down the highway,” she wrote in her testimony. “Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seatbelts could exist.”Senator Richard Blumenthal, the Democrat whose committee is holding Tuesday’s hearing, told the Washington Post’s Technology 2020 newsletter that lawmakers will also ask Haugen about her remarks on the 2020 presidential election.Haugen alleged on 60 Minutes that following Joe Biden’s win in the election, Facebook prematurely reinstated old algorithms that valued engagement over all else, a move that she said contributed to the 6 January attack on the Capitol.“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me,” she said.Following the election, Facebook also disbanded its civic team integrity team, a group that worked on issues related to political elections worldwide and which Haugen worked on. Facebook has said the team’s functions were distributed across the company.Haugen joined Facebook in 2019 as a product manager on the civic integrity team after spending more than a decade working in the tech industry, including at Pinterest and Google.Tuesday’s hearing is the second in mere weeks to focus on Facebook’s impact on children. Last week, lawmakers grilled Antigone Davis, Facebook’s global head of safety, and accused the company of “routinely” putting growth above children’s safety.Facebook has aggressively contested the accusations.On Friday, the company’s vice-president of policy and public affairs, Nick Clegg, wrote to Facebook employees ahead of Haugen’s public appearance. “Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” he said. “But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.”On Monday, Facebook asked a federal judge throw out a revised anitrust lawsuit brought by the Federal Trade Commission (FTC) that seeks to force the company giant to sell Instagram and WhatsApp.TopicsFacebookSocial mediaUS SenateUS politicsnewsReuse this content More

  • in

    Facebook ‘tearing our societies apart’: key excerpts from a whistleblower

    FacebookFacebook ‘tearing our societies apart’: key excerpts from a whistleblower Frances Haugen tells US news show why she decided to reveal inside story about social networking firm Dan Milmo Global technology editorMon 4 Oct 2021 08.33 EDTLast modified on Mon 4 Oct 2021 10.30 EDTFrances Haugen’s interview with the US news programme 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show the tech firm prioritised profit over safety and was “tearing our societies apart”.Haugen will testify in Washington on Tuesday, as political pressure builds on Facebook. Here are some of the key excerpts from Haugen’s interview.Choosing profit over the public goodHaugen’s most cutting words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit above the wellbeing of its users and the public. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money.”She also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election was over, allowing misinformation to spread on the platform again. “And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”Facebook’s approach to safety compared with othersIn a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”Instagram and mental healthThe document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterisation” of its research.Why Haugen leaked the documentsHaugen said “person after person” had attempted to tackle Facebook’s problems but had been ground down. “Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”Having joined the company in 2019, Haugen said she decided to act this year and started copying tens of thousands of documents from Facebook’s internal system, which she believed show that Facebook is not, despite public comments to the contrary, making significant progress in combating online hate and misinformation . “At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”Facebook and violenceHaugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”Facebook and the Washington riotThe 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”The 2018 algorithm changeFacebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customised feed of content such as friends’ photos and news stories – to prioritise content that increased user engagement. Haugen said this made divisive content more prominent.“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”Haugen said European political parties contacted Facebook to say that the news feed change was forcing them to take more extreme political positions in order to win users’ attention. Describing polititicians’ concerns, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”In a statement to 60 Minutes, Facebook said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”TopicsFacebookSocial networkingUS Capitol attackInstagramMental healthSocial mediaYoung peoplenewsReuse this content More