More stories

  • in

    ‘Congress will be taking action’: key takeaways from the Facebook whistleblower hearing

    Facebook‘Congress will be taking action’: key takeaways from the Facebook whistleblower hearingFrances Haugen’s testimony spotlighted the negative effects of social media’s impact on children and called for regulation of the company04:21Kari PaulTue 5 Oct 2021 15.16 EDTLast modified on Tue 5 Oct 2021 17.28 EDTThe Facebook whistleblower, Frances Haugen, testified before the US Congress on Tuesday, painting a dire picture of the tech giant’s policies.Haugen’s appearance in front of the US Senate is just the latest high-profile hearing on big tech, but it proved a substantive and insightful session that is sure to have a lasting impact.One of the most useful big tech hearings yetUS lawmakers have held several high-profile hearings on the practices of prominent tech companies such as Facebook, Google and Amazon in the past years, but we have rarely seen testimony from a witness who has so much expertise and so many actionable suggestions for improving a tech company. It may have been the most useful big tech hearing yet.Facebook harms children and is damaging democracy, claims whistleblowerRead moreHaugen’s testimony echoed concerns from activists and researchers that Facebook systematically promotes harmful content and encourages engagement at all costs. “The choices being made inside of Facebook are disastrous for our children, our public safety, our privacy and for our democracy,” she said. Social media’s impact on childrenTuesday’s hearing followed a Wall Street Journal report that revealed that Facebook had put aside its own research on the negative impact of its Instagram app on children. Haugen told lawmakers that Facebook intentionally targets teens, including children under the age of 13. She added she does not believe Facebook when it says it is suspending Instagram Kids, its platform for young users.Just last week, Facebook’s head of safety Antigone Davis had responded to questions about the company’s targeting of young users by emphasizing that children under the age of 13 were not allowed on Facebook.Fresh calls for regulationHaugen argued that Facebook needs more regulation, portraying a company that lacks the staffing, expertise and transparency needed to make meaningful change.“Facebook is stuck in a cycle where it struggles to hire,” she says. “That causes it to understaff projects, which causes scandals, which then makes it harder to hire.”Senators seemed to agreeSenators repeatedly compared Facebook to big tobacco, suggesting we may see similar regulation to the platform as we have seen of cigarettes in the past. “Facebook is like big tobacco, enticing young kids with that first cigarette,” said Senator Ed Markey. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer,” Markey added.A spotlight on Facebook’s role abroadHaugen put the spotlight on the impact of Facebook’s policy decisions outside the US, saying that the company does not dedicate equal amounts of research and resources to misinformation and hate speech to non-English content. “Facebook invests more in users that make them more money, even though danger may not be evenly distributed based on profitability,” she said.Haugen said 87% of misinformation spending at Facebook is on English content when only 9% of users are English speakers. That resource gap, she said, is fueling violence in places like Ethiopia.And on Facebook’s lack of transparencyHaugen also said Facebook lacks transparency, and urged lawmakers to demand more insight in the company’s research. She referenced Facebook’s decision in August to revoke the access of researchers of New York University to the platform’s data about the spread of vaccine misinformation.“The fact that Facebook is so scared of even basic transparency, that it goes out of its way to block researchers who are asking awkward questions, shows the need for congressional oversight,” she said.An array of possible next stepsHaugen stopped short of calling for a breakup of the company, but suggested several measures that could be taken to regulate it.Those measures include an independent government body staffed by former tech workers who understand how the algorithm works, changing the news feed to be chronological rather than ranking content through an opaque algorithm and requiring Facebook to publicly disclose its internal research.She encouraged the company to accept help from outsiders, offering empathy to Facebook and conceding “these are really, really hard questions” to address.Following the hearing, Facebook spokeswoman Lena Pietsch said in a statement that the company doesn’t agree with Haugen’s characterizations. “Despite all this, we agree on one thing: it’s time to begin to create standard rules for the internet,” she added. “It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookSocial mediaUS SenateUS politicsChildren’s healthnewsReuse this content More

  • in

    Facebook harms children and is damaging democracy, claims whistleblower

    FacebookFacebook harms children and is damaging democracy, claims whistleblowerFrances Haugen says in US Congress testimony that Facebook puts ‘astronomical profits before people’04:21Dan Milmo and Kari PaulTue 5 Oct 2021 14.56 EDTFirst published on Tue 5 Oct 2021 14.48 EDTFacebook puts “astronomical profits before people”, harms children and is destabilising democracies, a whistleblower has claimed in testimony to the US Congress.Frances Haugen said Facebook knew it steered young users towards damaging content and that its Instagram app was “like cigarettes” for under-18s. In a wide-ranging testimony, the former Facebook employee said the company did not have enough staff to keep the platform safe and was “literally fanning” ethnic violence in developing countries.She also told US senators:
    The “buck stops” with the founder and chief executive, Mark Zuckerberg.
    Facebook knows its systems lead teenagers to anorexia-related content.
    The company had to “break the glass” and turn back on safety settings after the 6 January Washington riots.
    Facebook intentionally targets teenagers and children under 13.
    Monday’s outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not “destabilise democracies”.
    Haugen appeared in Washington on Tuesday after coming forward as the source of a series of revelations in the Wall Street Journal last month based on internal Facebook documents. They revealed the company knew Instagram was damaging teenagers’ mental health and that changes to Facebook’s News Feed feature – a central plank of users’ interaction with the service – had made the platform more polarising and divisive.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead moreHer evidence to senators included the claim that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.In her opening testimony, Haugen, 37, said: “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.” She added that Facebook was “buying its profits with our safety”. In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).Referring to Monday’s near six-hour outage in which Facebook’s platforms including Instagram and WhatsApp were disabled for billions of users, Haugen’s testimony added: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.” Facebook has 3.5 billion monthly active users across its platforms including Instagram and WhatsApp.Warning that Facebook makes choices that “go against the common good”, Haugen said the company should be treated like the tobacco industry, which was subject to government action once it was discovered it was hiding the harms its products caused, or like car companies that were forced to adopt seatbelts or opioid firms that have been sued by government agencies.Urging lawmakers to force more transparency on Facebook, she said there should be more scrutiny of its algorithms, which shape the content delivered to users. “The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said. With greater transparency, she added, “we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.The hearing focused on the impact of Facebook’s platforms on children, with Haugen likening the appeal of Instagram to tobacco. “It’s just like cigarettes … teenagers don’t have good self-regulation.” Haugen added women would be walking around with brittle bones in 60 years’ time because of the anorexia-related content they found on Facebook platforms.Haugen told lawmakers that Facebook intentionally targets teens and “definitely” targets children as young as eight for the Messenger Kids app. The former Facebook product manager left the company in May after copying tens of thousands of internal documents.A Facebook spokesperson, Andy Stone, said in a tweet during the hearing: “Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”Haugen said that, according to internal documents, Zuckerberg had been given “soft options” to make the Facebook platform less “twitchy” and viral in countries prone to violence but declined to take them because it might affect “meaningful social interactions”, or MSI. She added: “We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.”Haugen said Zuckerberg had built a company that was “very metrics driven”, because the more time people spent on Facebook platforms the more appealing the business was to advertisers. Asked about Zuckerberg’s ultimate responsibility for decisions made at Facebook, she said: “The buck stops with him.”Haugen also warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside of the US.Referring to the aftermath of the 6 January storming of the Capitol, as protesters sought to overturn the US presidential election result, Haugen said she was disturbed that Facebook had to “break the glass” and reinstate safety settings that it had put in place for the November poll. Haugen, who worked for the Facebook team that monitored election interference globally, said those precautions had been dropped after Joe Biden’s victory in order to spur growth on the platform.Among the reforms recommended by Haugen were ensuring that Facebook shares internal information and research with “appropriate” oversight bodies such as Congress and removing the influence of algorithms on Facebook’s News Feed by allowing it to be ranked chronologically.Senator Ed Markey said Congress would take action. “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”Haugen’s lawyers have also filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teenagers found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”A Facebook spokesperson said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookUS CongressSocial networkingUS politicsMark ZuckerbergnewsReuse this content More

  • in

    Facebook whistleblower hearing: Frances Haugen testifies in Washington – live updates

    Key events

    Show

    1.47pm EDT
    13:47

    Hearing comes to a close as Haugen encourages more whistleblowers to come forward

    12.21pm EDT
    12:21

    The key takeaways so far

    10.38am EDT
    10:38

    Haugen: ‘We can afford nothing less than full transparency’

    10.32am EDT
    10:32

    Haugen gives opening statements

    10.06am EDT
    10:06

    Frances Haugen to testify before the Senate

    Live feed

    Show

    2.05pm EDT
    14:05

    This concludes the testimony of Facebook whistleblower, Frances Haugen in US Congress on Tuesday. She spoke for hours, painting a dire picture of the company’s policies and offering suggestions of how to fix a company she called “morally bankrupt”.
    Haugen’s appearance in front of the US Senate is just the latest high-profile hearing on Big Tech, but it proved a substantive and and insightful session that is sure to have a lasting impact.
    One of the most useful Big Tech hearings yet
    US lawmakers have held several high-profile hearings on the practices of prominent tech companies like Facebook, Google and Amazon in the past years, but we have rarely seen testimony from a witness who has so much expertise and so many actionable suggestions to changing a tech company for the better. It may have been the most useful Big Tech hearing yet.
    Social media’s impact on children
    Tuesday’s hearing was prompted by a Wall Street Journal report that revealed that Facebook had put aside its own research on the negative impact of its Instagram app on children.
    Haugen told lawmakers that Facebook intentionally targets teens, including children under the age of 13. Just last week, Facebook’s head of safety Antigone Davis had responded to questions about the company’s targeting of young users by emphasizing that children under the age of 13 were not allowed on Facebook.
    She said she does not believe Facebook when it says it is suspending Instagram Kids, its platform for young users that has been widely criticized. “I would be sincerely surprised if they do not continue working on Instagram Kids.”
    Fresh calls for regulation
    Haugen argued that Facebook needs more regulation, painting a picture of a company that lacks the staffing, expertise and transparency needed to make meaningful change. “Facebook is stuck in a cycle where it struggles to hire,” she says. “That causes it to understaff projects, which causes scandals, which then makes it harder to hire.”
    Senators seemed to agree
    Senators repeatedly compared Facebook to Big Tobacco, suggesting we may see similar regulation to the platform as we have seen of cigarettes in the past. “Facebook is like Big Tobacco, enticing young kids with that first cigarettes,” said senator Ed Markey. “A first social media account designed to keep kids as users for life.”
    “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer,” Markey added.
    A spotlight on Facebook’s role abroad
    Haugen put the spotlight on the impact of Facebook’s policy decisions outside f of the US, saying that the company does not dedicate equal amounts of research and resources to misinformation and hate speech to non-English content. That resource gap, she said, that is fueling violence in places like Ethiopia, she contended .
    And on Facebook’s lack of transparency
    Haugen also condemned Facebook’s lack of transparency and suppression of research, both internally and by outside auditors. She referenced Facebook’s decision in August to revoke the access of researchers of New York University to the platform’s data about the spread of vaccine misinformation.
    “The fact that Facebook is so scared of even basic transparency, that it goes out of its way to block researchers who are asking awkward questions, shows the need for Congressional oversight,” she said.
    An array of possible next steps
    Haugen suggested several measures that could be taken to regulate Facebook, that will surely be debated in the weeks to come. Those measures include an independent government body staffed by former tech workers who understand how the algorithm works, changing the Newsfeed to be chronological rather than ranking content through an opaque algorithm and requiring Facebook to publicly disclose its internal research.

    1.55pm EDT
    13:55

    After the hearing came to a close, more reactions to Haugen’s testimony have emerged. Key takeaways? The whistleblower’s intensive expertise and concrete suggestions to fix Facebook made this one of the most productive hearings we have seen on Big Tech.

    John Paczkowski
    (@JohnPaczkowski)
    just fascinating watching Frances Haugen methodically dismantle, collapse and chop into pieces *years* of Facebook PR messaging and deflection.

    October 5, 2021

    One suggestion Haugen repeatedly returned to was abolishing the newsfeed, the potential impact of which cannot be understated.

    Will Oremus
    (@WillOremus)
    This is a legitimately radical proposal in that it would essentially destroy social media as we know it. It would make our feeds crappy and boring, and either spam-ridden or overly sanitized or both. The question is: Have things gotten so bad that it’s worth it? https://t.co/403xoDGQqX

    October 5, 2021

    Some disagree on the best way to address these issues, however, with internet freedom advocates warning against chipping away at Section 230.

    Evan Greer
    (@evan_greer)
    Yes, I agree this is way more productive than a lot of previous hearings, although it sure seems like lawmakers are still stuck on stupid around Section 230 instead of recognizing that privacy legislation could address this algorithmic harm.Also this: https://t.co/yppgN15gWs https://t.co/wRIdh0TqCl

    October 5, 2021

    The focus from Haugen on international implications of Facebook’s toxic algorithms was appreciated by a number of experts.

    Sheera Frenkel
    (@sheeraf)
    I am enjoying how @FrancesHaugen keeps deftly turning the conversation to how the problems we have in the US pale by comparison to what is happening with Facebook globally.

    October 5, 2021

    The comparisons to Big Tobacco and its regulation were frequent today.

    Joan Donovan, 🦫
    (@BostonJoan)
    The FTC could develop and test standards for algorithms, like they did with testing cigarettes. We have so much more to learn here. https://t.co/c4MwFZ35bD pic.twitter.com/diZQJK6E8g

    October 5, 2021

    Meanwhile, others believe Facebook is irreparable, and that even the intensive measures proposed by Haugen cannot save it.

    jane chung
    (@orientaljanedoe)
    props to @FrancesHaugen for speaking upbut the premise that Facebook can be “healed” or “fixed” through a “truth and reconciliation” process is deeply, deeply flawed

    October 5, 2021

    Updated
    at 2.05pm EDT

    1.47pm EDT
    13:47

    Hearing comes to a close as Haugen encourages more whistleblowers to come forward

    Blumenthal ended the hearing with an emotional statement. He read aloud a text he received from a constituent who said he was “in tears” listening to Haugen’s testimony. A full transcription of that poignant message below:

    My 15-year-old daughter loved her body – and at 14 was on Instagram constantly, and maybe posting too much. Suddenly, she started hating her body. With her body dysmorphia, and now anorexia, she was in deep, deep trouble before we found treatment. I fear shall never be the same. I am broken hearted.

    In her closing statements, Haugen underscored the lack of transparency from Big Tech and encouraged her fellow tech workers to speak with bodies like the Securities and Exchange Commission and Congress “in order to have technologies be human centric, not computer centric.”
    “We live in a moment when whistleblowers are very important because these technological systems are walled off,” she said.

    Updated
    at 1.55pm EDT

    1.12pm EDT
    13:12

    In questioning with Senator Amy Klobuchar, Haugen again condemns Facebook’s lack of transparency and suppression of research – both internally and by outside auditors.
    She referenced Facebook’s decision in August to revoke NYU researchers’ access to platform data about the spread of vaccine misinformation. She said she “stands with” researchers who Facebook is “throwing under the bus” in its own interest.
    “The fact that Facebook is so scared of even basic transparency, that it goes out of its way to block researchers who are asking awkward questions, shows the need for congressional oversight,” she said.

    Updated
    at 1.48pm EDT

    1.05pm EDT
    13:05

    There is a lot of discussion about Section 230 reform and whether it could effectively address these issues.
    Section 230 refers to a portion of the US internet regulation that exempts platforms from legal liability for content generated by its users.
    Haugen says Facebook has claimed in the past it has “the right to mislead the court” because it has immunity under Section 230 “so why should they have to tell the truth?”
    There has been significant discussion in recent years about whether the regulation should be modified or overturned, which some internet freedom advocates warn could have unintended effects.

    Evan Greer
    (@evan_greer)
    Haugen is wrong about this. If we created a carveout in S 230 for algorithmic amplification Facebook wouldn’t just automatically revert to chronoligcal feed, your feed would become Disneyland. You’d only see sanitized content from corporate actors who lawyers determine is “safe”

    October 5, 2021

    Other bills, like Markey’s KIDS act, would regulate algorithms without taking away section 230 protections.

    Updated
    at 1.07pm EDT

    12.44pm EDT
    12:44

    Senator Rick Scott of Florida asks Haugen why Facebook has not been more proactive about addressing the issues brought up in these hearings and recent Wall Street Journal reports. As many Senators have noted, Zuckerberg is sailing this week.
    Haugen takes a softer view on this, saying “I have a huge amount of empathy for Facebook.”
    “These are really, really hard questions and I think they feel a little trapped and isolated,” she said.
    She added that most social media firms have a strong hold on the positive purposes of their platforms, but that Instagram is “distinctly worse” than others.
    “TikTok is about doing fun things with your friends, Snapchat is about faces and augmented reality, Reddit is about ideas,” she said. “But Instagram is about bodies, and about comparing lifestyles.”

    12.29pm EDT
    12:29

    Now we have Senator Ted Cruz asking Haugen specifics about research she witnessed and what measures could be taken to enact meaningful change at Facebook. She suggests the following:

    Introducing friction to amplification – for example, a tool like Twitter has that requires users to click through a link before sharing it
    Change the Newsfeed to be chronological rather than ranking content through its opaque algorithm
    Convening a board in the public sector to regulate Facebook that is comprised of researchers, former tech workers who understand the algorithms, and legislators
    Requiring Facebook to publicly disclose its internal research

    Updated
    at 1.03pm EDT

    12.21pm EDT
    12:21

    The key takeaways so far

    The session is coming back from a brief recess. Here are some key takeaways from the first part of today’s testimony:

    Facebook intentionally targets teens including children under the age of 13, Haugen says her documents show.
    Lack of transparency around how Facebook’s algorithms work make it impossible to regulate, Haugen says.
    Senators are repeatedly comparing Facebook to Big Tobacco, suggesting we may see similar regulation to the platform as we have seen of cigarettes in the past: “A first social media account designed to keep kids as users for life,” said Sen. Ed Markey.
    The platform does not dedicate equal amounts of research and resources to misinformation and hate speech to non-English content, Haugen says, fueling violence in places like Ethiopia.
    Haugen has stressed that Facebook tends to rely on artificial intelligence to automate moderation, even though it only catches about 10-20% of offending content, because it is cheaper.
    Haugen suggested a number of measures to be taken to regulate Facebook, including an independent government body staffed by former tech workers who understand how the algorithm works.

    Updated
    at 1.04pm EDT

    11.43am EDT
    11:43

    Now we are moving into questioning from Senator Ed Markey, who called Haugen a “21st century American hero” and said Americans owe “a huge debt of gratitude” to her for her courage.
    He asked about whether Facebook purposely targets children, to which Haugen replies the company absolutely targets users under the age of 18.
    “Facebook is like Big Tobacco, enticing young kids with that first cigarettes,” he said. “A first social media account designed to keep kids as users for life.”
    Markey is promoting his KIDS act – an update to the Children’s Online Privacy Protection Act that would prevent companies from collecting certain data on children and to prohibit the use of algorithms that promote toxic posts.
    “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”

    Updated
    at 1.05pm EDT

    11.32am EDT
    11:32

    David Smith

    From my colleague David Smith, who is in person at the testimony in Washington DC.
    Facebook whistleblower Frances Haugen is delivering clear and crisp answers, with elaborate hand gestures for emphasis, at a Senate hearing where she is preaching to the converted.
    I’m among about 30 masked people in the public and press gallery sitting behind Haugen, who is alone at a long desk with two bottles of Mountain Valley water and a microphone before her. The latter has a red digital clock that counts down each senator’s question time.Senator Richard Blumenthal, chairing, focused on Haugen during her opening statement as other senators frequently looked down at their notes. He evidently liked her suggestion that Facebook should declare “moral bankruptcy”. Comparisons with Big Tobacco are also striking a chord.Haugen’s 60 Minutes interview means there are few surprises and the atmosphere is not quite as electrifying as Facebook’s critics would like, with senators such as Ted Cruz drifting in and out of the compact room and John Thune rocking back and forth in his chair.Senator Roger Wicker sought to reassure Haugen: “You see some vacant seats. This is a pretty good attendance for a subcommittee.” Wicker has since left the room.

    Updated
    at 1.05pm EDT

    11.30am EDT
    11:30

    Senator Mike Lee of Utah is talking about advertising targeted at young users, to which Haugen replies that it is “very, very difficult” to understand the algorithms used to regulate such posts.

    Kari Paul
    (@kari_paul)
    Sen​.​ Mike Lee of Utah is talking about advertising targeted at young users, to which Haugen replies that it is “very, very difficult” to understand the algorithms used to regulate such posts. pic.twitter.com/3WXNmkEPau

    October 5, 2021

    Updated
    at 11.50am EDT

    11.18am EDT
    11:18

    Haugen is suggesting a regulatory agency within the federal government dedicated to policing Facebook, staffed with people who have expertise in algorithms to make meaningful change.
    “Right now the only people in the world who are trained to analyze these experiments to understand what’s happening inside of Facebook are people who have spent time there,” she said. More

  • in

    Frances Haugen: Facebook harms children and stokes division – video

    Frances Haugen, the former employee who accused Facebook of putting profit over safety, has testified before the US Senate. The whistleblower condemned the extreme secrecy and lack of transparency around Facebook and how its algorithms work. ‘I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,’ she said. ‘The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.’

    Facebook harms children and is damaging democracy, claims whistleblower
    ‘Congress will be taking action’: key takeaways from the Facebook whistleblower hearing More

  • in

    Facebook whistleblower accuses firm of serially misleading over safety

    FacebookFacebook whistleblower accuses firm of serially misleading over safety Frances Haugen filed at least eight complaints against the company regarding its approach to safety Dan Milmo Global technology editorTue 5 Oct 2021 07.50 EDTLast modified on Tue 5 Oct 2021 10.23 EDTThe Facebook whistleblower, Frances Haugen, who testifies at the US Congress on Tuesday, has filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.The complaints, published online by the news programme 60 Minutes late on Monday, hours before Haugen’s testimony to US senators at 10am EDT (3pm BST), are based on tens of thousands of internal documents that Haugen copied shortly before she quit Facebook in May.The complaints and testimony from Haugen, who stepped forward on Sunday as the source of a damning series of revelations in the Wall Street Journal, are taking place against a backdrop of operational chaos for Facebook, whose platforms, including Instagram and WhatsApp, went offline around the world for nearly six hours on Monday.The first whistleblower complaint filed to the US Securities and Exchange Commission relates to the 6 January riots in Washington, when crowds of protesters stormed the Capitol, and alleges that Facebook knowingly chose to permit political misinformation and contests statements made by its chief executive, Mark Zuckerberg, to the contrary.“Our anonymous client is disclosing original evidence showing that Facebook … has, for years past and ongoing, violated US securities laws by making material misrepresentations and omissions in statements to investors and prospective investors,” the sweeping opening statement reads, “including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”The complaints against Facebook, which reflect a series of reports in the Wall Street Journal in recent weeks, also cover:
    The company’s approach to hate speech.
    Its approach to teenage mental health.
    Its monitoring of human trafficking.
    How the company’s algorithms promoted hate speech.
    Preferential disciplinary treatment for VIP users.
    Promoting ethnic violence.
    Failing to inform investors about a shrinking user base in certain demographics.
    The first complaint, regarding 6 January, contests testimony given to Congress in March by Facebook’s founder and chief executive, Mark Zuckerberg, in which he stated that: “We remove language that incites or facilitates violence, and we ban groups that proclaim a hateful and violent mission.”The complaint rebuts this, claiming that the company’s own records show it “knowingly chose to permit political misinformation and violent content/groups and failed to adopt or continue measures to combat these issues, including as related to the 2020 US election and the 6 January insurrection, in order to promote virality and growth on its platforms.”According to one internal Facebook document quoted in the complaints, the company admits: “For example, we estimate that we may action as little as 3-5% of hate [speech] and ~0.6% of V&V [violent and inciting content] on Facebook.”A complaint also alleges that Facebook misrepresented its “reach and frequency”, which are key metrics for the advertisers who provide the majority of Facebook’s revenue. That included concealing a decline in the key demographic of young users, the complaint stated. “During Covid, every cohort’s use of Facebook increased, except for those 23 and under, which continued to decline,” the complaint said.“For years, Facebook has misrepresented core metrics to investors and advertisers including the amount of content produced on its platforms and growth in individual users,” it said, adding this applied particularly in “high-value demographics” such as US teenagers.Facebook has been approached for comment.The human trafficking complaint alleges that Facebook and its photo-sharing app, Instagram, were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude”. The hate speech complaint quotes another internal document that states: “We only take action against approximately 2% of the hate speech on the platform.” The teen health complaint focuses on the most damaging allegation from the WSJ series: that Instagram knew the app caused anxiety about body image among teenage girls.A complaint about Facebook’s approach to algorithms alleges that a tweak to the app’s News Feed product – a key part of users’ interaction with the app – led to the prioritisation of divisive content, while the complaint about ethnic violence contains an excerpt from an internal study that claims “in the Afghanistan market, the action rate for hate speech is worryingly low”.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teens found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”TopicsFacebookSocial mediaUS CongressUS politicsDigital mediaSocial networkingnewsReuse this content More

  • in

    Facebook whistleblower to take her story before the US Senate

    FacebookFacebook whistleblower to take her story before the US SenateFrances Haugen, who came forward accusing the company of putting profit over safety, will testify in Washington on Tuesday Dan Milmo and Kari PaulMon 4 Oct 2021 23.00 EDTLast modified on Mon 4 Oct 2021 23.23 EDTA former Facebook employee who has accused the company of putting profit over safety will take her damning accusations to Washington on Tuesday when she testifies to US senators.Frances Haugen, 37, came forward on Sunday as the whistleblower behind a series of damaging reports in the Wall Street Journal that have heaped further political pressure on the tech giant. Haugen told the news program 60 Minutes that Facebook’s priority was making money over doing what was good for the public.“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money,” she said.How losing a friend to misinformation drove Facebook whistleblower Read moreHaugen is expected to tell lawmakers that Facebook faces little oversight, and will urge Congress to take action. “As long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good,” she wrote in her written testimony.Haugen was called to testify before the US Senate’s commerce subcommittee on the risks the company’s products pose to children. Lawmakers called the hearing in response to a Wall Street Journal story based on Haugen’s documents that showed Facebook was aware of the damage its Instagram app was causing to teen mental health and wellbeing. One survey in the leaked research estimated that 30% of teenage girls felt Instagram made dissatisfaction with their body worse.She is expected to compare Facebook to big big tobacco, which resisted telling the public that smoking damaged consumers’ health. “When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seatbelts, the government took action,” Haugen wrote. “I implore you to do the same here.”Haugen will argue that Facebook’s closed design means it has no oversight, even from its own oversight board, a regulatory group that was formed in 2020 to make decisions independent of Facebook’s corporate leadership.“This inability to see into the actual systems of Facebook and confirm that Facebook’s systems work like they say is like the Department of Transportation regulating cars by watching them drive down the highway,” she wrote in her testimony. “Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seatbelts could exist.”Senator Richard Blumenthal, the Democrat whose committee is holding Tuesday’s hearing, told the Washington Post’s Technology 2020 newsletter that lawmakers will also ask Haugen about her remarks on the 2020 presidential election.Haugen alleged on 60 Minutes that following Joe Biden’s win in the election, Facebook prematurely reinstated old algorithms that valued engagement over all else, a move that she said contributed to the 6 January attack on the Capitol.“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me,” she said.Following the election, Facebook also disbanded its civic team integrity team, a group that worked on issues related to political elections worldwide and which Haugen worked on. Facebook has said the team’s functions were distributed across the company.Haugen joined Facebook in 2019 as a product manager on the civic integrity team after spending more than a decade working in the tech industry, including at Pinterest and Google.Tuesday’s hearing is the second in mere weeks to focus on Facebook’s impact on children. Last week, lawmakers grilled Antigone Davis, Facebook’s global head of safety, and accused the company of “routinely” putting growth above children’s safety.Facebook has aggressively contested the accusations.On Friday, the company’s vice-president of policy and public affairs, Nick Clegg, wrote to Facebook employees ahead of Haugen’s public appearance. “Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” he said. “But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.”On Monday, Facebook asked a federal judge throw out a revised anitrust lawsuit brought by the Federal Trade Commission (FTC) that seeks to force the company giant to sell Instagram and WhatsApp.TopicsFacebookSocial mediaUS SenateUS politicsnewsReuse this content More

  • in

    Facebook ‘tearing our societies apart’: key excerpts from a whistleblower

    FacebookFacebook ‘tearing our societies apart’: key excerpts from a whistleblower Frances Haugen tells US news show why she decided to reveal inside story about social networking firm Dan Milmo Global technology editorMon 4 Oct 2021 08.33 EDTLast modified on Mon 4 Oct 2021 10.30 EDTFrances Haugen’s interview with the US news programme 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show the tech firm prioritised profit over safety and was “tearing our societies apart”.Haugen will testify in Washington on Tuesday, as political pressure builds on Facebook. Here are some of the key excerpts from Haugen’s interview.Choosing profit over the public goodHaugen’s most cutting words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit above the wellbeing of its users and the public. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money.”She also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election was over, allowing misinformation to spread on the platform again. “And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”Facebook’s approach to safety compared with othersIn a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”Instagram and mental healthThe document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterisation” of its research.Why Haugen leaked the documentsHaugen said “person after person” had attempted to tackle Facebook’s problems but had been ground down. “Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”Having joined the company in 2019, Haugen said she decided to act this year and started copying tens of thousands of documents from Facebook’s internal system, which she believed show that Facebook is not, despite public comments to the contrary, making significant progress in combating online hate and misinformation . “At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”Facebook and violenceHaugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”Facebook and the Washington riotThe 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”The 2018 algorithm changeFacebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customised feed of content such as friends’ photos and news stories – to prioritise content that increased user engagement. Haugen said this made divisive content more prominent.“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”Haugen said European political parties contacted Facebook to say that the news feed change was forcing them to take more extreme political positions in order to win users’ attention. Describing polititicians’ concerns, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”In a statement to 60 Minutes, Facebook said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”TopicsFacebookSocial networkingUS Capitol attackInstagramMental healthSocial mediaYoung peoplenewsReuse this content More

  • in

    Facebook whistleblower to claim company contributed to Capitol attack

    US Capitol attackFacebook whistleblower to claim company contributed to Capitol attackFormer employee is set to air her claims and reveal her identity in an interview airing Sunday night on CBS 60 Minutes Edward HelmoreSun 3 Oct 2021 13.13 EDTLast modified on Sun 3 Oct 2021 13.15 EDTA whistleblower at Facebook will say that thousands of pages of internal company research she turned over to federal regulators proves the social media giant is deceptively claiming effectiveness in its efforts to eradicate hate and misinformation and it contributed to the January 6 attack on the Capitol in Washington DC.The former employee is set to air her claims and reveal her identity in an interview airing Sunday night on CBS 60 Minutes ahead of a scheduled appearance at a Senate hearing on Tuesday.In an internal 1,500-word memo titled Our position on Polarization and Election sent out on Friday, Facebook’s vice-president of global affairs, Nick Clegg, acknowledged that the whistleblower would accuse the company of contributing to the 6 January Capitol riot and called the claims “misleading”.The memo was first reported by the New York Times.The 6 January insurrection was carried out by a pro-Trump mob that sought to disrupt the election of Joe Biden as president. The violence and chaos of the attack sent shockwaves throughout the US, and the rest of the world, and saw scores of people injured and five die.Clegg, a former former UK deputy prime minister, said in his memo that Facebook had “developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.”He said that many things had contributed to America’s divisive politics.“The rise of polarization has been the subject of swathes of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization,” Clegg wrote.The memo comes two weeks after Facebook issued a statement on its corporate website hitting back against a series of critical articles in the Wall Street Journal.TopicsUS Capitol attackFacebookSocial networkingUS politicsnewsReuse this content More