More stories

  • in

    Supreme court, Facebook, Fed: three horsemen of democracy’s apocalypse | Robert Reich

    OpinionUS supreme courtSupreme court, Facebook, Fed: three horsemen of democracy’s apocalypseRobert ReichThese unaccountable bodies hold increasing sway over US government. Their abuses of power affect us all Sun 10 Oct 2021 01.00 EDTLast modified on Sun 10 Oct 2021 05.22 EDTThe week’s news has been dominated by the supreme court, whose term began on Monday; the Federal Reserve, and whether it will start responding to inflation by raising interest rates; and Facebook, which a whistleblower claimed intentionally seeks to enrage and divide Americans in order to generate engagement and ad revenue.‘Facebook can’t keep its head in the sand’: five experts debate the company’s futureRead moreThe common thread is the growing influence of these three power centers over our lives, even as they become less accountable to us. As such, they present a fundamental challenge to democracy.Start with the supreme court. What’s the underlying issue?Don’t for a moment believe the supreme court bases its decisions on neutral, objective criteria. I’ve argued before it and seen up close that justices have particular and differing ideas about what’s good for the country. So it matters who they are and how they got there.A majority of the nine justices – all appointed for life – were put there by George W Bush and Donald Trump, presidents who lost the popular vote. Three were installed by Trump, a president who instigated a coup. Yet they are about to revolutionize American life in ways most Americans don’t want.This new court seems ready to overrule Roe v Wade, the 1973 ruling that anchored reproductive rights in the 14th amendment; declare a 108-year-old New York law against carrying firearms unconstitutional; and strip federal bodies such as the Environmental Protection Agency of the power to regulate private business. And much more.Only 40% of the public approves of the court’s performance, a new low. If the justices rule in ways anticipated, that number will drop further. If so, expect renewed efforts to expand the court and limit the terms of its members.What about the Fed?Behind the recent stories about whether the Fed should act to tame inflation is the reality that its power to set short-term interest rates and regulate the financial sector is virtually unchecked. And here too there are no neutral, objective criteria. Some believe the Fed’s priority should be fighting inflation. Others believe it should be full employment. So like the supreme court, it matters who runs it.Elizabeth Warren tells Fed chair he is ‘dangerous’ and opposes renominationRead morePresidents appoint Fed chairs for four-year terms but tend to stick with them longer for fear of rattling Wall Street, which wants stability and fat profits. (Alan Greenspan, a Reagan appointee, lasted almost 20 years, surviving two Bushes and Bill Clinton, who didn’t dare remove him).The term of Jerome Powell, the current Fed chair, who was appointed by Trump, is up in February. Biden will probably renominate him to appease the Street, although it’s not a sure thing. Powell has kept interest rates near zero, which is appropriate for an economy still suffering the ravages of the pandemic.But Powell has also allowed the Street to resume several old risky practices, prompting the Massachusetts Democratic senator Elizabeth Warren to tell him at a recent hearing that “renominating you means gambling that, for the next five years, a Republican majority at the Federal Reserve, with a Republican chair who has regularly voted to deregulate Wall Street, won’t drive this economy over a financial cliff again.”Finally, what’s behind the controversy over Facebook?Facebook and three other hi-tech behemoths (Amazon, Google and Apple) are taking on roles that once belonged to governments, from cybersecurity to exploring outer space, yet they too are unaccountable.Their decisions about which demagogues are allowed to communicate with the public and what lies they are allowed to spew have profound consequences for whether democracy or authoritarianism prevails. In January, Mark Zuckerberg apparently deferred to Nick Clegg, former British deputy prime minister, now vice-president of Facebook, on whether to allow Trump back on the platform.Worst of all, they’re sowing hate. As Frances Haugen, a former data scientist at Facebook, revealed this week, Facebook’s algorithm is designed to choose content that will make users angry, because anger generates the most engagement – and user engagement turns into ad dollars. The same is likely true of the algorithms used by Google, Amazon and Apple. Such anger has been ricocheting through our society, generating resentment and division.US supreme court convenes for pivotal term – with its credibility on the lineRead moreYet these firms have so much power that the government has no idea how to control them. How many times do you think Facebook executives testified before Congress in the last four years? Answer: 30. How many laws has Congress enacted to constrain Facebook during that time? Answer: zero.Nor are they accountable to the market. They now make the market. They’re not even accountable to themselves. Facebook’s oversight board has become a bad joke.These three power centers – the supreme court, the Fed and the biggest tech firms – have huge and increasing effects on our lives, yet they are less and less answerable to us.Beware. Democracy depends on accountability. Accountability provides checks on power. If abuses of power go unchallenged, those who wield it will only consolidate their power further. It’s a vicious cycle that erodes faith in democracy itself.
    Robert Reich, a former US secretary of labor, is professor of public policy at the University of California at Berkeley and the author of Saving Capitalism: For the Many, Not the Few and The Common Good. His new book, The System: Who Rigged It, How We Fix It, is out now. He is a Guardian US columnist. His newsletter is at robertreich.substack.com
    TopicsUS supreme courtOpinionUS constitution and civil libertiesLaw (US)FacebookSocial networkingFederal ReserveUS economycommentReuse this content More

  • in

    Facebook whistleblower testimony should prompt new oversight – Schiff

    FacebookFacebook whistleblower testimony should prompt new oversight – Schiff‘I think we need regulation to protect people’s private data,’ influential Democrat says in wake of Frances Haugen revelations

    Facebook biased against the facts, says Nobel prize winner
    Martin Pengelly and Charles KaiserSat 9 Oct 2021 16.06 EDTFirst published on Sat 9 Oct 2021 15.36 EDTTestimony in Congress this week by the whistleblower Frances Haugen should prompt action to implement meaningful oversight of Facebook and other tech giants, the influential California Democrat Adam Schiff told the Guardian in an interview to be published on Sunday.“I think we need regulation to protect people’s private data,” the chair of the House intelligence committee said.“I think we need to narrow the scope of the safe harbour these companies enjoy if they don’t moderate their contents and continue to amplify anger and hate. I think we need to insist on a vehicle for more transparency so we understand the data better.”Haugen, 37, was the source for recent Wall Street Journal reporting on misinformation spread by Facebook and Instagram, the photo-sharing platform which Facebook owns. She left Facebook in May this year, but her revelations have left the tech giant facing its toughest questions since the Cambridge Analytica user privacy scandal.At a Senate hearing on Tuesday, Haugen shared internal Facebook reports and argued that the social media giant puts “astronomical profits before people”, harming children and destabilising democracy via the sharing of inaccurate and divisive content.Haugen likened the appeal of Instagram to tobacco, telling senators: “It’s just like cigarettes … teenagers don’t have good self-regulation.”Richard Blumenthal, a Democrat from Connecticut, said Haugen’s testimony might represent a “big tobacco” moment for the social media companies, a reference to oversight imposed despite testimony in Congress that their product was not harmful from executives whose companies knew that it was.The founder and head of Facebook, Mark Zuckerberg, has resisted proposals to overhaul the US internet regulatory framework, which is widely considered to be woefully out of date.He responded to Haugen’s testimony by saying the “idea that we prioritise profit over safety and wellbeing” was “just not true”.“The argument that we deliberately push content that makes people angry for profit is deeply illogical,” he said. “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”Schiff was speaking to mark publication of a well-received new memoir, Midnight in Washington: How We Almost Lost Our Democracy and Still Could.The Democrat played prominent roles in the Russia investigation and Donald Trump’s first impeachment. He now sits on the select committee investigating the deadly attack on the US Capitol on 6 January, by Trump supporters seeking to overturn his election defeat – an effort in part fueled by misinformation on social media.In his book, Schiff writes about asking representatives of Facebook and two other tech giants, Twitter and YouTube, if their “algorithms were having the effect of balkanising the public and deepening the divisions in our society”.‘Welcome to the party’: five past tech whistleblowers on the pitfalls of speaking outRead moreFacebook’s general counsel in the 2017 hearing, Schiff writes, said: “The data on this is actually quite mixed.”“It didn’t seem very mixed to me,” Schiff says.Asked if he thought Haugen’s testimony would create enough pressure for Congress to pass new laws regulating social media companies, Schiff told the Guardian: “The answer is yes.”However, as an experienced member of a bitterly divided and legislatively sclerotic Congress, he also cautioned against too much optimism among reform proponents.“If you bet against Congress,” Schiff said, “you win 90% of the time.”TopicsFacebookUS politicsSocial networkingnewsReuse this content More

  • in

    Facebook whistleblower’s testimony could finally spark action in Congress

    FacebookFacebook whistleblower’s testimony could finally spark action in CongressDespite years of hearings, the company has long seemed untouchable. But Frances Haugen appears to have inspired rare bipartisanship Kari PaulWed 6 Oct 2021 01.00 EDTThe testimony of Frances Haugen, a former Facebook employee, is likely to increase pressure on US lawmakers to undertake concrete legislative actions against the formerly untouchable tech company, following years of hearings and circular discussions about big tech’s growing power.In a hearing on Tuesday, the whistleblower shared internal Facebook reports with Congress and argued the company puts “astronomical profits before people”, harms children and is destabilizing democracies.Facebook harms children and is damaging democracy, claims whistleblowerRead moreAfter years of sparring over the role of tech companies in past American elections, lawmakers from both sides of the aisle on Tuesday appeared to agree on the need for new regulations that would change how Facebook targets users and amplifies content.“Frances Haugen’s testimony appears to mark a rare moment of bipartisan consensus that the status quo is no longer acceptable,” said Imran Ahmed, chief executive officer of the Center for Countering Digital Hate, a non-profit that fights hate speech and misinformation. “This is increasingly becoming a non-political issue and one that has cut through definitively to the mainstream.”Throughout the morning, Congress members leveled questions at Haugen about what specifically could and should be done to address the harms caused by Facebook.With 15 years in the industry as an expert in algorithms and design, Haugen offered a number of suggestions – including changing news feeds to be chronological rather than algorithmic, appointing a government body for tech oversight, and requiring more transparency on internal research.“I think the time has come for action,” Senator Amy Klobuchar told Haugen. “And I think you are the catalyst for that action.”Unlike past hearings, which were frequently derailed by partisan bickering, Tuesday’s questioning largely stuck to problems posed by Facebook’s opaque algorithmic formulas and how it harms children. Such issues can unite Congress and there is going to be “a lot of bipartisan concern about this today and in future hearings”, said Senator Roger Wicker of Mississippi.“The recent revelations about Facebook’s mental health effects on children are indeed disturbing,” he said. “They just show how urgent it is for Congress to act against powerful tech companies, on behalf of children and the broader public.”However, activists who have been calling on Congress to enact laws protecting children from the negative effects of social media are skeptical of such promises.“The bipartisan anger at Facebook is encouraging and totally justified,” said Jim Steyer, founder and CEO of the children’s protection organization Common Sense. “The next step is to turn that bipartisan anger into bipartisan legislative action before the year is over.”Exactly what should be done to regulate Facebook is a matter of debate. Senator Todd Young of Indiana asked Haugen whether she believed breaking up Facebook would solve these issues.“I’m actually against breaking up Facebook,” Haugen said. “Oversight and finding collaborative solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous even if broken up.”Many laws introduced or discussed thus far in Congress take aim at section 230, a portion of US internet regulations that exempts platforms from legal liability for content generated by their users.While some organizations, including Common Sense, are calling for the reform of section 230, other internet freedom advocates have warned that targeting that law could have unintended negative consequences for human rights, activism, and freedom of expression.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead more“Haugen’s proposal to create a carveout in section 230 around algorithmic amplification would do more harm than good,” said Evan Greer, director of the activist group Fight for the Future. “Your feed would become like Disneyland, where everything in it is sanitized, vetted by lawyers, and paid for by corporations.”Following the hearing, Facebook disputed Haugen’s characterizations. But the company said it agreed more regulation was in order. “We agree on one thing. It’s time to begin to create standard rules for the internet,” said Lena Pietsch, Facebook’s director of policy communications, in a statement. “It’s been 25 years since the rules of the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”Greer argued that Facebook was promoting changes to internet laws so that it could have a hand in crafting legislation that would largely benefit big corporations. Other members of Congress have put forward potential paths to regulation that sidestep section 230 reform. Common Sense has called on Congress to pass the Children and Media Research Advancement (Camra) Act, which would authorize the National Institutes of Health to carry out research on the effects of social media on children and teens.Advocacy groups have also called on Congress for updates to the Children’s Online Privacy Protection Act (Coppa), currently the primary mechanism for protecting children online.Proposed changes would stop companies from profiling teens and youth and microtargeting them with ads and content specifically designed to prey on their fears and insecurities.“Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content and preying on children and teens is over,” Markey, who authored one such bill, called the Kids Act, said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy any longer.”TopicsFacebookUS CongressSocial networkingUS politicsSocial mediaanalysisReuse this content More

  • in

    Facebook harms children and is damaging democracy, claims whistleblower

    FacebookFacebook harms children and is damaging democracy, claims whistleblowerFrances Haugen says in US Congress testimony that Facebook puts ‘astronomical profits before people’04:21Dan Milmo and Kari PaulTue 5 Oct 2021 14.56 EDTFirst published on Tue 5 Oct 2021 14.48 EDTFacebook puts “astronomical profits before people”, harms children and is destabilising democracies, a whistleblower has claimed in testimony to the US Congress.Frances Haugen said Facebook knew it steered young users towards damaging content and that its Instagram app was “like cigarettes” for under-18s. In a wide-ranging testimony, the former Facebook employee said the company did not have enough staff to keep the platform safe and was “literally fanning” ethnic violence in developing countries.She also told US senators:
    The “buck stops” with the founder and chief executive, Mark Zuckerberg.
    Facebook knows its systems lead teenagers to anorexia-related content.
    The company had to “break the glass” and turn back on safety settings after the 6 January Washington riots.
    Facebook intentionally targets teenagers and children under 13.
    Monday’s outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not “destabilise democracies”.
    Haugen appeared in Washington on Tuesday after coming forward as the source of a series of revelations in the Wall Street Journal last month based on internal Facebook documents. They revealed the company knew Instagram was damaging teenagers’ mental health and that changes to Facebook’s News Feed feature – a central plank of users’ interaction with the service – had made the platform more polarising and divisive.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead moreHer evidence to senators included the claim that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.In her opening testimony, Haugen, 37, said: “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.” She added that Facebook was “buying its profits with our safety”. In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).Referring to Monday’s near six-hour outage in which Facebook’s platforms including Instagram and WhatsApp were disabled for billions of users, Haugen’s testimony added: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.” Facebook has 3.5 billion monthly active users across its platforms including Instagram and WhatsApp.Warning that Facebook makes choices that “go against the common good”, Haugen said the company should be treated like the tobacco industry, which was subject to government action once it was discovered it was hiding the harms its products caused, or like car companies that were forced to adopt seatbelts or opioid firms that have been sued by government agencies.Urging lawmakers to force more transparency on Facebook, she said there should be more scrutiny of its algorithms, which shape the content delivered to users. “The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said. With greater transparency, she added, “we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.The hearing focused on the impact of Facebook’s platforms on children, with Haugen likening the appeal of Instagram to tobacco. “It’s just like cigarettes … teenagers don’t have good self-regulation.” Haugen added women would be walking around with brittle bones in 60 years’ time because of the anorexia-related content they found on Facebook platforms.Haugen told lawmakers that Facebook intentionally targets teens and “definitely” targets children as young as eight for the Messenger Kids app. The former Facebook product manager left the company in May after copying tens of thousands of internal documents.A Facebook spokesperson, Andy Stone, said in a tweet during the hearing: “Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”Haugen said that, according to internal documents, Zuckerberg had been given “soft options” to make the Facebook platform less “twitchy” and viral in countries prone to violence but declined to take them because it might affect “meaningful social interactions”, or MSI. She added: “We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.”Haugen said Zuckerberg had built a company that was “very metrics driven”, because the more time people spent on Facebook platforms the more appealing the business was to advertisers. Asked about Zuckerberg’s ultimate responsibility for decisions made at Facebook, she said: “The buck stops with him.”Haugen also warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside of the US.Referring to the aftermath of the 6 January storming of the Capitol, as protesters sought to overturn the US presidential election result, Haugen said she was disturbed that Facebook had to “break the glass” and reinstate safety settings that it had put in place for the November poll. Haugen, who worked for the Facebook team that monitored election interference globally, said those precautions had been dropped after Joe Biden’s victory in order to spur growth on the platform.Among the reforms recommended by Haugen were ensuring that Facebook shares internal information and research with “appropriate” oversight bodies such as Congress and removing the influence of algorithms on Facebook’s News Feed by allowing it to be ranked chronologically.Senator Ed Markey said Congress would take action. “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”Haugen’s lawyers have also filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teenagers found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”A Facebook spokesperson said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”TopicsFacebookUS CongressSocial networkingUS politicsMark ZuckerbergnewsReuse this content More

  • in

    Facebook whistleblower accuses firm of serially misleading over safety

    FacebookFacebook whistleblower accuses firm of serially misleading over safety Frances Haugen filed at least eight complaints against the company regarding its approach to safety Dan Milmo Global technology editorTue 5 Oct 2021 07.50 EDTLast modified on Tue 5 Oct 2021 10.23 EDTThe Facebook whistleblower, Frances Haugen, who testifies at the US Congress on Tuesday, has filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.The complaints, published online by the news programme 60 Minutes late on Monday, hours before Haugen’s testimony to US senators at 10am EDT (3pm BST), are based on tens of thousands of internal documents that Haugen copied shortly before she quit Facebook in May.The complaints and testimony from Haugen, who stepped forward on Sunday as the source of a damning series of revelations in the Wall Street Journal, are taking place against a backdrop of operational chaos for Facebook, whose platforms, including Instagram and WhatsApp, went offline around the world for nearly six hours on Monday.The first whistleblower complaint filed to the US Securities and Exchange Commission relates to the 6 January riots in Washington, when crowds of protesters stormed the Capitol, and alleges that Facebook knowingly chose to permit political misinformation and contests statements made by its chief executive, Mark Zuckerberg, to the contrary.“Our anonymous client is disclosing original evidence showing that Facebook … has, for years past and ongoing, violated US securities laws by making material misrepresentations and omissions in statements to investors and prospective investors,” the sweeping opening statement reads, “including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”The complaints against Facebook, which reflect a series of reports in the Wall Street Journal in recent weeks, also cover:
    The company’s approach to hate speech.
    Its approach to teenage mental health.
    Its monitoring of human trafficking.
    How the company’s algorithms promoted hate speech.
    Preferential disciplinary treatment for VIP users.
    Promoting ethnic violence.
    Failing to inform investors about a shrinking user base in certain demographics.
    The first complaint, regarding 6 January, contests testimony given to Congress in March by Facebook’s founder and chief executive, Mark Zuckerberg, in which he stated that: “We remove language that incites or facilitates violence, and we ban groups that proclaim a hateful and violent mission.”The complaint rebuts this, claiming that the company’s own records show it “knowingly chose to permit political misinformation and violent content/groups and failed to adopt or continue measures to combat these issues, including as related to the 2020 US election and the 6 January insurrection, in order to promote virality and growth on its platforms.”According to one internal Facebook document quoted in the complaints, the company admits: “For example, we estimate that we may action as little as 3-5% of hate [speech] and ~0.6% of V&V [violent and inciting content] on Facebook.”A complaint also alleges that Facebook misrepresented its “reach and frequency”, which are key metrics for the advertisers who provide the majority of Facebook’s revenue. That included concealing a decline in the key demographic of young users, the complaint stated. “During Covid, every cohort’s use of Facebook increased, except for those 23 and under, which continued to decline,” the complaint said.“For years, Facebook has misrepresented core metrics to investors and advertisers including the amount of content produced on its platforms and growth in individual users,” it said, adding this applied particularly in “high-value demographics” such as US teenagers.Facebook has been approached for comment.The human trafficking complaint alleges that Facebook and its photo-sharing app, Instagram, were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude”. The hate speech complaint quotes another internal document that states: “We only take action against approximately 2% of the hate speech on the platform.” The teen health complaint focuses on the most damaging allegation from the WSJ series: that Instagram knew the app caused anxiety about body image among teenage girls.A complaint about Facebook’s approach to algorithms alleges that a tweak to the app’s News Feed product – a key part of users’ interaction with the app – led to the prioritisation of divisive content, while the complaint about ethnic violence contains an excerpt from an internal study that claims “in the Afghanistan market, the action rate for hate speech is worryingly low”.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teens found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”TopicsFacebookSocial mediaUS CongressUS politicsDigital mediaSocial networkingnewsReuse this content More

  • in

    Facebook ‘tearing our societies apart’: key excerpts from a whistleblower

    FacebookFacebook ‘tearing our societies apart’: key excerpts from a whistleblower Frances Haugen tells US news show why she decided to reveal inside story about social networking firm Dan Milmo Global technology editorMon 4 Oct 2021 08.33 EDTLast modified on Mon 4 Oct 2021 10.30 EDTFrances Haugen’s interview with the US news programme 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show the tech firm prioritised profit over safety and was “tearing our societies apart”.Haugen will testify in Washington on Tuesday, as political pressure builds on Facebook. Here are some of the key excerpts from Haugen’s interview.Choosing profit over the public goodHaugen’s most cutting words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit above the wellbeing of its users and the public. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money.”She also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election was over, allowing misinformation to spread on the platform again. “And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”Facebook’s approach to safety compared with othersIn a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”Instagram and mental healthThe document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterisation” of its research.Why Haugen leaked the documentsHaugen said “person after person” had attempted to tackle Facebook’s problems but had been ground down. “Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”Having joined the company in 2019, Haugen said she decided to act this year and started copying tens of thousands of documents from Facebook’s internal system, which she believed show that Facebook is not, despite public comments to the contrary, making significant progress in combating online hate and misinformation . “At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”Facebook and violenceHaugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”Facebook and the Washington riotThe 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”The 2018 algorithm changeFacebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customised feed of content such as friends’ photos and news stories – to prioritise content that increased user engagement. Haugen said this made divisive content more prominent.“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”Haugen said European political parties contacted Facebook to say that the news feed change was forcing them to take more extreme political positions in order to win users’ attention. Describing polititicians’ concerns, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”In a statement to 60 Minutes, Facebook said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”TopicsFacebookSocial networkingUS Capitol attackInstagramMental healthSocial mediaYoung peoplenewsReuse this content More

  • in

    Facebook whistleblower to claim company contributed to Capitol attack

    US Capitol attackFacebook whistleblower to claim company contributed to Capitol attackFormer employee is set to air her claims and reveal her identity in an interview airing Sunday night on CBS 60 Minutes Edward HelmoreSun 3 Oct 2021 13.13 EDTLast modified on Sun 3 Oct 2021 13.15 EDTA whistleblower at Facebook will say that thousands of pages of internal company research she turned over to federal regulators proves the social media giant is deceptively claiming effectiveness in its efforts to eradicate hate and misinformation and it contributed to the January 6 attack on the Capitol in Washington DC.The former employee is set to air her claims and reveal her identity in an interview airing Sunday night on CBS 60 Minutes ahead of a scheduled appearance at a Senate hearing on Tuesday.In an internal 1,500-word memo titled Our position on Polarization and Election sent out on Friday, Facebook’s vice-president of global affairs, Nick Clegg, acknowledged that the whistleblower would accuse the company of contributing to the 6 January Capitol riot and called the claims “misleading”.The memo was first reported by the New York Times.The 6 January insurrection was carried out by a pro-Trump mob that sought to disrupt the election of Joe Biden as president. The violence and chaos of the attack sent shockwaves throughout the US, and the rest of the world, and saw scores of people injured and five die.Clegg, a former former UK deputy prime minister, said in his memo that Facebook had “developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.”He said that many things had contributed to America’s divisive politics.“The rise of polarization has been the subject of swathes of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization,” Clegg wrote.The memo comes two weeks after Facebook issued a statement on its corporate website hitting back against a series of critical articles in the Wall Street Journal.TopicsUS Capitol attackFacebookSocial networkingUS politicsnewsReuse this content More