More stories

  • in

    Facebook missed weeks of warning signs over Capitol attack, documents suggest

    FacebookFacebook missed weeks of warning signs over Capitol attack, documents suggestMaterials provided by Frances Haugen to media outlets shine light on how company apparently stumbled into 6 January Guardian staff and agenciesSat 23 Oct 2021 14.22 EDTFirst published on Sat 23 Oct 2021 12.23 EDTAs extremist supporters of Donald Trump stormed the US Capitol on 6 January, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinformation and content likely to incite further violence.Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with records of hate speech and filtering out the “Stop the Steal” rallying cry of Trump’s campaign to overturn his electoral loss, falsely citing widespread fraud. Officials have called it the most secure election in US history.Actions also included empowering Facebook content moderators to act more assertively by labeling the US a “temporary high risk location” for political violence.At the same time, frustration inside Facebook erupted over what some saw as the company’s halting and inconsistent response to rising extremism in the US.“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the 6 January turmoil.“We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s role in the events.New internal documents have been provided to a number of media outlets in recent days by the former Facebook employee turned whistleblower Frances Haugen, following her initial disclosures and claims that the platform puts profits before public good, and her testimony to Congress.The outlets, including the New York Times, the Washington Post and NBC, published reports based on those documents, which offer a deeper look into the spread of misinformation and conspiracy theories on the platform, particularly related to the 2020 US presidential election.They show that Facebook employees repeatedly flagged concerns before and after the election, when Trump tried to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told co-workers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent. But as workers flagged these issues and urged the company to act, the company failed or struggled to address the problems, the Times reported.The internal documents also show Facebook researchers have found the platform’s recommendation tools repeatedly pushed users to extremist groups, prompting internal warnings that some managers and executives ignored, NBC News reported.In one striking internal study, a Facebook researcher created a fake profile for “Carol Smith”, a conservative female user whose interests included Fox News and Donald Trump. The experiment showed that within two days, Facebook’s algorithm was recommending “Carol” join groups dedicated to QAnon, a baseless internet conspiracy theory.The documents also provide a rare glimpse into how the company appears to have simply stumbled into the events of 6 January.It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing – by posting on Facebook itself – to stop Congress from certifying Joe Biden’s election victory.This story is based in part on disclosures Haugen made to the Securities and Exchange Commission (SEC), the US agency that handles regulation to protect investors in publicly traded companies, provided to Congress in redacted form by her legal counsel.Facebook crisis grows as new whistleblower and leaked documents emergeRead moreThe redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.What Facebook called “Break the Glass” emergency measures put in place on 6 January were essentially a toolkit of options designed to stem the spread of dangerous or violent content. The social network had first used the system in the run-up to the bitter 2020 election.As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen has said.An internal Facebook report following 6 January, previously reported by BuzzFeed, faulted the company for a “piecemeal” approach to the rapid growth of “Stop the Steal” pages.Facebook said the situation was more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content. The company said it was not responsible for the actions of the rioters – and that having stricter controls in place prior to that day wouldn’t have helped.Facebook’s decisions to phase certain safety measures in or out had taken into account signals from the Facebook platform as well as information from law enforcement, said a spokesperson, Dani Lever, saying: “When those signals changed, so did the measures.”Lever added that some of the measures had stayed in place well into February and others remained active today.Meanwhile, Facebook is facing mounting pressure after a new whistleblower on Friday accused it of knowingly hosting hate speech and illegal activity.Allegations by the new whistleblower, who spoke to the Washington Post, were reportedly contained in a complaint to the SEC.In the complaint, which echoes Haugen’s disclosures, the former employee detailed how Facebook officials frequently declined to enforce safety rules for fear of angering Donald Trump and his allies or offsetting the company’s huge growth. In one alleged incident, Tucker Bounds, a Facebook communications official, dismissed concerns about the platform’s role in 2016 election manipulation.“It will be a flash in the pan,” Bounds said, according to the affidavit, as reported by the Post. “Some legislators will get pissy. And then in a few weeks they will move on to something else. Meanwhile, we are printing money in the basement, and we are fine.” TopicsFacebookUS Capitol attackSocial networkingSocial mediaUS politicsnewsReuse this content More

  • in

    Lawmakers seek to rein in big tech with bills aimed at competition and liability

    TechnologyLawmakers seek to rein in big tech with bills aimed at competition and liabilityOne bill would prevent platforms from giving preference to their own products, the other would remove Section 230 protections Kari PaulThu 14 Oct 2021 17.58 EDTLast modified on Thu 14 Oct 2021 18.37 EDTUS lawmakers announced two major new proposals seeking to rein in the power of big tech, days after the revelations from a former Facebook employee spotlighted the company’s sweeping impact.The first bill, proposed by a group of senators headed by Democrat Amy Klobuchar and Republican Chuck Grassley would bar big tech platforms from favoring their own products and services.The second bill, put forward by House Democrats, would remove some protections afforded tech companies by Section 230, a portion of the Communications Decency Act that exempts them from liability for what is posted on their platforms.Facebook whistleblower’s testimony could finally spark action in CongressRead moreThe proposals are part of a slew of bills from this Congress aimed at reining in tech firms, including industry leaders Facebook and Apple. Thus far, none have become law although one, a broader measure to increase resources for antitrust enforcers, has passed the Senate.Klobuchar and Grassley’s bill would specifically prohibit platforms from requiring companies operating on their sites to purchase the platform’s goods or services and ban them from biasing search results to favor the platform. It is a companion bill to a measure which has passed the House judiciary committee and must pass both houses of Congress to become law.The bill would address concerns that tech giants have become gatekeepers, giving preference to their own products, blocking rivals from accessing markets and imposing onerous fees and terms on smaller businesses.“As dominant digital platforms – some of the biggest companies our world has ever seen – increasingly give preference to their own products and services, we must put policies in place to ensure small businesses and entrepreneurs still have the opportunity to succeed in the digital marketplace,” Klobuchar said in a statement.The legislation comes as Congress is increasingly working on a bipartisan basis to address antitrust issues in big tech. Traditionally lawmakers have differed on their critiques of the industry – with Democrats claiming the companies are monopolies and Republicans criticizing what they perceive as an anti-conservative bias on the platforms.“This bill is welcome proof that the momentum in Congress to tackle big tech’s monopoly power is rapidly gaining force on both sides of the aisle,” read a statement from the Institute for Local Self-Reliance, a non-profit that fights against corporate monopolies. “We agree with their view that the tech giants cannot continue to abuse their power at the expense of competition, innovation, and entrepreneurship.”Meanwhile, the debate around Section 230 – a portion of the Communications Decency Act that protects companies from legal liability for content posted on their platforms – has continued. Its impact has long been a hot-button issue but became increasingly so during the Donald Trump’s presidency.The bill House Democrats introduced on Thursday would create an amendment in Section 230 that would hold companies responsible for the personalized algorithmic amplification of problematic content.In other words it seeks to simply “turn off” the Facebook news algorithm, said Evan Greer, director of digital rights group Fight For the Future.The law would apply only to large tech firms with 5,000,000 or more monthly users, but could still have negative consequences for firms large enough to qualify but that still have fewer resources than Facebook.“Facebook would likely be able to survive this, but smaller competitors wouldn’t,” Greer said. “That’s why Facebook has repeatedly called for changes to Section 230 – they know it will only serve to solidify their dominance and monopoly power.“This bill is well-intentioned, but it’s a total mess,” added Greer. “Democrats are playing right into Facebook’s hands by proposing tweaks to Section 230 instead of thoughtful policies that will actually reduce the harm done by surveillance-driven algorithms.”Lawmakers are “failing to understand how these policies will actually play out in the real world”, she added.Earlier this year more than 70 civil rights, LGBTQ+, sex worker advocacy and human rights organizations sent a letter cautioning lawmakers against changing Section 230.They instead prefer an approach to reining in Facebook and other platforms by attacking the data harvesting and surveillance practices they rely on as a business model.Democrats should instead “pass a privacy bill strong enough to kill Facebook’s surveillance driven business model while leaving the democratizing power of the internet intact”, Greer said.Reuters contributed to this reportTopicsTechnologyFacebookUS politicsSocial mediaApplenewsReuse this content More

  • in

    How to blow the whistle on Facebook – from someone who already did

    FacebookHow to blow the whistle on Facebook – from someone who already did This April, Sophie Zhang told the world about her employer’s failure to combat deception and abuse. Her advice? No screenshots, lawyer up – and trust yourselfSophie ZhangMon 11 Oct 2021 01.00 EDTLast modified on Mon 11 Oct 2021 12.05 EDTTwo years ago, I did something I almost never do: I put on a dress. Then I dropped my phone and other electronics off at the home of friends who had agreed to tell anyone who asked that I was at their place the entire time, and headed to the Oakland offices of the Guardian for my first meeting with a reporter. How Facebook let fake engagement distort global politics: a whistleblower’s accountRead moreLeaving my electronics was a safeguard against possible tracking by my then employer, Facebook. The dress was an additional layer of alibi: I theorized that if anyone from work saw me and could contradict my first alibi, they might conclude that my unusual behavior was evidence of nothing more than an affair.That first, anxious meeting was the beginning of a lengthy process that would culminate in my decision – 18 months later and after I had been fired by Facebook – to step forward and blow the whistle on Facebook’s failure to combat deception and abuse by powerful politicians around the world.This month, another Facebook whistleblower, Frances Haugen, has come forward. After providing the Wall Street Journal and US government with thousands of internal documents showing Facebook’s internal research into its own harms, Haugen testified to Congress. Her testimony and revelations have captured the imaginations of the public, the press and Capitol Hill and raised hopes that regulators might finally act to rein in Facebook’s immense power.During her testimony, Haugen encouraged “more tech employees to come forward through legitimate channels … to make sure that the public has the information they need”. But whistleblowing is never straightforward. When I was deciding whether to speak out, I struggled to find guidance on the best way to go about it. If you’re in that position now, here’s my best advice on how to navigate the complicated path to becoming a whistleblower.Decide what you’re willing to riskWhistleblowing is not for everyone; I knew Facebook employees on H1-B visas who considered speaking, but could not risk being fired and deported. Speaking out internally or anonymously to the press will risk your current job. Speaking out publicly will risk your future career. Providing documentation will risk lawsuits and legal action. These risks can be minimized, but not eliminated. Decide whether you’re going to go publicThe first question you have to ask yourself is whether your aim is to change the minds of employees and leadership, or to pressure them via public opinion? Employees will be more sympathetic to the company than the general public; an internal post denouncing the chief executive as intentionally undermining democracy might alienate your co-workers, but can move the window of discussion. Before I went public, I used Facebook’s internal message board, Workplace, to try to effect change. It was only when this failed that I decided to go to the press.If you do make an internal post, remember that leaks are inevitable, and consider how your words can be misunderstood. When I wrote my departure memo, I naively thought it would not leak, and wrote for an audience of insiders. One of the consequences of this was that a stray comment about “actors” (referring to people who take certain actions) resulted in incorrect reports in the Indian press that Bollywood stars were interfering with elections.Exhaust your internal optionsDon’t let the company claim that they were ignorant of the situation and issues you’re speaking out about, or allege that you had failed to speak to the right people. Even if you expect complaints to be ignored, consider making them nevertheless – in writing – so you can point to them later.Decide what you’re going to saySpeaking out about an area of personal expertise gives you credibility and insight, but narrows your scope to areas that may not arouse as much public interest. Speaking out about topics beyond your normal work will require you to conduct research and seek out internal documents you wouldn’t normally look at – creating a digital trail that could expose you – but could make your story more compelling. Be careful that what you say is correct and you aren’t making assumptions based on any personal bias or opinions; would-be “whistleblowers” have come forward with unconvincing revelations based on preconceptions.Facebook is ‘biased against facts’, says Nobel prize winnerRead moreExpect to face company criticism regardless of what you speak on – Facebook dismissed Haugen for speaking about issues beyond her scope, and attempted the same for myself even though I spoke only about topics I personally worked on.Whatever you speak about, consider what your end goal is and whether your revelations will accomplish that. Risking your career to help a tech reporter live-tweet a company meeting may not be the risk/reward ratio you had in mind.No screenshots, no work devicesNever contact outside parties (such as reporters or lawyers) via work devices; only do so via end-to-end encrypted systems like Signal on your personal devices. To securely copy work documents, use a personal device to take photos of the screen; do not take screenshots. If you’re accessing many documents, ensure that you have a plausible alibi. If leaking while employed, ensure that you’re only sharing documents that many employees have recently accessed. And if you intend to go public, insulate yourself beforehand by removing personal information online with a service like DeleteMe.Save up for a year without payIf you intend to go public with documentation, ensure that you’re able to survive off savings for at least a year. Most would-be-whistleblowers I’ve spoken to are concerned that they won’t be able to find another job. I worried about this too, but I’ve actually received many recruiting attempts – an experience also reported by others. Nevertheless, talking to the press, civil society and government officials is time consuming and will probably prevent you from working for some time. You will likely also incur additional expenses on lawyers and PR advice. Some whistleblowers choose to solicit donations, but this might undermine your credibility.Lawyer upIf you intend to go public with documentation and details, speak with a lawyer first. Organizations such as Whistleblower Aid and the Signals Network can help connect you with someone. By speaking out, you face the risk of lawsuits for breach of contract, or even prosecution in the United States for theft of trade secrets. These risks are unlikely, but the possibility exists nevertheless.Make contact with an outsiderMost tech reporters have a Signal address in their Twitter profile. I’ve heard many employees concerned that reporters will not protect anonymity – I personally have few concerns in that regard, although I would advise working with an established news outlet.When you do speak with a reporter, you should be clear up front about whether you’re speaking on the record (you can be quoted by name), unattributed (you can be quoted but not by name), or off the record (none of this can be published). If you intend to speak with the government, your lawyer should be able to help.It’s your decision – trust yourselfIn the end, whistleblowing is an intensely personal decision that very few will ever consider. It’s easy to criticize from the outside, but many feel differently when they face those risks themselves. Every time I advise others, I remind them that I can provide advice but the ultimate decision is their own. I am glad that I chose to come forward, and that Frances did as well, but no one is obligated to torch their career in pursuit of justice.TopicsFacebookSocial networkingUS politicsfeaturesReuse this content More

  • in

    Supreme court, Facebook, Fed: three horsemen of democracy’s apocalypse | Robert Reich

    OpinionUS supreme courtSupreme court, Facebook, Fed: three horsemen of democracy’s apocalypseRobert ReichThese unaccountable bodies hold increasing sway over US government. Their abuses of power affect us all Sun 10 Oct 2021 01.00 EDTLast modified on Sun 10 Oct 2021 05.22 EDTThe week’s news has been dominated by the supreme court, whose term began on Monday; the Federal Reserve, and whether it will start responding to inflation by raising interest rates; and Facebook, which a whistleblower claimed intentionally seeks to enrage and divide Americans in order to generate engagement and ad revenue.‘Facebook can’t keep its head in the sand’: five experts debate the company’s futureRead moreThe common thread is the growing influence of these three power centers over our lives, even as they become less accountable to us. As such, they present a fundamental challenge to democracy.Start with the supreme court. What’s the underlying issue?Don’t for a moment believe the supreme court bases its decisions on neutral, objective criteria. I’ve argued before it and seen up close that justices have particular and differing ideas about what’s good for the country. So it matters who they are and how they got there.A majority of the nine justices – all appointed for life – were put there by George W Bush and Donald Trump, presidents who lost the popular vote. Three were installed by Trump, a president who instigated a coup. Yet they are about to revolutionize American life in ways most Americans don’t want.This new court seems ready to overrule Roe v Wade, the 1973 ruling that anchored reproductive rights in the 14th amendment; declare a 108-year-old New York law against carrying firearms unconstitutional; and strip federal bodies such as the Environmental Protection Agency of the power to regulate private business. And much more.Only 40% of the public approves of the court’s performance, a new low. If the justices rule in ways anticipated, that number will drop further. If so, expect renewed efforts to expand the court and limit the terms of its members.What about the Fed?Behind the recent stories about whether the Fed should act to tame inflation is the reality that its power to set short-term interest rates and regulate the financial sector is virtually unchecked. And here too there are no neutral, objective criteria. Some believe the Fed’s priority should be fighting inflation. Others believe it should be full employment. So like the supreme court, it matters who runs it.Elizabeth Warren tells Fed chair he is ‘dangerous’ and opposes renominationRead morePresidents appoint Fed chairs for four-year terms but tend to stick with them longer for fear of rattling Wall Street, which wants stability and fat profits. (Alan Greenspan, a Reagan appointee, lasted almost 20 years, surviving two Bushes and Bill Clinton, who didn’t dare remove him).The term of Jerome Powell, the current Fed chair, who was appointed by Trump, is up in February. Biden will probably renominate him to appease the Street, although it’s not a sure thing. Powell has kept interest rates near zero, which is appropriate for an economy still suffering the ravages of the pandemic.But Powell has also allowed the Street to resume several old risky practices, prompting the Massachusetts Democratic senator Elizabeth Warren to tell him at a recent hearing that “renominating you means gambling that, for the next five years, a Republican majority at the Federal Reserve, with a Republican chair who has regularly voted to deregulate Wall Street, won’t drive this economy over a financial cliff again.”Finally, what’s behind the controversy over Facebook?Facebook and three other hi-tech behemoths (Amazon, Google and Apple) are taking on roles that once belonged to governments, from cybersecurity to exploring outer space, yet they too are unaccountable.Their decisions about which demagogues are allowed to communicate with the public and what lies they are allowed to spew have profound consequences for whether democracy or authoritarianism prevails. In January, Mark Zuckerberg apparently deferred to Nick Clegg, former British deputy prime minister, now vice-president of Facebook, on whether to allow Trump back on the platform.Worst of all, they’re sowing hate. As Frances Haugen, a former data scientist at Facebook, revealed this week, Facebook’s algorithm is designed to choose content that will make users angry, because anger generates the most engagement – and user engagement turns into ad dollars. The same is likely true of the algorithms used by Google, Amazon and Apple. Such anger has been ricocheting through our society, generating resentment and division.US supreme court convenes for pivotal term – with its credibility on the lineRead moreYet these firms have so much power that the government has no idea how to control them. How many times do you think Facebook executives testified before Congress in the last four years? Answer: 30. How many laws has Congress enacted to constrain Facebook during that time? Answer: zero.Nor are they accountable to the market. They now make the market. They’re not even accountable to themselves. Facebook’s oversight board has become a bad joke.These three power centers – the supreme court, the Fed and the biggest tech firms – have huge and increasing effects on our lives, yet they are less and less answerable to us.Beware. Democracy depends on accountability. Accountability provides checks on power. If abuses of power go unchallenged, those who wield it will only consolidate their power further. It’s a vicious cycle that erodes faith in democracy itself.
    Robert Reich, a former US secretary of labor, is professor of public policy at the University of California at Berkeley and the author of Saving Capitalism: For the Many, Not the Few and The Common Good. His new book, The System: Who Rigged It, How We Fix It, is out now. He is a Guardian US columnist. His newsletter is at robertreich.substack.com
    TopicsUS supreme courtOpinionUS constitution and civil libertiesLaw (US)FacebookSocial networkingFederal ReserveUS economycommentReuse this content More

  • in

    Facebook whistleblower testimony should prompt new oversight – Schiff

    FacebookFacebook whistleblower testimony should prompt new oversight – Schiff‘I think we need regulation to protect people’s private data,’ influential Democrat says in wake of Frances Haugen revelations

    Facebook biased against the facts, says Nobel prize winner
    Martin Pengelly and Charles KaiserSat 9 Oct 2021 16.06 EDTFirst published on Sat 9 Oct 2021 15.36 EDTTestimony in Congress this week by the whistleblower Frances Haugen should prompt action to implement meaningful oversight of Facebook and other tech giants, the influential California Democrat Adam Schiff told the Guardian in an interview to be published on Sunday.“I think we need regulation to protect people’s private data,” the chair of the House intelligence committee said.“I think we need to narrow the scope of the safe harbour these companies enjoy if they don’t moderate their contents and continue to amplify anger and hate. I think we need to insist on a vehicle for more transparency so we understand the data better.”Haugen, 37, was the source for recent Wall Street Journal reporting on misinformation spread by Facebook and Instagram, the photo-sharing platform which Facebook owns. She left Facebook in May this year, but her revelations have left the tech giant facing its toughest questions since the Cambridge Analytica user privacy scandal.At a Senate hearing on Tuesday, Haugen shared internal Facebook reports and argued that the social media giant puts “astronomical profits before people”, harming children and destabilising democracy via the sharing of inaccurate and divisive content.Haugen likened the appeal of Instagram to tobacco, telling senators: “It’s just like cigarettes … teenagers don’t have good self-regulation.”Richard Blumenthal, a Democrat from Connecticut, said Haugen’s testimony might represent a “big tobacco” moment for the social media companies, a reference to oversight imposed despite testimony in Congress that their product was not harmful from executives whose companies knew that it was.The founder and head of Facebook, Mark Zuckerberg, has resisted proposals to overhaul the US internet regulatory framework, which is widely considered to be woefully out of date.He responded to Haugen’s testimony by saying the “idea that we prioritise profit over safety and wellbeing” was “just not true”.“The argument that we deliberately push content that makes people angry for profit is deeply illogical,” he said. “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”Schiff was speaking to mark publication of a well-received new memoir, Midnight in Washington: How We Almost Lost Our Democracy and Still Could.The Democrat played prominent roles in the Russia investigation and Donald Trump’s first impeachment. He now sits on the select committee investigating the deadly attack on the US Capitol on 6 January, by Trump supporters seeking to overturn his election defeat – an effort in part fueled by misinformation on social media.In his book, Schiff writes about asking representatives of Facebook and two other tech giants, Twitter and YouTube, if their “algorithms were having the effect of balkanising the public and deepening the divisions in our society”.‘Welcome to the party’: five past tech whistleblowers on the pitfalls of speaking outRead moreFacebook’s general counsel in the 2017 hearing, Schiff writes, said: “The data on this is actually quite mixed.”“It didn’t seem very mixed to me,” Schiff says.Asked if he thought Haugen’s testimony would create enough pressure for Congress to pass new laws regulating social media companies, Schiff told the Guardian: “The answer is yes.”However, as an experienced member of a bitterly divided and legislatively sclerotic Congress, he also cautioned against too much optimism among reform proponents.“If you bet against Congress,” Schiff said, “you win 90% of the time.”TopicsFacebookUS politicsSocial networkingnewsReuse this content More

  • in

    The whistleblower who plunged Facebook into crisis

    After a set of leaks last month that represented the most damaging insight into Facebook’s inner workings in the company’s history, the former employee behind them has come forward. Now Frances Haugen has given evidence to the US Congress – and been praised by senators as a ‘21st century American hero’. Will her testimony accelerate efforts to bring the social media giant to heel?

    How to listen to podcasts: everything you need to know

    On Monday, Facebook and its subsidiaries Instagram and WhatsApp went dark after a router failure. There were thousands of negative headlines, millions of complaints, and more than 3 billion users were forced offline. On Tuesday, the company’s week got significantly worse. Frances Haugen, a former product manager with Facebook, testified before US senators about what she had seen in her two years there – and set out why she had decided to leak a trove of internal documents to the Wall Street Journal. Haugen had revealed herself as the source of the leak a few days earlier. And while the content of the leak – from internal warnings of the harm being done to teenagers by Instagram to the deal Facebook gives celebrities to leave their content unmoderated – had already led to debate about whether the company needed to reform, Haugen’s decision to come forward escalated the pressure on Mark Zuckerberg. In this episode, Nosheeen Iqbal talks to the Guardian’s global technology editor, Dan Milmo, about what we learned from Haugen’s testimony, and how damaging a week this could be for Facebook. Milmo sets out the challenges facing the company as it seeks to argue that the whistleblower is poorly informed or that her criticism is mistaken. And he reflects on what options politicians and regulators around the world will consider as they look for ways to curb Facebook’s power, and how likely such moves are to succeed. After Haugen spoke, Zuckerberg said her claims that the company puts profit over people’s safety were “just not true”. In a blog post, he added: “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” You can read more of Zuckerberg’s defence here. And you can read an analysis of how Haugen’s testimony is likely to affect Congress’s next move here. Archive: BBC; YouTube; TikTok; CSPAN; NBC; CBS;CNBC; Vice; CNN More

  • in

    Facebook whistleblower’s testimony could finally spark action in Congress

    FacebookFacebook whistleblower’s testimony could finally spark action in CongressDespite years of hearings, the company has long seemed untouchable. But Frances Haugen appears to have inspired rare bipartisanship Kari PaulWed 6 Oct 2021 01.00 EDTThe testimony of Frances Haugen, a former Facebook employee, is likely to increase pressure on US lawmakers to undertake concrete legislative actions against the formerly untouchable tech company, following years of hearings and circular discussions about big tech’s growing power.In a hearing on Tuesday, the whistleblower shared internal Facebook reports with Congress and argued the company puts “astronomical profits before people”, harms children and is destabilizing democracies.Facebook harms children and is damaging democracy, claims whistleblowerRead moreAfter years of sparring over the role of tech companies in past American elections, lawmakers from both sides of the aisle on Tuesday appeared to agree on the need for new regulations that would change how Facebook targets users and amplifies content.“Frances Haugen’s testimony appears to mark a rare moment of bipartisan consensus that the status quo is no longer acceptable,” said Imran Ahmed, chief executive officer of the Center for Countering Digital Hate, a non-profit that fights hate speech and misinformation. “This is increasingly becoming a non-political issue and one that has cut through definitively to the mainstream.”Throughout the morning, Congress members leveled questions at Haugen about what specifically could and should be done to address the harms caused by Facebook.With 15 years in the industry as an expert in algorithms and design, Haugen offered a number of suggestions – including changing news feeds to be chronological rather than algorithmic, appointing a government body for tech oversight, and requiring more transparency on internal research.“I think the time has come for action,” Senator Amy Klobuchar told Haugen. “And I think you are the catalyst for that action.”Unlike past hearings, which were frequently derailed by partisan bickering, Tuesday’s questioning largely stuck to problems posed by Facebook’s opaque algorithmic formulas and how it harms children. Such issues can unite Congress and there is going to be “a lot of bipartisan concern about this today and in future hearings”, said Senator Roger Wicker of Mississippi.“The recent revelations about Facebook’s mental health effects on children are indeed disturbing,” he said. “They just show how urgent it is for Congress to act against powerful tech companies, on behalf of children and the broader public.”However, activists who have been calling on Congress to enact laws protecting children from the negative effects of social media are skeptical of such promises.“The bipartisan anger at Facebook is encouraging and totally justified,” said Jim Steyer, founder and CEO of the children’s protection organization Common Sense. “The next step is to turn that bipartisan anger into bipartisan legislative action before the year is over.”Exactly what should be done to regulate Facebook is a matter of debate. Senator Todd Young of Indiana asked Haugen whether she believed breaking up Facebook would solve these issues.“I’m actually against breaking up Facebook,” Haugen said. “Oversight and finding collaborative solutions with Congress is going to be key, because these systems are going to continue to exist and be dangerous even if broken up.”Many laws introduced or discussed thus far in Congress take aim at section 230, a portion of US internet regulations that exempts platforms from legal liability for content generated by their users.While some organizations, including Common Sense, are calling for the reform of section 230, other internet freedom advocates have warned that targeting that law could have unintended negative consequences for human rights, activism, and freedom of expression.‘Moral bankruptcy’: whistleblower offers scathing assessment of FacebookRead more“Haugen’s proposal to create a carveout in section 230 around algorithmic amplification would do more harm than good,” said Evan Greer, director of the activist group Fight for the Future. “Your feed would become like Disneyland, where everything in it is sanitized, vetted by lawyers, and paid for by corporations.”Following the hearing, Facebook disputed Haugen’s characterizations. But the company said it agreed more regulation was in order. “We agree on one thing. It’s time to begin to create standard rules for the internet,” said Lena Pietsch, Facebook’s director of policy communications, in a statement. “It’s been 25 years since the rules of the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”Greer argued that Facebook was promoting changes to internet laws so that it could have a hand in crafting legislation that would largely benefit big corporations. Other members of Congress have put forward potential paths to regulation that sidestep section 230 reform. Common Sense has called on Congress to pass the Children and Media Research Advancement (Camra) Act, which would authorize the National Institutes of Health to carry out research on the effects of social media on children and teens.Advocacy groups have also called on Congress for updates to the Children’s Online Privacy Protection Act (Coppa), currently the primary mechanism for protecting children online.Proposed changes would stop companies from profiling teens and youth and microtargeting them with ads and content specifically designed to prey on their fears and insecurities.“Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content and preying on children and teens is over,” Markey, who authored one such bill, called the Kids Act, said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy any longer.”TopicsFacebookUS CongressSocial networkingUS politicsSocial mediaanalysisReuse this content More