More stories

  • in

    Facebook’s very bad year. No, really, it might be the worst yet

    Facebook’s very bad year. No, really, it might be the worst yet From repeated accusations of fostering misinformation to multiple whistleblowers, the company weathered some battles in 2021It’s a now-perennial headline: Facebook has had a very bad year.Years of mounting pressure from Congress and the public culminated in repeated PR crises, blockbuster whistleblower revelations and pending regulation over the past 12 months.And while the company’s bottom line has not yet wavered, 2022 is not looking to be any better than 2021 – with more potential privacy and antitrust actions on the horizon.Here are some of the major battles Facebook has weathered in the past year.Capitol riots launch a deluge of scandalsFacebook’s year started with allegations that a deadly insurrection on the US Capitol was largely planned on its platform. Regulatory uproar over the incident reverberated for months, leading lawmakers to call CEO Mark Zuckerberg before Congress to answer for his platform’s role in the attack.In the aftermath, Zuckerberg defended his decision not to take action against Donald Trump, though the former president stoked anger and separatist flames on his personal and campaign accounts. Facebook’s inaction led to a rare public employee walkout and Zuckerberg later reversed the hands-off approach to Trump. Barring Trump from Facebook platforms sparked backlash once again – this time from Republican lawmakers alleging censorship.What ensued was a months-long back-and-forth between Facebook and its independent oversight board, with each entity punting the decision of whether to keep Trump off the platform. Ultimately, Facebook decided to extend Trump’s suspension to two years. Critics said this underscored the ineffectiveness of the body. “What is the point of the oversight board?” asked the Real Oversight Board, an activist group monitoring Facebook, after the non-verdict.Whistleblowers take on FacebookThe scandal with perhaps the biggest impact on the company this year came in the form of the employee-turned-whistleblower Frances Haugen, who leaked internal documents that exposed some of the inner workings of Facebook and just how much the company knew about the harmful effects its platform was having on users and society.Haugen’s revelations, first reported by the Wall Street Journal, showed Facebook was aware of many of its grave public health impacts and had the means to mitigate them – but chose not to do so.For instance, documents show that since at least 2019, Facebook has studied the negative impact Instagram had on teenage girls and yet did little to mitigate the harms and publicly denied that was the case. Those findings in particular led Congress to summon company executives to multiple hearings on the platform and teen users.Facebook has since paused its plans to launch an Instagram app for kids and introduced new safety measures encouraging users to take breaks if they use the app for long periods of time. In a Senate hearing on 8 December, the Instagram executive Adam Mosseri called on Congress to launch an independent body tasked with regulating social media more comprehensively, sidestepping calls for Instagram to regulate itself.Haugen also alleged Facebook’s tweaks to its algorithm, which turned off some safeguards intended to fight misinformation, may have led to the Capitol attack. She provided information underscoring how little of its resources it dedicates to moderating non-English language content.In response to the Haugen documents, Congress has promised legislation and drafted a handful of new bills to address Facebook’s power. One controversial measure would target Section 230, a portion of the Communications Decency Act that exempts companies from liability for content posted on their platforms.Haugen was not the only whistleblower to take on Facebook in 2021. In April, the former Facebook data scientist turned whistleblower Sophie Zhang revealed to the Guardian that Facebook repeatedly allowed world leaders and politicians to use its platform to deceive the public or harass opponents. Zhang has since been called to testify on these findings before parliament in the UK and India.Lawmakers around the world are eager to hear from the Facebook whistleblowers. Haugen also testified in the UK regarding the documents she leaked, telling MPs Facebook “prioritizes profit over safety”.Such testimony is likely to influence impending legislation, including the Online Safety Bill: a proposed act in the UK that would task the communications authority Ofcom with regulating content online and requiring tech firms to protect users from harmful posts or face substantial fines.Zuckerberg and Cook feud over Apple updateThough Apple has had its fair share of regulatory battles, Facebook did not find an ally in its fellow tech firm while facing down the onslaught of consumer and regulatory pressure that 2021 brought.The iPhone maker in April launched a new notification system to alert users when and how Facebook was tracking their browsing habits, supposedly as a means to give them more control over their privacy.Facebook objected to the new policy, arguing Apple was doing so to “self-preference their own services and targeted advertising products”. It said the feature would negatively affect small businesses relying on Facebook to advertise. Apple pressed on anyway, rolling it out in April and promising additional changes in 2022.Preliminary reports suggest Apple is, indeed, profiting from the change while Google and Facebook have seen advertising profits fall.Global outage takes out all Facebook productsIn early October, just weeks after Haugen’s revelations, things took a sudden turn for the worse when the company faced a global service outage.Perhaps Facebook’s largest and most sustained tech failure in recent history, the glitch left billions of users unable to access Facebook, Instagram or Whatsapp for six hours on 4 and 5 October.Facebook’s share price dropped 4.9% that day, cutting Zuckerberg’s personal wealth by $6bn, according to Bloomberg.Other threats to FacebookAs Facebook faces continuing calls for accountability, its time as the wunderkind of Silicon Valley has come to a close and it has become a subject of bipartisan contempt.Republicans repeatedly have accused Facebook of being biased against conservatism, while liberals have targeted the platform for its monopolistic tendencies and failure to police misinformation.In July, the Biden administration began to take a harder line with the company over vaccine misinformation – which Joe Biden said was “killing people” and the US surgeon general said was “spreading like wildfire” on the platform. Meanwhile, the appointment of the antitrust thought leader Lina Khan to head of the FTC spelled trouble for Facebook. She has been publicly critical of the company and other tech giants in the past, and in August refiled a failed FTC case accusing Facebook of anti-competitive practices.After a year of struggles, Facebook has thrown something of a Hail Mary: changing its name. The company announced it would now be called Meta, a reference to its new “metaverse” project, which will create a virtual environment where users can spend time.The name change was met with derision and skepticism from critics. But it remains to be seen whether Facebook, by any other name, will beat the reputation that precedes it.TopicsFacebookTim CookMark ZuckerbergUS CongressUS Capitol attackAppleUS politicsfeaturesReuse this content More

  • in

    Facebook revelations: what is in cache of internal documents?

    FacebookFacebook revelations: what is in cache of internal documents?Roundup of what we have learned after release of papers and whistleblower’s testimony to MPs Dan Milmo Global technology editorMon 25 Oct 2021 14.42 EDTLast modified on Mon 25 Oct 2021 16.04 EDTFacebook has been at the centre of a wave of damaging revelations after a whistleblower released tens of thousands of internal documents and testified about the company’s inner workings to US senators.Frances Haugen left Facebook in May with a cache of memos and research that have exposed the inner workings of the company and the impact its platforms have on users. The first stories based on those documents were published by the Wall Street Journal in September.Facebook whistleblower Frances Haugen calls for urgent external regulationRead moreHaugen gave further evidence about Facebook’s failure to act on harmful content in testimony to US senators on 5 October, in which she accused the company of putting “astronomical profits before people”. She also testified to MPs and peers in the UK on Monday, as a fresh wave of stories based on the documents was published by a consortium of news organisations.Facebook’s products – the eponymous platform, the Instagram photo-sharing app, Facebook Messenger and the WhatsApp messaging service – are used by 2.8 billion people a day and the company generated a net income – a US measure of profit – of $29bn (£21bn) last year.Here is what we have learned from the documents, and Haugen, since the revelations first broke last month.Teenage mental healthThe most damaging revelations focused on Instagram’s impact on the mental health and wellbeing of teenage girls. One piece of internal research showed that for teenage girls already having “hard moments”, one in three found Instagram made body issues worse. A further slide shows that one in three people who were finding social media use problematic found Instagram made it worse, with one in four saying it made issues with social comparison worse.Facebook described reports on the research, by the WSJ in September, as a “mischaracterisation” of its internal work. Nonetheless, the Instagram research has galvanised politicians on both sides of the Atlantic seeking to rein in Facebook.Violence in developing countriesHaugen has warned that Facebook is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said that 87% of the spending on combating misinformation at Facebook is spent on English content when only 9% of users are English speakers. According to the news site Politico on Monday, just 6% of Arabic-language hate content was detected on Instagram before it made its way on to the platform.Haugen told Congress on 5 October that Facebook’s use of engagement-based ranking – where the platform ranks a piece of content, and whether to put it in front of users, on the amount of interactions it gets off people – was endangering lives. “Facebook … knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world. And that’s what is causing things like ethnic violence in Ethiopia,” she said.Divisive algorithm changesIn 2018 Facebook changed the way it tailored content for users of its news feed feature, a key part of people’s experience of the platform. The emphasis on boosting “meaningful social interactions” between friends and family meant that the feed leant towards reshared material, which was often misinformed and toxic. “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” said internal research. Facebook said it had an integrity team that was tackling the problematic content “as efficiently as possible”.Tackling falsehoods about the US presidential electionThe New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for the platform – were of posts alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle “Stop the Steal” groups spreading claims that the election was rigged. “Enforcement was piecemeal,” said the research. The revelations have reignited concerns about Facebook’s role in the 6 January riots.Facebook said: “The responsibility for the violence that occurred … lies with those who attacked our Capitol and those who encouraged them.” However, the WSJ has also reported that Facebook’s automated systems were taking down posts generating only an estimated 3-5% of total views of hate speech.Disgruntled Facebook staffWithin the files disclosed by Haugen are testimonies from dozens of Facebook employees frustrated by the company’s failure to either acknowledge the harms it generates, or to properly support efforts to mitigate or prevent those harms. “We are FB, not some naive startup. With the unprecedented resources we have, we should do better,” wrote one employee quoted by Politico in the wake of the 6 January attack on the US capitol.“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the US, we determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” wrote another. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy … History will not judge us kindly.”Facebook is struggling to recruit young usersA section of a complaint filed by Haugen’s lawyers with the US financial watchdog refers to young users in “more developed economies” using Facebook less. This is a problem for a company that relies on advertising for its income because young users, with unformed spending habits, can be lucrative to marketers. The complaint quotes an internal document stating that Facebook’s daily teenage and young adult (18-24) users have “been in decline since 2012-13” and “only users 25 and above are increasing their use of Facebook”. Further research reveals “engagement is declining for teens in most western, and several non-western, countries”.Haugen said engagement was a key metric for Facebook, because it meant users spent longer on the platform, which in turn appealed to advertisers who targeted users with adverts that accounted for $84bn (£62bn) of the company’s $86bn annual revenue. On Monday, Bloomberg said “time spent” for US teenagers on Facebook was down 16% year-on-year, and that young adults in the US were also spending 5% less time on the platform.Facebook is built for divisive contentOn Monday the NYT reported an internal memo warning that Facebook’s “core product mechanics”, or its basic workings, had let hate speech and misinformation grow on the platform. The memo added that the basic functions of Facebook were “not neutral”. “We also have compelling evidence that our core product mechanics, such as vitality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” said the 2019 memo.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we are a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we have invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook avoids confrontations with US politicians and rightwing news organisationsA document seen by the Financial Times showed a Facebook employee claiming Facebook’s public policy team blocked decisions to take down posts “when they see that they could harm powerful political actors”. The document said: “In multiple cases the final judgment about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg.” The memo said moves to take down content by repeat offenders against Facebook’s guidelines, such as rightwing publishers, were often reversed because the publishers might retaliate. The wave of stories on Monday were based on disclosures made to the Securities and Exchange Commission – the US financial watchdog – and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organisations including the NYT, Politico and Bloomberg.TopicsFacebookSocial mediaSocial networkingUS Capitol attackUS politicsDigital mediaanalysisReuse this content More

  • in

    Facebook boss ‘not willing to protect public from harm’

    The ObserverFacebookFacebook boss ‘not willing to protect public from harm’ Frances Haugen says chief executive has not shown any desire to shield users from the consequences of harmful content Dan MilmoSat 23 Oct 2021 21.02 EDTLast modified on Sun 24 Oct 2021 04.23 EDTThe Facebook whistleblower whose revelations have tipped the social media giant into crisis has launched a stinging new criticism of Mark Zuckerberg, saying he has not shown any readiness to protect the public from the harm his company is causing.Frances Haugen told the Observer that Facebook’s founder and chief executive had not displayed a desire to run the company in a way that shields the public from the consequences of harmful content.Her intervention came as pressure mounted on the near-$1tn (£730bn) business following a fresh wave of revelations based on documents leaked by Haugen, a former Facebook employee. The New York Times reported that workers had repeatedly warned that Facebook was being flooded with false claims about the 2020 presidential election result being fraudulent and believed the company should have done more to tackle it.Frances Haugen: ‘I never wanted to be a whistleblower. But lives were in danger’Read moreHaugen, who appears before MPs and peers in Westminster on Monday, said Zuckerberg, who controls the business via a majority of its voting shares, has not shown any willingness to protect the public.“Right now, Mark is unaccountable. He has all the control. He has no oversight, and he has not demonstrated that he is willing to govern the company at the level that is necessary for public safety.”She added that giving all shareholders an equal say in the running of the company would result in changes at the top. “I believe in shareholder rights and the shareholders, or shareholders minus Mark, have been asking for years for one share one vote. And the reason for that is, I am pretty sure the shareholders would choose other leadership if they had an option.”Haugen, who quit as a Facebook product manager in May, said she had leaked tens of thousand of documents to the Wall Street Journal and to Congress because she had realised that the company would not change otherwise.She said: “There are great companies that have done major cultural changes. Apple did a major cultural change; Microsoft did a major cultural change. Facebook can change too. They just have to get the will.”This weekend, a consortium of US news organisations released a fresh wave of stories based on the Haugen documents. The New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for Facebook – were of posts falsely alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle Stop the Steal groups spreading claims on the platform that the election was rigged. “Enforcement was piecemeal,” said the research.The revelations have reignited concerns about Facebook’s role in the 6 January riots, in which a mob seeking to overturn the election result stormed the Capitol in Washington. The New York Times added that some of the reporting for the story was based on documents not released by Haugen.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our commercial interests lie. The truth is we’ve invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook’s vice-president of integrity, Guy Rosen, said the company had put in place multiple measures to protect the public during and after the election and that “responsibility for the [6 January] insurrection lies with those who broke the law during the attack and those who incited them”.It was also reported on Friday that a new Facebook whistleblower had come forward and, like Haugen, had filed a complaint to the Securities and Exchange Commission, the US financial regulator, alleging that the company declined to enforce safety rules for fear of angering Donald Trump or impacting Facebook’s growth.Haugen will testify in person on Monday to the joint committee scrutinising the draft online safety bill, which would impose a duty of care on social media companies to protect users from harmful content, and allow the communications regulator, Ofcom, to fine those who breach this. The maximum fine is 10% of global turnover, so in the case of Facebook, this could run into billions of pounds. Facebook, whose services also include Instagram and WhatsApp, has 2.8 billion daily users and generated an income last year of $86bn.As well as issuing detailed rebuttals of Haugen’s revelations, Facebook is reportedly planning a major change that would attempt to put some distance between the company and its main platform. Zuckerberg could announce a rebranding of Facebook’s corporate identity on Thursday, according to a report that said the company is keen to emphasise its future as a player in the “metaverse”, a digital world in which people interact and lead their social and professional lives virtually.Haugen said Facebook must be compelled by all regulators to be more transparent with the information at its disposal internally, as detailed in her document leaks. She said one key reform would be to set up a formal structure whereby regulators could demand reports from Facebook on any problem that they identify.“Let’s imagine there was a brand of car that was having five times as many car accidents as other cars. We wouldn’t accept that car company saying, ‘this is really hard, we are trying our best, we are sorry, we are trying to do better in the future’. We would never accept that as an answer and we are hearing that from Facebook all the time. There needs to be an avenue where we can escalate a concern and they actually have to give us a response.”TopicsFacebookThe ObserverSocial networkingMark ZuckerbergUS elections 2020US CongressUS politicsReuse this content More

  • in

    Facebook missed weeks of warning signs over Capitol attack, documents suggest

    FacebookFacebook missed weeks of warning signs over Capitol attack, documents suggestMaterials provided by Frances Haugen to media outlets shine light on how company apparently stumbled into 6 January Guardian staff and agenciesSat 23 Oct 2021 14.22 EDTFirst published on Sat 23 Oct 2021 12.23 EDTAs extremist supporters of Donald Trump stormed the US Capitol on 6 January, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinformation and content likely to incite further violence.Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with records of hate speech and filtering out the “Stop the Steal” rallying cry of Trump’s campaign to overturn his electoral loss, falsely citing widespread fraud. Officials have called it the most secure election in US history.Actions also included empowering Facebook content moderators to act more assertively by labeling the US a “temporary high risk location” for political violence.At the same time, frustration inside Facebook erupted over what some saw as the company’s halting and inconsistent response to rising extremism in the US.“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the 6 January turmoil.“We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s role in the events.New internal documents have been provided to a number of media outlets in recent days by the former Facebook employee turned whistleblower Frances Haugen, following her initial disclosures and claims that the platform puts profits before public good, and her testimony to Congress.The outlets, including the New York Times, the Washington Post and NBC, published reports based on those documents, which offer a deeper look into the spread of misinformation and conspiracy theories on the platform, particularly related to the 2020 US presidential election.They show that Facebook employees repeatedly flagged concerns before and after the election, when Trump tried to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told co-workers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent. But as workers flagged these issues and urged the company to act, the company failed or struggled to address the problems, the Times reported.The internal documents also show Facebook researchers have found the platform’s recommendation tools repeatedly pushed users to extremist groups, prompting internal warnings that some managers and executives ignored, NBC News reported.In one striking internal study, a Facebook researcher created a fake profile for “Carol Smith”, a conservative female user whose interests included Fox News and Donald Trump. The experiment showed that within two days, Facebook’s algorithm was recommending “Carol” join groups dedicated to QAnon, a baseless internet conspiracy theory.The documents also provide a rare glimpse into how the company appears to have simply stumbled into the events of 6 January.It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing – by posting on Facebook itself – to stop Congress from certifying Joe Biden’s election victory.This story is based in part on disclosures Haugen made to the Securities and Exchange Commission (SEC), the US agency that handles regulation to protect investors in publicly traded companies, provided to Congress in redacted form by her legal counsel.Facebook crisis grows as new whistleblower and leaked documents emergeRead moreThe redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.What Facebook called “Break the Glass” emergency measures put in place on 6 January were essentially a toolkit of options designed to stem the spread of dangerous or violent content. The social network had first used the system in the run-up to the bitter 2020 election.As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen has said.An internal Facebook report following 6 January, previously reported by BuzzFeed, faulted the company for a “piecemeal” approach to the rapid growth of “Stop the Steal” pages.Facebook said the situation was more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content. The company said it was not responsible for the actions of the rioters – and that having stricter controls in place prior to that day wouldn’t have helped.Facebook’s decisions to phase certain safety measures in or out had taken into account signals from the Facebook platform as well as information from law enforcement, said a spokesperson, Dani Lever, saying: “When those signals changed, so did the measures.”Lever added that some of the measures had stayed in place well into February and others remained active today.Meanwhile, Facebook is facing mounting pressure after a new whistleblower on Friday accused it of knowingly hosting hate speech and illegal activity.Allegations by the new whistleblower, who spoke to the Washington Post, were reportedly contained in a complaint to the SEC.In the complaint, which echoes Haugen’s disclosures, the former employee detailed how Facebook officials frequently declined to enforce safety rules for fear of angering Donald Trump and his allies or offsetting the company’s huge growth. In one alleged incident, Tucker Bounds, a Facebook communications official, dismissed concerns about the platform’s role in 2016 election manipulation.“It will be a flash in the pan,” Bounds said, according to the affidavit, as reported by the Post. “Some legislators will get pissy. And then in a few weeks they will move on to something else. Meanwhile, we are printing money in the basement, and we are fine.” TopicsFacebookUS Capitol attackSocial networkingSocial mediaUS politicsnewsReuse this content More

  • in

    Lawmakers seek to rein in big tech with bills aimed at competition and liability

    TechnologyLawmakers seek to rein in big tech with bills aimed at competition and liabilityOne bill would prevent platforms from giving preference to their own products, the other would remove Section 230 protections Kari PaulThu 14 Oct 2021 17.58 EDTLast modified on Thu 14 Oct 2021 18.37 EDTUS lawmakers announced two major new proposals seeking to rein in the power of big tech, days after the revelations from a former Facebook employee spotlighted the company’s sweeping impact.The first bill, proposed by a group of senators headed by Democrat Amy Klobuchar and Republican Chuck Grassley would bar big tech platforms from favoring their own products and services.The second bill, put forward by House Democrats, would remove some protections afforded tech companies by Section 230, a portion of the Communications Decency Act that exempts them from liability for what is posted on their platforms.Facebook whistleblower’s testimony could finally spark action in CongressRead moreThe proposals are part of a slew of bills from this Congress aimed at reining in tech firms, including industry leaders Facebook and Apple. Thus far, none have become law although one, a broader measure to increase resources for antitrust enforcers, has passed the Senate.Klobuchar and Grassley’s bill would specifically prohibit platforms from requiring companies operating on their sites to purchase the platform’s goods or services and ban them from biasing search results to favor the platform. It is a companion bill to a measure which has passed the House judiciary committee and must pass both houses of Congress to become law.The bill would address concerns that tech giants have become gatekeepers, giving preference to their own products, blocking rivals from accessing markets and imposing onerous fees and terms on smaller businesses.“As dominant digital platforms – some of the biggest companies our world has ever seen – increasingly give preference to their own products and services, we must put policies in place to ensure small businesses and entrepreneurs still have the opportunity to succeed in the digital marketplace,” Klobuchar said in a statement.The legislation comes as Congress is increasingly working on a bipartisan basis to address antitrust issues in big tech. Traditionally lawmakers have differed on their critiques of the industry – with Democrats claiming the companies are monopolies and Republicans criticizing what they perceive as an anti-conservative bias on the platforms.“This bill is welcome proof that the momentum in Congress to tackle big tech’s monopoly power is rapidly gaining force on both sides of the aisle,” read a statement from the Institute for Local Self-Reliance, a non-profit that fights against corporate monopolies. “We agree with their view that the tech giants cannot continue to abuse their power at the expense of competition, innovation, and entrepreneurship.”Meanwhile, the debate around Section 230 – a portion of the Communications Decency Act that protects companies from legal liability for content posted on their platforms – has continued. Its impact has long been a hot-button issue but became increasingly so during the Donald Trump’s presidency.The bill House Democrats introduced on Thursday would create an amendment in Section 230 that would hold companies responsible for the personalized algorithmic amplification of problematic content.In other words it seeks to simply “turn off” the Facebook news algorithm, said Evan Greer, director of digital rights group Fight For the Future.The law would apply only to large tech firms with 5,000,000 or more monthly users, but could still have negative consequences for firms large enough to qualify but that still have fewer resources than Facebook.“Facebook would likely be able to survive this, but smaller competitors wouldn’t,” Greer said. “That’s why Facebook has repeatedly called for changes to Section 230 – they know it will only serve to solidify their dominance and monopoly power.“This bill is well-intentioned, but it’s a total mess,” added Greer. “Democrats are playing right into Facebook’s hands by proposing tweaks to Section 230 instead of thoughtful policies that will actually reduce the harm done by surveillance-driven algorithms.”Lawmakers are “failing to understand how these policies will actually play out in the real world”, she added.Earlier this year more than 70 civil rights, LGBTQ+, sex worker advocacy and human rights organizations sent a letter cautioning lawmakers against changing Section 230.They instead prefer an approach to reining in Facebook and other platforms by attacking the data harvesting and surveillance practices they rely on as a business model.Democrats should instead “pass a privacy bill strong enough to kill Facebook’s surveillance driven business model while leaving the democratizing power of the internet intact”, Greer said.Reuters contributed to this reportTopicsTechnologyFacebookUS politicsSocial mediaApplenewsReuse this content More

  • in

    How to blow the whistle on Facebook – from someone who already did

    FacebookHow to blow the whistle on Facebook – from someone who already did This April, Sophie Zhang told the world about her employer’s failure to combat deception and abuse. Her advice? No screenshots, lawyer up – and trust yourselfSophie ZhangMon 11 Oct 2021 01.00 EDTLast modified on Mon 11 Oct 2021 12.05 EDTTwo years ago, I did something I almost never do: I put on a dress. Then I dropped my phone and other electronics off at the home of friends who had agreed to tell anyone who asked that I was at their place the entire time, and headed to the Oakland offices of the Guardian for my first meeting with a reporter. How Facebook let fake engagement distort global politics: a whistleblower’s accountRead moreLeaving my electronics was a safeguard against possible tracking by my then employer, Facebook. The dress was an additional layer of alibi: I theorized that if anyone from work saw me and could contradict my first alibi, they might conclude that my unusual behavior was evidence of nothing more than an affair.That first, anxious meeting was the beginning of a lengthy process that would culminate in my decision – 18 months later and after I had been fired by Facebook – to step forward and blow the whistle on Facebook’s failure to combat deception and abuse by powerful politicians around the world.This month, another Facebook whistleblower, Frances Haugen, has come forward. After providing the Wall Street Journal and US government with thousands of internal documents showing Facebook’s internal research into its own harms, Haugen testified to Congress. Her testimony and revelations have captured the imaginations of the public, the press and Capitol Hill and raised hopes that regulators might finally act to rein in Facebook’s immense power.During her testimony, Haugen encouraged “more tech employees to come forward through legitimate channels … to make sure that the public has the information they need”. But whistleblowing is never straightforward. When I was deciding whether to speak out, I struggled to find guidance on the best way to go about it. If you’re in that position now, here’s my best advice on how to navigate the complicated path to becoming a whistleblower.Decide what you’re willing to riskWhistleblowing is not for everyone; I knew Facebook employees on H1-B visas who considered speaking, but could not risk being fired and deported. Speaking out internally or anonymously to the press will risk your current job. Speaking out publicly will risk your future career. Providing documentation will risk lawsuits and legal action. These risks can be minimized, but not eliminated. Decide whether you’re going to go publicThe first question you have to ask yourself is whether your aim is to change the minds of employees and leadership, or to pressure them via public opinion? Employees will be more sympathetic to the company than the general public; an internal post denouncing the chief executive as intentionally undermining democracy might alienate your co-workers, but can move the window of discussion. Before I went public, I used Facebook’s internal message board, Workplace, to try to effect change. It was only when this failed that I decided to go to the press.If you do make an internal post, remember that leaks are inevitable, and consider how your words can be misunderstood. When I wrote my departure memo, I naively thought it would not leak, and wrote for an audience of insiders. One of the consequences of this was that a stray comment about “actors” (referring to people who take certain actions) resulted in incorrect reports in the Indian press that Bollywood stars were interfering with elections.Exhaust your internal optionsDon’t let the company claim that they were ignorant of the situation and issues you’re speaking out about, or allege that you had failed to speak to the right people. Even if you expect complaints to be ignored, consider making them nevertheless – in writing – so you can point to them later.Decide what you’re going to saySpeaking out about an area of personal expertise gives you credibility and insight, but narrows your scope to areas that may not arouse as much public interest. Speaking out about topics beyond your normal work will require you to conduct research and seek out internal documents you wouldn’t normally look at – creating a digital trail that could expose you – but could make your story more compelling. Be careful that what you say is correct and you aren’t making assumptions based on any personal bias or opinions; would-be “whistleblowers” have come forward with unconvincing revelations based on preconceptions.Facebook is ‘biased against facts’, says Nobel prize winnerRead moreExpect to face company criticism regardless of what you speak on – Facebook dismissed Haugen for speaking about issues beyond her scope, and attempted the same for myself even though I spoke only about topics I personally worked on.Whatever you speak about, consider what your end goal is and whether your revelations will accomplish that. Risking your career to help a tech reporter live-tweet a company meeting may not be the risk/reward ratio you had in mind.No screenshots, no work devicesNever contact outside parties (such as reporters or lawyers) via work devices; only do so via end-to-end encrypted systems like Signal on your personal devices. To securely copy work documents, use a personal device to take photos of the screen; do not take screenshots. If you’re accessing many documents, ensure that you have a plausible alibi. If leaking while employed, ensure that you’re only sharing documents that many employees have recently accessed. And if you intend to go public, insulate yourself beforehand by removing personal information online with a service like DeleteMe.Save up for a year without payIf you intend to go public with documentation, ensure that you’re able to survive off savings for at least a year. Most would-be-whistleblowers I’ve spoken to are concerned that they won’t be able to find another job. I worried about this too, but I’ve actually received many recruiting attempts – an experience also reported by others. Nevertheless, talking to the press, civil society and government officials is time consuming and will probably prevent you from working for some time. You will likely also incur additional expenses on lawyers and PR advice. Some whistleblowers choose to solicit donations, but this might undermine your credibility.Lawyer upIf you intend to go public with documentation and details, speak with a lawyer first. Organizations such as Whistleblower Aid and the Signals Network can help connect you with someone. By speaking out, you face the risk of lawsuits for breach of contract, or even prosecution in the United States for theft of trade secrets. These risks are unlikely, but the possibility exists nevertheless.Make contact with an outsiderMost tech reporters have a Signal address in their Twitter profile. I’ve heard many employees concerned that reporters will not protect anonymity – I personally have few concerns in that regard, although I would advise working with an established news outlet.When you do speak with a reporter, you should be clear up front about whether you’re speaking on the record (you can be quoted by name), unattributed (you can be quoted but not by name), or off the record (none of this can be published). If you intend to speak with the government, your lawyer should be able to help.It’s your decision – trust yourselfIn the end, whistleblowing is an intensely personal decision that very few will ever consider. It’s easy to criticize from the outside, but many feel differently when they face those risks themselves. Every time I advise others, I remind them that I can provide advice but the ultimate decision is their own. I am glad that I chose to come forward, and that Frances did as well, but no one is obligated to torch their career in pursuit of justice.TopicsFacebookSocial networkingUS politicsfeaturesReuse this content More

  • in

    Supreme court, Facebook, Fed: three horsemen of democracy’s apocalypse | Robert Reich

    OpinionUS supreme courtSupreme court, Facebook, Fed: three horsemen of democracy’s apocalypseRobert ReichThese unaccountable bodies hold increasing sway over US government. Their abuses of power affect us all Sun 10 Oct 2021 01.00 EDTLast modified on Sun 10 Oct 2021 05.22 EDTThe week’s news has been dominated by the supreme court, whose term began on Monday; the Federal Reserve, and whether it will start responding to inflation by raising interest rates; and Facebook, which a whistleblower claimed intentionally seeks to enrage and divide Americans in order to generate engagement and ad revenue.‘Facebook can’t keep its head in the sand’: five experts debate the company’s futureRead moreThe common thread is the growing influence of these three power centers over our lives, even as they become less accountable to us. As such, they present a fundamental challenge to democracy.Start with the supreme court. What’s the underlying issue?Don’t for a moment believe the supreme court bases its decisions on neutral, objective criteria. I’ve argued before it and seen up close that justices have particular and differing ideas about what’s good for the country. So it matters who they are and how they got there.A majority of the nine justices – all appointed for life – were put there by George W Bush and Donald Trump, presidents who lost the popular vote. Three were installed by Trump, a president who instigated a coup. Yet they are about to revolutionize American life in ways most Americans don’t want.This new court seems ready to overrule Roe v Wade, the 1973 ruling that anchored reproductive rights in the 14th amendment; declare a 108-year-old New York law against carrying firearms unconstitutional; and strip federal bodies such as the Environmental Protection Agency of the power to regulate private business. And much more.Only 40% of the public approves of the court’s performance, a new low. If the justices rule in ways anticipated, that number will drop further. If so, expect renewed efforts to expand the court and limit the terms of its members.What about the Fed?Behind the recent stories about whether the Fed should act to tame inflation is the reality that its power to set short-term interest rates and regulate the financial sector is virtually unchecked. And here too there are no neutral, objective criteria. Some believe the Fed’s priority should be fighting inflation. Others believe it should be full employment. So like the supreme court, it matters who runs it.Elizabeth Warren tells Fed chair he is ‘dangerous’ and opposes renominationRead morePresidents appoint Fed chairs for four-year terms but tend to stick with them longer for fear of rattling Wall Street, which wants stability and fat profits. (Alan Greenspan, a Reagan appointee, lasted almost 20 years, surviving two Bushes and Bill Clinton, who didn’t dare remove him).The term of Jerome Powell, the current Fed chair, who was appointed by Trump, is up in February. Biden will probably renominate him to appease the Street, although it’s not a sure thing. Powell has kept interest rates near zero, which is appropriate for an economy still suffering the ravages of the pandemic.But Powell has also allowed the Street to resume several old risky practices, prompting the Massachusetts Democratic senator Elizabeth Warren to tell him at a recent hearing that “renominating you means gambling that, for the next five years, a Republican majority at the Federal Reserve, with a Republican chair who has regularly voted to deregulate Wall Street, won’t drive this economy over a financial cliff again.”Finally, what’s behind the controversy over Facebook?Facebook and three other hi-tech behemoths (Amazon, Google and Apple) are taking on roles that once belonged to governments, from cybersecurity to exploring outer space, yet they too are unaccountable.Their decisions about which demagogues are allowed to communicate with the public and what lies they are allowed to spew have profound consequences for whether democracy or authoritarianism prevails. In January, Mark Zuckerberg apparently deferred to Nick Clegg, former British deputy prime minister, now vice-president of Facebook, on whether to allow Trump back on the platform.Worst of all, they’re sowing hate. As Frances Haugen, a former data scientist at Facebook, revealed this week, Facebook’s algorithm is designed to choose content that will make users angry, because anger generates the most engagement – and user engagement turns into ad dollars. The same is likely true of the algorithms used by Google, Amazon and Apple. Such anger has been ricocheting through our society, generating resentment and division.US supreme court convenes for pivotal term – with its credibility on the lineRead moreYet these firms have so much power that the government has no idea how to control them. How many times do you think Facebook executives testified before Congress in the last four years? Answer: 30. How many laws has Congress enacted to constrain Facebook during that time? Answer: zero.Nor are they accountable to the market. They now make the market. They’re not even accountable to themselves. Facebook’s oversight board has become a bad joke.These three power centers – the supreme court, the Fed and the biggest tech firms – have huge and increasing effects on our lives, yet they are less and less answerable to us.Beware. Democracy depends on accountability. Accountability provides checks on power. If abuses of power go unchallenged, those who wield it will only consolidate their power further. It’s a vicious cycle that erodes faith in democracy itself.
    Robert Reich, a former US secretary of labor, is professor of public policy at the University of California at Berkeley and the author of Saving Capitalism: For the Many, Not the Few and The Common Good. His new book, The System: Who Rigged It, How We Fix It, is out now. He is a Guardian US columnist. His newsletter is at robertreich.substack.com
    TopicsUS supreme courtOpinionUS constitution and civil libertiesLaw (US)FacebookSocial networkingFederal ReserveUS economycommentReuse this content More

  • in

    Facebook whistleblower testimony should prompt new oversight – Schiff

    FacebookFacebook whistleblower testimony should prompt new oversight – Schiff‘I think we need regulation to protect people’s private data,’ influential Democrat says in wake of Frances Haugen revelations

    Facebook biased against the facts, says Nobel prize winner
    Martin Pengelly and Charles KaiserSat 9 Oct 2021 16.06 EDTFirst published on Sat 9 Oct 2021 15.36 EDTTestimony in Congress this week by the whistleblower Frances Haugen should prompt action to implement meaningful oversight of Facebook and other tech giants, the influential California Democrat Adam Schiff told the Guardian in an interview to be published on Sunday.“I think we need regulation to protect people’s private data,” the chair of the House intelligence committee said.“I think we need to narrow the scope of the safe harbour these companies enjoy if they don’t moderate their contents and continue to amplify anger and hate. I think we need to insist on a vehicle for more transparency so we understand the data better.”Haugen, 37, was the source for recent Wall Street Journal reporting on misinformation spread by Facebook and Instagram, the photo-sharing platform which Facebook owns. She left Facebook in May this year, but her revelations have left the tech giant facing its toughest questions since the Cambridge Analytica user privacy scandal.At a Senate hearing on Tuesday, Haugen shared internal Facebook reports and argued that the social media giant puts “astronomical profits before people”, harming children and destabilising democracy via the sharing of inaccurate and divisive content.Haugen likened the appeal of Instagram to tobacco, telling senators: “It’s just like cigarettes … teenagers don’t have good self-regulation.”Richard Blumenthal, a Democrat from Connecticut, said Haugen’s testimony might represent a “big tobacco” moment for the social media companies, a reference to oversight imposed despite testimony in Congress that their product was not harmful from executives whose companies knew that it was.The founder and head of Facebook, Mark Zuckerberg, has resisted proposals to overhaul the US internet regulatory framework, which is widely considered to be woefully out of date.He responded to Haugen’s testimony by saying the “idea that we prioritise profit over safety and wellbeing” was “just not true”.“The argument that we deliberately push content that makes people angry for profit is deeply illogical,” he said. “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”Schiff was speaking to mark publication of a well-received new memoir, Midnight in Washington: How We Almost Lost Our Democracy and Still Could.The Democrat played prominent roles in the Russia investigation and Donald Trump’s first impeachment. He now sits on the select committee investigating the deadly attack on the US Capitol on 6 January, by Trump supporters seeking to overturn his election defeat – an effort in part fueled by misinformation on social media.In his book, Schiff writes about asking representatives of Facebook and two other tech giants, Twitter and YouTube, if their “algorithms were having the effect of balkanising the public and deepening the divisions in our society”.‘Welcome to the party’: five past tech whistleblowers on the pitfalls of speaking outRead moreFacebook’s general counsel in the 2017 hearing, Schiff writes, said: “The data on this is actually quite mixed.”“It didn’t seem very mixed to me,” Schiff says.Asked if he thought Haugen’s testimony would create enough pressure for Congress to pass new laws regulating social media companies, Schiff told the Guardian: “The answer is yes.”However, as an experienced member of a bitterly divided and legislatively sclerotic Congress, he also cautioned against too much optimism among reform proponents.“If you bet against Congress,” Schiff said, “you win 90% of the time.”TopicsFacebookUS politicsSocial networkingnewsReuse this content More