More stories

  • in

    ‘New text, same problems’: inside the fight over child online safety laws

    Sharp divisions between advocates for children’s safety online have emerged as a historic bill has gathered enough votes to pass in the US Senate. Amendments to the bill have appeased some former detractors who now support the legislation; its fiercest critics, however, have become even more entrenched in their demands for changes.The Kids Online Safety Act (Kosa), introduced more than two years ago, reached 60 backers in the Senate mid-February. A number of human rights groups still vehemently oppose the legislation, underscoring ongoing divisions among experts, lawmakers and advocates over how to keep young people safe online.“The Kids Online Safety Act is our best chance to address social media’s toxic business model, which has claimed far too many children’s lives and helped spur a mental health crisis,” said Josh Golin, the executive director of the children’s online safety group Fairplay.Opponents say alterations to the bill are not enough and that their concerns remain unchanged.“A one-size-fits-all approach to kids’ safety won’t keep kids safe,” said Aliya Bhatia, a policy analyst at the Center for Democracy and Technology. “This bill still rests on the premise that there is consensus around the types of content and design features that cause harm. There isn’t, and this belief will limit young people from exercising their agency and accessing the communities they need to online.”What is the Kids Online Safety Act?Sponsored by the Connecticut Democrat Richard Blumenthal and the Tennessee Republican Marsha Blackburn, Kosa would be the biggest change to American tech legislation in decades. The bill would require platforms like Instagram and TikTok to mitigate online dangers via design changes or opt-outs of algorithm-based recommendations, among other measures. Enforcement would demand much more fundamental modifications to social networks than current regulations require.When it was first introduced in 2022, Kosa prompted an open letter signed by more than 90 human rights organizations united in strong opposition. The groups warned the bill could be “weaponized” by conservative state attorneys general – who were charged with determining what content is harmful – to censor online resources and information for queer and trans youth or people seeking reproductive healthcare.In response to the critiques, Blumenthal amended the bill, notably shifting some enforcement decisions to the Federal Trade Commission rather than state attorneys general. At least seven LGBTQ+ advocacy organizations that previously spoke out against the bill dropped their opposition citing the “considerable changes” to Kosa that “significantly mitigate the risk of it being misused to suppress LGBTQ+ resources or stifle young people’s access to online communities”, including Glaad, the Human Rights Campaign and the Trevor Project.To the critics who now support Kosa, the amendments by Blumenthal solved the legislation’s major issues. However, the majority of those who signed the initial letter still oppose the bill, including the Center for Democracy and Technology, the Electronic Frontier Foundation, Fight for the Future, and the ACLU.“New bill text, same problems,” said Adam Kovacevich, chief executive of the tech industry policy coalition the Chamber of Progress, which is supported by corporate partners including Airbnb, Amazon, Apple and Snap. “The changes don’t address a lot of its potential abuses.” Snap and X, formerly Twitter, have publicly supported Kosa.Is Kosa overly broad or a net good?Kovacevich said the latest changes fail to address two primary concerns with the legislation: that vague language will lead social media platforms to over-moderate to restrict their liability, and that allowing state attorneys general to enforce the legislation could enable targeted and politicized content restriction even with the federal government assuming more of the bill’s authority.The vague language targeted by groups that still oppose the bill is the “duty of care” provision, which states that social media firms have “a duty to act in the best interests of a minor that uses the platform’s products or services” – a goal subject to an enforcer’s interpretation. The legislation would also require platforms to mitigate harms by creating “safeguards for minors”, but with little direction as to what content would be deemed harmful, opponents argue the legislation is likely to encourage companies to more aggressively filter content – which could lead to unintended consequences.“Rather than protecting children, this could impact access to protected speech, causing a chilling effect for all users and incentivizing companies to filter content on topics that disproportionately impact marginalized communities,” said Prem M Trivedi, policy director at the Open Technology Institute, which opposes Kosa.Trivedi said he and other opponents fear that important but charged topics like gun violence and racial justice could be interpreted as having a negative impact on young users, and be filtered out by algorithms. Many have expressed concern that LGBTQ+-related topics would be targeted by conservative regulators, leading to fewer available resources for young users who rely on the internet to connect with their communities. Blackburn, the bill’s sponsor, has previously stated her intention to “protect minor children from the transgender [sic] in this culture and that influence”.An overarching concern among opponents of the bill is that it is too broad in scope, and that more targeted legislation would achieve similar goals with fewer unintended impacts, said Bhatia.“There is a belief that there are these magic content silver bullets that a company can apply, and that what stands between a company applying those tools and not applying those tools is legislation,” she said. “But those of us who study the impact of these content filters still have reservations about the bill.”Many with reservations acknowledge that it does feature broadly beneficial provisions, said Mohana Mukherjee, visiting faculty at George Washington University, who has studied technology’s impact on teenagers and young adults. She said the bill’s inclusion of a “Kosa council” – a coalition of stakeholders including parents, academic experts, health professionals and young social media users to provide advice on how best to implement the legislation – is groundbreaking.“It’s absolutely crucial to involve young adults and youth who are facing these problems, and to have their perspective on the legislation,” she said.Kosa’s uncertain futureKosa is likely to be voted on in the Senate this session, but other legislation targeting online harms threatens its momentum. A group of senators is increasingly pushing a related bill that would ban children under the age of 13 from social media. Its author, Brian Schatz, has requested a panel that would potentially couple the bill with Kosa. Blumenthal, the author of Kosa, has cautioned that such a move could slow the passage of both bills and spoke out against the markup.“We should move forward with the proposals that have the broadest support, but at the same time, have open minds about what may add value,” he said, according to the Washington Post. “This process is the art of addition not subtraction often … but we should make sure that we’re not undermining the base of support.”The bill’s future in the House is likewise unclear. Other bills with similar purported goals are floating around Congress, including the Invest in Child Safety Act – a bill introduced by the Democratic senator Ron Wyden of Oregon and the representatives Anna G Eshoo and Brian Fitzpatrick – which would invest more than $5bn into investigating online sexual abusers.With so much legislation swirling around the floors of Congress, it’s unclear when – or if – a vote will be taken on any of them. But experts agree that Congress has at least begun trying to bolster children’s online safety.“This is an emotionally fraught topic – there are urgent online safety issues and awful things that happen to our children at the intersection of the online world and the offline world,” said Trivedi. “In an election year, there are heightened pressures on everyone to demonstrate forward movement on issues like this.” More

  • in

    The whistleblower who plunged Facebook into crisis

    After a set of leaks last month that represented the most damaging insight into Facebook’s inner workings in the company’s history, the former employee behind them has come forward. Now Frances Haugen has given evidence to the US Congress – and been praised by senators as a ‘21st century American hero’. Will her testimony accelerate efforts to bring the social media giant to heel?

    How to listen to podcasts: everything you need to know

    On Monday, Facebook and its subsidiaries Instagram and WhatsApp went dark after a router failure. There were thousands of negative headlines, millions of complaints, and more than 3 billion users were forced offline. On Tuesday, the company’s week got significantly worse. Frances Haugen, a former product manager with Facebook, testified before US senators about what she had seen in her two years there – and set out why she had decided to leak a trove of internal documents to the Wall Street Journal. Haugen had revealed herself as the source of the leak a few days earlier. And while the content of the leak – from internal warnings of the harm being done to teenagers by Instagram to the deal Facebook gives celebrities to leave their content unmoderated – had already led to debate about whether the company needed to reform, Haugen’s decision to come forward escalated the pressure on Mark Zuckerberg. In this episode, Nosheeen Iqbal talks to the Guardian’s global technology editor, Dan Milmo, about what we learned from Haugen’s testimony, and how damaging a week this could be for Facebook. Milmo sets out the challenges facing the company as it seeks to argue that the whistleblower is poorly informed or that her criticism is mistaken. And he reflects on what options politicians and regulators around the world will consider as they look for ways to curb Facebook’s power, and how likely such moves are to succeed. After Haugen spoke, Zuckerberg said her claims that the company puts profit over people’s safety were “just not true”. In a blog post, he added: “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” You can read more of Zuckerberg’s defence here. And you can read an analysis of how Haugen’s testimony is likely to affect Congress’s next move here. Archive: BBC; YouTube; TikTok; CSPAN; NBC; CBS;CNBC; Vice; CNN More