More stories

  • in

    Historic bill aimed at keeping California children digitally safe approved

    Historic bill aimed at keeping California children digitally safe approvedLegislation will require companies to install guardrails for those under age 18 and use higher privacy settings California lawmakers passed first-of-its-kind legislation on Monday designed to improve the online safety and privacy protections for children.The bill, the California Age-Appropriate Design Code Act, will require firms such as TikTok, Instagram, and YouTube to install guardrails for users under the age of 18, including defaulting to higher privacy settings for minors and refraining from collecting location data for those users.It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether it is designed to be addictive or cause additional harm to children.Children’s safety advocates have applauded the bill, which passed in a vote of 33 to 0, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the internet that children and families deserve”, said Josh Golin, executive director at advocacy group Fairplay.“For far too long, tech companies have treated their egregious privacy and safety issues as a PR problem to be addressed only through vague promises, obfuscations, and delays,” he said. “Now, tech platforms will be required to prioritize young Californians’ interests and wellbeing ahead of reckless growth and shareholder dividends.”More details to come …TopicsTechnologyChildrenCaliforniaInternet safetyPrivacySocial mediaUS politicsnewsReuse this content More

  • in

    A Journey Into Misinformation on Social Media

    “Fake news” has gone from a hot buzzword popularized during the 2016 presidential campaign to an ever-present phenomenon known more formally as misinformation or disinformation.Whatever you call it, sowing F.U.D. — fear, uncertainty and doubt — is now a full-time and often lucrative occupation for the malign foreign actors and even ordinary U.S. citizens who try to influence American politics by publishing information they know to be false.Several of my colleagues here at The New York Times track the trends and shifting tactics of these fraudsters on their daily beats. So I exchanged messages this week with Sheera Frenkel, Tiffany Hsu and Stuart A. Thompson, all three of whom spend their days swimming in the muck brewed by fake news purveyors here and abroad.Our conversation, lightly edited for length and clarity:This is a political newsletter, so let me ask my first question this way: What are you seeing out there that is new during this election cycle, in terms of tactics or topics?Sheera Frenkel: I’d say it’s the way misinformation has shifted slightly, in that you don’t have the same type of superspreaders on platforms like Twitter and Facebook that you did in the 2020 election cycle. Instead, you have lots of smaller-scale accounts spreading misinformation across a dozen or more platforms. It is more pervasive and more deeply entrenched than in previous elections.The most popular topics are largely rehashes of what was spread in the 2020 election cycle. There are a lot of false claims about voter fraud that we first saw made as early as 2016 and 2018. Newspapers, including The New York Times, have debunked many of those claims. That doesn’t seem to stop bad actors from spreading them or people from believing them.Then there are new claims, or themes, that are being spread by more fringe groups and extremist movements that we have started to track.Tiffany Hsu: Sheera first noticed a while back that there was a lot of chatter about “civil war.” And, quickly, we started to see it everywhere — this strikingly aggressive rhetoric that intensified after the F.B.I. searched Mar-a-Lago and with the passage of a bill that will give more resources to the I.R.S.For example, after the F.B.I. search, someone said on Truth Social, the social media platform started by Trump, that “sometimes clearing out dangerous vermin requires a modicum of violence, unfortunately.”We have seen a fair amount of “lock and load” chatter. But there is also pushback on the right, with people claiming without evidence that federal law enforcement or the Democrats are planting violent language to frame conservative patriots as extremists and insurrectionists.More Coverage of the 2022 Midterm ElectionsThe Evidence Against a Red Wave: Since the fall of Roe v. Wade, it’s increasingly hard to see the once-clear signs of a Republican advantage. A strong Democratic showing in a special election in New York’s Hudson Valley is the latest example.New Women Voters: The number of women signing up to vote surged in some states after Roe was overturned, particularly in states where abortion rights are at risk.Sensing a Shift: Abortion rights, falling gas prices, legislative victories and Donald J. Trump’s re-emergence have Democrats dreaming again that they just might keep control of Congress. But the House map still favors Republicans.Bruising Fights in N.Y.: A string of ugly primaries played out across the state, as Democrats and Republicans fought over rival personalities and the ideological direction of their parties.Stuart A. Thompson: I’m always surprised by how much organization is happening around misinformation. It’s not just family members sharing fake news on Facebook anymore. There’s a lot of money sloshing around. There are lots of very well-organized groups that are trying to turn the attention over voter fraud and other conspiracy theories into personal income and political results. It’s a very organized machine at this point, after two years of organizing around the 2020 election. This feels different from previous moments when disinformation seemed to take hold in the country. It’s not just a fleeting interest spurred by a few partisan voices. It’s an entire community and social network and pastime for millions of people.Sheera, you’ve covered Silicon Valley for years. How much progress would you say the big social media players — Facebook/Meta, Twitter and Google, which owns YouTube — have made in tackling the problems that arose during the 2016 election? What’s working and what’s not?Sheera: When we talk about 2016, we are largely talking about foreign election interference. In that case, Russia tried to interfere with U.S. elections by using social media platforms to sow divisions among Americans.Today, the problem of foreign election interference hasn’t been solved, but it is nowhere near at the scale it once was. Companies like Meta, which owns Facebook, and Twitter announce regular takedowns of networks run by Russia, Iran and China aiming to spread disinformation or influence people online. Millions have been spent on security teams at those companies to make sure they are removing foreign actors from spreading disinformation.And while it is not a done deal (bad actors are always innovating!), they’ve made a huge amount of progress in taking down these networks. This week, they even announced for the first time that they had removed a foreign influence op promoting U.S. interests abroad.A cutout of Mark Zuckerberg, the chief executive of Meta, dressed as the “QAnon Shaman” and cutouts of others involved in the Capitol riot before a House hearing last year.Caroline Brehman/CQ-Roll Call, Inc via Getty ImagesWhat has been harder is what to do about Americans’ spreading misinformation to other Americans, and what to do with fringe political movements and conspiracies that continue to spread under the banner of free speech.Many of these social media companies have ended up exactly in the position they hoped to avoid — making one-off decisions on when they remove movements like the QAnon conspiracy group or voter fraud misinformation that begins to go viral. More

  • in

    ‘This you?’: the seven letters exposing rightwing hypocrisy

    ‘This you?’: the seven letters exposing rightwing hypocrisyAs Biden eases student loan debt for millions, a simple phrase is puncturing criticism from conservatives like Marjorie Taylor Greene Conservatives are frothing at the mouth over Joe Biden’s decision to forgive $10,000 in student debt for millions, railing against what they call “student loan socialism”. But their carefully crafted tweets have been undermined over and over again with two words: “This you?”Were there ever seven letters more powerful? On Twitter, the phrase is an instant marker of hypocrisy, cutting down the mighty from politicians to celebrities to brands. It typically comes as a reply to an opinionated tweet, accompanied by a screenshot of an earlier remark from the same person endorsing the opposite point of view. Now Biden’s debt cancellation has given the phrase new life: “This you?” is rolling through Twitter like a bowling ball, toppling critic after critic as it nullifies their claims. The source of many of the “receipts”, in this case, is the public record of those who had their Payment Protection Plan (PPP) loans – the federal funds intended to keep businesses afloat early in the pandemic – forgiven.The conservative advocacy group PragerU proclaimed: “It’s not complicated. Bailing out irresponsible behavior will spur more irresponsible behavior.” “This you”? asked @kaoticleftist, showing hundreds of thousands of dollars in forgiven PPP funds.Ok it began as a joke now it’s on the threshold of turning into a second job 🤦‍♀️ pic.twitter.com/oTB0hcPtzf— rayne (@trayne_wreck) August 25, 2022
    The rightwing Daily Caller published a piece headlined: “Biden debt forgiveness could send tuition through the roof”, prompting another Twitter user, @coreyastewart, to post a screenshot of the PPP funds that organization reportedly had forgiven.“Student loan forgiveness sounds really nice to illegal immigrants, people with no life experience, people who don’t have families yet, and people who use preferred pronouns,” wrote the conservative commentator Steven Crowder, earning a host of “This you?” replies – with screenshots highlighting more than $71,000 in loan forgiveness for his company.Those closer to the seats of power also received helpful feedback. The Iowa senator Chuck Grassley also criticized Biden’s plan, saying it would “fuel further inflation hurting those who can least afford it UNFAIR.” “This you?” asked a candidate for local office, pointing to Grassley’s application for a federal farm bailout.This you? https://t.co/bqgtjPlZ4b pic.twitter.com/69QCNKl0pW— Kimberly Graham for Polk County Attorney (@KimberlyforIowa) August 24, 2022
    Users also accused the rightwing pundit Ben Shapiro of a double standard, but he denied having received any PPP money and said he’d issued cease-and-desist letters to organizations claiming otherwise – pointing to the messy nature of internet sleuthing. But it wasn’t just everyday Twitter users calling out hypocrisy.On Thursday evening, the White House entered the fray. The Georgia congresswoman Marjorie Taylor Greene said it was “completely unfair” for the government to “say your debt is completely forgiven” – after her loan of more than $180,000 was forgiven, the official White House account noted. It was just one of a series of digs at critics: the Florida congressman Matt Gaetz, the White House said, had more than $482,000 in PPP loans forgiven, while the Pennsylvania congressman Mike Kelly got off the hook for more than $987,000. Congresswoman Marjorie Taylor Greene had $183,504 in PPP loans forgiven.https://t.co/4FoCymt8TB— The White House (@WhiteHouse) August 25, 2022
    It’s not the first time the meme has been widely deployed to illustrate double standards on a national scale. As brands and celebrities touted their support for the Black Lives Matter movement in 2020, social media quickly exposed many as simply trend followers, juxtaposing their posts with examples of past offensive behavior – marking what Aisha Harris described in the New York Times as “a swift undercutting of performative wokeness”. Users drew attention to an NFL star posting a symbolic black square after hanging out with Donald Trump; the Baltimore police department’s supportive words years after the death of Freddie Gray; and a host of other apparent changes of heart.As Harris wrote, there’s power in such a sharable medium. It’s true that, as the Twitter user @trayne_wreck – who collected countless examples of loan-based double standards – writes, highlighting hypocrisy is unlikely to change the minds of those who are called out.But, she says, it could make a difference to those of us reading: “You, who can do something about it, who can build power to make them obsolete. I hope it will resonate with you.”TopicsSocial mediaTwitterUS politicsfeaturesReuse this content More

  • in

    To Fight Election Falsehoods, Social Media Companies Ready a Familiar Playbook

    The election dashboards are back online, the fact-checking teams have reassembled, and warnings about misleading content are cluttering news feeds once again.As the United States marches toward another election season, social media companies are steeling themselves for a deluge of political misinformation. Those companies, including TikTok and Facebook, are trumpeting a series of election tools and strategies that look similar to their approaches in previous years.Disinformation watchdogs warn that while many of these programs are useful — especially efforts to push credible information in multiple languages — the tactics proved insufficient in previous years and may not be enough to combat the wave of falsehoods pushed this election season.Here are the anti-misinformation plans for Facebook, TikTok, Twitter and YouTube.FacebookFacebook’s approach this year will be “largely consistent with the policies and safeguards” from 2020, Nick Clegg, president of global affairs for Meta, Facebook’s parent company, wrote in a blog post last week.Posts rated false or partly false by one of Facebook’s 10 American fact-checking partners will get one of several warning labels, which can force users to click past a banner reading “false information” before they can see the content. In a change from 2020, those labels will be used in a more “targeted and strategic way” for posts discussing the integrity of the midterm elections, Mr. Clegg wrote, after users complained that they were “over-used.”Warning labels prevent users from immediately seeing or sharing false content.Provided by FacebookFacebook will also expand its efforts to address harassment and threats aimed at election officials and poll workers. Misinformation researchers said the company has taken greater interest in moderating content that could lead to real-world violence after the Jan. 6 attack on the U.S. Capitol.Facebook greatly expanded its election team after the 2016 election, to more than 300 people. Mark Zuckerberg, Facebook’s chief executive, took a personal interest in safeguarding elections.But Meta, Facebook’s parent company, has changed its focus since the 2020 election. Mr. Zuckerberg is now more focused instead on building the metaverse and tackling stiff competition from TikTok. The company has dispersed its election team and signaled that it could shut down CrowdTangle, a tool that helps track misinformation on Facebook, some time after the midterms.“I think they’ve just come to the conclusion that this is not really a problem that they can tackle at this point,” said Jesse Lehrich, co-founder of Accountable Tech, a nonprofit focused on technology and democracy.More Coverage of the 2022 Midterm ElectionsChallenging DeSantis: Florida Democrats would love to defeat Gov. Ron DeSantis in November. But first they must nominate a candidate who can win in a state where they seem to perpetually fall short.Uniting Around Mastriano: Doug Mastriano, the far-right G.O.P. nominee for Pennsylvania governor, has managed to win over party officials who feared he would squander a winnable race.O’Rourke’s Widening Campaign: Locked in an unexpectedly close race against Gov. Greg Abbott, Beto O’Rourke, the Democratic candidate, has ventured into deeply conservative corners of rural Texas in search of votes.The ‘Impeachment 10’: After Liz Cheney’s primary defeat in Wyoming, only two of the 10 House Republicans who voted to impeach Mr. Trump remain.In a statement, a spokesman from Meta said its elections team was absorbed into other parts of the company and that more than 40 teams are now focused on the midterms.TikTokIn a blog post announcing its midterm plans, Eric Han, the head of U.S. safety, said the company would continue its fact-checking program from 2020, which prevents some videos from being recommended until they are verified by outside fact checkers. It also introduced an election information portal, which provides voter information like how to register, six weeks earlier than it did in 2020.Even so, there are already clear signs that misinformation has thrived on the platform throughout the primaries.“TikTok is going to be a massive vector for disinformation this cycle,” Mr. Lehrich said, adding that the platform’s short video and audio clips are harder to moderate, enabling “massive amounts of disinformation to go undetected and spread virally.”TikTok said its moderation efforts would focus on stopping creators who are paid for posting political content in violation of the company’s rules. TikTok has never allowed paid political posts or political advertising. But the company said that some users were circumventing or ignoring those policies during the 2020 election. A representative from the company said TikTok would start approaching talent management agencies directly to outline their rules.Disinformation watchdogs have criticized the company for a lack of transparency over the origins of its videos and the effectiveness of its moderation practices. Experts have called for more tools to analyze the platform and its content — the kind of access that other companies provide.“The consensus is that it’s a five-alarm fire,” said Zeve Sanderson, the founding executive director at New York University’s Center for Social Media and Politics. “We don’t have a good understanding of what’s going on there,” he added.Last month, Vanessa Pappas, TikTok’s chief operating officer, said the company would begin sharing some data with “selected researchers” this year.TwitterIn a blog post outlining its plans for the midterm elections, the company said it would reactivate its Civic Integrity Policy — a set of rules adopted in 2018 that the company uses ahead of elections around the world. Under the policy, warning labels, similar to those used by Facebook, will once again be added to false or misleading tweets about elections, voting, or election integrity, often pointing users to accurate information or additional context. Tweets that receive the labels are not recommended or distributed by the company’s algorithms. The company can also remove false or misleading tweets entirely.Those labels were redesigned last year, resulting in 17 percent more clicks for additional information, the company said. Interactions, like replies and retweets, fell on tweets that used the modified labels.In Twitter’s tests, the redesigned warning labels increased click-through rates for additional context by 17 percent.Provided by TwitterThe strategy reflects Twitter’s attempts to limit false content without always resorting to removing tweets and banning users.The approach may help the company navigate difficult freedom of speech issues, which have dogged social media companies as they try to limit the spread of misinformation. Elon Musk, the Tesla executive, made freedom of speech a central criticism during his attempts to buy the company earlier this year.YouTubeUnlike the other major online platforms, YouTube has not released its own election misinformation plan for 2022 and has typically stayed quiet about its election misinformation strategy.“YouTube is nowhere to be found still,” Mr. Sanderson said. “That sort of aligns with their general P.R. strategy, which just seems to be: Don’t say anything and no one will notice.”Google, YouTube’s parent company, published a blog post in March emphasizing their efforts to surface authoritative content through the streamer’s recommendation engine and remove videos that mislead voters. In another post aimed at creators, Google details how channels can receive “strikes” for sharing certain kinds of misinformation and, after three strikes within a 90-day period, the channel will be terminated.The video streaming giant has played a major role in distributing political misinformation, giving an early home to conspiracy theorists like Alex Jones, who was later banned from the site. It has taken a stronger stance against medical misinformation, stating last September that it would remove all videos and accounts sharing vaccine misinformation. The company ultimately banned some prominent conservative personalities.More than 80 fact checkers at independent organizations around the world signed a letter in January warning YouTube that its platform is being “weaponized” to promote voter fraud conspiracy theories and other election misinformation.In a statement, Ivy Choi, a YouTube spokeswoman, said its election team had been meeting for months to prepare for the midterms and added that its recommendation engine is “continuously and prominently surfacing midterms-related content from authoritative news sources and limiting the spread of harmful midterms-related misinformation.” More

  • in

    The Storm is Upon Us review: indispensable QAnon history, updated

    The Storm is Upon Us review: indispensable QAnon history, updated Donald Trump welcomed the conspiracy at the White House. Its followers stormed Congress. Big Tech still seems not to care. Mike Rothschild’s book should sound the alarm for us allWhat is it that has hypnotized so many addled souls who devote themselves to decoding the Delphic clues of the QAnon conspiracy?QAnon’s ‘Q’ re-emerges on far-right message board after two years of silenceRead moreWhat they think they’re getting is “secret knowledge”, from “Q” and a bunch of other military insiders working for Donald Trump, about “the storm … a ringside seat to the final match” in a “secret war between good and evil” that will end with the slaughter of all “enemies of freedom”.In short, an irresistible mix of “biblical retribution and participatory justice”.The bad guys are “Democrats, Hollywood elites, business tycoons, wealthy liberals, the medical establishment, celebrities and the mass media … They’re controlled by Barack Obama” – a Muslim sleeper agent – and Hillary Clinton, “a blood-drinking ghoul who murders everyone in her way … and they’re funded by George Soros and the Rothschild banking family (no relation to the author)”.This updated edition of Mike Rothschild’s exhaustive history of the Q movement is more important than ever. Why? Partly because of the crucial role played by so many QAnon devotees in the storming of the Capitol on 6 January 2021 but mostly because Rothschild documents how much of this insanity has penetrated to the heart of the new Republican party, propelled by many of America’s most loathsome individuals, from Ted Cruz and Donald Trump Jr to Alex Jones, Michael Flynn and Roseanne Barr.As Rothschild writes of Trump’s first national security adviser, “Flynn’s family even filmed themselves taking the ‘digital soldier oath’… part of what would become a total enmeshment between members of the Flynn family and QAnon.”In the two years before the 2020 presidential election, “nearly 100 Republican candidates declared themselves to be Q Believers” while Trump “retweeted hundreds of Q followers, putting their violent fantasies and bizarre memes into tens of millions of feeds”.Asked about a movement which has repackaged most of the oldest and harshest racist and antisemitic conspiracies for a new age, Trump gave his usual coy endorsement of the behavior of America’s most damaged internet addicts.“I don’t know much about the movement,” he mumbled, “other than I understand they like me very much, which I appreciate.”In winter 2021, as the Omicron variant sent Covid cases skyrocketing, “QAnon promoters were among the most visible anti-vaccine advocates pushing out lies and conspiracy theories” to “dissuade people from getting vaccinated”.As with so many of QAnon adherents’ positions, the message was “both clear and completely contradicted by the available evidence: they believed the pandemic was over and any mandates related to vaccines or masks were totalitarian control mechanisms that were actually killing people”.More than anything else, this is the latest horrific confirmation of what the social psychologist Jonathan Haidt recently described as “the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached”.Like so many other ghastly conspiracies of recent decades, especially the blood libel that the Sandy Hook massacre was a staged event in which no one was actually killed, QAnon was propelled at warp speed by a combination of the incompetence and greed of all the big-tech big shots: Facebook, Twitter, Instagram and YouTube.Rothschild describes the usual futile internet game of Whac-A-Mole.Reddit “abruptly banned the 70,000-member r/Great Awakening board because members had started harassing other users” and had released the personal information “of at least one person they incorrectly claimed to be a mass shooter”.No matter: Q followers just migrated to Twitter and “closed Facebook groups with tens of thousands of members … Just in 2018, Q believers shared Q YouTube videos over 1.4m times, and drove hundreds of thousands of shares to Fox News, Breitbart and the Gateway Pundit”.By 2019, “Trump was routinely retweeting QAnon-promoting accounts.” By the 2020 election, “Trump had retweeted hundreds … and was regularly sharing memes created by the movement”.When Twitter and Facebook finally started “cracking down on Q iconography in the summer of 2020”, much of the movement just moved on to Instagram. Amazon and Etsy joined in the fun with books and merchandise and there were even “Q apps on the Google Play Store”.‘The lunacy is getting more intense’: how Birds Aren’t Real took on the conspiracy theoristsRead moreQ’s legacy includes what now looks like the permanent deformation of the Republican party. A December 2020 poll by NPR/Ipsos found about a third of Americans believed in a shadowy “deep state” and a robust 23% of Republicans “believed in a pedophilic ring of Satan-worshiping elites”.Rothschild ends by asking behavioral experts if there is anything the rest of us can do to help those who have gone far down this wretched rabbit hole. They say the only effective solution is a complete “unplugging” from the internet.Every time I read another book like this one, I’m increasingly inclined to the idea that this could be the only road back to sanity for all of us.
    The Storm is Upon Us: How QAnon Became a Movement, Cult, and Conspiracy Theory of Everything is published in paperback in the US by Melville House
    TopicsBooksQAnonThe far rightPolitics booksUS politicsSocial mediaInternetreviewsReuse this content More

  • in

    On TikTok, Election Misinformation Thrives Ahead of Midterms

    The fast-growing platform’s poor track record during recent voting abroad does not bode well for elections in the U.S., researchers said.In Germany, TikTok accounts impersonated prominent political figures during the country’s last national election. In Colombia, misleading TikTok posts falsely attributed a quotation from one candidate to a cartoon villain and allowed a woman to masquerade as another candidate’s daughter. In the Philippines, TikTok videos amplified sugarcoated myths about the country’s former dictator and helped his son prevail in the country’s presidential race.Now, similar problems have arrived in the United States.Ahead of the midterm elections this fall, TikTok is shaping up to be a primary incubator of baseless and misleading information, in many ways as problematic as Facebook and Twitter, say researchers who track online falsehoods. The same qualities that allow TikTok to fuel viral dance fads — the platform’s enormous reach, the short length of its videos, its powerful but poorly understood recommendation algorithm — can also make inaccurate claims difficult to contain.Baseless conspiracy theories about certain voter fraud in November are widely viewed on TikTok, which globally has more than a billion active users each month. Users cannot search the #StopTheSteal hashtag, but #StopTheSteallll had accumulated nearly a million views until TikTok disabled the hashtag after being contacted by The New York Times. Some videos urged viewers to vote in November while citing debunked rumors raised during the congressional hearings into the Jan. 6, 2021, attack on the Capitol. TikTok posts have garnered thousands of views by claiming, without evidence, that predictions of a surge in Covid-19 infections this fall are an attempt to discourage in-person voting.The spread of misinformation has left TikTok struggling with many of the same knotty free speech and moderation issues that Facebook and Twitter have faced, and have addressed with mixed results, for several years.But the challenge may be even more difficult for TikTok to address. Video and audio — the bulk of what is shared on the app — can be far more difficult to moderate than text, especially when they are posted with a tongue-in-cheek tone. TikTok, which is owned by the Chinese tech giant ByteDance, also faces many doubts in Washington about whether its business decisions about data and moderation are influenced by its roots in Beijing.“When you have extremely short videos with extremely limited text content, you just don’t have the space and time for nuanced discussions about politics,” said Kaylee Fagan, a research fellow with the Technology and Social Change Project at the Harvard Kennedy School’s Shorenstein Center. TikTok had barely been introduced in the United States at the time of the 2018 midterm elections and was still largely considered an entertainment app for younger people during the 2020 presidential election. Today, its American user base spends an average of 82 minutes a day on the platform, three times more than on Snapchat or Twitter and twice as long as on Instagram or Facebook, according to a recent report from the app analytics firm Sensor Tower. TikTok is becoming increasingly important as a destination for political content, often produced by influencers.The company insists that it is committed to combating false information. In the second half of 2020, it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media, according to a report it released last year. The platform’s filters kept another 441,000 videos with unsubstantiated claims from being recommended to users, the report said.TikTok says it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media in the second half of 2020.TikTokThe service blocked so-called deepfake content and coordinated misinformation campaigns ahead of the 2020 election, made it easier for users to report election falsehoods and partnered with 13 fact-checking organizations, including PolitiFact. Researchers like Ms. Fagan said TikTok had worked to shut down problematic search terms, though its filters remain easy to evade with creative spellings.“We take our responsibility to protect the integrity of our platform and elections with utmost seriousness,” TikTok said in a statement. “We continue to invest in our policy, safety and security teams to counter election misinformation.”But the service’s troubling track record during foreign elections — including in France and Australia this year — does not bode well for the United States, experts said.TikTok has been “failing its first real test” in Africa in recent weeks, Odanga Madung, a researcher for the nonprofit Mozilla Foundation, wrote in a report. The app struggled to tamp down on disinformation ahead of last week’s presidential election in Kenya. Mr. Madung cited a post on TikTok that included an altered image of one candidate holding a knife to his neck and wearing a blood-streaked shirt, with a caption that described him as a murderer. The post garnered more than half a million views before it was removed.“Rather than learn from the mistakes of more established platforms like Facebook and Twitter,” Mr. Madun wrote, “TikTok is following in their footsteps.”TikTok has also struggled to contain nonpolitical misinformation in the United States. Health-related myths about Covid-19 vaccines and masks run rampant, as do rumors and falsehoods about diets, pediatric conditions and gender-affirming care for transgender people. A video making the bogus claim that the mass shooting at Robb Elementary School in Uvalde, Texas, in May had been staged drew more than 74,000 views before TikTok removed it.Posts on TikTok about Russia’s war in Ukraine have also been problematic. Even experienced journalists and researchers analyzing posts on the service struggle to separate truth from rumor or fabrication, according to a report published in March by the Shorenstein Center.TikTok’s design makes it a breeding ground for misinformation, the researchers found. They wrote that videos could easily be manipulated and republished on the platform and showcased alongside stolen or original content. Pseudonyms are common; parody and comedy videos are easily misinterpreted as fact; popularity affects the visibility of comments; and data about publication time and other details are not clearly displayed on the mobile app.(The Shorenstein Center researchers noted, however, that TikTok is less vulnerable to so-called brigading, in which groups coordinate to make a post spread widely, than platforms like Twitter or Facebook.)During the first quarter of 2022, more than 60 percent of videos with harmful misinformation were viewed by users before being removed, TikTok said. Last year, a group of behavioral scientists who had worked with TikTok said that an effort to attach warnings to posts with unsubstantiated content had reduced sharing by 24 percent but had limited views by only 5 percent.Researchers said that misinformation would continue to thrive on TikTok as long as the platform refused to release data about the origins of its videos or share insight into its algorithms. Last month, TikTok said it would offer some access to a version of its application programming interface, or A.P.I., this year, but it would not say whether it would do so before the midterms.Filippo Menczer, an informatics and computer science professor and the director of the Observatory on Social Media at Indiana University, said he had proposed research collaborations to TikTok and had been told, “Absolutely not.”“At least with Facebook and Twitter, there is some level of transparency, but, in the case of TikTok, we have no clue,” he said. “Without resources, without being able to access data, we don’t know who gets suspended, what content gets taken down, whether they act on reports or what the criteria are. It’s completely opaque, and we cannot independently assess anything.”U.S. lawmakers are also calling for more information about TikTok’s operations, amid renewed concerns that the company’s ties to China could make it a national security threat. The company has said it plans to keep data about its American users separate from its Chinese parent. It has also said its rules have changed since it was accused of censoring posts seen as antithetical to Beijing’s policy goals.The company declined to say how many human moderators it had working alongside its automated filters. (A TikTok executive told British politicians in 2020 that the company had 10,000 moderators around the world.) But former moderators have complained about difficult working conditions, saying they were spread thin and sometimes required to review videos that used unfamiliar languages and references — an echo of accusations made by moderators at platforms like Facebook.In current job listings for moderators, TikTok asks for willingness to “review a large number of short videos” and “in continuous succession during each shift.”In a lawsuit filed in March, Reece Young of Nashville and Ashley Velez of Las Vegas said they had “suffered immense stress and psychological harm” while working for TikTok last year. The former moderators described 12-hour shifts assessing thousands of videos, including conspiracy theories, fringe beliefs, political disinformation and manipulated images of elected officials. Usually, they said, they had less than 25 seconds to evaluate each post and often had to watch multiple videos simultaneously to meet TikTok’s quotas. In a filing, the company pushed for the case to be dismissed in part because the plaintiffs had been contractors hired by staffing services, and not directly by TikTok. The company also noted the benefits of human oversight when paired with its review algorithms, saying, “The significant social utility to content moderation grossly outweighs any danger to moderators.”Election season can be especially difficult for moderators, because political TikTok posts tend to come from a diffuse collection of users addressing broad issues, rather than from specific politicians or groups, said Graham Brookie, the senior director of the Digital Forensic Research Lab at the Atlantic Council.“The bottom line is that all platforms can do more and need to do more for the shared set of facts that social democracy depends on,” Mr. Brookie said. “TikTok, in particular, sticks out because of its size, its really, really rapid growth and the number of outstanding issues about how it makes decisions.” More

  • in

    How Some Parents Changed Their Politics in the Pandemic

    ORINDA, Calif. — They waved signs that read “Defeat the mandates” and “No vaccines.” They chanted “Protect our kids” and “Our kids, our choice.”Almost everyone in the crowd of more than three dozen was a parent. And as they protested on a recent Friday in the Bay Area suburb of Orinda, Calif., they had the same refrain: They were there for their children.Most had never been to a political rally before. But after seeing their children isolated and despondent early in the coronavirus pandemic, they despaired, they said. On Facebook, they found other worried parents who sympathized with them. They shared notes and online articles — many of them misleading — about the reopening of schools and the efficacy of vaccines and masks. Soon, those issues crowded out other concerns.“I wish I’d woken up to this cause sooner,” said one protester, Lisa Longnecker, 54, who has a 17-year-old son. “But I can’t think of a single more important issue. It’s going to decide how I vote.”Ms. Longnecker and her fellow objectors are part of a potentially destabilizing new movement: parents who joined the anti-vaccine and anti-mask cause during the pandemic, narrowing their political beliefs to a single-minded obsession over those issues. Their thinking hardened even as Covid-19 restrictions and mandates were eased and lifted, cementing in some cases into a skepticism of all vaccines.Nearly half of Americans oppose masking and a similar share is against vaccine mandates for schoolchildren, polls show. But what is obscured in those numbers is the intensity with which some parents have embraced these views. While they once described themselves as Republicans or Democrats, they now identify as independents who plan to vote based solely on vaccine policies.Their transformation injects an unpredictable element into November’s midterm elections. Fueled by a sense of righteousness after Covid vaccine and mask mandates ended, many of these parents have become increasingly dogmatic, convinced that unless they act, new mandates will be passed after the midterms.To back up their beliefs, some have organized rallies and disrupted local school board meetings. Others are raising money for anti-mask and anti-vaccine candidates like J.D. Vance, the Republican nominee for Senate in Ohio; Reinette Senum, an independent running for governor in California; and Rob Astorino, a Republican gubernatorial candidate in New York.In interviews, 27 parents who called themselves anti-vaccine and anti-mask voters described strikingly similar paths to their new views. They said they had experienced alarm about their children during pandemic quarantines. They pushed to reopen schools and craved normalcy. They became angry, blaming lawmakers for the disruption to their children’s lives.Many congregated in Facebook groups that initially focused on advocating in-person schooling. Those groups soon latched onto other issues, such as anti-mask and anti-vaccine messaging. While some parents left the online groups when schools reopened, others took more extreme positions over time, burrowing into private anti-vaccine channels on messaging apps like WhatsApp and Telegram.Eventually, some began questioning vaccines for measles and other diseases, where inoculations have long been proven effective. Activists who oppose all vaccines further enticed them by joining online parent groups and posting inaccurate medical studies and falsehoods.“So many people, but especially young parents, have come to this cause in the last year,” said Janine Pera, 65, a longtime activist against all vaccines who attended the Orinda protest. “It’s been a huge gift to the movement.”The extent of activity is evident on Facebook. Since 2020, more than 200 Facebook groups aimed at reopening schools or opposing closings have been created in states including Texas, Florida and Ohio, with more than 300,000 members, according to a review by The New York Times. Another 100 anti-mask Facebook groups dedicated to ending masking in schools have also sprung up in states including New Jersey, New York and Connecticut, some with tens of thousands of members.Since the outbreak of Covid-19, many Facebook groups have sprung up opposing mask mandates.Renée DiResta, a research manager at the Stanford Internet Observatory who has studied anti-vaccine activism, said the movement had indoctrinated parents into feeling “like they are part of their community, and that community supports specific candidates or policies.”Their emergence has confounded Republican and Democratic strategists, who worried they were losing voters to candidates willing to take absolute positions on vaccines and masks.“A lot of Democrats might think these voters are now unreachable, even if they voted for the party recently,” said Dan Pfeiffer, a Democratic political adviser to former President Barack Obama.Read More on Facebook and MetaA New Name: In 2021, Mark Zuckerberg announced that Facebook would change its name to Meta, as part of a wider strategy shift toward the so-called metaverse that aims at introducing people to shared virtual worlds.Morphing Into Meta: Mr. Zuckerberg is setting a relentless pace as he leads the company into the next phase. But the pivot  is causing internal disruption and uncertainty.Zuckerberg’s No. 2: In June, Sheryl Sandberg, the company’s chief financing officer announced she would step down from Meta, depriving Mr. Zuckerberg of his top deputy.Tough Times Ahead: After years of financial strength, the company is now grappling with upheaval in the global economy, a blow to its advertising business and a Federal Trade Commission lawsuit.Nathan Leamer, who worked at the Federal Communications Commission during the Trump administration and is now vice president of public affairs at the firm Targeted Victory, said Republican candidates — some of whom have publicly been against Covid vaccine mandates — were better positioned to attract these voters. He pointed to last year’s surprise win in Virginia of Gov. Glenn Youngkin, a Republican, after he gained the support of young parents by invoking their frustration over Covid-driven school closures.Even so, Mr. Leamer said, these parents were a wild card in November. “The truth is that we don’t really know what these voters will do,” he said.‘I Found My People’Natalya Murakhver, 50, once considered herself a Democrat who prioritized environmental and food sustainability issues. Sam James, 41, said he was a Democrat who worried about climate change. Sarah Levy, 37, was an independent who believed in social justice causes.That was before the pandemic. In 2020, when the coronavirus swept in and led to lockdowns, Ms. Murakhver’s two daughters — Violet, 5, and Clementine, 9 — climbed the walls of the family’s Manhattan apartment, complaining of boredom and crying that they missed their friends.In Chicago, Mr. James’s two toddlers developed social anxiety after their preschool shuttered, he said. Ms. Levy said her autistic 7-year-old son watched TV for hours and stopped speaking in full sentences.“We were seeing real trauma happening because programs for children were shut down,” said Ms. Levy, a stay-at-home mother in Miami.But when they posted about the fears for their children on Facebook, Instagram or Twitter, they were told to stop complaining, they said. Other parents called them “selfish” and “whiny.” Alienated, they sought other like-minded parents online.Many found a community on Facebook. New groups, mostly started by parents, were rapidly appearing on the social network, with people pushing for schools to reopen. In California, 62 Facebook groups dedicated to reopening or keeping elementary schools open popped up late last year, according to a review by The Times. There were 21 such groups in Ohio and 37 in New York. Most ranged in size from under 100 members to more than 150,000.Facebook, which is owned by Meta, declined to comment.The company has removed groups that spread misinformation about Covid-19 and vaccines.“We couldn’t stand by and watch our children suffer without their friends and teachers,” said Natalya Murakhver, a mother of two.Marko Dukic for The New York TimesMs. Murakhver joined some Facebook groups and became particularly active in one called “Keep NYC Schools Open,” which petitioned the city to open schools and keep them open through Covid surges. Last year, she became a group administrator, helping to admit new members and moderating discussions. The group swelled to 2,500 members.“We had the same cause to rally behind,” Ms. Murakhver said. “We couldn’t stand by and watch our children suffer without their friends and teachers.”In Chicago, Mr. James joined two Facebook groups pushing Chicago schools to reopen. In Miami, Ms. Levy jumped into national Facebook groups and discussed how to force the federal government to mandate that schools everywhere reopen.“I found my people,” Ms. Levy said. While she had been an independent, she said she found common ground with Republicans “who understood that for us, worse than the virus, was having our kid trapped at home and out of school.”Into the Online Rabbit HoleThe Facebook groups were just the beginning of an online journey that took some parents from more mainstream views of reopening schools toward a single-issue position.In Chico, Calif., Kim Snyder, 36, who has a 7-year-old daughter and 9-year-old son, said she was a longtime Republican. After her children had to stay home in the pandemic, she helped create a Facebook group in 2020 for Chico parents committed to reopening schools full-time.At the time, her local schools had partially reopened and children were learning both online and in-person, Ms. Snyder said. But frustration over hybrid learning was mounting, and schools were repeatedly shut down when Covid surged.By mid-2021, Ms. Snyder’s Facebook group had splintered. Some parents were satisfied with the safety measures and hybrid learning and stopped participating in online discussions, she said. Others were angry that they had not returned to a prepandemic way of living.Protesters demanded the removal of the indoor mask mandate for the Los Angeles Unified School District in March.Caroline Brehman/EPA, via ShutterstockMs. Snyder counted herself in the latter category. She channeled her discontent by attending in-person protests against mask requirements at public schools. At the rallies, she met activists who opposed all types of vaccines. She invited some to join her Facebook group, she said, “because we were all fighting for the same thing. We wanted a return to normalcy.”The focus of her Facebook group soon morphed from reopening schools to standing against masks in schools. By late last year, more content decrying every vaccine had also started appearing in the Facebook group.“I started to read more about how masks and vaccines were causing all this damage to our kids,” Ms. Snyder said.Scientific advisers to the Centers for Disease Control and Prevention have said the Pfizer-BioNTech and Moderna coronavirus vaccine shots are considered safe for young children. But Ms. Snyder said she became convinced they were wrong. She browsed other Facebook groups too, to meet more parents with similar beliefs.Activists posted statistics about Covid vaccines in those Facebook groups. Often that information came from the Vaccine Adverse Event Reporting System, a database maintained by the C.D.C. and the Food and Drug Administration, which allows anyone to submit data. The C.D.C. has warned that the database “cannot prove that a vaccine caused a problem.”Yet in a September 2021 post in Ms. Snyder’s Facebook group, parents pointed to VAERS figures that they said showed thousands of vaccine-induced deaths.“This is absolutely dangerous!” one parent wrote. “This hasn’t been really tested and is NOT NECESSARY….OMG!”Another post titled “If you want to really know what is going on, read this” linked to an article that falsely claimed vaccines could leave children sterile. The article was originally posted to a Facebook group named Children’s Health Defense, which supports an organization founded and chaired by the anti-vaccine activist Robert F. Kennedy Jr.That tipped some parents into repudiating every vaccine, from chickenpox to hepatitis, and against vaccine mandates of any kind. A right to self-determination so that parents could decide what vaccines their children took was paramount.“For the first time, I began to look at the statistics and questioned whether all the vaccines I had previously given my kids made sense,” Ms. Snyder said.Soon she joined explicitly anti-vaccine Facebook groups that activists linked to, including ones supporting Children’s Health Defense. In those forums, parents seethed at the authorities, arguing they had no right to tell them what to do with their children’s bodies. Activists posted other links to Twitter and Telegram and urged parents to join them there, warning that Facebook often removed their content for misinformation.One link led to a Telegram channel run by Denise Aguilar, an anti-vaccine activist in Stockton, Calif. Ms. Aguilar, who speaks about her experiences as a mother on social media and on conservative podcasts, also runs a survivalist organization called Mamalitia, a self-described mom militia. She has more than 100,000 followers across her TikTok and Telegram channels.Early in the pandemic, Ms. Aguilar posted conspiracy theories about the coronavirus’s origins and questioned the effectiveness of masking. Now her messaging has changed to focus on political activism for the midterms. Denise Aguilar, right, an anti-vaccine activist, joined other activists in blocking the door to Gov. Gavin Newsom’s office in Sacramento in September 2019.Rich Pedroncelli/Associated PressIn June, Ms. Aguilar encouraged her Telegram followers to vote for Carlos Villapudua, a Democrat running for California State Assembly who voted against a bill that would let children aged 12 and older get vaccinated without parental consent.“Patriots unite!” wrote Ms. Aguilar, who didn’t respond to a request for comment. “We need to support freedom loving Americans.”From Talk to ActionBy late last year, the talk among parent groups on Facebook, Telegram and Instagram had shifted from vaccine dangers to taking action in the midterms.Ms. Snyder said her involvement against vaccines would “100 percent determine” whom she voted for in November. She said she was disappointed in Gov. Gavin Newsom of California, a Democrat who encouraged masking and promoted the coronavirus vaccines.In New York, Ms. Murakhaver, who previously supported candidates who favored strong environmental protection laws, said she would vote based solely on a candidate’s position on mandates on all children’s vaccines.The Facebook group she helped operate, Keep NYC Schools Open, has shut down. But Ms. Murakhaver remains close with activists she met through the group, chatting with them on Signal and WhatsApp. While her children were vaccinated against measles and other diseases when they were babies, she now opposes any mandate that would force other parents to inoculate their children.“I’m a single-issue voter now, and I can’t see myself supporting Democratic Party candidates unless they show they fought to keep our kids in school and let parents make decisions about masks and vaccines,” she said, adding that she prefers Mr. Astorino for New York governor over the Democratic incumbent, Kathy Hochul.While states including California have deferred bills requiring Covid-19 vaccines for students attending public schools, many parents said they worried the mandates would be passed after the midterms.“If we don’t show up and vote, these bills could come back in the future,” Ms. Snyder said.A “Defeat the Mandate” rally in April to protest vaccine mandates.Damian Dovarganes/Associated PressAt the Orinda demonstration in April, more than 50 people gathered outside the office of Steve Glazer, a Democratic state senator to oppose coronavirus vaccine mandates.One was Jessica Barsotti, 56, who has two teenagers and was at her first rally. Previously a Democrat, Ms. Barsotti said elected officials had let her family down during the pandemic and planned to cast her ballot in November for candidates who were against vaccine mandates.“If that is Republicans so be it. If it is independents, fine,” she said. “I’m not looking at their party affiliation but how they fall on this one issue. It’s changed me as a person and as a voter.” More