More stories

  • in

    Why the U.S. Is Being Ominously Compared to Hungary and Turkey

    A conversation with Max Fisher, who covers the decline of democracy around the world.Friday’s newsletter is a discussion with Max Fisher, an international reporter and columnist for The New York Times who covers conflict, diplomacy and the sweeping sociopolitical changes taking place all over the globe.Max often delves deep into the world of ideas and where they intersect with the real world, from the rise of new social movements to the subject of today’s chat: the decline of democracy in the United States and abroad.Here’s our conversation, lightly edited for length and clarity:You recently wrote about how democracy is under threat all over the world. What did you find most worrying?That democracy is declining more or less everywhere now. Not necessarily in every country but in every region, in rich and poor countries, old and new democracies. And the decline is incremental but steady, which means that the scale of the change isn’t necessarily obvious until you start looking at the data.We tend to think of democratic decline as something that happens in big dramatic moments — a coup, a government collapsing, tanks in the streets. But that’s not typically how it happens anymore.What happens is more like what has occurred in Venezuela, say, or Turkey or Hungary. Elected leaders rise within a democracy promising to defeat some threat within, and in the process end up slowly tearing that democracy down.Each step feels dangerous but maybe not outright authoritarian — the judiciary gets politicized a little, some previously independent institution gets co-opted, election rules get changed, news outlets come under tighter government control.No individual step feels as drastic as an outright coup. And because these leaders both promote and benefit from social polarization, these little power grabs might even be seen by supporters as saving democracy.But over many years, the system tilts more and more toward autocracy.That doesn’t always end up leading to full-on dictatorship. But that pull toward elected strongmen rulers is something we see happening in dozens of countries. By the sheer numbers, according to a democracy monitoring group called V-Dem, more democracies are in decline today than at any other point in the last century.What did you find most surprising?There’s one chart I think about a lot that was put together by the political scientists Pippa Norris and Ronald Inglehart. They tracked every election in Europe, at every level, going back decades. And they looked at how populist candidates did, on average in those elections, over time.Pippa Norris and Ronald Inglehart(Political scientists typically use the word “populist” to describe politicians who champion cultural backlash and oppose establishment institutions. Here’s a definition from the book “How Democracies Die,” by two academics named Steven Levitsky and Daniel Ziblatt: “anti-establishment politicians — figures who, claiming to represent the voice of ‘the people,’ wage war on what they depict as a corrupt and conspiratorial elite.”)What Norris and Inglehart found was that, in Europe, populists have been receiving a steadily larger share of the vote, on average, basically every year since 1960. That year is important because it’s roughly when Western countries, as the colonial era ended, collectively began to embrace what we now think of as full, liberal, multiracial democracy. And that is also the moment, it turns out from this research, when populist politics began steadily rising in a backlash to that new liberal-democratic order.That discovery is really important for understanding the threat to democracy. It shows that, for all the ways that we might think of the threat as top-down, it’s also, and maybe chiefly, bottom-up.And though we might tie the rise of populist hard-liner politics to specific events like the global financial crisis of 2007-8 or the refugee crisis of the mid-2010s, this is in fact something much larger.It’s a deeper backlash against the demands of modern liberal democracy — and this is something I’ve written about a lot over the past few years — both among voters who feel that they’re being asked to soften their racial and religious identities and among leaders who are being asked to compromise their political self-interest for the sake of democratic norms.What patterns have you found abroad that you now see in the United States?The United States fits pretty cleanly into what is now a well-established global pattern of democratic backsliding..css-1v2n82w{max-width:600px;width:calc(100% – 40px);margin-top:20px;margin-bottom:25px;height:auto;margin-left:auto;margin-right:auto;font-family:nyt-franklin;color:var(–color-content-secondary,#363636);}@media only screen and (max-width:480px){.css-1v2n82w{margin-left:20px;margin-right:20px;}}@media only screen and (min-width:1024px){.css-1v2n82w{width:600px;}}.css-161d8zr{width:40px;margin-bottom:18px;text-align:left;margin-left:0;color:var(–color-content-primary,#121212);border:1px solid var(–color-content-primary,#121212);}@media only screen and (max-width:480px){.css-161d8zr{width:30px;margin-bottom:15px;}}.css-tjtq43{line-height:25px;}@media only screen and (max-width:480px){.css-tjtq43{line-height:24px;}}.css-x1k33h{font-family:nyt-cheltenham;font-size:19px;font-weight:700;line-height:25px;}.css-ok2gjs{font-size:17px;font-weight:300;line-height:25px;}.css-ok2gjs a{font-weight:500;color:var(–color-content-secondary,#363636);}.css-1c013uz{margin-top:18px;margin-bottom:22px;}@media only screen and (max-width:480px){.css-1c013uz{font-size:14px;margin-top:15px;margin-bottom:20px;}}.css-1c013uz a{color:var(–color-signal-editorial,#326891);-webkit-text-decoration:underline;text-decoration:underline;font-weight:500;font-size:16px;}@media only screen and (max-width:480px){.css-1c013uz a{font-size:13px;}}.css-1c013uz a:hover{-webkit-text-decoration:none;text-decoration:none;}How Times reporters cover politics. We rely on our journalists to be independent observers. So while Times staff members may vote, they are not allowed to endorse or campaign for candidates or political causes. This includes participating in marches or rallies in support of a movement or giving money to, or raising money for, any political candidate or election cause.Learn more about our process.First, society polarizes, often over a backlash to social change, to demographic change, to strengthening political power by racial, ethnic or religious minorities, and generally amid rising social distrust.This leads to a bottom-up desire for populist outsiders who will promise to confront the supposed threat within, which means suppressing the other side of that social or partisan or racial divide, asserting a vision of democracy that grants special status for “my” side, and smashing the democratic institutions or norms that prevent that side from asserting what is perceived to be its rightful dominance.You also tend to see political parties and other establishment gatekeepers, who are in theory meant to keep authoritarians from rising in politics, either weaken or become co-opted. Once populist hard-liners gain enough power to begin eroding democratic checks, such as an independent judiciary or the rule of law, it’s usually a steady slide toward democratic erosion.This trend has really picked up speed, globally, only in the last 20 years or so. So it’s hard to say exactly how common it is for countries that begin on this path to end up like Hungary or Turkey. But very few democracies have begun to slide and then reversed course.You have a new book called “The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World.” In your reporting and research for the book, what sorts of effects on democracy did you find social media is having? I’m old enough to remember when techno-evangelists like Clay Shirky were predicting that social media would unleash a wave of democratization in the developing world. Obviously, that hasn’t happened. Or has it?I had that same arc of initially seeing social media as a democratizing force.So did a lot of Arab Spring activists from the early 2010s, like Wael Ghonim, the Egyptian revolutionary and Google engineer. But, within a few years, Ghonim had come to conclude, he has said in a TED Talk, that “the same tool that united us to topple dictators eventually tore us apart” by “amplifying the spread of misinformation, rumors, echo chambers and hate speech.”A neutral social media really could be a democratizing force, in theory. But the major platforms are far from neutral. They are deliberately designed to manipulate you, and to manipulate your experience on the platform in ways that will change how you think and how you behave. These platforms do this not just by what they show you, but also by eliciting certain emotions and behaviors from you.All this digital manipulation, at the scale of maybe hundreds of hours per year, changes you. And not just online, but in your offline life, too. It changes your emotional makeup, the way that you approach politics, your sense of your own identity — even the way that you process right and wrong.For an individual user — and we now have hard, empirical, scientific evidence for this — the effect can be to make you angrier, more extreme and intolerant, more distrustful, more prone to divide the world between us and them, and more disposed toward hostility and even violence against people outside your social in-group.This might change you just by a matter of degree. But when you multiply this effect out by billions of users, and often among a majority of the population, the effect can change society as a whole, too, and especially its politics, in ways that can be detrimental to democracy.What do you think most people miss about the link between social media and threats to democracy?One thing that social platforms have done — and it’s hard to blame this entirely on Silicon Valley — is to displace the traditional activism that is an important part of bringing about democracy or of preventing an existing democracy from backsliding.That activism used to happen through organizing among real-world networks, like student groups during the civil rights movement in the United States, or mothers’ groups in 1970s Argentina resisting that country’s dictatorship. Now, social media allows a protest group, even a leaderless one, to skip that process and, by going viral online, to activate thousands or even millions of people overnight.That is really effective at driving huge numbers of people onto the street, but not at much else.With the advent of social media, the number of mass protest events in the world shot way up. A million people marching on a capital city became a more common occurrence. But the success rate of those movements fell from about 70 percent to only 30 percent.The Yellow Vests movement in France quickly gained momentum in 2018 before fizzling out.Mauricio Lima for The New York TimesThe Yellow Vests, the French protest movement that began in 2018, exemplifies this. It was this stunning, spontaneous, nationwide uprising for political change. And it had been organized almost entirely through Facebook and other platforms. But it was also internally incoherent. For all its force, it quickly fizzled out, having caused a lot of traffic problems but having changed very little.Partly that was because of what had been lost in the displacement of traditional organizing. But partly it was also because of the distorting effects of those platforms. Those systems, just as they do for users globally, had pulled the Yellow Vests supporters who were gathering on those platforms toward extremes: demands to bar all refugees from the country, to default on the national debt, to replace elected legislatures with fuzzily defined citizens’ councils.It’s not the only reason the Yellow Vests mostly receded, but it is, I think, a metaphor for those platforms’ effects on our societies and democracies broadly.What to read about democracyLuke Broadwater and Michael Schmidt have an update on the long-shot push, led by some members of Congress and nonprofit groups, to bar Donald Trump from running for president in 2024 by invoking the 14th Amendment to establish him as an “insurrectionist.”Writing in The New Yorker, Adam Gopnik asks a provocative question: Can’t we come up with something better than liberal democracy?The editorial board of The New York Times is reaching out to readers to ask: What concerns and confounds you about the state of American democracy? Read about the project here.Thank you for reading On Politics. — BlakeRead past editions of the newsletter here.If you’re enjoying what you’re reading, please consider recommending it to others. They can sign up here. Browse all of our subscriber-only newsletters here.Have feedback? Ideas for coverage? We’d love to hear from you. Email us at onpolitics@nytimes.com. More

  • in

    Historic bill aimed at keeping California children digitally safe approved

    Historic bill aimed at keeping California children digitally safe approvedLegislation will require companies to install guardrails for those under age 18 and use higher privacy settings California lawmakers passed first-of-its-kind legislation on Monday designed to improve the online safety and privacy protections for children.The bill, the California Age-Appropriate Design Code Act, will require firms such as TikTok, Instagram, and YouTube to install guardrails for users under the age of 18, including defaulting to higher privacy settings for minors and refraining from collecting location data for those users.It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether it is designed to be addictive or cause additional harm to children.Children’s safety advocates have applauded the bill, which passed in a vote of 33 to 0, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the internet that children and families deserve”, said Josh Golin, executive director at advocacy group Fairplay.“For far too long, tech companies have treated their egregious privacy and safety issues as a PR problem to be addressed only through vague promises, obfuscations, and delays,” he said. “Now, tech platforms will be required to prioritize young Californians’ interests and wellbeing ahead of reckless growth and shareholder dividends.”More details to come …TopicsTechnologyChildrenCaliforniaInternet safetyPrivacySocial mediaUS politicsnewsReuse this content More

  • in

    A Journey Into Misinformation on Social Media

    “Fake news” has gone from a hot buzzword popularized during the 2016 presidential campaign to an ever-present phenomenon known more formally as misinformation or disinformation.Whatever you call it, sowing F.U.D. — fear, uncertainty and doubt — is now a full-time and often lucrative occupation for the malign foreign actors and even ordinary U.S. citizens who try to influence American politics by publishing information they know to be false.Several of my colleagues here at The New York Times track the trends and shifting tactics of these fraudsters on their daily beats. So I exchanged messages this week with Sheera Frenkel, Tiffany Hsu and Stuart A. Thompson, all three of whom spend their days swimming in the muck brewed by fake news purveyors here and abroad.Our conversation, lightly edited for length and clarity:This is a political newsletter, so let me ask my first question this way: What are you seeing out there that is new during this election cycle, in terms of tactics or topics?Sheera Frenkel: I’d say it’s the way misinformation has shifted slightly, in that you don’t have the same type of superspreaders on platforms like Twitter and Facebook that you did in the 2020 election cycle. Instead, you have lots of smaller-scale accounts spreading misinformation across a dozen or more platforms. It is more pervasive and more deeply entrenched than in previous elections.The most popular topics are largely rehashes of what was spread in the 2020 election cycle. There are a lot of false claims about voter fraud that we first saw made as early as 2016 and 2018. Newspapers, including The New York Times, have debunked many of those claims. That doesn’t seem to stop bad actors from spreading them or people from believing them.Then there are new claims, or themes, that are being spread by more fringe groups and extremist movements that we have started to track.Tiffany Hsu: Sheera first noticed a while back that there was a lot of chatter about “civil war.” And, quickly, we started to see it everywhere — this strikingly aggressive rhetoric that intensified after the F.B.I. searched Mar-a-Lago and with the passage of a bill that will give more resources to the I.R.S.For example, after the F.B.I. search, someone said on Truth Social, the social media platform started by Trump, that “sometimes clearing out dangerous vermin requires a modicum of violence, unfortunately.”We have seen a fair amount of “lock and load” chatter. But there is also pushback on the right, with people claiming without evidence that federal law enforcement or the Democrats are planting violent language to frame conservative patriots as extremists and insurrectionists.More Coverage of the 2022 Midterm ElectionsThe Evidence Against a Red Wave: Since the fall of Roe v. Wade, it’s increasingly hard to see the once-clear signs of a Republican advantage. A strong Democratic showing in a special election in New York’s Hudson Valley is the latest example.New Women Voters: The number of women signing up to vote surged in some states after Roe was overturned, particularly in states where abortion rights are at risk.Sensing a Shift: Abortion rights, falling gas prices, legislative victories and Donald J. Trump’s re-emergence have Democrats dreaming again that they just might keep control of Congress. But the House map still favors Republicans.Bruising Fights in N.Y.: A string of ugly primaries played out across the state, as Democrats and Republicans fought over rival personalities and the ideological direction of their parties.Stuart A. Thompson: I’m always surprised by how much organization is happening around misinformation. It’s not just family members sharing fake news on Facebook anymore. There’s a lot of money sloshing around. There are lots of very well-organized groups that are trying to turn the attention over voter fraud and other conspiracy theories into personal income and political results. It’s a very organized machine at this point, after two years of organizing around the 2020 election. This feels different from previous moments when disinformation seemed to take hold in the country. It’s not just a fleeting interest spurred by a few partisan voices. It’s an entire community and social network and pastime for millions of people.Sheera, you’ve covered Silicon Valley for years. How much progress would you say the big social media players — Facebook/Meta, Twitter and Google, which owns YouTube — have made in tackling the problems that arose during the 2016 election? What’s working and what’s not?Sheera: When we talk about 2016, we are largely talking about foreign election interference. In that case, Russia tried to interfere with U.S. elections by using social media platforms to sow divisions among Americans.Today, the problem of foreign election interference hasn’t been solved, but it is nowhere near at the scale it once was. Companies like Meta, which owns Facebook, and Twitter announce regular takedowns of networks run by Russia, Iran and China aiming to spread disinformation or influence people online. Millions have been spent on security teams at those companies to make sure they are removing foreign actors from spreading disinformation.And while it is not a done deal (bad actors are always innovating!), they’ve made a huge amount of progress in taking down these networks. This week, they even announced for the first time that they had removed a foreign influence op promoting U.S. interests abroad.A cutout of Mark Zuckerberg, the chief executive of Meta, dressed as the “QAnon Shaman” and cutouts of others involved in the Capitol riot before a House hearing last year.Caroline Brehman/CQ-Roll Call, Inc via Getty ImagesWhat has been harder is what to do about Americans’ spreading misinformation to other Americans, and what to do with fringe political movements and conspiracies that continue to spread under the banner of free speech.Many of these social media companies have ended up exactly in the position they hoped to avoid — making one-off decisions on when they remove movements like the QAnon conspiracy group or voter fraud misinformation that begins to go viral. More

  • in

    ‘This you?’: the seven letters exposing rightwing hypocrisy

    ‘This you?’: the seven letters exposing rightwing hypocrisyAs Biden eases student loan debt for millions, a simple phrase is puncturing criticism from conservatives like Marjorie Taylor Greene Conservatives are frothing at the mouth over Joe Biden’s decision to forgive $10,000 in student debt for millions, railing against what they call “student loan socialism”. But their carefully crafted tweets have been undermined over and over again with two words: “This you?”Were there ever seven letters more powerful? On Twitter, the phrase is an instant marker of hypocrisy, cutting down the mighty from politicians to celebrities to brands. It typically comes as a reply to an opinionated tweet, accompanied by a screenshot of an earlier remark from the same person endorsing the opposite point of view. Now Biden’s debt cancellation has given the phrase new life: “This you?” is rolling through Twitter like a bowling ball, toppling critic after critic as it nullifies their claims. The source of many of the “receipts”, in this case, is the public record of those who had their Payment Protection Plan (PPP) loans – the federal funds intended to keep businesses afloat early in the pandemic – forgiven.The conservative advocacy group PragerU proclaimed: “It’s not complicated. Bailing out irresponsible behavior will spur more irresponsible behavior.” “This you”? asked @kaoticleftist, showing hundreds of thousands of dollars in forgiven PPP funds.Ok it began as a joke now it’s on the threshold of turning into a second job 🤦‍♀️ pic.twitter.com/oTB0hcPtzf— rayne (@trayne_wreck) August 25, 2022
    The rightwing Daily Caller published a piece headlined: “Biden debt forgiveness could send tuition through the roof”, prompting another Twitter user, @coreyastewart, to post a screenshot of the PPP funds that organization reportedly had forgiven.“Student loan forgiveness sounds really nice to illegal immigrants, people with no life experience, people who don’t have families yet, and people who use preferred pronouns,” wrote the conservative commentator Steven Crowder, earning a host of “This you?” replies – with screenshots highlighting more than $71,000 in loan forgiveness for his company.Those closer to the seats of power also received helpful feedback. The Iowa senator Chuck Grassley also criticized Biden’s plan, saying it would “fuel further inflation hurting those who can least afford it UNFAIR.” “This you?” asked a candidate for local office, pointing to Grassley’s application for a federal farm bailout.This you? https://t.co/bqgtjPlZ4b pic.twitter.com/69QCNKl0pW— Kimberly Graham for Polk County Attorney (@KimberlyforIowa) August 24, 2022
    Users also accused the rightwing pundit Ben Shapiro of a double standard, but he denied having received any PPP money and said he’d issued cease-and-desist letters to organizations claiming otherwise – pointing to the messy nature of internet sleuthing. But it wasn’t just everyday Twitter users calling out hypocrisy.On Thursday evening, the White House entered the fray. The Georgia congresswoman Marjorie Taylor Greene said it was “completely unfair” for the government to “say your debt is completely forgiven” – after her loan of more than $180,000 was forgiven, the official White House account noted. It was just one of a series of digs at critics: the Florida congressman Matt Gaetz, the White House said, had more than $482,000 in PPP loans forgiven, while the Pennsylvania congressman Mike Kelly got off the hook for more than $987,000. Congresswoman Marjorie Taylor Greene had $183,504 in PPP loans forgiven.https://t.co/4FoCymt8TB— The White House (@WhiteHouse) August 25, 2022
    It’s not the first time the meme has been widely deployed to illustrate double standards on a national scale. As brands and celebrities touted their support for the Black Lives Matter movement in 2020, social media quickly exposed many as simply trend followers, juxtaposing their posts with examples of past offensive behavior – marking what Aisha Harris described in the New York Times as “a swift undercutting of performative wokeness”. Users drew attention to an NFL star posting a symbolic black square after hanging out with Donald Trump; the Baltimore police department’s supportive words years after the death of Freddie Gray; and a host of other apparent changes of heart.As Harris wrote, there’s power in such a sharable medium. It’s true that, as the Twitter user @trayne_wreck – who collected countless examples of loan-based double standards – writes, highlighting hypocrisy is unlikely to change the minds of those who are called out.But, she says, it could make a difference to those of us reading: “You, who can do something about it, who can build power to make them obsolete. I hope it will resonate with you.”TopicsSocial mediaTwitterUS politicsfeaturesReuse this content More

  • in

    To Fight Election Falsehoods, Social Media Companies Ready a Familiar Playbook

    The election dashboards are back online, the fact-checking teams have reassembled, and warnings about misleading content are cluttering news feeds once again.As the United States marches toward another election season, social media companies are steeling themselves for a deluge of political misinformation. Those companies, including TikTok and Facebook, are trumpeting a series of election tools and strategies that look similar to their approaches in previous years.Disinformation watchdogs warn that while many of these programs are useful — especially efforts to push credible information in multiple languages — the tactics proved insufficient in previous years and may not be enough to combat the wave of falsehoods pushed this election season.Here are the anti-misinformation plans for Facebook, TikTok, Twitter and YouTube.FacebookFacebook’s approach this year will be “largely consistent with the policies and safeguards” from 2020, Nick Clegg, president of global affairs for Meta, Facebook’s parent company, wrote in a blog post last week.Posts rated false or partly false by one of Facebook’s 10 American fact-checking partners will get one of several warning labels, which can force users to click past a banner reading “false information” before they can see the content. In a change from 2020, those labels will be used in a more “targeted and strategic way” for posts discussing the integrity of the midterm elections, Mr. Clegg wrote, after users complained that they were “over-used.”Warning labels prevent users from immediately seeing or sharing false content.Provided by FacebookFacebook will also expand its efforts to address harassment and threats aimed at election officials and poll workers. Misinformation researchers said the company has taken greater interest in moderating content that could lead to real-world violence after the Jan. 6 attack on the U.S. Capitol.Facebook greatly expanded its election team after the 2016 election, to more than 300 people. Mark Zuckerberg, Facebook’s chief executive, took a personal interest in safeguarding elections.But Meta, Facebook’s parent company, has changed its focus since the 2020 election. Mr. Zuckerberg is now more focused instead on building the metaverse and tackling stiff competition from TikTok. The company has dispersed its election team and signaled that it could shut down CrowdTangle, a tool that helps track misinformation on Facebook, some time after the midterms.“I think they’ve just come to the conclusion that this is not really a problem that they can tackle at this point,” said Jesse Lehrich, co-founder of Accountable Tech, a nonprofit focused on technology and democracy.More Coverage of the 2022 Midterm ElectionsChallenging DeSantis: Florida Democrats would love to defeat Gov. Ron DeSantis in November. But first they must nominate a candidate who can win in a state where they seem to perpetually fall short.Uniting Around Mastriano: Doug Mastriano, the far-right G.O.P. nominee for Pennsylvania governor, has managed to win over party officials who feared he would squander a winnable race.O’Rourke’s Widening Campaign: Locked in an unexpectedly close race against Gov. Greg Abbott, Beto O’Rourke, the Democratic candidate, has ventured into deeply conservative corners of rural Texas in search of votes.The ‘Impeachment 10’: After Liz Cheney’s primary defeat in Wyoming, only two of the 10 House Republicans who voted to impeach Mr. Trump remain.In a statement, a spokesman from Meta said its elections team was absorbed into other parts of the company and that more than 40 teams are now focused on the midterms.TikTokIn a blog post announcing its midterm plans, Eric Han, the head of U.S. safety, said the company would continue its fact-checking program from 2020, which prevents some videos from being recommended until they are verified by outside fact checkers. It also introduced an election information portal, which provides voter information like how to register, six weeks earlier than it did in 2020.Even so, there are already clear signs that misinformation has thrived on the platform throughout the primaries.“TikTok is going to be a massive vector for disinformation this cycle,” Mr. Lehrich said, adding that the platform’s short video and audio clips are harder to moderate, enabling “massive amounts of disinformation to go undetected and spread virally.”TikTok said its moderation efforts would focus on stopping creators who are paid for posting political content in violation of the company’s rules. TikTok has never allowed paid political posts or political advertising. But the company said that some users were circumventing or ignoring those policies during the 2020 election. A representative from the company said TikTok would start approaching talent management agencies directly to outline their rules.Disinformation watchdogs have criticized the company for a lack of transparency over the origins of its videos and the effectiveness of its moderation practices. Experts have called for more tools to analyze the platform and its content — the kind of access that other companies provide.“The consensus is that it’s a five-alarm fire,” said Zeve Sanderson, the founding executive director at New York University’s Center for Social Media and Politics. “We don’t have a good understanding of what’s going on there,” he added.Last month, Vanessa Pappas, TikTok’s chief operating officer, said the company would begin sharing some data with “selected researchers” this year.TwitterIn a blog post outlining its plans for the midterm elections, the company said it would reactivate its Civic Integrity Policy — a set of rules adopted in 2018 that the company uses ahead of elections around the world. Under the policy, warning labels, similar to those used by Facebook, will once again be added to false or misleading tweets about elections, voting, or election integrity, often pointing users to accurate information or additional context. Tweets that receive the labels are not recommended or distributed by the company’s algorithms. The company can also remove false or misleading tweets entirely.Those labels were redesigned last year, resulting in 17 percent more clicks for additional information, the company said. Interactions, like replies and retweets, fell on tweets that used the modified labels.In Twitter’s tests, the redesigned warning labels increased click-through rates for additional context by 17 percent.Provided by TwitterThe strategy reflects Twitter’s attempts to limit false content without always resorting to removing tweets and banning users.The approach may help the company navigate difficult freedom of speech issues, which have dogged social media companies as they try to limit the spread of misinformation. Elon Musk, the Tesla executive, made freedom of speech a central criticism during his attempts to buy the company earlier this year.YouTubeUnlike the other major online platforms, YouTube has not released its own election misinformation plan for 2022 and has typically stayed quiet about its election misinformation strategy.“YouTube is nowhere to be found still,” Mr. Sanderson said. “That sort of aligns with their general P.R. strategy, which just seems to be: Don’t say anything and no one will notice.”Google, YouTube’s parent company, published a blog post in March emphasizing their efforts to surface authoritative content through the streamer’s recommendation engine and remove videos that mislead voters. In another post aimed at creators, Google details how channels can receive “strikes” for sharing certain kinds of misinformation and, after three strikes within a 90-day period, the channel will be terminated.The video streaming giant has played a major role in distributing political misinformation, giving an early home to conspiracy theorists like Alex Jones, who was later banned from the site. It has taken a stronger stance against medical misinformation, stating last September that it would remove all videos and accounts sharing vaccine misinformation. The company ultimately banned some prominent conservative personalities.More than 80 fact checkers at independent organizations around the world signed a letter in January warning YouTube that its platform is being “weaponized” to promote voter fraud conspiracy theories and other election misinformation.In a statement, Ivy Choi, a YouTube spokeswoman, said its election team had been meeting for months to prepare for the midterms and added that its recommendation engine is “continuously and prominently surfacing midterms-related content from authoritative news sources and limiting the spread of harmful midterms-related misinformation.” More

  • in

    The Storm is Upon Us review: indispensable QAnon history, updated

    The Storm is Upon Us review: indispensable QAnon history, updated Donald Trump welcomed the conspiracy at the White House. Its followers stormed Congress. Big Tech still seems not to care. Mike Rothschild’s book should sound the alarm for us allWhat is it that has hypnotized so many addled souls who devote themselves to decoding the Delphic clues of the QAnon conspiracy?QAnon’s ‘Q’ re-emerges on far-right message board after two years of silenceRead moreWhat they think they’re getting is “secret knowledge”, from “Q” and a bunch of other military insiders working for Donald Trump, about “the storm … a ringside seat to the final match” in a “secret war between good and evil” that will end with the slaughter of all “enemies of freedom”.In short, an irresistible mix of “biblical retribution and participatory justice”.The bad guys are “Democrats, Hollywood elites, business tycoons, wealthy liberals, the medical establishment, celebrities and the mass media … They’re controlled by Barack Obama” – a Muslim sleeper agent – and Hillary Clinton, “a blood-drinking ghoul who murders everyone in her way … and they’re funded by George Soros and the Rothschild banking family (no relation to the author)”.This updated edition of Mike Rothschild’s exhaustive history of the Q movement is more important than ever. Why? Partly because of the crucial role played by so many QAnon devotees in the storming of the Capitol on 6 January 2021 but mostly because Rothschild documents how much of this insanity has penetrated to the heart of the new Republican party, propelled by many of America’s most loathsome individuals, from Ted Cruz and Donald Trump Jr to Alex Jones, Michael Flynn and Roseanne Barr.As Rothschild writes of Trump’s first national security adviser, “Flynn’s family even filmed themselves taking the ‘digital soldier oath’… part of what would become a total enmeshment between members of the Flynn family and QAnon.”In the two years before the 2020 presidential election, “nearly 100 Republican candidates declared themselves to be Q Believers” while Trump “retweeted hundreds of Q followers, putting their violent fantasies and bizarre memes into tens of millions of feeds”.Asked about a movement which has repackaged most of the oldest and harshest racist and antisemitic conspiracies for a new age, Trump gave his usual coy endorsement of the behavior of America’s most damaged internet addicts.“I don’t know much about the movement,” he mumbled, “other than I understand they like me very much, which I appreciate.”In winter 2021, as the Omicron variant sent Covid cases skyrocketing, “QAnon promoters were among the most visible anti-vaccine advocates pushing out lies and conspiracy theories” to “dissuade people from getting vaccinated”.As with so many of QAnon adherents’ positions, the message was “both clear and completely contradicted by the available evidence: they believed the pandemic was over and any mandates related to vaccines or masks were totalitarian control mechanisms that were actually killing people”.More than anything else, this is the latest horrific confirmation of what the social psychologist Jonathan Haidt recently described as “the power of social media as a universal solvent, breaking down bonds and weakening institutions everywhere it reached”.Like so many other ghastly conspiracies of recent decades, especially the blood libel that the Sandy Hook massacre was a staged event in which no one was actually killed, QAnon was propelled at warp speed by a combination of the incompetence and greed of all the big-tech big shots: Facebook, Twitter, Instagram and YouTube.Rothschild describes the usual futile internet game of Whac-A-Mole.Reddit “abruptly banned the 70,000-member r/Great Awakening board because members had started harassing other users” and had released the personal information “of at least one person they incorrectly claimed to be a mass shooter”.No matter: Q followers just migrated to Twitter and “closed Facebook groups with tens of thousands of members … Just in 2018, Q believers shared Q YouTube videos over 1.4m times, and drove hundreds of thousands of shares to Fox News, Breitbart and the Gateway Pundit”.By 2019, “Trump was routinely retweeting QAnon-promoting accounts.” By the 2020 election, “Trump had retweeted hundreds … and was regularly sharing memes created by the movement”.When Twitter and Facebook finally started “cracking down on Q iconography in the summer of 2020”, much of the movement just moved on to Instagram. Amazon and Etsy joined in the fun with books and merchandise and there were even “Q apps on the Google Play Store”.‘The lunacy is getting more intense’: how Birds Aren’t Real took on the conspiracy theoristsRead moreQ’s legacy includes what now looks like the permanent deformation of the Republican party. A December 2020 poll by NPR/Ipsos found about a third of Americans believed in a shadowy “deep state” and a robust 23% of Republicans “believed in a pedophilic ring of Satan-worshiping elites”.Rothschild ends by asking behavioral experts if there is anything the rest of us can do to help those who have gone far down this wretched rabbit hole. They say the only effective solution is a complete “unplugging” from the internet.Every time I read another book like this one, I’m increasingly inclined to the idea that this could be the only road back to sanity for all of us.
    The Storm is Upon Us: How QAnon Became a Movement, Cult, and Conspiracy Theory of Everything is published in paperback in the US by Melville House
    TopicsBooksQAnonThe far rightPolitics booksUS politicsSocial mediaInternetreviewsReuse this content More

  • in

    On TikTok, Election Misinformation Thrives Ahead of Midterms

    The fast-growing platform’s poor track record during recent voting abroad does not bode well for elections in the U.S., researchers said.In Germany, TikTok accounts impersonated prominent political figures during the country’s last national election. In Colombia, misleading TikTok posts falsely attributed a quotation from one candidate to a cartoon villain and allowed a woman to masquerade as another candidate’s daughter. In the Philippines, TikTok videos amplified sugarcoated myths about the country’s former dictator and helped his son prevail in the country’s presidential race.Now, similar problems have arrived in the United States.Ahead of the midterm elections this fall, TikTok is shaping up to be a primary incubator of baseless and misleading information, in many ways as problematic as Facebook and Twitter, say researchers who track online falsehoods. The same qualities that allow TikTok to fuel viral dance fads — the platform’s enormous reach, the short length of its videos, its powerful but poorly understood recommendation algorithm — can also make inaccurate claims difficult to contain.Baseless conspiracy theories about certain voter fraud in November are widely viewed on TikTok, which globally has more than a billion active users each month. Users cannot search the #StopTheSteal hashtag, but #StopTheSteallll had accumulated nearly a million views until TikTok disabled the hashtag after being contacted by The New York Times. Some videos urged viewers to vote in November while citing debunked rumors raised during the congressional hearings into the Jan. 6, 2021, attack on the Capitol. TikTok posts have garnered thousands of views by claiming, without evidence, that predictions of a surge in Covid-19 infections this fall are an attempt to discourage in-person voting.The spread of misinformation has left TikTok struggling with many of the same knotty free speech and moderation issues that Facebook and Twitter have faced, and have addressed with mixed results, for several years.But the challenge may be even more difficult for TikTok to address. Video and audio — the bulk of what is shared on the app — can be far more difficult to moderate than text, especially when they are posted with a tongue-in-cheek tone. TikTok, which is owned by the Chinese tech giant ByteDance, also faces many doubts in Washington about whether its business decisions about data and moderation are influenced by its roots in Beijing.“When you have extremely short videos with extremely limited text content, you just don’t have the space and time for nuanced discussions about politics,” said Kaylee Fagan, a research fellow with the Technology and Social Change Project at the Harvard Kennedy School’s Shorenstein Center. TikTok had barely been introduced in the United States at the time of the 2018 midterm elections and was still largely considered an entertainment app for younger people during the 2020 presidential election. Today, its American user base spends an average of 82 minutes a day on the platform, three times more than on Snapchat or Twitter and twice as long as on Instagram or Facebook, according to a recent report from the app analytics firm Sensor Tower. TikTok is becoming increasingly important as a destination for political content, often produced by influencers.The company insists that it is committed to combating false information. In the second half of 2020, it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media, according to a report it released last year. The platform’s filters kept another 441,000 videos with unsubstantiated claims from being recommended to users, the report said.TikTok says it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media in the second half of 2020.TikTokThe service blocked so-called deepfake content and coordinated misinformation campaigns ahead of the 2020 election, made it easier for users to report election falsehoods and partnered with 13 fact-checking organizations, including PolitiFact. Researchers like Ms. Fagan said TikTok had worked to shut down problematic search terms, though its filters remain easy to evade with creative spellings.“We take our responsibility to protect the integrity of our platform and elections with utmost seriousness,” TikTok said in a statement. “We continue to invest in our policy, safety and security teams to counter election misinformation.”But the service’s troubling track record during foreign elections — including in France and Australia this year — does not bode well for the United States, experts said.TikTok has been “failing its first real test” in Africa in recent weeks, Odanga Madung, a researcher for the nonprofit Mozilla Foundation, wrote in a report. The app struggled to tamp down on disinformation ahead of last week’s presidential election in Kenya. Mr. Madung cited a post on TikTok that included an altered image of one candidate holding a knife to his neck and wearing a blood-streaked shirt, with a caption that described him as a murderer. The post garnered more than half a million views before it was removed.“Rather than learn from the mistakes of more established platforms like Facebook and Twitter,” Mr. Madun wrote, “TikTok is following in their footsteps.”TikTok has also struggled to contain nonpolitical misinformation in the United States. Health-related myths about Covid-19 vaccines and masks run rampant, as do rumors and falsehoods about diets, pediatric conditions and gender-affirming care for transgender people. A video making the bogus claim that the mass shooting at Robb Elementary School in Uvalde, Texas, in May had been staged drew more than 74,000 views before TikTok removed it.Posts on TikTok about Russia’s war in Ukraine have also been problematic. Even experienced journalists and researchers analyzing posts on the service struggle to separate truth from rumor or fabrication, according to a report published in March by the Shorenstein Center.TikTok’s design makes it a breeding ground for misinformation, the researchers found. They wrote that videos could easily be manipulated and republished on the platform and showcased alongside stolen or original content. Pseudonyms are common; parody and comedy videos are easily misinterpreted as fact; popularity affects the visibility of comments; and data about publication time and other details are not clearly displayed on the mobile app.(The Shorenstein Center researchers noted, however, that TikTok is less vulnerable to so-called brigading, in which groups coordinate to make a post spread widely, than platforms like Twitter or Facebook.)During the first quarter of 2022, more than 60 percent of videos with harmful misinformation were viewed by users before being removed, TikTok said. Last year, a group of behavioral scientists who had worked with TikTok said that an effort to attach warnings to posts with unsubstantiated content had reduced sharing by 24 percent but had limited views by only 5 percent.Researchers said that misinformation would continue to thrive on TikTok as long as the platform refused to release data about the origins of its videos or share insight into its algorithms. Last month, TikTok said it would offer some access to a version of its application programming interface, or A.P.I., this year, but it would not say whether it would do so before the midterms.Filippo Menczer, an informatics and computer science professor and the director of the Observatory on Social Media at Indiana University, said he had proposed research collaborations to TikTok and had been told, “Absolutely not.”“At least with Facebook and Twitter, there is some level of transparency, but, in the case of TikTok, we have no clue,” he said. “Without resources, without being able to access data, we don’t know who gets suspended, what content gets taken down, whether they act on reports or what the criteria are. It’s completely opaque, and we cannot independently assess anything.”U.S. lawmakers are also calling for more information about TikTok’s operations, amid renewed concerns that the company’s ties to China could make it a national security threat. The company has said it plans to keep data about its American users separate from its Chinese parent. It has also said its rules have changed since it was accused of censoring posts seen as antithetical to Beijing’s policy goals.The company declined to say how many human moderators it had working alongside its automated filters. (A TikTok executive told British politicians in 2020 that the company had 10,000 moderators around the world.) But former moderators have complained about difficult working conditions, saying they were spread thin and sometimes required to review videos that used unfamiliar languages and references — an echo of accusations made by moderators at platforms like Facebook.In current job listings for moderators, TikTok asks for willingness to “review a large number of short videos” and “in continuous succession during each shift.”In a lawsuit filed in March, Reece Young of Nashville and Ashley Velez of Las Vegas said they had “suffered immense stress and psychological harm” while working for TikTok last year. The former moderators described 12-hour shifts assessing thousands of videos, including conspiracy theories, fringe beliefs, political disinformation and manipulated images of elected officials. Usually, they said, they had less than 25 seconds to evaluate each post and often had to watch multiple videos simultaneously to meet TikTok’s quotas. In a filing, the company pushed for the case to be dismissed in part because the plaintiffs had been contractors hired by staffing services, and not directly by TikTok. The company also noted the benefits of human oversight when paired with its review algorithms, saying, “The significant social utility to content moderation grossly outweighs any danger to moderators.”Election season can be especially difficult for moderators, because political TikTok posts tend to come from a diffuse collection of users addressing broad issues, rather than from specific politicians or groups, said Graham Brookie, the senior director of the Digital Forensic Research Lab at the Atlantic Council.“The bottom line is that all platforms can do more and need to do more for the shared set of facts that social democracy depends on,” Mr. Brookie said. “TikTok, in particular, sticks out because of its size, its really, really rapid growth and the number of outstanding issues about how it makes decisions.” More