More stories

  • in

    Twitter and TikTok Lead in Amplifying Misinformation, Report Finds

    A new analysis found that algorithms and some features of social media sites help false posts go viral.It is well known that social media amplifies misinformation and other harmful content. The Integrity Institute, an advocacy group, is now trying to measure exactly how much — and on Thursday it began publishing results that it plans to update each week through the midterm elections on Nov. 8.The institute’s initial report, posted online, found that a “well-crafted lie” will get more engagements than typical, truthful content and that some features of social media sites and their algorithms contribute to the spread of misinformation.Twitter, the analysis showed, has what the institute called the great misinformation amplification factor, in large part because of its feature allowing people to share, or “retweet,” posts easily. It was followed by TikTok, the Chinese-owned video site, which uses machine-learning models to predict engagement and make recommendations to users.“We see a difference for each platform because each platform has different mechanisms for virality on it,” said Jeff Allen, a former integrity officer at Facebook and a founder and the chief research officer at the Integrity Institute. “The more mechanisms there are for virality on the platform, the more we see misinformation getting additional distribution.”The institute calculated its findings by comparing posts that members of the International Fact-Checking Network have identified as false with the engagement of previous posts that were not flagged from the same accounts. It analyzed nearly 600 fact-checked posts in September on a variety of subjects, including the Covid-19 pandemic, the war in Ukraine and the upcoming elections.Facebook, according to the sample that the institute has studied so far, had the most instances of misinformation but amplified such claims to a lesser degree, in part because sharing posts requires more steps. But some of its newer features are more prone to amplify misinformation, the institute found.Facebook’s amplification factor of video content alone is closer to TikTok’s, the institute found. That’s because the platform’s Reels and Facebook Watch, which are video features, “both rely heavily on algorithmic content recommendations” based on engagements, according to the institute’s calculations.Instagram, which like Facebook is owned by Meta, had the lowest amplification rate. There was not yet sufficient data to make a statistically significant estimate for YouTube, according to the institute.The institute plans to update its findings to track how the amplification fluctuates, especially as the midterm elections near. Misinformation, the institute’s report said, is much more likely to be shared than merely factual content.“Amplification of misinformation can rise around critical events if misinformation narratives take hold,” the report said. “It can also fall, if platforms implement design changes around the event that reduce the spread of misinformation.” More

  • in

    In Italy’s Election, Politicians Use TikTok to Seek Votes

    Italian politicians are on a virtual hunt for undecided voters.Over the summer, as polls suggested that most of those who had not yet picked a side were under 30, party elders took it to the next level: TikTok.This month, Silvio Berlusconi, 85, who served four times as Italy’s prime minister, landed on the social media platform that is mostly popular among the young, explaining why he was there at his age.“On this platform, you guys are over five million, and 60 percent of you are less than 30. I am a little envious,” Mr. Berlusconi said, raising and lowering his voice for dramatic effect. “We will talk about your future.”The video had 9.6 million views, raising eyebrows among some users.“You are not so stupid that a video on TikTok is enough to vote for you,” said Emma Galeotti, a young TikTok content creator. “You send the message that we, young people, are so malleable and bonkers.”But Mr. Berlusconi’s communications team did not give up. His profile is brimming with a mix of snapshots from his TV appearances and classic Berlusconi jokes, as well as political messages recorded in his studio, where he is seen wearing classy blue suits — and often ties.Viewers have taken notice of his cultivated appearance.“What’s your foundation cream?” one asked. “The cream is too orange, more natural tones are better,” another wrote.“The rebound was comic or grotesque, but being on TikTok allowed him to be central to the electoral debate,” said Annalisa Ferretti, the coordinator of the social media division at the Italian advocacy group FB & Associati, who noted that the number of people following Mr. Berlusconi’s profile had surpassed 3.2 million in three weeks.“The problem is that this generation rejects the political class overall,” she said, adding that such social media popularity did not directly translate into votes.Other politicians have chosen different paths. Matteo Salvini, 49, of the far-right League party, who has been on TikTok for years and has 635,600 followers, uses the platform mostly as a mouthpiece for his meat-and-bone topics — security and immigration.Giorgia Meloni, 45, the leader of Brothers of Italy and possibly the next prime minister, does not seem to be doing as well on TikTok, despite her successful electoral campaign. She has 197,700 followers.University students seem to like the leader of the centrist party Action, Carlo Calenda, 49, who posts short political messages, answers questions received on the platform and discusses books, Ms. Ferretti said. But he has only about 24,300 followers.The center-left Democratic Party is the only party that offers a plurality of voices on TikTok. They post thematic videos with topics discussed by politicians who are the symbol of such issues, like Alessandro Zan, 48, for the civil rights battle. Enrico Letta, 56, a party leader, recently encouraged users to go vote — for whomever they liked. “The others should not decide for your future,” he said.Despite the efforts of politicians to reach a different audience, abstention still seems to be the main threat to the parties, and to Italian democracy.“They used to say, ‘Squares are full and the ballot boxes are empty,’” Ms. Ferretti said. “Now it’s more social media is full, and the ballot boxes are empty.” More

  • in

    TikTok Bans Political Fund-Raising Ahead of Midterms

    Less than two months before the midterm elections, TikTok is blocking politicians and political parties from fund-raising on its platform.In a blog post on Wednesday, the social media platform said it would prohibit solicitations for money by political campaigns. The company said political accounts would immediately lose access to advertising features and monetization services, such as gift giving, tipping and e-commerce capabilities.Over the next few weeks, TikTok will clamp down on politicians’ posting videos asking for donations, or political parties’ directing users to online donation pages, the company said.Accounts run by government offices will be slightly less restricted. TikTok said such accounts would be allowed to advertise in limited circumstances, such as when running educational campaigns about Covid-19 booster shots. But the people operating those accounts must work with someone from the company to run that kind of campaign.The new rules will help enforce a ban on political advertising that TikTok, known for its short videos and younger-skewing audience, first put in place in 2019.The company, which has more than a billion monthly users globally, continues to describe itself as “first and foremost an entertainment platform,” but the platform is increasingly drawing political content. Researchers who track online falsehoods say TikTok is on its way to becoming a major hub of political misinformation, fueled by the same qualities that make consumer products and dance videos go viral on the platform.In a campaign season already marked by conspiracy theories and aggressive rhetoric, TikTok has announced several steps to try to civilize and secure its platform. In August, the company debuted an “Elections Center,” a hub on the app with information about voting curated from authoritative sources and presented in more than 45 languages. TikTok said it planned to label posts related to the midterms with links directing users to the elections hub.Starting on Wednesday, TikTok said it would test a requirement that political accounts in the United States be verified. TikTok also said it was trying to educate users about its sponsorship rules, which prohibit creators from being paid to produce political content. More

  • in

    Social Media Companies Still Boost Election Fraud Claims, Report Says

    The major social media companies all say they are ready to deal with a torrent of misinformation surrounding the midterm elections in November.A report released on Monday, however, claimed that they continued to undermine the integrity of the vote by allowing election-related conspiracy theories to fester and spread.In the report, the Stern Center for Business and Human Rights at New York University said the social media companies still host and amplify “election denialism,” threatening to further erode confidence in the democratic process.The companies, the report argued, bear a responsibility for the false but widespread belief among conservatives that the 2020 election was fraudulent — and that the coming midterms could be, too. The report joins a chorus of warnings from officials and experts that the results in November could be fiercely, even violently, contended.“The malady of election denialism in the U.S. has become one of the most dangerous byproducts of social media,” the report warned, “and it is past time for the industry to do more to address it.”The State of the 2022 Midterm ElectionsWith the primaries over, both parties are shifting their focus to the general election on Nov. 8.Echoing Trump: Six G.O.P. nominees for governor and the Senate in critical midterm states, all backed by former President Donald J. Trump, would not commit to accepting this year’s election results.Times/Siena Poll: Our second survey of the 2022 election cycle found Democrats remain unexpectedly competitive in the battle for Congress, while G.O.P. dreams of a major realignment among Latino voters have failed to materialize.Ohio Senate Race: The contest between Representative Tim Ryan, a Democrat, and his Republican opponent, J.D. Vance, appears tighter than many once expected.Pennsylvania Senate Race: In one of his most extensive interviews since having a stroke, Lt. Gov. John Fetterman, the Democratic nominee, said he was fully capable of handling a campaign that could decide control of the Senate.The major platforms — Facebook, Twitter, TikTok and YouTube — have all announced promises or initiatives to combat disinformation ahead of the 2022 midterms, saying they were committed to protecting the election process. But the report said those measures were ineffective, haphazardly enforced or simply too limited.Facebook, for example, announced that it would ban ads that called into question the legitimacy of the coming elections, but it exempted politicians from its fact-checking program. That, the report says, allows candidates and other influential leaders to undermine confidence in the vote by questioning ballot procedures or other rules.In the case of Twitter, an internal report released as part of a whistle-blower’s complaint from a former head of security, Peiter Zatko, disclosed that the company’s site integrity team had only two experts on misinformation.The New York University report, which incorporated responses from all the companies except YouTube, called for greater transparency in how companies rank, recommend and remove content. It also said they should enhance fact-checking efforts and remove provably untrue claims, and not simply label them false or questionable.A spokeswoman for Twitter, Elizabeth Busby, said the company was undertaking a multifaceted approach to ensuring reliable information about elections. That includes efforts to “pre-bunk” false information and to “reduce the visibility of potentially misleading claims via labels.”In a statement, YouTube said it agreed with “many of the points” made in the report and had already carried out many of its recommendations.“We’ve already removed a number of videos related to the midterms for violating our policies,” the statement said, “and the most viewed and recommended videos and channels related to the election are from authoritative sources, including news channels.”TikTok did not respond to a request for comment.There are already signs that the integrity of the vote in November will be as contentious as it was in 2020, when President Donald J. Trump and some of his supporters refused to accept the outcome, falsely claiming widespread fraud.Inattention by social media companies in the interim has allowed what the report describes as a coordinated campaign to take root among conservatives claiming, again without evidence, that wholesale election fraud is bent on tipping elections to Democrats.“Election denialism,” the report said, “was evolving in 2021 from an obsession with the former president’s inability to accept defeat into a broader, if equally baseless, attack on the patriotism of all Democrats, as well as non-Trump-loving Republicans, and legions of election administrators, many of them career government employees.” More

  • in

    To Fight Election Falsehoods, Social Media Companies Ready a Familiar Playbook

    The election dashboards are back online, the fact-checking teams have reassembled, and warnings about misleading content are cluttering news feeds once again.As the United States marches toward another election season, social media companies are steeling themselves for a deluge of political misinformation. Those companies, including TikTok and Facebook, are trumpeting a series of election tools and strategies that look similar to their approaches in previous years.Disinformation watchdogs warn that while many of these programs are useful — especially efforts to push credible information in multiple languages — the tactics proved insufficient in previous years and may not be enough to combat the wave of falsehoods pushed this election season.Here are the anti-misinformation plans for Facebook, TikTok, Twitter and YouTube.FacebookFacebook’s approach this year will be “largely consistent with the policies and safeguards” from 2020, Nick Clegg, president of global affairs for Meta, Facebook’s parent company, wrote in a blog post last week.Posts rated false or partly false by one of Facebook’s 10 American fact-checking partners will get one of several warning labels, which can force users to click past a banner reading “false information” before they can see the content. In a change from 2020, those labels will be used in a more “targeted and strategic way” for posts discussing the integrity of the midterm elections, Mr. Clegg wrote, after users complained that they were “over-used.”Warning labels prevent users from immediately seeing or sharing false content.Provided by FacebookFacebook will also expand its efforts to address harassment and threats aimed at election officials and poll workers. Misinformation researchers said the company has taken greater interest in moderating content that could lead to real-world violence after the Jan. 6 attack on the U.S. Capitol.Facebook greatly expanded its election team after the 2016 election, to more than 300 people. Mark Zuckerberg, Facebook’s chief executive, took a personal interest in safeguarding elections.But Meta, Facebook’s parent company, has changed its focus since the 2020 election. Mr. Zuckerberg is now more focused instead on building the metaverse and tackling stiff competition from TikTok. The company has dispersed its election team and signaled that it could shut down CrowdTangle, a tool that helps track misinformation on Facebook, some time after the midterms.“I think they’ve just come to the conclusion that this is not really a problem that they can tackle at this point,” said Jesse Lehrich, co-founder of Accountable Tech, a nonprofit focused on technology and democracy.More Coverage of the 2022 Midterm ElectionsChallenging DeSantis: Florida Democrats would love to defeat Gov. Ron DeSantis in November. But first they must nominate a candidate who can win in a state where they seem to perpetually fall short.Uniting Around Mastriano: Doug Mastriano, the far-right G.O.P. nominee for Pennsylvania governor, has managed to win over party officials who feared he would squander a winnable race.O’Rourke’s Widening Campaign: Locked in an unexpectedly close race against Gov. Greg Abbott, Beto O’Rourke, the Democratic candidate, has ventured into deeply conservative corners of rural Texas in search of votes.The ‘Impeachment 10’: After Liz Cheney’s primary defeat in Wyoming, only two of the 10 House Republicans who voted to impeach Mr. Trump remain.In a statement, a spokesman from Meta said its elections team was absorbed into other parts of the company and that more than 40 teams are now focused on the midterms.TikTokIn a blog post announcing its midterm plans, Eric Han, the head of U.S. safety, said the company would continue its fact-checking program from 2020, which prevents some videos from being recommended until they are verified by outside fact checkers. It also introduced an election information portal, which provides voter information like how to register, six weeks earlier than it did in 2020.Even so, there are already clear signs that misinformation has thrived on the platform throughout the primaries.“TikTok is going to be a massive vector for disinformation this cycle,” Mr. Lehrich said, adding that the platform’s short video and audio clips are harder to moderate, enabling “massive amounts of disinformation to go undetected and spread virally.”TikTok said its moderation efforts would focus on stopping creators who are paid for posting political content in violation of the company’s rules. TikTok has never allowed paid political posts or political advertising. But the company said that some users were circumventing or ignoring those policies during the 2020 election. A representative from the company said TikTok would start approaching talent management agencies directly to outline their rules.Disinformation watchdogs have criticized the company for a lack of transparency over the origins of its videos and the effectiveness of its moderation practices. Experts have called for more tools to analyze the platform and its content — the kind of access that other companies provide.“The consensus is that it’s a five-alarm fire,” said Zeve Sanderson, the founding executive director at New York University’s Center for Social Media and Politics. “We don’t have a good understanding of what’s going on there,” he added.Last month, Vanessa Pappas, TikTok’s chief operating officer, said the company would begin sharing some data with “selected researchers” this year.TwitterIn a blog post outlining its plans for the midterm elections, the company said it would reactivate its Civic Integrity Policy — a set of rules adopted in 2018 that the company uses ahead of elections around the world. Under the policy, warning labels, similar to those used by Facebook, will once again be added to false or misleading tweets about elections, voting, or election integrity, often pointing users to accurate information or additional context. Tweets that receive the labels are not recommended or distributed by the company’s algorithms. The company can also remove false or misleading tweets entirely.Those labels were redesigned last year, resulting in 17 percent more clicks for additional information, the company said. Interactions, like replies and retweets, fell on tweets that used the modified labels.In Twitter’s tests, the redesigned warning labels increased click-through rates for additional context by 17 percent.Provided by TwitterThe strategy reflects Twitter’s attempts to limit false content without always resorting to removing tweets and banning users.The approach may help the company navigate difficult freedom of speech issues, which have dogged social media companies as they try to limit the spread of misinformation. Elon Musk, the Tesla executive, made freedom of speech a central criticism during his attempts to buy the company earlier this year.YouTubeUnlike the other major online platforms, YouTube has not released its own election misinformation plan for 2022 and has typically stayed quiet about its election misinformation strategy.“YouTube is nowhere to be found still,” Mr. Sanderson said. “That sort of aligns with their general P.R. strategy, which just seems to be: Don’t say anything and no one will notice.”Google, YouTube’s parent company, published a blog post in March emphasizing their efforts to surface authoritative content through the streamer’s recommendation engine and remove videos that mislead voters. In another post aimed at creators, Google details how channels can receive “strikes” for sharing certain kinds of misinformation and, after three strikes within a 90-day period, the channel will be terminated.The video streaming giant has played a major role in distributing political misinformation, giving an early home to conspiracy theorists like Alex Jones, who was later banned from the site. It has taken a stronger stance against medical misinformation, stating last September that it would remove all videos and accounts sharing vaccine misinformation. The company ultimately banned some prominent conservative personalities.More than 80 fact checkers at independent organizations around the world signed a letter in January warning YouTube that its platform is being “weaponized” to promote voter fraud conspiracy theories and other election misinformation.In a statement, Ivy Choi, a YouTube spokeswoman, said its election team had been meeting for months to prepare for the midterms and added that its recommendation engine is “continuously and prominently surfacing midterms-related content from authoritative news sources and limiting the spread of harmful midterms-related misinformation.” More

  • in

    On TikTok, Election Misinformation Thrives Ahead of Midterms

    The fast-growing platform’s poor track record during recent voting abroad does not bode well for elections in the U.S., researchers said.In Germany, TikTok accounts impersonated prominent political figures during the country’s last national election. In Colombia, misleading TikTok posts falsely attributed a quotation from one candidate to a cartoon villain and allowed a woman to masquerade as another candidate’s daughter. In the Philippines, TikTok videos amplified sugarcoated myths about the country’s former dictator and helped his son prevail in the country’s presidential race.Now, similar problems have arrived in the United States.Ahead of the midterm elections this fall, TikTok is shaping up to be a primary incubator of baseless and misleading information, in many ways as problematic as Facebook and Twitter, say researchers who track online falsehoods. The same qualities that allow TikTok to fuel viral dance fads — the platform’s enormous reach, the short length of its videos, its powerful but poorly understood recommendation algorithm — can also make inaccurate claims difficult to contain.Baseless conspiracy theories about certain voter fraud in November are widely viewed on TikTok, which globally has more than a billion active users each month. Users cannot search the #StopTheSteal hashtag, but #StopTheSteallll had accumulated nearly a million views until TikTok disabled the hashtag after being contacted by The New York Times. Some videos urged viewers to vote in November while citing debunked rumors raised during the congressional hearings into the Jan. 6, 2021, attack on the Capitol. TikTok posts have garnered thousands of views by claiming, without evidence, that predictions of a surge in Covid-19 infections this fall are an attempt to discourage in-person voting.The spread of misinformation has left TikTok struggling with many of the same knotty free speech and moderation issues that Facebook and Twitter have faced, and have addressed with mixed results, for several years.But the challenge may be even more difficult for TikTok to address. Video and audio — the bulk of what is shared on the app — can be far more difficult to moderate than text, especially when they are posted with a tongue-in-cheek tone. TikTok, which is owned by the Chinese tech giant ByteDance, also faces many doubts in Washington about whether its business decisions about data and moderation are influenced by its roots in Beijing.“When you have extremely short videos with extremely limited text content, you just don’t have the space and time for nuanced discussions about politics,” said Kaylee Fagan, a research fellow with the Technology and Social Change Project at the Harvard Kennedy School’s Shorenstein Center. TikTok had barely been introduced in the United States at the time of the 2018 midterm elections and was still largely considered an entertainment app for younger people during the 2020 presidential election. Today, its American user base spends an average of 82 minutes a day on the platform, three times more than on Snapchat or Twitter and twice as long as on Instagram or Facebook, according to a recent report from the app analytics firm Sensor Tower. TikTok is becoming increasingly important as a destination for political content, often produced by influencers.The company insists that it is committed to combating false information. In the second half of 2020, it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media, according to a report it released last year. The platform’s filters kept another 441,000 videos with unsubstantiated claims from being recommended to users, the report said.TikTok says it removed nearly 350,000 videos that included election misinformation, disinformation and manipulated media in the second half of 2020.TikTokThe service blocked so-called deepfake content and coordinated misinformation campaigns ahead of the 2020 election, made it easier for users to report election falsehoods and partnered with 13 fact-checking organizations, including PolitiFact. Researchers like Ms. Fagan said TikTok had worked to shut down problematic search terms, though its filters remain easy to evade with creative spellings.“We take our responsibility to protect the integrity of our platform and elections with utmost seriousness,” TikTok said in a statement. “We continue to invest in our policy, safety and security teams to counter election misinformation.”But the service’s troubling track record during foreign elections — including in France and Australia this year — does not bode well for the United States, experts said.TikTok has been “failing its first real test” in Africa in recent weeks, Odanga Madung, a researcher for the nonprofit Mozilla Foundation, wrote in a report. The app struggled to tamp down on disinformation ahead of last week’s presidential election in Kenya. Mr. Madung cited a post on TikTok that included an altered image of one candidate holding a knife to his neck and wearing a blood-streaked shirt, with a caption that described him as a murderer. The post garnered more than half a million views before it was removed.“Rather than learn from the mistakes of more established platforms like Facebook and Twitter,” Mr. Madun wrote, “TikTok is following in their footsteps.”TikTok has also struggled to contain nonpolitical misinformation in the United States. Health-related myths about Covid-19 vaccines and masks run rampant, as do rumors and falsehoods about diets, pediatric conditions and gender-affirming care for transgender people. A video making the bogus claim that the mass shooting at Robb Elementary School in Uvalde, Texas, in May had been staged drew more than 74,000 views before TikTok removed it.Posts on TikTok about Russia’s war in Ukraine have also been problematic. Even experienced journalists and researchers analyzing posts on the service struggle to separate truth from rumor or fabrication, according to a report published in March by the Shorenstein Center.TikTok’s design makes it a breeding ground for misinformation, the researchers found. They wrote that videos could easily be manipulated and republished on the platform and showcased alongside stolen or original content. Pseudonyms are common; parody and comedy videos are easily misinterpreted as fact; popularity affects the visibility of comments; and data about publication time and other details are not clearly displayed on the mobile app.(The Shorenstein Center researchers noted, however, that TikTok is less vulnerable to so-called brigading, in which groups coordinate to make a post spread widely, than platforms like Twitter or Facebook.)During the first quarter of 2022, more than 60 percent of videos with harmful misinformation were viewed by users before being removed, TikTok said. Last year, a group of behavioral scientists who had worked with TikTok said that an effort to attach warnings to posts with unsubstantiated content had reduced sharing by 24 percent but had limited views by only 5 percent.Researchers said that misinformation would continue to thrive on TikTok as long as the platform refused to release data about the origins of its videos or share insight into its algorithms. Last month, TikTok said it would offer some access to a version of its application programming interface, or A.P.I., this year, but it would not say whether it would do so before the midterms.Filippo Menczer, an informatics and computer science professor and the director of the Observatory on Social Media at Indiana University, said he had proposed research collaborations to TikTok and had been told, “Absolutely not.”“At least with Facebook and Twitter, there is some level of transparency, but, in the case of TikTok, we have no clue,” he said. “Without resources, without being able to access data, we don’t know who gets suspended, what content gets taken down, whether they act on reports or what the criteria are. It’s completely opaque, and we cannot independently assess anything.”U.S. lawmakers are also calling for more information about TikTok’s operations, amid renewed concerns that the company’s ties to China could make it a national security threat. The company has said it plans to keep data about its American users separate from its Chinese parent. It has also said its rules have changed since it was accused of censoring posts seen as antithetical to Beijing’s policy goals.The company declined to say how many human moderators it had working alongside its automated filters. (A TikTok executive told British politicians in 2020 that the company had 10,000 moderators around the world.) But former moderators have complained about difficult working conditions, saying they were spread thin and sometimes required to review videos that used unfamiliar languages and references — an echo of accusations made by moderators at platforms like Facebook.In current job listings for moderators, TikTok asks for willingness to “review a large number of short videos” and “in continuous succession during each shift.”In a lawsuit filed in March, Reece Young of Nashville and Ashley Velez of Las Vegas said they had “suffered immense stress and psychological harm” while working for TikTok last year. The former moderators described 12-hour shifts assessing thousands of videos, including conspiracy theories, fringe beliefs, political disinformation and manipulated images of elected officials. Usually, they said, they had less than 25 seconds to evaluate each post and often had to watch multiple videos simultaneously to meet TikTok’s quotas. In a filing, the company pushed for the case to be dismissed in part because the plaintiffs had been contractors hired by staffing services, and not directly by TikTok. The company also noted the benefits of human oversight when paired with its review algorithms, saying, “The significant social utility to content moderation grossly outweighs any danger to moderators.”Election season can be especially difficult for moderators, because political TikTok posts tend to come from a diffuse collection of users addressing broad issues, rather than from specific politicians or groups, said Graham Brookie, the senior director of the Digital Forensic Research Lab at the Atlantic Council.“The bottom line is that all platforms can do more and need to do more for the shared set of facts that social democracy depends on,” Mr. Brookie said. “TikTok, in particular, sticks out because of its size, its really, really rapid growth and the number of outstanding issues about how it makes decisions.” More

  • in

    How Candidates Are Using TikTok to Secure Younger Voters

    If all politics is theater, Representative Tim Ryan is one of its subtler actors. A moderate Democrat from Ohio’s 13th district who has represented the state for nearly two decades, his speeches and debate performances are often described as coming out of central casting. His style choices are D.C. standard. He’s not usually the subject of late-night skits or memes.That’s not to say he isn’t trying. Back in the spring of 2020, as Covid-19 was overtaking the country and a divided Congress was duking it out over a sweeping stimulus bill, Mr. Ryan, 48, was so frustrated at the stalled legislation that he decided to channel his emotion into a TikTok video.The 15-second clip features Mr. Ryan lounging around his office in a white button-down and dress pants, his tie slightly loose, as he mimes a clean version of “Bored in the House,” by Curtis Roach. It’s a rap song that resonated with cooped-up Americans early on in the pandemic, featuring a refrain (“I’m bored in the house, and I’m in the house bored”) that appears in millions of videos across TikTok. Most of them depict people losing their minds in lockdown. Mr. Ryan’s interpretation was a little more literal: Bored … in the House … get it?

    @reptimryan In the (People’s) House bored. ♬ original sound – curtistootrill Mr. Ryan is not a politician one readily associates with the Zoomers of TikTok. His talking points tend to revolve around issues like reviving American manufacturing rather than, say, defunding the police. But the chino-clad congressman wasn’t naïve to the nontraditional places from which political influence might flow. Years ago he was all in on meditation. Why not try the social platform of the moment?His teenage daughter, Bella, got him up to speed and taught him some of the dances that had gone viral on the app. “I just thought it was hysterical, and that it was something really cool that her and I could do together,” Mr. Ryan said in a phone interview.Representative Tim Ryan of Ohio joined TikTok in 2020. “I started to see it as an opportunity to really speak to an audience that wasn’t watching political talk shows or watching the news,” he said.Elizabeth Frantz for The New York TimesSoon enough, he was posting on his own account, sharing video montages of his floor speeches and his views on infrastructure legislation, backed by the sound of Taylor Swift’s “All Too Well.” (As any TikTok newbie would quickly learn, popular songs help videos get discovered on the platform.)“I started to see it as an opportunity to really speak to an audience that wasn’t watching political talk shows or watching the news,” Mr. Ryan said. This year, he’s running for Ohio’s open Senate seat; he thinks TikTok could be a crucial part of the race.But as primaries begin for the midterm elections, the real question is: What do voters think?Privacy, Protest and PunditrySocial media has played a role in political campaigning since at least 2007, when Barack Obama, then an Illinois senator, registered his first official Twitter handle. Since then, enormous numbers of political bids have harnessed the power of social platforms, through dramatic announcement videos on YouTube, Twitter debates, Reddit A.M.A.s, fireside chats on Instagram Live and more. TikTok, with its young-skewing active global user base of one billion, would seem a natural next frontier.A Guide to the 2022 Midterm ElectionsMidterms Begin: The Texas primaries officially opened the 2022 election season. See the full primary calendar.In the Senate: Democrats have a razor-thin margin that could be upended with a single loss. Here are the four incumbents most at risk.In the House: Republicans and Democrats are seeking to gain an edge through redistricting and gerrymandering, though this year’s map is poised to be surprisingly fairGovernors’ Races: Georgia’s contest will be at the center of the political universe, but there are several important races across the country.Key Issues: Inflation, the pandemic, abortion and voting rights are expected to be among this election cycle’s defining topics.So far, though, compared with other platforms, it has been embraced by relatively few politicians. Their videos run the gamut of cringey — say, normie dads bopping along to viral audio clips — to genuinely connecting with people.“TikTok is still in the novelty phase in terms of social media networks for political candidates,” said Eric Wilson, a Republican political technologist.Republicans in particular have expressed concerns about the app’s parent company, ByteDance, whose headquarters are in China. In the final year of his presidency, Donald J. Trump signed an executive order to ban the app in the United States, citing concerns that user data could be retrieved by the Chinese government. (President Biden revoked the order last summer.)After a brief stint on the app, Senator Marco Rubio of Florida, a Republican, deleted his account. He has since called on President Biden to block the platform entirely. In an email statement, Mr. Rubio, 50, wrote that TikTok “poses a serious threat to U.S. national security and Americans’ — especially children’s — personal privacy.”Senator Marco Rubio of Florida believes that TikTok “poses a serious threat to U.S. national security and Americans’ — especially children’s — personal privacy.”Scott McIntyre for The New York TimesThat point has been disputed by national security experts, who think the app would be a relatively inefficient way for Chinese agencies to obtain U.S. intelligence.“They have better ways of getting it,” said Adam Segal, the director of the Digital and Cyberspace Policy program at the Council on Foreign Relations, among them “phishing emails, directed targeted attacks on the staff or the politicians themselves or buying data on the open market.”Regardless, TikTok seems to have empowered a new generation to become more engaged with global issues, try on ideological identities and participate in the political process — even those not old enough to vote.There have been rare but notable examples of TikTok inspiring political action. In 2020, young users encouraged people to register for a Tulsa, Okla., rally in support of former President Donald Trump as a prank to limit turnout. Ahead of the rally, Brad Parscale, Mr. Trump’s 2020 campaign manager, tweeted that there had been more than a million ticket requests, but only 6,200 tickets were scanned at the arena.Such activity is not limited to young liberals on the platform. Ioana Literat, an associate professor of communication at Teachers College, Columbia University, who has studied young people and political expression on social media with Neta Kligler-Vilenchik of the Hebrew University of Jerusalem, pointed to the political “hype houses” that became popular on TikTok during the 2020 election. The owners of those accounts have livestreamed debates, debunked misinformation spreading on the app and discussed policy issues.“Young political pundits on both sides of the ideological divide have been very successful in using TikTok to reach their respective audiences,” Ms. Literat said.You’ve Got My Vote, BestieMany of the politicians active on TikTok are Democrats or left-leaning independents, including Senator Jon Ossoff of Georgia, Senator Bernie Sanders of Vermont, Senator Ed Markey of Massachusetts, Representative Ilhan Omar of Minnesota and the mayors of two of America’s largest cities, Lori Lightfoot and Eric Adams (who announced he had joined this week with a video that featured his morning smoothie regimen).This could be because the platform has a large proportion of young users, according to internal company data and documents that were reviewed by The New York Times in 2020, and young people tend to lean liberal. (TikTok would not share current demographic data with The Times.)Senator Ed Markey of Massachusetts has cultivated a following on TikTok, where young users often refer to him as their “bestie.”Alyssa Schukar for The New York Times“If you are a Democrat running for office, you’re trying to get young voters to go out and support you,” said Mr. Wilson, the Republican strategist. “That calculation is different for Republicans, where you’re trying to mobilize a different type of voter” — someone who is likely older and spends time on other platforms.For his part, Mr. Markey has cultivated a following on TikTok with videos that are a mix of silly (such as him boiling pasta in acknowledgment of “Rigatoni Day”), serious (for example, him reintroducing the Green New Deal with Alexandria Ocasio-Cortez and Cori Bush) and seriously stylish (him stepping out in a bomber jacket and Nike high tops). The comments on his videos are filled with fans calling him “bestie” (“go bestie!!”, “i love you bestie,” “YES BESTIE!!!!”).The feeling is mutual. “When I post on TikTok, it’s because I’m having fun online and talking with my friends about the things we all care about,” Mr. Markey, 75, wrote in an email. “I listen and learn from young people on TikTok. They are leading, they know what’s going on and they know where we are headed, especially online. I’m with them.”

    @ed_markey you have to stop ♬ A Moment Apart – ODESZA – Hannah Stater Dafne Valenciano, 19, a college student from California, said that she’s a fan of Mr. Ossoff’s TikTok account. During his campaign season, “he had very funny content and urged young voters to go to the ballots,” Ms. Valenciano said. “Politicians accessing this social media makes it easier for my generation to see their media rather than through news or articles.”Several of the videos posted by Mr. Ossoff, 35, who has moppy brown hair and boyish good looks, have been interpreted by his fans as thirst traps. “YAS DADDY JON,” one user commented on a video of him solemnly discussing climate change. Another wrote, on a post celebrating his first 100 days in office, that Mr. Ossoff was “hot and he knows it,” calling him a “confident king.” The senator has more than half a million followers on TikTok.Some politicians end up on the platform unwittingly. Take, for instance, the viral audio of Kamala Harris declaring, “we did it, Joe” after winning the 2020 election. Though the vice president doesn’t have an account herself, her sound bite has millions of plays.Catering to such viral impulses may seem gimmicky, but it’s a necessary part of any candidate’s TikTok strategy. Political advertising is prohibited on the platform, so politicians can’t promote much of their content to target specific users. And the app pushes videos from all over the world into users’ feeds, making it hard for candidates to reach the ones who might actually vote for them.Daniel Dong, 20, a college student from New Hampshire, said that he often sees posts from politicians in other states in his TikTok feed, but “those races don’t matter to me because I’m never going to be able to vote for a random person from another state.”The Art of the Viral VideoChristina Haswood, a Democratic member of the Kansas House of Representatives, first started her TikTok account in the summer of 2020, when she was running for her seat.“I went to my campaign manager and was like, ‘Wouldn’t it be funny if I made a campaign TikTok?’” Ms. Haswood, 27, said.“A lot of folks don’t see an Indigenous politician, a young politician of color,” said Christina Haswood, a member of the Kansas House of Representatives. She hopes to inspire young people to run for office.Arin Yoon for The New York TimesShe won the race, making her one of a handful of Native Americans in the Kansas state legislature. “A lot of folks don’t see an Indigenous politician, a young politician of color. You don’t see that every day across the state, let alone across the country,” Ms. Haswood said. “I want to encourage young people to run for office.”At first, Ms. Haswood created TikToks that were purely informational — videos of her talking directly to the camera, which weren’t getting much traction. When one of the candidates running against her in the primary also started a TikTok, she felt she needed to amp things up.Conner Thrash, at the time a high school student and now a college student at the University of Kansas, started to notice Ms. Haswood’s videos. “I really loved what she stood for,” Mr. Thrash, 19, said. “I realized that I had the ability to bridge the gap between a politician trying to expand their outreach and people like my young, teenage self.”So he reached out to Ms. Haswood, and the two started making content together and perfecting the art of the viral TikTok. A video should strike a careful balance of entertaining but not embarrassing; low-fi without seeming careless; and trendy but innovative, bringing something new to the never-ending scroll.One of their most-watched videos lays out key points of Ms. Haswood’s platform, including the protection of reproductive rights and legalizing recreational marijuana. The video is set to a viral remix of Taylor Swift’s “Love Story” and follows a trend in which TikTok users push the camera away from themselves midsong. (Ms. Haswood used a Penny skateboard to achieve the effect.)

    @haswoodforks Meet Christina Haswood, the future for democratic politics in Kansas.❤️#kansas #democrat #progressive #vote #fyp #foryoupage ♬ Love Story – Disco Lines TikTok may have helped Ms. Haswood win her race, but few candidates have had her success. Several politicians with large TikTok followings, including Matt Little (a former liberal member of the Minnesota Senate) and Joshua Collins (a socialist who ran for U.S. representative for Washington), lost, “pretty badly — in their respective elections,” Ms. Literat said, “so technically they did not succeed from a political perspective.”The behavior of young voters in particular can be hard to predict. In the 2020 presidential election, about half of Americans between the ages 18 and 29 voted, according to the Center for Information & Research on Civic Learning and Engagement at Tufts University — a record turnout for an age group not known for showing up to the polls.Still, “young people help drive the culture,” said Jennifer Stromer-Galley, the author of “Presidential Campaigning in the Internet Age” and a professor of information studies at Syracuse University.“Even though they may or may not ever vote for Jon Ossoff, being on TikTok does help shape Ossoff’s image,” she added. “More people are going to know Ossoff’s name today because of his TikTok stunt than they did before.” More

  • in

    Jeffrey Katzenberg Talks About His Billion-Dollar Flop

    The public failure of his start-up Quibi hasn’t stopped Jeffrey Katzenberg from doubling down on tech. A Hollywood power broker, he headed up Disney in the 1980s and ’90s and co-founded a rival studio, DreamWorks, before finding a puzzle he could not yet solve: getting people to pay for short-format content. Investors gave him and the former Hewlett-Packard C.E.O. and California gubernatorial candidate Meg Whitman $1.75 billion to build a video platform, but not enough customers opened up their wallets, at $4.99 a month, and Quibi folded within a year of its launch. Katzenberg says the problems were product-market fit and the Covid pandemic, not competition from TikTok or YouTube.[You can listen to this episode of “Sway” on Apple, Spotify, Google or wherever you get your podcasts.]In this conversation, Kara Swisher and Katzenberg delve into Quibi’s demise, the shifting power dynamics in Hollywood and his pivot to Silicon Valley. They also discuss his influence in another sphere: politics. And the former Hollywood executive, who co-chaired a fund-raiser to help fend off California’s recent recall effort, offers some advice to Gov. Gavin Newsom.(A full transcript of the episode will be available midday on the Times website.)Photograph by WndrCoThoughts? Email us at sway@nytimes.com.“Sway” is produced by Nayeema Raza, Blakeney Schick, Matt Kwong, Daphne Chen and Caitlin O’Keefe and edited by Nayeema Raza; fact-checking by Kate Sinclair; music and sound design by Isaac Jones; mixing by Carole Sabouraud and Sonia Herrero; audience strategy by Shannon Busta. Special thanks to Kristin Lin and Liriel Higa. More