More stories

  • in

    Campaign Press Aides Move From the Shadows to Social Media Stardom

    MINDEN, Nev. — As Adam Laxalt, the Republican candidate for Senate in Nevada, ambled along a throng of Trump supporters at a recent rally and posed for pictures, it was his campaign’s communications director, Courtney Holland, who was really working the crowd.With an iPhone in her left hand, Ms. Holland used her right one to whip up more enthusiasm from the red-capped Republicans gathered behind her boss. As the crowd took the cue, Ms. Holland framed her shot and blasted the footage out onto the campaign’s various social media channels — as well as her own.With more than 100,000 followers on Twitter and nearly 70,000 others on Instagram, Ms. Holland reflects a new breed of campaign aides — those whose online profiles more closely resemble social media influencers than traditional behind-the-scenes press operatives.The shift seizes on the transformation in how American voters receive information about their candidates, and is changing the way campaign press shops function. Both parties are increasingly using social media to build loyalty to a particular political brand, and targeting critics and journalists to energize supporters and drive online contributions. Instead of drafting political positions for their candidates, these staff members take to social media to make their own statements.Working her first political campaign, Ms. Holland has shown little interest in dealing with mainstream reporters to shape stories about Nevada’s closely watched Senate race — and she didn’t respond to a request for comment for this article. She has used her Twitter account, however, to repeatedly post negative information about Hunter Biden, President Biden’s son, and criticize Senator Catherine Cortez Masto, the Democratic incumbent in the race, for not participating in more TV interviews.Ms. Holland’s posts on Instagram — posing with fellow conservatives or modeling Republican merchandise — have regularly drawn hundreds or even thousands of likes. Several of her memes attacking Mr. Biden have been viewed more than 100,000 times.“Influencers are being subsumed into the political apparatus on the right and the left,” said Samuel C. Woolley, who has studied social media and politics as the project director of the propaganda research team at the University of Texas at Austin. “There has been a blurring of the line between influencers and their positions as staffers that has historically been behind the camera.”The State of the 2022 Midterm ElectionsElection Day is Tuesday, Nov. 8.Bracing for a Red Wave: Republicans were already favored to flip the House. Now they are looking to run up the score by vying for seats in deep-blue states.Pennsylvania Senate Race: Lt. Gov. John Fetterman and Mehmet Oz clashed in one of the most closely watched debates of the midterm campaign. Here are five takeaways.Polling Analysis: If these poll results keep up, everything from a Democratic hold in the Senate and a narrow House majority to a total G.O.P. rout becomes imaginable, writes Nate Cohn, The Times’s chief political analyst.Strategy Change: In the final stretch before the elections, some Democrats are pushing for a new message that acknowledges the economic uncertainty troubling the electorate.In Florida, Christina Pushaw had about 2,000 Twitter followers before Gov. Ron DeSantis appointed her as his press secretary in May 2021. She now has more than 220,000 followers — far more than Lt. Gov Jeanette Nuñez and nearly as many as Casey DeSantis, Florida’s first lady.Ms. Pushaw built her following with an aggressive social media persona that sometimes includes five or six dozen postings a day, often attacking Democrats and the mainstream media. She has called the president a “seemingly senile 79-year-old aspiring dictator” and suggested that a neo-Nazi rally in Orlando had been staged by Democrats, although she later deleted that tweet.Last summer, Twitter locked her account for 12 hours for violating rules on “abusive behavior” after The Associated Press said her conduct led to a reporter receiving threats and other online abuse.Ms. Pushaw, who is now the DeSantis campaign’s rapid response director, has recently urged her fellow Republicans to stop engaging at all with the mainstream media, which she often refers to as “liberal,” “corporate” or “legacy media.”“My working theory is that if ALL conservatives simply stop talking to them, the legacy media will lose any shred of credibility or interest to Americans who follow politics,” Ms. Pushaw wrote in August.In Florida, Christina Pushaw had about 2,000 Twitter followers before Gov. Ron DeSantis appointed her as his press secretary in May 2021.Paul Hennessy/SOPA Images/LightRocket via Getty ImagesMs. Pushaw didn’t respond to a request for comment. But Mr. DeSantis has defended his aide, saying he views the criticism of her as a sign of her success.“You can try to smear me or anyone in my administration all you want to,” he told reporters in June. “All that’s going to do is embolden us to continue moving forward for the people of Florida.”The pugnacity from Ms. Pushaw and other Republicans has been deeply influenced by former President Donald J. Trump, whose combative political style has been defined by both his aggressiveness on social media and his sparring with the media. Mr. Trump bestowed his top social media aide in the White House, Dan Scavino, with the title of “assistant to the president,” while former President Barack Obama’s digital director, Jason Goldman, was a deputy assistant.Still, Mr. Obama and his team helped pave the way for turning press teams into content creators. The Obama White House regularly produced photos and videos packaged specifically for direct consumption among their own followers on social media.More recently, some of the 2020 Democratic presidential campaigns were loosely linked to armies of fanatical social media followers who teamed up to bully critics, fellow Democrats and reporters.During that race, a relentless group of superfans for Vice President Kamala Harris, known as #KHive, targeted Senator Bernie Sanders, her rival in that campaign, and numerous reporters.Reecie Colbert, one of the group’s more outspoken members, issued a warning during the campaign to Ms. Harris’s critics in a podcast about the group, saying, “I wanted them to know I will stomp a hole in you if you come for Kamala.” She later told The Los Angeles Times that she was speaking for herself, not the group.Ms. Harris has thanked KHive for its support of her on Twitter, and her husband, Doug Emhoff, regularly interacts with them.Lis Smith, a Democratic strategist, has long maintained an active social media profile. In 2012, when she was working on Mr. Obama’s re-election campaign as the rapid response director, Twitter temporarily locked her profile after she sent so many tweets during a presidential debate that she set off an internal alarm at the company designed to identify bots.But Ms. Smith warned that campaigns can go too far in letting their social media presence define them.“Social media is an increasingly big part of the job, but not in a good way,” she said. “Candidates who use social media in an authentic way can reinforce their strengths. But if you let Twitter supplant the hard work of dealing with reporters, you’re essentially breaking down a legitimate line of communications with the public.”Ryan James Girdusky, a conservative activist with over 110,000 followers on Twitter, said having staff members whose agility on social media could drive attention to a candidate’s message could be a significant advantage during a campaign.“When you have a new social media account, you have to build followers,” said Mr. Girdusky, a co-author of the book, “They’re Not Listening: How the Elites Created the National Populist Revolution.”“When you’re behind the eight ball, it’s definitely a major plus to have people who are known in the conservative movement and bring that level of credibility,” he added. More

  • in

    La desinformación es más difícil de combatir en EE. UU.

    La proliferación de redes sociales alternativas ha ayudado a afianzar la información falsa y engañosa como elemento clave de la política estadounidense.La mañana del 8 de julio, el expresidente Donald Trump recurrió a Truth Social, la plataforma de redes sociales que fundó con gente cercana a él, para afirmar que había ganado las elecciones presidenciales del 2020 en el estado de Wisconsin, a pesar de todas las pruebas que evidenciaban lo contrario.Alrededor de 8000 personas compartieron esa misiva en Truth Social, cifra que distó mucho de los cientos de miles de respuestas que sus publicaciones en Facebook y Twitter solían generar antes de que esas plataformas le apagaran el micrófono tras los mortíferos disturbios en el Capitolio el 6 de enero de 2021.A pesar de ello, la afirmación infundada de Trump pululó en la conciencia pública. Saltó de su aplicación a otras plataformas de redes sociales, por no hablar de pódcast, la radio y la televisión.Al cabo de 48 horas de publicado su mensaje, más de un millón de personas lo habían visto en al menos una decena de otros lugares. Apareció en Facebook y Twitter, de donde fue eliminado, pero también en YouTube, Gab, Parler y Telegram, según un análisis de The New York Times.La difusión de la afirmación de Trump ilustra cómo la desinformación ha hecho metástasis desde que los expertos comenzaron a sonar la alarma sobre la amenaza que supone y todo esto ocurre justo antes de las elecciones de mitad de mandato de este año. A pesar de los años de esfuerzos de los medios de comunicación, de los académicos e incluso de las propias empresas de redes sociales para hacer frente al problema, se puede decir que hoy en día está más generalizado y extendido.“Para ser honesta, me parece que el problema está peor que nunca”, comentó Nina Jankowicz, experta en desinformación que condujo durante un periodo breve un consejo consultivo dentro del Departamento de Seguridad Nacional dedicado a combatir la desinformación. La creación del panel desató furor y provocó su renuncia y la disolución del consejo consultivo.No hace mucho, la lucha contra la desinformación se centraba en las principales plataformas de redes sociales, como Facebook y Twitter. Cuando se les presionaba, solían eliminar los contenidos problemáticos, incluida la información errónea y la desinformación intencionada sobre la pandemia de COVID-19.Sin embargo, ahora hay decenas de plataformas nuevas, incluidas algunas que se enorgullecen de no moderar —censurar, como lo denominan— las declaraciones falsas en nombre de la libertad de expresión.Otras personalidades siguieron los pasos de Trump y se cambiaron a estas nuevas plataformas tras ser “censuradas” por Facebook, YouTube o Twitter. Entre ellos, Michael Flynn, el general retirado que sirvió brevemente como principal asesor de Seguridad Nacional de Trump; L. Lin Wood, una abogada pro-Trump; Naomi Wolf, una autora feminista y escéptica de las vacunas, así como diversos seguidores de QAnon y los Oath Keepers, un grupo de militantes de extrema derecha.Al menos 69 millones de personas se han unido a plataformas como Parler, Gab, Truth Social, Gettr y Rumble, que se promueven como alternativas conservadoras a las grandes empresas tecnológicas, según declaraciones de las empresas mismas. Aunque muchos de esos usuarios ya no tienen cabida en las plataformas más grandes, siguen difundiendo sus opiniones, que a menudo aparecen en capturas de pantalla publicadas en los sitios que les prohibieron la entrada.“Nada en internet existe de manera aislada”, afirmó Jared Holt, gestor principal en la investigación sobre odio y extremismo del Instituto para el Diálogo Estratégico. “Lo que ocurre en plataformas alternas como Gab o Telegram o Truth tarde o temprano llega a Facebook, Twitter y otras”, agregó.Los usuarios han migrado a aplicaciones como Truth Social luego de haber sido “censuradas” por Facebook, YouTube o Twitter.Leon Neal/Getty ImagesEl discurso político se ha radicalizado por la difusión de las personas que propagan desinformación, indicó Nora Benavidez, abogada sénior en Free Press, un grupo de defensa de los derechos digitales y la transparencia.“Nuestro lenguaje y nuestros ecosistemas en línea se están volviendo cada vez más corrosivos”, dijo.Los cambios en el paisaje de la desinformación se están haciendo más evidentes con el ciclo electoral en Estados Unidos. En 2016, la campaña encubierta de Rusia para difundir mensajes falsos y divisorios parecía una aberración en el sistema político estadounidense. Hoy la desinformación, procedente de enemigos extranjeros y nacionales, se ha convertido en una característica del mismo.La idea infundada de que el presidente Joe Biden no fue electo de manera legítima se generalizó entre los miembros del Partido Republicano, e hizo que funcionarios de los estados y los condados impusieran nuevas restricciones para votar, a menudo solo con base en teorías de la conspiración que se cuelan en los medios de comunicación de derecha.Los votantes no solo deben filtrar un torrente cada vez mayor de mentiras y falsedades sobre los candidatos y sus políticas, sino también información sobre cuándo y dónde votar. Los funcionarios nombrados o elegidos en nombre de la lucha contra el fraude electoral han adoptado una postura que implica que se negarán a certificar los resultados que no sean de su agrado.Los proveedores de desinformación también se han vuelto cada vez más sofisticados a la hora de eludir las normas de las principales plataformas, mientras que el uso del video para difundir afirmaciones falsas en YouTube, TikTok e Instagram ha hecho que los sistemas automatizados tengan más dificultades para identificarlos que los mensajes de texto.TikTok, propiedad del gigante chino de la tecnología ByteDance, se ha vuelto uno de los principales campos de batalla en la lucha actual contra la desinformación. Un informe del mes pasado de NewsGuard, una organización que da seguimiento al problema en línea, mostró que casi el 20 por ciento de los videos que aparecían como resultados de búsqueda en TikTok contenían información falsa o tendenciosa sobre temas como los tiroteos en las escuelas y la guerra de Rusia en Ucrania.Katie Harbath en el “sala de operaciones” de Facebook, donde se monitoreaba el contenido relacionado con las elecciones en la plataforma, en 2018Jeff Chiu/Associated Press“La gente que hace esto sabe cómo aprovechar los vacíos”, explicó Katie Harbath, exdirectora de políticas públicas de Facebook que ahora dirige Anchor Change, una consultora estratégica.A pocas semanas de las elecciones de mitad de mandato, las principales plataformas se han comprometido a bloquear, etiquetar o marginar todo lo que infrinja las políticas de la empresa, incluida la desinformación, la incitación al odio o los llamados a la violencia.Sin embargo, la industria artesanal de expertos dedicados a contrarrestar la desinformación —los grupos de expertos, las universidades y las organizaciones no gubernamentales— mencionan que la industria no está haciendo suficiente. El mes pasado, por ejemplo, el Centro Stern para los Negocios y los Derechos Humanos de la Universidad de Nueva York advirtió que las principales plataformas seguían amplificando el “negacionismo electoral” de maneras que debilitaban la confianza en el sistema democrático.Otro desafío es la proliferación de plataformas alternativas para esas falsedades y opiniones aún más extremas.Muchas de esas nuevas plataformas florecieron tras la derrota de Trump en 2020, aunque todavía no han alcanzado el tamaño o el alcance de Facebook y Twitter. Estas plataformas afirman que las grandes empresas tecnológicas están en deuda con el gobierno, el Estado profundo o la élite liberal.Parler, una red social fundada en 2018, era uno de los sitios que más crecía, hasta que las tiendas de aplicaciones de Apple y Google lo expulsaron tras los disturbios mortales del 6 de enero, alimentados por la desinformación y los llamados a la violencia en línea. Desde entonces ha vuelto a ambas tiendas y ha empezado a reconstruir su audiencia apelando a quienes sienten que sus voces han sido silenciadas.“En Parler creemos que el individuo es quien debe decidir lo que cree que es la verdad”, dijo en una entrevista, Amy Peikoff, la directora de políticas de la plataforma.Argumentó que el problema con la desinformación o las teorías de la conspiración se derivaba de los algoritmos que las plataformas usan para mantener a la gente pegada a internet y no del debate sin moderar que fomentan sitios como Parler.El lunes, Parler anunció que Kanye West había, en principio, accedido a comprar la plataforma en un acuerdo que el rapero y el diseñador de moda, ahora conocido como Ye, formuló en términos políticos.“En un mundo en que las opiniones conservadoras se consideran controversiales, debemos de asegurarnos de tener el derecho a expresarnos libremente”, dijo, según el comunicado de la compañía.Los competidores de Parler son ahora BitChute, Gab, Gettr, Rumble, Telegram y Truth Social, y cada uno de ellos se presenta como un santuario frente a las políticas de moderación de las principales plataformas en todo tipo de temas, desde la política hasta la salud.Una nueva encuesta del Centro de Investigaciones Pew descubrió que el 15 por ciento de las cuentas destacadas en esas siete plataformas habían sido desterradas previamente de otras como Twitter y Facebook.Las aplicaciones como Gettr se publicitan como alternativas a los gigantes tecnológicosElijah Nouvelage/Getty ImagesSegún la encuesta, casi dos terceras partes de los usuarios de esas plataformas dijeron que habían encontrado una comunidad de personas que compartían sus opiniones. La mayoría son republicanos o se inclinan por ese partido.Una consecuencia de esta atomización de las fuentes de las redes sociales es que se refuerzan las burbujas de información partidista en las que viven millones de estadounidenses.Según el Centro Pew, al menos el seis por ciento de los estadounidenses se informa de manera habitual en al menos uno de estos sitios relativamente nuevos, que a menudo “ponen de relieve puntos de vista del mundo que no pertenecen a la corriente dominante y, a veces, utilizan un lenguaje ofensivo”. La encuesta encontró que una de cada 10 publicaciones en estas plataformas que mencionaban cuestiones relacionadas con la comunidad LGBTQ incluían alegatos peyorativos.Estos nuevos sitios siguen siendo marginales comparados con las plataformas más grandes; por ejemplo, Trump tiene 4 millones de seguidores en Truth Social, en comparación con los 88 millones que tenía cuando Twitter cerró su cuenta en 2021.Aun así, Trump ha retomado cada vez más sus publicaciones con el ímpetu que antes mostraba en Twitter. El allanamiento del FBI en Mar-a-Lago volvió a poner sus últimos pronunciamientos en el ojo del huracán político.Para las principales plataformas, el incentivo financiero para atraer usuarios, y sus clics, sigue siendo poderoso y podría hacer que den marcha atrás a las medidas que tomaron en 2021. También hay un componente ideológico. El llamado a la libertad individual, con tintes emocionales, impulsó en parte la oferta de Elon Musk para comprar Twitter, que parece haberse reactivado tras meses de maniobras legales.Nick Clegg, el presidente de asuntos globales de Meta, la empresa matriz de Facebook, incluso sugirió hace poco que la plataforma podría restablecer la cuenta de Trump en 2023, antes de la que podría ser otra carrera presidencial. Facebook había dicho previamente que solo lo haría “si el riesgo para la seguridad pública ha disminuido”.Nick Clegga, el presidente de asuntos globales de MetaPatrick T. Fallon/Agence France-Presse — Getty ImagesUn estudio de Truth Social realizado por Media Matters for America, un grupo de monitoreo de medios con tendencia de izquierda, examinó la forma en que la plataforma se ha convertido en hogar de algunas de las teorías de conspiración más marginales. Trump, que empezó a publicar en la plataforma en el mes de abril, ha amplificado cada vez más el contenido de QAnon, la teoría de conspiración en línea.Ha compartido publicaciones de QAnon más de 130 veces. Los seguidores de QAnon promueven una falsedad amplia y compleja centrada en Trump como líder que se enfrenta a una conspiración de una camarilla de pedófilos del Partido Demócrata. Dichas opiniones han hallado cabida durante las primarias de este año en las campañas electorales de los republicanos.Jankowicz, la experta en desinformación, mencionó que las divisiones sociales y políticas habían agitado las olas de la desinformación.Las controversias sobre la mejor manera de responder a la pandemia de COVID-19 profundizaron la desconfianza en el gobierno y los expertos médicos, sobre todo entre los conservadores. La negativa de Trump a aceptar el resultado de las elecciones de 2020 condujo a la violencia en el Capitolio, pero no terminó con ella.“Deberían habernos unido”, dijo Jankowicz, refiriéndose a la pandemia y a los disturbios. “Pensé que quizás podrían servir como una especie de poder de convocatoria, pero no lo fueron”Steven Lee Myers cubre desinformación para el Times. Ha trabajado en Washington, Moscú, Bagdad y Pekín, donde contribuyó a los artículos que ganaron el Premio Pulitzer al servicio público en 2021. También es el autor de The New Tsar: The Rise and Reign of Vladimir Putin. @stevenleemyers • FacebookSheera Frenkel es una reportera de tecnología premiada que tiene su sede en San Francisco. En 2021, ella y Cecilia Kang publicaron Manipulados. La batalla de Facebook por la dominación mundial. @sheeraf More

  • in

    Ahead of Midterms, Disinformation Is Even More Intractable

    On the morning of July 8, former President Donald J. Trump took to Truth Social, a social media platform he founded with people close to him, to claim that he had in fact won the 2020 presidential vote in Wisconsin, despite all evidence to the contrary.Barely 8,000 people shared that missive on Truth Social, a far cry from the hundreds of thousands of responses his posts on Facebook and Twitter had regularly generated before those services suspended his megaphones after the deadly riot on Capitol Hill on Jan. 6, 2021.And yet Mr. Trump’s baseless claim pulsed through the public consciousness anyway. It jumped from his app to other social media platforms — not to mention podcasts, talk radio or television.Within 48 hours of Mr. Trump’s post, more than one million people saw his claim on at least dozen other sites. It appeared on Facebook and Twitter, from which he has been banished, but also YouTube, Gab, Parler and Telegram, according to an analysis by The New York Times.The spread of Mr. Trump’s claim illustrates how, ahead of this year’s midterm elections, disinformation has metastasized since experts began raising alarms about the threat. Despite years of efforts by the media, by academics and even by social media companies themselves to address the problem, it is arguably more pervasive and widespread today.“I think the problem is worse than it’s ever been, frankly,” said Nina Jankowicz, an expert on disinformation who briefly led an advisory board within the Department of Homeland Security dedicated to combating misinformation. The creation of the panel set off a furor, prompting her to resign and the group to be dismantled.Not long ago, the fight against disinformation focused on the major social media platforms, like Facebook and Twitter. When pressed, they often removed troubling content, including misinformation and intentional disinformation about the Covid-19 pandemic.Today, however, there are dozens of new platforms, including some that pride themselves on not moderating — censoring, as they put it — untrue statements in the name of free speech.Other figures followed Mr. Trump in migrating to these new platforms after being “censored” by Facebook, YouTube or Twitter. They included Michael Flynn, the retired general who served briefly as Mr. Trump’s first national security adviser; L. Lin Wood, a pro-Trump lawyer; Naomi Wolf, a feminist author and vaccine skeptic; and assorted adherents of QAnon and the Oath Keepers, the far-right militia.At least 69 million people have joined platforms, like Parler, Gab, Truth Social, Gettr and Rumble, that advertise themselves as conservative alternatives to Big Tech, according to statements by the companies. Though many of those users are ostracized from larger platforms, they continue to spread their views, which often appear in screen shots posted on the sites that barred them.The State of the 2022 Midterm ElectionsBoth parties are making their final pitches ahead of the Nov. 8 election.Where the Election Stands: As Republicans appear to be gaining an edge with swing voters in the final weeks of the contest for control of Congress, here’s a look at the state of the races for the House and Senate.Biden’s Low Profile: President Biden’s decision not to attend big campaign rallies reflects a low approval rating that makes him unwelcome in some congressional districts and states.What Young Voters Think: Twelve Americans under 30, all living in swing states, told The Times about their political priorities, ranging from the highly personal to the universal.Debates Dwindle: Direct political engagement with voters is waning as candidates surround themselves with their supporters. Nowhere is the trend clearer than on the shrinking debate stage.“Nothing on the internet exists in a silo,” said Jared Holt, a senior manager on hate and extremism research at the Institute for Strategic Dialogue. “Whatever happens in alt platforms like Gab or Telegram or Truth makes its way back to Facebook and Twitter and others.”Users have migrated to apps like Truth Social after being “censored” by Facebook, YouTube or Twitter.Leon Neal/Getty ImagesThe diffusion of the people who spread disinformation has radicalized political discourse, said Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability.“Our language and our ecosystems are becoming more caustic online,” she said. The shifts in the disinformation landscape are becoming clear with the new cycle of American elections. In 2016, Russia’s covert campaign to spread false and divisive posts seemed like an aberration in the American political system. Today disinformation, from enemies, foreign and domestic, has become a feature of it.The baseless idea that President Biden was not legitimately elected has gone mainstream among Republican Party members, driving state and county officials to impose new restrictions on casting ballots, often based on mere conspiracy theories percolating in right-wing media.Voters must now sift through not only an ever-growing torrent of lies and falsehoods about candidates and their policies, but also information on when and where to vote. Officials appointed or elected in the name of fighting voter fraud have put themselves in the position to refuse to certify outcomes that are not to their liking.The purveyors of disinformation have also become increasingly sophisticated at sidestepping the major platforms’ rules, while the use of video to spread false claims on YouTube, TikTok and Instagram has made them harder for automated systems to track than text.TikTok, which is owned by the Chinese tech giant ByteDance, has become a primary battleground in today’s fight against disinformation. A report last month by NewsGuard, an organization that tracks the problem online, showed that nearly 20 percent of videos presented as search results on TikTok contained false or misleading information on topics such as school shootings and Russia’s war in Ukraine.Katie Harbath in Facebook’s “war room,” where election-related content was monitored on the platform, in 2018.Jeff Chiu/Associated Press“People who do this know how to exploit the loopholes,” said Katie Harbath, a former director of public policy at Facebook who now leads Anchor Change, a strategic consultancy.With the midterm elections only weeks away, the major platforms have all pledged to block, label or marginalize anything that violates company policies, including disinformation, hate speech or calls to violence.Still, the cottage industry of experts dedicated to countering disinformation — think tanks, universities and nongovernment organizations — say the industry is not doing enough. The Stern Center for Business and Human Rights at New York University warned last month, for example, that the major platforms continued to amplify “election denialism” in ways that undermined trust in the democratic system.Another challenge is the proliferation of alternative platforms for those falsehoods and even more extreme views.Many of those new platforms have flourished in the wake of Mr. Trump’s defeat in 2020, though they have not yet reached the size or reach of Facebook and Twitter. They portray Big Tech as beholden to the government, the deep state or the liberal elite.Parler, a social network founded in 2018, was one of the fastest-growing sites — until Apple’s and Google’s app stores kicked it off after the deadly riot on Jan. 6, which was fueled by disinformation and calls for violence online. It has since returned to both stores and begun to rebuild its audience by appealing to those who feel their voices have been silenced.“We believe at Parler that it is up to the individual to decide what he or she thinks is the truth,” Amy Peikoff, the platform’s chief policy officer, said in an interview.She argued that the problem with disinformation or conspiracy theories stemmed from the algorithms that platforms use to keep people glued online — not from the unfettered debate that sites like Parler foster.On Monday, Parler announced that Kanye West had agreed in principle to purchase the platform, a deal that the rapper and fashion designer, now known as Ye, cast in political terms.“In a world where conservative opinions are considered to be controversial, we have to make sure we have the right to freely express ourselves,” he said, according to the company’s statement.Parler’s competitors now are BitChute, Gab, Gettr, Rumble, Telegram and Truth Social, with each offering itself as sanctuary from the moderating policies of the major platforms on everything from politics to health policy.A new survey by the Pew Research Center found that 15 percent of prominent accounts on those seven platforms had previously been banished from others like Twitter and Facebook.Apps like Gettr market themselves as alternatives to Big Tech.Elijah Nouvelage/Getty ImagesNearly two-thirds of the users of those platforms said they had found a community of people who share their views, according to the survey. A majority are Republicans or lean Republican.A result of this atomization of social media sources is to reinforce the partisan information bubbles within which millions of Americans live.At least 6 percent of Americans now regularly get news from at least one of these relatively new sites, which often “highlight non-mainstream world views and sometimes offensive language,” according to Pew. One in 10 posts on these platforms that mentioned L.G.B.T.Q. issues involved derisive allegations, the survey found.These new sites are still marginal compared with the bigger platforms; Mr. Trump, for example, has four million followers on Truth Social, compared with 88 million when Twitter kicked him off in 2021.Even so, Mr. Trump has increasingly resumed posting with the vigor he once showed on Twitter. The F.B.I. raid on Mar-a-Lago thrust his latest pronouncements into the eye of the political storm once again.For the major platforms, the financial incentive to attract users — and their clicks — remains powerful and could undo the steps they took in 2021. There is also an ideological component. The emotionally laced appeal to individual liberty in part drove Elon Musk’s bid to buy Twitter, which appears to have been revived after months of legal maneuvering.Nick Clegg, the president of global affairs at Meta, Facebook’s parent company, even suggested recently that the platform might reinstate Mr. Trump’s account in 2023 — ahead of what could be another presidential run. Facebook had previously said it would do so only “if the risk to public safety has receded.”Nick Clegg, Meta’s president for global affairs.Patrick T. Fallon/Agence France-Presse — Getty ImagesA study of Truth Social by Media Matters for America, a left-leaning media monitoring group, examined how the platform had become a home for some of the most fringe conspiracy theories. Mr. Trump, who began posting on the platform in April, has increasingly amplified content from QAnon, the online conspiracy theory.He has shared posts from QAnon accounts more than 130 times. QAnon believers promote a vast and complex falsehood that centers on Mr. Trump as a leader battling a cabal of Democratic Party pedophiles. Echoes of such views reverberated through Republican election campaigns across the country during this year’s primaries.Ms. Jankowicz, the disinformation expert, said the nation’s social and political divisions had churned the waves of disinformation.The controversies over how best to respond to the Covid-19 pandemic deepened distrust of government and medical experts, especially among conservatives. Mr. Trump’s refusal to accept the outcome of the 2020 election led to, but did not end with, the Capitol Hill violence.“They should have brought us together,” Ms. Jankowicz said, referring to the pandemic and the riots. “I thought perhaps they could be kind of this convening power, but they were not.” More

  • in

    Twitter and TikTok Lead in Amplifying Misinformation, Report Finds

    A new analysis found that algorithms and some features of social media sites help false posts go viral.It is well known that social media amplifies misinformation and other harmful content. The Integrity Institute, an advocacy group, is now trying to measure exactly how much — and on Thursday it began publishing results that it plans to update each week through the midterm elections on Nov. 8.The institute’s initial report, posted online, found that a “well-crafted lie” will get more engagements than typical, truthful content and that some features of social media sites and their algorithms contribute to the spread of misinformation.Twitter, the analysis showed, has what the institute called the great misinformation amplification factor, in large part because of its feature allowing people to share, or “retweet,” posts easily. It was followed by TikTok, the Chinese-owned video site, which uses machine-learning models to predict engagement and make recommendations to users.“We see a difference for each platform because each platform has different mechanisms for virality on it,” said Jeff Allen, a former integrity officer at Facebook and a founder and the chief research officer at the Integrity Institute. “The more mechanisms there are for virality on the platform, the more we see misinformation getting additional distribution.”The institute calculated its findings by comparing posts that members of the International Fact-Checking Network have identified as false with the engagement of previous posts that were not flagged from the same accounts. It analyzed nearly 600 fact-checked posts in September on a variety of subjects, including the Covid-19 pandemic, the war in Ukraine and the upcoming elections.Facebook, according to the sample that the institute has studied so far, had the most instances of misinformation but amplified such claims to a lesser degree, in part because sharing posts requires more steps. But some of its newer features are more prone to amplify misinformation, the institute found.Facebook’s amplification factor of video content alone is closer to TikTok’s, the institute found. That’s because the platform’s Reels and Facebook Watch, which are video features, “both rely heavily on algorithmic content recommendations” based on engagements, according to the institute’s calculations.Instagram, which like Facebook is owned by Meta, had the lowest amplification rate. There was not yet sufficient data to make a statistically significant estimate for YouTube, according to the institute.The institute plans to update its findings to track how the amplification fluctuates, especially as the midterm elections near. Misinformation, the institute’s report said, is much more likely to be shared than merely factual content.“Amplification of misinformation can rise around critical events if misinformation narratives take hold,” the report said. “It can also fall, if platforms implement design changes around the event that reduce the spread of misinformation.” More

  • in

    Meta Removes Chinese Effort to Influence U.S. Elections

    Meta, the parent company of Facebook and Instagram, said on Tuesday that it had discovered and taken down what it described as the first targeted Chinese campaign to interfere in U.S. politics ahead of the midterm elections in November.Unlike the Russian efforts over the last two presidential elections, however, the Chinese campaign appeared limited in scope — and clumsy at times.The fake posts began appearing on Facebook and Instagram, as well as on Twitter, in November 2021, using profile pictures of men in formal attire but the names of women, according to the company’s report.The users later posed as conservative Americans, promoting gun rights and opposition to abortion, while criticizing President Biden. By April, they mostly presented themselves as liberals from Florida, Texas and California, opposing guns and promoting reproductive rights. They mangled the English language and failed to attract many followers.Two Meta officials said they could not definitively attribute the campaign to any group or individuals. Yet the tactics reflected China’s growing efforts to use international social media to promote the Communist Party’s political and diplomatic agenda.What made the effort unusual was what appeared to be the focus on divisive domestic politics ahead of the midterms.In previous influence campaigns, China’s propaganda apparatus concentrated more broadly on criticizing American foreign policy, while promoting China’s view of issues like the crackdown on political rights in Hong Kong and the mass repression in Xinjiang, the mostly Muslim region where hundreds of thousands were forced into re-education camps or prisons.Ben Nimmo, Meta’s lead official for global threat intelligence, said the operation reflected “a new direction for Chinese influence operations.”“It is talking to Americans, pretending to be Americans rather than talking about America to the rest of the world,” he added later. “So the operation is small in itself, but it is a change.”The operation appeared to lack urgency and scope, raising questions about its ambition and goals. It involved only 81 Facebook accounts, eight Facebook pages and one group. By July, the operation had suddenly shifted its efforts away from the United States and toward politics in the Czech Republic.The posts appeared during working hours in China, typically when Americans were asleep. They dropped off noticeably during what appeared to be “a substantial lunch break.”In one post, a user struggled with clarity: “I can’t live in an America on regression.”Even if the campaign failed to go viral, Mr. Nimmo said the company’s disclosure was intended to draw attention to the potential threat of Chinese interference in domestic affairs of its rivals.Meta also announced that it had taken down a much larger Russian influence operation that began in May and focused primarily on Germany, as well as France, Italy and Britain.The company said it was “the largest and most complex” operation it had detected from Russia since the war in Ukraine began in February.The campaign centered around a network of 60 websites that impersonated legitimate news organizations in Europe, like Der Spiegel, Bild, The Guardian and ANSA, the Italian news agency.The sites would then post original articles criticizing Ukraine, warning about Ukrainian refugees and arguing that economic sanctions against Russia would only backfire. Those articles were then promoted across the internet, including on Facebook and Instagram, but also on Twitter and Telegram, the messaging app, which is widely used in Russia.The Russian operation involved 1,633 accounts on Facebook, 703 pages and one group, as well as 29 different accounts on Instagram, the company’s report said. About 4,000 accounts followed one or more of the Facebook pages. As Meta moved to block the operation’s domains, new websites appeared, “suggesting persistence and continuous investment in this activity.”Meta began its investigation after disclosures in August by one of Germany’s television networks, ZDF. As in the case of the Chinese operation, it did not explicitly accuse the government of the Russian president, Vladimir V. Putin, though the activity clearly mirrors the Kremlin’s extensive information war surrounding its invasion.“They were kind of throwing everything at the wall and not a lot of it was sticking,” said David Agranovich, Meta’s director of threat disruption. “It doesn’t mean that we can say mission accomplished here.”Meta’s report noted overlap between the Russian and Chinese campaigns on “a number of occasions,” although the company said they were unconnected. The overlap reflects the growing cross-fertilization of official statements and state media reports in the two countries, especially regarding the United States.The accounts associated with the Chinese campaign posted material from Russia’s state media, including those involving unfounded allegations that the United States had secretly developed biological weapons in Ukraine.A French-language account linked to the operation posted a version of the allegation in April, 10 days after it had originally been posted by Russia’s Ministry of Defense on Telegram. That one drew only one response, in French, from an authentic user, according to Meta.“Fake,” the user wrote. “Fake. Fake as usual.” More

  • in

    The Midterm Election’s Most Dominant Toxic Narratives

    Ballot mules. Poll watch parties. Groomers.These topics are now among the most dominant divisive and misleading narratives online about November’s midterm elections, according to researchers and data analytics companies. On Twitter, Facebook, Reddit, Truth Social and other social media sites, some of these narratives have surged in recent months, often accompanied by angry and threatening rhetoric.The effects of these inflammatory online discussions are being felt in the real world, election officials and voting rights groups said. Voters have flooded some local election offices with misinformed questions about supposedly rigged voting machines, while some people appear befuddled about what pens to use on ballots and whether mail-in ballots are still legal, they said.“Our voters are angry and confused,” Lisa Marra, elections director in Cochise County, Ariz., told a House committee last month. “They simply don’t know what to believe.”The most prevalent of these narratives fall into three main categories: continued falsehoods about rampant election fraud; threats of violence and citizen policing of elections; and divisive posts on health and social policies that have become central to political campaigns. Here’s what to know about them.Misinformation about the 2020 election, left, has fueled the “Stop the Steal” movement, center, and continues to be raised at campaign events for the midterms, right.From left, Amir Hamja for The New York Times, Gabriela Bhaskar for The New York Times, Ash Ponders for The New York Times Election FraudFalse claims of election fraud are commanding conversation online, with former President Donald J. Trump continuing to protest that the 2020 presidential election was stolen from him.Voter fraud is rare, but that falsehood about the 2020 election has become a central campaign issue for dozens of candidates around the country, causing misinformation and toxic content about the issue to spread widely online.“Stolen election” was mentioned 325,589 times on Twitter from June 19 to July 19, a number that has been fairly steady throughout the year and that was up nearly 900 percent from the same period in 2020, according to Zignal Labs, a media research firm.On the video-sharing site Rumble, videos with the term “stop the steal” or “stolen election” and other claims of election fraud have been among the most popular. In May, such posts attracted 2.5 million viewers, more than triple the total from a year earlier, according to Similarweb, a digital analytics firm.More recently, misinformation around the integrity of voting has metastasized. More conspiracy theories are circulating online about individuals submitting fraudulent ballots, about voting machines being rigged to favor Democrats and about election officials switching the kinds of pens that voters must use to mark ballots in order to confuse them.The State of the 2022 Midterm ElectionsWith the primaries over, both parties are shifting their focus to the general election on Nov. 8.Inflation Concerns Persist: In the six-month primary season that has just ended, several issues have risen and fallen, but nothing has dislodged inflation and the economy from the top of voters’ minds.Herschel Walker: The Republican Senate candidate in Georgia claimed his business donated 15 percent of its profits to charities. Three of the four groups named as recipients say they didn’t receive money.North Carolina Senate Race: Are Democrats about to get their hearts broken again? The contest between Cheri Beasley, a Democrat, and her G.O.P. opponent, Representative Ted Budd, seems close enough to raise their hopes.Echoing Trump: Six G.O.P. nominees for governor and the Senate in critical midterm states, all backed by former President Donald J. Trump, would not commit to accepting this year’s election results.These conspiracy theories have in turn spawned new terms, such as “ballot trafficking” and “ballot mules,” which is used to describe people who are paid to cast fake ballots. The terms were popularized by the May release of the film “2000 Mules,” a discredited movie claiming widespread voter fraud in the 2020 election. From June 19 to July 19, “ballot mules” was mentioned 17,592 times on Twitter; it was not used before the 2020 election, according to Zignal.In April, the conservative talk show host Charlie Kirk interviewed the stars of the film, including Catherine Engelbrecht of the nonprofit voting group True the Vote. Mr. Kirk’s interview has garnered more than two million views online.“A sense of grievance is already in place,” said Kyle Weiss, a senior analyst at Graphika, a research firm that studies misinformation and fake social media accounts. The 2020 election “primed the public on a set of core narratives, which are reconstituting and evolving in 2022.”.css-1v2n82w{max-width:600px;width:calc(100% – 40px);margin-top:20px;margin-bottom:25px;height:auto;margin-left:auto;margin-right:auto;font-family:nyt-franklin;color:var(–color-content-secondary,#363636);}@media only screen and (max-width:480px){.css-1v2n82w{margin-left:20px;margin-right:20px;}}@media only screen and (min-width:1024px){.css-1v2n82w{width:600px;}}.css-161d8zr{width:40px;margin-bottom:18px;text-align:left;margin-left:0;color:var(–color-content-primary,#121212);border:1px solid var(–color-content-primary,#121212);}@media only screen and (max-width:480px){.css-161d8zr{width:30px;margin-bottom:15px;}}.css-tjtq43{line-height:25px;}@media only screen and (max-width:480px){.css-tjtq43{line-height:24px;}}.css-x1k33h{font-family:nyt-cheltenham;font-size:19px;font-weight:700;line-height:25px;}.css-ok2gjs{font-size:17px;font-weight:300;line-height:25px;}.css-ok2gjs a{font-weight:500;color:var(–color-content-secondary,#363636);}.css-1c013uz{margin-top:18px;margin-bottom:22px;}@media only screen and (max-width:480px){.css-1c013uz{font-size:14px;margin-top:15px;margin-bottom:20px;}}.css-1c013uz a{color:var(–color-signal-editorial,#326891);-webkit-text-decoration:underline;text-decoration:underline;font-weight:500;font-size:16px;}@media only screen and (max-width:480px){.css-1c013uz a{font-size:13px;}}.css-1c013uz a:hover{-webkit-text-decoration:none;text-decoration:none;}How Times reporters cover politics. We rely on our journalists to be independent observers. So while Times staff members may vote, they are not allowed to endorse or campaign for candidates or political causes. This includes participating in marches or rallies in support of a movement or giving money to, or raising money for, any political candidate or election cause.Learn more about our process.The security of ballot drop boxes, left; the search for documents at Mar-a-Lago, center; and the role of the F.B.I., right, are being widely discussed online in the context of the midterm elections. From left, Marco Garcia for The New York Times, Saul Martinez for The New York Times, Kenny Holston for The New York TimesCalls to ActionOnline conversations about the midterm elections have also been dominated by calls for voters to act against apparent election fraud. In response, some people have organized citizen policing of voting, with stakeouts of polling stations and demands for information about voter rolls in their counties. Civil rights groups widely criticize poll watching, which they say can intimidate voters, particularly immigrants and at sites in communities of color.From July 27 to Aug. 3, the second-most-shared tweet about the midterms was a photo of people staking out a ballot box, with the message that “residents are determined to safeguard the drop boxes,” according to Zignal. Among those who shared it was Dinesh D’Souza, the creator of “2000 Mules,” who has 2.4 million followers on Twitter.In July, Seth Keshel, a retired Army captain who has challenged the result of the 2020 presidential election, shared a message on Telegram calling for “all-night patriot tailgate parties for EVERY DROP BOX IN AMERICA.” The post was viewed more than 70,000 times.Anger toward the F.B.I. is also reflected in midterm-related conversations, with a rise in calls to shut down or defund the agency after last month’s raid of Mr. Trump’s Florida residence, Mar-a-Lago.“Abolish FBI” became a trending hashtag across social media, mentioned 122,915 times on Twitter, Facebook, Reddit and news sites from July 1 to Aug. 30, up 1,990 percent from about 5,882 mentions in the two months before the 2020 election, according to Zignal.In a video posted on Twitter on Sept. 20, Representative Andrew Clyde, Republican of Georgia, implied that he and others would take action against the F.B.I. if Republicans won control of Congress in November.“You wait till we take the House back. You watch what happens to the F.B.I.,” he said in a video captured by a left-leaning online show, “The Undercurrent,” and shared more than 1,000 times on Twitter within a few hours. Mr. Clyde did not respond to a request for comment.Representative Marjorie Taylor Greene of Georgia, center, is among the politicians who have spread misinformation about gay and transgender people, a report said.From left: Todd Heisler/The New York Times, Stefani Reynolds for The New York Times, Todd Heisler/The New York TimesHot-Button IssuesSome online conversations about the midterms are not directly related to voting. Instead, the discussions are centered on highly partisan issues — such as transgender rights — that candidates are campaigning on and that are widely regarded as motivating voters, leading to a surge of falsehoods.A month after Florida passed legislation that prohibits classroom discussion or instruction about sexual orientation and gender identity, which the Republican governor, Ron DeSantis, signed into law in March, the volume of tweets falsely linking gay and transgender individuals to pedophilia soared, for example.Language claiming that gay people and transgender people were “grooming” children for abuse increased 406 percent on Twitter in April, according to a study by the Human Rights Campaign and the Center for Countering Digital Hate.The narrative was spread most widely by 10 far-right figures, including midterm candidates such as Representatives Lauren Boebert of Colorado and Marjorie Taylor Greene of Georgia, according to the report. Their tweets on “grooming” misinformation were viewed an estimated 48 million times, the report said.In May, Ms. Boebert tweeted: “A North Carolina preschool is using LGBT flag flashcards with a pregnant man to teach kids colors. We went from Reading Rainbow to Randy Rainbow in a few decades, but don’t dare say the Left is grooming our kids!” The tweet was shared nearly 2,000 times and liked nearly 10,000 times.Ms. Boebert and Ms. Taylor Greene did not respond to requests for comment.On Facebook and Instagram, 59 ads also promoted the narrative that the L.G.B.T.Q.+ community and allies were “grooming” children, the report found. Meta, the owner of Facebook and Instagram, accepted up to $24,987 for the ads, which were served to users over 2.1 million times, according to the report.Meta said it had removed several of the ads mentioned in the report.“The repeated pushing of ‘groomer’ narratives has resulted in a wider anti-L.G.B.T. moral panic that has been influencing state and federal legislation and is likely to be a significant midterm issue,” said David Thiel, the chief technical officer at the Stanford Internet Observatory, which studies online extremism and disinformation. More

  • in

    As Midterms Loom, Mark Zuckerberg Shifts Focus Away From Elections

    Mark Zuckerberg, Facebook’s chief executive, made securing the 2020 U.S. election a top priority. He met regularly with an election team, which included more than 300 people from across his company, to prevent misinformation from spreading on the social network. He asked civil rights leaders for advice on upholding voter rights.The core election team at Facebook, which was renamed Meta last year, has since been dispersed. Roughly 60 people are now focused primarily on elections, while others split their time on other projects. They meet with another executive, not Mr. Zuckerberg. And the chief executive has not talked recently with civil rights groups, even as some have asked him to pay more attention to the midterm elections in November.Safeguarding elections is no longer Mr. Zuckerberg’s top concern, said four Meta employees with knowledge of the situation. Instead, he is focused on transforming his company into a provider of the immersive world of the metaverse, which he sees as the next frontier of growth, said the people, who were not authorized to speak publicly.The shift in emphasis at Meta, which also owns Instagram and WhatsApp, could have far-reaching consequences as faith in the U.S. electoral system reaches a brittle point. The hearings on the Jan. 6 Capitol riots have underlined how precarious elections can be. And dozens of political candidates are running this November on the false premise that former President Donald J. Trump was robbed of the 2020 election, with social media platforms continuing to be a key way to reach American voters.Election misinformation remains rampant online. This month, “2000 Mules,” a film that falsely claims the 2020 election was stolen from Mr. Trump, was widely shared on Facebook and Instagram, garnering more than 430,000 interactions, according to an analysis by The New York Times. In posts about the film, commenters said they expected election fraud this year and warned against using mail-in voting and electronic voting machines.Voters casting their ballots in Portland, Maine, this month.Jodi Hilton for The New York TimesOther social media companies have also pulled back some of their focus on elections. Twitter, which stopped labeling and removing election misinformation in March 2021, has been preoccupied with its $44 billion sale to Elon Musk, three employees with knowledge of the situation said. Mr. Musk has suggested he wants fewer rules about what can and cannot be posted on the service.“Companies should be growing their efforts to get prepared to protect the integrity of elections for the next few years, not pulling back,” said Katie Harbath, chief executive of the consulting firm Anchor Change, who formerly managed election policy at Meta. “Many issues, including candidates pushing that the 2020 election was fraudulent, remain and we don’t know how they are handling those.”Meta, which along with Twitter barred Mr. Trump from its platforms after the riot at the U.S. Capitol on Jan. 6, 2021, has worked over the years to limit political falsehoods on its sites. Tom Reynolds, a Meta spokesman, said the company had “taken a comprehensive approach to how elections play out on our platforms since before the U.S. 2020 elections and through the dozens of global elections since then.”Mr. Reynolds disputed that there were 60 people focused on the integrity of elections. He said Meta has hundreds of people across more than 40 teams focused on election work. With each election, he said, the company was “building teams and technologies and developing partnerships to take down manipulation campaigns, limit the spread of misinformation and maintain industry-leading transparency around political ads and pages.”Trenton Kennedy, a Twitter spokesman, said the company was continuing “our efforts to protect the integrity of election conversation and keep the public informed on our approach.” For the midterms, Twitter has labeled the accounts of political candidates and provided information boxes on how to vote in local elections.How Meta and Twitter treat elections has implications beyond the United States, given the global nature of their platforms. In Brazil, which is holding a general election in October, President Jair Bolsonaro has recently raised doubts about the country’s electoral process. Latvia, Bosnia and Slovenia are also holding elections in October.“People in the U.S. are almost certainly getting the Rolls-Royce treatment when it comes to any integrity on any platform, especially for U.S. elections,” said Sahar Massachi, the executive director of the think tank Integrity Institute and a former Facebook employee. “And so however bad it is here, think about how much worse it is everywhere else.”Facebook’s role in potentially distorting elections became evident after 2016, when Russian operatives used the site to spread inflammatory content and divide American voters in the U.S. presidential election. In 2018, Mr. Zuckerberg testified before Congress that election security was his top priority.“The most important thing I care about right now is making sure no one interferes in the various 2018 elections around the world,” he said.The social network has since become efficient at removing foreign efforts to spread disinformation in the United States, election experts said. But Facebook and Instagram still struggle with conspiracy theories and other political lies on their sites, they said.In November 2019, Mr. Zuckerberg hosted a dinner at his home for civil rights leaders and held phone and Zoom conference calls with them, promising to make election integrity a main focus.He also met regularly with an election team. More than 300 employees from various product and engineering teams were asked to build new systems to detect and remove misinformation. Facebook also moved aggressively to eliminate toxic content, banning QAnon conspiracy theory posts and groups in October 2020.Around the same time, Mr. Zuckerberg and his wife, Priscilla Chan, donated $400 million to local governments to fund poll workers, pay for rental fees for polling places, provide personal protective equipment and other administrative costs.The week before the November 2020 election, Meta also froze all political advertising to limit the spread of falsehoods.But while there were successes — the company kept foreign election interference off the platform — it struggled with how to handle Mr. Trump, who used his Facebook account to amplify false claims of voter fraud. After the Jan. 6 riot, Facebook barred Mr. Trump from posting. He is eligible for reinstatement in January 2023.Last year, Frances Haugen, a Facebook employee-turned-whistle-blower, filed complaints with the Securities and Exchange Commission accusing the company of removing election safety features too soon after the 2020 election. Facebook prioritized growth and engagement over security, she said.In October, Mr. Zuckerberg announced Facebook would focus on the metaverse. The company has restructured, with more resources devoted to developing the online world.The team working on elections now meets regularly with Nick Clegg, Meta’s president for global affairs.Christopher Furlong/Getty ImagesMeta also retooled its election team. Now the number of employees whose job is to focus solely on elections is approximately 60, down from over 300 in 2020, according to employees. Hundreds of others participate in meetings about elections and are part of cross-functional teams, where they work on other issues. Divisions that build virtual reality software, a key component of the metaverse, have expanded.What Is the Metaverse, and Why Does It Matter?Card 1 of 5The origins. More

  • in

    Meta Will Give Researchers More Information on Political Ad Targeting

    Meta, which owns Facebook and Instagram, said that it planned to give outside researchers more detailed information on how political ads are targeted across its platform, providing insight into the ways that politicians, campaign operatives and political strategists buy and use ads ahead of the midterm elections.Starting on Monday, academics and researchers who are registered with an initiative called the Facebook Open Research and Transparency project will be allowed to see data on how each political or social ad was used to target people. The information includes which interest categories — such as “people who like dogs” or “people who enjoy the outdoors” — were chosen to aim an ad at someone.In addition, Meta said it planned to include summaries of targeting information for some of its ads in its publicly viewable Ad Library starting in July. The company created the Ad Library in 2019 so that journalists, academics and others could obtain information and help safeguard elections against the misuse of digital advertising.While Meta has given outsiders some access into how its political ads were used in the past, it has restricted the amount of information that could be seen, citing privacy reasons. Critics have claimed that the company’s system has been flawed and sometimes buggy, and have frequently asked for more data.That has led to conflicts. Meta previously clashed with a group of New York University academics who tried ingesting large amounts of self-reported data on Facebook users to learn more about the platform. The company cut off access to the group last year, citing violations of its platform rules.The new data that is being added to the Facebook Open Research Transparency project and the Ad Library is a way to share information on political ad targeting while trying to keep data on its users private, the company said.“By making advertiser targeting criteria available for analysis and reporting on ads run about social issues, elections and politics, we hope to help people better understand the practices used to reach potential voters on our technologies,” the company said in a statement.With the new data, for example, researchers browsing the Ad Library could see that over the course of a month, a Facebook page ran 2,000 political ads and that 40 percent of the ad budget was targeted to “people who live in Pennsylvania” or “people who are interested in politics.”Meta said it had been bound by privacy rules and regulations on what types of data it could share with outsiders. In an interview, Jeff King, a vice president in Meta’s business integrity unit, said the company had hired thousands of workers over the past few years to review those privacy issues.“Every single thing we release goes through a privacy review now,” he said. “We want to make sure we give people the right amount of data, but still remain privacy conscious while we do it.”The new data on political ads will cover the period from August 2020, three months before the last U.S. presidential election, to the present day. More