More stories

  • in

    Russia’s Unsupported ‘Dirty Bomb’ Claims Spread Through Right-Wing U.S. Media

    After Russia claimed that Ukraine would use a “dirty bomb” — a conventional explosive that can spread radioactive material — on its own territory, Western countries reacted with skepticism, noting that Russia often accused others of doing what Moscow was considering doing itself. Russia provided no evidence to support its claim.But on right-wing media and in many right-wing communities online, Russia’s claim was portrayed as believable, and as a dire warning that could serve to escalate Russia’s war on Ukraine for Ukraine’s benefit, or even trigger a new world war.Alex Jones, the American conspiracy theorist who often spreads lies on his Infowars platform, during his online show on Monday suggested that Ukraine would detonate a dirty bomb within its borders and then blame Russia as “a pretext to bring NATO fully into the conflict” and start World War III. “My analysis is, about 90 percent at this point, that there’s going to be full-on public war with Russia, and at least a tactical nuclear war in Europe,” he added.The Gateway Pundit, another right-wing American news site, repeated Russia’s claims in an article on Sunday and asked: “Would this surprise anyone?”And on YouTube, Gonzalo Lira, an American commentator who lives in Ukraine, said that “all the evidence” pointed to a “deliberate provocation that is being staged by the Americans.” The video received more than 59,000 views and circulated through right-wing social media.America’s far right has repeatedly sided with Russia in the conflict, echoing President Vladimir V. Putin of Russia’s baseless claims that the war was about routing corruption in Ukraine or ending neo-Nazism.As the conflict has continued, online communities have interpreted developments through their own conspiratorial lens, picking up Russian narratives that Ukraine would bomb its own citizens to curry international sympathy.Those discussions continued this week regarding claims about the possible use of a dirty bomb in the conflict.In one video — circulated widely on the messaging app Telegram and seen more than 36,000 times on the video-streaming service Brighteon — a streamer suggested that a “cabal” of Democrats and leftists was plotting to use the dirty bomb to set off a nuclear war, which would prevent Republicans from winning the midterm elections.“The cabal is so desperate, obviously, that they can’t allow that to happen,” the host said. “And even if it does happen, they’ll probably try to push for war before the next Congress gets in.” More

  • in

    ‘We risk another crisis’: TikTok in danger of being major vector of election misinformation

    ‘We risk another crisis’: TikTok in danger of being major vector of election misinformation A study suggests the video platform is failing to filter false claims and rhetoric in the weeks leading up to US midterms

    Read the new Guardian series exploring the increasing power and reach of TikTok
    In the final sprint to the US midterm elections social media giant TikTok risks being a major vector for election misinformation, experts warn, with the platform’s massive user base and its design making it particularly susceptible to such threats.Preliminary research published last week from digital watchdog Global Witness and the Cybersecurity for Democracy team at New York University suggests the video platform is failing to filter large volumes of election misinformation in the weeks leading up to the vote.TikTok approved 90% of advertisements featuring election misinformation submitted by researchers, including ads containing the wrong election date, false claims about voting requirements, and rhetoric dissuading people from voting.From dance videos to global sensation: what you need to know about TikTok’s riseRead moreTikTok has for several years prohibited political advertising on the platform, including branded content from creators and paid advertisements, and ahead of midterm elections has automatically disabled monetization to better enforce the policy, TikTok global business president Blake Chandlee said in a September blog post. “TikTok is, first and foremost, an entertainment platform,” he wrote.But the NYU study showed TikTok “performed the worst out of all of the platforms tested” in the experiment, the researchers said, approving more of the false advertisements than other sites such as YouTube and Facebook.The findings spark concern among experts who point out that – with 80 million monthly users in the US and large numbers of young Americans indicating the platform is their primary source of news – such posts could have far reaching consequences.Yet the results come to little surprise, those experts say. During previous major elections in the US, TikTok had far fewer users, but misinformation was already spreading widely on the app. TikTok faced challenges moderating misinformation about elections in Kenya and the war in Ukraine.And the company, experts say, is doing far too little to rein in election lies spreading among its users.“This year is going to be much worse as we near the midterms,” said Olivia Little, a researcher who co-authored the Media Matters report. “There has been an exponential increase in users, which only means there will be more misinformation TikTok needs to proactively work to stop or we risk facing another crisis.”A crucial testWith Joe Biden himself warning that the integrity of American elections is under threat, TikTok has announced a slew of policies aimed at combatting election misinformation spreading through the app.The company laid out guidelines and safety measures related to election content and launched an elections center, which “connect[s] people who engage with election content” to approved news sources in more than 45 languages.“To bolster our response to emerging threats, TikTok partners with independent intelligence firms and regularly engages with others across the industry, civil society organizations, and other experts,” said Eric Han, TikTok’s head of US safety, in August.In September, the company also announced new policies requiring government and politician accounts to be verified and said it would ban videos aimed at campaign fundraising. TikTok added it would block verified political accounts from using money-making features available to influencers on the app, such as digital payments and gifting.Still, experts have deep concerns about the spread of election falsehoods on the video app.Those fears are exacerbated by TikTok’s structure, which makes it difficult to investigate and quantify the spread of misinformation. Unlike Twitter, which makes public its Application Programming Interface (API), software that allows researchers to extract data from platforms for analysis, or Meta, which offers its own internal search engine called Crowdtangle, TikTok does not offer tools for external audits. However, independent research as well as the platform’s own transparency reports highlight the challenges it has faced in recent years moderating election-related content.TikTok removed 350,000 videos related to election misinformation in the latter half of 2020, according to a transparency report from the company, and blocked 441,000 videos containing misinformation from user feeds globally. Internet nonprofit Mozilla warned in the run-up to Kenya’s 2022 election that the platform was “failing its first real test” to stem dis- and misinformation during pivotal political moments. The nonprofit said it had found more than 130 videos on the platform containing election-related misinformation, hate speech, and incitement against communities prior to the vote, which together gained more than 4m views. “Rather than learn from the mistakes of more established platforms like Facebook and Twitter, TikTok is following in their footsteps,” Mozilla researcher Odanga Madung wrote at the time.Why TikTok is so vulnerable to misinformationPart of the reason TikTok is uniquely susceptible to misinformation lies in certain features of its design and algorithm, experts say.Its For You Page, or general video feed, is highly customized to users’ individual preferences via an algorithm that’s little understood, even by its own staff. That combination lends itself to misinformation bubbles, said Little, the Media Matters researcher.“TikTok’s hyper-tailored algorithm can blast random accounts into virality very quickly, and I don’t think that is going to change anytime soon because it’s the reason it has become such a popular platform,” she said.Meanwhile, the ease with which users’ remix, record, and repost videos – few of which have been fact-checked – allows misinformation to spread easily while making it more difficult to remove.TikTok’s video-exclusive content brings up additional moderation hurdles, as artificial intelligence processes may find it more difficult to automatically scrape video content for misinformation compared to text. Several recent studies have highlighted how those features have exacerbated the spread of misinformation on the platform. When it comes to TikTok content related to the war in Ukraine, for example, the ability to “remix media” without fact checking it has made it difficult “even for seasoned journalists and researchers to discern truth from rumor, parody and fabrication”, said a recent report from Harvard’s Shorenstein Center on Media.That report cited other design features in the app that make it an easy pathway for misinformation, including that most users post under pseudonyms and that, unlike on Facebook, where users’ feeds are filled primarily with content from friends and people they know, TikTok’s For You Page is largely composed of content from strangers.Some of these problems are not unique to TikTok, said Marc Faddoul co-director of Tracking Exposed, a digital rights organization investigating TikTok’s algorithm.Studies have shown that algorithms across all platforms are optimized to detect and exploit cognitive biases for more polarizing content, and that any platform that relies on algorithms rather than a chronological newsfeed is more susceptible to disinformation. But TikTok is the most accelerated model of an algorithmic feed yet, he said.At the same time, he added, the platform has been slow in coming to grips with issues that have plagued its peers like Facebook and Twitter for years.“Historically, TikTok has characterized itself as an entertainment platform, denying they host political content and therefore disinformation, but we know now that is not the case,” he said.Young user base is particularly at riskExperts say an additional cause for concern is a lack of media literacy among TikTok’s largely young user base. The vast majority of young people in the US use TikTok, a recent Pew Research Center report showed. Internal data from Google revealed in July that nearly 40% of Gen Z – the generation born between the late 1990s and early 2000s – globally uses TikTok and Instagram as their primary search engines.In addition to being more likely to get news coverage from social media, Gen Z also has far higher rates of mistrust in traditional institutions such as the news media and the government compared with past generations, creating a perfect storm for the spread misinformation, said Helen Lee Bouygues, president of the Reboot Foundation, a media literacy advocacy organization.“By the nature of its audience, TikTok is exposing a lot of young children to disinformation who are not trained in media literacy, period,” she said. “They are not equipped with the skills necessary to recognize propaganda or disinformation when they see it online.”The threat is amplified by the sheer amount of time spent on the app, with 67% of US teenagers using the app for an average of 99 minutes per day. Research conducted by the Reboot Foundation showed that the longer a user spends on an app the less likely they are to distinguish between misinformation and fact.To enforce its policies, which prohibit election misinformation, harassment, hateful behavior, and violent extremism, TikTok says it relies on “a combination of people and technology” and partners with fact checkers to moderate content. The company directed questions to this blog post regarding election misinformation measures, but declined to share how many human moderators it employs.Bouygues said the company should do far more to protect its users, particularly young ones. Her research shows that media literacy and in-app nudges towards fact checking could go a long way when it comes to combating misinformation. But government action is needed to force such changes.“If the TikToks of the world really want to fight fake news, they could do it,” she said. “But as long as their financial model is keeping eyes on the page, they have no incentive to do so. That’s where policymaking needs to come into play.”TopicsTikTokThe TikTok takeoverSocial mediaUS politicsfeaturesReuse this content More

  • in

    La desinformación es más difícil de combatir en EE. UU.

    La proliferación de redes sociales alternativas ha ayudado a afianzar la información falsa y engañosa como elemento clave de la política estadounidense.La mañana del 8 de julio, el expresidente Donald Trump recurrió a Truth Social, la plataforma de redes sociales que fundó con gente cercana a él, para afirmar que había ganado las elecciones presidenciales del 2020 en el estado de Wisconsin, a pesar de todas las pruebas que evidenciaban lo contrario.Alrededor de 8000 personas compartieron esa misiva en Truth Social, cifra que distó mucho de los cientos de miles de respuestas que sus publicaciones en Facebook y Twitter solían generar antes de que esas plataformas le apagaran el micrófono tras los mortíferos disturbios en el Capitolio el 6 de enero de 2021.A pesar de ello, la afirmación infundada de Trump pululó en la conciencia pública. Saltó de su aplicación a otras plataformas de redes sociales, por no hablar de pódcast, la radio y la televisión.Al cabo de 48 horas de publicado su mensaje, más de un millón de personas lo habían visto en al menos una decena de otros lugares. Apareció en Facebook y Twitter, de donde fue eliminado, pero también en YouTube, Gab, Parler y Telegram, según un análisis de The New York Times.La difusión de la afirmación de Trump ilustra cómo la desinformación ha hecho metástasis desde que los expertos comenzaron a sonar la alarma sobre la amenaza que supone y todo esto ocurre justo antes de las elecciones de mitad de mandato de este año. A pesar de los años de esfuerzos de los medios de comunicación, de los académicos e incluso de las propias empresas de redes sociales para hacer frente al problema, se puede decir que hoy en día está más generalizado y extendido.“Para ser honesta, me parece que el problema está peor que nunca”, comentó Nina Jankowicz, experta en desinformación que condujo durante un periodo breve un consejo consultivo dentro del Departamento de Seguridad Nacional dedicado a combatir la desinformación. La creación del panel desató furor y provocó su renuncia y la disolución del consejo consultivo.No hace mucho, la lucha contra la desinformación se centraba en las principales plataformas de redes sociales, como Facebook y Twitter. Cuando se les presionaba, solían eliminar los contenidos problemáticos, incluida la información errónea y la desinformación intencionada sobre la pandemia de COVID-19.Sin embargo, ahora hay decenas de plataformas nuevas, incluidas algunas que se enorgullecen de no moderar —censurar, como lo denominan— las declaraciones falsas en nombre de la libertad de expresión.Otras personalidades siguieron los pasos de Trump y se cambiaron a estas nuevas plataformas tras ser “censuradas” por Facebook, YouTube o Twitter. Entre ellos, Michael Flynn, el general retirado que sirvió brevemente como principal asesor de Seguridad Nacional de Trump; L. Lin Wood, una abogada pro-Trump; Naomi Wolf, una autora feminista y escéptica de las vacunas, así como diversos seguidores de QAnon y los Oath Keepers, un grupo de militantes de extrema derecha.Al menos 69 millones de personas se han unido a plataformas como Parler, Gab, Truth Social, Gettr y Rumble, que se promueven como alternativas conservadoras a las grandes empresas tecnológicas, según declaraciones de las empresas mismas. Aunque muchos de esos usuarios ya no tienen cabida en las plataformas más grandes, siguen difundiendo sus opiniones, que a menudo aparecen en capturas de pantalla publicadas en los sitios que les prohibieron la entrada.“Nada en internet existe de manera aislada”, afirmó Jared Holt, gestor principal en la investigación sobre odio y extremismo del Instituto para el Diálogo Estratégico. “Lo que ocurre en plataformas alternas como Gab o Telegram o Truth tarde o temprano llega a Facebook, Twitter y otras”, agregó.Los usuarios han migrado a aplicaciones como Truth Social luego de haber sido “censuradas” por Facebook, YouTube o Twitter.Leon Neal/Getty ImagesEl discurso político se ha radicalizado por la difusión de las personas que propagan desinformación, indicó Nora Benavidez, abogada sénior en Free Press, un grupo de defensa de los derechos digitales y la transparencia.“Nuestro lenguaje y nuestros ecosistemas en línea se están volviendo cada vez más corrosivos”, dijo.Los cambios en el paisaje de la desinformación se están haciendo más evidentes con el ciclo electoral en Estados Unidos. En 2016, la campaña encubierta de Rusia para difundir mensajes falsos y divisorios parecía una aberración en el sistema político estadounidense. Hoy la desinformación, procedente de enemigos extranjeros y nacionales, se ha convertido en una característica del mismo.La idea infundada de que el presidente Joe Biden no fue electo de manera legítima se generalizó entre los miembros del Partido Republicano, e hizo que funcionarios de los estados y los condados impusieran nuevas restricciones para votar, a menudo solo con base en teorías de la conspiración que se cuelan en los medios de comunicación de derecha.Los votantes no solo deben filtrar un torrente cada vez mayor de mentiras y falsedades sobre los candidatos y sus políticas, sino también información sobre cuándo y dónde votar. Los funcionarios nombrados o elegidos en nombre de la lucha contra el fraude electoral han adoptado una postura que implica que se negarán a certificar los resultados que no sean de su agrado.Los proveedores de desinformación también se han vuelto cada vez más sofisticados a la hora de eludir las normas de las principales plataformas, mientras que el uso del video para difundir afirmaciones falsas en YouTube, TikTok e Instagram ha hecho que los sistemas automatizados tengan más dificultades para identificarlos que los mensajes de texto.TikTok, propiedad del gigante chino de la tecnología ByteDance, se ha vuelto uno de los principales campos de batalla en la lucha actual contra la desinformación. Un informe del mes pasado de NewsGuard, una organización que da seguimiento al problema en línea, mostró que casi el 20 por ciento de los videos que aparecían como resultados de búsqueda en TikTok contenían información falsa o tendenciosa sobre temas como los tiroteos en las escuelas y la guerra de Rusia en Ucrania.Katie Harbath en el “sala de operaciones” de Facebook, donde se monitoreaba el contenido relacionado con las elecciones en la plataforma, en 2018Jeff Chiu/Associated Press“La gente que hace esto sabe cómo aprovechar los vacíos”, explicó Katie Harbath, exdirectora de políticas públicas de Facebook que ahora dirige Anchor Change, una consultora estratégica.A pocas semanas de las elecciones de mitad de mandato, las principales plataformas se han comprometido a bloquear, etiquetar o marginar todo lo que infrinja las políticas de la empresa, incluida la desinformación, la incitación al odio o los llamados a la violencia.Sin embargo, la industria artesanal de expertos dedicados a contrarrestar la desinformación —los grupos de expertos, las universidades y las organizaciones no gubernamentales— mencionan que la industria no está haciendo suficiente. El mes pasado, por ejemplo, el Centro Stern para los Negocios y los Derechos Humanos de la Universidad de Nueva York advirtió que las principales plataformas seguían amplificando el “negacionismo electoral” de maneras que debilitaban la confianza en el sistema democrático.Otro desafío es la proliferación de plataformas alternativas para esas falsedades y opiniones aún más extremas.Muchas de esas nuevas plataformas florecieron tras la derrota de Trump en 2020, aunque todavía no han alcanzado el tamaño o el alcance de Facebook y Twitter. Estas plataformas afirman que las grandes empresas tecnológicas están en deuda con el gobierno, el Estado profundo o la élite liberal.Parler, una red social fundada en 2018, era uno de los sitios que más crecía, hasta que las tiendas de aplicaciones de Apple y Google lo expulsaron tras los disturbios mortales del 6 de enero, alimentados por la desinformación y los llamados a la violencia en línea. Desde entonces ha vuelto a ambas tiendas y ha empezado a reconstruir su audiencia apelando a quienes sienten que sus voces han sido silenciadas.“En Parler creemos que el individuo es quien debe decidir lo que cree que es la verdad”, dijo en una entrevista, Amy Peikoff, la directora de políticas de la plataforma.Argumentó que el problema con la desinformación o las teorías de la conspiración se derivaba de los algoritmos que las plataformas usan para mantener a la gente pegada a internet y no del debate sin moderar que fomentan sitios como Parler.El lunes, Parler anunció que Kanye West había, en principio, accedido a comprar la plataforma en un acuerdo que el rapero y el diseñador de moda, ahora conocido como Ye, formuló en términos políticos.“En un mundo en que las opiniones conservadoras se consideran controversiales, debemos de asegurarnos de tener el derecho a expresarnos libremente”, dijo, según el comunicado de la compañía.Los competidores de Parler son ahora BitChute, Gab, Gettr, Rumble, Telegram y Truth Social, y cada uno de ellos se presenta como un santuario frente a las políticas de moderación de las principales plataformas en todo tipo de temas, desde la política hasta la salud.Una nueva encuesta del Centro de Investigaciones Pew descubrió que el 15 por ciento de las cuentas destacadas en esas siete plataformas habían sido desterradas previamente de otras como Twitter y Facebook.Las aplicaciones como Gettr se publicitan como alternativas a los gigantes tecnológicosElijah Nouvelage/Getty ImagesSegún la encuesta, casi dos terceras partes de los usuarios de esas plataformas dijeron que habían encontrado una comunidad de personas que compartían sus opiniones. La mayoría son republicanos o se inclinan por ese partido.Una consecuencia de esta atomización de las fuentes de las redes sociales es que se refuerzan las burbujas de información partidista en las que viven millones de estadounidenses.Según el Centro Pew, al menos el seis por ciento de los estadounidenses se informa de manera habitual en al menos uno de estos sitios relativamente nuevos, que a menudo “ponen de relieve puntos de vista del mundo que no pertenecen a la corriente dominante y, a veces, utilizan un lenguaje ofensivo”. La encuesta encontró que una de cada 10 publicaciones en estas plataformas que mencionaban cuestiones relacionadas con la comunidad LGBTQ incluían alegatos peyorativos.Estos nuevos sitios siguen siendo marginales comparados con las plataformas más grandes; por ejemplo, Trump tiene 4 millones de seguidores en Truth Social, en comparación con los 88 millones que tenía cuando Twitter cerró su cuenta en 2021.Aun así, Trump ha retomado cada vez más sus publicaciones con el ímpetu que antes mostraba en Twitter. El allanamiento del FBI en Mar-a-Lago volvió a poner sus últimos pronunciamientos en el ojo del huracán político.Para las principales plataformas, el incentivo financiero para atraer usuarios, y sus clics, sigue siendo poderoso y podría hacer que den marcha atrás a las medidas que tomaron en 2021. También hay un componente ideológico. El llamado a la libertad individual, con tintes emocionales, impulsó en parte la oferta de Elon Musk para comprar Twitter, que parece haberse reactivado tras meses de maniobras legales.Nick Clegg, el presidente de asuntos globales de Meta, la empresa matriz de Facebook, incluso sugirió hace poco que la plataforma podría restablecer la cuenta de Trump en 2023, antes de la que podría ser otra carrera presidencial. Facebook había dicho previamente que solo lo haría “si el riesgo para la seguridad pública ha disminuido”.Nick Clegga, el presidente de asuntos globales de MetaPatrick T. Fallon/Agence France-Presse — Getty ImagesUn estudio de Truth Social realizado por Media Matters for America, un grupo de monitoreo de medios con tendencia de izquierda, examinó la forma en que la plataforma se ha convertido en hogar de algunas de las teorías de conspiración más marginales. Trump, que empezó a publicar en la plataforma en el mes de abril, ha amplificado cada vez más el contenido de QAnon, la teoría de conspiración en línea.Ha compartido publicaciones de QAnon más de 130 veces. Los seguidores de QAnon promueven una falsedad amplia y compleja centrada en Trump como líder que se enfrenta a una conspiración de una camarilla de pedófilos del Partido Demócrata. Dichas opiniones han hallado cabida durante las primarias de este año en las campañas electorales de los republicanos.Jankowicz, la experta en desinformación, mencionó que las divisiones sociales y políticas habían agitado las olas de la desinformación.Las controversias sobre la mejor manera de responder a la pandemia de COVID-19 profundizaron la desconfianza en el gobierno y los expertos médicos, sobre todo entre los conservadores. La negativa de Trump a aceptar el resultado de las elecciones de 2020 condujo a la violencia en el Capitolio, pero no terminó con ella.“Deberían habernos unido”, dijo Jankowicz, refiriéndose a la pandemia y a los disturbios. “Pensé que quizás podrían servir como una especie de poder de convocatoria, pero no lo fueron”Steven Lee Myers cubre desinformación para el Times. Ha trabajado en Washington, Moscú, Bagdad y Pekín, donde contribuyó a los artículos que ganaron el Premio Pulitzer al servicio público en 2021. También es el autor de The New Tsar: The Rise and Reign of Vladimir Putin. @stevenleemyers • FacebookSheera Frenkel es una reportera de tecnología premiada que tiene su sede en San Francisco. En 2021, ella y Cecilia Kang publicaron Manipulados. La batalla de Facebook por la dominación mundial. @sheeraf More

  • in

    Social media giants struggle to tackle misinformation: Politics Weekly America – podcast

    More ways to listen

    Apple Podcasts

    Google Podcasts

    Spotify

    RSS Feed

    Download

    Share on Facebook

    Share on Twitter

    Share via Email

    This week, Jonathan Freedland and Anya van Wagtendonk look at how misinformation could affect the outcome of the midterm elections in November and how tech platforms and lawmakers should be doing more to help stem the erosion of voter confidence in American democracy

    How to listen to podcasts: everything you need to know

    Watch Anywhere but Washington with Oliver Laughland Buy your tickets to the Guardian’s Politics Weekly America Live event at 8pm GMT on 2 November Listen to the latest series of Comfort Eating with Grace Dent Send your questions and feedback to podcasts@theguardian.com Help support the Guardian by going to theguardian.com/supportpodcasts More

  • in

    Ahead of Midterms, Disinformation Is Even More Intractable

    On the morning of July 8, former President Donald J. Trump took to Truth Social, a social media platform he founded with people close to him, to claim that he had in fact won the 2020 presidential vote in Wisconsin, despite all evidence to the contrary.Barely 8,000 people shared that missive on Truth Social, a far cry from the hundreds of thousands of responses his posts on Facebook and Twitter had regularly generated before those services suspended his megaphones after the deadly riot on Capitol Hill on Jan. 6, 2021.And yet Mr. Trump’s baseless claim pulsed through the public consciousness anyway. It jumped from his app to other social media platforms — not to mention podcasts, talk radio or television.Within 48 hours of Mr. Trump’s post, more than one million people saw his claim on at least dozen other sites. It appeared on Facebook and Twitter, from which he has been banished, but also YouTube, Gab, Parler and Telegram, according to an analysis by The New York Times.The spread of Mr. Trump’s claim illustrates how, ahead of this year’s midterm elections, disinformation has metastasized since experts began raising alarms about the threat. Despite years of efforts by the media, by academics and even by social media companies themselves to address the problem, it is arguably more pervasive and widespread today.“I think the problem is worse than it’s ever been, frankly,” said Nina Jankowicz, an expert on disinformation who briefly led an advisory board within the Department of Homeland Security dedicated to combating misinformation. The creation of the panel set off a furor, prompting her to resign and the group to be dismantled.Not long ago, the fight against disinformation focused on the major social media platforms, like Facebook and Twitter. When pressed, they often removed troubling content, including misinformation and intentional disinformation about the Covid-19 pandemic.Today, however, there are dozens of new platforms, including some that pride themselves on not moderating — censoring, as they put it — untrue statements in the name of free speech.Other figures followed Mr. Trump in migrating to these new platforms after being “censored” by Facebook, YouTube or Twitter. They included Michael Flynn, the retired general who served briefly as Mr. Trump’s first national security adviser; L. Lin Wood, a pro-Trump lawyer; Naomi Wolf, a feminist author and vaccine skeptic; and assorted adherents of QAnon and the Oath Keepers, the far-right militia.At least 69 million people have joined platforms, like Parler, Gab, Truth Social, Gettr and Rumble, that advertise themselves as conservative alternatives to Big Tech, according to statements by the companies. Though many of those users are ostracized from larger platforms, they continue to spread their views, which often appear in screen shots posted on the sites that barred them.The State of the 2022 Midterm ElectionsBoth parties are making their final pitches ahead of the Nov. 8 election.Where the Election Stands: As Republicans appear to be gaining an edge with swing voters in the final weeks of the contest for control of Congress, here’s a look at the state of the races for the House and Senate.Biden’s Low Profile: President Biden’s decision not to attend big campaign rallies reflects a low approval rating that makes him unwelcome in some congressional districts and states.What Young Voters Think: Twelve Americans under 30, all living in swing states, told The Times about their political priorities, ranging from the highly personal to the universal.Debates Dwindle: Direct political engagement with voters is waning as candidates surround themselves with their supporters. Nowhere is the trend clearer than on the shrinking debate stage.“Nothing on the internet exists in a silo,” said Jared Holt, a senior manager on hate and extremism research at the Institute for Strategic Dialogue. “Whatever happens in alt platforms like Gab or Telegram or Truth makes its way back to Facebook and Twitter and others.”Users have migrated to apps like Truth Social after being “censored” by Facebook, YouTube or Twitter.Leon Neal/Getty ImagesThe diffusion of the people who spread disinformation has radicalized political discourse, said Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability.“Our language and our ecosystems are becoming more caustic online,” she said. The shifts in the disinformation landscape are becoming clear with the new cycle of American elections. In 2016, Russia’s covert campaign to spread false and divisive posts seemed like an aberration in the American political system. Today disinformation, from enemies, foreign and domestic, has become a feature of it.The baseless idea that President Biden was not legitimately elected has gone mainstream among Republican Party members, driving state and county officials to impose new restrictions on casting ballots, often based on mere conspiracy theories percolating in right-wing media.Voters must now sift through not only an ever-growing torrent of lies and falsehoods about candidates and their policies, but also information on when and where to vote. Officials appointed or elected in the name of fighting voter fraud have put themselves in the position to refuse to certify outcomes that are not to their liking.The purveyors of disinformation have also become increasingly sophisticated at sidestepping the major platforms’ rules, while the use of video to spread false claims on YouTube, TikTok and Instagram has made them harder for automated systems to track than text.TikTok, which is owned by the Chinese tech giant ByteDance, has become a primary battleground in today’s fight against disinformation. A report last month by NewsGuard, an organization that tracks the problem online, showed that nearly 20 percent of videos presented as search results on TikTok contained false or misleading information on topics such as school shootings and Russia’s war in Ukraine.Katie Harbath in Facebook’s “war room,” where election-related content was monitored on the platform, in 2018.Jeff Chiu/Associated Press“People who do this know how to exploit the loopholes,” said Katie Harbath, a former director of public policy at Facebook who now leads Anchor Change, a strategic consultancy.With the midterm elections only weeks away, the major platforms have all pledged to block, label or marginalize anything that violates company policies, including disinformation, hate speech or calls to violence.Still, the cottage industry of experts dedicated to countering disinformation — think tanks, universities and nongovernment organizations — say the industry is not doing enough. The Stern Center for Business and Human Rights at New York University warned last month, for example, that the major platforms continued to amplify “election denialism” in ways that undermined trust in the democratic system.Another challenge is the proliferation of alternative platforms for those falsehoods and even more extreme views.Many of those new platforms have flourished in the wake of Mr. Trump’s defeat in 2020, though they have not yet reached the size or reach of Facebook and Twitter. They portray Big Tech as beholden to the government, the deep state or the liberal elite.Parler, a social network founded in 2018, was one of the fastest-growing sites — until Apple’s and Google’s app stores kicked it off after the deadly riot on Jan. 6, which was fueled by disinformation and calls for violence online. It has since returned to both stores and begun to rebuild its audience by appealing to those who feel their voices have been silenced.“We believe at Parler that it is up to the individual to decide what he or she thinks is the truth,” Amy Peikoff, the platform’s chief policy officer, said in an interview.She argued that the problem with disinformation or conspiracy theories stemmed from the algorithms that platforms use to keep people glued online — not from the unfettered debate that sites like Parler foster.On Monday, Parler announced that Kanye West had agreed in principle to purchase the platform, a deal that the rapper and fashion designer, now known as Ye, cast in political terms.“In a world where conservative opinions are considered to be controversial, we have to make sure we have the right to freely express ourselves,” he said, according to the company’s statement.Parler’s competitors now are BitChute, Gab, Gettr, Rumble, Telegram and Truth Social, with each offering itself as sanctuary from the moderating policies of the major platforms on everything from politics to health policy.A new survey by the Pew Research Center found that 15 percent of prominent accounts on those seven platforms had previously been banished from others like Twitter and Facebook.Apps like Gettr market themselves as alternatives to Big Tech.Elijah Nouvelage/Getty ImagesNearly two-thirds of the users of those platforms said they had found a community of people who share their views, according to the survey. A majority are Republicans or lean Republican.A result of this atomization of social media sources is to reinforce the partisan information bubbles within which millions of Americans live.At least 6 percent of Americans now regularly get news from at least one of these relatively new sites, which often “highlight non-mainstream world views and sometimes offensive language,” according to Pew. One in 10 posts on these platforms that mentioned L.G.B.T.Q. issues involved derisive allegations, the survey found.These new sites are still marginal compared with the bigger platforms; Mr. Trump, for example, has four million followers on Truth Social, compared with 88 million when Twitter kicked him off in 2021.Even so, Mr. Trump has increasingly resumed posting with the vigor he once showed on Twitter. The F.B.I. raid on Mar-a-Lago thrust his latest pronouncements into the eye of the political storm once again.For the major platforms, the financial incentive to attract users — and their clicks — remains powerful and could undo the steps they took in 2021. There is also an ideological component. The emotionally laced appeal to individual liberty in part drove Elon Musk’s bid to buy Twitter, which appears to have been revived after months of legal maneuvering.Nick Clegg, the president of global affairs at Meta, Facebook’s parent company, even suggested recently that the platform might reinstate Mr. Trump’s account in 2023 — ahead of what could be another presidential run. Facebook had previously said it would do so only “if the risk to public safety has receded.”Nick Clegg, Meta’s president for global affairs.Patrick T. Fallon/Agence France-Presse — Getty ImagesA study of Truth Social by Media Matters for America, a left-leaning media monitoring group, examined how the platform had become a home for some of the most fringe conspiracy theories. Mr. Trump, who began posting on the platform in April, has increasingly amplified content from QAnon, the online conspiracy theory.He has shared posts from QAnon accounts more than 130 times. QAnon believers promote a vast and complex falsehood that centers on Mr. Trump as a leader battling a cabal of Democratic Party pedophiles. Echoes of such views reverberated through Republican election campaigns across the country during this year’s primaries.Ms. Jankowicz, the disinformation expert, said the nation’s social and political divisions had churned the waves of disinformation.The controversies over how best to respond to the Covid-19 pandemic deepened distrust of government and medical experts, especially among conservatives. Mr. Trump’s refusal to accept the outcome of the 2020 election led to, but did not end with, the Capitol Hill violence.“They should have brought us together,” Ms. Jankowicz said, referring to the pandemic and the riots. “I thought perhaps they could be kind of this convening power, but they were not.” More

  • in

    A social network for bigots? No wonder Kanye West wants to buy Parler | Arwa Mahdawi

    A social network for bigots? No wonder Kanye West wants to buy ParlerArwa MahdawiThe rapper’s antisemitic remarks have got him banned from Twitter and Instagram. But here’s a safe space where he’ll be able to say just what he likes Kanye West, a man who can’t seem to stop saying bigoted things, is buying Parler, a social network designed especially for people who like to say bigoted things. I was a little surprised when this news broke on Monday because I thought Parler was basically a Nazified version of Myspace that nobody used any more. There are a bunch of fringe rightwing social networks out there – Gettr, Gab, Truth Social – and Parler might be the least successful of a very unsuccessful bunch. The Twitter clone was launched in 2018 with the stated aim of countering the “ever-increasing tyranny … of our tech overlords”; it had a brief moment of popularity then that fizzled out. No doubt because of the tyranny of our tech overlords.Despite the fact it’s not a household name, I’m sure I don’t need to explain why West, who has changed his name to Ye, is interested in Parler, which, one imagines, may soon change its name to Er. The musician, who has been moving dramatically to the right in recent years, had his Twitter and Instagram accounts locked this month because of antisemitic comments. Or that’s what us lefties have been saying anyway – West seems to think he was being censored and free speech is dead and liberals are trying to cancel him yada yada yada. Instead of engaging in any sort of introspection following his Twitter suspension, Ye apparently decided to fight for his right to be a bigot. Parler’s parent company, Parlement Technologies, put that in rather more sanitised terms. In a statement, it said West is making “a groundbreaking move into the free speech media space and will never have to fear being removed from social media again”.If you think you’ve heard this story before, it’s because you have. Rich conservatives are obsessed with creating safe spaces where they can never be criticised or contradicted; where nobody cares about facts and everyone cares about their feelings. Donald Trump launched Truth Social at the beginning of this year after he was banned by Twitter. Elon Musk said he was buying Twitter then said he wasn’t buying Twitter and now seems to be buying Twitter again. Trump-supporting Peter Thiel has put money into Rumble, a more rightwing version of YouTube.While it may look suspiciously like they’re too fragile to deal with other people’s opinions, conservatives always couch their obsession with building echo chambers in terms of “free speech”. George Farmer, the CEO of Parler’s parent company, for example, said he thinks West will “change the way the world thinks about free speech”. I don’t know about that. I do know, however, that the acquisition (which is for an undisclosed sum) is likely to change Farmer’s bank balance.Farmer, it’s important to note, happens to be married to Candace Owens, a rightwing pundit who once suggested the US military invade Australia in order to free its people “suffering under a totalitarian regime”. When she’s not dreaming about liberating Australia, Owens is busy palling around with West; the pair recently wore “White Lives Matter” shirts at Paris fashion week. Owens also defended West after he tweeted that he was going “death con 3 on JEWISH PEOPLE … You guys have toyed with me and tried to black ball anyone whoever opposes your agenda.” This is obviously indefensible, but Owens did her best, saying on her podcast: “If you’re an honest person, when you read this tweet, you had no idea what the hell he was talking about … if you are an honest person, you did not think this tweet was antisemitic.” (If I’m honest, I think it was.) The Farmer-Owens-West connection has led a number of people to suspect that the Parler acquisition was a brilliant manoeuvre on Owens’ part to get West to redistribute some of his wealth to her family. Candace was cashing in on Kanye, in other words.While West’s descent into extremism is disturbing, his acquisition of Parler (assuming it goes through) is not keeping me up at night. If Truth Social is anything to go by, I highly doubt that Parler is going to be influential anytime soon. What is keeping me up at night, however, is the rightward drift of more mainstream platforms such as CNN. What’s keeping me up at night is the rightward drift of politics. West is a very prominent symbol of a much bigger problem. Arwa Mahdawi is a Guardian columnist
    Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.
    TopicsKanye WestOpinionParlerSocial mediaUS politicsDigital mediaPeter ThielcommentReuse this content More

  • in

    The Misinformation Beat, Translated

    To report an article on the spread of false narratives in non-English languages, the journalist Tiffany Hsu spent time on fringe platforms — and Google Translate.Times Insider explains who we are and what we do and delivers behind-the-scenes insights into how our journalism comes together.Facts are facts no matter the language in which they are shared. Ahead of the midterm elections, misleading translations and blatant falsehoods about topics such as inflation and election fraud are swirling in non-English languages on social media — and multilingual fact checkers are struggling to keep up.The Times journalist Tiffany Hsu tackled this topic in a recent report. After spending several years on the media beat for The Times, she joined the team covering disinformation and misinformation this summer. (Disinformation means a coordinated campaign by people or organizations that generally know the information is false; misinformation, as Ms. Hsu puts it, is when your uncle repeats something he read on Facebook, not realizing the post wasn’t factual.) For the article, Ms. Hsu spoke with about a dozen researchers — and spent time on Google Translate — to understand how the spread of falsehoods may target immigrant communities and affect the vote.In an interview on Thursday, Ms. Hsu shared more about her recent reporting. This conversation has been edited.When did you start to hear about misinformation in other languages?My family is from Taiwan, and for several years now there has been this interesting flow of content from not only Taiwanese producers, but also Taiwanese American producers and mainland Chinese producers that reaches immigrants like my parents in this country. Often a lot of that information is twisted or is just flat-out wrong.Misinformation, especially in Spanish, was a big problem in 2020. Jennifer Medina wrote a great couple of stories for us on this during that election. I was talking to researchers and many of them were pointing out that the problem had not only not gone away, it had gotten worse. We were entering the midterm season with more fact checkers working in different languages, but also with more misinformation on more topics in more languages on more platforms.The State of the 2022 Midterm ElectionsWith the primaries over, both parties are shifting their focus to the general election on Nov. 8.The Final Stretch: With elections next month, a Times/Siena poll shows that independents, especially women, are swinging toward the G.O.P. despite Democrats’ focus on abortion rights as voters worry about the economy.Questioning 2020: Hundreds of Republicans on the ballot this November have cast doubt on the 2020 election, a Times analysis found. Many of these candidates are favored to win their races.Georgia Senate Race: The contest, which could determine whether Democrats keep control of the Senate, has become increasingly focused on the private life and alleged hypocrisy of Herschel Walker, the Republican nominee.Jill Biden: The first lady, who has become a lifeline for Democratic candidates trying to draw attention and money in the midterms, is the most popular surrogate in the Biden administration.Do you spend a lot of time on fringe services?For this particular story, because it covered so many different languages that I’m not familiar with, the researchers were a fantastic lifeline. I would reach out to them and say, “This narrative is circulating in English language communities. Are you seeing this in Spanish or Chinese or Vietnamese or Hindi?” They would tell me if they had, and often, they had.Generally in my reporting, I spend a lot of time on various platforms like Gab, Telegram, Truth Social, Rumble and TikTok. This morning, I was commuting on the train, and I spent an hour scrolling through TikTok, looking at videos that were tagged with the midterm hashtag and seeing quite a lot that were not fully factual.Has your work changed the way that you use social media?Absolutely. I had been on Facebook for a while. I was on Instagram for a long time, and I used to look at it just as entertainment. But ever since taking on the media beat and now the misinformation beat, I’m hyperconscious of what’s being served to me. Covering this beat has personally been helpful to me because it’s trained me to stop and think about what it is I’m seeing on these platforms and to not take everything at face value.In your reporting, you already have to cut through what’s true and what’s not. Now you have to do that in languages that you don’t speak. What tools did you use?Every time a researcher, tipster or my editor sends me a post, I try to find it myself either on one of the platforms or through Wayback Machine, which often can find deleted posts. I try to confirm for myself the post does in fact exist in the form that it was sent to me.And a lot of Google Translate. I’m lucky in that I know a lot of people who speak these popular languages, so I run a lot of the content by them. A lot of what ended up in the story had already been independently fact-checked by many excellent fact-checking groups like Factchequeado and Viet Fact Check. The key with all stories is to find the authoritative sources and then try to double check them.What audience are you thinking about?I don’t think about audience per se, because I’m not coming into this with an angle. I’m not trying to convince a disbelieving audience, and I’m not trying to back up what a supportive audience might think. What I’m trying to do is look at the content that’s out there and determine whether or not it’s accurate. If it’s not accurate, I’m trying to prove why and then explain what some of the consequences might be. My job is to lay out the evidence and readers will determine for themselves whether or not that’s convincing to them.As we approach the midterms, what most concerns you?There’s a lot of chatter still on a lot of these platforms and in other mediums about the integrity of elections. That, from everything I’ve heard, is very dangerous. There’s a piece of research that says that new voters are most likely to be Latino. Primarily Spanish-speaking voters are at risk of being exposed to disinformation about voting. There have been changes in voting policies that make the election process confusing, even to a native English speaker. To have this doubt swirling in the environment heading into a really consequential election is problematic, especially when so many diasporic communities are going to become or are becoming very powerful voting centers.The midterms are important, which is why a lot of our coverage right now is focused on political misinformation. But misinformation is everywhere. It touches every single topic you can think of: Parenting groups, education, crime. I really do think it’s important to have a lot of reporting firepower behind it. More