More stories

  • in

    Trump May Start a Social Network. Here’s My Advice.

    Recast your past failures as successes, engage in meaningless optics, and other tips from the Silicon Valley playbook.So Donald Trump wants to start a social network and become a tech mogul?Lucky for him, I am an expert in all things digital, and I’m willing to help. Tech is hard stuff, and new ventures should be attempted with extreme care, especially by those whose history of entrepreneurship is littered with the carcasses of, say, Trump Steaks.Or Trump Water. Or Trump University. Or Trump magazine. Or Trump Casinos. Or Trump Mortgages. Or Trump Airlines. Or Trump Vodka. Or the Trump pandemic response. Or, of course, the 2020 Trump presidential campaign.So, Mr. Trump, here’s my advice.Right from the start, I advise you to embrace your myriad failures as if they’re your best friends. Every failed venture actually went exactly as planned. Give up your distaste for being called a “loser.” Drop your tendency to blame others. Quit falling back on conspiracy theories — which even Sydney “The Kraken” Powell is bailing on.Instead, use one of Silicon Valley’s favorite excuses for its mistakes, that old Thomas Edison trope: You didn’t fail, but found 10,000 ways that didn’t work. Even if “fail” and “don’t work” are the same thing, in tech these are seen as a badge of honor rather than as a sign that you are terrible at executing a business plan and engage in only meaningless optics.Which brings us to my next point: Engage in meaningless optics. In Silicon Valley, what people perceive is just as valuable as anything that is actually valuable.You think 5,000 Beeple JPEGs are worth $70 million? Well, I have a Jack Dorsey tweet for $2.9 million you might want to consider.Luckily, this fits right in your wheelhouse — a talent that you have displayed in spades since the beginnings of your career.Edison also said that “genius is 1 percent inspiration and 99 percent perspiration.” I might rephrase that for your entry into tech by saying, genius is 1 percent instigation and 99 percent perfidy.Instigation and perfidy, in fact, make the perfect formula for a modern-day social network, so you are already well on your way, given your skill set.Baseless conspiracies? Check. Incessant lies? Check. Crazy ALL-CAP declarations designed to foment anger? Check. Self-aggrandizing though badly spelled streams that actually reveal a profound lack of self-esteem? Check. Link-baiting hateful memes? Double check. Inciting violence over election fraud with both explicit and cryptic messages to your base, in order to get them to think they should attack the Capitol, like, for real? Checkmate — especially if you are former Vice President Mike Pence!As for Mr. Pence, you must get him to sign on to your platform, along with all the other right-wingers who groveled to you when you were on Twitter.And that doesn’t mean just the Florida member of the House of Representatives Matt Gaetz, who I assume will do that on any platform, but the whole passel of them, from Ted Cruz to Marjorie Taylor Greene to Marco Rubio to your current nemesis, Mitch McConnell. And, also, wait for it … folks like Alexandria Ocasio Cortez and Bernie Sanders, as well as all those Hollywood celebs who hate-tweet at you, and, of course, all the fake media.To have an effective social network, you need the whole gang there in order to reach the blessed perfect formula: Enragement equals engagement. You didn’t start the fire — well, maybe you did — but you definitely need to keep stoking it.It might be challenging to get all of the complex tech to actually work. A social network requires a lot of it, including servers, apps and content moderation tools. You’ll need a whole army of geeks whom you’ll have to pay real money. (If you don’t, they will cyberhack you back to Queens.)And since all of this will be quite pricey, I suppose you could take over the pretty much defunct Parler. Its previous chief executive said in an interview with me that the platform wasn’t responsible for any of the post-presidential-election chaos, and that got the service thrown off all kinds of back-end platforms run by Google, Apple and Amazon.Still, Parler may be on life support, but the tech is already built — and the platform is already full of deplorables, or, um, “patriots,” who happen to believe more in QAnon wingnut ideas than in the Constitution.Also, you should think about having a fresh kombucha station at HQ.As for your future competitors … Twitter has seen its shares rise sharply since it tossed you off for life. You still might get a reprieve over at Facebook, where an oversight board is contemplating your fate. We’ll see what the chief executive, Mark Zuckerberg, decides after the board makes a ruling.Keep in mind, Mr. Zuckerberg really is the most powerful man in the world; that was even the case when you were in the Oval Office. And while he once bear-hugged your administration, he is now sidling up to President Biden.Try to ignore that and learn to like your fellow tech moguls. You will be on their side. You’ll have to learn to love Section 230 — part of a 1996 law that shields companies from liability for what is said on their platforms — and abandon efforts to get rid of it (as you tried by executive order), since it will protect whatever toxic flood you unleash on your social media site.Which brings me to my last point: the name. It’s critical — and I am not sure how to approach it.Avoid MeinSpace and InstaGraft, for obvious reasons. The narcissist in you might go for The_Donald, which you might now be able to use, since Reddit banned the 800,000-member forum with that name for violating its rules against harassment, hate speech, content manipulation and more. (Sounds like just the kind of folks you like and who like you.)Personally, if you go this direction, I would use your name in a more creative way. My suggestion: Trumpet.Trumpets are brash and loud, and they’re often badly played and tinny. Right on brand, I’d say.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram, and sign up for the Opinion Today newsletter. More

  • in

    Zuckerberg, Dorsey and Pichai testify about disinformation.

    The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.Republicans sent the executives letters this month asking them about the decisions to remove conservative personalities and stories from their platforms, including an October article in The New York Post about President Biden’s son Hunter.Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright. More

  • in

    How Anti-Asian Activity Online Set the Stage for Real-World Violence

    On platforms such as Telegram and 4chan, racist memes and posts about Asian-Americans have created fear and dehumanization.In January, a new group popped up on the messaging app Telegram, named after an Asian slur.Hundreds of people quickly joined. Many members soon began posting caricatures of Asians with exaggerated facial features, memes of Asian people eating dog meat and images of American soldiers inflicting violence during the Vietnam War.This week, after a gunman killed eight people — including six women of Asian descent — at massage parlors in and near Atlanta, the Telegram channel linked to a poll that asked, “Appalled by the recent attacks on Asians?” The top answer, with 84 percent of the vote, was that the violence was “justified retaliation for Covid.”The Telegram group was a sign of how anti-Asian sentiment has flared up in corners of the internet, amplifying racist and xenophobic tropes just as attacks against Asian-Americans have surged. On messaging apps like Telegram and on internet forums like 4chan, anti-Asian groups and discussion threads have been increasingly active since November, especially on far-right message boards such as The Donald, researchers said.The activity follows a rise in anti-Asian misinformation last spring after the coronavirus, which first emerged in China, began spreading around the world. On Facebook and Twitter, people blamed the pandemic on China, with users posting hashtags such as #gobacktochina and #makethecommiechinesepay. Those hashtags spiked when former President Donald J. Trump last year called Covid-19 the “Chinese virus” and “Kung Flu.”While some of the online activity tailed off ahead of the November election, its re-emergence has helped lay the groundwork for real-world actions, researchers said. The fatal shootings in Atlanta this week, which have led to an outcry over treatment of Asian-Americans even as the suspect said he was trying to cure a “sexual addiction,” were preceded by a swell of racially motivated attacks against Asian-Americans in places like New York and the San Francisco Bay Area, according to the advocacy group Stop AAPI Hate.“Surges in anti-Asian rhetoric online means increased risk of real-world events targeting that group of people,” said Alex Goldenberg, an analyst at the Network Contagion Research Institute at Rutgers University, which tracks misinformation and extremism online.He added that the anti-China coronavirus misinformation — including the false narrative that the Chinese government purposely created Covid-19 as a bioweapon — had created an atmosphere of fear and invective.Anti-Asian speech online has typically not been as overt as anti-Semitic or anti-Black groups, memes and posts, researchers said. On Facebook and Twitter, posts expressing anti-Asian sentiments have often been woven into conspiracy theory groups such as QAnon and in white nationalist and pro-Trump enclaves. Mr. Goldenberg said forms of hatred against Black people and Jews have deep roots in extremism in the United States and that the anti-Asian memes and tropes have been more “opportunistically weaponized.”But that does not make the anti-Asian hate speech online less insidious. Melissa Ryan, chief executive of Card Strategies, a consulting firm that researches disinformation, said the misinformation and racist speech has led to a “dehumanization” of certain groups of people and to an increased risk of violence.Negative Asian-American tropes have long existed online but began increasing last March as parts of the United States went into lockdown over the coronavirus. That month, politicians including Representative Paul Gosar, Republican of Arizona, and Representative Kevin McCarthy, a Republican of California, used the terms “Wuhan virus” and “Chinese coronavirus” to refer to Covid-19 in their tweets.Those terms then began trending online, according to a study from the University of California, Berkeley. On the day Mr. Gosar posted his tweet, usage of the term “Chinese virus” jumped 650 percent on Twitter; a day later there was an 800 percent increase in their usage in conservative news articles, the study found.Mr. Trump also posted eight times on Twitter last March about the “Chinese virus,” causing vitriolic reactions. In the replies section of one of his posts, a Trump supporter responded, “U caused the virus,” directing the comment to an Asian Twitter user who had cited U.S. death statistics for Covid-19. The Trump fan added a slur about Asian people.In a study this week from the University of California, San Francisco, researchers who examined 700,000 tweets before and after Mr. Trump’s March 2020 posts found that people who posted the hashtag #chinesevirus were more likely to use racist hashtags, including #bateatingchinese.“There’s been a lot of discussion that ‘Chinese virus’ isn’t racist and that it can be used,” said Yulin Hswen, an assistant professor of epidemiology at the University of California, San Francisco, who conducted the research. But the term, she said, has turned into “a rallying cry to be able to gather and galvanize people who have these feelings, as well as normalize racist beliefs.”Representatives for Mr. Trump, Mr. McCarthy and Mr. Gosar did not respond to requests for comment.Misinformation linking the coronavirus to anti-Asian beliefs also rose last year. Since last March, there have been nearly eight million mentions of anti-Asian speech online, much of it falsehoods, according to Zignal Labs, a media insights firm..css-1xzcza9{list-style-type:disc;padding-inline-start:1em;}.css-rqynmc{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:0.9375rem;line-height:1.25rem;color:#333;margin-bottom:0.78125rem;}@media (min-width:740px){.css-rqynmc{font-size:1.0625rem;line-height:1.5rem;margin-bottom:0.9375rem;}}.css-rqynmc strong{font-weight:600;}.css-rqynmc em{font-style:italic;}.css-yoay6m{margin:0 auto 5px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-weight:700;font-size:1.125rem;line-height:1.3125rem;color:#121212;}@media (min-width:740px){.css-yoay6m{font-size:1.25rem;line-height:1.4375rem;}}.css-1dg6kl4{margin-top:5px;margin-bottom:15px;}#masthead-bar-one{display:none;}#masthead-bar-one{display:none;}.css-1pd7fgo{background-color:white;border:1px solid #e2e2e2;width:calc(100% – 40px);max-width:600px;margin:1.5rem auto 1.9rem;padding:15px;box-sizing:border-box;}@media (min-width:740px){.css-1pd7fgo{padding:20px;width:100%;}}.css-1pd7fgo:focus{outline:1px solid #e2e2e2;}#NYT_BELOW_MAIN_CONTENT_REGION .css-1pd7fgo{border:none;padding:20px 0 0;border-top:1px solid #121212;}.css-1pd7fgo[data-truncated] .css-rdoyk0{-webkit-transform:rotate(0deg);-ms-transform:rotate(0deg);transform:rotate(0deg);}.css-1pd7fgo[data-truncated] .css-eb027h{max-height:300px;overflow:hidden;-webkit-transition:none;transition:none;}.css-1pd7fgo[data-truncated] .css-5gimkt:after{content:’See more’;}.css-1pd7fgo[data-truncated] .css-6mllg9{opacity:1;}.css-coqf44{margin:0 auto;overflow:hidden;}.css-coqf44 strong{font-weight:700;}.css-coqf44 em{font-style:italic;}.css-coqf44 a{color:#326891;-webkit-text-decoration:underline;text-decoration:underline;text-underline-offset:1px;-webkit-text-decoration-thickness:1px;text-decoration-thickness:1px;-webkit-text-decoration-color:#ccd9e3;text-decoration-color:#ccd9e3;}.css-coqf44 a:visited{color:#333;-webkit-text-decoration-color:#333;text-decoration-color:#333;}.css-coqf44 a:hover{-webkit-text-decoration:none;text-decoration:none;}In one example, a Fox News article from April that went viral baselessly said that the coronavirus was created in a lab in the Chinese city of Wuhan and intentionally released. The article was liked and shared more than one million times on Facebook and retweeted 78,800 times on Twitter, according to data from Zignal and CrowdTangle, a Facebook-owned tool for analyzing social media.By the middle of last year, the misinformation had started subsiding as election-related commentary increased. The anti-Asian sentiment ended up migrating to platforms like 4chan and Telegram, researchers said.But it still occasionally flared up, such as when Dr. Li-Meng Yan, a researcher from Hong Kong, made unproven assertions last fall that the coronavirus was a bioweapon engineered by China. In the United States, Dr. Yan became a right-wing media sensation. Her appearance on Tucker Carlson’s Fox News show in September has racked up at least 8.8 million views online.In November, anti-Asian speech surged anew. That was when conspiracies about a “new world order” related to President Biden’s election victory began circulating, said researchers from the Network Contagion Research Institute. Some posts that went viral painted Mr. Biden as a puppet of the Chinese Communist Party.In December, slurs about Asians and the term “Kung Flu” rose by 65 percent on websites and apps like Telegram, 4chan and The Donald, compared with the monthly average mentions from the previous 11 months on the same platforms, according to the Network Contagion Research Institute. The activity remained high in January and last month.During this second surge, calls for violence against Asian-Americans became commonplace.“Filipinos are not Asians because Asians are smart,” read a post in a Telegram channel that depicted a dog holding a gun to its head.After the shootings in Atlanta, a doctored screenshot of what looked like a Facebook post from the suspect circulated on Facebook and Twitter this week. The post featured a miasma of conspiracies about China engaging in a Covid-19 cover-up and wild theories about how it was planning to “secure global domination for the 21st century.”Facebook and Twitter eventually ruled that the screenshot was fake and blocked it. But by then, the post had been shared and liked hundreds of times on Twitter and more than 4,000 times on Facebook.Ben Decker More

  • in

    Liberals want to blame rightwing 'misinformation' for our problems. Get real | Thomas Frank

    One day in March 2015, I sat in a theater in New York City and took careful notes as a series of personages led by Hillary Clinton and Melinda Gates described the dazzling sunburst of liberation that was coming our way thanks to entrepreneurs, foundations and Silicon Valley. The presentation I remember most vividly was that of a famous TV actor who rhapsodized about the wonders of Twitter, Facebook and the rest: “No matter which platform you prefer,” she told us, “social media has given us all an extraordinary new world, where anyone, no matter their gender, can share their story across communities, continents and computer screens. A whole new world without ceilings.”Six years later and liberals can’t wait for that extraordinary new world to end. Today we know that social media is what gives you things like Donald Trump’s lying tweets, the QAnon conspiracy theory and the Capitol riot of 6 January. Social media, we now know, is a volcano of misinformation, a non-stop wallow in hatred and lies, generated for fun and profit, and these days liberal politicians are openly pleading with social media’s corporate masters to pleez clamp a ceiling on it, to stop people from sharing their false and dangerous stories.A “reality crisis” is the startling name a New York Times story recently applied to this dismal situation. An “information disorder” is the more medical-sounding label that other authorities choose to give it. Either way, the diagnosis goes, we Americans are drowning in the semiotic swirl. We have come loose from the shared material world, lost ourselves in an endless maze of foreign disinformation and rightwing conspiracy theory.In response, Joe Biden has called upon us as a nation to “defend the truth and defeat the lies”. A renowned CNN journalist advocates a “harm reduction model” to minimize “information pollution” and deliver the “rational views” that the public wants. A New York Times writer has suggested the president appoint a federal “reality czar” who would “help” the Silicon Valley platform monopolies mute the siren song of QAnon and thus usher us into a new age of sincerity.These days Democratic politicians lean on anyone with power over platforms to shut down the propaganda of the right. Former Democratic officials pen op-eds calling on us to get over free speech. Journalists fantasize about how easily and painlessly Silicon Valley might monitor and root out objectionable speech. In a recent HBO documentary on the subject, journalist after journalist can be seen rationalizing that, because social media platforms are private companies, the first amendment doesn’t apply to them … and, I suppose, neither should the American tradition of free-ranging, anything-goes political speech.In the absence of such censorship, we are told, the danger is stark. In a story about Steve Bannon’s ongoing Trumpist podcasts, for example, ProPublica informs us that “extremism experts say the rhetoric still feeds into an alternative reality that breeds anger and cynicism, which may ultimately lead to violence”.In liberal circles these days there is a palpable horror of the uncurated world, of thought spaces flourishing outside the consensus, of unauthorized voices blabbing freely in some arena where there is no moderator to whom someone might be turned in. The remedy for bad speech, we now believe, is not more speech, as per Justice Brandeis’s famous formula, but an “extremism expert” shushing the world.What an enormous task that shushing will be! American political culture is and always has been a matter of myth and idealism and selective memory. Selling, not studying, is our peculiar national talent. Hollywood, not historians, is who writes our sacred national epics. There were liars-for-hire in this country long before Roger Stone came along. Our politics has been a bath in bullshit since forever. People pitching the dumbest of ideas prosper fantastically in this country if their ideas happen to be what the ruling class would prefer to believe.“Debunking” was how the literary left used to respond to America’s Niagara of nonsense. Criticism, analysis, mockery and protest: these were our weapons. We were rational-minded skeptics, and we had a grand old time deflating creationists, faith healers, puffed-up militarists and corporate liars of every description.Censorship and blacklisting were, with important exceptions, the weapons of the puritanical right: those were their means of lashing out against rap music or suggestive plays or leftwingers who were gainfully employed.What explains the clampdown mania among liberals? The most obvious answer is because they need an excuse. Consider the history: the right has enjoyed tremendous success over the last few decades, and it is true that conservatives’ capacity for hallucinatory fake-populist appeals has helped them to succeed. But that success has also happened because the Democrats, determined to make themselves the party of the affluent and the highly educated, have allowed the right to get away with it.There have been countless times over the years where Democrats might have reappraised this dumb strategy and changed course. But again and again they chose not to, blaming their failure on everything but their glorious postindustrial vision. In 2016, for example, liberals chose to blame Russia for their loss rather than look in the mirror. On other occasions they assured one another that they had no problems with white blue-collar workers – until it became undeniable that they did, whereupon liberals chose to blame such people for rejecting them.To give up on free speech is to despair of reason itselfAnd now we cluck over a lamentable “information disorder”. The Republicans didn’t suffer the landslide defeat they deserved last November; the right is still as potent as ever; therefore Trumpist untruth is responsible for the malfunctioning public mind. Under no circumstances was it the result of the Democrats’ own lackluster performance, their refusal to reach out to the alienated millions with some kind of FDR-style vision of social solidarity.Or perhaps this new taste for censorship is an indication of Democratic healthiness. This is a party that has courted professional-managerial elites for decades, and now they have succeeded in winning them over, along with most of the wealthy areas where such people live. Liberals scold and supervise like an offended ruling class because to a certain extent that’s who they are. More and more, they represent the well-credentialed people who monitor us in the workplace, and more and more do they act like it.What all this censorship talk really is, though, is a declaration of defeat – defeat before the Biden administration has really begun. To give up on free speech is to despair of reason itself. (Misinformation, we read in the New York Times, is impervious to critical thinking.) The people simply cannot be persuaded; something more forceful is in order; they must be guided by we, the enlightened; and the first step in such a program is to shut off America’s many burbling fountains of bad takes.Let me confess: every time I read one of these stories calling on us to get over free speech or calling on Mark Zuckerberg to press that big red “mute” button on our political opponents, I feel a wave of incredulity sweep over me. Liberals believe in liberty, I tell myself. This can’t really be happening here in the USA.But, folks, it is happening. And the folly of it all is beyond belief. To say that this will give the right an issue to campaign on is almost too obvious. To point out that it will play straight into the right’s class-based grievance-fantasies requires only a little more sophistication. To say that it is a betrayal of everything we were taught liberalism stood for – a betrayal that we will spend years living down – may be too complex a thought for our punditburo to consider, but it is nevertheless true. More

  • in

    Rightwing 'super-spreader': study finds handful of accounts spread bulk of election misinformation

    A handful of rightwing “super-spreaders” on social media were responsible for the bulk of election misinformation in the run-up to the Capitol attack, according to a new study that also sheds light on the staggering reach of falsehoods pushed by Donald Trump.A report from the Election Integrity Partnership (EIP), a group that includes Stanford and the University of Washington, analyzed social media platforms including Facebook, Twitter, Instagram, YouTube, and TikTok during several months before and after the 2020 elections.It found that “super-spreaders” – responsible for the most frequent and most impactful misinformation campaigns – included Trump and his two elder sons, as well as other members of the Trump administration and the rightwing media.The study’s authors and other researchers say the findings underscore the need to disable such accounts to stop the spread of misinformation.“If there is a limit to how much content moderators can tackle, have them focus on reducing harm by eliminating the most effective spreaders of misinformation,” said said Lisa Fazio, an assistant professor at Vanderbilt University who studies the psychology of fake news but was not involved EIP report. “Rather than trying to enforce the rules equally across all users, focus enforcement on the most powerful accounts.” The report analyzed social media posts featuring words like “election” and “voting” to track key misinformation narratives related to the the 2020 election, including claims of mail carriers throwing away ballots, legitimate ballots strategically not being counted, and other false or unproven stories.The report studied how these narratives developed and the effect they had. It found during this time period, popular rightwing Twitter accounts “transformed one-off stories, sometimes based on honest voter concerns or genuine misunderstandings, into cohesive narratives of systemic election fraud”.Ultimately, the “false claims and narratives coalesced into the meta-narrative of a ‘stolen election’, which later propelled the January 6 insurrection”, the report said.“The 2020 election demonstrated that actors – both foreign and domestic – remain committed to weaponizing viral false and misleading narratives to undermine confidence in the US electoral system and erode Americans’ faith in our democracy,” the authors concluded.Next to no factchecking, with Trump as the super-spreader- in-chiefIn monitoring Twitter, the researchers analyzed more than more than 22 million tweets sent between 15 August and 12 December. The study determined which accounts were most influential by the size and speed with which they spread misinformation.“Influential accounts on the political right rarely engaged in factchecking behavior, and were responsible for the most widely spread incidents of false or misleading information in our dataset,” the report said.Out of the 21 top offenders, 15 were verified Twitter accounts – which are particularly dangerous when it comes to election misinformation, the study said. The “repeat spreaders” responsible for the most widely spread misinformation included Eric Trump, Donald Trump, Donald Trump Jr. and influencers like James O’Keefe, Tim Pool, Elijah Riot, and Sidney Powell. All 21 of the top accounts for misinformation leaned rightwing, the study showed.“Top-down mis- and disinformation is dangerous because of the speed at which it can spread,” the report said. “If a social media influencer with millions of followers shares a narrative, it can garner hundreds of thousands of engagements and shares before a social media platform or factchecker has time to review its content.”On nearly all the platforms analyzed in the study – including Facebook, Twitter, and YouTube – Donald Trump played a massive role.It pinpointed 21 incidents in which a tweet from Trump’s official @realDonaldTrump account jumpstarted the spread of a false narrative across Twitter. For example, Trump’s tweets baselessly claiming that the voting equipment manufacturer Dominion Voting Systems was responsible for election fraud played a large role in amplifying the conspiracy theory to a wider audience. False or baseless tweets sent by Trump’s account – which had 88.9m followers at the time – garnered more than 460,000 retweets.Meanwhile, Trump’s YouTube channel was linked to six distinct waves of misinformation that, combined, were the most viewed of any other repeat-spreader’s videos. His Facebook account had the most engagement of all those studied.The Election Integrity Partnership study is not the first to show the massive influence Trump’s social media accounts have had on the spread of misinformation. In one year – between 1 January 2020 and 6 January 2021 – Donald Trump pushed disinformation in more than 1,400 Facebook posts, a report from Media Matters for America released in February found. Trump was ultimately suspended from the platform in January, and Facebook is debating whether he will ever be allowed back.Specifically, 516 of his posts contained disinformation about Covid-19, 368 contained election disinformation, and 683 contained harmful rhetoric attacking his political enemies. Allegations of election fraud earned over 149.4 million interactions, or an average of 412,000 interactions per post, and accounted for 16% of interactions on his posts in 2020. Trump had a unique ability to amplify news stories that would have otherwise remained contained in smaller outlets and subgroups, said Matt Gertz of Media Matters for America.“What Trump did was take misinformation from the rightwing ecosystem and turn it into a mainstream news event that affected everyone,” he said. “He was able to take these absurd lies and conspiracy theories and turn them into national news. And if you do that, and inflame people often enough, you will end up with what we saw on January 6.”Effects of false election narratives on voters“Super-spreader” accounts were ultimately very successful in undermining voters’ trust in the democratic system, the report found. Citing a poll by the Pew Research Center, the study said that, of the 54% of people who voted in person, approximately half had cited concerns about voting by mail, and only 30% of respondents were “very confident” that absentee or mail-in ballots had been counted as intended.The report outlined a number of recommendations, including removing “super-spreader” accounts entirely.Outside experts agree that tech companies should more closely scrutinize top accounts and repeat offenders.Researchers said the refusal to take action or establish clear rules for when action should be taken helped to fuel the prevalence of misinformation. For example, only YouTube had a publicly stated “three-strike” system for offenses related to the election. Platforms like Facebook reportedly had three-strike rules as well but did not make the system publicly known.Only four of the top 20 Twitter accounts cited as top spreaders were actually removed, the study showed – including Donald Trump’s in January. Twitter has maintained that its ban of the former president is permanent. YouTube’s chief executive officer stated this week that Trump would be reinstated on the platform once the “risk of violence” from his posts passes. Facebook’s independent oversight board is now considering whether to allow Trump to return.“We have seen that he uses his accounts as a way to weaponize disinformation. It has already led to riots at the US Capitol; I don’t know why you would give him the opportunity to do that again,” Gertz said. “It would be a huge mistake to allow Trump to return.” More

  • in

    Fixing What the Internet Broke

    AdvertisementContinue reading the main storySupported byContinue reading the main storyon techFixing What the Internet BrokeHow sites like Facebook and Twitter can help reduce election misinformation.Credit…Angie WangMarch 4, 2021, 12:26 p.m. ETThis article is part of the On Tech newsletter. You can sign up here to receive it weekdays.January’s riot at the U.S. Capitol showed the damage that can result when millions of people believe an election was stolen despite no evidence of widespread fraud.The Election Integrity Partnership, a coalition of online information researchers, published this week a comprehensive analysis of the false narrative of the presidential contest and recommended ways to avoid a repeat.Internet companies weren’t solely to blame for the fiction of a stolen election, but the report concluded that they were hubs where false narratives were incubated, reinforced and cemented. I’m going to summarize here three of the report’s intriguing suggestions for how companies such as Facebook, YouTube and Twitter can change to help create a healthier climate of information about elections and everything else.One broad point: It can feel as if the norms and behaviors of people online are immutable and inevitable, but they’re not. Digital life is still relatively new, and what’s good or toxic is the result of deliberate choices by companies and all of us. We can fix what’s broken. And as another threat against the Capitol this week shows, it’s imperative we get this right.1) A higher bar for people with the most influence and the repeat offenders: Kim Kardashian can change more minds than your dentist. And research about the 2020 election has shown that a relatively small number of prominent organizations and people, including President Donald Trump, played an outsize role in establishing the myth of a rigged vote.Currently, sites like Facebook and YouTube mostly consider the substance of a post or video, divorced from the messenger, when determining whether it violates their policies. World leaders are given more leeway than the rest of us and other prominent people sometimes get a pass when they break the companies’ guidelines.This doesn’t make sense.If internet companies did nothing else, it would make a big difference if they changed how they treated the influential people who were most responsible for spreading falsehoods or twisted facts — and tended to do so again and again.The EIP researchers suggested three changes: create stricter rules for influential people; prioritize faster decisions on prominent accounts that have broken the rules before; and escalate consequences for habitual superspreaders of bogus information.YouTube has long had such a “three strikes” system for accounts that repeatedly break its rules, and Twitter recently adopted versions of this system for posts that it considers misleading about elections or coronavirus vaccinations.The hard part, though, is not necessarily making policies. It’s enforcing them when doing so could trigger a backlash.2) Internet companies should tell us what they’re doing and why: Big websites like Facebook and Twitter have detailed guidelines about what’s not allowed — for example, threatening others with violence or selling drugs.But internet companies often apply their policies inconsistently and don’t always provide clear reasons when people’s posts are flagged or deleted. The EIP report suggested that online companies do more to inform people about their guidelines and share evidence to support why a post broke the rules.3) More visibility and accountability for internet companies’ decisions: News organizations have reported on Facebook’s own research identifying ways that its computer recommendations steered some to fringe ideas and made people more polarized. But Facebook and other internet companies mostly keep such analyses a secret.The EIP researchers suggested that internet companies make public their research into misinformation and their assessments of attempts to counter it. That could improve people’s understanding of how these information systems work.The report also suggested a change that journalists and researchers have long wanted: ways for outsiders to see posts that have been deleted by the internet companies or labeled false. This would allow accountability for the decisions that internet companies make.There are no easy fixes to building Americans’ trust in a shared set of facts, particularly when internet sites enable lies to travel farther and faster than the truth. But the EIP recommendations show we do have options and a path forward. Before we go …Amazon goes big(ger) in New York: My colleagues Matthew Haag and Winnie Hu wrote about Amazon opening more warehouses in New York neighborhoods and suburbs to make faster deliveries. A related On Tech newsletter from 2020: Why Amazon needs more package hubs closer to where people live.Our homes are always watching: Law enforcement officials have increasingly sought videos from internet-connected doorbell cameras to help solve crimes but The Washington Post writes that the cameras have sometimes been a risk to them, too. In Florida, a man saw F.B.I. agents coming through his home camera and opened fire, killing two people.Square is buying Jay-Z’s streaming music service: Yes, the company that lets the flea market vendor swipe your credit card is going to own a streaming music company. No, it doesn’t make sense. (Square said it’s about finding new ways for musicians to make money.)Hugs to thisA kitty cat wouldn’t budge from the roof of a train in London for about two and a half hours. Here are way too many silly jokes about the train-surfing cat. (Or maybe JUST ENOUGH SILLY JOKES?)We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.If you don’t already get this newsletter in your inbox, please sign up here.AdvertisementContinue reading the main story More