More stories

  • in

    Germany Struggles to Stop Online Abuse Ahead of Election

    Scrolling through her social media feed, Laura Dornheim is regularly stopped cold by a new blast of abuse aimed at her, including from people threatening to kill or sexually assault her. One person last year said he looked forward to meeting her in person so he could punch her teeth out.Ms. Dornheim, a candidate for Parliament in Germany’s election on Sunday, is often attacked for her support of abortion rights, gender equality and immigration. She flags some of the posts to Facebook and Twitter, hoping that the platforms will delete the posts or that the perpetrators will be barred. She’s usually disappointed.“There might have been one instance where something actually got taken down,” Ms. Dornheim said.Harassment and abuse are all too common on the modern internet. Yet it was supposed to be different in Germany. In 2017, the country enacted one of the world’s toughest laws against online hate speech. It requires Facebook, Twitter and YouTube to remove illegal comments, pictures or videos within 24 hours of being notified about them or risk fines of up to 50 million euros, or $59 million. Supporters hailed it as a watershed moment for internet regulation and a model for other countries.But an influx of hate speech and harassment in the run-up to the German election, in which the country will choose a new leader to replace Angela Merkel, its longtime chancellor, has exposed some of the law’s weaknesses. Much of the toxic speech, researchers say, has come from far-right groups and is aimed at intimidating female candidates like Ms. Dornheim.Some critics of the law say it is too weak, with limited enforcement and oversight. They also maintain that many forms of abuse are deemed legal by the platforms, such as certain kinds of harassment of women and public officials. And when companies do remove illegal material, critics say, they often do not alert the authorities or share information about the posts, making prosecutions of the people publishing the material far more difficult. Another loophole, they say, is that smaller platforms like the messaging app Telegram, popular among far-right groups, are not subject to the law.Free-expression groups criticize the law on other grounds. They argue that the law should be abolished not only because it fails to protect victims of online abuse and harassment, but also because it sets a dangerous precedent for government censorship of the internet.The country’s experience may shape policy across the continent. German officials are playing a key role in drafting one of the world’s most anticipated new internet regulations, a European Union law called the Digital Services Act, which will require Facebook and other online platforms to do more to address the vitriol, misinformation and illicit content on their sites. Ursula von der Leyen, a German who is president of the European Commission, the 27-nation bloc’s executive arm, has called for an E.U. law that would list gender-based violence as a special crime category, a proposal that would include online attacks.“Germany was the first to try to tackle this kind of online accountability,” said Julian Jaursch, a project director at the German think tank Stiftung Neue Verantwortung, which focuses on digital issues. “It is important to ask whether the law is working.”Campaign billboards in Germany’s race for chancellor, showing, from left, Annalena Baerbock of the Green Party, Olaf Scholz of the Social Democrats and Christian Lindner of the Free Democrats.Sean Gallup/Getty ImagesMarc Liesching, a professor at HTWK Leipzig who published an academic report on the policy, said that of the posts that had been deleted by Facebook, YouTube and Twitter, a vast majority were classified as violating company policies, not the hate speech law. That distinction makes it harder for the government to measure whether companies are complying with the law. In the second half of 2020, Facebook removed 49 million pieces of “hate speech” based on its own community standards, compared with the 154 deletions that it attributed to the German law, he found.The law, Mr. Liesching said, “is not relevant in practice.”With its history of Nazism, Germany has long tried to balance free speech rights against a commitment to combat hate speech. Among Western democracies, the country has some of the world’s toughest laws against incitement to violence and hate speech. Targeting religious, ethnic and racial groups is illegal, as are Holocaust denial and displaying Nazi symbols in public. To address concerns that companies were not alerting the authorities to illegal posts, German policymakers this year passed amendments to the law. They require Facebook, Twitter and YouTube to turn over data to the police about accounts that post material that German law would consider illegal speech. The Justice Ministry was also given more powers to enforce the law. “The aim of our legislative package is to protect all those who are exposed to threats and insults on the internet,” Christine Lambrecht, the justice minister, who oversees enforcement of the law, said after the amendments were adopted. “Whoever engages in hate speech and issues threats will have to expect to be charged and convicted.”Germans will vote for a leader to replace Angela Merkel, the country’s longtime chancellor.Markus Schreiber/Associated PressFacebook and Google have filed a legal challenge to block the new rules, arguing that providing the police with personal information about users violates their privacy.Facebook said that as part of an agreement with the government it now provided more figures about the complaints it received. From January through July, the company received more than 77,000 complaints, which led it to delete or block about 11,500 pieces of content under the German law, known as NetzDG.“We have zero tolerance for hate speech and support the aims of NetzDG,” Facebook said in a statement. Twitter, which received around 833,000 complaints and removed roughly 81,000 posts during the same period, said a majority of those posts did not fit the definition of illegal speech, but still violated the company’s terms of service.“Threats, abusive content and harassment all have the potential to silence individuals,” Twitter said in a statement. “However, regulation and legislation such as this also has the potential to chill free speech by emboldening regimes around the world to legislate as a way to stifle dissent and legitimate speech.”YouTube, which received around 312,000 complaints and removed around 48,000 pieces of content in the first six months of the year, declined to comment other than saying it complies with the law.The amount of hate speech has become increasingly pronounced during election season, according to researchers at Reset and HateAid, organizations that track online hate speech and are pushing for tougher laws.The groups reviewed nearly one million comments on far-right and conspiratorial groups across about 75,000 Facebook posts in June, finding that roughly 5 percent were “highly toxic” or violated the online hate speech law. Some of the worst material, including messages with Nazi symbolism, had been online for more than a year, the groups found. Of 100 posts reported by the groups to Facebook, roughly half were removed within a few days, while the others remain online.The election has also seen a wave of misinformation, including false claims about voter fraud.Annalena Baerbock, the 40-year-old leader of the Green Party and the only woman among the top candidates running to succeed Ms. Merkel, has been the subject of an outsize amount of abuse compared with her male rivals from other parties, including sexist slurs and misinformation campaigns, according to researchers.Ms. Baerbock, the Green Party candidate for chancellor, taking a selfie with one of her supporters.Laetitia Vancon for The New York TimesOthers have stopped running altogether. In March, a former Syrian refugee running for the German Parliament, Tareq Alaows, dropped out of the race after experiencing racist attacks and violent threats online.While many policymakers want Facebook and other platforms to be aggressive in screening user-generated content, others have concerns about private companies making decisions about what people can and can’t say. The far-right party Alternative for Germany, which has criticized the law for unfairly targeting its supporters, has vowed to repeal the policy “to respect freedom of expression.”Jillian York, an author and free speech activist with the Electronic Frontier Foundation in Berlin, said the German law encouraged companies to remove potentially offensive speech that is perfectly legal, undermining free expression rights.“Facebook doesn’t err on the side of caution, they just take it down,” Ms. York said. Another concern, she said, is that less democratic countries such as Turkey and Belarus have adopted laws similar to Germany’s so that they could classify certain material critical of the government as illegal.Renate Künast, a former government minister who once invited a journalist to accompany her as she confronted individuals in person who had targeted her with online abuse, wants to see the law go further. Victims of online abuse should be able to go after perpetrators directly for libel and financial settlements, she said. Without that ability, she added, online abuse will erode political participation, particularly among women and minority groups.In a survey of more than 7,000 German women released in 2019, 58 percent said they did not share political opinions online for fear of abuse.“They use the verbal power of hate speech to force people to step back, leave their office or not to be candidates,” Ms. Künast said.The Reichstag, where the German Parliament convenes, in Berlin.Emile Ducke for The New York TimesMs. Dornheim, the Berlin candidate, who has a master’s degree in computer science and used to work in the tech industry, said more restrictions were needed. She described getting her home address removed from public records after somebody mailed a package to her house during a particularly bad bout of online abuse.Yet, she said, the harassment has only steeled her resolve.“I would never give them the satisfaction of shutting up,” she said. More

  • in

    How They Failed: California Republicans, Media Critics and Facebook

    In a special Opinion Audio bonanza, Jane Coaston (The Argument), Ezra Klein (The Ezra Klein Show) and Kara Swisher (Sway) sit down to discuss what went wrong for the G.O.P. in the recall election of Gov. Gavin Newsom of California. “This was where the nationalization of politics really bit back for Republicans,” Jane says. The three hosts then debate whether the media industry’s criticism of itself does any good at all. “The media tweets like nobody’s watching,” Ezra says. Then the hosts turn to The Wall Street Journal’s revelations in “The Facebook Files” and discuss how to hold Facebook accountable. “We’re saying your tools in the hands of malevolent players are super dangerous,” Kara says, “but we have no power over them whatsoever.”And last, Ezra, Jane and Kara offer recommendations to take you deep into history, fantasy and psychotropics.[You can listen to this episode of “The Argument” on Apple, Spotify or Google or wherever you get your podcasts.]Read more about the subjects in this episode:Jane Coaston, Vox: “How California conservatives became the intellectual engine of Trumpism”Ezra Klein: “Gavin Newsom Is Much More Than the Lesser of Two Evils” and “A Different Way of Thinking About Cancel Culture”Kara Swisher: “The Endless Facebook Apology,” “Don’t Get Bezosed,” “The Medium of the Moment” “‘They’re Killing People’? Biden Isn’t Quite Right, but He’s Not Wrong.” and “The Terrible Cost of Mark Zuckerberg’s Naïveté”(A full transcript of the episode will be available midday on the Times website.)Photographs courtesy of The New York TimesThoughts? Email us at argument@nytimes.com or leave us a voice mail message at (347) 915-4324. We want to hear what you’re arguing about with your family, your friends and your frenemies. (We may use excerpts from your message in a future episode.)By leaving us a message, you are agreeing to be governed by our reader submission terms and agreeing that we may use and allow others to use your name, voice and message.This episode was produced by Phoebe Lett, Annie Galvin and Rogé Karma. It was edited by Stephanie Joyce, Alison Bruzek and Nayeema Raza. Engineering, music and sound design by Isaac Jones and Sonia Herrero. Fact-checking by Kate Sinclair, Michelle Harris and Kristin Lin. Audience strategy by Shannon Busta. Special thanks to Matt Kwong, Daphne Chen and Blakeney Schick. More

  • in

    The Alarming Rise of Peter Thiel, Tech Mogul and Political Provocateur

    THE CONTRARIAN Peter Thiel and Silicon Valley’s Pursuit of PowerBy Max ChafkinA few years ago, on a podcast called “This Is Actually Happening,” a penitent white supremacist recalled a formative childhood experience. One night his mother asked him: “You enjoying your burger?” She went on, “Did you know it’s made out of a cow?”“Something died?” the boy, then 5, replied.“Everything living dies,” she said. “You’re going to die.”Plagued thereafter by terror of death, the boy affected a fear-concealing swagger, which eventually became a fascist swagger.By chance, I’d just heard this episode when I opened “The Contrarian,” Max Chafkin’s sharp and disturbing biography of the Silicon Valley tech billionaire Peter Thiel, another far-right figure, though unrepentant.An epiphany from Thiel’s childhood sounded familiar. When he was 3, according to Chafkin, Thiel asked his father about a rug, which his father, Klaus Thiel, explained was cowhide. “Death happens to all animals. All people,” Klaus said. “It will happen to me one day. It will happen to you.”A near identical far-right coming-of-age tale — a Rechtsextremebildungsroman? The coincidence kicked off a wave of despair that crashed over me as I read Chafkin’s book. Where did these far-right Americans, powerful and not, ashamed and proud, come from? Why does a stock lecture about mortality lead some 3-to-5-year-old boys to develop contempt for the frailties in themselves — and in everyone else? Like the anonymous white supremacist, Thiel never recovered from bummer death news, and, according to Chafkin, still returns compulsively to “the brutal finality of the thing.” Thiel also turned to swaggering and, later, an evolving, sometimes contradictory, hodgepodge of libertarian and authoritarian beliefs.Thiel stalks through Chafkin’s biography “as if braced for a collision,” spoiling for a fight with whomever he designates a “liberal” — meaning anyone he suspects of snubbing him. Unsmiling, solipsistic and at pains to conceal his forever wounded vanity, Thiel in Chafkin’s telling comes across as singularly disagreeable, which is evidently the secret to both his worldly successes and his moral failures.Young Thiel had the usual dandruff-club hobbies: He played Dungeons & Dragons, read Tolkien and aced the SATs. He was arrogant, and set his worldview against those who mocked him for it. One of Thiel’s classmates at Stanford told Chafkin, “He viewed liberals through a lens as people who were not nice to him.” Looking back on Thiel’s anti-elitist and eventually illiberal politics, Chafkin is succinct: “He’d chosen to reject those who’d rejected him.”Chafkin serves as a tour guide to the ideological roadhouses where Thiel threw back shots of ultraconservative nostrums on his way to serve Donald Trump in 2016. There was his home life, where — first in Cleveland, then in South Africa and, finally, in suburban California — he ingested his German family’s complicity in apartheid (his father helped build a uranium mine in the Namib desert) and enthusiasm for Reagan; his requisite enlightenment via the novels of Ayn Rand; his excoriations of libs at Stanford, which (Chafkin reminds readers) still shows the influence of its eugenicist founding president, David Starr Jordan; and his depressing stint at a white-shoe corporate law firm, where he was disappointed to find “no liberals to fight.”These stages of the cross led Thiel to Silicon Valley in the mid-1990s, hot to leave big law and gamble on young Randian Übermenschen. An early bet on a coder named Max Levchin hit it big. The two devised PayPal, the company Thiel is famous for, which supercharged his antipathies with capital. Thiel, who’d published a book called “The Diversity Myth,” “made good on his aversion to multiculturalism,” Chafkin writes. “Besides youth, PayPal’s other defining quality was its white maleness.”In 2000, PayPal got in business with Elon Musk. “Peter thinks Musk is a fraud and a braggart,” one source tells Chafkin. “Musk thinks Peter is a sociopath.” According to Chafkin, Thiel remained coldblooded during the dot-com crash that year, as PayPal loopholed its way to market dominance. The company rebounded with a growth strategy known as “blitzscaling,” as well as the use of some supremely nasty tactics. “Whereas [Steve] Jobs viewed business as a form of cultural expression, even art,” Chafkin writes, “for Thiel and his peers it was a mode of transgression, even activism.”When PayPal went public, Thiel took out tens of millions and turned to investing full time. With various funds he scouted for more entrepreneurial twerps, and in the mid-2000s he latched onto Mark Zuckerberg of Facebook. He also set up a hedge fund called Clarium, where, according to Chafkin, Thiel’s staffers styled themselves as intellectuals and savored the wit of VDARE, an anti-immigration website that regularly published white nationalists. Hoping to make death less inevitable, at least for himself, Thiel also began to patronize the Alcor Life Extension Foundation, which has been steadily freezing the corpses of moneyed narcissists in liquid nitrogen since 1976.Thiel passed on investing in Tesla, telling Musk (according to Musk) that he didn’t “fully buy into the climate change thing.” But he gave Zuckerberg a loan for Facebook, which intermittently let him keep a leash on the young founder. After Sept. 11, Chafkin reports, Thiel also panicked about “the threat posed by Islamic terrorism — and Islam itself.” Libertarianism deserted him; he created Palantir, a data-analytics surveillance tech company designed, in essence, to root out terrorists. The C.I.A. used it, the N.Y.P.D. used it and Thiel became a contractor with big government. By 2006 his Clarium had $2 billion under management.Around this time, the wily Nick Denton, of the gossip empire Gawker, took notice of what Chafkin calls Thiel’s “extremist politics and ethically dubious business practices.” Gawker’s Valleywag site dragged Thiel, whose homosexuality was an open secret, suggesting he was repressed. This enraged Thiel, who by 2008 seemed to have lost it, firing off a floridly religious letter to Clarium investors warning of the imminent apocalypse and urging them to save their immortal souls and “accumulate treasures in heaven, in the eternal City of God.”The planet avoided the apocalypse, as it tends to do, but that year the financial crash laid the economy to waste. Several big investors pulled out of Thiel’s fund. In Chafkin’s telling, Thiel unaccountably blamed Denton for scaring away ultraconservatives by outing him. He determined to put Denton out of business, and in 2016, by clandestinely bankrolling a nuisance lawsuit designed to bankrupt Gawker, he did.Chafkin’s chronicle of Thiel’s wild abandon during the Obama years contains some of the most suspenseful passages in the book, as the narrative hurtles toward his acquisition of actual political power. Thiel seemed intoxicated by the rise of Obama, who galvanized the liberals Thiel most loved to hate. Chafkin recounts decadent parties at Thiel’s homes with barely clad men, along with his investments in nutjob projects, like seasteading, which promised life on floating ocean platforms free from government regulation. In a widely read essay, he argued that democracy and capitalism were at odds, because social programs and women’s suffrage curbed the absolute freedom of above-the-law capitalists like himself. He was officially antidemocracy.Thiel then began to direct money to nativist political candidates and causes, and to collaborate — via Palantir — with Lt. Gen. Michael Flynn, the strange right-wing figure who would later become a zealous Trumpite embraced by the QAnon cult. He built an army of mini-Thiels, the Thiel fellows, teenage boys (along with a few girls) whom he paid to quit college, forfeit normal social life and try to get rich in the Valley.Thiel backed Ron Paul for president in 2012, and helped Ted Cruz win a Texas Senate seat. (Gawker noted that Thiel’s support for the anti-gay Cruz was “no crazier than paying kids to drop out of school, cure death or create a floating libertarian ocean utopia.”) He contributed to Tea Party politicians with the aim of building a bigger “neo-reactionary” political movement, and in 2015, he gave his followers their own holy book when he published “Zero to One,” a compendium of antidemocracy, pro-monopoly blitzscaling tips.Peter Thiel, speaking at the Republican National Convention in July 2016. After Donald Trump won the nomination, Thiel decided Trump was a delightful disrupter and kindred spirit and urged voters to take him “seriously, but not literally.”Stephen Crowley/The New York TimesAt the same time, by investing in Lyft, TaskRabbit and Airbnb with his Founders Fund, Thiel seemed to be on the right side of history. When he spoke before mainstream audiences, he sometimes softened his extreme views and even laughed off his more gonzo follies — seasteading, for one.Yet one friend described Thiel to Chafkin as “Nazi-curious” (though the friend later said he was just being glib), and during this period Thiel also became, Chafkin writes, closer to Curtis Yarvin, a noxious avatar of the alt-right who had ties to Steve Bannon. He turned to survivalist prepping, kitting out a giant estate in New Zealand, where he took citizenship, making it possible that at a moment’s notice he could slip the knot of what, Chafkin says, had become his ultimate nemesis: the U.S. government itself.In the mid-2010s, a Palantir rep was also meeting with Cambridge Analytica, the creepy English data-mining firm that was later recorded boasting about using twisted data shenanigans to all but give the 2016 presidential election to Donald Trump.Like just about every powerful figure who eventually went all in for Trump, Thiel was initially skeptical, according to Chafkin. But once Trump won the nomination Thiel decided he was a delightful disrupter and kindred spirit. High from crushing Gawker, Thiel spoke for Trump at the Republican National Convention, and poured money into Rebekah Mercer’s PAC to rescue the campaign as Trump revealed increasing madness on the stump. He also urged voters to take Trump “seriously, but not literally.” Simultaneously, at Thiel’s recommendation, Chafkin suggests, Zuckerberg continued to allow popular content, including potentially misleading far-right articles, to stay at the top of Facebook’s trending stories, where they could attract more clicks and spike more get-out-the-vote cortisol.Why did Thiel go to such lengths for Trump? Chafkin quotes an anonymous longtime investor in Thiel’s firms: “He wanted to watch Rome burn.” Trump won, which meant that Thiel’s money and his burn-it-down ideology also won.Chafkin recounts that some of Thiel’s friends found this concretizaton of his cosmology too much to bear, and turned on him. But most did what most Trump opponents did for four years: waited it out, tried to wish away the erosion of American democracy and turned to their affairs.For his part, Thiel embraced the role of kingmaker, and Palantir benefited handsomely from contracts the Trump administration sent its way. Thiel found another winning sponsee: Josh Hawley, then Missouri’s attorney general, with whom he fought Google, which threatened the stability of many Thiel-backed companies, and which Hawley saw as communist, or something.Chafkin, a writer and editor at Bloomberg Businessweek, is especially interested in the friction between Zuckerberg and Thiel, who drifted apart for a time as Thiel became more involved in conservative politics. The words spent on discord in this relationship — and on tension between Thiel and other tech titans — distract from the more urgent chronicle of Thiel’s rise as one of the pre-eminent authors of the contemporary far-right movement.“The Contrarian” is chilling — literally chilling. As I read it, I grew colder and colder, until I found myself curled up under a blanket on a sunny day, icy and anxious. Scared people are scary, and Chafkin’s masterly evocation of his subject’s galactic fear — of liberals, of the U.S. government, of death — turns Thiel himself into a threat. I tried to tell myself that Thiel is just another rapacious solipsist, in it for the money, but I used to tell myself that about another rapacious solipsist, and he became president.By way of conclusion, Chafkin reports that Thiel rode out much of the pandemic in Maui, losing faith in Trump. Evidently Thiel considers the devastating coronavirus both an economic opportunity for Palantir, which went public in 2020 and has benefited from Covid-related government contracts, and a vindication of his predictions that the world as we know it is finished. More

  • in

    These Two Rumors Are Going Viral Ahead of California’s Recall Election

    As California’s Sept. 14 election over whether to recall Gov. Gavin Newsom draws closer, unfounded rumors about the event are growing.Here are two that are circulating widely online, how they spread and why, state and local officials said, they are wrong.Rumor No. 1: Holes in the ballot envelopes were being used to screen out votes that say “yes” to a recall.On Aug. 19, a woman posted a video on Instagram of herself placing her California special election ballot in an envelope.“You have to pay attention to these two holes that are in front of the envelope,” she said, bringing the holes close to the camera so viewers could see them. “You can see if someone has voted ‘yes’ to recall Newsom. This is very sketchy and irresponsible in my opinion, but this is asking for fraud.”The idea that the ballot envelope’s holes were being used to weed out the votes of those who wanted Gov. Newsom, a Democrat, to be recalled rapidly spread online, according to a review by The New York Times..css-1xzcza9{list-style-type:disc;padding-inline-start:1em;}.css-3btd0c{font-family:nyt-franklin,helvetica,arial,sans-serif;font-size:1rem;line-height:1.375rem;color:#333;margin-bottom:0.78125rem;}@media (min-width:740px){.css-3btd0c{font-size:1.0625rem;line-height:1.5rem;margin-bottom:0.9375rem;}}.css-3btd0c strong{font-weight:600;}.css-3btd0c em{font-style:italic;}.css-w739ur{margin:0 auto 5px;font-family:nyt-franklin,helvetica,arial,sans-serif;font-weight:700;font-size:1.125rem;line-height:1.3125rem;color:#121212;}#NYT_BELOW_MAIN_CONTENT_REGION .css-w739ur{font-family:nyt-cheltenham,georgia,’times new roman’,times,serif;font-weight:700;font-size:1.375rem;line-height:1.625rem;}@media (min-width:740px){#NYT_BELOW_MAIN_CONTENT_REGION .css-w739ur{font-size:1.6875rem;line-height:1.875rem;}}@media (min-width:740px){.css-w739ur{font-size:1.25rem;line-height:1.4375rem;}}.css-9s9ecg{margin-bottom:15px;}.css-uf1ume{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-box-pack:justify;-webkit-justify-content:space-between;-ms-flex-pack:justify;justify-content:space-between;}.css-wxi1cx{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:column;-ms-flex-direction:column;flex-direction:column;-webkit-align-self:flex-end;-ms-flex-item-align:end;align-self:flex-end;}.css-12vbvwq{background-color:white;border:1px solid #e2e2e2;width:calc(100% – 40px);max-width:600px;margin:1.5rem auto 1.9rem;padding:15px;box-sizing:border-box;}@media (min-width:740px){.css-12vbvwq{padding:20px;width:100%;}}.css-12vbvwq:focus{outline:1px solid #e2e2e2;}#NYT_BELOW_MAIN_CONTENT_REGION .css-12vbvwq{border:none;padding:10px 0 0;border-top:2px solid #121212;}.css-12vbvwq[data-truncated] .css-rdoyk0{-webkit-transform:rotate(0deg);-ms-transform:rotate(0deg);transform:rotate(0deg);}.css-12vbvwq[data-truncated] .css-eb027h{max-height:300px;overflow:hidden;-webkit-transition:none;transition:none;}.css-12vbvwq[data-truncated] .css-5gimkt:after{content:’See more’;}.css-12vbvwq[data-truncated] .css-6mllg9{opacity:1;}.css-qjk116{margin:0 auto;overflow:hidden;}.css-qjk116 strong{font-weight:700;}.css-qjk116 em{font-style:italic;}.css-qjk116 a{color:#326891;-webkit-text-decoration:underline;text-decoration:underline;text-underline-offset:1px;-webkit-text-decoration-thickness:1px;text-decoration-thickness:1px;-webkit-text-decoration-color:#326891;text-decoration-color:#326891;}.css-qjk116 a:visited{color:#326891;-webkit-text-decoration-color:#326891;text-decoration-color:#326891;}.css-qjk116 a:hover{-webkit-text-decoration:none;text-decoration:none;}The Instagram video collected nearly half a million views. On the messaging app Telegram, posts that said California was rigging the special election amassed nearly 200,000 views. And an article about the ballot holes on the far-right site The Gateway Pundit reached up to 626,000 people on Facebook, according to data from CrowdTangle, a Facebook-owned social media analytics tool.State and local officials said the ballot holes were not new and were not being used nefariously. The holes were placed in the envelope, on either end of a signature line, to help low-vision voters know where to sign it, said Jenna Dresner, a spokeswoman for the California Secretary of State’s Office of Election Cybersecurity.The ballot envelope’s design has been used for several election cycles, and civic design consultants recommended the holes for accessibility, added Mike Sanchez, a spokesman for the Los Angeles County registrar. He said voters could choose to put the ballot in the envelope in such a way that didn’t reveal any ballot marking at all through a hole.Instagram has since appended a fact-check label to the original video to note that it could mislead people. The fact check has reached up to 20,700 people, according to CrowdTangle data.Rumor No. 2: A felon stole ballots to help Governor Newsom win the recall election.On Aug. 17, the police in Torrance, Calif., published a post on Facebook that said officers had responded to a call about a man who was passed out in his car in a 7-Eleven parking lot. The man had items such as a loaded firearm, drugs and thousands of pieces of mail, including more than 300 unopened mail-in ballots for the special election, the police said.Far-right sites such as Red Voice Media and Conservative Firing Line claimed the incident was an example of Democrats’ trying to steal an election through mail-in ballots. Their articles were then shared on Facebook, where they collectively reached up to 1.57 million people, according to CrowdTangle data.Mark Ponegalek, a public information officer for the Torrance Police Department, said the investigation into the incident was continuing. The U.S. postal inspector was also involved, he said, and no conclusions had been reached.As a result, he said, online articles and posts concluding that the man was attempting voter fraud were “baseless.”“I have no indication to tell you one way or the other right now” whether the man intended to commit election fraud with the ballots he collected, Mr. Ponegalek said. He added that the man may have intended to commit identity fraud. More

  • in

    Reporter Discusses False Accusations Against Dominion Worker

    Through one employee of Dominion Voting Systems, a Times Magazine article examines the damage that false accusations can inflict.Times Insider explains who we are and what we do, and delivers behind-the-scenes insights into how our journalism comes together.As Susan Dominus, a staff writer for The New York Times Magazine, approached her reporting for an article on the attacks on Dominion Voting Systems, a business that supplies election technology, she wanted to tell the story of one of the Dominion employees who was being vilified by supporters of President Trump.She zeroed in on one man: Eric Coomer, whose anti-Trump social media posts were used to bolster false allegations that Dominion had tampered with the election, leading to death threats. Her article, published on Tuesday, is a case study in what can happen when information gets wildly manipulated. In an edited interview, Ms. Dominus discussed what she learned.How did you come upon Eric Coomer — did you have him in mind all along? Or did you want to do something on Dominion and eventually found your way to him?The Magazine was interested in pursuing a story about how the attacks on Dominion Voting Systems — a private business — were dramatically influencing the lives of those who worked there, people who were far from public figures. Many employees there were having their private information exposed, but early on, a lot of the threats were focusing on Eric Coomer, who was then the director of product strategy and security at Dominion. Eventually, people such as the lawyers Sidney Powell and Rudy Giuliani and the president’s son Eric Trump were naming him in the context of accusations about Dominion fixing the election.What was the biggest surprise you came across in your reporting?I was genuinely surprised to find that Mr. Coomer had expressed strong anti-Trump sentiments, using strong language, on his Facebook page. His settings were such that only his Facebook friends could see it, but someone took a screenshot of those and other divisive posts, and right-wing media circulated them widely. The posts were used in the spread of what cybersecurity experts call malinformation — something true that is used to support the dissemination of a story that is false. In this case, it was the big lie that the election was rigged. I think to understand the spread of spurious information — to resist its lure, to fight it off — these distinctions are helpful to parse. Understanding the human cost of these campaigns also matters. We heard a lot about the attacks on Dominion, but there are real people with real lives who are being battered in a battle they had no intention of joining, whatever their private opinions.There were so many elaborate theories of election fraud involving Dominion. How important were the accusations against Eric Coomer in that bigger story?It’s hard to say. But Advance Democracy Inc., a nonpartisan nonprofit, looked at the tweets in its database from QAnon-related accounts and found that, from Nov. 1 to Jan. 7, Eric Coomer’s name appeared in 25 percent of the ones that mentioned Dominion. Coomer believes the attacks on Dominion were somewhat inevitable but considered his own role as “an accelerant.”Trump’s Bid to Subvert the ElectionCard 1 of 4A monthslong campaign. More

  • in

    Facebook Said to Consider Forming an Election Commission

    The social network has contacted academics to create a group to advise it on thorny election-related decisions, said people with knowledge of the matter.Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.Mark Zuckerberg, Facebook’s chief executive, testified remotely in April about social media’s role in extremism and misinformation. Via ReutersFacebook declined to comment.If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.A spokesman for the Oversight Board declined to comment.Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.Lawmakers and political ad buyers also criticized Facebook for changing the rules around political ads before the 2020 presidential election. Last year, the company said it would bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter. An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.” More

  • in

    How Jason Miller Is Trying to Get Trump Back on the Internet

    Social media has felt quieter without the constant ALL CAPS fury of Donald Trump, but Jason Miller is trying to change that.Miller, who was the former president’s longtime aide and spokesman, recently took a new gig running a social media platform called Gettr, which claims to be a haven from censorship and cancel culture. It may sound a little like Parler 2.0, but the game-changer for Gettr — which has a little under two million users — would be if Miller can get Trump to create an account and get back online.[You can listen to this episode of “Sway” on Apple, Spotify, Google or wherever you get your podcasts.]In this conversation, Kara Swisher asks Miller how he intends to get Trump to log on. She challenges him on his claims that Twitter and Facebook are out to censor conservatives and presses him about how content moderation works on his platform. They also discuss the question on everyone’s mind: Is Trump likely to run again in 2024?(A full transcript of the episode will be available midday on the Times website.)Joe RachaThoughts? Email us at sway@nytimes.com.“Sway” is produced by Nayeema Raza, Blakeney Schick, Matt Kwong Daphne Chen and Caitlin O’Keefe, and edited by Nayeema Raza ; fact-checking by Kate Sinclair, Michelle Harris and Kristin Lin; music and sound design by Isaac Jones; mixing by Carole Sabouraud and Sonia Herrero; audience strategy by Shannon Busta. Special thanks to Kristin Lin and Liriel Higa. More

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More