More stories

  • in

    The Alarming Rise of Peter Thiel, Tech Mogul and Political Provocateur

    THE CONTRARIAN Peter Thiel and Silicon Valley’s Pursuit of PowerBy Max ChafkinA few years ago, on a podcast called “This Is Actually Happening,” a penitent white supremacist recalled a formative childhood experience. One night his mother asked him: “You enjoying your burger?” She went on, “Did you know it’s made out of a cow?”“Something died?” the boy, then 5, replied.“Everything living dies,” she said. “You’re going to die.”Plagued thereafter by terror of death, the boy affected a fear-concealing swagger, which eventually became a fascist swagger.By chance, I’d just heard this episode when I opened “The Contrarian,” Max Chafkin’s sharp and disturbing biography of the Silicon Valley tech billionaire Peter Thiel, another far-right figure, though unrepentant.An epiphany from Thiel’s childhood sounded familiar. When he was 3, according to Chafkin, Thiel asked his father about a rug, which his father, Klaus Thiel, explained was cowhide. “Death happens to all animals. All people,” Klaus said. “It will happen to me one day. It will happen to you.”A near identical far-right coming-of-age tale — a Rechtsextremebildungsroman? The coincidence kicked off a wave of despair that crashed over me as I read Chafkin’s book. Where did these far-right Americans, powerful and not, ashamed and proud, come from? Why does a stock lecture about mortality lead some 3-to-5-year-old boys to develop contempt for the frailties in themselves — and in everyone else? Like the anonymous white supremacist, Thiel never recovered from bummer death news, and, according to Chafkin, still returns compulsively to “the brutal finality of the thing.” Thiel also turned to swaggering and, later, an evolving, sometimes contradictory, hodgepodge of libertarian and authoritarian beliefs.Thiel stalks through Chafkin’s biography “as if braced for a collision,” spoiling for a fight with whomever he designates a “liberal” — meaning anyone he suspects of snubbing him. Unsmiling, solipsistic and at pains to conceal his forever wounded vanity, Thiel in Chafkin’s telling comes across as singularly disagreeable, which is evidently the secret to both his worldly successes and his moral failures.Young Thiel had the usual dandruff-club hobbies: He played Dungeons & Dragons, read Tolkien and aced the SATs. He was arrogant, and set his worldview against those who mocked him for it. One of Thiel’s classmates at Stanford told Chafkin, “He viewed liberals through a lens as people who were not nice to him.” Looking back on Thiel’s anti-elitist and eventually illiberal politics, Chafkin is succinct: “He’d chosen to reject those who’d rejected him.”Chafkin serves as a tour guide to the ideological roadhouses where Thiel threw back shots of ultraconservative nostrums on his way to serve Donald Trump in 2016. There was his home life, where — first in Cleveland, then in South Africa and, finally, in suburban California — he ingested his German family’s complicity in apartheid (his father helped build a uranium mine in the Namib desert) and enthusiasm for Reagan; his requisite enlightenment via the novels of Ayn Rand; his excoriations of libs at Stanford, which (Chafkin reminds readers) still shows the influence of its eugenicist founding president, David Starr Jordan; and his depressing stint at a white-shoe corporate law firm, where he was disappointed to find “no liberals to fight.”These stages of the cross led Thiel to Silicon Valley in the mid-1990s, hot to leave big law and gamble on young Randian Übermenschen. An early bet on a coder named Max Levchin hit it big. The two devised PayPal, the company Thiel is famous for, which supercharged his antipathies with capital. Thiel, who’d published a book called “The Diversity Myth,” “made good on his aversion to multiculturalism,” Chafkin writes. “Besides youth, PayPal’s other defining quality was its white maleness.”In 2000, PayPal got in business with Elon Musk. “Peter thinks Musk is a fraud and a braggart,” one source tells Chafkin. “Musk thinks Peter is a sociopath.” According to Chafkin, Thiel remained coldblooded during the dot-com crash that year, as PayPal loopholed its way to market dominance. The company rebounded with a growth strategy known as “blitzscaling,” as well as the use of some supremely nasty tactics. “Whereas [Steve] Jobs viewed business as a form of cultural expression, even art,” Chafkin writes, “for Thiel and his peers it was a mode of transgression, even activism.”When PayPal went public, Thiel took out tens of millions and turned to investing full time. With various funds he scouted for more entrepreneurial twerps, and in the mid-2000s he latched onto Mark Zuckerberg of Facebook. He also set up a hedge fund called Clarium, where, according to Chafkin, Thiel’s staffers styled themselves as intellectuals and savored the wit of VDARE, an anti-immigration website that regularly published white nationalists. Hoping to make death less inevitable, at least for himself, Thiel also began to patronize the Alcor Life Extension Foundation, which has been steadily freezing the corpses of moneyed narcissists in liquid nitrogen since 1976.Thiel passed on investing in Tesla, telling Musk (according to Musk) that he didn’t “fully buy into the climate change thing.” But he gave Zuckerberg a loan for Facebook, which intermittently let him keep a leash on the young founder. After Sept. 11, Chafkin reports, Thiel also panicked about “the threat posed by Islamic terrorism — and Islam itself.” Libertarianism deserted him; he created Palantir, a data-analytics surveillance tech company designed, in essence, to root out terrorists. The C.I.A. used it, the N.Y.P.D. used it and Thiel became a contractor with big government. By 2006 his Clarium had $2 billion under management.Around this time, the wily Nick Denton, of the gossip empire Gawker, took notice of what Chafkin calls Thiel’s “extremist politics and ethically dubious business practices.” Gawker’s Valleywag site dragged Thiel, whose homosexuality was an open secret, suggesting he was repressed. This enraged Thiel, who by 2008 seemed to have lost it, firing off a floridly religious letter to Clarium investors warning of the imminent apocalypse and urging them to save their immortal souls and “accumulate treasures in heaven, in the eternal City of God.”The planet avoided the apocalypse, as it tends to do, but that year the financial crash laid the economy to waste. Several big investors pulled out of Thiel’s fund. In Chafkin’s telling, Thiel unaccountably blamed Denton for scaring away ultraconservatives by outing him. He determined to put Denton out of business, and in 2016, by clandestinely bankrolling a nuisance lawsuit designed to bankrupt Gawker, he did.Chafkin’s chronicle of Thiel’s wild abandon during the Obama years contains some of the most suspenseful passages in the book, as the narrative hurtles toward his acquisition of actual political power. Thiel seemed intoxicated by the rise of Obama, who galvanized the liberals Thiel most loved to hate. Chafkin recounts decadent parties at Thiel’s homes with barely clad men, along with his investments in nutjob projects, like seasteading, which promised life on floating ocean platforms free from government regulation. In a widely read essay, he argued that democracy and capitalism were at odds, because social programs and women’s suffrage curbed the absolute freedom of above-the-law capitalists like himself. He was officially antidemocracy.Thiel then began to direct money to nativist political candidates and causes, and to collaborate — via Palantir — with Lt. Gen. Michael Flynn, the strange right-wing figure who would later become a zealous Trumpite embraced by the QAnon cult. He built an army of mini-Thiels, the Thiel fellows, teenage boys (along with a few girls) whom he paid to quit college, forfeit normal social life and try to get rich in the Valley.Thiel backed Ron Paul for president in 2012, and helped Ted Cruz win a Texas Senate seat. (Gawker noted that Thiel’s support for the anti-gay Cruz was “no crazier than paying kids to drop out of school, cure death or create a floating libertarian ocean utopia.”) He contributed to Tea Party politicians with the aim of building a bigger “neo-reactionary” political movement, and in 2015, he gave his followers their own holy book when he published “Zero to One,” a compendium of antidemocracy, pro-monopoly blitzscaling tips.Peter Thiel, speaking at the Republican National Convention in July 2016. After Donald Trump won the nomination, Thiel decided Trump was a delightful disrupter and kindred spirit and urged voters to take him “seriously, but not literally.”Stephen Crowley/The New York TimesAt the same time, by investing in Lyft, TaskRabbit and Airbnb with his Founders Fund, Thiel seemed to be on the right side of history. When he spoke before mainstream audiences, he sometimes softened his extreme views and even laughed off his more gonzo follies — seasteading, for one.Yet one friend described Thiel to Chafkin as “Nazi-curious” (though the friend later said he was just being glib), and during this period Thiel also became, Chafkin writes, closer to Curtis Yarvin, a noxious avatar of the alt-right who had ties to Steve Bannon. He turned to survivalist prepping, kitting out a giant estate in New Zealand, where he took citizenship, making it possible that at a moment’s notice he could slip the knot of what, Chafkin says, had become his ultimate nemesis: the U.S. government itself.In the mid-2010s, a Palantir rep was also meeting with Cambridge Analytica, the creepy English data-mining firm that was later recorded boasting about using twisted data shenanigans to all but give the 2016 presidential election to Donald Trump.Like just about every powerful figure who eventually went all in for Trump, Thiel was initially skeptical, according to Chafkin. But once Trump won the nomination Thiel decided he was a delightful disrupter and kindred spirit. High from crushing Gawker, Thiel spoke for Trump at the Republican National Convention, and poured money into Rebekah Mercer’s PAC to rescue the campaign as Trump revealed increasing madness on the stump. He also urged voters to take Trump “seriously, but not literally.” Simultaneously, at Thiel’s recommendation, Chafkin suggests, Zuckerberg continued to allow popular content, including potentially misleading far-right articles, to stay at the top of Facebook’s trending stories, where they could attract more clicks and spike more get-out-the-vote cortisol.Why did Thiel go to such lengths for Trump? Chafkin quotes an anonymous longtime investor in Thiel’s firms: “He wanted to watch Rome burn.” Trump won, which meant that Thiel’s money and his burn-it-down ideology also won.Chafkin recounts that some of Thiel’s friends found this concretizaton of his cosmology too much to bear, and turned on him. But most did what most Trump opponents did for four years: waited it out, tried to wish away the erosion of American democracy and turned to their affairs.For his part, Thiel embraced the role of kingmaker, and Palantir benefited handsomely from contracts the Trump administration sent its way. Thiel found another winning sponsee: Josh Hawley, then Missouri’s attorney general, with whom he fought Google, which threatened the stability of many Thiel-backed companies, and which Hawley saw as communist, or something.Chafkin, a writer and editor at Bloomberg Businessweek, is especially interested in the friction between Zuckerberg and Thiel, who drifted apart for a time as Thiel became more involved in conservative politics. The words spent on discord in this relationship — and on tension between Thiel and other tech titans — distract from the more urgent chronicle of Thiel’s rise as one of the pre-eminent authors of the contemporary far-right movement.“The Contrarian” is chilling — literally chilling. As I read it, I grew colder and colder, until I found myself curled up under a blanket on a sunny day, icy and anxious. Scared people are scary, and Chafkin’s masterly evocation of his subject’s galactic fear — of liberals, of the U.S. government, of death — turns Thiel himself into a threat. I tried to tell myself that Thiel is just another rapacious solipsist, in it for the money, but I used to tell myself that about another rapacious solipsist, and he became president.By way of conclusion, Chafkin reports that Thiel rode out much of the pandemic in Maui, losing faith in Trump. Evidently Thiel considers the devastating coronavirus both an economic opportunity for Palantir, which went public in 2020 and has benefited from Covid-related government contracts, and a vindication of his predictions that the world as we know it is finished. More

  • in

    Facebook Said to Consider Forming an Election Commission

    The social network has contacted academics to create a group to advise it on thorny election-related decisions, said people with knowledge of the matter.Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.Mark Zuckerberg, Facebook’s chief executive, testified remotely in April about social media’s role in extremism and misinformation. Via ReutersFacebook declined to comment.If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.A spokesman for the Oversight Board declined to comment.Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.Lawmakers and political ad buyers also criticized Facebook for changing the rules around political ads before the 2020 presidential election. Last year, the company said it would bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter. An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.” More

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More

  • in

    The Facebook Oversight Board's Verdict on the Trump Ban

    In the end, they passed the buck.A year ago, Facebook introduced an oversight board that it said would help it answer difficult moderation questions — that is, who is allowed to use the social media site to amplify his voice and who is not.Yet when presented with its most consequential issue — whether to uphold the site’s indefinite suspension of Donald Trump — the board on Wednesday said Facebook should make the ultimate decision.The whole farce highlights the fatuousness of having a quasi-court assist a multinational corporation in making business decisions. Its members may be deliberative, earnest and thoughtful, but the oversight board cannot compel Facebook to make underlying policy changes nor set meaningful precedent about moderation. Its remit is only to decide whether specific posts should remain on the site or be removed.Helle Thorning-Schmidt, an oversight board co-chair and former prime minister of Demark, sought to bolster the body’s importance. “Anyone who is concerned about Facebook’s excessive concentration of power should welcome the oversight board clearly telling Facebook that they cannot invent new unwritten rules when it suits them,” she said in a call with media outlets.Michael McConnell, another co-chair and a Stanford Law School professor, said Facebook was “open to the suggestions of the board” in an interview. “The immediate holding of our decision is binding and I do think that they are going to set precedent.” He added, “The analogy to the Supreme Court is not bad.”But Facebook is no public entity and the board’s policy rulings have no legal standing beyond co-opting the language of the legal system. The company, meaning its chief executive, Mark Zuckerberg, will act in its best interests as a business.(Twitter, Mr. Trump’s favored platform, shut down his account two days after the Capitol riot on Jan. 6 and has announced no plans to restore it, nor has the company farmed out the decision to a third party.)Declining to amplify Mr. Trump’s lies on Facebook as the country was reeling from the Capitol attack was a good business decision for Facebook at the time, but restoring his account, with its some 35 million followers, may also eventually be a good business decision.The board, made up of 20 handpicked scholars, lawyers, politicians and other heavyweights, said Donald Trump’s use of Facebook to spur on the Jan. 6 attack on the Capitol was worthy of an account ban, but that Facebook needed to clarify the duration. The board said that Facebook must decide within six months on a lifetime ban or one of a specific duration.The issue could drag on, however. The board said it could very well have to rule again on Mr. Trump’s status after Facebook makes its decision.Beyond the specifics of Mr. Trump’s use of Facebook and Instagram, the oversight board requested the social media company better explain how its rules apply to public figures and more clearly enumerate its strikes and penalties processes, which can appear opaque, particularly when users are suspended or barred with little warning.Facebook allows an exemption for politicians to lie or break other of its rules in what the company says is the interest of newsworthiness. This is the opposite of how it should be: Politicians are more likely to be believed than regular folks, who are held to a higher standard on the site.Mr. Trump repeatedly violated Facebook’s community standards, including by threatening other world leaders and pushing conspiracy theories about his enemies. Nearly a quarter of his roughly 6,000 posts last year featured extremist rhetoric or misinformation about the election, his critics or the coronavirus.And he made it clear on Monday, as the oversight board’s public relations team began publicizing the imminent decision, that his time out of office has not chastened him. Regarding the decisive and fairly run November election, Mr. Trump wrote: “The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!”Ms. Thorning-Schmidt chastised Facebook for what she said were arbitrary rule-making procedures. “The oversight board is clearly telling Facebook that they can’t just invent new, unwritten rules when it suits them and for special uses,” she said. “They have to have a transparent way of doing this.”But therein lies the unresolvable contradiction. Facebook’s rules, and its oversight board, are constructs of a private entity whose only real accountability is to its founder and chief executive.The board is good government theater. Until Facebook gives the board a much stronger mandate, it will remain just that.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More

  • in

    Trump Ban From Facebook Upheld by Oversight Board

    A company-appointed panel ruled that the ban was justified at the time but added that the company should reassess its action and make a final decision in six months.SAN FRANCISCO — A Facebook-appointed panel of journalists, activists and lawyers on Wednesday upheld the social network’s ban of former President Donald J. Trump, ending any immediate return by Mr. Trump to mainstream social media and renewing a debate about tech power over online speech.Facebook’s Oversight Board, which acts as a quasi-court over the company’s content decisions, said the social network was right to bar Mr. Trump after he used the site to foment an insurrection in Washington in January. The panel said the ongoing risk of violence “justified” the move.But the board also said that an indefinite suspension was “not appropriate,” and that the company should apply a “defined penalty.” The board gave Facebook six months to make its final decision on Mr. Trump’s account status.“Our sole job is to hold this extremely powerful organization, Facebook, to be held accountable,” Michael McConnell, co-chair of the Oversight Board, said on a call with reporters. The ban on Mr. Trump “did not meet these standards,” he said.The decision adds difficulties to Mr. Trump rejoining mainstream social media, which he had used during his White House years to cajole, set policy, criticize opponents and rile up his tens of millions of followers. Twitter and YouTube had also cut off Mr. Trump in January after the insurrection at the Capitol building, saying the risk of harm and the potential for violence that he created were too great.But while Mr. Trump’s Facebook account remains suspended for now, he may be able to return to the social network once the company reviews its action. Mr. Trump still holds tremendous sway over Republicans, with his false claims of a stolen election continuing to reverberate. On Wednesday, House Republican leaders moved to expel Representative Liz Cheney of Wyoming from her leadership post for criticizing Mr. Trump and his election lies.Representatives for Mr. Trump did not immediately return requests for comment. On Tuesday, he unveiled a new site, “From the desk of Donald J. Trump,” with a Twitter-like feed, to communicate with his supporters.Mr. Trump’s continued Facebook suspension gave Republicans, who have long accused social media companies of suppressing conservative voices, new fuel against the platforms. Mark Zuckerberg, Facebook’s chief executive, has testified in Congress several times in recent years about whether the social network has shown bias against conservative political views. He has denied it.Senator Marsha Blackburn, Republican of Tennessee, said the Facebook board’s decision was “extremely disappointing” and that it was “clear that Mark Zuckerberg views himself as the arbiter of free speech.” And Representative Jim Jordan, Republican of Ohio, said Facebook, which faces antitrust scrutiny, should be broken up.Democrats were also unhappy. Frank Pallone, the chairman of the House energy and commerce committee, tweeted, “Donald Trump has played a big role in helping Facebook spread disinformation, but whether he’s on the platform or not, Facebook and other social media platforms with the same business model will find ways to highlight divisive content to drive advertising revenues.”The decision underlined the power of tech companies in determining who gets to say what online. While Mr. Zuckerberg has said that he does not wish his company to be “the arbiter of truth” in social discourse, Facebook has become increasingly active about the kinds of content it allows. To prevent the spread of misinformation, the company has cracked down on QAnon conspiracy theory groups, election falsehoods and anti-vaccination content in recent months, before culminating in the blocking of Mr. Trump in January.“This case has dramatic implications for the future of speech online because the public and other platforms are looking at how the oversight board will handle what is a difficult controversy that will arise again around the world,” said Nate Persily, a professor at Stanford University’s law school.He added, “President Trump has pushed the envelope about what is permissible speech on these platforms and he has set the outer limits such that if you are unwilling to go after him, you are allowing a large amount of incitement and hate speech and disinformation online that others are going to propagate.”In a statement, Facebook said it was “pleased” that the board recognized that its barring of Mr. Trump in January was justified. The company added that it would consider the ruling and “determine an action that is clear and proportionate.”Mr. Trump’s case is the most prominent that the Facebook Oversight Board, which was conceived in 2018, has handled. The board, which is made up of 20 journalists, activists and former politicians, reviews and adjudicates the company’s most contested content moderation decisions. Mr. Zuckerberg has repeatedly referred to it as the “Facebook Supreme Court.”But while the panel is positioned as independent, it was founded and funded by Facebook and has no legal or enforcement authority. Critics have been skeptical of the board’s autonomy and have said it gives Facebook the ability to punt on difficult decisions.Each of its cases is decided by a five-person panel selected from among the board’s 20 members, one of whom must be from the country in which the case originated. The panel reviews the comments on the case and makes recommendations to the full board, which decides through a majority vote. After a ruling, Facebook has seven days to act on the board’s decision.Mark Zuckerberg, the Facebook chief executive, testified before during the Senate judiciary committee last year. He has denied that the platform showed political bias.Pool photo by Hannah Mckay/EPA, via ShutterstockSince the board began issuing rulings in January, it has overturned Facebook’s decisions in four out of the five cases it has reviewed. In one case, the board asked Facebook to restore a post that used Joseph Goebbels, the Nazi propaganda chief, to make a point about the Trump presidency. Facebook had earlier removed the post because it “promoted dangerous individuals,” but complied with the board’s decision.In another case, the board ruled that Facebook had overreached by taking down a French user’s post that erroneously suggested the drug hydroxychloroquine could be used to cure Covid-19. Facebook restored the post but also said it would keep removing the false information following guidance by the Centers for Disease Control and Prevention and the World Health Organization.In Mr. Trump’s case, Facebook also asked the board to make recommendations on how to handle the accounts of political leaders. On Wednesday, the board suggested the company should publicly explain when it was applying special rules to influential figures, though it should impose definite time limits when doing so. The board also said Facebook should more clearly explain its strikes and penalties process, and develop and publish a policy that governs responses to crises or novel situations where its regular processes would not prevent imminent harm.“Facebook has been clearly abused by influential users,” said Helle Thorning-Schmidt, a co-chair of the Oversight Board.Facebook does not have to adopt these recommendations but said it “will carefully review” them.For Mr. Trump, Facebook was long a place to rally his digital base and support other Republicans. More than 32 million people followed him on Facebook, though that was far fewer than the more than 88 million followers he had on Twitter.Over the years, Mr. Trump and Mr. Zuckerberg also shared a testy relationship. Mr. Trump regularly assailed Silicon Valley executives for what he perceived to be their suppression of conservative speech. He also threatened to revoke Section 230, a legal shield that protects companies like Facebook from liability for what users post.Mr. Zuckerberg occasionally criticized some of Mr. Trump’s policies, including the handling of the pandemic and immigration. But as calls from lawmakers, civil rights leaders and even Facebook’s own employees grew to rein in Mr. Trump on social media, Mr. Zuckerberg declined to act. He said speech by political leaders — even if they spread lies — was newsworthy and in the public interest.The two men also appeared cordial during occasional meetings in Washington. Mr. Zuckerberg visited the White House more than once, dining privately with Mr. Trump.The politeness ended on Jan. 6. Hours before his supporters stormed the Capitol, Mr. Trump used Facebook and other social media to try to cast doubt on the results of the presidential election, which he had lost to Joseph R. Biden Jr. Mr. Trump wrote on Facebook, “Our Country has had enough, they won’t take it anymore!”Less than 24 hours later, Mr. Trump was barred from the platform indefinitely. While his Facebook page has remained up, it has been dormant. His last Facebook post, on Jan. 6, read, “I am asking for everyone at the U.S. Capitol to remain peaceful. No violence!”Cecilia Kang More

  • in

    A Facebook panel will reveal on Wednesday whether Trump will regain his megaphone.

    Facebook’s Oversight Board, an independent and international panel that was created and funded by the social network, plans to announce on Wednesday whether former President Donald J. Trump will be able to return to the platform that has been a critical megaphone for him and his tens of millions of followers.The decision will be closely watched as a template for how private companies that run social networks handle political speech, including the misinformation spread by political leaders.Mr. Trump was indefinitely locked out of Facebook on Jan. 7 after he used his social media accounts to incite a mob of his supporters to storm the Capitol a day earlier. Mr. Trump had declined to accept his election defeat, saying the election had been stolen from him.At the time that Facebook barred Mr. Trump, the company’s chief executive, Mark Zuckerberg, wrote in a post: “We believe the risks of allowing the president to continue to use our service during this period are simply too great.”Two weeks later, the company referred the case of Mr. Trump to Facebook’s Oversight Board for a final decision on whether the ban should be permanent. Facebook and the board’s members have said the panel’s decisions are binding, but critics are skeptical of the board’s independence. The panel, critics said, is a first-of-its-kind Supreme Court-like entity on online speech, funded by a private company with a poor track record of enforcing its own rules.Facebook’s approach to political speech has been inconsistent. In October 2019, Mr. Zuckerberg declared the company would not fact check political speech and said that even lies by politicians deserved a place on the social network because it was in the public’s interest to hear all ideas by political leaders. But Mr. Trump’s comments on Jan. 6 were different, the company has said, because they incited violence and threatened the peaceful transition of power in elections.On Monday, Mr. Trump continued to deny the election results.“The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!” he said in an emailed statement. More

  • in

    Zuckerberg, Dorsey and Pichai testify about disinformation.

    The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.Republicans sent the executives letters this month asking them about the decisions to remove conservative personalities and stories from their platforms, including an October article in The New York Post about President Biden’s son Hunter.Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright. More

  • in

    Facebook Ends Ban on Political Advertising

    AdvertisementContinue reading the main storySupported byContinue reading the main storyFacebook Ends Ban on Political AdvertisingThe social network had prohibited political ads on its site indefinitely after the November election. Such ads have been criticized for spreading misinformation.Mark Zuckerberg, the Facebook chief executive, testifying in October. Before the ban on political ads, he had said he wanted to maintain a hands-off approach toward speech on Facebook.Credit…Pool photo by Michael ReynoldsMarch 3, 2021Updated 6:16 p.m. ETSAN FRANCISCO — Facebook said on Wednesday that it planned to lift its ban on political advertising across its network, resuming a form of digital promotion that has been criticized for spreading misinformation and falsehoods and inflaming voters.The social network said it would allow advertisers to buy new ads about “social issues, elections or politics” beginning on Thursday, according to a copy of an email sent to political advertisers and viewed by The New York Times. Those advertisers must complete a series of identity checks before being authorized to place the ads, the company said.“We put this temporary ban in place after the November 2020 election to avoid confusion or abuse following Election Day,” Facebook said in a blog post. “We’ve heard a lot of feedback about this and learned more about political and electoral ads during this election cycle. As a result, we plan to use the coming months to take a closer look at how these ads work on our service to see where further changes may be merited.”Political advertising on Facebook has long faced questions. Mark Zuckerberg, Facebook’s chief executive, has said he wished to maintain a largely hands-off stance toward speech on the site — including political ads — unless it posed an immediate harm to the public or individuals, saying that he “does not want to be the arbiter of truth.”But after the 2016 presidential election, the company and intelligence officials discovered that Russians had used Facebook ads to sow discontent among Americans. Former President Donald J. Trump also used Facebook’s political ads to amplify claims about an “invasion” on the Mexican border in 2019, among other incidents.Facebook had banned political ads late last year as a way to choke off misinformation and threats of violence around the November presidential election. In September, the company said it planned to forbid new political ads for the week before Election Day and would act swiftly against posts that tried to dissuade people from voting. Then in October, Facebook expanded that action by declaring it would prohibit all political and issue-based advertising after the polls closed on Nov. 3 for an undetermined length of time.The company eventually clamped down on groups and pages that spread certain kinds of misinformation, such as discouraging people from voting or registering to vote. It has spent billions of dollars to root out foreign influence campaigns and other types of meddling from malicious state agencies and other bad actors.In December, Facebook lifted the ban to allow some advertisers to run political issue and candidacy ads in Georgia for the January runoff Senate election in the state. But the ban otherwise remained in effect for the remaining 49 states.Attitudes around how political advertising should be treated across Facebook are decidedly mixed. Politicians who are not well known often can raise their profile and awareness of their campaigns by using Facebook.“Political ads are not bad things in and of themselves,” said Siva Vaidhyanathan, a media studies professor and the author of a book studying Facebook’s effects on democracy. “They perform an essential service, in the act of directly representing the candidate’s concerns or positions.”He added, “When you ban all campaign ads on the most accessible and affordable platform out there, you tilt the balance toward the candidates who can afford radio and television.”Representative Alexandria Ocasio-Cortez, Democrat of New York, has also said that political advertising on Facebook can be a crucial component for Democratic digital campaign strategies.Some political ad buyers applauded the lifting of the ads ban.“The ad ban was something that Facebook did to appease the public for the misinformation that spread across the platform,” said Eileen Pollet, a digital campaign strategist and founder of Ravenna Strategies. “But it really ended up hurting good actors while bad actors had total free rein. And now, especially since the election is over, the ban had really been hurting nonprofits and local organizations.”Facebook has long sought to thread the needle between forceful moderation of its policies and a lighter touch. For years, Mr. Zuckerberg defended politicians’ right to say what they wanted on Facebook, but that changed last year amid rising alarm over potential violence around the November election.In January, Facebook barred Mr. Trump from using his account and posting on the platform after he took to social media to delegitimize the election results and incited a violent uprising among his supporters, who stormed the U.S. Capitol.Facebook said Mr. Trump’s suspension was “indefinite.” The decision is now under review by the Facebook Oversight Board, a third-party entity created by the company and composed of journalists, academics and others that adjudicates some of the company’s thorny content policy enforcement decisions. A decision is expected to come within the next few months.On Thursday, political advertisers on Facebook will be able to submit new ads or turn on existing political ads that have already been approved, the company said. Each ad will appear with a small disclaimer, stating that it has been “paid for by” a political organization. For those buying new ads, Facebook said it could take up to a week to clear the identity authorization and advertising review process.AdvertisementContinue reading the main story More