More stories

  • in

    What Happened When Facebook Employees Warned About Election Misinformation

    Company documents show that the social network’s employees repeatedly raised red flags about the spread of misinformation and conspiracies before and after the contested November vote.Sixteen months before last November’s presidential election, a researcher at Facebook described an alarming development. She was getting content about the conspiracy theory QAnon within a week of opening an experimental account, she wrote in an internal report.On Nov. 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustible election misinformation” were visible below many posts.Four days after that, a company data scientist wrote in a note to his co-workers that 10 percent of all U.S. views of political material — a startlingly high figure — were of posts that alleged the vote was fraudulent.In each case, Facebook’s employees sounded an alarm about misinformation and inflammatory content on the platform and urged action — but the company failed or struggled to address the issues. The internal dispatches were among a set of Facebook documents obtained by The New York Times that give new insight into what happened inside the social network before and after the November election, when the company was caught flat-footed as users weaponized its platform to spread lies about the vote. More

  • in

    Whistle-Blower to Accuse Facebook of Contributing to Jan. 6 Riot, Memo Says

    In an internal memo, Facebook defended itself and said that social media was not a primary cause of polarization.SAN FRANCISCO — Facebook, which has been under fire from a former employee who has revealed that the social network knew of many of the harms it was causing, was bracing for new accusations over the weekend from the whistle-blower and said in a memo that it was preparing to mount a vigorous defense.The whistle-blower, whose identity has not been publicly disclosed, planned to accuse the company of relaxing its security safeguards for the 2020 election too soon after Election Day, which then led it to be used in the storming of the U.S. Capitol on Jan. 6, according to the internal memo obtained by The New York Times. The whistle-blower planned to discuss the allegations on “60 Minutes” on Sunday, the memo said, and was also set to say that Facebook had contributed to political polarization in the United States.The 1,500-word memo, written by Nick Clegg, Facebook’s vice president of policy and global affairs, was sent on Friday to employees to pre-empt the whistle-blower’s interview. Mr. Clegg pushed back strongly on what he said were the coming accusations, calling them “misleading.” “60 Minutes” published a teaser of the interview in advance of its segment on Sunday.“Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out,” he wrote. “But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.”Facebook has been in an uproar for weeks because of the whistle-blower, who has shared thousands of pages of company documents with lawmakers and The Wall Street Journal. The Journal has published a series of articles based on the documents, which show that Facebook knew how its apps and services could cause harm, including worsening body image issues among teenage girls using Instagram.Facebook has since scrambled to contain the fallout, as lawmakers, regulators and the public have said the company needs to account for the revelations. On Monday, Facebook paused the development of an Instagram service for children ages 13 and under. Its global head of safety, Antigone Davis, also testified on Thursday as irate lawmakers questioned her about the effects of Facebook and Instagram on young users.A Facebook spokesman declined to comment. A spokesman for “60 Minutes” did not immediately respond to a request for comment.Inside Facebook, executives including Mr. Clegg and the “Strategic Response” teams have called a series of emergency meetings to try to extinguish some of the outrage. Mark Zuckerberg, Facebook’s chief executive, and Sheryl Sandberg, the chief operating officer, have been briefed on the responses and have approved them, but have remained behind the scenes to distance themselves from the negative press, people with knowledge of the company have said.The firestorm is far from over. Facebook anticipated more allegations during the whistle-blower’s “60 Minutes” interview, according to the memo. The whistle-blower, who plans to reveal her identity during the interview, was set to say that Facebook had turned off some of its safety measures around the election — such as limits on live video — too soon after Election Day, the memo said. That allowed for misinformation to flood the platform and for groups to congregate online and plan the Jan. 6 storming of the Capitol building.Mr. Clegg said that was an inaccurate view and cited many of the safeguards and security mechanisms that Facebook had built over the past five years. He said the company had removed millions of groups such as the Proud Boys and others related to causes like the conspiracy theory QAnon and #StopTheSteal election fraud claims.The whistle-blower was also set to claim that many of Facebook’s problems stemmed from changes in the News Feed in 2018, the memo said. That was when the social network tweaked its algorithm to emphasize what it called Meaningful Social Interactions, or MSI, which prioritized posts from users’ friends and family and de-emphasized posts from publishers and brands.The goal was to make sure that Facebook’s products were “not just fun, but are good for people,” Mr. Zuckerberg said in an interview about the change at the time.But according to Friday’s memo, the whistle-blower would say that the change contributed to even more polarization among Facebook’s users. The whistle-blower was also set to say that Facebook then reaped record profits as its users flocked to the divisive content, the memo said.Mr. Clegg warned that the period ahead could be difficult for employees who might face questions from friends and family about Facebook’s role in the world. But he said that societal problems and political polarization have long predated the company and the advent of social networks in general.“The simple fact remains that changes to algorithmic ranking systems on one social media platform cannot explain wider societal polarization,” he wrote. “Indeed, polarizing content and misinformation are also present on platforms that have no algorithmic ranking whatsoever, including private messaging apps like iMessage and WhatsApp.”Mr. Clegg, who is scheduled to appear on the CNN program “Reliable Sources” on Sunday morning, also tried to relay an upbeat note to employees.“We will continue to face scrutiny — some of it fair and some of it unfair,” he said in the memo. “But we should also continue to hold our heads up high.”Here is Mr. Clegg’s memo in full:OUR POSITION ON POLARIZATION AND ELECTIONSYou will have seen the series of articles about us published in the Wall Street Journal in recent days, and the public interest it has provoked. This Sunday night, the ex-employee who leaked internal company material to the Journal will appear in a segment on 60 Minutes on CBS. We understand the piece is likely to assert that we contribute to polarization in the United States, and suggest that the extraordinary steps we took for the 2020 elections were relaxed too soon and contributed to the horrific events of January 6th in the Capitol.I know some of you – especially those of you in the US – are going to get questions from friends and family about these things so I wanted to take a moment as we head into the weekend to provide what I hope is some useful context on our work in these crucial areas.Facebook and PolarizationPeople are understandably anxious about the divisions in society and looking for answers and ways to fix the problems. Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out. So it’s natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn’t supported by the facts – as Chris and Pratiti set out in their note on the issue earlier this year.The rise of polarization has been the subject of swathes of serious academic research in recent years. In truth, there isn’t a great deal of consensus. But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.Specifically, we expect the reporting to suggest that a change to Facebook’s News Feed ranking algorithm was responsible for elevating polarizing content on the platform. In January 2018, we made ranking changes to promote Meaningful Social Interactions (MSI) – so that you would see more content from friends, family and groups you are part of in your News Feed. This change was heavily driven by internal and external research that showed that meaningful engagement with friends and family on our platform was better for people’s wellbeing, and we further refined and improved it over time as we do with all ranking metrics.Of course, everyone has a rogue uncle or an old school classmate who holds strong or extreme views we disagree with – that’s life – and the change meant you are more likely to come across their posts too. Even so, we’ve developed industry-leading tools to remove hateful content and reduce the distribution of problematic content. As a result, the prevalence of hate speech on our platform is now down to about 0.05%.But the simple fact remains that changes to algorithmic ranking systems on one social media platform cannot explain wider societal polarization. Indeed, polarizing content and misinformation are also present on platforms that have no algorithmic ranking whatsoever, including private messaging apps like iMessage and WhatsApp.Elections and DemocracyThere’s perhaps no other topic that we’ve been more vocal about as a company than on our work to dramatically change the way we approach elections. Starting in 2017, we began building new defenses, bringing in new expertise, and strengthening our policies to prevent interference. Today, we have more than 40,000 people across the company working on safety and security.Since 2017, we have disrupted and removed more than 150 covert influence operations, including ahead of major democratic elections. In 2020 alone, we removed more than 5 billion fake accounts — identifying almost all of them before anyone flagged them to us. And, from March to Election Day, we removed more than 265,000 pieces of Facebook and Instagram content in the US for violating our voter interference policies.Given the extraordinary circumstances of holding a contentious election in a pandemic, we implemented so called “break glass” measures – and spoke publicly about them – before and after Election Day to respond to specific and unusual signals we were seeing on our platform and to keep potentially violating content from spreading before our content reviewers could assess it against our policies.These measures were not without trade-offs – they’re blunt instruments designed to deal with specific crisis scenarios. It’s like shutting down an entire town’s roads and highways in response to a temporary threat that may be lurking somewhere in a particular neighborhood. In implementing them, we know we impacted significant amounts of content that did not violate our rules to prioritize people’s safety during a period of extreme uncertainty. For example, we limited the distribution of live videos that our systems predicted may relate to the election. That was an extreme step that helped prevent potentially violating content from going viral, but it also impacted a lot of entirely normal and reasonable content, including some that had nothing to do with the election. We wouldn’t take this kind of crude, catch-all measure in normal circumstances, but these weren’t normal circumstances.We only rolled back these emergency measures – based on careful data-driven analysis – when we saw a return to more normal conditions. We left some of them on for a longer period of time through February this year and others, like not recommending civic, political or new Groups, we have decided to retain permanently.Fighting Hate Groups and other Dangerous OrganizationsI want to be absolutely clear: we work to limit, not expand hate speech, and we have clear policies prohibiting content that incites violence. We do not profit from polarization, in fact, just the opposite. We do not allow dangerous organizations, including militarized social movements or violence-inducing conspiracy networks, to organize on our platforms. And we remove content that praises or supports hate groups, terrorist organizations and criminal groups.We’ve been more aggressive than any other internet company in combating harmful content, including content that sought to delegitimize the election. But our work to crack down on these hate groups was years in the making. We took down tens of thousands of QAnon pages, groups and accounts from our apps, removed the original #StopTheSteal Group, and removed references to Stop the Steal in the run up to the inauguration. In 2020 alone, we removed more than 30 million pieces of content violating our policies regarding terrorism and more than 19 million pieces of content violating our policies around organized hate in 2020. We designated the Proud Boys as a hate organization in 2018 and we continue to remove praise, support, and representation of them. Between August last year and January 12 this year, we identified nearly 900 militia organizations under our Dangerous Organizations and Individuals policy and removed thousands of Pages, groups, events, Facebook profiles and Instagram accounts associated with these groups.This work will never be complete. There will always be new threats and new problems to address, in the US and around the world. That’s why we remain vigilant and alert – and will always have to.That is also why the suggestion that is sometimes made that the violent insurrection on January 6 would not have occurred if it was not for social media is so misleading. To be clear, the responsibility for those events rests squarely with the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them. Mature democracies in which social media use is widespread hold elections all the time – for instance Germany’s election last week – without the disfiguring presence of violence. We actively share with Law Enforcement material that we can find on our services related to these traumatic events. But reducing the complex reasons for polarization in America – or the insurrection specifically – to a technological explanation is woefully simplistic.We will continue to face scrutiny – some of it fair and some of it unfair. We’ll continue to be asked difficult questions. And many people will continue to be skeptical of our motives. That’s what comes with being part of a company that has a significant impact in the world. We need to be humble enough to accept criticism when it is fair, and to make changes where they are justified. We aren’t perfect and we don’t have all the answers. That’s why we do the sort of research that has been the subject of these stories in the first place. And we’ll keep looking for ways to respond to the feedback we hear from our users, including testing ways to make sure political content doesn’t take over their News Feeds.But we should also continue to hold our heads up high. You and your teams do incredible work. Our tools and products have a hugely positive impact on the world and in people’s lives. And you have every reason to be proud of that work. More

  • in

    How They Failed: California Republicans, Media Critics and Facebook

    In a special Opinion Audio bonanza, Jane Coaston (The Argument), Ezra Klein (The Ezra Klein Show) and Kara Swisher (Sway) sit down to discuss what went wrong for the G.O.P. in the recall election of Gov. Gavin Newsom of California. “This was where the nationalization of politics really bit back for Republicans,” Jane says. The three hosts then debate whether the media industry’s criticism of itself does any good at all. “The media tweets like nobody’s watching,” Ezra says. Then the hosts turn to The Wall Street Journal’s revelations in “The Facebook Files” and discuss how to hold Facebook accountable. “We’re saying your tools in the hands of malevolent players are super dangerous,” Kara says, “but we have no power over them whatsoever.”And last, Ezra, Jane and Kara offer recommendations to take you deep into history, fantasy and psychotropics.[You can listen to this episode of “The Argument” on Apple, Spotify or Google or wherever you get your podcasts.]Read more about the subjects in this episode:Jane Coaston, Vox: “How California conservatives became the intellectual engine of Trumpism”Ezra Klein: “Gavin Newsom Is Much More Than the Lesser of Two Evils” and “A Different Way of Thinking About Cancel Culture”Kara Swisher: “The Endless Facebook Apology,” “Don’t Get Bezosed,” “The Medium of the Moment” “‘They’re Killing People’? Biden Isn’t Quite Right, but He’s Not Wrong.” and “The Terrible Cost of Mark Zuckerberg’s Naïveté”(A full transcript of the episode will be available midday on the Times website.)Photographs courtesy of The New York TimesThoughts? Email us at argument@nytimes.com or leave us a voice mail message at (347) 915-4324. We want to hear what you’re arguing about with your family, your friends and your frenemies. (We may use excerpts from your message in a future episode.)By leaving us a message, you are agreeing to be governed by our reader submission terms and agreeing that we may use and allow others to use your name, voice and message.This episode was produced by Phoebe Lett, Annie Galvin and Rogé Karma. It was edited by Stephanie Joyce, Alison Bruzek and Nayeema Raza. Engineering, music and sound design by Isaac Jones and Sonia Herrero. Fact-checking by Kate Sinclair, Michelle Harris and Kristin Lin. Audience strategy by Shannon Busta. Special thanks to Matt Kwong, Daphne Chen and Blakeney Schick. More

  • in

    The Alarming Rise of Peter Thiel, Tech Mogul and Political Provocateur

    THE CONTRARIAN Peter Thiel and Silicon Valley’s Pursuit of PowerBy Max ChafkinA few years ago, on a podcast called “This Is Actually Happening,” a penitent white supremacist recalled a formative childhood experience. One night his mother asked him: “You enjoying your burger?” She went on, “Did you know it’s made out of a cow?”“Something died?” the boy, then 5, replied.“Everything living dies,” she said. “You’re going to die.”Plagued thereafter by terror of death, the boy affected a fear-concealing swagger, which eventually became a fascist swagger.By chance, I’d just heard this episode when I opened “The Contrarian,” Max Chafkin’s sharp and disturbing biography of the Silicon Valley tech billionaire Peter Thiel, another far-right figure, though unrepentant.An epiphany from Thiel’s childhood sounded familiar. When he was 3, according to Chafkin, Thiel asked his father about a rug, which his father, Klaus Thiel, explained was cowhide. “Death happens to all animals. All people,” Klaus said. “It will happen to me one day. It will happen to you.”A near identical far-right coming-of-age tale — a Rechtsextremebildungsroman? The coincidence kicked off a wave of despair that crashed over me as I read Chafkin’s book. Where did these far-right Americans, powerful and not, ashamed and proud, come from? Why does a stock lecture about mortality lead some 3-to-5-year-old boys to develop contempt for the frailties in themselves — and in everyone else? Like the anonymous white supremacist, Thiel never recovered from bummer death news, and, according to Chafkin, still returns compulsively to “the brutal finality of the thing.” Thiel also turned to swaggering and, later, an evolving, sometimes contradictory, hodgepodge of libertarian and authoritarian beliefs.Thiel stalks through Chafkin’s biography “as if braced for a collision,” spoiling for a fight with whomever he designates a “liberal” — meaning anyone he suspects of snubbing him. Unsmiling, solipsistic and at pains to conceal his forever wounded vanity, Thiel in Chafkin’s telling comes across as singularly disagreeable, which is evidently the secret to both his worldly successes and his moral failures.Young Thiel had the usual dandruff-club hobbies: He played Dungeons & Dragons, read Tolkien and aced the SATs. He was arrogant, and set his worldview against those who mocked him for it. One of Thiel’s classmates at Stanford told Chafkin, “He viewed liberals through a lens as people who were not nice to him.” Looking back on Thiel’s anti-elitist and eventually illiberal politics, Chafkin is succinct: “He’d chosen to reject those who’d rejected him.”Chafkin serves as a tour guide to the ideological roadhouses where Thiel threw back shots of ultraconservative nostrums on his way to serve Donald Trump in 2016. There was his home life, where — first in Cleveland, then in South Africa and, finally, in suburban California — he ingested his German family’s complicity in apartheid (his father helped build a uranium mine in the Namib desert) and enthusiasm for Reagan; his requisite enlightenment via the novels of Ayn Rand; his excoriations of libs at Stanford, which (Chafkin reminds readers) still shows the influence of its eugenicist founding president, David Starr Jordan; and his depressing stint at a white-shoe corporate law firm, where he was disappointed to find “no liberals to fight.”These stages of the cross led Thiel to Silicon Valley in the mid-1990s, hot to leave big law and gamble on young Randian Übermenschen. An early bet on a coder named Max Levchin hit it big. The two devised PayPal, the company Thiel is famous for, which supercharged his antipathies with capital. Thiel, who’d published a book called “The Diversity Myth,” “made good on his aversion to multiculturalism,” Chafkin writes. “Besides youth, PayPal’s other defining quality was its white maleness.”In 2000, PayPal got in business with Elon Musk. “Peter thinks Musk is a fraud and a braggart,” one source tells Chafkin. “Musk thinks Peter is a sociopath.” According to Chafkin, Thiel remained coldblooded during the dot-com crash that year, as PayPal loopholed its way to market dominance. The company rebounded with a growth strategy known as “blitzscaling,” as well as the use of some supremely nasty tactics. “Whereas [Steve] Jobs viewed business as a form of cultural expression, even art,” Chafkin writes, “for Thiel and his peers it was a mode of transgression, even activism.”When PayPal went public, Thiel took out tens of millions and turned to investing full time. With various funds he scouted for more entrepreneurial twerps, and in the mid-2000s he latched onto Mark Zuckerberg of Facebook. He also set up a hedge fund called Clarium, where, according to Chafkin, Thiel’s staffers styled themselves as intellectuals and savored the wit of VDARE, an anti-immigration website that regularly published white nationalists. Hoping to make death less inevitable, at least for himself, Thiel also began to patronize the Alcor Life Extension Foundation, which has been steadily freezing the corpses of moneyed narcissists in liquid nitrogen since 1976.Thiel passed on investing in Tesla, telling Musk (according to Musk) that he didn’t “fully buy into the climate change thing.” But he gave Zuckerberg a loan for Facebook, which intermittently let him keep a leash on the young founder. After Sept. 11, Chafkin reports, Thiel also panicked about “the threat posed by Islamic terrorism — and Islam itself.” Libertarianism deserted him; he created Palantir, a data-analytics surveillance tech company designed, in essence, to root out terrorists. The C.I.A. used it, the N.Y.P.D. used it and Thiel became a contractor with big government. By 2006 his Clarium had $2 billion under management.Around this time, the wily Nick Denton, of the gossip empire Gawker, took notice of what Chafkin calls Thiel’s “extremist politics and ethically dubious business practices.” Gawker’s Valleywag site dragged Thiel, whose homosexuality was an open secret, suggesting he was repressed. This enraged Thiel, who by 2008 seemed to have lost it, firing off a floridly religious letter to Clarium investors warning of the imminent apocalypse and urging them to save their immortal souls and “accumulate treasures in heaven, in the eternal City of God.”The planet avoided the apocalypse, as it tends to do, but that year the financial crash laid the economy to waste. Several big investors pulled out of Thiel’s fund. In Chafkin’s telling, Thiel unaccountably blamed Denton for scaring away ultraconservatives by outing him. He determined to put Denton out of business, and in 2016, by clandestinely bankrolling a nuisance lawsuit designed to bankrupt Gawker, he did.Chafkin’s chronicle of Thiel’s wild abandon during the Obama years contains some of the most suspenseful passages in the book, as the narrative hurtles toward his acquisition of actual political power. Thiel seemed intoxicated by the rise of Obama, who galvanized the liberals Thiel most loved to hate. Chafkin recounts decadent parties at Thiel’s homes with barely clad men, along with his investments in nutjob projects, like seasteading, which promised life on floating ocean platforms free from government regulation. In a widely read essay, he argued that democracy and capitalism were at odds, because social programs and women’s suffrage curbed the absolute freedom of above-the-law capitalists like himself. He was officially antidemocracy.Thiel then began to direct money to nativist political candidates and causes, and to collaborate — via Palantir — with Lt. Gen. Michael Flynn, the strange right-wing figure who would later become a zealous Trumpite embraced by the QAnon cult. He built an army of mini-Thiels, the Thiel fellows, teenage boys (along with a few girls) whom he paid to quit college, forfeit normal social life and try to get rich in the Valley.Thiel backed Ron Paul for president in 2012, and helped Ted Cruz win a Texas Senate seat. (Gawker noted that Thiel’s support for the anti-gay Cruz was “no crazier than paying kids to drop out of school, cure death or create a floating libertarian ocean utopia.”) He contributed to Tea Party politicians with the aim of building a bigger “neo-reactionary” political movement, and in 2015, he gave his followers their own holy book when he published “Zero to One,” a compendium of antidemocracy, pro-monopoly blitzscaling tips.Peter Thiel, speaking at the Republican National Convention in July 2016. After Donald Trump won the nomination, Thiel decided Trump was a delightful disrupter and kindred spirit and urged voters to take him “seriously, but not literally.”Stephen Crowley/The New York TimesAt the same time, by investing in Lyft, TaskRabbit and Airbnb with his Founders Fund, Thiel seemed to be on the right side of history. When he spoke before mainstream audiences, he sometimes softened his extreme views and even laughed off his more gonzo follies — seasteading, for one.Yet one friend described Thiel to Chafkin as “Nazi-curious” (though the friend later said he was just being glib), and during this period Thiel also became, Chafkin writes, closer to Curtis Yarvin, a noxious avatar of the alt-right who had ties to Steve Bannon. He turned to survivalist prepping, kitting out a giant estate in New Zealand, where he took citizenship, making it possible that at a moment’s notice he could slip the knot of what, Chafkin says, had become his ultimate nemesis: the U.S. government itself.In the mid-2010s, a Palantir rep was also meeting with Cambridge Analytica, the creepy English data-mining firm that was later recorded boasting about using twisted data shenanigans to all but give the 2016 presidential election to Donald Trump.Like just about every powerful figure who eventually went all in for Trump, Thiel was initially skeptical, according to Chafkin. But once Trump won the nomination Thiel decided he was a delightful disrupter and kindred spirit. High from crushing Gawker, Thiel spoke for Trump at the Republican National Convention, and poured money into Rebekah Mercer’s PAC to rescue the campaign as Trump revealed increasing madness on the stump. He also urged voters to take Trump “seriously, but not literally.” Simultaneously, at Thiel’s recommendation, Chafkin suggests, Zuckerberg continued to allow popular content, including potentially misleading far-right articles, to stay at the top of Facebook’s trending stories, where they could attract more clicks and spike more get-out-the-vote cortisol.Why did Thiel go to such lengths for Trump? Chafkin quotes an anonymous longtime investor in Thiel’s firms: “He wanted to watch Rome burn.” Trump won, which meant that Thiel’s money and his burn-it-down ideology also won.Chafkin recounts that some of Thiel’s friends found this concretizaton of his cosmology too much to bear, and turned on him. But most did what most Trump opponents did for four years: waited it out, tried to wish away the erosion of American democracy and turned to their affairs.For his part, Thiel embraced the role of kingmaker, and Palantir benefited handsomely from contracts the Trump administration sent its way. Thiel found another winning sponsee: Josh Hawley, then Missouri’s attorney general, with whom he fought Google, which threatened the stability of many Thiel-backed companies, and which Hawley saw as communist, or something.Chafkin, a writer and editor at Bloomberg Businessweek, is especially interested in the friction between Zuckerberg and Thiel, who drifted apart for a time as Thiel became more involved in conservative politics. The words spent on discord in this relationship — and on tension between Thiel and other tech titans — distract from the more urgent chronicle of Thiel’s rise as one of the pre-eminent authors of the contemporary far-right movement.“The Contrarian” is chilling — literally chilling. As I read it, I grew colder and colder, until I found myself curled up under a blanket on a sunny day, icy and anxious. Scared people are scary, and Chafkin’s masterly evocation of his subject’s galactic fear — of liberals, of the U.S. government, of death — turns Thiel himself into a threat. I tried to tell myself that Thiel is just another rapacious solipsist, in it for the money, but I used to tell myself that about another rapacious solipsist, and he became president.By way of conclusion, Chafkin reports that Thiel rode out much of the pandemic in Maui, losing faith in Trump. Evidently Thiel considers the devastating coronavirus both an economic opportunity for Palantir, which went public in 2020 and has benefited from Covid-related government contracts, and a vindication of his predictions that the world as we know it is finished. More

  • in

    Facebook Said to Consider Forming an Election Commission

    The social network has contacted academics to create a group to advise it on thorny election-related decisions, said people with knowledge of the matter.Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.Mark Zuckerberg, Facebook’s chief executive, testified remotely in April about social media’s role in extremism and misinformation. Via ReutersFacebook declined to comment.If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.A spokesman for the Oversight Board declined to comment.Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.Lawmakers and political ad buyers also criticized Facebook for changing the rules around political ads before the 2020 presidential election. Last year, the company said it would bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter. An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.” More

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More

  • in

    The Facebook Oversight Board's Verdict on the Trump Ban

    In the end, they passed the buck.A year ago, Facebook introduced an oversight board that it said would help it answer difficult moderation questions — that is, who is allowed to use the social media site to amplify his voice and who is not.Yet when presented with its most consequential issue — whether to uphold the site’s indefinite suspension of Donald Trump — the board on Wednesday said Facebook should make the ultimate decision.The whole farce highlights the fatuousness of having a quasi-court assist a multinational corporation in making business decisions. Its members may be deliberative, earnest and thoughtful, but the oversight board cannot compel Facebook to make underlying policy changes nor set meaningful precedent about moderation. Its remit is only to decide whether specific posts should remain on the site or be removed.Helle Thorning-Schmidt, an oversight board co-chair and former prime minister of Demark, sought to bolster the body’s importance. “Anyone who is concerned about Facebook’s excessive concentration of power should welcome the oversight board clearly telling Facebook that they cannot invent new unwritten rules when it suits them,” she said in a call with media outlets.Michael McConnell, another co-chair and a Stanford Law School professor, said Facebook was “open to the suggestions of the board” in an interview. “The immediate holding of our decision is binding and I do think that they are going to set precedent.” He added, “The analogy to the Supreme Court is not bad.”But Facebook is no public entity and the board’s policy rulings have no legal standing beyond co-opting the language of the legal system. The company, meaning its chief executive, Mark Zuckerberg, will act in its best interests as a business.(Twitter, Mr. Trump’s favored platform, shut down his account two days after the Capitol riot on Jan. 6 and has announced no plans to restore it, nor has the company farmed out the decision to a third party.)Declining to amplify Mr. Trump’s lies on Facebook as the country was reeling from the Capitol attack was a good business decision for Facebook at the time, but restoring his account, with its some 35 million followers, may also eventually be a good business decision.The board, made up of 20 handpicked scholars, lawyers, politicians and other heavyweights, said Donald Trump’s use of Facebook to spur on the Jan. 6 attack on the Capitol was worthy of an account ban, but that Facebook needed to clarify the duration. The board said that Facebook must decide within six months on a lifetime ban or one of a specific duration.The issue could drag on, however. The board said it could very well have to rule again on Mr. Trump’s status after Facebook makes its decision.Beyond the specifics of Mr. Trump’s use of Facebook and Instagram, the oversight board requested the social media company better explain how its rules apply to public figures and more clearly enumerate its strikes and penalties processes, which can appear opaque, particularly when users are suspended or barred with little warning.Facebook allows an exemption for politicians to lie or break other of its rules in what the company says is the interest of newsworthiness. This is the opposite of how it should be: Politicians are more likely to be believed than regular folks, who are held to a higher standard on the site.Mr. Trump repeatedly violated Facebook’s community standards, including by threatening other world leaders and pushing conspiracy theories about his enemies. Nearly a quarter of his roughly 6,000 posts last year featured extremist rhetoric or misinformation about the election, his critics or the coronavirus.And he made it clear on Monday, as the oversight board’s public relations team began publicizing the imminent decision, that his time out of office has not chastened him. Regarding the decisive and fairly run November election, Mr. Trump wrote: “The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!”Ms. Thorning-Schmidt chastised Facebook for what she said were arbitrary rule-making procedures. “The oversight board is clearly telling Facebook that they can’t just invent new, unwritten rules when it suits them and for special uses,” she said. “They have to have a transparent way of doing this.”But therein lies the unresolvable contradiction. Facebook’s rules, and its oversight board, are constructs of a private entity whose only real accountability is to its founder and chief executive.The board is good government theater. Until Facebook gives the board a much stronger mandate, it will remain just that.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More

  • in

    Trump Ban From Facebook Upheld by Oversight Board

    A company-appointed panel ruled that the ban was justified at the time but added that the company should reassess its action and make a final decision in six months.SAN FRANCISCO — A Facebook-appointed panel of journalists, activists and lawyers on Wednesday upheld the social network’s ban of former President Donald J. Trump, ending any immediate return by Mr. Trump to mainstream social media and renewing a debate about tech power over online speech.Facebook’s Oversight Board, which acts as a quasi-court over the company’s content decisions, said the social network was right to bar Mr. Trump after he used the site to foment an insurrection in Washington in January. The panel said the ongoing risk of violence “justified” the move.But the board also said that an indefinite suspension was “not appropriate,” and that the company should apply a “defined penalty.” The board gave Facebook six months to make its final decision on Mr. Trump’s account status.“Our sole job is to hold this extremely powerful organization, Facebook, to be held accountable,” Michael McConnell, co-chair of the Oversight Board, said on a call with reporters. The ban on Mr. Trump “did not meet these standards,” he said.The decision adds difficulties to Mr. Trump rejoining mainstream social media, which he had used during his White House years to cajole, set policy, criticize opponents and rile up his tens of millions of followers. Twitter and YouTube had also cut off Mr. Trump in January after the insurrection at the Capitol building, saying the risk of harm and the potential for violence that he created were too great.But while Mr. Trump’s Facebook account remains suspended for now, he may be able to return to the social network once the company reviews its action. Mr. Trump still holds tremendous sway over Republicans, with his false claims of a stolen election continuing to reverberate. On Wednesday, House Republican leaders moved to expel Representative Liz Cheney of Wyoming from her leadership post for criticizing Mr. Trump and his election lies.Representatives for Mr. Trump did not immediately return requests for comment. On Tuesday, he unveiled a new site, “From the desk of Donald J. Trump,” with a Twitter-like feed, to communicate with his supporters.Mr. Trump’s continued Facebook suspension gave Republicans, who have long accused social media companies of suppressing conservative voices, new fuel against the platforms. Mark Zuckerberg, Facebook’s chief executive, has testified in Congress several times in recent years about whether the social network has shown bias against conservative political views. He has denied it.Senator Marsha Blackburn, Republican of Tennessee, said the Facebook board’s decision was “extremely disappointing” and that it was “clear that Mark Zuckerberg views himself as the arbiter of free speech.” And Representative Jim Jordan, Republican of Ohio, said Facebook, which faces antitrust scrutiny, should be broken up.Democrats were also unhappy. Frank Pallone, the chairman of the House energy and commerce committee, tweeted, “Donald Trump has played a big role in helping Facebook spread disinformation, but whether he’s on the platform or not, Facebook and other social media platforms with the same business model will find ways to highlight divisive content to drive advertising revenues.”The decision underlined the power of tech companies in determining who gets to say what online. While Mr. Zuckerberg has said that he does not wish his company to be “the arbiter of truth” in social discourse, Facebook has become increasingly active about the kinds of content it allows. To prevent the spread of misinformation, the company has cracked down on QAnon conspiracy theory groups, election falsehoods and anti-vaccination content in recent months, before culminating in the blocking of Mr. Trump in January.“This case has dramatic implications for the future of speech online because the public and other platforms are looking at how the oversight board will handle what is a difficult controversy that will arise again around the world,” said Nate Persily, a professor at Stanford University’s law school.He added, “President Trump has pushed the envelope about what is permissible speech on these platforms and he has set the outer limits such that if you are unwilling to go after him, you are allowing a large amount of incitement and hate speech and disinformation online that others are going to propagate.”In a statement, Facebook said it was “pleased” that the board recognized that its barring of Mr. Trump in January was justified. The company added that it would consider the ruling and “determine an action that is clear and proportionate.”Mr. Trump’s case is the most prominent that the Facebook Oversight Board, which was conceived in 2018, has handled. The board, which is made up of 20 journalists, activists and former politicians, reviews and adjudicates the company’s most contested content moderation decisions. Mr. Zuckerberg has repeatedly referred to it as the “Facebook Supreme Court.”But while the panel is positioned as independent, it was founded and funded by Facebook and has no legal or enforcement authority. Critics have been skeptical of the board’s autonomy and have said it gives Facebook the ability to punt on difficult decisions.Each of its cases is decided by a five-person panel selected from among the board’s 20 members, one of whom must be from the country in which the case originated. The panel reviews the comments on the case and makes recommendations to the full board, which decides through a majority vote. After a ruling, Facebook has seven days to act on the board’s decision.Mark Zuckerberg, the Facebook chief executive, testified before during the Senate judiciary committee last year. He has denied that the platform showed political bias.Pool photo by Hannah Mckay/EPA, via ShutterstockSince the board began issuing rulings in January, it has overturned Facebook’s decisions in four out of the five cases it has reviewed. In one case, the board asked Facebook to restore a post that used Joseph Goebbels, the Nazi propaganda chief, to make a point about the Trump presidency. Facebook had earlier removed the post because it “promoted dangerous individuals,” but complied with the board’s decision.In another case, the board ruled that Facebook had overreached by taking down a French user’s post that erroneously suggested the drug hydroxychloroquine could be used to cure Covid-19. Facebook restored the post but also said it would keep removing the false information following guidance by the Centers for Disease Control and Prevention and the World Health Organization.In Mr. Trump’s case, Facebook also asked the board to make recommendations on how to handle the accounts of political leaders. On Wednesday, the board suggested the company should publicly explain when it was applying special rules to influential figures, though it should impose definite time limits when doing so. The board also said Facebook should more clearly explain its strikes and penalties process, and develop and publish a policy that governs responses to crises or novel situations where its regular processes would not prevent imminent harm.“Facebook has been clearly abused by influential users,” said Helle Thorning-Schmidt, a co-chair of the Oversight Board.Facebook does not have to adopt these recommendations but said it “will carefully review” them.For Mr. Trump, Facebook was long a place to rally his digital base and support other Republicans. More than 32 million people followed him on Facebook, though that was far fewer than the more than 88 million followers he had on Twitter.Over the years, Mr. Trump and Mr. Zuckerberg also shared a testy relationship. Mr. Trump regularly assailed Silicon Valley executives for what he perceived to be their suppression of conservative speech. He also threatened to revoke Section 230, a legal shield that protects companies like Facebook from liability for what users post.Mr. Zuckerberg occasionally criticized some of Mr. Trump’s policies, including the handling of the pandemic and immigration. But as calls from lawmakers, civil rights leaders and even Facebook’s own employees grew to rein in Mr. Trump on social media, Mr. Zuckerberg declined to act. He said speech by political leaders — even if they spread lies — was newsworthy and in the public interest.The two men also appeared cordial during occasional meetings in Washington. Mr. Zuckerberg visited the White House more than once, dining privately with Mr. Trump.The politeness ended on Jan. 6. Hours before his supporters stormed the Capitol, Mr. Trump used Facebook and other social media to try to cast doubt on the results of the presidential election, which he had lost to Joseph R. Biden Jr. Mr. Trump wrote on Facebook, “Our Country has had enough, they won’t take it anymore!”Less than 24 hours later, Mr. Trump was barred from the platform indefinitely. While his Facebook page has remained up, it has been dormant. His last Facebook post, on Jan. 6, read, “I am asking for everyone at the U.S. Capitol to remain peaceful. No violence!”Cecilia Kang More