More stories

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More

  • in

    Facebook Ban Hits Trump Where It Hurts: Messaging and Money

    Facebook has increasingly become one of the most vital weapons in a political campaign’s arsenal, and few had tapped into its potential for advertising and fund-raising as aggressively as Mr. Trump’s.The decision by Facebook on Wednesday to keep former President Donald J. Trump off its platform could have significant consequences for his political operation as he tries to remain the leader of the Republican Party, thwarting his ability to amplify his message to tens of millions of followers and hampering his fund-raising ability.Facebook has increasingly become one of the most vital weapons in a political campaign’s arsenal, with its ability to juice small-dollar online-fund-raising numbers into the millions, expand and acquire contact information, help build out data on a campaign’s voter file and provide the most sophisticated advertising platform available.Few campaigns had tapped into Facebook’s potential for advertising and fund-raising as aggressively as Mr. Trump’s. His successful 2016 campaign said its prolific use of Facebook had allowed it to send millions of different, hyper-targeted political ads to small slices of the population.“Facebook was the method,” Brad Parscale, the Trump campaign manager in 2020 and digital director in 2016, told “60 Minutes” in 2017. “It was the highway which his car drove on.”That continued in 2020, as his re-election operation devoted a nine-figure budget to Facebook advertising. And much like he did with his Twitter account, Mr. Trump often turned to Facebook’s advertising platform in times of political crisis.During Mr. Trump’s first impeachment trial in September 2019, his campaign began flooding Facebook with ads criticizing the impeachment as a hoax and subversive effort by far-left Democrats.Though Mr. Trump is out of office and living at his resort in Florida, he retains broad influence over the Republican Party. But his platform for reaching Americans has diminished greatly without access to big social media sites like Facebook and Twitter, which has permanently suspended the former president. Some Trump aides think that the absence of Facebook, which was crucial to his success in 2016, will hinder him if he decides to run again in 2024, which he has told several advisers is his plan.Facebook’s ruling was delivered by an oversight board, which also said the company’s indefinite suspension was “not appropriate’’ and gave Facebook six months to come up with a final decision on whether Mr. Trump would regain access.His Facebook ads proved a useful tool to draw out big crowds to his signature rallies. Days before the president was scheduled to arrive in a given city, Facebook users around the region would begin seeing ads about the rally, with a link to sign up for a free ticket.The decision by Facebook does not immediately hamper Mr. Trump’s fund-raising ability — he still maintains control of a large number of supporter email addresses and phone numbers. But fund-raising lists must be continually refreshed, and Facebook has proved a crucial place for Mr. Trump to do so.“He has the best fund-raising list, but that decays over time if you’re not adding back into it,” said Eric Wilson, a Republican digital strategist. “So because they don’t have the ability to run ads on Facebook, they’re losing out on petitions to grow their email list, surveys, things like that — the tactics that every campaign has to be doing 365 to really maintain their fund-raising.”Throughout 2020, the Trump campaign would run ads asking users to “take this SOCIALISM poll” or “Wish Melania a Happy Birthday,” which would help both with keeping lists current while occasionally expanding or adding new names to their lists, or getting a direct donation from the ad.In recent days, Mr. Trump’s operation has begun to more aggressively solicit supporters for cash via text message — including one reacting to the Facebook decision on Wednesday. On Tuesday, Mr. Trump’s team announced he would begin posting his thoughts on political developments to his own website, trying to brand it as “From the Desk of Donald J. Trump.” But the power of Mr. Trump’s pronouncements on social media had been their ability to ricochet quickly across the web and into the streams of his supporters — something far harder to achieve while being deplatformed.But even without Facebook, some Republican strategists note that Mr. Trump still has one of the largest megaphones in the world, simply because of the public interest in his plans, which might lessen the impact of Facebook’s ban.“I compare it to somebody who has a sprained ankle,” said Tim Cameron, a Republican digital strategist. “It’s kind of hobbling for a little bit, and he’s not going to be at the strength that he would be with the ability to reach people on Facebook and other social platforms, but it’s certainly not something that’s going to stop him.”Even with the Facebook spigot turned off since January, Mr. Trump began the spring with more than $85 million in his various political committees, according to an adviser, after banking tens of millions of dollars that he raised after the election.But perhaps most immediately, the ban against running any political ads hampers one of Mr. Trump’s most current prized roles: Republican primary kingmaker.“He’s really committed to settling scores and making sure his allies get boosted,” Mr. Wilson said. “They won’t have access to Facebook to help the candidates he wants to support in the primaries in 2022.” More

  • in

    The Facebook Oversight Board's Verdict on the Trump Ban

    In the end, they passed the buck.A year ago, Facebook introduced an oversight board that it said would help it answer difficult moderation questions — that is, who is allowed to use the social media site to amplify his voice and who is not.Yet when presented with its most consequential issue — whether to uphold the site’s indefinite suspension of Donald Trump — the board on Wednesday said Facebook should make the ultimate decision.The whole farce highlights the fatuousness of having a quasi-court assist a multinational corporation in making business decisions. Its members may be deliberative, earnest and thoughtful, but the oversight board cannot compel Facebook to make underlying policy changes nor set meaningful precedent about moderation. Its remit is only to decide whether specific posts should remain on the site or be removed.Helle Thorning-Schmidt, an oversight board co-chair and former prime minister of Demark, sought to bolster the body’s importance. “Anyone who is concerned about Facebook’s excessive concentration of power should welcome the oversight board clearly telling Facebook that they cannot invent new unwritten rules when it suits them,” she said in a call with media outlets.Michael McConnell, another co-chair and a Stanford Law School professor, said Facebook was “open to the suggestions of the board” in an interview. “The immediate holding of our decision is binding and I do think that they are going to set precedent.” He added, “The analogy to the Supreme Court is not bad.”But Facebook is no public entity and the board’s policy rulings have no legal standing beyond co-opting the language of the legal system. The company, meaning its chief executive, Mark Zuckerberg, will act in its best interests as a business.(Twitter, Mr. Trump’s favored platform, shut down his account two days after the Capitol riot on Jan. 6 and has announced no plans to restore it, nor has the company farmed out the decision to a third party.)Declining to amplify Mr. Trump’s lies on Facebook as the country was reeling from the Capitol attack was a good business decision for Facebook at the time, but restoring his account, with its some 35 million followers, may also eventually be a good business decision.The board, made up of 20 handpicked scholars, lawyers, politicians and other heavyweights, said Donald Trump’s use of Facebook to spur on the Jan. 6 attack on the Capitol was worthy of an account ban, but that Facebook needed to clarify the duration. The board said that Facebook must decide within six months on a lifetime ban or one of a specific duration.The issue could drag on, however. The board said it could very well have to rule again on Mr. Trump’s status after Facebook makes its decision.Beyond the specifics of Mr. Trump’s use of Facebook and Instagram, the oversight board requested the social media company better explain how its rules apply to public figures and more clearly enumerate its strikes and penalties processes, which can appear opaque, particularly when users are suspended or barred with little warning.Facebook allows an exemption for politicians to lie or break other of its rules in what the company says is the interest of newsworthiness. This is the opposite of how it should be: Politicians are more likely to be believed than regular folks, who are held to a higher standard on the site.Mr. Trump repeatedly violated Facebook’s community standards, including by threatening other world leaders and pushing conspiracy theories about his enemies. Nearly a quarter of his roughly 6,000 posts last year featured extremist rhetoric or misinformation about the election, his critics or the coronavirus.And he made it clear on Monday, as the oversight board’s public relations team began publicizing the imminent decision, that his time out of office has not chastened him. Regarding the decisive and fairly run November election, Mr. Trump wrote: “The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!”Ms. Thorning-Schmidt chastised Facebook for what she said were arbitrary rule-making procedures. “The oversight board is clearly telling Facebook that they can’t just invent new, unwritten rules when it suits them and for special uses,” she said. “They have to have a transparent way of doing this.”But therein lies the unresolvable contradiction. Facebook’s rules, and its oversight board, are constructs of a private entity whose only real accountability is to its founder and chief executive.The board is good government theater. Until Facebook gives the board a much stronger mandate, it will remain just that.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More

  • in

    Trump Ban From Facebook Upheld by Oversight Board

    A company-appointed panel ruled that the ban was justified at the time but added that the company should reassess its action and make a final decision in six months.SAN FRANCISCO — A Facebook-appointed panel of journalists, activists and lawyers on Wednesday upheld the social network’s ban of former President Donald J. Trump, ending any immediate return by Mr. Trump to mainstream social media and renewing a debate about tech power over online speech.Facebook’s Oversight Board, which acts as a quasi-court over the company’s content decisions, said the social network was right to bar Mr. Trump after he used the site to foment an insurrection in Washington in January. The panel said the ongoing risk of violence “justified” the move.But the board also said that an indefinite suspension was “not appropriate,” and that the company should apply a “defined penalty.” The board gave Facebook six months to make its final decision on Mr. Trump’s account status.“Our sole job is to hold this extremely powerful organization, Facebook, to be held accountable,” Michael McConnell, co-chair of the Oversight Board, said on a call with reporters. The ban on Mr. Trump “did not meet these standards,” he said.The decision adds difficulties to Mr. Trump rejoining mainstream social media, which he had used during his White House years to cajole, set policy, criticize opponents and rile up his tens of millions of followers. Twitter and YouTube had also cut off Mr. Trump in January after the insurrection at the Capitol building, saying the risk of harm and the potential for violence that he created were too great.But while Mr. Trump’s Facebook account remains suspended for now, he may be able to return to the social network once the company reviews its action. Mr. Trump still holds tremendous sway over Republicans, with his false claims of a stolen election continuing to reverberate. On Wednesday, House Republican leaders moved to expel Representative Liz Cheney of Wyoming from her leadership post for criticizing Mr. Trump and his election lies.Representatives for Mr. Trump did not immediately return requests for comment. On Tuesday, he unveiled a new site, “From the desk of Donald J. Trump,” with a Twitter-like feed, to communicate with his supporters.Mr. Trump’s continued Facebook suspension gave Republicans, who have long accused social media companies of suppressing conservative voices, new fuel against the platforms. Mark Zuckerberg, Facebook’s chief executive, has testified in Congress several times in recent years about whether the social network has shown bias against conservative political views. He has denied it.Senator Marsha Blackburn, Republican of Tennessee, said the Facebook board’s decision was “extremely disappointing” and that it was “clear that Mark Zuckerberg views himself as the arbiter of free speech.” And Representative Jim Jordan, Republican of Ohio, said Facebook, which faces antitrust scrutiny, should be broken up.Democrats were also unhappy. Frank Pallone, the chairman of the House energy and commerce committee, tweeted, “Donald Trump has played a big role in helping Facebook spread disinformation, but whether he’s on the platform or not, Facebook and other social media platforms with the same business model will find ways to highlight divisive content to drive advertising revenues.”The decision underlined the power of tech companies in determining who gets to say what online. While Mr. Zuckerberg has said that he does not wish his company to be “the arbiter of truth” in social discourse, Facebook has become increasingly active about the kinds of content it allows. To prevent the spread of misinformation, the company has cracked down on QAnon conspiracy theory groups, election falsehoods and anti-vaccination content in recent months, before culminating in the blocking of Mr. Trump in January.“This case has dramatic implications for the future of speech online because the public and other platforms are looking at how the oversight board will handle what is a difficult controversy that will arise again around the world,” said Nate Persily, a professor at Stanford University’s law school.He added, “President Trump has pushed the envelope about what is permissible speech on these platforms and he has set the outer limits such that if you are unwilling to go after him, you are allowing a large amount of incitement and hate speech and disinformation online that others are going to propagate.”In a statement, Facebook said it was “pleased” that the board recognized that its barring of Mr. Trump in January was justified. The company added that it would consider the ruling and “determine an action that is clear and proportionate.”Mr. Trump’s case is the most prominent that the Facebook Oversight Board, which was conceived in 2018, has handled. The board, which is made up of 20 journalists, activists and former politicians, reviews and adjudicates the company’s most contested content moderation decisions. Mr. Zuckerberg has repeatedly referred to it as the “Facebook Supreme Court.”But while the panel is positioned as independent, it was founded and funded by Facebook and has no legal or enforcement authority. Critics have been skeptical of the board’s autonomy and have said it gives Facebook the ability to punt on difficult decisions.Each of its cases is decided by a five-person panel selected from among the board’s 20 members, one of whom must be from the country in which the case originated. The panel reviews the comments on the case and makes recommendations to the full board, which decides through a majority vote. After a ruling, Facebook has seven days to act on the board’s decision.Mark Zuckerberg, the Facebook chief executive, testified before during the Senate judiciary committee last year. He has denied that the platform showed political bias.Pool photo by Hannah Mckay/EPA, via ShutterstockSince the board began issuing rulings in January, it has overturned Facebook’s decisions in four out of the five cases it has reviewed. In one case, the board asked Facebook to restore a post that used Joseph Goebbels, the Nazi propaganda chief, to make a point about the Trump presidency. Facebook had earlier removed the post because it “promoted dangerous individuals,” but complied with the board’s decision.In another case, the board ruled that Facebook had overreached by taking down a French user’s post that erroneously suggested the drug hydroxychloroquine could be used to cure Covid-19. Facebook restored the post but also said it would keep removing the false information following guidance by the Centers for Disease Control and Prevention and the World Health Organization.In Mr. Trump’s case, Facebook also asked the board to make recommendations on how to handle the accounts of political leaders. On Wednesday, the board suggested the company should publicly explain when it was applying special rules to influential figures, though it should impose definite time limits when doing so. The board also said Facebook should more clearly explain its strikes and penalties process, and develop and publish a policy that governs responses to crises or novel situations where its regular processes would not prevent imminent harm.“Facebook has been clearly abused by influential users,” said Helle Thorning-Schmidt, a co-chair of the Oversight Board.Facebook does not have to adopt these recommendations but said it “will carefully review” them.For Mr. Trump, Facebook was long a place to rally his digital base and support other Republicans. More than 32 million people followed him on Facebook, though that was far fewer than the more than 88 million followers he had on Twitter.Over the years, Mr. Trump and Mr. Zuckerberg also shared a testy relationship. Mr. Trump regularly assailed Silicon Valley executives for what he perceived to be their suppression of conservative speech. He also threatened to revoke Section 230, a legal shield that protects companies like Facebook from liability for what users post.Mr. Zuckerberg occasionally criticized some of Mr. Trump’s policies, including the handling of the pandemic and immigration. But as calls from lawmakers, civil rights leaders and even Facebook’s own employees grew to rein in Mr. Trump on social media, Mr. Zuckerberg declined to act. He said speech by political leaders — even if they spread lies — was newsworthy and in the public interest.The two men also appeared cordial during occasional meetings in Washington. Mr. Zuckerberg visited the White House more than once, dining privately with Mr. Trump.The politeness ended on Jan. 6. Hours before his supporters stormed the Capitol, Mr. Trump used Facebook and other social media to try to cast doubt on the results of the presidential election, which he had lost to Joseph R. Biden Jr. Mr. Trump wrote on Facebook, “Our Country has had enough, they won’t take it anymore!”Less than 24 hours later, Mr. Trump was barred from the platform indefinitely. While his Facebook page has remained up, it has been dormant. His last Facebook post, on Jan. 6, read, “I am asking for everyone at the U.S. Capitol to remain peaceful. No violence!”Cecilia Kang More

  • in

    A Facebook panel will reveal on Wednesday whether Trump will regain his megaphone.

    Facebook’s Oversight Board, an independent and international panel that was created and funded by the social network, plans to announce on Wednesday whether former President Donald J. Trump will be able to return to the platform that has been a critical megaphone for him and his tens of millions of followers.The decision will be closely watched as a template for how private companies that run social networks handle political speech, including the misinformation spread by political leaders.Mr. Trump was indefinitely locked out of Facebook on Jan. 7 after he used his social media accounts to incite a mob of his supporters to storm the Capitol a day earlier. Mr. Trump had declined to accept his election defeat, saying the election had been stolen from him.At the time that Facebook barred Mr. Trump, the company’s chief executive, Mark Zuckerberg, wrote in a post: “We believe the risks of allowing the president to continue to use our service during this period are simply too great.”Two weeks later, the company referred the case of Mr. Trump to Facebook’s Oversight Board for a final decision on whether the ban should be permanent. Facebook and the board’s members have said the panel’s decisions are binding, but critics are skeptical of the board’s independence. The panel, critics said, is a first-of-its-kind Supreme Court-like entity on online speech, funded by a private company with a poor track record of enforcing its own rules.Facebook’s approach to political speech has been inconsistent. In October 2019, Mr. Zuckerberg declared the company would not fact check political speech and said that even lies by politicians deserved a place on the social network because it was in the public’s interest to hear all ideas by political leaders. But Mr. Trump’s comments on Jan. 6 were different, the company has said, because they incited violence and threatened the peaceful transition of power in elections.On Monday, Mr. Trump continued to deny the election results.“The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!” he said in an emailed statement. More

  • in

    Facebook, Preparing for Chauvin Verdict, Will Limit Posts That Might Incite Violence

    Facebook on Monday said it planned to limit posts that contain misinformation and hate speech related to the trial of Derek Chauvin, the former Minneapolis police officer charged with the murder of George Floyd, to keep them from spilling over into real-world harm.As closing arguments began in the trial and Minneapolis braced for a verdict, Facebook said it would identify and remove posts on the social network that urged people to bring arms to the city. It also said it would protect members of Mr. Floyd’s family from harassment and take down content that praised, celebrated or mocked his death.“We know this trial has been painful for many people,” Monika Bickert, Facebook’s vice president of content policy, wrote in a blog post. “We want to strike the right balance between allowing people to speak about the trial and what the verdict means, while still doing our part to protect everyone’s safety.”Facebook, which has long positioned itself as a site for free speech, has become increasingly proactive in policing content that might lead to real-world violence. The Silicon Valley company has been under fire for years over the way it has handled sensitive news events. That includes last year’s presidential election, when online misinformation about voter fraud galvanized supporters of former President Donald J. Trump. Believing the election to have been stolen from Mr. Trump, some supporters stormed the Capitol building on Jan. 6.Leading up to the election, Facebook took steps to fight misinformation, foreign interference and voter suppression. The company displayed warnings on more than 150 million posts with election misinformation, removed more than 120,000 posts for violating its voter interference policies and took down 30 networks that posted false messages about the election.But critics said Facebook and other social media platforms did not do enough. After the storming of the Capitol, the social network stopped Mr. Trump from being able to post on the site. The company’s independent oversight board is now debating whether the former president will be allowed back on Facebook and has said it plans to issue its decision “in the coming weeks,” without giving a definite date.The death of Mr. Floyd, who was Black, led to a wave of Black Lives Matter protests across the nation last year. Mr. Chauvin, a former Minneapolis police officer who is white, faces charges of manslaughter, second-degree murder and third-degree murder for Mr. Floyd’s death. The trial began in late March. Mr. Chauvin did not testify.Facebook said on Monday that it had determined that Minneapolis was, at least temporarily, “a high-risk location.” It said it would remove pages, groups, events and Instagram accounts that violated its violence and incitement policy; take down attacks against Mr. Chauvin and Mr. Floyd; and label misinformation and graphic content as sensitive.The company did not have any further comment.“As the trial comes to a close, we will continue doing our part to help people safely connect and share what they are experiencing,” Ms. Bickert said in the blog post. More

  • in

    I Used to Think the Remedy for Bad Speech Was More Speech. Not Anymore.

    I used to believe that the remedy for bad speech is more speech. Now that seems archaic. Just as the founders never envisioned how the right of a well-regulated militia to own slow-loading muskets could apply to mass murderers with bullet-spewing military-style semiautomatic rifles, they could not have foreseen speech so twisted to malevolent intent as it is now.Cyber-libertarianism, the ethos of the internet with roots in 18th-century debate about the free market of ideas, has failed us miserably. Well after the pandemic is over, the infodemic will rage on — so long as it pays to lie, distort and misinform.Just recently, we saw the malignancies of our premier freedoms on display in the mass shooting in Boulder, Colo. At the center of the horror was a deeply disturbed man with a gun created for war, with the capacity to kill large numbers of humans, quickly. Within hours of the slaughter at the supermarket, a Facebook account with about 60,000 followers wrote that the shooting was fake — a so-called false flag, meant to cast blame on the wrong person.So it goes. Toxic misinformation, like AR-15-style weapons in the hands of men bent on murder, is just something we’re supposed to live with in a free society. But there are three things we could do now to clean up the river of falsities poisoning our democracy.First, teach your parents well. Facebook users over the age of 65 are far more likely to post articles from fake news sites than people under the age of 30, according to multiple studies.Certainly, the “I don’t know it for a fact, I just know it’s true” sentiment, as the Bill Maher segment has it, is not limited to seniors. But too many older people lack the skills to detect a viral falsity.That’s where the kids come in. March 18 was “MisinfoDay” in many Washington State high schools. On that day, students were taught how to spot a lie — training they could share with their parents and grandparents.Media literacy classes have been around for a while. No one should graduate from high school without being equipped with the tools to recognize bogus information. It’s like elementary civics. By extension, we should encourage the informed young to pass this on to their misinformed elders.Second, sue. What finally made the misinformation merchants on television and the web close the spigot on the Big Lie about the election were lawsuits seeking billions. Dominion Voting Systems and Smartmatic, two election technology companies, sued Fox News and others, claiming defamation.“Lies have consequences,” Dominion’s lawyers wrote in their complaint. “Fox sold a false story of election fraud in order to serve its own commercial purposes, severely injuring Dominion in the process.”In response to the Smartmatic suit, Fox said, “This lawsuit strikes at the heart of the news media’s First Amendment mission to inform on matters of public concern.” No, it doesn’t. There is no “mission” to misinform.The fraudsters didn’t even pretend they weren’t peddling lies. Sidney Powell, the lawyer who was one of the loudest promoters of the falsehood that Donald Trump won the election, was named in a Dominion lawsuit. “No reasonable person would conclude that the statements were truly statements of fact,” her lawyers wrote, absurdly, of her deception.Tell that to the majority of Republican voters who said they believed the election was stolen. They didn’t see the wink when Powell went on Fox and Newsmax to claim a massive voter fraud scheme.Dominion should sue Trump, the man at the top of the falsity food chain. The ex-president has shown he will repeat a lie over and over until it hurts him financially. That’s how the system works. And the bar for a successful libel suit, it should be noted, is very high.Finally, we need to dis-incentivize social media giants from spreading misinformation. This means striking at the algorithms that drive traffic — the lines of code that push people down rabbit holes of unreality.The Capitol Hill riot on Jan. 6 might not have happened without the platforms that spread false information, while fattening the fortunes of social media giants.“The last few years have proven that the more outrageous and extremist content social media platforms promote, the more engagement and advertising dollars they rake in,” said Representative Frank Pallone Jr., chairman of the House committee that recently questioned big tech chief executives.Taking away their legal shield — Section 230 of the Communications Decency Act — is the strongest threat out there. Sure, removing social media’s immunity from the untruthful things said on their platforms could mean the end of the internet as we know it. True. But that’s not necessarily a bad thing.So far, the threat has been mostly idle — all talk. At the least, lawmakers could more effectively use this leverage to force social media giants to redo their recommendation algorithms, making bogus information less likely to spread. When YouTube took such a step, promotion of conspiracy theories decreased significantly, according to researchers at the University of California, Berkeley, who published their findings in March 2020.Republicans may resist most of the above. Lies help them stay in power, and a misinformed public is good for their legislative agenda. They’re currently pushing a wave of voter suppression laws to fix a problem that doesn’t exist.I still believe the truth may set us free. But it has little chance of surviving amid the babble of orchestrated mendacity.Timothy Egan (@nytegan) is a contributing opinion writer who covers the environment, the American West and politics. He is a winner of the National Book Award and author, most recently, of “A Pilgrimage to Eternity.”The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More

  • in

    Zuckerberg, Dorsey and Pichai testify about disinformation.

    The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.Republicans sent the executives letters this month asking them about the decisions to remove conservative personalities and stories from their platforms, including an October article in The New York Post about President Biden’s son Hunter.Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright. More