More stories

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More

  • in

    The Facebook Oversight Board's Verdict on the Trump Ban

    In the end, they passed the buck.A year ago, Facebook introduced an oversight board that it said would help it answer difficult moderation questions — that is, who is allowed to use the social media site to amplify his voice and who is not.Yet when presented with its most consequential issue — whether to uphold the site’s indefinite suspension of Donald Trump — the board on Wednesday said Facebook should make the ultimate decision.The whole farce highlights the fatuousness of having a quasi-court assist a multinational corporation in making business decisions. Its members may be deliberative, earnest and thoughtful, but the oversight board cannot compel Facebook to make underlying policy changes nor set meaningful precedent about moderation. Its remit is only to decide whether specific posts should remain on the site or be removed.Helle Thorning-Schmidt, an oversight board co-chair and former prime minister of Demark, sought to bolster the body’s importance. “Anyone who is concerned about Facebook’s excessive concentration of power should welcome the oversight board clearly telling Facebook that they cannot invent new unwritten rules when it suits them,” she said in a call with media outlets.Michael McConnell, another co-chair and a Stanford Law School professor, said Facebook was “open to the suggestions of the board” in an interview. “The immediate holding of our decision is binding and I do think that they are going to set precedent.” He added, “The analogy to the Supreme Court is not bad.”But Facebook is no public entity and the board’s policy rulings have no legal standing beyond co-opting the language of the legal system. The company, meaning its chief executive, Mark Zuckerberg, will act in its best interests as a business.(Twitter, Mr. Trump’s favored platform, shut down his account two days after the Capitol riot on Jan. 6 and has announced no plans to restore it, nor has the company farmed out the decision to a third party.)Declining to amplify Mr. Trump’s lies on Facebook as the country was reeling from the Capitol attack was a good business decision for Facebook at the time, but restoring his account, with its some 35 million followers, may also eventually be a good business decision.The board, made up of 20 handpicked scholars, lawyers, politicians and other heavyweights, said Donald Trump’s use of Facebook to spur on the Jan. 6 attack on the Capitol was worthy of an account ban, but that Facebook needed to clarify the duration. The board said that Facebook must decide within six months on a lifetime ban or one of a specific duration.The issue could drag on, however. The board said it could very well have to rule again on Mr. Trump’s status after Facebook makes its decision.Beyond the specifics of Mr. Trump’s use of Facebook and Instagram, the oversight board requested the social media company better explain how its rules apply to public figures and more clearly enumerate its strikes and penalties processes, which can appear opaque, particularly when users are suspended or barred with little warning.Facebook allows an exemption for politicians to lie or break other of its rules in what the company says is the interest of newsworthiness. This is the opposite of how it should be: Politicians are more likely to be believed than regular folks, who are held to a higher standard on the site.Mr. Trump repeatedly violated Facebook’s community standards, including by threatening other world leaders and pushing conspiracy theories about his enemies. Nearly a quarter of his roughly 6,000 posts last year featured extremist rhetoric or misinformation about the election, his critics or the coronavirus.And he made it clear on Monday, as the oversight board’s public relations team began publicizing the imminent decision, that his time out of office has not chastened him. Regarding the decisive and fairly run November election, Mr. Trump wrote: “The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!”Ms. Thorning-Schmidt chastised Facebook for what she said were arbitrary rule-making procedures. “The oversight board is clearly telling Facebook that they can’t just invent new, unwritten rules when it suits them and for special uses,” she said. “They have to have a transparent way of doing this.”But therein lies the unresolvable contradiction. Facebook’s rules, and its oversight board, are constructs of a private entity whose only real accountability is to its founder and chief executive.The board is good government theater. Until Facebook gives the board a much stronger mandate, it will remain just that.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More

  • in

    Trump Ban From Facebook Upheld by Oversight Board

    A company-appointed panel ruled that the ban was justified at the time but added that the company should reassess its action and make a final decision in six months.SAN FRANCISCO — A Facebook-appointed panel of journalists, activists and lawyers on Wednesday upheld the social network’s ban of former President Donald J. Trump, ending any immediate return by Mr. Trump to mainstream social media and renewing a debate about tech power over online speech.Facebook’s Oversight Board, which acts as a quasi-court over the company’s content decisions, said the social network was right to bar Mr. Trump after he used the site to foment an insurrection in Washington in January. The panel said the ongoing risk of violence “justified” the move.But the board also said that an indefinite suspension was “not appropriate,” and that the company should apply a “defined penalty.” The board gave Facebook six months to make its final decision on Mr. Trump’s account status.“Our sole job is to hold this extremely powerful organization, Facebook, to be held accountable,” Michael McConnell, co-chair of the Oversight Board, said on a call with reporters. The ban on Mr. Trump “did not meet these standards,” he said.The decision adds difficulties to Mr. Trump rejoining mainstream social media, which he had used during his White House years to cajole, set policy, criticize opponents and rile up his tens of millions of followers. Twitter and YouTube had also cut off Mr. Trump in January after the insurrection at the Capitol building, saying the risk of harm and the potential for violence that he created were too great.But while Mr. Trump’s Facebook account remains suspended for now, he may be able to return to the social network once the company reviews its action. Mr. Trump still holds tremendous sway over Republicans, with his false claims of a stolen election continuing to reverberate. On Wednesday, House Republican leaders moved to expel Representative Liz Cheney of Wyoming from her leadership post for criticizing Mr. Trump and his election lies.Representatives for Mr. Trump did not immediately return requests for comment. On Tuesday, he unveiled a new site, “From the desk of Donald J. Trump,” with a Twitter-like feed, to communicate with his supporters.Mr. Trump’s continued Facebook suspension gave Republicans, who have long accused social media companies of suppressing conservative voices, new fuel against the platforms. Mark Zuckerberg, Facebook’s chief executive, has testified in Congress several times in recent years about whether the social network has shown bias against conservative political views. He has denied it.Senator Marsha Blackburn, Republican of Tennessee, said the Facebook board’s decision was “extremely disappointing” and that it was “clear that Mark Zuckerberg views himself as the arbiter of free speech.” And Representative Jim Jordan, Republican of Ohio, said Facebook, which faces antitrust scrutiny, should be broken up.Democrats were also unhappy. Frank Pallone, the chairman of the House energy and commerce committee, tweeted, “Donald Trump has played a big role in helping Facebook spread disinformation, but whether he’s on the platform or not, Facebook and other social media platforms with the same business model will find ways to highlight divisive content to drive advertising revenues.”The decision underlined the power of tech companies in determining who gets to say what online. While Mr. Zuckerberg has said that he does not wish his company to be “the arbiter of truth” in social discourse, Facebook has become increasingly active about the kinds of content it allows. To prevent the spread of misinformation, the company has cracked down on QAnon conspiracy theory groups, election falsehoods and anti-vaccination content in recent months, before culminating in the blocking of Mr. Trump in January.“This case has dramatic implications for the future of speech online because the public and other platforms are looking at how the oversight board will handle what is a difficult controversy that will arise again around the world,” said Nate Persily, a professor at Stanford University’s law school.He added, “President Trump has pushed the envelope about what is permissible speech on these platforms and he has set the outer limits such that if you are unwilling to go after him, you are allowing a large amount of incitement and hate speech and disinformation online that others are going to propagate.”In a statement, Facebook said it was “pleased” that the board recognized that its barring of Mr. Trump in January was justified. The company added that it would consider the ruling and “determine an action that is clear and proportionate.”Mr. Trump’s case is the most prominent that the Facebook Oversight Board, which was conceived in 2018, has handled. The board, which is made up of 20 journalists, activists and former politicians, reviews and adjudicates the company’s most contested content moderation decisions. Mr. Zuckerberg has repeatedly referred to it as the “Facebook Supreme Court.”But while the panel is positioned as independent, it was founded and funded by Facebook and has no legal or enforcement authority. Critics have been skeptical of the board’s autonomy and have said it gives Facebook the ability to punt on difficult decisions.Each of its cases is decided by a five-person panel selected from among the board’s 20 members, one of whom must be from the country in which the case originated. The panel reviews the comments on the case and makes recommendations to the full board, which decides through a majority vote. After a ruling, Facebook has seven days to act on the board’s decision.Mark Zuckerberg, the Facebook chief executive, testified before during the Senate judiciary committee last year. He has denied that the platform showed political bias.Pool photo by Hannah Mckay/EPA, via ShutterstockSince the board began issuing rulings in January, it has overturned Facebook’s decisions in four out of the five cases it has reviewed. In one case, the board asked Facebook to restore a post that used Joseph Goebbels, the Nazi propaganda chief, to make a point about the Trump presidency. Facebook had earlier removed the post because it “promoted dangerous individuals,” but complied with the board’s decision.In another case, the board ruled that Facebook had overreached by taking down a French user’s post that erroneously suggested the drug hydroxychloroquine could be used to cure Covid-19. Facebook restored the post but also said it would keep removing the false information following guidance by the Centers for Disease Control and Prevention and the World Health Organization.In Mr. Trump’s case, Facebook also asked the board to make recommendations on how to handle the accounts of political leaders. On Wednesday, the board suggested the company should publicly explain when it was applying special rules to influential figures, though it should impose definite time limits when doing so. The board also said Facebook should more clearly explain its strikes and penalties process, and develop and publish a policy that governs responses to crises or novel situations where its regular processes would not prevent imminent harm.“Facebook has been clearly abused by influential users,” said Helle Thorning-Schmidt, a co-chair of the Oversight Board.Facebook does not have to adopt these recommendations but said it “will carefully review” them.For Mr. Trump, Facebook was long a place to rally his digital base and support other Republicans. More than 32 million people followed him on Facebook, though that was far fewer than the more than 88 million followers he had on Twitter.Over the years, Mr. Trump and Mr. Zuckerberg also shared a testy relationship. Mr. Trump regularly assailed Silicon Valley executives for what he perceived to be their suppression of conservative speech. He also threatened to revoke Section 230, a legal shield that protects companies like Facebook from liability for what users post.Mr. Zuckerberg occasionally criticized some of Mr. Trump’s policies, including the handling of the pandemic and immigration. But as calls from lawmakers, civil rights leaders and even Facebook’s own employees grew to rein in Mr. Trump on social media, Mr. Zuckerberg declined to act. He said speech by political leaders — even if they spread lies — was newsworthy and in the public interest.The two men also appeared cordial during occasional meetings in Washington. Mr. Zuckerberg visited the White House more than once, dining privately with Mr. Trump.The politeness ended on Jan. 6. Hours before his supporters stormed the Capitol, Mr. Trump used Facebook and other social media to try to cast doubt on the results of the presidential election, which he had lost to Joseph R. Biden Jr. Mr. Trump wrote on Facebook, “Our Country has had enough, they won’t take it anymore!”Less than 24 hours later, Mr. Trump was barred from the platform indefinitely. While his Facebook page has remained up, it has been dormant. His last Facebook post, on Jan. 6, read, “I am asking for everyone at the U.S. Capitol to remain peaceful. No violence!”Cecilia Kang More

  • in

    A Facebook panel will reveal on Wednesday whether Trump will regain his megaphone.

    Facebook’s Oversight Board, an independent and international panel that was created and funded by the social network, plans to announce on Wednesday whether former President Donald J. Trump will be able to return to the platform that has been a critical megaphone for him and his tens of millions of followers.The decision will be closely watched as a template for how private companies that run social networks handle political speech, including the misinformation spread by political leaders.Mr. Trump was indefinitely locked out of Facebook on Jan. 7 after he used his social media accounts to incite a mob of his supporters to storm the Capitol a day earlier. Mr. Trump had declined to accept his election defeat, saying the election had been stolen from him.At the time that Facebook barred Mr. Trump, the company’s chief executive, Mark Zuckerberg, wrote in a post: “We believe the risks of allowing the president to continue to use our service during this period are simply too great.”Two weeks later, the company referred the case of Mr. Trump to Facebook’s Oversight Board for a final decision on whether the ban should be permanent. Facebook and the board’s members have said the panel’s decisions are binding, but critics are skeptical of the board’s independence. The panel, critics said, is a first-of-its-kind Supreme Court-like entity on online speech, funded by a private company with a poor track record of enforcing its own rules.Facebook’s approach to political speech has been inconsistent. In October 2019, Mr. Zuckerberg declared the company would not fact check political speech and said that even lies by politicians deserved a place on the social network because it was in the public’s interest to hear all ideas by political leaders. But Mr. Trump’s comments on Jan. 6 were different, the company has said, because they incited violence and threatened the peaceful transition of power in elections.On Monday, Mr. Trump continued to deny the election results.“The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!” he said in an emailed statement. More

  • in

    Zuckerberg, Dorsey and Pichai testify about disinformation.

    The chief executives of Google, Facebook and Twitter are testifying at the House on Thursday about how disinformation spreads across their platforms, an issue that the tech companies were scrutinized for during the presidential election and after the Jan. 6 riot at the Capitol.The hearing, held by the House Energy and Commerce Committee, is the first time that Mark Zuckerberg of Facebook, Jack Dorsey of Twitter and Sundar Pichai of Google are appearing before Congress during the Biden administration. President Biden has indicated that he is likely to be tough on the tech industry. That position, coupled with Democratic control of Congress, has raised liberal hopes that Washington will take steps to rein in Big Tech’s power and reach over the next few years.The hearing is also be the first opportunity since the Jan. 6 Capitol riot for lawmakers to question the three men about the role their companies played in the event. The attack has made the issue of disinformation intensely personal for the lawmakers since those who participated in the riot have been linked to online conspiracy theories like QAnon.Before the hearing, Democrats signaled in a memo that they were interested in questioning the executives about the Jan. 6 attacks, efforts by the right to undermine the results of the 2020 election and misinformation related to the Covid-19 pandemic.Republicans sent the executives letters this month asking them about the decisions to remove conservative personalities and stories from their platforms, including an October article in The New York Post about President Biden’s son Hunter.Lawmakers have debated whether social media platforms’ business models encourage the spread of hate and disinformation by prioritizing content that will elicit user engagement, often by emphasizing salacious or divisive posts.Some lawmakers will push for changes to Section 230 of the Communications Decency Act, a 1996 law that shields the platforms from lawsuits over their users’ posts. Lawmakers are trying to strip the protections in cases where the companies’ algorithms amplified certain illegal content. Others believe that the spread of disinformation could be stemmed with stronger antitrust laws, since the platforms are by far the major outlets for communicating publicly online.“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” said Representative Frank Pallone, the New Jersey Democrat who is chairman of the committee.The tech executives are expected to play up their efforts to limit misinformation and redirect users to more reliable sources of information. They may also entertain the possibility of more regulation, in an effort to shape increasingly likely legislative changes rather than resist them outright. More

  • in

    Facebook Ends Ban on Political Advertising

    AdvertisementContinue reading the main storySupported byContinue reading the main storyFacebook Ends Ban on Political AdvertisingThe social network had prohibited political ads on its site indefinitely after the November election. Such ads have been criticized for spreading misinformation.Mark Zuckerberg, the Facebook chief executive, testifying in October. Before the ban on political ads, he had said he wanted to maintain a hands-off approach toward speech on Facebook.Credit…Pool photo by Michael ReynoldsMarch 3, 2021Updated 6:16 p.m. ETSAN FRANCISCO — Facebook said on Wednesday that it planned to lift its ban on political advertising across its network, resuming a form of digital promotion that has been criticized for spreading misinformation and falsehoods and inflaming voters.The social network said it would allow advertisers to buy new ads about “social issues, elections or politics” beginning on Thursday, according to a copy of an email sent to political advertisers and viewed by The New York Times. Those advertisers must complete a series of identity checks before being authorized to place the ads, the company said.“We put this temporary ban in place after the November 2020 election to avoid confusion or abuse following Election Day,” Facebook said in a blog post. “We’ve heard a lot of feedback about this and learned more about political and electoral ads during this election cycle. As a result, we plan to use the coming months to take a closer look at how these ads work on our service to see where further changes may be merited.”Political advertising on Facebook has long faced questions. Mark Zuckerberg, Facebook’s chief executive, has said he wished to maintain a largely hands-off stance toward speech on the site — including political ads — unless it posed an immediate harm to the public or individuals, saying that he “does not want to be the arbiter of truth.”But after the 2016 presidential election, the company and intelligence officials discovered that Russians had used Facebook ads to sow discontent among Americans. Former President Donald J. Trump also used Facebook’s political ads to amplify claims about an “invasion” on the Mexican border in 2019, among other incidents.Facebook had banned political ads late last year as a way to choke off misinformation and threats of violence around the November presidential election. In September, the company said it planned to forbid new political ads for the week before Election Day and would act swiftly against posts that tried to dissuade people from voting. Then in October, Facebook expanded that action by declaring it would prohibit all political and issue-based advertising after the polls closed on Nov. 3 for an undetermined length of time.The company eventually clamped down on groups and pages that spread certain kinds of misinformation, such as discouraging people from voting or registering to vote. It has spent billions of dollars to root out foreign influence campaigns and other types of meddling from malicious state agencies and other bad actors.In December, Facebook lifted the ban to allow some advertisers to run political issue and candidacy ads in Georgia for the January runoff Senate election in the state. But the ban otherwise remained in effect for the remaining 49 states.Attitudes around how political advertising should be treated across Facebook are decidedly mixed. Politicians who are not well known often can raise their profile and awareness of their campaigns by using Facebook.“Political ads are not bad things in and of themselves,” said Siva Vaidhyanathan, a media studies professor and the author of a book studying Facebook’s effects on democracy. “They perform an essential service, in the act of directly representing the candidate’s concerns or positions.”He added, “When you ban all campaign ads on the most accessible and affordable platform out there, you tilt the balance toward the candidates who can afford radio and television.”Representative Alexandria Ocasio-Cortez, Democrat of New York, has also said that political advertising on Facebook can be a crucial component for Democratic digital campaign strategies.Some political ad buyers applauded the lifting of the ads ban.“The ad ban was something that Facebook did to appease the public for the misinformation that spread across the platform,” said Eileen Pollet, a digital campaign strategist and founder of Ravenna Strategies. “But it really ended up hurting good actors while bad actors had total free rein. And now, especially since the election is over, the ban had really been hurting nonprofits and local organizations.”Facebook has long sought to thread the needle between forceful moderation of its policies and a lighter touch. For years, Mr. Zuckerberg defended politicians’ right to say what they wanted on Facebook, but that changed last year amid rising alarm over potential violence around the November election.In January, Facebook barred Mr. Trump from using his account and posting on the platform after he took to social media to delegitimize the election results and incited a violent uprising among his supporters, who stormed the U.S. Capitol.Facebook said Mr. Trump’s suspension was “indefinite.” The decision is now under review by the Facebook Oversight Board, a third-party entity created by the company and composed of journalists, academics and others that adjudicates some of the company’s thorny content policy enforcement decisions. A decision is expected to come within the next few months.On Thursday, political advertisers on Facebook will be able to submit new ads or turn on existing political ads that have already been approved, the company said. Each ad will appear with a small disclaimer, stating that it has been “paid for by” a political organization. For those buying new ads, Facebook said it could take up to a week to clear the identity authorization and advertising review process.AdvertisementContinue reading the main story More

  • in

    Congressional Committee Presses Cable Providers on Election Fraud Claims

    AdvertisementContinue reading the main storySupported byContinue reading the main storyCongressional Committee Presses Cable Providers on Election Fraud ClaimsBefore a hearing scheduled for Wednesday, Democratic members of the House Energy and Commerce Committee asked cable companies what they did to combat “the spread of misinformation.”President Trump’s supporters approach the Capitol on Jan. 6.Credit…Kenny Holston for The New York TimesFeb. 22, 2021, 9:14 a.m. ETThree months ago, federal lawmakers grilled Mark Zuckerberg, Facebook’s chief executive, and Jack Dorsey, Twitter’s chief, about the misinformation that had appeared on their platforms. Now, a congressional committee has scheduled a hearing to focus on the role of companies that provide cable television service in the spread of falsehoods concerning the 2020 election.In advance of the Wednesday hearing, called “Fanning the Flames: Disinformation and Extremism in the Media,” members of the House Energy and Commerce Committee sent a letter on Monday to Comcast, AT&T, Spectrum, Dish, Verizon, Cox and Altice, asking about their role in “the spread of dangerous misinformation.”The committee members also sent the letter to Roku, Amazon, Apple, Google and Hulu, digital companies that distribute cable programming.The scrutiny of cable providers took on new urgency after supporters of former President Donald J. Trump, who repeatedly promoted the debunked claim that the election was rigged, stormed the Capitol on Jan. 6.“To our knowledge, the cable, satellite and over-the-top companies that disseminate these media outlets to American viewers have done nothing in response to the misinformation aired by these outlets,” two Democratic representatives from California, Anna G. Eshoo and Jerry McNerney, wrote in the letter, which was reviewed by The New York Times.None of the companies to which the letter was sent immediately replied to requests for comment.Newsmax, a right-wing cable channel carried by AT&T, CenturyLink, Charter, Comcast, Dish and Verizon, had a surge in ratings in November because of programs that embraced the former president’s claims of voter fraud. One America News Network, a right-wing outlet carried by AT&T, CenturyLink and Verizon, also promoted the false theory.Fox News, the most-watched cable news network, which is available from all major carriers, was one of five defendants in a $2.7 billion defamation lawsuit filed this month by the election technology company Smartmatic. In the suit, the company accused Fox News, its parent company Fox Corporation, three Fox anchors and two frequent Fox guests of promoting false claims about the election and Smartmatic’s role in it. (Fox has denied the claims and filed a motion to dismiss the suit.)Congress can raise the issue of whether cable providers bear responsibility for the programs they deliver to millions of Americans, but it may have no way to force them to drop networks that have spread misinformation. And unlike broadcast stations, cable channels do not have licenses that are regulated by the Federal Communications Commission.The lawmakers’ letter asks the companies, “What steps did you take prior to, on, and following the November 3, 2020 elections and the January 6, 2021 attacks to monitor, respond to, and reduce the spread of disinformation, including encouragement or incitement of violence by channels your company disseminates to millions of Americans?”“Are you planning to continue carrying Fox News, OANN, and Newsmax on your platform both now and beyond the renewal date?” the letter continues. “If so, why?”Blair Levin, who served as the F.C.C.’s chief of staff under President Bill Clinton, said a hearing could be a first step toward meaningful action. “You have to establish a factual record that on both the election and Covid, tens of millions of Americans believe things that are just factually not true, and then try to figure out: ‘What are the appropriate roles for the government in changing that dynamic?’” Mr. Levin said.Harold Feld, the senior vice president at Public Knowledge, a nonprofit group focused on telecommunications and digital rights, suggested that legislators might not have easy options to exert influence over Fox, Newsmax or OAN.“You have a lot of people who are very angry about it, you have a lot of people who want to show that they’re very angry about it, but you don’t have a lot of good ideas yet about what you ought to be doing about it,” he said.For now, defamation lawsuits filed by private companies have taken the lead in the fight against disinformation promoted on some cable channels.Last month, Dominion Voting Systems, another election technology company that has figured prominently in conspiracy theories about the 2020 vote, sued two of Mr. Trump’s legal representatives, Rudolph W. Giuliani and Sidney Powell, in separate lawsuits, each seeking more than $1 billion in damages. Both appeared as guests on Fox News, Fox Business, Newsmax and OAN in the weeks after the election.On Monday, Dominion sued Mike Lindell, the chief executive of MyPillow, alleging that he defamed Dominion with baseless claims of election fraud involving its voting machines.AdvertisementContinue reading the main story More