The social network has contacted academics to create a group to advise it on thorny election-related decisions, said people with knowledge of the matter.
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.
Facebook declined to comment.
If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.
Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.
The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.
In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.
A spokesman for the Oversight Board declined to comment.
Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.
Lawmakers and political ad buyers also criticized Facebook for changing the rules around political ads before the 2020 presidential election. Last year, the company said it would bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.
The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.
The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.
Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.
There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.
“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”
Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.
An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.
Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”
Source: Elections - nytimes.com