More stories

  • in

    Facebook Said to Consider Forming an Election Commission

    The social network has contacted academics to create a group to advise it on thorny election-related decisions, said people with knowledge of the matter.Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.Mark Zuckerberg, Facebook’s chief executive, testified remotely in April about social media’s role in extremism and misinformation. Via ReutersFacebook declined to comment.If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.A spokesman for the Oversight Board declined to comment.Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.Lawmakers and political ad buyers also criticized Facebook for changing the rules around political ads before the 2020 presidential election. Last year, the company said it would bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter. An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.” More

  • in

    How Jason Miller Is Trying to Get Trump Back on the Internet

    Social media has felt quieter without the constant ALL CAPS fury of Donald Trump, but Jason Miller is trying to change that.Miller, who was the former president’s longtime aide and spokesman, recently took a new gig running a social media platform called Gettr, which claims to be a haven from censorship and cancel culture. It may sound a little like Parler 2.0, but the game-changer for Gettr — which has a little under two million users — would be if Miller can get Trump to create an account and get back online.[You can listen to this episode of “Sway” on Apple, Spotify, Google or wherever you get your podcasts.]In this conversation, Kara Swisher asks Miller how he intends to get Trump to log on. She challenges him on his claims that Twitter and Facebook are out to censor conservatives and presses him about how content moderation works on his platform. They also discuss the question on everyone’s mind: Is Trump likely to run again in 2024?(A full transcript of the episode will be available midday on the Times website.)Joe RachaThoughts? Email us at sway@nytimes.com.“Sway” is produced by Nayeema Raza, Blakeney Schick, Matt Kwong Daphne Chen and Caitlin O’Keefe, and edited by Nayeema Raza ; fact-checking by Kate Sinclair, Michelle Harris and Kristin Lin; music and sound design by Isaac Jones; mixing by Carole Sabouraud and Sonia Herrero; audience strategy by Shannon Busta. Special thanks to Kristin Lin and Liriel Higa. More

  • in

    Trump, Covid and the Loneliness Breaking America

    I wasn’t planning on reading any of the new batch of Donald Trump books. His vampiric hold on the nation’s attention for five years was nightmarish enough; one of the small joys of the post-Trump era is that it’s become possible to ignore him for days at a time. More

  • in

    Here's a Look Inside Facebook's Data Wars

    Executives at the social network have clashed over CrowdTangle, a Facebook-owned data tool that revealed users’ high engagement levels with right-wing media sources.One day in April, the people behind CrowdTangle, a data analytics tool owned by Facebook, learned that transparency had limits.Brandon Silverman, CrowdTangle’s co-founder and chief executive, assembled dozens of employees on a video call to tell them that they were being broken up. CrowdTangle, which had been running quasi-independently inside Facebook since being acquired in 2016, was being moved under the social network’s integrity team, the group trying to rid the platform of misinformation and hate speech. Some CrowdTangle employees were being reassigned to other divisions, and Mr. Silverman would no longer be managing the team day to day.The announcement, which left CrowdTangle’s employees in stunned silence, was the result of a yearlong battle among Facebook executives over data transparency, and how much the social network should reveal about its inner workings.On one side were executives, including Mr. Silverman and Brian Boland, a Facebook vice president in charge of partnerships strategy, who argued that Facebook should publicly share as much information as possible about what happens on its platform — good, bad or ugly.On the other side were executives, including the company’s chief marketing officer and vice president of analytics, Alex Schultz, who worried that Facebook was already giving away too much.They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.Team Selective Disclosure won, and CrowdTangle and its supporters lost.An internal battle over data transparency might seem low on the list of worthy Facebook investigations. And it’s a column I’ve hesitated to write for months, in part because I’m uncomfortably close to the action. (More on that in a minute.)But the CrowdTangle story is important, because it illustrates the way that Facebook’s obsession with managing its reputation often gets in the way of its attempts to clean up its platform. And it gets to the heart of one of the central tensions confronting Facebook in the post-Trump era. The company, blamed for everything from election interference to vaccine hesitancy, badly wants to rebuild trust with a skeptical public. But the more it shares about what happens on its platform, the more it risks exposing uncomfortable truths that could further damage its image. The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.These people, most of whom would speak only anonymously because they were not authorized to discuss internal conversations, said Facebook’s executives were more worried about fixing the perception that Facebook was amplifying harmful content than figuring out whether it actually was amplifying harmful content. Transparency, they said, ultimately took a back seat to image management.Facebook disputes this characterization. It says that the CrowdTangle reorganization was meant to integrate the service with its other transparency tools, not weaken it, and that top executives are still committed to increasing transparency.“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” said Joe Osborne, a Facebook spokesman. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”But the executives who pushed hardest for transparency appear to have been sidelined. Mr. Silverman, CrowdTangle’s co-founder and chief executive, has been taking time off and no longer has a clearly defined role at the company, several people with knowledge of the situation said. (Mr. Silverman declined to comment about his status.) And Mr. Boland, who spent 11 years at Facebook, left the company in November.“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Mr. Boland said, in his first interview since departing. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”Mr. Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, left Facebook in November. Christian Sorensen Hansen for The New York TimesThe Twitter Account That Launched 1,000 MeetingsHere’s where I, somewhat reluctantly, come in.I started using CrowdTangle a few years ago. I’d been looking for a way to see which news stories gained the most traction on Facebook, and CrowdTangle — a tool used mainly by audience teams at news publishers and marketers who want to track the performance of their posts — filled the bill. I figured out that through a kludgey workaround, I could use its search feature to rank Facebook link posts — that is, posts that include a link to a non-Facebook site — in order of the number of reactions, shares and comments they got. Link posts weren’t a perfect proxy for news, engagement wasn’t a perfect proxy for popularity and CrowdTangle’s data was limited in other ways, but it was the closest I’d come to finding a kind of cross-Facebook news leaderboard, so I ran with it.At first, Facebook was happy that I and other journalists were finding its tool useful. With only about 25,000 users, CrowdTangle is one of Facebook’s smallest products, but it has become a valuable resource for power users including global health organizations, election officials and digital marketers, and it has made Facebook look transparent compared with rival platforms like YouTube and TikTok, which don’t release nearly as much data.But the mood shifted last year when I started a Twitter account called @FacebooksTop10, on which I posted a daily leaderboard showing the sources of the most-engaged link posts by U.S. pages, based on CrowdTangle data.Last fall, the leaderboard was full of posts by Mr. Trump and pro-Trump media personalities. Since Mr. Trump was barred from Facebook in January, it has been dominated by a handful of right-wing polemicists like Mr. Shapiro, Mr. Bongino and Sean Hannity, with the occasional mainstream news article, cute animal story or K-pop fan blog sprinkled in.The account went semi-viral, racking up more than 35,000 followers. Thousands of people retweeted the lists, including conservatives who were happy to see pro-Trump pundits beating the mainstream media and liberals who shared them with jokes like “Look at all this conservative censorship!” (If you’ve been under a rock for the past two years, conservatives in the United States frequently complain that Facebook is censoring them.)The lists also attracted plenty of Facebook haters. Liberals shared them as evidence that the company was a swamp of toxicity that needed to be broken up; progressive advertisers bristled at the idea that their content was appearing next to pro-Trump propaganda. The account was even cited at a congressional hearing on tech and antitrust by Representative Jamie Raskin, Democrat of Maryland, who said it proved that “if Facebook is out there trying to suppress conservative speech, they’re doing a terrible job at it.”Inside Facebook, the account drove executives crazy. Some believed that the data was being misconstrued and worried that it was painting Facebook as a far-right echo chamber. Others worried that the lists might spook investors by suggesting that Facebook’s U.S. user base was getting older and more conservative. Every time a tweet went viral, I got grumpy calls from Facebook executives who were embarrassed by the disparity between what they thought Facebook was — a clean, well-lit public square where civility and tolerance reign — and the image they saw reflected in the Twitter lists.As the election approached last year, Facebook executives held meetings to figure out what to do, according to three people who attended them. They set out to determine whether the information on @FacebooksTop10 was accurate (it was), and discussed starting a competing Twitter account that would post more balanced lists based on Facebook’s internal data.They never did that, but several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only “engagement,” while the true measure of Facebook popularity would be based on “reach,” or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees have access to it.)Last September, Mark Zuckerberg, Facebook’s chief executive, told Axios that while right-wing content garnered a lot of engagement, the idea that Facebook was a right-wing echo chamber was “just wrong.”“I think it’s important to differentiate that from, broadly, what people are seeing and reading and learning about on our service,” Mr. Zuckerberg said.But Mr. Boland, the former Facebook vice president, said that was a convenient deflection. He said that in internal discussions, Facebook executives were less concerned about the accuracy of the data than about the image of Facebook it presented.“It told a story they didn’t like,” he said of the Twitter account, “and frankly didn’t want to admit was true.”The Trouble With CrowdTangleAround the same time that Mr. Zuckerberg made his comments to Axios, the tensions came to a head. The Economist had just published an article claiming that Facebook “offers a distorted view of American news.”The article, which cited CrowdTangle data, showed that the most-engaged American news sites on Facebook were Fox News and Breitbart, and claimed that Facebook’s overall news ecosystem skewed right wing. John Pinette, Facebook’s vice president of global communications, emailed a link to the article to a group of executives with the subject line “The trouble with CrowdTangle.”“The Economist steps onto the Kevin Roose bandwagon,” Mr. Pinette wrote. (See? Told you it was uncomfortably close to home.)Nick Clegg, Facebook’s vice president of global affairs, replied, lamenting that “our own tools are helping journos to consolidate the wrong narrative.”Other executives chimed in, adding their worries that CrowdTangle data was being used to paint Facebook as a right-wing echo chamber.David Ginsberg, Facebook’s vice president of choice and competition, wrote that if Mr. Trump won re-election in November, “the media and our critics will quickly point to this ‘echo chamber’ as a prime driver of the outcome.”Fidji Simo, the head of the Facebook app at the time, agreed.“I really worry that this could be one of the worst narratives for us,” she wrote.Several executives proposed making reach data public on CrowdTangle, in hopes that reporters would cite that data instead of the engagement data they thought made Facebook look bad.But Mr. Silverman, CrowdTangle’s chief executive, replied in an email that the CrowdTangle team had already tested a feature to do that and found problems with it. One issue was that false and misleading news stories also rose to the top of those lists.“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.Tweaking VariablesMr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.It’s worth noting that these transparency efforts are voluntary, and could disappear at any time. There are no regulations that require Facebook or any other social media companies to reveal what content performs well on their platforms, and American politicians appear to be more interested in fighting over claims of censorship than getting access to better data.It’s also worth noting that Facebook can turn down the outrage dials and show its users calmer, less divisive news any time it wants. (In fact, it briefly did so after the 2020 election, when it worried that election-related misinformation could spiral into mass violence.) And there is some evidence that it is at least considering more permanent changes.This year, Mr. Hegeman, the executive in charge of Facebook’s news feed, asked a team to figure out how tweaking certain variables in the core news feed ranking algorithm would change the resulting Top 10 lists, according to two people with knowledge of the project.The project, which some employees refer to as the “Top 10” project, is still underway, the people said, and it’s unclear whether its findings have been put in place. Mr. Osborne, the Facebook spokesman, said that the team looks at a variety of ranking changes, and that the experiment wasn’t driven by a desire to change the Top 10 lists.As for CrowdTangle, the tool is still available, and Facebook is not expected to cut off access to journalists and researchers in the short term, according to two people with knowledge of the company’s plans.Mr. Boland, however, said he wouldn’t be surprised if Facebook executives decided to kill off CrowdTangle entirely or starve it of resources, rather than dealing with the headaches its data creates.“Facebook would love full transparency if there was a guarantee of positive stories and outcomes,” Mr. Boland said. “But when transparency creates uncomfortable moments, their reaction is often to shut down the transparency.” More

  • in

    The Constitution of Knowledge review: defending truth from Trump

    Jonathan Rauch is among America’s more thoughtful and rigorously honest public intellectuals. In his new book, he addresses the rise of disinformation and its pernicious effects on democratic culture.Through an analogy to the US constitution, he posits that the “values and rules and institutions” of “liberal science” effectively serve as “a governing structure, forcing social contestation onto peaceful and productive pathways. And so I call them, collectively, the Constitution of Knowledge.”What he calls the “reality based community [is] the social network which adheres to liberal science’s rules and norms … objectivity, factuality, rationality: they live not just within individuals’ minds and practices but on the network”. This community includes not only the hard sciences but also such fields as scholarship, journalism, government and law, in a “marketplace of persuasion” driven by pursuit of truth under clear standards of objectivity.Rauch puts the Trump era at the heart of the challenge, as Trump felt no “accountability to truth”, telling reporter Lesley Stahl that he did so to “demean you all, so when you write negative stories about me, no one will believe you”.To Rauch, “Trump and his media echo chambers [lied] because their goal was to denude the public’s capacity to make any distinctions.” Thus “every truth was met not just with denial but with inversion … to convey … that the leader was the supreme authority”.The result is a crisis of democracy. As Senator Ben Sasse warned, “A republic will not work if we don’t have shared facts.” What emerges, in Rauch’s term, as “epistemic tribalism” effectively denies “the concept of objective knowledge [which] is inherently social”.There is much here and the diagnosis is superb, with clear explanations of how and why disinformation spreads. Rauch finds glimmers of hope and positive change, as digital media act “more like publishers … crafting epistemic standards and norms”. Solutions, though, involve self-regulation rather than government action. Rauch cites Twitter’s Jack Dorsey in noting “that the battle against misinformation and abusive online behavior would be won more by product design than by policy design”.True, yet Rauch admits there are “no comprehensive solutions to the disinformation threat”, instead hoping for reactions that will promote “something like a stronger immune system … less vulnerable”.Rauch is a “radical incrementalist”. Hearty praise for John Stuart Mill makes clear that he seeks solutions principally from within classical liberalism, which the analogy to a social network reinforces: individuals working as a community, not a collective. Thus he shies from using government to enforce adherence to the Constitution of Knowledge.“Cause for alarm, yes,” Rauch writes. “Cause for fatalism – no.”Surely there is a third perspective. Referring to a director of the National Institutes of Health known for rigorous science and deep faith, Rauch states that “a person who applied the Constitution of Knowledge to every daily situation would be Sherlock Holmes or Mr Spock: an otherworldly fictional character. In fact, when I compare Francis Collins’s worldview with my own, I think mine is the more impoverished. He has access to two epistemic realms; I, only one.”It shows Rauch’s generosity of spirit and intellectual integrity that he recognizes the validity and worth of other epistemic realms. They may, in fact, be a clue to solving the broader problem.Does the constitutional order contain sufficient self-correcting mechanism? Rauch’s response, in a forceful and heartfelt final chapter, is to renew engagement in defence of truth. This is right so far as it goes. Waving the white flag, or silence (as Mill, not Burke noted) enables and ensures defeat in the face of attacks on the concept of truth. It’s good that “Wikipedia figured out how to bring the Constitution of Knowledge online”, but that only works with a presumed universal acceptance of truth, a challenge in an era where a 2018 MIT study found falsehoods were 70% more likely to be retweeted than truths.One may heartily agree with Solzhenitsyn (whom Rauch quotes) that “one word of truth outweighs the world” and yet note with horror (as Rauch does) that a few powerful algorithms can overwhelm it in the heat of a tech-amplified campaign.Does individual action mean fervent defence in response to every inaccurate social media post? How to judge? (The individual rational response is generally to ignore the false post, hence the collective action issue.) And would that defence guarantee success when disinformation has destroyed trust in institutions and in the concept of truth itself?Rauch’s optimism is infectious but it may fall short. The reality-based community seems no longer to have a hold on the common mind, weakening its power in the face of organized and mechanical opposition.Shorn of an appeal to what Lincoln termed the “mystic chords of memory” – itself not subject to empirical verification – how does the reality-based community avoid consignment to the margins? If a pandemic hasn’t convinced many people of the truth of science, what will?Calling for “more truth” may not be enough when people don’t want to know the truth or cannot tell what it is. If “Trump was waging warfare against the American body politic”, why should that body not respond collectively? Lincoln’s appeal – a strong use of political savvy and rhetoric to call his hearers to something beyond ourselves – can help.That is Rauch’s challenge – and ours. One may agree with him about the progress of science and support its extension to fields of social and political science. But the very urgency of the situation demands wise prediction on whether that will be sufficient. Rauch states his case as well as possible, but repairing the breaches in the body politic may require more than he is willing to endorse.Rauch begins with Socrates (“the sense of wonder is the mark of the philosopher”) and describes continued debate towards truth in Socrates’ words: “Let us meet here again.”Indeed, and with Rauch, we will. In the meanwhile, less Mill, more Lincoln. More

  • in

    A leaked S&M video won’t keep Zack Weiner out of politics – and nor should it | Arwa Mahdawi

    You have to be something of a masochist to want to get into politics – and Zack Weiner is an unapologetic masochist. Last week, the 26-year-old, who is running for a place on the city council in New York, was something of a nonentity: he had zero name recognition and his campaign had raised just over $10,000 (£7,200), most of which he had donated himself.Perhaps the most notable thing about Weiner was the fact his dad is the co-creator of the kids’ TV show Dora the Explorer. But that changed when a video of a man engaged in consensual sadomasochism was posted on Twitter by an anonymous account that claimed the man was Weiner. On Saturday, the New York Post ran a story about the video, complete with salacious screengrab. Pretty soon it made international headlines.Why would anyone care about the sex life of an unknown twentysomething running for local office? Well, because a lot of people are pervs, for one thing. But the main reason the story has become so popular is because of how Weiner responded. Instead of going on the defensive, he owned it. His own campaign manager was the one who tipped off the New York Post about the video and Weiner told the paper that he is a “proud BDSMer”, who has nothing to be ashamed of.“Whoops. I didn’t want anyone to see that, but here we are,” Weiner later wrote on Twitter. “Like many young people, I have grown into a world where some of our most private moments have been documented online. While a few loud voices on Twitter might chastise me for the video, most people see the video for what it is: a distraction.”Weiner’s response to the video is almost identical to a plotline from the TV show BillionsThe frank and dignified way in which Weiner handled this episode has, quite rightly, earned him a lot of praise. It is, in many ways, a masterclass in how to respond to revenge porn.There was some speculation that the video was a publicity stunt. Releasing a sex tape of yourself in order to kickstart a political career might once have been unthinkable, but in today’s attention economy it is all too plausible. Donald Trump taught the world that any idiot can get into politics as long as you find a way to keep your name in the headlines.Then there’s the fact that Weiner’s response to the video is almost identical to a plotline from the TV show Billions. “I’m a masochist,” the character Chuck Rhoades announces in a press conference after a political rival threatens to leak pictures of him enjoying sadomasochistic sex in an attempt to derail his campaign for state attorney general for New York. Rhoades’s speech is a huge success: he goes on to win the election.So is it possible that Weiner’s campaign, inspired by Billions, might have leaked the video itself? Absolutely not, Joseph Gallagher, Weiner’s campaign manager, told me. He added, for good measure, that neither he nor Weiner, who is also an actor and screenwriter, had ever watched the TV show. The reason he flagged the video to the Post, he clarified, was in order to control the narrative and get ahead of the story. Which makes sense.Ultimately, what’s important is the fact that, as Weiner pointed out, a generation of young people who have documented every part of their lives are starting to enter politics. Revenge porn, which has already helped derail the political career of the former congresswoman Katie Hill, is going to become a common political weapon. And I suspect female politicians will have a far harder time surviving the weaponisation of their personal lives than men. More

  • in

    Iran Activists Urge Election Boycott. Raisi Likely Winner.

    In a soft pleading voice, the white-haired woman in the video implores, “For the sake of my son, Pouya Bakhtiari, don’t vote.” She holds the young man’s photo, and continues, “Because of the bullet they shot at his head and shattered his dreams, don’t vote.” In a second video, another mother, sitting next to a gravestone, echoed the same message: “At 30, my son lies under a huge pile of dirt.” A third woman described her 18-year-old son as full of hope, until Nov. 17, 2019, when a bullet pierced his heart.“Voting means betrayal,” she added.Videos like these, circulating on Iranian activists’ social media accounts with the hashtag that in Persian means #notovoting, have been appearing in increasing numbers in the weeks and months leading up to Iran’s presidential election on Friday. Some of the videos have been made by parents who say their children were shot dead during antigovernment protests over the last few years. Others are by the parents of political prisoners who were executed by the regime in the 1980s, as well as by the families of those who died in the Ukrainian passenger plane that crashed last year shortly after takeoff from Tehran. (Iran’s military said it mistakenly shot down the plane).What’s remarkable about the videos is their audacity: that Iranians are speaking up, seemingly without fear, about boycotting an election in an authoritarian country whose leaders rarely tolerate open displays of dissent. Iranians have had enough. And besides, what’s the point of voting when the result is predetermined?The call for an election boycott seems to be resonating: a recent poll by the state-run Iranian Student Polling Agency predicts that turnout will be as low as 40 percent — the lowest since the 1979 revolution.A low turnout in Friday’s election would certainly signal a rejection of the Islamic regime. But not voting will also give the regime exactly what it wants: a near-certain assurance that its handpicked candidate, Ebrahim Raisi, a cleric who is close to Iran’s Islamic Revolutionary Guards Corps, will win.Of course, the regime has done its part as well for Mr. Raisi. Last month, the Guardian Council, the body that vets election candidates, rejected all the potential candidates except for Mr. Raisi and six relatively unknown figures.Even insiders to the regime were reportedly stunned that the council had gone so far as to bar a current vice president, Eshaq Jahangiri, and a former speaker of Parliament, Ali Larijani.To be sure, the Guardian Council has rejected other presidential hopefuls over the past four decades. But this time it’s especially significant because the supreme leader, Ayatollah Ali Khamenei, is 82, which raises the issue of succession. Hard-liners within the Revolutionary Guards are grooming Mr. Raisi to take his place, making his election into office even more important. The ayatollah’s support for Mr. Raisi is no secret. After Mr. Raisi failed a bid for the presidency in 2017, Ayatollah Khamenei made him head of the judiciary two years later.The tightly controlled process has led many Iranians to question the entire exercise. And institutions such as the Guardian Council, which is controlled by Ayatollah Khamenei, have stymied any democratic change and crippled the efforts of presidents who have tried to introduce political and social freedoms. (Two presidential candidates during the 2009 race, Mehdi Karroubi and the former Prime Minister Mir Hossein Mousavi, who campaigned on a platform of delivering democratic reforms, remain under house arrest. The regime at the time suppressed massive protests in the aftermath of what was seen as a widely disputed election.)The campaign to boycott the election highlights the rising levels of both anger and apathy toward the regime, at a time when the economy has been suffering under the weight of U.S. sanctions, as well as mismanagement and corruption by Iranian officials. The government also badly botched the Covid-19 pandemic, leaving more than 82,000dead so far. In addition, the regime has brutally cracked down on protests that have erupted since 2009, mostly over worsening economic conditions.Those boycotting the vote include a wide group of people inside and outside Iran, including many who formerly used to be sympathetic to the regime, such as the former President Mahmoud Ahmadinejad, Mr. Mousavi and Faezeh Hashemi, the daughter of former President Akbar Hashemi Rafsanjani. Last month, over 230 prominent activists signed an open letter calling for an election boycott and stated that their goal is to bring “nonviolent transition from the Islamic Republic to the rule of the people.”Unsurprisingly, Ayatollah Khamenei has branded those pushing for a boycott as enemies and has urged Iranians to go to the polls. Here lies the regime’s dilemma: Iran’s leadership wants just enough turnout to legitimize Mr. Raisi’s victory, but not so much that the result might demonstrate how unpopular he really is.During his campaign trips in recent weeks, Mr. Raisi has sought to cast himself as a man of the people and has promised to fight corruption. He talked to people who approached him about pending court cases, depicting himself as an accessible man. But his past as head of the judiciary is testament to what may lie ahead under his rule. Young activists were tried behind closed doors and executed. As a young cleric, he signed off on the executions of thousands of political prisoners in 1988.Boycotting the elections, for a population that is deeply scarred, is understandable. But sadly, a boycott this time may cement the hard-liners’ grip on power for many years to come.Nazila Fathi is the author of “The Lonely War: One Woman’s Account of the Struggle for Modern Iran.” She is a fellow at the Washington-based Middle East Institute. She lives in Maryland.The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram. More