More stories

  • in

    Jeffrey Katzenberg Talks About His Billion-Dollar Flop

    The public failure of his start-up Quibi hasn’t stopped Jeffrey Katzenberg from doubling down on tech. A Hollywood power broker, he headed up Disney in the 1980s and ’90s and co-founded a rival studio, DreamWorks, before finding a puzzle he could not yet solve: getting people to pay for short-format content. Investors gave him and the former Hewlett-Packard C.E.O. and California gubernatorial candidate Meg Whitman $1.75 billion to build a video platform, but not enough customers opened up their wallets, at $4.99 a month, and Quibi folded within a year of its launch. Katzenberg says the problems were product-market fit and the Covid pandemic, not competition from TikTok or YouTube.[You can listen to this episode of “Sway” on Apple, Spotify, Google or wherever you get your podcasts.]In this conversation, Kara Swisher and Katzenberg delve into Quibi’s demise, the shifting power dynamics in Hollywood and his pivot to Silicon Valley. They also discuss his influence in another sphere: politics. And the former Hollywood executive, who co-chaired a fund-raiser to help fend off California’s recent recall effort, offers some advice to Gov. Gavin Newsom.(A full transcript of the episode will be available midday on the Times website.)Photograph by WndrCoThoughts? Email us at sway@nytimes.com.“Sway” is produced by Nayeema Raza, Blakeney Schick, Matt Kwong, Daphne Chen and Caitlin O’Keefe and edited by Nayeema Raza; fact-checking by Kate Sinclair; music and sound design by Isaac Jones; mixing by Carole Sabouraud and Sonia Herrero; audience strategy by Shannon Busta. Special thanks to Kristin Lin and Liriel Higa. More

  • in

    YouTube Suspends Trump’s Channel for at Least Seven Days

    #masthead-section-label, #masthead-bar-one { display: none }Capitol Riot FalloutliveLatest UpdatesInside the SiegeVisual TimelineNotable ArrestsFar-Right SymbolsAdvertisementContinue reading the main storySupported byContinue reading the main storyYouTube Suspends Trump’s Channel for at Least Seven DaysYouTube is the latest tech company to bar the president from posting online, following Twitter, Facebook and others.YouTube headquarters in San Bruno, Calif. Credit…Jim Wilson/The New York TimesPublished More

  • in

    From Voter Fraud to Vaccine Lies: Misinformation Peddlers Shift Gears

    AdvertisementContinue reading the main storySupported byContinue reading the main storyFrom Voter Fraud to Vaccine Lies: Misinformation Peddlers Shift GearsElection-related falsehoods have subsided, but misleading claims about the coronavirus vaccines are surging — often spread by the same people.Sidney Powell, who was a member of President Trump’s legal team, on Capitol Hill last month. She has started posting inaccurate claims about the coronavirus vaccines online.Credit…Jonathan Ernst/ReutersDavey Alba and Dec. 16, 2020, 5:00 a.m. ETSidney Powell, a lawyer who was part of President Trump’s legal team, spread a conspiracy theory last month about election fraud. For days, she claimed that she would “release the Kraken” by showing voluminous evidence that Mr. Trump had won the election by a landslide.But after her assertions were widely derided and failed to gain legal traction, Ms. Powell started talking about a new topic. On Dec. 4, she posted a link on Twitter with misinformation that said that the population would be split into the vaccinated and the unvaccinated and that “big government” could surveil those who were unvaccinated.“NO WAY #America,” Ms. Powell wrote in the tweet, which collected 22,600 shares and 51,000 likes. “This is more authoritarian communist control imported straight from #China.” She then tagged Mr. Trump and the former national security adviser Michael T. Flynn — both of whom she had represented — and other prominent right-wing figures to highlight the post.Ms. Powell’s changing tune was part of a broader shift in online misinformation. As Mr. Trump’s challenges to the election’s results have been knocked down and the Electoral College has affirmed President-elect Joseph R. Biden Jr.’s win, voter fraud misinformation has subsided. Instead, peddlers of online falsehoods are ramping up lies about the Covid-19 vaccines, which were administered to Americans for the first time this week.Apart from Ms. Powell, others who have spread political misinformation such as Rep. Marjorie Taylor Greene, a Republican of Georgia, as well as far-right websites like ZeroHedge, have begun pushing false vaccine narratives, researchers said. Their efforts have been amplified by a robust network of anti-vaccination activists like Robert F. Kennedy Jr. on platforms including Facebook, YouTube and Twitter.Among their misleading notions is the idea that the vaccines are delivered with a microchip or bar code to keep track of people, as well as a lie that the vaccines will hurt everyone’s health (the vaccines from Pfizer and Moderna have been proved to be more than 94 percent effective in trials, with minimal side effects). Falsehoods about Bill Gates, the Microsoft co-founder and philanthropist who supports vaccines, have also increased, with rumors that he is responsible for the coronavirus and that he stands to profit from a vaccine, according to data from media insights company Zignal Labs.The shift shows how political misinformation purveyors are hopping from topic to topic to maintain attention and influence, said Melissa Ryan, chief executive of Card Strategies, a consulting firm that researches disinformation.It is “an easy pivot,” she said. “Disinformation about vaccines and the pandemic have long been staples of the pro-Trump disinformation playbook.”The change has been particularly evident over the last six weeks. Election misinformation peaked on Nov. 4 at 375,000 mentions across cable television, social media, print and online news outlets, according to an analysis by Zignal. By Dec. 3, that had fallen to 60,000 mentions. But coronavirus misinformation steadily increased over that period, rising to 46,100 mentions on Dec. 3, from 17,900 mentions on Nov. 8.NewsGuard, a start-up that fights false stories, said that of the 145 websites in its Election Misinformation Tracking Center, a database of sites that publish false election information, 60 percent of them have also published misinformation about the coronavirus pandemic. That includes right-wing outlets such as Breitbart, Newsmax and One America News Network, which distributed inaccurate articles about the election and are now also running misleading articles about the vaccines.John Gregory, the deputy health editor for NewsGuard, said the shift was not to be taken lightly because false information about vaccines leads to real-world harm. In Britain in the early 2000s, he said, a baseless link between the measles vaccine and autism spooked people into not taking that vaccine. That led to deaths and serious permanent injuries, he said.“Misinformation creates fear and uncertainty around the vaccine and can reduce the number of people willing to take it,” said Carl Bergstrom, a University of Washington evolutionary biologist who has been tracking the pandemic.Dr. Shira Doron, an epidemiologist at Tufts Medical Center, said the consequences of people not taking the Covid-19 vaccines because of misinformation would be catastrophic. The vaccines are “the key piece to ending the pandemic,” she said. “We are not getting there any other way.”Ms. Powell did not respond to a request for comment.To deal with vaccine misinformation, Facebook, Twitter, YouTube and other social media sites have expanded their policies to fact-check and demote such posts. Facebook and YouTube said they would remove false claims about the vaccines, while Twitter said it pointed people to credible public health sources.Business & EconomyLatest UpdatesUpdated Dec. 16, 2020, 9:57 a.m. ETThe latest: Domino’s will pay its hourly workers a bonus.LVMH takes a stake in WhistlePig, an American rye whiskey brand.U.S. retail sales decline more than expected in November.The flow of vaccine falsehoods began rising in recent weeks as it became clear that the coronavirus vaccines would soon be approved and available. Misinformation spreaders glommed onto interviews by health experts and began twisting them.On Dec. 3, for example, Dr. Kelly Moore, the associate director for immunization education at the nonprofit Immunization Action Coalition, said in an interview with CNN that when people receive the vaccine, “everyone will be issued a written card” that would “tell them what vaccine they had and when their next dose is due.”Dr. Moore was referring to a standard appointment reminder card that could also be used as a backup vaccine record. But skeptics quickly started saying online that the card was evidence that the U.S. government intended to surveil the population and limit the activities of people who were unvaccinated.That unfounded idea was further fueled by people like Ms. Powell and her Dec. 4 tweet. Her post pushed the narrative to 47,025 misinformation mentions that week, according to Zignal, making it the No. 1 vaccine misinformation story at the time.To give more credence to the idea, Ms. Powell also appended a link to an article from ZeroHedge, which claimed that immunity cards would “enable CDC to track Covid-19 vaxx status in database.” On Facebook, that article was liked and commented on 24,600 times, according to data from CrowdTangle, a Facebook-owned social media analytics tool. It also reached up to one million people.ZeroHedge did not respond to a request for comment.In an interview, Dr. Moore said she could not believe how her words had been distorted to seem as if she was supporting surveillance and restrictions on unvaccinated members of the public. “In fact, I was simply describing an ordinary appointment reminder card,” she said. “This is an old-school practice that goes on around the world.”Angela Stanton-King, a Republican candidate for Congress in Georgia, in Atlanta last month.Credit…Megan Varner/Getty ImagesOther supporters of Mr. Trump who said the election had been stolen from him also began posting vaccine falsehoods. One was Angela Stanton-King, a former Republican candidate for Congress from Georgia and a former reality TV star. On Dec. 5, she tweeted that her father would be forced to take the coronavirus vaccine, even though in reality the government has not made it mandatory.“My 78 yr old father tested positive for COVID before Thanksgiving he was told to go home and quarantine with no prescribed medication,” Ms. Stanton-King wrote in her tweet, which was liked and shared 13,200 times. “He had zero symptoms and is perfectly fine. Help me understand why we need a mandatory vaccine for a virus that heals itself…”Ms. Stanton-King declined to comment.Anti-vaccination activists have also jumped in. When two people in Britain had an adverse reaction to Pfizer’s Covid-19 vaccine this month, Mr. Kennedy, a son of former Senator Robert F. Kennedy who campaigns against vaccines as chairman of the anti-vaccination group Children’s Health Defense, pushed the unproven notion on Facebook that ingredients in the vaccine led to the reactions. He stripped out context that such reactions are usually very rare and it is not yet known whether the vaccines caused them.His Facebook post was shared 556 times and reached nearly a million people, according to CrowdTangle data. In an email, Mr. Kennedy said the Food and Drug Administration should “require pre-screening” of vaccine recipients and “monitor allergic and autoimmune reactions,” without acknowledging that regulators have already said they would do so.Ms. Ryan, the disinformation researcher, said that as long as there were loopholes for misinformation to stay up on social media platforms, purveyors would continue pushing falsehoods about the news topic of the day. It could be QAnon today, the election tomorrow, Covid-19 vaccines after that, she said.“They need to stay relevant,” she said. “Without Trump, they’re going to need new hobbies.”AdvertisementContinue reading the main story More

  • in

    YouTube to Forbid Videos Claiming Widespread Election Fraud

    AdvertisementContinue reading the main storyTracking Viral MisinformationYouTube to Forbid Videos Claiming Widespread Election FraudDec. 9, 2020, 12:25 p.m. ETDec. 9, 2020, 12:25 p.m. ETYouTube’s announcement is a reversal of a much-criticized company policy on election videos.Credit…Dado Ruvic/ReutersYouTube on Wednesday announced changes to how it handles videos about the 2020 presidential election, saying it would remove new videos that mislead people by claiming that widespread fraud or errors influenced the outcome of the election.The company said it was making the change because Tuesday was the so-called safe harbor deadline — the date by which all state-level election challenges, such as recounts and audits, are supposed to be completed. YouTube said that enough states have certified their election results to determine that Joseph R. Biden Jr. is the president-elect.YouTube’s announcement is a reversal of a much-criticized company policy on election videos. Throughout the election cycle, YouTube, which is owned by Google, has allowed videos spreading false claims of widespread election fraud under a policy that permits videos that comment on the outcome of an election. Under the new policy, videos about the election uploaded before the safe harbor deadline would remain on the platform, with YouTube appending an information panel linking to the Office of the Federal Register’s election results certification notice.In a blog post on Wednesday, YouTube pushed back on the idea that it had allowed harmful and misleading elections-related videos to spread unfettered on its site. The company said that since September, it had shut down over 8,000 channels and “thousands” of election videos that violated its policies. Since Election Day, the company said, it had also shown fact-check panels over 200,000 times above relevant election-related search results on voter fraud narratives such as “Dominion voting machines” and “Michigan recount.”AdvertisementContinue reading the main story More

  • in

    YouTube, under pressure over election falsehoods, suspends OAN for Covid-19 misinformation.

    YouTube suspended One America News Network, one of the right-wing channels aggressively pushing false claims about widespread election fraud, for violating its policies on misinformation.But the misinformation that got OAN in trouble on Tuesday had nothing to do with the election. YouTube removed a video that violated its policies against content claiming that there is a guaranteed cure for Covid-19. YouTube said it issued a strike against the channel as part of its three-strike policy. That meant OAN is not permitted to upload new videos or livestream on the platform for one week.The move came on the same day that a group of Democratic senators urged YouTube to reverse its policy of allowing videos containing election outcome misinformation and pushed the company to adopt more aggressive steps to curb the spread of false content and manipulated media ahead of crucial runoff elections for Georgia’s two Senate seats in January.In the weeks after the election, OAN has published articles challenging the integrity of the vote and pushing President Trump’s false claims that he won the election.YouTube has said OAN is not an authoritative news source and stripped advertising from a few of its videos for undermining confidence in elections with “demonstrably false” information. However, the videos remained available on the platform, helping OAN to gain share among right-wing channels.In addition to the one-week suspension, YouTube said it kicked OAN out of a program that allows partner channels to generate advertising revenue from videos for repeated violations of its COVID-19 misinformation policy and other infractions. One America News’s YouTube channel will remain up during the suspension.OAN representatives could not immediately be reached for comment on Tuesday.YouTube, which is owned by Google, has come under criticism for allowing videos spreading false claims of widespread election fraud under a policy that permits videos that comment on the outcome of an election.“Like other companies, we allow discussions of this election’s results and the process of counting votes, and are continuing to closely monitor new developments,” Ivy Choi, a YouTube spokeswoman, said in a statement. “Our teams are working around the clock to quickly remove content that violates our policies and ensure that we are connecting people with authoritative information about elections.”YouTube said it had surfaced videos from what it deemed to be authoritative news sources in search results and recommendations, while affixing a label to videos discussing election results. That label states that The Associated Press has called the election for Joseph R. Biden Jr. with a link to a results page on Google.In a letter sent Tuesday to Susan Wojcicki, YouTube’s chief executive, four Democratic senators — Robert Menendez of New Jersey, Mazie Hirono of Hawaii, Gary Peters of Michigan and Amy Klobuchar of Minnesota — said they had “deep concern with the proliferation of misinformation” on the platform. The letter pointed to how one YouTube video with the baseless claim of voter fraud in Michigan had five million views.“These videos seek to undermine our democracy and cast doubt on the legitimacy of President-elect Biden’s incoming administration,” the senators wrote. “Moreover, because the current president has not committed to a peaceful transition of power, misinformation and manipulated media content on your platform may fuel civil unrest.”The senators also expressed concern about the runoff elections for the two Georgia Senate seats, because those races will garner “significant national interest.” In a series of questions to Ms. Wojcicki, the senators asked if YouTube would commit to removing false or misleading information about the 2020 election and the Georgia races. They asked the company to respond by Dec. 8. More