More stories

  • in

    Las investigaciones internas de Facebook: los documentos muestran señales de alarma sobre la desinformación

    Documentos de la empresa revelan que en varias ocasiones trabajadores de la red social advirtieron de la difusión de desinformación y teorías de la conspiración antes y después de las elecciones presidenciales de Estados Unidos.Dieciséis meses antes de las elecciones presidenciales celebradas en noviembre del año pasado, una investigadora de Facebook describió un acontecimiento alarmante. Una semana después de abrir una cuenta experimental, ya estaba recibiendo contenido sobre la teoría conspirativa de QAnon, según escribió en un informe interno.El 5 de noviembre, dos días después de las elecciones, otro empleado de Facebook escribió un mensaje para alertar a sus colegas sobre los comentarios con “desinformación electoral polémica” que se podían ver debajo de muchas publicaciones.Cuatro días después de eso, un científico de datos de la empresa escribió una nota para sus compañeros de trabajo en la que decía que el diez por ciento de todas las vistas de material político en Estados Unidos —una cifra sorprendentemente alta— eran publicaciones que alegaban un fraude electoral.En cada caso, los empleados de Facebook sonaron una alarma sobre desinformación y contenido inflamatorio en la plataforma e instaron a tomar medidas, pero la empresa no atendió los problemas o tuvo dificultades para hacerlo. La comunicación interna fue parte de un conjunto de documentos de Facebook que obtuvo The New York Times, que brindan nueva información sobre lo ocurrido dentro de la red social antes y después de las elecciones de noviembre, cuando a la empresa la tomaron desprevenida los usuarios que convirtieron la plataforma en un arma para difundir mentiras sobre la votación. More

  • in

    Facebook revelations: what is in cache of internal documents?

    FacebookFacebook revelations: what is in cache of internal documents?Roundup of what we have learned after release of papers and whistleblower’s testimony to MPs Dan Milmo Global technology editorMon 25 Oct 2021 14.42 EDTLast modified on Mon 25 Oct 2021 16.04 EDTFacebook has been at the centre of a wave of damaging revelations after a whistleblower released tens of thousands of internal documents and testified about the company’s inner workings to US senators.Frances Haugen left Facebook in May with a cache of memos and research that have exposed the inner workings of the company and the impact its platforms have on users. The first stories based on those documents were published by the Wall Street Journal in September.Facebook whistleblower Frances Haugen calls for urgent external regulationRead moreHaugen gave further evidence about Facebook’s failure to act on harmful content in testimony to US senators on 5 October, in which she accused the company of putting “astronomical profits before people”. She also testified to MPs and peers in the UK on Monday, as a fresh wave of stories based on the documents was published by a consortium of news organisations.Facebook’s products – the eponymous platform, the Instagram photo-sharing app, Facebook Messenger and the WhatsApp messaging service – are used by 2.8 billion people a day and the company generated a net income – a US measure of profit – of $29bn (£21bn) last year.Here is what we have learned from the documents, and Haugen, since the revelations first broke last month.Teenage mental healthThe most damaging revelations focused on Instagram’s impact on the mental health and wellbeing of teenage girls. One piece of internal research showed that for teenage girls already having “hard moments”, one in three found Instagram made body issues worse. A further slide shows that one in three people who were finding social media use problematic found Instagram made it worse, with one in four saying it made issues with social comparison worse.Facebook described reports on the research, by the WSJ in September, as a “mischaracterisation” of its internal work. Nonetheless, the Instagram research has galvanised politicians on both sides of the Atlantic seeking to rein in Facebook.Violence in developing countriesHaugen has warned that Facebook is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said that 87% of the spending on combating misinformation at Facebook is spent on English content when only 9% of users are English speakers. According to the news site Politico on Monday, just 6% of Arabic-language hate content was detected on Instagram before it made its way on to the platform.Haugen told Congress on 5 October that Facebook’s use of engagement-based ranking – where the platform ranks a piece of content, and whether to put it in front of users, on the amount of interactions it gets off people – was endangering lives. “Facebook … knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world. And that’s what is causing things like ethnic violence in Ethiopia,” she said.Divisive algorithm changesIn 2018 Facebook changed the way it tailored content for users of its news feed feature, a key part of people’s experience of the platform. The emphasis on boosting “meaningful social interactions” between friends and family meant that the feed leant towards reshared material, which was often misinformed and toxic. “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” said internal research. Facebook said it had an integrity team that was tackling the problematic content “as efficiently as possible”.Tackling falsehoods about the US presidential electionThe New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for the platform – were of posts alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle “Stop the Steal” groups spreading claims that the election was rigged. “Enforcement was piecemeal,” said the research. The revelations have reignited concerns about Facebook’s role in the 6 January riots.Facebook said: “The responsibility for the violence that occurred … lies with those who attacked our Capitol and those who encouraged them.” However, the WSJ has also reported that Facebook’s automated systems were taking down posts generating only an estimated 3-5% of total views of hate speech.Disgruntled Facebook staffWithin the files disclosed by Haugen are testimonies from dozens of Facebook employees frustrated by the company’s failure to either acknowledge the harms it generates, or to properly support efforts to mitigate or prevent those harms. “We are FB, not some naive startup. With the unprecedented resources we have, we should do better,” wrote one employee quoted by Politico in the wake of the 6 January attack on the US capitol.“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the US, we determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” wrote another. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy … History will not judge us kindly.”Facebook is struggling to recruit young usersA section of a complaint filed by Haugen’s lawyers with the US financial watchdog refers to young users in “more developed economies” using Facebook less. This is a problem for a company that relies on advertising for its income because young users, with unformed spending habits, can be lucrative to marketers. The complaint quotes an internal document stating that Facebook’s daily teenage and young adult (18-24) users have “been in decline since 2012-13” and “only users 25 and above are increasing their use of Facebook”. Further research reveals “engagement is declining for teens in most western, and several non-western, countries”.Haugen said engagement was a key metric for Facebook, because it meant users spent longer on the platform, which in turn appealed to advertisers who targeted users with adverts that accounted for $84bn (£62bn) of the company’s $86bn annual revenue. On Monday, Bloomberg said “time spent” for US teenagers on Facebook was down 16% year-on-year, and that young adults in the US were also spending 5% less time on the platform.Facebook is built for divisive contentOn Monday the NYT reported an internal memo warning that Facebook’s “core product mechanics”, or its basic workings, had let hate speech and misinformation grow on the platform. The memo added that the basic functions of Facebook were “not neutral”. “We also have compelling evidence that our core product mechanics, such as vitality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” said the 2019 memo.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we are a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we have invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook avoids confrontations with US politicians and rightwing news organisationsA document seen by the Financial Times showed a Facebook employee claiming Facebook’s public policy team blocked decisions to take down posts “when they see that they could harm powerful political actors”. The document said: “In multiple cases the final judgment about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg.” The memo said moves to take down content by repeat offenders against Facebook’s guidelines, such as rightwing publishers, were often reversed because the publishers might retaliate. The wave of stories on Monday were based on disclosures made to the Securities and Exchange Commission – the US financial watchdog – and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organisations including the NYT, Politico and Bloomberg.TopicsFacebookSocial mediaSocial networkingUS Capitol attackUS politicsDigital mediaanalysisReuse this content More

  • in

    The Observer view on Donald Trump’s Truth Social | Observer editorial

    OpinionDonald TrumpThe Observer view on Donald Trump’s Truth SocialObserver editorialAided by his app, the great liar could yet return as the Republicans’ next presidential nominee Sun 24 Oct 2021 01.30 EDTIn the life story of Donald Trump, to his mind an epic saga of unrivalled achievement, these are the wilderness years. After the US electoral college confirmed his 2020 defeat, an outcome he still mendaciously disputes, Trump plunged into despair. He sulked, he raged, he conspired. Yet the 6 January coup plot was an egregious step too far. He was cast into outer darkness.Trump lost the White House bully pulpit and a US president’s ability to command instant global attention. Personally wounding was the ban imposed by Twitter, Facebook and Instagram, which belatedly agreed he posed a threat to democracy. Trump was cut off from social media and his supporter base. He was all but silenced.What worse fate could there be for a narcissist who craves constant attention and approval? Exiled to his luxury Florida estate, the Elba of the Everglades, Trump has struggled since to regain his voice. Last week, he made his move. The result: the so-called Truth Social media app, launching next year.The newly formed company behind the app, Trump Media and Technology Group, plans to disseminate what it calls “anti-woke” news, debate and entertainment to Americans deprived of honest, impartial media outlets. This is total drivel, of course, coming from the mouth of the most shameless liar in modern US history.Abusing truth as only Trump can, Truth Social will more likely prove both false and antisocial. It’s his way of regaining lost ground, prior to a wished-for presidential comeback in 2024. It’s a political propaganda platform intended to magnify and exploit the hate, ignorance and prejudice on which he feeds. MPs please note: Trump is the ultimate definition of “online harms”.This self-serving bid to defeat “the tyranny of big tech” is a commercial long shot. The new app looks remarkably similar to Twitter, which has more than 200m users. Previous US attempts to grow alternative “conservative social space” have failed. Although shares in the new company initially soared, its USP is overly dependent on Trump’s continuing appeal.That appeal looks increasingly fractured. Trump is under fire from Mitch McConnell, the Senate minority leader, and other Republicans who fear his obsession with overturning the 2020 result is deflecting attention from Joe Biden’s mistakes ahead of next year’s midterm congressional elections.An early test will come on 2 November when Democrat-leaning Virginia elects a governor. Polls there currently suggest a dead heat. Trump, meanwhile, is taking legal heat, too. His family business faces a fraud investigation. He was recently questioned under oath for more than four hours in a civil lawsuit in New York.Steve Bannon, one of his best-known former aides, has been found in contempt of Congress for refusing to testify to the 6 January inquiry and faces possible criminal prosecution. Since Trump ordered all his minions to act similarly, the legal bull’s-eye pinned to his back grows ever more unmissable.Yet for all that, Trump remains first choice among Republican voters for the party’s presidential nomination. His average “favourable/unfavourable” rating is almost identical to Biden’s among the electorate as a whole. And he has shown how dangerous he can be when he reaches a wide audience, which is why Truth Social is worrying.Will Trump rise again from the depths, like the “shapeless monsters” imagined by the great 19th-century Russian novelist Ivan Turgenev? Life is akin to an unsuspecting man sitting in a small boat on a calm, limitless ocean, he wrote. “Then one of the monsters begins to emerge from the murk, rising higher and higher, becoming ever more repellently, clearly discernible… Another minute and its impact will overturn the boat.”For now, Trump’s monstrous outline is blurred, his voice muted. He awaits Turgenev’s “destined day”, when he plans, once again, to capsize the ship of state. To which we say: all hands on deck!TopicsDonald TrumpOpinionRepublicansSocial mediaUS politicseditorialsReuse this content More

  • in

    Facebook missed weeks of warning signs over Capitol attack, documents suggest

    FacebookFacebook missed weeks of warning signs over Capitol attack, documents suggestMaterials provided by Frances Haugen to media outlets shine light on how company apparently stumbled into 6 January Guardian staff and agenciesSat 23 Oct 2021 14.22 EDTFirst published on Sat 23 Oct 2021 12.23 EDTAs extremist supporters of Donald Trump stormed the US Capitol on 6 January, battling police and forcing lawmakers into hiding, an insurrection of a different kind was taking place inside the world’s largest social media company.Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinformation and content likely to incite further violence.Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with records of hate speech and filtering out the “Stop the Steal” rallying cry of Trump’s campaign to overturn his electoral loss, falsely citing widespread fraud. Officials have called it the most secure election in US history.Actions also included empowering Facebook content moderators to act more assertively by labeling the US a “temporary high risk location” for political violence.At the same time, frustration inside Facebook erupted over what some saw as the company’s halting and inconsistent response to rising extremism in the US.“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the 6 January turmoil.“We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”It’s a question that still hangs over the company today, as Congress and regulators investigate Facebook’s role in the events.New internal documents have been provided to a number of media outlets in recent days by the former Facebook employee turned whistleblower Frances Haugen, following her initial disclosures and claims that the platform puts profits before public good, and her testimony to Congress.The outlets, including the New York Times, the Washington Post and NBC, published reports based on those documents, which offer a deeper look into the spread of misinformation and conspiracy theories on the platform, particularly related to the 2020 US presidential election.They show that Facebook employees repeatedly flagged concerns before and after the election, when Trump tried to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told co-workers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent. But as workers flagged these issues and urged the company to act, the company failed or struggled to address the problems, the Times reported.The internal documents also show Facebook researchers have found the platform’s recommendation tools repeatedly pushed users to extremist groups, prompting internal warnings that some managers and executives ignored, NBC News reported.In one striking internal study, a Facebook researcher created a fake profile for “Carol Smith”, a conservative female user whose interests included Fox News and Donald Trump. The experiment showed that within two days, Facebook’s algorithm was recommending “Carol” join groups dedicated to QAnon, a baseless internet conspiracy theory.The documents also provide a rare glimpse into how the company appears to have simply stumbled into the events of 6 January.It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing – by posting on Facebook itself – to stop Congress from certifying Joe Biden’s election victory.This story is based in part on disclosures Haugen made to the Securities and Exchange Commission (SEC), the US agency that handles regulation to protect investors in publicly traded companies, provided to Congress in redacted form by her legal counsel.Facebook crisis grows as new whistleblower and leaked documents emergeRead moreThe redacted versions received by Congress were obtained by a consortium of news organizations, including the Associated Press.What Facebook called “Break the Glass” emergency measures put in place on 6 January were essentially a toolkit of options designed to stem the spread of dangerous or violent content. The social network had first used the system in the run-up to the bitter 2020 election.As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen has said.An internal Facebook report following 6 January, previously reported by BuzzFeed, faulted the company for a “piecemeal” approach to the rapid growth of “Stop the Steal” pages.Facebook said the situation was more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content. The company said it was not responsible for the actions of the rioters – and that having stricter controls in place prior to that day wouldn’t have helped.Facebook’s decisions to phase certain safety measures in or out had taken into account signals from the Facebook platform as well as information from law enforcement, said a spokesperson, Dani Lever, saying: “When those signals changed, so did the measures.”Lever added that some of the measures had stayed in place well into February and others remained active today.Meanwhile, Facebook is facing mounting pressure after a new whistleblower on Friday accused it of knowingly hosting hate speech and illegal activity.Allegations by the new whistleblower, who spoke to the Washington Post, were reportedly contained in a complaint to the SEC.In the complaint, which echoes Haugen’s disclosures, the former employee detailed how Facebook officials frequently declined to enforce safety rules for fear of angering Donald Trump and his allies or offsetting the company’s huge growth. In one alleged incident, Tucker Bounds, a Facebook communications official, dismissed concerns about the platform’s role in 2016 election manipulation.“It will be a flash in the pan,” Bounds said, according to the affidavit, as reported by the Post. “Some legislators will get pissy. And then in a few weeks they will move on to something else. Meanwhile, we are printing money in the basement, and we are fine.” TopicsFacebookUS Capitol attackSocial networkingSocial mediaUS politicsnewsReuse this content More

  • in

    What Happened When Facebook Employees Warned About Election Misinformation

    Company documents show that the social network’s employees repeatedly raised red flags about the spread of misinformation and conspiracies before and after the contested November vote.Sixteen months before last November’s presidential election, a researcher at Facebook described an alarming development. She was getting content about the conspiracy theory QAnon within a week of opening an experimental account, she wrote in an internal report.On Nov. 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustible election misinformation” were visible below many posts.Four days after that, a company data scientist wrote in a note to his co-workers that 10 percent of all U.S. views of political material — a startlingly high figure — were of posts that alleged the vote was fraudulent.In each case, Facebook’s employees sounded an alarm about misinformation and inflammatory content on the platform and urged action — but the company failed or struggled to address the issues. The internal dispatches were among a set of Facebook documents obtained by The New York Times that give new insight into what happened inside the social network before and after the November election, when the company was caught flat-footed as users weaponized its platform to spread lies about the vote. More

  • in

    Twitter admits bias in algorithm for rightwing politicians and news outlets

    TwitterTwitter admits bias in algorithm for rightwing politicians and news outletsHome feed promotes rightwing tweets over those from the left, internal research finds Dan Milmo Global technology editorFri 22 Oct 2021 08.04 EDTLast modified on Fri 22 Oct 2021 10.59 EDTTwitter has admitted it amplifies more tweets from rightwing politicians and news outlets than content from leftwing sources.The social media platform examined tweets from elected officials in seven countries – the UK, US, Canada, France, Germany, Spain and Japan. It also studied whether political content from news organisations was amplified on Twitter, focusing primarily on US news sources such as Fox News, the New York Times and BuzzFeed.The study compared Twitter’s “Home” timeline – the default way its 200 million users are served tweets, in which an algorithm tailors what users see – with the traditional chronological timeline where the most recent tweets are ranked first.The research found that in six out of seven countries, apart from Germany, tweets from rightwing politicians received more amplification from the algorithm than those from the left; right-leaning news organisations were more amplified than those on the left; and generally politicians’ tweets were more amplified by an algorithmic timeline than by the chronological timeline.According to a 27-page research document, Twitter found a “statistically significant difference favouring the political right wing” in all the countries except Germany. Under the research, a value of 0% meant tweets reached the same number of users on the algorithm-tailored timeline as on its chronological counterpart, whereas a value of 100% meant tweets achieved double the reach. On this basis, the most powerful discrepancy between right and left was in Canada (Liberals 43%; Conservatives 167%), followed by the UK (Labour 112%; Conservatives 176%). Even excluding top government officials, the results were similar, the document said.Twitter said it wasn’t clear why its Home timeline produced these results and indicated that it may now need to change its algorithm. A blog post by Rumman Chowdhury, Twitter’s director of software engineering, and Luca Belli, a Twitter researcher, said the findings could be “problematic” and that more study needed to be done. The post acknowledged that it was concerning if certain tweets received preferential treatment as a result of the way in which users interacted with the algorithm tailoring their timeline.“Algorithmic amplification is problematic if there is preferential treatment as a function of how the algorithm is constructed versus the interactions people have with it. Further root cause analysis is required in order to determine what, if any, changes are required to reduce adverse impacts by our Home timeline algorithm,” the post said.Twitter said it would make its research available to outsiders such as academics and it is preparing to let third parties have wider access to its data, in a move likely to put further pressure on Facebook to do the same. Facebook is being urged by politicians on both sides of the Atlantic to distribute its research to third parties after tens of thousands of internal documents – which included revelations that the company knew its Instagram app damaged teenage mental health – were leaked by the whistleblower Frances Haugen.The Twitter study compared the two ways in which a user can view their timeline: the first uses an algorithm to provide a tailored view of tweets that the user might be interested in based on the accounts they interact with most and other factors; the other is the more traditional timeline in which the user reads the most recent posts in reverse chronological order.The study compared the two types of timeline by considering whether some politicians, political parties or news outlets were more amplified than others. The study analysed millions of tweets from elected officials between 1 April and 15 August 2020 and hundreds of millions of tweets from news organisations, largely in the US, over the same period.Twitter said it would make its research available to third parties but said privacy concerns prevented it from making available the “raw data”. The post said: “We are making aggregated datasets available for third party researchers who wish to reproduce our main findings and validate our methodology, upon request.”Twitter added that it was preparing to make internal data available to external sources on a regular basis. The company said its machine-learning ethics, transparency and accountability team was finalising plans in a way that would protect user privacy.“This approach is new and hasn’t been used at this scale, but we are optimistic that it will address the privacy-vs-accountability tradeoffs that can hinder algorithmic transparency,” said Twitter. “We’re excited about the opportunities this work may unlock for future collaboration with external researchers looking to reproduce, validate and extend our internal research.”TopicsTwitterSocial mediaDigital mediaUS politicsnewsReuse this content More

  • in

    Trump Finds Backing for His Own Media Venture

    A merger could give the former president access to nearly $300 million in cash — and perhaps a new platform.Former President Donald J. Trump said on Wednesday that he had lined up the investment money to create his own publicly traded media company, an attempt to reinsert himself in the public conversation online from which he has largely been absent since Twitter and Facebook banned him after the Jan. 6 insurrection.If finalized, the deal could give the new Trump company access to nearly $300 million in spending money.In a statement announcing the new venture, Mr. Trump and his investors said that the new company would be called Trump Media & Technology Group and that they would create a new social network called Truth Social. Its purpose, according to the statement, is “to create a rival to the liberal media consortium and fight back against the ‘Big Tech’ companies of Silicon Valley.”Since he left office and became the only American president to be impeached twice, Mr. Trump has had an active presence in conservative media. But he lacks the ability he once had to sway news cycles and dominate the national political debate. He filed a lawsuit this month asking Twitter to reinstate his account.The announcement on Wednesday also pointed to a promised new app listed for pre-sale on the App Store, with mock-up illustrations bearing more than a passing resemblance to Twitter.The details of Mr. Trump’s latest partnership were vague. The statement he issued was reminiscent of the kind of claims he made about his business dealings in New York as a real estate developer. It was replete with high-dollar amounts and superlatives that could not be verified.Rumors of Mr. Trump’s interest in starting his own media businesses have circulated since he was defeated in the November 2020 election. None materialized. Despite early reports that he was interested in starting his own cable channel to rival Fox News, that was never an idea that got very far given the immense costs and time needed to put into it. A close adviser, Jason Miller, started a rival social media platform for Trump supporters called Gettr. But Mr. Trump never signed on.In a statement on Wednesday night, Mr. Miller said of his and Mr. Trump’s negotiations, “We just couldn’t come to terms on a deal.”Mr. Trump’s partner is Digital World Acquisition, a special purpose acquisition company, or SPAC. These so-called blank-check companies are an increasingly popular type of investment vehicle that sells shares to the public with the intention of using the proceeds to buy private businesses.Digital World was incorporated in Miami a month after Mr. Trump lost the 2020 election.The company filed for an initial public stock offering this spring, and it sold shares to the public on the Nasdaq stock exchange last month. The I.P.O. raised about $283 million, and Digital World drummed up another $11 million by selling shares to investors through a so-called private placement.Digital World is backed by some marquee Wall Street names and others with high-powered connections. In regulatory filings after the I.P.O., major hedge funds including D.E. Shaw, Highbridge Capital Management, Lighthouse Partners and Saba Capital Management have reported owning substantial percentages of Digital World.Digital World’s chief executive is Patrick F. Orlando, a former employee of investment banks including the German Deutsche Bank, where he specialized in the trading of financial instruments known as derivatives. He created his own investment bank, Benessere Capital, in 2012, according to a recent regulatory filing.Digital World’s chief financial officer, Luis Orleans-Braganza, is a member of Brazil’s National Congress.Mr. Orlando disclosed in a recent filing that he owned nearly 18 percent of the company’s outstanding stock. Mr. Orlando and representatives for Digital World did not immediately respond to requests for comment.This is not Mr. Orlando’s first blank-check company. He has created at least two others, including one, Yunhong International, that is incorporated in the offshore tax haven of the Cayman Islands.At the time that investors bought shares in Digital World, it had not disclosed what, if any, companies it planned to acquire. On its website, Digital World said that its goal was “to focus on combining with a leading tech company.”At least one of the investors, Saba Capital Management, did not know at the time of the initial public offering that Digital World would be doing a transaction with Mr. Trump, according to a person familiar with the matter.Mr. Trump, who has repeatedly lied about the results of the 2020 election while accusing the mainstream news media of publishing “fake” stories to discredit him, leaned hard into the notion of truth as his new company’s governing ethos.“We live in a world where the Taliban has a huge presence on Twitter, yet your favorite American president has been silenced,” Mr. Trump said in his written statement, vowing to publish his first item soon. “This is unacceptable.” More

  • in

    Hey Parler, Nashville Isn’t Turning Red

    NASHVILLE — When NPR’s tech reporter, Bobby Allyn, tweeted last week that the social media site Parler was moving its headquarters from Nevada to Nashville, a single word came to my mind — a word this newspaper will not publish, no matter that it is the only word in the English language truly appropriate to the situation.Parler’s chief executive, George Farmer, offered some reasons for moving the company. “Tennessee has great weather, an abundance of Southern hospitality, wonderful music and barbecue,” he wrote in an email announcement. “Even more than that, though, Tennessee shares Parler’s vision of individual liberty and free expression.”Founded in 2018 as a less regulated alternative to Facebook and Twitter, Parler is an online place where high-profile right-wing commentators and political figures can promulgate lies and conspiracy theories without interference. Though the company notified the F.B.I. about threats of violence in advance of the insurrection on Jan. 6, and has since added algorithms to detect posts calling directly for violence, it was nonetheless Parler’s vision of “free expression” that helped bring about the invasion of the U.S. Capitol by homegrown terrorists.The craven Republicans running Tennessee might share that vision of liberty, but Nashville definitely does not. Nashville, according to NBC News, is “a big blue dot in a deep red state.” That fact should tell you all you need to know about the relationship between this city and our state government. You likely know this dynamic already because it exists in virtually every major city or college town in every gerrymandered state governed by Republicans: Think Oxford, Miss.; Atlanta, Ga.; Birmingham, Ala.; Lexington, Ky.; Austin, Texas.; Chapel Hill, N.C.What you might not know is that Nashville is also in the midst of a convulsive identity crisis, unsure whether it wants to remain Music City or become something more like a tech incubator or a health care center or a financial services hub. Or maybe just the place where bridesmaids come to get drunk in the street.A midsize city on its way to becoming a big city can be all these things at once, of course, especially if it is a midsize city that is growing deliberately, in ways that do not displace its low-income residents or its work force. Especially if it is a midsize city that is investing in its public schools and building out its infrastructure to accommodate its meteoric growth.Nashville is doing those things poorly, if at all, and some of the blame for this paralysis can be laid at the feet of state government, which frequently passes pre-emptive laws or issues pre-emptive executive orders designed to tie the hands of Nashville leaders. The very last thing this city needs is to become the headquarters of a social media site favored by the right-wingers who are most poisoned by lies and hatred and fear.The truth is that high-profile members of the far right have been moving to Middle Tennessee since long before Parler announced its impending relocation. As the Nashville Scene’s Steven Hale noted when the conservative media celebrity Ben Shapiro decided to move the headquarters of The Daily Wire, the media company he co-founded, from Los Angeles to Nashville: “Look, we try hard to ignore these people,” Hale wrote. Nevertheless, here they are.And it’s not just celebrities who are moving to town. The coronavirus pandemic taught a lot of people that they can work wherever they want to work, and increasingly where people seem to want to work is in a state with no income tax. In my neighborhood alone, we have newcomers from Chicago, Houston, Los Angeles and a bunch of other places I can’t name because I haven’t met the new people yet. A few weeks ago, I overheard a conversation between two new neighborhood children on bicycles. “Are you from Nashville?” the first child asked. “I’m from Des Moines,” the other kid said.We are hospitable people here in Tennessee, it’s true, and we do have great music and barbecue. But Mr. Farmer should know that Tennessee’s “great weather” includes six of the 18 billion-dollar weather disasters to hit the U.S. this year — catastrophic weather events triggered by a changing climate that many on his site deny exists. More