More stories

  • in

    Fake news alert! Donald Trump’s new social media app is a triumph | Arwa Mahdawi

    Fake news alert! Donald Trump’s new social media app is a triumphArwa MahdawiThe former president’s media venture, Truth Social has got off to a rocky start – with technical problems and potential legal issues to boot Truth hurts, everyone knows that. Nevertheless, I wasn’t expecting my experience with Truth Social, Donald Trump’s new social media venture, to be quite so painful. After months of fanfare, the former president’s new app, which is essentially a Twitter clone, was opened to the US public on Sunday night. Obviously, I signed up straight away – or at least I tried to.Donald Trump’s social media app launches on Apple storeRead moreI spent 20 frustrating minutes attempting to create a new account and getting error message after error message. Eventually, I managed to sign up with the username @stormyd, only to be told that I had been put on a waiting list “due to massive demand”. I was number 194,276 in line, apparently. Which, I’m sure, is a very precise number and not something they just pulled out of the air.It is unclear how many people were actually successful at getting on Truth Social – although the Guardian has reported that at least one Catholic priest managed to join. The fact that you, apparently, needed God on your side to secure an account wasn’t the only issue with the launch: the app has also run into potential legal trouble. It turns out Truth Social may not have just taken inspiration from Twitter, the app’s logo looks suspiciously like that of a British solar power startup called Trailar. “Great to see Donald Trump supporting a growing sustainability business!” Trailar tweeted on Monday. “Maybe ask next time?”If Trump’s new app failed to successfully launch on time, it would hardly be the surprise of the century. The last time he made a lot of noise about launching a new media platform, it turned out to be an underwhelming blog, which shuttered after just a few weeks. It’s not as if Trump put a technological genius in charge of Truth Social: Devin Nunes, head honcho at the app’s parent company, Trump Media & Technology Group (TMTG), may be most famous for the fact that he once unsuccessfully sued a cow.In 2019, Nunes, who used to be a Republican congressman, filed a $250m lawsuit against Twitter and two parody Twitter accounts: one was called “Devin Nunes’ Mom” and one was called “Devin Nunes’ Cow”. This is no laughing matter, I’ll have you know. The cow was very mean to him: it called the politician a “treasonous cowpoke” whose “boots are full of manure”. It was all very hard for the poor man, whose lawsuit claimed that the parody accounts subjected him to a “defamation campaign of stunning breadth and scope, one that no human being should ever have to bear and suffer in their whole life”.Nunes doesn’t just have beef with cows, by the way. He’s a big fan of suing anyone who says anything mean to him, and has launched defamation lawsuits against a number of journalists. He managed to juggle all these lawsuits with his political career for a while but, in December, announced he was leaving Congress to join TMTG. “The time has come to reopen the internet and allow for the free flow of ideas and expression without censorship,” he proclaimed. Unless cows are involved, obviously. No free speech or free flow of ideas for cows! Or pesky journalists. Or anyone who says anything unflattering, if we’re being honest.Truth Social’s marketing material talks about welcoming diverse opinions but the app’s terms and conditions are rather more restrictive. Under “prohibited activities”, the rules state that users of the site agree not to “disparage, tarnish, or otherwise harm, in our opinion, us and/or the Site”.A cynic might wonder whether the fact that you are not allowed to say mean things about Trump on his app may factor in why Melania doesn’t appear to be a big fan of her husband’s latest venture. A couple of weeks ago, you see, the former first lady entered into a “special arrangement” to share “exclusive communications” with the conservative social media app Parler. Why would she announce an exclusive relationship with a direct competitor to Truth Social shortly before it launched ? I’m not even going to begin to speculate. The truth is out there, but there’s a very long waiting list to get to it.
    Arwa Mahdawi is a Guardian columnist
    TopicsDonald TrumpOpinionUS politicsSocial mediaDigital mediaInternetcommentReuse this content More

  • in

    Trump Truth Social app will be fully operational by end of March, Nunes says

    Trump Truth Social app will be fully operational by end of March, Nunes saysApple App Store lists rightwing Twitter alternative but ex-congressman tapped to lead company indicates slow rollout Donald Trump’s rightwing riposte to Twitter – his new social media app Truth Social – is supposed to launch on Monday. But the rollout of what the former president hopes will be the start of a new media empire continues to be shrouded in confusion and secrecy.Tim Scott, only Black Senate Republican, hints he could be Trump running mateRead moreDevin Nunes, the former Republican congressman and Trump loyalist who heads Trump Media & Technology Group (TMTG), told Fox News on Sunday Truth Social would make its debut on the Apple App Store this week. The app is featured on the store, with the notice “Expected Feb 21”.But the launch has been beset with delays. On the Fox News show Sunday Morning Futures, Nunes indicated that a full service was still weeks away.“Our goal is, I think we’re going to hit it, I think by the end of March we’re going to be fully operational at least within the United States,” he said.Truth Social is Trump’s answer to having been permanently thrown off Twitter after the company ruled that the then president’s tweets leading up to the US Capitol attack on January 6 2021 violated its policy against glorification of violence. The decision cut Trump off from direct contact with almost 90m followers.Facebook has also suspended Trump for comments inciting violence at the Capitol, but has left open the possibility of a return.Glimpses of what Truth Social will look like have been given in the past few days, prompting the observation that it looks remarkably similar to Twitter. Instead of blue ticks to denote verified accounts, it will use red ticks.Trump’s eldest son, Donald Jr, tweeted a screenshot of his father’s first post on Truth Social, which said: “Get ready! Your favorite President will see you soon!”The remark was much less memorable than the fact that the Truth Social screenshot and Donald Jr’s actual tweet looked virtually identical.Truth Social describes itself as a “big tent” social media platform “that encourages an open, free, and honest global conversation without discriminating against political ideology”.But given the initial teething problems of the launch, the former president could find it difficult to fill the hole in his public profile left by his banishment from established social media.Twitter records more than 200 million daily active users and Facebook almost 2 billion. By contrast Gettr, a social media outlet set up by Jason Miller, a former Trump adviser, claims 4 million users on average per month.Gettr is part of a growing number of social media start-ups vying to take on tech giants they accuse of censoring rightwing ideology. Gettr, Parler and Gab all present as rightwing alternatives to Twitter.Rumble is a video platform that sets itself up as conservative competition to YouTube. The company has said it will be providing video on the Truth Social app.The proliferation of rightwing social media sites, despite their relatively small reach compared with Silicon Valley giants, is prompting concern about their political impact.Observers have questioned whether the start-ups, which present themselves as forums for open untrammeled discussion, will act as breeding grounds for misinformation on subjects such as vaccinations, the climate crisis and election integrity.Truth Social has promised to ensure that its contents is “family friendly” and has reportedly entered a partnership with a San Francisco company, Hive, which will moderate posts using cloud-based artificial intelligence.Even the new app’s name is likely to be controversial, given Trump’s legendary struggles with veracity. The Washington Post calculated that in the four years of his presidency, the man now behind Truth Social made 30,573 false or misleading claims.TopicsDonald TrumpSocial mediaDigital mediaInternetUS politicsRepublicansnewsReuse this content More

  • in

    Trump’s social media platform hits roadblocks as major political battle looms

    Trump’s social media platform hits roadblocks as major political battle looms‘Truth Social’ purportedly plans to challenge Twitter and Facebook, platforms that have banned or curbed the ex-president Donald Trump’s plan to launch “Truth Social”, a special purpose acquisitions backed social media company, early next year may have hit a roadblock after US regulators issued a request for information on the deal on Monday.The request from the SEC and the Financial Industry Regulatory Authority for information from Digital World Acquisition Corp (DWAC), a blank-check SPAC that is set to merge with Trump Media & Technology Group, comes as a powerful Republican congressman, Devin Nunes, announced he was stepping out of politics to join the Trump media venture as CEO.The twin developments set the stage for a major political battle over Truth Social, a platform that purportedly plans to challenge Twitter and Facebook, social platforms that have banned or curbed the former president over his involvement in stoking the 6 January Capitol riot.The request for information relates to DWAC board meetings, policies about stock trading, the identities of certain investors and details of communications between DWAC and Trump’s social media firm. It comes three weeks after Democratic Senator Elizabeth Warren asked the SEC to investigate possible securities violations at the company.Warren quoted news reports that said DWAC “may have committed securities violations by holding private and undisclosed discussions about the merger as early as May 2021, while omitting this information in [SEC] filing and other public statements.”But investigations into the Trump project appear to predate Warren’s request.“According to the SEC’s request, the investigation does not mean that the SEC has concluded that anyone violated the law or that the SEC has a negative opinion of DWAC or any person, event, or security,” DWAC said in a statement.Last week, Reuters reported that Trump’s new company is trying to raise up to $1bn by selling shares to hedge funds and family offices at a price higher than the SPAC pre-merger valuation of $10 a share.It also comes as the launch of the Trump media venture failed to meet a November deadline to release an invitation-only beta version of the platform.In October, soon after the deal was announced, shares in DWAC soared by more than 1,200%, suggesting the implied value of the enterprise could reach $8.2bn. Trading in the company was halted 12 times as Trump fans pumped the stock on Reddit and StockTwits, pushing Trump’s 58% stake in the combined TMT-DWAC company to $4.8bn.DWAC shares were trading at $43.19 per share on Monday morning, down almost 3% on news of the filing, even as equity markets broadly were higher.According to a press release from Trump Media & Technology, the media operation will begin operations in the first quarter of next year, with Truth Social launching ahead of the 2022 midterm election and a potential subscription video on-demand service coming later.Milos Vulanovic, an expert in SPAC deals at the Edhec Business School in Nice, France, told the Guardian that Trump’s politically oriented media venture could bring “new investors who may not fully understand how SPACs work” into the market. “I don’t see why Trump-sponsored media couldn’t take 10% of the social media market and make huge money for Trump and his investors.”TopicsDonald TrumpSocial mediaDigital mediaUS politicsnewsReuse this content More

  • in

    The big idea: are we really so polarised? | Dominic Packer and Jay Van Bavel

    The big idea: are we really so polarised? In many democracies the political chasm seems wider than ever. But emotion, not policies, may be what actually divides us In 2020, the match-making website OkCupid asked 5 million hopeful daters around the world: “Could you date someone who has strong political opinions that are the opposite of yours?” Sixty per cent said no, up from 53% a year before.Scholars used to worry that societies might not be polarised enough. Without clear differences between political parties, they thought, citizens lack choices, and important issues don’t get deeply debated. Now this notion seems rather quaint as countries have fractured along political lines, reflected in everything from dating preferences to where people choose to live.Sign up to our Inside Saturday newsletter for an exclusive behind the scenes look at the making of the magazine’s biggest features, as well as a curated list of our weekly highlights.Just how stark has political polarisation become? Well, it depends on where you live and how you look at it. When social psychologists study relations between groups, they often find that whereas people like their own groups a great deal, they have fairly neutral feelings towards out-groups: “They’re fine, but we’re great!” This pattern used to describe relations between Democrats and Republicans in the US. In 1980, partisans reported feeling warm towards members of their own party and neutral towards people on the other side. However, while levels of in-party warmth have remained stable since then, feelings towards the out-party have plummeted.The dynamics are similar in the UK, where the Brexit vote was deeply divisive. A 2019 study revealed that while UK citizens were not particularly identified with political parties, they held strong identities as remainers or leavers. Their perceptions were sharply partisan, with each side regarding its supporters as intelligent and honest, while viewing the other as selfish and close-minded. The consequences of hating political out-groups are many and varied. It can lead people to support corrupt politicians, because losing to the other side seems unbearable. It can make compromise impossible even when you have common political ground. In a pandemic, it can even lead people to disregard advice from health experts if they are embraced by opposing partisans.The negativity that people feel towards political opponents is known to scientists as affective polarisation. It is emotional and identity-driven – “us” versus “them”. Importantly, this is distinct from another form of division known as ideological polarisation, which refers to differences in policy preferences. So do we disagree about the actual issues as much as our feelings about each other suggest?Despite large differences in opinion between politicians and activists from different parties, there is often less polarisation among regular voters on matters of policy. When pushed for their thoughts about specific ideas or initiatives, citizens with different political affiliations often turn out to agree more than they disagree (or at least the differences are not as stark as they imagine).More in Common, a research consortiumthat explores the drivers of social fracturing and polarisation, reports on areas of agreement between groups in societies. In the UK, for example, they have found that majorities of people across the political spectrum view hate speech as a problem, are proud of the NHS, and are concerned about climate change and inequality.As psychologist Anne Wilson and her colleagues put it in a recent paper: “Partisans often oppose one another vehemently even when there is little actual daylight between their policy preferences, which are often tenuously held and contextually malleable.”This relative lack of divergence would, of course, come as a surprise to partisans themselves. This is the phenomenon of false polarisation, whereby there is widespread misperception of how much people on the left and the right are divided, not only on issues but also in their respective ways of life. When asked to estimate how many Republicans earn more than $250,000 a year, for example, Democrats guessed 38%. In reality it is 2%. Conversely, while about 6% of Democrats self-identify as members of the LGBT community, Republicans believed it was 32%. New research from Victoria Parker and her colleagues finds that partisans are especially likely to overestimate how many of their political opponents hold extreme opinions. Those overestimates, in turn, are associated with a disinclination to talk or socially engage with out-party members, avoidance that is likely to prevent people from forming more accurate impressions of the other side.What drives these misperceptions? And why do citizens so dislike one another if they aren’t necessarily deeply divided on policy matters? Politicians certainly have incentives to sharpen differences in order to motivate and mobilise voters, rallying support by portraying themselves as bulwarks against the barbarians on the other side. Divisiveness also plays well on social media, where extreme voices are amplified. Moral outrage is particularly likely to go viral.In a recent project led by Steve Rathje and Sander van der Linden at Cambridge University, we examined more than 2.5m posts on Twitter and Facebook. We found that posts were significantly more likely to be shared or retweeted if they referenced political opponents. Every word about the out-group increased the odds of a post being shared by 67% – and these posts were, in turn, met with anger and mockery.In this increasingly toxic environment, reducing false polarisation and affective polarisation are major challenges. It is often suggested, for example, that if people were only to expose themselves to perspectives from the other side, it would breed greater understanding and cooperation. Yet this intuition turns out to be flawed.The big idea: Is the era of the skyscraper over?Read moreSociologist Christopher Bail and his colleagues offered sets of Democrats and Republicans money to follow a bot that would retweet messages from politicians, media companies and pundits every day for a month. Importantly, the messages always came from the other side of the political spectrum. Far from promoting harmony, it backfired. After a month of being exposed to conservative talking points, Democrats’ attitudes had become, if anything, marginally more liberal. And Republicans became more conservative following their diet of liberal views. When what you see from the other side strikes you as biased or obnoxious, it doesn’t endear you to their perspectives.In this regard, the behaviour of elites matters. Political scientist Rasmus Skytte showed people messages from politicians that were either civil or rude. Interestingly, aggressive and unkind messages didn’t reduce trust in politicians or increase affective polarisation. It seems that incivility is what people have come to expect. But when they saw polite and respectful messages, they subsequently felt more trust towards politicians and became less affectively polarised.These results suggest that we should expect better from our leaders and those with large platforms. Don’t reward divisive rhetoric with “likes”. Instead, follow politicians and pundits who embody norms of respect and civility, even when they disagree on policy matters. In fact, many of us might be better off if we took a break from social media altogether. When economists found that whenpeople who were encouraged people to disconnect from Facebook for a month spent less time online and were less politically polarised. They also experienced improved psychological wellbeing.No one these days is worried that our societies are insufficiently polarised. But because so much of the polarisation is about emotions and identities rather than issues, it is still not clear that citizens are presented with good choices or that important issues are being deeply debated. Here again, we must expect better. Demand that politicians and pundits get into policy specifics. Let’s focus more on actual ideas for solving actual problems, where we, as citizens, may well turn out to agree on more than we realise. Dominic Packer and Jay Van Bavel are psychologists and the authors of The Power of Us. To support the Guardian and Observer, order your copy at guardianbookshop.com. Delivery charges may apply.Further readingUncivil Agreement: How Politics Became Our Identity by Lilliana Mason (Chicago, £19)Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing by Chris Bail (Princeton, £20)The Wealth Paradox: Economic Prosperity and the Hardening of Attitudes by Frank Mols and Jolanda Jetten (Cambridge, £19.99)TopicsBooksThe big ideaSociety booksSocial trendsSocial mediaDigital mediaPsychologyUS politicsfeaturesReuse this content More

  • in

    Facebook revelations: what is in cache of internal documents?

    FacebookFacebook revelations: what is in cache of internal documents?Roundup of what we have learned after release of papers and whistleblower’s testimony to MPs Dan Milmo Global technology editorMon 25 Oct 2021 14.42 EDTLast modified on Mon 25 Oct 2021 16.04 EDTFacebook has been at the centre of a wave of damaging revelations after a whistleblower released tens of thousands of internal documents and testified about the company’s inner workings to US senators.Frances Haugen left Facebook in May with a cache of memos and research that have exposed the inner workings of the company and the impact its platforms have on users. The first stories based on those documents were published by the Wall Street Journal in September.Facebook whistleblower Frances Haugen calls for urgent external regulationRead moreHaugen gave further evidence about Facebook’s failure to act on harmful content in testimony to US senators on 5 October, in which she accused the company of putting “astronomical profits before people”. She also testified to MPs and peers in the UK on Monday, as a fresh wave of stories based on the documents was published by a consortium of news organisations.Facebook’s products – the eponymous platform, the Instagram photo-sharing app, Facebook Messenger and the WhatsApp messaging service – are used by 2.8 billion people a day and the company generated a net income – a US measure of profit – of $29bn (£21bn) last year.Here is what we have learned from the documents, and Haugen, since the revelations first broke last month.Teenage mental healthThe most damaging revelations focused on Instagram’s impact on the mental health and wellbeing of teenage girls. One piece of internal research showed that for teenage girls already having “hard moments”, one in three found Instagram made body issues worse. A further slide shows that one in three people who were finding social media use problematic found Instagram made it worse, with one in four saying it made issues with social comparison worse.Facebook described reports on the research, by the WSJ in September, as a “mischaracterisation” of its internal work. Nonetheless, the Instagram research has galvanised politicians on both sides of the Atlantic seeking to rein in Facebook.Violence in developing countriesHaugen has warned that Facebook is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said that 87% of the spending on combating misinformation at Facebook is spent on English content when only 9% of users are English speakers. According to the news site Politico on Monday, just 6% of Arabic-language hate content was detected on Instagram before it made its way on to the platform.Haugen told Congress on 5 October that Facebook’s use of engagement-based ranking – where the platform ranks a piece of content, and whether to put it in front of users, on the amount of interactions it gets off people – was endangering lives. “Facebook … knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world. And that’s what is causing things like ethnic violence in Ethiopia,” she said.Divisive algorithm changesIn 2018 Facebook changed the way it tailored content for users of its news feed feature, a key part of people’s experience of the platform. The emphasis on boosting “meaningful social interactions” between friends and family meant that the feed leant towards reshared material, which was often misinformed and toxic. “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” said internal research. Facebook said it had an integrity team that was tackling the problematic content “as efficiently as possible”.Tackling falsehoods about the US presidential electionThe New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for the platform – were of posts alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle “Stop the Steal” groups spreading claims that the election was rigged. “Enforcement was piecemeal,” said the research. The revelations have reignited concerns about Facebook’s role in the 6 January riots.Facebook said: “The responsibility for the violence that occurred … lies with those who attacked our Capitol and those who encouraged them.” However, the WSJ has also reported that Facebook’s automated systems were taking down posts generating only an estimated 3-5% of total views of hate speech.Disgruntled Facebook staffWithin the files disclosed by Haugen are testimonies from dozens of Facebook employees frustrated by the company’s failure to either acknowledge the harms it generates, or to properly support efforts to mitigate or prevent those harms. “We are FB, not some naive startup. With the unprecedented resources we have, we should do better,” wrote one employee quoted by Politico in the wake of the 6 January attack on the US capitol.“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the US, we determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” wrote another. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy … History will not judge us kindly.”Facebook is struggling to recruit young usersA section of a complaint filed by Haugen’s lawyers with the US financial watchdog refers to young users in “more developed economies” using Facebook less. This is a problem for a company that relies on advertising for its income because young users, with unformed spending habits, can be lucrative to marketers. The complaint quotes an internal document stating that Facebook’s daily teenage and young adult (18-24) users have “been in decline since 2012-13” and “only users 25 and above are increasing their use of Facebook”. Further research reveals “engagement is declining for teens in most western, and several non-western, countries”.Haugen said engagement was a key metric for Facebook, because it meant users spent longer on the platform, which in turn appealed to advertisers who targeted users with adverts that accounted for $84bn (£62bn) of the company’s $86bn annual revenue. On Monday, Bloomberg said “time spent” for US teenagers on Facebook was down 16% year-on-year, and that young adults in the US were also spending 5% less time on the platform.Facebook is built for divisive contentOn Monday the NYT reported an internal memo warning that Facebook’s “core product mechanics”, or its basic workings, had let hate speech and misinformation grow on the platform. The memo added that the basic functions of Facebook were “not neutral”. “We also have compelling evidence that our core product mechanics, such as vitality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” said the 2019 memo.A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we are a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we have invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”Facebook avoids confrontations with US politicians and rightwing news organisationsA document seen by the Financial Times showed a Facebook employee claiming Facebook’s public policy team blocked decisions to take down posts “when they see that they could harm powerful political actors”. The document said: “In multiple cases the final judgment about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg.” The memo said moves to take down content by repeat offenders against Facebook’s guidelines, such as rightwing publishers, were often reversed because the publishers might retaliate. The wave of stories on Monday were based on disclosures made to the Securities and Exchange Commission – the US financial watchdog – and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organisations including the NYT, Politico and Bloomberg.TopicsFacebookSocial mediaSocial networkingUS Capitol attackUS politicsDigital mediaanalysisReuse this content More

  • in

    Twitter admits bias in algorithm for rightwing politicians and news outlets

    TwitterTwitter admits bias in algorithm for rightwing politicians and news outletsHome feed promotes rightwing tweets over those from the left, internal research finds Dan Milmo Global technology editorFri 22 Oct 2021 08.04 EDTLast modified on Fri 22 Oct 2021 10.59 EDTTwitter has admitted it amplifies more tweets from rightwing politicians and news outlets than content from leftwing sources.The social media platform examined tweets from elected officials in seven countries – the UK, US, Canada, France, Germany, Spain and Japan. It also studied whether political content from news organisations was amplified on Twitter, focusing primarily on US news sources such as Fox News, the New York Times and BuzzFeed.The study compared Twitter’s “Home” timeline – the default way its 200 million users are served tweets, in which an algorithm tailors what users see – with the traditional chronological timeline where the most recent tweets are ranked first.The research found that in six out of seven countries, apart from Germany, tweets from rightwing politicians received more amplification from the algorithm than those from the left; right-leaning news organisations were more amplified than those on the left; and generally politicians’ tweets were more amplified by an algorithmic timeline than by the chronological timeline.According to a 27-page research document, Twitter found a “statistically significant difference favouring the political right wing” in all the countries except Germany. Under the research, a value of 0% meant tweets reached the same number of users on the algorithm-tailored timeline as on its chronological counterpart, whereas a value of 100% meant tweets achieved double the reach. On this basis, the most powerful discrepancy between right and left was in Canada (Liberals 43%; Conservatives 167%), followed by the UK (Labour 112%; Conservatives 176%). Even excluding top government officials, the results were similar, the document said.Twitter said it wasn’t clear why its Home timeline produced these results and indicated that it may now need to change its algorithm. A blog post by Rumman Chowdhury, Twitter’s director of software engineering, and Luca Belli, a Twitter researcher, said the findings could be “problematic” and that more study needed to be done. The post acknowledged that it was concerning if certain tweets received preferential treatment as a result of the way in which users interacted with the algorithm tailoring their timeline.“Algorithmic amplification is problematic if there is preferential treatment as a function of how the algorithm is constructed versus the interactions people have with it. Further root cause analysis is required in order to determine what, if any, changes are required to reduce adverse impacts by our Home timeline algorithm,” the post said.Twitter said it would make its research available to outsiders such as academics and it is preparing to let third parties have wider access to its data, in a move likely to put further pressure on Facebook to do the same. Facebook is being urged by politicians on both sides of the Atlantic to distribute its research to third parties after tens of thousands of internal documents – which included revelations that the company knew its Instagram app damaged teenage mental health – were leaked by the whistleblower Frances Haugen.The Twitter study compared the two ways in which a user can view their timeline: the first uses an algorithm to provide a tailored view of tweets that the user might be interested in based on the accounts they interact with most and other factors; the other is the more traditional timeline in which the user reads the most recent posts in reverse chronological order.The study compared the two types of timeline by considering whether some politicians, political parties or news outlets were more amplified than others. The study analysed millions of tweets from elected officials between 1 April and 15 August 2020 and hundreds of millions of tweets from news organisations, largely in the US, over the same period.Twitter said it would make its research available to third parties but said privacy concerns prevented it from making available the “raw data”. The post said: “We are making aggregated datasets available for third party researchers who wish to reproduce our main findings and validate our methodology, upon request.”Twitter added that it was preparing to make internal data available to external sources on a regular basis. The company said its machine-learning ethics, transparency and accountability team was finalising plans in a way that would protect user privacy.“This approach is new and hasn’t been used at this scale, but we are optimistic that it will address the privacy-vs-accountability tradeoffs that can hinder algorithmic transparency,” said Twitter. “We’re excited about the opportunities this work may unlock for future collaboration with external researchers looking to reproduce, validate and extend our internal research.”TopicsTwitterSocial mediaDigital mediaUS politicsnewsReuse this content More

  • in

    Facebook whistleblower accuses firm of serially misleading over safety

    FacebookFacebook whistleblower accuses firm of serially misleading over safety Frances Haugen filed at least eight complaints against the company regarding its approach to safety Dan Milmo Global technology editorTue 5 Oct 2021 07.50 EDTLast modified on Tue 5 Oct 2021 10.23 EDTThe Facebook whistleblower, Frances Haugen, who testifies at the US Congress on Tuesday, has filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.The complaints, published online by the news programme 60 Minutes late on Monday, hours before Haugen’s testimony to US senators at 10am EDT (3pm BST), are based on tens of thousands of internal documents that Haugen copied shortly before she quit Facebook in May.The complaints and testimony from Haugen, who stepped forward on Sunday as the source of a damning series of revelations in the Wall Street Journal, are taking place against a backdrop of operational chaos for Facebook, whose platforms, including Instagram and WhatsApp, went offline around the world for nearly six hours on Monday.The first whistleblower complaint filed to the US Securities and Exchange Commission relates to the 6 January riots in Washington, when crowds of protesters stormed the Capitol, and alleges that Facebook knowingly chose to permit political misinformation and contests statements made by its chief executive, Mark Zuckerberg, to the contrary.“Our anonymous client is disclosing original evidence showing that Facebook … has, for years past and ongoing, violated US securities laws by making material misrepresentations and omissions in statements to investors and prospective investors,” the sweeping opening statement reads, “including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”The complaints against Facebook, which reflect a series of reports in the Wall Street Journal in recent weeks, also cover:
    The company’s approach to hate speech.
    Its approach to teenage mental health.
    Its monitoring of human trafficking.
    How the company’s algorithms promoted hate speech.
    Preferential disciplinary treatment for VIP users.
    Promoting ethnic violence.
    Failing to inform investors about a shrinking user base in certain demographics.
    The first complaint, regarding 6 January, contests testimony given to Congress in March by Facebook’s founder and chief executive, Mark Zuckerberg, in which he stated that: “We remove language that incites or facilitates violence, and we ban groups that proclaim a hateful and violent mission.”The complaint rebuts this, claiming that the company’s own records show it “knowingly chose to permit political misinformation and violent content/groups and failed to adopt or continue measures to combat these issues, including as related to the 2020 US election and the 6 January insurrection, in order to promote virality and growth on its platforms.”According to one internal Facebook document quoted in the complaints, the company admits: “For example, we estimate that we may action as little as 3-5% of hate [speech] and ~0.6% of V&V [violent and inciting content] on Facebook.”A complaint also alleges that Facebook misrepresented its “reach and frequency”, which are key metrics for the advertisers who provide the majority of Facebook’s revenue. That included concealing a decline in the key demographic of young users, the complaint stated. “During Covid, every cohort’s use of Facebook increased, except for those 23 and under, which continued to decline,” the complaint said.“For years, Facebook has misrepresented core metrics to investors and advertisers including the amount of content produced on its platforms and growth in individual users,” it said, adding this applied particularly in “high-value demographics” such as US teenagers.Facebook has been approached for comment.The human trafficking complaint alleges that Facebook and its photo-sharing app, Instagram, were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude”. The hate speech complaint quotes another internal document that states: “We only take action against approximately 2% of the hate speech on the platform.” The teen health complaint focuses on the most damaging allegation from the WSJ series: that Instagram knew the app caused anxiety about body image among teenage girls.A complaint about Facebook’s approach to algorithms alleges that a tweak to the app’s News Feed product – a key part of users’ interaction with the app – led to the prioritisation of divisive content, while the complaint about ethnic violence contains an excerpt from an internal study that claims “in the Afghanistan market, the action rate for hate speech is worryingly low”.Facebook has issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teens found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”TopicsFacebookSocial mediaUS CongressUS politicsDigital mediaSocial networkingnewsReuse this content More

  • in

    Facebook ‘tearing our societies apart’: key excerpts from a whistleblower

    FacebookFacebook ‘tearing our societies apart’: key excerpts from a whistleblower Frances Haugen tells US news show why she decided to reveal inside story about social networking firm Dan Milmo Global technology editorMon 4 Oct 2021 08.33 EDTLast modified on Mon 4 Oct 2021 10.30 EDTFrances Haugen’s interview with the US news programme 60 Minutes contained a litany of damning statements about Facebook. Haugen, a former Facebook employee who had joined the company to help it combat misinformation, told the CBS show the tech firm prioritised profit over safety and was “tearing our societies apart”.Haugen will testify in Washington on Tuesday, as political pressure builds on Facebook. Here are some of the key excerpts from Haugen’s interview.Choosing profit over the public goodHaugen’s most cutting words echoed what is becoming a regular refrain from politicians on both sides of the Atlantic: that Facebook puts profit above the wellbeing of its users and the public. “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimise for its own interests, like making more money.”She also accused Facebook of endangering public safety by reversing changes to its algorithm once the 2020 presidential election was over, allowing misinformation to spread on the platform again. “And as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”Facebook’s approach to safety compared with othersIn a 15-year career as a tech professional, Haugen, 37, has worked for companies including Google and Pinterest but she said Facebook had the worst approach to restricting harmful content. She said: “I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before.” Referring to Mark Zuckerberg, Facebook’s founder and chief executive, she said: “I have a lot of empathy for Mark. And Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful, polarising content gets more distribution and more reach.”Instagram and mental healthThe document leak that had the greatest impact was a series of research slides that showed Facebook’s Instagram app was damaging the mental health and wellbeing of some teenage users, with 30% of teenage girls feeling that it made dissatisfaction with their body worse.She said: “And what’s super tragic is Facebook’s own research says, as these young women begin to consume this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more. Facebook’s own research says it is not just that Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media.”Facebook has described the Wall Street Journal’s reporting on the slides as a “mischaracterisation” of its research.Why Haugen leaked the documentsHaugen said “person after person” had attempted to tackle Facebook’s problems but had been ground down. “Imagine you know what’s going on inside of Facebook and you know no one on the outside knows. I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground.”Having joined the company in 2019, Haugen said she decided to act this year and started copying tens of thousands of documents from Facebook’s internal system, which she believed show that Facebook is not, despite public comments to the contrary, making significant progress in combating online hate and misinformation . “At some point in 2021, I realised, ‘OK, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question that this is real.’”Facebook and violenceHaugen said the company had contributed to ethnic violence, a reference to Burma. In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence” relating to the country. Speaking on 60 Minutes, Haugen said: “When we live in an information environment that is full of angry, hateful, polarising content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other. The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”Facebook and the Washington riotThe 6 January riot, when crowds of rightwing protesters stormed the Capitol, came after Facebook disbanded the Civic Integrity team of which Haugen was a member. The team, which focused on issues linked to elections around the world, was dispersed to other Facebook units following the US presidential election. “They told us: ‘We’re dissolving Civic Integrity.’ Like, they basically said: ‘Oh good, we made it through the election. There wasn’t riots. We can get rid of Civic Integrity now.’ Fast-forward a couple months, we got the insurrection. And when they got rid of Civic Integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’”The 2018 algorithm changeFacebook changed the algorithm on its news feed – Facebook’s central feature, which supplies users with a customised feed of content such as friends’ photos and news stories – to prioritise content that increased user engagement. Haugen said this made divisive content more prominent.“One of the consequences of how Facebook is picking out that content today is it is optimising for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarising – it’s easier to inspire people to anger than it is to other emotions.” She added: “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”Haugen said European political parties contacted Facebook to say that the news feed change was forcing them to take more extreme political positions in order to win users’ attention. Describing polititicians’ concerns, she said: “You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media.”In a statement to 60 Minutes, Facebook said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”TopicsFacebookSocial networkingUS Capitol attackInstagramMental healthSocial mediaYoung peoplenewsReuse this content More