Social media companies are putting lives at risk by failing to “detox” their platforms of misinformation about public health issues such as coronavirus, NHS staff have warned.
Some Covid-19 patients have been rushed to intensive care after delaying seeking medical help for symptoms because of fake news about the disease, a doctor told a parliamentary inquiry.
The NHS 111 helpline has been flooded with questions about false rumours callers calls had read on the internet, MPs on the Digital, Culture, Media and Sport subcommittee on online harms and disinformation heard.
Download the new Independent Premium app
Sharing the full story, not just the headlines
The committee also grilled representatives of Twitter, Facebook and Google for the second time on Thursday following criticism by chair Julian Knight about “a lack of clarity” in evidence to an earlier hearing and “failures to provide adequate answers to follow-up correspondence”.
The three executives, as well as a fourth from YouTube, appeared before MPs over video after research showed social media firms were removing less than one in 10 posts spreading “dangerous” coronavirus fake news.
The Centre for Countering Digital Hate, which published the research, accused the platforms of “shirking their responsibility” to stop the spread of “falsehoods.
Giving evidence to the committee, Dr Megan Emma Smith, a consultant anaesthetist at Royal Free London Hospital, said “doctors across the board” were “deeply concerned” about misinformation.
She said: “What I’ve seen is a lot patients who aren’t presenting to hospital — they’re presenting very, very late on in the illness — because, in some of their cases, they have been afraid to come to hospital or they’ve believed online messaging that the illness isn’t as serious as it really is.
“By the time they come to me… they are unbelievably sick and they have required incubation.”
Thomas Knowles, an advanced paramedic practitioner for NHS 111, said at the height of the coronavirus crisis he dealt with “multiple calls a day” involving misinformation, ranging from the use of certain medications to do-not-resuscitate orders.
He recalled one woman who he believed to be suffering a heart attack who refused medical attention “because she’d read on Facebook that [coronavirus] meant she’d definitely die if she went to hospital”.
No hype, just the advice and analysis you need
He also warned the spread of anti-vaccination conspiracy theories could potentially undermine “one of our ways out of this pandemic”.
Mr Knowles accused social media firms of “profiting off of a system which places everyone at increased risk of harm” and called for regulation to prevent platforms “removing themselves from that social responsibility”.
The committee was also sent submissions from healthcare workers on the frontline of the coronavirus pandemic who signed an open letter urging social media firms to “correct the record” on misinformation by alerting all users who encounter it. One doctor in New York said his neighbours had died “because of a delayed federal government response informed by online conspiracy theories”.
The letter, signed by the medics, called for platforms to “detox the algorithms that decide what people see” to prevent “harmful lies” being amplified.
Questioning Leslie Miller, YouTube’s vice-president of government affairs and public policy, the Labour MP Yvette Cooper asked why the video-streaming website had promoted “shocking” anti-vaccination and 5G conspiracy theories on its home page.
“Surely that is utterly irresponsible of YouTube, and I have been raising this issue with you and your colleagues repeatedly,” said Ms Cooper, who as Home Affairs Committee chair had joined the session as a guest.
Ms Miller said YouTube had expanded its policies on harmful and dangerous content to include “content that contradicts medical or scientific facts”, but acknowledged there was “always more to do in this area”. She noted the platform had removed conspiracy theorist David Icke’s channel after it linked coronavirus to 5G and “Jewish cults”.
However, Scottish National Party MP John Nicolson MP said Icke was still “spreading lies” on monetised videos on other YouTube channels.
“You’re doing nothing about it. You know exactly what you’re doing and I think it’s enormously cynical,” he told Ms Miller. “It suits your purposes to have David Icke on because he’s clickbait.”
Monika Bickert, Facebook’s head of product policy, said millions of users had viewed official coronavirus health information which the platform been promoting during the pandemic.
But Facebook faced criticism from the committee over its decision not to take action over an inflammatory post in which Donald Trump threatened to shoot “looters” following violent protests over the death of George Floyd.
“It looks to me like something is rotten in the state of Facebook,” Mr Nicholson said.
Company founder Mark Zuckerberg’s defence of the decision not to remove the post this week prompted staff walkouts and resignations, as well as condemnation from civil rights leaders.
Ms Bickert admitted Facebook’s processes for removing content were “not perfect” but said Mr Trump’s post had not violated its policies.
Twitter faced Mr Trump’s wrath after it concealed the same post by the US president and the White House behind a warning about “promoting violence”.
The committee asked Twitter’s director of public policy, Nick Pickles, whether Mr Trump’s account could be suspended if he continued to violate the platform’s rules.
He did not rule it out, replying: “Every Twitter account is subject to the rules.”