Parents and children can expect to “experience a different internet for the first time”, according to the Technology Secretary as new safety measures came into effect.
Peter Kyle said he had “high expectations” for the changes, as the head of the regulator in charge of enforcement against social media platforms which do not comply urged the public to “judge us by the impact we secure”.
While some campaigners have welcomed the new protections – which include age checks to prevent children accessing pornography and other harmful content – others have branded them a “sticking plaster”.
Charities and other organisations working in the sector of children’s safety have agreed the key will be ensuring the measures are enforced, urging Ofcom to “show its teeth”.
The changes also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content on the likes of self harm and eating disorders towards them.
Actions which could be taken against firms which fail to adhere to the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK.
Mr Kyle has said a generation of children will not be allowed to grow up “at the mercy of toxic algorithms” as he pledged the Government is laying the foundations for a safer, healthier, more humane online world and warned tech firms “will be held to account” if they fail to act in line with the changes.
He told Sky News: “I have very high expectations of the change that children will experience.
“And let me just say this to parents and children, you will experience a different internet really, for the first time in from today, moving forward than you’ve had in the past. And that is a big step forward.”
The measures, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders.
Ofcom chief executive Dame Melanie Dawes said the regulator’s research had shown half a million eight to 14-year-olds have come across pornography online in the last month alone.
When it was put to her by the BBC that one of their staff members testing out the new measures had been able to sign up to a well-known porn site on Friday using just an email address, she said sites will be “checking patterns of email use” behind the scenes to verify users are adults.
She told Radio 4’s Today programme: “We’ve shown that we’ve got teeth and that we’re prepared to use them at Ofcom. And we have secured commitments across the porn industry and from the likes of X that no other country has secured. These things can work.
“Judge us by the impact we secure. And absolutely, please do tell us if you think there’s something we need to know about that isn’t working because the law is very clear now.”
She also said the Government is right to be considering limits on the amount of time children can spend on social media apps.
Earlier this week, Mr Kyle said he wanted to tackle “compulsive behaviour” and ministers are reportedly considering a two-hour limit, with curfews also under discussion.
Dame Melanie told LBC: “I think the Government is right to be opening up this question. I think we’re all a bit addicted to our phones, adults and children, obviously particularly a concern for young people. So, I think it’s a good thing to be moving on to.”
Children’s charities the NSPCC and Barnardo’s are among those who have welcomed the new checks in place from Friday, as well as the Internet Watch Foundation (IWF).
The IWF warned the “safeguards put in place need to be robust and meaningful” and said there is “still more to be done”, as they urged tech platforms to to build in safeguards rather than having them as “an afterthought”.
The Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a “lack of ambition and accountability” in the measures, and accused the regulator of choosing to “prioritise the business needs of big tech over children’s safety”.
Andy Burrows, chief executive of the foundation, told Sky News: “We’ve always had a very simple test for the Online Safety Act, will it stop further young people like Molly from dying because of the harmful design of social media platforms?
“And regrettably, we just don’t think it passes that test. This is a sticking plaster, not the comprehensive solution that we really need.”
Ofcom said it has also launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube.
The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe.