in

Comply with child age checks or face consequences, Ofcom tells tech firms

Tech firms have been warned to act now or face the consequences, as new online safety protections for children come into force.

From Friday, so-called “risky” sites and apps will be expected to use what the regulator has described as “highly effective” age checks to identify which users are children and subsequently prevent them from accessing pornography, as well as other harmful content including self-harm, suicide, eating disorders and extreme violence.

But some online safety campaigners said while the new measures should have been a “watershed moment for young people”, regulator Ofcom has instead “let down” parents, accusing it of choosing to “prioritise the business needs of big tech over children’s safety”.

The Molly Rose Foundation, founded by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media, said the changes lack ambition and accountability and warned that big tech will have taken note.

In the face of campaigners’ criticism, Ofcom chief executive Dame Melanie Dawes has previously defended the reforms, insisting that tech firms are not being given much power over the new measures, which are coming into effect as part of the Online Safety Act.

The changes, which will apply across the UK, include age checks on pornography websites, as well as others such as dating app Grindr, which Ofcom said will ensure it is more difficult for children in the UK to access online porn than in many other countries.

The regulator said sites such as X, formerly Twitter, and others including Bluesky and Reddit have also committed to age assurances.

Ofcom said its safety codes also demand that algorithms “must be tamed and configured for children so that the most harmful material is blocked”.

It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube.

The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe.

Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK.

The NSPCC has warned that Ofcom must “show its teeth and fully enforce the new codes”.

Dame Melanie said: “Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK.

“Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.”

But Andy Burrows, chief executive of the Molly Rose Foundation, said: “This should be a watershed moment for young people but instead we’ve been let down by a regulator that has chosen to prioritise the business needs of big tech over children’s safety.”

He said the “lack of ambition and accountability will have been heard loud and clear in Silicon Valley”.

He added: “We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.”

Chris Sherwood, chief executive at the NSPCC, said: “Children, and their parents, must not solely bear the responsibility of keeping themselves safe online. It’s high time for tech companies to step up.”

He said if enforcement is “strong”, the codes should offer a “vital layer of protection” for children and young people when they go online, adding: “If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes”.

England’s Children’s Commissioner, Dame Rachel de Souza, said Friday “marks a new era of change in how children can be protected online, with tech companies now needing to identify and tackle the risks to children on their platforms or face consequences”, and said the measures must keep pace with emerging technology to make them effective in the future.

She added: “I will continue to reflect the views of children in the work I do with Ofcom to make the online world a safer place for all children. Protection must always come before profit.”


Source: UK Politics - www.independent.co.uk


Tagcloud:

Consent decrees force schools to desegregate. The Trump administration is striking them down

Why Trump’s political playbook is failing in the Epstein case | Jan-Werner Müller