Ofcom unveils new rules to protect children online

Ofcom unveils new rules to protect children online

New protections intended to give children safer lives have been finalised by Ofcom today.

More than 40 measures have been laid down for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as social media, search and gaming. This follows consultation and research involving tens of thousands of children, parents, companies and experts.

The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Dame Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”

Providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which Ofcom may request. They should then implement safety measures to mitigate those risks. From 25 July 2025, they should apply the safety measures set out in our Codes to mitigate those risks.

If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.

Share icon
Share this article: