Ofcom Issues 40-Point Mandate for Tech Firms
The UK’s communications regulator, Ofcom, has unveiled stringent new guidelines requiring technology companies to implement sweeping child-protection measures or face severe penalties. Under the Online Safety Act, platforms must adopt 40 practical steps by 24 July 2025, including content-filtering algorithms, robust age verification, and enhanced reporting systems, or risk fines up to 10% of global revenue and potential shutdowns in the UK market.
Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.” Source
Key Requirements for Compliance
Algorithmic Safeguards: Tech firms must deploy tools to filter harmful content from children’s feeds and limit exposure to dangerous material.
Age Verification: Platforms must introduce “effective” checks to prevent under-18s from accessing adult content, with Apple and Google urged to flag downloads of age-restricted apps.
Reporting and Support: Companies must simplify user reporting processes and provide guidance to those encountering harmful content.
Governance: Firms must appoint a child safety lead and establish senior oversight bodies to annually review risks.
Stakeholder Input and Enforcement
The guidelines follow a consultation involving tens of thousands of responses from parents, children, and experts, who demanded stronger controls over group chats, content moderation, and feed customisation. Ofcom CEO Melanie Dawes framed the rules as a “reset” for children’s online experiences, emphasising that platforms failing to act will face strict enforcement.
Labour MP Gregor Poynton, chair of the All-Party Parliamentary Group on Children’s Online Safety, advocated for deeper collaboration with tech giants to improve age-checking systems, suggesting app stores could redirect underage users to more rigorous verification processes.
“Apple and Google should look at whether, when someone downloads an app that is for 18 and over…could they flag that that person might be under 18? Therefore, they could then get sent on a different journey for age verification, which is perhaps longer and more in-depth.” Source
With Ofcom securing additional funding and staff to enforce compliance, the message to the industry is clear: prioritise child safety or risk existential consequences.
At Big Sister, we advocate for children's safety online. Our app marks a positive change in the way that our children are protected online, using flags and alerts to warn parents of dangerous content without breaking down trust and privacy barriers between children and adults.
Find out more about how to protect your children online without breaking their trust in our latest blog here.
Or sign up to our waitlist to be the first to know when the app launches and get access to our early bird discount.