LONDON, March 12 — The United Kingdom's (UK) media and privacy regulators have demanded that major social media platforms do more to keep children off their services, warning that companies are failing to enforce their own minimum-age rules.
The UK has been weighing tougher curbs on children's access to social media, with the government considering barring under-16s from such platforms, mirroring a move by Australia.
Ofcom and the Information Commissioner's Office said they had grown increasingly concerned about algorithmic feeds that expose children to harmful or addictive content.
"These online services are household names, but they are failing to put children's safety at the heart of their products. That must now change quickly, or Ofcom will act," said Ofcom's chief executive Melanie Dawes on Thursday.

Use 'modern' tech, companies told
In the latest implementation phase of the UK's Online Safety Act, Ofcom told Facebook and Instagram — both owned by Meta — as well as Roblox, Snapchat, ByteDance's TikTok, and Alphabet's YouTube to show by April 30 as to how they would tighten age checks, restrict strangers from contacting children, make feeds safer, and stop testing new products on minors.
The ICO separately issued an open letter to the same platforms, calling on them to adopt "modern, viable" age-assurance tools to stop those under 13 accessing services not designed for them.
"There is now modern technology at your fingertips, so there is no excuse," said its chief executive Paul Arnold.
A Meta spokesman said the company already uses AI-based age detection and age-estimation tools and places teens in accounts with built-in protections, adding that age should be verified "centrally at the app store level" so families do not have to provide personal information multiple times.
A YouTube spokesman said the platform also offered age-appropriate experiences and was "surprised to see Ofcom move away from a risk-based approach", urging the regulator to focus on "high-risk services" that were failing to comply with the law.
A Roblox spokesman said the company had launched more than 140 new safety features in the past year, including mandatory age checks for chat, designed to prevent adults from communicating with children.
"While no system is ever perfect, we continue to strengthen protections designed to keep players safe," the spokesman said.
Snapchat did not respond to a request for comment. TikTok declined to comment.
Ofcom can fine companies up to 10 per cent of their qualifying global revenue, while the ICO can issue fines of up to four per cent of a company's global annual turnover.
Last month, the privacy watchdog fined Reddit nearly £14.5 million (RM76.16 million) for failing to introduce meaningful age checks and for processing children's data unlawfully.








