Pinnacle Gazette

UK Regulators Demand Stricter Age Checks on Social Media Platforms

Ofcom and ICO call for enhanced protections for children online amid rising concerns about safety.

Category: Technology

In a significant move aimed at bolstering online safety for children, UK regulators Ofcom and the Information Commissioner’s Office (ICO) have urged major technology companies to implement more robust age verification measures for users under 13. This call comes in response to growing concerns about the accessibility of social media platforms to younger users and the inadequacies of current age verification processes.

The platforms under scrutiny include household names such as Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X. Ofcom's chief executive, Dame Melanie Dawes, expressed her discontent with the current state of child safety measures, stating that these services are "failing to put children's safety at the heart of their products." This assertion highlights a critical gap between what tech companies promise and their actual practices in safeguarding children online.

Ofcom and the ICO's joint letter outlines a series of demands aimed at ensuring that children are protected from inappropriate content and interactions. Among these demands are effective minimum age policies that restrict access for under-13s, failsafe grooming protections, safer content feeds for children, and a cessation of product testing on minors. The regulators have set a deadline of April 30, 2026, for these platforms to report back on the measures they intend to implement.

Despite the urgency of these demands, some companies have responded defensively. YouTube's parent company, Google, expressed surprise at Ofcom's approach, suggesting that the regulator should focus on higher-risk services instead. "We urge them to focus instead on high-risk services that are failing to comply with the codes set out in the Online Safety Act," a spokesperson stated. This reflects a broader trend of resistance from the industry to increased regulatory scrutiny.

Currently, most social media platforms enforce a minimum age limit of 13, but Ofcom's research indicates that a staggering 86% of children aged 10 to 12 have their own social media profiles. This alarming statistic underscores the inadequacy of self-reported age declarations, which many platforms rely on. As the ICO noted in its letter, "self-declaration is easily circumvented, meaning underage children can easily access services that have not been designed for them." This loophole raises serious questions about the effectiveness of existing safeguards.

Dame Melanie Dawes emphasized the need for substantial action, stating, "We're not saying this is a completely blank sheet of paper they need to address, but they have not gone far enough." The push for enhanced age verification measures is not merely a regulatory formality; it represents a growing recognition of the need to prioritize children's safety in an increasingly digital world.

The ICO's focus is particularly concerned with how young children's data is handled. In the letter, ICO chief executive Paul Arnold pointed out that services with a minimum age of 13 often lack a lawful basis for processing the personal data of children under that age. This gap in legal protections has raised alarms about the potential for misuse of sensitive information.

Technology Secretary Liz Kendall voiced her support for Ofcom's initiatives, asserting that no platform should receive a "free pass" when it comes to protecting children online. "No company should need a court order to act responsibly to protect children," she added, reinforcing the notion that accountability in the tech industry is paramount.

As regulators shift their focus from fringe sites to mainstream social media giants, the landscape of online safety enforcement is changing. In recent months, Ofcom has investigated nearly 100 services since the UK's online safety laws took effect in 2025. This investigation has led to progress in areas such as sharing child sexual abuse material and requiring age checks for pornography sites. However, Ofcom has made it clear that the industry has "not done enough," prompting the current demands for stricter measures.

Experts in digital mental health have welcomed these regulatory actions but caution that they must be just the beginning. Professor Amy Orben from Cambridge University remarked that safety must be integrated into digital products by design, rather than treated as an afterthought. She stated, "Regulators must show more strength in holding companies to account."

Social media analyst Matt Navarra echoed this sentiment, highlighting that while knowing a user is a child is a critical first step, the more significant challenge lies in designing platforms that do not exploit children's attention through aggressive algorithms or recommendation systems. "Knowing a user is a child is step one," he noted, "but designing a platform that doesn't exploit their attention is the next step—and that step is actually much harder."

As the deadline approaches, the pressure is mounting on these platforms to take meaningful action. The expectation is that they will not only comply with the regulators' demands but also demonstrate a genuine commitment to creating a safer online environment for children. The current situation serves as a crucial reminder of the responsibilities that come with operating in the digital space, particularly when it concerns the most vulnerable users.

In conclusion, the UK regulators' call for stricter age checks signifies a pivotal moment in the ongoing dialogue about child safety in the digital realm. As technology continues to evolve, so too must the frameworks that govern its use, ensuring that children's rights and safety are at the forefront of this evolution.