As reported by the BBC, UK regulators have told major technology firms they should introduce stronger systems to stop children under 13 accessing social media platforms
The request was made jointly by Ofcom and the Information Commissioner’s Office, which have written to companies including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X about tightening protections for younger users.
Most of these services set 13 as their minimum age, yet regulators say large numbers of younger children are still signing up. According to Ofcom research, around 86% of 10- to 12-year-olds already have their own social media accounts.
Officials say this suggests current safeguards are not working effectively and that companies must take stronger action to prevent underage access and reduce potential online harms.
Many platforms currently depend on users simply entering their date of birth when creating an account, a system critics say is easy for children to bypass.
Regulators are now encouraging firms to introduce far more reliable age-checking tools. Technology of this kind is already required for certain online services that host adult material, including pornography.
While such checks are not yet mandatory for mainstream social media, Ofcom has said companies should voluntarily adopt similar protections to ensure younger children cannot easily access platforms designed for teenagers and adults.

Be the first to comment