Meta Unveils Parent-Managed WhatsApp Accounts for Children Under 13 to Address Global Child Safety Concerns
WhatsApp introduces parent-managed accounts for children under 13, featuring restricted messaging and contact controls to improve digital safety for pre-teens.
By: AXL Media
Published: Mar 11, 2026, 11:31 AM EDT
Source: The information in this article was sourced from CNA

New Safeguards for Early Messaging Experiences
WhatsApp has introduced a specialized account tier designed for children under the age of 13, emphasizing parental oversight as a core security feature. According to a statement from the Meta Platforms subsidiary, these accounts are restricted to essential messaging and calling functions, removing the broader social features that have drawn regulatory fire. The development follows consistent feedback from guardians seeking a controlled environment where younger users can communicate with family and peers without exposure to the risks prevalent on unrestricted platforms.
Heightened Oversight and Contact Management Tools
Under the new framework, parents or legal guardians retain total control over the digital interactions of the pre-teen user. Reports indicate that the system requires parental approval for every new contact request and dictates which group chats a child is permitted to join. Furthermore, guardians can review incoming message requests from unknown parties and adjust privacy settings in real time. This architecture is intended to create a shielded communication loop, ensuring that the child’s first experience with digital messaging remains supervised and restricted to verified individuals.
The Context of Rising Global Regulatory Scrutiny
This strategic rollout arrives as governments worldwide intensify their efforts to curb the impact of digital platforms on youth mental health. Australia recently led the way by adopting a landmark social media ban for teenagers, a move that has prompted other nations to consider similar legislative measures. According to industry observers, Meta’s decision to launch parent-managed tools is a preemptive attempt to satisfy international regulators who are increasingly wary of the psychological and social pressures exerted by unregulated chat applications on vulnerable age groups.
Categories
Topics
Related Coverage
- Roblox Agrees to $12 Million Nevada Settlement Following Intensive Child Safety and Predator Access Allegations
- Meta to discontinue end-to-end encryption for Instagram Direct Messages starting May 2026
- Russian State Hackers Target Dutch Officials in Sophisticated Signal and WhatsApp Account Hijacking Campaign
- Dutch Intelligence Warns of Russian Hackers Targeting Diplomats and Military via Signal and WhatsApp