WhatsApp Launches Parent-Managed Accounts for Kids Under 13

WhatsApp is rolling out parent-managed accounts for children under 13. It’s a direct response to growing global pressure — countries like Denmark, Germany, Spain, and the UK are either banning or heavily restricting social media for kids. Rather than wait to be regulated out of the market, Meta is building its own version of kid-safe messaging. The question is whether this is genuine child protection or a clever way to get 3 billion users’ kids onto the platform before competitors do.
What Are Parent-Managed Accounts?
The setup requires both a parent’s and child’s device. Parents scan a QR code to link the accounts, and a six-digit PIN protects all parental settings — meaning the child can’t change them without the parent’s code.
Key restrictions on managed accounts:
- Messaging and calling only — no Meta AI, no Channels, no Status updates
- Only saved contacts can message the child by default
- Group invites locked behind the parent’s PIN
- Images from unknown contacts are blurred automatically
- Unknown senders show context cards with details like common groups and country
What Parents Can Monitor
By default, parents are alerted when their pre-teen adds, blocks, or reports a contact. Beyond that, parents can opt into additional notifications:
- Name or profile picture changes
- New chat requests from unknown contacts
- Group activity and invitations
- When disappearing messages are enabled
- When chats are deleted
Importantly, all chats remain end-to-end encrypted. Parents cannot read their child’s messages — they only see metadata about activity, not content.
The Privacy Trade-Off
This is where it gets interesting. WhatsApp is threading a needle between parental oversight and child privacy. The encryption stays intact, which means WhatsApp itself can’t read the messages either. Parents get activity alerts but not message content. It’s a compromise that will satisfy neither the “parents should see everything” crowd nor the “kids deserve full privacy” advocates.
When the child ages up, they get a notification to convert to a standard account — with an option to delay the transition by up to 12 months. Smart move: it gives parents a gradual off-ramp rather than an abrupt switch.
The Bigger Picture
Let’s be honest about what’s happening here. With 3 billion+ users, WhatsApp is the world’s dominant messaging platform. Kids under 13 were technically banned from using it, but enforcement was essentially nonexistent — any child could sign up with a phone number and lie about their age.
By creating official managed accounts, WhatsApp accomplishes two things simultaneously: it addresses regulatory pressure from governments cracking down on kids’ social media use, and it creates a legitimate pathway for under-13 users onto the platform. Instead of losing these users to competitors or regulation, WhatsApp gets to onboard them — with parental consent as the fig leaf.
The Bottom Line
WhatsApp’s parent-managed accounts are a pragmatic response to a real problem. Kids are already on messaging platforms, and giving parents actual controls is better than the current honor system of age verification. But don’t mistake this for pure altruism — Meta is building a pipeline to convert every managed account into a full WhatsApp user the moment they age up. The parental controls are real, but so is the business strategy behind them. Rolling out in select geographies now, with global expansion coming over the next few months.