POSH
Safer Defaults for Child Accounts
Child safety should not depend on parents finding every setting manually.
Child accounts should start safer — not require parents to fix them after the fact.
A system problem
SAFETY SHOULD BE BUILT IN — NOT HIDDEN IN SETTINGS
Most child accounts begin with open messaging, stranger access, and high-risk features already enabled.
Parents are expected to find and fix these manually — often after exposure has already begun.
If safety is optional, risk becomes normal.
Why this matters
Most parents do not know every setting across every device, app, and platform.
Children are often exposed simply because defaults were never designed with safety first.
Open by default = exposed by default
POSH position:
Child accounts should start with the safest reasonable settings enabled automatically — with parents choosing to loosen them later if needed.
Why defaults matter
Most harm does not start with one big mistake. It starts with small openings.
DMs open
↓
Stranger contact begins
↓
Trust builds
↓
Private communication grows
↓
Secrecy increases
↓
Risk escalates
Safer defaults close the first openings — before escalation begins
What safer defaults should look like
- Direct messages OFF by default
- Stranger contact restricted
- Location sharing OFF
- Live streaming restricted or disabled
- Digital gifting blocked unless parent-approved
- Friend requests limited or reviewed
- Private group access restricted
- Clear activity visibility for parents
High-risk features should be opt-in — not automatically open.
High-risk features that should never be open by default
- Open direct messaging
- Voice chat with strangers
- Live video interaction
- Location sharing
- Anonymous gifting
- Unrestricted friend requests
- Public discoverability by unknown adults
If a feature creates direct stranger access, it should not be wide open for children.
What POSH is calling out
Parents are expected to manage complex systems manually
Platforms already understand these risks
Yet high-risk features remain widely accessible by default
This is not a knowledge gap — it is a design decision
Why this is practical
Platforms already build systems for:
- privacy settings
- purchase controls
- screen time
- content filtering
- account security
The same level of design can be applied to child safety defaults.
This is not about banning technology — it is about designing it responsibly for children.
What parents can do right now
Assume defaults are not safe enough
Check messaging, friends, location, and gifting
Use both device-level and app-level controls
Set clear rules about private chats and off-platform movement
Keep conversations open with your child
How this connects to digital gifting
Gifting is one example of a wider issue:
High-risk features should be opt-in and parent-visible — not silently active.
What platforms and lawmakers should hear clearly
Parents should not carry the full burden of child safety
Known risk patterns should be designed against — not ignored
Safer defaults are a responsibility, not an optional feature
Safer defaults are not overreach — they are overdue
Help push safer defaults
Most parents assume safer defaults already exist
Raising awareness turns assumptions into pressure
Better defaults can protect millions of children before harm starts