POSH
How Algorithms Increase Risk For Kids
Your child is not just choosing content.
Content is being chosen for them — based on what keeps them engaged.
Use this page if you’ve ever wondered why your child keeps seeing the same types of content, why things escalate, or why it feels hard to “just stop.”
System awareness page
ENGAGEMENT OVER SAFETY
Algorithms are designed to keep attention — not to protect children.
Important:
The system rewards what your child watches, reacts to, or spends time on — not what is safe.
How algorithms work (simple)
Watch something → more of it appears
Like something → more of it appears
Pause on something → more of it appears
Comment or share → more of it appears
The system learns fast — even from small actions.
What this means for your child
- Content becomes more personalised over time
- Exposure can escalate without the child noticing
- Risky or emotional content can be repeated
- It can feel like “this is normal” because it appears often
- It becomes harder to break away from the content loop
What your child sees next is based on what they did before.
The content loop
Watch / interact
↓
Algorithm learns
↓
More similar content
↓
Stronger engagement
↓
Deeper exposure
The more engagement, the stronger the loop becomes.
How risk increases
- Content becomes more extreme or intense over time
- Similar themes are repeated again and again
- Emotional content (fear, anger, excitement) is prioritised
- Children may be exposed to things they did not search for
- Algorithm pathways can lead to unsafe communities or contacts
Escalation can happen quietly and quickly.
Why it feels normal to kids
- They see the same type of content repeatedly
- It appears in a constant stream
- Peers may be seeing similar content
- The system reinforces interest without challenge
Repeated exposure can make risky content feel normal.
High-risk patterns
Rapid content escalation
Obsession with specific themes
Emotional dependency on content
Moving from public content to private interaction
Content leading to direct messaging or contact
Algorithms can move children from content → contact.
What parents often think (but isn’t true)
- “They searched for it.”
- “They chose to watch that.”
- “They can just stop watching.”
The system is actively shaping what they see.
What parents can do
- Be aware of what platforms your child uses
- Regularly check feeds and recommendations
- Reset or refresh feeds where possible
- Encourage a mix of content, not one repeated theme
- Set time limits to break content loops
- Talk to your child about how algorithms work
Understanding the system reduces its control.
What to say to your child
“The app learns from what you watch.”
“What you stop on tells it what to show you next.”
“Not everything you see is random.”
“You can reset what the app shows you.”
“You’re allowed to step away from it.”
Final POSH reminder
Algorithms learn fast.
Exposure builds quickly.
Engagement drives content.
Safety is not the priority.
The more your child engages, the deeper the system pulls them in.