POSH

How Algorithms Increase Risk For Kids

Your child is not just choosing content.
Content is being chosen for them — based on what keeps them engaged.

Use this page if you’ve ever wondered why your child keeps seeing the same types of content, why things escalate, or why it feels hard to “just stop.”
System awareness page
ENGAGEMENT OVER SAFETY
Algorithms are designed to keep attention — not to protect children.
Important:
The system rewards what your child watches, reacts to, or spends time on — not what is safe.

How algorithms work (simple)

Watch something → more of it appears

Like something → more of it appears

Pause on something → more of it appears

Comment or share → more of it appears

The system learns fast — even from small actions.

What this means for your child

What your child sees next is based on what they did before.

The content loop

Watch / interact
Algorithm learns
More similar content
Stronger engagement
Deeper exposure
The more engagement, the stronger the loop becomes.

How risk increases

Escalation can happen quietly and quickly.

Why it feels normal to kids

Repeated exposure can make risky content feel normal.

High-risk patterns

Rapid content escalation

Obsession with specific themes

Emotional dependency on content

Moving from public content to private interaction

Content leading to direct messaging or contact

Algorithms can move children from content → contact.

What parents often think (but isn’t true)

The system is actively shaping what they see.

What parents can do

Understanding the system reduces its control.

What to say to your child

“The app learns from what you watch.”
“What you stop on tells it what to show you next.”
“Not everything you see is random.”
“You can reset what the app shows you.”
“You’re allowed to step away from it.”

Where this connects

Final POSH reminder

Algorithms learn fast.

Exposure builds quickly.

Engagement drives content.

Safety is not the priority.

The more your child engages, the deeper the system pulls them in.