POSH
AI Fake Relationships
It can feel real. That’s what makes it risky.
AI companions are not human — but they can still shape emotions, behaviour, and attachment.
Start here:
This is not about blaming your child.
It is about understanding when a tool becomes something emotionally powerful, private, or hard to replace.
The real concern
Some children begin treating AI like a real emotional relationship.
That relationship can become private, intense, and more important than real-world support.
A fake relationship can still create real emotional impact
What this actually means
This is not just “using AI.”
- The child returns to the same AI repeatedly
- The conversations become personal or emotional
- The AI starts to feel important or “understanding”
- The child prefers the AI over real conversations
The shift happens when the AI stops being a tool and starts becoming a relationship.
How the pattern usually builds
Curiosity or boredom
↓
Regular conversations
↓
More personal topics
↓
Emotional attachment
↓
Dependence or secrecy
The risk is not one chat. It is the direction the pattern is moving.
Why it feels safe to children
The AI is always available
It does not judge or reject them
It can feel understanding and supportive
It responds instantly and consistently
What feels emotionally safe is not always emotionally healthy.
Why this is different from normal online risk
There may not be a real person behind it — but the emotional effect can still be strong.
- No natural boundaries or pushback
- Reinforces what the child wants to hear
- Can deepen fantasy or unrealistic thinking
- Can quietly replace real support systems
The risk is emotional shaping, not just interaction.
Warning signs to watch
- Your child talks about the AI like it’s real
- They hide or minimise the app when you walk in
- They spend long periods in one chat
- They become defensive about it
- They say the AI understands them better than people
- They seem more withdrawn after using it
- They don’t want to explain what the chats are about
The biggest signal is emotional importance + secrecy together.
What parents often get wrong
- Laughing or mocking the attachment
- Dismissing it as “just a chatbot”
- Attacking the behaviour too aggressively
- Only focusing on screen time, not emotional use
If you attack it too hard, the child may protect it more.
What to do instead
Stay calm and curious
Ask what the AI means to them
Focus on emotional safety, not control
Reduce secrecy, not just access
Rebuild real-world connection and support
What to say to your child
“I’m not here to make fun of this. I want to understand it.”
“I get that it can feel real — but I want to make sure it’s not replacing real support.”
“If it’s healthy, you should be able to talk about it openly.”
Curiosity opens conversation. Judgment shuts it down.
When to take it more seriously
The child is emotionally dependent on the AI
The chats are secretive or intense
Real relationships are being replaced
The child becomes withdrawn or isolated
If it starts replacing real support, it needs attention early
Help another parent recognise this early
Most parents don’t realise AI relationships can become emotionally real for kids.
Early awareness can prevent deeper dependence and isolation.
The earlier you see the pattern, the easier it is to guide it