POSH

AI Nudify & Deepfake Risks

Fake images can still cause real harm.
This page helps parents understand AI-generated sexualised images, deepfakes, image-based abuse, and what to do if a child is targeted.

IMAGE ABUSE RISK
AI Images
Deepfakes
Nudify Apps
Blackmail
Removal
Important:
This guide is intentionally non-graphic. It is about protecting children, preserving evidence safely, reporting correctly, and reducing further harm.
When fake content becomes real pressure
FAKE IMAGE. REAL FEAR. REAL RESPONSE.

AI-generated intimate images, nudify apps, and deepfakes can be used to shame, bully, threaten, blackmail, or isolate a child. Even when the image is fake, the fear and damage can feel real.

The child is not responsible for someone creating or spreading fake sexualised content.
The adult response should reduce shame, not increase it.

What parents need to understand

AI abuse can involve real photos being altered, fake nude images being generated, faces being placed onto sexual content, or threats to create or share fake intimate images.

It can be fake

The image may not be real, but the humiliation, fear, bullying, and blackmail can still be very serious.

It can spread fast

Images, screenshots, links, and rumours can move quickly through group chats, school networks, gaming spaces, or social platforms.

It can silence children

Children may hide the situation because they fear blame, punishment, embarrassment, or losing their device.

If this is happening now

Tell the child they are not in trouble.

Do not repost, forward, or publicly share the image.

Preserve evidence without spreading the content further.

Record usernames, profile links, group names, app names, dates, and threats.

Move into official reporting and removal pathways quickly.

Do not make the image travel further.

Warning signs this may be happening

Do not wait for the child to explain it perfectly. Many children only disclose a small part at first.

What not to do

The response should protect the child and reduce exposure — not accidentally spread the content further.

What to save safely

Avoid creating unnecessary copies of sexualised images involving a child. Focus on safe evidence capture and official reporting.

What to say to the child

Remove blame

“This is not your fault. Someone else chose to create, threaten, or share harmful content.”

Reduce panic

“We are going to slow this down, save what matters, and report it properly.”

Stop isolation

“You do not have to deal with this alone. I am glad you told me.”

Keep them talking

“You do not have to explain everything perfectly. Start with what you can.”

Reporting and removal pathways

Use the right pathway depending on where you live, where the content is hosted, and whether the child is at immediate risk.

Platform reporting can help remove content, but serious child exploitation or threats may need official reporting too.

Best connected pages

Final POSH reminder

Fake sexualised images can still cause real fear and real harm.

The child needs calm protection, not shame.

Do not spread the image further. Preserve what matters. Report through the right pathway.

Do not amplify the harm while trying to prove it.