ChatGPT driving rise in reports of ‘satanic’ organised ritual abuse, UK experts say
TL;DR
UK experts warn that ChatGPT is driving a rise in reports of organised ritual abuse, as survivors use the AI tool as a substitute for therapy.
Key Points
- British police classify 'witchcraft, spirit possession and spiritual abuse' (WSPRA) against children as a massively under-reported phenomenon.
- Such offences involve sexual abuse, violence and neglect with ritualistic elements, sometimes inspired by Satanism, fascism or esoteric beliefs.
- The UK currently has no specific criminal charge that directly covers this form of abuse.
- Survivors appear to turn to ChatGPT because professional trauma therapy is inaccessible or unaffordable.
Nauti's Take
The headline sounds like moral panic, but the actual finding is sober: people who feel unheard everywhere else are telling a chatbot about their worst experiences. That is not a ChatGPT problem – it is a therapist shortage problem.
Anyone who wants to regulate AI here instead of funding trauma centres has the priorities backwards. And yes, it remains an open question whether more reports mean more incidents or simply greater visibility of suffering that was always there but never surfaced.
Context
When abuse survivors turn to AI chatbots as their primary outlet for trauma processing, it exposes a systemic failure in healthcare and social services – not a problem with AI itself. At the same time, these conversations may produce structured first-person accounts that could offer investigators and researchers new insights. The absence of a specific UK criminal charge highlights how far legislation lags behind the reality of specialised forms of abuse.