ChatGPT as a First Step in Reporting Organised and Ritual Abuse
Something unusual is happening in UK support services: survivors of organised and ritual abuse are picking up the phone and saying something few people expected to hear: “I was referred to you by ChatGPT.”
That does not mean an AI chatbot has become a therapist, or that it is replacing trained support workers. But support groups say it is becoming a first stop for some survivors who have struggled for years, and sometimes decades, to say what happened out loud.
It points to a new role for AI in one of the hardest parts of trauma response: disclosure. In the UK, charities and police experts say some survivors of organised and ritual abuse are using ChatGPT as a low-pressure way to put their experiences into words before speaking to a real person.
Why ChatGPT keeps coming up
According to The Guardian, Gabrielle Shaw, chief executive of the National Association for People Abused in Childhood (NAPAC), said the charity has seen a sustained rise over the past 18 months in reports involving ritual abuse. She also said that over the past six to 12 months, some callers have explicitly said ChatGPT directed them to seek help after using it for “therapy and exploration.”
The same report said NAPAC recorded 36,700 calls over nine years, with 1,310 mentioning organised and ritual abuse. Shaw said the recent increase does not look like the shorter spikes the charity has seen around certain religious or supernatural dates. Instead, it appears to be a steadier pattern.
This suggests a broader change in how some survivors are reaching support, not just a temporary jump in attention. For people who expect disbelief or do not yet feel ready to speak to another human, an AI chatbot may offer a less intimidating place to begin.
Why these cases are so hard to report
The disclosure gap has been a long-running problem. The Guardian reported that there have been 14 UK criminal cases since 1982 in which ritualistic practices in sexual abuse were acknowledged. Richard Fewkes, director of the Hydrant Programme, said the ritual elements can sound “fantastical,” which has made such accounts easier to dismiss.
That is also the backdrop for more recent efforts to improve professional response. In July 2025, NAPAC and the National Police Chiefs’ Council’s Hydrant Programme published a research review and operational guidance aimed at helping police and other professionals better understand and investigate these cases.
The clearest takeaway is not that AI has solved anything. It is that some survivors appear to be using ChatGPT as a bridge to human support, while charities and police try to improve what happens after that first disclosure. In a story shaped for years by silence, disbelief, and investigative blind spots, that first step may be what makes later support possible.
Also read: Backlash over OpenAI’s Pentagon deal shows how quickly trust in ChatGPT can shift.
The post ChatGPT as a First Step in Reporting Organised and Ritual Abuse appeared first on eWEEK.