Inside sick world of paedo deep fake ‘collectors’ who trade ‘sets of real kids like football cards’ as gangs make a mint
“WHAT does your room look like?” seemed an innocent question to teenager Thomas… but within minutes of his reply he was being ‘sextorted’ and feared his life was over.
The 14-year-old, who we have given a false name, had been talking to a ‘girl’ online and sent pictures showing his surroundings – fatefully they included his face.
On Monday, the sicko, 27, from Bolton, was sentenced to 18 years in prison[/caption]Moments later, those images were twistedly transformed into a deepfake child sexual abuse video – generated by Artificial Intelligence (AI) and the stranger was threatening to make it public.
In a harrowing plea for help to Childline, Thomas wrote: “Now they’re demanding money from me and they said if I don’t pay my life will be over!
“I know it’s not me in the video but it looks so real! I’m worried what will happen if my friends find out. I can’t believe I got myself in this situation, I’m so scared. I don’t know what to do.”
This is just one of a number of terrifying tricks used by paedophiles and their enablers to ‘sextort’ children and teenagers online – and tech is at the heart of it.
The Internet Watch Foundation (IWF) found 2,562 images of child sexual abuse material (CSAM) that had used AI on just one dark web forum in one month last year… some of the content showed kids as young as two.
Among the twisted creators is Hugh Nelson, from Bolton, who was locked up for 18 years on Monday after he turned ordinary images of children into AI-generated explicit material.
The 27-year-old sicko charged paedophiles £80 for a new “character” – depicting a real-life kid – and would then flog them £10 images of the child in different repulsive positions.
The harrowing case comes as The Sun today reveals that vile paedophiles are swapping and selling images of kids “like trading cards” and try to “collect every single abuse image of a child that exists”.
IWF senior analyst Zara, whose name we have changed for security reasons, explains in an exclusive interview that those who run sites are considered “like gods” to the perverts they serve.
She says: “The way children are exploited has diversified and we’ve seen what we call ‘commercial’ sites online with criminal gangs trying to make money from images, selling them in packs online.
“They get a collection of pictures of the same child or group of children and then sell them as a group…
“Offenders can then buy different online memberships to the site up to VIP level.
“The people who run these sites have a sort of god-like status among offenders, which is very strange.
“We see offenders asking, ‘Do you have any pictures of X because I don’t have my full collection’, asking for a particular victim.”
Predators exchange images and videos of child sexual abuse ‘like trading cards’[/caption] The Internet Watch Foundation, as well as other charities, work to remove the chilling content online[/caption]Zara’s boss Tamsin McNally, Hotline Manager at the IWF, said: “People tell us about ‘collectors’.
“It blows my mind that there’s even a phrase like that but it’s kind of like trading cards, football cards.
“There are people out there who want to have every single abuse image of a child that exists out there.
“When it comes to child sexual abuse, it’s not just one image being taken.
“Quite often there will be hundreds of images or videos or both and there are people out there who want to collect each and every single one of that specific child.
“They will try and collect them en masse or trade off for other images as well.”
‘Made to order’
The crimes of twisted Hugh Nelson, heard on Monday at Bolton Crown Court, were described as “deeply horrifying” by police and have chilled the nation to its core.
He was sentenced to 18 years imprisonment after pleading guilty to 11 chilling charges related to the production of CSAM and manipulating underage children to commit sex acts.
People tell us about ‘collectors’. It blows my mind that there’s even a phrase like that but it’s kind of like trading cards, football cards.
Tamsin McNally, Hotline Manager at the IWF
According to CPS special prosecutor Jeanette Smith, it’s “one of the first cases” that links people like Nelson using tech to create images and “the real offending that goes on behind that”.
During an 18-month period, the 27-year-old made around £5,000 from his abhorrent activities – but his actions went “far beyond… a ‘business opportunity’” according to police.
Detective Constable Carly Baines, of Greater Manchester Police, said: “Not only was he creating and selling these images, but he was engaging in depraved sexualised chat online about children and going as far as to encourage people interested in his online content to commit contact offences such as rape against children they knew or were related to.”
Some of the images Nelson received were of children who live as far away as France, Italy and the United States, which has led to more arrests and investigations overseas.
In a police interview, the paedophile admitted creating the photos had “taken over my life” and he had fallen into a worsening “pit of despair and absolutely grotesque behaviour”.
IWF’s Tamsin McNally tells The Sun there are ‘hundreds’ of sick videos and photos that paedophiles want to collect[/caption]Nelson continued: “I’ve probably been doing it for about two years now and I could probably say that they have got worse in nature as I’ve continued with them.
“It’s sick how much it affects your mind, especially when you have no job, you sit at home, you play games, you watch porn and you make these stupid goddam images.
The twisted crimes of Hugh Nelson
A TWISTED paedophile has been sentenced to 18 years for using AI technology and images of real children to make sex abuse material.
Hugh Nelson, 27, from Bolton, made thousands of pounds from the sickening trade and was imprisoned after previously pleading guilty to 11 offenses.
The charges included three counts of encouraging the rape of a child under 13 and one count of attempting to incite a boy under 16 to engage in a sexual act.
Nelson also pleaded guilty to three counts each of the distribution and making of indecent images, and one count of possessing prohibited images.
At an earlier court appearance, he also admitted to four counts of distributing indecent pseudo photographs of children and one of publishing an obscene article.
The landmark case is one of the first times the tech and real-life victims of child sexual abuse have been linked and prosecuted.
The Internet Watch Foundation said there has been an explosion in AI being used to create abhorred images – in the last six months, they have found more examples than they did during the whole of 2023.
Dan Sexton, the charity’s chief technical officer, said: “Our work has always been difficult anyway.
“[But] we’ve never had to deal with the possibility that someone could download some software on their computer and create an infinite amount of new images.
“They use as many as they can until the hard drives fill up. That’s a new type of harm that which we have not been prepared for.”
“My mind is very corrupted and warped. It can just be images of them posing, fully clothed, to hardcore rape images. So everything really.”
Derek-Ray Hill, interim CEO at the IWF, described filmmaker Nelson’s crimes as “horrifying” – all the more so for images appearing to be “made to order”.
He said: “Technology is now enabling previously unthought of violations of innocent children.
“We are discovering more and more synthetic and AI images of child sexual abuse and they can be disturbingly life-like.
“That Nelson profited from making this material to order after clients sent him images to manipulate is on another horrifying level.”
‘Defiling’ apps
The Internet Watch Foundation runs a report and remove project in which kids can report images and the organisation will repeatedly remove them.
It’s a never-ending task as predators find new ways to create sexual abuse imagery, to target the vulnerable and distribute it online.
You could tell she was essentially growing up, she had been abused for a number of years – not only sexually abused but a number of photos and videos were taken and they were shared online.
Tamsin McNally, IWF
IWF’s Tamsin McNally said: “Technology has changed the way that children are sexually exploited.
“We are seeing defiling apps which ‘remove’ clothes from victims. We know through Childline that reports in school are rife.”
She explained that many would be surprised to know that the majority of the indecent images are “not all hidden away on the dark web” but discoverable on normal internet pages.
McNally told us there have been suicides linked to these cases and sextortion, describing it as “absolutely a global problem” with children being “exploited in every single country”.
Kids constantly ‘re-victimised’
Recalling one horrific case, she told us about seeing one girl “grow up in images and videos” where she was sexually abused by her own father over a number of years.
The images and videos of the unnamed child, who was under 10 when it began, were Category A, the most severe form of abuse under UK law.
Content showed one young girl ‘growing up’ through sexual abuse images and videos[/caption]McNally told us: “When I first saw her, she was between the ages of seven and 10 then subsequently as my time went on I saw more and more images of her.
“You could tell she was essentially growing up, she had been abused for a number of years – not only sexually abused but a number of photos and videos were taken and they were shared online.
“I don’t know what happened to her. I’d really like to think she’s being safeguarded and looked after but 10 years later, I’m still seeing images of her online being sold online by collectors.”
McNally tells us that “the girl is being re-victimised” every time content she features in is shared and explains that cases like this highlight why the IWF’s work removing CSAM is vital.
Thankfully, due to their hard work, they have been able to remove and prevent the content from being uploaded and shared again.
While that case was a success for the foundation, tragically the unnamed girl is just one of tsunami of victims whose images continue to be circulated by predators on the net.
If you need help with sextortion or removing online images contact: www.iwf.org.uk/our-technology/report-remove/.