On Air Now

Thursday Breakfast

7:00am - 10:00am

Now Playing

High Fade

Pick Me Up

Sexual predators' new business model is spreading - and humans are needed to catch them

Wednesday, 22 April 2026 18:47

By Mickey Carroll, science and technology reporter

For a long time, images and videos of child abuse were hidden away in the darkest corners of the internet, intentionally hard to find.

That's changing. Now, fully commercial sites are springing up all over the open web.

The Internet Watch Foundation, the organisation charged with removing child sexual abuse material (CSAM) from the internet, has seen the number of commercial sites double in the past year alone.

Some of them are hidden behind apparently innocent website fronts, others are just sitting in the open, just a few clicks away from your social media feeds.

The criminals running these sites aren't selling access to one or two videos of 'category A' material - the worst level of designation content assigned by police.

They're encouraging users to download - and pay for - terabytes of content at a time. But like any business, they need a marketing strategy. They've chosen word of mouth.

"[They're using] 'refer-a-friend' schemes whereby if you view the content and you want more, you can spread that link around your social media accounts, and then the more clicks that content gets," according to Mabel, an anonymous analyst at the IWF.

"That's new. We never used to see that at all."

Mabel is one of the few people in the world who is legally allowed to hunt down and remove CSAM from the internet. She's also a grandmother.

She added: "I worry that my grandchildren will be presented with these sites in their feeds on their social media, not realise what they are and click on them."

Nearly every refer-a-friend scheme was reported to the IWF by a member of the public, rather than a trained analyst.

That worries analysts like Mabel because it suggests ordinary people are now stumbling across this extreme abuse material in a way they never have before.

"I come into work every day and I know what I'm going to see. I'm expecting to see the content that I see on the internet," she said.

"But can you imagine if you turned on your phone, turned on the computer, and within a few clicks you saw category A content? You can't unsee that once you've seen it."

Read more from Sky News:
Investigation into child sex abuse on Telegram
Survivor of online child abuse shares story
Sex offenders exposed to abuse as children

A lot of tech firms, like social media companies, have recognised the harm that seeing such extreme content can do to their employees. Social media moderators are routinely exposed to CSAM, extreme violence and death. It has an impact.

Two years ago, moderators from Meta began legal action against the company after more than 140 of them were diagnosed with severe PTSD.

Other major social media sites like TikTok are also facing legal action over their treatment of moderators and, as a result, many companies are turning to AI to deal with the majority of extreme content.

They say it will help ease the severe mental load for their human workers.

Even the Metropolitan Police announced last week that it will begin exploring how AI could help the force analyse large volumes of CSAM, leaving officers free to "focus human expertise where it is needed most".

So what about the IWF, where analysts are dealing with more content than ever before? They've seen a 6% increase in the amount of CSAM online in the last year alone.

"Artificial intelligence tools are a supplement, right?" IWF chief executive Kerry Smith said.

"They're a supplement to human intelligence. They aren't a replacement."

She believes her human analysts are worth the cost of the mandatory monthly counselling, stringent recruitment process and ongoing psychological care, because of their "offline understanding" of the internet's underbelly.

"[They have an] understanding of how abuse occurs, what exploitation looks like, how you find particular indicators within those images and within those videos that can help identify an individual," Ms Smith said.

"So I think artificial intelligence is a weapon that we could use to prevent online child sexual abuse and exploitation, but it's not a replacement for human intelligence and human insight."

Sky News

(c) Sky News 2026: Sexual predators' new business model is spreading - and humans are needed to catch them

Donate to Roch Valley Radio

 

Do you have a story for us? Want to tell us about something happening in our Borough?

Let us know by emailing newsdesk@rochvalleyradio.com

All contact will be treated in confidence.

More from Technology

Donate to Roch Valley Radio

 

Recently Played

Newsletter

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter and stay updated.

   

Coming up next On Air

  • Thursday Breakfast

    7:00am - 10:00am

    getting you out of bed and to work and school with great music and headlines.

  • Stubbsy in the Morning

    10:00am - Noon

    Steve Stubbs brings you a mix of great music and irreverant chat each day of the week.

  • Lunch with the Hat Man

    Noon - 2:00pm

    Join Alan Duckworth, the Hat Man, for a great mix of music and chat.

  • Afternoons on Roch Valley Radio

    2:00pm - 3:00pm

  • Thursday Drivetime

    3:00pm - 6:00pm

    getting you home on your favourite Drivetime station.

  • The Sean Show

    6:00pm - 8:00pm

    Sean Cummiskey brings you the The Sensational Spectacular Sctillating Supersonic Sean Show! Or for ease The Sean Show! Top music and musings with “Great Number 2’s” and the home of tenuous links to songs from the news and “on this day!”.