pedophiles on dark web turning to ai program to generate sexual abuse content

An internet watchdog is sounding the alarm over the growing trend of sex offenders collaborating online to use open source artificial intelligence to generate child sexual abuse material.

“There’s a technical community within the offender space, particularly dark web forums, where they are discussing this technology,” Dan Sexton, the chief technology officer at the Internet Watch Foundation (IWF), told The Guardian in a report last week. “They are sharing imagery, they’re sharing [AI] models. They’re sharing guides and tips.”

Sexton’s organization has found that offenders are increasingly turning to open source AI models to create illegal child sexual abuse material (CSAM) and distribute it online. Unlike closed AI models such as OpenAI’s Dall-E or Google’s Imagen, open source AI technology can be downloaded and adjusted by users, according to the report. Sexton said the ability to use such technology has spread among offenders, who take to the dark web to create and distribute realistic images.

NEW AI OFFERS ‘PERSONAL PROTECTION’ AGAINST ABDUCTIONS, CRIMINAL THREATS

An internet watchdog is sounding the alarm over the growing trend of sex offenders collaborating online to use open source artificial intelligence to generate child sexual abuse material. (Fox News / File)

“The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified. And that is a much harder problem to fix,” Sexton said. “It’s been taught what child sexual abuse material is, and it’s been taught how to create it.”

Sexton said the online discussions that take place on the dark web include images of celebrity children and publicly available images of children. In some cases, images of child abuse victims are used to create brand-new content.

“All of these ideas are concerns, and we have seen discussions about them,” Sexton said.

photo illustration of businessman using AI chatbot

Sexton said the online discussions that take place on the dark web include images of celebrity children and publicly available images of children. In some cases, images of child abuse victims are used to create brand-new content.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Christopher Alexander, the chief analytics officer of Pioneer Development Group, told Fox News Digital one of the new dangers of this technology is that it could be used to introduce more people to CSAM. On the other hand, AI could be used to help scan the web for missing people, even using “age progressions and other factors that could help locate trafficked children.”

“So, generative AI is a problem, AI and machine learning is a tool to combat it, even just by doing detection,” Alexander said.

“The extreme dangers created by this technology will have massive implications on the well-being of the internet. Where these companies fail, Congress must aggressively step up to the plate and act to protect both children and the internet as a whole.”

Leave a Reply

Your email address will not be published. Required fields are marked *