Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Alaskan man tipped off someone for AI CSAM, then got arrested for the same thing


If you’re going to call the police and rat someone out for reporting an interest in child sexual abuse material (CSAM), maybe it’s not the best idea to have the same material on your own devices. Or give additional consent to a search so law enforcement can gather more information. But it was allegedly done by an Alaskan man. This took him into police custody.

404 Media informed About Anthey O’Connor, who was arrested earlier this week after police found artificially generated child sexual abuse material (CSAM) during a search of their devices.

From 404:

According to the new information payment documentsAnthony O’Connor contacted law enforcement in August to report an unidentified airman who had shared child sexual abuse (CSAM) material with O’Connor. While investigating the crime, and with O’Connor’s consent, federal authorities searched his phone for additional information. A review of the electronics revealed that O’Connor suggested creating a virtual reality CSAM for the airman, according to the criminal complaint.

Police said the unidentified pilot shared a photo of the child with O’Connor at the grocery store, and the two discussed how they could place the child in an open virtual reality world.

Law enforcement claims they found at least six open, AI-generated CSAM images on O’Connor’s devices, which they say were deliberately downloaded and mixed in with several “real” images. As a result of his search, law enforcement found a computer in Connor’s home along with multiple hard drives hidden in the home’s vent; a 41-second video of child rape was revealed during a computer review.

In an interview with authorities, O’Connor said she regularly reported CSAM to internet service providers “but was still sexually satisfied with the images and videos”. It is not known why he decided to report the pilot to law enforcement agencies. Perhaps he was guilty of guilt, or the AI ​​truly believed that CSAM was not breaking the law.

AI image generators are usually trained using real photos; that is, the images of children “created” by artificial intelligence are mainly based on real images. There is no way to separate the two. AI-based CSAM is not a victimless crime in this sense.

The first such arrest of someone in possession of AI-generated CSAM has just taken place back in May When the FBI arrested a man for using Stable Diffusion to create “thousands of realistic images of minors.”

AI proponents will say that it has always been possible to create revealing images of minors using Photoshop, but AI tools are making it exponentially easier for everyone. It was found in a recent report one in six congresswomen has been the target of deep fake porn created by artificial intelligence. Just as printers don’t allow you to make copies of currency, many products have security guards to prevent the worst possible use. Implementing barriers prevents at least some of this behavior.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *