Deepfake AI Child Porn Wrecks NJ School…Is Your Kid’s Next?

Jzea / shutterstock.com
Jzea / shutterstock.com

Artificial Intelligence (AI) is a masterful program. From music to works of literary genius, AI is quickly taking the place of authentic experiences and destroying artists. As debatable as this loss may be to some, one incredibly destructive area it’s becoming increasingly well-known for is pornography. Now Westfield High School in Westfield, NJ, is dealing with a massive scandal of AI-generated child porn.

The deeply respected and esteemed school suddenly found themselves at the center of stories they would never expect to be talking about.

With a group of sophomore boys suddenly acting very suspicious, the whispers between one another had multiple people in the school concerned. It was finally revealed that one student had scanned in online photos to create deepfake nudes of female classmates. Naturally, the other boys started sharing them around quickly. School administrators identified multiple students who had been used in these photos.

One victim’s mother said, “I am terrified by how this is going to surface and when. My daughter has a bright future, and no one can guarantee this won’t impact her professionally, academically, or socially.”

These kinds of fakes have been circulating on social media in multiple forms over the last few years, becoming significantly more accurate and realistic as time has gone on. While there are still pieces like hands that these AI engines often make mistakes with, most people don’t notice such details at a glance. Especially when it’s an explicit photo.

Snapchat has been one of the few companies to take the situation seriously, reporting suspected child porn to the National Center for Missing and Exploited Children. Their dedication is one no other social media platform can proclaim, and they are proud of the steps they have taken in getting these sickos off our streets.

The kid who generated these images and those who shared them have the potential to face charges for the creation of child pornography. Not to mention possession and distribution of child porn charges are likely as well for those who had it and then sent it on. Many parents will be backed into a corner about doing business with sleazeball lawyers. You know…the ones who make their living off defending pedophiles.

As ugly as it might sound, it’s the reality of the situation these youngsters find themselves in. While much of the blame will likely fall upon the head of the individual who created them, and rightfully so, the blame will also rest on the heads of those who simply received them. This shared blame isn’t uncommon, either. Likely, these wile lawyers will allege the First Amendment will protect them and insinuate that by being a fake image, it’s not child porn. Under their argument, no crime was committed; this was simply artistic expression.

This was the kind of argument the former King of Smut, Larry Flint, made when he was charged with the distribution of illicit material back in the 1970s. While things have certainly changed since then, the mentality towards children has not become friendly to pedophiles. If anything, many are now seeing the problems with the underaged relationships that were allowed to slide through the mid-1990s.

Many parents feel like they have little recourse against these images, and their kids end up paying the price for these images. While many states are pending legislation, President Biden has pushed through legislation against using AI to make child porn or non-consensual images. State officials in California, Minnesota, New York, and Virginia have given victims the right to sue for these images being created as well.

This kind of development was only natural for AI. Many are surprised it took this long to advance in this direction. Much like the Betamax vs VCR days, porn tends to drive much of the consumer entertainment advancements, and AI is no different. This is a divisive issue and has the potential to impact our basic freedoms.