Teen boys use AI to make fake nudes of classmates, sparking police probe

Key Takeaways:

– Boys at Westfield High School in New Jersey used AI image generators to create and share fake nude photos of female classmates.
– The school initially believed that the images had been deleted and were no longer in circulation.
– It remains unclear how many students were involved or disciplined, and whether faculty had reviewed the images.
– There is currently no federal law restricting the creation of faked sexual images of real people.
– President Joe Biden issued an executive order urging lawmakers to pass protections against generative AI producing child sexual abuse material or non-consensual intimate imagery.
– Some states, including Virginia, California, Minnesota, and New York, have passed laws outlawing the distribution of faked porn.
– New Jersey may consider passing a similar law, and if necessary, draft a new law.
– Students targeted by the fake images are uncomfortable attending school with the boys who created them and fear potential future damage.
– AI image generators have become sophisticated, making it easier to create realistic deepfakes.
– Image-detection firm Sensity AI reported that more than 90% of fake images online are porn.
– The incident at Westfield High School highlights the need for awareness and responsible use of new technologies.

Ars Technica:

Enlarge / Westfield High School in Westfield, NJ, in 2020.

This October, boys at Westfield High School in New Jersey started acting “weird,” the Wall Street Journal reported. It took four days before the school found out that the boys had been using AI image generators to create and share fake nude photos of female classmates. Now, police are investigating the incident, but they’re apparently working in the dark, because they currently have no access to the images to help them trace the source.

According to an email that the WSJ reviewed from Westfield High School principal Mary Asfendis, the school “believed” that the images had been deleted and were no longer in circulation among students.

It remains unclear how many students were harmed. A Westfield Public Schools spokesperson cited student confidentiality when declining to tell the WSJ the total number of students involved or how many students, if any, had been disciplined. The school had not confirmed whether faculty had reviewed the images, seemingly only notifying the female students allegedly targeted when they were identified by boys claiming to have seen the images.

It’s also unclear if what the boys did was illegal. There is currently no federal law restricting the creation of faked sexual images of real people, the WSJ reported, and in June, child safety experts reported that there was seemingly no way to stop thousands of realistic but fake AI child sex images from being shared online.

This week, President Joe Biden issued an executive order urging lawmakers to pass protections to prevent a wide range of harms, including stopping “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals.” Biden asked the secretary of Commerce, the secretary of Homeland Security, and the heads of other appropriate agencies to provide recommendations regarding “testing and safeguards against” producing “child sexual abuse material” and “non-consensual intimate imagery of real individuals (including intimate digital depictions of the body or body parts of an identifiable individual), for generative AI.” But it could take years before those protections are ultimately introduced, if ever.

Some states have stepped in where federal law is lagging, with Virginia, California, Minnesota, and New York passing laws to outlaw the distribution of faked porn, the WSJ reported. And New Jersey might be next, according to Jon Bramnick, a New Jersey state senator who told the WSJ that he would be “looking into whether there are any existing state laws or pending bills that would criminalize the creation and sharing of” AI-faked nudes. And if he fails to find any such laws, Bramnick said he planned to draft a new law.

It’s possible that other New Jersey laws, like those prohibiting harassment or the distribution of child sexual abuse materials, could apply in this case. In April, New York sentenced a 22-year-old man, Patrick Carey, to six months in jail and 10 years of probation “for sharing sexually explicit ‘deepfaked’ images of more than a dozen underage women on a pornographic website and posting personal identifying information of many of the women, encouraging website users to harass and threaten them with sexual violence.” Carey was found to have violated several laws prohibiting harassment, stalking, child endangerment, and “promotion of a child sexual performance,” but at the time, the county district attorney, Anne T. Donnelly, recognized that laws were still lacking to truly protect victims of deepfake porn.

“New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children,” Donnelly said.

Remarkably, New York moved quickly to close that gap, passing a law last month that banned AI-generated revenge porn, and it appears that Bramnick this week agreed that New Jersey should be next to strengthen its laws.

“This has to be a serious crime in New Jersey,” Bramnick said.

Until laws are strengthened, Bramnick has asked the Union County prosecutor to find out what happened at Westfield High School, and state police are still investigating. Westfield Mayor Shelley Brindle has encouraged more victims to speak up and submit reports to the police.

Students targeted remain creeped out

Some of the girls targeted told the WSJ that they were not comfortable attending school with boys who created the images. They’re also afraid that the images may reappear at a future point and create more damage, either professionally, academically, or socially. Others have said the experience has changed how they think about posting online.

Last year, Ars warned that AI image generators have become so sophisticated that training AI to create realistic deepfakes is now easier than ever. Some image tools, like OpenAI’s DALL-E or Adobe’s Firefly, the WSJ report noted, have moderation settings to stop users from creating pornographic images. However, even the best filters are challenging if not “impossible” to enforce, experts told the WSJ, and technology exists to face-swap or remove clothing if someone seeking to create deepfakes is motivated and savvy enough to combine different technologies.

Image-detection firm Sensity AI told the WSJ that more than 90 percent of fake images online are porn. As image generators become more commonplace, the risk of more fake images spreading seems to rise.

For the female students at Westfield High School, the idea that their classmates would target them is more “creepy” than the vague thought that “there are creepy guys out there,” the WSJ reported. Until the matter is settled in the New Jersey town, the girls plan to keep advocating for victims, and their principal, Asfendis, has vowed to raise awareness on campus of how to use new technologies responsibly.

“This is a very serious incident,” Asfendis wrote in an email to parents. “New technologies have made it possible to falsify images, and students need to know the impact and damage those actions can cause to others.”

Source link

AI Eclipse TLDR:

In October 2020, boys at Westfield High School in Westfield, New Jersey, were discovered using AI image generators to create and share fake nude photos of their female classmates. The incident went unnoticed for four days before the school administration became aware of it. The police are currently investigating the case, but they have no access to the images to trace their source. The school believed that the images had been deleted and were no longer in circulation, although the exact number of students affected remains unknown. The school has not disclosed the number of students involved or whether any disciplinary action has been taken. It is also unclear if the boys’ actions were illegal, as there is currently no federal law prohibiting the creation of fake sexual images of real people.

President Joe Biden recently issued an executive order urging lawmakers to pass protections against the production of non-consensual intimate imagery by generative AI. However, it may take years for these protections to be implemented. Some states, including Virginia, California, Minnesota, and New York, have already passed laws to outlaw the distribution of faked porn. New Jersey State Senator Jon Bramnick has expressed his intention to investigate existing state laws or draft new legislation to criminalize the creation and sharing of AI-faked nudes.

The incident at Westfield High School has left the targeted female students feeling uncomfortable and concerned about potential future harm to their personal and professional lives. The incident has also raised awareness about the risks associated with AI image generators and the difficulty of enforcing filters to prevent the creation of fake images. The incident highlights the need for education on the responsible use of new technologies and the potential impact of such actions on others.