When news broke in February that artificial intelligence-generated photographs of nude students were appearing at a Beverly Hills high school, many district officials and parents were horrified.
But others said no one should have been surprised by the spread of AI-powered “stripping” shows. “The only shocking thing about this story,” one Carlsbad father said his 14-year-old son told him, “is that people are shocked.”
Now, a newly released report from Thorn, a tech company working to stop the spread of child sexual abuse material, shows just how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap “undressing” apps and other easy-to-use, AI-powered programs for creating deepfake nudes.
But the report also shows that other forms of abuse involving digital images remain bigger problems for school-aged children.
To measure middle and high school students’ experiences and attitudes regarding sexual material online, Thorn surveyed 1,040 youth ages 9 to 17 across the country between Nov. 3 and Dec. 1, 2023. Well over half of the group were Black, Latino, Asian or Native American students; Thorn said the resulting data was weighted to make the sample representative of U.S. schoolchildren.
According to Thorn, 11% of students surveyed said they knew of friends or classmates who had used artificial intelligence to generate nudes of other students; an additional 10% declined to say. About 80% said they didn't know anyone who had done that.
In other words, at least 1 in 9 students, and as many as 1 in 5, knew of classmates using AI to create fake nudes of people without their consent.
Stefan Turkheimer, vice president of public policy for the Rape, Abuse & Incest National Network, the nation’s largest anti-sexual violence organization, said Thorn’s findings are consistent with anecdotal evidence from RAINN’s online hotline. Many more children have been contacting the hotline to report being victims of fake nudes, as well as the nonconsensual dissemination of real images, he said.
Compared to a year ago or even six months ago, he said, “the numbers have certainly increased, and significantly.”
Technology is amplifying both types of abuse, Turkheimer said. Not only is image quality improving, but “the distribution of video has really expanded.”
Thorn’s survey found that nearly 1 in 4 13- to 17-year-olds said they had been sent or shown an actual nude photo or video of a classmate or peer without that person’s knowledge. But that number, at least, is down from 2022 and 2019, when 29% of students surveyed in that age group said they had seen nudes shared without consent.
Not surprisingly, only 7% of students surveyed admitted to having personally shared a nude photo or video without that person's knowledge.
The study found that sharing real nudes is common among students, with 31% of 13- to 17-year-olds agreeing with the statement that “it’s normal for people my age to share nudes with each other.” That’s about the same overall level as in 2022, the report says, though it’s notably lower than in 2019, when nearly 40% agreed with that statement.
Only 17% of that age group admitted to sharing nude selfies of themselves. A further 15% of 9- to 17-year-olds said they had considered sharing a nude photo of themselves but decided against it.
Turkheimer wondered if some of the perceived decline in online sexual interactions was due to last year’s shutdown of Omegle, a site where people could have video calls with random strangers. Although Omegle’s rules prohibited nudity and sharing explicit content, more than a third of students who reported using Omegle said they had experienced some kind of sexual interaction there.
He also noted that the study did not explore how often students experienced the interactions the survey tracked, such as sharing nudes with an adult.
According to Thorn, 6% of students surveyed said they had been victims of sextortion — someone had threatened to reveal a sexual image unless they agreed to pay money, send more sexual photos, or take some other action. And when asked who to blame when a nude selfie is made public, 28% said it was solely the victim's fault, compared to 51% who blamed the person who leaked it.