Alarming New Study Reveals Over Half of U.S. Teens Use Generative AI Tools to Create Non-Consensual Sexualized Images
New study finds 55% of US teens use AI to create sexualized images, with one-third reporting they were victims of non-consensual AI image distribution.
By: AXL Media
Published: Mar 18, 2026, 2:50 PM EDT
Source: Information for this report was sourced from PLOS

The Normalization of AI Sexual Exploitation Among Youth
New research led by Chad Steel of George Mason University has uncovered a pervasive and troubling trend in the digital behavior of American teenagers. According to the study published in the open-access journal PLOS One, more than half of U.S. adolescents have engaged with "nudification" tools, which leverage generative artificial intelligence to strip clothing from images of individuals. While prior research noted the normalization of sexualized image sharing, the introduction of GenAI has introduced a new layer of exploitation. These findings suggest that the ease of creating high-fidelity, non-consensual imagery is fundamentally altering the landscape of adolescent social interaction and digital safety.
Quantifying the Prevalence of Nudification Tool Usage
To understand the scope of this issue, researchers analyzed anonymous survey data from 557 U.S. residents between the ages of 13 and 17. The results were startling: 55.3% of participants admitted to using AI tools to create at least one sexualized image of themselves or another person. Furthermore, 54.4% of respondents reported receiving such images. This data provides a concrete benchmark for a phenomenon that was previously only understood through anecdotal reports and rising complaints to law enforcement. The study indicates that these tools are no longer niche applications but have become a mainstream component of the teenage digital experience.
A Crisis of Non Consensual Image Distribution
Perhaps the most distressing aspect of the research is the high rate of victimization reported by the participants. Approximately 36.3% of the surveyed teens stated that someone else had created a sexualized AI image of them without their consent. Additionally, 33.2% reported that such an image of them had been distributed non-consensually. The psychological impact of these actions is severe, with victims reporting a profound sense of dehumanization and permanent disruption to their personal lives. Unlike traditional "sexting," where a person typically has some initial control over the image, GenAI allows for the total fabrication of sexual content without the victim ever being involved in the process.
Categories
Topics
Related Coverage
- West Virginia University Study Finds Judges Adopting Generative AI for Administrative Support While Protecting Human Authority
- ACM TechBrief Warns of Security and Reliability Risks in Rapidly Rising Vibe Coding Trend
- Archaeologists Uncover Sophisticated Prehistoric European Recipes Through Advanced Analysis of 8,000 Year Old Pottery Crusts
- Nollywood Legal Feud Ends: Court Discharges Angela Okorie as Mercy Johnson-Okojie Withdraws Defamation Charges