FBI Subpoenas X to Retrieve Grok AI Prompts Used in Nonconsensual Pornography Case

Court records reveal the FBI obtained Grok AI prompts used by Simon Tuck to create over 200 nonconsensual sexual videos in an extreme harassment case.

By: AXL Media

Published: Feb 25, 2026, 11:11 AM EST

Source: Information for this report was sourced from 404 Media

FBI Subpoenas X to Retrieve Grok AI Prompts Used in Nonconsensual Pornography Case - article image
FBI Subpoenas X to Retrieve Grok AI Prompts Used in Nonconsensual Pornography Case - article image

Details of the Harassment and Stalking Allegations

The investigation into Simon Tuck uncovered a series of extreme actions targeting a woman and her husband. Tuck, who reportedly worked out and exchanged texts with the woman, is accused of secretly filming her in his garage while she exercised. Over several months, the affidavit alleges that Tuck "swatted" the couple's home, filed false reports with the husband's employer claiming drug addiction and child abuse, and even reached out to a funeral home stating the husband would soon be dead. He also allegedly posed as a member of a Russian hacking crew to send threats to the victims.

The Use of Grok AI to Generate Explicit Content

In January, the FBI subpoenaed X for Tuck’s interaction logs with the Grok AI bot. The resulting data included detailed prompts used to generate approximately 200 pornographic videos of a woman closely resembling the victim's wife. One specific prompt obtained by investigators described a blonde woman in a "sensual sports style" performing explicit acts on a tennis court. Furthermore, Tuck allegedly used Grok to draft a formal complaint against the victim's husband which was then sent to the husband's place of employment.

Content Moderation and Platform Responsibility

The case highlights ongoing concerns regarding the content moderation standards of the Grok AI platform. According to court records, the creation of this nonconsensual sexual material occurred during a period when Grok was already facing heavy criticism for its ability to generate child sexual abuse material and other harmful content. The incident aligns with the broader "undress her" phenomenon, which has raised alarms about how easily AI tools can be weaponized for nonconsensual sexual imaging and real-life harm.

Categories

Topics

Related Coverage