![]()

Why officials are targeting platforms rather than individual users
The press release said officials “want to introduce a new ban on so-called ‘nudifier’ systems that use AI to create or manipulate images that are sexually explicit or intimate and resemble an identifiable real person without that person’s consent.” It added that “the ban would not apply to AI systems with effective safety measures preventing users from creating such images.”
As Bloomberg reported, the proposed ban would mark a major shift in the EU’s handling of explicit deepfakes, moving beyond simply prosecuting users to holding platforms accountable. Bloomberg said the Grok scandal “epitomized” why regulators needed to change course, noting that “this amendment is the first” EU move “to specifically target AI platforms” that generate and permit distribution of “sexual material without the subject’s consent.”
Though EU officials didn’t name Grok in the press release, regulators had already been examining the AI system while considering what xAI’s controversy might mean for other, less prominent nudify tools. When submitting questions to the European Commission earlier this year, lawmakers warned:
Recent shocking reports of AI-powered nudity applications, such as Grok on X, but also other tools that are freely available online, highlight an increase in AI-driven tools that allow users to generate manipulated intimate images of individuals without their consent, facilitating gender-based cyberviolence and the creation of child sexual abuse material.
Lawmakers urged that “these systems should be banned from the EU market,” stressing that “individual perpetrators”—who “can often be punished under national criminal law”—“are often hard to find.” They argued a better approach would be to act earlier to “prevent widespread image-based sexual violence from the outset.”
With apparent support from Members of Parliament, the amendment’s likely approval is certain to annoy Musk, who is also confronting US lawsuits seeking injunctions over Grok’s nudify outputs. In January, Ashley St. Clair, a mother of one of Musk’s children, became among the first victims to sue. More recently, three young girls in Tennessee filed a proposed class action on behalf of all children allegedly harmed by Grok’s CSAM outputs.
In the EU, public calls for regulatory action are growing as xAI appears unwilling to stop Grok from digitally undressing real people. Michael McNamara, a member of the civil liberties committee, said in the press release that he believes the ban on nudify apps “is something that our citizens expect.”