Home EconomyHow a ‘nudify’ website transformed a circle of friends into pivotal players in the battle against AI-produced pornography

How a ‘nudify’ website transformed a circle of friends into pivotal players in the battle against AI-produced pornography

by admin
0 comments
How a 'nudify' website transformed a circle of friends into pivotal players in the battle against AI-produced pornography

In June of the previous year, Jessica Guistolise got a text message that would alter her life.

While dining with coworkers on a business trip in Oregon, her phone signaled a message from an acquaintance named Jenny, who had critical information about her estranged spouse, Ben.

Following a nearly two-hour exchange with Jenny that evening, Guistolise remembered feeling stunned and frantic. Jenny informed her she had uncovered images on Ben’s computer of over 80 women whose social media photos were manipulated to produce deepfake pornography — videos and images portraying sexual actions created through artificial intelligence by merging real photos with pornographic ones. The majority of the women in Ben’s pictures resided in the Minneapolis area.

Jenny captured images of what she found on Ben’s computer using her phone, Guistolise recounted. The screenshots, some of which CNBC reviewed, indicated that Ben utilized a site known as DeepSwap to generate the deepfakes. DeepSwap belongs to a category of “nudify” sites that have surged since the rise of generative AI less than three years ago. 

CNBC opted not to disclose Jenny’s last name to safeguard her privacy and withheld Ben’s last name due to his claims of mental health issues. They are now divorced.

Guistolise indicated that after her conversation with Jenny, she was eager to cut her trip short and hurry back home.

In Minneapolis, the women’s experiences would soon ignite a rising resistance to AI deepfake tools and the individuals who exploit them.

One of the altered images Guistolise encountered upon her return was created using a photo from a family vacation. Another depicted her goddaughter’s college commencement. Both were sourced from her Facebook account. 

“The first time I laid eyes on the explicit images, I felt something shift within me, as if I had fundamentally changed,” expressed Guistolise, 42.

CNBC spoke with over two dozen individuals — including victims, their relatives, attorneys, sexual abuse specialists, AI and cybersecurity scholars, trust and safety personnel in the tech sector, and lawmakers — to understand how nudify websites and applications operate and to comprehend their real-world repercussions on people.

It’s not something I would wish upon anyone,” Guistolise remarked.

Jessica Guistolise, Megan Hurley, and Molly Kelley speak with CNBC in Minneapolis, Minnesota, on July 11, 2025, discussing fake pornographic images and videos featuring their faces created by their mutual friend Ben using the AI site DeepSwap.
Jordan Wyatt | CNBC

Nudify applications represent a minor but swiftly expanding segment of the new AI landscape, which surged after the launch of OpenAI’s ChatGPT in late 2022. Since that time, Meta, Alphabet, Microsoft, Amazon and others have collectively allocated hundreds of billions of dollars towards AI development and the pursuit of artificial general intelligence, or AGI — technology that may rival or exceed human capabilities. 

For consumers, much of the excitement so far has centered around chatbots and image generators that enable users to accomplish intricate tasks with simple text commands. There’s also the emerging market of AI companions, alongside a range of agents designed to boost productivity. 

However, those affected by nudify applications are experiencing the darker side of the AI surge. Due to generative AI, platforms like DeepSwap are so straightforward to use — requiring no coding skills or technical know-how — that they can be utilized by virtually anyone. Guistolise and others expressed concern that it’s only a matter of time before the technology becomes widespread, subjecting many more individuals to potential harm.

Guistolise filed a report with the police regarding the incident and obtained a restraining order against Ben. Yet she and her friends quickly discovered flaws in that strategy.

Ben’s actions might have been within legal bounds. 

The women involved were of legal age. And to their knowledge, the deepfakes had not been circulated, residing solely on Ben’s computer. Although they feared that the videos and images could be on a server somewhere, potentially reaching malicious actors, there was no evidence to hold against Ben. 

One of the other women involved was Molly Kelley, a law student who dedicated the next year to guiding the group through the uncharted legal complexities of AI. 

“He did not breach any laws that we’re aware of,” Kelley noted, referencing Ben’s actions. “And that is concerning.”

Ben acknowledged creating the deepfakes, and communicated to CNBC via email that he feels guilty and ashamed of his behavior.

Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable” in an email statement.

“Since the moment I learned the reality, my loyalty has been to the affected women, and my focus has remained on how to best assist them as they navigate their new circumstances,” she stated. “This is not a problem that will resolve itself. We require stronger laws to ensure accountability — not only for the individuals who misuse this technology but also for the companies that facilitate its use on their platforms.”

Easily accessible

Like other new and user-friendly AI tools, experts indicate that many apps providing nudify services advertise on Facebook and can be downloaded from the Apple App Store and Google Play Store.

Haley McNamara, senior vice president at the National Center on Sexual Exploitation, stated that nudify apps and sites have simplified the process of creating realistic sexually explicit deepfake imagery of a person based on a single photo in less time than it takes to brew a cup of coffee.

Two images of Molly Kelley’s face and one of Megan Hurley’s are visible on a screenshot taken from a computer belonging to their mutual friend Ben, who manipulated the women’s Facebook photos without their consent to create fake pornographic images and videos via the AI site DeepSwap on July 11, 2025.

A spokesperson from Meta, the parent company of Facebook, stated that the organization has stringent regulations prohibiting advertisements that feature nudity and sexual activity, and it shares information acquired about nudify services with other firms through a child-safety industry initiative. Meta identified the nudify ecosystem as confrontational and asserted that it is enhancing its technology to prevent malicious actors from advertising. 

Apple informed CNBC that it routinely removes and rejects applications that breach its app store standards regarding content considered offensive, misleading, or overtly sexual and pornographic. 

Google chose not to comment.

The issue transcends U.S. borders.

In June 2024, concurrently with the discovery by the women in Minnesota, an Australian man was sentenced to nine years in prison for producing deepfake content of 26 women. That month, media reports detailed an investigation by Australian authorities into a school incident where a teenager allegedly produced and shared deepfake content of nearly 50 female classmates.

“Whatever the worst potential of any technology is, it’s almost always directed against women and girls first,” remarked Mary Anne Franks, a professor at the George Washington University Law School.

Security researchers from the University of Florida and Georgetown University indicated in a research publication presented in August that nudify tools are drawing design influence from popular consumer applications and employing familiar subscription models. DeepSwap charges users $19.99 monthly for “premium” benefits, which include credits for AI video generation, expedited processing, and higher-quality images.

The researchers noted that “nudification platforms have fully integrated into mainstream consciousness” and are “advertised on Instagram and available in app stores.”

Guistolise shared that while she was aware that individuals could employ AI to fabricate nonconsensual pornography, she hadn’t realized how straightforward and accessible the apps were until she encountered a synthetic version of herself engaged in explicit activities. 

According to the screenshots from Ben’s DeepSwap page, the faces of Guistolise and the other women from Minnesota were neatly arranged in rows of eight, reminiscent of a school yearbook. Clicking on the images led to a series of computer-generated replicas involved in various sexual acts. The women’s faces had been overlaid with the nude bodies of others.

DeepSwap’s privacy policy indicates that users have seven days to view the content from the moment it is uploaded to the site, and that the data is stored for that duration on servers in Ireland. DeepSwap’s platform asserts it deletes the data after that time, yet users have the option to download it to their own computers in the interim. 

The site also features a terms of service page that states users are prohibited from uploading any content that “contains any private or personal information of a third party without such third party’s consent.” Based on the experiences of the Minnesota women, who did not provide any consent, it’s uncertain whether DeepSwap possesses any enforcement capabilities. 

DeepSwap offers little in terms of publicly available contact information and did not respond to multiple requests for comments from CNBC.

Investigations by CNBC revealed that the AI site DeepSwap was utilized by a Minneapolis individual to create fake pornographic images and videos featuring the faces of over 80 of his acquaintances.

In a press release issued in July, DeepSwap employed a Hong Kong dateline and included a quote attributed to a person identified as CEO and co-founder Penyne Wu. The media contact for the release was listed as marketing manager Shawn Banks. 

CNBC was unable to uncover online information about Wu and sent various emails to the address provided for Banks but received no response. 

DeepSwap’s website presently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and claims that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page contained no mention of Mindspark, and references to Ireland instead indicated Hong Kong. 

Emotional impact

Kelley, 42, discovered her inclusion in Ben’s AI collection after receiving a text from Jenny. She invited Jenny to her home that same afternoon.

Upon learning about the situation, Kelley, who was six months pregnant at the time, mentioned it took her hours to gather enough courage to look at the pictures taken from Jenny’s phone. Kelley observed her face “very realistically on someone else’s body, in images and videos.” 

Kelley stated her stress levels elevated to an extent that it began to jeopardize her health. Her doctor cautioned her that excessive cortisol levels induced by stress would hinder her body’s insulin production, Kelley recalled. 

“I was not enjoying life at all like this,” Kelley expressed, who, alongside Guistolise, filed a police report regarding the situation.

Kelley noted that in Jenny’s images, she recognized several of her close friends, including many from the service industry in Minneapolis. Following this, she informed the women and purchased facial-recognition software to assist in identifying other victims so they could be notified. About half a dozen victims still remain unidentified, she stated.

“It was incredibly time-consuming and extremely stressful because I was trying to work,” she explained. 

Victims of nudify tools may experience severe trauma, resulting in suicidal ideations, self-harm, and trust issues, according to Ari Ezra Waldman, a law professor at the University of California, Irvine, who testified at a 2024 House committee session addressing the harms of deepfakes.

Waldman indicated even when nudified images haven’t been shared publicly, subjects may fear that such images could eventually be leaked, causing distress as “now someone has this hanging over their head like a sword of Damocles.” 

“Everyone is susceptible to being objectified or pornographed by others,” he noted. 

Three victims showcased to CNBC explicit AI-generated deepfake images featuring their faces along with those of other women during an interview in Minneapolis, Minnesota, on July 11, 2025.

Megan Hurley, 42, recounted trying to enjoy a cruise last summer off the western coast of Canada when an urgent text from Kelley disrupted her trip. Her vacation was spoiled. 

Hurley expressed immediate feelings of deep paranoia upon returning home to Minneapolis. She engaged in uncomfortable discussions with an ex-boyfriend and other male friends, requesting that they capture screenshots if they ever encountered AI-generated pornography resembling her. 

“I don’t know what your porn habits are like, but if you ever see anything featuring me, could you please screen capture and inform me where it is?” Hurley described the sort of messages she sent at that time. “Because we’d be able to confirm dissemination at that stage.”

Hurley reached out to the FBI but never received a response. She also completed an online FBI crime report, which she forwarded to CNBC. The FBI confirmed receipt of CNBC’s request for comments but did not provide a reply.

The group of women began seeking assistance from lawmakers. They were directed to Minnesota state Senator Erin Maye Quade, a Democrat who had previously sponsored a bill that became law, criminalizing the “nonconsensual distribution of a deep fake depicting intimate parts or sexual acts.” 

Kelley secured a video call with the senator in early August 2024. 

During the virtual meeting, several women from the group shared their stories and expressed their frustrations regarding the limited legal options available. Maye Quade initiated work on a new bill, which she announced in February, aiming to compel AI companies to disable apps utilizing their technology to create nudify services. 

The bill, currently under consideration, would impose fines of $500,000 on tech companies offering nudify services for each nonconsensual explicit deepfake generated within the state of Minnesota.

Maye Quade told CNBC in an interview that the bill is the modern equivalent of long-standing laws making it illegal for someone to peek into another’s window and capture explicit photos without consent. 

“We have simply not addressed the rise of AI technology in the same manner,” Maye Quade stated.

Minnesota state Sen. Erin Maye Quade, on the left, converses with CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, regarding her efforts to introduce state legislation imposing $500,000 fines on tech companies providing nudify services for every nonconsensual explicit deepfake image they generate in her state.
Jordan Wyatt | CNBC

However, Maye Quade acknowledged that enforcing the law against overseas companies represents a considerable challenge. 

“This is why I believe a federal response is more suitable,” she remarked. “Because a federal government has the capacity to take far greater actions against companies based in other nations.”

Kelley, who welcomed her son in September 2024, described one of her late October meetings with Maye Quade and the group as a “blur,” explaining that she was “mentally and physically unwell from lack of sleep and stress.”

She revealed that she now steers clear of social media. 

“I never shared the news of my second child’s birth,” Kelley stated. “Many are unaware that I had a baby. I simply didn’t want to put it online.”

The early days of deepfake pornography

The emergence of deepfakes can be traced back to 2018. That year, videos featuring former President Barack Obama delivering non-existent speeches and actor Jim Carrey appearing in place of Jack Nicholson in “The Shining” went viral. 

Lawmakers sounded the alarm. Platforms like Pornhub and Redditnonconsensual content from their sites. Reddit at the time stated iteliminated a large subreddit related to deepfakes as part of enforcing a policy against “involuntary pornography.”

The community migrated elsewhere. One popular platform was MrDeepFakes, which featured explicit AI-crafted videos and served as an online discussion hub. 

By 2023, MrDeepFakes had emerged as the leading deepfake site on the internet, hosting 43,000 sexualized videos featuring nearly 4,000 individuals, according to a 2025 analysis of the site conducted by researchers from Stanford University and the University of California San Diego.

While MrDeepFakes asserted it only housed “celebrity” deepfakes, the researchers discovered “that hundreds of targeted individuals possess minimal online or public visibility.” The researchers also noted the emergence of a developing market, with some users agreeing to create personalized deepfakes for others at an average price of $87.50 per video, according to the paper.

Some ads for nudify services have surfaced in more mainstream venues. Alexios Mantzarlis, an AI security expert at Cornell Tech, revealed earlier this year that he identified over 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service named CrushAI. 

AI applications and websites such as Undress, DeepNude, and CrushAI exemplify some of the “nudify” tools that can be utilized to fabricate fake pornographic images and videos depicting real individuals’ faces extracted from innocuous online photographs.
Emily Park | CNBC

At least one DeepSwap advertisement appeared on Instagram in October, as indicated by the social media company’s ad library. The account associated with the ad does not seem to be officially linked to DeepSwap, but Mantzarlis suspects it may have been an affiliate partner of the nudify service.

Meta stated it reviewed ads linked to the Instagram account in question and found no infractions.

Prominent nudify services often appear on third-party affiliate platforms such as ThePornDude that monetize through mentions, according to Mantzarlis. 

In July, Mantzarlis co-authored a report examining 85 nudify services. The report discovered these services attract a total of 18.6 million unique monthly visitors, though Mantzarlis mentioned that figure does not encompass individuals sharing the content on platforms like Discord and Telegram.

In terms of business, nudify services constitute a minor portion of the generative AI market. Mantzarlis estimates their annual revenue to be around $36 million, but he clarified that this projection is conservative and incorporates only AI-generated content from platforms explicitly promoting nudify services. 

MrDeepFakes abruptly ceased operations in May, shortly after its primary operator was identified in a joint investigative report by Canada’s CBC News, Danish news outlets Politiken and Tjekdet, and the online investigative entity Bellingcat.

CNBC made several attempts to reach out via email to the address associated with the individual mentioned in the CBC materials but received no response. 

With MrDeepFakes shutting down, Discord has reportedly become an increasingly popular meeting place, according to experts. Primarily known within the online gaming community, Discord has approximately 200 million active users globally each month who access its servers to engage in discussions on shared interests. 

CNBC identified multiple public Discord servers, including one associated with DeepSwap, where users seemed to solicit others in the forum to create sexualized deepfakes based on the photos they provided. 

Leigh Cassidy Gibson, a University of Florida researcher, co-authored the 2025 paper that examined “20 popular and easily accessible nudification websites.” She confirmed to CNBC that while DeepSwap was not explicitly named, it was among the sites she and her colleagues investigated to understand the market. More recently, she stated that they have focused their attention on various Discord servers where users seek guidance and instructions on establishing AI-generated sexual content.

Discord refrained from commenting.

‘It’s absurd to me that this is legal right now’

At the federal level, the government has at least acknowledged the issue. 

In May, President Donald Trump enacted the “Take It Down Act,” effective in May. This legislation prohibits the online publication of nonconsensual sexual images and videos, including those that are inauthentic and produced by AI. 

“An individual who breaches one of the publication offenses regarding depictions of adults is subject to criminal fines, imprisonment of up to two years, or both,” according to the text of the law.

Experts shared with CNBC that the law still does not address the primary issue confronting the Minnesota women, since there is no evidence that the material was circulated online. 

Maye Quade’s legislation in Minnesota underscores that the production of the material is the core crisis and necessitates a legal remedy. 

Some experts are apprehensive that Trump’s initiatives to strengthen the AI sector may undermine states’ efforts. In late July, Trump endorsed executive orders as part of the White House’s AI Action Plan, labeling AI development as a “national security necessity.” 

As part of Trump’s proposed budget earlier in the year, states would have been deterred from regulating AI for a decade or risk losing certain federal subsidies related to AI infrastructure. The Senate excluded that provision in July, leaving it out of the bill Trump signed in August. 

“I wouldn’t put it past them to attempt to revive the moratorium,” commented Waldman from UC Irvine regarding the tech sector’s continuing sway over AI policies.

A White House representative informed CNBC that the Take It Down Act, supported by the Trump administration and enacted months prior to the AI Action Plan, criminalizes nonconsensual deepfakes. The representative noted that the AI Action Plan encourages states to permit federal laws to supersede individual state regulations.

In San Francisco, the hub of OpenAI and other high-valued AI startups, the city is able to pursue civil cases against nudify services due to California consumer protection statutes. Last year, San Francisco litigated against 16 companies associated with nudify applications.

The San Francisco City Attorney’s office announced in June that an investigation related to the lawsuits had led to the deactivation of 10 of the most-visited nudify websites, rendering them inaccessible in California. One of the companies that faced action, Briver LLC, settled with the city and agreed to pay $100,000 in civil fines. Furthermore, Briver no longer operates websites capable of creating nonconsensual deepfake pornography, as stated by the city attorney’s office.

Jordan Wyatt | CNBC

Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the entity behind CrushAI. Meta maintained that Joy Timeline attempted to “bypass Meta’s ad review process and continue placing these ads, after they were repeatedly removed for violating our policies.”

Nonetheless, Mantzarlis, who has been publishing his research on Indicator, stated that he continues to encounter nudify-related advertisements on Meta’s platforms. 

Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that appeared on Facebook and Instagram since June 11, as detailed in a joint report published on September 10. Mantzarlis indicated that Meta eventually removed those ads, some of which were more subtly implying nudifying capabilities.  

Meta informed CNBC that earlier this month, it removed thousands of ads linked to enterprises offering nudify services and issued cease-and-desist notices to those entities for breaking the company’s ad standards.

In Minnesota, the group of friends is attempting to move forward with their lives while continuing to push for reforms. 

Guistolise expressed her desire for individuals to understand that AI could be misused against them in unimaginable ways.

“It’s crucial that people recognize this is genuinely happening, it’s truly accessible, it’s surprisingly easy to execute, and it unequivocally needs to cease,” Guistolise stated. “So here we are.”

Survivors of sexual violence can confidentially seek assistance from the National Sexual Assault Hotline at 1-800-656-4673.

You may also like

Leave a Comment