AI Influencers Are Weaponizing Celebrity Sex Scandals to Sell Porn
#AI

AI Influencers Are Weaponizing Celebrity Sex Scandals to Sell Porn

Startups Reporter
3 min read

A new wave of AI-generated Instagram accounts is fabricating explicit content featuring celebrities like LeBron James and Dwayne Johnson, using a viral formula to drive traffic to adult subscription platforms while exploiting Meta's content moderation gaps.

A disturbing new trend has emerged on Instagram where AI-generated influencer accounts are creating and sharing fake explicit images depicting themselves with celebrities, then funneling millions of views to adult content platforms.

The scheme follows a highly specific formula that began appearing around December 2025. Each post starts with a still image of an AI-generated woman next to a celebrity, captioned "How it started." The video then cuts to another image showing both figures post-coitus, appearing sweaty with disheveled hair and sometimes smeared makeup. These Reels frequently use identical audio clips, making it easy to discover dozens of similar posts through Instagram's audio browsing feature.

The celebrities targeted read like a roster of the world's most famous athletes and public figures. LeBron James appears most frequently, alongside Dwayne "The Rock" Johnson, Twitch streamer iShowSpeed, MMA fighters Jon Jones and Connor McGregor, soccer star Cristiano Ronaldo, and adult film actor Johnny Sins. The accounts show no concern for plausibility, simply choosing whoever is likely to generate engagement.

Instagram AI Influencers Are Defaming Celebrities With Sex Scandals

One particularly audacious example applied this formula to Venezuelan President Nicolás Maduro shortly after his capture by the United States, showing an AI influencer in bed with the political leader. The posts regularly accumulate staggering view counts—one video featuring Jon Jones reached 7.7 million views, while another with iShowSpeed hit 14.5 million.

Users who encounter these videos often click through to the influencer's profile, expecting to find an OnlyFans link. Instead, they discover a bio with no AI disclosure and a link to Fanvue, an OnlyFans competitor with more permissive policies regarding AI-generated content. On Fanvue, the accounts do acknowledge they are "AI-generated or enhanced" while selling nude images and videos.

This represents an established business model for monetizing AI-generated pornography. The strategy involves harvesting attention on Instagram through shocking, non-consensual content, then converting that traffic into paid subscriptions on platforms that allow such material. Sometimes these accounts go further, stealing directly from real adult content creators by faceswapping themselves into existing videos.

The trend reflects how AI influencer strategies evolve to exploit algorithmic loopholes. Last year, 404 Media reported on accounts using AI to create influencers with Down syndrome who sold nudes. Current variations include AI influencers sleeping with entire sports teams, African tribal chiefs, Walmart managers, or sharing a man with their mothers.

Emanuel Maiberg

Meta's response has been inconsistent. While the company removed some flagged Reels, it did not respond to requests for comment. This pattern suggests either inability or unwillingness to effectively police AI-generated content that violates Instagram's policies against non-consensual intimate imagery and undisclosed AI media.

Celebrities possess more resources than typical adult content creators to fight back. Last year, LeBron James—perhaps the most frequent target of this latest meta—sent a cease-and-desist to a company creating and sharing AI videos of him on Instagram. However, the decentralized nature of these accounts makes enforcement difficult, and new ones appear faster than they can be shut down.

The underlying technology continues to improve while becoming more accessible. What once required technical expertise can now be accomplished with consumer-grade tools, enabling this cottage industry of AI-generated porn to scale rapidly. Each new meta strategy seems designed to push boundaries further, testing what platforms will tolerate and what audiences will accept.

For now, the formula remains effective: shocking non-consensual imagery drives engagement, engagement drives traffic, and traffic converts to revenue on permissive platforms. Until platforms like Instagram implement more robust detection and enforcement, or until legal frameworks catch up with the technology, this cycle appears likely to continue, with new celebrities and public figures becoming targets as the trend evolves.

Featured image

The broader implications extend beyond individual celebrities. This represents a fundamental shift in how personal likeness can be weaponized for profit, creating a template that could be applied to anyone with sufficient public recognition. The lack of effective platform intervention suggests this problem will only grow more sophisticated and widespread before meaningful solutions emerge.

Comments

Loading comments...