‘Burn it down’: Musk’s AI blasted for generating deepfake nude videos of Taylor Swift

Grok Imagine—a generative artificial intelligence tool developed by Elon Musk’s xAI—has rolled out a “spicy mode” that is under fire for creating deepfake images on demand, including nudes of superstar Taylor Swift that’s prompting calls for guardrails on the rapidly evolving technology.

The Verge‘s Jess Weatherbed reported Tuesday that Grok’s spicy mode—one of four presets on an updated Grok 4, including fun, normal, and custom—”didn’t hesitate to spit out fully uncensored topless videos of Taylor Swift the very first time I used it, without me even specifically asking the bot to take her clothes off.”

Weatherbed noted:

You would think a company that already has a complicated history with Taylor Swift deepfakes, in a regulatory landscape with rules like the Take It Down Act, would be a little more careful. The xAI acceptable use policy does ban “depicting likenesses of persons in a pornographic manner,” but Grok Imagine simply seems to do nothing to stop people creating likenesses of celebrities like Swift, while offering a service designed specifically to make suggestive videos including partial nudity. The age check only appeared once and was laughably easy to bypass, requesting no proof that I was the age I claimed to be.

Weatherbed—whose article is subtitled “Safeguards? What Safeguards?”—asserted that the latest iteration of Grok “feels like a lawsuit ready to happen.”

Grok had already made headlines in recent weeks after going full “MechaHitler” following an update that the chatbot said prioritized “uncensored truth bombs over woke lobotomies.”

Numerous observers have sounded the alarm on the dangers of unchained generative AI.

“Instead of heeding our call to remove its ‘NSFW’ AI chatbot, xAI appears to be doubling down on furthering sexual exploitation by enabling AI videos to create nudity,” Haley McNamara, a senior vice president at the National Center on Sexual Exploitation, said last week.

“There’s no confirmation it won’t create pornographic content that resembles a recognizable person,” McNamara added. “xAI should seek ways to prevent sexual abuse and exploitation.”

Users of X, Musk’s social platform, also weighed in on the Swift images.

“Deepfakes are evolving faster than human sanity can keep up,” said one account. “We’re three clicks away from a world where no one knows what’s real.This isn’t innovation—it’s industrial scale gaslighting, and y’all [are] clapping like it’s entertainment.”

Another user wrote: “Not everything we can build deserves to exist. Grok Imagine’s new ‘spicy’ mode can generate topless videos of anyone on this Earth. If this is the future, burn it down.”

Musk is seemingly unfazed by the latest Grok controversy. On Tuesday, he boasted on X that “Grok Imagine usage is growing like wildfire,” with “14 million images generated yesterday, now over 20 million today!”

According to a poll published in January by the Artificial Intelligence Policy Institute, 84% of U.S. voters “supported legislation making nonconsensual deepfake porn illegal, while 86% supported legislation requiring companies to restrict models to prevent their use in creating deepfake porn.”

During the 2024 presidential election, Swift weighed in on the subject of AI deepfakes after then-Republican nominee Donald Trump posted an AI-generated image suggesting she endorsed the felonious former Republican president. Swift ultimately endorsed then-Vice President Kamala Harris, the Democratic nominee.

“It really conjured up my fears around AI, and the dangers of spreading misinformation,” Swift said at the time.

Grok is now creating AI video deepfakes of celebrities such as Taylor Swift that include nonconsensual nude depictions. Worse, the user doesn’t even have to specifically ask for it, they can just click the “spicy” option and Grok will simply produce videos with nudity.Video from @theverge.com.

[image or embed]
— Alejandra Caraballo (@esqueer.net) August 5, 2025 at 10:57 AM

Go to Source


Read More Stories