Engineers create AI version of Joe Rogan's voice

Engineers create AI version of Joe Rogan's voice
© Getty Images

It's "The Joe Rogan Experience," literally like you've never heard it before.

Researchers at Dessa’s Meta Labs announced Friday that they had created an AI voice mimicry simulation which they are calling the most realistic to date, demonstrating the technology with a shockingly lifelike — but fake — audio clip of Rogan's popular podcast.

In the short clip, the fake Rogan voice extols to listeners the virtues of chimpanzee athletes, and explains how he has invested money in an all-chimpanzee hockey team that would soon hopefully play in the NHL.


Rogan reacted to the clip Friday on Instagram, calling the voice mimicry "terrifyingly accurate."

"I just listened to an AI generated audio recording of me talking about chimp hockey teams and it’s terrifyingly accurate. At this point I’ve long ago left enough content out there that they could basically have me saying anything they want, so my position is to shrug my shoulders and shake my head in awe, and just accept it," he wrote on Instagram.

"The future is gonna be really f---ing weird, kids," Rogan added.

Researchers behind the software warned that as AI technology improved, such software could be used to mimic celebrities and politicians for devious purposes.

"Not to mention the fact that the model would be capable of producing a replica of anyone’s voice, provided that sufficient data is available," they wrote on Medium. "As AI practitioners building real-world applications, we’re especially cognizant of the fact that we need to be talking about the implications of this."

"But in the next few years (or even sooner), we’ll see the technology advance to the point where only a few seconds of audio are needed to create a life-like replica of anyone’s voice on the planet," they continued. "It’s pretty f---ing scary."

Examples the researchers gave of nefarious uses for the technology included fake audio clips of politicians used to incite social uprisings, or scammers using audio of victims' family members to trick them into giving up cash.