It was a perfect crime.
AI slop is taking over the internet, and it’s making it increasingly difficult for web users to know what is real, what is fake, and what is some weird, synthetic mixture between the two. 404 Media reports on the newest frontier in this bizarre, dystopic trend, revealing the rise of the AI-generated “True Crime” podcast which, as you might imagine, isn’t based in truth at all.
The YouTube channel True Crime Case Files has managed to rack up millions of views by spitting out long, drawn out yarns of murder and mayhem that are complete and utter bullshit. Some of the channel’s videos have only a few hundred views. Others, however, have tens or even hundreds of thousands.
“It needs to be called ‘true crime,’ because true crime is a genre,” the channel’s owner told 404. “I wanted [the audience] to think about why […] they care so much that it was true, why it matters so much to them that real people are being murdered.”
The owner apparently added: “True crime, it’s entertainment masquerading as news […] that’s all there is to it.”
The YouTuber’s AI-generated plot lines are described as being “disturbing, often hypersexual” exercises in luridness, apparently designed to draw in viewers with their over-the-top craziness. The YouTuber said he was inspired to create the channel after spending a lot of time watching Dateline with his family. He says he realized such shows were driven by a very formulaic structure, and that structure was easy to regurgitate. From there, he began experimenting with content creation via ChatGPT. Not long afterward, he started the channel.
“I labeled it [as] AI parody, and it didn’t do well […] I think part of it is people are just hostile towards AI. So when they see the word AI, they’re just freaked out by it,” the YouTuber told 404. He subsequently took down the “parody” disclaimer and views of his channel began to take off.
Having gone hunting for the YouTube videos that 404’s article mentions, I stumbled across a number of other channels doing the exact same thing. It clearly seems like an appealing racket for someone looking to make a quick buck. Web hustlers figure they can use automated content-creating platforms like ChatGPT to tap into the attention economy and eke out a small living.
Sometimes it doesn’t pay off, however. Last year, a YouTube show attempted to use an AI-generated “George Carlin comedy special” to generate attention. The episode ended in litigation from Carlin’s family and the video had to permanently be pulled from the web.