DEADRINGERS
AI is giving life to a resurrection economy
As AI tools become more sophisticated, they’re increasingly being used to imitate the voices and likenesses of the dead, with and without permission. It can be big business.
When singer-songwriter Blaze Foley was shot and killed in Austin, Texas, in 1989, he had released only two singles and one little-known studio album.
Over the years and decades following his death, friends and fans would continue to champion Foley’s legacy, elevating him to the status of a cult icon — a musician’s musician — with songs like “Clay Pigeons” covered by artists including Willie Nelson, John Prine, Michael Cera, and the Avett Brothers, among others. In 2018, a biopic titled “Blaze,” directed by Ethan Hawke, premiered at Sundance.
That’s how the Blaze Foley canon continued — until sometime last year, when a new song appeared on the artist’s Spotify page. The three-minute single, titled “Together,” sat atop Foley’s profile, featuring a clearly AI-generated image — a sort of Jared Leto Joker-esque man singing into a microphone. This, in case you don’t know, is not what Blaze Foley looked like.
It’s also not what he sounded like.
As reported by 404 Media, it turns out “Together” — which Spotify eventually removed — was an AI-generated song posted to Foley’s Spotify page without the permission of his estate or label. 404 also noted at least one other instance of a new, seemingly AI-generated song on another deceased artist’s profile. More examples of this trend have popped up since.
Though the Foley song was unauthorized and not even a clear attempt at impersonation, “resurrecting” deceased public figures is becoming a growing use case for AI, particularly as consumer-facing tools become more sophisticated.
Last month, Variety reported that Val Kilmer — who died in 2025 — will appear via AI in the upcoming historical drama “As Deep as the Grave.” Kilmer’s AI-generated likeness (he signed onto the film years ago but never made it onto set) will reportedly be in more than an hour of the finished film, and was used with permission from his daughter and estate, which was reportedly compensated for his appearance.
At least online, the public reaction to this use of AI leans heavily negative, far from the realm of “technological awe” that the companies behind it might be imagining or hoping for. In a recent survey on preferred AI use in films and television conducted by The Wrap and NRG, 51% of respondents said that replicating the voice of a deceased actor or celebrity was “not acceptable,” compared to just 35% who supported it. Still, the number of AI deals that involve “bringing back” dead celebrities continues to grow every day.
And those AI-estate pacts can mean big money. According to an industry source with knowledge of deals between estates and AI companies that reproduce voices, they are typically paid out to estates in cash ranging between five and seven figures, depending on the commercial scale and duration of the partnership.
In 2022, James Earl Jones — the iconic voice behind Darth Vader — reached an agreement with Disney’s Lucasfilm to recreate his voice using AI for future “Star Wars” projects. Jones passed away in 2024, but his AI-generated voice continues to be put to use in deals like Disney’s “Fortnite” partnership with Epic Games. Jones’ voice was launched on “Fortnite” in May 2025, and users immediately stretched the bounds of what should have been possible, having Darth Vader say “skibidi toilet” and various curse words, among other things. Epic Games reportedly patched the issue “within 30 minutes.”
Voice cloning, which has been technically feasible for longer than other generative-AI uses, has proven itself a major market for AI estate deals. In 2023, DeepZen reached a deal with the estate of “Gilmore Girls” actor Edward Herrmann (who died in 2014) to use Herrmann’s cloned voice to narrate several audiobooks.
In 2024, ElevenLabs launched its ElevenReader app globally. For $11 a month, users can have links, audiobooks, PDFs, and written text read aloud by cloned voices of dead celebrities including Judy Garland, Burt Reynolds, John Wayne, and Albert Einstein. (Estates representing more celebrities, including Jerry Garcia and Matthew McConaughey — who is alive — have made deals for more restricted uses.)
“Our family believes that this will bring new fans to Mama, and be exciting to those who already cherish the unparalleled legacy that Mama gave and continues to give to the world,” said Liza Minnelli (Garland’s daughter) in a 2024 statement about the partnership. In the same statement, ElevenLabs said its “iconic voices” are exclusively for individual streaming and not meant for creating content to be shared (and possibly monetized).
AI’s use case as a resurrector seems to fit with broader existential marketing techniques that have been used to boost the tech over the past few years. That AI models may be able to “bring back” beloved dead celebrities fits neatly into the narrative that the tech’s powers are “limitless” or that certain models are “too dangerous to release.” While ElevenLabs’ celebrity voice deals are a large part of its external marketing, the company’s own financial announcements imply that most of its recent revenue growth comes from enterprise clients using ElevenLabs for their customer service departments.
“This whole industry is based on evangelism,” said Jamie Cohen, a professor of media studies at CUNY Queens College. “Companies are projecting their own use cases onto the consumer. What they’re doing is assuming a market exists that is like, ‘I don’t have to choose this new voice actor; I can have Jerry Garcia’s voice for my book.’”
Cohen says we’re seeing more micro-markets emerge as AI struggles to find reliable revenue. “You’re also really screwing with the way people interpret and trust in the media too, so I’m not sure if they’re aware of the glaring problems that they’re potentially creating with this.”
If these sorts of lucrative AI-estate deals do have sticking power, they stand to massively boost the “digital legacy market,” which was valued at $22.46 billion in 2024 and could triple by 2034, The Atlantic reported.
Commercial AI deals are increasingly being brought to estates, says Tina Xavie, who runs the brand and icon management firm Xavie Agency. The company’s client list includes Jerry Garcia, Marvin Gaye, Jonas Salk, and Liberace.
“There are so many small [AI] companies that kind of fly under the radar that are doing AI voice, AI images,” said Xavie. “So yes, the volume of inquiries has increased, but also the number of infringements has increased.”
another clip of vader saying skibidi toilet pic.twitter.com/nWeHFhAqiT
— Guff (@Guff_Guy) May 16, 2025
Xavie says clients are mostly interested in making deals in which the celebrity’s voice or likeness is being used in a way not entirely related to what the deceased celebrity was most known for, e.g. a famous musician’s voice being used to narrate audiobooks.
“The estate will approve the final work, so it has to sound like them, look like them, talk like them,” Xavie said. “The clients that are embracing this technology want this to be authentically done.”
As Xavie pointed out, AI is far from a pure boon for estates. CMG Worldwide, a legacy management firm with a client list including Garland, Jim Henson, Andre the Giant, and James Dean, refers to AI as an “unprecedented threat” on its website. Legacy management companies have had to become significantly more dynamic to respond to AI tools like the now defunct Sora, which, immediately upon launch, was flooded with videos featuring dead public figures like Martin Luther King Jr., Robin Williams, John F. Kennedy, and Bob Ross. (A few months before it shuttered the app, OpenAI said it would allow representatives of “recently deceased” public figures to request their likeness be blocked from use on Sora.)
In October, Zelda Williams, film director and daughter of Robin Williams, urged fans to stop sending her AI-generated videos of her father in a story on Instagram. Williams wrote:
“To watch the legacies of real people be condensed down to ‘this vaguely looks and sounds like them so that’s enough’, just so other people can churn out horrible TikTok slop puppeteering them is maddening...
AI is just badly recycling and regurgitating the past to be re-consumed. You are taking in the Human Centipede of content, and from the very very end of the line, all while the folks at the front laugh and laugh, consume and consume.”
“I concur concerning my father,” Bernice King, daughter of Martin Luther King Jr., wrote in a post on X.
The proliferation of unauthorized AI-generated deepfakes and songs has also expanded a secondary market for third-party companies to track and identify that content across the internet, for both platforms and estates. According to deepfake detection company Reality Defender, large online platforms have been slow to protect artists, both living and dead, from unwanted AI imitation.
“Which social platforms have robust protection against deepfakes, AI, and the misuse of copyright? The answer is almost none of them,” a Reality Defender spokesperson said in an interview with Sherwood News. “Because the platforms aren’t doing anything about this — they’re doing watermarking checks and that’s about it — it’s going to continue to be Whac-A-Mole as more and more content becomes AI-generated.”
Reality Defender says the cost to protect brands and likenesses against AI-generated infringement could be astronomical, and that the burden is very much on estates and brands.
Beatdapp, a music streaming fraud detection company, found that fraudulent streams — including those from AI bots, of AI songs — make up between 5% and 10% of all music streams online. That, cofounder Morgan Hayduk says, adds up to $1 billion to $2 billion every year.
“There is no doubt in my mind: this is not a 1% problem. We have seen platforms where it’s a 20% problem. We’ve seen platforms where it’s a 40% problem for brief periods of time,” said Hayduk, who added that while platforms are getting better at detection, the flood of AI content is making the situation more difficult. “About a third of all the uploads every day are AI-generative, and that number is growing week over week, month over month.”
Hayduk thinks platforms should be more restrictive about what content gets uploaded. If a platform is going to allow anything to be uploaded, he says, they should at least have a better understanding of what is and what isn’t AI, so it can be treated appropriately before it has real consequences on royalty pools.
Platforms have taken a variety of positions on this topic. Earlier this year, Bandcamp announced a ban on AI music on its platform, writing, “Any use of AI tools to impersonate other artists or styles is strictly prohibited.” In April, French music streamer Deezer, which says it’s the “only streaming platform in the world transparently tagging AI-generated music,” announced that 44% of all new music uploaded to its platform was AI-generated. Spotify last year said it removed 75 million spam tracks, which have become easier to produce as consumer-facing AI tools scale. At the time, the company said AI music wasn’t “impacting streams or revenue distribution for human artists in any meaningful way.”
Shortly after 404’s report on the unauthorized song uploaded to Blaze Foley’s Spotify, the platform removed “Together.” In March, Spotify released a limited beta version of a feature called “Artist Profile Protection,” which will allow artists to review songs before they’re released on their profiles. In its launch post, Spotify cited the proliferation of AI-generated music as the reason for the feature:
“Music has been landing on the wrong artist pages across streaming services, and the rise of easy-to-produce AI tracks has made the problem worse. That’s not the experience we want artists to have on Spotify, and that’s why we’ve made protecting artist identity a top priority for 2026.”
In late April, Spotify launched a new verification badge that signals an artist as being human and having consistent listener activity. As of launch, profiles representing AI-generated or AI-persona artists aren’t eligible for the badge. Despite these updates, a Spotify spokesperson told Sherwood that the platform is “still seeing very low levels of interest in primarily prompt generated music.” In a statement to Sherwood on the topic of resurrection AI, a Spotify spokesperson said the following:
“Unauthorized vocal impersonation and other deceptive uses of artist identity have no place on Spotify. We employ a range of safeguards to protect artists, including systems designed to detect and prevent unauthorized content, human review, and reporting and takedown processes. Spotify is also the only streaming service to offer Artist Profile Protection, which lets artists, or the estates and rights holders managing their catalogs, opt in to review and approve or decline incoming releases before they go live on their profile.”
YouTube last month expanded its “likeness detection” tech to the entertainment industry, with support from big agencies and management companies including CAA, UTA, WME, and Untitled Management. The tool “helps creators find content on YouTube where their face appears to be altered or generated by AI” and request its removal, according to YouTube. In a support page about the tool, YouTube says it will not remove content it deems to be parody or satire. YouTube declined to comment for this story on how it determines what constitutes parody and satire within AI-generated content or how it reviews that content.
These sorts of updates from major platforms serve to both protect their reputation and shield any potential liability, says Santa Clara University law professor and rights of publicity expert Tyler Ochoa.
“The point of doing [that], from a legal point of view, is that they are making reasonable efforts to ensure that their tool isn’t being used in ways that would violate rights of publicity,” Ochoa said. The right of publicity is state law that’s central to the controversy surrounding deepfakes and the AI tools that allow users to publish content that imitates celebrities.
Generally, the law grants people control over the use of their name, image, and likeness — subject to First Amendment protections like parody, news, and political speech. According to Ochoa, imitating a (living or dead) celebrity’s voice or likeness without consent for commercial purposes would violate the right of publicity in most states. Instances that aren’t overtly commercial and are more about showing off the technology would be harder to win for a rights holder.
Adding complexity to the situation is the fact that AI users violating the law are routinely doing so using tools offered by large platforms. While many major AI generators are designed to block prompts explicitly asking to imitate real artists or intellectual property, most of those blocks can be bypassed with simple prompt workarounds. (For example, rather than ask a bot to generate a new Beatles song, one could ask for ’60s rock with a Liverpool vibe).
“Many of the model builders have completely gone against the sum total of copyright law in the United States to build these models,” said a Reality Defender spokesperson. “And so you have these keywords of which you cannot build things on, but that’s because they’ve already likely taken the copyrighted materials which feature those likenesses. And also, there are so many ways to get past that.”
Still, Ochoa says, the Supreme Court recently shielded platforms from secondary liability (liability for enabling an act rather than committing one) with a major decision in the Cox vs. Sony Music copyright case.
“If we take that decision seriously, and we use it in right of publicity law, the platform can’t be held liable just because it can be used to infringe. It can only be used if it intended users to infringe,” said Ochoa. “So if they’re taking reasonable measures to try to prevent people from infringing, the fact that people can get around those measures shouldn’t matter. It’ll be the user who’s liable.”
Since it’s not economically viable to sue every user posting this sort of content, rights holders and estates have to choose their targets carefully, Ochoa said.
Imitating the dead for financial gain is nothing new, but what’s clear is that AI is making it easier — and harder to discern from the real thing than, say, a person in an Elvis jumpsuit singing in a casino. Beatdapp’s Hayduk predicts that more unauthorized commercial uses are on the way.
“I do think there is a market for the unreleased mixtapes of the world, and pretending that you have some valuable IP that’s been locked in a bunker somewhere and suddenly is being unearthed,” Hayduk said. “I don’t think we’re so far away now.”
