News
Artificial Intelligence and Machine Learning

Raising the Dead with AI

Using artificial intelligence to create lifelike avatars of people who have passed.

Posted
dark figure at night, illustration

It now is possible to use technology to raise the dead.

Well, kind of.

We haven’t cracked the code on how to live forever, or discovered how to bring someone back to biological life. (Although there are plenty of startups that aspire to solving those challenges.)

Instead, it has become much easier and more common to “resurrect” the dead by creating lifelike artificial intelligence (AI) avatars of them.

Thanks to advancements in generative AI, artificial intelligence that can generate language, imagery, and audio (among other media), users are now able to speak with “ghostbots” that mimic people who have passed away.

Think of it as ChatGPT for the dearly departed.

The ubiquity of both powerful AI language models and increasingly large digital footprints left by the dead have made it easier than ever to simulate the persona of someone after they have died, and to converse with it.

As a result, the practice of having a real, live conversation with a posthumous representation of a dead person is growing in popularity.

Character.ai, a website that hosts a range of lifelike chatbots of people living, dead, and fictional, is one of the top-trafficked generative AI sites in the world—second only to ChatGPT. The site provides the ability to chat with notable dead people like William Shakespeare, Queen Elizabeth II, and J.R.R. Tolkien.

Another site, Hello History, allows you to have conversations with dozens of other historical figures.

However, it’s not the ability to trade barbs with the Bard or to have virtual tea with a deceased monarch that has experts worried. It’s the fact that individuals also are using such ghostbots to resurrect dead family members and friends to find solace—a path that, they say, is replete with emotional, legal, and ethical pitfalls.

Speaker for the dead

At the technological level, ghostbots today are not magic, but math. They consist of AI language models that are trained on the data generated by a dead person. The models use this data to make mathematical predictions about what the person would be likely to say and how they would be likely to sound.

In one high-profile example of how they work, from 2021, a man named Joshua Barbeau used a paid AI tool to recreate his dead girlfriend by feeding the system her text messages and Facebook account. Barbeau was then able to text with a simulacrum of the woman, who had suddenly died eight years before.

Thanks to recent advancements in AI language models, these types of tools don’t just text back canned responses; they are able to hold up their end of dynamic, longform, natural-language conversations on many subjects.

Many ghostbot tools do still suffer from problems associated with generative AI tools like ChatGPT; namely, they hallucinate, or make up convincing, but incorrect, information. After a while, a ghostbot may produce answers or stories that sound nothing like a specific deceased person. They may even invent stories or memories that never happened. They also regularly produce outdated information or misinformation, and degrade in performance over longer conversations.

Despite their flaws, ghostbots—and their descendants—are getting very good, very fast. And some of them are taking things way further than just text.

Thanks to the rapid pace of generative AI innovation, we quickly are gaining the ability to create convincing deep fakes of voices, photos, and videos depicting real people, fakes that are increasingly indistinguishable from reality.

One popular example is ancestry company MyHeritage’s “Deep Nostalgia” platform, which lets you upload images of dead relatives and convincingly animate them.

In fact, AI for recreating the dead is going to undergo a “transformational” shift in the coming years, says James Hutson, a professor at Lindenwood University in Missouri who studies ghostbots. As we layer more powerful extended reality onto powerful video and voice AI avatars, Hutson says, we’re going to create even deeper, more realistic, and more immersive experiences with the deceased.

We very likely will soon not only interact with two-dimensional chatbots that represent the dead, but also with lifelike three-dimensional representations of them overlayed on top of the real world through augmented reality, or within entirely virtual environments through VR, Hutson says.

“These recreations would not merely serve as a nostalgic reminder, but offer a multi-sensory connection, blurring the boundaries between the corporeal and the digital.”

Mortal consequences

These advancements in AI for simulating the dead have experts concerned from a emotional perspective.

On one hand, AI companions for processing grief do serve a purpose, says Emmanuelle Vaast, a professor of information systems at Canada’s McGill University. She points to their use in successfully alleviating loneliness and dealing with grief after losing a loved one. However, overreliance on increasingly realistic ghostbots presents huge problems, she says. They can stall the grieving process and erode a person’s grip on reality.

Not to mention, they are software—and software changes all the time.

“Changes in the design of an AI bot with pre-existing relations to humans could be emotionally devastating to people dealing with grief,” Vaast says.

But the problem goes even deeper, given how intimately we’re all affected by death.

It is one thing to use a static digital memorial to remember someone online, and quite another to interact with a sophisticated copy of them constantly, says Leah Henrickson, a lecturer at Australia’s University of Queensland who has done work on ghostbots. “This technology doesn’t allow the dead to die.”

Even if ghostbots do become widely accepted, a whole new set of problems arises.

Ghostbots themselves are “mortal,” says Edina Harbinja, a media and privacy law researcher at Aston University in the U.K. They are AI-based services that require investment and maintenance to keep running. The failure of a company that runs a popular ghostbot with millions of users becomes not just a financial catastrophe, but a traumatic, emotional one as users lose access to virtual loved ones.

Also, if resurrecting the dead in this manner becomes common, what does that mean for people without strong digital presences in life? Will some people stay “more dead” than others because they can’t easily be resurrected? Not to mention, what happens to the living in a world where it’s normal to converse with the dead?

“If we keep reanimating our dead acquaintances, never letting them actually be dead, how does that change how we feel about our own lives and deaths?” asks Henrickson.

“The digital shadow of a loved one might inadvertently become an emotional crutch,” warns Hutson, especially as ghostbots become more and more lifelike.

A Ghost in the machine

Aside from the emotional considerations, resurrecting the dead with AI is a legal and ethical minefield, too.

Today, the personal data of deceased people is either not protected at all from use in AI systems, or it is protected to some limited extent, as is the case in some European Union countries, says Harbinja. “This means that their data can be used quite freely by ghostbot service providers to feed the algorithms and resurrect the dead.”

Family and friends can also use an individual’s data to train their own ghostbots with relative freedom. That means the best you can do is clearly express your wishes regarding ghostbots to family and friends—and hope they follow them after you die.

Because of that, Henrickson sees it as highly likely that in the future, we all will become a lot more conscious of how our digital assets are handled, and how they are treated as part of our death-related plans.

The creation of ghostbots quickly sinks into murky ethical territory. Even if you’re technically allowed to create a ghostbot of someone, should you?

Recreating someone via ghostbot is a kind of forced immortality. At best, it may come off as strange or creepy to insist you have the right to converse with a digital clone of a dead person. At worst, it may be seen as a non-consensual violation of an individual’s dignity as a human being.

In short, the problem of ghostbots isn’t just a technological one. It’s also a deeply cultural one that changes in severity and context depending on to whom we’re referring. Many people feel very differently about the subject if we’re resurrecting a famous historical figure, versus how they’d feel creating a ghostbot of a very personal relation.

“The challenge lies not just in harnessing AI’s potential, but in ensuring its application respects the sanctity of individual legacies, the intricacies of human relationships, and the inherent dignity of the bereavement process,” says Hutson.

The only problem—and it’s a big one—is that the technology is advancing way faster than our ability to cope with the changes in behaviors and norms it causes.

Tal Morse, a researcher on media and death at Israel’s Hadassah Academic College, says how people feel about ghostbot technology matters—and there is still massive reluctance to adopt these technologies among many people, no matter how rapidly it develops.

One stark example of this is the reporting around the Joshua Barbeau story cited earlier. Most reports talked about his decision to resurrect his dead partner with a tinge of revulsion. Lots of people are still nowhere near comfortable with using and being around technologies that respond and operate like a real dead person used to do.

So just because the technology continues to improve and can produce increasingly convincing replicas of the deceased does not mean it will become more accepted, or be seen as more desirable, by society at large.

“Despite the promise of this industry, people do not perceive its products as an authentic, reliable replacement of the person that was lost,” says Morse.

Further Reading

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More