Chatbots of the dead
Take everything someone has ever written — every text message, email, journal entry, blog post — and feed it into a chatbot. Imagine that after that person dies, they could then continue to talk to you in their own voice, forever. It’s a concept called “chatbots of the dead.” In 2021, Microsoft obtained a program that would do exactly that: train a chatbot to emulate the speech of a dead friend or family member.
“Yes, it’s disturbing,” admitted Tim O’Brien, Microsoft’s general manager of AI programs, when news of the patent hit the headlines. For some, the notion of talking to a loved one from beyond the grave elicited feelings of revulsion and fear, something that philosophy researchers Joel Krueger and Lucy Osler call “the ick factor.” In October 2022, the U.K. researchers, based at Exeter and Cardiff universities respectively, published “Communing with the Dead Online,” a research paper that looks at the role that chatbots could play in the grieving process. Since then, the capabilities of artificial intelligence large language models have snowballed — and their influence on our lives. Krueger and Osler say we should consider how chatbots might help us in our darkest days by continuing our relationship with loved ones after they’ve died.
This conversation has been edited for length and clarity.
What role could a chatbot potentially play after a person has died?
Lucy: Sadly, Joe’s dad died while we were researching this, which added a very different texture to the writing experience. It changed a lot of the conversations we were having around it.
Joel: I started thinking more carefully about some of the ways I wanted not just to preserve his memory but to create more active, and maybe dynamic, ways of maintaining his presence in my life. I started thinking about what role chatbots and more sophisticated technologies might play in maintaining a continuing bond with him.
For what it’s worth, I’m still undecided. I’m not sure I’d want a chatbot of my father. But I started thinking more about this issue in that very real context, as I was negotiating my own grief.
Tell me about the ‘ick factor’ — this response that I’m even having right now, thinking about talking to a family member via a chatbot from beyond the grave.
Lucy: If someone turns around and says, ‘Did you know that we can now create a chatbot of the dead that impersonates someone’s style of voice?’ A very common reaction is: ‘gross,’ ‘ew,’ ‘that’s really scary.’ There’s that kind of knee-jerk reaction. But we think that there might be interesting and complicated things to unpack there. People have this instinctive ick factor when it comes to conversing with the dead. There’s an old Chinese ritual, where there would be a paid impersonator of the dead person at a funeral who would play the role of the deceased, and I think lots of Western ears find that kind of startling and a bit strange. Historically, we recognize that. But because something’s unfamiliar is not a reason to say well, that’s got no worth at all. Grieving practices come in all shapes and forms.
Do you think talking with a chatbot, after someone has died, would interrupt the natural grieving process? Or the stages of grief like denial, bargaining and acceptance?
Lucy: Using a chatbot of the dead isn’t about denying someone has died. It’s about readjusting to a world where you’re very aware that they have died, without letting go of various habits of intimacy. You don’t have to just move on in a very stark sense. We can have a kind of nuanced and ongoing adjustment to someone’s death and take time to emotionally adjust to the absence we now feel, as we learn to inhabit the world without them.
Joel: We’ve always employed various technologies to find ways to maintain a connection with the dead, and this is just one new form of these technologies. There are lots of ways of getting stuck, and certainly, we can get trapped in those patterns of not accepting the loss. For instance, someone could wake up each day, go through the same pictures, watch the same videos, scroll the same Facebook page. It’s unclear to me whether there’s any greater threat when it comes to chatbots. Chatbots do provide a much richer form of reciprocity, a kind of back-and-forth in which the person may feel more present than if we’re just looking at a picture of them.
Yes — and there are now AI programs that allow you to talk and interact with a video or hologram version of the person that has died.
Joel: Yes! Since our research came out late last year, the world has already moved on so much. And some of the grief technology now already seems worlds ahead of a chatbot that’s confined to some little textbox on a screen or a phone.
Lucy: If you think about the “Be Right Back” episode of “Black Mirror,” it has some interesting implications for what the near future might look like. But I think we should be able to say that a chatbot and a living robot replica of a dead partner are different things.
What are things you worry about with tech companies offering these so-called ‘chatbots of the dead’?
Lucy: I am much more concerned, for instance, about data being sold from these programs. Or about these things being created as to be deliberately addictive.
Joel: Or targeted advertising used on them, when you’re grieving. Imagine if you had a chatbot of your dead father, let’s say, that you could activate anytime you want. You might say, ‘Dad, I’m feeling kind of low today. I really miss you.’ And he says, ‘I’m really sorry to hear that sweetheart. Why don’t you go get the new frappuccino at Starbucks for lunch, and that will help elevate your mood?’
Funnily enough, that’s something my dad probably would say.
Joel: You can imagine those kinds of targeted ads being built into the technology or very subtle, algorithmically calibrated ways to kind of keep you engaged and potentially keeping you stuck in the grief process as a way of driving user engagement.
I think our concern is more about the people who are designing the chatbots than it is about the individuals who are using them. The real focus needs to be on issues of transparency, privacy and regulation. The motivations that people have for designing this sort of tech should be as a tool, as a continuing bond, instead of something that they want you to come back to again and again and again. And I realize that sounds a bit hopelessly naive when you’re talking about companies that are driven first and foremost by driving profit.
The story you just read is a small piece of a complex and an ever-changing storyline that Coda covers relentlessly and with singular focus. But we can’t do it without your help. Show your support for journalism that stays on the story by becoming a member today. Coda Story is a 501(c)3 U.S. non-profit. Your contribution to Coda Story is tax deductible.