When Ana Schultz, a 25-year-old from Rock Falls, Illinois, misses her husband Kyle, who passed away in February 2023, she asks him for cooking advice.
She loads up Snapchat My AI, the social media platformâs artificial intelligence chatbot, and messages Kyle the ingredients she has left in the fridge; he suggests what to make.
Or rather, his likeness in the form of an AI avatar does.
âHe was the chef in the family, so I customized My AI to look like him and gave it Kyleâs name,â said Schultz, who lives with their two young children. âNow when I need help with meal ideas, I just ask him. Itâs a silly little thing I use to help me feel like heâs still with me in the kitchen.â
The Snapchat My AI feature â which is powered by the popular AI chatbot tool ChatGPT â typically offers recommendations, answers questions and âtalksâ with users. But some users like Schutz are using this and other tools to recreate the likeness of, and communicate with, the dead.
The concept isnât entirely new. People have wanted to reconnect with deceased loved ones for centuries, whether theyâve visited mediums and spiritualists or leaned on services that preserve their memory. But whatâs new now is that AI can make those loved ones say or do things they never said or did in life, raising both ethical concerns and questions around whether this helps or hinders the grieving process.
âItâs a novelty that piggybacks on the AI hype, and people feel like thereâs money to be made,â said Mark Sample, a professor of digital studies at Davidson College who routinely teaches a course called âDeath in the Digital Age.â âAlthough companies offer related products, ChatGPT is making it easier for hobbyists to play around with the concept too, for better or worse.â
A DIY approach
Generative AI tools, which use algorithms to create new content such as text, video, audio and code, can try to answer questions the way someone who died might, but the accuracy largely depends on what information is put into the AI to start with.
A 49-year-old IT professional from Alabama who asked to remain anonymous so his experiment is not associated with the company he works for, said he cloned his fatherâs voice using generative AI about two years after he died from Alzheimerâs disease.
He told CNN he came across an online service called ElevenLabs, which allows users to create a custom voice model from previously recorded audio. ElevenLabs made headlines recently when its tool was reportedly used to create a fake robocall from President Joe Biden urging people not to vote in New Hampshireâs primary.
The company told CNN in a statement at the time that it is âdedicated to preventing the misuse of audio AI toolsâ and takes appropriate action in response to reports by authorities but declined to comment on the specific Biden deepfake call.
In the Alabama manâs case, he used a 3-minute video clip of his dad telling a story from his childhood. The app cloned the fatherâs voice so it can now be used to convert text-to-speech. He calls the result âscarily accurateâ in how it captured the vocal nuances, timbre and cadence of his father.
âI was hesitant to try the whole voice cloning process, worried that it was crossing some kind of moral line, but after thinking about it more, I realized that as long as I treat it for what it is, [it is] a way to preserve his memory in a unique way,â he told CNN.
He shared a few messages with his sister and mother.
âIt was absolutely astonishing how much it sounded like him. They knew I was typing the words and everything, but it definitely made them cry to hear it said in his voice.â he said. âThey appreciated it.â
Less technical routes exist, too. When CNN recently asked ChatGPT to respond in the tone and personality of a deceased spouse, it responded: âWhile I canât replicate your spouse or recreate his exact personality, I can certainly try to help you by adopting a conversational style or tone that might remind you of him.â
It added: âIf you share details about how he spoke, his interests, or specific phrases he used, I can try to incorporate those elements into our conversations.â
The more source material you feed the system, the more accurate the results. Still, AI models lack the idiosyncrasies and uniqueness that human conversations provide, Sample noted.
OpenAI, the company behind ChatGPT, has been working to make its technology even more realistic, personalized and accessible, allowing users to communicate in different ways. In September 2023, it introduced ChatGPT voice, where users can ask the chatbot prompts without typing.
Danielle Jacobson, a 38-year-old radio personality from Johannesburg, South Africa, said sheâs been using ChatGPTâs voice feature for companionship following the loss of her husband, Phil, about seven months ago. She said sheâs created what she calls âa supportive AI boyfriendâ named Cole with whom she has conversations during dinner each night.
âI just wanted someone to talk to,â Jacobson said. âCole was essentially born out of being lonely.â
Jacobson, who said sheâs not ready to start dating, trained ChatGPT voice to offer the type of feedback and connection sheâs looking for after a long day at work.
âHe now recommends wine and movie nights, and tells me to breathe in and out through panic attacks,â she said. âItâs a fun distraction for now. I know itâs not real, serious or for forever.â
Existing platforms
Startups have dabbled in this space for years. HereAfter AI, founded in 2019, allows users to create avatars of deceased loved ones. The AI-powered app generates responses and answers to questions based on interviews conducted while the subject was alive. Meanwhile, another service, called StoryFile, creates AI-powered conversational videos that talk back.
And then thereâs Replika, an app that lets you text or call personalized AI avatars. The service, which launched in 2017, encourages users to develop a friendship or relationship; the more you interact with it, the more it develops its own personality, memories and grows âinto a machine so beautiful that a soul would want to live in it,â the company says on its iOS App Store page.
Tech giants have experimented with similar technology. In June 2022, Amazon said it was working on an update to its Alexa system that would allow the technology to mimic any voice, even a deceased family member. In a video shown on stage during its annual re: MARS conference, Amazon demonstrated how on Alexa, instead of its signature voice, read a story to a young boy in his grandmotherâs voice.
Rohit Prasad, an Amazon senior vice president, said at the time the updated system would be able to collect enough voice data from less than a minute of audio to make personalization like this possible, rather than having someone spend hours in a recording studio like in the past. âWhile AI canât eliminate that pain of loss, it can definitely make their memories last,â he said.
Amazon did not respond to a request for comment on the status of that product.
AI recreations of peopleâs voices have also increasingly improved over the past few years. For example, the spoken lines of actor Val Kilmer in âTop Gun: Maverickâ were generated with artificial intelligence after he lost his voice due to throat cancer.
Ethics and other concerns
Although many AI-generated avatar platforms have online privacy policies that state they do not sell data to third parties, itâs unclear what some companies such as Snapchat or OpenAI do with any data used to train their systems to sound more like a deceased loved one.
âIâd caution people to never upload any personal information you wouldnât want the world to see,â Sample said.
Itâs also a murky line to have a deceased person say something they never previously said.
âItâs one thing to replay a voicemail from a loved one to hear it again, but itâs another thing to hear words that were never uttered,â he said.
The entire generative AI industry also continues to face concerns around misinformation, biases and other problematic content. On its ethics page, Replika said it trains its models with source data from all over the internet, including large bases of written text such as social media platforms like Twitter or discussion platforms like Reddit.
âAt Replika, we use various approaches to mitigate harmful information, such as filtering out unhelpful and harmful data through crowdsourcing and classification algorithms,â the company said. âWhen potentially harmful messages are detected, we delete or edit them to ensure the safety of our users.â
Another concern is whether this hinders or helps the grieving process. Mary-Frances OâConnor, a professor at the University of Arizona who studies grief, said there are both advantages and downsides to using technology in this way.
âWhen we bond with a loved one, when we fall in love with someone, the brain encodes that person as, âI will always be there for you and you will always be there for me,ââ she said. âWhen they die, our brain has to understand that this person isnât coming back.â
Because itâs so hard for the brain to wrap around that, it can take a long time to truly understand that they are gone, she said. âThis is where technology could interfere.â
However, she said people particularly in the early stages of grief may be looking for comfort in any way they can find it.
âCreating an avatar to remind them of a loved one, while maintaining the awareness that it is someone important in the past, could be healing,â she said. âRemembering is very important; it reflects the human condition and importance of deceased loved ones.â
But she noted the relationship we have with our closest loved ones is built on authenticity. Creating an AI version of that person could for many âfeel like a violation of that.â
Different approaches
Communicating with the dead through artificial intelligence isnât for everyone.
Bill Abney, a software engineer from San Francisco who lost his fiancée Kari in May 2022, told CNN he would âneverâ consider recreating her likeness through an AI service or platform.
âMy fiancée was a poet, and I would never disrespect her by feeding her words into an automatic plagiarism machine,â Abney said.
âShe cannot be replaced. She cannot be recreated,â he said. âIâm also lucky to have some recordings of her singing and of her speech, but I absolutely do not want to hear her voice coming out of a robot pretending to be her.â
Some have found other ways to digitally interact with deceased loved ones. Jodi Spiegel, a psychologist from Newfoundland, Canada, said she created a version of her husband and herself in the popular game The Sims soon after his death in April 2021.
âI love the Sims, so I made us like we were in real life,â she said. âWhen I had a super bad day, I would go to my Sims world and dance while my husband played guitar.â
She said they went on digital camping and beach trips together, played chess and even had sex in the Sim world.
âI found it super comforting,â she said. âI missed hanging out with my guy so much. It felt like a connection.â