The news is by your side.

Using AI to talk to the dead

0

Dr. Stephenie Lucas Oney is 75, but she still turns to her father for advice. How did he deal with racism, she wonders. How did he succeed when the odds were against him?

The answers are rooted in William Lucas’ experience as a black man from Harlem who made a living as a police officer, FBI agent and judge. But Dr. Oney does not receive the guidance personally. Her father has been dead for over a year.

Instead, she listens on her phone to the answers, given in her father’s voice, through HereAfter AI, an app powered by artificial intelligence that generates answers based on hours of interviews previously conducted with him. he died in May 2022.

His voice comforts her, but she said she created the profile more for her four children and eight grandchildren.

“I want the children to hear all these things in his voice,” said Dr. Oney, an endocrinologist, from her home in Grosse Pointe, Michigan, “and not from me trying to paraphrase, but to hear it from his point of view. of vision, his time and his perspective.”

Some people are turning to AI technology as a way to communicate with the dead, but its use as part of the grieving process has raised ethical questions while unsettling some who have experimented with it.

HereAfter AI was introduced in 2019, two years after the debut of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe and blink while responding to questions. Both generate answers based on responses users gave to questions like “Tell me about your childhood” and “What’s the biggest challenge you’ve faced?”

Their call comes as no surprise to Mark Sample, professor of digital studies at Davidson College, who teaches a course titled Death in the Digital Age.

“Every time there is a new form of technology, there is always the urge to use it to connect with the dead,” Mr Sample said. He noted Thomas Edison’s failed attempt to create a ‘ghost phone.”

StoryFile offers a “high-fidelity” version where someone is interviewed in a studio by a historian, but there is also a version where all you need is a laptop and webcam to get started. Stephen Smith, co-founder, let his mother, Marina Smith, a Holocaust educator, try it out. Her StoryFile avatar asked questions at her funeral in July.

According to StoryFile, about 5,000 people have created profiles. Among them was actor Ed Asner, who was interviewed eight weeks before his death in 2021.

The company sent Mr. Asner’s StoryFile to his son Matt Asner, who was surprised to see his father looking at him and appearing to answer questions.

“I was blown away by it,” Matt Asner said. “I thought it was incredible how I could have this interaction with my father that was relevant and meaningful, and it was his personality. This man who I really missed, my best friend, was there.”

He played the file during his father’s memorial service. Some people were moved, he said, but others felt uncomfortable.

“There were people who thought it was morbid and were afraid of it,” Mr. Asner said. “I don’t share that view,” he added, “but I can understand why they would say that.”

Lynne Nieto gets it too. She and her husband, Augie, founder of Life Fitness, which makes exercise equipment, created a StoryFile before his death in February from amyotrophic lateral sclerosis, or ALS. They thought they could use it on the website of Augie’s Quest, the nonprofit organization they founded to raise money for ALS research. Maybe someday his young grandchildren would want to watch it.

Mrs. Nieto first reviewed his file about six months after his death.

“I’m not going to lie, it was a little hard to watch,” she said, adding that it reminded her of their Saturday morning conversations and felt a little too “raw.”

Those feelings are not uncommon. These products force consumers to face the one thing they are programmed not to think about: mortality.

“People are squeamish about death and loss,” James Vlahos, co-founder of HereAfter AI, said in an interview. “It can be difficult to sell because people are forced to confront a reality they would rather not deal with.”

HereAfter AI grew from a chatbot that Mr. Vlahos created for his father before his death from lung cancer in 2017. Mr. Vlahos, a conversational AI specialist and journalist who contributed to The New York Times Magazine, wrote about the experience for Wired and soon I heard from people asking if he could turn them into a mombot, a wifebot, and so on.

“I didn’t think about it in any commercial way,” Mr Vlahos said. “And then it became crystal clear: This should be a business.”

As with other AI innovations, chatbots created in the likeness of someone who has died raise ethical questions.

Ultimately, it’s a matter of consent, says Alex Connock, a senior fellow at the University of Oxford’s Saïd Business School and author of “The Media Business and Artificial Intelligence.”

“Like all ethics rules in AI, it will come down to consent,” he said. “If you did it knowingly, I think most ethical problems can be solved quite easily.”

The consequences for survivors are less clear.

Dr. David Spiegel, associate professor of psychiatry and behavioral sciences at Stanford School of Medicine says programs like StoryFile and HereAfter AI can help people grieve, like looking through an old photo album.

“The critical thing is that you keep a realistic perspective on what you’re investigating — that it’s not that this person is still alive and communicating with you,” he said, “but that you’re reexamining what he or she left behind.”

Leave A Reply

Your email address will not be published.