A lot of people think I’m batshit crazy,
says Justin Harrison of Grieftech.
I don’t. I think he’s a delusional ghoul.
Harrison has cobbled together a chatbot that uses an imitation of his late mother’s voice and predictive text built from her online communications, and he thinks that it is a cure for grief, because it enables him to talk to his “mother”. It doesn’t. There is no person there. It’s a kind of selfish version of grief, where he can deny her death and pretend it’s OK because his superficial, fake emulation of his mother can pay attention to him. It’s gross and creepy.
In the last few years, I’ve lost my mother and a brother; in years gone by, I’ve lost my father and a sister. They’re dead. The grief comes from the loss of living, human, thinking, behaving human beings who can’t be resurrected by some fraud with a collection of words they may have uttered. But this shallow idiot thinks a chatbot is a substitute.
Harrison is being interviewed, and he thinks he’s being clever by throwing some publicly recorded videos of his interviewer into the chatbot’s database, and then conversing with the computer. The interviewer is not impressed. So Harrison and some other team member argue with him to say that the computer used a spot-on turn turn of phrase
. I guess if all we are is a series of turns of phrases, then the simulacrum is perfect. Except we aren’t. There’s no person, no thinking mind, behind the chatbot.
Then the interviewer goes off to talk to a series of people: one who imagined seeing a dead person after taking drugs, another who dreamed that they were visited by the ghost of their father, a medium who claims, with many weird jerky expressions, that they can communicate psychically with a friend. They’re all the same thing: frauds, liars, or deluded people who have convinced themselves that their loved ones are nothing but superficial reflections of their own minds. Justin Harrison is just more of the same, a phony like all the other phonies who have leeched off other people’s honest grief for profit.
After I’m dead, at least I’m reassured that no ghoul is going to be tormenting me with banalities; I’ll be gone. Don’t be fooled that my chatbot copy’s banalities are coming from me, though.
kome says
Psychics and mediums have been pulling this grift for centuries, and now techbros have found a way to automate it. Say what you will about that ghoulish piece of garbage Sylvia Browne, but at least she had to interact directly with the people she was exploiting.
StevoR says
There was an episode of Star Trek TNG that already discused pretty much this exact scenario. It didn;t end well
Source : https://en.wikipedia.org/wiki/Silicon_Avatar
cag says
Billions of people believe that they are talking to god/allah/jesus. What a business opportunity.
profpedant says
After my father died there were several times that I ‘saw my Dad’ in my peripheral vision. I knew perfectly well that his ‘being there’ was my brain misinterpreting what was actually in my peripheral vision, but it was nice to be able to (sort of) see my father again. No magic involved, just a pleasant misinterpretation of reality.
HidariMak says
I imagine that the porn industry will be early adopters of this. Pay by the minute to talk with a “Debbie Does A.I.” chatbot, for starters.
acroyear says
Once again, “Max Headroom” (the series itself, not the character) was hugely ahead of its time.
An episode in season 2 was called “Deities” and it involved an ex- of Edison Carter who ran a ‘religion’ based on a simpler idea of the tech that created Max Headroom from Carter’s memories and personality. Of course, they couldn’t scale up, so the whole thing (like all religions) was something of a money scam preying on the vulnerability of the elderly.
Well, here we are – using tech to give a very rough (and poor) simulation of a loved one to scam money from the elderly.
“Oh, that’s WONDERFUL, isn’t it?”
Raging Bee says
Billions of people believe that they are talking to god/allah/jesus. What a business opportunity.
Coming up next: Churches offering Zoom or MS Teams meetings with Jesus Christ, who will of course assure everyone listening that [church organizing the meeting] is the true representative of his Word and Will.
Larry says
cag @ #3
Have you been trying to talk with Jesus but you get nothing back. God’s not returning your calls? Well, suffer no more! ChristOnLine®, a new app from Ronco, maker of Mr. Microphone, uses AI on your phone or device to allow you to actually speak to and interact with your deity of choice. Available in the app store now for only $29.99/month. Install it today!
AstrySol says
I think the need for psychological support is real for a non trivial number of people, so if there is a way that can provide some kind of relief there honestly, IMO it is not necessarily bad on its own.
However, tech bros tend to be cheapskates, make half-assed products, skirt regulations, lie about them and charge exuberant prices. And I think that is the problem.
It’s like quality magic shows vs whatever is done by those con artists.
Akira MacKenzie says
Didn’t Kayne West do something like this a couple of years back as a birthday present to Kim Kardashian; he had an AI “hologram” of her late father made to wish her a happy birthday and tell her what a great guy West was?
(Checks Goggle)
Yup, he did.
https://www.bbc.com/news/entertainment-arts-54731382
Tethys says
It’s slightly less creepy than the Norman Bates method of keeping his Mom around. It’s hardly surprising that a chatbot parrots her in a “spot on turn of phrase”, given that the bot was given his Mother’s texts as a training set.
Pretending your Mother hasn’t died is not a healthy method of dealing with grief.
Jean says
That’s similar to the other deluded people who want to “upload” themselves into a computer. Just because you may at some point be able to create a simulacre of some person that could fool others does not mean you have a real self-conscious individual.
tacitus says
Everyone processes their grief in their own way. My mom lost her husband of 66 years last year — they were very close — and continued to “feel” he was alive for several months afterward even though she was well aware that he was dead.
As long as you’re not duping anyone into believing their loved one is still alive, I suspect an AI simulacrum of their lost loved one will be quite popular, and significantly beneficial to some. I doubt my mom would have wanted one, but her sister-in-law lost her husband a few years ago and her life essentially ended the day he died since she never got over his loss. All she ever wanted to do was be with him, and she became a recluse after he died.
Ideally, robust mental health care is the answer of course, but we’ve never had that level of care in the US or UK in all the decades I’ve been around, so if affordable AI companions (simulating loved ones or not) become a reality, there are millions of elderly, lonely, and housebound individuals who could stand to benefit from the company.
No doubt there will be plenty of research on their potential uses in healthcare in the years ahead, but not everyone is as fortunate as my parents have been to have access to friends and family to help them and encourage them through an increasingly complex world as their faculties start to decline.
spiderj says
It starts as science fiction.
https://youtu.be/LU6U2B4VBqQ?si=LapChc7g3eisWoCn
Jazzlet says
I have had many dreams about deceased relatives over the years, but that’s all they were dreams. Which is fine by me as a lot of them were not pleasant dreams, the one where my mother was eating and talking, missed her mouth with the fork which then hit her cheek was particularly disturbing
I mean it
Sure?
as her cheek fell apart like a slow braised cut of beef while she continued to eat and talk. That dream happened something like forty years ago, but it still disturbs me when I remember it.
Jazzlet says
Sorry obviously didn’t close the tag, italics should stop at end of penultimate sentence.
seachange says
Universities are going from tenured professors to adjunct professors to TAs all while charging more to fill their proliferating administration’s already fat pockets.
You already have some recorded genetics lessons. We here have been privileged to access some of them.
…
…
…
LykeX says
Does he? Or is he just so emotionally stunted that he can’t tell the difference? Is he so lacking in humanity that, to him, it IS a substitute?
nomdeplume says
Every day, new evidence of a world gone mad.
KG says
London Standard’s AI imitation of Brian Sewell proves art critics cannot be easily replaced. Sewell was a rather nasty piece of work IMO, but he didn’t deserve this.
John Morales says
[OT]
In the news: https://www.theguardian.com/world/2024/sep/26/nazca-lines-peru-new-geoglyphs
gijoel says
Blasphemy!!??!?! Repent now or Roko’s basilisk will torture your chatbot at some indeterminate point in the far future. :)
On a personal note when I was a teenager my mother floated the idea of freeze drying after her death. The plan was that my brother and I would share her corpse and keep her in our homes for six months a year. She got really angry when we both said no.