Friday, June 6, 2025

I, for one, welcome our hallucinatory AI therapist overlords ...

I've done a fair amount of research into griefbots (or thanabots, or "restoration" systems), primarily those that try to "replace" your dead person.
As you may suspect, I'm not an avid fan.

Out of the blue, I have come across a site from David Kessler, an otherwise unobjectionable disciple of Elizabeth Kubler-Ross.


I'm still not sure how I feel about AI "therapists."  I know that they can be quite effective: they are polite, persistent, and never argue.  At the very least they never start any sentence with "At least ... ," which all too many *people* do.  With something this potentially dangerous, I'd like to see at least a *tiny* bit of research before rolling this stuff out to see who commits suicide.  The grief.com site does, at least, lead off with "This discussion is solely educational, not medical or mental health advice. This does not constitute an ongoing or therapeutic relationship. Please consult a healthcare and/or mental health professional for care. For medical or mental health emergencies reach out to emergency services right away. This is only for 18 years and older."

And today I found this article about Replika sexually harrassing users, some of them minors.  Replika is one of the griefbots and, although I have no actual experience with it, is the one I've known about the longest.  (It started becoming public around the time that Gloria died, and originally used email from your "loved one" as source material, and, of course, I've got tons of email from Gloria.)

The article notes the business model of Replika puts romantic or sexual roleplay behind a paywall, which can, in a variety of ways, promote sexual content in the public, free, or lower cost versions of the system.  It is *extremely* likely that the system is "trained" to promote extended, continuing, or upselling levels and use to the clients.


Grief table of contents and resources

No comments:

Post a Comment