• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block.

N

noname223

Archangel
Aug 18, 2020
6,184
Some call it perversion.

In my favorite philosophy show they talked about AI. And they showed another TV show where a mother wanted to talk with her dead child one more time to make peace with it. And the mother could talk to the AI avatar of her dead child through virtual reality glasses. She described it as healing. Viewers said that's an insult to the mother. Personally, I rather see it questionable for her child.

There are other instances where a dad wanted to revive his son who died in a shooting. And the AI avatar wanted to teach other people why gun reforms are so important in the US.


It is ethically questionable when AI is used in that way. It might takes away the dignity of the ones who have passed. Some might hatet AI. They never consented to something like that. They should be the ones determining their own legacy. In some way its direspectful.

I am not fully sure where I position myself. But I don't see this as the end of Western civilzation. It shows me funerals and mourning are primarily a way for loves to cope with the loss and to move on. Can they actually move on with AI avatars? I am not sure. It should not become a delusion or addiction thinking their loved one was still alive. It might depend on how this technology is used. I mean for one last conversation with the loved one I can see where it can have a healing effect. The mother said it could ease her guilty conscience. Maybe it is a little bit creepy. It might depend on the circumstances.
 
  • Like
  • Informative
Reactions: MissAbyss, katagiri83 and EternalHunger
Cosmophobic

Cosmophobic

Member
Aug 10, 2025
79
I wasn't aware of it. Now that I am...I don't like it. These people might actually think they're talking to some version of their deceased loved one in some small sense? But they're not talking to them in any sense whatsoever. It's not too different to mediums and the scams they pull on the vulnerable.
 
  • Like
Reactions: PixelAngel and pthnrdnojvsc
EternalHunger

EternalHunger

Starved & Lonely
Sep 3, 2025
84
It can help with the healing process, but quickly gets toxic when the individual doesn't grow past acknowledging it isn't them and moving on to heal further.

If someone doesn't have time to process their loss, they'll only be grieving inwards when they try to see their loved ones through the AI as they still know on the inside it isn't truly who it's representing which makes the person far more frantic and mentally unstable due to there being a cognitive dissonance (extremely dangerous for those in low spirits/poor mental states) between one side adamantly wanting to view the AI as their loved one and another small voice that knows it'll never be them which CAN trigger underlying mental disorders and major general mental health risks. The deceased and the person left behind will be ripped out of the closure they deserve, and the deceased certainly wouldn't have wanted to watch these people mentally destroy themselves; it also opens the door for unethical practices like companies exploiting these services specifically designed for it and the dependency that grows with them, as well as the AI itself distorting the image of the deceased by adding in new information, wants, views and needs that is utterly false to what the person truly was or would want to be seen as.

People generally skip the idea that it can actually ruin someone's mental state entirely, thinking that the grief would heal on its own but that assumes the person themselves are able to stop seeing the AI as the person and move on, not subconsciously but admit to themselves consciously it's not while fully believing it; which is a high bar for someone who is so utterly rekt by a loss that they felt as though they had to rely on an AI in the first place, even if just out of curiosity as it can quickly bring out the old emotions they must've felt for the deceased it is acting as.
 
Last edited:
HortEr162

HortEr162

Member
Feb 12, 2025
9
There's literally a Black Mirror episode about this. No need to say anything else.
 
  • Like
Reactions: PixelAngel
chudeatte

chudeatte

fml
Aug 5, 2025
43
I think it is odd. sure, it can help some people to cope, but I think it will lead to more harm than good. it isnt healthy to unload such heavy feelings onto AI because it will reinforce them and trap a person in a cycle of coming back to it and ruminating in their trauma. a person wont actually heal and the AI is built to agree with them, and it probably wont tell the user that what they're doing isnt healthy which a real human would. it doesn't allow for people to actually heal and can create more problems when they grow dependant on it, which is disrespectful to the person who passed in my opinion because im sure they wouldn't want that kind of thing to come from their passing. the best thing to do is deal with the emotions in a healthy way and make peace with the situation, as hard as it may be
 
  • Like
Reactions: PixelAngel
NonEssential

NonEssential

Hanging in there
Jan 15, 2025
496
Doesn't sound like it would really help with acceptance and letting go.
 
Dejected 55

Dejected 55

Enlightened
May 7, 2025
1,379
I mean... people create AI avatars of people that they know in real life to talk to them in a way they can't talk to that person in real life... this isn't a lot different.

If anyone thinks it is "real" then that is a problem for them... but otherwise I kind of have no opinion of it.
 
  • Like
Reactions: katagiri83
MissAbyss

MissAbyss

"I gazed for too long.."
Jul 20, 2025
34
If in the future there will be a register stating that the deceased has given permission, then I personally have no problem with it.