Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.
If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.
Donate via cryptocurrency:
Bitcoin (BTC):
Ethereum (ETH):
Monero (XMR):
VentingGoogle's Gemini has been my psychologist
Thread starterCrematoryy
Start date
You are using an out of date browser. It may not display this or other websites correctly. You should upgrade or use an alternative browser.
Professional help is expensive. Therapy could help me, but it's not economically viable. There's no public health plan that offers it in my country. I've always talked to robots online, and I felt I should start using that again.
Reactions:
EmptyBottle, wobble and Unbearable Mr. Bear
Professional help is expensive. Therapy could help me, but it's not economically viable. There's no public health plan that offers it in my country. I've always talked to robots online, and I felt I should start using that again.
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis
Don't want to burst your bubble, just want you to be safe, that's all, friend.
Reactions:
alwayspissedoff, eggsausagerice, NormallyNeurotic and 3 others
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis
Don't want to burst your bubble, just want you to be safe, that's all, friend.
Well, from what I know, most LLMs are extremely sycophantic by default, and can make one believe falsehoods about oneself by sweet talking them. You can still use them for whatever you see fit, just know their limitations.
Oh, my honey pot, mama bear is glad you came to her then. Here, lemme give you a warm, long hug while you talk about what is assailing you so much, dear. *bear hug* Mama's here now, don't worry.
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis
Don't want to burst your bubble, just want you to be safe, that's all, friend.
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
Well then that would be the worst way to discover how one has those things, specially in areas without adequate psychiatrical and sychological care, where it may get undiagnosed for decades.
That said, you are right, I was mistaken. I apologize for it
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
it can increase the risk of those disorders, as well as cause other effects like cravings for the LLM. Even tho I use LLMs moderately, I sometimes have slight cravings to use them to discuss random ideas... coz they reply fast (and I like their replies), and don't get bored
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
Not schizophrenia, no, but it can cause psychosis. Many disorders have can make someone more likely to develop psychosis, but do not have it as a general symptom—even severe depression and PTSD.
I think it's good as long as you know how to use it. Don't ask it to make judgments for you--it won't disagree with you because it's trained not to and that causes biases. Venting to it and asking it for advice sounds like a good way to move forward.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.