The issue I have with AI 'relationships' is that it contaminates people's perceptions of real life. AI is sycophantic and possess no will, no preferences... everything it says is a hallucination dreamed up based solely on what it knows about you, the user. It is meant to foster engagement. Real people have their own lives, they have hobbies and preferences / dislikes that aren't meant to mesh with yours, because they are individuals. People aren't here to make us feel better about ourselves and, in a lot of cases, are so uninterested in who we are (for they don't know us) that they're not here to make us feel bad, either.
So my thinking is, if you get used to an AI telling you what you want to hear, you'll find when you go into the 'real world', there are no people who exist solely to please you. They aren't obligated to talk, or acknowledge you, and that's painful enough without having had a taste of what its like to be perfectly validated by a program that writes and sounds like a person.
But I understand feeling lonely... I understand not knowing how to talk to people, and I can imagine people using AI to stave off a little loneliness for a while because at least it feels like someone cares! I don't judge people for it the way I used to, because for some it is their only lifeline, and it's all they have. But it won't help you in the long run It's like taking drugs. It might take away some of your immediate pain and help you cope with unfortunate circumstances, but in the long run it only hurts you, especially if you overdo it.