Quote Originally Posted by Pokemon Trainer Sarah View Post
Another thing that worries me is that the developers/owners of the AI obviously want people engaging with it and I've read that they are constantly tweaking things to keep people coming back. Especially people who are vulnerable or lonely and just want someone/thing to talk to. It's really interesting just how many people feel like they are connecting with LLMs on a deeper level to the point they prefer talking to them over human contact. It's not necessarily a bad thing and kind of makes sense as the LLM is always complimentary and encouraging and doesn't judge you. But my worry is that there are humans in control of these things and what kind of things they say/don't say etc. And those humans are very easily going to be able to manipulate all of those people just by tweaking the kinds of things the LLMs says or suggests. I feel like it could get quite dangerous for spreading disinformation or propaganda in the wrong hands. We already have enough trouble with bots online trying to do the very same things, but an AI that people feel like they have a personal connection to would be a whole other level!!
The key thing to remember is not to let anything that happens online get too under your skin. I have been burned, betrayed, lied to, cheated, and scammed out of money online by people who I thought I could trust and by people who I initially felt sorry for. It’s scummy when you encounter it, but the thing to remember is tomorrow can be a better day and move forward. Best to take the experience’s wisdom and lessons learned to avoid such a thing from happening again and move to the next day, because no one deserves to be haunted by a bad experience for the rest of their days.

Like with AI, people need to remember that it’s just a machine and it’s only following its programming, algorithms, and coding. Just don’t think of it as a person the way you wouldn’t think your printer or your calculator is a person. It’s just a tool with advanced functionality that was programmed to type and reply like an actual person. If a person is really down in the dumps and needs emotional support, they should talk to family, seek other like-minded people as friends, or adopt a pet to love and as companionship. But yeah, don’t make AI a replacement for real friends.