Are LLMs our new life coach?

It was when I went on a trip to Paris with my friend. We were talking about life as usual, until she mentioned she had broken up with her boyfriend. It was due to the differing values they held in the relationship, and she was the one who decided to end it, despite still having feelings for him. What surprised me while listening to her story was that she had consulted Gemini about her concern. This was my first time meeting someone (although I knew her from my home uni), actually having a daily conversation with an AI. At that moment, although I didn’t share my honest opinion with her, I felt disappointed.

In the age of numerous AI systems being widely integrated into our lives, systems that are often used for people to look up or ask questions and have them answered by text are called Large Language Models (LLMs). Models such as ChatGPT, Gemini, Claude, and others are built to read the internet and train themselves to predict what words and phrases come next until they get it right. In other words, they don’t understand what they are saying, but by combining information they have in their system, they sound like they understand us. While we know that artificial intelligence (at least for now) does not have emotions or a human heart, its ability to synthesize with human emotions is very astonishing. Today, I would like to discuss episodes that I feel were particularly modern, given the prevalence of LLMs in our lives.

Why do we rely on them so much?

First of all, I would like to question why, in the first place, people decide to consult AI in their lives, especially if they have someone around them to talk to? The primary reason for this is the absence of judgment. Research from the University of Kansas (2025) suggests that when individuals are dealing with embarrassing or sensitive information, they actively prefer the anonymity of AI over human interaction. AI does not have an ego, and it doesn’t get bored with your repetitive stories. It gives you the sense of being away from social stigma, which is similar to what you can feel in therapy. Furthermore, AI can be your mentor 24/7 ー they are free anytime to be beside you, even at 3:00 AM or midnight, when your best human supporters are asleep. For many, AI serves as a partner, helping them express thoughts that are too heavy to carry alone.

Furthermore, people use AI to verbalize their feelings. Often, like my friend, people sometimes struggle to put their feelings into words, as it is a very complex process. AI works as a mentor to organize your thoughts and emotions, showing sympathy for your feelings. This helps people calm down and organize their thoughts and feelings as well. This can be attributed to a psychological phenomenon in humans, known as the mirroring effect, as well as the chameleon effect, where people subconsciously imitate a person’s speech or behaviors, thereby building comfort and closeness. By AI taking this into its system, people could feel closer.

What’s the problem?

What is concerning about using these tools like as mentioned above, is that some do not understand the algorithm of LLMs. People actually think AI knows what they are talking about, despite the fact that in reality, it doesn’t know any of your history or understand you at all. There is also the risk of an echo chamber, where AI models are trained to take after your word usage and ways of thinking. The more you use it, the more they learn the words you use frequently, the way of thought you have, and imitate them to personalize it to what fits the user. This drags users into the world of asking anything and getting answers, without even thinking about whether it’s the right answer for them.

As we move forward with AI becoming more embedded in our daily lives, we must remember that these systems are tools, not a thing to depend on for decisions. Actual conversations must remain deeply with human interaction, even if the content is sometimes sensitive. It’s essential that we cultivate relationships with people throughout our lives, so we are not left to rely solely on our AI friends.

Just remember, someone is always there for you. Let’s lift our heads and enjoy the ups and downs of pur human life 🙂

https://news.ku.edu/news/article/study-finds-people-prefer-ai-chatbots-when-discussing-embarrassing-health-info-but-humans-when-they-are-angry#:~:text=LAWRENCE%20%E2%80%94%20Many%20people%20have%20experienced,nonjudgmental%20nature%20of%20AI%20chatbots.