
Anna, a Ukrainian living in London, says she regularly uses the premium version of ChatGPT because of its ability to listen without interrupting or judging her. While she knows it is only a machine, she says its patient and consistent responses help her reflect on her thoughts and emotions.
“I can rely on it to understand my issues and communicate with me in a way that suits me,” she said, asking to remain anonymous. After a recent breakup, Anna said the chatbot’s non-judgmental presence allowed her to explore her mixed feelings in a way her friends and family could not.
Her experience reflects a growing trend. Research cited by Harvard Business Review shows that in 2025, therapy and companionship became the most common use of generative AI tools such as ChatGPT. Other studies suggest that people often rate AI-generated responses as more compassionate and understanding than those written by humans, including trained crisis hotline workers.
Researchers say this does not mean AI is genuinely empathetic, but rather that many people rarely experience truly non-judgmental and uninterrupted listening in everyday life. Experiments have found that people often feel more hopeful and less distressed after interacting with AI-generated responses compared to human ones.
Large language models are designed to recognise emotions, reflect them back and offer supportive language. They do not interrupt, do not become impatient and do not try to dominate conversations. This creates a sense of psychological safety for users, allowing them to share difficult thoughts more freely.
Experts say there are several lessons humans can learn from AI about listening, including giving uninterrupted attention, acknowledging emotions, avoiding quick judgments and resisting the urge to immediately offer solutions.
Psychologists also note that people often turn conversations back to themselves by sharing similar personal stories, which can shift attention away from the speaker. AI systems, having no personal experiences, do not fall into this habit.
However, researchers warn against over-reliance on AI for emotional support. While chatbots can simulate empathy, they do not possess genuine care or understanding. There are also concerns about vulnerable people forming emotional dependence on AI or being exposed to harmful advice.
Michael Inzlicht, a psychologist at the University of Toronto, cautioned that AI companies could potentially manipulate users and that excessive reliance on chatbots could weaken real human connections.
Despite these risks, experts say AI can still serve as a useful tool for inspiring better listening habits and greater compassion among people, reports UNB.
“There is something uniquely meaningful about a human choosing to be present and listen,” researchers say, adding that while AI may help people feel heard, it cannot replace the depth of real human connection.