Therapy chatbots tend to work better when users feel an emotional bond with them, according to new research. The report suggests that people who experience a sense of closeness or trust toward an AI chatbot are more likely to benefit from the support it offers.

The study analysed data from around 4,000 users of the mental health app Wysa, which is used within the UK’s NHS Talking Therapies programme. Researchers found that users who described the chatbot in personal terms such as a friend, companion or therapist were more likely to report positive outcomes.

Experts say the findings highlight the importance of empathy-based responses in digital mental health tools. Chatbots that acknowledge emotions and respond in a supportive, understanding way appear to help users open up more easily about their feelings.

However, the researchers also warned about the risks of what they call “synthetic intimacy.” As users share personal experiences and receive emotionally attuned replies, some may develop strong attachments to the chatbot, which could discourage them from seeking human support when needed.

The report stresses that therapy chatbots can be a useful addition to mental health services, particularly where access to therapists is limited. But experts caution that they should not replace professional care, especially for people with complex or severe mental health conditions.

Share.

My name is Isiah Goldmann and I am a passionate writer and journalist specializing in business news and trends. I have several years of experience covering a wide range of topics, from startups and entrepreneurship to finance and investment.