When AI becomes your therapist
When people feel overwhelmed, anxious, or uncertain, they often look for someone to speak to. More and more, that someone is not a person. Artificial intelligence chatbots like ChatGPT are turning into places where users share worries about relationships, work, stress, or personal matters, often seeking advice or reassurance in times of emotional distress.
Psychologists say that this sort of appeal is understandable. AI offers instant responses, anonymity and no judgment, all these are qualities that can make opening up feel easier than speaking to another person.
Recent research shows that many are already using AI for emotional support. A 2025 study in the Journal of Medical Internet Research found that users often turn to AI chatbots. Other surveys report that younger users in particular are turning to generative AI tools when they feel anxious, sad or overwhelmed.
“One important factor is convenience and immediacy,” says licensed counseling psychologist Evie Michailidou. “Individuals can type their thoughts whenever and wherever a concern arises and receive an instant response, which can feel reassuring and containing in the moment.
“Interacting with AI feels less intimidating than speaking to a therapist. There is no perceived judgement, no fear of being misunderstood and no exposure of vulnerability in front of another person,” she adds.
Some find that talking to AI feels supportive, especially when they need quick reassurance. Chatbots are available at any time and reply immediately, which is not always the case with traditional support systems. However, psychologists point out that the same things that make AI helpful can also make it hard to differentiate between short-term comfort and real psychological support.
Some researchers caution that increasing reliance on AI for emotional advice may affect how individuals interpret guidance, especially in serious mental health situations.
“Therapy is a process that requires time, emotional effort and a willingness to engage in often uncomfortable self-reflection and personal growth,” Michailidou explains. “It is not always pleasant, and it does not provide instant relief. In contrast, AI can offer quick comfort or structured feedback without requiring the deeper emotional work that psychological change typically involves.”
For some, AI is used as a harmless way to process thoughts or get advice. Experts warn that relying on it instead of seeking professional help comes with significant risks.
“One of the main risks of relying on AI instead of a trained mental health professional is that AI is a machine operated system; there is no human being on the other end,” Michailidou says. “While it can generate data driven suggestions or coping tips, it cannot replace the depth, attunement and responsiveness of human-to-human interaction.”.
This is because previous studies have already established the importance of the relationship between the client and the therapist, something that AI cannot provide.
“Research consistently shows that one of the strongest predictors of positive therapeutic outcomes is the quality of the relationship between therapist and client,” Michailidou says. “This relational bond built on trust, empathy and emotional attunement, cannot truly be replicated by an AI tool”.
The other limitation of AI is the way that it responds. Although the chatbot may seem to be empathising with the client, it is responding based on the patterns it has learned from the large volume of data it has been trained on.
“AI systems often generate responses based on patterns in data and the way a question is phrased,” Michailidou explains. “This means the guidance provided can be somewhat leading or shaped by the user’s words, rather than emerging from a nuanced understanding of the person’s history, emotional state or non-verbal communication.”
For those dealing with convoluted psychological issues, this lack of deeper understanding can be substantial.
“AI lacks clinical judgement, ethical responsibility, and the ability to assess risk in complex situations, which are essential components of professional mental health care,” she says. Although there is not much data available yet on how widely people in Cyprus are using AI in this matter, Michailidou said she has begun to notice the trend developing locally.
“Although I do not have empirical data to present, based on my experience and observations from social networks, I have noticed a growing tendency for some individuals in Cyprus to use AI tools such as Chat GPT when dealing with personal or emotional concerns,” she says.
Despite these concerns, psychologists admit that there can be some limited and positive uses for artificial intelligence in the area of mental wellbeing. “AI can potentially be used in limited and more constructive ways in relation to mental wellbeing,” Michailidou says. “For example, it may be helpful for accessing general information or practical tips for common difficulties such as insomnia, stress management, relaxation techniques or time management strategies.”
In this sense, the process of using AI can be compared to looking for information about mental health and education on the internet. However, this should not be equated with any form of psychological support.
“Each person’s psychological experience is unique, and meaningful, therapeutic change requires depth, personal exploration and individualised understanding”, she says.
As technology continues to rewrite how people seek advice, psychologists say that emotional wellbeing still depends on something at the core of humanity: connection.
“AI can offer temporary reassurance or general guidance,” Michailidou says. “But it cannot replace the depth, relational safety, and professional responsibility that come with human psychological support”.