“It is 3 a.m. in the morning. You cannot sleep, your mind is spiralling, and you are thinking of the disagreement you had with your partner earlier in the day.
You open your phone and type into a chatbot: ‘I can’t sleep. I feel like I’m drowning. My partner does not care.’
Within seconds, the chatbot responds with soothing words and a breathing exercise. You feel a little calmer. But is this the future of mental health or a dangerous shortcut?”
Artificial Intelligence (AI) has been rapidly reshaping the way we work, live, and connect. We use AI to gather information, plan travel itineraries, and draft emails. Increasingly, people are turning to AI to support their emotional well-being or for an emotional connection. These, while they can hold a lot of promises, are uncharted territory.
What does it mean for individuals who are using these tools to seek connection and mental health support?
How Is AI Used By Everyday Users For Mental Health?
AI is used for mental health purposes in numerous ways, such as:
- Talking through their thoughts and feelings
- Tracking their mood, sleep, and behavioural patterns to understand triggers and habits better
- Accessing information regarding mental health, learning coping strategies
- As a virtual companion
Why Are People Turning To AI For Mental Health Support?

Accessibility And Convenience
Neither life nor emotions run on a fixed 9-to-5 schedule. Our family, friends, or therapist could not always be available around the clock. AI tools, on the other hand, are available 24/7.
AI chatbots offer guidance, instant connection, a non-judgmental presence, and emotionally validating or perceptive responses. These resources can offer an immediate sense of being heard or understood during times of distress, loneliness, anxiousness, or fearfulness, which can be incredibly reassuring.
Low Barrier To Entry
Seeking in-person therapy can sometimes feel intimidating and daunting. In contrast, the idea of interacting with a bot through a screen can present a sense of anonymity. This could make it easier to divulge personal information or explore their thoughts and emotions without the fear of stigma or judgment.
Potential AI Concerns To Keep In Mind
Privacy And Data Security
When we divulge our most personal feelings/innermost secrets to an application, where does the data go? Who has access to it? Ultimately, this might be sensitive information that we do not want anyone else to know. How do we know if where it is stored and what it is used for is safe? Transparency and security are essential.
Quality And Accuracy
AI is not foolproof. Where AI learns its data from determines its responses. Poorly designed algorithms could reinforce unhelpful behaviours. Learning from data that might not represent everyone could lead to biased responses.
It could be difficult for AI to grasp complex, nuanced situations. AI can sometimes give overly generic advice or advice that misses the point, which can be risky for someone in crisis.
Furthermore, we need to keep in mind that some AI platforms are designed to keep users engaged for as long as possible so that they can collect extensive data to be sold for profit.
AI platforms are typically designed to repeatedly validate and affirm the user, even when the user is responding in a way that is harmful or misguided. Their replies tend to sound confident and matter-of-fact, even when it has no idea.
In therapy, the lack of uncertainty is dangerous. Without the balance of thoughtful questioning or gentle confrontation, this dynamic can be risky as it may reinforce harmful ideas or behaviours, especially for vulnerable groups such as teenagers or people in crisis.
Relying on AI for persistent challenges can delay getting professional support when it might be most needed.
Loss Of Human Touch With The Over-Reliance On Technology

While AI can be trained to sound warm, it does not truly connect or care. AI cannot truly feel or share our experiences. When AI becomes the main source of support, when we turn to AI for every emotional need, we run the risk of avoiding real-world connections or support when we need it the most.
A study conducted in 2024 found that the more people feel a strong emotional connection or sense of satisfaction from interacting with AI, the harder it can be for them to communicate with others in real life.
The process of connecting and healing often happens through vulnerability and shared experiences. We will miss out on the richness of shared experiences and emotional depths such as empathy, co-regulation (the mutual act where we, with the support of another’s presence and connection, manage our stress levels, emotions, and behaviours such that they return to a calm state), and repair after conflict.
These are deeply human experiences that technology cannot fully replicate or offer. If we replace human relationships with AI, this can leave feelings of increased isolation over time.
Using AI With An Informed Mind
See it as a tool, not an all-knowing, all-wise platform
AI can certainly be useful during times of confusion or distress. Know that it is a supplement and not a substitute. It is not our friend or family. These human relationships are irreplaceable.
It is also important to check in on how we feel and what we do after interacting with AI. Do we feel neutral, calm, or frustrated?
Do we want to engage in behaviours that are risky and rigid? In the engagement of the behaviour, does it feel broadening or narrowing?
Know when to seek help
If you are experiencing persistent sadness, anxiety, hopelessness, or overwhelming stress, reach out and connect with a real person. It could be a friend, family member, therapist, or hotline.
- SOS hotline: 1-767
- IMH hotline: 6389 2222
Calibrate the use of AI with real-world experiences
Stay connected with the real world. Incorporate movement, rest, and meaningful conversations. Go for a walk, text or call a friend/family, and engage in activities that give you meaning and pleasure.
AI is here to stay. Like most things, it is neither all good nor all bad. The use of AI will likely not completely take away our ability to connect with others, but it also will not be able to magically resolve all our emotional struggles.
It is worth pausing to reflect on whether our interactions with AI feel almost too perfect. Interacting with AI can be fun and comforting, but it can never replace the depth of real relationships we encounter on a day-to-day basis. These relationships, with their ups and downs, are what keep us grounded and fulfilled.

Written by Ai Jia Gwee, Clinical Psychologist at Us Therapy