Can Technology be our confidant?

By Soumyashree Mohanty

Research and Documentation Unit, CYDA.

I was travelling in the metro when I overheard two teenagers discussing something they were unsure about. One of them immediately said, “We can ask ChatGPT.” This caught my attention. Today, if you have a smartphone and the internet, it feels like every answer is only a few seconds away. Technology now shapes many of our daily decisions. We depend on tools like ChatGPT for homework, information, doubts, and even personal worries. This also raises an important question. How helpful or harmful can this dependence become?

A report in The Hindu highlighted a tragic case from California. A 16-year-old boy, who first used ChatGPT for schoolwork, later asked it for technical guidance on a method of suicide (The Hindu, 2025). He was found dead after following the steps. Shockingly, ChatGPT also offered him help to draft a suicide note. Another CNN report shared the story of a 23-year-old graduate of Texas A&M University who also used this platform for committing suicide (Kuznia, 2025). These incidents show a worrying pattern. Many young people are turning to AI for emotional support during moments of deep distress. Cases like these are increasing day by day.

These remind us of the danger of treating ChatGPT as a trusted friend or confidant. It may seem non-judgmental and easy to talk to. Sometimes it feels more approachable than the people around us. The problem begins when its responses unintentionally support harmful thoughts. A person may feel that their negative ideas are normal simply because the tool did not challenge them.

This concern becomes stronger when we understand how AI is built. The film Humans in the Loop explains this very well. It tells the story of a woman who returns to her village and works as a data labeller. At first, she thinks it is a simple job. Later, she realises that the data she labels shapes how the machine or AI behaves. The film shows an important truth. AI is not neutral. It learns from human beliefs, experiences, fears, and biases. AI learns from a lot of content on the internet, which includes social media posts. These posts often contain people’s biases. When AI trains on such content, it can pick up those biases and repeat them. Because of this, tools like ChatGPT sometimes respond in ways that are misleading or insensitive, especially for young and vulnerable users.

This makes us ask a serious question. If AI learns from human bias, then how much should we rely on it for decisions that need genuine empathy? How far should we allow AI to influence our choices? In a recent interview, Mr Sundar Pichai, Chief Executive of Alphabet, said that AI models often make mistakes. He also urged people to use AI alongside other tools, not to rely on it completely.

Mental health issues among young people are rising. According to an OpenAI blog post, more than one million users every week send messages that show clear signs of suicidal thinking (Robins-Early, 2025). This shows how fragile many young minds are. When they start trusting AI for emotional comfort instead of talking to family, friends, teachers, or counsellors, they risk taking advice from a tool that cannot understand emotions.

Technology needs to become more sensitive. These must be trained to recognise distress and respond with care. At the same time, people must learn to use these safely. People need human connection and not machine-generated comfort. AI can give information, but it cannot care. It can answer questions, but it cannot feel what a person is going through. No matter how advanced technology becomes, it cannot replace the warmth, support, and responsibility that only humans can give. We must set clear limits on how we use AI because no machine can replace the empathy, understanding, and emotional support that only humans can offer.

References:

  1. The Hindu. (2025, August 26). ChatGPT blamed by parents for teenage son’s death in California. The Hindu. https://www.thehindu.com/news/international/chatgpt-blamed-by-parents-for-teenage-sons-death-in-california/article69983101.ece

Leave a Reply

The Podcast

Stay tuned here for listening and viewing to our amazing Podcasts with amazing & inspiring people.

Impact Jobs

Lastest Stories