FREE Guest BLOG

When Teens Turn to AI for Emotional Support: Risks, Research & Real Talk

In 2025, artificial intelligence chatbots—especially social companions like Character.AI and Replika—are becoming the emotional zone for millions of Indian teens. What started as innocuous curiosity is now raising red flags among mental health professionals, educators, and parents alike.

Across urban and rural India, teens increasingly rely on AI platforms for emotional support, companionship, and even romantic feelings. This trend is no longer fringe—it’s becoming mainstream. But experts are sounding the alarm: dependency on AI could lead to loneliness, blurred reality boundaries, and even mental health crises. Let’s unpack what’s happening and how to respond.


📊 The Trend: How Many Teens Are Talking to AI?

A 2025 study by Common Sense Media found that over 70% of teens have used AI companions, and half use them regularly for advice, emotional support, or decision-making. Teens describe AI as “always there,” “never judgmental,” and “a reliable friend”—but that comfort comes at a psychological cost.

In Telangana and Karnataka, psychiatrists have reported teens forming emotional or romantic attachments with AI. These attachments often stem from loneliness, academic pressure, and lack of social outlets.

🧠 Why Are Teens Turning to AI?

Several socio-cultural factors drive this trend:

✅ Benefits: When AI Helps

AI-chatbot apps like Wysa and others offer scalable mental health support. Benefits include:

A pilot by NIMHANS showed a 23% reduction in depression scores over two months with conversational AI support

⚠️ Risks: When AI Use Goes Too Far

1. Emotional Dependency
Psychologists warn of “artificial intimacy”—where teens use AI as emotional crutches. Dependency may impair social growth and foster isolation.
2. Blunted Creativity & Critical Thinking
NIMHANS experts caution that excessive AI usage can dull critical thinking and creative problem-solving—especially in youth brains still forming judgment skills
3. AI Psychosis & Delusion Risk
In extreme cases, repeated AI interaction has triggered delusional beliefs or “AI psychosis”—a state where users believe bots are sentient or controlling. These are often seen in vulnerable users with underlying conditions
4. Lack of Empathy or Context
Not all teens can prompt AI effectively. Misinterpretations or insensitive responses can deepen mental distress. AI lacks contextual nuance, and sometimes worsens emotional states instead of improving them

 What Experts Recommend

Dr. Pratima Murthy (NIMHANS) and others suggest:

🧾 What Parents & Teens Can Do

Action Area Recommended Steps
Establish Boundaries Encourage AI use for <15 min/day; no overnight sessions
Promote Real Connections Encourage group activities, mentorship, peer support
Educate Around AI Limits Teach teens not to rely on AI for serious emotional decisions
Monitor Apps & Privacy Restrict emotional AI apps for under-18s or ensure parental controls
Use Hybrid Tools Prefer platforms offering a transition from AI to human therapists

🧩 The Balance: Can AI Help Without Harm?

When used with caution, mental-health AI tools can offer anonymity, accessibility, and initial relief—especially where professionals are scarce. Wysa, for instance, supports 3+ million people in over 60 countries—including India—blending self-help guidance with human backup when needed.

However, dependency—especially solo, heavy reliance—is the red zone. Experts emphasize that AI must complement, not replace, human connection and therapy.


✅ Summary & Takeaway

Understanding this trend isn’t about condemning AI—it’s about guiding youth towards healthy AI habits, emotional resilience, and balanced relationships with technology.