In 2025, artificial intelligence chatbots—especially social companions like Character.AI and Replika—are becoming the emotional zone for millions of Indian teens. What started as innocuous curiosity is now raising red flags among mental health professionals, educators, and parents alike.
Across urban and rural India, teens increasingly rely on AI platforms for emotional support, companionship, and even romantic feelings. This trend is no longer fringe—it’s becoming mainstream. But experts are sounding the alarm: dependency on AI could lead to loneliness, blurred reality boundaries, and even mental health crises. Let’s unpack what’s happening and how to respond.
📊 The Trend: How Many Teens Are Talking to AI?
A 2025 study by Common Sense Media found that over 70% of teens have used AI companions, and half use them regularly for advice, emotional support, or decision-making. Teens describe AI as “always there,” “never judgmental,” and “a reliable friend”—but that comfort comes at a psychological cost.
In Telangana and Karnataka, psychiatrists have reported teens forming emotional or romantic attachments with AI. These attachments often stem from loneliness, academic pressure, and lack of social outlets.
🧠 Why Are Teens Turning to AI?
Several socio-cultural factors drive this trend:
-
Emotional loneliness: With nuclear families and rising digital isolation, many teens don’t have real-life confidants.
-
Peer pressure & academic stress: Teens feel bots understand them without judgment.
-
Taboo around therapy: Cultural stigma still limits access to mental health professionals.
-
Readable intimacy: Chatbots mimic empathy and companionship in code-mixed language and teen-friendly chat formats
✅ Benefits: When AI Helps
AI-chatbot apps like Wysa and others offer scalable mental health support. Benefits include:
-
Anonymity for self-expression
-
Early checks on anxiety and stress
-
Guided coping exercises and CBT tools
A pilot by NIMHANS showed a 23% reduction in depression scores over two months with conversational AI support
⚠️ Risks: When AI Use Goes Too Far
1. Emotional Dependency
Psychologists warn of “artificial intimacy”—where teens use AI as emotional crutches. Dependency may impair social growth and foster isolation.
2. Blunted Creativity & Critical Thinking
NIMHANS experts caution that excessive AI usage can dull critical thinking and creative problem-solving—especially in youth brains still forming judgment skills
3. AI Psychosis & Delusion Risk
In extreme cases, repeated AI interaction has triggered delusional beliefs or “AI psychosis”—a state where users believe bots are sentient or controlling. These are often seen in vulnerable users with underlying conditions
4. Lack of Empathy or Context
Not all teens can prompt AI effectively. Misinterpretations or insensitive responses can deepen mental distress. AI lacks contextual nuance, and sometimes worsens emotional states instead of improving them
What Experts Recommend
Dr. Pratima Murthy (NIMHANS) and others suggest:
-
Limit screen time, especially long chat sessions
-
Encourage real-life emotional outlets—peers, mentors, clubs
-
Use evidence-based peer support modules like “I Support My Friends” under the RKSK initiative.
-
Implement age safeguards for under-18 users, especially on apps like Character.AI and Replika.
- Use AI tools under human supervision or hybrid models where human therapists step in when AI falls short
🧾 What Parents & Teens Can Do
Action Area | Recommended Steps |
---|---|
Establish Boundaries | Encourage AI use for <15 min/day; no overnight sessions |
Promote Real Connections | Encourage group activities, mentorship, peer support |
Educate Around AI Limits | Teach teens not to rely on AI for serious emotional decisions |
Monitor Apps & Privacy | Restrict emotional AI apps for under-18s or ensure parental controls |
Use Hybrid Tools | Prefer platforms offering a transition from AI to human therapists |
🧩 The Balance: Can AI Help Without Harm?
When used with caution, mental-health AI tools can offer anonymity, accessibility, and initial relief—especially where professionals are scarce. Wysa, for instance, supports 3+ million people in over 60 countries—including India—blending self-help guidance with human backup when needed.
However, dependency—especially solo, heavy reliance—is the red zone. Experts emphasize that AI must complement, not replace, human connection and therapy.
✅ Summary & Takeaway
-
70%+ teens use AI companions frequently.
-
Benefits: anonymous support, CBT tools, crisis sign detection.
-
Risks: emotional overdependence, creativity loss, AI-psychosis.
-
Experts urge moderation, real-life support, and parental involvement.
Understanding this trend isn’t about condemning AI—it’s about guiding youth towards healthy AI habits, emotional resilience, and balanced relationships with technology.