Background
Research suggests that roughly 72% of U.S. teens have interacted with AI companions, and about a third use them primarily for social or emotional connection — not information or tasks. These tools are designed to be available, nonjudgmental, and endlessly patient in ways humans can't be. For many kids, that's genuinely helpful. The line between helpful tool and problematic dependency isn't always obvious, and it doesn't happen all at once. This checklist is for spotting the pattern early.
Observable behavioral signals
Check any behavior you've observed consistently over the past two weeks:
Emotional Dependency
Sleep and Routine Disruption
Social Substitution
School and Performance
What to do if you're seeing these signs
1. Don't take the app away immediately.
Sudden removal of an attachment figure — even an artificial one — can cause real distress. If your child has formed an emotional bond with an AI, yanking access without conversation can backfire and damage trust.
2. Have a conversation first.
Lead with curiosity, not judgment. Try: "I've noticed you spend a lot of time with [app name]. What do you like about it?" Listen. You're trying to understand what it's providing — because understanding that tells you what's actually missing.
3. Gradually reintroduce alternatives.
Not as a directive ("you need to hang out with real friends"), but as expansion: "I was thinking we could [do an activity] this weekend." The goal is to add, not subtract.
4. If distress is significant, get professional support.
AI companion dependency is a real and emerging clinical phenomenon. School counselors are increasingly aware of it. A therapist who works with adolescents can provide structured support. You don't have to handle this alone.
Crisis resource
If your child is expressing thoughts of self-harm or not wanting to be here, contact 988 (call or text) or text HOME to 741741 for the Crisis Text Line.