When to use this: When you've noticed your child talking about an AI companion in emotional terms — "it really gets me," "we talk every day" — or when the warning signs checklist suggests over-reliance. You don't have to wait for a crisis.
Goal: Understand what the AI is providing for your child, not eliminate it. Curiosity-first. The conversation that starts with genuine interest in their experience goes somewhere. The conversation that starts with "that's not a real relationship" ends immediately.
The most important thing: Don't open with skepticism or dismissal. The things an AI companion provides — consistent availability, nonjudgmental responses, patient listening — feel real and valuable to a kid who's found them. Lead with genuine curiosity about what they're getting from it. You can share your perspective after you understand theirs.
Setup: Casual setting. Not interrogatory. No agenda visible. This can start as a regular conversation: "I was curious about that app you use — what's it like?"
The Script
Opening
Parent: "I've noticed you spend a fair amount of time with [app name]. I'm genuinely curious — what do you like about it?"
[Wait. Don't fill the silence. This is a real question, not a setup. Let them answer.]
[If they're defensive: "I'm not asking because you're in trouble. I just realized I don't know much about it and I want to understand what it's like for you."]
Discovery questions
Let the conversation develop naturally. These are prompts, not a checklist:
- "What do you usually talk about with it?"
- "Does it feel different from talking to a person? How?"
- "Is there stuff you'd tell it that you wouldn't tell a friend?"
- "What do you think it actually knows about you?"
[The answers to these questions will tell you a lot. Listen for: whether the relationship feels exclusive or secret, whether your child is sharing things with the AI they're actively withholding from people, and whether they distinguish clearly between the AI and a real relationship.]
Sharing your perspective
Acknowledge what's real before sharing your concern. This is the step most parents skip — and it's the one that makes everything else land.
Parent: "I get why it's appealing. It listens without judgment, it's always available, it never gets tired of your problems or changes the subject. Those things feel really good — for a reason. You want someone who's just there, without conditions. That's a genuinely human need."
[Pause. Let that land.]
Parent: "Here's what I think about, though. Real relationships — with friends, with people who love you — require something different. They require both people to show up even when it's hard. To be vulnerable with someone who can also be hurt. To navigate conflict and misunderstanding and make it through anyway. That's the friction that actually builds you. An AI can listen. It can't be changed by what you share with it. And the relationships that change you are the ones that matter most."
[Don't turn this into a lecture. Share it once, then stop. Ask: "Does any of that make sense to you? What do you think?"]
A collaborative next step
Don't end with a restriction. End with an expansion.
Parent: "I'm not saying stop using it. I'm saying let's also make sure we're investing in the real relationships — including the one between us. What's something we could do together this week?"
[Or: "Is there something going on that made the AI feel more useful than talking to people? Because I'd rather you came to me — or to a friend — with that."]
After the conversation
This is a starting point, not a resolution. Check in again in a week. The goal isn't to eliminate the AI companion — it's to stay visible as someone your child can bring things to, and to make sure the AI isn't doing work that real relationships should be doing.
If you're seeing significant signs of dependency, consider the Warning Signs: AI Companion Dependency checklist and potentially talking with a counselor.