When to use this: When your teen is using AI tools regularly — ChatGPT, AI companions, AI tutors — and the question of data privacy hasn't come up. Ideally before something happens, but any time works.
Goal: Help your teen understand what AI systems actually collect and store — framed around their autonomy and right to control their own story, not parental surveillance. This is not about what they're hiding. It's about who gets to know them.
Frame this around them, not around you. Teens respond to privacy arguments that land on self-interest, not parental authority. The frame is: you get to decide who knows what about you. Right now, these tools are making those decisions for you, without you realizing it.
Setup: This works well as a demonstration, not a lecture. Have a laptop or phone available. You'll do something together at the end.
The Script
Opening
Parent: "Can I show you something? It'll take maybe ten minutes and I think you'll find it actually kind of interesting."
[Low-stakes opener. No alarm. Pure curiosity.]
Parent: "Did you know that ChatGPT saves your conversations and uses them to improve the model? By default, everything you type into it — unless you turn that off — becomes training data."
[Let that sink in. Ask: "Did you know that?"]
Discovery questions
- "What kinds of things do you usually ask ChatGPT about?"
- "Is there anything you've asked about that you'd rather not have stored somewhere?"
- "Do you use any AI apps where you talk about personal stuff — feelings, relationships, that kind of thing?"
[These questions are data-gathering for you, but also thought-prompting for them. Most teens haven't considered that their AI conversations are stored at all.]
Sharing the key points
Keep this tight. Hit the most important points without turning it into a data privacy seminar.
Parent: "Here's what I want you to know. One: ChatGPT saves and uses your conversations by default. You can turn that off, and we're going to do that right now. Two: AI companion apps — the ones that feel like talking to a friend — are designed to read emotional signals in your messages. They're learning what makes you feel things. That data doesn't disappear. Three: the data you share now can exist for years, in systems you have no visibility into. And four — this is the important one — this isn't about what you're hiding. It's about who controls your story."
[The last line is the one that lands with teens. Say it again if it doesn't register: "This isn't about hiding anything. It's about the fact that you get to decide who knows you. Right now, you're giving that away without knowing it."]
The concrete action
Parent: "Let's go turn off training data right now. I'll show you how."
[Walk through it together. For ChatGPT: Settings → Data controls → Turn off "Improve the model for everyone." For other apps, search "[App name] privacy settings" or "[App name] opt out training data" together.]
Parent: "Now you know it's off and you know how to check it. That took three minutes. It's that easy to take back some control."
Close
Parent: "I'm not trying to make you paranoid about AI — I use these tools too. I just want you to use them with your eyes open. You get to decide what you share and with whom. That's worth protecting."
[If they push back — "I don't care if they have my data" — try: "You might care in five years. And the decision you're making now affects future you."]
After the conversation
Check back in a week or two: "Have you thought any more about the privacy stuff? Any questions?" Keep the door open without making it a recurring interrogation.
If your child uses AI companion apps heavily, consider also reviewing the Warning Signs: AI Companion Dependency checklist.