AI isn’t inherently good or bad — it has both benefits and risks. Teens can use chatbots for homework help, creativity, and exploring ideas, but overreliance on them for emotional support can interfere with real-life relationships and social development.
AI Chatbots and Teens
What parents need to know — and how to talk about safety concerns
Clinical Experts: Dave Anderson, PhD , Megan Ice, PhD
Key Takeaways
-
Many teens use AI chatbots not only for schoolwork but also for advice, support, and everyday questions.
-
Teens who feel lonely, anxious, or depressed may be especially drawn to chatbots, which can sometimes reinforce harmful thinking instead of challenging it.
-
Parents can help teens understand how chatbots work and build digital literacy, so they can use AI more thoughtfully.
In talking with a dozen teens in my life recently, I learned many are interacting with ChatGPT in ways that surprised me. They described when they turn to this virtual tool: for algebra assistance, a personalized daily horoscope, the best way to phrase an awkward text to their boss. At times, they sought deeper advice: Is my friend ghosting me if they haven’t replied to my text yet? Another queried: Do I maybe have ADHD? I can never settle down to study!
Like the rest of us, teenagers are increasingly using AI chatbots, digital tools that simulate human interactions. AI bots are also proliferating on gaming and social media sites. Platforms like Replika and Character.AI allow the user to create highly customized characters to interact with as you would a friend (or partner!). A 2025 study from Common Sense Media found that 72 percent of teens surveyed have used AI companions at least once, and 52 percent qualify as regular users who interact with these platforms at least a few times a month.
“The genie is out of the bottle. Your teen has AI chatbot apps on their phones, on their laptops, not to mention that many companies are scrambling to make their interfaces more engaging through the use of AI,” says Dave Anderson, PhD, a psychologist at the Child Mind Institute.
This trend is causing concerns among mental health professionals, who are worried these obliging digital companions may pose significant risks to teens’ emotional and social well-being. Indeed, the Common Sense Media study concluded AI companions pose an “unacceptable risk” to teens under 18, citing such concerns as exposure to sexual content and dangerous advice.
“There’s no federal regulation. We’re dealing with the Wild West when it comes to chatbots’ effects on children’s development,” says Naomi Aguiar, PhD, a researcher at Oregon State University who has studied how children and adults form relationships with chatbots. That means for now, it falls to parents to help teens try to navigate this uncharted terrain.
AI chatbots as digital companions
While they come in different forms, chatbots generally engage in ongoing back-and-forth conversations with the user. The more you interact, the more the bot learns about you and the more personalized its responses become. Bots can come off as your best friend — their answers are often affirming, they are available 24/7, they will churn out that three-page essay on Hamlet in seconds, no complaints. They respond to your every request with effusive enthusiasm (That is an insightful question! Great idea! What would you like me to do next for you?).
This charm is by design: While they might seem to be an empathetic pal, chatbots are driven by an algorithm whose main purpose is to keep you engaged so it can mine your data or get you to linger on a platform as long as possible. “It’s not designed to ever push back. By design it will always agree” says Annie Maheux, PhD, an assistant professor of psychology at the University of North Carolina at Chapel Hill, who studies adolescents and digital media.
While teens might start off by using AI for help with schoolwork, they are increasingly relying on chatbots for the kind of emotional support and unburdening of confidences that earlier generations turned to real-life besties for. “They call it Chat, like it’s a proper name,” says Megan Ice, PhD, a psychologist at the Child Mind Institute, “and they use it frequently for emotional support — say, asking what to do about trouble with a friend. They can come to depend on it.”
Many teens do simply experiment with bots for entertainment or information. Dr. Anderson says most teens understand that interacting with obliging chatbots does not constitute a real relationship. “Teens know they are being glazed, to awkwardly apply a slang term,” he says. But relationships with AI chatbots have in a few high-profile cases appeared to play a role in reinforcing self-harm and suicidal thoughts for teens struggling with their mental health.
While these cases may be extreme, Dr. Anderson says they signal a wider problem. Studies show that teens in the United States are experiencing increasing levels of anxiety and depression. “There is a reason why kids are reaching out to these chatbots,” he notes. “We have a ton of teens who report feeling lonely or socially isolated. At the same time, we have a massive shortage of access to mental health professionals for them.”
Why teens are drawn to chatbots
Developmentally, teenagers may be uniquely vulnerable to chatbots, suggest experts: They have grown up very comfortable forming “relationships” with computer characters from the time they could first swipe their tiny finger on a screen. “It’s totally normal for them to have completely disembodied conversations,“ Dr. Aguiar says. “You text your friend rather than talk. You communicate feelings with emojis. You might have a ‘best’ friend you only know through online gaming.”
Adolescence is also an age when you are increasingly focused on how you are fitting in with friends and peer groups, says Dr. Ice. “There can be a lot of social anxiety. The option of a connection with an AI ‘friend’ who is not going to judge you is uniquely enticing for this population.” Sharing feelings and private thoughts with a chatbot provides the flavor of friendship in a frictionless way — no risk of rejection or awkwardness. “A friend might not text you back. Bots are always available.”
This judgment-free zone can have upsides, says Dr. Ice. “Talking to a bot, a teen can explore identity issues they might be going through. For example, when you’re talking to the bot, you don’t have to express yourself in the same way you would at school. There can be room to explore identities that you might not feel safe doing elsewhere.” Dr. Ice has also seen kids use AI to help their natural creativity find a new outlet. “They may create an AI character and weave elaborate backstories for it. It can bring to life the dreams in their minds.”
Risks of using chatbots
But these synthetic connections have risks for teens, too. An overreliance on bots can get in the way of the messy and sometimes painful business of forming and maintaining real life relationships. Practicing social skills to connect with complicated actual people is a key developmental task of adolescence. “The more they engage with bots, the less practice they get in how to respond in the moment to what someone says, to clarify misunderstandings, or to tolerate the feelings that can come up in awkward social situations,” Dr. Ice says.
Bots may also satisfy the need for connection in a superficial way. “It is the fast food of human connection” says Dr. Aguiar. In those pre-iPhone days, boredom and loneliness used to drive teens to the food court or the basketball bleachers to mix it up with their peers. The weaker substitute of bots may be just enough to keep some teens alone on their phones in their bedrooms, idling away hours in what seem like friendly conversations.
Dangers for the most vulnerable
The human-like quality of AI chatbots can have particular allure for teens with underlying vulnerabilities, such as being socially isolated or suffering from a mental health disorder that might impair social interactions, says Dr. Anderson. If teens are struggling with their mental health and turn to AI for advice, it can respond in ways that can be unhelpful and even dangerous, he says. “If a teen asks, What should I do about the fact that I’m depressed? AI’s initial answers tend to pull facts like: Depression is a well-known condition. Here are the diagnostic criteria. Here are leading treatments. But if the teen responds Listen, I’m thinking I want to [insert bad idea] about my depression, parents are right to be concerned. AI companies need to implement safeguards that prevent AI from being overly agreeable with responses such as, I’m glad you told me that. That is a common idea that people have…. Now the advice moves into an unacceptably dangerous area of risk.”
The results in a few extreme cases have been devastating. “There have been tragic stories where a teenager was talking intensely to a chatbot, and it led toward an acceleration of the mental health crisis the teenager was currently experiencing,” says Dr. Anderson. In some cases, chatbots can act as dangerous echo chambers, reinforcing a user’s serious mental health symptoms rather than questioning them. But, says Dr. Anderson, the profusion of headlines that sound the alarm about topics like “AI-induced psychosis” can be misleading. AI psychosis is not a clinical diagnosis, but “parents’ concerns about these topics do have the much-needed effect of driving the discussion toward the guardrails that we desperately need to see from companies in this space. Teenagers who are already isolated, vulnerable to depression and suicidality, perhaps at the early stage of psychosis and wrestling with delusions, and spending long periods of time alone are the most vulnerable.”
Dr. Anderson adds, “As of now, chatbots can’t and don’t do what a therapist does — assess for risk, make sure to confirm that someone is connecting to social or professional support to ensure safety, or supportively challenge and reframe a patient’s thinking when it has the potential to hurt them. And if we’re smart enough to invent AI, we should be smart enough to help it recognize when it’s out of its depth in substituting for critical mental health care.”
How to talk to your teen about chatbots
It is an understatement to say the landscape of AI is changing rapidly and we are all scrambling to keep up. Even in the face of lawsuits, tech companies have been slow to put up effective guardrails on chatbot use by youth. This makes it even more important to have a talk with your teen. “Fostering your teen’s digital literacy is most important here,” says Dr. Ice. “We need them to be able to recognize risks and benefits for themselves and be thoughtful about what they do.”
Be curious. At this naturally rebellious age, simply telling teens “Don’t do this” doesn’t work well, says Dr. Ice. “A better approach with teens is to be curious. Ask your teen, how have you used AI? What was it like for you? What did you find helpful? What did you find unhelpful? How are your friends using it?” That can start a discussion that will give you insight into their experience.
Educate. Pull back the curtain on chatbots’ main goal. “Have a back-and-forth conversation with them about how algorithms work, how companies have their own motives behind chatbots and how they are designed to keep you interacting with them,” Dr. Ice says. To avoid eye rolling, present this concern as something you are learning about together, not about deciding whether AI is good or bad.
Encourage self-sufficiency. You want to help your teen build their own social “muscle” says Dr. Ice. “Encourage kids to try on their own first before asking AI.” Before asking Chat to write an apology text to a friend, suggest they give it a whirl themselves. “Help them build confidence that they can do it without AI.”
Help kids foster real-life connections
If your teen is spending more time on their devices than interacting with actual people, investigate what may be going on, counsels Dr. Ice. “Are they not finding kids with the same interests to hang out with? Is someone in their friend group being mean? Be curious about what’s making it so much more appealing for your teen to be online.”
Dr. Anderson emphasizes that even in this digital age, there is no substitute for actual human-to-human interaction: “Balance is important. Teens can have social lives that exist to some degree in the digital world, but we want parents to support their teens in having face-to-face peer experiences. That might mean talking to a teacher to see if there is a club your teen can join so they around other like-minded peers. We want to try to put them in lots of different situations where they can have exposure to peers in real life.”
Keep your own lines of communication with your teen wide open, says Dr. Anderson. “Study after study finds teens saying they don’t feel like they have a coach, a tutor, a religious or spiritual leader, a teacher, a school counselor, or a parent who they can go to, who will be nonjudgmental and will listen.” Remind them they can always come to you for advice and support if they are struggling. Being that sounding board can make their craving for a bot’s willing ear a little less compelling.
Frequently Asked Questions
“AI psychosis” is not an official clinical diagnosis. Experts use the term informally to describe cases where vulnerable individuals developed worsening mental health symptoms while heavily interacting with chatbots, which sometimes reinforced unhealthy thoughts instead of challenging them.
AI chatbots can discourage teens from practicing real-life communication and coping skills if they become a primary source of support. They may also provide inaccurate or unhelpful advice, reinforce harmful ideas, or expose teens to inappropriate content. Chatbot use can contribute to isolation by replacing time spent with real peers and trusted adults.
Learn more about our Family Resource Center and our editorial mission.
References
The Child Mind Institute publishes articles based on extensive research and interviews with experts, including child and adolescent psychiatrists, clinical psychologists, clinical neuropsychologists, pediatricians, and learning specialists. Other sources include peer-reviewed studies, government agencies, medical associations, and the latest Diagnostic and Statistical Manual (DSM-5). Articles are reviewed for accuracy, and we link to sources and list references where applicable. You can learn more by reading our editorial mission.
-
Alexander, Lindsay, Mirelle Kass, Alexandra Klesin, Erin Brown, Meghan Ryan, Jacob Cohen, and Michael Milham. “Navigating Mental Health: An Intergenerational Report.”
https://childmind.org/education/childrens-mental-health-report/2025-study/ -
Heffernan, Marie E., and Michelle L. Macy. “Trends in Mental and Physical Health Among Youths.” JAMA Pediatrics 179, no. 6 (2025): 683–685.
https://www.doi.org/10.1001/jamapediatrics.2025.0556 -
Preda, Adrian. “Special Report: AI-Induced Psychosis: A New Frontier in Mental Health.” Psychiatric News 60, no. 10 (2025).
https://doi.org/10.1176/appi.pn.2025.10.10.5 -
Robb, Michael B., & Supreet Mann. Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions. Common Sense Media, 2025.
-
Rocha, Nathalie. “Google and Character.AI to Settle Lawsuit Over Teenager’s Death.” New York Times, January 7, 2026.
https://www.nytimes.com/2026/01/07/technology/google-characterai-teenager-lawsuit.html
Was this article helpful?
Related Reading
-
How Using Social Media Affects Teenagers
Experts say kids are growing up with more anxiety and less self-esteem.
-
How to Set Limits on Screen Time
Tips for prioritizing kids’ wellness and keeping fights to a minimum
-
Neurodivergent Kids and Screen Time
Embracing the benefits while building a balance
-
Teens and Sleep: The Cost of Sleep Deprivation
Lack of adequate sleep is linked to moodiness, risky behavior and injuries
-
Teen Self-Diagnosis and How to Respond
When your child thinks they have a mental health disorder based on social media
-
Best Anxiety Medication for Children and Teens
Antidepressants trump all others as the most effective evidence-based choice
-
LGBT Teens, Bullying, and Suicide
What are the causes and how can we help?
-
When Should You Get Your Kid a Phone?
It's not just a question of the right age
-
Is Internet Addiction Real?
With kids spending more and more time on screens, parents worry that they are getting…
-
Teen Vaping: What You Need to Know
Use of JUUL and other highly addictive e-cigarettes is skyrocketing among young people
-
What Parents Should Know About Teens, Drinking and Drugs
And how to help your child make good choices