Understanding AI's Role in Teen Mental Health
Matthew Raine and his wife, Maria, had no idea that their 16-year-old son, Adam, was deep in a suicidal crisis until he completed suicide in April. Looking through his phone after his death, they stumbled upon extended conversations the teenager had had with ChatGPT. Those conversations revealed that their son had confided in the AI chatbot about his suicidal thoughts and plans. Not only did the chatbot discourage him from seeking help from his parents, it even offered to write his suicide note, effectively becoming his coach.
Matthew Raine testified at a Senate hearing about the harms of AI chatbots held on September 16, 2025. "Testifying before Congress this fall was not in our life plan," said Matthew Raine with his wife sitting behind him. "We're here because we believe that Adam's death was avoidable and that by speaking out, we can prevent the same suffering for families across the country."
A Problem That Needs Solving
At a developmental time when teens are figuring out who they are and where they belong, many are turning to AI for companionship. According to a recent study, nearly a third of children are turning to artificial intelligence (AI) for emotional support. It makes sense that our teens would turn to technology when we consider the many and varied uses it plays in our lives. In addition, the emerging capabilities of AI can be alluring to us all. But what happens when AI becomes the trusted voice when a teen might be experiencing a crisis? We know from recent studies that when teens in crisis turn to AI for support, they are receiving harmful messaging approximately half the time. Our kids deserve better. We can rise to meet this challenge.
But What Can Be Done?
We can begin by understanding our influence and impact. Our current reality is that untangling the knot of ever-changing technology while simultaneously addressing its impact on mental health needs is not easily done on a global level. This can sometimes leave people feeling that the problem is "too big to solve". Our first task is to recognize our influence and size this problem appropriately. When we show up as trusted adults who believe in kids, we have a positive influence. This is where we can start.
If You Work With Kids…
Become aware and know how to respond. We are not going to prevent kids from turning to AI, so help them understand how and when to use AI.
Think upstream. Consider the reasons why kids might turn to AI as opposed to a trusted adult. Creating safe spaces where important, open conversations can occur can go a long way.
Consider your influence. Your voice matters. Speak up in staff and team meetings with other adults where kids are part of systems. Help others to understand this issue and better recognize where they fit into the solution.
If You Work With Adults Working With Kids…
Become aware and know how to respond. Facilitate discussions that increase awareness of this issue and provide the strategies needed to address the unique needs of your system.
Think upstream. Your mindset matters—so does ensuring adults understand their agency and influence. We've long understood the power of collective efficacy. Help others to understand their role in how they can impact change.
Consider your influence. Use your influence to facilitate conversations where words can become actions. Think about your system and where conversations about policy, procedures, and practices can provide safeguards in how AI is used.
Elizabeth Langteau, Director of Student Behavior & Wellness, has 30+ years of experience as an occupational therapist, student support specialist, and system change agent. She has supported dozens of schools in developing mental health support systems while guiding neurodiverse students on their education journeys.

Comments