Declining mental health among American youth is no secret. It’s the subject of frequent news reports and bestselling books and a fixture of pop culture. Bestselling authors link the growing problem to smartphones and screen time. Meanwhile, school officials are worried about their students, burdened by parents’ expectations, and struggling to meet the rising demand for counselors. What if technology could solve this problem of its own making?
Enter Sonny, a school counseling chatbot. In “When There’s No School Counselor, There’s a Bot,” Wall Street Journal columnist Julie Jargon describes the part human, part AI hybrid making its way into schools across the country. In the words of Sonar Mental Health, Sonny’s developers, it’s a “personal wellbeing companion” that K-12 students can chat with “about literally anything.”
What sets this chatbot apart from pure AI, says Jargon, is the human element. She reports, “humans with backgrounds in psychology, social work and crisis-line support are always in the mix, reviewing the chats and taking cues from AI to inform their own replies to students.”
But a closer look at Sonar’s own webpage and its “Human-In-The-Loop model” is less reassuring. Who are the humans listening in on conversations “about anything” with school children? They’re 20-somethings. Sonar says it hires young people because they’re “closer in age” with “similar backgrounds” and better able to “empathize and connect with the students.” It also makes Sonny “more affordable.” Indeed.
At Sonar, youth input is central. “Teens know what works best for them,” their website says, “that’s why student voices shape everything we do,” including how Sonny sounds. Sonar’s co-founder Drew Barvir told the Journal that the AI speaks like a cool older sibling.
Are these cool older siblings the mental health professionals who are “always in the mix”? Hardly. The website’s disclaimer says 20-somethings are “not licensed mental health professionals.” Such experts are further removed from the chats, providing support to the people who are monitoring the chats.
But even that isn’t what it appears to be. According to Jargon’s reporting, a staff of six people, across shifts, can monitor 15 to 25 chats at a time. But Sonny, who is available to kids from 8 a.m. to 2 a.m., is being used by 4,500 middle and high school students over nine school districts. That math doesn’t add up. Sonny must be more AI than human—a lot more.
Sonar’s use of AI to “catch problems early” is no solution. Meeting kids on their phones—the very place responsible for so much of their mental anguish—and engaging them with AI chats that are monitored (loosely defined) by 20-somethings is deeply concerning. Far from solving school officials’ problems, Sonny is likely to make things worse. Children will think Sonny is a trusted friend, but they’ll be deceived. It’s a Frankenstien monster that’s part computer program, part 20-something mash-up without wisdom. And it will likely draw children into their devices even more, cutting them off from the real people who can help them.
Even if Sonar could fully staff Sonny with trained mental health professionals, it would still be a bad idea. Psychoanalyst Erica Komisar, writing for the Institute for Family Studies, points to what children need most in this mental health crisis, and it’s not AI chatbots. What children need most is their parents. Parents, she says, “are the lenses that help children see just far enough into the future to understand the impact of their choices. They are the moral shelter children need to grow into emotionally and ethically-grounded adults.” To truly help children, Komisar calls for “ending the outsourcing of parenting to schools, therapists, and social media influencers.”
As parents, we feel the limits of our human intelligence. But how much more are the limits of the 20-something wellbeing companions who have never met the children they’re chatting with? Assurances that they’re guided by AI running on “machine learning” and “social media insights” fall flat. It’s no benefit that the youth on the other end of the chats are “safe and judgment-free.” What school children often need to hear is that they’re thinking wrongly, that their plans are foolish, or that a course-correction is overdue. They need to hear this from their parents, pastors, and other sources of wisdom.
From the beginning, God made parents to love their children, with all of their unique needs, and bring them up in the instructions of His Word (see Deuteronomy 6:6-7 and Ephesians 6:4). This is where wisdom is found. Parents may occasionally need to enlist the help of a trusted pastor or biblical counselor—but never a computer-generated conversation overseen by a kid barely older than their own children.
AI-generated “counselors” can’t help children or teens in need of real support. It will only intensify the pathologies emerging from screen time and scrolling. Children in crisis need adults who know them personally and are committed to their good, adults who can help to see the truth about God, the brokenness of the world, and their own need for redemption and grace. They need this apart from the screens that are so often the platform of their suffering. Parents, your kids may need counsel, but they don’t need hybrid counseling chatbots. They need you.