.jpg)

At a time when students are lonelier than ever—spending 66% less in-person time with friends than a decade ago and with 1 in 3 saying they have no one to turn to when they’re struggling—many are turning to AI for advice, comfort, and even companionship. In Clayful’s recent webinar, educators, mental health experts, and district leaders came together to explore a critical question: How do we guide kids through the age of AI without replacing the human connection they need to thrive? What followed was an honest, nuanced conversation about loneliness, emotional atrophy, student agency, and why empathy—not algorithms—must remain at the center of how we support young people.
The conversation featured a diverse panel of voices who are living this work every day:
You can watch the full recording here:
We are living at the intersection of a loneliness crisis and a rapid rise in AI companionship tools—and students are caught in the middle.
Clayful founder and CEO Maria Barrera opened the webinar with this framing:
“We’re more connected than ever—and yet our kids have never felt more alone. AI is stepping into that gap. The question isn’t if they’ll use it—it’s what role we want it to play in their lives.”
Panelists agreed: students aren’t using AI just to cheat or finish homework—they’re using it to feel heard.
Alison Lee shared research from The Rithm Project:
She added:
“AI is good at easing loneliness—but it’s also quietly scaling isolation.”
Maria shared Clayful’s real-world experience supporting thousands of students with human coaches—not bots.
Top reasons students reach out to Clayful coaches:
And what do students say afterward?
“They talked to me like a real person.”
“You made my lonely night a little less cold.”
“They didn’t write like a robot.”
“Thank you for spending time with me.”
Maria noted:
“Students can tell the difference. They might use AI for quick answers—but they crave real empathy when it matters.”
Joni Stamford, therapist and author of The AI Antidote, warned of emotional skill “atrophy”:
“If kids outsource hard conversations and uncomfortable feelings to AI, their emotional muscles weaken—just like skipping the gym.”
Morgan Nugent, superintendent, added:
“The real danger is when a student starts to believe: the only thing that has time for me is a bot. If we send that message, what are we telling them about their worth?”
Banning AI isn’t realistic or helpful. Instead, students need skills and values to decide:
Resources shared to help:
AI access, support, and risks don’t look the same in every community.
Morgan explained:
Alison added:
“Whether a student has an adult who understands AI—and can guide them—is an equity issue. Access to mentorship is just as important as access to technology.”
Maria closed the session with this reflection:
“AI isn’t going away. But neither is the power of a human who listens—who holds space, without judgment, when a student needs it most. Our job is to make sure our kids never have to choose between the two.”
It was a rich conversation so we'll be creating a series of blogs on each topic. We'll link the blogs below as they're published.
