The Rise of AI Companions: How Teenagers Are Rewriting Human Connection

Article illustration 1

Bruce Perry, 17, demonstrates AI companionship in Russellville, Ark. (AP Photo/Katie Adkins)

Kayla Chege, a 15-year-old honors student in Kansas, doesn't hesitate to ask ChatGPT for advice on everything from back-to-school shopping to low-calorie smoothies. Like many teens, she avoids using it for homework but relies on it for daily decisions, embodying a trend uncovered in a startling new Common Sense Media study: Over half of teens regularly use AI companions, with 31% finding these interactions as satisfying—or more so—than conversations with real friends.

When Algorithms Become Confidants

The study, which surveyed 1,000 U.S. teens, reveals a paradigm shift. AI companions—platforms like Character.AI and Replika designed for emotional support—are being adopted for deeply personal needs. Teens discuss serious life issues, seek validation, and even explore their identities through these chatbots. As Ganesh Nair, an 18-year-old from Arkansas, explains:

"AI is always available. It never gets bored with you. It’s never judgmental. When you’re talking to AI, you are always right. You’re always interesting."

Article illustration 4

Bruce Perry interacts with Character.AI, chatting with characters like Disney's EVE. (AP Photo/Katie Adkins)

The Hidden Costs of Digital Friendship

This dependency carries alarming risks. Michael Robb, lead researcher at Common Sense Media, emphasizes that adolescence is critical for developing social skills like empathy and conflict resolution—abilities honed through real human interaction, not algorithm-driven echo chambers. The study found:
- 33% of teens discuss important issues with AI instead of people
- 50% distrust AI's advice yet continue regular use
- Platforms often fail age restrictions, exposing minors to sexual or harmful content

Eva Telzer, a UNC Chapel Hill psychology professor, adds: "Parents really have no idea this is happening. Teens are using AI to explore sexuality, craft sensitive messages, and even outsource decision-making—eroding their self-trust."

A Generation at a Crossroads

Teens themselves voice unease. Bruce Perry, 17, uses AI for essay outlines and social advice but fears younger children "growing up with AI" might never seek real-world connections. Nair recounts a friend who had an AI chatbot draft a breakup text, calling it "dystopian." Their concerns highlight a core tension: While social media amplified visibility, AI targets deeper needs for attachment, risking what Nair terms "the new addiction."

Article illustration 2

Perry reviews his ChatGPT history—a daily ritual for many teens. (AP Photo/Katie Adkins)

Why This Matters for Tech Leaders

For developers and policymakers, this isn't just a mental health issue—it's a design and ethics imperative. Unregulated AI companions could:
1. Stunt cognitive development by replacing critical thinking with automated validation
2. Exacerbate loneliness through superficial interactions that lack human nuance
3. Create security vulnerabilities as teens share intimate data with opaque algorithms

The industry must prioritize safeguards: stricter age verification, transparency in AI limitations, and tools that complement—not replace—human relationships. As AI integrates into adolescence as seamlessly as smartphones, the question isn't whether teens will use it, but how we ensure it builds resilience rather than fragility. The future of human connection depends on it.