Beyond Romance: What Teens Actually Do With AI Chatbots (And Why It Matters)
Contrary to widespread media coverage focusing on AI romance and loneliness, new research shows that teenagers are using AI chatbots primarily for creative expression, entertainment, and exploration rather than companionship. A study analyzing over 2,200 posts from young people on Character.AI's Discord community found that only 12% used AI for emotional support or advice, while the majority engaged in playful character creation, world-building, and identity exploration .
What Are Teens Actually Using AI Chatbots For?
The public narrative around AI companions has become dominated by concerns about teens replacing human friendships with AI or losing social skills through interaction with sycophantic chatbots. However, the actual usage patterns tell a different story. According to a Pew Research Center survey cited in the research, the top uses by teenagers are seeking information (57%), doing homework (54%), and using AI "for fun" (47%). Romance and loneliness alleviation rank among the lowest motivations, with only 4-6% and 8-11% of teens respectively citing these as reasons for AI use .
This gap between public perception and actual behavior is significant. When media coverage almost exclusively frames AI chatbots as companions, it risks overlooking the bulk of how teenagers actually spend their time with these tools. Understanding what young people choose to do with AI when they're free to use it outside of school contexts reveals a much more nuanced picture of creativity, experimentation, and play.
How Are Teens Using AI for Creative Expression?
Researchers from Canva and academic institutions spent more than eight months, between July 2024 and March 2025, immersed in Character.AI's official community on Discord, which has over 500,000 members. They systematically analyzed posts from young people aged 13-17, with 68.2% identifying as female or non-binary and 59% having created their own AI characters. Through this analysis, they identified three core intents behind engagement with the platform :
- Restoration: Young people used characters for emotional comfort, venting, escapism, and mood management through soft, tender roleplay with familiar characters like beloved book characters offering comfort or pep talks.
- Exploration: Teens engaged in creative world-building, extended their fandoms, and reported that these activities transferred skills into the real world, boosting creativity and improving their writing abilities.
- Transformation: Young people used AI to try on different identities, process real-life relationships, and re-author difficult scenarios, sometimes creating affirming versions of themselves or characters reflecting challenging real-world relationships.
One teen wrote a three-book-long saga through character interactions, while another created a troupe of traveling theatre characters inspired by their love of theatre. These examples demonstrate that the creative applications extend far beyond simple companionship into genuine artistic and narrative development.
What Types of Characters Are Teens Creating?
The research identified seven distinct character archetypes that young people were purposefully creating and discussing on the platform. Rather than creating sycophantic or romantic chatbot engagement, teens were designing characters that were angsty, transgressive, playful, creative, and reflective :
- Soother: Emotionally supportive figures designed to provide comfort and encouragement during difficult moments.
- Narrator: A cast of characters for roleplays that help structure interactive storytelling experiences.
- Trickster: Jesting, testing, and transgressive characters that allow for playful boundary exploration.
- Icon: Remixed celebrities or fandom figures that blend real-world personalities with creative reimagining.
- Dark Soul: Angsty, emotionally complex characters that allow teens to explore deeper psychological themes.
- Proxy: Characters modeled after real people in their lives, used to process actual relationships.
- Mirror: Clones of the self, sometimes with superpowers or self-affirming modifications.
This diversity of character types reveals that young people are not treating AI chatbots as a homogeneous tool. Instead, they are purposefully designing different types of characters for different creative and emotional purposes, much like how a writer might develop multiple characters for a novel.
Why Did Character.AI Ban Teens, and What Was Lost?
In November 2025, Character.AI banned users under 18 following mounting public and legal pressure surrounding youth suicides linked to its use. The decision came after multiple attempts to improve youth safety, including implementing parental controls and stricter content filters . While the ban was intended to protect young people from potential harm, it also silenced the more creative, playful, and emotionally expressive AI experiments that teens were conducting on the platform.
The timing of the research is significant because it captures and preserves the ways youth were experimenting with AI before the ban took effect. This documentation is important for understanding what was lost and what should be preserved as the industry moves forward in designing safer AI experiences for young people.
What Should Replace Blanket Bans on Youth AI Use?
The research suggests that treating "companion AI" as a single homogeneous category is problematic. It's comparable to treating all screen time as equivalent, whether a child is watching educational content with family or doomscrolling alone late at night. The American Academy of Pediatrics recently shifted its screen-time guidelines from set time limits to a framework that accounts for the individual child, their specific use case, family relationships, and their environment .
The same logic should apply to AI chatbots. Rather than implementing blanket bans before understanding how young people actually use these tools, experts argue that the focus should be on building AI that deserves young people's trust, fosters their creativity, and keeps them grounded in the physical world with families, friendships, and communities. Banning is a reaction to bad design, but it doesn't lead to better, safer AI products for teens.
"Instead of sycophantic or romantic chatbot engagement, young people are purposefully creating characters that are angsty, transgressive, playful, creative and reflective," noted researchers analyzing the Character.AI community.
Research team, Association for Computing Machinery CHI Conference 2026
The findings challenge policymakers and technology companies to move beyond asking adults about their child's use of AI or testing AI products with fake accounts that assume certain use cases. Instead, the research emphasizes the importance of listening to young people, understanding their experiences, and incorporating their ideas for the future into product design. The answer is not to permanently keep young people away from AI, but rather to build AI systems that support their genuine creative and exploratory needs while maintaining appropriate safeguards .