It is estimated that more than half of U.S. teens regularly use companion chatbots that leverage large-scale language models and generative artificial intelligence (AI) technology. According to the developer, programs such as Character.AI, Replika, and Kindroid are aimed at providing companionship. But a recent study from Drexel University shows that teens are concerned that these attachments are becoming unhealthy and affecting their lives offline.
The study, to be presented at the Association for Computing Machinery’s Human Factors in Computing conference in April, examined a sample of more than 300 Reddit posts from users who described themselves as 13 to 17 years old, specifically about their reliance and overreliance on Character.AI. They found that in many cases, teens began using the technology for emotional and psychological support or entertainment, but that use developed into addiction and even patterns associated with addiction. Some report that excessive use disrupts their sleep and causes academic conflicts and strained relationships.
“This study provides one of the first teen-centered explanations for overreliance on AI companions,” said Dr. Afsaneh Raji, an assistant professor in Drexel University’s School of Computing and Information Studies, whose ETHOS lab, which studies how people’s interactions with computing and AI systems impact their social behavior, well-being, and safety, led the study. “We highlight how these interactions impact young users’ lives and introduce a framework for chatbot design that fosters healthy interactions.”
About a quarter of the posts suggested that teens were using Character.AI for some kind of emotional or psychological support, from distress to dealing with loneliness and isolation, or seeking advice for mental health issues. Just over 5% reported using it for brainstorming, creative activities, or entertainment.
The researchers say these posts show that what started out as an innocuous or helpful interaction has developed into a stronger attachment that can be as difficult to break as an addiction.
By mapping teens’ experiences to known elements of behavioral addiction, we find that clear patterns of conflict, withdrawal, and relapse emerge in their posts, suggesting more than just frequent or avid use. Many teens say they started with something they found helpful or harmless, but over time it became something they found difficult to leave, even if they wanted to. ”
Matt Namvarpour, PhD student in the School of Information Sciences and ETHOS lab, lead author of the study
Researchers found evidence of all six factors associated with behavioral addiction among the 318 posts they analyzed.
- conflict –– Competing desire to continue interacting with the chatbot despite feeling uncomfortable with overuse.
- salience – Deeper emotional attachment to bots instead of humans.
- Withdrawal – Feeling sad, anxious, and incomplete when not interacting with the bot.
- Tolerance range – A pattern of escalating usage develops, creating a need to continue using the bot for satisfaction and emotional stability.
- recurrence – If you try to stop it, it will return to using the bot after a few days or weeks.
- mood modification – Turn to bots in moments of stress or loneliness to improve your mood or find temporary relief.
“What makes this particularly troubling is that chatbots are so interactive and emotionally responsive that the experience can feel more like a relationship than a tool,” Namvarpour said. “So distancing isn’t just about quitting a habit, it can feel like you’re moving away from something meaningful, which makes it harder to recognize and deal with overdependence.”
Addiction to technologies such as video games has been studied and identified as a psychological condition, but the unique interactivity of AI chatbots makes users particularly susceptible to forming problematic attachments, researchers say. This suggests that special care should be taken in design to protect users.
“Personalization, multimodality, and memory set AI companions apart from previous technologies, making over-reliance difficult to disentangle authentic sentient relationships,” the researchers wrote. “This highlights the need for further research into the unique characteristics of these relationships and how to address the unique challenges for companion chatbots.”
The team provided a design framework to help address this concern. This focuses on understanding the needs of chatbot users, how and why chatbot users form attachments, and how to train bots to suppress attachments while being respectful and supportive. We also recommend that programs provide users with an easy and clean exit.
“It’s important for designers to provide guidance that allows users to feel confident in their ability to form relationships offline, as a healthy way to find emotional support, without anthropomorphizing and attaching cues to the technology,” Raji said. “Our framework also challenges designers to provide a variety of exits so that users can easily exit the program at their convenience without feeling abrupt or final.”
Incorporating features such as usage tracking, emotional check-in prompts, and personalized usage limits could also be effective ways to discreetly reduce usage, the researchers suggested. We also recommended including input from users and mental health professionals in the design process.
“Designers now have a responsibility to build systems with empathy, nuance, and attention to detail that not only protect teens from harm, but also help them recover, grow, and cultivate more fulfilling lives.”
To expand on this research, the team pointed to exploring a larger community of users across a broader demographic through surveys and interviews, as well as users of other chatbots and messaging platforms other than Reddit.
sauce:
Reference magazines:
Namvarpur, M. Others. (2026). Understanding teens’ overdependence on AI companion chatbots through self-reported narratives on Reddit. CHI ’26: Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems.. DOI: 10.1145/3772318.3790597. https://dl.acm.org/doi/10.1145/3772318.3790597

