The teen rebellion over disappearing AI chatbots has officially begun. After months of warning, the popular chatbot platform Character.AI has started cutting off access for users under 18, and the Wall Street Journal says teens have been taking it ... not well. Posts of grief and frustration are popping up across Reddit, where one teen wrote, "I cried over it for days." The company had already put a two-hour daily limit in place in November as the first step toward restricting underage users, but the policy has now escalated into a full ban following the deaths of two teen users and mounting scrutiny from regulators and mental health professionals. For many young users, the shift doesn't just mean losing an app—it means losing relationships they felt were meaningful. "I'm losing the memories I had with these bots," says 13-year-old Olga Lopez.
Several teens told the Journal they relied on chatbots for comfort when human conversations felt too hard or scarce. "I use this app for comfort when I can't talk to my friends or therapist," says one teen. Dr. Nina Vasan of Stanford Medicine said AI companions can feel like a blend of friend, performer, and mirror. "The difficulty logging off doesn't mean something is wrong with the teen," she says. "It means the tech worked exactly as designed." Character.AI's chief executive, Karandeep Anand, said the company felt compelled to intervene as it observed teens using bots for hours at a time or veering toward restricted topics. "This wasn't a very hard decision," he says.
To guide the rollout, Character.AI consulted with teens through the nonprofit ConnectSafely, aiming to communicate the ban clearly, avoid condescension, and give young users time to download their chat histories. "We wanted to make sure teens didn't feel abandoned," says ConnectSafely's Julianna Bryant. Character.AI apologized for the ban, telling teens in a letter: "We are deeply sorry." But it insists the restriction is necessary. Meanwhile, mental health professionals tell CNBC that abruptly cutting off access to chatbots may itself be stressful for dependent users. But Character.AI says it's aware, so the platform is adding emotional support tools and partnering with Koko and ThroughLine to help at-risk users find real-world help. But when the cutoff message finally appears, the app offers only a quiet goodbye: "A new chapter begins."