After experiencing the intense academic culture of elite private schools, Alex Mathew, a 16-year-old based in Austin, Texas, saw a new use for large-language model (LLM) technology. Rather than using LLM technology for cancer detection and essay writing, Mathew seeks to improve teenage mental health with Berry AI.
Berry AI, Mathew’s new company, aims to produce a mass-market AI companion in the form of an interactive plushie. Armed with ChatGPT and a voice chip, Berry is projected to be available this summer for purchase at $100, plus a monthly subscription fee of $10 to $20.
Mathew says Berry is meant to help users reframe their situations. “You have to know when to prioritize what you’re working on, who to hang out with, that kind of thing. And so just getting clear over time on what your priorities are, what’s important to you, and how did it really take action on that?” He said.
Berry is not meant to replace traditional mental health resources, but to provide an outlet for conversation. “It’s not supposed to be high pressure, like a therapy session,” Mathew said in an interview with The Urban Legend. “It’s just supposed to be a literal conversation with your friend [where you] just talk about the things you’ve done that day.”
Berry uses a modified version of ChatGPT that Mathew created and named Serenity. Mathew designed Serenity to dispense mental health advice and build what the company calls an accountability framework. “We created a framework for you to build amazing habits and live the life you want to live. A huge gap right now is in taking action on advice you get, so we make it easy to do so,” Berry’s website reads.
There are concerns about Berry’s business model and the wider ethics of for profit companies in the mental health space. “I think being a for-profit business is kind of antithetical to the idea of empathy. … It is an AI that can reflect your own thinking towards you, but it can’t show you compassion,” Harper Lind ’27 said.
The Urban Legend asked Serenity about concerns around the use of AI in mental health. The LLM described itself as a first step, supplementing more traditional mental health services.
“For some people, especially those who feel isolated, judged or not yet ready to open up, talking to a chatbot can be a first safe step. It’s always available, doesn’t interrupt and never shames,” Serenity wrote.
Serenity made it clear, however, that it is not intended to be a stand-in for any sort of human contact. “The ultimate goal should be more human connection, not less,” Serenity wrote. “AI should help open the door to real-life conversations, not replace them.”
The future of Berry and other similar products is unclear at the moment, due both to their novelty and questions around effectiveness. “I think it’s an interesting idea. … I’m just afraid of how something sets out to be good, but then is misused,” Director of Counseling Services Amina Samake said.
Berry has implemented safeguards to prevent users from becoming overly dependent on its plushies. “You can’t talk to Berry for more than 60 minutes a day, because I don’t want this to become a character AI — or like a friend,” Mathew said.
Mathew also described how the company handles mentions of self-harm or suicide. “Whenever you talk about [self-harm], it will ping [an] emergency contact … [over] call or text,” Mathew said. “[This] state of mind, … [is] not something that AI should be handling. It should be handled by another person.”
Samake raised concerns about AI’s storage of private information and the risk of information breaches, especially in a mental health context. “Those systems could be hacked and people could get information that is not appropriate,” she said. Berry’s Frequently Asked Questions (FAQ) notes the use of industry standard encryption, and that the company cannot view any conversations you have with the plush.
Many also worry about the implications connecting with a chatbot will have on human relationships with other people. “When you have this [bot] that is solely focused on you, it is taking your information and then figuring out ways to say it back to you that keeps you connected to it,” Samake said. “I just wonder what that means for the human bond in general, for that person and how they then connect with actual human beings.”