Recent research published in JAMA network open We provide evidence that interacting with conversational artificial intelligence programs can help reduce symptoms of anxiety and depression while increasing your overall sense of well-being. The findings suggest that these digital platforms can form meaningful therapeutic bonds with users and provide an accessible way to support mental health at scale.
Mental health problems affect millions of people around the world, but only a small percentage of them receive the professional care they need. This treatment disparity tends to be caused by structural issues such as a lack of trained therapists, high costs, and the social stigma that still surrounds seeking psychological help. To address this problem, scientists are turning to digital technologies that can provide treatment to more people without burdening existing clinics.
Anat Shoshani, professor at Reichmann University’s Baruch Ivchar Department of Psychology and lead psychologist at Kai.ai, noticed this disconnect firsthand. “As a clinician, I have repeatedly encountered structural contradictions in mental health care. Treatment is highly effective, but psychological distress rarely manifests according to the structure of the treatment system,” Shoshani explained. “People experience midnight panic attacks, loneliness after a breakup, anxiety before exams, emotional spirals on the commute to work, and relapses after treatment ends. Many others spend months on waiting lists or don’t seek treatment at all.”
Early mental health applications often struggled to maintain user interest over time. People tended to abandon these programs quickly, often because they felt the applications were too passive or robotic to provide a real sense of connection. New artificial intelligence systems are designed to have natural, fluid conversations with users. These modern programs use advanced language models to simulate the empathy and individualized support commonly found in human therapy.
Researchers conducted this study to see if conversational artificial intelligence agents were actually comparable to traditional group therapy in reducing psychological distress. They wanted to assess how well digital tools could treat certain psychiatric symptoms compared to human-driven interactions. They also wanted to know whether people can feel a true bond with digital platforms, and whether that bond leads to improved mental health.
The research team recruited 995 Israeli university students between the ages of 18 and 35. These participants experienced mild to moderate psychological distress, which the researchers measured using a standardized screening tool. Scientists randomly divided students into three nearly equal groups and compared different types of support.
One group of 336 students used a conversational artificial intelligence platform called Kai for 12 weeks. The platform works through familiar messaging applications and provides personalized mental health exercises. “Kai was intentionally designed to be more than a chatbot; conversations are just one layer,” Shoshani said. “It integrates evidence-based interventions from CBT, ACT, DBT, mindfulness, and positive psychology, along with daily emotional check-ins, personalized routines, journaling tools, short guided exercises, psychoeducation, and human safety escalation as needed.”
These acronyms in the quotes refer to established psychotherapies, such as cognitive-behavioral therapy, which helps identify and change negative thought patterns, and other therapies that focus on acceptance and emotional regulation. Participants could message the program at any time and were encouraged to participate at least three times a week.
Another group of 331 students participated in traditional face-to-face group therapy with a licensed psychologist. These weekly sessions lasted 90 minutes over the same 12 weeks and covered similar coping strategies. A final group of 328 students served as a waitlist control. This means they did not receive active treatment during the study, but were later provided access to a digital platform.
To track progress, the authors used several well-known psychological questionnaires. They measured anxiety with a specific seven-question survey and depression with a nine-question survey. Anxiety disorders typically involve persistent worry, while depression often includes feelings of sadness and loss of interest in daily life.
The researchers also assessed symptoms of post-traumatic stress disorder, a mental health condition caused by frightening events. They looked beyond negative symptoms and focused on positive functioning by measuring overall well-being and life satisfaction. Students completed surveys at the beginning of the study, immediately after the 12-week intervention, and again three months later.
After 12 weeks, the researchers found that participants who used the artificial intelligence program had significantly reduced anxiety than those who received in-person group therapy or those in the control group. Group therapy was not significantly different from waiting list in terms of reducing anxiety. The digital platform also helped reduce symptoms of depression more effectively than the control condition.
The authors believe that the reason digital platforms worked so well despite anxiety is because they are always available. “Anxiety tends to increase in real time,” Shoshani explains. “It happens before social situations, during late-night ruminations, before difficult conversations, and in moments when a therapist is not available. Immediate support can be critical in those situations.”
Focusing on positive mental health, the digital group reported higher overall happiness and life satisfaction compared to the other two groups. These improvements were still present during the 3-month follow-up assessment. This study provides evidence that the digital program had no effect on symptoms of post-traumatic stress disorder, as these specific trauma-related scores remained similar in all three groups.
Shoshani noted that this lack of effectiveness helps define the boundaries of digital care. “Trauma is often more complex and may require deeper clinical judgment, specialized intervention, and interpersonal work,” she noted.
The survey also revealed surprisingly high levels of engagement with digital tools. “In our study, participants were active about three times a week, and 61% remained active after 12 weeks,” Shoshani said. “This level of retention suggests that people are not just experimenting with technology, but are incorporating it into their emotional routines.”
The authors also considered the concept of the therapeutic alliance. This term refers to the trust and connection that a person typically feels towards their care provider. Participants rated the artificial intelligence program to be as warm and professional as human therapists in group sessions.
Data suggests that when participants feel a strong bond with a digital program, they send more messages and engage more deeply. Researchers found that feeling supported by the program was directly related to significant improvements in mental health symptoms. This success may be related to a phenomenon known as the online disinhibition effect, in which people become more comfortable sharing sensitive information with their computers.
“Human disclosure is often delayed by shame, fear of judgment, social desirability, and concerns about burdening others,” Shoshani said. “AI seems to remove some of these interpersonal barriers.”
Although artificial intelligence programs have helped reduce common distress, the findings have some limitations. All psychological outcomes were reported by participants themselves rather than assessed by expert clinicians. Reliance on self-report surveys means that personal bias can influence the data. In addition, Shoshani provided important context regarding the participants’ environments.
“This study was conducted during a period of prolonged national stress and regional instability, which may have influenced the emotional results,” she said. A significant number of participants stopped responding by the 3-month follow-up. This loss of participants may impact the extent to which we understand the long-term benefits of the intervention.
The study also found that people using digital platforms were less likely to say they intended to receive traditional treatment in the future. The authors emphasize that these digital tools do not work completely in isolation. “Effective digital support requires a robust ‘human-in-the-loop’ system. AI is constantly monitored by clinical experts to ensure safety and provide a bridge to human crisis teams when user needs exceed the platform’s capabilities,” Shoshani explained.
She cautioned against the assumption that human practitioners were becoming obsolete. “Our goal is to create a ‘stepped care’ model where AI handles the immediate, routine resilience tasks, allowing human experts to focus their expertise where it is needed most,” she added.
Future research should consider how digital conversational agents can be safely integrated into existing healthcare systems and investigate long-term cost-effectiveness. The ultimate goal is to make psychological support more accessible to people who may be suffering in silence.
“If technology can responsibly lower that threshold, provide support early on, and help people feel less alone in difficult moments, it could make a lot of sense,” Shoshani concluded. “The future of mental health may not be defined by replacing human connection; it may be defined by increasing the number of moments in which support is available.”
The study, “Efficacy of Conversational AI Agents and Digital Therapeutic Alliance for Psychiatric Symptoms: A Randomized Clinical Trial,” was authored by Anat Shoshani, Bar Gurfinkel, Ariel Kor, Yael Ben-Haim, OrKanarek, Romi Segev, Or Shafir, and Romi Arbel.

