Your future therapist may be Chatbotyou may see positive results, but have not started telling Chatgpt how you feel.
A new study by Dartmouth researchers found that a generated AI tool is designed to act as a therapist Depressed,,,,, anxiety and Eating disorders – But human experts still need to pay close attention to the tool.
This study is Published in March in the journal NEJM AI. Researchers experimented with 106 people using Therabot, who developed smartphone apps in Dartmouth over the past few years.
This is a small sample, but the researchers say it is the first clinical trial of an AI therapy chatbot. The results show a great advantage, mainly because the robot is available 24 hours a day, which puts gap patients facing traditional therapy bridging. But researchers warn Generated AI-assisted therapies If done incorrectly, it can be dangerous.
“I think there is a lot more evolution in this space,” said Nick Jacobson, senior author of the study and associate professor of biomedical data science and psychiatry in Dartmouth. “It’s amazing to achieve personalized, scalable impact.”
Read more: Apple’s AI Doctors May See You in 2026
Therabot Research
The 210 participants were divided into two groups – one group of 106 groups was allowed to use the chatbot, while the control group was left on the “waiting list”. Participants’ symptoms of anxiety, depression, or eating disorder were assessed during and after the test. For the first four weeks, the app prompted its users to participate every day. In the second four weeks, the prompt stopped, but people can still participate on their own.
The study participants actually used the app, and the researchers said they were surprised by how much people communicated with the robot and how they communicated. Afterward, participants reported a level of “therapeutic alliance” – trust and collaboration between patients and therapists – similar to in-person therapists.
The timing of interaction is also worth noting, and the interaction peaks are late at night when patients often feel worried. These are especially difficult time to get to a human therapist.
“With Therabot, people will be accessed throughout the trial in their daily lives and can really access it,” Jacobson said. This includes the time when someone has trouble falling asleep at 2 a.m. due to anxiety or difficult times.
https://www.youtube.com/watch?v=fduhq6_fe9i
After that, the patient’s assessment showed a 51% drop in symptoms of major depression, a 31% drop in symptoms of general anxiety, and a 19% drop in symptoms of eating disorders in patients at risk of these specific diseases.
“People who attended the trial are not only mild,” Jacobson said. “For example, in depression, people in the group are moderate to severe in depression. But on average, their symptoms are reduced by 50%, which goes from severe to mild or moderate to moderate to almost none.”
What makes Therabot unique
The research team not only selects more than 100 people who need support, but also allows them to use large language models, such as Openai’s Chatgpt See what’s going on. Therabot is custom-fine-tuned-following specific treatment procedures. It is built to guard against serious problems, such as potential signs self harmand report to them so that human intervention can be done when needed. Humans also track the robot’s communications to lend a hand when the robot says it shouldn’t have.
Jacobson said in the first four weeks of the study that due to the uncertainty of the robot’s performance, he would read every message it sent as soon as possible. “I didn’t have a lot of sleep in the first part of the judgment,” he said.
Jacobson said human intervention is rare. Testing of early models two years ago showed that more than 90% of the responses were consistent with best practices. When a researcher intervenes, it is usually when the robot provides advice outside the therapist’s scope – just like when it tries to provide more general medical advice, such as how to treat sexually transmitted diseases, rather than referring patients to healthcare providers. “Its actual advice is reasonable, but it is beyond the scope of care we offer.”
Therabot is not your typical large language model; it is essentially hand-trained. Jacobson said a team of more than 100 people created a data set using best practices to understand how therapists should respond to real-life human experience. “Only the highest quality data is part of it,” he said. Similar general models Gemini of Google or Claude of humanityFor example, training on data is far more than just the medical literature and may respond incorrectly.
Can the generated artificial intelligence be your therapist?
The Dartmouth study is an early sign that in some cases it may be helpful to build special tools using Generative AI, but that doesn’t mean any AI chatbot can be your therapist. This is a controlled study of human experts monitoring it, and there are dangers to try this on your own.
Remember that most common large language models are training on the ocean of data found on the internet. So while they can sometimes provide some good mental health guidance, they also include bad information—such as the fictional therapist’s behavior or information about mental health that people post on online forums.
“In a healthy environment, there are a lot of things they do in many ways,” he said.
Even a chatbot that provides useful suggestions Harmful in the wrong environment. Jacobson said that if you tell the chatbot that is trying to lose weight, it will come up with ways to help you. However, if you are dealing with an eating disorder, that can be harmful.
Many people are already using chatbots Perform tasks of approximate therapist’s work. Jacobson said you should be careful.
“It’s very close to the quality of the internet in terms of how it trains,” he said. “Is there great content there? Yes. Is there dangerous content there? Yes.”
Jacobson said anything you get from a chatbot is the same skeptic as a strange website. Even if it looks more polished from Gen AI tools, it may still be unreliable.
If you or someone you like suffer from eating disorders, please contact National Association for Eating Disorders For resources that can help. If you feel like you or someone you know is in direct danger, Dial 988 Or text “NEDA” to 741741 to contact the crisis text line.