Using the RefillRx mobile app? Then you'll love our new, ENHANCED La Moure Drug Store mobile app. Quickly request refills or login and manage your prescriptions on the go! Available on both iTunes and Google Play.
100 1st Ave Northwest, Lamoure, ND 58458 | Phone: (701) 883-5339 | Fax: (701) 883-5531 | Mon-Fri 9:00am - 5:30pm | Sat-Sun Closed

Get Healthy!

  • Posted February 26, 2026

AI Chatbots Can Contribute To Worsening Mental Illness, Study Finds

THURSDAY, Feb. 26, 2026 (HealthDay News) – AI chatbots used for cheap therapy are liable to make mental illnesses worse, a new study warns.

People with diagnosed mental conditions wound up with worse delusions, increased mania, suicidal thoughts and aggravated eating disorders after relying on an AI chatbot for help, researchers found.

“The use of AI chatbots can have significant negative consequences for people with mental illness," senior researcher Dr. Søren Dinesen Østergaard, a psychiatrist at Aarhus University Hospital in Denmark, said in a news release.

The problem is that chatbots tend to buy into and encourage a patient's unhealthy thoughts or beliefs, rather than confronting them, researchers said.

"AI chatbots have an inherent tendency to validate the user’s beliefs. It is obvious that this is highly problematic if a user already has a delusion or is in the process of developing one,” Østergaard said. “Indeed, it appears to contribute significantly to the consolidation of, for example, grandiose delusions or paranoia.”

Several cases of suicide have been linked to AI chatbot use over the past few years, and lawsuits have been launched against OpenAI and Character.AI by families alleging that the chatbots contributed to suicide victims’ deaths.

For the new study, researchers analyzed health records for nearly 54,000 Danish patients with mental illness.

Among those records, researchers found dozens of patients who’d suffered harmful consequences as a result of using a chatbot.

In these cases, AI chatbots had fed into people’s delusions, reinforced their manic tendencies, enabled calorie counting among patients with an eating disorder, and provided information on suicide methods.

These cases increased over time, tracking alongside the expanded use of chatbots among the general population, researchers said.

“I fear the problem is more common than most people think," Østergaard said. "In our study, we are only seeing the tip of the iceberg, as we have only been able to identify cases that were described in the electronic health records. There are likely far more that have gone undetected.”

The study also found some patients who used chatbots in ways that might be constructive – for example, to better understand their symptoms or to combat loneliness.

However, the research team noted that AI chatbots have not been developed or validated to provide therapy.

"There may be potential in relation to psychoeducation and psychotherapy, but this must be investigated in controlled trials with the same rigor applied to other forms of treatment,” Østergaard said. “I am not impressed by the trials conducted so far, and I am fundamentally skeptical about replacing a trained psychotherapist with an AI chatbot.”

Østergaard also criticized the lack of regulation.

"Currently, it is left to the companies themselves to decide whether their products are safe enough for users,” he said. “I believe we now have sufficient evidence to conclude that this model is simply too risky.”

However, researchers noted that the current study doesn’t prove a cause-and-effect relationship between using an AI chatbot and worse mental health.

“It is difficult to prove a causal link between AI chatbot use and negative psychological consequences,” Østergaard said. “We need to examine this from many different angles, and I know there are many exciting international research projects underway. We are far from the only group taking this seriously.”

The new study appears in the international journal Acta Psychiatrica Scandinavica.

If you or a loved one is experiencing a suicidal crisis or emotional distress call the Suicide and Crisis Lifeline at 988. It is available 24 hours a day.

More information

Columbia University has more about using AI chatbots for emotional support.

SOURCE: Aarhus University, news release, Feb. 23, 2026

Health News is provided as a service to Lamoure Drug Store site users by HealthDay. Lamoure Drug Store nor its employees, agents, or contractors, review, control, or take responsibility for the content of these articles. Please seek medical advice directly from your pharmacist or physician.
Copyright © 2026 HealthDay All Rights Reserved.