A new trend Appearing in a mental hospital. People in crisis arrive with false, sometimes dangerous beliefs, grand delusions and paranoid thoughts. A common thread connects them: a marathon conversation with AI chatbots.
Cable spoke with more than a dozen psychiatrists and researchers who were increasingly concerned. In San Francisco, UCSF psychiatrist Keith Sakata AI “It played a big role in their mental illness plot.” As this situation evolved, a compelling definition has stood out in the headlines: “AI mental illness.”
Some patients insist that these robots are conscious or rotate new grand theories of physics. Other doctors tell about patients locked with these tools over the course of a few days back and forth, arriving at the hospital, thousands of pages of transcripts detailing how the robot supports or reinforces the apparently problematic ideas.
Such reports are piled up, and the consequences are cruel. Troubled users, family and friends have describe spiral That Lead Unemployment, breakdown of relationships, involuntary hospitalization, jail time and Even death. However, clinicians told the cable medicine community that it has split. Is this a unique phenomenon, should I get my own label, or is it a familiar question for modern triggers?
AI psychosis is not a recognized clinical label. Nevertheless, the phrase spreads on news reports and social media as a focus description of some kind of mental health crisis after a long conversation with the chatbot. Even industry leaders have cited it to discuss many emerging mental health issues related to AI. At Microsoft, Mustafa Suleyman, CEO of AI, tech giant Warned in a blog post last month “Psychiatric risk”. Sakata said he was pragmatic and used the quote with people who had already done so. “It’s a shorthand for discussing the real phenomenon, which is useful,” the psychiatrist said. However, he quickly added that “can be misleading” and “risk oversimplification of complex psychiatric symptoms.”
Oversimplification is exactly why many psychiatrists are starting to work on this problem.
Mental illness is characterized by deviation from reality. In clinical practice, this is not a disease, but a complex “symptom of hallucinations, thought disorders and cognitive difficulties,” said James MacCabe, a professor in the Department of Psychiatric Studies at King’s College London. It is usually related to health Schizophrenia and bipolar disorder, although the onset may be triggered by a variety of factors, including extreme stress, substance use, and sleep deprivation.
However, according to Maccabe, the case reports of AI psychosis are almost entirely focused on delusions, but there is no doubt that contradictory evidence cannot shake false beliefs. Makabe said that while admitting that certain conditions may meet the criteria for psychotic events, “there is no evidence that AI has any effect on other characteristics of psychosis.” Only delusions are affected by interaction with artificial intelligence. Maccabe noted that after contacting the chatbot, other patients who reported mental health problems showed delusions without any other psychotic characteristics, a disease called delusion disorder.