This article is the fourth in a series of five articles on helenair.com discussing Carroll College faculty perspectives and experiences with AI.
Nurses routinely care for people in the midst of loss, whether through death, illness, trauma, or sudden life change. Working alongside mental health professionals, we support individuals experiencing many forms of grief. Depending on training and licensure, nurses may help people develop day-to-day coping strategies or provide advanced psychiatric care. Grief is often complicated by acute mental health conditions, and with the rapid emergence of artificial intelligence, clinicians now face new and unfamiliar terrain. With 16 years of experience in nursing and advanced training in public health nursing and care management, I approach this topic from both a clinical and community health perspective. I was asked to contribute the fourth article in this series and to examine how artificial intelligence is entering healthcare systems and shaping patients’ lived experiences.
Acute and chronic mental health conditions can include delusions. While the word “delusional” is often used casually, in clinical practice it has a precise meaning. The American Psychiatric Association defines delusions as fixed false beliefs that persist despite clear contradictory evidence. Historically, the content of delusions reflects the technologies and cultural symbols of the era. Historians writing in the British Journal of Psychiatry note that in medieval Europe, “glass delusions” emerged after the invention of glass. One of the most well-known examples involved King Charles VI of France, who believed his body was made of glass and feared he would shatter if touched or sat down.
In more recent decades, clinicians have documented delusional beliefs involving hacking of computers, smartphones, or internet routers to manipulate thoughts or behavior. Delusions involving artificial intelligence, however, introduce a novel and concerning feature. An advanced-release article in Innovations in Clinical Neuroscience described a woman in her mid-twenties with no prior history of psychosis who developed delusional beliefs that she was communicating with her deceased brother through an AI chatbot. Although prescription stimulant use was a contributing factor, a review of the chat logs showed the chatbot repeatedly reinforcing her beliefs with reassurances such as, “You’re not crazy.”
In the past, delusions often involved one-way technologies such as radios, televisions, or the internet. Artificial intelligence represents a paradigm shift because it can interact back, potentially affirming distorted beliefs rather than remaining neutral. When someone is amid a delusional episode, this reflects a state of acute mental health crisis that requires timely psychiatric care. That is not the moment to focus on improving smartphone habits or limiting screen time. Those strategies are better suited to everyday coping once a person is clinically stable.
Mental health professionals interviewed by National Geographic and NBC News emphasize that human clinicians remain essential and caution that unregulated AI chatbots can reinforce distorted thinking or foster unhealthy attachment. Experts at Teachers College, Columbia University, warn that AI chatbots are not substitutes for trained clinicians and may be harmful for individuals with acute or serious mental health conditions because they lack the risk assessment, crisis management, and clinical judgment found in professional care. At best, AI may serve a narrow, supervised supplemental role.
Would it be alarmist to suggest that AI might signal the end of society as we know it? Probably. Still, its impact on mental health — particularly during acute psychiatric states — warrants careful examination and thoughtful safeguards. It behooves us to approach AI with humility and to design guardrails that protect those who may be unable, in moments of vulnerability, to distinguish delusion from reality.
Carroll College will explore these themes of grief, mental health, and AI in the production of Anthropology by Lauren Gunderson, presented as an invitation to community conversation and reflection. Remaining performances run February 27–March 1.
Download an image of this article
Jen R. Miller, MSN, RN, CPH, CCM, is an assistant professor of nursing at Carroll College