Medindia LOGIN REGISTER
Medindia

AI Enhances Suicide Risk Detection in Medical Clinics

by Swethapriya Sampath on Jan 4 2025 3:00 PM
Listen to this article
0:00/0:00

AI improves suicide risk screening, with interruptive alerts to doctors.

AI Enhances Suicide Risk Detection in Medical Clinics
Artificial Intelligence (AI) helps doctors to identify patients who are at risk for suicide, enhancing clinical alerts and improving prevention measures (1 Trusted Source
Risk Model-Guided Clinical Decision Support for Suicide Screening

Go to source
).
A new study from Vanderbilt University Medical Center (VUMC) led by Colin Walsh, MD, MA, associate professor of Biomedical Informatics, Medicine and Psychiatry, conducted a test using their AI system, called the Vanderbilt Suicide Attempt and Ideation Likelihood model (VSAIL).

The AI system was tested in three neurology clinics at VUMC to see if it could help doctors identify patients at risk of suicide during their regular clinic visits. The study was published in JAMA Network Open and compared automatic pop-up alerts that interrupt the doctor during consultation and a passive system that displays the risk information on the patient’s electronic chart.


Advertisement

AI Detects Suicide Risk in Patients

The study found that the interruptive alerts were far more effective, leading doctors to conduct suicide risk assessments in connection with 42% of screening alerts, compared to just 4% with the passive system.

"Most people who die by suicide have seen a health care provider in the year before their death, often for reasons unrelated to mental health," Walsh said. "But universal screening isn't practical in every setting. We developed VSAIL to help identify high-risk patients and prompt focused screening conversations."

Suicide has been on the rise in the U.S. for a generation and is estimated to claim the lives of 14.2 in 100,000 Americans each year, making it the nation’s 11th leading cause of death. Studies have shown that 77% of people who die by suicide have contact with primary care providers in the year before their death.

Calls to improve risk screening have led researchers to explore ways to identify patients most in need of assessment. The VSAIL model, which Walsh's team developed at Vanderbilt, analyzes routine information from electronic health records to calculate a patient's 30-day risk of suicide attempt. In earlier prospective testing, where VUMC patient records were flagged but no alerts were fired, the model proved effective at identifying high-risk patients, with one in 23 individuals flagged by the system later reporting suicidal thoughts.


Advertisement

Clinical Alerts Improve Suicide Risk Screening

In the new study, when patients identified as high-risk by VSAIL came for appointments at Vanderbilt's neurology clinics, their doctors received on a randomized basis either interruptive or non-interruptive alerts. The research focused on neurology clinics because certain neurological conditions are associated with increased suicide risk.

The researchers suggested that similar systems could be tested in other medical settings. "The automated system flagged only about 8% of all patient visits for screening," Walsh said. "This selective approach makes it more feasible for busy clinics to implement suicide prevention efforts."

The study involved 7,732 patient visits over six months, prompting 596 total screening alerts. During the 30-day follow-up period, in a review of VUMC health records, no patients in either randomized alert group were found to have experienced episodes of suicidal ideation or attempted suicide. While the interruptive alerts were more effective at prompting screenings, they could potentially contribute to "alert fatigue" — when doctors become overwhelmed by frequent automated notifications. The researchers noted that future studies should examine this concern.


Advertisement

AI Alerts for Suicide Prevention

"Healthcare systems need to balance the effectiveness of interruptive alerts against their potential downsides," Walsh said. "But these results suggest that automated risk detection combined with well-designed alerts could help us identify more patients who need suicide prevention services."

Others on the study from VUMC included Michael Ripperger, BS, Laurie Novak, PhD, Carrie Reale, MSN, Shilo Anders, PhD, Ashley Spann, MD, Jhansi Kolli, BS, Katelyn Robinson, BA, Qingxia Chen, PhD, David Isaacs, MD, Lealani Mae Acosta, MD, Fenna Phibbs, MD, Elliot Fielstein, PhD, Drew Wilimitis, BS, Katherine Musacchio Schafer, PhD, Dan Albert, MS, Jill Shelton, BSN, Jessica Stroh, BSN, and co-senior authors William Stead, MD, and Kevin Johnson, MD, MS.

The study was supported by the Evelyn Selby Stead Fund for Innovation (VUMC), FDA Sentinel, Wellcome Leap MCPsych, and the National Institutes of Health (grants MH118233, MH12145 MH116269, MH120122, HG009034, HG012510).

Reference:
  1. Risk Model–Guided Clinical Decision Support for Suicide Screening - (https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2828654)


Source-Eurekalert


Advertisement

Home

Consult

e-Book

Articles

News

Calculators

Drugs

Directories

Education