Skip to content

Artificial Intelligence poised to intervene in potential crises by pinpointing individuals exhibiting signs of suicidal tendencies

AI System at Carnegie Mellon University Successfully Distinguishes People in Preliminary Tests

Artificial Intelligence to Potentially Detect Individuals Harboring Suicidal Ideations, Perhaps...
Artificial Intelligence to Potentially Detect Individuals Harboring Suicidal Ideations, Perhaps Preventing Calamities

Artificial Intelligence poised to intervene in potential crises by pinpointing individuals exhibiting signs of suicidal tendencies

Researchers at Carnegie Mellon University have developed an innovative AI system that could revolutionise mental health assessments. This groundbreaking technology listens to the brain itself, potentially detecting subtle shifts in how someone perceives words like "carefree" or "praise" before they even recognise they're at risk.

The AI system, which combines psychology, neuroscience, and machine learning, is currently being tested on a larger scale. Researchers are seeking funding and partners to expand their studies, aiming to make this technology widely available for mental health patients in the near future.

The research fits into a broader trend in neuroscience: decoding how the brain represents thoughts, including emotionally loaded concepts. In this case, the AI system is able to identify six key words and five brain regions that consistently distinguish suicidal participants from neurotypical controls.

The standout terms that the AI system used to identify suicidal individuals were "death", "cruelty", "trouble", "carefree", "good", and "praise". In a second analysis, the AI system was able to distinguish between those who had previously attempted suicide and those who hadn't with 94% accuracy.

The ultimate goal is to develop a scalable, affordable way to detect suicidal ideation in everyday clinical settings. If successful, this could pave the way for wearable suicide screening tools, such as smartwatches.

However, ethicists, clinicians, technologists, and patients will need to work together to establish boundaries, best practices, and compassionate implementation for this technology. As with any advancement in healthcare, the benefits must be weighed against potential risks and privacy concerns.

In the quest for a more accessible and early suicide detection tool, the research team is exploring alternatives to fMRI machines, such as electroencephalography (EEG). EEG is a cheaper, portable method that could potentially be used for suicide screening tools.

The AI system was initially tested on 34 young adults, with 17 having a history of suicidal thoughts and 17 being neurotypical controls. The researchers plan to expand their participant pool, test new imaging technologies, and refine the machine-learning algorithms.

This breakthrough could mark a significant step forward in suicide prevention and mental health assessment. By combining these emotional fingerprints with AI pattern recognition, researchers are approaching a new frontier: the ability to "see" suffering before it becomes unmanageable.

Read also: