Teenage suicide is extremely difficult to predict. That’s why some experts are t Source: Peter Holley
Researchers from Cincinnati Children’s Hospital Medical Center are testing an app in schools that analyzes language to determine whether teens are at risk for suicide. They call it Spreading Activation Mobile or “SAM.” (John Pestian)
In any given week, Ben Crotte, a behavioral health therapist at Children’s Home of Cincinnati, speaks to dozens of students in need of an outlet.
Their challenges run the adolescent gamut, from minor stress about an upcoming test to severe depression, social isolation and bullying.
Amid the flood of conversations, meetings and paperwork, the challenge for Crotte — and mental health professionals everywhere — is separating hopeless expressions of pain and suffering from crucial warning signs that suggest a student is at risk for committing suicide.
It’s a daunting, high-pressure task, which explains why Crotte was willing to add another potentially useful tool to his diagnostic kit: an app that uses an algorithm to analyze speech and determine whether someone is likely to take their own life.
It’s name: “Spreading Activation Mobile” or “SAM.”
“Losing a child is my worst nightmare, and we all live with the fear that we might miss something,” Crotte said, referring to mental health professionals who work in education. “Sometimes we have to go with our gut to make a decision, so this is one more tool to help me make a final determination about someone’s health.”
SAM is being tested in a handful of Cincinnati schools this year and arrives at a time when researchers across the country are developing new forms of artificial intelligence that may forever change the way mental health issues are diagnosed and treated.
John Pestian — a professor in the divisions of Biomedical Informatics and psychiatry at Cincinnati Children’s Hospital Medical Center — has spent years refining SAM. (John Pestian)
Rates of teen suicide, in particular, are on the rise, with the rate among teen girls hitting a 40-year high in 2015, according to the Centers of Disease Control and Prevention. Over the past decade, the CDC reports, suicide rates doubled among teen girls and jumped by more than 30 percent among teen boys.
Despite being the 10th leading cause of death in the United States, suicide remains extremely difficult to predict. Experts say that’s because many people’s risk for self-harm is paired with another mental illness and fluctuates according to various stressors in their life, all of which interact uniquely within each individual. Complicating matters is that suicidal ideation — which can signal a growing risk for self harm — is far more common than actual suicide. To assess risk, mental health professionals have long relied on timeworn tools — notepads, conversation and well-honed intuition. Now artificial intelligence — combined with the widespread use of smartphones — is beginning to change the way experts interpret human behavior and predict self harm.
“Technology is here to stay, and if we can use it to prevent suicide, we should do that,” said physician Jill Harkavy-Friedman, vice president of research at the American Foundation for Suicide Prevention. “But we’re in the very early stages of learning how to use technology in this space.”
There are thousands of apps dedicated to improving mental health, but experts say the most promising will begin to incorporate predictive machine learning algorithms into their design like SAM. By analyzing a patient’s language, emotional state and social media footprint, these algorithms will be able to assemble increasingly accurate, predictive portraits of patients using data that is far beyond the reach of even the most experienced clinicians.
“A machine will find 100 other pieces of data that your phone has access to that you wouldn’t be able to measure as a psychiatrist or general practitioner who sees someone for a half-hour a few times a year,” said Chris Danforth, a University of Vermont researcher who helped develop an algorithm that can spot signs of depression by analyzing social media posts.
Using data from more than 5,000 adult patients with a potential for self-harm, Colin Walsh, a data scientist at Vanderbilt University Medical Center, also created machine-learning algorithms that predict — with more than 90 percent accuracy — the likelihood that someone will attempt suicide within the next week. The risk detection is based on such information as the patient’s age, gender, Zip codes, medications and prior diagnoses.
Danforth’s algorithm — which he developed with Harvard researcher Andrew Reece — can spot signs of depression by analyzing the tone of a patient’s Instagram feed. The pair created a second algorithm that pinpoints the rise and fall of someone’s mental illness by scanning the language, word count, speech patterns and degree of activity on their Twitter feed. A task that would require days of research for a clinician was accomplished by the machine in a matter of seconds.
“The dominant contributor to the difference between depressed and healthy classes was an increase in usage of negative words by the depressed class, including ‘don’t,’ ‘no,’ ‘not,’ ‘murder,’ ‘death,’ ‘never’ and ‘sad,’ ” the researchers wrote in their latest study identifying mental illness on Twitter. “The second largest contributor was a decrease in positive language by the depressed class, relative to the healthy class, including fewer appearances of ‘photo,’ ‘happy,’ ‘love,’ and ‘fun.’ ”
Danforth said mental health professionals are still dependent on the Diagnostic and Statistical Manual of Mental Disorders (the DSM) and one-on-one interviews, but he believes the data being amassed by smartphones means mental health is on the verge of a “digital revolution.”
“We’re already talking with doctors at the University of Vermont who want to build a screening tool for the emergency room that would ask people whether they’d be willing to have an algorithm look at their social media history,” Danforth noted.
The tools, however, will only be as good as the data that is used to create machine learning algorithms, Harkavy-Friedman said, noting that there is a lack of longitudinal studies on suicide. Social media will offer important information, she said, but any population of people being studied will always include false positives — people who exhibit suicidal behaviors but don’t go on to end their own lives.
“The more we can learn about both the factors leading to suicide, the better off we’ll be,” she said. “We need a huge number of people to study.”
Experts said it could take another five to 10 years to create algorithms predictive enough to be reliably deployed inside hospitals, schools and therapists’ offices. Questions will have to be resolved as well, experts said, such as whether predictive algorithms will affect health insurance premiums or what happens if drug companies manage to access people’s predictive data?
John Pestian, a clinical scientist and professor in the divisions of Biomedical Informatics and psychiatry at Cincinnati Children’s Hospital Medical Center within the University of Cincinnati, said it’s too early to answer those questions. When he created SAM — the app now being tested in Cincinnati schools — he was only focused on one thing: alleviating suffering with technology.
“You go into the emergency department and you go to the intensive care unit and you see technology everywhere, but you go into a psychiatrist’s office and you see a couch,” Pestian said.
| }
|