By Published: Nov. 12, 2019

A cell phone

Thanks to advances in artificial intelligence, computers can now assist doctors in diagnosing disease and help monitor patient sleep patterns and vital signs from hundreds of miles away.听

Now, 兔子先生传媒文化作品 researchers are working to apply machine learning to psychiatry, with a speech-based mobile app that can categorize a patient鈥檚 mental health status as well as or better than a human can.

鈥淲e are not in any way trying to replace clinicians,鈥 says Peter Foltz, a research professor at the Institute of Cognitive Science and co-author of a听听颈苍听Schizophrenia Bulletin听that lays out the promise and potential pitfalls of AI in psychiatry. 鈥淏ut we do believe we can create tools that will allow them to better monitor their patients.鈥

Nearly one in five U.S. adults lives with a mental illness, many in remote areas where access to psychiatrists or psychologists is scarce. Others can鈥檛 afford to see a clinician frequently, don鈥檛 have time or can鈥檛 get in to see one.

Even when a patient does make it in for an occasional visit, therapists base their diagnosis and treatment plan largely on listening to a patient talk 鈥 an age-old method that can be subjective and unreliable, notes paper co-author Brita Elvev氓g, a cognitive neuroscientist at the University of Troms酶, Norway.

鈥淗umans are not perfect. They can get distracted and sometimes miss out on subtle speech cues and warning signs,鈥 Elvev氓g says. 鈥淯nfortunately, there is no objective blood test for mental health.鈥

Language a window into mental health

In pursuit of an AI version of that blood test, Elvev氓g and Foltz teamed up to develop machine learning technology able to detect day-to-day changes in speech that hint at mental health decline.听

For instance, disjointed speech鈥攕entences that don鈥檛 follow a logical pattern鈥攃an be a critical symptom in schizophrenia. Shifts in tone or pace can hint at mania or depression.听 And memory loss can be a sign of both cognitive and mental health problems.

鈥淟anguage is a critical pathway to detecting patient mental states,鈥 says Foltz. 鈥淯sing mobile devices and AI, we are able to track patients daily and monitor these subtle changes.鈥

The new mobile app asks patients to answer a 5- to 10-minute series of questions by talking into their phone.

Among various other tasks,听they鈥檙e asked about their emotional state, asked to tell a short story, listen to a story and repeat it and given a series of touch-and-swipe motor skills tests.

In collaboration with Chelsea Chandler, a computer science graduate student at 兔子先生传媒文化作品, and other colleagues, they developed an AI system that assesses those speech samples, compares them to previous samples by the same patient as well as the broader population and rates the patient鈥檚 mental state.

In one recent study, the team asked human clinicians to listen to speech samples of 225 participants鈥攈alf with severe psychiatric issues; half healthy volunteers鈥攊n rural Louisiana and Northern Norway and assess them. They then compared those results to those of the machine learning system.

鈥淲e found that the computer鈥檚 AI models can be at least as accurate as clinicians,鈥 says Foltz.

Their technology is not commercially available yet. But he听and his colleagues听envision a day when such AI systems could be in the room with a therapist and a patient to provide additional data-driven insight, or serve as a remote-monitoring system for the severely mentally ill.

If the app detected a worrisome change, it could notify the patient鈥檚 doctor to check in.

鈥淧atients听often need to be monitored with frequent clinical interviews by trained professionals to avoid costly emergency care and unfortunate events,鈥 says Foltz. 鈥 But there are simply not enough clinicians for that.鈥

Research call to action

Peter Foltz

Peter Foltz

Foltz previously helped develop and commercialize听an AI-based essay-grading technology which is now broadly used to help educators do their job.听

In their new paper, the researchers lay out a call to action for larger studies to prove efficacy and earn public trust before AI听technology could be broadly brought into clinical practice for psychiatry.听听听听听

鈥淭he mystery around AI does not nurture trustworthiness, which is critical when applying medical technology,鈥 they write. 鈥淩ather than looking for machine learning models to become the ultimate decision-maker in medicine, we should leverage the things that machines do well that are distinct from what humans do well.鈥