A new study finds that an interactive voice application using artificial intelligence is an effective way to monitor the well-being of people being treated for serious mental illness.
Researchers from UCLA followed 47 people for up to 14 months using an application called MyCoachConnect. The data were collected from 2013 and 2015. All of the patients were being treated by physicians for serious mental illnesses, including bipolar disorder, schizophrenia and major depressive disorder.
For the study, published in PLOS One, participants called a toll-free number one or two times a week and answered three open-ended questions when prompted by a computer-generated voice. The questions were: How have you been over the past few days?; What’s been troubling or challenging over the past few days?; and What’s been particularly good or positive?
MyCoachConnect was designed to collect personalized patient responses, said lead author Dr. Armen Arevian, director of the Innovation Lab at the Jane and Terry Semel Institute for Neuroscience and Human Behavior. Specifically, the AI was trained to use an individual’s own words to offer a personalized analysis for each person. The application focused primarily on the choice of words the patients used in their responses, how their responses changed over time, with a smaller emphasis on audio features like tone of voice.
The analysis of the data, conducted in collaboration with researchers from USC’s Signal Analysis and Interpretation Laboratory, or SAIL, found that the application’s analysis was highly comparable with the physicians’ tracking of the patient’s well-being during the study period.
“The way people answer questions and the way they change their answers over time is unique to each patient,” Arevian said. “We were looking at a person as a person and not as a diagnosis.”
For the study, patients made calls either from a mobile phone or landline, including a pay phone, and were asked to speak for two to three minutes for each question.
“Technology doesn’t have to be complicated,” Arevian said. “In this study, patients didn’t need a smartphone[PM1] . It could be simple and low tech on the patient end, and high tech on the backend.”
Researchers hope that artificial intelligence that can analyze data collected from apps such as My Coach Connect will enable more proactive and personalized care for individuals. The application, for example, may help improve treatment by intervening early when someone is experiencing more symptoms.
“Artificial intelligence allowed us to illuminate the various clinically meaningful dimensions of language use and vocal patterns of the patients over time and personalized at each individual level,” said senior author Dr. Shri Narayanan, the Niki and Max Nikias Chair in Engineering and director of SAIL at the USC Viterbi School of Engineering.
Some participants were interviewed after the study ended, and said they found the system easy and enjoyable to use, Arevian said.
“They said speaking to a computer-generated voice allowed them to speak more freely,” Arevian said. “They also said it helped them feel less lonely because they knew that someone would be listening to it, and to them that meant that someone cared.”
MyCoachConnect was developed and hosted on the Chorus platform, which was developed by Arevian at UCLA and allows people to visually create mobile and other computer applications without computer programming in as little as a few minutes.Clinical sites that are interested in using the app with their patients may contact the Semel Innovation Lab to discuss implementation.