News
Computing Profession

Mental Health Apps Listen and Guide

Posted
Artist's representation of a mental  health app.
Developments in portable devices and affective computing could help detect mental health issues earlier.

Most computer scientists know about ELIZA, the 50-year-old program that mimics a psychiatrist’s responses to user input. It uses a crude script, often spouting non-sequiturs or falling back on well-worn phrases like "How do you feel about that?" Regardless, first-time users often pour their hearts out to it— and walk away changed.

ELIZA addresses our very human need to vent, but does not offer any real analysis. The rise of portable devices, along with developments in affective computing, promise to give us new ways to monitor, detect, and warn of mental health issues.

Off the couch and into your pocket

In diagnosing mental illness, "There are only three things you can measure," according to Albert ("Skip") Rizzo, director for Medical Virtual Reality at the University of Southern California’s Institute for Creative Technologies. "What they tell you, self-reporting; what their inner body tells you, psychophysiology; and what their behavior shows, behavior analysis." Past apps for mental issues such as depression relied heavily on self-reporting, whether in the digital form of classical tools like the nine-question Patient Health Questionnaire (PHQ-9), or as mood journals.

One newer breed of evidence-based apps takes advantage of the many sensors in our smart phones, and our reliance on them. One such app is Purple Robot, used by a team at Northwestern University’s Center for Behavioral Intervention Technologies to test the correlation between depression (as measured by the PHQ-9) and such personal factors as location (measured via GPS). Explaining how the data revealed surprising patterns, lead researcher Sohrob Saeb said, "There was a variable that we named as ‘entropy,’ that measured how participants spread their time across different locations. People who were more depressed had lower entropy, meaning they spent most of their time in fewer locations—at work and home. As far as I know, no one had previously looked into this exact variable."

A similar approach is found in Cogito Companion, an app used as part of the U.S. Defense Advance Research Projects Agency (DARPA) funded Detection and Computational Analysis of Psychological Signals (DCAPS) program. Like Purple Robot, it correlates smartphone sensor measurements with a traditional tool—clinical assessments, in this case. Companion, however, also examines audio signals via "audio check-ins," brief 30-second recordings individuals can choose to make on their phone, for signs of engagement and stress.

"Our initial focus was analyzing how people speak," said Skyler Place, vice president and general manager of Boston-based Cogito Corp., "looking at tonality, energy, and speaking rate. Our mobile app was also focused on depression and PTSD (post-traumatic stress syndrome) assessment, and is now being used for a wider variety of patients."

Cogito Companion received an unexpected real-world test for PTSD when the nearby 2013 Boston Marathon bombing occurred during clinical trials; the product later got a boost in the form of a grant from the U.S. National Institute of Mental Health to study the mental health of patients via a mobile platform.

Another part of the DCAPS project is SimSensei, which hearkens back to ELIZA by placing the user in conversation with a virtual therapist shown on a desktop computer. A prototype has been developed, utilizing advances in the artificial intelligence fields of machine learning, natural language processing, and computer vision, consisting of two modules: Multisense tracks and analyzes facial expressions, body posture, acoustic features, linguistic patterns, and higher-level behavior descriptors (such as attention and fidgeting) in real time, inferring from these any indications of psychological distress, which directly inform SimSensei, the virtual therapist, which might then choose to nod, or pause, or ask a follow-up question.

Watch, wait, and warn

PTSD is a reaction to trauma; depression sometimes is, too. Bipolar disorder—also known as manic-depressive illness—is generally a chronic mood condition, which makes it an especially good fit for smartphone-based monitoring, according to Melvin McInnis, professor of Psychiatry at the University of Michigan and Principal Investigator of the Heinz C. Prechter Bipolar Research Fund. "We’re not talking about monitoring for a week, or a month, or four months. We’re talking about a lifetime of monitoring," he said. "To do that, we need a way to monitor individuals in a manner that is integrated into their daily routine, integrated into what they’re doing at any given time, and not obvious to the average person. If you walk around with any kind of wearable device like a Fitbit, you stand out, but a mobile phone looks like a normal device."

McInnis joined others at University of Michigan to create PRIORI, a pilot project in which a modified smartphone analyzes voice patterns in outgoing calls to detect changes in each user’s bipolar disorder. One symptom of the illness made smartphone-based monitoring especially useful, according to team member and University of Michigan computer science and engineering assistant professor in Computer Science and Engineering Emily Mower Provost: "When people start to transition (between phases in their disorder), the thing that their family members tell us again and again is that they can hear it in the voice of the individual, that something was starting to go wrong. If we could identify that such a transition is likely, we could make sure that this person saw a clinical care provider, rather than actually going through a transition that’s costly in terms of time lost, finances, friendships, and relationships."

Said McInnis, "The ideal situation would be that the patient, or a family member or designated clinic would see a warning signal that indicates there are problems brewing and that it would be good to check in with the clinic."

Rizzo believes such apps have great potential, yet warns they are no panacea. "There’s no shortcut in mental health," he said. "Automatic behavior analysis is all about sensing: looking at large amounts of data to get insight into a person. We can use that insight not just that to sell them something, but to help them when they’re in distress or need better mental health services."

Tom Geller is an Oberlin, OH-based technology and business writer.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More