EmotionAlly : The Mental Health Ally for Patients and Clinicians
by Rachel Lee
Coauthors: The link to the formal abstract (including citations, graphics, and acknowledgments) is as follows: https://docs.google.com/document/d/1L4trBwqc_yxb9-E50OuoTBl6aTDJsO5miHHRMlmM3CQ/edit?usp=sharing
Medical Devices & Digital Health
Background:
Psychotherapy is most commonly conducted once a week in a time-limited fashion, with each session lasting between forty-five minutes to an hour. Nonetheless, patients living with mental health conditions often find themselves needing support in between therapy sessions; a patient may experience a mental health crisis during the week without being able to access compassionate care at the moment when they need it most. In undergoing mental health challenges, individuals may feel additional loneliness, further compounding their distress. Moreover, patients may feel pressured during therapy itself if it is the sole form of support available, which can inhibit them from fully divulging their emotions during each appointment with their therapist. Ideally, there would be immense benefit should patients have the ability to contact their psychologist and receive immediate help when undergoing mental health challenges outside of the therapy room. However, psychologists are restricted by availability and increasing caseloads, although many remain concerned about the safety and well-being of their patients upon departure from each appointment.
Methods:
EmotionAlly is a digital therapeutic app that provides on-demand mental health assistance. The app features Ally [al-lee], a chatbot ally available 24/7 to help patients process difficult emotions using active listening techniques. In particular, deep learning and natural language processing allow Ally to ask reflective questions and validate the user’s affective experiences, deepening the patient’s understanding of their feelings. Ally also conducts daily check-ins, asking the user questions such as “How are you feeling right now?” Since research shows that higher “psychotherapeutic doses” result in clinically significant recovery, Ally’s notifications will improve patient outcomes by ensuring that patients receive significant support between therapy sessions. Additionally, users can track their mood through the app’s mood tracker, and Ally will be capable of conversing with the patient in an individualized manner based on trends in their mood and progress. This will empower patients to recognize signs, patterns, and triggers while preventing the advent of mental health crises, including self-injurious behavior and suicide. Given that people prefer to share their innermost struggles with a virtual agent than another human, the chatbot will also be able to detect language indicative of a mental health emergency. In such cases of a crisis, Ally will kindly provide the user with a list of local and national helplines and resources (such as the National Suicide Prevention Lifeline) as well as directions to the nearest emergency room. If immediate danger is detected after a thorough assessment, the chatbot will notify the patient’s therapist and fulfill its duties as a mandated reporter.
EmotionAlly builds upon existing mental health chatbot services, including Woebot, by functioning not only as a therapeutic aid but moreover strengthening provider-patient rapport. The app differs from current technologies since it is designed for both patients and their therapists. Data collected from the patient outside the therapy room can inform the way clinicians approach future therapy sessions. For instance, providers will be able to monitor their patient’s mood tracker as well as conversations with Ally, allowing them to have access to their patient’s experiences on an ongoing basis, instead of being limited to what can be said within the short span of a single therapy session. Furthermore, after each interaction with Ally, the patient will be asked to complete a survey for feedback on the conversation. Machine learning recommender systems analyze the survey results and Ally conversations to provide the therapist with focal themes mentioned in engagements with the chatbot (e.g. grief/loss, anxiety, self-esteem, financial hardship, relationships, etc.) as well as suggestions for theoretical approaches and techniques to use in therapy sessions (e.g. CBT, DBT, IPT, ACT, etc.). In this manner, EmotionAlly will help clinicians decide what topics might be helpful to address during their time with the patient. By improving a provider’s understanding of their patient’s needs, EmotionAlly can help create a strong therapist-patient bond, which is a critical predictor of positive outcomes.
Conclusion:
Mental health challenges and crises are continuous processes and not discrete events, often occurring at times when immediate support is unavailable. While it is not realistically possible for mental health providers to intervene during every tribulation, AI chatbots are powerful tools for accessible mental health care delivery. In addition to bolstering patient mental health and offering on-demand help, EmotionAlly is designed to fortify the bond between clinicians and their patients. Regular engagement with Ally reduces the risk of crisis in between therapy sessions. Integrating information from the app into therapy allows clinicians to maximize the time spent helping their patients. Patients will no longer have to wait until therapy, suffering in silence, and providers can nurture healing and growth in safer, more productive spaces.
With EmotionAlly, reliable, compassionate care will be the norm, not the exception.