AI Companion HAILEY Boosts Empathy in Mental Health Support

It might be tough to find the perfect words to soothe someone who is suffering from mental illness. Researchers from the Universities of Washington and Stanford have devised an innovative solution to this challenge. A team led by Allen School professor Tim Althoff developed a novel AI agent dubbed Human-AI collaboration approach for EmpathY (HAILEY) to improve empathy during online mental health support interactions.  

HAILEY is a collaborative agent designed to assist peer supporters who need more professional treatment knowledge. HAILEY is a substantial deviation from regular chatbots. Peer-to-peer support networks such as Reddit and TalkLife are invaluable resources for individuals who cannot find a therapist, cannot afford therapy, or are hesitant to seek help for mental health issues.  

HAILEY is trained in perfect isolation to avoid the types of errors that might afflict AI systems. According to David Atkins, co-founder and CEO of Lyssn.io, Inc., HAILEY takes a different approach than comprehensive, generative AI models. It emphasizes the significance of tailored deployment of technology for mental health.  

HAILEY builds on and expands on the team’s work on PARTNER, a deep reinforcement learning model trained for sympathetic rewriting. In a non-clinical controlled experiment, 300 peer supporters used the TalkLife platform to learn about HAILEY’s just-in-time feedback. When compared to merely human-crafted replies, HAILEY-enhanced responses were picked 47% of the time, indicating a considerable improvement in empathy.  

According to co-author Inna Lin, after interacting with HAILEY, people’s trust in offering aid grew, with a 27% boost in confidence among individuals who struggle with empathy. More than two-thirds of users considered the ideas made by HAILEY useful, and the clear majority (77%) would like to see a feedback feature similar to HAILEY integrated into the leading site.  

HAILEY has been hailed by Adam Miner, a licensed clinical psychologist and clinical assistant professor at Stanford University, as an example of how to use artificial intelligence for mental health therapy in a humane and risk-free manner. According to Miner, AI may considerably improve mental health support services if it is created with the permission, respect, and autonomy of its users in mind.  

HAILEY has the potential to be a game changer in the field of mental health assistance, but the team recognizes that further study is required before it can be used. Problems may arise when attempting to develop a system that provides real-time feedback on hundreds of discussions at once while also quickly filtering out unsuitable information. Ethical issues, such as transparency around the use of AI in responding to help seekers, must also be carefully examined. 

In an interview with UW News, Tim Althoff makes it clear that HAILEY is not replacing human connection. Instead, it’s intended to enhance interpersonal relationships, implying that AI might play a critical role in encouraging more empathetic treatment for persons suffering from mental health concerns.  

The team’s approach focuses on patient safety and high-quality mental health care. When traditional forms of therapy are not an option, online peer-to-peer support forums such as Reddit and TalkLife may be a lifeline. Despite their widespread use, the paper’s chief author, Ashish Sharma, concedes that the average empathy level in talks on existing platforms is only one.  

The team created HAILEY, a collaborative AI agent that enters online chats, to solve the issue of impersonal online mental health care. This resource’s unique value stems from its capacity to assist “peer supporters,” or people who give critical help on these platforms while lacking professional therapy abilities. According to Ashish Sharma, the team collaborated with mental health practitioners to include the complex concept of empathy into computational methodologies.  

Unlike more traditional chatbots such as ChatGPT, HAILEY is intended to encourage human-AI collaboration. Its primary purpose is to empower users to respond more empathetically in the present to people requesting help. Unlike chatbots that actively learn from online interactions, HAILEY is able to avoid the challenges associated with open-ended queries and complex human emotions during training due to its closed nature.  

David Atkins, CEO of Lyssn.io, Inc. and adjunct professor in the UW Department of Psychiatry and Behavioral Sciences, emphasized the need to employ technology to improve mental health with caution. According to him, they concentrated on designing and carefully testing an empathy model before deploying it in a controlled setting. Because of careful design, HAILEY is a more ethically sound solution for mental health assistance than ordinary AI models.  

To create HAILEY, the researchers initially created PARTNER, a deep reinforcement learning model trained for empathetic rewriting. A controlled experiment involving 300 peer supporters using the TalkLife platform yielded positive results. Nearly 47% of participants in the “treatment” group who interacted with HAILEY reported liking replies that benefited from human-AI collaboration. According to the study’s authors, the human-AI combination resulted in remarks that were 20% more empathetic than those made by people alone. 

Inna Lin, an Allen School graduate student and co-author, was satisfied with the results. The vast majority of caregivers reported feeling more confident in their abilities after utilizing HAILEY. Users who reported difficulty injecting empathy into their remarks improved by 27% more than those who did not, suggesting that HAILEY’s assistance was especially beneficial for individuals experiencing difficulties in this area.  

The fact that more than 60% of its clients found its recommendations to be valuable and achievable says volumes about how well HAILEY operates. In a good indicator, 77% of respondents responded that if such a feedback system were made available on genuine platforms, they would utilize it. Adam Miner, a clinical assistant professor at Stanford University and a certified clinical psychologist, praises HAILEY for being a safe and human-centered AI application for mental health assistance.

He emphasizes the need to constantly honor user consent, respect, and autonomy in order to fully realize AI’s promise in advancing mental health care. Despite the positive findings, the team recognizes that HAILEY requires considerable work before it can be used in a production setting. A lot of logistical challenges must be addressed before the system can deliver real-time feedback on hundreds of discussions at the same time. The use of artificial intelligence (AI) to assist persons in need presents ethical concerns, such as whether or not its use should be disclosed.  

Tim Althoff closes by emphasizing that HAILEY cannot replace the personal connection that is so important in mental health care. Instead, it’s a watershed moment in the strategic deployment of AI to improve human interactions. This study demonstrates the possibility of a healthy collaboration between technology and human compassion by demonstrating how AI may assist people in becoming more compassionate in the problematic field of mental health care. 

Journal Reference  

Sharma, A., Lin, I. W., Miner, A. S., Atkins, D. C., & Althoff, T. (2023). Human–AI collaboration enables more empathic conversations in text-based peer-to-peer mental health support. Retrieved from https://www.nature.com/articles/s42256-022-00593-2 

Latest Posts

Free CME credits

Both our subscription plans include Free CME/CPD AMA PRA Category 1 credits.

Digital Certificate PDF

On course completion, you will receive a full-sized presentation quality digital certificate.

medtigo Simulation

A dynamic medical simulation platform designed to train healthcare professionals and students to effectively run code situations through an immersive hands-on experience in a live, interactive 3D environment.

medtigo Points

medtigo points is our unique point redemption system created to award users for interacting on our site. These points can be redeemed for special discounts on the medtigo marketplace as well as towards the membership cost itself.
 
  • Registration with medtigo = 10 points
  • 1 visit to medtigo’s website = 1 point
  • Interacting with medtigo posts (through comments/clinical cases etc.) = 5 points
  • Attempting a game = 1 point
  • Community Forum post/reply = 5 points

    *Redemption of points can occur only through the medtigo marketplace, courses, or simulation system. Money will not be credited to your bank account. 10 points = $1.

All Your Certificates in One Place

When you have your licenses, certificates and CMEs in one place, it's easier to track your career growth. You can easily share these with hospitals as well, using your medtigo app.

Our Certificate Courses