It might be tough to find the perfect words to soothe someone who is suffering from mental illness. Researchers from the Universities of Washington and Stanford have devised an innovative solution to this challenge. A team led by Allen School professor Tim Althoff developed a novel AI agent dubbed Human-AI collaboration approach for EmpathY (HAILEY) to improve empathy during online mental health support interactions. Â
HAILEY is a collaborative agent designed to assist peer supporters who need more professional treatment knowledge. HAILEY is a substantial deviation from regular chatbots. Peer-to-peer support networks such as Reddit and TalkLife are invaluable resources for individuals who cannot find a therapist, cannot afford therapy, or are hesitant to seek help for mental health issues. Â
HAILEY is trained in perfect isolation to avoid the types of errors that might afflict AI systems. According to David Atkins, co-founder and CEO of Lyssn.io, Inc., HAILEY takes a different approach than comprehensive, generative AI models. It emphasizes the significance of tailored deployment of technology for mental health. Â
HAILEY builds on and expands on the team’s work on PARTNER, a deep reinforcement learning model trained for sympathetic rewriting. In a non-clinical controlled experiment, 300 peer supporters used the TalkLife platform to learn about HAILEY’s just-in-time feedback. When compared to merely human-crafted replies, HAILEY-enhanced responses were picked 47% of the time, indicating a considerable improvement in empathy. Â
According to co-author Inna Lin, after interacting with HAILEY, people’s trust in offering aid grew, with a 27% boost in confidence among individuals who struggle with empathy. More than two-thirds of users considered the ideas made by HAILEY useful, and the clear majority (77%) would like to see a feedback feature similar to HAILEY integrated into the leading site. Â
HAILEY has been hailed by Adam Miner, a licensed clinical psychologist and clinical assistant professor at Stanford University, as an example of how to use artificial intelligence for mental health therapy in a humane and risk-free manner. According to Miner, AI may considerably improve mental health support services if it is created with the permission, respect, and autonomy of its users in mind. Â
HAILEY has the potential to be a game changer in the field of mental health assistance, but the team recognizes that further study is required before it can be used. Problems may arise when attempting to develop a system that provides real-time feedback on hundreds of discussions at once while also quickly filtering out unsuitable information. Ethical issues, such as transparency around the use of AI in responding to help seekers, must also be carefully examined.Â
In an interview with UW News, Tim Althoff makes it clear that HAILEY is not replacing human connection. Instead, it’s intended to enhance interpersonal relationships, implying that AI might play a critical role in encouraging more empathetic treatment for persons suffering from mental health concerns. Â
The team’s approach focuses on patient safety and high-quality mental health care. When traditional forms of therapy are not an option, online peer-to-peer support forums such as Reddit and TalkLife may be a lifeline. Despite their widespread use, the paper’s chief author, Ashish Sharma, concedes that the average empathy level in talks on existing platforms is only one. Â
The team created HAILEY, a collaborative AI agent that enters online chats, to solve the issue of impersonal online mental health care. This resource’s unique value stems from its capacity to assist “peer supporters,” or people who give critical help on these platforms while lacking professional therapy abilities. According to Ashish Sharma, the team collaborated with mental health practitioners to include the complex concept of empathy into computational methodologies. Â
Unlike more traditional chatbots such as ChatGPT, HAILEY is intended to encourage human-AI collaboration. Its primary purpose is to empower users to respond more empathetically in the present to people requesting help. Unlike chatbots that actively learn from online interactions, HAILEY is able to avoid the challenges associated with open-ended queries and complex human emotions during training due to its closed nature. Â
David Atkins, CEO of Lyssn.io, Inc. and adjunct professor in the UW Department of Psychiatry and Behavioral Sciences, emphasized the need to employ technology to improve mental health with caution. According to him, they concentrated on designing and carefully testing an empathy model before deploying it in a controlled setting. Because of careful design, HAILEY is a more ethically sound solution for mental health assistance than ordinary AI models. Â
To create HAILEY, the researchers initially created PARTNER, a deep reinforcement learning model trained for empathetic rewriting. A controlled experiment involving 300 peer supporters using the TalkLife platform yielded positive results. Nearly 47% of participants in the “treatment” group who interacted with HAILEY reported liking replies that benefited from human-AI collaboration. According to the study’s authors, the human-AI combination resulted in remarks that were 20% more empathetic than those made by people alone.Â
Inna Lin, an Allen School graduate student and co-author, was satisfied with the results. The vast majority of caregivers reported feeling more confident in their abilities after utilizing HAILEY. Users who reported difficulty injecting empathy into their remarks improved by 27% more than those who did not, suggesting that HAILEY’s assistance was especially beneficial for individuals experiencing difficulties in this area. Â
The fact that more than 60% of its clients found its recommendations to be valuable and achievable says volumes about how well HAILEY operates. In a good indicator, 77% of respondents responded that if such a feedback system were made available on genuine platforms, they would utilize it. Adam Miner, a clinical assistant professor at Stanford University and a certified clinical psychologist, praises HAILEY for being a safe and human-centered AI application for mental health assistance.
He emphasizes the need to constantly honor user consent, respect, and autonomy in order to fully realize AI’s promise in advancing mental health care. Despite the positive findings, the team recognizes that HAILEY requires considerable work before it can be used in a production setting. A lot of logistical challenges must be addressed before the system can deliver real-time feedback on hundreds of discussions at the same time. The use of artificial intelligence (AI) to assist persons in need presents ethical concerns, such as whether or not its use should be disclosed. Â
Tim Althoff closes by emphasizing that HAILEY cannot replace the personal connection that is so important in mental health care. Instead, it’s a watershed moment in the strategic deployment of AI to improve human interactions. This study demonstrates the possibility of a healthy collaboration between technology and human compassion by demonstrating how AI may assist people in becoming more compassionate in the problematic field of mental health care.Â
Journal Reference Â
Sharma, A., Lin, I. W., Miner, A. S., Atkins, D. C., & Althoff, T. (2023). Human–AI collaboration enables more empathic conversations in text-based peer-to-peer mental health support. Retrieved from https://www.nature.com/articles/s42256-022-00593-2Â


