We've joined our RTI Health Solutions colleagues under the RTI Health Solutions brand to offer an expanded set of research and consulting services.
Rapid Growth In AI Mental Health Technology Prompts Optimism & Concerns
How AI could support mental health care
Amid escalating demand for mental health services, Artificial Intelligence (AI) technology could have a profound impact on access shortages. From therapy chatbots to clinical decision devices, these new tools also have the potential to lower costs, improve care, and boost provider efficiency.
As technology accelerates faster than research and regulation, AI applications are generating both excitement and controversy. Before fully embracing these new healthcare tools, experts caution that risks accompany these benefits and more research is necessary. Even advocates for the new technology point out that AI tools are not a replacement for clinical therapy and are not intended for use in acute crisis intervention or management.
Chatbot holds potential to improve patient wellbeing
In a recent NPR story, a single mother experiencing depression shared how a mental health chatbot lifted her spirits and improved her wellbeing. The woman, who did not have health insurance, turned to the chatbot as an affordable, judgement-free way to share her experiences at any time, day or night. In these sessions, she received practical suggestions to addressing depression, such as listening to calming music or practicing deep breathing.
This anecdote offers a powerful and poignant window into the potential of AI to improve access to and affordability of beneficial mental health tools.
Could AI ease soaring demand for mental health professionals?
New AI tools could hold tremendous potential in addressing the country's escalating mental health needs. In February 2023, more than 30% of US adults reported symptoms of anxiety or depression, Kaiser Family Foundation reports. At the same time, nearly half of the US population lives in a mental health workforce shortage area, a problem that’s heightened in rural areas.
“Lackluster enforcement” of the Mental Health Parity and Addiction Equity Act, a 2008 law designed to offer equal benefits for mental and physical health needs, is adding to what the American Medical Association calls “a grave threat” to patient care access.
New tools unlock huge potential for AI mental health technology
Clinicians and patients alike are putting AI applications to work. From improving diagnosis and treatment plans to assisting beleaguered therapists, AI is already making important strides in improving mental health, says a publication from the World Economic Forum.
Therapists can use AI to improve care and efficiency, improving the overall experience for both providers and patients. For example, AI technology can help providers wade through data on patient behavior and symptoms as well as their responses to prior treatment. This analysis could lead to more precise diagnosis and therapies. Improved efficiency could also free up time-strapped providers.
Myriad applications for natural language processing
AI technology can also be used to train therapists and improve therapy techniques. For example, some mental health clinics are already using natural language processing (NLP), or AI that helps computers understand and process human language, to create transcripts of therapy sessions. These can be used to offer therapists insights into their work and improve trainees’ approaches.
This data could help researchers pinpoint why some psychotherapists are more effective than others, explains an article from the Stanford University Human-Centered Artificial Intelligence. It could also map out the content of sessions, such as evaluating the ratio of cognitive behavioral therapy to regular conversation. It may even note subtle nuances in a patient’s speech pattern that could help diagnose conditions, such as depression and psychosis.
This detection capability has expanded beyond the clinician's office, according to research published in Digital Medicine. From social media to the narrative section of electronic health records, NLP is making strides in mental health detection.
Chatbots are on call 24/7 for mental healthcare
Along with improving access and quality of care, there are other advantages of these new tools, notes this article from the American Psychological Association. Most notably, mental health chatbots don’t have office hours and can be reached any time day or night. When late-night anxieties emerge, the chatbot is readily available. In addition, the technology can pull from a vast trove of psychological literature and can recall every client interaction.
Other pros include the ability to adjust preferences and approaches for individuals, such as customizing approaches to better reflect cultural competency. Plus, they can offer a sense of privacy for people in tight-knit rural communities who might fear stigma and hesitate to seek help.
Do some therapy interactions require a human?
Popular apps, such as Woebot, use NLP and the principles of cognitive behavioral therapy, or CBT, to directly interact with users. Other types of therapy, though, might pose challenges if delivered via a bot. For example, it’s unclear how effective psychodynamic, relational or humanistic therapies would be without a human element, notes the author of a Psychology Today article.
Another challenge arises from accountability. It could be easier to ignore an app compared to an appointment with a live person waiting. Even if people have a positive user experience with a mental health app, that doesn’t necessary predict whether they will continue to use it, findings described in JMIR Human Factors.
As interest in AI grows, skepticism persists
With so many new and emerging tools, the jury's still out on consumer willingness to embrace AI in mental healthcare. In a 2023 survey, the Pew Research Center found nearly 8 in 10 US adults reported they would not want to use an AI chatbot for mental health support. About 28% of people said they shouldn’t be available at all.
At the same time, others are so eager to try out the new tech that they're turning to AI chatbots not designed for therapy to discuss their mental health needs, raising privacy and quality of care concerns.
What about bias in AI?
Prior research has found evidence of embedded bias in AI applications. That's because algorithms are created by people, who may exhibit explicit and implicit biases. A well-documented example comes from biased AI predictions that were used to make medical care decisions. The tool used healthcare costs as a proxy for health needs. But the measure did not account for the fact that differences in spending originated from a biased healthcare system with unequal access to care.
Another study demonstrated the risk of using potentially biased AI models in mental health. Researchers evaluated the impact of biased AI recommendations on mental health crisis decision making. Through the use of a web-based experiment, researchers found that participants could be strongly influenced by advice from a biased model, findings published in November in Communications Medicine indicated.
Exploring privacy challenges
In a recent webinar from Harvard Law School, experts explored the implications—and ethics—of using AI in mental healthcare. Speakers described fears they'd heard that personal data might be shared with the company providing the platform, as well as local law enforcement. While AI mental health apps and chatbots are promising, there are also ethical and legal challenges that must be addressed before they are released in an uncontrolled way, a speaker said.
On a similar note, several US senators have also expressed privacy and data sharing concerns about mental health apps that operate in a “regulatory gray area.”
More research necessary for AI mental health technology
As the technology accelerates, observers are calling for more detailed research and investigation into the benefits and challenges of these emerging technologies. In a 2023 review published in the International Journal of Mental Health Nursing, authors highlighted the need for more research to better understand the role of AI in mental healthcare.
In the meantime, organizations such as the World Health Organization offer guiding principles when using AI in health. New tools should be evaluated to see if they attain the following:
-
Protect autonomy
-
Promote human wellbeing, safety, and the public interest
-
Ensure transparency
-
Foster responsibility and accountability
-
Ensure inclusiveness and equity
-
Promote AI that is responsive and sustainable
Technology won't wait for mental healthcare research
Even as researchers evaluate the impact of AI on patient wellbeing and outcomes, the technology is charging ahead. Already, there are thousands of mental health apps available, which makes it challenging for consumers to evaluate the most efficient and research-based options.
Meanwhile, new ways to harness AI's potential are emerging at a rapid pace. Just this March, healthcare startup Aiberry raised $8 million for its AI-powered mental health software that can be used for clinical mental health screening questionnaires.
RTI Health Advance’s digital health experts can help
With technology accelerating faster than research and regulations, RTI Health Advance experts can help you decipher the smartest digital tools and best practices to improve patient care and bolster clinical outcomes. With a team of experts, we can guide you through the promises and pitfalls of a rapidly changing landscape.
Subscribe Now
Stay up-to-date on our latest thinking. Subscribe to receive blog updates via email.
By submitting this form, I consent to use of my personal information in accordance with the Privacy Policy.