How AI can be a powerful tool in mental healthcare
What are some of the ways that AI can be a powerful tool in the mental healthcare industry? How is AI being used today in mental healthcare? Is this technology a useful tool for psychotherapists? And if yes, how? Are you using AI as part of your treatment for your patients?
“AI can certainly be a tool for those providing care in the field of mental health. There is more documented communication (i.e., emails, messaging within a patient portal, etc.) happening between patients and providers than ever before, and oftentimes the intended tone of the message may be misconstrued or misinterpreted, which may fracture rapport between the provider and the patient. Advancements in the language and learning of AI models can help to convey the message as intended by asking questions such as, “Can you rephrase this message to have more empathy,” or “Can you edit this message to remove any potential mis-gendering verbiage?”
“Additionally, recent research suggests that AI can adequately complete some of the more “manualised” aspects of therapy, with minimal (if any) reduction in the formation of a therapeutic alliance, which can facilitate getting a patient started with a treatment such as Cognitive Behavioral Therapy (CBT) at their convenience, completing some of the first steps of the treatment before transitioning to a human therapist. This could, theoretically, be a useful tool for increasing the number of patients that a therapist is able to see, if some of the sessions in a typical course of CBT were completed by AI, prior to transitioning to the psychotherapist, that therapist may then not have to see the patient for as long as they otherwise would to complete the traditional course of CBT, then creating openings in their schedule sooner, while not sacrificing the length/structure of the treatment. To my knowledge, this use of AI is not yet commonplace and I, personally, have not been utilising AI as part of my current practice as my current role involves more focus on medication management and administrative work, as well as utilising insight-oriented and supportive therapies that are not as structured as a therapy such as CBT.”
While AI cannot replace human interaction and professional support - can this technology truly analyse and deliver supportive advice? And if so, how effective can it be? What are some of the risks in using Chatbot therapy?
“I think there is utility in this technology being able to assess words and phrases that indicate a person is likely in distress, which would prompt supportive statements from the Chatbot and recommendations for mental health treatment, similar to how major search engines, such as Google, will provide links to the suicide hotline and crisis resources when someone searches words or phrases that are suggestive of someone researching methods to attempt suicide. The more people who use these technologies and the more input received for the AI to learn from, the more robust its ability to recognise these words/phrases, thus it projects to be a useful tool to help with early intervention.
“As with all types of therapy, the efficacy will be largely dependent on the individual’s follow-through with recommendations, though most of these technologies seem relatively quick to recommend connecting with an in-person therapist, which may help to limit any potential pitfalls in the therapeutic performance of AI. It’s worth noting that therapists reviewing AI performance in appropriately responding with behavioural interventions and “homework assignments” within the context of CBT-based treatment, they were in agreement with AI 90% of the time. Thus, it would stand to reason that use of AI for this purpose would often be similar to what the typical response of a therapist would be, and AI can continue to learn to further refine and improve the consistency of appropriate responses.”
There’s less of a stigma in asking for help with AI – right? These Chatbot’s can provide non-judgmental and a confidential space where individuals feel safe to express their feelings, etc. So, do you think people that would not have pursued therapy are more likely to use AI for help?
“Whether it is related to less stigma, or a byproduct of accessibility and ease-of-use to type a message in an app, it seems as if there is less of a barrier to take that first step in pursuing help. The responses from AI being non-judgmental was described as one of the key factors in finding the AI intervention to be effective, though many users also commented on their initial thoughts of using AI for this purpose to be “silly” or that they were hesitant to do so, thus I’m not certain if it’s a lack of stigma with AI that leads them to attempt it initially, though it certainly seems to be a factor that keeps people engaged with the technology as a form of treatment. I received direct communication from Sarah Baldry, the Vice President of Global Marketing for Wysa (the app utilised in the article linked above) who indicated that WYSA has received reports from 352 individuals who had used the Wysa app up to that point, who claimed the app saved their lives. Whether any of these individuals would have pursued therapy elsewhere if they had not been able to use this app is difficult to say, though it seems as if they were at least willing to use an AI-based app first and found it to be effective.”
Can a Chatbot lift someone’s spirit? Offer emotional support?
“Certainly. One of the most commonly reported factors in the efficacy of AI-based mental health apps (in addition to the interaction being non-judgmental) is the ease of conversation that people find with using the app. Generally, people find it to be quite similar to conversation with a live person, thus the responses sent to the patient hold similar weight as if they were receiving validation or supportive statements from another human. We know that words matter, and they can be used to provide emotional support or to lift someone’s spirits. From what we can tell, there doesn’t seem to be much drop-off in impact when these words are originating from an AI Chatbot as opposed to coming from an individual therapist.”
While AI and Chatbot’s have been programmed to recognise certain keywords or phrases that indicates a crisis (self-harm, suicide thoughts) and then offers appropriate resources, and in many cases connects the user to a human crisis intervention specialist - how confident are you in allowing these Chatbot’s to make a diagnosis?
“I would say that we’re still pretty far from accepting AI being able to diagnose people consistently. While much of the diagnostic criteria detailed in the DSM-V is made up of conjunctive and disjunctive lists – something that AI would theoretically be able to “learn” pretty handily – there is often nuance and clarification of symptoms required, which can take a good amount of time and may still not be definitive. For example, someone may indicate they have racing thoughts, which could be attributable to anxiety, Bipolar Disorder, ADHD, intoxication on a substance, hypervigilance related to a prior traumatic experience and so on.
“It is worth noting that there are already aspects of some neuropsychological assessments that are scored via algorithms, which AI may be able to help facilitate interpretation of the data findings, though the utility of this is likely somewhat limited as the psychologist administering the testing has to also weigh in on the patient’s effort level and engagement in the testing as part of the evaluation, meaning AI can’t yet autonomously perform this currently.”
Let’s discuss the one-on-one relationship patients have with their therapists and answer the question - will AI someday be analogous to the face-to-face patient-therapist interconnection? Is it safe to say this technology will never replace the human therapist?
“I would hesitate to use the word ‘never’ given we have seen AI advance so profoundly in recent years and at one point I wouldn’t have imagined we could reach the level of sophistication that we currently have. Despite this, I would say that we are nowhere near the point of replacing human therapists entirely. We would need further advancement in AI’s ability to comprehend psychodynamics, to appreciate the subtlety of non-verbal communication from patients, to utilise discretion effectively as to when to confront or make interpretations of information that patients provide and so on. Thus far, we have seen good efficacy of AI completing some of the standardised aspects of certain types of therapy, though there are treatment approaches that can be helpful to people outside of using CBT.
Beyond even what AI would have to learn, there are substantial logistical barriers to AI effectively replacing therapists, namely public acceptance.
“Even for job responsibilities that don't require extensive training or education to perform and there is existing technology to effectively perform this job (i.e., cashiers and self-checkout stations, bank tellers and ATMs, etc.), there are still humans performing these jobs in many settings. Societal acceptance and widespread adoption of technologies tend to occur very slowly; I would anticipate this would be similar, or even slower, when considering replacement of a human performing such intricate work and navigating such intimate information as a therapist would.”
Byline by Dr. Ryan Wade, a board-certified Yale-educated psychiatrist and Director of Addiction Services at Silver Hill Hospital in New Canaan, Connecticut, and the Medical Director at Freedom Institute in midtown Manhattan.
For more insights into Healthcare - check out the latest edition of Healthcare Digital and be sure to follow us on LinkedIn & Twitter.
Other magazines that may be of interest - Manufacturing Magazine.
*********************************************
BizClik is a global provider of B2B digital media platforms that cover 'Executive Communities' for CEO's, CFO's, CMO's, Sustainability Leaders, Procurement & Supply Chain Leaders, Technology & AI Leaders, Cyber Leaders, FinTech & InsurTech Leaders as well as covering industries such as Manufacturing, Mining, Energy, EV, Construction, Healthcare + Food & Drink.
BizClik, based in London, Dubai & New York offers services such as Content Creation, Advertising & Sponsorship Solutions, Webinars & Events.