Are we ready for mental health AI?

By: Anna Cristina Tuazon - @inquirerdotnet - Columnist/Philippine Daily Inquirer | May 11,2023 - 08:00 AM

I must confess, I am not an early adopter of technology. This quirk of mine has saved me from headaches. I refuse to install TikTok on my phone and would delete my Facebook if I could. Cryptocurrencies and nonfungible tokens do not appeal to me one bit. At the same time, my risk-adverse nature prevented me from considering telepsychotherapy for many years up until the pandemic forced us all to migrate online. I had to eat my words after railing against online therapy when it has now changed my practice forever.

The next technology we must contend with is artificial intelligence (AI). ChatGPT has been a revelation for many and writers before me had already expressed their thoughts on it. I held my tongue, owing to my reckoning with my online therapy skepticism, as I wanted to give it an open mind. I have encountered mental health chatbots and apps before and it was clear that technology still had a long way to go before it can provide the same nuanced benefit as a human therapist. ChatGPT and its other AI peers seem to hold potential to bridge that gap by providing a “human” experience to the conversation.

Is that enough?

Our psychology department has long been inundated with requests to collaborate by businessmen and startups who want to produce app-based mental health care aimed at accessibility and affordability. Indeed, psychotherapy is very expensive in this country with only expat-level private insurance offering limited coverage and PhilHealth not covering it at all.

There is a reason for the high cost—training to be a psychotherapist is time and resource intensive, with few scholarships available in the field. The work itself comprises long, consistent emotional labor. It is impossible for a therapist to provide eight hours of therapy per day for five days a week as this is a sure-fire way to burn out. As such, with only relatively few hours of income per week, they have to make sure their fees will cover their living costs. They also have to invest in continuing education units to renew their license. The only way to lower the overall costs of psychotherapy is for insurance and governments to comprehensively cover mental health care.

Given lack of insurance coverage, I notice that startups that highlight affordability when offering mental health services severely underpay their therapists. When businesses inquire with us, usually to field our graduates, we are appalled at the rates they offer. I kindly explain to them that our graduates can seek better rates with their own private practice or if they do want to offer low-cost services, they can do so with nonprofits instead.

It made sense that these startups are now heavily investing in chatbots and AI-esque technologies to keep costs low. If the technology is good enough to provide high quality psychotherapy, why not? But therein lies the problem. The one thing I have not heard from such business proposals is their concern with quality. They’ve been focused on keeping the costs low that they have grossly underestimated what it takes to provide good quality therapy. Most of them don’t even know what therapy looks like and how it works to help effect beneficial change for clients.

They also do not realize that conversations also have the power to harm people. The first thing we always ask these inquiring entrepreneurs is how they deal with the ethics—privacy, confidentiality, and ensuring no harm is done. In clinical trials, the earlier step is to prove that the drug causes no harm before we prove that the drug can provide significant benefit. Why is it not the same process with AI technology? News recently reported of a man in Belgium who died by suicide after being urged to do so by the chatbot Eliza. This particular chatbot was not designed for mental health—but why wasn’t it? If a person encouraged someone else to commit suicide, that person can be held liable. Where is the accountability when it comes to chatbots?

I believe that it is possible to reach a state of mental health care that is assisted by AI. I also believe the technology isn’t there yet. It would be much easier to develop medicine AI than mental health AI. This is because mental health care deals with a much greater plurality of approaches, unlike medicine. We also do not have the benefit of definitive imaging or assessment tools that would guarantee the best course of action. A big part of therapy is co-exploration and collaboration with the client. The therapist makes use of their own inner reactions to generate hypothesis and test this with the client. We would need AI technology that has enough capacity for emotions and self-reflection—and ability to check against their own biases—in order to effectively provide psychotherapy. We are not there yet.

—————-

Erratum: In last week’s article “Curriculum fatigue,” I erroneously attributed the Department of Education’s (DepEd) draft curricula to the Second Congressional Commission on Education (EdCom II). EdCom II did not participate in the curriculum review effort and this was done instead by DepEd and the Assessment, Curriculum, and Technology Research Centre. Thank you to the EdCom II office for the clarification.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

Read Next

Disclaimer: The comments uploaded on this site do not necessarily represent or reflect the views of management and owner of Cebudailynews. We reserve the right to exclude comments that we deem to be inconsistent with our editorial standards.

TAGS: AI, artificial intelligence, mental health

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.