AI and Mental Health

Written in

by

We’ve all faced spells of mental health challenges; I certainly have. There have been a couple distinct periods in my life where I felt incredibly anxious and a sense of inevitable, impending doom. Through these tribulations I’ve become much stronger and now face adversity, both externally and internally, head on. Looking back, I wonder if AI could’ve helped me as much as hypnotherapy and personal counseling did, and I wonder what that would even look like. I wonder if AI has the ability to frame the way I’m feeling in such a way that I’m able to better process my emotions. And I wonder if AI has the ability to come up with solutions that not even a therapist can ascertain.

When ChatGPT was first released, it came as a shock (although it really shouldn’t have) that a vast number of users were using it as a therapist. In fact, that’s how I first started using ChatGPT. I think the reasoning is two-fold. One, I wanted to test the Natural Language Model to see if it could sympathize with my state of being in such a way that it could potentially replace a human. Second, I found it comforting to tell my problems to a computer, where my anonymity is guaranteed and where I didn’t have to experience guilt, shame, disgust, or any other feeling that comes with exposing myself to the outside world. I found that ChatGPT not only “understood” my point of view, but even elaborated on it and provided some distinct, creative solutions for problems I faced. This begs the question, is AI good enough to replace traditional therapists and what are the implications of AI chatbots as they’re seemingly intertwined with personal AI assistants, counselors, consultants, and advisors?

Pros of AI in Mental Health

Imagine if you had a therapist who was able to sympathize with you 24/7, provide creative solutions almost instantaneously, listen to your problems without innately judging, guarantee your anonymity, and consistently be relied upon? That’s AI Chatbots. We’ve already seen how AI can re-create what people see using and interpreting their brain scans, and we’ve seen how accurate AI is in diagnosing individuals with early stage medical abnormalities from body scans. What if there was an AI therapy bot that was able to personalize therapeutic recommendations and listen to your concerns every single day; those are AI chatbots, albeit these chatbots were not created specifically for this purpose. So why is it that so many people use these chatbots for therapeutic remediations? We use them because they’re:

  • Accessible – available 24/7 if you have internet
  • Anonymous – it’s a robot; an objective being
  • Consistent – always able to provide proper answers for concerns
  • Efficient – incredibly quick to answer and reply
  • Personalized – custom curated based on the content/prompt provided

Cons of AI in Mental Health

Based on the last paragraph and the pros, how could AI possibly have any negative implications with regards to mental health treatment? Well, there are a few. For one, humans are social creatures; we crave empathy. An AI model isn’t going to be able to empathize or take on the same emotions as a human because it’s not a human. It can sympathize, but sympathy only goes so far when it comes to solving real, hard problems. And sympathetic reply after sympathetic reply might make the AI chatbot seem even less invested in personal problems (sort of like a parent who babies their kid; it’s not actually solving anything).

Additionally, there is limited scope or capabilities for complex care, such as Bipolar Disorder, Schizophrenia, Multiple Personality, etc., and it’s certainly not able to diagnose someone with these complex medical conditions because there isn’t necessarily a “check list” of symptoms that, if the person has them, means they’re bipolar. These conditions are very circumstantial. For example, you can even hallucinate when you’re highly stressed, but this doesn’t mean you’re psychotic or schizophrenic. AI has the ability to help professional, personal medical professionals diagnose complex mental conditions, and treat facets of those conditions (like anxiety, depression, etc.), however it’s not able to holistically solve for something so subjective.

Lastly, as mentioned before, AI is only as good as the data it’s trained on. If fed old data or old information on how best to treat specific mental conditions or states of being, it’s going to provide age old “solutions” that might not be relevant in the current environment. For example, it might recommend getting electric shock therapy if you have depression, because that’s what they did in the “good ole days.” It might also interpret anxiety or depression poorly if the data its trained on has zero information about anxiety or depression; it might just not know what to do.

Conclusion

I am all for AI in mental health treatment and to assist in the diagnosis of mental conditions. However, I’m against AI solely diagnosing without a human professional being present. I think that AI Chatbots specifically, are incredibly useful when someone needs immediate assistance for anxiety, panic, depression, etc. In fact, I think there should be an AI therapy bot who’s specific purpose is to provide immediate relief to distressed individuals; this should be the main course of action rather than racking up obscene amounts of medical bills if going to the ER, or having to wait for a therapy appointment in approximately 3 months. When it comes to complex mental health conditions, a human should always be involved to provide human empathy and to create an environment that is conducive to recovery. An AI model can help with analysis of objective indicators of mental distress (fidgeting, facial expressions, eye movement, etc.), however AI should not be the sole agent as it isn’t able to empathize with humans; it’s a damn computer model. I think AI is here to help, and I think it can help solve the mental health crisis we are all facing everyday.

Tags

Leave a comment