CHALLENGES & OPPORTUNITIES IN THE MENTAL HEALTH CHATBOT SPACE

GIMBHI
3 min readAug 13, 2022

--

Data Input: One limitation of chatbots is access to certain types of data. When a chatbot interacts with you, the chatbot could plausibly have access to an entire database of texts, physical activity patterns, social media usage. And it’s not just the data — it’s the analysis and interpretation of that data. For example, based on texting frequency, social media usage, sentiment analysis of language, physical activity, location tracking, medical history, and other data, the chatbot may ascertain that you went through a depressive episode last week, and it may adjust interactions accordingly. This is all assuming a chatbot has access to passive data collected through a mobile device. However, a chatbot could plausibly be missing lots of data that a human therapist would typically consider and integrate into care. Complex social & cultural context, body language, and other social & physical data that chatbots are not equipped to analyze and integrate yet.

Trust & Credibility: But even more importantly, a chatbot may not be able to build rapport and trust between a human patient to get access to get important patient-reported information. There are many factors that affect trust and credibility with a patient and human therapists may be better at building this than a chatbot. And trust and credibility with patients is critical because not surprisingly studies show that trust and credibility are integral in driving positive outcomes.

Currently, we are speaking to researchers about the prospect of leveraging AI for hyper-personalization of chatbots & avatars to optimize outcomes. Studies show that voice, attire, disposition, personality type, environment, gender, age, attractiveness, appearance, and many other factors affect patients’ perceptions of healthcare providers and ultimately affect adherence and outcomes. At GIMBHI, we believe the future of AI chatbots in mental health ultimately includes the development & refinement of personalized 3D avatars. Synthetic media technology has advanced to the point that it’s possible to generate realistic video and speech of avatars ranging from cartoon-looking robots to avatars that look and sound like real humans. 3D avatars will achieve trust with users beyond what is capable with simply an AI chatbot. In the healthcare setting, trust and credibility is integral to adherence and achieving outcomes.

Psychotherapeutic Approach: Currently, chatbots are somewhat limited in what psychotherapeutic approaches they employ. Most chatbots employ a variant of behavioral therapy. The objective of CBT is to reinforce positive behaviors and reduce negative behaviors. Psychoanalytic therapy deals with the complexities of the unconscious mind, memory, and personality. We have yet to see chatbots that “practice “psychanalytic therapy.

NLP Models: Foundation models (typically open-source pretrained language models) basically define the limits of what chatbots can do, unless a chatbot developer builds their own NLP model. Examples of these would be OpenAI’s GPT-3 and Google’s LaMDA. Researchers at UPenn published their findings of the biases of language-based models in AI mental health applications. “NLP models may entirely miss out cultures who express their suffering through vocabulary that differs from the existing standards found in the medical literature.”

The Future: As chatbots continue to become more sophisticated with the advancement of NLP technology and language models, it seems that these will compete primarily with digital health apps and interactive software which offer less interactivity. Digital health apps have notoriously low adherence rates, but we believe chatbots which offer on-demand personalized interaction will become preferable to users. In the long run, the question is can AI chatbots seamlessly mimic real human interaction and relationships? If the answer is yes, then that certainly will extend to the patient-therapist relationship as well.

Subscribe to our research & market intelligence, and join our network. Feel free to reach out to our founder, Shivan Bhavnani (shiv@gimbhi.com) with any questions, comments, or ideas.

gimbhi.com

research@gimbhi.com

@gimbhi1

meetup.com/Mental-Health-Startups-Investors

Subscribe to news and reports about mental health startups & mental health investors at gimbhi.substack.com

--

--

GIMBHI
GIMBHI

Written by GIMBHI

GIMBHI is an independent institute, which aims to accelerate the growth of investment in mental & behavioral healthcare worldwide. www.gimbhi.com

No responses yet