Talking about one’s emotions and thoughts is crucial for keeping good mental health. Unfortunately, that is often not easy to do. There might not always be someone around to talk to, and, in addition to that, some people are hesitant or even unable to ask their healthcare system for help. Only a small proportion of people with mental health issues do contact the healthcare system to address their problems or take advantage of the available services [1].
However, we live in the age of digitalization in which intelligent machines can make our everyday and private lives easier. Digitalization trends have affected clinical psychology and psychotherapy as well and have led, among other things, to the creation of chatbots – programs that can hold conversations with users, usually based on scripts created by psychotherapists. Some chatbots include psychotherapeutic interventions (like cognitive behavioral therapy techniques) offered in real-time, and may have a role to play in patient care – in a sense, offering some therapeutic help without a (human) therapist.
Such chatbots can talk about various topics including mental health and show promising potential for the future of psychological interventions. Several studies have already found their efficacy in treating symptoms of panic disorder, social anxiety disorder, generalized anxiety disorder, posttraumatic stress disorder, attention deficits, and major depression [2]. Despite their success, there is still much room for improvement.
The Development of AI Assistants
The first conversational AI application concerned with mental health issues was a program called ELIZA. It was also one of the first chatbots in existence. It was created between 1964 and 1966 by Joseph Weizenbaum at the MIT Artificial Intelligence Laboratory. At this early stage of artificial intelligence, it used a technique known as “pattern matching,” which searches for keywords in the text. ELIZA produced responses based on the results of the search and a set of rules designed by a programmer. ELIZA consisted of several scripts, one of which was called DOCTOR. It simulated a Rogerian psychotherapist and the technique was based on the repetition of what the patient had recently said. It was believed that such a program could aid doctors in treating patients. However, those dreams did not materialize.
Dialogue systems have undergone a long development. The first person to conceive of conversational AI was the father of modern computing, Alan Turing. He designed a test that was supposed to evaluate the capabilities of computers to communicate. The idea was to let a person talk to a second person and a computer placed in a different room. If the person asked a question and couldn’t tell which reply was created by another human and which by the computer, the computer passed the Turing test [3]. Although it was later demonstrated that such a computer does not possess true intelligence according to the “Chinese room” [4], such evaluation is still being used today. Modern dialogue systems like DialoGPT are reaching the point in which their responses are not distinguishable from the responses of humans [5].
However, saying that a response of a computer is indistinguishable from a response of a human is not enough. The things humans say do not always make sense either and you certainly would not accept advice about your health treatment from your barista and guidance on how to run a café from your doctor. The usefulness of even the most human-like responses is always questionable. The same can be said for the latest text generation models like GPT-3. They can generate coherent human-like texts but their truthfulness or usefulness is never guaranteed.
Such weaknesses are critical in medical applications, and there have been concerns about bias as well [6]. More difficulties may arise in a situation where a system trained on one demographic is applied to patients of another demographic, leading to problematic results. Thus health applications require more sensitive approaches than others with state-of-the-art algorithms of artificial intelligence like large generative models trained on huge text corpora.
Applications Supporting Mental Health
Despite the concerns, utilizing artificial intelligence in the medical domain can open new possibilities for healthcare. Especially interesting is the domain of mental health, in which many of the treatment techniques rely on dialogues between the therapist and the patient. AI technologies can make treatment available to a larger group of people in need. Often the people who need help the most are those who are the most vulnerable and cannot afford costly therapy treatment. An example of a successful AI application is a chatbot called Karim developed by a Silicon Valley startup X2AI. The chatbot was designed to help refugees from Syria in Lebanon to combat fear, anxiety and emotional problems. The chatbot analyses a person’s emotional state via algorithms of natural language understanding (NLU) and responds with comments, questions and recommendations [7]. But it is not just about refugees. Roughly one in every four Americans suffers from the consequences of mental health issues [8], and 60% of those in need have not received treatment [9].
Computer-assisted and web-based therapy was pioneered in the 90s and 2000s in the US by Roger Gould and his team with his therapeutic online programs called Mastering Stress and Shrink Yourself that were based on highly personalized interactions of patients with the software. Those programs focused on adjusting the user’s behavior and reducing their stress by helping the user work on recognizing it, taking care of themselves, resolving problems, tensions in relationships, but also on eliminating eating disorders, emotional eating, overeating etc. [10].
It can be noted that the problems connected to stress, eating habits and digestion are often related, as patients’ stress, depression, anxiety, poor coping strategies, and some personality traits are commonly associated, among other things, also with inflammatory bowel diseases including Crohn’s disease or ulcerative colitis [11]. Therefore, interventions that could be provided by bots and apps focused on psychosocial support, wellbeing, emotions, negative thoughts etc. may help protect people also from the immunologic consequences of chronically stressful experiences, alleviate stress and improve mental health.
Rapid progress has been achieved recently in mental health apps thanks to breakthroughs in the development of algorithms for natural language understanding (NLU). Those algorithms allow a much better understanding of what people are saying and thus have opened an opportunity to develop even more intelligent mental health care assistants.
One such example is chatbot Florence, which addresses the incorrect taking of prescriptions. According to Cutler and Everet, around 3.8 billion prescriptions are written in the U.S. every year, out of which over 50% are taken incorrectly [12]. Florence prompts people to take their prescriptions through reminders and gamification while tracking their health.
An example of a chatbot app that takes care of mental health specifically is “Woebot.” The aim of Woebot is to alleviate depression and anxiety by using cognitive behavioral therapy. We do know that cognitive behavioral therapy can be put into practice with the use of chatbots very effectively, in fact, Andersson et al. demonstrated that online and in-person CBT treatments are equally effective [13]. A great advantage is that such chatbots are available 24/7. Another app is “Wysa,” which offers chatting and connection to a real therapist.
The Rise of Voice-First Technology
Nowadays, most chatbot apps are text-based, which means that logically their development should lead to voice-based interactions. That is where the future of conversational AI lies. After all, speech is the most natural means of communication for humans. The progress in AI and conversational technologies enabled this development to start already, leading to the creation of voice-based mental health assistants, one of the first applications being “TalkToPoppy,” which has its own “virtual persona” named Poppy.
Its ambition is not to replace human therapists, but to offer a solution to people for whom it can be difficult to contact a therapist. Similarly to other chatbot apps, Poppy’s conversations are primarily based on cognitive behavioral therapy approaches focused on handling various negative thoughts and emotions, sometimes enriched by humanistic psychotherapy. The main difference and progress are, however, in the means of interaction with Poppy which is done not by texting but primarily by using your own voice. The communication is therefore much more natural and human-like.
This technology is available thanks to the revolution in conversation design that has been instigated by the “Flowstorm” startup platform and its liberalization of conversational AI, and especially by its introduction of a completely new concept of virtual personas. Flowstorm’s personas are complex virtual beings, gifted with distinct character traits and their own experience. They are able to talk about emotions and feelings with an understanding that is unattainable to existing voice assistants like Alexa or Siri.
How to Train a Virtual Mental Health Assistant?
There have always been serious ethical concerns about utilizing AI in the domain of mental health – how to prevent intelligent computers from telling people who may need medical treatment things that could make their state even worse? In a famous case, a bot designed to give people medical assistance incited a test user to commit suicide. To avoid such scenarios, Flowstorm has developed the so-called hybrid model. The principle of this model is synergy. Automated processes including NLP, generative models, etc. are only one part of the system – the other part is controlled by creative designers. Their role is to inculcate the system with the patterns of behaving and speaking necessary for the successful solving of different “language games.” We may imagine the process of creating an intelligent agent in Flowstorm as being similar to the way we teach a child to behave and speak appropriately in different situations. On the one hand, a child has some mental and physical predispositions, and a virtual persona has components operating automated algorithms. On the other hand, the way the child behaves and speaks depends on continuous training where we patiently interact with it and provide it with models and patterns that lead to successful management of a wide scale of situations – in this sense, conversation designers are the teachers of the virtual personas whose parents are Flowstorm’s engineers.
The patterns the virtual personas in the field of mental health need to manage are mostly from the domain of psychology and therapy, and that’s why the core members of the TalkToPoppy’s team are professional psychologists and therapists. The secret recipe that made Flowstorm successful consists in the acknowledgment that only professionals from the fields of communication and psychology can create personas with a human-like character and that they need tools that allow them to pass the maximum of their professional experience on to the virtual personas with a minimum of technical IT skills.
This mutual cooperation of humanistic and technical disciplines enabled the creation of the virtual persona Poppy. As a result, Poppy is always there for anybody who needs someone to share their troubles with. She is ready to listen, trained to give the best fitting advice or to simply keep you company.
The current advancement of conversational Al and virtual mental health assistants holds exciting promises for the future of mental healthcare and psychotherapy. AI assistants will be able to become an integral part of our everyday lives. Mental health assistants will make a valuable contribution in the field of psychotherapy as they will make treatments easier and more effective, or provide help to those who are not able to undergo a proper treatment. There is still a long way to go but the technology is being improved every day and it is obvious already that AI has the potential to help millions of people struggling with their mental wellbeing and thus make the world a better, happier and more peaceful place.
Radek Heissler, Anna Cranfordová, Vít Jakimiv, Petr Marek
[1] Bendig, E., Erb, B., Schulze-Thuesing, L., & Baumeister, H. (2019). The Next Generation: Chatbots in Clinical Psychology and Psychotherapy to Foster Mental Health – A Scoping Review. Verhaltenstherapie, 1–13. https://doi.org/10.1159/000501812
[2] Gratzer, D., & Goldbloom, D. (2020). Therapy and E-therapy—Preparing Future Psychiatrists in the Era of Apps and Chatbots. Academic Psychiatry, 44(2), 231–234. https://doi.org/10.1007/s40596-019-01170-3
Jang, S., Kim, J.-J., Kim, S.-J., Hong, J., Kim, S., & Kim, E. (2021). Mobile app-based chatbot to deliver cognitive behavioral therapy and psychoeducation for adults with attention deficit: A development and feasibility/usability study. International Journal of Medical Informatics, 150, 104440. https://doi.org/10.1016/j.ijmedinf.2021.104440
Oh, J., Jang, S., Kim, H., & Kim, J.-J. (2020). Efficacy of mobile app-based interactive cognitive
behavioral therapy using a chatbot for panic disorder. International Journal of Medical Informatics, 140, 104171. https://doi.org/10.1016/j.ijmedinf.2020.104171
[3] Turing, Alan M. “Computing machinery and intelligence.” Parsing the turing test. Springer, Dordrecht, 2009. 23-65.
[4] Searle, John R. “Minds, brains, and programs.” The Turing Test: Verbal Behaviour as the Hallmark of Intelligence (1980): 201-224.
[5] Zhang, Yizhe, et al. “Dialogpt: Large-scale generative pre-training for conversational response generation.” arXiv preprint arXiv:1911.00536 (2019).
[6] Obermeyer, Ziad, et al. “Dissecting racial bias in an algorithm used to manage the health of populations.” Science 366.6464 (2019): 447-453.
[8] National Institutes of Health, National Institute of Mental Health. (n.d.). Statistics: Any Disorder Among Adults. Retrieved March 5, 2013, from http://www.nimh.nih.gov/statistics/1ANYDIS_ADULT.shtml
[9] Substance Abuse and Mental Health Services Administration. (2012). Results from the 2010 National Survey on Drug Use and Health: Mental Health Findings NSDUH Series H-42, HHS Publication No. (SMA) 11-4667). Rockville, Md.; Substance Abuse and Mental Health Services Administration, 2012
[10] Jacobs, M. K., Christensen, A., Snibbe, J. R., Dolezal-Wood, S., Huber, A., & Polterok, A. (2001). A comparison of computer-based versus traditional individual psychotherapy. Professional Psychology: Research and Practice, 32(1), 92–96. https://doi.org/10.1037/0735-7028.32.1.92
[11] Sajadinejad, M. S., Asgari, K., Molavi, H., Kalantari, M., & Adibi, P. (2012). Psychological Issues in Inflammatory Bowel Disease: An Overview. Gastroenterology Research and Practice, 2012, 1–11. https://doi.org/10.1155/2012/106502
[12] Cutler, David M., and Wendy Everett. “Thinking outside the pillbox—medication adherence as a priority for health care reform.” New England Journal of Medicine (2010).
[13] Andersson, Gerhard, et al. “Guided Internet‐based vs. face‐to‐face cognitive behavior therapy for psychiatric and somatic disorders: a systematic review and meta‐analysis.” World Psychiatry 13.3 (2014): 288-295.