Mental health cases are on the rise more than ever before following the mental health crisis that accompanied the Covid-19 pandemic. According to a Lancet publication from last year, the pandemic triggered 76 million cases of anxiety and 53 million cases of depression worldwide.

Therapy chatbots are filling in the gaps

In a world where mental health resources are limited, therapy chatbots are increasingly addressing the shortfall. The mental health chatbot app Wysa, launched in 2016, has been described as an ’emotionally intelligent’ artificial intelligence (AI) therapy chatbot and currently has three million users. It is currently being used in some London schools, while the UK’s National Health Service is also conducting randomised control trials (RCTs) to see if the app could be used for those on NHS mental health waiting lists.

In Singapore, the government licensed Wysa during the peak of the Covid-19 pandemic in 2020. As of June this year, the app has received device designation from the US Food and Drug Administration (FDA) to treat anxiety and depressive disorders.

The market is currently unregulated

How exactly a mental health chatbot can help a patient is still unclear, and research conducted into them is limited and often by the companies that created them. The therapy chatbot market is unregulated and may only be creating the illusion of help. Most therapy chatbots are not required to have governmental approval and the FDA even loosened its rules surrounding mental health apps to provide remote mental health support during the pandemic.

Clearly, there needs to be stricter regulations and rules on what these bots can and cannot say. Chatbot app Woebot is one of the most controversial launches, running off both clinical research and AI. In 2018, when a user input a statement that mentioned they were a minor being forced into coitus asking for help, the app simply responded: “It shows me how much you care about connection and that’s beautiful.”

The launch of mental health chatbots should be limited until there is empirical evidence to support their use and efficacy. At present, it seems that these platforms may be causing more harm than good. But with better research and regulations, mental health chatbots may serve a stronger role in the mental health care system. Perhaps its efficacy could be bolstered by pairing AI with human intelligence.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.