Mindset

Business psychology services

Business psychology services

Our understanding of the opportunities and risks presented by Artificial Intelligence (AI) is in its infancy. However, some AI tools are already helping people to better manage their mental health. Is AI the answer to the UK’s mental health crisis or could it worsen an already challenging situation?

Waiting Times for Mental Health Patients

There is a lot of information about the benefits and risks of adopting AI solutions in the news. And, a story I picked up on focuses on the use of AI to assist individuals, particularly young people, with their mental health. That grabbed my attention! So, what do we need to know?

Firstly, we must acknowledge that NHS England data* states that 7.4 million people are waiting for mental health appointments. What’s more, the majority of patients wait 46 weeks for treatment, but 481,246 patients have waited for over a year. This is despite dedicated teams and Government investment in mental health services.

What happens if you are in poor mental health and are not seen for months? One option is to contact charitable organisations that provide free resources, helplines or drop-in support. However, a new option is gaining interest online. It’s an AI character called Psychologist.

What is Psychologist AI?

Psychologist is an AI Beta character that has received millions of messages during its first year of creation. It was developed by a psychology student, who trained it in the principles he covered in his degree. He has taught the bot to shape answers to an extensive range of the most common mental health questions and conditions. So, how does it work?

People in need of mental health support access the character online, type their questions and Psychologist responds. Each conversation is said to be confidential.

The text format and fact that this isn’t a real person seem to appeal, especially to those in the 16-30 age group. For them, anonymity and convenience are favoured over arranging an appointment with a specialist. Equally, no one knows you are asking questions, so there is no stigma attached.

The Pros & Cons of Using AI for Mental Health

The clear advantage of Psychologist is that communication with the bot can take place at any time of night or day and the response is instant. As there is no waiting list and you can communicate with it for as long as you need, it is at your beck and call. For some individuals, this on-tap interaction offers help when they need it most. As such, it is beneficial and may prevent some from reaching a point of crisis.

Additionally, this is far from the only use of AI in healthcare. To give an example, Limbic Access is a mental health chatbot that has secured medical device certification in the UK. It is used by NHS trusts as a triage tool. The future of patient care is dependent on technology that can help individuals to better manage their own health.

On the flip side, these AI tools are unable to provide advice or support that is tailored to the needs of the individual. As a result, responses are generic and limited by the information that has been input. The advice being offered by AI solutions is not monitored or quality-checked. Equally, it can’t pick up on nuances that might indicate heightened risk in a person.

We all know where an online search of symptoms can lead. Without the human perspective, could AI response lead people to self-diagnosis that doesn’t reflect the true nature of their situation? Is there a risk that it could heighten a person’s struggles?

My Thoughts on AI for Mental Health Support

The fact is that many people are already turning to AI for support with their mental health. Therefore, rather than fight it, we need to focus on ensuring it is fit for purpose. We should embrace the technology, yet find ways to monitor the advice. In my mind, the priority is to reduce the risk of harm.

Solutions that secure medical device certification are great, yet we need to be aware of every option available to people. The answers that I want are:

  • Who is programming them and how do we monitor if the advice is sound?
  • Are there trigger words or phrases that prompt a crisis response, such as an A&E call?
  • Who has the power to shut down AI solutions that aren’t helpful and may even prey on people’s vulnerabilities?
  • How can we learn about approved AI solutions, so that these can be actively promoted?

In addition, I’m well aware that the criteria for accessing support and treatment varies. Therefore, until AI can provide regionally specific advice, there is scope for misinformation to be shared. As a thought, Psychology AI is programmed in Australia, so is any signposting going to be relevant to UK users?

Addressing the Demand for Mental Health Support

What we do know is the NHS, Government and private specialists have yet to find a viable solution to address the high demand for mental health support. Therefore, we need to be open to fresh options, especially if they provide a means of early intervention. If people can easily support and access techniques it can prevent issues from escalating. As such, it offers the ability for individuals in need to self-care, which is empowering.

I’m unsure of AI in psychology at present, but I am open to learning more and reviewing the research that I hope will be carried out. But, for me, nothing beats the human connection and personalised solutions that specialists can provide.

What are your thoughts on AI being part of a mental health strategy?

 

* https://www.england.nhs.uk/statistics/wp-content/uploads/sites/2/2023/06/Apr23-RTT-SPN-publication-version-PDF-427K.pdfnd, a

We use cookies to ensure the best experience on our website. You can disable the use cookies in your browser settings. Disabling cookies can cause the website to not function as intended.