Artificial intelligence is constantly at the centre of widespread media scrutiny and debate, with consistent concerns about the potential for unethical usage and privacy and data infringements. However, whilst measures must be taken to mitigate these worries, we need to open up the conversation to understand how AI, when used ethically, can be a significant force for good. A key example of an area where AI has huge potential to improve outcomes is in mental health care. AI offers an opportunity to help unlock a more objective and safer approach to assessing and diagnosing mental health conditions.
Currently, mental health diagnoses are largely subjective, resulting in the risk of key signs and symptoms being missed. Sadly, like so many people, I have witnessed first hand the tragedy that can unfold when mental illness is not properly seen – sometimes even by patients themselves. And even when patients are assessed by a clinician, the diagnosis and severity of a condition may be missed as assessments are based on observations and the patient’s own answers to subjective questions such as how they would rate their mood or emotions on a scale of 1-10. While this can be helpful there is huge potential for patients who are in need of mental health support to slip through the net.
However, AI presents the opportunity to introduce a more objective layer of assessment. There are a growing number of AI tools that are seeking to improve how we diagnose mental health conditions, including AI therapists and wearables. My team at thymia have created a tool in partnership with institutions such as UCL and KCL to assist with creating more objective ways of assessing mental health. The tech works by identifying invisible ‘biomarkers’ in users’ eye movements, voice or behaviour which cannot be seen by the human eye, in the same way that a blood test might be used to detect or monitor a physical condition.
Adding AI to wider mental health assessments means that clinicians can compare verbal information they gain from speaking to patients with data from physical and behavioural reactions that they might have missed during the consultation. This can help them more accurately determine the cause of the patient’s distress, allowing a more definitive diagnosis to be made, and a relevant treatment plan arranged, much sooner. It is devastating to see how the stigma surrounding mental illness pervades even today, with sufferers still feeling shamed by a diagnosis and reluctant to seek help. In bringing objective and data driven insights into mental health we can improve the parity of esteem with physical health, as well as levelling the playing field in terms of access to care and support, thereby starting to reduce the stigma and empowering patients to seek help.
By using tech to help spot mental illness sooner and more accurately, we can also alleviate some of the burden being placed on the healthcare system. Due to extensive waiting lists for mental health treatment, AI driven tools that can detect patients’ conditions can enable earlier intervention, preventing patients’ conditions from worsening – which in turn means less demand on in-patient wards and pressure on emergency care.
In order to guarantee that AI systems assess patients fairly, it is vital that machine learning models are trained from clean and debiased data sets, which are representative of all groups who might seek mental health support and do not have existing biases in the data learning tool. The model needs to be able to spot signs of illness and how they present in different people if clear and accurate diagnoses are to be achieved.
The use of AI in mental healthcare is the way forward, ensuring that patients’ conversations with clinicians can be cross-referenced with data to help understand their symptoms better. The objectivity that AI can bring to mental healthcare can help destigmatise mental illness and improve access by bringing it on a more equal footing with physical healthcare. By creating more objective methods of diagnosis, patients feel comfortable seeking support sooner to allow for earlier intervention. This is enabling a culture shift towards preventative mental health care and allowing patients to feel empowered about their diagnoses.