Where AI And Psychiatry Meet – Gurkiret Kaur

Concerns regarding AI are often centred around machines becoming better at being proactive partners that think, learn, and make decisions when presented with a challenge.

Artificial intelligence (AI) is increasingly utilised in health care, particularly in dermatology, oncology, and radiology.

However, AI has triggered feelings of concern, joy, excitement, fear, anticipation, and innumerable other emotions in individuals across the globe.

Worries and concerns regarding AI are often centred around machines becoming better at being proactive partners that think, learn, and make decisions when presented with a challenge. The fear is that eventually AI could take over human roles.

Proponents see the employment of AI as a way to reduce human workload while emphasising empathy and personal touch in service delivery. It has the potential to help us complete things more effectively, boost our creativity, and broaden our understanding of the world.

It is a waltz between algorithm and data, accompanied by constant feeding into machine learning, which reveals the intelligence of anticipating events through repetition.

The use of AI is gradually permeating the fabric of our reality. Voice command assistants, chatbots, and self-driving cars are just a few common examples. However, one area where AI is being considered for use is in psychiatry.

Dr Paz Garcia-Portilla, a professor of psychiatry at the University of Oviedo in Spain, and her colleagues used AI to provide a significant degree of comparison in assessing contextualised illness severity by combining clinical data, psychometry and cognition, functioning, and biomarkers. The overtrained AI staging model demonstrated reproducibility and ease of translation into daily use.

She discussed a study by Dunlop et al. (2017) that ranked the elements contributing to CGI-S rating at the European Psychiatric Association (EPA) Congress 2023.

The Clinical Global Impressions Scale – Severity (CGI-S), a common standard of assessment, is administered by mental health researchers and physicians worldwide, including in Malaysia. This scale assesses the severity of symptoms and overall illness in psychiatric disorders such as schizophrenia.

Schizophrenia is a persistent mental illness that causes people to lose touch with reality by altering perception, thought, behaviour, and affect. These patients experience a wide range of symptoms. Each case is unique.

In 2016, schizophrenia was ranked as the world’s 12th major cause of disability. It is also one of the more uncommon mental disabilities, accounting for 1 per cent of the global population; patients and carers lack knowledge and awareness of the condition, negatively impacting disease management success rates.

Due to the CGI-S’s subjective nature, which heavily relies on clinician evaluation of patient symptoms, conduct, functional impairment, and clinical presentation, its legitimacy has come under scrutiny. The scale generates a standardised report based on the clinician’s observations and interactions, and it serves as a reference for tracking condition development and treatment response.

Discordance was seen between the CGI-S and AI models of patients classified to the seven stages of disease severity. This has led to the consideration that misclassification may have occurred under the CGI-S standard.

The use of AI in this case could lead to reviews of existing accepted practices, standards, and guidelines, leading to improved effectiveness and accuracy.

Another case study exhibiting AI technology acceptance is in the field of suicide prevention. There is often a disparity between self-reported and clinician-reported cases of suicidal ideation.

Professor Philippe Courtet, of the University of Montpellier in France, discussed the use of AI to address the problem of under-reporting. According to studies, doctors were found to only report 25 per cent of cases, leaving the other 75 per cent unreported.

“We have identified many, many thousands of risk factors for suicide,” Courtet pointed out, “but they are no more accurate in predicting suicide risk than flipping a coin.”

He envisioned AI as a therapeutic decision support tool in suicide prevention in the future, using a stratified approach to identify those of higher vulnerability and needing immediate support and treatment.

Patients are divided into smaller, homogeneous groups with comparable characteristics using stratified psychiatry. Each patient has a unique biosignature, allowing for comparisons and risk assessments to be made against a larger database of historical data, supporting decision making by clinicians.

The constant updating of AI models to produce new forecasts serves as a paradigm for psychiatric divisions and medical professions, moving away from a one-size-fits-all approach. It necessitates a larger collaborative space, creation of possibilities, and the removal of the fear of failure.

In Malaysia, an opportunity exists to use AI modelling with the National Mental Health Registry (NMHR). This database was established in 2003 with the goal of collecting information on schizophrenia patients for the purpose of planning and evaluating mental health treatment practices.

However, the registry lacks data on critical follow-up such as treatment adherence and patient relapse rates. Infrequent updates also resulted in discontinuous and staggered evidence.

Versions of the National Health and Morbidity Survey did not report on schizophrenia, instead focusing on depression and anxiety disorders.

Persons with psychiatric disorders suffer from high rates of morbidity and mortality. When compounded by limited numbers or a dearth of mental health professionals, there is an urgent need for AI to help identify vulnerable and high-risk individuals in an effort to provide interventions as soon as possible to treat mental illnesses.

With the assistance of AI, it may be possible to review the definition of mental illnesses objectively, and diagnose them earlier, even at a prodromal stage.

Could the use of a deep learning model based on registry data such as the NMHR predict both a mental disorder diagnosis and disorder progression in the clinical assessment of a patient? Could AI be used to predict mental disorder diagnosis and severity?

With AI, treatment plans to therapist training can potentially be personalised. Patients can even be empowered and to take charge of their own care.

However, before such potential can be realised, countries such as Malaysia must look to improving its fundamentals such as electronic health records which can be costly and take up scarce resources.

Such investment in health care is crucial before health systems can take advantage of the promises of AI technology.

Gurkiret Kaur is a research officer at the Galen Centre for Health and Social Policy.

  • This is the personal opinion of the writer or publication and does not necessarily represent the views of CodeBlue.

You may also like