Written by 10:24 pm Discussions

### Is Artificial Intelligence Prepared to Serve as Your Therapist?

AI falls short as a convincing psychotherapist.

This month, Microsoft unveiled the forthcoming inclusion of its Copilot AI key on new Windows 11 PCs. Described by company executives as “transformative,” this new keyboard offers convenient access to generative artificial intelligence for users who are interested. The potential applications are vast, with discussions revolving around how AI could enhance the treatment of mental health conditions by improving the selection of antidepressants and other psychiatric medications. However, the full extent of AI’s transformative impact is less clear and subject to debate, particularly in the context of individuals undergoing psychotherapy.

The integration of AI in the healthcare sector is steadily expanding. According to a survey conducted in 2023, over 10% of healthcare providers currently incorporate chatbots like ChatGPT into their daily practices, and nearly half of them express interest in utilizing AI for tasks such as data entry, scheduling, and clinical research. Beyond administrative functions, some experts suggest that AI could play a pivotal role in delivering care to patients with highly stigmatized psychiatric disorders who might otherwise avoid seeking treatment for their mental health issues.

Can AI Demonstrate Empathy Towards Human Suffering?

But can AI effectively function as a psychotherapist? Psychiatrists and other mental health professionals undergo training in various forms of psychotherapy, including cognitive behavioral therapy, supportive psychotherapy, and psychodynamic psychotherapy. While each approach involves nuanced processes and techniques, they all share a common emphasis on _empathy_—the ability to understand and share the feelings of another individual.

Empathy necessitates a certain level of imagination. In some aspects, AI demonstrates this capacity by generating responses based on incomplete information, a phenomenon colloquially referred to as AI-generated “hallucinations.” Similarly, psychotherapists engage in a similar process when trying to grasp a patient’s distress. They offer interpretations and analyses when faced with uncertainties in a patient’s narrative. However, the establishment of an empathetic connection is rooted in shared human experiences such as fear, social injustices, or joy—elements that are beyond the reach of AI. These intangible shared experiences foster a bond between therapist and patient.

AI’s ability to improvise is limited, preventing it from replicating the nuanced experience of genuine human suffering. Unlike AI, a trained psychotherapist can decipher complex thoughts or behavioral patterns and respond accordingly, even in moments of therapeutic silence. AI relies on past data to predict future outcomes when generating responses, a method that can have negative consequences as evidenced by a report of a Belgian man who tragically took his own life after engaging in multiple conversations with a chatbot about eco-anxiety.

In my role as an emergency psychiatrist, I encounter individuals facing various crises. Recently, I met a young father from South America who undertook a perilous journey to the United States via Central America. Through tears and moments of contemplative silence, he shared his aspirations to earn a living and reunite with his family in the United States to escape the turmoil in his home country. His anguish was palpable and deeply rooted in the human experience. Can AI truly empathize with the struggles of a migrant father separated from his child while crossing the Darien Gap?

AI’s Propensity for Biases Raises Concerns

Artificial intelligence is susceptible to biases, including gender and racial biases, due to its algorithmic nature. This predisposition to biases poses significant challenges, particularly in the context of psychotherapy. For instance, the importance of racial concordance between patients and therapists is underscored in a 2022 study published in the Journal of Racial and Ethnic Disparities. The study revealed that 83% of black caregivers emphasized the significance of having a mental health provider who shared their racial and ethnic background. Preferences for racial concordance stem from shared experiences, cultural understanding, and enhanced rapport with the therapist—qualities that currently exceed AI’s capabilities.

While AI has the potential to advance the field of psychiatry and improve mental healthcare delivery, its role in clinical psychiatry remains ambiguous. Research suggests that AI can aid in analyzing human circadian rhythms, potentially detecting early signs of major depressive episodes before patients themselves recognize symptoms of low mood, enabling timelier interventions by healthcare professionals.

Ignoring the impact of AI in this domain is unwise, yet the precise place of generative AI in clinical psychiatry is still undetermined. While the possibilities presented by AI are enticing, there are unresolved questions regarding its capabilities. Although AI may contribute to advancements in mental healthcare, maintaining a sense of balance is crucial. As noted by Sam Altman of OpenAI at the World Economic Forum in Davos, Switzerland, AI technology is not adept at handling life-or-death scenarios.

For now, the nuanced understanding of human experiences that defines our humanity is best left to humans themselves.

Visited 3 times, 1 visit(s) today
Last modified: January 22, 2024
Close Search Window
Close