MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Monday, 23 December 2024

Dr Bot is in

Artificial Intelligence is already supplementing — and overpowering — humans in various walks of our lives. Should we outsource psychotherapy with the human touch to AI as well?

Srimoyee Bagchi Published 26.02.24, 05:58 AM
Representational image.

Representational image. File Photo.

My father, a lifelong alcoholic, was a lonely man. On the rare occasion when alcohol did not drive him into a rage, it took him to the old desktop computer to a chatbot named ELIZA. This 1966 predecessor of ChatGPT was created by Joseph Weizenbaum to trick users into believing that they were conversing with a person and not a computer program. Weizenbaum had developed ELIZA as a cautionary tale, to demonstrate that “[t]here are aspects to human life that a computer cannot understand...” Ironically, ELIZA became widely popular as a Rogerian therapist — in which the patient directs the conversation and the therapist often repeats the patient’s language back at him or her — with people spending hours sharing their woes with ELIZA.

Programs like Jabberwac­ky, Dr. Sbaitso and ALICE were ELIZA’s successors. Exchanges with these chatbots were often engaging, sometimes comical, and occasionally nonsensical. But the idea that computers can serve as confidants, expanding therapy’s reach beyond the limits of its overworked practitioners, persisted through the decades. In 2024, at a time when the world is in the midst of a staggering mental health crisis exacerbated by the pandemic, there are over 20,000 apps that have entered the mental health space, offering to supplement traditional therapy with Artificial Intelligence. The following numbers may explain why: in the United States of America, more than half of all counties lack psychiatrists; in India, there are only 0.3 psychiatrists and 0.07 psychologists per 100,000 people. Mental health startups have thus smelt blood.

ADVERTISEMENT

AI is already supplementing — and overpowering — humans in various walks of our lives. Should we outsource psychotherapy with the human touch to AI as well? In an ideal world, the answer would be a straightforward no. AI can only mimic emotions; empathy remains a core human trait that is impossible to encode in an algorithm. But unlike humans who require rest and remuneration for sustenance, chatbots are available at all times, are largely cost-effective and accessible, and, so far, anonymous. Research has also found that some people feel more comfortable confessing their feelings to an insentient bot rather than a person. A bond of trust between the therapist and the patient is paramount to the effectiveness of therapy. But there is an undeniable vulnerability and a fear of judgement in these spaces given the human therapists’ predilection for prejudices. When talking to a bot, these stakes have the potential to be minimised, allowing users to shed their inhibitions. Moreover, studies show that in India, the human therapy room has remained largely ‘apolitical’, making topics concerning gender, caste, identity, and marginalisation inherently taboo. This gives AI an edge.

Or does it? AI is encoded with human biases too. There have been instances where it failed to recognise child sexual abuse: when a 12-year-old being forced to have sex turned to the app, Wysa, it responded with “You seem to be doing well overall but are struggling with a few problems right now.” On occasion, Doctor Bot turned into an abuser itself: the Replika app told one user that it wanted to touch the latter’s “private areas”. Besides, threats to privacy and data leaks can never be ruled out.

But the global shortfall in mental healthcare prompts the question: is some therapy better than none? The urgency of the crisis should prompt an investigation of the systemic problems in mental healthcare instead of leading us unthinkingly into the care of the AI therapist.

Such an enquiry into the gaps in healthcare should include attention to the silent epidemic of loneliness. My father claimed that ELIZA was a patient, objective listener. There should not be a paucity of such listeners in a world of over eight billion humans.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT