MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Thursday, 12 December 2024

Lonely planet: Editorial on the flip side of AI companionship apps

Research by Massachusetts Institute of Technology shows that objectivity & non-judgemental understanding are the reasons why people prefer bot companions over their human counterparts

The Editorial Board Published 04.11.24, 05:44 AM

File Photo

Sewell Setzer III, a child from Florida, committed suicide recently. In an incident that is revealing and worrying, his mother has filed a lawsuit against Character.AI, a platform that lets users have in-depth conversations with Artificial Intelligence chatbots, for abetting her son’s suicide. In the past, such apps have even been accused of sexual harassment of underage users. This is the flip side of a now booming, largely unregulated, industry of AI companionship apps. Users of these apps can create their own AI companions and chat with them over text messages, voice chats and so on. Many of these apps are designed to simulate intimate relationships and some market themselves as a way of combating the so-called loneliness epidemic. Ironically, there are several studies that show that such technology has exacerbated the loneliness epidemic: research at the University of Massachusetts Lowel, for instance, found that reliance on technology and social media edges out in-person relationships, leaving people lonelier and disconnected. This means little to the tech industry, which has sensed a lucrative market that can profit from the loneliness epidemic. The tech industry is not the only segment that has smelt money by exploiting human frailties. Scammers often use technology to defraud people, especially those who are looking for companionship, of their life savings. Americans lost an estimated $12.5 billion to online criminals in 2023, out of which $652 million was in romance scams that prey on the lonely. The phenomenon is not limited to the West; in the same year, Indians lost Rs 13.23 crore to online romance scams.

There is, of course, a need to set up guardrails in technology. Coding in pop-ups that either lead users to suicide prevention helplines or send alerts to overseers, parents or authorities, upon the use of certain keywords could be one way. But the root of the problem lies in people — the youth and the elderly are more vulnerable than most — turning to technology for intimacy and advice. Revealingly, research by the Massachusetts Institute of Technology shows that objectivity and non-judgemental understanding are the reasons why people prefer bot companions over their human counterparts. Perhaps a certain percentage of the billions being spent developing AI chatbots could be spent on creating accessible sites of socialisation — parks, libraries, recreation sites — that can help people of all ages rediscover the magic of human connectivity. It was not just AI that failed Sewell Setzer III; it was society.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT