MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Thursday, 19 December 2024

Bot bias

At a time when biases in our social fabric are being challenged, Generative Artificial Intelligence, in its present state, seems to be dyeing us back to the darker shades of history

Suvrat Arora Published 15.04.24, 06:08 AM
Representational image.

Representational image. File Photo.

The word, ‘prompt’, made its way into the shortlist for the Oxford Word of the Year last year. This only goes to show that Generative Artificial Intelligence is seldom out of the news. The prescient view is that AI is ready to confiscate the globe. But will the reign of AI be a charm or a catastrophe? This is a question that must be contemplated.

ChatGPT insists on a level of intransigence. At a time when conversations on inclusivity and diversity are peaking around the globe, the futuristic AI, at the precipice of superseding humans, remains assertive about its stringent narrative of the world. Be it gender, nationa­lity or colour, AI has a decisive perception, one that is cont­rary to that of the contem­po­rary world.

ADVERTISEMENT

A study by Rest of World, which reports on tech stories, found that Midjourney, one of the more popular image-generation AI tools, has left no stone unturned when it comes to racial profiling. “An Indian person” is almost always an old man with a beard. “A Mexican person” is usually a man in a sombrero. Most of New Delhi’s streets are polluted and littered. A poor person is typically black. As biased as these statements sound, they are some of the alarming insights based on neural prompts fed to Midjourney.

The consequences of such bias can be serious. Given the frenzy around it, organisations are employing AI to automate recruitment, security, loan provisions and so on. Imagine an AI-powered hiring tool filtering out qualified candidates based on its perceived notions of race or gender. Now stop imagining — because this has already happened. Amazon’s AI-based hiring system harboured a bias against female applicants and, thus, had to be scrapped. In 2022, Robert Williams became the first documented individual wrongfully arrested due to bias in facial recognition technology used by US agencies. Studies have also found that these face recognition systems are prone to misidentification of black and Asian minorities. Apart from hiring, AI biases have infringed upon the exchange of wealth too. Algorithmic discrimination in Chicago has led to the denial of loans to residents from black neighbourhoods. When AI systems are used to approve loans automatically, there’s a risk of denial to individuals from marginalised communities given the historical biases embedded in the training material.

In response to an interestingly sharp prompt by an X (formerly Twitter) user, ChatGPT stated “white” and “male” are characteristics of a good scientist. Another study by the University of Michigan has revealed that Open AI’s CLIP portrays low-income and non-Western lifestyles less accurately. Further, geographical bias against African countries was also identified.

But why is AI so skewed and opinionated? Blame it on the data gap. It is a fact that whatever AI generates is based on the data on which it is trained. And guess what? This data is created by humans and gathered from whatever is in circulation. Minorities and people of colour and diversity have always had to battle for unbiased representation: this battle has now spilled over into the arena of AI.

At a time when biases in our social fabric are being challenged, Generative AI, in its present state, seems to be dyeing us back to the darker shades of history. But the data that AI is hurling at us is not fabricated. They reflect the prejudices that afflict society itself.

While AI’s biases paint a disturbing picture, they also present an opportunity. By exposing and dismantling these prejudices, we can pave the way for a future where AI becomes a force for good, reflecting the diversity and richness of humanity. Experts should rewrite algorithms concerning society before they are coded into the machines.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT