MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Sunday, 24 November 2024

Facebook's surprise findings on hate in India

The report said one of the reasons that motivated it was the 'recent spike in inflammatory content and hate speech'

Our Bureau And Agencies New Delhi Published 11.11.21, 10:48 PM
Facebook's senior officials downplayed the threats it posed just a year earlier, raising the question of whether they did enough to combat it.

Facebook's senior officials downplayed the threats it posed just a year earlier, raising the question of whether they did enough to combat it. Shutterstock

Content espousing hate surged in India around key events from elections to Covid and protests against the Citizenship Amendment Act, show Facebook's internal documents reviewed by NDTV.

The documents reveal how despite the spike in inflammatory and anti-minority content India, Facebook's senior officials downplayed the threats it posed just a year earlier, raising the question of whether they did enough to combat it.

ADVERTISEMENT

A document from July 2020, titled “Communal Conflict in India”, for instance, showed how offline harm was often accompanied by online hate, during key moments of crisis.

The report said one of the reasons that motivated it was the "recent spike in inflammatory content and hate speech in India" and "marked rise in violence against Muslim Minority in India over the last 18 months".

It found that in December 2019 -- which saw over 80 per cent increase in inflammatory content over the baseline around the time of the anti-CAA protests -- online content on Facebook and WhatsApp included “misinformation on protests, demonizing content (against Muslims) hate speech and inflammatory spikes”.

The same document also showed how in March 2020, at the start of the first COVID lockdown, there was a 300-plus per cent in spike in inflammatory content. Online content at the time blamed Muslims for the spread of Covid 19.

These documents are part of disclosures made to the American Securities and Exchange Commission and provided to American Congress in redacted form by whistleblower Frances Haugen's legal counsel. The redacted versions received by the US Congress were reviewed by a consortium of news organizations, including NDTV.

Internal documents also reflect the human impact of this spike in hate, captured in interviews of Muslims and Hindus in India conducted by Facebook. The team found that Muslim users felt particularly threatened or upset, while Hindus did not express similar fears. A Muslim man from Mumbai said that he was scared for his life, and that he was worried that all "Muslims are going to be attacked" due to the wide circulation of Islamophobic hashtags like #CoronaJihad.

However, a document from just a year before this, January 2019, "Critical Countries: Review With Chris Cox" said that "there is comparatively low prevalence of problem content (hate speech etc) on Facebook" in India. Mr Cox is a senior Facebook executive, who was in charge of the company's applications including Facebook, WhatsApp and Instagram at the time. The document goes on to say that "surveys tell us that people generally feel safe" in the country and "experts tell us that the country is stable".

This supposed "clean chit" was just months before the high pitched Lok Sabha polls, and the very divisive Delhi election campaigns.

However, by 2021, the tune seems to have changed.

Documents from the time show that Facebook's assessment of India had changed in an apparently belated acknowledgement of spiralling online and offline hate. Documents, which were related to then upcoming 2021 state elections in the country, said that "India has been assessed as severe for Societal violence... with recurring mobilization along identity fault lines, tied by both the press and civil society groups to social media discourse".

On this apparent change in stance, spokesperson for Meta said their teams have developed "an industry-leading process of reviewing and prioritizing which countries have the highest risk of offline harm and violence every six months. We make these determinations in line with the UN Guiding Principles on Business and Human Rights and following a review of societal harms, how much Facebook's products impact these harms and critical events on the ground."

Follow us on:
ADVERTISEMENT
ADVERTISEMENT