MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Wednesday, 20 November 2024

Facebook's design hinders efforts to combat misinformation, reveals study by GWU scientists

Facebook's design favors community connection over misinformation control, says scientists

PTI New Delhi Published 16.09.23, 02:20 PM
Representational Image

Representational Image Image courtesy: Facebook

Facebook's core design sabotaged the social media giant's efforts to combat misinformation running rife on the platform, scientists analysing its misinformation policies said.

The platform's architecture pushed back even when Facebook tweaked its algorithms and removed content and accounts to combat vaccine misinformation, the researchers at the George Washington University, US, found.

ADVERTISEMENT

No reduced engagement with anti-vaccine content was seen, despite Facebook's significant effort to remove a lot of such content during the COVID-19 pandemic, their study published in the journal Science Advances said.

The scientists say that these consequences resulted from what the platform is designed to do - enabling community members to connect over common interests, which include both pro- and anti-vaccine persuasions.

"(Facebook) is designed to allow motivated people to build communities and easily exchange information around any topic," said David Broniatowski, lead study author and an associate professor of engineering management and systems engineering.

"Individuals highly motivated to find and share anti-vaccine content are just using the system the way it's designed to be used, which makes it hard to balance those behaviours against public health or other public safety concerns," said Broniatowski.

In the remaining anti-vaccine content not removed from the social media, links to off-platform, low credibility sites and "alternative" social media platforms increased in number, the researchers said.

This remaining content also became more misinformative, containing sensationalist false claims about vaccine side effects which were often too new to be fact-checked in real time, they found.

Further, anti-vaccine content producers were found to be more efficient in leveraging the platform than pro-vaccine content producers as they effectively coordinated content delivery across pages, groups, and users' news feeds, even though both the groups had large page networks.

"Collateral damage" in the form of some pro-vaccine content being removed as a result of the platform's policies and the overall vaccine-related discourse becoming politically charged and polarised could also have contributed, the study said.

Broniatowski pointed out that the discussion about social media platforms and artificial intelligence governance largely revolves around either content or algorithms.

"To effectively tackle misinformation and other online harms, we need to move beyond content and algorithms to also focus on design and architecture.

"Removing content or changing algorithms can be ineffective if it doesn't change what the platform is designed to do. You have to change the architecture if you want to balance (anti-vaccine behaviours against public health concerns)," said Broniatowski.

Social media platform designers could develop a set of "building codes" for their platforms informed by scientific evidence to reduce online harms and ensure users' protection, the researchers said.

Except for the headline, this story has not been edited by The Telegraph Online staff and has been published from a syndicated feed.

Follow us on:
ADVERTISEMENT
ADVERTISEMENT