MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Monday, 25 November 2024

Malady and immunisation

Psychologically ‘inoculating’ against misinformation helps blunt its power

Nico Grant And Tiffany Hsu Published 19.09.22, 04:07 AM

In the fight against online misinformation, falsehoods have key advantages — they crop up fast and spread at the speed of electrons, and there is a lag period before fact-checkers can debunk them.

So researchers at Google, and the UK universities of Cambridge and Bristol tested a different approach that tries to undermine misinformation before people see it. They call it “pre-bunking”.

ADVERTISEMENT

The researchers found that psychologically “inoculating” Internet users against lies and conspiracy theories — by preemptively showing them videos about the tactics behind misinformation — made people more sceptical of falsehoods, according to an academic paper published in Science Advances. But effective educational tools may still not be enough to reach people with hardened political beliefs, the researchers found.

Since Russia spread disinformation on Facebook during the 2016 election, tech companies have struggled to balance concerns about censorship with fighting online lies and conspiracy theories.

The strategies being deployed during the midterm vote in the US this year by Facebook, TikTok and other companies often resemble tactics developed to deal with misinformation in past elections: partnerships with fact-checking groups, warning labels, portals with vetted explainers as well as post removal and user bans.

Social media platforms have made attempts to pre-bunk before, though those efforts have done little to slow the spread of false information. Most have also not been as detailed — or as entertaining — as the videos used in the studies by the researchers.

Twitter said it would try to “enable healthy civic conversation” during the midterm elections in part by reviving pop-up warnings, which it used during the 2020 election. Warnings, written in multiple languages, will appear as prompts placed atop users’ feeds and in searches for certain topics.

The paper details seven experiments with 30,000 participants. The researchers bought YouTube ad space to show users in the US 90-second videos aiming to teach them about propaganda tropes and manipulation techniques.

One million adults watched one of the ads for 30 seconds or longer.

The users were taught about tactics such as scapegoating and deliberate incoherence, or the use of conflicting explanations to assert that something is true, so that they could spot lies.

Researchers tested some participants within 24 hours of seeing a pre-bunk video, and found a 5 per cent increase in their ability to recognise misinformation techniques.

One video opens with a mournful piano tune and a little girl grasping a teddy bear, as a narrator says “what happens next will make you tear up”. Then, the narrator explains that emotional content compels people to pay more attention than they otherwise would, and that fearmongering and appeals to outrage are keys to spreading moral and political ideas on social media.

The video offers examples, such as headlines that describe a “horrific” accident instead of a “serious” one, before reminding viewers that if something they see makes them angry, “someone may be pulling your strings”.

Beth Goldberg, one of the authors and the head of research and development at Jigsaw, a technology incubator within Google, said that pre-bunking leans into people’s innate desire to not be duped. “This is one of the few misinformation interventions that I’ve seen at least that has worked not just across the conspiratorial spectrum, but across the political spectrum,” she said.

Tech companies, academics and nongovernmental organisations fighting misinformation have the disadvantage of never knowing what lie will spread next. But professor Stephan Lewandowsky of the University of Bristol, a co-author of the paper, said propaganda and lies were predictable, nearly always created from the same playbook.

“Fact-checkers can only rebut a fraction of the falsehoods circulating online,” Lewandowsky said. “We need to teach people to recognise the misinformation playbook, so they understand when they are being misled.”

NYTNS

Follow us on:
ADVERTISEMENT
ADVERTISEMENT