MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Sunday, 22 December 2024

US presidential election falsehoods take off on YouTube as it looks the other way

Within months, the largest video platform became a home for election conspiracy theories, half-truths and lies

Nico Grant San Francisco Published 02.11.24, 06:50 AM
Representational image

Representational image File image

In June 2023, YouTube decided to stop fighting the most persistent strain of election misinformation in the US: the falsehood that President Joe Biden stole the 2020 election from Donald J. Trump.

Within months, the largest video platform became a home for election conspiracy theories, half-truths and lies. They in turn became a source of revenue for YouTube, which announced growing quarterly ad sales on Tuesday.

ADVERTISEMENT

During four tumultuous months of this year’s presidential campaign, researchers from Media Matters for America, a group that monitors information from conservative sources, examined the consequences of YouTube’s about-face.

While Media Matters is a progressive organisation that regularly criticises conservatives, reporters and academics frequently cite it as a source on YouTube misinformation because it devotes significant resources to tracking the vast platform.

The New York Times independently verified the research, examining all of the videos identified by Media Matters and determining whether YouTube placed ads or fact-check labels on them.

From May toh August, researchers at Media Matters tracked 30 of the most popular YouTube channels they identified as persistently spreading election misinformation, to analyse the narratives they shared in the run-up to November’s election.

The 30 conservative channels posted 286 videos containing election misinformation, which racked up more than 47 million views. YouTube generated revenue from more than a third of those videos by placing ads before or during them, researchers found. Some commentators also made money from those videos and other monetised features available to members of the YouTube Partner Programme.

Rudolph Giuliani, the former New York mayor, posted more false electoral claims to YouTube than any other major commentator in the research group, the analysis concluded. He said in May, for example, that if he did not rehash the 2020 election, the 2024 election would be stolen.

Journalist Tucker Carlson and pundit Ben Shapiro did not directly respond to a series of questions, but attacked reporting from The New York Times. Giuliani said in an X post that he was “proud to be included with Ben and Tucker — two GREAT Patriots!” He added that “we are particularly honoured by the designation as #1 among ‘major YouTube creators’.”

YouTube, which is owned by Google, has prided itself on connecting viewers with “authoritative information” about elections. But in this presidential contest, it acted as a megaphone for conspiracy theories. The commentators used false narratives about 2020 as a foundation for elaborate claims that the 2024 presidential contest was also rigged — all while YouTube made money from them.

Kayla Gogarty, a research director at Media Matters who led the analysis, said that “YouTube is allowing these right-wing accounts and channels to undermine the 2024 results.”

A YouTube spokeswoman said that the company reviewed eight videos, identified by The Times, and that those did not violate its community guidelines.

“The ability to openly debate political ideas, even those that are controversial, is an important value — especially in the midst of election season,” she said in a statement.

“Most” of the 30 tracked channels are “ineligible for advertising”, and some had previously violated the company’s content policies, the spokeswoman added. “This report demonstrates our consistent approach to enforcing our policies.”

YouTube said it removes videos that mislead voters on how to vote, encourage election interference or make violent threats.

Mary Ellen Coe, YouTube’s chief business officer, described the platform’s approach to the election as “cautious and vigilant” in a September interview.

“We have significant investment in this area in terms of making sure that first and foremost we’re raising authoritative content, and then we are removing or reducing things that might represent misinformation,” said Coe, who was not directly responding to the Media Matters research.

In December 2020, YouTube banned content that “that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 US presidential election”, the same policy it applied to historical presidential races.

But in June 2023, the platform reversed course, saying creators were allowed to dispute the outcome of any past presidential race as YouTube tried to offer “a home for open discussion and debate during the ongoing election season”. YouTube declined to comment on whether it would reimpose a ban on misinformation about the outcome of the race after states certified the votes, as it did in 2020.

YouTube removed only three of the videos that Media Matters found and placed information labels that link to factual information on 21 of them, though most of the election labels were later removed.

Some of the commentators seized on news events.

Those moments included Trump’s conviction on May 30 on 34 felony counts related to hush-money payments. Kash Patel, who served in Trump’s administration, said in a video posted by NTD, an independent television network, that the justice system was “rigged” against the former President to interfere with the election.

YouTube’s approach to the presidential race garnered attention in September when two Russians, working for the state-run media outlet RT, were indicted in connection with an effort to spread disinformation on the video platform.

RT allegedly funnelled the money through Tenet Media, which operated a YouTube channel with more than 16 million views. Tenet then paid popular pundits to create content, including Benny Johnson and Tim Pool, whose channels were monitored by the Media Matters researchers.

Johnson, Pool and others have said they had no knowledge about the true source of the payments. YouTube deleted Tenet’s channel but did not terminate Johnson’s and Pool’s accounts.

Johnson said his channel provided “a valuable access point for independent thought and journalism”, and reiterated his stance that the 2020 election was stolen. He attacked reporting from The Times in an online video. Pool did not respond to a request for comment.

YouTube made money from more of Johnson’s videos than any other creator’s videos that the researchers tracked, placing ads on 35 of his 39 videos. He shared conspiracy theories that have been debunked, including that Democrats “broke the machines” to deny Republicans a victory in Arizona on election night in 2020.

Political commentators on YouTube used Trump’s conviction to claim that his legal troubles were a concerted effort to hamper his electoral prospects.

Mike Davis, a former Senate aide who runs a judicial advocacy group called the Article III Project, claimed that Biden, his allies and his aides were behind Trump’s conviction. ( Trump was charged by state prosecutors, who are not under the control of the President, and convicted by a jury in New York.)

Davis said, “Biden is behind the unprecedented indictments of Trump.” He responded to questions from The Times with an attack on The Times’s reporting in a post on X.

When YouTube last barred creators from uploading misinformation about the presidential election, the platform said it worried the content would mislead users.

Since then, the company’s public stance has shifted. It said banning this type of election misinformation did not reduce real-world harm and has focused instead on its benefits.

“What’s important to us,” Coe, the YouTube executive, said, “is that we’re representing a broad spectrum of views.”

New York Times News Service

Follow us on:
ADVERTISEMENT
ADVERTISEMENT