Advertisement

SKIP ADVERTISEMENT

YouTube May Have Misinformation Blind Spots, Researchers Say

The video platform said it had limited the spread of misinformation ahead of Election Day, but new research showed that false narratives continued to slip through.

The nonprofit Media Matters identified a variety of videos on YouTube that contained misinformation.Credit...YouTube

Nico Grant, based in San Francisco, writes about Google and the other pieces of its parent company, Alphabet.

No, China has not worked with Democrats to steal the midterm elections, as some people on YouTube have claimed. Nor has Saudi Arabia.

And there is no evidence that an “overwhelming amount of fraud” tipped Pennsylvania in 2020, or that electronic voting machines will manipulate results next week, as one conservative activist has claimed in a video.

Ahead of the midterm elections, disinformation watchdogs say they are concerned that what has been described as an aggressive effort by YouTube to confront misinformation on the Google-owned platform has developed blind spots. In particular, they are worried about YouTube’s TikTok-like service that offers very short videos, and about the platform’s Spanish-language videos.

But the situation is difficult to understand clearly, more than a dozen researchers said in interviews with The New York Times, because they have limited access to data and because examining videos is time-intensive work.

Image
Jiore Craig, the head of digital integrity at the Institute for Strategic Dialogue, said it could be difficult to monitor videos for misinformation.Credit...Tag Christof for The New York Times

“It’s easier to do research with other forms of content,” such as text found on Facebook or Twitter, said Jiore Craig, the head of digital integrity for the Institute for Strategic Dialogue, or I.S.D., a nonprofit that counters extremism and disinformation. “That puts YouTube in a situation where they get off easier.”

While Facebook and Twitter are closely scrutinized for misinformation, YouTube has often flown under the radar, despite the broad influence of the video platform. It reaches more than two billion people and houses the web’s second-most popular search engine.

YouTube banned videos that claimed widespread fraud in the 2020 presidential election, but it has not established a comparable policy for the midterms, a move that prompted criticism from some watchdogs.

“You don’t build a sprinkler system after the building is on fire,” said Angelo Carusone, the president of Media Matters for America, a nonprofit that monitors conservative misinformation.

A YouTube spokeswoman said the company disagreed with some of the criticism of its work fighting misinformation. “We’ve heavily invested in our policies and systems to make sure we’re successfully combating election-related misinformation with a multilayered approach,” the spokeswoman, Ivy Choi, said in a statement.

YouTube said that it removed a number of videos that The New York Times flagged for violating its policies on spam and election integrity and that it determined that other content did not violate its policies. The company also said that from April to June it took down 122,000 videos that contained misinformation.

“Our community guidelines prohibit misleading voters on how to vote, encouraging interference in the democratic process and falsely claiming that the 2020 U.S. election was rigged or stolen,” Ms. Choi said. “These policies apply globally, regardless of language.”

YouTube intensified its stance against political disinformation after the 2020 presidential election. Some YouTube creators took to the platform and livestreamed the Jan. 6, 2021, attack on the Capitol. Within 24 hours, the company began punishing people who spread the lie that the 2020 election was stolen and revoked President Donald J. Trump’s uploading privileges.

YouTube committed $15 million to hire more than 100 additional content moderators to help with the midterm elections and the presidential election in Brazil, and the company has more than 10,000 moderators stationed around the world, according to a person familiar with the matter who was not authorized to discuss staffing decisions.

The company has finessed its recommendation algorithm so that the platform does not suggest political videos from unverified sources to other viewers, according to another person familiar with the matter. YouTube also created an election war room involving dozens of officials, and has been preparing to quickly remove videos and livestreams that violate its policies on Election Day, the person said.

Still, researchers argued that YouTube could have been even more proactive in clamping down on false narratives that might continue to reverberate after the election.

The most prominent election-related conspiracy theory on YouTube has been the unfounded assertion that some Americans have cheated by stuffing drop boxes with multiple ballots. The idea came from a discredited, conspiracy-laden documentary titled “2000 Mules,” which said Mr. Trump had lost re-election because of illegal votes cast at drop boxes.

Image
The discredited documentary “2000 Mules” contains false election narratives that have spread on YouTube.Credit...YouTube

I.S.D. examined YouTube Shorts and found at least a dozen examples of short videos that echoed the ballot-trafficking allegations of “2000 Mules,” without warning labels that would counter the misinformation or provide authoritative election information, according to links shared with The New York Times. The viewership of the videos varied widely, from a few dozen views to tens of thousands. Two of the videos featured links to the film itself.

I.S.D. found the videos through keyword searches. Its list was not meant to be exhaustive, but “these Shorts were identified with relative ease, demonstrating that they remain easily accessible,” three I.S.D. researchers wrote in a report. Some of the videos feature men addressing the camera, in a car or a home, promoting their strong belief in the film. Other videos promote the documentary without personal commentary.

The nonprofit group also looked at rivals of YouTube Shorts, TikTok and Instagram Reels, and found that both of them had also spread similar types of misinformation.

Ms. Craig, of I.S.D., said nonprofit groups like hers were working hard ahead of Election Day to catch and counter misinformation that remained on the social media platforms of tech giants, even though those companies have billions of dollars and thousands of content moderators.

“Our teams are strung out picking up the slack of the well-resourced entities that could be doing this kind of work,” she said.

Even though videos on YouTube Shorts are not longer than one minute, they are more difficult to review than longer videos, according to two people familiar with the matter.

The company relies on artificial intelligence to scan what people have uploaded to its platform. Some of the A.I. systems work in minutes, and others in hours, looking for signs that something is wrong with the content, one of the people said. Shorter videos give off fewer signals than longer ones, the person said, so YouTube has begun to work on a solution that can work more effectively with its short format.

YouTube has also struggled to rein in Spanish-language misinformation, according to research and analysis from Media Matters and Equis, a nonprofit focused on the Latino community.

Almost half of Latinos have turned to YouTube weekly for news, more than they have any other social media platform, said Jacobo Licona, a researcher at Equis. And those viewers have access to a profusion of misinformation and one-sided political propaganda on the platform, he said, with Latin American influencers based in countries like Mexico, Colombia and Venezuela wading into U.S. politics.

Many of them have co-opted familiar narratives, such as false claims about dead people voting in the United States, and translated them into Spanish.

In October, YouTube asked a group that tracks Spanish-language misinformation on the site for access to its monitoring data, according to two people familiar with the request. The company was looking for outside help in policing its platform, and the group worried that YouTube had not made necessary investments in Spanish content, they said.

YouTube said it communicated with subject-matter experts to gain further insight ahead of the midterms. It also said it had made significant investments in combating harmful misinformation across languages, including Spanish.

Image
Kayla Gogarty, the deputy research director at Media Matters, worries about the real-world impact of misinformation on YouTube.Credit...Octavio Jones for The New York Times

YouTube has several laborious processes for moderating Spanish-language videos. The company has Spanish-speaking human moderators, who help teach A.I. systems that also vet content. One A.I. method has involved transcribing videos and reviewing the text, an employee said. Another path has been to use Google Translate to convert the transcript from Spanish to English. These methods have not always proven to be accurate because of idioms and slang, the person said.

YouTube said that its systems had also evaluated visual signals, metadata and on-screen text in Spanish-language videos and that its A.I. had been able to learn new trends, such as evolving idioms and slang.

In English, researchers have found electoral fraud claims from famous personalities with big followings, including Charlie Kirk, Dinesh D’Souza (who created “2000 Mules”) and Tim Pool, a YouTube personality with 1.3 million followers known for sowing doubt about the results of the 2020 election and questioning the use of ballot boxes.

“One of the most disturbing things to me is people watching ballot boxes is being praised and encouraged on YouTube,” Kayla Gogarty, the deputy research director at Media Matters, said in an interview. “That’s a very clear example of something going from online to offline, which could cause real-world harm.”

Nico Grant is a technology reporter covering Google from San Francisco. Previously, he spent five years at Bloomberg News, where he focused on Google and cloud computing. More about Nico Grant

A version of this article appears in print on  , Section B, Page 1 of the New York edition with the headline: Blind Spots Are Feared At YouTube. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT