Study: 25 percent of most-viewed COVID-19 related videos on YouTube contain ‘misleading information’

Amid a rise in misinformation regarding the coronavirus pandemic, YouTube, which sees billions of views per day, hosts a large number of videos with millions of views that contain “misleading information” specifically regarding the COVID-19 crisis, according to a study published Wednesday in “British Medical Journal.”

Researchers looked up the most-viewed coronavirus-related videos uploaded since March 21, 2020. They used key words such as “coronavirus” and “COVID-19,” which ultimately left them with 69 videos to analyze for reliability and usability. 

Of the 69 videos analyzed, which had cumulatively racked up more than 200 million views, 19 of them presented false and misleading information specifically about the novel coronavirus, the researchers said. 

RELATED: CoronavirusNOW.com, FOX launches national hub for COVID-19 news and updates

“Of the 19 non-factual videos, six were from entertainment news (32%), five were from network news (26%), five were from internet news (26%) and three were from consumer videos (13%),” the authors wrote. 

The 19 “non-factual videos” had accumulated more than 62 million views. 

The authors of the study went on to suggest that YouTube, which boasts more than 2 billion monthly users, can either be a powerful educational tool for health care providers amid coronavirus pandemic or a hindrance. 

Smartphone Apps

FILE - A smartphone screen is seen with the Streaming app Youtube on May 11, 2020 in Bochum, Germany. (Photo by Mario Hommes/DeFodi Images via Getty Images)

The pandemic has triggered a rise in conspiracy theories and widespread misinformation across social media platforms and the internet at large. 

With millions of videos being uploaded to YouTube weekly, filtering through misleading content is an uphill battle for the company. 

RELATED: Better Business Bureau warns of ‘mandatory online COVID-19 test’ text message scam

YouTube recently removed videos that featured two California doctors who made headlines for downplaying the seriousness of the COVID-19 pandemic. The doctors called for an end to social distancing orders, and have since been called out by medical experts for spreading misinformation. 

The two garnered even more attention after Tesla and SpaceX owner Elon Musk tweeted a video clip of the doctors with the caption, “Docs make good points.”

One YouTube video shared on Twitter by Musk featuring the doctors has since been deleted after YouTube put out a statement saying it violated their guidelines.

RELATED: ‘Reckless and untested musings’: Health experts criticize 2 doctors urging ease on social distancing

“We quickly remove flagged content that violate our Community Guidelines, including content that explicitly disputes the efficacy of local health authority recommended guidance on social distancing that may lead others to act against that guidance," a spokesperson for YouTube said. "However, content that provides sufficient educational, documentary, scientific or artistic (EDSA) context is allowed -- for example, news coverage of this interview with additional context. From the very beginning of the pandemic, we’ve had clear policies against COVID-19 misinformation and are committed to continue providing timely and helpful information at this critical time.”

Other social media companies have rallied to fight against the deluge of misinformation on their platforms. 

Twitter announced Monday it will start alerting users when a tweet makes disputed or misleading claims about the coronavirus.

The announcement signals that Twitter is taking its role in amplifying misinformation more seriously. But how the platform enforces its new policy will be the real test, with company leaders already tamping down expectations.

The Associated Press contributed to this report. 
 

Health CoronavirusWorld