Story at a glance
- YouTube, like other Internet platforms, has been criticized for allowing and even promoting misinformation.
- A new study finds that while this is true for several subjects, anti-vaccine videos are somewhat suppressed.
- Still, the presence of vaccine misinformation is dangerous considering the ongoing coronavirus pandemic.
If you’re familiar with YouTube, you’ve probably been down a rabbit hole of “related videos” and search results before. But while some searches — like cute bunny videos — are relatively harmless, others can immerse viewers in disinformation.
While other major Internet platforms, including Facebook and Twitter, were pushed to act ahead of the presidential election, YouTube has defended its decision to leave up videos with misinformation in recent weeks.
Like other companies, we're allowing these videos because discussion of election results & the process of counting votes is allowed on YT. These videos are not being surfaced or recommended in any prominent way.— YouTubeInsider (@YouTubeInsider) November 12, 2020
A new study looked deeper into YouTube search results for controversial and disputed subjects and found one that the platform might be burying: anti-vaccination videos.
BREAKING NEWS ON THE CORONAVIRUS PANDEMIC
"We saw all these media reports and opinion pieces talking about how YouTube is driving people down the rabbit hole," Tanushree Mitra, a researcher at Virginia Tech University and author of the study, told Business Insider. "But I was like: 'All these reports are talking without any empirical evidence. Is this actually happening?'"
The study found that while your watch history can create a "filter bubble effect" on the YouTube "Top 5" and "Up Next" recommendations you receive, it won't necessarily do so if you’re searching for vaccine controversies. But the search results will — an area the study points to as needing improvement.
"We're committed to providing timely and helpful information, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, to help combat misinformation," a YouTube spokesperson told Business Insider. "We also have clear policies that prohibit videos that encourage harmful or dangerous content, impersonation or hate speech. When videos are flagged to us that break our policies, we quickly remove them."
But that’s not always the case when it comes to other misinformation. The study compared anti-vaccine misinformation to other misinformation that is demonstrably false, including conspiracy theories surrounding the 9/11 attacks, chem trails, the earth being flat and the moon landing.
YouTube searches for the chem trail conspiracy theory, which claim (despite scientific evidence to the contrary) that aircrafts and rockets spray trails of harmful chemicals, turned up "significantly more" misinformative results than all other topics, the study found. Videos about 9/11 conspiracy theories, however, presented the most misinformation in the "Up Next" or "Top 5" recommendations.
"No matter how much you search for anti-vaccines, or if a user goes and searches for anti-vaccine videos, the resulting recommendations from the algorithm would still be pointing them to debunking videos, or pro-vaccine videos," Mitra told Business Insider. "That's not the case for other ones, which potentially proves it'll push you down the rabbit hole if you're looking for chem trails, but not for vaccines."
Of course, the subject of vaccinations is more significant than ever during the coronavirus pandemic. A recent poll suggests Americans are more willing to receive a coronavirus vaccine than they were just two months ago, but 42 percent still say they would not get a COVID-19 vaccine when one becomes available. And while top public health officials are optimistic about life returning back to normal in 2021, the herd immunity required to make that happen depends on Americans receiving accurate information on vaccines.
WHAT YOU NEED TO KNOW ABOUT CORONAVIRUS RIGHT NOW