YouTube is facilitating the spread of misinformation
In July 2018 New Zealand was hit by a storm of debate about ‘hate speech vs free speech’ created by Lauren Southern and Stefan Molyneux, two controversial YouTubers known best for their anti-Muslim, anti-women and anti-multiculturalism views. The pair had planned a visit to Auckland for a speaking event.
The controversy arose after Auckland Council denied them a council venue to speak at and members of the public organised a well-attended peaceful protest against their visit.
But for anyone paying attention to the online movements that have grown in the wake of the toxic Gamergate campaign, the pair’s views and actions were likely unsurprising.
A combination of ease of production, algorithms that promote shocking and controversial content, and a lack of moderation have made YouTube in particular a breeding ground for extremist views and racist misinformation.
New Zealand is well placed to fall victim to the hoaxes, pseudoscience and radicalisation that thrive on the Google-owned platform.
New Zealanders now spend more time watching video online on YouTube and Facebook than reading news sites. While Facebook and Twitter have earned the majority of criticism for misinformation, YouTube is likely a more influential source of mis/dis/mal-information.
Youtube has pledged to do more to limit the reach of Conspiracy Theory videos “We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11” but they have been vague on where they will draw the line and the changes have already been implemented inconsistently. Without a clear policy on what is and is not “borderline content” it is likely there will continue to be harmful content promoted.
The new policy is currently only focussed on a small amount of videos in the United States, but a wider rollout is planned eventually.
Much of the New Zealand-based misinformation on YouTube spreads mistruths about New Zealand history to hundreds of thousands of people
Searching for something as innocuous as ‘New Zealand History’ on YouTube (in a private browser) brings New Zealand Skeletons in the Cupboard, a discredited racist ‘documentary’ that was pulled from TVNZ for being inaccurate, as the top result. It currently has over half a million views on YouTube.
The YouTube recommendations from that video include a lecture by Jordan Peterson, known for his anti-women views, and other videos about an earlier race of people in New Zealand (6 Reasons New Zealand Was Settled by an Ancient Unknown People, 45,000 views, and Brien Foerster: Crimson Horizon - The Mysterious Red-Haired Sea Kings of the Pacific FULL LECTURE, 38,000 views) as well as videos about Bigfoot.
A video by a New Zealand conspiracy theory channel titled Evidence of Pre-Māori Civilisation has 114,000 views. The same channel has four videos about Southern and Molyneux's visit to New Zealand.
Within a handful of recommended videos you can find your way from a search for New Zealand history to videos by American far-right figures arguing for white supremacy.
New York Times columnist and Professor at the University of North Carolina Zeynep Tufekci notes that:
“For all its lofty rhetoric, Google [which owns YouTube] is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.”
“What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.”
The Wall Street Journal conducted an investigation of YouTube content that found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news source”. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.
Author J.M. Berger, a scholar with the EU research group VOX-Pol took a quantitative look at users who identify with or are connected to the alt-right movement on Twitter. He conducted a detailed analysis of nearly 30,000 Twitter accounts that either tweeted white supremacist content or followed others who did. The goal was to document the habits and practices of the alt-right on Twitter in a kind of overall “census” of the movement. Berger found that YouTube was, by far, the most common external site white supremacists linked to. There were about 74,000 tweets linking to YouTube videos in his sample; the next-closest external site was Facebook, with only 23,000.
This algorithmic rabbit hole plays a big role in emboldening people to spread race-based online abuse and harassment. Misinformation can create a pretext for racist beliefs and views, and foster a toxic online space where racist language is common and accepted. Online hate speech can lead to the shaping of hateful views and behaviours and, as has been illustrated around the world, violent hate crimes.