top of page
youtube-conspiracy-800x450.jpg

YouTube's 'Conspiracy Effect'

November 2018

Does YouTube Promote Conspiracy Theories?


I’m sure many of you can relate when I say I have found myself watching video clips on YouTube (of cute kittens), when all of a sudden, it’s 2 hours later - I had lost all sense of time and realised that yet again, I had been sucked in to a 'YouTube binge' - the videos I had watched were recommended to me on the right side of the screen, given by the site’s helpful “up next” feature. Two hours of watching video clips, and I’d gone from watching kittens to something totally different, having no idea how I got there.

There are currently over 1.5 billion YouTube users in the world, and just like me, what these 1.5 billion YouTube users see is mostly shaped by the recommendations or “up next” algorithm. Algorithms function in the same way a newspaper editor would, making decisions about what information is relevant and in what context. It has been said that algorithms are the new ‘gatekeepers’ of information (Caplan & Boyd, 2016).


But what information is YouTube making visible to its users?


YouTube’s role in boosting conspiracy theories.  


Recent research at Harvard studied 13,529 YouTube channels and analysed the YouTube recommendation videos. After studying this data the findings were that the majority of YouTube recommendations show far-right content and conspiracy theories.

“You’re only one or two clicks away from extreme far-right channels, conspiracy theories, and radicalizing content.” (Kaiser & Rauchfleisch, 2018).


This idea of YouTube promoting conspiracy theories was further supported when a former YouTube engineer, Guillaume Chaslot, who worked on YouTube’s “up next” algorithm, stated that the way the algorithm was built gives a boost to fake news and conspiracy theories. Chaslot highlights YouTube’s bias leans toward conspiracies on his website Algotransparency.org. For example, when searching for “Is the earth flat or round?” you are more likely to be recommended videos that are in favour of the flat earth theory. Similarly, when searching for the “Who is the Pope?”, videos that state the Pope is ‘evil’ or ‘satanic’ are 80% more likely to be found on YouTube than a normal Google search. These findings are alarming, but the point here is not to point blame to YouTube but to understand how the high visibility of conspiracy theories via YouTube could lead to bigger problems.


The Conspiracy Effect


Van der Linden (2015) coined the term ‘conspiracy effect’. The study found that the simple act of watching a two-minute clip on a conspiracy theory is enough to compel people to dismiss the science. This research also illustrates that the ‘conspiracy effect’ can apply to anyone, believers or non-believers, after simply being exposed to a conspiracy theory video. This is a troubling finding considering the amount of conspiracy theory videos being made highly visible due to YouTube’s algorithm.


The Dangerous Rippling-Effect


So, what we know so far is that YouTube is being watched by billions of people every day, and the majority of the videos being viewed come from YouTube’s recommendation algorithm. We also know that YouTube’s algorithm is biased towards conspiracy theories, which means that there is a high chance that billions of YouTube users are being exposed to conspiracy videos every day. We also know that it only takes a 2-minute conspiracy video for people to be compelled to dismiss the science surrounding the conspiracy.


Okay, So the world will have a few more conspiracy theory believers and the tinfoil industry will grow due to more people wrapping their heads in foil. That doesn’t seem like that big of a deal, right? Well, it turns out that the belief in conspiracy theories can have very dangerous ripple effects. A few studies have indicated that climate change conspiracy theories discourage people from trying to reduce their carbon footprint. Anti-vaccine conspiracy theories discourage people from vaccinating their children against diseases. Political conspiracy theories make people less likely to want to engage in the political system. (Jolley & Douglas, 2013; Jolley & Douglas, 2014). Barkun (2016, p.4) stated that "Once conspiracy theories become commonly discussed forms of explanation, their movement into political discourse is only a matter of time". The dangerous ripple effects could be endless. The issue of YouTube’s conspiracy effect should be taken seriously.


What Can be done?


YouTube’s CEO Susan Wojiciki has recently been vocal about the way its recommendation algorithm promotes conspiracies. “When there are videos that are focused around something that’s a conspiracy—we will show a companion unit of information from Wikipedia showing that here is information about the event” Susan Wojcicki (YouTube CEO). This is a nice gesture, but anyone who knows Wikipedia also knows that literally, anyone can edit its archive. This solution doesn’t seem to be fitting of the problem. Blocking or deleting users videos also seems like a solution that may have unforeseen consequences. Research suggesting that a greater knowledge of the news media reduces the likelihood of believing in a conspiracy theory (Craft, Ashley & Maksl, 2017). This could possibly be one way to go. Educating people on news media literacy would mean that people have the knowledge they need to think critically about the media they see. However, the practicalities of this may be quite difficult. How do you educate over a billion people in news media literacy? I have a feeling you will need more than a simple 2-minute video.


Should anything be done?


“YouTube’s algorithm doesn’t seek out extreme videos, but looks for clips that are already drawing high traffic and keeping people on the site.” Chaslot, 2017. Former YouTube engineer, Chaslot, states that the algorithm does favour conspiracy theories, but suggests that this is due to conspiracy theories drawing high traffic. It could be that this is what the users want to see. Should we simply give the people what they want? This may sound controversial, but I believe it to be a relevant aspect to consider. The world is evolving, and technology can be a great opportunity for sharing and getting more diverse voices heard. As Palfrey & Gasser (2008) suggest, the internet as a mass medium has brought about ‘semiotic democracy’ where “a greater number of people are able to tell the stories of their times”. Yet, one could also view this in a negative light, as did Fenster (2008) when he claimed that the internet was “the Petri dish for paranoids”.


Food for thought


Should YouTube decrease the number of conspiracy videos recommended to users? Or would this be silencing voices that may not be heard otherwise? To not provide an alternative truth or stopping alternative truths from being heard, this idea rings alarm bells for me. Look at Galileo, he had a different view of the world, and his beliefs and opinions were shot down and silenced. Yes, now his beliefs about how the world works are the main narrative, but there was a time the most common view was that the earth was flat. If Galileo had access to YouTube in his time I am certain he would be the main YouTuber with ‘conspiracy’ videos on “The Earth is Round!”.


Many scientists have been called ‘quakes’ before, but in today's modern world we believe in the narrative of science. Yet, science can be wrong and change. Think of poor old Pluto, being a planet and then demoted. Should we ignore an idea simply because it goes against science? To silence anyone's beliefs, whether they are ‘scientific’ or not, is a fundamentally flawed approach in my opinion. Yes, I think there are effects and even dangers when it comes to the belief in conspiracy theories. Yet I also think that there can be an even bigger danger when we believe it is okay to disregard, silence or filter out other people's beliefs. It is also important to remember that some conspiracy theories, such as Watergate and MK-Ultra turned out to be true. If we stop people voicing their concerns about an issue simply because it comes under the label of 'conspiracy theory' then maybe those few that are true will slip under the radar.


Personally, I think it is good to have a diverse account of the world around me. Let’s be honest, no one really knows what is going on, and the more people’s views we have on reality, the deeper our understanding of ourselves, others and the world grow. Do you think conspiracy theories on YouTube should be monitored, accompanied by a third-party, taken down? Or do you think YouTube should stay true to its slogan of ‘Broadcast Yourself’ and let people share their beliefs regardless of ‘evidence’. Comment now before your opinion gets a Wikipedia link attached to it, or worse, silenced and blocked. 


Original article on tgrassowtruthbetold.wordpress.com









YouTube's 'Conspiracy Effect': Work
bottom of page