The Changing Face of Radicalization: Online Rabbit Holes and Echo Chambers

In recent years far right extremism has been emerging in various corners of the globe. Most notably the rise of far right extremist groups have been on the rise in both the United States of America (USA) as well as across Europe. The internet has shown to be an effective tool for spreading information as well as disinformation. Oftentimes children and young adults are at higher risk for radicalization into extremist ideologies on various online platforms such as Youtube and TikTok.

The internet is home to many platforms where people can meet, share ideas, and build networks. It is also a place where echo chambers exist. This is largely due to the difficulties with regulating social media platforms. There is another aspect to this issue. As social media advances so does the algorithm that is curated for the user. It works by tracking content the user interacts with and then suggests similar content. While this is usually harmless and does not always directly link to radicalization, it can create an echo chamber for extremist ideologies. Both Youtube and TikTok have caught some heat for creating a pipe-line to right wing extremism through allowing these ideologies to be spread. Equally, a lot of creators making content that sends users down the pipe-line to extremist ideologies rely on dog whistles. A dog whistle is defined as “a coded message communicated through words or phrases commonly understood by a particular group of people, but not by others.” 

The Case for Youtube

A report done by Data Society analyzes the very phenomena of the Youtube to Alt-Right pipeline. The report describes a group of influencers as an “alternative influence network”. The network is made up of different political pundits and internet celebrities who share views that are reactionary to ideas like feminism, left leaning politics, and various other social justice issues. One of the things the report touches on is that right wing extremism is more center and easily accessible. It describes talking heads such as Ben Shapiro and Jordan Peterson who may not have outright extremist views, however do host more dangerous figures like Richard Spencer, thus begins the pipe-line.

In June of 2019, the New York Times published an article titled The Making of a Youtube Radical which follows the story of a young man named Caleb Cain in his journey down and out of the Youtube to alt-right pipe-line. The article describes Cain as a young man who was a broke dropout looking for a sense of direction. He found identity and community with a content creator, Stefan Molynuex, who shared his hardships about his difficult upbringing, something Cain related deeply to. However, Molynuex also had a political agenda, and much of his content touched on themes of how feminism is setting young men back. This would eventually send Cain down a path of consuming and binging far right extremist content from many creators and channels, some of which have now been banned. The story of Cain is not a unique one, and while the ends and outs of radicalization on online platforms aren’t fully understood it is important to keep a watchful eye on the youth and the media they are consuming.

The Case for TikTok

There exists a case to be made that there is indeed an alt-right pipe-line that can be found on Tiktok. In recent years, the Institute for Strategic Dialogue released a research article that looks at how extremist views including those of Neo-Nazis and other white supremacy groups are easily able to spread hate. The group found that content on TikTok that was leaning towards white supremacist ideas, was also layered with misogynistic and anti-LGBTQ+ sentiments. Some videos praised incel mass shooter Elliot Rodgers who killed six people in 2014 after writing a manifesto detailing his hatred for women. While this is only one example of many, it paints a very bleak picture. Not only are researchers finding that it is children and young adults who are exposed to this content, but also Islamic Militant groups are using the platform to spread their message. While a spokesperson for the app has made statements saying violent extremism is strictly prohibited, there still remains a gap where these ideologies are being spread very quickly. The quickness of this spread is largely to do with specific sounds, hashtags and keywords. While bans on these specific things have been moderately effective the platform needs to develop more nuanced policies. 

Conclusion 

The issue of online safety especially when it comes to extremist and hateful content is very multifaceted. Social media is a tool to connect, create and share ideas. It is an avenue for entertainment. It has also become a space for curated content, which means that perhaps social media companies should be held responsible for the content published on them, within their guidelines, even if it is content promoting extremist views. As it currently stands, more and more social media platforms, not just YouTube and TikTok, are being more and more scrutinized for their lack of action when it comes to combating hateful ideologies. For parents, they should be keeping a watchful eye on their children, and the content they are seeing and interacting with.

Image Credits: Jason Howie — Edited by GorStra team

Previous
Previous

Bi-Weekly Update 1: Terrorism & Counterterrorism

Next
Next

The Emergence of Boko Haram