Categories
Blog

Silencing or Strengthening? The Ongoing Debate Over Deplatforming Extremists 

Deplatforming involves permanently removing controversial figures from social media sites to reduce the spread of harmful or offensive content. This approach has been increasingly adopted by platforms like Facebook, Twitter, and YouTube, targeting numerous high-profile influencers (Jhaver et al., 2021). Despite its intentions, the effectiveness of deplatforming remains hotly debated, particularly after Twitter’s 2016 ban of several alt-right accounts led to a surge in users on Gab, known for its lax moderation and as a ‘free speech’ alternative to Twitter (Rogers, 2020). Among its new users were figures like Robert Bowers, the perpetrator of the 2018 Pittsburgh synagogue shooting, and Milo Yiannopoulos, a right-wing provocateur banned from Twitter for targeted harassment. Additionally, many extremists have migrated to Telegram, which offers secure messaging and has been criticized for its lenient stance on extremist content, thereby allowing such material to persist longer than it might on more mainstream platforms (Shehabat et al., 2017). Telegram’s features, such as public channels and private chats, make it a potent tool for extremist groups, enabling them to broadcast to followers and organize through secure chats. Notably, the platform’s confidence in its security measures led it to offer a $300,000 prize twice to anyone who could break its encryption (Weimann, 2016).

This backdrop sets the stage for a broader critique. Critics point out that deplatforming simply relocates extremists to other online spaces, thus passing the problem elsewhere and potentially strengthening the convictions and distrust of their followers towards society and mainstream information sources (Rogers, 2020). Another significant concern is the role of social media companies as arbiters of speech. By assuming the power to deplatform, these companies take on a quasi-judicial role in determining what speech is acceptable. This raises questions about the concentration of power in the hands of private entities, the potential for biased enforcement of rules, and the impact on freedom of expression and democratic discourse. The fear is that such power could be misused to silence legitimate dissent or favor certain political viewpoints. Critics also argue that deplatforming may inadvertently draw more attention to the suppressed content, a phenomenon known as the Streisand Effect. This term stems from a 2003 incident when Barbra Streisand unsuccessfully sued photographer Kenneth Adelman and Pictopia.com for privacy violation over an aerial photograph of her house, leading to vastly increased public interest in the photo.

In contrast, supporters argue that deplatforming cleanses online spaces and limits the reach of extremist content creators. While these individuals can easily find alternative online spaces to share their ideologies, their overall impact is arguably reduced on less popular platforms. Indeed, several studies confirm the effectiveness of deplatforming. For instance, the one conducted by Jhaver et al. (2021), suggests that deplatforming can decrease activity levels and toxicity among supporters of deplatformed figures. In another study (Rogers, 2020) it was observed that banned celebrities who migrated to Telegram experienced reduced audience engagement and milder language. Conversely, Ali and colleagues (2021), who analyzed accounts on Gab suspended from Twitter and Reddit, noted increased activity and toxicity but, in line with other studies, a decrease in potential audience size.

Given these mixed outcomes, there’s a clear need for further research to assess deplatforming’s effectiveness comprehensively. A systematic analysis across various platforms could provide a clearer understanding of deplatforming’s consequences, informing future strategies for managing online extremism.

Ali, S., Saeed, M. H., Aldreabi, E., Blackburn, J., De Cristofaro, E., Zannettou, S., & Stringhini, G. (2021, 2021). Understanding the effect of deplatforming on social networks.

Jhaver, S., Boylston, C., Yang, D., & Bruckman, A. (2021). Evaluating the effectiveness of deplatforming as a moderation strategy on Twitter. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1-30.

Rogers, R. (2020). Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. European Journal of Communication, 35(3), 213-229.

Shehabat, A., Mitew, T., & Alzoubi, Y. (2017). Encrypted jihad: Investigating the role of Telegram App in lone wolf attacks in the West. Journal of strategic security, 10(3), 27-53.

Weimann, G. (2016). Terrorist migration to the dark web. Perspectives on Terrorism, 10(3), 40-44.