Categories
Blog

Presentation at the 3rd International Conference of PACT (Populism and Conspiracy Theory Project)

INTRODUCTION  

Essentially, medical conspiracy theories “depict medical, science or technology-related issues as under the control of secretive and sinister organisations” (Lahrach, Furnham 2017: 89), advocating that malevolent “motivations underpin everything from vaccination campaigns to cancer treatment” (Grimes 2021: 1). Although Medical conspiracy theories have been “a problem since before the dawn of social media” (Ibid), it is unquestionable that the Internet has provided an amplification to this issue. Even before the pandemic, when the gravity of this problem became most evident (Ibid), the digital spread of disinformation had already shown alarming consequences for the acceptance of medical science, especially when it comes to anti-vax propaganda.

Already in 2019, the WHO (2019) declared “vaccine hesitancy” as one of the top-ten threats to global health. Since “Medical conspiracy theories directly contradict evidence-based scientific research” (Lahrach, Furnham 2017: 89), belief in this type of conspiracy theory leads people to reject modern mainstream medicine (Ibid; Douglas et al. 2019: 3), the consequences of which can be severely life-limiting and harmful (Grimes 2021:2). Under these circumstances, the case of the anti-vax movement is especially concerning, seeing how the online spread of disinformation contributed to the worldwide decrease of vaccine uptake, consequently leading to the comeback of diseases that had been virtually cured in the past (Douglas et al. 2019: 4; Grimes 2021: 2).

Many controversies led to the widespread of anti-vax conspiracy theories, ever since the very beginning when vaccines were first being developed. What eventually became one of the main pillars of the anti-vax movement was the publication of an article in 1998 by gastroenterologist Andrew Wakefield that suggested a link between the measles, mumps, and rubella vaccine to the development of autism (Stano 2020: 488; Sherwin 2021: 559). Even though in the following years Wakefield’s research was investigated and found to be irresponsible, dishonest, and fraudulent (in the words of the UK General Medical Council), the anti-vax movement had already gained traction, so much so that by 2002 “immunisation rates dropped below 85 per cent” (Stano 2020: 489). Progressively, the phenomenon of the anti-vax movement “extended beyond Wakefield’s case, making social networks key actors in the rise and spread of forms of anti-vaccine conspiracionism online” (Ibid, 491). Social media has thus become, as frequently cited in academic studies, a “source of vaccine controversy” (Grant et al. 2015: 2). Thriving in this ambient, anti-vax conspiracy theories have become resilient, persisting despite all efforts to eradicate them, even progressively gaining more support.

Considering this relevance, this brief presentation aims to analyse social media posts with the help of digital humanities methodologies, seeking language patterns that can potentially assist in the codification of cultural meanings and in the formation of ideological clusters of anti-vaccine conspiracy theorists online.

MATERIALS AND METHODS

Unfortunately, I am unable to share the name of the Telegram group from which I obtained my data, as it is sensitive information protected by the GDPR. What I can say is that the group’s description states that it is an “Anti-New World Order” channel. The “New World Order” is a term that is common for many conspiracy theories which describe a secretly emerging authoritarian/totalitarian political elite that seeks to replace all sovereign nation states with a one-world government.

The data that I obtained from the group was the textual (non-pictorial) content of messages sent from its administrator to the channel’s subscribers (which are a total of 25.1 thousand accounts). Only messages containing the string of characters ‘vacc’ somewhere in its text were collected (thus including words such as ‘vaccine’, ‘vaccines’, ‘vaccination’, ‘anti-vaccine’, etc).  Messages were collected from 1st July 2023 to 1st June 2024, manually, totalling 9 messages. My intention is to automate this process in the future, so that a larger amount of texts may be easily collected.

The data was compiled on a .txt file, which was then uploaded to Voyant – an open-source web-based text reading and analysis environment which was designed to facilitate reading and interpretive practices for digital humanities students and scholars.

After uploading the dataset to Voyant, this is the panel I was working with:

PRELIMINARY RESULTS

I explored some of Voyant’s available tools that could help me in identifying language patterns, starting with the ‘TermsBerry’, which shows the most common terms of the text and their closeness to each other:

By hovering the mouse over a term, the words that are closely related to it in the text light up. The stronger the colour, the more times these two terms appear together. For example, the strongest correlatives of ‘vaccines’ (figure on the bottom left) are: ‘containment’, ‘covid’, and ‘measures’, while weaker (but still relevant) correlatives are: ‘immune’, ‘excess’, ‘deaths’, and ‘trend’. The relevant correlatives for ‘vaccine’ (singular) (top left figure) are: ‘camps’ (alluding to the idea of ‘vaccination camps’), ‘banned’ and ‘people’ (connected to the victimization of nonvaccine individuals), ‘covid’, ‘linked’, ‘theories’, ‘warned’ (related to how conspiracy theories seek to warn people of dangers that only those capable of observing hidden connections can see), and ‘swabs’ (code for the act of ‘getting vaccinated’).

Interestingly, the strongest correlatives of ‘vaccination’ (top right) are: ‘covid’, ‘response’, ‘lockdown’, ‘true’, ‘motivation’, ‘saving’, and ‘lives’. Interpreting these results require caution. Do these relate to somehow the idea of vaccines as saving people’s lives? By looking at the correlatives of ‘destroyed’ it is possible to see: ‘businesses’, ‘white’, ‘people’, and ‘lives’. This tool does not provide for negation, which means that correlates will appear even if the meaning of the sentence is negative.

To investigate this further, we may take a look at another tool, called ‘Contexts’:

Here it is possible to see all occurrences of terms containing the string ‘vacc’ in the dataset as well as what precedes and what follows each occurrence in the text. Reading the context allows to confirm (or disproof) the analysis of the results of the correlatives, in a way that it is possible to be sure that the discourse in the texts do not see vaccines or the lockdown as measures taken to save lives, focusing instead on the side-effects and on the notion of these measures as being harmful. It is important to note that reading the context of each occurrence is only possible while dealing with such a small dataset (including only 11 occurrences from a total of 9 messages). The bigger the dataset, the more difficult it becomes to check the context for each analysed word and meaning.

Finally, it is worth mentioning the tool ‘Bubblelines’:

This graph shows the occurrence of selected terms (in this case, ‘vacc*’, ‘covid’, ‘scandemic’, ‘lives’, ‘saving’, and ‘people’) over the course of the txt file – and, since it contains the messages in order of post, it also reflects passage of time. We can see that ‘covid’ (dark green) and ‘vacc*’ (light green) appear together most of the times, therefore the discourse surrounding vaccination in the channel mostly regards the covid vaccine and not other kinds. Considering the messages were collected between 2023 and 2024, one could suppose that would not necessarily be the case, and yet it appears so. Another interesting result points to the occurrences of the term ‘scandemic’ spread across the timeline, which I previously supposed it would coincide with the occurrences of Covid but that did not. Rather, the graph suggests the terms are used almost interchangeably, which may indicate that ‘scandemic’ is used as code for the ‘covid pandemic’.

DISCUSSION & FINAL REMARKS

One notion is commonly echoed in the literature: that conspiracy theories are strongly related to the complexities of living under conditions of uncertainty (mainly around values, morals, and identity), as well as fear and confusion that accompany these contemporary crisis-filled periods of socio-cultural upheavals, when epistemic conventions erode, in the risk-saturated, overly-connected, globalized world of late-capitalism (Douglas et al. 2019; Harambam 2020; Lee 2020; Butter & Knight 2020; Leone et al. 2020).

Medical conspiracy theories “are widely known, broadly endorsed, and highly predictive of many common health behaviours”, in a way that their belief “arises from common attribution processes” rather than from psychopathological conditions (Oliver, Wood 2014: 818). The anti-vax movement, more specifically, is not restricted to any single political inclination (Avramov et al. 2020: 521). Besides, it is possible to affirm that belief in medical conspiracy theories and vaccine hesitancy are not likely to be binary, but rather (much like radicalisation), exist “on a spectrum, which can be readily influenced by several mechanisms” (Grimes 2021: 2).

As meaning-making mechanisms, conspiracy theories reduce complexity, suggesting “simplistic and opaque relationships between causes and effects or inputs and outputs” (Önnerfors & Krouwel 2021: 254). This may seem paradoxical, since “some conspiracy theories appear complex on the surface”, possessing layers of interconnected elements and assumptions, however, “in the end most conspiracy theories make a relatively black-and-white assumption of an all-evil conspiracy stopping at nothing to pursue malevolent goals” (Krouwel & van Prooijen, 2021, p. 29). Producing its own evidence, they bring about coherence from a disordered social reality (Amlinger 2022: 262), establishing “a pseudo-rationality (particularly related to presumed causalities) while addressing emotions such as fear and blame within a simplified ethics of good and evil” (Önnerfors & Krouwel 2021: 254).

Therefore, it is possible to say that conspiracy theories carry out “epistemic search for hidden realities” aiming “to give meaning to the gaps in perception” through causal determination that is, however, incongruent with reality (Amlinger 2022: 264). This way, sense is “created in a situation of existential fragility”, where the feelings of powerlessness are warded off by the idea of taking back control (Steiner & Önnerfors 2018: 33), since “simple and straightforward beliefs about society foster people’s sense that they understand the world, which helps them regulate such negative feelings” (Krouwel & van Prooijen 2021: 29).

The dichotomic style of processing characteristic of conspiracy theories manifests inflexible convictions that are also innate to extreme political ideologies, leading to “a pessimistic view about the functioning of society, independent of whether it is extremism on the right or on the left” (Thórisdóttir et al. 2020: 307). According to Önnerfors and Krouwel (2021: 263), it is the “omnipresence of doom scenarios” and “absence of a positive political project for the future” that promote fertile ground for conspiracy belief.

As means of conclusion, considering this work is still in-progress, I can state that there are still methodological issues, namely the fact that as data amount increases, it becomes more difficult to avoid loss of context, opening the analysis for the possibility of misinterpretation. This is still a challenge that I am not sure how to resolve, however, I still believe there is much need for the development of such methodology, since when it comes to social media, scholars need to work with increasingly larger texts.

REFERENCES

Amlinger, C. (2021). Men make their own history: Conspiracy as counter-narrative in the German political field. In: Hristov, T., Carver, B., & Craciun, D. (Eds.), Plots: Literary Form and Conspiracy Culture. Routledge, 179-199.

Avramov, K., Gatov, V., & Yablokov, I. (2020). Conspiracy theories and fake news In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 512-524.

Butter, M. & Knight, P. (2020). Introduction. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 304-316.

Douglas, Karen; Uscinski, Joseph; Sutton, Robbie; Cichocka, Aleksandra; Nefes, Turkay; Siang Ang, Chee; Deravi, Farzin 2019. Understanding Conspiracy Theories. Political Psychology 40 (2019): 3-35.

Grant, L., Hausman, B. L., Cashion, M., Lucchesi, N., Patel, K., & Roberts, J. (2015). Vaccination persuasion online: a qualitative study of two provaccine and two vaccine-skeptical websites. Journal of medical Internet research, 17(5), e133.

Grimes, David 2021. Medical Disinformation and the Unviable Nature of COVID-19 Conspiracy Theories. PLoS ONE 16(3): e0245900, DOI: 10.1371/journal.pone.0245900

Harambam, J. (2020a). Conspiracy Theory Entrepreneurs, Movements and Individuals. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 278-291.

Krouwel, A., & van Prooijen, J. W. (2021). The new European order? Euroscepticism and conspiracy belief. In: Önnerfors, A., & Krouwel, A. (Eds.), Europe: Continent of Conspiracies: Conspiracy Theories in and about Europe. Routledge, 22-35.

Lahrach, Y.; Furnham, A. (2017). Are modern health worries associated with medical conspiracy theories?. Journal of Psychosomatic Research 99, 89-94, DOI: 10.1016/j.jpsychores.2017.06.004

Lee, B. (2020). Radicalisation and conspiracy theories. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 304-316.

Leone, M., Madisson, M., & Ventsel, A. (2020). Semiotic Approaches to Conspiracy Theories. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 43-55.

Oliver, E. & Wood, T. (2014). Medical Conspiracy Theories and Health Behaviors in The United States. JAMA Internal Medicine, 174(5), 817-818.

Önnerfors, A., & Krouwel, A. (2021).  Between Internal Enemies and External Threats; How conspiracy theories have shaped Europe – an introduction. In: Önnerfors, A., & Krouwel, A. (Eds.), Europe: Continent of Conspiracies: Conspiracy Theories in and about Europe. Routledge, 1-21. 

Sherwin, B. D. (2020). Anatomy of a conspiracy theory: Law, politics, and science denialism in the era of COVID-19. Tex. A&M L. Rev.8, 537.

Stano, S. (2020). The Internet and The Spread of Conspiracy Content. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 483-496.

Steiner, K., & Önnerfors, A. (2018). Expressions of Radicalization. Global Politics, Processes and Practices.

Thórisdóttir, H., Mari, S., & Krouwel, A. (2020). Conspiracy theories, political ideology and political behaviour. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 304-316.

Categories
Blog

Presentation at the II International Semiotics Congress of ASIA (Asian Semiotics International Association)

INTRODUCTION

In 2005, a French author named Giséle Littman published, under the pseudonym Bat Ye’or, a book entitled Eurabia – The Euro-Arab Axis. The text states that “ever since the early 1970s, the European Union was secretly conspiring with the Arab League to bring about a ‘Eurabia’ on the continent” (Bergmann 2021: 39). In 2011, another French author called Renaud Camus published a book entitled The Great Replacement, that “argued that European civilisation and identity was at risk of being subsumed by mass migration, especially from Muslim countries, and because of low birth rates among the native French people” (Ibid, 37). Even though these books may have introduced the “fear of cultural subversion”, the full conspiracy theory “usually also takes the form of accusing a domestic elite of betraying the ‘good ordinary people’ into the hands of the external evil” (Ibid, 38).

How, then, can we define the Eurabia conspiracy theory in concise terms? First, let us take a step back and look into the definition of conspiracy theory, in a more general sense: Conspiracy theory can be defined as a representation in the form of a narrative that explains an event or circumstance as being the result of a group of people with covert and malicious intentions (adapted from Leone et al. 2020: 44 and Birchall 2006: 34). From this, the definition of the Eurabia conspiracy theory may thus be: the European continent is being transformed into an Islamic society through the destruction of white Christian civilisation, brought about by the secret alliance between Muslims, the domestic elites of Europe, and left-wing cultural-Marxists (adapted from Bergmann 2021 and Gualda 2021). This conspiracy theory in particular “has been one of the most fast-growing amongst Neo-Nationalists, rooting in countries like Austria, Denmark, Germany, Italy”, the UK, the Netherlands, and Belgium (Bergmann, 2021: 37).

Given the relevance of this topic, this short exploratory presentation aims to semiotically analyse the messages from a white supremacist Telegram group, with the help of digital humanities methodologies, seeking language patterns that can potentially assist in the codification of cultural meanings and in the formation of these anti-Muslim ideological clusters on Telegram. This presentation regards work that is still in-progress, as I am nearing the end of my first year of the PhD course.

MATERIALS AND METHODS

Unfortunately, I am unable to share the name of the Telegram group from which I obtained my data, as it is sensitive information protected by the GDPR.

The data that I obtained from the group was the textual (non-pictorial) content of messages sent from its administrator to the channel’s subscribers (which are a total of 12.5 thousand accounts). The messages were collected from October 1st to December 31st, 2023, manually (one by one), totalling 168 messages. My intention is to automate this process in the future. Seeing how this was my first test, I thought it would suffice to collect this amount manually for now.

The data was compiled on a .txt file, which was then uploaded to Voyant – an open-source web-based text reading and analysis environment which was designed to facilitate reading and interpretive practices for digital humanities students and scholars.

After uploading the file to Voyant, this is the control panel that I was working with:

PRELIMINARY RESULTS

I explored some of Voyant’s available tools that could help me in identifying language patterns, starting with the ‘Collocates’:

The ‘Collocates’ list provides the terms that occur near certain keywords. The highlighted words – for example “genocide”, “work”, “victory”, “run” – are built-in categories from Voyant, that serve to classify words “positive” (green) and “negative” (red). Voyant allows you to edit those categories, but since I am not doing sentiment analysis, there was no need to consider them for now. Even so, it may be interesting to see how “immigrants” are mostly associated with words such as “genocide”, “threats”, and “run”, while the word “white” appears together with “work” and “victory”. Other relevant associations may be the collocates: “white” + “replacement”; “genocide” + “Europeans” + “Europe”; “immigrants” + “tax” + “payer”; and “immigrants” + “illegals”.

With this first list only, it is already possible to see how one does not need to read all 168 messages in order to get a picture of the discourse contained in this Telegram group, which I believe to be the point of such tools – to facilitate analysis of large datasets.


Moving on to the next panel (below), it is possible to see the most common words of the file, and if one hovers the mouse over a term, the terms that occur near to that word are highlighted. This provides for better visualisation, since it allows one keyword to be related to more than just one other term, like in the previous table. In turn, each of such terms is further related to other collocates, forming a web of most common keywords and the most common terms found near them in the text.

I highlighted a few segments that seemed most relevant:

The first one surrounds the word “immigrants”, which is linked to, again, ‘tax’ and ‘payer’, but also to ‘living’, ‘quietly’, and ‘numbers’. The word ‘quietly’ points to the conspiratorial nature of the immigration phenomenon, implying that there is a secret agenda behind it.

The next image centres around the word ‘muslim’ (in singular), which is here linked to ‘germany’, ‘team’, ‘police’, ‘post’, and ‘world’. This data is a bit harder to interpret. We know from the image on the left bottom corner that ‘hitler’ is also one of the most popular terms used in the Telegram group, and considering how this is a white supremacist group, it is unsurprising that Germany gets many mentions, given the country’s history with such movements. Yet, terms like ‘team’, ‘post’, and ‘world’ do not provide for clear analytical results. 

The third image (on the upper right), centres around the term ‘immigration’, which is linked to ‘scale’, ‘life’, ‘reported’, ‘invaders’, and ‘start’. Here, we have a clearer picture of the discourse, especially with the word ‘invaders’, which is also connected, in its turn, to ‘jewish’ and to ‘knife’.

On the bottom left corner, we have the web surrounding the word ‘muslims’ (in plural), linked to ‘christmas’ (I collected the messages during the month of December, so it makes sense), ‘ww2’, ‘war’, ‘settlers’, and again ‘germany’. In this case, perhaps ‘settlers’ is the most significant meaning-making term.

Finally, regarding the word ‘european’, we may see ‘genocide’, ‘happening’, ‘world’, ‘police’, and again ‘scale’. It is important to point out that ‘genocide’ is here linked to ‘european’, not with ‘muslim’. However, we saw from the collocate list that it can also be found near the word ‘immigrant’, despite it not showing in this visualization form.


Lastly, I would also like to share results obtained from the ‘Trends’ tool, which offered me the following graph:

It measures the occurrence of these selected terms over the course of the manuscript, and since the file contains the messages in order of post, it also reflects passage of time. The extreme left represents the start of October while the right represents the end of December. Here, it is interesting to note how the term ‘muslim’ only appears at the end, and in a couple of curves (around segments 38 to 47 – probably around November) it coincides with occurrences of the term ‘immigrants’ and ‘genocide’. However, from previous analysis, we see that ‘genocide’ is not a collocate of ‘muslim’, but it may be of ‘immigrant’ and surely is of ‘european’. This graph indicates that terms appear roughly in the same segment of the document, but not necessarily in the same sentences. Besides, the fact that the term Europa appears throughout the document is also important to consider, which makes it hard to interpret these curves as meaningful.

DISCUSSION

According to the literature, in Eurabia and Great Replacement discourses, ‘Islam’ is associated with “evil, crime and barbarism”, as well as other “harmful characteristics and ideological markers that enhance polarised, emotional and simplifying visions of social reality” (Gualda 2021: 57). It is “typically represented as backwards, fanatic and violent”, as well as a totalitarian political doctrine (Dyrendal 2020: 374), while Muslims themselves “are generally portrayed as a homogeneous group of violent and authoritative religious fundamentalists” (Bergmann 2021: 42). Muslim individuals are seen as “mere executors of a religiously based, collective will” and, consequently, since Islam is itself seen as fundamentalist in nature, “every believer will be made to follow its radical version” (Dyrendal 2020:  374). In this sense, the idea of ‘Islam’ is seen as being a uniting factor for all Muslims, that unites them “in a common plan for domination” (Ibid).

In this sense, the “Eurabia conspiracy theory has often become entangled in a more general opposition to immigration” (Bergmann 2021: 40), connected to political statements of failed integration (Ekman 2022: 1128) – which are based around the notion that Western societies are homogeneous, and that Muslims and other migrants are unable to integrate into them (Gualda 2021; Ekman 2022) – or to the notion that “incorporation of diversity, multiculturalism or other elements of Islam or the Muslim world into [Western] culture” will mean the total collapse of society, which will become a colony of Islam (Gualda 2021: 61-62). In other words, the arrival of “new norms, habits and customs brought by the foreign population […] could influence the disappearance of one’s own culture” (Ibid), turning immigration into an ‘invasion’ that threatens people’s culture and identity – as it was possible to see from the results of the quantitative analysis, which pointed to how ‘muslims’ and ‘immigrants’ are often linked to terms such as ‘invaders’ and ‘settlers’.

In general, the Eurabia conspiracy theory was brought firmly into the political mainstream by the financial crisis of 2008 and later the refugee crisis of 2015 (Bergmann 2021: 48-49). Nowadays, it is common to find political leaders who propagate Great Replacement and/or Eurabia conspiracy theories (Ekman 2022: 1127). As we see such Islamophobic and anti-immigration radical discourses become more popular, we also see them become normalized, especially across new media platforms such as Telegram.

CONCLUSION

As means of conclusion, considering this work is still in-progress, I can point to how automatization is dearly needed for such research – the more data, the more accurate the analysis. Another issue is that there are limits to how much semiotic analysis can be done on top of these quantitative results; how much can actually be accurately interpreted from these lists, graphs, and flowcharts? So much of semiotic analysis depends on context, therefore it is still hard to see how we can carry out analysis in large scale without losing said context. Nevertheless, I still believe there is much need for the development of such methodology, since when it comes to social media, scholars need to work with increasingly larger texts.

REFERENCES

Bergmann, E. (2021). The Eurabia conspiracy theory. In: Önnerfors, A., & Krouwel, A. (Eds.), Europe: Continent of Conspiracies: Conspiracy Theories in and about Europe. Routledge, 36-53.

Birchall, C. (2006). Knowledge goes pop: From conspiracy theory to gossip. Berg Publishers.

Dyrendal, A. (2020). Conspiracy Theory and Religion. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 304-316.

Ekman, M. (2022). The great replacement: Strategic mainstreaming of far-right conspiracy claims. Convergence, 28(4), 1127-1143.

Gualda, E. (2021). Metaphors of Invasion, Imagining Europe as Endangered by Islamisation. In: Önnerfors, A., & Krouwel, A. (Eds.), Europe: Continent of Conspiracies: Conspiracy Theories in and about Europe. Routledge, 54-75.

Leone, M., Madisson, M., & Ventsel, A. (2020). Semiotic Approaches to Conspiracy Theories. In: Butter, M., Knight, P. (Eds.), Routledge Handbook of Conspiracy Theories. Routledge, 43-55.

Categories
Blog

Influence of social network processes on radical conspiracy theories (PART 2) 

PART 1 proposed that the Internet’s horizontal distribution of power and equal opportunity are illusory or lost to the past, having been replaced with the logics of algorithms, which merely increase the influence of those who already possess it, as described by the ‘Matthew effect of accumulated advantage’.

Beyond that, it is also interesting to consider social media influencers and their role not only in disseminating (mis/dis)information, but also in guiding interpretation (telling people how to understand a piece of news, for example), and providing others with reasons for action (for instance, boycotting a determined business, purchasing products from a determined company, contacting representatives with certain demands, donating to certain organizations, voting for a determined party, organizing protests, etc.).

Such influencers can participate in deliberately coordinated networks of influencing activities and be engaged in the dissemination of strategic talking points […], but their posting activities can also be self-started, aiming at the advancement of personal brands, i.e. increasing personal popularity. Often, they are doing both. (Madisson, Ventsel 2021: 17)

The crux of the matter is that there are different possible ways to configure knowledge or to encode information, and “those that gain precedence will influence what it means to know; what kind of knowledge is culturally valued; how we learn; and who will have access to knowledge and power” (Birchall 2006: 8). 

Therefore, in the same way that one may speak of an economic elite (a minority group that holds wealth) or a political elite (a minority group that holds decision-making power), it may also be possible to speak of a “semiotic elite” – those nodes and actors who are in possession of the power to influence interpretation for larger groups of people. This semiotic elite, created by the cost of visibility, is who decides how the world should be interpreted. Naturally, this is not a product of new media or technology, since there has always been a group of people who more or less guided (or sought to guide) how information was interpreted at least ever since the printing press. However, the issue is that nowadays we are leaving the formation of this powerful group in the hands of social media algorithms and its logics.

Other than the previously explained ‘Matthew effect’ (see PART 1), the second network process described by Leal (2020: 500) is called ‘clustering’. In essence, clustering “reveals our tendency to connect with people who are similar or close to us”. This, in turn, leads to segregation, that is, the so-called social media bubbles or echo-chambers (Leal 2020; Stano 2020).

Echo-chambers are densely connected network clusters with few, if any, links to other groupings. They are bounded spaces marked by the internal reproduction of ideas rather than the external production of knowledge. […] As clustered, tight-knit communities, echo-chambers forge particular beliefs and shield their members from outside influences. With little or no exchange of information with opposing or even different groupings, the adopted viral narratives will reverberate in feedback-loops leading to persistent worldviews and resistance to change. (Leal 2020: 507)

Echo-chambers thus thwart the development of people’s critical faculties and analytical analysis, a configuration that “feeds propaganda and extremism and reduces democracy and critical debate” (Stano 2020: 487). And the

ideological effect of the echo-chamber is based on both homophily (similarity) and heterophily (dissimilarity) if we think of the two extremes of a social structure (the complete community or a complete network). In this way, if ideological discourses or orientations are repeated inside each echo-chamber (or subcommunity), the discourses – being ideologically similar – resonate, and the discourses are reproduced and reinforced within each online community, resulting in a nondialogue between communities. (Caballero 2020: 140)

In conclusion, the architecture of social media platforms itself promotes fertile ground to this reproduction of segregation that is central to the process of radicalization and conspiracy thinking (‘us-vs.-them’ logic). As such,

Proliferation of conspiracist discourse in a society creates micro-areas of shared meaning that are impermeable and in conflict with each other. (Leone, Madisson, Ventsel 2020: 47)

This way, social media platforms create these isolated and conflicting world-views/ways to interpret events and circumstances that are particular, almost tailored to each individual, as opposed to shared (Leone, Madisson, Ventsel 2020).

The fact of the matter is that, despite it being a space for politics and community-creation, the internet is “first and foremost, a place of business in which most of the information and data produced are employed to make a profit” (Puumeister 2020: 519). In concise terms, “the hunt for profit and the transformation of behaviour and experience into data and information can be said to constitute the underlying logic for constructing the affordances of new media environments” (Ibid) – social media works the way it works because it is profitable that way.

In this sense, it seems naïve to point out to community-building and dialog efforts as solutions to the problem of conspiracy thinking when these efforts are obstructed by the very structure of the online networks that host most of present-day human communication on the basis of this notion that division and conflict generates profit.

It appears that some fundamental change is needed in the underlying logics upon which social media environments are constructed. Hence, “understanding the way networks operate and how they can be manipulated by actors with vested political or economic interests is key” (Leal 2020: 499) to thinking of solutions to this issue.

References

Birchall, Clare 2006. Knowledge Goes Pop: From Conspiracy Theory to Gossip. Oxford: Berg Publishers.

Caballero, Estrella 2020. Social Network Analysis, Social Big Data and Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 135-147.

Leal, Hugo 2020. Networked Disinformation and The Lifecycle of Online Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 497-511. 

Leone, Massimo; Madisson, Mari-Liis; Ventsel, Andreas 2020. Semiotic Approaches to Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 43-55.

Madisson, Mari-Liis; Ventsel, Andreas 2021. Strategic Conspiracy Narratives: A Semiotic Approach. New York: Routledge.

Puumeister, Ott 2020. Conspiratorial Rationality. Sign Systems Studies, 48(2-4): 519-528.

Stano, Simona 2020. The Internet and The Spread of Conspiracy Content. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 483-496.

Categories
Blog

Influence of social network processes on radical conspiracy theories

Editor’s note: This post is part one of a two-part series that the author intends to publish over the next two months.

There is a structural tendency for extremists to uphold conspiracy theories, which is reflected in their dichotomic thinking style aimed at making sense of societal events by providing oversimplified explanations (van Prooijen et al. 2015). More specifically, “tragedies caused by extremism are rooted substantially in a tendency to be distrustful and paranoid toward groups of other-minded individuals” (Ibid), which is also a characteristic of conspiracy thinking, known as ‘us-vs.-them’ logic. In a general manner, conspiracy theories and radicalization are both fundamentally related to meaning-making processes that “may compensate for personal uncertainties by providing self-regulatory clarity, and by imbuing the world with meaning and purpose” (Ibid, 571), thus, meaning is “created in a situation of existential fragility” (Önnerfors, Steiner 2018: 33). Accordingly, conspiracy thinking may be seen as a core component of the process of radicalization into extremism.

Recent studies have been moving away “from debunking conspiracy theories towards exploring their meaning for those involved” (Harambam 2020: 280). A possible approach regards how “conspiracy theories serve as a way to express distrust and discontent with authorities, and perhaps even distrust towards society more generally” (Thórisdóttir et al. 2020: 313), showcasing how, in a more general manner, “relationships in public based automatically upon authority are in decline” (Fairclough 1995: 137).

In general, the last few decades have shown how epistemic authorities and distributions of power have changed (Lorusso 2022). More people are now able to intervene in the public sphere, feeling empowered by new media and its logics to act as reliable information sources (Ibid). In this context, the base of conspiracy narratives becomes what Mari-Liss Madisson (2014) calls ‘social trust’ – that is, the verification of these narratives transcends any reference to proven facts, but instead, they rely on other narratives to support it (Madisson 2014; Stano 2020), creating an interdependent web of conspiracy narratives.

Further, it is crucial to consider that the circulation of information – and consequently of mis/disinformation – is regulated by network processes rather than being driven by chance (Leal 2020: 499). This means that, when speaking about online communication, it is also important to consider some of the socio-technical affordances of social media and how they impact communication. Leal (2020) describes two basic processes by which online networks generally operate: “The ‘Matthew effect’ (related to centrality and power) and clustering (related to homophily and transitivity)”.

The ‘Matthew effect of accumulated advantage’ is “determinant of hierarchies in online social networks” (Leal 2020: 499), describing the emergence of interconnected hubs, actors, or nodes, whose path is “dependent and favours those who are already central, powerful and influential” (Ibid).

A couple of years ago, the Director-General of the World Health Organisation declared: “we’re not just fighting an epidemic; we’re fighting an infodemic” (WHO 2020: vii). In the face of this infodemic, the expansion of social media has reached a paradox:

The number of opinion holders and discussion platforms has multiplied to such a degree that it is quite likely that any particular posting will not be noticed by nearly anyone against the background of a general flood of information. This information overload has increased the relevance of focusers or filters of attention, which can be institutions, individual mediators (e.g. social media micro-celebrities) or algorithms (e.g. those that mark trending themes), that can bring attention to a certain topic or an event. (Madisson, Ventsel 2021: 17)

Visibility has thus become one of the most desirable resources in the landscape of online media, and yet, it largely tends to be obtained by those who are already in possession of it. Therefore,

While the scale, speed and reach are generally conceived as providing equal opportunities for communication between people and the spread of narratives, the reality shows that this horizontality is illusory. […] In social networks, especially, it is not the equal distribution of interconnections, but the fact that some nodes are more well-connected than others that makes an idea or a virus circulate faster and more efficiently. (Leal 2020: 499)

As a consequence, the “turn-of-the-century utopian dream of the internet as a space of liberation and as a birthplace of new democratic communities has vanished” (Puumeister 2020: 520). Horizontal distribution of power and equal opportunity are now illusions or lost to the past, having been replaced with the logics of algorithms, which merely increase the influence of those who already possess it. The effects of this can be seen on many spheres of human life, especially in processes of meaning-making.

Follow the continuation of this discussion (PART 2) here on the VORTEX blog, coming June 2024.

References

Fairclough, Norman. 1995. Critical Discourse Analysis: The Critical Study of Language. London: Lohgman.

Harambam, Jaron 2020. Conspiracy Theory Entrepreneurs, Movements and Individuals. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 278-291.

Leal, Hugo 2020. Networked Disinformation and The Lifecycle of Online Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 497-511.

Lorusso, Anna M. 2022. Fake News as Discursive Genre: Between Hermetic Semiosis and Gossip. Social Epistemology37(17): 1-13, DOI: 10.1080/02691728.2021.2001604.

Madisson, Mari-Liis 2014. The Semiotic Logic of Signification of Conspiracy Theories. Semiotica 202 (2014): 273-300.

Madisson, Mari-Liis; Ventsel, Andreas 2021. Strategic Conspiracy Narratives: A Semiotic Approach. New York: Routledge.

Önnerfors, Andreas; Steiner, Kristian (2018). Expressions of Radicalization, Global Politics, Processes and Practices. London: Palgrave-Macmillan, 1507-1508.

Puumeister, Ott 2020. Conspiratorial Rationality. Sign Systems Studies, 48(2-4): 519-528.

Stano, Simona 2020. The Internet and The Spread of Conspiracy Content. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 483-496.

Thórisdóttir, Hulda; Mari, Silvia; Krouwel, André 2020. Conspiracy Theories, Political Ideology and Political Behaviour. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 304-316.

van Prooijen, Jan-Willem; Krouwel, André P.; Pollet, Thomas V. (2015). Political extremism predicts belief in conspiracy theories. Social Psychological and Personality Science, 6 (5), 570-578. DOI: 10.1177/19485506145673

WHO 2020. An ad hoc WHO technical consultation managing the COVID-19 infodemic: call for action. Geneva: World Health Organization. Licence: CC BY-NC-SA 3.0 IGO. Retrieved from: https://www.who.int/publications/i/item/9789240010314, 06.03.23.