Categories
Blog

Influence of social network processes on radical conspiracy theories (PART 2) 

PART 1 proposed that the Internet’s horizontal distribution of power and equal opportunity are illusory or lost to the past, having been replaced with the logics of algorithms, which merely increase the influence of those who already possess it, as described by the ‘Matthew effect of accumulated advantage’.

Beyond that, it is also interesting to consider social media influencers and their role not only in disseminating (mis/dis)information, but also in guiding interpretation (telling people how to understand a piece of news, for example), and providing others with reasons for action (for instance, boycotting a determined business, purchasing products from a determined company, contacting representatives with certain demands, donating to certain organizations, voting for a determined party, organizing protests, etc.).

Such influencers can participate in deliberately coordinated networks of influencing activities and be engaged in the dissemination of strategic talking points […], but their posting activities can also be self-started, aiming at the advancement of personal brands, i.e. increasing personal popularity. Often, they are doing both. (Madisson, Ventsel 2021: 17)

The crux of the matter is that there are different possible ways to configure knowledge or to encode information, and “those that gain precedence will influence what it means to know; what kind of knowledge is culturally valued; how we learn; and who will have access to knowledge and power” (Birchall 2006: 8). 

Therefore, in the same way that one may speak of an economic elite (a minority group that holds wealth) or a political elite (a minority group that holds decision-making power), it may also be possible to speak of a “semiotic elite” – those nodes and actors who are in possession of the power to influence interpretation for larger groups of people. This semiotic elite, created by the cost of visibility, is who decides how the world should be interpreted. Naturally, this is not a product of new media or technology, since there has always been a group of people who more or less guided (or sought to guide) how information was interpreted at least ever since the printing press. However, the issue is that nowadays we are leaving the formation of this powerful group in the hands of social media algorithms and its logics.

Other than the previously explained ‘Matthew effect’ (see PART 1), the second network process described by Leal (2020: 500) is called ‘clustering’. In essence, clustering “reveals our tendency to connect with people who are similar or close to us”. This, in turn, leads to segregation, that is, the so-called social media bubbles or echo-chambers (Leal 2020; Stano 2020).

Echo-chambers are densely connected network clusters with few, if any, links to other groupings. They are bounded spaces marked by the internal reproduction of ideas rather than the external production of knowledge. […] As clustered, tight-knit communities, echo-chambers forge particular beliefs and shield their members from outside influences. With little or no exchange of information with opposing or even different groupings, the adopted viral narratives will reverberate in feedback-loops leading to persistent worldviews and resistance to change. (Leal 2020: 507)

Echo-chambers thus thwart the development of people’s critical faculties and analytical analysis, a configuration that “feeds propaganda and extremism and reduces democracy and critical debate” (Stano 2020: 487). And the

ideological effect of the echo-chamber is based on both homophily (similarity) and heterophily (dissimilarity) if we think of the two extremes of a social structure (the complete community or a complete network). In this way, if ideological discourses or orientations are repeated inside each echo-chamber (or subcommunity), the discourses – being ideologically similar – resonate, and the discourses are reproduced and reinforced within each online community, resulting in a nondialogue between communities. (Caballero 2020: 140)

In conclusion, the architecture of social media platforms itself promotes fertile ground to this reproduction of segregation that is central to the process of radicalization and conspiracy thinking (‘us-vs.-them’ logic). As such,

Proliferation of conspiracist discourse in a society creates micro-areas of shared meaning that are impermeable and in conflict with each other. (Leone, Madisson, Ventsel 2020: 47)

This way, social media platforms create these isolated and conflicting world-views/ways to interpret events and circumstances that are particular, almost tailored to each individual, as opposed to shared (Leone, Madisson, Ventsel 2020).

The fact of the matter is that, despite it being a space for politics and community-creation, the internet is “first and foremost, a place of business in which most of the information and data produced are employed to make a profit” (Puumeister 2020: 519). In concise terms, “the hunt for profit and the transformation of behaviour and experience into data and information can be said to constitute the underlying logic for constructing the affordances of new media environments” (Ibid) – social media works the way it works because it is profitable that way.

In this sense, it seems naïve to point out to community-building and dialog efforts as solutions to the problem of conspiracy thinking when these efforts are obstructed by the very structure of the online networks that host most of present-day human communication on the basis of this notion that division and conflict generates profit.

It appears that some fundamental change is needed in the underlying logics upon which social media environments are constructed. Hence, “understanding the way networks operate and how they can be manipulated by actors with vested political or economic interests is key” (Leal 2020: 499) to thinking of solutions to this issue.

References

Birchall, Clare 2006. Knowledge Goes Pop: From Conspiracy Theory to Gossip. Oxford: Berg Publishers.

Caballero, Estrella 2020. Social Network Analysis, Social Big Data and Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 135-147.

Leal, Hugo 2020. Networked Disinformation and The Lifecycle of Online Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 497-511. 

Leone, Massimo; Madisson, Mari-Liis; Ventsel, Andreas 2020. Semiotic Approaches to Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 43-55.

Madisson, Mari-Liis; Ventsel, Andreas 2021. Strategic Conspiracy Narratives: A Semiotic Approach. New York: Routledge.

Puumeister, Ott 2020. Conspiratorial Rationality. Sign Systems Studies, 48(2-4): 519-528.

Stano, Simona 2020. The Internet and The Spread of Conspiracy Content. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 483-496.

Categories
Research

Breakdown of knowledge authority: semiotic analysis of an anti-vax conspiracy theory influencer on Twitter

Abstract

Anti-vax conspiracy theories are major drivers of “vaccine hesitancy”, a top-10 threat to global health according to the WHO. This paper investigates the interpretative mechanisms and discursive conditions of anti-vax discourse on Twitter (X), through the analysis of seven tweets posted by an anti-vax influencer. Mixed methods of discourse analysis are employed, focusing on the strategic character and potential social effects of discourse. As a set of relations, the code-text of anti-vax conspiracy theories is characterized by a conflict between authority and freedom. The archetype of the enemy is diffuse and composed of different elements (government, mainstream media, medical/scientific community) that are all totalized into one-and-the-same evil: “the authorities.” Overall, when facing the increasing deconstruction of epistemic authority on social media, the form (independently from content) with which anti-vax discourse seeks to provide argumentation (by framing identities and social relations in the shape of dichotomic oppositions) is fundamentally undesirable.

Full Citation:

Piva, H. C. (2024). Breakdown of knowledge authority: Semiotic analysis of an anti-vax conspiracy theory influencer on Twitter. Social Semiotics, 34(1), 1–22. https://doi.org/10.1080/10350330.2024.2341398​

View Publication

Categories
Blog

Influence of social network processes on radical conspiracy theories

Editor’s note: This post is part one of a two-part series that the author intends to publish over the next two months.

There is a structural tendency for extremists to uphold conspiracy theories, which is reflected in their dichotomic thinking style aimed at making sense of societal events by providing oversimplified explanations (van Prooijen et al. 2015). More specifically, “tragedies caused by extremism are rooted substantially in a tendency to be distrustful and paranoid toward groups of other-minded individuals” (Ibid), which is also a characteristic of conspiracy thinking, known as ‘us-vs.-them’ logic. In a general manner, conspiracy theories and radicalization are both fundamentally related to meaning-making processes that “may compensate for personal uncertainties by providing self-regulatory clarity, and by imbuing the world with meaning and purpose” (Ibid, 571), thus, meaning is “created in a situation of existential fragility” (Önnerfors, Steiner 2018: 33). Accordingly, conspiracy thinking may be seen as a core component of the process of radicalization into extremism.

Recent studies have been moving away “from debunking conspiracy theories towards exploring their meaning for those involved” (Harambam 2020: 280). A possible approach regards how “conspiracy theories serve as a way to express distrust and discontent with authorities, and perhaps even distrust towards society more generally” (Thórisdóttir et al. 2020: 313), showcasing how, in a more general manner, “relationships in public based automatically upon authority are in decline” (Fairclough 1995: 137).

In general, the last few decades have shown how epistemic authorities and distributions of power have changed (Lorusso 2022). More people are now able to intervene in the public sphere, feeling empowered by new media and its logics to act as reliable information sources (Ibid). In this context, the base of conspiracy narratives becomes what Mari-Liss Madisson (2014) calls ‘social trust’ – that is, the verification of these narratives transcends any reference to proven facts, but instead, they rely on other narratives to support it (Madisson 2014; Stano 2020), creating an interdependent web of conspiracy narratives.

Further, it is crucial to consider that the circulation of information – and consequently of mis/disinformation – is regulated by network processes rather than being driven by chance (Leal 2020: 499). This means that, when speaking about online communication, it is also important to consider some of the socio-technical affordances of social media and how they impact communication. Leal (2020) describes two basic processes by which online networks generally operate: “The ‘Matthew effect’ (related to centrality and power) and clustering (related to homophily and transitivity)”.

The ‘Matthew effect of accumulated advantage’ is “determinant of hierarchies in online social networks” (Leal 2020: 499), describing the emergence of interconnected hubs, actors, or nodes, whose path is “dependent and favours those who are already central, powerful and influential” (Ibid).

A couple of years ago, the Director-General of the World Health Organisation declared: “we’re not just fighting an epidemic; we’re fighting an infodemic” (WHO 2020: vii). In the face of this infodemic, the expansion of social media has reached a paradox:

The number of opinion holders and discussion platforms has multiplied to such a degree that it is quite likely that any particular posting will not be noticed by nearly anyone against the background of a general flood of information. This information overload has increased the relevance of focusers or filters of attention, which can be institutions, individual mediators (e.g. social media micro-celebrities) or algorithms (e.g. those that mark trending themes), that can bring attention to a certain topic or an event. (Madisson, Ventsel 2021: 17)

Visibility has thus become one of the most desirable resources in the landscape of online media, and yet, it largely tends to be obtained by those who are already in possession of it. Therefore,

While the scale, speed and reach are generally conceived as providing equal opportunities for communication between people and the spread of narratives, the reality shows that this horizontality is illusory. […] In social networks, especially, it is not the equal distribution of interconnections, but the fact that some nodes are more well-connected than others that makes an idea or a virus circulate faster and more efficiently. (Leal 2020: 499)

As a consequence, the “turn-of-the-century utopian dream of the internet as a space of liberation and as a birthplace of new democratic communities has vanished” (Puumeister 2020: 520). Horizontal distribution of power and equal opportunity are now illusions or lost to the past, having been replaced with the logics of algorithms, which merely increase the influence of those who already possess it. The effects of this can be seen on many spheres of human life, especially in processes of meaning-making.

Follow the continuation of this discussion (PART 2) here on the VORTEX blog, coming June 2024.

References

Fairclough, Norman. 1995. Critical Discourse Analysis: The Critical Study of Language. London: Lohgman.

Harambam, Jaron 2020. Conspiracy Theory Entrepreneurs, Movements and Individuals. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 278-291.

Leal, Hugo 2020. Networked Disinformation and The Lifecycle of Online Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 497-511.

Lorusso, Anna M. 2022. Fake News as Discursive Genre: Between Hermetic Semiosis and Gossip. Social Epistemology37(17): 1-13, DOI: 10.1080/02691728.2021.2001604.

Madisson, Mari-Liis 2014. The Semiotic Logic of Signification of Conspiracy Theories. Semiotica 202 (2014): 273-300.

Madisson, Mari-Liis; Ventsel, Andreas 2021. Strategic Conspiracy Narratives: A Semiotic Approach. New York: Routledge.

Önnerfors, Andreas; Steiner, Kristian (2018). Expressions of Radicalization, Global Politics, Processes and Practices. London: Palgrave-Macmillan, 1507-1508.

Puumeister, Ott 2020. Conspiratorial Rationality. Sign Systems Studies, 48(2-4): 519-528.

Stano, Simona 2020. The Internet and The Spread of Conspiracy Content. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 483-496.

Thórisdóttir, Hulda; Mari, Silvia; Krouwel, André 2020. Conspiracy Theories, Political Ideology and Political Behaviour. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 304-316.

van Prooijen, Jan-Willem; Krouwel, André P.; Pollet, Thomas V. (2015). Political extremism predicts belief in conspiracy theories. Social Psychological and Personality Science, 6 (5), 570-578. DOI: 10.1177/19485506145673

WHO 2020. An ad hoc WHO technical consultation managing the COVID-19 infodemic: call for action. Geneva: World Health Organization. Licence: CC BY-NC-SA 3.0 IGO. Retrieved from: https://www.who.int/publications/i/item/9789240010314, 06.03.23.