Categories
Blog

Influence of social network processes on radical conspiracy theories (PART 2) 

PART 1 proposed that the Internet’s horizontal distribution of power and equal opportunity are illusory or lost to the past, having been replaced with the logics of algorithms, which merely increase the influence of those who already possess it, as described by the ‘Matthew effect of accumulated advantage’.

Beyond that, it is also interesting to consider social media influencers and their role not only in disseminating (mis/dis)information, but also in guiding interpretation (telling people how to understand a piece of news, for example), and providing others with reasons for action (for instance, boycotting a determined business, purchasing products from a determined company, contacting representatives with certain demands, donating to certain organizations, voting for a determined party, organizing protests, etc.).

Such influencers can participate in deliberately coordinated networks of influencing activities and be engaged in the dissemination of strategic talking points […], but their posting activities can also be self-started, aiming at the advancement of personal brands, i.e. increasing personal popularity. Often, they are doing both. (Madisson, Ventsel 2021: 17)

The crux of the matter is that there are different possible ways to configure knowledge or to encode information, and “those that gain precedence will influence what it means to know; what kind of knowledge is culturally valued; how we learn; and who will have access to knowledge and power” (Birchall 2006: 8). 

Therefore, in the same way that one may speak of an economic elite (a minority group that holds wealth) or a political elite (a minority group that holds decision-making power), it may also be possible to speak of a “semiotic elite” – those nodes and actors who are in possession of the power to influence interpretation for larger groups of people. This semiotic elite, created by the cost of visibility, is who decides how the world should be interpreted. Naturally, this is not a product of new media or technology, since there has always been a group of people who more or less guided (or sought to guide) how information was interpreted at least ever since the printing press. However, the issue is that nowadays we are leaving the formation of this powerful group in the hands of social media algorithms and its logics.

Other than the previously explained ‘Matthew effect’ (see PART 1), the second network process described by Leal (2020: 500) is called ‘clustering’. In essence, clustering “reveals our tendency to connect with people who are similar or close to us”. This, in turn, leads to segregation, that is, the so-called social media bubbles or echo-chambers (Leal 2020; Stano 2020).

Echo-chambers are densely connected network clusters with few, if any, links to other groupings. They are bounded spaces marked by the internal reproduction of ideas rather than the external production of knowledge. […] As clustered, tight-knit communities, echo-chambers forge particular beliefs and shield their members from outside influences. With little or no exchange of information with opposing or even different groupings, the adopted viral narratives will reverberate in feedback-loops leading to persistent worldviews and resistance to change. (Leal 2020: 507)

Echo-chambers thus thwart the development of people’s critical faculties and analytical analysis, a configuration that “feeds propaganda and extremism and reduces democracy and critical debate” (Stano 2020: 487). And the

ideological effect of the echo-chamber is based on both homophily (similarity) and heterophily (dissimilarity) if we think of the two extremes of a social structure (the complete community or a complete network). In this way, if ideological discourses or orientations are repeated inside each echo-chamber (or subcommunity), the discourses – being ideologically similar – resonate, and the discourses are reproduced and reinforced within each online community, resulting in a nondialogue between communities. (Caballero 2020: 140)

In conclusion, the architecture of social media platforms itself promotes fertile ground to this reproduction of segregation that is central to the process of radicalization and conspiracy thinking (‘us-vs.-them’ logic). As such,

Proliferation of conspiracist discourse in a society creates micro-areas of shared meaning that are impermeable and in conflict with each other. (Leone, Madisson, Ventsel 2020: 47)

This way, social media platforms create these isolated and conflicting world-views/ways to interpret events and circumstances that are particular, almost tailored to each individual, as opposed to shared (Leone, Madisson, Ventsel 2020).

The fact of the matter is that, despite it being a space for politics and community-creation, the internet is “first and foremost, a place of business in which most of the information and data produced are employed to make a profit” (Puumeister 2020: 519). In concise terms, “the hunt for profit and the transformation of behaviour and experience into data and information can be said to constitute the underlying logic for constructing the affordances of new media environments” (Ibid) – social media works the way it works because it is profitable that way.

In this sense, it seems naïve to point out to community-building and dialog efforts as solutions to the problem of conspiracy thinking when these efforts are obstructed by the very structure of the online networks that host most of present-day human communication on the basis of this notion that division and conflict generates profit.

It appears that some fundamental change is needed in the underlying logics upon which social media environments are constructed. Hence, “understanding the way networks operate and how they can be manipulated by actors with vested political or economic interests is key” (Leal 2020: 499) to thinking of solutions to this issue.

References

Birchall, Clare 2006. Knowledge Goes Pop: From Conspiracy Theory to Gossip. Oxford: Berg Publishers.

Caballero, Estrella 2020. Social Network Analysis, Social Big Data and Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 135-147.

Leal, Hugo 2020. Networked Disinformation and The Lifecycle of Online Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 497-511. 

Leone, Massimo; Madisson, Mari-Liis; Ventsel, Andreas 2020. Semiotic Approaches to Conspiracy Theories. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge, 43-55.

Madisson, Mari-Liis; Ventsel, Andreas 2021. Strategic Conspiracy Narratives: A Semiotic Approach. New York: Routledge.

Puumeister, Ott 2020. Conspiratorial Rationality. Sign Systems Studies, 48(2-4): 519-528.

Stano, Simona 2020. The Internet and The Spread of Conspiracy Content. In: Butter, Michael; Knight, Peter (eds.), Routledge Handbook of Conspiracy Theories. New York: Routledge. 483-496.