Are social media culprits in spreading conspiracy theories?

Anyone talking about conspiracy thinking or conspiracy theories quickly comes across words like "echo chambers" and "filter bubbles. People are said to end up in a rabbit hole in which they see their own rightness confirmed over and over again. And therefore slowly get sucked into ever more extreme thinking. A major culprit of this would be social media, which uses algorithms to suck people into this rabbit hole. But is it really that simple? What role does social media actually play in conspiracy theories?

What are we actually talking about

Social media, social networks; previously a distinction was sometimes made between the two. Social media would primarily be a way to share or convey information online to a wide audience, while social networks - according to Wikipedia, a type of online social media platform - primarily have the function of bringing people (with similar interests) together. This requires a certain dedication or activity ("engagement") from users. The classic example of such a network is Facebook.

By now, this distinction has become quite blurred. Virtually all social media have now proven to also have a community or networking function, even if - initially - that was not its intended purpose. Consider, for example, YouTube, which was initially a platform for people to share videos they had made themselves. Today, there are entire YouTube communities in which the revenue model is interaction with viewers who feel part of this community. Conversely, one could argue that Facebook, initially aimed at connecting with friends and acquaintances, has also become an important source of information for a group of people to keep abreast of what is going on in the world.

In short, whoever talks about social media is usually also talking about social networks and vice versa.

The use of social media

How a person uses social media varies. Last year I did a Twitter analysis looking at how people were tweeting - in Dutch - about a particular alleged conspiracy. The analysis showed that supporters of the conspiracy theory used Twitter in different ways. Both as a way to bring like-minded people together, and as a way to refer people to semi-open or closed platforms, where the discussion continued. Think of Telegram channels. Twitter was also used to seek interaction with prominent people or (mainstream) media outlets that could increase attention to the alleged conspiracy.

The way social media can be used varies. Similarly, the type of user also varies. There are those who are the "face" of a particular conspiracy theory, there are those who stay more in the background but are crucial in creating engaging content, there are those who take on the management of a Facebook page or Telegram channel. There are people who consume content and leave it at that, and people who then go on to actively distribute the content.

All of these people likely have a variety of reasons why they do what they do, motivations that, moreover, may change over time. Consider concern about the state of the Netherlands or the world, idealism, opportunism, financial gain, prestige, disappointment or pleasure, or a mix of these. All this makes it complex to speak of "the user" of "social media.

The connectors: perhaps most interesting

What struck me most about the results of the network analysis? The assumption of a closed environment in which the same ideas are constantly repeated without interaction with dissent was only partially true. Yes, there were some Twitterers on the edges of the network who interacted mostly with each other, but there were also Twitterers in the middle of the network, who actually actively sought interaction with people who were not (yet) convinced by the theory. They are not the most visible to the outside world, but play an important role in gaining attention and spreading the thinking. So-called "connective leaders" (see Poell et al., 2015) are the connectors who create the right conditions in which a theory or community can flourish.

Interestingly, researcher Marijn Keizer (Institute for Advanced Study in Toulouse) recently stated in the Volkskrant that he is more concerned about filter bubbles with dissenters than like-minded people. His research found that people who already hold extreme views actually see these views reinforced by coming into contact with dissenters (De Volkskrant, July 14, 2023, "Is polarization increasing in the Netherlands due to the advent of social media?").

Who is the boss?

An important note, however, is that we still know very little about the algorithms used by social media platforms. The assumption is that these are generally aimed at enticing users to linger as long as possible on a platform. So the question is to what extent users, including disseminators of conspiracy-narratives, can "steer" their information or whether, for them too, the algorithm is leading in this. Then, moreover, it remains to be seen whether that leads to other, extreme(er) views among the audience that sees the content, and whether that then also leads to more polarization. Spoiler: it's not that simple. For an insight into the considerations, I recommend you read the aforementioned article in De Volkskrant, or Emma Ooms' chapter in the volume Political Polarization in the Netherlands (see the link at the bottom of this article).

What is certain, moreover, is that consumers of content on social media have a certain freedom of choice. Not necessarily always whether or not to be introduced to something - because sometimes that simply happens - but to do something with it. To go along with it or not, to want to know more about it or not. Algorithms may play a part in this, but I wouldn't want to overestimate that role. Whether something resonates with you as a user will also have to do with your personal circumstances and the society in which you find yourself.

As Tom Dobber states in the volume Political Polarization in the Netherlands, where he explains the operation of algorithms: "It is more likely that algorithms reinforce polarization than that they cause polarization. Structural problems such as inequality and identity can be magnified and reinforced through algorithms, but algorithms were applied in a world already rife with inequality and different perceptions of identity" (p.144).

Importance of research

Social media have become an integral part of our existence. For many people, besides being a fun pastime, they are also a way to gather information and consume news. That social media play a role in spreading more extreme ideas or conspiracies can no longer be denied. But exactly what role and what that then does to the ideas and opinions of the recipient is a more complex issue.

By doing research, we are finding out a little more and more. Personally, I am very curious about the results of the large-scale study of 208 million Facebook accounts (the first results can now be read in Science and Nature) and the study on the influence of search engines by the University of Twente.

Read/see/listen 

  • Research from the University of Twente on the influence of search engines;
  • Colleague Lidwien van de Wijngaert previously wrote an article on filter bubbles and echo chambers in the Zwarte Piet debate;
  • The volume Political Polarization in the Netherlands, including a contribution by colleagues Marianne van Bochove, Tess Schijvenaars and Hans Moors. Free download via this link;
  • Thomas Poell, Rasha Abdulla, Bernhard Rieder, Robbert Woltering & Liesbeth Zack (2015): Protest leadership in the age of social media, Information, Communication & Society, DOI: 10.1080/1369118X.2015.1088049;
  • Podcast Rabbit Hole New York Times. From 2020, but still relevant.