What makes people share misinformation on social media?

Given the information environment we find ourselves in, keeping track of what is true and what is misleading is not just a matter of integrity and morality, but a matter of survival. Disinformation kills people. Bad information prevents smart decisions, leading to the spread of preventable diseases, undermining public health initiatives, and preventing us from recognizing and responding to escalating threats including climate change, COVID 19, human migration, and many others.

People tend to “believe lies despite the obvious truth” when something looks like it might be true (the “main” versus “literal” truth) and aligns with their values ​​and desires. In addition, they will probably share it without a break as well. However, research is less clear about what individual factors play a role in people deciding whether to engage with or share potentially misleading social media posts.

Examining the drivers of disinformation

Source: Fauxels/Pexel

To understand the drivers of disinformation, Morosoli and colleagues conducted research published in the journal American behaviorist (2022). They looked at sociodemographic factors such as gender, age and level of education; Permissive factors such as behavior on social media including political beliefs, congruence with one’s attitudes towards a certain issue (‘attitude congruence’, e.g. when I believe climate change is a hoax and I see news stories that match that idea , there is more congruence), the importance of the respective topic (“topic salience”) as well as relevant personality traits, in particular the dark triad of narcissism, sociopathy and Machiavellianism.

The researchers surveyed over 7,000 people in six different countries, all western democracies (Switzerland, Belgium, France, Germany, Great Britain and the USA). They focused on information about climate change, immigration and COVID-19, all divisive social issues notoriously riddled with disinformation.

The subjects were shown three sample social media posts, one on each topic, which for the purposes of the study were designed to contain misleading information and reflect actual posts. They were told that their opinions would be sought to help an online news service verify the posts before publication. They were asked how likely they would be to engage with a particular post and how motivated they would be to share it. Various demographic factors, attitudes and political beliefs were queried using recognized rating scales.

Correlates of interacting with and sharing deceptive social media messages

The researchers found that men, older people, and those with a lower level of education were significantly more likely to engage in model jobs. The dark triad personality traits, narcissism and psychopathy, were associated with greater engagement in social media posts. Additionally, people with conservative (as opposed to liberal) political leanings were more likely to engage with the posts.

Subjects indicated that they were more likely to share sample posts that were consistent with their attitudes and beliefs, such as fake news saying climate change wasn’t real, even though they didn’t believe in it at all. The more relevant a topic was perceived, the more likely participants also indicated that they would be motivated to share a particular post. Participants were most likely concerned with contributing to climate change protests, followed by immigration and most recently coronavirus.

Perhaps unsurprisingly, people who used social media more in general were more likely to be willing to engage with posts. Those with greater baseline trust in social media messages reported greater motivation to share posts. Individuals who tended to engage with posts from friends and family were also more likely to share posts, suggesting a synergistic social effect.

There were some nuanced results that warranted further investigation. For example, while right-wing individuals were generally more agreeable to disinformation, the effect was stronger for the immigration-related item. Likewise, attitude played a greater role in immigration: involvement in misinformation was more likely when the information in the post matched one’s own perspective.

In particular, for the contribution to climate change, messages blaming protesters for leaving litter behind led to greater engagement, as the blame was more than the issue of climate change itself, emphasizing the importance of the details in understanding the spread of disinformation highlighted on social media. This suggests that the insertion of inflammatory material into social media messages, even if irrelevant to the main points, could be used manipulatively to encourage engagement and sharing.

Finding meaning in an increasingly opaque and fast-paced information environment

More work is needed to fully understand how disinformation spreads through social media engagement and sharing. Careful research looking at each individual issue along with relevant sub-factors would be required to reveal all the different pathways and factors that predict both belief in falsehood and confrontation and sharing. from that. Research would need to tap into such psychological factors as the “illusory truth effect.”1 and how beliefs and upbringing can lead some to blindly follow authoritarian leaders.

Because research of this type can be used both in the service of truth and in support of disinformation efforts, an escalating information arms race is looming, with various interest groups able to leverage research on the psychology of disinformation for whatever theirs own goals.

Where it will lead is difficult to predict. Hopefully, research like this can be used to build consensus on how information can be managed as the great social experiment of technology unfolds and shapes our world.

Comments are closed.