People regularly encounter false political information on social media. Perhaps one in ten forward it on. Online misinformation spreads far and fast, with potential consequences for attitudes, beliefs, and actions.

Considerable effort has been invested in attempts to counter the spread of false information, with mixed success. In addition to fact-checking and debunking, psychological interventions have been developed. These include inoculation, gamified interventions, and approaches which aim to focus attention on accuracy. All of these approaches can work. However, there are issues with the scalability of some interventions, while others rely on cooperation from social media platforms. Furthermore, a recent re-analysis of data from gamified interventions suggests they may reduce trust in information overall, rather than enhance our ability to tell truth from falsehood. Re-analysis of data from interventions prompting people to consider accuracy suggests the approach is ineffective for politically conservative people.  

Our work focuses on why people share false information. Some individuals share misinformation because they genuinely believe it to be true, while others knowingly share false content. Some personality types may be more likely to engage with false information than others. As motivations influence the effectiveness of interventions, it is useful to understand these in order to know if an intervention to help people recognise false information is likely to be effective for those who will share it anyway to achieve some desired outcome, or only be effective for those who believe the information to be true.   

Debunking false information is unlikely to be effective for those who will share it anyway. 

Motives for Sharing  

Based on social media users’ own accounts, we identified six distinct sets of motives for sharing political information and misinformation. Three sets — prosocial activism, awareness, and fighting false information — demonstrate a desire to ‘make things better’, benefiting other individuals and society as a whole. The other three sets reflect motives relating to attack or manipulation of others, political self-expression, and entertainment. Sharing misinformation can therefore be driven by destructive motives, but also be seen as a strategy to enhance social cohesion.  

  • Prosocial activism: A desire to educate, inform, or mobilise other people in ways intended to benefit them or society. Sentiments about driving social change, critical thinking, morality, and political accountability, as well as informing people. Proactive use of social media to achieve political or social goals regarded as positive by the individual, and not involving tactics such as attacking others. 

  • Awareness: These motives seemed to revolve around transparency or making people aware of information, and reflect ‘good’ reasons for sharing information. However, themes appear to be tinged with suspicion, and may be indicative of conspiracist ideation. 

  • Fighting False Information: Combating misinformation and minimising its harm, generally reflecting social responsibility in the political misinformation domain. Individuals endorsing these items might try to debunk false information (even if inadvertently spreading it further while doing so). 

  • Attack or manipulation of others: Cynical, antisocial, and manipulative use of social media. A desire to achieve one’s own ends with a disregard for the truth or the welfare of others. Some of the motives dealt with self-enhancement. Others dealt with actively doing harm to others. Overall, these sentiments were either directly opposed to ‘prosocial activism’ motives, or treated as irrelevant. 

  • Political self-expression: Expression of political views and participation in political debate. People endorsing these motives want to talk about politics, not necessarily to bring about political change. 

  • Entertainment: A desire to entertain oneself or others, be funny, or alleviate boredom.   

Individual Differences  

It is important to consider the characteristics of people who engage with false information online. Some research suggests specific personalities are more likely to share misinformation (for example, people who are politically conservative and have low levels of conscientiousness). It is also suggested that some interventions may only be effective for particular types of people, such as accuracy nudges only being effective for politically liberal individuals. However, research on personality and demographic variables has produced conflicting results, making it challenging to draw definitive conclusions.  

Findings from several of our studies suggest that schizotypy may be important. Schizotypy is a set of characteristics associated with disordered thinking. It has multiple dimensions. Positive schizotypy is associated with suspicion, disordered perception, and belief in the paranormal. We have found that people with higher levels of positive schizotypy are more likely to report sharing false information. These findings are based on self-report data, and we need to extend this to evaluate behavioural evidence. 

Going ‘all in’ on one specific type of intervention may be unwise. 

Practical Implications  

While effective interventions have been developed, they may not be universally effective. For example, a truth-discernment protocol might work for those motivated by prosocial activism, but it is unlikely to be effective for those who share political information with the intention of attacking or manipulating others. Additionally, certain people may be more vulnerable to misinformation, or resistant to particular interventions. This means going ‘all in’ on one specific type of intervention may be unwise. Further research is needed on individual characteristics that influence engagement with misinformation. This should be considered within the wider picture of general vulnerability to online influence. Finally, more work is needed to evaluate the actual effects of exposure to false information online.

Tom Buchanan is a Professor of Psychology at the University of Westminster. Dr Rotem Perach is a Research Fellow at the University of Westminster. Dr Deborah Husbands is a Reader in Psychology at the University of Westminster. Some of the research described in this article was supported by The Leverhulme Trust, Research Project Grant RPG-2021-163. 

Read more

Buchanan, T. (2020) Why Do People Share Disinformation On Social Media? 

Buchanan, T. (2020) Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation. PLoS One, 15(10).    

Buchanan, T., & Kempley, J. (2021) Individual differences in sharing false political information on social media: Direct and indirect effects of cognitive-perceptual schizotypy and psychopathy. Personality and Individual Differences, 182.    

Copeland, S. & Marsden, S. (2020) The Relationship Between Mental Health Problems and Terrorism.    

Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K., & Amazeen, M. A. (2022) The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1, 13-29.     

Gill, P., Corner, E., McKee, A., Hitchen, P., & Betley, P. (2022) What Do Closed Source Data Tell Us About Lone Actor Terrorist Behavior? A Research Note. Terrorism and Political Violence, 34(1), 113-130.    

Lewandowsky, S., Van der Linden, S., & Cook, J. (2018) Can We Inoculate Against Fake News?    

Modirrousta-Galian, A. & Higham, P. A. (2023) Gamified inoculation interventions do not improve discrimination between true and fake news: Reanalyzing existing research with receiver operating characteristic analysis. Journal of Experimental Psychology General, 152(9), 2411–2437.     

Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592, 590–595.    

Perach, R., Joyner, L., Husbands, D., & Buchanan, T. (2023) Why Do People Share Political Information and Misinformation Online? Developing a Bottom-Up Descriptive Framework. Social Media + Society, 9(3).