Soft Facts and Digital Behavioural Influencing After the 2017 Terror Attacks (Full Report)

This CREST report examines how ‘soft facts’ influence individual and collective behaviours, and what are the most effective counter-measures for managing their consequences.

Click here to download the Full Report.

The report, by Martin Innes, states findings from the research project Soft Facts And Digital Behavioural Influencing. It examines how a series of soft facts communicated on social media in the aftermath of four terrorist attacks that took place in 2017, functioned to influence public perceptions and understandings of the causes and consequences of these events.

The research was designed to generate evidence and insights about three areas:

  1. To provide an empirically led dissection of the organisation of social reactions to terrorism and how it is being transformed and altered by emerging patterns of social communication. Whilst previous research has been predicated upon single case study designs, the current project compared social media reactions across four different events.
  2. How disinformation and misinformation communicated via social media platforms influences public definitions of the post-terror event situation, and the ways people think, feel, and behave in relation to it.
  3. To make a more general conceptual and methodological contribution to the rapidly growing literature on disinformation and its effects.

For the purposes of this analysis they collectively define these informational forms as soft facts. Where hard facts are objective and stable, soft facts are malleable and contested.

Where hard facts are objective and stable, soft facts are malleable and contested.

They are an important feature of the contemporary media ecosystem, especially in moments of emergency and crisis when people are highly influenceable.

The principal output of this analysis is the conceptualisation of eight ‘techniques of disinformation’:

  1. Seeding
  2. Denial of Credibility
  3. Event Ghosting
  4. Emulsifying
  5. Infiltrating and Inciting
  6. Spoofing
  7. Truthing
  8. Social Proofing

Taken together, these techniques of disinformation illuminate some of the workings of digital influence engineering in the contemporary information environment.

Read, download and share the full report:  ‘Soft Facts’ (Full Report)

Read More

Russian twitter accounts influencing UK debate following terror attacksThe Executive Summary of this report can be found at crestresearch.ac.uk/resources/soft-facts-summary

For further reading on this topic, see Martin Innes’ article Russian Influence And Interference On Twitter Following The 2017 UK Terrorist Attacks in CREST Security Review, Issue 7: Transitions.

There is also a Policy Brief which details how independent analysis has identified systematic use of fake social media accounts, linked to Russia, amplifying the public impacts of four terrorist attacks that took place in the UK in 2017. You can download, read and share the four-page brief at crestresearch.ac.uk/resources/russian-influence-uk-terrorist-attacks

You can also read his published journal article: Innes, M. 2020. Techniques of disinformation: Constructing and communicating ‘soft facts’ after terrorismBritish Journal of Sociology (10.1111/1468-4446.12735)

These resources are produced from the ‘Soft Facts And Digital Behavioural Influencing’ project, funded by CREST. To find out more information about this programme, and to see other outputs from the team, visit the Soft Facts And Digital Behavioural Influencing project page.


As part of CREST’s commitment to open access research, these resources are available under a Creative Commons BY-NC-SA 4.0 licence. For more details on how you can use our content see here.