This CREST report examines how ‘soft facts’ influence individual and collective behaviours, and what are the most effective counter-measures for managing their consequences.

This project analyses social media data collected in the aftermath of four terror attacks that took place in the UK in 2017, to explore how various rumours, conspiracy theories, propaganda and fake news shaped social reactions to these incidents, and the ways they came to be defined and understood. 

For the purposes of the analysis we collectively define these informational forms as ‘soft facts’. Where ‘hard facts’ are objective and stable, soft facts are malleable and contested. They are an important feature of the contemporary media eco-system, especially in moments of emergency and crisis when people are highly influenceable.

Where hard facts are objective and stable, soft facts are malleable and contested.

Techniques of Disinformation

The principal output of this analysis is the conceptualisation of eight ‘techniques of disinformation’. Individually and collectively these are designed to capture key methods in terms of how misleadingly influential communications are constructed and communicated:

Seeding

Seeding involves utilising misinformation to create an element of doubt in the minds of the audience members, in terms of what to believe about an occurrence. In effect, communicating misinformation serves to create the conditions for disinformation, shaping the thoughts, feelings and behaviour of the audience.

Denial of credibility

Denial of credibility is where an attempt to undermine belief or trust in a specific unit of information is predicated upon attacking or undermining the source in some way. This often involves impugning the source’s motives, or past behaviour in some fashion.

Event ghosting

‘Event ghosting’ is a way of changing the meaning of an event or episode for an audience, via the insertion of made-up features into narratives about it. Importantly, most of the time this is not accomplished by devising an alternative narrative, but by revising and editing components of one that is already established.

Emulsifying 

Emulsifying is based upon blending two separate event narratives together, in order to misdirect audience attention in some way. Typically, this can work in one of two ways: either by ‘loading up’ the level of complexity, such that it renders it so difficult to understand that most people don’t try; the alternative is to drastically simplify things. For instance, by suggesting that one event is just like another (when they are not actually alike).

Infiltrating and iciting

Infiltrating and iciting is a specific technique where an agent of influence deliberately enters an established thought community by mimicking their social identities and interests, with the intent to message provocatively to fire up their emotions.

Spoofing

Spoofing involves imitating an established digital social identity, often by co-opting linguistic tropes and visible symbols of a group.

Truthing

‘Truthing’ is where support for an idea or position is based upon manipulating images, statistics, or other evidence. This can include conspiratorial ‘truth claims’ as well as ones more limited in terms of their purview.

Social proofing

Social proofing uses affordances designed into social media technologies to create an aura or illusion of support or consensus around a controversial issue.

This can, for example, be done by artificially inflating the number of ‘likes’ or supportive comments attached to a message. This is on the basis that such displays of consensus might modify the behaviours of other users.

Taken together, these techniques of disinformation illuminate some of the workings of digital influence engineering in the contemporary information environment.

Types of soft facts

There is increasing political and public consternation about how the communication of misinformation and disinformation within and across media platforms is corroding public trust in key institutions, and democratic processes and values.

The value of adopting a digital behavioural analytics approach to this problem is in determining how these kinds of influence are being accomplished and by whom.

A key aspect of the analysis lies in identifying a range of online actors engaged in constructing and communicating different kinds of soft fact. This includes:

  • Citizens at the scene who misinterpret things that they see or hear, but are able to communicate these to large numbers of followers via social media without validating the provenance of the information they are sharing.
  • Other citizens who, for their own personal social-psychological needs that are not terribly well understood, seek to interject themselves into the story, in ways that do not necessarily reflect what actually happened.
  • Journalists who, under intense pressure to break stories before their competitors, amplify false or misleading information in ways that can have long-term consequences in terms of how an event is publicly defined and understood.
  • Groups with strong ideological agendas who want to interpret occurrences in such a way that they can be seen to support their political values and perspectives.
  • Hostile states who, by manipulating and amplifying particular messages, seek to exacerbate social tensions between existing groups.

This latter dimension is an especially important finding of the work for policy and practice.

Unexpectedly, when analysing the empirical data collected following the four terror attacks, the researchers identified and attributed a number of Russian-linked social media accounts authoring and amplifying provocative and highly antagonistic messages.

Collectively, across the accounts concerned, they were adopting a spread of different political standpoints and messaging coherent with these positions.

As such, the study has identified a new and troubling dimension to what happens in the aftermath of terror attacks, in terms of what needs to be done in order to manage and mitigate the public impacts of such events.

the study has identified a new and troubling dimension to what happens in the aftermath of terror attacks

In documenting the social dynamics and mechanics of how soft fact communications can shape and steer the ways terror events come to be interpreted and defined, the analysis makes a distinctive contribution to a growing body of research interested in understanding processes of social reaction to terrorism.

Social media are very important to such efforts, because they both fundamentally alter these processes, but simultaneously afford digital traces that enable them to be studied in high resolution, in ways that were not previously possible.

Adopting this approach, a key facet of this study is in documenting how the communication of misinformation and disinformation in the wake of a terror attack has the capacity to influence the overall levels of social harm it induces.

The implications for policy and practice that flow from this insight concern the importance of actively managing the information environment and being willing to disrupt and counter any soft facts communicated following an attack.

 

Read more

A Policy Brief is also available from this project, which details how the systematic use of fake social media accounts, linked to Russia, amplifies the public impact of four terrorist attacks that took place in the UK in 2017. 
You can download, read and share the four-page brief at crestresearch.ac.uk/resources/russian-influence-uk-terrorist-attacks/

For further reading on this topic, see Martin Innes' article 'Russian Influence And Interference On Twitter Following The 2017 UK Terrorist Attacks' in CREST Security Review, Issue 7: Transitions.