Final two projects addressing security threats announced today

New research projects announced

The final two projects to come out of CREST’s 2019 commissioning call have been announced today.

Addressing extremist Islamist disinformation and understanding the adoption of smart home technology are the subjects of two research projects, (both based at University of Bristol) which have been awarded funding today.

In addition to long-term research projects, CREST commissions six- and twelve-month projects to react to new and emerging requirements of its funders. CREST offered £1.12m to fund innovative proposals within this latest round of commissioning.

After a rigorous and independent review process, the successful projects were selected from more than 80 applications. The final two projects to be announced are:

Inoculating against the spread of Islamophobic and extremist Islamist disinformation

Extremist islamophobia and Islamist extremism do not arise on their own. They arise when vulnerable people are exposed to increasingly extreme disinformation, either about expansionist tendencies of Islam or about western hostility against Muslims.

Containment of extremism therefore demands the development of tools to control the impact and spread of disinformation.

This is particularly challenging because existing counter-extremist narratives have often turned out to be inadvertently counter-productive and because it is notoriously difficult to correct misinformation once it has been accepted as true by the recipients.

This project, led by Stephan Lewandowsky (University of Bristol), takes a different approach by examining the efficacy of inoculation against misinformation. It examines two related research questions:

1) What is the nature of the misinformation currently available in the UK cultural context that supports islamophobia on the one hand, and Islamist extremism on the other? What is a likely path through the misinformation landscape that a vulnerable person might follow during self-radicalisation? Can reliable rhetorical markers be identified that reveal information to be false and extremist?

2) Can these markers of misinformation be “reverse engineered” to create inoculating tools that can protect vulnerable people against misinformation and potential radicalisation?

The first research question will be answered by a content analysis of the online misinformation landscape to identify representative pathways that a consumer might follow during self-radicalisation.

The second research question will be answered by an experimental study that seeks to inoculate participants (young Muslim and non-Muslim UK residents) against misinformation by training them to recognise the misleading rhetorical tools.

You can read more about this project here.

Understanding the role of individual differences in the adoption, use and exploitation of smart home technology

The advent of the Internet of Things (IoT) has raised the prospect of increasingly connected devices within the home. From smart locks and home surveillance systems to connected appliances, light bulbs, and heating systems, such technology presents both multiple advantages and potential risks related to data security and personal safety.

Current research has also yet to address how differences in the adoption and use of new technology by different consumer groups can influence the potential for emergent vulnerabilities that are likely to be exploited in the criminal sphere. For instance, to what extent are different users likely to consider, understand and mitigate potential security and personal safety risks associated with the technology that they purchase? And how might this influence how they choose to use such devices? How might criminal groups and others with malevolent intent exploit and react to potential vulnerabilities? And how might consumers themselves respond to these evolving risks?

This project, led by Emma Williams (University of Bristol), uses home-based IoT technology and cyber-enabled crime as a basis to explore the relationship between individual differences in the adoption and use of new technology, and the exploitation of such technologies for nefarious purposes.

The project sets out to:

  • Use home-based IoT technology as a framework to identify the potential of these devices to be exploited for criminal purposes via the ‘cyber-enablement’ of particular crimes;
  • Investigate the influence of individual differences in psychological characteristics and socio-demographic factors on the adoption and use of these devices at the consumer level, including how this may differ in response to perceived vulnerabilities of devices;

You can read more about this project here.

These are the final two projects to be announced making a total of 11 commissioned projects. The research will start in October this year and is due for completion in September 2020.

The other nine commissioned projects are:

Rapport building: Culture and online vs. in-person interviews

Dr. Ewout Meijer at Maastricht University

Examining how rapport is built between people of different culture backgrounds, across online and in-person interactions is the focus of this project led by Dr. Ewout Meijer. Using Hall’s (1976) theory on low- and high-context communication cultures, this project will examine the effect of culture on rapport-building in investigative interviewing scenarios. Link to project.

Human Engagement Through Artificial / Augmented Intelligence

Professor Chris Baber at the University of Birmingham

Baber’s project looks at ‘Augmented intelligence’ and how it can extend human cognitive ability. The capabilities of Artificial Intelligence / Machine Learning (AI / ML) for exploring vast data resources and discovering patterns, exceeds that of the human. However the human’s expertise will allow insight into unusual or unfamiliar patterns. The project therefore focuses on the need to ensure collaboration in pursuit of sense-making. Link to project.

The adaptable law enforcement officer: Developing a measure of adaptive effectiveness

Dr Simon Oleszkiewicz at the University of Twente, Netherlands

The ability to adapt to changing situations is vital for law enforcement officers who are charged with the objective of establishing contact and building relationships with sources in criminal environments (i.e., covert law enforcement). Not only do these officers have to maintain a guise of adhering to a criminal conduct, they have to react fittingly to novel and uncertain situational demands. This project aims to develop a behavioural measure of adaptability relevant for police contexts.  Link to project.

Understanding Twenty-First Century Militant Anti-Fascism: An Analytical Framework And Matrix

Professor Nigel Copsey at Teesside University

Anti-fascist militancy has existed for as long as fascism has, but as a form of contentious politics, militant anti-fascism is still largely neglected across both academic and policy-practitioner communities. Since the societal conditions behind the current right-wing populist surge are unlikely to disappear anytime soon, there is a pressing need for research that addresses the security implications of radical extra-parliamentary groups who hold that violent confrontation is essential to effective anti-fascist opposition. Link to project.

Simulated phishing and employee cybersecurity behaviour (SPEC)

Dr John Blythe at CybSafe

This project will conduct two studies with differing approaches to investigate (i) how policies on simulated phishing emails are currently implemented in organisations using a cross-sectional survey and (ii) the impact of simulated phishing emails policies on employees’ cyber security awareness and their perceptions of key factors (organisational trust, procedural fairness, stress and perceived monitoring) through an experimental study. Link to project.

Why do people spread disinformation on social media?

Professor Tom Buchanan at the University of Westminster

Individual social media users are key to the spread of disinformation online. By interacting with disinformation, they share it to their own social networks. This can greatly increase its reach, and potential impact on society. Why do people do this? Are they fooled by the disinformation, and spread it because they believe it is true? Do they know the information is fake, but spread it anyway? How does the way a disinformation message is presented influence our likelihood of sharing it? Are some people more likely to share disinformation than others? This project will address those questions. Link to project.

Collecting and Leveraging Identity Cues with Keystroke Analysis (CLICKA)

Dr Oliver Buckley at the University of East Anglia

The project is based on the idea of ‘motor learning’, which suggests that a task becomes more automatic and requires less conscious thought the more it is repeated. In the first instance the project will develop an experimental framework to capture a user’s typing behaviours. This will then be used to create a predictive model, using state-of-the-art machine learning techniques capable of inferring some or part of an anonymous individual’s name. Link to project.

‘Hot periods’ of anti-minority activism and the threat of violent domestic extremism: Towards an assessment framework.

Dr Joel Busher at Coventry University

The aim of this project is to develop a stronger understanding of the dynamics of violent escalation, non-escalation and de-escalation during periods of intense anti-minority activism, and in doing so enhance the ability of state and civil society actors to (a) assess the threat of violent escalation during and in the aftermath of such ‘hot periods’, and (b) more accurately anticipate how planned interventions are likely to play out on the ground. Link to project.

Understanding moral injury and belief change in the experiences of police online child sex crime investigators

Dr Peter Lee at University of Portsmouth and Dr Mark Doyle at Solent University

This project will start by analysing and exploiting primary data from moral injury-related findings. These will subsequently be used to inform a focused enquiry into the causes of moral injury, and consequences such as changes in attitudes, beliefs and behaviour, among police internet child abuse investigators and relevant forensic teams. Link to project.

For more information about the successful applicants please visit the CREST website at: www.crestresearch.ac.uk/projects/.

Tags from the story