Two new projects addressing security threats faced by the UK have been announced:
The Adaptable Law Enforcement Officer: Developing a Measure of Adaptive Effectiveness
This project, led by Dr Simon Oleszkiewicz (University of Twente, Netherlands), in collaboration with Erik Mac Giolla (University of Gothenburg, Sweden) aims to develop a behavioural measure of adaptability relevant for police contexts.
To examine adaptive behaviour, Oleszkiewicz’s team have developed a novel experimental set-up inspired by observations of the training at the Los Angeles Police Department.
In Experiment 1, university students will take the role of an ‘agent’ that has to complete three ‘undercover missions’. Adaptive behaviour will be elicited by three features: a goal, an expectation, and a violation of that expectation. This violation creates the novel or unexpected situation that participants must adapt to in order to attain their mission objective. Adaptability will be measured as the adjustments made in response to the changed situational demand.
Experiment 2 will be a vignette study to examine perceptions of the adaptive responses. A sample of practitioners with relevant experiences will watch a number of video recordings of adaptive responses from Experiment 1 and rate the efficacy in attaining mission objectives.
You can read more about Oleszkiewicz’s project here.
Human Engagement Through Artificial / Augmented Intelligence
This project is led by Professor Chris Baber (School of Computer Science, University of Birmingham) teaming with Ian Apperly (School of Psychology, University of Birmingham) and is the latest project to be awarded funding from CREST’s recent commissioning call.
Baber’s project looks at ‘Augmented intelligence’ and how it can extend human cognitive ability. The capabilities of Artificial Intelligence / Machine Learning (AI / ML) for exploring vast data resources and discovering patterns, exceeds that of the human. However the human’s expertise will allow insight into unusual or unfamiliar patterns. The project therefore focuses on the need to ensure collaboration in pursuit of sense-making.
This requires that the AI is able to explain itself to the human, that the human can provide explanation to the AI, and that human-AI engagement progresses through the establishment and maintenance of common ground.
Not only is it important that humans and automation establish and use common ground, but also that humans who communicate through the automation have this. The project asks how common ground might breakdown in order to explore consequences and mitigations.
You can read more about Baber’s project here.
To see all the 2019 CREST commissioned projects go to: www.crestresearch.ac.uk/projects/.
More projects to come…
Watch this space as the other successful projects (subject to contracts being finalised) will be announced very soon. Follow us on Twitter, Facebook and LinkedIn, or sign up to our newsletter, to keep updated.