- The overwhelming majority of P/CVE programmes have not been subject to formal evaluation. Where evaluations have taken place, they can fall short of the standards of transparency, independence, and rigour typical of related fields.
- Greater sharing of internal evaluations would strengthen the field and enhance the speed at which progress is made, as well as avoiding parallel research cultures developing in the open and closed source literature. Pooling expertise on evaluation methods across the different bodies involved in delivering P/CVE interventions and more publicly accessible evaluations would also strengthen work in this area.
- Key challenges facing P/CVE interventions include the absence of an appropriate counterfactual, or an understanding of what would have happened in the absence of an intervention, and the small numbers of people who are supported through these programmes. Quasi-experimental designs have been used in comparable fields such as gang-related interventions and have the potential to overcome these challenges.
- There are ethical and security challenges when selecting an appropriate control group against which to evaluate the impact of P/CVE programmes. It is relatively straightforward to identify a control group for primary prevention methods that are aimed at larger populations; it is far harder to generate control groups for those at risk of, or involved in extremism, as this would typically involve denying individuals access to support to determine if an intervention was effective. Switching Replications design can potentially overcome this issue.
- There is an absence of robust data against which to triangulate the findings of P/CVE evaluations. However, lessons can be learned from evaluations of gang-related interventions, which commonly use more than one evaluation method to triangulate their findings.
This report provides an overview of the types of P/CVE programmes that have been developed, reviews the methods used to evaluate them, and outlines the challenges facing evaluation efforts alongside a review of how research has sought to overcome them. The most significant limitation is the lack of evaluative work carried out to date. Most research is descriptive, and although there is a good understanding of the challenges facing the field, few studies have successfully addressed them.
This report is one of a series exploring Knowledge Management Across the Four Counter-Terrorism ‘Ps’, which looks at areas of policy and practice that fall within the four pillars of CONTEST.
To read the full report, please download it as a PDF.
Baruch, B., Ling, B., Warnes, R. and Hofman, J. (2018). Evaluation in an emerging field: Developing a measurement framework for the field of counter-violent-extremism. Evaluation, 24:4, 475–495.
Braddock, K. (2020). Experimentation & quasi- experimentation in countering violent extremism: Directions of future inquiry. Washington DC: RESOLVE Network.
Cherney, A. and Belton, E. (2019a). Assessing intervention outcomes targeting radicalised offenders: Testing the pro integration model of extremist disengagement as an evaluation tool. Dynamics of Asymmetric Conflict [Early Access Article].
Cherney, A. and Belton, E. (2019b). Evaluating Case-Managed Approaches to Counter Radicalization and Violent Extremism: An Example of the Proactive Integrated Support Model (PRISM) Intervention. Studies in Conflict & Terrorism [Early Access Article].
Cherney, A. (2020). Evaluating interventions to disengage extremist offenders: a study of the proactive integrated support model (PRISM). Behavioral Sciences of Terrorism and Political Aggression, 12:1, 17–36.
Davies, M., Warnes, R. and Hofman, J. (2017). Exploring the transferability and applicability of gang evaluation methodologies to counter-violent radicalisation. Santa Monica, California: RAND Corporation.
Ris, L. and Ernstorfer, A. (2017). Borrowing a wheel: Applying existing design, monitoring, and evaluation strategies to emerging programming approaches to prevent and counter violent extremism. Briefing Paper, Peacebuilding Evaluation Consortium.
As part of CREST’s commitment to open access research this report is available under a Creative Commons BY-NC-SA 4.0 licence. For more details on how you can use our content see here.
IMAGE CREDITS: Copyright ©2020 R. Stevens / CREST (CC BY-SA 4.0)