The importance of evaluation in preventing and countering violent extremism (P/CVE) practice is widely recognised. While things are moving to the right direction, more work is needed to make evidence-based evaluations an integral and standard part of P/CVE field. The Horizon 2020 funded INDEED project (2021-2024) sought to address this need.

One objective of INDEED was to outline what evidence-based evaluation means in P/CVE . Numerous focus groups, interviews, stakeholder meetings, and events demonstrated great interest in evaluation but also a lack of knowledge and experience.

The project found that the key to successful and productive evaluations is not to see it as a one-time event but rather as a systemic part of organisational processes. Evaluations are most useful and reliable when they are supported by a well-developed evaluation culture where evaluation is seen as an instrument for learning, and where evaluation is actively sought to improve processes and practices.

Developing this kind of evaluation culture can alleviate fears around evaluation, increase knowledge and motivation to design more evidence-based programmes, and nurture a willingness to improve one’s own work.

The results of the INDEED project suggest five steps that can be taken to foster evaluation culture in the P/CVE field.

1. Build a shared understanding of what evaluation means

Practitioners and policymakers have varied understandings of what evaluation and evidence mean and what evaluation involves. For the law enforcement sector, “evidence” is mainly understood in the context of forensics, and evaluation is associated with top-down discussions about assessing operations and interventions. Practitioners can also conflate evaluation with risk assessment (of individuals), instead of P/CVE programming.

The key to successful and productive evaluations is not to see it as a one-time event but rather as a systemic part of organisational processes.

To create a shared understanding of the principles and processes of evaluation, INDEED developed the INDEED evidence-based evaluation model. It provides a good starting point for practitioners and policymakers from all sectors. This model captures the main principles behind evidence-based evaluation, such as stakeholder and community orientation, evidence collections and professional judgement. It also covers the main iterative steps of the evaluation process defined by the INDEED project, such as preparation, design, execution and utilisation.

2. Plan early and do it together

It is never too early to think about evaluation. Evidence-based evaluation often requires data to be collected at different stages, and benefits from an evaluation strategy that is developed as part of the programme’s implementation plan. This plan should not be dictated top-down, but developed with key stakeholders, especially those implementing the initiative. This usually significantly increases their commitment to evaluation and makes the results more useful for their work.

Evaluation is almost always possible, but its depth and rigor will depend on organisational resources, time and willingness to find solutions to problems that emerge. Advance planning is particularly important if evaluation requires handing over sensitive data. This can be addressed by, for example, having an independent, internal evaluator or using a traffic light system to indicate what data can be shared with an external evaluator.

3. Build knowledge about evaluation

Evidence-based evaluation should involve those who plan and implement the P/CVE initiative and not just external evaluators. Staff should have a basic understanding of evaluation so they can contribute to building a productive evaluation culture.

To support practitioners with evaluations, INDEED developed, tested, and validated an evidence-based evaluation tool to give practitioners the information necessary to plan and implement their evaluations, and design new programmes with evaluation in mind.

Increasing and upgrading knowledge of evaluation processes is key to successful P/CVE practise. The P/CVE field needs more systematic, robust, affordable training. The INDEED project created online self-paced trainings and a training curriculum for P/CVE practitioners, but much more needs to be done, including better opportunities for in-person training which allow more effective experimental simulation methods to be used.

4. Policymakers and funders should provide structures and resources to support evaluation

A common reason for resisting evaluation is the fear of negative consequences. Evaluations tend to be more successful and useful when they are planned and implemented with learning and not control in mind, and when they are not attached to funding decisions. 

Policymakers and funders can help overcome practical obstacles for evaluation. Both by allocating funding for evaluation when financing projects, and having funding available to evaluate existing P/CVE initiatives without an evaluation budget.

Some funders have recommended evaluation is carried out in the middle of the project so that results can be incorporated into practice more quickly. Bringing in knowledgeable evaluators early on can provide support for programme implementors. Academic researchers can also support practitioners with the development of an evaluation plan by bringing their state-of-the-art perspectives from the P/CVE research and recommending evaluation designs.

5.  Share stories, encourage and motivate others

It became clear in INDEED from the start just how important sharing evaluation results is for building evaluation culture and applying evidence-based approaches in the P/CVE field. Planning evaluations and evidence-based initiatives is much easier if it is possible to learn from what others have done.

There are good reasons why not all results can be shared, but it is highly recommended to do so whenever possible. Extracting lessons learnt and analysing their transferability to P/CVE in other contexts is crucial for strengthening the evidence base. Sharing evaluation results and experiences with planning and implementing evaluations will motivate and encourage other P/CVE stakeholders to design better programmes and perform more rigorous evaluations.


The INDEED project has received funding from the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No. 101021701.

Read more

Malkki, L., van der Vet, I. & Prokic, M. (2023a). Evidence-based evaluation of P/CVE and de-radicalisation initiatives. INDEED e-guidebook 1. https://bit.ly/43cLlbS

Malkki, L., van der Vet, I. & Prokic, M. (2023b). How to design PVE/CVE and de-radicalisation initiatives and evaluations according to the principles of evidence-based practice. INDEED e-guidebook 2. https://bit.ly/43cLlbS