Evaluation fulfils a variety of roles in the field of preventing and countering violent extremism (P/CVE), which often involves a range of non-state and civil society actors in primary, secondary and tertiary prevention activities. Against the backdrop of evaluation’s historical role in public policy, where evaluation often informed budget decisions (cf. Derlien, 1998), it is often associated with a legitimising function. This is particularly true in a highly competitive funding landscape, in which projects often compete for limited funds (Malet 2021; KN:IX 2020). When state funding bodies request evaluations, this can create the impression that evaluations are primarily control instruments (Sivenbring and Andersson Malmros 2019; cf. Treischl and Wolbring 2020).
However, evaluation can also be emancipatory. Self-evaluation in particular has the potential to professionalise the practice of preventing and countering violent extremism and to establish a stronger professional self-image (von Berg et al. 2024, 216). This opportunity contrasts with a reality in which the topic of evaluation has only increasingly come into focus in the past years (cf. Uhl and Kattein 2024; Bressan et al. 2024) yet it is still often criticised as inadequate in terms of scope and quality (cf. Feddes and Gallucci 2015; van Hemert et al. 2014). There are many reasons for this: On the one hand, there is frequently a lack of evaluation skills at an individual level, and, on the other hand, there is often a lack of institutionalised procedures and conditions at a structural level. The latter is currently being investigated as part of a country-comparative dissertation within the VORTEX doctoral network under the provisional title ‘Evaluation and Quality Management in the Field of Preventing and Countering Violent Extremism – Twelve European Countries in Comparison’. However, there are also starting points at an organisational level to make evaluations more practicable: Appropriate structures and processes can, for example, demystify external evaluations and make the potential of self-evaluations more valuable and achievable. This blog post outlines three such levers at the organisational level: a systematic and impact-oriented practice, a monitoring system and a guideline for the case of evaluation within the organisation.
Systematic and impact-oriented practice
A P/CVE practice that is systematically structured and impact-oriented is much easier to evaluate than practice in which impact assumptions, intervention goals and success criteria have not been defined in advance. The development of a theory of change for the overall strategy is a fundamental first step, as it creates a common understanding of how and why certain measures should contribute to the achievement of objectives (INDEED 2023, 16).
However, it is also crucial to address the level of specific measures: Impact assumptions and intervention objectives should be precisely formulated and operationalised with measurable success criteria. In the evaluation of federally funded counselling centres in Germany, Karliczek et al. (2023) propose a model that differentiates objectives in disengagement work. The nine-field matrix structures goals both in terms of the reference level – i.e. whether they relate to the radicalised person themselves, to their interfaces with the environment or to the social environment – and in terms of the level of disengagement work, which distinguishes between pragmatic aspects, socio-affective-emotional dynamics and ideological-normative convictions.
If such intervention goals have been systematically developed and concretised using success criteria, counsellors can use them to reflect on the results of their work with clients, for example through measuring how their clients fare on each of them before and after an intervention (Karliczek et al. 2023, 106). A practice that works on this basis makes it much easier to document the impact of disengagement work in a comparable way across many cases. However, this is not about rigid standardisation, but about structured flexibility: in other words, a framework concept that guides professional action but does not restrict individual casework. One example of such a systematic procedure is the social diagnostic model adapted by the Violence Prevention Network e. V. for disengagement work. It combines a holistic analysis of cases with needs-oriented interventions and creates a sound basis for making impact both visible and assessable (von Berg et al. 2024). Importantly, it leaves flexibility to counsellors within the model so that they can decide which measures are necessary and appropriate in a given context.
Establishing a monitoring system
Such a systematic approach naturally lends itself to monitoring, i.e. the ongoing, systematic recording of data. Applied to the aforementioned possibilities of measuring the results of disengagement work, this means regularly documenting whether a positive change has taken place for a client in relation to a specific intervention goal based on the specified impact assumptions and intervention goals. If we move away from the counselling example and towards trainings with multipliers on how to handle situations related to violent extremism, a key question might be whether participants feel more knowledgeable after the training. Tracking these self-assessments over time is essential for meaningful evaluation. While monitoring focuses on ongoing data collection, evaluation can go a step further: it may ask how effective the intervention is in achieving its intended goals and makes a judgement about its impact. In doing so, evaluation can build on the data collected through monitoring and analyse it to assess overall effectiveness (Junk 2021).
Being prepared for evaluation
Even if an organisation structures its work systematically and monitoring is firmly anchored in practice, conducting a self-evaluation – and especially an external evaluation – can be perceived as overwhelming. If the organisation is not sufficiently prepared, there is a risk that the potential of an impact-oriented approach to monitoring will remain unused. It is therefore advisable to think about responsibilities and coordination processes within the organisation at an early stage. Klöckner et al. (2021), for example, describe the benefits of an evaluation working group within civil society organisations focused on supporting clients with disengaging from violent extremist groups. Such a group can represent a recurring early exchange format within the organisation in preparation for an external evaluation. With regard to an upcoming external evaluation, such a working group offers a protected framework in which expectations, fears and specific needs can be openly addressed. In this way, the organisation can collect and communicate its needs and objectives at an early stage (expectation management) and discuss the possibilities of practical feasibility and already look at how evaluation results can be transferred into practice (cf. Klöckner et al. 2021, 7). The overarching goal of the evaluation working group is to strengthen the organisation’s identification with the evaluation process and thus ensure the greatest benefit of the evaluation for the organisation (INDEED 2023, 28). However, such an evaluation working group is also suitable for the preparation and coordination of self-evaluations. Here, too, it can be a recurring structure that accompanies the process of harnessing monitoring data for evaluation.
This blogpost sought to explore how P/CVE organisations can better harness the potential of evaluation by embedding three structural levers into their work: First, developing systematic, impact-oriented practices, second, establishing monitoring systems to track progress over time, and third, preparing for evaluation through internal coordination processes such as a working group on evaluation. These strategies help make both self-evaluation and external evaluation more meaningful and manageable, thus transforming evaluation from a perceived burden into a valuable learning opportunity.
Sources
Bressan, Sarah, Sophie Ebbecke, and Lotta Rahlf. 2024. ‘How Do We Know What Works in Preventing Violent Extremism? Evidence and Trends in Evaluation from 14 Countries’. Berlin: GPPi; PrEval (PRIF). https://gppi.net/assets/BressanEbbeckeRahlf_How-Do-We-Know-What-Works-in-Preventing-Violent-Extremism_2024_final.pdf.
Derlien, Hans-Ulrich. 1998. ‘Le Développement Des Évaluations Dans Un Contexte International’. In Politiques Publiques: Évaluation, 7–11. Paris: Economica.
Feddes, Allard R., and Marcello Gallucci. 2015. ‘A Literature Review on Methodology Used in Evaluating Effects of Preventive and De-Radicalisation Interventions’. Journal for Deradicalization, no. 5 (December), 1–27.
Hemert, Dianne van, Helma van den Berg, Tony van Vliet, Maaike Roelofs, and Mirjam Huis in ’t Veld. 2014. ‘Synthesis Report on the State-of-the-Art in Evaluating the Effectiveness of Counter-Violent Extremism Interventions’. Deliverable 2.2. IMPACT Europe.
INDEED. 2023. ‘How to Design PVE/CVE and De-Radicalisation Initiatives and Evaluations According to the Principles of Evidence-Based Practice’. INDEED Consortium. https://home-affairs.ec.europa.eu/networks/eu-knowledge-hub-prevention-radicalisation/welcome-package/learning-resources/indeed-e-guidebook-2-how-design-pvecve-and-de-radicalisation-initiatives-and-evaluations-according_en.
Junk, Julian. 2021. Quality Management of P/CVE Interventions in Secondary and Tertiary Prevention: Overview and First Steps in Implementing Monitoring and Reporting. Radicalisation Awareness Network. https://home-affairs.ec.europa.eu/system/files/2021-12/ran_ad-hoc_quality_management_of_p-cve_interventions_122021_en.pdf.
Karliczek, Kari-Maria, Vivienne Ohlenforst, Dorte Schaffranke, Dennis Walkenhorst, und Juliane Kanitz. 2023. Evaluation bundesfinanzierter Beratungs-stellen: Abschlussbericht der Evaluation der Beratungsstellen zur Distanzierung und Deradikalisierung vom islamistischen Extremismus. Beiträge zu Migration und Integration, Band 12. Nürnberg: Bundesamt für Migration und Flüchtlinge. Junk, Julian. 2021. Quality Management of P/CVE Interventions in Secondary and Tertiary Prevention: Overview and First Steps in Implementing Monitoring and Reporting. Radicalisation Awareness Network. https://home-affairs.ec.europa.eu/system/files/2021-12/ran_ad-hoc_quality_management_of_p-cve_interventions_122021_en.pdf.
Klöckner, Mona, Svetla Koynova, Johanna Liebich, and Lisa Neef. 2021. ‘Erfahrungen aus der Evaluationsplanung eines Aussteigerprogramms. Voraussetzungen für Wirksamkeitserfassung in der tertiären Extremismusprävention’. PRIF Report 6. Frankfurt am Main: PrEval Consortium. https://www.prif.org/fileadmin/Daten/Publikationen/Prif_Reports/2021/PRIF0621_barrierefrei.pdf
KN:IX. 2020. ‘Kompetenznetzwerk “Islamistischer Extremismus” (KN:IX) – Herausforderungen, Bedarfe und Trends im Themenfeld’. Berlin: Kompetenznetzwerk Islamistischer Extremismus. https://kn-ix.de/wp-content/uploads/2021/02/KNIX-Report-2020.pdf.
Malet, David. 2021. ‘Countering Violent Extremism: Assessment in Theory and Practice’. Journal of Policing, Intelligence and Counter Terrorism 16 (1): 58–74. https://doi.org/10.1080/18335330.2021.1889017.
Sivenbring, Jennie, and Robin Andersson Malmros. 2019. Mixing Logics: Multiagency Approaches for Countering Violent Extremism. Göteborg: Segerstedinstitutet, Göteborgs Universitet. https://www.gu.se/sites/default/files/2020-03/1764750_korrekt-versionmixing-logics_digital_korrekt.pdf.
Treischl, Edgar, Tobias Wolbring. 2020. Wirkungsevaluation. Grundlagen, Standards, Beispiele, Weinheim/Basel: Beltz.
Uhl, Andreas, and Ian Kattein. 2024. ‘Monitoring von Evaluationskapazitäten in der Extremismusprävention, Demokratieförderung und Politischen Bildung’. In PrEval Monitor: PrEval Zukunftswerkstätten. https://preval.hsfk.de/fileadmin/PrEval/PrEval_Monitor_2024_engl..pdf
Von Berg, Annika, Dennis Walkenhorst, Gloriett Kargl, and Maximilian Ruf. 2023. Soziale Diagnostik in der Extremismusprävention – Diagnose, Fallverstehen, Intervention und Wirkungsmessung. Ideologie und Gewalt – Schriften zur Deradikalisierung. Wiesbaden: Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-42427-5.