top of page

Designing and delivering evaluation of complex policies

De-risking programmes 

Increasingly, evaluative thinking is being used at the front end of programme and policy development. I use a range of tools to help policy and service designers to 'de-risk' programmes ahead of committing to delivery.  Existing theory can be used to frame and model complex behaviours. Participatory systems mapping can be used to understand the causal drivers of a policy issue and to identify ideal points of intervention. Theory of Change workshops can be used to ensure there is a good understanding of how a policy or programme will work and to understand risks to delivery and results. Participatory workshops and research can be used clarify perspectives and values of various stakeholders who might be key to policy success. And social research and behavioural insights can be used to gain deep understanding of what activities might be used to trigger and sustain change over time. Finally evidence maps can be produced to illustrate the knowns and unknowns of a problem and highlight where more evidence is needed. Taken together this body of evidence and insights can help forge a more realistic route to intended results. Building on prior developmental work during my stint as Principal Evaluation Officer at the  Department of Energy and Climate Change, I recently worked with CECAN and Technopolis to develop the DEFRA Theory of Change toolkit. This is now the Departmental tool for use in the early stages of policy and evaluation design.

Brainstorming Session

Supporting commissioners and evaluation teams 

I work with both clients and evaluators in the early stages of evaluation. Client side I support commissioners to scope and clarify evaluation requirements, and support them to set up effective evaluation systems and manage evaluators. Over 2022-23 I worked extensively with the Youth Endowment Fund to help them design a rigorous two year evaluation of the Ministry of Justice's Turnaround Programme. I also work with evaluation agencies advising on the design and delivery of more complex evaluations. Since 2019 I have worked extensively with ICF Ltd on its portfolio of evaluations for Defra and its agencies, and with Technopolis in the design and delivery of evaluations for DESNZ and DLUHC (such as the ongoing Freeports Programm Evaluation)

I also provide both clients and evaluators ongoing advice support and QA to optimise evaluation, and to help steer evaluations over time especially when new or innovative methods are required, or for particuarly complex programmes. And I get involved in detailed delivery of workstreams where appropriate - for example leading on realist evaluation, contribution analysis, process tracing, and / or qual and quant analysis and subsequent sense making.

Abstract Structure

Navigating complexity

Complex programmes tend to work in an emergent way towards multiple outcomes. They may have to overcome many barriers and obstacles to achieve their goals, over multiple 'stages'. Success may be dependent on the behaviours of multiple stakeholders who can only be indirectly influenced or incentivised to behave differently. In this context, the evidence base for how to achieve these goals is uncertain and clients need to actively manage delivery, changing activities over time to adapt to a changing environment. Actively monitoring and evaluating progrmames delivery and outcomes over time is critical in these contexts, and can provide regular, timely, actionable learning and evidence that can be used to strengthen programmes and policies. These kinds of evaluation frameworks need to be flexible, and will need to be reviewed regularly to ensure they are tailored to emergent requirements. I help make sure these kinds of evaluations are deliverable, and produce actionable learning and evidence.

bottom of page