Matthew Baumann Associates
Helping research firms and consultants to designing and delivering evaluation of complex policies
Helping consulting teams to apply evaluative thinking in their work with Government during the early stage of policy development
Increasingly, evaluative thinking is being used at the front end of programme and policy development. I work with consultants and research firms to apply a range of tools to help policy and service designers to 'de-risk' programmes ahead of committing to delivery.
-
Existing theory can be used to frame and model complex behaviours.
-
Participatory systems mapping can be used to understand the causal drivers of a policy issue and to identify ideal points of intervention.
-
Theory of Change workshops can be used to ensure there is a good understanding of how a policy or programme will work and to understand risks to delivery and results.
-
Participatory workshops and research can be used clarify perspectives and values of various stakeholders who might be key to policy success.
-
Social research and behavioural insights can be used to gain deep understanding of what activities might be used to trigger and sustain change over time.
-
Evidence maps can be produced to illustrate the knowns and unknowns of a problem and highlight where more evidence is needed.
Taken together this body of evidence and insights can help forge a more realistic route to intended results. My expertise comes from deep undertstanding of designing and scoping evaluation in the early stages of policy making. Whilst in Central Evaluation team at the Department of Energy and Climate Change (2012-2016) I developed a range of tools to underpin evauative thinking and more recently worked with CECAN and Technopolis to develop the DEFRA Theory of Change toolkit. This is now used extensively by Defra in the early stages of policy and evaluation design.
Supporting evaluation commissioners and research and evaluation firms to design, commission manage and deliver evaluations
I work with both clients and evaluators in the early stages of evaluation.
-
Client side I support commissioners to scope and clarify evaluation requirements, and support them to set up effective evaluation systems and manage evaluators. For example over 2022-25 I have worked extensively with the Youth Endowment Fund to help them design and manage a range of evaluations (MOJ's Turnaround Programme, Neighbourhood Fund, Your Choice Evaluation, Deferred Prosecutions) as well as helping them to review and optimise their own internal evaluation systems.
-
I also work with evaluation agencies and consultancies advising on the design and delivery of more complex evaluations. Since 2019 I have worked extensively with ICF Ltd on its portfolio of evaluations of Farming and Nature programmes for Defra and its agencies, and with Technopolis in the design and delivery of evaluations for DESNZ and MHCLG e.g. on the ongoing Freeports Programm Evaluation).
-
I also provide both government Departments and evaluation teams with ongoing advice support, peer review and QA to optimise their evaluations. Where appropriate I get involved in the detailed delivery of more complex evaluations
Helping research and evaluation consultants to navigate complexity in their evaluations and ensure government receives valuable and timely evidence.
I help make sure more complex evaluations remain deliverable, and produce actionable learning and evidence.
Examples of how evaluations of complex programmes might need to be tailored in no particular order..:
-
Complex programmes tend to work towards multiple outcomes - determining focal areas or causal hotspots is a way of managing the scope during an evaluation.
-
The likelihood of emergence and change in what is being delivered / implemented needs to be understood and evaluators need to anticipate policy or programme evolution. Implementers have to overcome many barriers and obstacles to achieve their goals, over multiple 'stages' - evaluation can play a crucial role in facilitating and generating evidence for national and local actors.
-
In most cases policy is innovative. Initial theories of change are likely to be 'best guesses' - they're not 'testable' as such but need to refined over time as better knowledge about action and cause and effects are determined.
-
Success may be dependent on the behaviours of multiple stakeholders at different levels. There may be many stakeholders who play a part in the change who need to be engaged. Evaluation helps to ensure stakeholders values, stakes and roles are understood, and their experiences and responses to policies captured and fed into an overall analysis.
-
Programme evaluators and policy makers should, in complex settings, start from an assumption that policies and programmes generally contribute to change rather than 'cause it' - evaluation needs to be tailored to distinguish between elements of a programme that have a direct and attributal effect and those that are expected only to contribute.
-
Evaluators need to think in terms both of the big picture (the system - actors and actions and relationships) and small picture (particular nodes or relationships within the system that mighbe in focus for an evaluation) and be able to clarify what is working within the wider system to enable policy systems to be optimised.
-
Actively monitoring and evaluating progrmames delivery and outcomes over time is critical in these contexts, and when guided to, they can provide regular, timely, actionable learning and evidence that can be used to strengthen programmes and policies.
- Data about inputs, activity, performance and outcomes may be needed from multiple stakeholders and the data access, data sharing and validation will require mapping and upfront agreement
- Systems for analysing, aggregating and synthesising evidence from mixed methods / data sources need to be clarified and delivered
- A transdisciplinary approach might be needed within the evaluation team to enable mutual understanding and effective communication across different analytical or sectoral areas.
-
Evaluation Learning and reporting systems need to be designed and actively implemented minimising burden and maximising value through careful tailoring
-
As programmes are delivered evaluation frameworks need to be flexible, and reviewed and updated regularly to ensure they continue to be fit for purpose.