Skip to content

Blog

Itad’s Realist Evaluation week – Three Lessons from the BRACED session

Dave Wilson reflects on three key lessons the BRACED evaluation team learnt with Gill Westhorp during Itad's Realist Evaluation week.

20/05/2016

One of the evaluation questions Itad is seeking to answer for DFID’s BRACED programme is: how and why have different ‘packages’ of activities strengthened resilience in particular contexts?

The evaluation team are applying a Realist Evaluation (RE) approach to answer this question, but are challenged by the fact that they are not responsible for the data gathering themselves, instead supporting implementing partners (15 INGO led consortia) to collect it. As not all the Monitoring and Evaluation (M&E) leads for these organisations are trained in RE (for a more comprehensive overview of RE, see the recent CDI practice paper), and to ensure data is gathered in a way which is consistent with RE and aligned across projects to enable synthesis, the Itad team supported them in creating evaluation matrices and terms of reference. In meeting with Gill Westhorp in April 2016, we were able to explore some of the challenges they have been facing, and identify some key learning points:

Realist evaluation can help to deal with complexity

RE should lend itself well to a complex programme like BRACED. Not only is the programme complicated in the way it has been set up, but resilience as a concept is rooted in complexity. Many of the 15 projects, managed by over 100 different organisations, have designed their interventions as ‘packages’ which interact with one another to strengthen the resilience of participants. Resilience therefore may be considered as an emergent propertyoften unobservable, interactions between the individual interventions, how people engage with and react to them and what changes as a result. RE therefore should be able to handle this complexity through its focus on understanding mechanisms and contexts and requirement for iteration – checking and re-checking understanding and exploring how and why observable changes are happening.

Iteration is king, rediscover the toddler in you

The fundamental analytical unit in realist evaluation is the Context Mechanism Outcome Configuration (CMOC) which can be presented as a narrative statement about how the context in which the intervention is offered ‘fires’ a mechanism which can be a change in preference or behaviour which leads to an outcome. CMOCs should change as the evaluation progresses. Based on the initial theories of change, Gill recommends crafting rough cut, first iteration CMOCs. These could be sequenced based on a hierarchy of different outcomes depending on the project. Once the first round of data collection is complete, these CMOCs can be revisited and refined, which can be repeated again after the second round.

In addition to the two rounds of secondary data, the BRACED evaluation team also have some capacity and resources for supplementary primary data gathering, to triangulate and further contextualise the secondary data. One useful tool to consider, suggests Gill, is Outcome Harvesting in which project stakeholders are asked to identify the key outcomes during interviews and suggest others from outside the project who may be able to corroborate these – a form of purposive, ‘snowball’ sampling. During interviews, the team were encouraged to rediscover the toddler in themselves, engage their naturally inquisitive minds and ask not only what the outcomes are but why they were achieved, repeatedly and persistently…….Yes but why? OK, so Why? Right……why? Until patience runs thin or the only answer is ‘Hmmm…. just because’.

Presenting Findings

Gill emphasised the importance of the presentation format in reporting evidence from realist evaluations. Understanding your audience and their needs is critical to this and in this way, links to Utilization Focused Evaluation principles. Different audience groups may be interested in different levels of analysis – policy makers may want ‘nuggets’ of evidence which informs their investment decisions, whereas practitioners will probably be more interested in the details which sit behind these nuggets in the form of the CMOCs or propositions (narrative statements). Crafting such a report becomes somewhat of an art form but Gill offered a useful rule of thumb; that documents should be organised by evaluation questions being answered.

Armed with these useful insights from one of the world’s leading proponents and practitioners of Realist Evaluation, the evaluation team should be well equipped to navigate the complexity within the BRACED programme to offer real insights into which resilience strengthening interventions work in which contexts from whom and why.