In a political climate dominated by stories of ‘fake news’, ‘alternative facts’ and ‘not trusting experts’, the conference presents an opportunity to show how good quality evidence does make a difference. So, I’m really interested in learning from, and challenging, experts on how they use evidence in their analysis and findings to inform public debate and influence those in power and authority.
That’s my expectation…..I’ll let you know if it’s met when I come back from the conference! Watch this space!
Evaluations – are they of any use? Reflections after the UKES conference
So, did it meet my expectations? To an extent. There was a strong sense of what makes for a quality evaluation and good quality evidence. This came through from the various presentations, publications and posters on show. But are evaluations of any use beyond delivering a well thought out process and an eye-catching report? There is a debate around this – I still want to know how an evaluation is not only useful but that the evidence and findings will be used.
From the discussions, it was clear that evaluations have the potential to be useful if they are truly wanted by commissioners to contribute to results and learning, that they are strategic and forward thinking, and that there is clarity on the utility of the evaluation and the target audience. For the evaluators, understanding the context and building a relationship the client are also critical.
A session on the ‘Politics of utilisation focused evaluation’ raised questions around where the drivers are coming from for an evaluation, how likely will it is to be useful and how it will be used. A presentation on ‘lessons learned from an evaluation of DFID’s What Works to Prevent Violence Programme’ demonstrated the complex relationships an evaluator needs to understand and manage in getting the most out of an evaluation. One of the critical factors to get right is building and maintaining a constructive relationship between the client and the evaluator. Understanding needs and expectations at the outset but being prepared to be flexible and adapt during the evaluation are seen to be equally important.
Looking to the future, Itad’s presence at the UKES conference puts us in a good position to be at the forefront of demonstrating how evaluations can make a significant difference to programming and policy-making. On the day I attended, Itad’s presentation on ‘How accountability trumps learning: Lessons from evaluating the Tilitonse programme’ was well received. And people were still discussing issues around measuring and evaluating adaptive programmes, which was reflected in Itad’s presentation of the day before, ‘Adaptive learning on PERL Nigeria: How do you measure change in changing interventions?‘
My headline takeaway. At the outset of any evaluation you need to ask the question, ‘how will this evaluation make a difference – and how would we know?’