Skip to content

Blog

The road to impact: how do we ensure that evidence is both useful and used?

How do we use evidence to influence? How can we broker a dialogue with new audiences and create spaces where we can learn from each other?

19/06/2017

At the March 2017 convening of the Resilience Measurement Community of Practice in Uganda, two questions resonated as common challenges across diverse communities. How do we use evidence to influence? How can we broker a dialogue with new audiences and create spaces where we can learn from each other?  The discussions centred on both reaching wider audiences to increase uptake of findings and bringing in learning from fields and disciplines across the resilience measurement community. This isn’t the first time I’ve come across this, and for me, these challenges centre on two processes: knowledge exchange and increasing impact.

For too long, valuable evidence-based findings have sat on shelves or hidden by paywalls. In my former knowledge exchange role within academia, I was involved in the research impact agenda, where there is a similar move towards planning for learning and uptake of findings beyond the field of research i.e. pathways to impact. It strikes me that both communities – and probably many more besides – want to ensure that their findings inform policy and practice, but are grappling with how to engage a wider audience to ensure that findings are useful, used and ultimately benefit the economy and society.

There is no silver bullet to support learning and uptake processes – there is no decisive approach that ‘works well’ in all situations, not least because of the different aims of programmes and the uniqueness of their contexts. We work with many stakeholders, from international to community level, across diverse cultures, with variable resources and different starting points. Despite increasing awareness that evidence needs to be widely shared and embedded to be useful, focus tends to fall on improving communications, and product-push. This is unsurprising, given that many evaluations and research outputs are conducted at the end of a project or programme, and compile the knowledge developed in a summative way. While this is often too late to inform the development programme itself and course-correct, it can show whether the programme was effective in contributing to its goals. In particular, theory-based evaluations can go further in helping to explain what worked well, what didn’t, how and why, and for whom. This is useful evidence and can be used to generate new knowledge, build on what already exists and inform the design of future programmes.

In recognition that much learning comes too late to course-correct within the lifetime of a programme, in the development field, we are moving increasingly towards adaptive programming. As we learn more about how to ‘do’ adaptive programming we are starting to uncover some of the challenges of feeding in learning in real time to inform decision making and (hopefully) increase impact. For example, the qualitative learning-based monitoring systems used in BRACED facilitate valuable internal learning, but require more complex and hefty data collection and reporting, placing a burden on scarce resources. The time needed to analyse and synthesise more complex, context-specific evidence properly can also impose limitations on real-time learning. Recognising these limitations informs adaptive responses to optimise data collection and improve sequencing to facilitate better feedback processes. The process of testing approaches in different contexts will reveal new challenges, as well as opportunities to learn and develop solutions to support better adaptive programming.

Feeding in learning has the potential to do more than just inform adaptive programming. Learning cycles can also offer opportunities to increase stakeholder participation, learn together and foster shared ownership. Inputs throughout the programme lifecycle can lead to outcomes that are more locally meaningful, beneficial and sustainable. To do this well we need to become more than just gatherers and disseminators of knowledge. Sharing knowledge is a two-way process, and we need to become knowledge brokers, boundary spanners, intermediaries. We need to create spaces and cultivate practices that are sensitive to context and culture, to share our knowledge and learn from others. Ensuring the representation of diverse groups, and different levels of governance are important to ensure that the outcomes reflect the interests of many and that those involved are accountable to each other.

Early and continued engagement and collaboration with a variety of stakeholders throughout the lifetime of a programme can help to share emerging insights, and at the same time build relationships and trust. This process takes time and is expensive – often not possible within the resource and capacity constraints of programmes. In my experience though, when built into the budget, genuinely ‘doing the job together’, reflecting on progress throughout the programme lifecycle, and informing course-correction based on stakeholder needs, evolving priorities and interests, can pay off. In the past, I have found that this process has fostered shared ownership and increased the impact of the findings within the local context. In this way, early and continued engagement of stakeholders in learning processes has the potential to ensure that evidence is used and useful, and can help to create demand for evidence.

This is not an easy journey, often involving challenging negotiations, conflicting interests and trade-offs. You don’t always get it right, priorities shift and understanding of the context develops, providing plenty of opportunities for ‘learning from failure’. Ultimately though, if the engagement and commitment to share and learn together is genuine and inclusive, participation in adaptive decision-making can improve the sustainability of development outcomes for the local community. Embedding learning processes can empower a wide range of stakeholders to apply the lessons learned, and use evidence to influence. It’s not a silver bullet, but it’s a pretty good start.