Adaptive management requires learning as you go. Decisions need to be taken to support course-correction – and they need to be based on evidence.
We often think about monitoring for adaptive management, making sure we are aware of the changes in a particular metric so that we can base decisions on whether something has improved, worsened, or stayed the same.
But monitoring only focuses on whether something has changed, and doesn’t get to the deeper understanding of why and how this change has happened. This is the role of evaluation. In this respect, evaluation can play an important supplementary role to monitoring and professional intuition in informing adaptations to programme design and implementation.
But how can evaluation feed into decisions that need to be made in a few months’ time?
Using evaluation in this way requires a different way of working. Evaluations that support adaptive management are undertaken in a much shorter time frame. Some recent evaluations we have conducted have been done and dusted within two months.
So, what does this mean for both commissioners and implementers of evaluations?
In many ways, rapid evaluations don’t require evaluators to do anything new, they just require that we follow the good practices that we already know are central to delivering high quality and useful evaluations. But it’s always good to have a reminder, and even more so when time is tight and the pressure is on.
Here are our top five takeaways from conducting evaluations on the fly. These are aimed mostly at evaluation implementers but are also relevant for commissioners given their role in shaping evaluation design and implementation.
1. Be clear on scope. Evaluation management – like any other project management – must consider four central dimensions: timing, quality, budget and scope.
Timing: In a rapid evaluation, the timing is often set – we want the findings yesterday. Or in two months, perhaps.
Quality: This is a given. There’s no point going ahead with an evaluation unless it can deliver high-quality findings and conclusions.
Cost: There may be some wriggle room on budget, but let’s be honest, it won’t be much. This is especially true given these types of evaluations are likely to have been designed, commissioned, and finished up well within a financial year.
Scope: This is where there is most room for flex – both in terms of the age-old ‘scope creep’, but also for setting boundaries. This is the space in which expectations must be calibrated. In order to deliver a high-quality evaluation, within a short timeframe and with a limited budget, there needs to be a trade-off, and it usually comes down to this: breadth or depth. Does the evaluation team cover lots of issues at a fairly superficial level, or do they go deep in a limited number? It’s unlikely to be able to do both in these circumstances, so it’s important to get the client to articulate what’s going to be the most useful to them.
2. Make sure your client is committed to engaging in the evaluation process or don’t bother. In any evaluation, utility is maximised where the client is engaged, and in a rapid evaluation this is of heightened importance. The client needs the insights for a particular purpose, and there is extra pressure to deliver.
Clients need to make the time to set the scope, make themselves available to hear emerging findings, provide feedback and help shape recommendations. In a rapid evaluation, this engagement needs to happen within a very compressed timeframe, so it’s important that key stakeholders in the client organisation are aware and able to provide this level of involvement.
Active engagement is also a risk mitigation strategy. Rapid evaluations don’t have the benefit of time; a small mistake can be amplified very quickly. Regular engagement from the client is essential. There are not so many opportunities for course-correction when things are moving quickly.
3. Don’t skip inception. There can sometimes be a pressure to just get on with the evaluation and skip the inception phase. Don’t do this. This is your only chance to really ensure everyone is on the same page. Even if inception is just a week or two, it is an important step in the process to elevate the importance of setting the scene for the evaluation, as well as signing off on that all-important question of scope.
4. Remain flexible and don’t over-design the evaluation. As with any evaluation, you need to have a clear way forward, but in a rapid evaluation it is highly likely that the commissioner hasn’t had a chance to clearly think through the scope of work. They are also working to a rapid timeline, remember. Expect that as things evolve, priorities will change, and you will need to adapt. This means keeping the evaluation design light and sufficiently flexible for new questions to be added, or lines of enquiry to be pursued*.
(You may be thinking: doesn’t this contradict the earlier point on the importance of pinning down scope? How can you remain flexible under a fixed scope? The point we want to reinforce on flexibility is on the implementation side. Be flexible in how you will conduct and present the work, but stay laser-focused on the questions you’re answering.)
5. Keep the team small and ensure good availability. Coordinating between team members takes time, and that’s something you just don’t have when conducting a rapid evaluation. Handovers between team members covering for one another is another potential time-suck, but can be avoided by keeping the same people on board for the duration of the evaluation. It might even be helpful to appoint a small ‘core team’ to manage the exchange between the client and team, and keep roles and responsibilities crystal clear.
So, in sum, rapid evaluations are perfectly feasible. If you have a small team with good availability, and keep your scope locked down – you might even enjoy it. I know we certainly have!
*But mind the scope creep! There might be trade-offs in delivering x instead of y. It’s your job to anticipate these and communicate them to the client so they can make an informed decision.
What’s your experience of conducting rapid evaluations? Comment below – we’d love to hear your thoughts!
Contributors: Rob Lloyd; Stefanie Wallach; Mary Lagaay
Image © Credit: Flickr