Skip to content

Blog

Design for Health: the role of data and evidence

In this blog, Emma Mulhern, describes her initial impressions of Design for Health, a joint venture by both the Bill and Melinda Gates Foundation and USAID’s Center for Innovation and Impact. Discover 'the challenge' and recommendations for 'the response'.

3/08/2018

In Design for Health, I’m really excited about the opportunities presented by using data and evidence to progress towards an end vision: to define ‘a set of principles for how to collect a body of evidence outlining how the data and results of an effective design process can contribute to health outcomes’. The platform has identified five archetypes to describe global health specialists with divergent views on design – the Newbie, the Curious, the Trialist, the Believer and the Pioneer – all of whom Design for Health seeks to influence and support with various tools and resources. The plan? To encourage the appropriate use of design principles and appreciation of the role that design can play in the public health space.

The challenge

In public health, we love data, randomised control trials and quasi-experimental designs. We love best practices and vast bodies of evidence. Data – did I mention data? So, the inclusion of the Newbie as an archetype really resonated with me: ‘Until I see hard numbers and facts about its impact, I cannot even start talking about design.’ Of course, the importance of evidence in the public health sector is critical: when evidence is utilised correctly in public health, we enable policy makers to make better decisions, learn from past mistakes and/or challenges, and design and implement better interventions. In this context, how do Newbies get taken on the journey of an emerging methodology in this field?

Those behind Design for Health have recognised this challenge and are endeavouring to meet the Newbies in their comfort zone with data and evidence. The role of data and evidence in pushing the vision for Design for Health forward is crucial to bring the Newbie along with the vision, but also in supporting learning between the Pioneers and the Believers to develop best practices and evidence in this area.

The response

Continued investment in evaluation is key to building up this evidence base and developing best practice. In the same way that using Human-Centred Design (HCD) to develop health programming is considered doing business differently, monitoring and evaluating HCD also has to be done differently.

At Itad, over the last couple of years, we have led on evaluating programmes that involve HCD, namely the Hewlett Foundation’s strategy to apply HCD to improve family planning and reproductive health services in Sub-Sharan Africa and Adolescents 360. We also attended the HCD Exchange in Tanzania in January 2018 to share the lessons we’ve learnt, especially from a methodological perspective, such as what we have found challenging and what has worked. We’re certain that our work on these programs will contribute to the growing evidence base.

Personally, through my day to day work, I’ve become passionate about identifying and applying evaluation methods that are appropriate for design thinking – so much so that I am completing my MSc in Global Health Policy dissertation on this exact topic – looking at lessons from evaluating interventions and programmes that utilise HCD. I’m now speaking with a range of people commissioning, designing and implementing evaluation in this field, and I’m really excited to write and publish my findings over the coming months. The application of design in public health throws up challenges for more traditional evaluation approaches and we need to be able to meet these in order to present the data and evidence of the potential of design for better, more innovative public health solutions.