Renaisi’s research and evaluation team exists to help others generate and use evidence. We are commissioned to provide ‘insight’ and ‘analysis’, to make sense of complexity. For us and others in our profession, there is no greater joy than seeing data and information start to fall into place, and a picture emerging about what an organisation is changing, how, and why it matters. But too often in our sector we neglect to interrogate why generating evidence is valuable in the first place, and what happens to all that information and insight in the medium and longer term.

That is why I wanted to comment on Bethia McNeil’s recent blogpost on this topic, which begins to unpick the assumed links between ‘evidence’, ‘learning’ and ‘improved provision’. As Director of the Centre for Youth Impact, Bethia focuses on organisations supporting young people, but I think her core message applies more widely than that.

There is a pervasive assumption in our sector that improved evidence leads to learning, which in turn leads to improvements in how activities are delivered, and eventually to improved outcomes for people and communities. That is the ideal - if you like, it is the ‘Theory of Change’ of evaluation and impact practice. But the process is not automatic. For evidence to actually lead to improvements in delivery, we have to actively take steps to make it happen, and organisational mechanisms need to be in place to facilitate this. It is a process that can be done badly, or not at all. [1] Too often evaluation is judged on whether the evidence generated is ‘robust’, rather than whether it leads to any useful changes in practice. [2] 

So what factors need to be in place to ensure that we can learn from evidence, and apply that learning to improve provision? This question should be the focus of attention and debate amongst people who care about evaluation and evidence across the sector. For evidence to be valuable, it has to be used and applied in practice.

[1] www.renaisi.com/impact-measurement-must-work-better-services-communities/

[2] A similar logic applies to the relationship between evidence and funding: http://insights.renaisi.com/post/102dsj0/more-impact-measurement-does-not-lead-to-more-funding