It is always a delight to read a well designed, simple, yet insightful and thought-provoking report. 'Benchmarking Foundation Evaluation Practices', recently published by the Center for Effective Philanthropy and the Center for Evaluation Innovation, is certainly that. Not the most accessible topic, perhaps, but a very important one for the social sector - how do major Foundations evaluate their grantmaking programmes, and what do they do with the results?
The report is based on evidence from Canadian and US-based organisations, but contains some sobering findings for the UK sector too. What struck me is an apparent lack of clarity about what Foundation evaluation is for. Why evaluate grants? Is it about improving a Foundation's overarching strategy, or about generating evidence about 'what works' that can be used by other organisations and individuals working in the field? Or is it about demonstrating that the impact of a particular fund was worth the investment? Or to compare the social outcomes created by different grantees, to inform what types of organisations should be funded in future?
As an evaluator, understanding the purpose of a Foundation evaluation is crucially important, because otherwise, it is all too easy to set up a brilliantly robust, thorough methodology that answers the 'wrong' question - wrong in the sense that it doesn't help to push our understanding further or to generate insight on what should be done differently. The best Foundation evaluations should have a relevance and applicability well beyond the organisation itself, informing the developing practice of charities and other community organisations, and providing insight that can be used by other grantmakers too.
Easier said than done, of course. But framing the evaluation as an opportunity to research a question of wider significance is a good place to start. In our evaluation of Paul Hamlyn Foundation's Youth Fund, a major element of the evaluation is to research the effectiveness of asset-based approaches to working with young people - an area informed by comparatively little evidence to date, and relevant to any organisation that engages young people. We have recently been appointed as evaluators for Power to Change's Community Business Fund, and again, the purpose is not just to evaluate what the fund has achieved, but to explore how different types of community business models create impact for communities, and to generate evidence about the effectiveness of this approach. Our ambition in both these studies is to produce findings that are useful to a wider audience, and can inform change, development, and more effective action.
I would encourage you to read the Benchmarking Foundation report and reflect on its findings. Do share your thoughts: @AliceHMThornton or find me on LinkedIn here.
Benchmarking Foundation Evaluation Practices is the most comprehensive data collection effort to date on evaluation practices at foundations. The report shares data points and infographics on crucial topics related to evaluation at foundations, such as evaluation staffing and structures, investment in evaluation work, and the usefulness of evaluation information.