e-MOPs

Ethics & Engagement across the Wellcome Trust Major Overseas Programmes

Resources

Resources for Evaluation of Public and Community Engagement

These resources pages are currently under development so that over time they provide introductions, summaries and links to a range evaluation approaches that are useful for assessing public and community engagement.

Evaluation of engagement - addressing complexity

Public and community engagement initiatives take place in settings with multiple stakeholders, contextual factors that may have an unforeseen influence, and dynamic circumstances that may lead to unexpected change.

In evaluation jargon, capturing the ‘contribution’ made by your project in such a complex and fluid context, may be more realistic and more scientifically valid than looking for ‘attribution’ – aiming to definitively prove that your project led to the changes observed.

Either way, useful monitoring and evaluation depends on being clear about the changes that your public engagement initiative is aiming for, so these changes can be assessed. This in turn, depends on understandings and assumptions about how change happens - the ‘theory of change’ for the engagement work being undertaken, and how your initiative contributes to these changes.

Making clear the ‘theory of change’ for your engagement activities means they can be tested against evidence of what actually happened, with the potential to learn and further improve subsequent initiatives.

Some evaluation frameworks are more suited than others to addressing such complexity and it is mainly these frameworks, and their related evaluation approaches and tools, that are featured in these resources pages.

In addition, there are some short descriptions and links to existing guides and toolkits for planning and conducting monitoring and evaluation of public engagement. These guides tend not to address complexity as explicitly as those highlighted in the rest of this resources section, but do provide introductions to evaluation and some useful tools. 

Useful evaluation approaches

Outcome Mapping
Outcome Mapping - a planning, monitoring and evaluation approach that is particularly helpful for evaluating multi-stakeholder projects. It seeks to assess contribution of a project to changes in the relationships and behaviour of those it comes into direct contact with and has influence over. 

See some introductory presentations, short films, case studies and the manual here 

 

Most Significant Change - story based approach to assessing impact
Most Significant Change is a qualitative and participatory monitoring and evaluation approach that gathers stories of change from a range of stakeholders, which are then discussed and analysed together to assess the impact of projects. It is particularly useful where there may not yet be agreement on what outcomes are the most important, or they are difficult to predict in advance.

See some introductory presentations, short films, case studies, and the manual and quick start guide here

 

Realist evaluation
Realist evaluation – a method that is useful for evaluation of complex social programmes, implemented across a variety of contexts. Integrates qualitative and quantitative analysis and seeks to build an understanding of ‘what works for who in what circumstances’

See some introductory presentations, short films, case studies and books and papers on the approach here

Participatory Statistics for evaluation
Participatory statistics is a set of methods that enable local people to generate statistics for local level planning, learning and reflection, but which can also be aggregated at wider levels for statistical analysis and generation of 'representative' evaluation findings.

See some introductory papers, short films, case studies and guides here

Case studies for evaluation
Case studies are often misunderstood, but can be valuable and rigorous way to answer 'how' and 'why' questions, and understand projects in their real-life contexts.

See some introductory papers, presentation, guides and examples here 

Qualitative Comparative Analysis (QCA)
QCA is a research and data analysis method for comparing complex cases. It was explicitly developed to tackle complexity in social programmes and combines quantitative and qualitative analysis. It uses set-theory to understand how causes and conditions combine differently in different cases to explain outcomes of interest.

See some introductory paper, presentations, guides and examples here

Guides to evaluation of public engagement

A number of UK focused guides to evaluating public engagement tend to focus on evaluation of events and research dissemination rather than more concerted community engagement but contain clear introductions to monitoring and evaluation and some useful approaches, tools and templates. An annotated list and links to the resources can be found here

Evaluation of community engagement resources

A number of guides and toolkits are useful for evaluation of community capacity and engagement, even if none are specifically focused on engagement with research. An annotated list and links can be found here

 

General evaluation resources

A number of useful websites, discussion forums and on-line resources provide information on evaluation approaches, tools and guidance - much of it from the international development sector - which are relevant and valuable for evaluation of engagement with research. An annotated list and links can be found here

Insights from complexity theory for evaluation

A range of emerging evaluation approaches are more or less explicitly informed by developments in complexity theory - a few of which are introduced above.

A range of background papers outline some of the useful contributions of complexity theory and their relevance for evaluation of complex social change. 

Comment by Mary Chambers on November 13, 2014 at 4:26

Does anyone have examples of using a log frame analysis for PE? It seems the main tool for development agencies still. I'd be interested to hear comments on strengths and weaknesses. We're teaching about it in a group session tomorrow so I'd love to see a real life example if anyone could email it to me.

thanks Mary

Add a Comment

You need to be a member of e-MOPs to add comments!

Join e-MOPs

© 2017   Created by Dina Rippon.   Powered by

Badges  |  Report an Issue  |  Terms of Service