People at the Centre: an evolving approach to participatory evaluation

From KM4Dev Wiki
Jump to: navigation, search

Note: The content of this section of the site is currently a structured brain dump by Hannah Beardon - with experience in evaluation of rights and empowerment programmes and participatory methods - and Amy Barbor, an experienced participatory video professional. With the aid of the Innovation Fund grant, we have been able to spend a few days thinking through what we know, and how to structure that, and writing up some of our thoughts and experiences. We very much hope that this will inspire other members of the community to add, adapt, challenge etc and build a useful resource for our community, and if necessary our battles with others in our fields!

Participation in evaluation: whose perspectives count?

Evaluation is increasingly recognised as an essential element of programme learning and quality, as well as traditional accountability. By stepping out of the daily routine, possibly with an external facilitator, staff and participants in a project can reflect on what they have been doing, and where that is leading them - and the project. Not just "are we doing our activities well?" but also "are we getting where we want to go?", "are those the right kinds of activities, with the right kinds of people, to achieve our objectives?", and "what do we know about what works well?".

Evaluation methods and skills have been developing to meet this need, from overall 'downward accountability' structures to ensure that poor and marginalised stakeholders are able to influence and comment on the organisation's work (such as ActionAid's ALPS developed around 10 years ago), to tools and approaches that can be used within available spaces such as evaluation processes, whether tools such as most significant change or participatory numbers, process tracing and critical stories of change, or outcome harvesting. Increasingly projects and evaluations are thinking about and articulating the underlying theory of change - or of how they think that the kind of change they are working towards happens - to provide a flexible shared framework from which to make decisions and assess progress. This enables measures of success to be related to more specific and immediate outcomes, but always also in relation to the wider goals and direction as set out in the theory. It accepts and acknowledges the challenges of complexity - the project does not operate in a vacuum - and allows evaluators to slightly bypass the issue of attribution, or at least not exclusively focus on that, but to gather evidence which validates or improves the theory to create a stronger basis for future work.

Things are definitely moving in the right direction. And with donors talking about complexity and about evidence-based interventions, there is a real opportunity for these more nuanced, learning-focused processes to be adopted wholesale. It is happening in pockets, but at the same time there is plenty of talk about 'results' , which is often interpreted as 'measurable results' and there is a risk that instead of opening up evaluation methods to acknowledge complexity, the sector is moving towards funding interventions which are simple and easy to isolate. Still many calls for proposals and terms of reference for evaluations prioritise assessment of what has been achieved - and measuring it - rather than how it has been achieved - and the quality of the assumptions, relationships, skills and strategies. There is a culture of 'rigour' which tends to favour information which is reliable, even if not useful. In my own experience many INGOs reporting to donors knowingly and intentionally deny themselves the opportunity to focus on learning, in order to fulfil what they understand to be the donor's focus on (upwards) accountability.

So there are opportunities, movements, and challenges. I (Hannah) have spent much of my own career carving out little niche spaces where we can actually ask people what they want, what they think - facilitate reflection, trusted spaces, where people can tell you what might seem obvious, but can really make a difference to an intervention that works and one that doesn't. Then I was given the opportunity to work with IKM Emergent, exploring with other development sector practitioners how knowledge flows within INGOs - how and whether the voices of the poor and marginalised for whom they work are heard, listened to and responded to - and at what levels of the organisation. We saw there the difference between getting someone to validate your theory or speak to your message, and really listening and negotiating. Appropriate techniques and skills (and spaces) are needed to allow that kind of conversation to take place. The same thing happens with 'evidence'. Are you finding evidence to confirm your theory, or building your theory and learning on the evidence available? Thinking through these issues, and considering that in most cases INGOs now recognise that their role is to contribute to social change processes, we wanted to focus more on how the 'owners' of that social change are included - not only in data collection, but in making sense of that data. We have experimented with bringing in participatory video techniques and approaches, and this is part of our push to improve the methods, but also change the definition of rigour to include useful, as well as reliable. This attempt to write up our thinking and experience we hope will be a basis for sharing experiences, moving this on further, and support and build advocacy to move 'participatory' out of the niche pockets and into the mainstream.

An evolving approach combining participatory video and evaluation

The approach we are sharing here is based on our own response to this problem. We are not claiming it to be new, just evolving. That is why we want to share it on a platform like this. It focuses on facilitating and supporting different stakeholders to reflect on the project in relation to their own context, experiences and aspirations of change, and communicate this effectively to the project staff at different levels, to improve relevance, equity and quality. The approach rests on the creation of safe spaces for reflection with groups, good listening and questioning, and facilitation of effective communication between the groups. The evaluator is facilitator, and interpreter – sharing information and stories between the groups. The evaluator's analysis is informed by, and as far as possible triangulated through, interaction with the different groups.

We have found that this approach delivers rigorous quantitative and qualitative information and analysis with the emphasis on ethics and inclusion. It is an evaluation, rather than a case study, including full desk review and interviews and focus groups with a range of stakeholders. It starts with time to gather a sample of beneficiaries' perspectives on the project. Any suggestions of impact and change can be triangulated through further statistics and interviews. Relevant issues relating to organisational dynamics and management arise when staff and partners have a chance to reflect on, and explain, the decription of the project that the work has unearthed. In our experience, this is a rich learning experience for project beneficiaries and staff alike.

This approach forms the skeleton of the evaluation, as represented here. Different tools and exercises can be used to open up and direct discussions and build skills and awareness for effective communication. Here we share some that we have used, and leave space for others to add their own. Here we are only covering the ‘field trip’ part of the evaluation process, although this approach implies the application of participatory principles to the needs assessment and design, and the development of preliminary findings and write up. In particular, we find it more appropriate to write the final report in an engaging, narrative style, using much of the material prepared by participants in the process. However, we have also found that through this approach the process is as useful a learning experience as the final report for those involved.

What is and what's on this site

This site, or section of the wiki, is just a start. The Innovation Fund has allowed us to think about what we know, what we have tried to do in response to some of these challenges, and write up some practical examples and tools to start the ball rolling. We hope and expect that others in the community might add to, or challenge, these to build a good resource and community. This will work best with some active promotion and facilitation, which is not officially in place. It is a useful resource for those of us wishing to make the arguments, and we hope that there are others out there, like us, who want to clarify and articulate the arguments, ideas and methods in order to push the sector to consider these types of techniques 'rigorous', and essential for honest, equitable development.

This 'guide' is focused on the direct engagement part of data collection process, for an evaluation process which puts the voices of the 'people with and for whom we work' (formerly known as 'beneficiaries') at the heart of understanding the programme relevance and impact. It builds on experience of facilitating workshops with these stakeholders to understand the context, and the programme, identify data to be collected and validated, and then feed that back to programme staff at different levels to feed reflection and further analysis. We have used versions of this approach in several different evaluations, with fairly standard TORs, but always of traditional programme/ project type. We have not thought yet how it could be adapted to advocacy and campaign networks, for example, so this is an early stage of thinking.

The site is structured around the phases of this direct engagement - after needs assessment and evaluation questions have been developed, desk review done, stakeholders mapped and methodology designed - and before the sharing and write up of findings:

  1. The approach: Setting the ground for meaningful participation in cross-cultural settings. This section covers some of the initial groundwork and setting the atmosphere, developing trust and agreeing expectations. Facilitation is a fundamental aspect of this type of work, to allow people to reflect on and share their perspectives, and collaborate in interpretation and analysis.
  2. Understanding the context: Looking at the project in relation to people's lives and wider social changes. The workshop, or engagement, begins with an exploration of the context, rather than the project itself. This helps the evaluator/ facilitator to understand the cultural, political, social, economic realities which influence the way the project plays out, and encourages the workshop participants to talk about relevant issues (and the project itself) in a more free way, not limited to what they think is directly relevant to the project.
  3. Talking about the project: Facilitating direct discussion of the project's aims, activities and contribution to change. At this stage, there is a good grounding for participants to describe and share their perspectives on the project chronology, effectiveness, relevance and impact.
  4. Validating through wider data collection: Involving stakeholders in the collection of further data. As with any other evaluation, there will be interviews and primary data collection during the field trip. Some of these will be identified and organised in advance.However, others may be identified during the workshop. Ideally, there can be enough time for participants to plan and conduct research within their own communities and environments, based on the discussions at the workshop, whether individual interviews or documentaries/ stories of change.
  5. Meta-analysis and feedback: Sharing the findings and making sense with different project stakeholders. It is useful - and ethical - for the facilitator/ evaluator to feed back to the workshop group what s/he has understood, about the project and context. This and the participants' stories or research) can then be the basis of feed back and food for thought for follow up meetings with the programme/ NGO staff at different levels.