This article is by Ashley Kuchanny, IDS alumni ambassador for the UK, and originally posted on IDS Alumni Blog, “Remembering whose reality counts in IDS’ Impact Evaluation Short course“.
Comfort and excitement on returning to IDS
The familiar hallways, familiar faces and familiar library combined to make me feel like I was coming back home. My time studying at IDS four years ago was a period of immense excitement for me. My understanding of development was ripped apart and put back together again. Every week brought new challenges to my experiences and preconceptions, and as I walked back into the building four years later for a week long Impact Evaluation Design course, I knew the same was about to happen again.
The five day course was led by Eduardo Masset, Robert Chambers and Dee Jupp. In the first half of the week, we looked at quantitative methods, in the second at qualitative. As an IDS alumni who spent some of the most fascinating days of my master’s degree crawling around on the floor with multi-coloured seeds, discussing participatory methods, much of this was familiar territory for me. But not to everyone. Asking the beneficiaries, or participants, was sometimes perceived as the ‘poor relative’ of ‘robust’ and ‘scientific’ approaches.
The last day of the course allayed any fears about this as the robustness of qualitative approaches was addressed. By documenting the process thoroughly and using different methods to ensure findings are supported, we can have equal confidence in this method of research.
Challenging my preconceptions
Since leaving IDS, I have worked for both BRAC and Children on the Edge. Multiple times I have encountered M&E Consultants who have advised a sample size of between five and ten per cent. I had come to assume, therefore, that this is best practice. I quickly realised that this is a dangerous way of calculating sample sizes that could lead to either not finding an impact when there is one, or finding an impact when there actually is not one.
The power of participatory approaches has always amazed and excited me, and I have thoroughly enjoyed using and experimenting with different approaches I learned at IDS. Last year I facilitated a participatory mapping exercise in a Ugandan slum with severe issues of alcohol abuse, child exploitation and child sacrifice. It led to fascinating insights and to new understandings for many parents about how to protect and keep their children safe.
Disseminating the findings back to the participants was key. Parents gained a new understanding of their children’s behaviour and local partners told us how they would use the information: “we have learnt that children have a great capacity to be change makers if provided with opportunities to feel empowered.”
This process was an important part of a chain of events in this slum, which has driven local people to develop Community Child Protection Committees which have since transformed the community.
Bringing it back to work
Children on the Edge is currently in the design phase of an exciting new education project for Musahar children (the lowest of the Dalit’s) in Bihar State, India. I can’t wait to return to my team and start exploring new avenues to include children in the evaluation of the project. One new method which I am keen to explore further is photo elicitation. It is my hope that through giving children disposable cameras and asking them to take pictures that represent their lives, and then discussing the pictures which they children, that we can gain a deeper understanding of their perspectives on what is important to them.
I learned as much from my course colleagues as from the front, with participants from a wide spectrum of organisations. Their combined knowledge and expertise gave enormous diversity and helped contextualise the learning.
I am challenged to ensure that our quantitative work is sufficiently rigorous and that our control groups are secure. Children on the Edge often works in extreme environments such as makeshift refugee camps or informal slums of displaced people. It was also important to realise that it is better not to implement randomised control trials (RCTs) at all if they cannot be done with sufficient sample sizes and robust control groups. Whilst rigorous RCTs are not possible working with such transient and mobile communities, there are a myriad of options which could be used to understand if these are the right projects for these groups, and what change, if any, is attributed to our interventions.
Exploring participatory methods having now gained practical experience led to me seeing the sessions in a very different perspective from how I saw it as a student. I am now less idealistic and very aware of constraints, but am newly encouraged to continue to seek and use methods which are inclusive and empower the participants, and reminded to challenge my preconceptions of whose reality counts.