The story behind the numbers

‘Making’, analysing & using qualitative evidence

Christine Fenning

On 11 January 2019 Mokoro was delighted to host a seminar on qualitative evidence-gathering methods. Following our previous seminar on collecting evidence for development, this seminar looked further into some of the qualitative approaches available and their use by and usefulness for policy makers and practitioners.

Our panel of four was chaired by Catherine Dom (Mokoro Principal Consultant). The speakers explored the value of qualitative/alternative evidence and discussed various methods of qualitative/alternative evidence-gathering, spoke about the issues around the synthesis of diverse data, and considered how quantitative and qualitative approaches to research can be used together, with concrete examples from Plan’s Real Choices, Real Lives study, from HelpAge International’s Moldova programme, and from the UK Foreign and Commonwealth Office in North Africa.

Lilli Loveday from Plan International UK introduced the Real Choices, Real Lives project, a longitudinal study which follows 142 girls living in poverty from nine countries (Brazil, El Salvador, Dominican Republic, Togo, Benin, Uganda, Cambodia, The Philippines, Vietnam) across three continents for the first 18 years of their lives. All the girls were born in 2006, and Plan International produced the first reports in 2007. The study explores the detailed experiences of the girls and their families, also looking at the importance of the girls’ broader social networks. Plan’s research aims to put a human face on available statistics, theories and academic discussions and to enable a better understanding of the lives of these girls. The evidence generated is used to support Plan’s policy and advocacy work and to inform the organisation’s future programming.

Lilli spoke about the opportunities and the challenges a longitudinal study like this presents. Real Choices, Real Lives is an incredible asset in terms of evidence generation and the commitment to hearing the voices of girls. The evidence generated contributes to the understanding of approaches that might work in development interventions. The study generates rich and nuanced data and an understanding of social norms by taking an in-depth, focused look at the girls’ experiences in a broad range of areas over a number of years (12 years at this point).

However, there are some challenges. These include: a) numbers are relatively small so the study does not necessarily carry enough weight with policy makers; b) the scope of the study, geographically and in terms of the broad themes, makes it more difficult to extract comparative evidence across all countries and regions; and c) the ever changing priorities in the aid/development sector do not necessarily align with long-term projects like this which can make it difficult to fund and justify the research, if an immediate need is not obvious.

Lilli concluded her presentation by outlining the way forward. Plan International is investigating the best way to use the remaining six years of the study, for example, by looking into case study approaches or by doing more thematic studies.

Bev Jones spoke about how using qualitative Monitoring & Evaluation and Learning (MEL) approaches can help reconstruct longitudinal stories of change. Bev drew on examples from HelpAge International’s Moldova programme and from work by the UK Foreign and Commonwealth Office in North Africa using adapted contribution analysis. Essentially, the question Bev discussed was how to tell a story about change when there is no MEL system.

HelpAge International works with older women and men in low- and middle-income countries for better services and policies, and for changes in the behaviours and attitudes of individuals and societies towards old age. For its Moldova programme, HelpAge International wanted to know what difference the core funding was making, so Bev and her colleagues tested assumptions through discussions, interviews with other organisations and conversations at community and household levels. They then worked with the HelpAge Moldova team on the organisation’s causal pathway for the intervention over the previous 13 years to reach a consensus on what the team considered to have been the most important changes.

Focussed interviews and field visits were conducted to test the evidence base for each step in the causal pathway. Additional or alternative factors contributing to the change were considered. Together with the HelpAge Moldova team the research team then went on to identify some of the strategic issues and lessons for the future. These included thinking about regional strengthening, business models to sustain and increase the impact of services relating to ageing, reflection on how to respond to the government’s request for co-delivery of services and how this might affect HelpAge International Moldova’s identity, and what models would work best.

Bev presented a second current example where she and colleagues are applying the same approach in a very different setting, namely for the Foreign and Commonwealth Office in North Africa, to find out what the UK’s contribution is, for example, in Morocco on anti-radicalisation. The aim is to have approximately ten case studies there. Bev emphasized the importance and usefulness of qualitative approaches: they generate rich evidence not available through quantitative approaches. Bev concluded her presentation by noting that approaches such as contribution analysis should constantly be adapted to meet the needs of research and evaluation teams.

James Copestake from Bath University spoke about the use of qualitative evidence through the  experience of testing the QuIP (Quality Impact Assessment Protocol). The purpose of the QuIP is a) to integrate quantitative and qualitative approaches to impact evaluation, and b) to extend the role of impact evaluation as a deliberative response to complexity. James briefly set out the history of the QuIP, with the design and piloting phase having taken place between 2012 and 2015. Eight pilot studies were conducted in Ethiopia and Malawi in a collaboration between Farm Africa, Self Help Africa, Evidence for Development and universities in Malawi, Ethiopia and the UK. Commercial testing then took place between 2016 and 2018, with Bath Social and Development Research Ltd being set up as a social enterprise to deliver more QuIPs; 25 QuIP evaluations have been commissioned in 14 countries so far, covering a wide range of subject areas, including child nutrition, climate change adaptation, housing improvement, microfinance, and rural livelihoods.

How does a QuIP work? Essentially, it provides a flexible standard for qualitative social research into causal mechanisms of change, with the organisation commissioning each QUIP involved in the process of adapting it to meet the needs of the particular case it is being used for. Interviews and focus groups are used to obtain information about changes seen in relation to selected outcomes of an intervention and this information is then linked to various change mechanisms to try to establish causal links. The methodology relies on self-reported attribution. Case/sample selection is mostly done from monitoring data which shows that certain changes are taking place but it is not clear why they have happened. Data is collected by independent, local and so-called blindfolded[1] field researchers to reduce confirmation bias. Every study is an action research project which starts with the outputs of an intervention ­– the changes that people experience in their lives – and asks why they have taken place. The researchers work back from the outputs using a semi-structured questionnaire. The field notes are written up as text in spreadsheets and are coded. The data is then analysed using frequency counts, dashboards, tables and charts to inform interactive thematic analysis of causal claims that are embedded in the text; the process essentially combines qualitative and quantitative methods. This allows for flexible integration with wider processes of evaluation, sense-making and deliberation.

The QuIP research team are now trying to generalise from the individual case studies and learn lessons from the process. Aspirations of the QuIP are to deepen the dialogue over qualitative/quantitative integration in evaluation and research, to reframe impact evaluation as part of local/national civil society deliberation and to explore integrated approaches to impact evaluation of multi-component interventions in complex contexts.

The final presentation was by Gary Goertz from the University of Notre Dame who spoke about a research triad of multi-method (mixed-methods) research, causal mechanisms and case studies – an integrated approach, as there is a clear demand to move away from strictly randomized control trials to more integrated approaches involving both qualitative and quantitative methods.

A simplified version of the research triad asks the following three questions about an intervention: 1) How does it work? (causal mechanism or theory of change); 2) Does it work? (internal validity); and 3) How generalizable is it? (external validity). In brief, researchers will have to know how an intervention is expected to work in individual cases, they will have to challenge people’s attributions, and they will then need to ask how generalizable the intervention/programme is and think about testing it in multiple settings. Gary said that this was too complex for quantitative methods and could only be done through qualitative research.

The reasoning behind the research triad is that the inputs and activities of an intervention are known and the outputs/outcomes can be identified, but there is a gap in understanding the causal links between them. So case studies are often done to fill that gap. An example concerning the mentoring of women entrepreneurs in Kenya illustrated the importance of integrating qualitative and quantitative aspects in research and stressed the importance of consulting beneficiaries of an intervention and asking them directly why they think an intervention worked (or did not work): if it had not been for the intervention, what would have happened? Gary said that it was interesting to consider how qualitative studies provide causal inferences, and accessible and plausible evidence for policy makers.

It was fascinating to listen to our speakers who approached the subject from different angles and to see how it all connected, with the question of how case studies can be used to provide evidence of what works either in a given context or more generally. The presentations provided much food for thought and stimulating discussion with our 50+ participants who came from a range of policy and practice perspectives.

You can access video recordings and PowerPoint slides of the four presentations here.

Download this article.

 

[1] Researchers are not aware what they are evaluating until after the interviews have been conducted.

You must be logged in to post a comment.