Close

Blog: Process Evaluation skills-building workshop, Zambia

The Centre for Evaluation is launching a series of skills-building workshops around the world, organised in partnership with our collaborators, and held in locations where LSHTM staff are based or regularly visit. In the first instance, the thematic focus is Conducting Process EvaluationsThe Centre has developed a short curriculum comprising 3 powerpoint presentations, 3 case studies and related group work exercises – these can be adapted to fit local contexts and interests, and it is expected that local researchers will also present their work and facilitate discussion. We have two sample agendas to help structure 1- or 2-day workshops. Process evaluation theme lead Bernadette Hensen recently delivered a workshop in collaboration with Zambart in Lusaka, Zambia.

The first Centre for Evaluation supported skills-building workshop on Process Evaluation was held on 6 December 2018 in Lusaka, Zambia. The workshop was run in collaboration with Zambart, a research organisation established in 2004 and LSHTM collaborating partner. To keep costs to a minimum, the workshop capitalised on my being in Lusaka to work with colleagues on a formative research study.

Process Evaulation Zambia

Participants discussing how PE can be integrated into their work.

The workshop saw 12 participants, from Zambart, the University of Zambia, other stakeholders and researchers, come together to build new skills and share experiences of process evaluation. Musonda Simwinga and I ran the workshop, with support from Ginny Bond. In the morning, participants discussed what process evaluation is, why process evaluations are useful and when they can be used. We also discussed evaluation frameworks, including logic models, as useful models to map intervention pathways and guide process evaluation design. Zambart has extensive experience running large cluster-randomised trials, including ZAMSTAR (Zambia and South Africa TB Reduction Study) and the HPTN (071) PopART (Population Effects of Antiretroviral Therapy to Reduce HIV Transmission) trial. In these trials, rich quantitative and qualitative data was collected alongside data to evaluate the impact of the interventions on primary outcomes. These data were collected to understand, for example, the effect stigma had on participants engagement with the intervention, intervention acceptability and the important relationship lay counsellors have established with participants including the influence this has on introducing innovations, such as HIV self-testing. We discussed how much of the research conducted by Zambart within trials is implicitly about process, and that making this more explicit would provide a guiding framework for exploring assumptions about how interventions works and valuable information for scaling-up interventions that are successful and how to modify interventions where there is no impact. It would also ensure a more genuine trans-disciplinary engagement. Two participants were experienced in realist evaluation, including Chama Mulubwa, manager for a series of HIV self-testing case studies called STAR and research degree student at UMEA University and Dr. Joseph Zulu, Assistant Dean at the School of Public Health, University of Zambia. The participants discussed the overlap between realist and process evaluation, and where these two fields were distinct.

After discussing process evaluation more generally, we discussed indicators and tools. In this discussion, process evaluation was seen as heavily quantitative, particularly when measuring intervention implementation and reach. We discussed where along a logic model and how qualitative concepts and data collection tools can complement quantitative data collection. In this session, Mwelwa Phiri, a prospective LSHTM research degree student, presented her ideas for a process evaluation to be embedded within a trial of sexual and reproductive health services for adolescents and young people in Lusaka. The workshop ended with a discussion about when and how to analyse data arising from a process evaluation, and how process evaluation could be embedded more explicitly in our work.

Having worked with Zambart since 2009, this workshop was the first workshop I’ve organised with Zambart. It was a great opportunity to discuss thinking related to this concept and debate how it aligned with work already ongoing at Zambart, but also with related concepts such as Monitoring & Evaluation and Realist Evaluation, and whether qualitative data could be defined as “routine”. The workshop was stimulating and engaging, albeit a bit rushed with the workshop held over one day. We also wished that more colleagues from the School of Public Health at the University of Zambia could have attended. We are hopeful that the workshop may lead to a similar seminar or workshop being held at the University of Zambia, and will expand to include a workshop on quantifying impact using quasi-experimental designs. We also hope to carry out a process evaluation on a joint proposal and grant.

Similar workshops are being planned for Ethiopia, South Africa, Zimbabwe and beyond. These are designed to be low-cost, taking advantage of existing LSHTM presence and travel, and sharing logistical costs with local partners. If you are interested in organising a Process Evaluation workshop (or designing a different skills-building event) please get in touch (evaluation@lshtm.ac.uk)!

Short Courses

LSHTM's short courses provide opportunities to study specialised topics across a broad range of public and global health fields. From AMR to vaccines, travel medicine to clinical trials, and modelling to malaria, refresh your skills and join one of our short courses today.