Bulletin of the World Health Organization

Evaluating large-scale health programmes at a district level in resource-limited countries

Theodore Svoronos & Kedar S Mate

Volume 89, Number 11, November 2011, 831-837

Table 1. Overview of context-specific evaluation designs

Evaluation design Key components
Alternative randomized controlled trials22,26 Randomized design with flexible protocols to allow for variation, real-world complications and greater external validity.
Realist evaluation2729 Approach designed to understand the interaction between the intervention in question and the context in which it is introduced. Theory building and case study methods are emphasized, though realist evaluation does not rely on a fixed methodology.
Evaluation platform design7 Aims to assess effectiveness at scale and the contribution of a large-scale programme towards achieving broad health goals. Takes the district as the primary unit of analysis and relies on continuous monitoring of multiple levels of indicators.
Process evaluation16,30,31 Assesses the actual implementation of a programme by describing the process of implementation and assessing fidelity to programme design. Relies on various tools to map processes, including logic models and programme impact pathways.
Multiple case study design3234 Applies case-study methodology to several subjects with the goal of understanding the complexities of a programme from multiple perspectives. Information is gathered through direct (e.g. interviews and observations) and indirect (e.g. documentation and archival records) means.
Interrupted time series design35,36 Uses multiple data points over time, both before and after an intervention, to understand whether an intervention’s effect is significantly different from existing secular trends.