Bulletin of the World Health Organization

Evaluation of the influenza sentinel surveillance system in Madagascar, 2009–2014

Alain Rakotoarisoa a, Laurence Randrianasolo b, Stefano Tempia c, Julia Guillebaud d, Norosoa Razanajatovo d, Lea Randriamampionona a, Patrice Piola b, Ariane Halm e & Jean-Michel Heraud d

a. Direction de la Veille Sanitaire et de la Surveillance Epidémiologique, Ministry of Public Health, Antananarivo, Madagascar.
b. Epidemiology Unit, Institut Pasteur de Madagascar, Antananarivo, Madagascar.
c. Influenza Division, Centers for Disease Control and Prevention, Atlanta, United States of America.
d. National Influenza Centre, Virology Unit, Institut Pasteur de Madagascar, Ambatofotsikely, BP 1274, Antananarivo, Madagascar.
e. Epidemiology and Surveillance Unit, Indian Ocean Commission, Ebène, Mauritius.

Correspondence to Jean-Michel Heraud (email: jmheraud@pasteur.mg).

(Submitted: 08 February 2016 – Revised version received: 21 December 2016 – Accepted: 23 December 2016 – Published online: 21 February 2017.)

Bulletin of the World Health Organization 2017;95:375-381. doi: http://dx.doi.org/10.2471/BLT.16.171280

Introduction

The World Health Organization (WHO) recommends that, from no more than two years after implementation, influenza surveillance systems should be periodically and comprehensively evaluated.1 Such evaluations may enable shortfalls to be identified, performance to be improved and data reliability to be assessed. Although several influenza surveillance systems have been established in Africa,2,3 data on the performance of influenza surveillance in Africa are scarce.

Local setting

Madagascar is a low-income country with a health system that faces numerous challenges – including problems in the timely detection of disease outbreaks and the mounting of effective responses to such outbreaks. Although there has been an influenza surveillance system in Madagascar since 1972, in 2007 this system covered only six primary health centres – all located in the capital city of Antananarivo. Between 2002 and 2006, each of the six health centres collected up to five specimens weekly from patients presenting with influenza-like illness (ILI). Staff from the national influenza centre in Antananarivo collected these specimens twice a week. Only one centre reported weekly aggregated data on the numbers of ILI cases recorded among all consultations. The pre-2007 system could monitor influenza activity only in the capital city. Thus, for influenza pandemic preparedness and to satisfy the 2005 International Health Regulations,4 it became important to implement influenza surveillance throughout Madagascar.

Approach

In 2007, in collaboration with the Malagasy Ministry of Public Health, the Institut Pasteur de Madagascar initiated a countrywide system for the prospective syndromic and virological surveillance of fever.3,5 The system was designed to enable the daily collection of data on ILI, the daily reporting of the data to staff at the Institut Pasteur de Madagascar – via a short message service-based system – and the collection of samples to be tested for influenza virus. The main aim of the syndromic surveillance, which was integrated in the general practice of the clinicians at the sentinel sites, was the prompt detection of any influenza-related unusual event, outbreak or seasonal epidemic, especially in areas where laboratory-confirmed diagnoses were difficult to obtain.

To check that the reliable data needed for effective public health interventions were being generated, we evaluated the influenza surveillance component of the fever surveillance system between January 2009 and December 2014. During the study period, influenza surveillance – nested within the fever surveillance – was implemented in 34 public or private health-care facilities spread across Madagascar (available from the corresponding author). Each day, trained staff at each of these sentinel sites were supposed to report, via text messages to the Institut Pasteur de Madagascar, the age-stratified numbers of outpatients who had presented with fever, i.e. a temperature of  at least 38 °C (Fig. 1). For each person with fever that gave verbal informed consent, a standardized paper-based case report form should have been used to record demographic characteristics, clinical symptoms and date of illness onset. Case report forms should have been sent to the Institut Pasteur de Madagascar weekly, by express courier. All the data sent were entered into a central electronic database. If incomplete or inconsistent data were detected, queries were sent to the corresponding sentinel sites. Each day, a time–trend analysis of the syndromic surveillance data was implemented so that any peaks in ILI incidence – above a pre-established threshold – could be detected rapidly. Clinicians at the sentinel sites identified cases of ILI, among the fever cases, using standard WHO case definitions.6,7 Daily, weekly and monthly reports were generated at the Institut Pasteur de Madagascar and shared with the sentinel sites and other key stakeholders.

Fig. 1. Flowchart showing the implementation of the national system for the surveillance of influenza-like and other febrile illnesses, Madagascar, 2009–2014
Fig. 1. Flowchart showing the implementation of the national system for the surveillance of influenza-like and other febrile illnesses, Madagascar, 2009–2014
SMS: short message service.

Weekly, at 12 of the sentinel sites, nasopharyngeal and/or oropharyngeal samples were collected from up to five patients with ILI and shipped to the national influenza centre for influenza testing, as previously described.8,9

No financial incentives were provided to the health centre staff for their surveillance-related activities but medical equipment, stationery and training were provided to support such activities.

To evaluate the influenza surveillance system, we followed the relevant guidelines of the United States Centers for Disease Control and Prevention10,11 and considered eight key attributes. For each attribute, specific quantitative and/or qualitative indicators were developed and scored (Table 1).

Data quality, stability and timeliness were evaluated using the central database at the Institut Pasteur de Madagascar. To evaluate the other five attributes, semi-structured interviews and standardized self-administered questionnaires were used to collect relevant data from 85 individuals involved in the surveillance system from the sentinel sites (68 individuals) and from the Institut Pasteur de Madagascar or the Malagasy Ministry of Public Health (17 individuals). However, 18 staff members from sentinel sites failed to respond.

Relevant changes

Between January 2009 and December 2014, 177 718 fever cases were reported from the 34 sentinel sites. Overall, 25 809 (14.5%) of these fever cases were considered to have ILI. Samples were collected from 35.6% (9192) of the ILI cases and tested for influenza; 3573 (38.9%) of those tested were found positive. Table 1 summarizes the results of our evaluation of the influenza surveillance component of the fever surveillance system. The data collected on ILI appeared to be of good quality. Full data on most of the cases observed at the sentinel sites were sent in a timely manner. The case definition of ILI and the sampling criteria also appeared to be respected. However, less than 50% (4265/9293) of the samples collected reached the laboratory within 48 hours of their collection. In terms of representativeness, it seems likely that the low median age of the ILI cases observed at the sentinel sites – i.e. four years – reflects a reluctance of adolescents and adults with fever to seek care. More than 80% (47/50) of the staff interviewed stated that the implementation of their surveillance activities was easy and that the time they devoted to such activities was acceptable. Although none of the interviewees reported delays in the collection of samples from patients, 36 (54%) reported regular delays in the collection of case report forms by the express couriers. Over our study period, the mean annual costs of the entire fever surveillance system and the laboratory testing of samples were estimated to be 94 364 and 44 588 United States dollars, respectively.

The fever surveillance system appeared capable of monitoring trends in several fever-associated illnesses under a unified platform and appeared to be quite stable, at least in terms of reporting frequency. Each year the national influenza centre shared the isolates of influenza virus that it had recovered with the WHO Collaborating Centre for Reference and Research on Influenza, London, United Kingdom of Great Britain and Northern Ireland.

Lessons learnt

The influenza surveillance system showed good performance in terms of most of the indicators and attributes that we evaluated. One of the system’s main strengths was its data quality – including the respect shown to case definition and sampling criteria. The use of mobile phones and texting for the transmission of daily aggregated data, the follow-up and the relative simplicity of the system contributed to improving the completeness, quality and timeliness of the data and the acceptability of the system to sentinel site staff. The main weaknesses that we observed were the frequent shortages of blank case report forms and inadequacies in the number of staff trained. Although half of the surveillance staff interviewed reported that the associated workload was the main challenge in the implementation of surveillance activities, all of them reportedly felt that – given the probable benefits to public health – the time they spent on such activities remained reasonable. The delays between the collection of samples and their receipt in the virological laboratory were another issue.

During our evaluation, we used scores based on an arbitrary scale to estimate the quality of the surveillance system in terms of each of the indicators we evaluated. We decided not to give an overall score for each of the eight attributes we evaluated because the indicators for each attribute are unlikely to have equal importance.

Although the annual costs of the system appeared moderate, the system, at the time of writing, remains entirely supported by external funding. To improve the system’s sustainability, advocacy is needed to promote financial support from the Malagasy Ministry of Health and other national stakeholders. Ideally, the influenza surveillance system should be nested within an integrated system of disease surveillance based on a syndromic approach. If such a system can be kept simple, its acceptability to surveillance staff and its data quality and timeliness are more likely to be good (Box 1). If such a system is going to be sustainable in the long term, the number of sentinel sites and the tests used need to be tailored to the funds available.

Box 1. Summary of main lessons learnt

  • During 2009–2014, the influenza surveillance system in Madagascar appears to have performed well.
  • The system apparently provided reliable and timely data.
  • Given its flexibility and overall moderate cost, the system may become a useful model for syndromic and laboratory-based surveillance in other low-resource settings.

Given its flexibility and moderate costs, Madagascar’s influenza surveillance system may be a useful model for syndromic and laboratory-based surveillance in other resource-constrained settings.


Acknowledgements

We thank the staff of the Malagasy Ministry of Public Health and the sentinel sites.

Funding:

This publication was supported by the United States Centers for Disease Control and Prevention (cooperative agreements 5U51IP000812-02) and the Office of the Assistant Secretary for Preparedness and Response (cooperative agreement IDESP060001-01-01). AR was supported by the Indian Ocean Field Epidemiology Training Programme.

Competing interests:

None declared.

References