Malaria control: the power of integrated action


Case studies/experiences with IVM

Environmental management and modification strategies were commonly used a century ago to control vectors, before chemical insecticides became cheaply available. Some of those early experiences failed due to a lack of careful implementation or knowledge about vector ecology. But others registered striking successes that may be highly relevant today. A few examples of successful IVM experiences, new and old, are presented here to illustrate the types of integrated approaches that policy-makers may want to examine today in their own locale. Since much more attention has already been given to chemical approaches involving insecticide-treated nets (ITNs) and indoor residual sprays, the focus here will be on experiences with environmental management and biological controls, per se, as well as the social and economic benefits gained from such integrated approaches.

China

Wet/dry irrigation cycles

Prior to the mid-1960s, Sichuan Province had the fourth highest level of malaria disease rates in China. Indoor residual spraying, improved case detection, treatment, and surveillance gradually reduced the severity of epidemics in the 1970s and early 1980s. When ITNs were introduced in 1986, disease rates dropped even more dramatically. By 1993, malaria disease rates had declined from the former 10–20 cases per 10 000 people to less than five cases per 10 000. The success of the bednet approach in China also played a role in its worldwide adoption as a key component of ‘Roll Back Malaria’. Despite the overall success of bednets, however, their limitations when used alone as a single solution became apparent over time.

  • Cost and sustainability. There was no lasting impact on malaria vector densities – bednets had to be treated indefinitely at regular intervals, requiring substantial commitments of time and money. In the case of Sichuan, due to changes in cost and availability of insecticides, community-wide treatment of bednets was suspended in 1993.
  • Vector ecology. House-based control measures were not always effective in Sichuan Province since, due to local vector behaviour, some disease transmission occured outdoors, e.g. when farmers slept in their fields.

Over the past two decades, however, a major new farm-irrigation scheme has led to the virtual eradication of malaria in some areas of Sichuan Province – even after the discontinuation of systematic ITN programmes. The key to success has been the expansion of irrigation schemes that assure farmers of year-round availability of water, and consequent decline in the traditional practice of permanently flooding rice fields. As constant flooding of rice paddies has been replaced by a cycle of intermittent wet/dry irrigation, breeding habitats for malaria vectors have been reduced below the critical threshold level that would trigger disease outbreaks (35).

As malaria has declined, public health has improved and agricultural productivity has also increased. Today, farmers cultivate a second field crop during the cold season when flooded rice paddies previously remained fallow. As a result, the annual income of farmers in Sichuan Province increased by more than 60% between 1995 and 2000. With this increased wealth, local residents can better afford antimalarial measures and curative treatment when necessary. Sustainable agriculture based on wet/dry crop rotation (WDCR) has thus fundamentally changed the pattern of malaria in Sichuan at virtually no cost to traditional malaria control programmes and with other significant economic payoffs.

The Chinese experience with intermittent wet/dry crop rotation could potentially be replicated in comparable areas of Eurasia, such as south-eastern Turkey, where there have been malaria epidemics associated with irrigation practices. Overall, the Chinese experience underlines the importance of coordinated planning among land-use, water, agriculture, and health policy-makers for more efficient water use, water security, and malaria control.

Zambia

Cost-effectiveness of large-scale environmental management

The experience with environmental management of malaria in the copper-mining communities of Zambia (formerly Northern Rhodesia), between 1929 and 1949, has provided a historical example of how integrated malaria control strategies may yield substantial economic benefits, as well as public health gains (25,28).

Since the Northern Rhodesian programme was undertaken in an area fraught with high rates of malaria disease, it also illustrates how integrated management may be relevant in areas of endemic disease – as well as in areas of more marginal and epidemic transmission.

Fig. 3 - Use of maps to predict current and future disease distributions

In the Northern Rhodesian programme, a package of integrated control measures reduced the malaria incidence rate by 50–75% in the first 3–5 years of programme operation. Between 1930 and 1936, malaria incidence within the four mining communities involved in the programme declined from 457–514 cases to 135–251 cases per 1000 people per year. When indoor residual spraying with DDT was introduced in 1946, supplementing but not replacing the environmental management measures, there was another sharp decline in malaria incidence to just 21-30 cases per 1000 people.

Northern Rhodesia’s programme was a comprehensive approach, first including improvements in housing, water, sanitation, medical treatment and facilities, and bednets. When these measures alone proved insufficient to substantially reduce disease incidence, surveys of local malarial vector habitats were conducted and environmental management strategies were designed to reduce breeding habitats.

The measures that were designed included vegetation clearance, modification of river boundaries, increasing velocity of the river flow to interrupt larval development, and swamp drainage. An additional component – albeit not one that would be considered environmentally sound today – was the application of oil to open water bodies, which also interrupted larval development.

Strategies were tuned to the local ecology and behaviour of the malaria vectors, some of which preferred shady habitats, and some of which thrived in sunshine. The measures were implemented in parallel by careful cooperation among health, water management, and planning authorities. Monthly malaria-incidence rates and vector-density surveys provided a constant stream of updated information on the effectiveness of the measures taken, so that they could be fine-tuned, and performance improved.

The detailed records kept of the programme costs and procedures – together with health, employment, and revenue data from the mining company operations – has facilitated cost-effectiveness analysis, It has been estimated by some experts in retrospective analysis that the malaria-control effort may have averted over 14 000 deaths, and over 517 000 malaria attacks, in a mining community population that swelled from 11,000 employees and their families to over 140,000 people over a period of 20 years. Over the same 20 years, integrated malaria-control costs were estimated to total about US$ 11 million, while nearly US$ 6.5 million in direct medical costs and indirect costs of lost worker productivity were estimated to have been averted, (1995 USD terms) (28).

Important economic development and macroeconomic benefits may also have been generated by malaria control. Over the 20-year period the programme was in operation, Northern Rhodesia was transformed from an insignificant player in copper mining, to the third most important copper ore producer worldwide, and in 1938, copper represented 55% of taxable national income (25,28)..

Prior to 1929, unsuccessful malaria control efforts had resulted in migrant workers abandoning some of the same copper mine sites, and rumours flourished along the labour routes of the malaria dangers associated with copper mining in the area. Integrated malaria management both dissipated these fears and stimulated unprecedented in-migration to the mining communities – essential ingredients in the rapid expansion and sustainability of the mining operations.

While this programme occurred under a colonial regime that is now a historical artifact, such historical experiences with malaria control have been recognized by experts in Africa and elsewhere as potentially relevant to the struggle of modern and independent African nations to address issues of disease, environment and development. In particular, the experience illustrates how concerted intersectoral action between health and economic sectors can yield public health benefits as well as economic payoffs.

Kenya

Livestock and settlement patterns in the control of malaria

A new study, conducted within four villages of the Mwea Division of Kenya under the auspices of the Systemwide Initiative on Malaria and Agriculture (SIMA) and the Canadian-based International Development Research Centre, found malaria-disease prevalence was significantly lower in the two villages with rice irrigation schemes as compared with villages with no irrigated agriculture (0–9% versus 17–54%). The lower incidence occurred despite the existence of a 30- to 300-fold increase in the number of local malaria vectors in the irrigated locales. A possible explanation for the so-called ‘paddies paradox’ may lie in the tendency for the prevalent Anopheles arabiensis vector to prefer feeding on cattle, rather than on humans (19). Ongoing research as to feeding preferences and patterns of mosquitoes may lend insights into how malaria control may make use of livestock as ‘diversionary’ hosts for some malaria vectors.

Larvivorous fish in africa and asia

Predatory fish that eat mosquito larvae have been used for mosquito control for at least a century. In both Africa and Asia, native fish species have been identified that may act as biological control agents. Caution is required here, however, in that the introduction of exotic larvivorous fish species can also have negative consequences upon local ecosystems.

  • Fish may be particularly useful in controlling malaria vectors associated with rice fields (16). In Asia this practice has spurred development of pisciculture, which in turn generates additional agricultural, economic, and nutritional benefits (36). In China, stocking rice paddies with edible fish, such as carp, decreased larval numbers of malaria vectors in comparison to control areas. Expansion of fish stocking in rice fields on a large scale over several years correlated with a marked decrease in malaria transmission, as well as improved rice yields and significant fish production (37). In Ethiopia, researchers have identified indigenous fish species that suppressed malaria vectors breeding in urban wells and water containers (38,39).
  • In India, both introduced and native fish species have been effective in suppressing the breeding of the Anopheles stephensi vector that thrives in urban water containers. Interest in the use of biological predators is rising due to the growing resistance of the vector to DDT (15,36,40). Since water containers are also habitats for the Aedes aegypti vector responsible for dengue fever, good vector control practice can have a simultaneous and positive impact on both malaria and dengue disease control. In Goa, the introduction of native fish species in wells and water tanks – with the use of Bti in smaller habitats – led to a significant decline in the annual parasite index (API) of malaria parasites among local residents. Significantly, the combined use of these two biological tools was more effective than conventional vector controls, including indoor residual spraying with DDT and pyrethrum fogging (41).
Share