Date Published: November 22, 2016
Publisher: Public Library of Science
Author(s): Robert W. Snow
Abstract: Robert W. Snow discusses the importance of empirical evidence, such as that provided in the trial published this week by Milligan and colleagues, in guiding malaria control in Africa.
Partial Text: The use of antimalarial drugs to prevent malaria in Africa is not new [2,3]. At the turn of the last century, quininisation was widely promoted among residents of colonial administrative centres. During the “eradication” projects of the 1950s and 1960s, when indoor residual spraying failed to reach expected targets, mass drug administration (MDA) was included, resulting in huge reductions in infection incidence but never quite reaching elimination. The seasonal use of “chemoprophylaxis” was undertaken as part of pilot trials or elimination campaigns in Kenya, Burkina Faso, Senegal, Reunion, and Tunisia, again with dramatic impacts on parasite transmission and disease incidence [2,3]. Approaches to the seasonal use of drugs to prevent infection were resurrected during the 1980s with trials of fortnightly distribution by village health workers of pyrimethamine-dapsone to young children in The Gambia, resulting in an 80% reduction in clinical events and a 34% reduction in all-cause childhood mortality . Trials during the early 2000s of intermittent presumptive sulphadoxine-pyrimethamine (SP) treatment of infants (IPTi) attending routine vaccine visits showed on average a 30% protection against morbid events due to malaria . However, with the exception of the presumptive SP treatment of malaria in pregnancy, throughout the history of malaria control in Africa, drug-based interventions have received less attention compared to vector control. All commentators, since the earliest use of quinine and chloroquine prophylaxis, have highlighted concerns related to adherence, adequate coverage, sustainability, delayed acquisition of immunity, and resistance, concerns relevant to both vector- and drug-based control.
Trial results of SMC, between 2006 and 2011, showed dramatic impacts of the intermittent, presumptive use of combinations of antimalarial drugs on the incidence of clinical malaria in children aged between two months and five years . A meta-analysis in 2012 found strong evidence that the periodic, presumptive use of SP in combination with Amodiaquine (AQ) in areas of acute seasonal transmission could reduce malaria morbidity in young children by 75% . This evidence led to policy statements by WHO the same year  and development of regional and national plans for implementation of SMC. Donor agencies provided funding to operational plans through consortia of national malaria control programmes, nongovernmental organisations, UN agencies, and monitoring and evaluation partners. Within a year, 3.2 million children aged less than five years were protected by SMC in seven countries . This history provides an exemplary illustration of how field research evidence can lead to early policy adoption and immediate donor assistance. Importantly, previous reservations on the use of drugs for malaria control seemed less of a concern for SMC than, say, for IPTi or MDA.
The operational costs of reaching households with children under the age of five would be similar if one aimed to reach these children’s older siblings at the same time. The only additional costs would be the increased use of comparatively cheap, well-tolerated drug combinations, while the benefits could be great if disease burdens were significant in children above five years of age. The epidemiological associations among parasite exposure, age, and clinical burden are complex, but, in broad terms, as malaria transmission intensity declines, the age at which functional clinical immunity is acquired increases (Fig 1) [9,10]. West Africa and the Sahel, where current SMC efforts, including Milligan and colleagues’ trial, are focussed, encompass a wide range of intrinsic transmission characteristics, important to predict the impact of SMC . Similarly varied has been the ability to reduce transmission potential through vector control. Countries such as Senegal, where Milligan and colleagues conducted their trial, and The Gambia have witnessed massive reductions in malaria transmission intensity over the last decade, to the extent that the phenotype of clinical malaria has transitioned from a disease concentrated in young children to one that affects an older childhood population [12,13].
Parasites, vectors, and humans adapt in the face of intervention. Models might be able to predict what might happen, but they do not tell you what does happen. We have become too comfortable with model predictions of impact and future predictions. After 20 years of scaling access to ITN, we still depend on models on their likely contribution to changing disease burdens. Empirical evidence is scanty, the impact of ITN on the disease phenotype and acquired immunity is poorly described, and pyrethroid resistance has emerged, but its public health impact remains unclear.