Chapter 3Infectious Diseases Related To Travel
For the Record: A History of Malaria Chemoprophylaxis
Alan J. Magill
Among the many dreaded fevers that plague man, “marsh fever” was distinguished by its periodicity, enlarged spleen, and pattern of attacking those exposed to wet, warm, boggy areas. In the mid-1700s, it became known by its Italian name, mal’aria (“bad air”), referring to bad or foul air emanating from swampy lands. The discovery and development of drugs to treat and prevent malaria have been driven for centuries by the desire and need to protect travelers, military personnel, explorers, and the commercial interest of imperial powers when they went into the malaria-endemic tropics.
HISTORICAL MILESTONES IN ANTIMALARIAL CHEMOPROPHYLAXIS
- 1620s: Jesuit missionaries living in Peru learned the healing power of powdered bark, often known as “Jesuits bark,” from the cinchona trees growing in the high forests of Peru and Bolivia.
- 1768: James Lind, a British naval surgeon, recommended that “every man receive a daily ration of cinchona powder” as long as a ship lay at anchor in any tropical port. His advice, however, was not widely accepted, and it would be almost 100 years before the concept of chemoprophylaxis would be accepted.
- Early 1800s: Crews of the British Royal Navy manning the blockade of the West African coast to suppress the Atlantic slave trade were decimated by malaria.
- 1820: French chemists isolated quinine, the most active compound in cinchona bark, which allowed for quinine’s expanded availability and use.
- 1854: The Scottish surgeon William Balfour Baikie gave 6–8 grains (1 grain = 65 mg) of quinine, half in the morning and half in the evening, dissolved in sherry, to all the ship’s crew during a 118-day expedition up the river Benue in modern-day Nigeria. No men died. This unprecedented accomplishment gradually led to acceptance that malaria could be prevented by chemoprophylaxis.
- 1861: Quinine first saw widespread use as a prophylactic agent in the American Civil War when both the Union and Confederate Armies, plagued with malaria, used massive quantities to prevent the disease.
- 1880: A French military physician working in Algeria, Alphonse Laveran, discovered parasites in the blood of a French soldier. The identification of the infectious agent and its life cycle was an essential step in opening the way to the development of effective prophylactic agents.
- 1914–1918: During World War I, both Allied and Axis militaries suffered terribly from malaria. German armies were denied access to quinine, as most was held by a monopoly of Dutch growers on the island of Java.
- 1920s: German chemists pursued a synthetic route to new antimalarial drugs to circumvent the Dutch monopoly, achieving spectacular success with the introduction of pamaquine and mepacrine during the 1930s.
- 1941: With America’s entry into World War II, trade between the United States and Germany ceased, and mepacrine was no longer available to the Allies.
- 1942: Japanese armies overran the Dutch cinchona plantations on Java, cutting off the supply of quinine and leaving the Allies with no antimalarial drugs.
- 1943: American scientists quickly devised a manufacturing process for mepacrine, and the drug was given as a treatment and as a prophylactic under the name Atabrine. Meanwhile, the Allies, especially the Americans, launched the largest antimalarial drug discovery and development program the world had seen. By 1945, several new antimalarial drugs had been introduced, including chloroquine and proguanil. Chloroquine went on to assume a role both in prophylaxis for travelers and as a treatment drug for endemic areas.
- 1959: Chloroquine-resistant Plasmodium falciparum malaria was first reported.
- 1965: Large-scale American military involvement began in South Vietnam, reaching almost 200,000 soldiers by the end of 1965. Chloroquine-resistant malaria caused illness and death in US forces.
- 1967: A second massive US government–sponsored antimalarial drug discovery effort centered at the Walter Reed Army Institute of Research began and led to the discovery of mefloquine.
- 1971: During studies of volunteers with experimentally induced malaria infections, it was observed that tetracycline, administered to treat intercurrent bacterial infections, appeared to exert blood schizontocidal activity against chloroquine-resistant P. falciparum. A few additional studies were performed with tetracycline, doxycycline, and minocycline, but no formal development of the drug as an antimalarial was pursued.
- Early 1980s: The Wellcome Research Laboratories developed atovaquone as an antimalarial drug. Clinical trials in uncomplicated P. falciparum malaria as monotherapy were disappointing, with early treatment failures due to the emergence of atovaquone-resistant parasites. However, combining atovaquone with proguanil led to an efficacious combination therapy.
- 1982: The fixed combination of pyrimethamine and sulfadoxine (Fansidar) became available in the United States and was recommended by CDC for prophylactic use in travelers at risk of acquiring chloroquine-resistant P. falciparum.
- 1985: Pyrimethamine-sulfadoxine as weekly malaria chemoprophylaxis was abruptly withdrawn because of fatal cases of Stevens-Johnson syndrome. Since no alternative drugs were licensed to prevent chloroquine-resistant P. falciparum malaria, CDC recommended daily doxycycline for antimalarial prophylaxis.
- Late 1980s: The US Army conducted several field and human challenge clinical trials demonstrating the efficacy of doxycycline as malaria prophylaxis.
- 1989: The US Food and Drug Administration (FDA) approved mefloquine (Lariam).
- 1992: Pfizer, at the request of FDA, submitted a supplemental new drug application for doxycycline for malaria chemoprophylaxis.
- 2000: Combination of fixed dose atovaquone-proguanil (Malarone) approved by FDA.
The 3 largest antimalarial drug development efforts of modern times occurred during and after major conflicts of the 20th Century—World War I, World War II, and the Vietnam War. The modern history of developing drugs to prevent malaria grew almost entirely from the need to protect military personnel and keep them healthy to conduct combat operations in malaria-endemic areas. Those massive government-funded efforts produced the drugs now used to keep modern civilian travelers safe from the sickness and death that mark malaria’s march through time.
- Greenwood D. Conflicts of interest: the genesis of synthetic antimalarial agents in peace and war. J Antimicrob Chemother. 1995 Nov;36(5):857–72.
- Kitchen LW, Vaughn DW, Skillman DR. Role of US military research programs in the development of US Food and Drug Administration—approved antimalarial drugs. Clin Infect Dis. 2006 Jul 1;43(1):67–71.
- Smith DC. Quinine and fever: The development of the effective dosage. J Hist Med Allied Sci. 1976 Jul;31(3):343–67.
- Smith DC, Sanford LB. Laveran’s germ: the reception and use of a medical discovery. Am J Trop Med Hyg. 1985 Jan;34(1):2–20.
- Sweeney AW. Wartime research on malaria chemotherapy. Parassitologia. 2000 Jun;42(1–2):33–45.
- Centers for Disease Control and Prevention
1600 Clifton Rd
Atlanta, GA 30333
TTY: (888) 232-6348
- Contact CDC-INFO