Bitten for a Cause: The Volunteers Who Risked Malaria Infection for Lifesaving Research

How scientific breakthroughs and historical events have shaped the global fight against one of humanity’s oldest killers.

Dr Tim Crocker-Buque PhD MRCP
Microbial Instincts

--

A female Anopheles mosquito taking a blood meal. Source: CDC

The first time I watched someone die of malaria was in a small, old missionary hospital in rural Ethiopia. As soon as the young woman walked into the dilapidated admissions building, we all noticed her, face drawn and desperate, eyes searching for help. Yet, she wasn’t the sick one.

Carefully untying the white tallit shawl tightly wrapped around her torso, she took out the tiny limp body of her four-year-old daughter Faayo* and handed her over, arms outstretched, staring at us imploringly. After we had carried the girl to the assessment bay, her mother slumped heavily into a cheap plastic chair, exhausted, having walked across rugged, rocky terrain for many hours to reach us.

The team of compassionate, knowledgeable and tremendously skilled Ethiopian doctors I was working with quickly got down to business: one went to talk to the mother, while another began trying to find a vein to draw blood and start some intravenous fluids.

I went to find the hospital’s only pediatrician. When he arrived he looked down at Faayo, stony-faced. He delicately stroked her hot forehead then pulled up her eyelids to look into her unseeing eyes, the whites now yellow. “Probably malaria,” he said, “probably too late”. Looking for malaria parasites in a fresh blood sample takes minutes rather than hours, but by the time the result had come back positive, Faayo had already died.

So, a decade later, how did I end up deliberately infecting healthy volunteers with this deadly disease?

The Child Killer

Malaria causes around 250 million episodes of illness globally each year, more than 9 in 10 of these in central and southern Africa, with Nigeria and Congo particularly afflicted. Although the proportion of people who die from malaria is relatively low overall, it mainly kills young children, with more than 450,000 children under the age of 5 dying each year, alongside around 150,000 older children and adults.

These numbers are lower than they used to be (in 2000, there were more than 850,000 deaths) thanks to intensive malaria control initiatives, such as insecticide-treated bed nets. However, there has been no real progress in reducing the number of cases or deaths for nearly 10 years. So, what causes this devastating illness?

Manson and Ross

The diary of Ronald Ross showing the complete lifecycle of the malaria parasite for the first time in 1897. Source: LSHTM Library

Identifying the cause of malaria had stumped doctors for centuries. Prior to the 18th Century, the classic symptoms of relapsing fever were known as “ague,” from old French meaning “acute fever.” As its symptoms became better categorised and differentiated from other fever-causing illnesses, in it was renamed mal’aria by physicians in Italy. At this time, it was thought to be caused by something bad in the air, especially around swampy land: mala (bad) + aria (air).

It took the combination of microscopy and imperialism to make the next leap. Ronald Ross had not wanted to become a doctor, but under pressure from his father, a British colonial general of the Indian Army, he enrolled at Barts medical school in London in 1874. He went on to complete exams at the Royal College of Surgeons and later the Society of Apothecaries, leading to him securing his father’s preferred job in the Indian Medical Service.

In 1878, while Ross was still at medical school, Patrick Manson published his ground-breaking discovery that the parasites causing lymphatic filariasis were spread by mosquito bites. He noted the similarity of appearance of malarial parasites in the blood and, in 1894, published an article in the British Medical Journal calling on physicians working in tropical countries, especially India, to investigate whether these too were spread by mosquitoes. Ross took him up on this challenge and, in 1898, fully described the lifecycle of the Plasmodium parasite for the first time (pictured above), for which he won the Nobel Prize in 1902.

Murderous Mothers

What Ross identified was that when a female mosquito from the Anopheles species takes a blood meal from a person infected with malaria, she also ingests Plasmodium parasites. These migrate through her gut and mature in the salivary glands, so when she next takes a bite, parasites are injected back into an uninfected human. From there, they travel via the blood to the liver** where they replicate and bust the cells open, throwing huge numbers of parasites into the blood. The parasites then get inside red blood cells and replicate further, bursting open these cells with parasites that then infect more blood cells. This cycle of cellular infection and rupture is what causes the characteristic cyclical fever, as well as anaemia from red blood cell damage and jaundice from liver cell damage.

But why only females? Because they need the extra protein to feed and nourish their developing eggs. It’s one of nature’s deep ironies that in their quest to nourish their own offspring, maternal mosquitoes are responsible for the deaths of many thousands of human children.

Empire and War

An embellished sign encouraging members of the American military to take atabrine to prevent malaria. Source: Otis Historical Archives of the National Museum of Health & Medicine

Once the lifecycle had been described, the next mission was to find treatments and ways of preventing transmission.

Historically malaria was much more widespread than today, with malarial regions extending into southern and central Europe as well as southern and eastern American states. As European colonists arrived and settled in tropical lands, not only did they become more exposed to malaria, but also exported it to places like South America. The high death rate from relapsing fever was a huge problem for colonial forces.

The indigenous populations of Andean countries had long used bark from the Cinchona tree in traditional medicine. Missionaries noted its apparent effectiveness at curing malaria, and samples were sent back to Europe in the 17th Century. In 1820, scientists in France identified quinine as the curative compound and successfully purified it from the bark.

As powerful states mustered vast armies to defend their imperial territories in the wars of the 19th and 20th Centuries, there was a vast demand for quinine to prevent malaria among troops. European explorers stole seeds from Peru and Ecuador, and with much difficulty, the Dutch exported these to Java, eventually resulting in quinine tonic wine being widely distributed across imperial forces.

As global conflicts escalated through the first half of the 20th century, the laborious production of quinine from tree bark was not sufficient to keep up with the massive demand from military forces across Africa and Asia. During WW2 in the Pacific, vastly more soldiers died from malaria than in combat. Fortunately, research was already underway to find an easy-to-produce synthetic antimalarial drug that could be made at scale. This included the new drugs atabrine and the (more effective) chloroquine, developed between 1934 and 1946.

Tropical Challenges

“This will be a long war if for every division I have facing the enemy I must count on a second division in the hospital with malaria and a third division convalescing from this debilitating disease” General Douglas McArthur

Despite these advances, many mysteries around malaria remained, including why soldiers infected with P. falciparum parasites were cured, yet those with P. vivax parasites relapsed.

In the mid-1940s, at the height of the conflict in the Pacific, Brigadier Neil Hamilton Fairley was the Director of Medicine for the Australian Military Forces and a global expert in tropical medicine. He became determined to solve the problem of relapse to prevent the ongoing waves of deaths among his troops. So he designed a seriously radical and wildly complex clinical trial using deliberate infections of military “volunteers” — one of the earliest examples of a human infection challenge study.

The study divided the soldiers into several groups: 1) Those taking prophylaxis drugs (including atebrin and sulphadiazine), and 2) controls, not taking antimalarials. Groups 1 and 2 were exposed to the bites of infected mosquitos. Group 3, also not on antimalarials, was then administered blood taken from Group 2 participants at different times following the mosquito bites to see if and when the parasites were infectious.

To get enough mosquitos, the study team had to arrange the air transport of 20,000 larvae per week from New Guinea and carefully allow them to mature in the lab. They then needed to infect the mosquitoes by feeding them on infected humans. To do this they flew malaria patients from New Guinea and Queensland to Cairns to supply meals of infected blood.

For many ethical and safety reasons this study could never be conducted today. However, it was a revolution in the understanding of malaria transmission and the effectiveness of the new medicines. It provided huge detail on the relationship between blood parasites and infectiousness, optimised the dosing regimen for atebrin prophylaxis, confirmed the ineffectiveness of these drugs in preventing P. vivax relapse, and demonstrated a lack of immunity after repeat exposures. Implementing these findings radically reduced the number of cases and deaths of soldiers fighting in the tropics:

A graph showing the dramatic reduction in malaria following introduction of effective malaria prophylaxis, from Hamilton Fairley’s 1945 study.

Lab Blood Food

In the aftermath of WW2 and the horrors inflicted by doctors in the Nazi concentration camps, studies deliberately infecting research participants were largely abandoned. However, malaria was still causing havoc for military personnel, including in the US forces in Vietnam.

Drug development had continued apace, but the Plasmodium parasites were developing resistance nearly as fast as new drugs became available. For example, chloroquine resistance had emerged by 1957 after its introduction in 1945. With improvements in vaccine technology, efforts were focused on preventing malaria infections with immunization rather than prophylaxis.

In 1986, doctors at the Walter Reed Army Institute of Research (WRAIR) in the USA (named after the discoverer of the mosquito transmission of Yellow Fever) reinvigorated the volunteer infection model for malaria. The first study was small, with only 6 participants, but revolutionary, as the mosquitos were not fed from a human with malaria infection, but on parasites grown in a lab. Since then thousands of volunteers have been infected with malaria using this method at sites in the USA, UK, The Netherlands, and Australia. The P. falciparum strain selected is known to be extremely sensitive to chloroquine, making it easy to treat. These studies have been shown to be very safe, with few serious side effects reported amongst participants, and thus continue to this day.

Vaccine Hope

The journey of the first licenced malaria vaccine (called RTS,S) from lab to widespread use was significantly sped up through the use of human challenge studies. The first of these was safely conducted at WRAIR in 1996 with 6 of the 7 participants avoiding infection following a mosquito bite. It was then evaluated in several further human challenge studies to improve the dose and formulation before it was used in The Gambia and Kenya in 2001 and 2009.

Although not perfect (it prevents about 30% of malaria infections and around 55% of severe cases), the WHO’s approval of the vaccine in 2021 has led to more than a million children being immunized to date. Fast on its heels is the R21 vaccine, which looks to prevent 77% of malaria cases in children, and is also in the process of being rolled out by the WHO.

Despite this progress, many children like Faayo living in malarial regions do not have access to tests and treatment to cure their malaria. The arrival of effective vaccines to stop young children from dying is a glimmer of hope that soon some of this suffering will be prevented. Yet, the effectiveness of these vaccines needs to improve, so more volunteers will be needed to uncover their arms and be bitten for science.

*Name and some details changed to preserve anonymity.

** The liver as the other site of replication (after blood cells) was identified later, but a second site was hypothesised.

--

--

Dr Tim Crocker-Buque PhD MRCP
Microbial Instincts

Stories of doctors, infections, patients, and doctors infecting their patients. Dr Tim Crocker-Buque (Cro-ker-bew-kay).