Scientists debate how global warming will effect the spread of malaria to patients like this one, in
a Taiwanese clinic. [Credit: Ashley Jonathan Clements, flickr.com.]
New York City summers are known for their heat and humidity, but even so, the summer of 2006 was surprisingly scorching. In the first days of August, temperatures hit triple digits for several days, followed by almost a week of relentless 90-degree weather. And with the humidity, it felt even hotter—akin to the climate of tropical countries. Thousands of people lost power, over 100 people died, and residents were generally uncomfortable. But some might wonder if there could be another danger lurking in these baking temperatures—could they facilitate the spread of tropical infectious diseases, such as malaria?
Scientists agree that our planet is warming and that human activities have contributed to climate change. Since researchers started recording climate data in 1850, they have seen a rise in global average temperatures. And in recent years people have been feeling the heat —the decade of 1998 to 2007 was the warmest ever recorded. In the future, scientists believe we will see not only an increase in temperatures, but also an increase in extreme weather events, such as droughts in some areas and torrential storms in others.
All these changing weather patterns are likely to influence human health. Droughts, floods and heat waves could cause injury, sickness and death. One specific health threat that has been linked to climate change is the increased spread of infectious disease. Certain vector-borne diseases—illnesses in which a host organism, such as an insect, carries and transmits a disease-causing pathogen—are particularly affected by varying weather and hotter temperatures. Because these vectors are cold-blooded, they are subject to the whim of their surrounding environment to control their internal heat. Thus, an increase in temperature would theoretically favor insect life, and possibly allow the spread of certain diseases into new areas.
When discussing future disease spread, people often cite malaria as an illness that will be influenced by climate change. In An Inconvenient Truth, Al Gore links malaria expansion to global warming. And many popular press articles have also mentioned the possibility of the disease spreading to new areas, including the United States and Europe. So should we brace for a malaria surge as our planet warms? Will New York City see malaria epidemics in future summer heat waves? While malaria expansion could hypothetically occur, in reality, how much the disease spreads depends on many factors, some of which may outweigh temperature change. To assess the threat, it’s helpful to look at the whole picture, which not only includes the climate, but also the parasite, the vector, the human host and our society.
Malaria today has a huge public health impact. Worldwide there are about 350 million to 500 million malaria cases each year and more than one million deaths, mostly in sub-Saharan Africa, according to the Centers for Disease Control and Prevention, or CDC.
The disease is caused by a protozoan parasite, which is transmitted by the Anopheles mosquito. While there are four species of human malaria parasite, the two most serious forms are Plasmodium falciparum and Plasmodium vivax.
When an infected mosquito draws blood from a human, the parasite is injected into the victim’s bloodstream. The parasite multiplies inside liver cells and red blood cells and causes, among other symptoms, fever, chills, anemia, vomiting and convulsions. The immune system has trouble detecting and fighting the disease-causing pathogen; if it is not treated early and with the right medicine, malaria can be fatal.
Since malaria parasites cannot develop properly within the mosquito below about 60° F (16° C), the disease is found mostly in tropical areas where temperatures stay relatively warm all year round. This climate restriction is what has people concerned that a warming world could mean more malaria. But temperature alone is not completely responsible for disease spread.
“The thing that people usually hone in on is that the parasite develops more quickly at higher temperatures, which is undoubtedly true,” says Paul Reiter, a professor of medical entomology at the Pasteur Institute in Paris, France. “But whether that really makes a difference to the epidemiology is dependent on a lot of other factors.”
Malaria’s Northern Past
Malaria historically had a larger global reach, with the disease stretching much farther north than it does today.
“One hundred years ago there was a lot of malaria in [the United States], and there was a lot all the way up into Scandinavia and Europe,” says Thomas Burkot, a malaria researcher in the Center for Disease Control’s Division of Parasitic Diseases. “So the potential range for malaria transmission, at least seasonally, is quite a bit larger than where we actually find malaria today.”
Indeed, in the mid-19th century, malaria epidemics occurred in parts of England, Denmark and Sweden, but the disease gradually disappeared from Europe around the beginning of the 20th century. The United States, on the other hand, did not see malaria eradicated until a few years after World War II, in the early 1950’s. Before that time, the disease was endemic, or found locally, in several parts of the country, and especially prevalent across the southeastern states. In fact, the CDC, which originally stood for the Communicable Disease Center, was established in Atlanta, Georgia in 1946 primarily to eradicate malaria. Even today, the Anopheles mosquito is widely present in the US and on all continents except Antarctica.
So why did the disease disappear from the western world? The answer reveals the complex factors behind malaria transmission, and also suggests why the disease probably won’t stretch back to Europe or the US, even in the midst of temperature changes.
“The reason why [malaria disappeared], we can’t really say for sure, but a lot of it has to do with our lifestyle,” says Burkot.
In Europe, a combination of factors probably contributed to malaria’s decline. One cause may have been increased urbanization. Malaria is generally a rural disease, and when industrialization took hold and people migrated to cities, the disease did not follow them there. “Malaria tends to be a rural disease because the mosquitoes need clean water sources,” says Burkot. “Generally in urban communities, you really don’t see a large population of mosquitoes that transmit malaria because the water’s just too polluted.” Also, malaria mosquitoes feed only at night, and improved building materials reduced the mosquito’s access to people.
Increased animal agriculture may have helped as well. Farmers began to breed larger populations of cattle, and when humans and cattle live close together, malaria mosquitoes can be deflected to feeding on cattle rather than people, according to Reiter, the entomologist at the Pasteur Institute.
In the United States, the story was different. The CDC organized a campaign to get rid of malaria in the 1940s and 50s, which mainly involved spraying the pesticide DDT in areas with lots of malaria cases. Draining swamps and wetlands—mosquito breeding grounds—also helped in the effort. Malaria was considered eradicated from the country in 1951, according to the CDC.
These actions, along with the present day malaria medication that is available in developed countries keep malaria at bay in areas outside of the tropics.