Today's Paper Latest Story ideas Coronavirus The Article iPad Core Values Weather Newsletters Obits Puzzles Archive

Human life span: from brevity to longevity

by LAURA HELMUTH SLATE | January 13, 2014 at 2:41 a.m.

The most important difference between the world today and 150 years ago isn’t nuclear weapons or the Internet. It’s life span.

Americans in the early 1800s lived 35 or 40 years on average, but now we live to almost 80.

Why has average life span increased so much?

Was it a stepwise series of advances that each added a few years, such as clean water, sewage treatment, vaccines, modern medicine?

As it turns out, the question of who or what gets credit for the doubling of life expectancy in the past few centuries is contentious. Statistics are sparse before 1900; there are rivalries among biomedicine and public health, obstetricians and midwives, people who say life expectancy will rise indefinitely and those who say it’s starting to plateau.

In much of the developing world, life span hasn’t increased nearly as dramatically as in the United States and the rest of the developed world. (The United States has a lousy life expectancy compared with other wealthy nations.) Even within our country, there are huge differences among races, regions and social classes.

To understand why people live longer today, start with how people died in the past. They died young and painfully of consumption (tuberculosis), quinsy (tonsil abcess), fever, childbirth and worms. There’s nothing like looking back at the history of death to dispel any notions you have that people used to live in harmony with the land or be more in touch with their bodies. Life was full of contagious disease, spoiled food, malnutrition, exposure and injuries.

But disease was the worst.

The vast majority of deaths before the mid-20th century were caused by microbes - bacteria, amoebas, protozoans or viruses that ruled earth and to a lesser extent still do.

Lists of deaths by causes were kept in London starting in the 1600s and in some North American cities and parishes starting in the 1700s.People thought fevers were spread by miasmas (bad air) and the treatment of choice for pretty much everything was bloodletting. It’s not clear which microbes were involved. So we don’t necessarily know what it meant to die of “dropsy” (swelling) or whether ague referred to typhoid fever, malaria or another disease.

Interpreting these records has become a subfield of history. But overall, death was mysterious, capricious and ever-present.


The first European settlers mostly died of starvation, with, some historians say, a side order of stupidity. They picked unnecessary fights with native peoples and sought gold and silver rather than planting food. They drank foul water.

As Charles Mann notes in his book 1493: Uncovering the New World Columbus Created, one-third of the first three waves of colonists were gentlemen, meaning they didn’t do manual labor. During the winter of 1609-10, almost everyone died; those who survived engaged in cannibalism.

Deadly diseases infiltrated North America faster than Europeans did. American Indians had no history of exposure nor resistance to common European diseases of childhood. Unimaginable pandemics of smallpox, measles, typhus and other diseases ultimately reduced the population by as much as 95 percent.

The slave trade killed more than 1 million Africans who were kidnapped, shackled and shipped across the Atlantic. Those who survived the journey were at risk of dying from European diseases, starvation and abuse.

The slave trade also introduced African microbes to North America; malaria and yellow fever killed the most.

One of the best reviews of death’s history is The Deadly Truth: A History of Disease in America by Gerald Grob. Pioneers in wagon trains had barely enough food, much of it spoiled. Their water came from larvae-infested ponds. They died in droves of dysentery.

Poorly sealed, damp log houses were teeming with mosquitoes and vermin. Because of settlement patterns along waterways, some of the most notorious hot zones for malaria in the mid-1800s were in Ohio and Michigan.

Faster transportation in the 1800s brought wave after wave of disease outbreaks to new cities and the interior. Urbanization brought people into ideal proximity from a germ’s point of view. So did factory work and public schools. Children who might have toiled in relative epidemiological isolation on farms were suddenly coughing all over one another in enclosed schoolrooms.


How did we go from past miseries to an expectation oflong and healthy lives? “Most people credit medical advances,” says Harvard medical historian David Jones - “but most historians would not.”

Most effective medical treatments that save lives today have been available only since World War II: antibiotics, chemotherapy, drugs for high blood pressure. But the steepest increase in life expectancy occurred from the late 1800s to the mid-1900s.

Some successful treatments such as insulin for diabetics have kept individuals alive but haven’t necessarily had an effect on average life span.

Mathematically, the interventions that saved infants and children from dying of communicable disease had the greatest impact. Until the early 20th century, the most common age of death was in infancy.

Clean water could be the biggest lifesaver in history. Some historians attribute one-half of the overall reduction in mortality, two-thirds of the reduction in child mortality and three-fourths of the reduction in infant mortality to clean water.

In 1854, John Snow traced a cholera outbreak in London to a water pump next to a leaky sewer, and after that some of the big public works projects of the late 1900s involved separating clean water from dirty. Cities ran water through sand and gravel to physically trap filth. When that didn’t work, they started chlorinating water.


Closely related were technologies to move wastewater away from cities, but, as Grob explains in The Deadly Truth, the first sewage systems made the transmission of fecal-borne diseases worse. Lacking an understanding of germs, people thought that dilution was the best solution and just piped their sewage into nearby waterways. Unfortunately, sewage outlets were often near water system inlets.

Once the germ theory of disease caught on in the late 1800s, people started washing their hands. Soap stops deadly and lingering infections. Even today, kids who don’t have access to soap and clean water have stunted growth.

Housing, especially in cities, was crowded, filthy, poorly ventilated, stinky, hot in the summer and cold in the winter. These were terrible conditions for human beings - but a great place to be an infectious microbe.

Pretty much everyone was infected with tuberculosis, the leading killer for most of the 19th century. It was predominantly a disease of poverty. As economic conditions started improving in the 19th century, more housing was built, and it was airier, brighter (sunlight kills tuberculosis bacteria), more weather-resistant and less hospitable to vermin and germs.


A longevity gap between the rich and the poor developed slowly with the introduction of effective health measures that only the rich could afford: Ipecac as a purgative, condoms to prevent the transmission of syphilis, quinine from the bark of the cinchona tree to treat malaria.

Once people realized citrus fruits could prevent scurvy, the wealthy built greenhouses where they grew the lifesaving fruit.

Improved nutrition extends life. The earliest European settlers in North America suffered from starvation, but once the British Colonies were established, they had more food and better nutrition than people in England.

In Europe, the wealthy were taller than the poor, but there were no such class-related differences in America. This changed during the 1800s, when the population expanded and immigrants moved to urban areas. Average height declined, but farmers were taller than laborers.

People in rural areas outlived those in cities by about 10 years, largely due to less exposure to contagious disease and better nutrition. Diseases of malnutrition were common among the urban poor: scurvy (vitamin C deficiency), rickets (vitamin D deficiency) and pellagra (a niacin deficiency). Improved nutrition at the end of the 1800s made people taller, healthier and longer-lived; fortified foods reduced vitamin-deficiency disorders.

Contaminated food once ranked among the greatest killers, especially of weaned infants. Refrigeration, public health drives for pure and pasteurized milk, and an understanding of germ theory helped people keep food safe.

The Pure Food and Drug Act of 1906 made it a crime to sell adulterated food, introduced labeling laws, and led to government meat inspection and the creation of the Food and Drug Administration.


People started finding ways to fight epidemics in the early 1700s, mostly by isolating the sick and inoculating the healthy.

The United States suffered fewer massive epidemics than Europe, where bubonic plague periodically burned through the continent and killed one-third of the population. Low population density prevented most epidemics from becoming widespread early in U.S. history, but epidemics did cause mass deaths, especially in crowded cities.

Yellow fever killed hundreds of people in Savannah,Ga., in 1820 and 1854; the first devastating cholera epidemic hit Europe and North America in 1832. Port cities suffered some of the worst outbreaks because sailors brought new disease strains from all over the world. Port cities instituted quarantines starting in the 19th century, preventing sailors from disembarking if there was any evidence of disease and, on land, quarantines isolated contagious people.


In the early 1900s, antitoxins to treat diphtheria and vaccines against diphtheria, tetanus and whooping cough helped stop these deadly diseases, followed by vaccines for mumps, measles, rubella (German measles) and polio.

Anne Schuchat is assistant surgeon general and the acting director of the Center for Disease Control and Prevention’s Center for Global Health. She says it’s not just the scientific invention of vaccines that saved lives, but the “huge social effort to deliver them to people improved health, extended life and kept children alive.”

Some credit for the historical decrease in deadly diseases may go to the disease agents themselves. The microbes that cause rheumatic fever, scarlet fever and a few other diseases may have evolved to become less deadly. That makes sense - it’s no advantage to a parasite to killits own host, and less-deadly strains may have spread more readily in the human population.

Sudden evolutionary change in microbes can go the other way, too: The pandemic influenza of 1918-19 was a new strain that killed more people than any disease outbreak in history - around 50 million.

In any battle between microbes and mammals, the smart money is on the microbes.

ActiveStyle, Pages 27 on 01/13/2014

Print Headline: From brevity to longevity/Human life span has increased dramatically since the early 1800s, but why?


Sponsor Content