By Scott Miners in Vol. 17, No. 2.
In his book, Searching for Safety, Aaron Wildavsky (pages 67-68) quotes the respected physician and researcher Lewis Thomas, M.D., for substantiation of the view that health and longevity for humans improved, not because of medical interventions, such as vaccines, but because of better sanitation and living conditions:
“There is no question that our collective health has improved spectacularly in the past century, but there is a running argument over how this came to be. One thing seems certain: it did not happen because of medicine, or medical science, or even the presence of doctors…. Medical care itself—the visits by doctors in the homes of the sick and the transport of patients to hospitals—could have had no more than marginal effects on either the prevention or reversal of disease during all the nineteenth century and the first third of the twentieth. Indeed, during most of the centuries before this one [the twentieth], doctors often made things worse whenever they did anything to treat disease.
“It was not until the early twentieth century that anything approaching rational therapy emerged for human disease, and it was not until the middle of the century that we came into possession of rational and powerful technologies for the treatment and prevention of infection on a large scale.”
Thomas then surmises that if the credit should not go to medical practices, it needs to go to another source. “Much of the credit should go to the plumbers and engineers of the western world. The contamination of drinking water by human feces was at one time the single greatest cause of human disease and death for us; it remains so, along with starvation and malaria, for the Third World.” Thomas also credits “improvement in agriculture and then in human nutrition, especially in the quantity and quality of food available to young children. As our standard of living improved, we built better shelters, with less crowding and more protection from the cold.”
Greg Beattle substantiates Thomas’s view, in Vaccination, A Parent’s Dilemma (Oracle Press, Queensland, Australia, pp. 36-57). He writes that the major diseases on the planet declined significantly from the time the link between sanitation and disease was discovered in the mid-1800s. At the time these diseases were almost non-existent, in the mid-1900s, vaccines for them were introduced. Many vaccine advocates thereafter claimed that credit for the decline should be given to vaccines.
It is striking when viewing the charts in Beattle’s book to see the decline in mortality rates for diseases since the mid-1800s, when sanitation and nutrition measures were introduced in Europe, Australia and North America. Generally speaking, by the 1960s, mortality rates had reached very low points. One can see in each mortality graph, such as for measles, that there are no significant changes in the decline rate after vaccinations were introduced in the late 50s and early 60s compared to before.
Measles vaccine was introduced in 1963 in the United States. Prior to 1963, according to the Centers for Disease Control’s own records, deaths per year from measles, which had reached a high of over 10,000 in the early 1920s, had already declined to almost none. Neil Miller writes, in Vaccines: Are They Really Safe And Effective?2nd ed. (p. 27), “In 1957 the number of children who died with measles was about 93 or 94. Children may die ‘with measles’ but not ‘from measles.’ The use of fever suppressing compounds, such as aspirin, raised the death rate ‘with measles.’” Miller states that the proximate cause of measles deaths was really from one or more of the following—unsanitary conditions, improper nutrition, or malnutrition and mistreatment. He also notes that if those patients who died with measles had been given better hygiene, proper nutrition, including vitamin A, and normal parental care, their “complication rates and chances of death” would have been significantly reduced.
Scarlet fever may show the best evidence of being contained because of better sanitation, nutrition and living conditions, rather than a vaccine, as deaths from scarlet fever in 1900 were over 180, whereas since 1950 there have been fewer than ten per year. No vaccine has ever been developed for scarlet fever.
Deaths from typhoid fever are also virtually non-existent today, although in 1910 there were over 500 in the U.S. No widespread vaccination program has ever been in place for typhoid fever.
Beattle’s charts show that there were over 1500 deaths from diphtheria in 1880; by 1920 there were close to 600 deaths, and by 1935 deaths had declined to 400, and this decline continued until the first vaccine was introduced in the late 30s. By the time the second diphtheria vaccine was introduced about 1950, the death rate was fewer than 100.
There were close to 1000 deaths from whooping cough in 1880; by 1920 the rate had declined to fewer than 400, and the rate continued to decline through 1940 (to fewer than 200), when the first vaccine was introduced. By the time the second vaccine was introduced in the late 50s, the death rate was already less than 50.
Polio vaccines were introduced in 1957. By then deaths from polio had declined to almost none (from over 40 in 1951). Death rates actually climbed (in the early 1960s) after the first polio vaccine was introduced.
This article is from, “A Special Section on Vaccines: Questioning Their Necessity and Governmental Mandates,” Well Being Journal, Vol. 17, No. 2, which contains a special section on vaccines and vaccine policy.