Saturday, March 31, 2007

THE SIXTH SENSE MYTH 10

With the recent tristful and unfortunate (what were the chances of that stingray fatally striking him in the heart with his barbed tail?) death of 44 year old Australian crocodile hunter and animal naturalist Steve Irwin, I heard several commentators talking about Irwin’s “sixth sense” when dealing with wild and funest animals. Quite apart from the obvious failure of his “sixth” or any other sense of danger in this instance, the concept of a sixth sense is what is called a vulgar error. Not vulgar as in obscene, but in the original Latin (vulgus - - the masses) meaning of the masses of common people as distinguished from informed people.

When speaking of a “sixth sense” the assumption is that there are five other senses – sight, hearing, smell, taste, and touch. The number of senses is a matter of definition. In fact one can make a case for only one sense or as many as 13 or more. The sense of touch could encompass all of the senses (sound waves impinging on the eardrum or light waves on the retina, etc.) or, in addition to the five already mentioned, there are the senses of balance, orientation, heat, cold, pain, movement, pressure or resistance, as well as hunger and thirst.

This word misuse reminds me of the current seemingly ubiquitous expression, especially by television talk show hosts or anchors, of “You will not believe this next story.” If the assumption is that the audience (more precisely viewer because it includes sight as well as audio (sound)) will not believe what they are about to be told, then what is the purpose of telling them? It is a rhetorical statement completely devoid of originality and steeped in clichéd repetition. More to the point is the question of why these people lemming-like keep making this completely inane statement.

When you next hear someone unthinkingly repeating the cliché of a “sixth sense” you can make a decision whether you want to let them continue in their blissful ignorance or risk offending them by pointing out their error. My predilection is always to educate and risk offense.

Friday, March 23, 2007

The Great Mortality & Immorality 9

The Black Death or as it was more commonly called at the time, The Great Mortality invaded Europe in 1347-52 apparently from the steppes of Central Asia becoming the most devastating plague of all time. The population of Europe has been estimated to be 75 million at that time and fully 1/3, at least 25 million people, died from the plague. Even if the mortality rate was not greatly iatrogenically accelerated, effective medical intervention was practically nonexistent. There have been many accounts of this plague, but one of the best is the 2005 book The Great Mortality by John Kelly.

The disease was caused by the bacillus, Yersinia pestis, with the transmission vector mainly being the rat flea, Xenopsylla cheopis, and also likely the human flea, Pulex irritans. The fleas in turn were carried by the black rat, Rattus rattus, although other varmints such as marmots may have also been involved.

There are additional aspects to this story besides the massive deaths and horrendous human suffering caused by the disease and one is implied by the second part of this article title.

Often a kernel of truth is contained in some of the most hackneyed sayings and this episode of disease is no exception. The cliché, originally from Psalms, “it is an ill wind that blows nobody good” illustrates the point. In the years after the Black Death abated, farm land became more plentiful (there were many fewer people) and the cost of labour in Europe increased for the same reason. Thus economic advantage accrued to the peasantry versus the landed gentry. Even in the cities labourers, especially skilled labour, saw their wages rise dramatically along with prices of course. There was an attempt by the ruling classes to impose wage and price controls, but then as now this never works. As frequently stated by the likes, among many others, of economists Milton Friedman, Thomas Sowell, Walter Williams, et al. the putative proposition is when wage and price controls are imposed which are set below free market wages and prices then shortages result; when price and wage supports are instituted which are above free market wages and prices then over supply and over production result which is wasteful and inefficient in a capitalistic economy because it takes resources in labor or goods away from other more desired production/consumption. This is such a simple concept yet is either not understood or not accepted by socialists or other hard left types to the detriment of economic prosperity.

Book making in the Middle Ages was a labor intensive undertaking which required several copyists. In the high wage post-plague years this cost became prohibitively expensive except for the very richest nobility. Enter Johann Gutenberg with his invention of the moveable type printing press in 1453. One could say that the advent of the Black Death led to mass production of printed material, including books, and a much higher literacy rate in Europe. “necessity became the mother of invention” might be the watchword.

Once again we have the proposition that “It is not that the primitive mind can not link cause and effect, but does so where none exists.” In this case it took an ugly and deadly turn. When the Black Death hit Europe in 1347-48, perhaps understandably with the discovery of microorganisms being centuries in the future, people, especially Christians, looked for more tangible agents. As in the previous, but much less severe, plagues in the 11th, 12th, 13th, and early 14th centuries the candidates were lepers, beggars, vagrants, and Jews. By far the most popular of these were the Jews. And Christians, by and large, showed little mercy in dealing with what they perceived was the source of their misery by inculpating Jews. One exception was Pope Clement VI who issue a Papal Bull on July 6, 1348 stating “it can not be true that the Jews…are the cause…of the plague…for [it] afflicts the Jews themselves.” Little attention was paid to this sensible analysis.

Unlike previous campaigns against the non-culpable Jews in Europe this time banishment and prohibition from entry were not the main actions taken. Real pogroms including massacres and attempted force conversion to Christianity were undertaken. Jews in what are now Austria, England, France, Germany, Italy, Spain, and Switzerland among other European countries were targets of indiscriminate killing – the favorite mode being burning at the stake. In at least one documented incident all the Jews in one town - men, women, and children – were crowded into one building which was locked and set on fire. Nobody survived. Apparently the syllogism was: The Great Mortality was bad; Jews do bad things, therefore Jews caused the deadly disease.

Naturally this horrendously deadly torment had to be justified – and so it was. You see the Jews caused the plague by poisoning water wells all over Europe. There were even a few Jews who confessed to this. After, of course, they were put on the rack or subjected to other unbearable torture.

It was not just in the 20th century where compurgation was not afforded the Jews, but they were massacred in great numbers during the holocaust. Whether the number was 6 million the Nazis slaughtered or 5 million or 7 million is of little import – it was undeniably millions. Given their history of persecution it should be natural to cut them a little slack in their dispute with the Arabs over Palestine. Especially the Europeans, with their historically deplorable treatment of Jews, should reasonable be less biased against the Israelis vs. the Palestinians than they are.

I am not claiming that the Israelis are completely blameless in the tête-à-tête with the Arabs although the Arabs did start the 1948 war because they saw the Jews getting stronger with immigration and aid from around the world. When I was in Libya in the late 1950’s I met several young Palestinian Arabs who were given 24 hours by the Jews to leave their homes in 1948. They could take with them as much of their personal property as they could manage, but they had to get out. The Israeli ambassador to the U.N., Dan Gillerman, was recently asked in a TV interview in the United States whether the Jews had not forced some Arabs out of Palestine during the 1948 war and he said “no, absolutely not.” Ambassador Gillerman is a smart, educated, well informed politician so he had to know what he said was not true. Yes, I am saying he lied.

Regardless of which side has the more legitimate claim on Palestine, a proposition which could be debated endlessly, it is abundantly clear to all sane, reasonable people that Israel wants to live in peace (and do business wouldn’t you know?) with their portion of the Holy Land while the Palestinians and other Muslims in the Middle East want the Jews either exterminated or exiled from all Arabs lands.

It is singular that the funest crimes committed by Christians against Jews and perceived non-believing other people hundreds of years ago in Europe would be absolutely unthinkable today. Can the same be said about Muslims?

Christians in the Middle Ages burned their enemies, real or imagined, at the stake. That does not differ materially, and certainly not in outcome, with Islamo-fascists blowing up innocent people with IED’s today. At times Medieval Christians gave Jews the option of converting to Christianity on pain of death if they refused, which they sometimes did. One of the stated objectives of current Islamic fanatics is to convert all non-adherents of their brand of Islam and Sharia (Arabic for "well-trodden path") law, on the pain of death if they refuse. Sound familiar?

One gets the impression that the madman in Tehran (President Mahmoud Ahmadinejad) would rather forgo any conversion option when it comes to the “Great Satans” of the United States and Israel and proceed directly to termination. If the Iranian regime manufactures nuclear bombs and acquires additional help from North Korea for developing long range missile technology then they will be in position to carry out their intimidation if not downright elimination strategy against the United States and Israel. The Iranians would find out too late the meaning of a pyrrhic victory – no matter how much damage they inflicted they would be completely incinerated in retaliation. Sane people would realize this, but can fanatics, religious or otherwise, be counted on to be rational?

My conclusion is that Christianity has advanced immeasurably in the last few hundred years in terms of compassion, morality, and rationality while Islam is still stuck in their concomitance with the earlier Christianity past.

I would enjoy entertaining dissenting opinions with my theses if they are supported by facts - Fons et origo, and apodictic reasoning. Even any lapsus calami pointed out would be appreciated.

Friday, March 16, 2007

MISCELLANEOUS RUMINATIONS 8

Americans spend multi-billions of dollars annually on foods, fads, books, treatments, surgery, séances, and God knows what else to lose or otherwise keep their weight under control. Why not the peripeteia of that? That is to say spending time and treasure to keep one’s weight up. Sounds ridiculous you think, then consider: The 18th & 19th century’s English cleric and economist, Thomas Malthus (1766-1834), observing that the population was increasing geometrically and the food supply arithmetically, declared in his Essay on the Principles of Population (1798) that absent of deadly pandemics or world-wide wars the world was in for massive starvation. What did happen to the world’s population in those centuries since?

The world population has increased approximately as follows: 600 million in 1700; 900 million in 1800; 1.6 billion in 1900; 2.5 billion in 1950; 6.5 billion today. This is certainly a geometric increase, especially in the 20th and 21st centuries. Why then has there not been catastrophic starvation? In fact this year the World Health Organization declared there are more people at risk for health problems from excessive weight (eating too much food) than from starvation. True enough millions of people in the third world suffer from a deficiency of nutritious food, yet a shortage of food is not at fault; rather it is a delivery problem coupled with corruption and inefficiency of United Nations and government officials on the receiving end of foodstuffs given by the generous and compassionate West.

It turns out that the supply of food has increased even more than the population in the past 300 years. The term used for it is the “Green Revolution” meaning advanced mechanized farming, packaging, storing, and delivery methods for food stuffs. Malthus had no way of predicting or even imagining this true revolution.

More recently, in the late 1960’s to the 1980’s, a screwball named Paul Ehrlich (see my essay Fools, Frauds & Fakes) in his 1968 book The Population Bomb predicted that ½ billion people would die of starvation in the next decade and there would be food riots in the United States in the 1980’s. He opined that the world’s population would be 1.5 billion by 1985. This Ehrlich book (it sold 20 million copies) and his 1991 book The population Explosion were best sellers and he made numerous appearances on national TV programs. Ehrlich was the darling of the left in his day much as the equally loony and labile Al Gore, who is clearly afflicted with amentia, is today. The next time you hear people such as the egregious Scott Pelley (see my essay Global Warming or Cooling: Which the Hell is it?) and ABC News reporter Bill Blakemore, who stated this past week that he rejects balance on reporting Global Warming as he considers critics of GW merely hacks and lackeys, keep in mind the record of the we-are-all-going-to-starve crowd. Journalism objectivity is not a strong suit of the paralogistic press (main stream news media).


The latest affright by the wacky left, quo modo, as with Global Warming, is that the world’s seas will be depleted of harvestable seafood by 2050. These kvetching people apparently will never be satisfied until depression and despair affects every living soul on earth. A 14 member group of scientists (?) from Halifax U. in Nova Scotia, Stanford, Scripps Inst. of Oceanography, Stockholm U., and British Columbia U. came up with this carking scenario. University of Washington professor of aquatic & fishery sciences, Ray Hilborn, responded “It’s just mind-boggling stupid.” Indeed.

Even though as the 16th & 17th century’s English poet and preacher, John Donne (1571? – 1631) wrote in his Devotions Upon Emergent Occasions (1624) “Every man’s death diminishes me because I am involve in mankind, therefore never send to know for whom the bell tolls; it tolls for thee.” that does not mean all sacrifice of life is futile. In defense of their homeland various countries have paid heavily. Great Britain, France, and the United States collectively suffered approx. 750,000 military deaths in World War II. In the three months Battle of Moscow in the late fall and winter of 1941 after Hitler decided to invade the Soviet Union, the Russians calamitously had 150,000 more military and civilian death than that - a total of 900,000 in just three months. In fact 80% of the German causalities in the war occurred on the Eastern front fighting the Soviets. It is conceivable that without Germany’s war with the Soviet Union on their Eastern front, the Allies invasion in June 1944 may have not succeeded. At the very least the Allies effort to defeat the Nazis would have been much more costly in time, blood, and treasure.

The United States had 405,000 military deaths during WWII. For every US death Japan had 5; Germany 20; and the USSR 75. The total military and civilian deaths during WWII for the Soviet Union were circa 30 million. There are those who have said that the United States should not have allowed the Soviet Union hegemony over Eastern Europe after WWII. The Red Army occupied the Eastern European countries at the finish of the war – it was a fait accompli. For the United States to have intervened would have meant another war. Wasn’t there killing enough already? Eastern Europeans suffered under the Soviet yoke from the rest of the 1940’s through the 1980’s, but they are largely free now. Incidentally when Boris Yeltsin declared the end of the Soviet Union on 12/06/1991 the date was 50 years to the day when the Soviets counter attacked in the Battle of Moscow.

The second most consumers spending holiday in the United States after Christmas is now Halloween. And this holiday has spread to other countries, especially in Europe. In what may fairly be characterized as continuing anti-American xenophobia, the French government is attempting to minimize if not completely eliminating this, horreur, imported American celebration. However, in an irenic spirit we can reasonably empathize with the Frogs on this one. After all what children would want to go around on Halloween saying, “bonbon ou baton (trick or treat)?” Literally this means “candy or a baton (club)”, presumably applied to the noggin of recalcitrant or parsimonious treat givers. The phrase in French is “treat or trick” - butt-backwards to the English expression, but then what would you expect from the Frenchies? Even French children sensibly do not want to make themselves sound ridiculous by repeatedly yelling “bonbon ou baton.”

Friday, March 9, 2007

THE FIRST MIRACLE DRUG 7

Before reading the book The Demon Under the Microscope by Thomas Hager it was my, and likely others as well, impression, erroneous as it turned out, that penicillin was the first ‘miracle’ anti-bacterial drug. It was not. The following is a brief recounting of what really was that drug.

In the summer of 1924 the teen age son of President Calvin Coolidge developed a blister on his big toe while playing tennis. The wound became infected with what looked like Streptococcus pyogenes and despite the best medical treatment available at the time he was dead within two weeks. In November 1936 Harvard senior, Franklin Delano Roosevelt Jr., was undergoing a round of cocktail and dinner parties and press conferences owing to his engagement to be married. Perhaps due to the stress and tiring activities of all this, young Roosevelt began feeling unwell with a sore throat and head which felt like it was full of concrete. Five days later he was admitted to Massachusetts General Hospital. Again despite the best medical treatment available at that time Roosevelt’s condition steadily deteriorated over the next three weeks as he was diagnosed with some strain of a streptococcus infection. In desperation he was given a new and still unproven drug. In two days there was a complete turnaround and Roosevelt soon left the hospital. The completely different outcome of these two cases was entirely due to this new drug.

Streptococcus infections, or strep for short, were estimated to kill 1.5 million victims per year in Europe and North America alone during the 1920’s. That same number adjusted for today’s population would total more than the current worldwide annual deaths from cholera, dysentery, typhoid, and AIDS combined. Such diseases as scarlet fever, rheumatic fever, and strep throat are caused by various species of streptococcus. With enough magnification all strains of strep look like twisted strings of beads (Latin streptococcus from Greek streptos twisted chain + kokkos berry or seed).

After the horrendous number of deaths of soldiers from infected wounds in WW1 the Germans, English, and French, and especially the Germans concentrated on finding medications which would cure these infections. A young German medic during WW1, Gerhard Domagk, experienced firsthand the futility of the medical treatment given to wounded soldiers in fighting infection. After the war he determined to combat this scourge of deadly infection by graduating from medical school in 1921 and devoting himself to finding cures. After first working at a medical school he went to work in the pharmaceutical department of Friedrich Bayer & Company.

German companies were the world leaders in the manufacturing of dyes for cloth and other materials. A Jewish German chemist, Paul Ehrlich, conceived the idea and developed a method of using dyes to stain bacteria so they could be studied under a microscope. He had received a Nobel Prize in medicine for his work on immunity and serum therapy in 1908. Domagk, and other German researchers started using dye molecules, especially what was called an azo dye in attempting to develop an anti-bacterial medication composed of the dye molecule coupled with other molecules. Domagk had six laboratory assistants - all of them women. How is that for diversity in the workplace for the 1920’s? After many trials and failures, Domagk tried combining the azo dye molecule with a molecule called sulfanilamide. This medication showed great promise treating some types of strep and after much further experimentation and refinement a drug called Prontosil was marketed by Bayer. Dr. Gerhard Domagk was named a Nobel Prize recipient by the Swedish committee in 1939. However Adolph Hitler himself prohibited Domagk from accepting the prize because a couple of years early a German opponent of the Nazi regime had been awarded the Nobel Peace Prize by the Norwegian Nobel committee. It made no difference to Hitler that the Swedish committee was different from the Norwegian committee – he wasn’t going to have anything to do with Nobel period. Domagk was invited to Stockholm to receive his Nobel Prize in 1947 when Hitler was history.

This new anti-bacterial drug, Prontosil, and further similar drugs developed by the Germans, French, and English worked almost miraculously in laboratory tests on mice and rabbits and then on human patients. There was no such thing as a clinical trial in the 1930’s so the patients became by default guinea pigs. Still the cure rates were as nothing seen in medicine before. The one puzzling aspect was that while the results in vivo in the laboratory and with human patients were spectacular, in vitro it was a dismal failure – the pathogens in test tubes were not affected by the drug.

The Germans and as a result the French and English as well were so hung up on using dye molecules that it was practically by accident that a group of test mice were treated with pure sulfanilamide in a French lab. When the sulfanilamide alone worked as effectively as the previous drugs linked to the azo dye there was shock and as expected, doubt in laboratories around Europe. After the results were confirmed by others the rush was on to turn out variations of sulfa based medications. One mystery was solved by the discovery of sulfa being the active anti-bacterial agent. Sulfa had to be released from the rest of the Prontosil molecule in order to become active. This could happen in the body of an animal, where enzymes in the body could split the Prontosil molecule into two pieces, releasing pure sulfa as the medicine, but not in a test tube which contained no enzymes.

By 1937 sulfa drugs were being widely used in the United States and therein lay a cautionary tale. In Tulsa, Oklahoma Dr. James Stephenson observed that many sick patients including a majority of children began showing up in Tulsa hospitals and later in hospitals in other cities around the country. A number of them began dying with renal failure. Sulfa pills, capsules, injectable solutions, and powders had recently hit the market from a variety of pharmaceutical manufacturers. Every drug store in Tulsa was selling sulfa to anyone who asked for it. Dr. Stephenson traced the problem to a proprietary drug called Elixir Sulfanilamide produced by the Massengill Company. Sulfa is difficult to dissolve, alcohol does not work well nor does many of the common medicinal solvents. The head chemist for Massengill, Harold Watkins, found that diethylene glycol, an industrial solvent, worked very well to dissolve the sulfa. As it turned out, the problem was it also worked very well in bring on renal failure and death which reached 67 confirmed. As difficult as it is to believe today, the Massengill Company ran no tests on laboratory animals nor did they do any clinical tests with their medicine. The FDA had been established in 1906, but was of minimal effect in regulating the medical field. Like other issues perhaps the pendulum has swung too much the other way today. Although Watkins was slow in admitting blame for the sickness and death of the people who took his medicine he clearly realized his culpability – he committed suicide eight months later by putting a loaded hand gun to his head and squeezing the trigger.

Sulfa drugs generally had no serious side effects (except when mixed with poison, e.g. diethylene glycol). I can attest there were some relatively minor ones however. When I was a teenager I was given a sulfa drug for a forgotten illness and when I got up from bed to go to the bathroom I felt so dizzy that I almost, but not quite, fell to the floor. The medication was discontinued and I believe logic would dictate that I recovered anyway.

In 1939 a drug called sulfapyridine was developed in the United States as a cure for pneumonia. Within a few years this drug was saving the lives (why not say postponing the deaths? Isn’t that more accurate?) of 33 thousand pneumonia patients each year in the United States alone. By 1942 at least 3600 sulfa derivatives had been synthesized and studied and more than 30 of them had been sold in the United States. Sulfapyridine was, gram for gram, 18 times more powerful than Prontosil; sulfathiazole was 54 times more effective; sulfadiazine was 100 times more effective. The number of diseases that could be treated with the new sulfa drugs was also growing. Sulfathiazole worked as well as the other drugs against strep and also was effective against staphylococcal infections.

There seemed to be an ever increasing spectrum of diseases which could be treated successfully with sulfa drugs. The one thing unexplained was how the medicine worked, but that was discovered eight years after Prontosil was formulated. Sulfa was less of a magic bullet than a clever imposter. It was observed that sulfa never worked as well when there was dead tissue around or a lot of pus as in uncleaned wounds. There was a yeast extract called para-aminobenzoic acid (PABA) a chemical involved in bacterial metabolism which had astonishing anti-sulfa abilities. Some bacteria can make their own PABA; others can not and have to find it in their own environment. For these bacteria PABA is an essential metabolite. If they can not find it they starve. The sulfa and PABA molecules are so similar that an enzyme critical in keeping the bacteria healthy mistakes one for the other, binding sulfa instead of PABA, but since sulfa could not be metabolized, the enzyme, with sulfa stuck to it, becomes useless. The bacteria, denied a nutrient they need, eventually starve to death.

Sulfa drugs of course had not yet been discovered by WW1 so that acute respiratory diseases, including influenza, pneumonia, bronchitis, and other diseases killed circa 50,000 U.S. soldiers. In WW11, with twice as many men and women in uniform, only 1265 died from these diseases. The main difference was the widespread use of sulfa drugs in WW11. The wartime sulfa production was more than 4500 tons in 1943 alone, enough to treat more than 100 million patients. In December 1943 British Prime Minister Winston Churchill undertook a long and exhausting, for a 69 year old heavy smoker and legendary drinker, trip to Cairo, then to Teheran to meet with wartime allies Roosevelt and Stalin, back Cairo to meet again with Roosevelt, and finally on to Tunis for a couple of days rest at Eisenhower’s villa. When he arrived he had a sore throat and a fever of 101ºF. X-rays revealed that he had contacted pneumonia. He was given sulfa drugs, but suffered two bouts of atrial fibrillation and an episode of cardiac failure. Finally the medication kicked in and Churchill’s temperature returned to normal. Two weeks after he became sick he flew to Marrakech, Morocco and then home. For a while it was touch and go for one of the two leaders of the free world with sulfa drugs being the determinant.

I would be remiss if I did not mention what everyone reading this must be thinking about now. How about the problem of drug resistance with sulfa? The answer is what you might expect. With the massive overuse of sulfa drugs, as there was and is with penicillin and other modern antibiotics, resistance to the drugs became an increasing problem in lowering their effectiveness. Killing off non-resistant bacteria and thereby leaving a few resistant strains is a still an unsolved problem for these disease fighting medications.

Incidentally in 1928 British medical researcher Alexander Fleming notice that one of his bacteria plates which had been contaminated with a mold had the odd effect of clearing a zone around itself in which the bacteria did not grow. Fleming logically thought that the mold was releasing some sort of substance that hindered the growth of the bacteria. Purifying an amount of the mold sufficient to conduct in vivo tests proved so difficult that he largely dropped the research and turned to – sulfa. He returned to penicillin years later and along with two other British medical researchers received a Nobel Prize for medicine in 1945.

Friday, March 2, 2007

BEER CONSUMPTION & OTHER LITTLE ICE AGE PHENOMENA 6

In a 2006 History Channel TV program, food & wine expert Joseph H. Coulombe gave the following statistics:

23 gallons of beer; 1 gallon of hard liquor; and 2 gallons of wine are consumed yearly on average by each adult American.

88% of wine is drunk by 11% of Americans, most on the East or West Coasts.

Various wine, beer, or alcohol producing institutes give the same figures of approximately two gallons of wine and 1 gallon of liquor and from just over 20 gallons of beer, to 21 gallons, to 22 gallons depending upon the year cited and the particular survey quoted. One survey estimated that 88% of wine is consumed by 16% of Americans.

Wine drinking is increasing in the United States - 231 million cases of wine were consumed in 2001 with an increase of 5½% in 2004. The projected increased total wine consumption by 2010 is 300 million cases.

IN 1997 adult Americans consumed an average of 22 gallons of beer, 23 ½ gallons of coffee, 53 gallons of soft drinks, and 2 gallons of wine. The French consumed an average of 64 liters (17 gallons) of wine while Germans drank an average of 120 liters (30 gallons) of beer.

Beer accounts for approx. 88% of all alcohol consumed in the United States – 11 times as much beer is consumed as wine.


Now to the reasons America is primarily a beer drinking nation:

The period called The Little Ice Age occurred from the early 14th to the middle of the 19th centuries. It was universally, especially in the northern hemisphere, but not uniformly cold decade after decade and century after century. Instead it was on average colder than the centuries preceding and following it with some years, including consecutive ones, of bone chilling cold followed by a few temperate years. Still there were long periods of cold and often wet weather such that starting in the 14th century, vineyards in Northern Europe died to the extent that grains were substituted to brew beer and distill hard alcohol. By the time of large immigration of Europeans to America in the 17th and 18th centuries Northern Europeans had been brewing beer and distilling liquor instead of making wine for two centuries. Thus they took their beer drinking culture and beer and alcohol making techniques with them to America. And during this embryonic habit forming period on the new continent overwhelmingly most of the immigrants were from Northern Europe – England, Scotland, Ireland, Germany, Holland, Poland, and Sweden, primarily. Some, but relatively few, came from the wine drinking Mediterranean Basin.

Although not a few of the Founding Fathers (possibly even the Founding Mothers) enjoyed drinking imported wine, Thomas Jefferson brewed beer at Monticello and George Washington was the largest distiller of rye whiskey in the Colonies.

There were many other historical events which were caused or enhanced by the Little Ice Age. Napoleon Bonaparte invaded Russia in the fall of 1812 with a force of circa 600,000. The winter was so severe that by the time he retreated his army was reduced to 130,000 due mostly to starvation and disease. Temperatures dropped to the minus 30’s ºF where snow crystals floated in the air instead of falling to the ground due to the high density of the cold air. It was described by one of the officers as a surrealistic world. There are accounts of soldiers cutting off meat from the flanks of the wagon pulling horses. The horse’s skins were frozen and they were so numb that they did not feel it. The blood from the wounds froze so quickly there was little blood loss. In 2006 the U.S. congress passed a law prohibiting the slaughter of horses in this country for the export of horse meat. Two of these plants are near here in North Texas. I don’t know how the slicing off of pieces of meat from live working horses would be perceived today, but I suppose allowances should be made for starving soldiers.

When Napoleon’s severely depleted army entered Vilnius, Lithuania it was reduced to approximately 40,000. In 2001 a mass grave in Vilnius was discovered containing 3000 skeletons which were determined to be almost two hundred years old - all from Napoleon’s army. Starvation, typhus, and gangrene were the main causes of the death of thousands of soldiers in the city. Only 3000 to 4000 of the original 600,000 man army eventually made it back to France. One of those who made it back was Napoleon himself. On June 18, 1815 the Duke of Wellington settled Napoleon’s hash, so to speak, at the Battle of Waterloo in Belgium.

In the year 1815, when James Madison was president of the United States and Indiana admitted as the 19th state, the single most powerful volcanic eruption ever recorded occurred on April 18th at Mt. Tambora on the Indonesian island of Sumbawa. Mt. Tambora was a 13,000 ft. extinct volcano until that fateful day when 4200 ft. of the top blew off throwing 36 cubic miles of ash, sulfur gas, and debris into the atmosphere to a height of 15 ½ miles. Immediate fatalities were 70,000 people on that and adjacent islands with the total going to 90,000 not long after. This eruption put 100 times more ash in the atmosphere than the 9000 ft. Mt. St. Helens eruption in Washington State in 1980 and had four times the energy as Krakatoa in 1883. Worldwide climate was affected for years.

That winter Hungary had brown colored snow and in Puglia (a region in southern Italy) where it seldom snows, the snow was red tinted. Of course these were caused by the volcanic eruption that spring thousands of miles away.

The New World did not escape the debilitating effects of the Little Ice Age. During what was called the Year Without a Summer in June 1816 in New England there were 5 consecutive days of snow. There was also snow during July and August; in fact 75% of the corn crop was ruined by the cold temperatures and excessive moisture. There were reports of birds falling out of the sky dead due to the cold. Finally many New Englanders had enough of the continuing cold and inclement weather and they greatly accelerated the country’s westward migration. They did not realize their problems originated half a world away.

Also in the summer of 1816 the British poets Percy Bysshe Shelly and Lord Byron along with Shelly’s 19 year old wife Mary Shelly vacationed along Lake Geneva in Switzerland. The weather was so wet and cold that they stayed indoors most of the time and decided to see who could write the most terrifying and gripping tale. Mary Shelly won with her novel of Frankenstein.

The Black Death which hit Europe in 1347 to 1351 was spread by rats carrying fleas infected with bubonic plague (Yersinia pestis bacillus). At least 25 million, 1/3 of the European population, died during this period. This high fatality rate was greatly enhanced by the population being left in a weakened state owing to crop failures brought on by the cold and wet weather at the start of the Little Ice Age.

There is a famous painting of George Washington crossing the Delaware River on the way to the Battle of Trenton, New Jersey on Christmas night of 1776 by artist Emanuel Luetze. This painting shows the river choked full of blocks of ice. Unlike some renditions of historical events this it is an accurate depiction of the weather conditions at that time. Today the Delaware River seldom if ever freezes, but then the world was in the grip of the Little Ice Age.

The Medieval Climate Optimum aka the Medieval Warm Period occurred from the 10th to the 14th centuries and was 4º - 7º F warmer than previously. During this period wine grapes were grown in profusion as far north as southern Britain. It was also when the Vikings settled in Greenland. Today Greenland and Iceland should more accurately have their names reversed. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.

However, around 1000 A.D. when the Vikings immigrated to Greenland it was properly named. The Vikings subsisted on a diet of 80% land grazing animals (mostly sheep, some cattle) and 20% cod from the sea. After the advent of the Little Ice Age in the 14th century and as the climate grew progressively colder there was a change in the Viking’s diet to 20% land animals and 80% fish as the sheep and cattle died out from a lack of grass. Eventually even the temperature sensitive cod stopped coming to the North Atlantic. The Vikings, unlike the Inuit natives, were unable to adapt to the changing climate and coupled with their essential barter with Northern Europe being interrupted by impassable pack ice they simply died out. The Vikings in Greenland were done in by The Little Ice Age.

What then caused The Little Ice Age? First off, the last glaciation period which ended about 12,000 years ago was a more extreme period of cold where there was an average drop in temperature of 9 ºF. The Little Ice Age had an estimated average temperature of 4º - 5º F cooler than today. This does not seem like much, but is enough to cause major climate changes. There have also been periods of glaciation with even much colder temperatures going back millions of years where glaciers covered the mid section of America as far south as Indiana, Illinois, and Ohio and in Northern Europe.

The short answer to what caused The Little Ice Age is that nobody knows. Naturally there are a number of theories. One is that, for reasons not understood, the sun undergoes periodic episodes of changing energy output. There is a phenomenon called the Maunder Minimum which was a period of low sun spot activity between 1645 and 1715 that correlates with the sun’s low energy output. Another is that there was an unusually active period of volcanism (an average of five per century vs. normally one or less) during The Little Ice Age which blocked heat from the sun. Still another is the Thermohaline Circulation or Mid-Atlantic Conveyor Belt Theory. This theory holds that currents carry water from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of the warmer water of the Gulf Stream flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding The Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes The Little Ice Age was initiated. Perhaps there was a combination of these effects, but whatever it was, neither The Little Ice Age nor the preceding warm period were due to individual or industrial pollution.