Friday, March 23, 2007

The Great Mortality & Immorality 9

The Black Death or as it was more commonly called at the time, The Great Mortality invaded Europe in 1347-52 apparently from the steppes of Central Asia becoming the most devastating plague of all time. The population of Europe has been estimated to be 75 million at that time and fully 1/3, at least 25 million people, died from the plague. Even if the mortality rate was not greatly iatrogenically accelerated, effective medical intervention was practically nonexistent. There have been many accounts of this plague, but one of the best is the 2005 book The Great Mortality by John Kelly.

The disease was caused by the bacillus, Yersinia pestis, with the transmission vector mainly being the rat flea, Xenopsylla cheopis, and also likely the human flea, Pulex irritans. The fleas in turn were carried by the black rat, Rattus rattus, although other varmints such as marmots may have also been involved.

There are additional aspects to this story besides the massive deaths and horrendous human suffering caused by the disease and one is implied by the second part of this article title.

Often a kernel of truth is contained in some of the most hackneyed sayings and this episode of disease is no exception. The cliché, originally from Psalms, “it is an ill wind that blows nobody good” illustrates the point. In the years after the Black Death abated, farm land became more plentiful (there were many fewer people) and the cost of labour in Europe increased for the same reason. Thus economic advantage accrued to the peasantry versus the landed gentry. Even in the cities labourers, especially skilled labour, saw their wages rise dramatically along with prices of course. There was an attempt by the ruling classes to impose wage and price controls, but then as now this never works. As frequently stated by the likes, among many others, of economists Milton Friedman, Thomas Sowell, Walter Williams, et al. the putative proposition is when wage and price controls are imposed which are set below free market wages and prices then shortages result; when price and wage supports are instituted which are above free market wages and prices then over supply and over production result which is wasteful and inefficient in a capitalistic economy because it takes resources in labor or goods away from other more desired production/consumption. This is such a simple concept yet is either not understood or not accepted by socialists or other hard left types to the detriment of economic prosperity.

Book making in the Middle Ages was a labor intensive undertaking which required several copyists. In the high wage post-plague years this cost became prohibitively expensive except for the very richest nobility. Enter Johann Gutenberg with his invention of the moveable type printing press in 1453. One could say that the advent of the Black Death led to mass production of printed material, including books, and a much higher literacy rate in Europe. “necessity became the mother of invention” might be the watchword.

Once again we have the proposition that “It is not that the primitive mind can not link cause and effect, but does so where none exists.” In this case it took an ugly and deadly turn. When the Black Death hit Europe in 1347-48, perhaps understandably with the discovery of microorganisms being centuries in the future, people, especially Christians, looked for more tangible agents. As in the previous, but much less severe, plagues in the 11th, 12th, 13th, and early 14th centuries the candidates were lepers, beggars, vagrants, and Jews. By far the most popular of these were the Jews. And Christians, by and large, showed little mercy in dealing with what they perceived was the source of their misery by inculpating Jews. One exception was Pope Clement VI who issue a Papal Bull on July 6, 1348 stating “it can not be true that the Jews…are the cause…of the plague…for [it] afflicts the Jews themselves.” Little attention was paid to this sensible analysis.

Unlike previous campaigns against the non-culpable Jews in Europe this time banishment and prohibition from entry were not the main actions taken. Real pogroms including massacres and attempted force conversion to Christianity were undertaken. Jews in what are now Austria, England, France, Germany, Italy, Spain, and Switzerland among other European countries were targets of indiscriminate killing – the favorite mode being burning at the stake. In at least one documented incident all the Jews in one town - men, women, and children – were crowded into one building which was locked and set on fire. Nobody survived. Apparently the syllogism was: The Great Mortality was bad; Jews do bad things, therefore Jews caused the deadly disease.

Naturally this horrendously deadly torment had to be justified – and so it was. You see the Jews caused the plague by poisoning water wells all over Europe. There were even a few Jews who confessed to this. After, of course, they were put on the rack or subjected to other unbearable torture.

It was not just in the 20th century where compurgation was not afforded the Jews, but they were massacred in great numbers during the holocaust. Whether the number was 6 million the Nazis slaughtered or 5 million or 7 million is of little import – it was undeniably millions. Given their history of persecution it should be natural to cut them a little slack in their dispute with the Arabs over Palestine. Especially the Europeans, with their historically deplorable treatment of Jews, should reasonable be less biased against the Israelis vs. the Palestinians than they are.

I am not claiming that the Israelis are completely blameless in the tête-à-tête with the Arabs although the Arabs did start the 1948 war because they saw the Jews getting stronger with immigration and aid from around the world. When I was in Libya in the late 1950’s I met several young Palestinian Arabs who were given 24 hours by the Jews to leave their homes in 1948. They could take with them as much of their personal property as they could manage, but they had to get out. The Israeli ambassador to the U.N., Dan Gillerman, was recently asked in a TV interview in the United States whether the Jews had not forced some Arabs out of Palestine during the 1948 war and he said “no, absolutely not.” Ambassador Gillerman is a smart, educated, well informed politician so he had to know what he said was not true. Yes, I am saying he lied.

Regardless of which side has the more legitimate claim on Palestine, a proposition which could be debated endlessly, it is abundantly clear to all sane, reasonable people that Israel wants to live in peace (and do business wouldn’t you know?) with their portion of the Holy Land while the Palestinians and other Muslims in the Middle East want the Jews either exterminated or exiled from all Arabs lands.

It is singular that the funest crimes committed by Christians against Jews and perceived non-believing other people hundreds of years ago in Europe would be absolutely unthinkable today. Can the same be said about Muslims?

Christians in the Middle Ages burned their enemies, real or imagined, at the stake. That does not differ materially, and certainly not in outcome, with Islamo-fascists blowing up innocent people with IED’s today. At times Medieval Christians gave Jews the option of converting to Christianity on pain of death if they refused, which they sometimes did. One of the stated objectives of current Islamic fanatics is to convert all non-adherents of their brand of Islam and Sharia (Arabic for "well-trodden path") law, on the pain of death if they refuse. Sound familiar?

One gets the impression that the madman in Tehran (President Mahmoud Ahmadinejad) would rather forgo any conversion option when it comes to the “Great Satans” of the United States and Israel and proceed directly to termination. If the Iranian regime manufactures nuclear bombs and acquires additional help from North Korea for developing long range missile technology then they will be in position to carry out their intimidation if not downright elimination strategy against the United States and Israel. The Iranians would find out too late the meaning of a pyrrhic victory – no matter how much damage they inflicted they would be completely incinerated in retaliation. Sane people would realize this, but can fanatics, religious or otherwise, be counted on to be rational?

My conclusion is that Christianity has advanced immeasurably in the last few hundred years in terms of compassion, morality, and rationality while Islam is still stuck in their concomitance with the earlier Christianity past.

I would enjoy entertaining dissenting opinions with my theses if they are supported by facts - Fons et origo, and apodictic reasoning. Even any lapsus calami pointed out would be appreciated.

Friday, March 16, 2007

MISCELLANEOUS RUMINATIONS 8

Americans spend multi-billions of dollars annually on foods, fads, books, treatments, surgery, séances, and God knows what else to lose or otherwise keep their weight under control. Why not the peripeteia of that? That is to say spending time and treasure to keep one’s weight up. Sounds ridiculous you think, then consider: The 18th & 19th century’s English cleric and economist, Thomas Malthus (1766-1834), observing that the population was increasing geometrically and the food supply arithmetically, declared in his Essay on the Principles of Population (1798) that absent of deadly pandemics or world-wide wars the world was in for massive starvation. What did happen to the world’s population in those centuries since?

The world population has increased approximately as follows: 600 million in 1700; 900 million in 1800; 1.6 billion in 1900; 2.5 billion in 1950; 6.5 billion today. This is certainly a geometric increase, especially in the 20th and 21st centuries. Why then has there not been catastrophic starvation? In fact this year the World Health Organization declared there are more people at risk for health problems from excessive weight (eating too much food) than from starvation. True enough millions of people in the third world suffer from a deficiency of nutritious food, yet a shortage of food is not at fault; rather it is a delivery problem coupled with corruption and inefficiency of United Nations and government officials on the receiving end of foodstuffs given by the generous and compassionate West.

It turns out that the supply of food has increased even more than the population in the past 300 years. The term used for it is the “Green Revolution” meaning advanced mechanized farming, packaging, storing, and delivery methods for food stuffs. Malthus had no way of predicting or even imagining this true revolution.

More recently, in the late 1960’s to the 1980’s, a screwball named Paul Ehrlich (see my essay Fools, Frauds & Fakes) in his 1968 book The Population Bomb predicted that ½ billion people would die of starvation in the next decade and there would be food riots in the United States in the 1980’s. He opined that the world’s population would be 1.5 billion by 1985. This Ehrlich book (it sold 20 million copies) and his 1991 book The population Explosion were best sellers and he made numerous appearances on national TV programs. Ehrlich was the darling of the left in his day much as the equally loony and labile Al Gore, who is clearly afflicted with amentia, is today. The next time you hear people such as the egregious Scott Pelley (see my essay Global Warming or Cooling: Which the Hell is it?) and ABC News reporter Bill Blakemore, who stated this past week that he rejects balance on reporting Global Warming as he considers critics of GW merely hacks and lackeys, keep in mind the record of the we-are-all-going-to-starve crowd. Journalism objectivity is not a strong suit of the paralogistic press (main stream news media).


The latest affright by the wacky left, quo modo, as with Global Warming, is that the world’s seas will be depleted of harvestable seafood by 2050. These kvetching people apparently will never be satisfied until depression and despair affects every living soul on earth. A 14 member group of scientists (?) from Halifax U. in Nova Scotia, Stanford, Scripps Inst. of Oceanography, Stockholm U., and British Columbia U. came up with this carking scenario. University of Washington professor of aquatic & fishery sciences, Ray Hilborn, responded “It’s just mind-boggling stupid.” Indeed.

Even though as the 16th & 17th century’s English poet and preacher, John Donne (1571? – 1631) wrote in his Devotions Upon Emergent Occasions (1624) “Every man’s death diminishes me because I am involve in mankind, therefore never send to know for whom the bell tolls; it tolls for thee.” that does not mean all sacrifice of life is futile. In defense of their homeland various countries have paid heavily. Great Britain, France, and the United States collectively suffered approx. 750,000 military deaths in World War II. In the three months Battle of Moscow in the late fall and winter of 1941 after Hitler decided to invade the Soviet Union, the Russians calamitously had 150,000 more military and civilian death than that - a total of 900,000 in just three months. In fact 80% of the German causalities in the war occurred on the Eastern front fighting the Soviets. It is conceivable that without Germany’s war with the Soviet Union on their Eastern front, the Allies invasion in June 1944 may have not succeeded. At the very least the Allies effort to defeat the Nazis would have been much more costly in time, blood, and treasure.

The United States had 405,000 military deaths during WWII. For every US death Japan had 5; Germany 20; and the USSR 75. The total military and civilian deaths during WWII for the Soviet Union were circa 30 million. There are those who have said that the United States should not have allowed the Soviet Union hegemony over Eastern Europe after WWII. The Red Army occupied the Eastern European countries at the finish of the war – it was a fait accompli. For the United States to have intervened would have meant another war. Wasn’t there killing enough already? Eastern Europeans suffered under the Soviet yoke from the rest of the 1940’s through the 1980’s, but they are largely free now. Incidentally when Boris Yeltsin declared the end of the Soviet Union on 12/06/1991 the date was 50 years to the day when the Soviets counter attacked in the Battle of Moscow.

The second most consumers spending holiday in the United States after Christmas is now Halloween. And this holiday has spread to other countries, especially in Europe. In what may fairly be characterized as continuing anti-American xenophobia, the French government is attempting to minimize if not completely eliminating this, horreur, imported American celebration. However, in an irenic spirit we can reasonably empathize with the Frogs on this one. After all what children would want to go around on Halloween saying, “bonbon ou baton (trick or treat)?” Literally this means “candy or a baton (club)”, presumably applied to the noggin of recalcitrant or parsimonious treat givers. The phrase in French is “treat or trick” - butt-backwards to the English expression, but then what would you expect from the Frenchies? Even French children sensibly do not want to make themselves sound ridiculous by repeatedly yelling “bonbon ou baton.”

Friday, March 9, 2007

THE FIRST MIRACLE DRUG 7

Before reading the book The Demon Under the Microscope by Thomas Hager it was my, and likely others as well, impression, erroneous as it turned out, that penicillin was the first ‘miracle’ anti-bacterial drug. It was not. The following is a brief recounting of what really was that drug.

In the summer of 1924 the teen age son of President Calvin Coolidge developed a blister on his big toe while playing tennis. The wound became infected with what looked like Streptococcus pyogenes and despite the best medical treatment available at the time he was dead within two weeks. In November 1936 Harvard senior, Franklin Delano Roosevelt Jr., was undergoing a round of cocktail and dinner parties and press conferences owing to his engagement to be married. Perhaps due to the stress and tiring activities of all this, young Roosevelt began feeling unwell with a sore throat and head which felt like it was full of concrete. Five days later he was admitted to Massachusetts General Hospital. Again despite the best medical treatment available at that time Roosevelt’s condition steadily deteriorated over the next three weeks as he was diagnosed with some strain of a streptococcus infection. In desperation he was given a new and still unproven drug. In two days there was a complete turnaround and Roosevelt soon left the hospital. The completely different outcome of these two cases was entirely due to this new drug.

Streptococcus infections, or strep for short, were estimated to kill 1.5 million victims per year in Europe and North America alone during the 1920’s. That same number adjusted for today’s population would total more than the current worldwide annual deaths from cholera, dysentery, typhoid, and AIDS combined. Such diseases as scarlet fever, rheumatic fever, and strep throat are caused by various species of streptococcus. With enough magnification all strains of strep look like twisted strings of beads (Latin streptococcus from Greek streptos twisted chain + kokkos berry or seed).

After the horrendous number of deaths of soldiers from infected wounds in WW1 the Germans, English, and French, and especially the Germans concentrated on finding medications which would cure these infections. A young German medic during WW1, Gerhard Domagk, experienced firsthand the futility of the medical treatment given to wounded soldiers in fighting infection. After the war he determined to combat this scourge of deadly infection by graduating from medical school in 1921 and devoting himself to finding cures. After first working at a medical school he went to work in the pharmaceutical department of Friedrich Bayer & Company.

German companies were the world leaders in the manufacturing of dyes for cloth and other materials. A Jewish German chemist, Paul Ehrlich, conceived the idea and developed a method of using dyes to stain bacteria so they could be studied under a microscope. He had received a Nobel Prize in medicine for his work on immunity and serum therapy in 1908. Domagk, and other German researchers started using dye molecules, especially what was called an azo dye in attempting to develop an anti-bacterial medication composed of the dye molecule coupled with other molecules. Domagk had six laboratory assistants - all of them women. How is that for diversity in the workplace for the 1920’s? After many trials and failures, Domagk tried combining the azo dye molecule with a molecule called sulfanilamide. This medication showed great promise treating some types of strep and after much further experimentation and refinement a drug called Prontosil was marketed by Bayer. Dr. Gerhard Domagk was named a Nobel Prize recipient by the Swedish committee in 1939. However Adolph Hitler himself prohibited Domagk from accepting the prize because a couple of years early a German opponent of the Nazi regime had been awarded the Nobel Peace Prize by the Norwegian Nobel committee. It made no difference to Hitler that the Swedish committee was different from the Norwegian committee – he wasn’t going to have anything to do with Nobel period. Domagk was invited to Stockholm to receive his Nobel Prize in 1947 when Hitler was history.

This new anti-bacterial drug, Prontosil, and further similar drugs developed by the Germans, French, and English worked almost miraculously in laboratory tests on mice and rabbits and then on human patients. There was no such thing as a clinical trial in the 1930’s so the patients became by default guinea pigs. Still the cure rates were as nothing seen in medicine before. The one puzzling aspect was that while the results in vivo in the laboratory and with human patients were spectacular, in vitro it was a dismal failure – the pathogens in test tubes were not affected by the drug.

The Germans and as a result the French and English as well were so hung up on using dye molecules that it was practically by accident that a group of test mice were treated with pure sulfanilamide in a French lab. When the sulfanilamide alone worked as effectively as the previous drugs linked to the azo dye there was shock and as expected, doubt in laboratories around Europe. After the results were confirmed by others the rush was on to turn out variations of sulfa based medications. One mystery was solved by the discovery of sulfa being the active anti-bacterial agent. Sulfa had to be released from the rest of the Prontosil molecule in order to become active. This could happen in the body of an animal, where enzymes in the body could split the Prontosil molecule into two pieces, releasing pure sulfa as the medicine, but not in a test tube which contained no enzymes.

By 1937 sulfa drugs were being widely used in the United States and therein lay a cautionary tale. In Tulsa, Oklahoma Dr. James Stephenson observed that many sick patients including a majority of children began showing up in Tulsa hospitals and later in hospitals in other cities around the country. A number of them began dying with renal failure. Sulfa pills, capsules, injectable solutions, and powders had recently hit the market from a variety of pharmaceutical manufacturers. Every drug store in Tulsa was selling sulfa to anyone who asked for it. Dr. Stephenson traced the problem to a proprietary drug called Elixir Sulfanilamide produced by the Massengill Company. Sulfa is difficult to dissolve, alcohol does not work well nor does many of the common medicinal solvents. The head chemist for Massengill, Harold Watkins, found that diethylene glycol, an industrial solvent, worked very well to dissolve the sulfa. As it turned out, the problem was it also worked very well in bring on renal failure and death which reached 67 confirmed. As difficult as it is to believe today, the Massengill Company ran no tests on laboratory animals nor did they do any clinical tests with their medicine. The FDA had been established in 1906, but was of minimal effect in regulating the medical field. Like other issues perhaps the pendulum has swung too much the other way today. Although Watkins was slow in admitting blame for the sickness and death of the people who took his medicine he clearly realized his culpability – he committed suicide eight months later by putting a loaded hand gun to his head and squeezing the trigger.

Sulfa drugs generally had no serious side effects (except when mixed with poison, e.g. diethylene glycol). I can attest there were some relatively minor ones however. When I was a teenager I was given a sulfa drug for a forgotten illness and when I got up from bed to go to the bathroom I felt so dizzy that I almost, but not quite, fell to the floor. The medication was discontinued and I believe logic would dictate that I recovered anyway.

In 1939 a drug called sulfapyridine was developed in the United States as a cure for pneumonia. Within a few years this drug was saving the lives (why not say postponing the deaths? Isn’t that more accurate?) of 33 thousand pneumonia patients each year in the United States alone. By 1942 at least 3600 sulfa derivatives had been synthesized and studied and more than 30 of them had been sold in the United States. Sulfapyridine was, gram for gram, 18 times more powerful than Prontosil; sulfathiazole was 54 times more effective; sulfadiazine was 100 times more effective. The number of diseases that could be treated with the new sulfa drugs was also growing. Sulfathiazole worked as well as the other drugs against strep and also was effective against staphylococcal infections.

There seemed to be an ever increasing spectrum of diseases which could be treated successfully with sulfa drugs. The one thing unexplained was how the medicine worked, but that was discovered eight years after Prontosil was formulated. Sulfa was less of a magic bullet than a clever imposter. It was observed that sulfa never worked as well when there was dead tissue around or a lot of pus as in uncleaned wounds. There was a yeast extract called para-aminobenzoic acid (PABA) a chemical involved in bacterial metabolism which had astonishing anti-sulfa abilities. Some bacteria can make their own PABA; others can not and have to find it in their own environment. For these bacteria PABA is an essential metabolite. If they can not find it they starve. The sulfa and PABA molecules are so similar that an enzyme critical in keeping the bacteria healthy mistakes one for the other, binding sulfa instead of PABA, but since sulfa could not be metabolized, the enzyme, with sulfa stuck to it, becomes useless. The bacteria, denied a nutrient they need, eventually starve to death.

Sulfa drugs of course had not yet been discovered by WW1 so that acute respiratory diseases, including influenza, pneumonia, bronchitis, and other diseases killed circa 50,000 U.S. soldiers. In WW11, with twice as many men and women in uniform, only 1265 died from these diseases. The main difference was the widespread use of sulfa drugs in WW11. The wartime sulfa production was more than 4500 tons in 1943 alone, enough to treat more than 100 million patients. In December 1943 British Prime Minister Winston Churchill undertook a long and exhausting, for a 69 year old heavy smoker and legendary drinker, trip to Cairo, then to Teheran to meet with wartime allies Roosevelt and Stalin, back Cairo to meet again with Roosevelt, and finally on to Tunis for a couple of days rest at Eisenhower’s villa. When he arrived he had a sore throat and a fever of 101ºF. X-rays revealed that he had contacted pneumonia. He was given sulfa drugs, but suffered two bouts of atrial fibrillation and an episode of cardiac failure. Finally the medication kicked in and Churchill’s temperature returned to normal. Two weeks after he became sick he flew to Marrakech, Morocco and then home. For a while it was touch and go for one of the two leaders of the free world with sulfa drugs being the determinant.

I would be remiss if I did not mention what everyone reading this must be thinking about now. How about the problem of drug resistance with sulfa? The answer is what you might expect. With the massive overuse of sulfa drugs, as there was and is with penicillin and other modern antibiotics, resistance to the drugs became an increasing problem in lowering their effectiveness. Killing off non-resistant bacteria and thereby leaving a few resistant strains is a still an unsolved problem for these disease fighting medications.

Incidentally in 1928 British medical researcher Alexander Fleming notice that one of his bacteria plates which had been contaminated with a mold had the odd effect of clearing a zone around itself in which the bacteria did not grow. Fleming logically thought that the mold was releasing some sort of substance that hindered the growth of the bacteria. Purifying an amount of the mold sufficient to conduct in vivo tests proved so difficult that he largely dropped the research and turned to – sulfa. He returned to penicillin years later and along with two other British medical researchers received a Nobel Prize for medicine in 1945.

Friday, March 2, 2007

BEER CONSUMPTION & OTHER LITTLE ICE AGE PHENOMENA 6

In a 2006 History Channel TV program, food & wine expert Joseph H. Coulombe gave the following statistics:

23 gallons of beer; 1 gallon of hard liquor; and 2 gallons of wine are consumed yearly on average by each adult American.

88% of wine is drunk by 11% of Americans, most on the East or West Coasts.

Various wine, beer, or alcohol producing institutes give the same figures of approximately two gallons of wine and 1 gallon of liquor and from just over 20 gallons of beer, to 21 gallons, to 22 gallons depending upon the year cited and the particular survey quoted. One survey estimated that 88% of wine is consumed by 16% of Americans.

Wine drinking is increasing in the United States - 231 million cases of wine were consumed in 2001 with an increase of 5½% in 2004. The projected increased total wine consumption by 2010 is 300 million cases.

IN 1997 adult Americans consumed an average of 22 gallons of beer, 23 ½ gallons of coffee, 53 gallons of soft drinks, and 2 gallons of wine. The French consumed an average of 64 liters (17 gallons) of wine while Germans drank an average of 120 liters (30 gallons) of beer.

Beer accounts for approx. 88% of all alcohol consumed in the United States – 11 times as much beer is consumed as wine.


Now to the reasons America is primarily a beer drinking nation:

The period called The Little Ice Age occurred from the early 14th to the middle of the 19th centuries. It was universally, especially in the northern hemisphere, but not uniformly cold decade after decade and century after century. Instead it was on average colder than the centuries preceding and following it with some years, including consecutive ones, of bone chilling cold followed by a few temperate years. Still there were long periods of cold and often wet weather such that starting in the 14th century, vineyards in Northern Europe died to the extent that grains were substituted to brew beer and distill hard alcohol. By the time of large immigration of Europeans to America in the 17th and 18th centuries Northern Europeans had been brewing beer and distilling liquor instead of making wine for two centuries. Thus they took their beer drinking culture and beer and alcohol making techniques with them to America. And during this embryonic habit forming period on the new continent overwhelmingly most of the immigrants were from Northern Europe – England, Scotland, Ireland, Germany, Holland, Poland, and Sweden, primarily. Some, but relatively few, came from the wine drinking Mediterranean Basin.

Although not a few of the Founding Fathers (possibly even the Founding Mothers) enjoyed drinking imported wine, Thomas Jefferson brewed beer at Monticello and George Washington was the largest distiller of rye whiskey in the Colonies.

There were many other historical events which were caused or enhanced by the Little Ice Age. Napoleon Bonaparte invaded Russia in the fall of 1812 with a force of circa 600,000. The winter was so severe that by the time he retreated his army was reduced to 130,000 due mostly to starvation and disease. Temperatures dropped to the minus 30’s ºF where snow crystals floated in the air instead of falling to the ground due to the high density of the cold air. It was described by one of the officers as a surrealistic world. There are accounts of soldiers cutting off meat from the flanks of the wagon pulling horses. The horse’s skins were frozen and they were so numb that they did not feel it. The blood from the wounds froze so quickly there was little blood loss. In 2006 the U.S. congress passed a law prohibiting the slaughter of horses in this country for the export of horse meat. Two of these plants are near here in North Texas. I don’t know how the slicing off of pieces of meat from live working horses would be perceived today, but I suppose allowances should be made for starving soldiers.

When Napoleon’s severely depleted army entered Vilnius, Lithuania it was reduced to approximately 40,000. In 2001 a mass grave in Vilnius was discovered containing 3000 skeletons which were determined to be almost two hundred years old - all from Napoleon’s army. Starvation, typhus, and gangrene were the main causes of the death of thousands of soldiers in the city. Only 3000 to 4000 of the original 600,000 man army eventually made it back to France. One of those who made it back was Napoleon himself. On June 18, 1815 the Duke of Wellington settled Napoleon’s hash, so to speak, at the Battle of Waterloo in Belgium.

In the year 1815, when James Madison was president of the United States and Indiana admitted as the 19th state, the single most powerful volcanic eruption ever recorded occurred on April 18th at Mt. Tambora on the Indonesian island of Sumbawa. Mt. Tambora was a 13,000 ft. extinct volcano until that fateful day when 4200 ft. of the top blew off throwing 36 cubic miles of ash, sulfur gas, and debris into the atmosphere to a height of 15 ½ miles. Immediate fatalities were 70,000 people on that and adjacent islands with the total going to 90,000 not long after. This eruption put 100 times more ash in the atmosphere than the 9000 ft. Mt. St. Helens eruption in Washington State in 1980 and had four times the energy as Krakatoa in 1883. Worldwide climate was affected for years.

That winter Hungary had brown colored snow and in Puglia (a region in southern Italy) where it seldom snows, the snow was red tinted. Of course these were caused by the volcanic eruption that spring thousands of miles away.

The New World did not escape the debilitating effects of the Little Ice Age. During what was called the Year Without a Summer in June 1816 in New England there were 5 consecutive days of snow. There was also snow during July and August; in fact 75% of the corn crop was ruined by the cold temperatures and excessive moisture. There were reports of birds falling out of the sky dead due to the cold. Finally many New Englanders had enough of the continuing cold and inclement weather and they greatly accelerated the country’s westward migration. They did not realize their problems originated half a world away.

Also in the summer of 1816 the British poets Percy Bysshe Shelly and Lord Byron along with Shelly’s 19 year old wife Mary Shelly vacationed along Lake Geneva in Switzerland. The weather was so wet and cold that they stayed indoors most of the time and decided to see who could write the most terrifying and gripping tale. Mary Shelly won with her novel of Frankenstein.

The Black Death which hit Europe in 1347 to 1351 was spread by rats carrying fleas infected with bubonic plague (Yersinia pestis bacillus). At least 25 million, 1/3 of the European population, died during this period. This high fatality rate was greatly enhanced by the population being left in a weakened state owing to crop failures brought on by the cold and wet weather at the start of the Little Ice Age.

There is a famous painting of George Washington crossing the Delaware River on the way to the Battle of Trenton, New Jersey on Christmas night of 1776 by artist Emanuel Luetze. This painting shows the river choked full of blocks of ice. Unlike some renditions of historical events this it is an accurate depiction of the weather conditions at that time. Today the Delaware River seldom if ever freezes, but then the world was in the grip of the Little Ice Age.

The Medieval Climate Optimum aka the Medieval Warm Period occurred from the 10th to the 14th centuries and was 4º - 7º F warmer than previously. During this period wine grapes were grown in profusion as far north as southern Britain. It was also when the Vikings settled in Greenland. Today Greenland and Iceland should more accurately have their names reversed. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.

However, around 1000 A.D. when the Vikings immigrated to Greenland it was properly named. The Vikings subsisted on a diet of 80% land grazing animals (mostly sheep, some cattle) and 20% cod from the sea. After the advent of the Little Ice Age in the 14th century and as the climate grew progressively colder there was a change in the Viking’s diet to 20% land animals and 80% fish as the sheep and cattle died out from a lack of grass. Eventually even the temperature sensitive cod stopped coming to the North Atlantic. The Vikings, unlike the Inuit natives, were unable to adapt to the changing climate and coupled with their essential barter with Northern Europe being interrupted by impassable pack ice they simply died out. The Vikings in Greenland were done in by The Little Ice Age.

What then caused The Little Ice Age? First off, the last glaciation period which ended about 12,000 years ago was a more extreme period of cold where there was an average drop in temperature of 9 ºF. The Little Ice Age had an estimated average temperature of 4º - 5º F cooler than today. This does not seem like much, but is enough to cause major climate changes. There have also been periods of glaciation with even much colder temperatures going back millions of years where glaciers covered the mid section of America as far south as Indiana, Illinois, and Ohio and in Northern Europe.

The short answer to what caused The Little Ice Age is that nobody knows. Naturally there are a number of theories. One is that, for reasons not understood, the sun undergoes periodic episodes of changing energy output. There is a phenomenon called the Maunder Minimum which was a period of low sun spot activity between 1645 and 1715 that correlates with the sun’s low energy output. Another is that there was an unusually active period of volcanism (an average of five per century vs. normally one or less) during The Little Ice Age which blocked heat from the sun. Still another is the Thermohaline Circulation or Mid-Atlantic Conveyor Belt Theory. This theory holds that currents carry water from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of the warmer water of the Gulf Stream flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding The Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes The Little Ice Age was initiated. Perhaps there was a combination of these effects, but whatever it was, neither The Little Ice Age nor the preceding warm period were due to individual or industrial pollution.

Friday, February 16, 2007

GLOBAL COOLING OR WARMING? 5

Given the frequency and ferocity of hurricanes last year the prediction for the number of alleged Global Warming induced Atlantic hurricanes this season was originally 9, then 7, then 5. What did we end up with, one or two which alternated between being tropical storms and hurricanes and did no damage to the USA except for some flooding? Seems to me this is a prime example of the futility of predicting natural phenomena and the questionable, if not downright erroneous, assignment of causative factors. It is not that the primitive mind can not link cause (Global Warming) and effect (hurricanes), but often does so where none exists.

The following is a summary of an 11 page speech about Global Warming given on the US Senate floor on September 25, 2006 by Sen. James Inhofe (R-Okla.) with some of my comments and news items added:

Quote from Newsweek magazine: “There are ominous signs that the earth’s weather patterns have begun to change dramatically and that these changes may portend a drastic decline in food production – with serious political implications for just about every nation on earth.”

Quote from Time magazine: “As they review the bizarre and unpredictable weather pattern of the past several years, a growing number of scientists are beginning to suspect that many seemingly contradictory meteorological fluctuations are actually part of a global climatic upheaval.”

A New York Times headline: “Climate Changes Endanger World’s Food Output”

Sounds contemporary and parlous does it not? It does until one realizes these Brummagem opinions were made in 1974 & 75 by the fourth estate climate doom myrmidons and they were forecasting not global warming, but a coming ice age.

But wait, there is more:

Again a headline from the New York Times: “Geologists Think the World May be frozen Up Again” And when was this you may ask – why more than 100 years ago in 1895. The NYT has had myriad decades to practice their Chicken Little we-are-all-doomed philosophy.

A front page article in the 10/07/1912, yes again, New York Times, just a few months after the Titanic struck an iceberg and sank, declared that a prominent professor “Warns us of an encroaching ice age.”

The very same day in 1912 the Los Angles Times ran an article warning that the “Human race will have to fight for its existence against cold.”

An 8/09/1923 front page article in the Chicago Tribune declared: “Scientist says Arctic ice will wipe out Canada.” The article quoted a Yale University professor who predicted that large parts of Europe and Asia would be “wiped out” and Switzerland would be “entirely obliterated.”

On 8/10/1923 a Washington Post article declared: “Ice age coming here.”

By the 1930’s the media took a break from reporting on the coming ice age and changed 180 degrees to promote global warming.

Time magazine (1939): “[Those] who claim that winters were harder when they were boys are quite right…..weathermen have no doubt that the world at least for the time being is growing warmer.”

Time magazine in 1951 pointed to receding permafrost in Russia as proof that the planet was warming.

In 1952 the New York Times noted that the “trump card” of global warming “has been the melting of glaciers.”

By the 1970’s the media had done another 180 degree turn to their previous we-are-all-going-to-freeze position. The news media have never been lacking in paralogism.

A December 29, 1974 New York Times article on global cooling reported that climatologists believed “the facts of the present climate change are such that the most optimistic experts would assign near certainty to major crop failure in a decade.” The article warned that unless government officials reacted to the coming catastrophe, “mass deaths by starvation and probably anarchy and violence would result.”

In 1975 the New York Times reported that “a major cooling [was] widely considered to be inevitable.” These past predictions of gloom and doom have a familiar ring do they not? They sound strikingly similar to our modern media promotion of Algore’s brand of climate alarmism.

Of course the current major media’s hysterical position on climate is back to the, we-are-all-going-to-burn-to-death syndrome. Their abrupt changes could induce whiplash for those foolish enough to follow them.

In April of this year, Time magazine devoted an issue to Global Warming alarmism titled: “Be Worried, Be Very Worried” This is the same magazine which first warned of a coming ice age in the 1920’s before switching to predicting dire consequences of Global Warming in the 1930’s before switching yet again to promoting the 1970’s coming ice age scare. Consistency has never been a hallmark of the left.

U.S. News & World Report carried a special report in their 4/10/2006 issue titled: “The Truth About Global Warming.” And no, it was not pitching the concept of Global Warming as a hoax. I read the article while waiting in a dentist’s office. Among other inanities it contained a graph connecting data labeled “Northern Hemisphere estimated average temperatures” 1000 A.D. – 1850 A.D. to “Actual recorded temperatures” 1850 -2000. What is wrong with that? First, correlating dubious estimated data with actual measurements is problematical enough, but even how correlative are measurements made 150 years ago with those of today? Second, many of the early data estimates cover the years of the Little Ice Age so those temperatures should by natural processes be lower than today. Third, the scale of the graph greatly exaggerates total temperature differences which are less than 1º C.

On 2/19/2006 CBS’s 60 Minutes produced a segment about the North Pole. This was a completely one-sided report, alleging rapid and unprecedented melting at the polar cap.
It even featured correspondent Scott Pelley claiming that the ice in Greenland was melting so fast that he barely got off an ice-berg before it collapsed into the water. 60 Minutes failed to inform its viewers that a 2005 study by Ola Johannessen and his colleagues showed that the interior of Greenland is gaining ice and mass and that according to scientists the Arctic was warmer in the 1930’s than today.

One of the experts featured by 60 Minutes was NASA scientist and alarmist James Hansen. Hansen has partisan ties to the seemingly labile Al Gore and was the recipient of a $250,000 grant from the left-wing Heinz Foundation. When the egregious Scott Pelley was asked why he justified excluding scientists skeptical of Global Warming alarmism from his story he responded that he considers skeptics the equivalent “holocaust deniers.” There is what passes for an objective journalist in the news media.

It is interesting that Monday (9/25/2006) on the CBS evening news I watched a segment about vineyards in southern Britain being cultivating for wine production made possible by claimed man-induced ever increasing warmer weather. For people such as the “perky one” Kathy Curic, history began today (“Those who do not know history are doomed to repeat its mistakes” – philosopher George Santayana). In the Medieval Warm Period (circa 1000A.D. – 1400 A.D.), British wine was produced in such quantity and quality that French wine producers wanted to impose tariffs to limit the import of British wines because they did not want the competition. Odd isn’t it that climate warming at that time can not reasonably be attributed to human folly?

I believe that in the past several years, perhaps since the 1990’s, global temperatures have increased a bit. What does this mean in the earth’s long climate trends of millennia – nothing; in the intermediate term of centuries – nothing; and in the short term of decades – not much. The contentious issue of whether this so far short warming trend is caused by man made pollution is simply not rationally solvable at this time. What is indisputably known is that the earth has gone through cycles of warming and cooling going back more than a million years as recorded in the geologic record. And whatever the causes, variable sun energy output, volcanic activity, or the Thermohaline Circulation effect it was not caused by human profligacy.

Just this week the top climate Cassandra, Al Gore (who wrote in his 1992 book Earth in the Lurch or was that Earth in the Balance? that the internal combustion engine was the greatest threat to mankind) said that cigarette smoke is a “significant contributor to Global Warming.” Also this week a report stated that it would cost $1,000,000,000,000, yes one trillion dollars, to cap greenhouse gas emissions. Yet another story said that Global Warming may drive the lemurs of Madagascar to extinction. My God, woe is us, we are all going to die before sundown! Steve Erwin turned out to be the lucky one.

The corybantic Global Warming fanatics fit the description of 1950’s & 60’s author Eric Hoffer’s “true believers” as much or more than any archetypes Hoffer actually wrote about.

The speech by Sen. Inhofe which was exhaustively researched and footnoted was not even mentioned in passing by the main stream print or electronic media. Why do suppose that is? Could it be that it does not fit the hard left, non-objective climate opinions of the New York Times, Los Angles Times, Washington Post, ABC, CBS, CNN, NBC etc.? I was fortunate to have read it on the Drudge Report. For those who would like balance on the Global Warming opinions and propaganda, yes propaganda, of the news media and Hollywood elites I highly recommend reading the Inhofe speech.

Friday, February 9, 2007

LINCOLN STORIES 4

Having just finished reading the book Herndon’s Lincoln by William H. Herndon, Lincoln’s friend for 20 years and law partner for 16 years, and a then young writer named Jesse W. Weik, I thought it time for me to weigh in with a short essay on Abraham Lincoln. Herndon’s book which was first published in 1888 and updated in 1892 (Herndon died in 1891) has a modern incarnation in 2006 being edited by Douglas L. Wilson and Rodney O. Davis.

The Herndon book is arguably, in my opinion undoubtedly, the best book on Lincoln excluding Lincoln’s presidential years. It is not only Herndon’s close association with Lincoln that makes his a stellar book, but also because Herndon got the idea to write the book within months of Lincoln’s assassination and immediately began to assemble letters and documents to and from and about Lincoln. Additionally Herndon interviewed many people who knew Lincoln before Herndon had met him as well as Herndon’s own contemporaries, in some cases traveling to places within and without the state of Illinois. Recounting of why it took decades for Herndon to write and get his book published is too long a tale to tell here.

It was Herndon, for example, who established beyond any doubt that Lincoln’s first love was a young woman named Ann Rutledge who tragically died of disease at the age of 19. Herndon knew Ann Rutledge as well as the Rutledge family and other people who knew her. It is because the Herndon book was out of print (it was never a popular seller) and forgotten for so long that the story of Abraham Lincoln and Ann Rutledge was doubted and even declared a myth (e.g., the 1979 edition of the World Book Encyclopedia called the story a myth). Herndon ‘s biography of Lincoln was criticized at the time because Herndon, even though he was an ardent admirer and greatly respected Lincoln, was determined to presented his subject in as complete and totally human light as possible - figuratively with warts and all. Herndon described Lincoln’s political ambition as, “a little engine that knew no rest” which did not sit well with Herndon’s critics. Yet it was demonstrably true. A couple of months after Lincoln lost the Illinois senatorial race to Stephen Douglas in 1858 Herndon asked Lincoln how he was doing. Lincoln replied that it reminding him of the overgrown boy who painfully stubbed his toe – it hurt too much to laugh and he was too big to cry. Not only does this reflect on Lincoln’s ambition, but it shows how long these types of expressions have been around.

There is some confusion about how many political elections Lincoln lost. I remember one historian saying Lincoln lost many elections. Lincoln himself said he was proud that he only lost one election that was put directly to the people. Lincoln was right. Here are the facts: Lincoln first ran for the state of Illinois legislature as a Whig in 1832 at the age of 23, two weeks after he was discharged from the militia in the Black Hawk War and where as he said he fought only mosquitoes. Of the eight unsuccessful candidates from his district he came third. In 1834 he ran again and was elected, receiving the second highest vote total from his district. Lincoln was reelected three more times to the legislature with the most votes of all the candidates from his district and in his third and fourth terms he was runner-up for Speaker of the legislature.

In 1846 Lincoln ran successfully for the U.S. congress again as a member of the Whig party. Even though he wanted to, he did not run for reelection in 1848 because of a somewhat vague understanding among members of the Whig party that various people would take turns. Owing to his vocal opposition to President Polk’s generally popular Mexican American War (1846-48) it is doubtful that Lincoln would have won reelection had he run in 1848.

Lincoln ran for the U.S. senate in 1854 and again in 1858 this time against incumbent senator Stephen Douglas losing both times. However until the XVII amendment to the U.S. constitutional was ratified on April 8, 1913 senators were elected by the state legislatures.

Lincoln was noted as a great story teller and there was always a point or as he said a “nub” to his stories. He used stories to gently put off suggestions from his friends he did not want to pursue; to deflect annoying questions from strangers; and to tell off his personal and political enemies when they became unbearable. The following of his stories are illustrative. I do not remember where I read, or possibly saw on television documentaries, these stories about Lincoln although I am sure there were several sources.

When Lincoln engaged Stephen Douglas in a series of seven debates for the position of U.S. Senator from Illinois in 1858 Douglas noticed that what Lincoln said when they were in the northern part of the state was not completely consistent with what Lincoln said in Southern Illinois. At the start of their next debate Douglas accused Lincoln of being two-faced. When it was Lincoln’s turn to speak he turned to the audience and said, “Now my opponent says I am two-faced, but I leave it to you, if I had two faces would I keep this one?” Douglas was right, but it was impossible to pursue a serious point when the crowd was convulsed with laughter. Getting the best of Lincoln was something few people managed.

A stranger asked Lincoln how many soldiers the South had in the field. Lincoln said he had it on good authority that the number was twelve hundred thousand. The man was appalled by the answer and said, “Good heavens, that many?” Lincoln responded, “Yes, after being whipped by the Confederates our generals tell me they were out numbered by three to one, and I must believe them. Now I know that we have four hundred thousand troops in the field so three times four is twelve. Don’t you see it?” After that what could one do but resolve to not ask the president any more impertinent and silly questions.

Senator Benjamin Wade (R-Ohio) who was chairman of the Committee on the Conduct of the War went to the White House to lobby Lincoln to remove Gen. Grant as commander of the army. Lincoln refused. Whereupon the vainglorious and overbearing Wade launching into an attack on Lincoln saying, “You have put this government on the road to Hell by your obstinacy and in fact you are only a mile away from Hell at this minute!” Lincoln quietly asked Wade a question: ‘The distance from here to the Capital building is about one mile is it not?” Wade grabbed up his hat and cane and stalked out of the White House. At least he was intelligent enough to realize he had been well and truly told off for his hubris and bad manners.

Before Lincoln issued the presidential directive called the Emaciation Proclamation on September 22, 1862 to be effective on January 1, 1863, he was pestered, not to say harassed, especially by three abolitionary members of congress: Senators Charles Sumner and Henry Wilson of Massachusetts & Representative Schuyler Colfax of Indiana. One day Lincoln was speaking to a long time acquaintance from Illinois in the White House and as he was looking out of a window he saw the three men coming up the front door path. Lincoln sadly told his visitor that it reminded him of the boy in Sunday school reading aloud about the three Hebrew children in the fiery furnace. The boy stumbled badly when pronouncing their names, Shadrach, Meshach, and Abednego, and as he continued reading in acute embarrassment his eyes, traveling ahead in the text, spotted their names again. “Look, look there”, he exclaimed with much agitation, “It’s those same damn three fellows again!”

Twice Pulitzer Prize winning author, Lincoln biographer, and Harvard history professor emeritus, David Herbert Donald (1920 - ) told the following story during a C-SPAN Book-TV interview: A few years ago he was getting his annual physical examination from a long time friend and world famous Harvard medical school internal medicine specialist when the doctor ask him what book he was working on. Donald replied that it was a biography of Abraham Lincoln except this time it was about Lincoln’s personal relationships.

Professor Donald explained that Aristotle described three types of friendships – enjoyable friendships; useful friendships; and close friendships. Lincoln had many enjoyable friendships as he enjoyed people’s company and being a very entertaining person they enjoyed being around him. And Lincoln certainly had many useful political friendships with the politicians in Washington D.C. and Illinois. But when it came to close, intimate friendships Lincoln had none. Professor Donald postulated that for an adult to form close friendships it was necessary for him to have close friendships as a child. Lincoln had a step-brother nearly his same age, but they had completely different personalities and therefore did not get along well. Being on the American frontier there were few other children of Lincoln’s age that he came in contact long enough to become firm friends or chums. As Donald was telling his doctor friend this he notice a tear coursing down this renowned and highly respected doctor’s cheek as he said, “I never had a chum when I was growing up.” Donald said “I didn’t know what to say, I still don’t know what to say.”

According to William Herndon or “Billy” as Lincoln called him and other lawyers in Springfield, Illinois, Lincoln was adequately familiar with the law, but not exceptionally so. Rather it was in nisi prius (presenting a case before a judge and jury) where Lincoln excelled. Yet even here if he was not completely convinced of the rightness of the case he was less than overwhelming. There were more than a few cases which Lincoln refused to accept because he thought his prospective client was in the wrong. Given that his client was on firm legal and moral ground however, Lincoln was not excelled by any lawyer in the state and few if any in the country. Lincoln could and did argue points in favor of the opposing party better than his own lawyer could, yet Lincoln would give counter arguments that were more convincing so he almost always won those cases.

Abraham Lincoln’s virtues were myriad and substantial, his faults minor, yet like all mortals he did possess them. Between his heartbreaking loss of Ann Rutledge and marriage to Mary Todd he had a girl friend named Mary Owens. Like Mary Todd she tended to be a bit on the stocky side or pleasingly plump if you prefer (Ann Rutledge’s dimensions were given as 5ft. 4in. and 120 lbs., but she died still in her teens). Mary Owens later explained to William Herndon that she turned down Lincoln’s offer of marriage saying he was, “deficient in those little links which make up the chain of women’s happiness – at least it was so in my case.” There was an occasion when several men and lady friends were horseback riding and there was a stream to cross which was a bit difficult. The men who were riding a little ahead were solicitous of the ladies, but Lincoln who was accompanying Mary Owens did not look back to see how she was doing. After she caught up she upbraided him by saying, “You are a nice fellow! I suppose you did not care whether my neck was broken or not.” Mary Owens said that Lincoln laughingly replied (she supposed by way of compliment) that he knew that she was plenty smart to take care of herself. Mary Todd was born in to an upper class Kentucky family and might have had a happier life if she had been as perceptive as Mary Owens although had she not married Abraham Lincoln she would have never lived in the White House.

Historians widely agree that at most Lincoln had cumulatively one year of primary school education. This was not nothing as Lincoln was an avid student who was eager to learn, yet how could anyone with such a meager formal education write some of the greatest prose in the American language? The Gettysburg Address delivered on November 19, 1863 is world famous and is his most quoted oration, but consider also these excerpts from his 1st & 2nd presidential inaugural addresses.

At the end of his first inaugural speech he turned directly to the South and said: “In your hands my dissatisfied fellow-countrymen, and not in mine, is the momentous issue of civil war. The government will not assail you. You can have no conflict, without yourselves being the aggressors. You have no oath registered in Heaven to destroy the government, while I shall have the most solemn one to ‘preserve, protect, and defend’ it.

I am loath to close. We are not enemies, but friends. We must not be enemies. Though passion may have strained, it must not break our bonds of affection. The mythic chords of memory, stretching from every battlefield, and patriot grave, to every living heart and hearth-stone, all over this broad land, will yet swell the chorus of the Union, when again touched, as surely they will be, by the better angels of our nature.”

In his 2nd inaugural address on March 4, 1865 (his 1st was March 4, 1861) he concluded with: “Fondly do we hope – fervently do we pray – that this mighty scourge of war will speedily pass away. Yet, if God wills that it continue, until all the wealth piled by the bondman’s two hundred and fifty years of unrequited toil shall be sunk, and every drop of blood drawn by the lash, shall be paid by another drawn by the sword, as was said three thousand years ago, so shall it be said ‘the judgments of the Lord, are true and righteous altogether.’

With malice toward none, with charity for all, with firmness in the right, as God gives us to see the right, let us strive on to finish the work we are in; to bind up the nation’s wounds; to care for him who has borne the battle, and for his widow, and his orphan – to do all which may achieve a just and lasting peace, among ourselves, and with all nations.”

Upon the death of Abraham Lincoln, Secretary of War Edwin Stanton is alleged to have said, “Now he belongs to the Ages.” Truly he does.

Friday, February 2, 2007

HORMESIS AND OTHER MATTERS 3

Hormesis – what is it? First I will tell you what it is not. It has nothing to do with homeopathy which is a method of treating disease by giving highly diluted doses of drugs to patients such that the same drugs given to healthy people in much more concentrated form would cause the same symptoms as the disease. Sometimes these doses given to patients are less than one part in one billion. The idea is to theoretically simulate the symptoms of a disease rather than attacking the cause. Yet with such a diluted dose the patient would in effect be given a placebo. Not only is the idea nuts, but the method makes even less sense.

Hormesis also is not scientology which is a screwball philosophy/religion/cult based on the inane rants of a pulp and science fiction writer named L. (Lafayette) Ron Hubbard (1911-1976). Hubbard attended George Washington University where he received poor grades, was on academic probation, and dropped out after two years. He took a course called Atomic and Molecular Physics. Based on this he later claimed he was a nuclear physicist. He received an F in the course. His military record in WWII and domestic life is a sorry story which, if interested, I invite you to consult on the internet. Scientology does affect people though – it can cause a person to dementedly jump up and down on a couch in front of television cameras and to undertake a mission impossible by dispensing medical advice to women even if one does not know any more about medicine than a frog.

In his book The Politically Incorrect Guide to Science Tom Bethell explains that hormesis is the theory that whereas large amounts of certain substances are toxic and even deadly, small amounts of these same substances are not only not harmful, but actually beneficial. This is not only true, but reasonable and demonstrable when you think about it. Consider alcohol. When taken in large amounts over years you would likely need to go shopping for a new liver, yet the public has been inundated in the past few years with reports from the news media that alcohol taken in small amounts daily, say a 12 oz. bottle of beer; a mixed drink; or a 5 or 6 oz. glass of wine will keep a person healthy, if not happy and wise.

Hormesis contradicts the widely held assumption in public health that large doses of toxic substances continue to be toxic in smaller and smaller doses - this is called the linear, no threshold theory. When plotted with “harm” and “dose” on the ordinate and abscissa, respectively, a straight line intersects the origin. Another theory is called the linear, threshold theory. In this case the straight line intersects the “dose” axis short of the origin and plots as a horizon line along the “dose” axis back to the origin. A third theory is hormesis which plots as a curved line again intersecting the “dose” axis short of the origin, but in this case goes below the “harm” line into what is labeled the “benefit” area, then plots back to the origin at the zero dose amount.

So frequently has hormesis been observed that, quod erat demonstrandum, it is not difficult to find substances that are beneficial at low doses. Consider vitamin and mineral supplements. The ingredients include boron, chromium, copper, iodine, magnesium, molybdenum, nickel, phosphorous, potassium, selenium, vanadium, and zinc – all toxic at high doses. The widespread use of multi-vitamins shows that people do not subscribe to the linear, no threshold theory when it comes to ingested substances. How about externally applied substances and processes? Is long time exposure to x-rays and radiation bad and short time exposure still detriment, only less so? Many people would say so, but what is the evidence?

The unfortunate people at the center of the nuclear bomb sites in Hiroshima and Nagasaki naturally were incinerated. In fact the blast was so powerful and bright that permanent shadows of objects were cast on concrete structures which were not completely destroyed. The people further away from the blast later died in great numbers of various cancers. As expected as the distance from the center of the blast increased fewer and fewer people died from cancer. However, then an unexpected phenomenon happened. At increasing distances from the central blast site, where people still received some radiation, but only small amounts, the later incidence of cancer was less that for people further away who had not been subjected to any radiation. Interesting to say the least, is it not?

Anti-nuclear zealots assert, ipse dixit, or seemingly deliberately misstate the dangers of long half-life radioactive materials from nuclear power plants citing that these sites will be rendered uninhabitable for thousands of years. This is completely backwards. It is the radioactive materials with short half-lives that are dangerous. Author Tom Bethell quotes a former acting Secretary of Energy posing the question: “Would you rather sit on a box of firecrackers if half will go off in the next week or a box in which half will go off in the next 24,000 years?”

Plutonium is deadly because it can be used to make atomic bombs. It is not injurious to handle because its half-life is 24,000 years. Bernard Cohen of the University of Pittsburgh once offered to eat some plutonium if Ralph Nader would eat a like amount of caffeine. Nader refused. Plutonium is not to be confused with Polonium – 210 which was discovered in 1898 by double Nobel Prize winner Marie Curie and her husband, single Nobel Prize winner, Pierre Curie. Polonium - 210 has a half-life of 138 days and is so rare that only about 100 grams/yr. are made, mostly as a byproduct of nuclear reactions in power plants. It is thought to be the substance that was used to poison ex-KGB and vehement critic of Russian President Vladimir Putin, Alexander Litvinenko, in London recently. Milligrams or even micrograms of the stuff, if ingested, would be enough to fatally destroy internal organs.

A congressional resolution declared in 1970 that,”The conquest of cancer is a national crusade to be accomplished by 1976.” In 1971 when the National Cancer Act became law, 330,000 American died of cancer. In 2004 about 560,000 Americans died of cancer. Even when adjusted for an aging population, the percentage of Americans dying of cancer is about the same as in 1970 and in 1950 as well. In contrast age-adjusted deaths for heart disease have declined by 59% and by 69% for stroke.

A decline in lung cancer in recent years is attributable to a decline in cigarette smoking for men. Lung cancer for women has increase as a proportion of total lung cancers for both men and women. When the stigma of cigarette smoking for women abated many years ago little did they realize what a pyrrhic victory it would become - another unintended consequence of an otherwise admirable equality goal. Cures for some types of leukemia (the four main types are acute lymphocytic leukemia [most common for children], acute myclogenous leukemia, chronic lymphocytic leukemia, and chronic myclogenous leukemia) and testicular cancer have met with striking success. Improvement in five year cancer survival rates are a function of better surveillance and earlier diagnosis, when surgery and chemotherapy have a better chance of success. Survival gains for the most common forms of cancer are measured in additional months of life, not years. Compared to cure rates of infectious diseases, especially through most of the 20th century, the record of cancer has been woefully lagging.

In part, the answer to the slow progress in curing and successfully treating cancer may be linked to its cause. What does cause cancer? In the 1920’s researchers bombarded fruit flies, Drosophila melanogaster (the black-bodied dew-lover) with x-rays, resulting in mutant flies. Humans exposed to large doses of x-rays proved to be at high risk for skin cancer and leukemia. Therefore it was clearly shown that x-rays produced both mutations and cancers.

In the 1960’s researchers at the National Institutes of Health used fast growing bacteria to detect the mutagenic properties of various substances. Some carcinogens proved to be mutagenic, advancing the gene-mutation theory of cancer. In the 1970’s Dr. Robert Weinberg at an MIT cancer research lab concluded that carcinogens act by damaging DNA (deoxyribonucleic acid), thereby creating mutations in genes of the targeted cells. Radiation is an example of a carcinogen which is a mutagen, others are not. Tar found in cigarettes and asbestos are carcinogens, but are not mutagenic – that is they do not affect DNA.

Viruses were once thought to play an important role in causing cancer, however since only a couple types of cancer are definitely connected to viruses it is now thought that viruses play a minor role in cancer. In 1970 two researchers at the NIH, Robert Heubner and George Todaro put forth the “oncogene hypothesis” which attributed all cancer to the activation of certain genes that are intrinsic to cells – not transported there by viruses. These hypothetical cancer genes, called proto-oncogenes, are activated by being mutated.

The oncogene theory at the outset posited that a single gene mutation was enough to turn a normal cell into a cancer cell. Although a few types of cancer are now considered to be caused by a single gene mutation these are the exceptions. It was always unlikely that a single gene mutation was a major factor because mutations occur at a predictable rate in the body and as the number of cells in the body number in the tens, if not hundreds, of trillions, we would all have cancer if a single hit was sufficient to transform a cell in many cancers. In time it was proposed that two genes, then six or seven genes would have to mutate in the same cell during its lifetime to make it cancerous. That was just guesswork based on gene mutation theory. Researchers have never been able to show that any gene, whether or not mutated, except in a few isolated cases, could transform a normal cell into a cancer cell, nor could it start a tumor in any animal. Further, they have never been able to show that any combination of genes taken from a cancer cell can transform normal cells, whether tested in vitro or in the lab. The theory has not been confirmed by any functional test. These mutated genes can be transported into test cells, but when these genes are integrated into the cell’s DNA the recipient cells do not turn into cancer cells, and if injected into experimental animals, they do not cause tumors. Genetically engineered mice have mutated oncogenes in every cell of their bodies so one would think they would die immediately of cancer. In fact offspring grow up and live long enough to pass on their supposedly deadly genes to the next generation.

Cells divide in the course of life, sometimes frequently as in the gut or skin, and sometimes infrequently, as in bone or muscle. This cell division is called mitosis (from Greek mitos, meaning thread). The chromosomes (from Greek chrōma color plus sōma body) double up, and then separate, and after mitosis a full compliment of chromosomes ends up in both of the resultant cells. Normal cells in the body divide only a certain number of times called the Hayflick limit which is roughly 50, before expiring. It is thought to be a cause or concomitant of aging.

Normal mouse cells have 40 chromosomes, humans have 46 – 23 from each parent. Such cells are called diploid because they have two of each chromosome. Genes are segments of DNA strung along these chromosomes. The largest chromosomes incorporate several thousand genes each. It may not be flattering to the male ego, but by far the “Y” chromosome, which defines a male, is the smallest of all the human chromosomes.

Sometimes there is an error in the anaphase or telophase stages of cell division such that the chromosomes do not divide properly, but end up unequally in the daughter cells. One cell could come up a chromosome short while another cell would have an extra one. Such pathological cells will usually die and then there would not be a problem. But sometimes the error persists, more likely in the cell with an extra chromosome. The cell just keeps on dividing, ignoring the Hayflick limit with its control mechanisms. This is dangerous because a tumor forms in that part of the body and that is cancer.

This abnormality of the number of chromosomes in the cells is called aneuploidy. Some human cancer cells may have as many as 80 chromosomes instead of the normal 46 which could give them double the right number of genes. It is not just that a cell has two or three genes that malfunction. It is far worst than that. And the more aneuploid they become, the more likely they are to metastasize.

A leading cancer researcher, Bert Vogelstein of Johns Hopkins, a few years ago accepted that “at least 90% of human cancer cells are aneuploid.” More recently his laboratory reported aneuploidy “is consistently shown in virtually all cancers.” More and more researches accept that cancer cells are aneuploid. Some insist that this is a consequence, not a cause of cancer, but others are beginning to say that perhaps both gene mutation and aneuploidy play a role in cancer. Christoph Lengauer, a researcher in the famed Vogelstein/Kinzler lab at Johns Hopkins University, said that “with our experiments, we found that aneuploidy is a very early event in tumorigenesis.”

What is the answer then? Does aneuploidy have no connection with cancer; is it the main cause of cancer; or is it a cause in addition to gene mutation? At this point the last possibility seems to be the most likely.