Let’s see at the outset if we can agree on something. That something is that energy can not be created or destroyed – the 1st Law of Thermodynamics. If that is true, and there is enough potential energy extant on earth for human use, then we can not run out of energy. It is like our use of water. All of the water used by humans, other animals, insects, arachnids, plants, trees, etc. does not disappear out into space. True, water conservationists may be considering only potable water, but that is just a matter of technology to convert or recapture water to make it useful. And if the argument then changes to the expense of reclaiming water read on about the treatment of energy.
The Greens and even other people, especially, but not only, on the left, believe that energy use (in particular in the USA) should be curtailed and perhaps rationed; the world is rapidly running out of useful energy; and fossil fuel derived energy sources are noxious, are ruining the environment, and therefore should be quickly phased out. None of these assertions is true as I shall endeavor to illustrate.
At this point it is important to define what is being discussed. Considering the possible sources of energy: fossil fuels; wind; the sun; biomass; and nuclear, there is no shortage of raw energy. But what exactly is energy (see list of definitions)? Those raw energy sources are potential energy that is capable of being reconstructed to do useful things. The process of making energy useful is to apply order to it. That is to say to decrease its entropy. Sunshine, wind, fossils fuels, nuclear fuel, all have to be ordered to coax useful work out of them. In fact, by far, most energy is used to extract, refine, transport, and apply energy to do what we consider useful. And the total process is exceedingly wasteful and inefficient. As recounted by Peter W. Huber and Mark P. Mills in their 2005 book, The Bottomless Well: The Twilight of Fuel, The Virtue of Waste, and Why We Will Never Run Out of Energy Thomas Edison’s first light bulb converted 4% of the energy input to the bulb into light - 96% was dissipated in heat. It gets worse than that, much worse. The power plant Edison built to light his bulbs did not even convert 10% of its heat into electricity. The total energy conversion from power plant to light bulb output was less than ½ of one percent.
Efficiency of raw energy conversion has gotten better over the years and keeps getting better, however it is still a small percentage of the potential energy of the resource which gets converted into useful work. Most of the energy which is wasted is in the form of heat. Fortunately this dissipated heat is a small percentage of the total heat which reaches the earth from the sun during daylight and is then radiated back into space at night. According to Huber & Mills, an estimated 5 million Quads (see definitions) of solar energy reach the earth per year – 10,000 times as much as humans consume in the form of fossil fuels, crops, and wood. So no, in case you were worried, we are not turning the earth into a gigantic hot forge by our inefficient use of energy.
Over the past 30 years, appliances, air-conditioners, refrigerators, light bulbs, electronics, etc. became 30-50% more efficient, yet in the USA we burn an additional 400 million tons of coal per year. Increasing the efficiency of energy generation increases not decreases the use of energy. Tremendous improvements in efficiency create more demand, not less. Our main use of energy is not lighting, locomotion, or cooling. It is to extract, refine, process, and purify energy itself. And the more efficient we become at refining energy the more of it we want to use. Say’s Law [John Baptiste Say (1767-1832)] states that additional supply creates additional demand. Collective wants are insatiable. In order to lower consumption it would be necessary to lower efficiency. I do not know anyone who is in favor of that.
In the United States 40% of total raw energy is supplied by oil and 60% by coal, gas, uranium, and hydroelectric. For electricity, 50% is supplied by coal; 20% by uranium; 18% by natural gas; 0.27% wind; 0.013% solar and the rest primarily by hydro and oil. After the Three Mile Island meltdown in 1979 pundits declared that was the end of civilian nuclear power. The Chernobyl accident seven years later confirmed it in their minds. Since 1979 the amount of electricity generated by uranium increased from 11% to 20% of the total today. No new reactors were built in the intervening years, except the ones already under construction, however these new ones and the existing ones were made more efficient and were run more hours per day.
Electricity is responsible for 60% of our GDP and 85% of our growth in energy demand since 1980. In the next several decades hybrid automobiles combining electrically powered batteries and gasoline engines with fully electric drive trains will increase the relative utilization of electricity. As it is now, less than 20% of the cost of owning and operating automobiles is due to the cost of gasoline (so why all the grumbling and whining over the price of gas?) and with the future hybrids the cost of powering these vehicles will be even less. The cost of electricity for automobile propulsion versus the cost of gasoline is 1/3 for electricity generated by coal and 1/10 if generated by uranium.
If the Greens and their sympathizers do not give up their unrelenting and unrealistic opposition to oil and gas exploration and refining and especially uranium utilization then this is going to happen: As the increasingly efficient and increased use of electricity eventuates, the United States will burn more and more coal. No politician, either Democrat or Republican (no others count), will be responsible for the lights going out as was experienced in California under Governor “Gray Skies” Davis (Davis did not cause the problem which was building years before he became governor, but he was in office at the critical time and he contributed to the problem rather than the solution so he paid the price). Uranium is by far the densest (most mass and potential energy per unit volume) and most economic form of raw energy, is readily available, and can be made completely safe. Let the fanatics explain why it should not be more widely used as a raw energy source.
Is the United States an energy consumption hog? We consume about 100 Quads of raw energy per year today up from 7 in 1910 and 35 in 1950. The USA consumes 43% of the world’s gasoline, 26% of the electricity, 25% of the petroleum, 25% of natural gas, and 23 % of hard coal. We also produce 25% of the world’s goods and services so our consumption is not out of line with our productivity. Despite what the effete Europeans might think America spreads democracy in the world (a good thing according to sensible people), provides military defense (also a good thing), feeds people around the world out of proportion to America’s population, and pioneers many key inventions and engineering technologies. This would not be possible if we were burning dung as a primary fuel. As countries become more industrialized and more productive they consume more raw energy. From 2002 to 2004 the USA used 700,000 more bbls. of oil per day; during this same time period China’s increase was 1.47 million bbls. per day. In contrast to the 2 gigawatt Hoover Dam on the Colorado River the Chinese built an 18 gigawatt dam on the Yangtze River.
Is it true that the more fossil fuels are used up the less there are available and the more expensive they become? Intuitively it would seem so, yet this has not been true historically. Global oil production has increased from 66.8 million bbls./day in 2002 to 69.2 in 2003, 72.5 in 2004, and is at a still higher rate in 2005. In 1980 some experts predicted that the price of oil would be $200/bbl. by 2003. There were only about 30 billions bbls. of proven reserves in the USA in 1979. Since then 67 billion bbls. of crude have been produced. Recently what was described as the largest onshore oil field in the last 30 years in the lower 48 states was discovered in Utah. Even though the history of such discoveries is that inevitably the reserves are initially overestimated, then underestimated with more data, and finally with still more data more accurately estimated, these discoveries are typical when there are incentives.
The price of crude oil has remained remarkably stable over the years despite increased costs of extraction. Wells which are drilled through 10,000 ft. of water, 20,000 ft. of rock vertically, and 30,000 ft. of rock horizontally have not materially raised the cost of crude, inflation adjusted, from wells drilled in 100 ft. of water in 1954 and unit costs are less than the 69 ft. well drilled in Pennsylvania in 1859. According to Huber & Mills production costs of the Statfjord oil field in the North Sea are not much higher that the Spindletop field of southeast Texas discovered in 1901. Even though the popular perception is that recent gasoline prices in the USA were at historic high levels, the inflation adjusted prices in 1981 were at equal or higher levels. And to promote cleaner burning, the various gasoline additives mandated by different states have increased gasoline costs in the interim.
On the order of 5 million Quads per year of solar energy reach the earth. Worldwide coal reserves are 200,000 Quads and oil shale at least 10,000 Quads. There are 3.5 trillion bbls. of recoverable heavy oil in tar sands in Canada and Venezuela – at 80 million bbls./day consumption this would represent a 100 year supply. Global consumption is 345 Quads of fossil fuels (100 in the USA) per year. Given these huge reserves of raw energy along with renewable sources such as wind and hydroelectric as well as nuclear, one could realistically conclude that energy sources are essentially infinite.
Is the earth’s environment slowly (some would say not so slowly) being degraded by humanity’s inexorable drive to extract and consume vast amount of energy? Well, what has happened in the past? London, England has more hours of clear skies than it did a century and more ago when great quantities of soot were spewed into the atmosphere by coal burning fireplaces and furnaces; Los Angles is less polluted now than it was in the past few decades; and Pittsburg used to be called the “Smoky City” – no more.
According to Huber & Wells in 1840 it required 6000 cords of wood to produce 1000 tons of iron. As late as 1910 some 27% of all U.S. farmland was devoted to feeding horses used for transportation. Feeding the organic transportation system in 1910 required far more land than we have since seized for oil pipelines, refineries, and wells. The lower 48 states had a bit over 1 billion acres of forests when the first Europeans settled in New England. By 1920 that figure shrank to about ¾ of a billion acres. In recent years trees have been replanted at a rate of 3 million acres per year. Growing up on a farm in Michigan in the 1940’s I planted thousands of various pine seedlings myself. For the first time in history a Western nation has reversed the decline of its forests. If current trends continue America could eventually return to the levels of forestation last seen by the Pilgrims.
The British Antarctic Survey out of Cambridge, England in a recently published report based on satellite images between 1992 and 2003 stated that the East Antarctic ice sheet gained about 45 billion tons of ice and thickened at an average rate of 1.8 cm. per year – enough to reduce the ocean’s rise by 0.12 mm. per year. This region comprises circa 85% of Antarctic’s total ice volume. Does this prove “Global Cooling” is the order of the day? No, but it does not exactly support the theory that the earth is becoming an oven either.
What could reverse this inexorable drive to consume more and more energy? Not an exhaustion of raw energy as has been discussed in detail, but rather a drop in world population – something that most assuredly will happen, although not soon.
Currently the world population is approximately 6½ billion. Even though the worldwide fertility rate has been declining for the last 50 years, the forecast for the world population is 8 to 9 billion by 2050. How can there be an increasing population with a decreasing fertility rate? The answers are: 1.) Even if the fertility rate is falling, if it is still above the replacement rate then the population will increase. 2.) There is a quarter of a century or so delay (called momentum) between births and the age of the women giving birth. 3.) Although fewer children are being born the ones that are born are living longer.
As explained by Ben J. Wattenberg in his 2004 book, Fewer: How the New Demography of Depopulation Will Shape Our Future an ongoing United Nations study on demographics as well as the U.S. Bureau of Statistics postulate that the world population will be 2 to 3 billion by the year 2300 almost back to what it was in 1950 at a little over 2½ billion. That is a long time period for making projections, yet the trends are there. Of course conditions could change and therefore change the outcome, but it seems to me more likely that some catastrophic event such as a deadly pandemic disease or a nuclear holocaust would lower populations even further rather than raising population numbers by God knows what.
The replacement fertility rate is 2.1 children per woman. In 1950 the world rate was 5.0; in 2000 it was 2.7 and was previously predicted to be 2.0 by 2050. The new number is 1.85 for 2050 and that number may drop to 1.7 when still newer estimates come out in 2008 which may mean that the world’s population will peak before 2050 and not reach the 8 to 9 billion figure.
The fertility rate in the less developed world was 2.9 in 2000 and is forecast to be 2.0 by 2050. It is a commonplace that the fertility rate drops as a population gains economically. In a wonderfully sardonic commentary in the 1950’s Bergen Evans told the story of the man who was worried that the black population in the USA was increasing faster than whites. Evans told him that yes the black birth rate was higher than whites, but as blacks gained in economic and social status their birth rate would decline – thus mixed are all blessings! This current decline in third world countries is largely unrelated to economic development, but may be due to what is called diffusion (the spread of communications). People may not have yet experienced too much in the way of increased economic prosperity, but they are not dumb, they know what is happening in the developed world, partly as the result of smaller families. Indonesia, Egypt, Iran, Brazil, Cuba, and Mexico even though they currently have more than replacement population increase rates, have declining fertility rates. Wouldn’t that be the supreme irony if, at some future time, the United States offered incentives for Mexicans to come here while the Mexican government tried mightily to keep them home!
Many European countries are undergoing population declines now as are Japan and South Korea. Europe as a whole, which has a fertility rate of 1.4, down from 2.6 in 1950, is predicted to decline by 100 million by 2050. Russia alone is suffering an 800,000 per year decline in population. Japan’s fertility rate was 2.8 in 1950 and 1.3 in 2005 with a projected 17 million population decline by 2050. South Korea’s rate was 1.1 in 2005. The South Koreans have a real and growing problem with population decline, but are not expected to drop to zero by 2050 – unless the N. K. Commies launch nukes.
The fertility rate of the USA was 3.5 in 1960 at the height of the “baby boom” and is 2.0 today. Because of robust immigration and momentum, by contrast with Europe, the United States which has a current population of almost 300 million is projected to have a population of 400 million by 2050. Compared to historical standards we are not currently being overrun with immigrants. Early in the 20th century our population had 14% foreign born, 5% in 1970, 10% in 2005, and a projected 13% in 2050. By the 2nd half of the twenty-first century there are apt to be three great powers in the world: the United States with a 400+ million population and China and India with circa 1 billion each. Given their population declines and their quasi socialistic societies the European countries will have to be content learning to be 3rd rate powers. A caveat in the prediction about India taking her place as one of the three great powers is given by Adlai E. Stevenson, former U.S. presidential candidate in the 1950’s, having predicted that it takes 50 years for a country to industrialize. India gained independence in 1948 so by 1998 they should have been industrialized. As of 2005 they still are not, although they have made big strides – especially in the last 15 years, so perhaps his time frame needs to be stretched a mite.
DEFINITONS
1st Law of Thermodynamics – The energy going into a system, minus the energy coming out of a system, equals the change in the energy stored in the system. Stated more simply the Law says that energy can neither be created nor destroyed.
2nd Law of Thermodynamics – Heat will, on its own, flow from a hot object to a cold object. An alternative definition is that the state of any closed system inevitably decays from more order to less order.
Energy - The name given to the ability to do work. Potential Energy is possessed by a body due to its position or form. Kinetic Energy is energy possessed by a body because of its motion.
Entropy – The entropy of a substance increases whenever the energy it possesses decreases. Entropy is also used as a measure of the disorder of a substance – the greater the disorder, the greater the entropy.
BTU (British Thermal Unit) – Heat required to raise one lb. of water one degree Fahrenheit.
Quad – One quadrillion (10¹) BTU’s.
Horsepower – The power required to raise 550 lbs. 1 ft. in 1 sec.
Force - Force is a push or pull on an object or body.
Work - The amount of work is determined by the strength of the force used and the distance through which it moved.
Power - Power measures the rate at which work is done.
Watt – A unit used to measure power. An electric devise uses 1 watt when 1 volt of electric potential drives 1 ampere of electric current through it.
Gigawatt – One billion watts.
Calorie – Quantity of heat required to raise the temperature of 1 gram of water 1 degree centigrade.
Saturday, April 14, 2007
Friday, April 6, 2007
POLITICALLY INCORRECT SCIENCE 11
Science journalist, senior editor of the Weekly Standard, and columnist for National Review, Tom Bethell had a paperback book tilted A Politically Incorrect Guide to Science published in November 2005. Bethell grew up in Great Britain and graduated from Oxford University. Some of what follows is taken from his book.
It is interesting that Charles Darwin said his theory of evolution – national selection and the survival of the fittest - was based on the work of Thomas Malthus (1766-1834). Malthus was a free market economist who postulated that worldwide famine would ensue because the population was increasing geometrically and the food supply was increasing arithmetically. He was wrong because his assumptions about these increases were incorrect. This idea was taken to a ridiculous level in the 1960’s by Paul Ehrlich (see my essay Fools, Frauds & Fakes) who said that 65 million Americans would die of starvation by the 1980’s. Sacre´ bleu! What idiocy. If he had meant obesity 20 years later he would have looked less ridiculous. British mathematician and philosopher Bertrand Russell (1872-1970) made the wry comment that the theory of a laissez-faire economist was applied to the animal and vegetable kingdoms and embraced by the left.
The view of many people, both in and out of science, is that if a process or substance, such as radiation, lead, mercury, etc., is toxic then even the minutest amount is detrimental. Linus Pauling who won the Nobel prize for chemistry in 1954 (he was working furiously on the structure of DNA [deoxyribonucleic acid] and might have won the Nobel prize for medicine except he was beaten to the solution by American James Watson and the Brits, Francis Crick and Maurice Wilkins) was a believer in this Linear No Threshold Theory of toxicity. Pauling and his wife ingested massive quantities, as much as a couple of orders of magnitude more than is recommended, of vitamin C as a preventative for colds and cancer. Although they both lived into their early 90’s (Pauling died at the age of 93 of prostate cancer) millions of other people lived that long or longer without taking supplemental vitamin C. Double blind studies have failed to show any correlation between large doses of vitamin C and prevention of colds and cancer. Because of his tireless advocacy of banning all nuclear testing he won the Nobel Peace prize in 1962. He also won the Lenin Peace Prize in the same year so one can see who was pushing his agenda.
There is a theory called hormesis which holds that while large doses of these materials are toxic, low doses are in fact beneficial.
Johns Hopkins researchers found that Eastern USA shipyard employees who worked on nuclear reactors for ships and submarines were exposed to 10 times the amount of radiation compared to workers who were not exposed yet they had a 25% lower incidence of cancer that the national average.
Radon is a colorless, inert, radioactive gaseous element given off in the radioactive disintegration of radium and is found in the basement of houses all around the country in widely varying amounts. A comprehensive study at the University of Pittsburg yielded an almost inverse relation of radon exposure and lung cancer.
Radiation from uranium bearing rocks and cosmic radiation because of higher elevations in the Rocky Mountain area are greater than in the Mississippi Valley area. However incidences of cancer are measurably lower in the Rocky Mts. than in the Mississippi Valley.
Although it is controversial, there is a Japanese study which indicates that survivors of Hiroshima and Nagasaki on average live longer that Japanese in the same age group of other cities. Certainly there is no clear indication that life spans of H & N survivors have been shortened relative to other Japanese.
It was predicted that the fatalities of Chernobyl would be on the order of 150,000. The New York Times and Washington Post reported that to date 50 deaths are attributable to the Chernobyl nuclear accident (1986). And even if that total is increased owing to the longer term effects of nuclear radiation as a causative agent of cancer the original estimate will not be approached.
Then President Richard Nixon declared war on cancer in 1971 with the prediction that the war would be largely won by 1976. I do not recall there was widespread laughter by the public at that extremely silly forecast. Admittedly Nixon was not someone who inspired merriment – contempt would be more like it. Still how could he or anyone have been so naïve?
Whacko environmentalists like to claim we are all slowly, and in some cases not so slowly, being poisoned by chemicals input into the air, ground, and water by perfidious and profligate man. In fact low levels of chemicals are beneficial to good health. Consider such chemical elements as phosphorus, magnesium, molybdenum, nickel, copper, zinc, chromium, and selenium which are trace elements necessary for good health and contained in many vitamin and mineral supplements. Mercury and lead are also in this category such that while large amounts are toxic, small quantities are essential.
Is global warming occurring and if so is it caused by human activity? The “Greens” and other head cases insist the answers are yes and yes and all dissenters are right wing environmental evil despoilers who should be suppressed. As for myself, I have a few questions. Not only is there a problem of obtaining accurate and representative temperatures now, given the “heat island” effect of cities, lack of proper temperature recording equipment in third world countries, and considering the earth is 2/3 covered with oceans, but comparing these with past proxy temperatures derived from tree rings and ice cores is problematic at best.
The environmentalists estimate that global temperature has increased by 1 ºC over the past 100 years, yet there was a circa 30 year period in the mid century where average temperatures declined and most of the gain is attributed to the first half of the 20th century. If industrialization is the cause then how is this explained?
Vikings settled in Greenland about 1000 A.D. The name was not selected cynically to entice additional settlers, but was an accurate description unlike now where the names of Iceland and Greenland should more descriptively be interchanged. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.
The diet of the Vikings was 80% derived from grazing land animals and 20% from fishing in the sea. This was prior to the “Little Ice Age” from approximately 1400 to 1850. As the global cold weather persisted the diet of the Vikings changed to 80% from fishing and 20% from the diminishing herds of sheep and cattle. Eventually all the sheep and cattle died and even the fish stopped coming into the cold northern waters. The Vikings did not want to emulate the native Greenlanders because they considered them barbarians so they all starved. Hubris will get you every time.
There were many other effects of the Little Ice Age. New England was so persistently cold that the western migration of Americans was greatly accelerated. The severity of the “Irish Potato Famine” is attributable to the prolonged cold and damp weather. Question: What caused what was a relatively global warm period prior to the LIA, what caused the LIA, and what caused the warm up after?
There are two current theories as to the mechanism of the temperature change during the Little Ice Age and neither, naturally, has anything to do with the alleged irresponsible actions of man. One theory is that the sun on a still unpredictable and not understood basis periodically outputs a changing and significant amount of solar energy. Another theory is that currents carry waters from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of warmer water flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding the Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes the Little Ice Age was initiated. Or so goes the theory.
The United States has not signed on to the Kyoto Treaty and has therefore been pummeled by the hard left in this country and around the world. However the purpose of the treaty is clear. It is to cripple the US economy. Consider: Major polluting countries, China, India, Brazil, and third World countries are exempt from emission standards for the undisputed reason their economies would be hurt. The chosen year of 1990 for base reductions favor Russia and Germany as it was subsequent to then that their economies started growing rapidly owing to the overthrowing of Communism in Russia and the unification of East and West Germany. According to a just released European Institute for Public Policy Research report, only Britain and Sweden are hono(u)ring their commitments to cut greenhouse gasses and 10 out of 15 European signatories will miss their treaty targets without taking urgent action.
There is no doubt that AIDS (Auto Immune Deficiency Syndrome) is a huge health problem in Africa. The question is, is it as widespread as advertised? Whether people in Africa have AIDS is largely not determined by tests for HIV (Human Immunodeficiency Virus), but whether hospital patients have several of such symptoms as a 10% weight loss, cough, fever, and diarrhea which are all symptoms of AIDS and symptoms of other diseases as well. Additionally false positives of HIV are produced by such maladies as malaria and pregnancy, common in Africa. Despite what is called the AIDS epidemic the population of Africa is continually increasing contrary to the forecasts of the most pessimist alarmists.
The effect of a diminution in smoking and earlier detection of some cancers has lowered cancer mortality rates. However even though there has been a significant decrease in the mortality of some forms of cancer there are other types which have not had any meaningful decrease in the mortality rate in the past 50 years. Cancer is a serious, but interesting disease or more correctly a series of various diseases. It is widely accepted that gene mutation is the cause of most cancers. There is an alternate explanation called the theory of cell duplication or the Aneuploid Theory which holds that mistakes made during cell division cause cancer. For example when a somatic cell divides instead of producing 46 double structured chromosomes it erroneously makes, say, a cell with an many as 80 double structured chromosome pairs. University of California molecular biologist, Peter Duesburg has been proposing this theory for 25 years. He is better known for his claim that HIV is not the cause of AIDS so perhaps his explanation for the cause of cancer should be taken with a large, even very large, dose of skepticism. The complexity of cellular biology is what makes an understanding not only of cancer, but human biology so frustrating. The Human Genome Project was first conceived in 1975 with great promise for human cellular engineering advancement. Thirty years later understanding of human cellular biology has been frustratingly slow. There is no rational reason to believe that stem cell research will advance any faster.
It is interesting that Charles Darwin said his theory of evolution – national selection and the survival of the fittest - was based on the work of Thomas Malthus (1766-1834). Malthus was a free market economist who postulated that worldwide famine would ensue because the population was increasing geometrically and the food supply was increasing arithmetically. He was wrong because his assumptions about these increases were incorrect. This idea was taken to a ridiculous level in the 1960’s by Paul Ehrlich (see my essay Fools, Frauds & Fakes) who said that 65 million Americans would die of starvation by the 1980’s. Sacre´ bleu! What idiocy. If he had meant obesity 20 years later he would have looked less ridiculous. British mathematician and philosopher Bertrand Russell (1872-1970) made the wry comment that the theory of a laissez-faire economist was applied to the animal and vegetable kingdoms and embraced by the left.
The view of many people, both in and out of science, is that if a process or substance, such as radiation, lead, mercury, etc., is toxic then even the minutest amount is detrimental. Linus Pauling who won the Nobel prize for chemistry in 1954 (he was working furiously on the structure of DNA [deoxyribonucleic acid] and might have won the Nobel prize for medicine except he was beaten to the solution by American James Watson and the Brits, Francis Crick and Maurice Wilkins) was a believer in this Linear No Threshold Theory of toxicity. Pauling and his wife ingested massive quantities, as much as a couple of orders of magnitude more than is recommended, of vitamin C as a preventative for colds and cancer. Although they both lived into their early 90’s (Pauling died at the age of 93 of prostate cancer) millions of other people lived that long or longer without taking supplemental vitamin C. Double blind studies have failed to show any correlation between large doses of vitamin C and prevention of colds and cancer. Because of his tireless advocacy of banning all nuclear testing he won the Nobel Peace prize in 1962. He also won the Lenin Peace Prize in the same year so one can see who was pushing his agenda.
There is a theory called hormesis which holds that while large doses of these materials are toxic, low doses are in fact beneficial.
Johns Hopkins researchers found that Eastern USA shipyard employees who worked on nuclear reactors for ships and submarines were exposed to 10 times the amount of radiation compared to workers who were not exposed yet they had a 25% lower incidence of cancer that the national average.
Radon is a colorless, inert, radioactive gaseous element given off in the radioactive disintegration of radium and is found in the basement of houses all around the country in widely varying amounts. A comprehensive study at the University of Pittsburg yielded an almost inverse relation of radon exposure and lung cancer.
Radiation from uranium bearing rocks and cosmic radiation because of higher elevations in the Rocky Mountain area are greater than in the Mississippi Valley area. However incidences of cancer are measurably lower in the Rocky Mts. than in the Mississippi Valley.
Although it is controversial, there is a Japanese study which indicates that survivors of Hiroshima and Nagasaki on average live longer that Japanese in the same age group of other cities. Certainly there is no clear indication that life spans of H & N survivors have been shortened relative to other Japanese.
It was predicted that the fatalities of Chernobyl would be on the order of 150,000. The New York Times and Washington Post reported that to date 50 deaths are attributable to the Chernobyl nuclear accident (1986). And even if that total is increased owing to the longer term effects of nuclear radiation as a causative agent of cancer the original estimate will not be approached.
Then President Richard Nixon declared war on cancer in 1971 with the prediction that the war would be largely won by 1976. I do not recall there was widespread laughter by the public at that extremely silly forecast. Admittedly Nixon was not someone who inspired merriment – contempt would be more like it. Still how could he or anyone have been so naïve?
Whacko environmentalists like to claim we are all slowly, and in some cases not so slowly, being poisoned by chemicals input into the air, ground, and water by perfidious and profligate man. In fact low levels of chemicals are beneficial to good health. Consider such chemical elements as phosphorus, magnesium, molybdenum, nickel, copper, zinc, chromium, and selenium which are trace elements necessary for good health and contained in many vitamin and mineral supplements. Mercury and lead are also in this category such that while large amounts are toxic, small quantities are essential.
Is global warming occurring and if so is it caused by human activity? The “Greens” and other head cases insist the answers are yes and yes and all dissenters are right wing environmental evil despoilers who should be suppressed. As for myself, I have a few questions. Not only is there a problem of obtaining accurate and representative temperatures now, given the “heat island” effect of cities, lack of proper temperature recording equipment in third world countries, and considering the earth is 2/3 covered with oceans, but comparing these with past proxy temperatures derived from tree rings and ice cores is problematic at best.
The environmentalists estimate that global temperature has increased by 1 ºC over the past 100 years, yet there was a circa 30 year period in the mid century where average temperatures declined and most of the gain is attributed to the first half of the 20th century. If industrialization is the cause then how is this explained?
Vikings settled in Greenland about 1000 A.D. The name was not selected cynically to entice additional settlers, but was an accurate description unlike now where the names of Iceland and Greenland should more descriptively be interchanged. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.
The diet of the Vikings was 80% derived from grazing land animals and 20% from fishing in the sea. This was prior to the “Little Ice Age” from approximately 1400 to 1850. As the global cold weather persisted the diet of the Vikings changed to 80% from fishing and 20% from the diminishing herds of sheep and cattle. Eventually all the sheep and cattle died and even the fish stopped coming into the cold northern waters. The Vikings did not want to emulate the native Greenlanders because they considered them barbarians so they all starved. Hubris will get you every time.
There were many other effects of the Little Ice Age. New England was so persistently cold that the western migration of Americans was greatly accelerated. The severity of the “Irish Potato Famine” is attributable to the prolonged cold and damp weather. Question: What caused what was a relatively global warm period prior to the LIA, what caused the LIA, and what caused the warm up after?
There are two current theories as to the mechanism of the temperature change during the Little Ice Age and neither, naturally, has anything to do with the alleged irresponsible actions of man. One theory is that the sun on a still unpredictable and not understood basis periodically outputs a changing and significant amount of solar energy. Another theory is that currents carry waters from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of warmer water flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding the Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes the Little Ice Age was initiated. Or so goes the theory.
The United States has not signed on to the Kyoto Treaty and has therefore been pummeled by the hard left in this country and around the world. However the purpose of the treaty is clear. It is to cripple the US economy. Consider: Major polluting countries, China, India, Brazil, and third World countries are exempt from emission standards for the undisputed reason their economies would be hurt. The chosen year of 1990 for base reductions favor Russia and Germany as it was subsequent to then that their economies started growing rapidly owing to the overthrowing of Communism in Russia and the unification of East and West Germany. According to a just released European Institute for Public Policy Research report, only Britain and Sweden are hono(u)ring their commitments to cut greenhouse gasses and 10 out of 15 European signatories will miss their treaty targets without taking urgent action.
There is no doubt that AIDS (Auto Immune Deficiency Syndrome) is a huge health problem in Africa. The question is, is it as widespread as advertised? Whether people in Africa have AIDS is largely not determined by tests for HIV (Human Immunodeficiency Virus), but whether hospital patients have several of such symptoms as a 10% weight loss, cough, fever, and diarrhea which are all symptoms of AIDS and symptoms of other diseases as well. Additionally false positives of HIV are produced by such maladies as malaria and pregnancy, common in Africa. Despite what is called the AIDS epidemic the population of Africa is continually increasing contrary to the forecasts of the most pessimist alarmists.
The effect of a diminution in smoking and earlier detection of some cancers has lowered cancer mortality rates. However even though there has been a significant decrease in the mortality of some forms of cancer there are other types which have not had any meaningful decrease in the mortality rate in the past 50 years. Cancer is a serious, but interesting disease or more correctly a series of various diseases. It is widely accepted that gene mutation is the cause of most cancers. There is an alternate explanation called the theory of cell duplication or the Aneuploid Theory which holds that mistakes made during cell division cause cancer. For example when a somatic cell divides instead of producing 46 double structured chromosomes it erroneously makes, say, a cell with an many as 80 double structured chromosome pairs. University of California molecular biologist, Peter Duesburg has been proposing this theory for 25 years. He is better known for his claim that HIV is not the cause of AIDS so perhaps his explanation for the cause of cancer should be taken with a large, even very large, dose of skepticism. The complexity of cellular biology is what makes an understanding not only of cancer, but human biology so frustrating. The Human Genome Project was first conceived in 1975 with great promise for human cellular engineering advancement. Thirty years later understanding of human cellular biology has been frustratingly slow. There is no rational reason to believe that stem cell research will advance any faster.
Saturday, March 31, 2007
THE SIXTH SENSE MYTH 10
With the recent tristful and unfortunate (what were the chances of that stingray fatally striking him in the heart with his barbed tail?) death of 44 year old Australian crocodile hunter and animal naturalist Steve Irwin, I heard several commentators talking about Irwin’s “sixth sense” when dealing with wild and funest animals. Quite apart from the obvious failure of his “sixth” or any other sense of danger in this instance, the concept of a sixth sense is what is called a vulgar error. Not vulgar as in obscene, but in the original Latin (vulgus - - the masses) meaning of the masses of common people as distinguished from informed people.
When speaking of a “sixth sense” the assumption is that there are five other senses – sight, hearing, smell, taste, and touch. The number of senses is a matter of definition. In fact one can make a case for only one sense or as many as 13 or more. The sense of touch could encompass all of the senses (sound waves impinging on the eardrum or light waves on the retina, etc.) or, in addition to the five already mentioned, there are the senses of balance, orientation, heat, cold, pain, movement, pressure or resistance, as well as hunger and thirst.
This word misuse reminds me of the current seemingly ubiquitous expression, especially by television talk show hosts or anchors, of “You will not believe this next story.” If the assumption is that the audience (more precisely viewer because it includes sight as well as audio (sound)) will not believe what they are about to be told, then what is the purpose of telling them? It is a rhetorical statement completely devoid of originality and steeped in clichéd repetition. More to the point is the question of why these people lemming-like keep making this completely inane statement.
When you next hear someone unthinkingly repeating the cliché of a “sixth sense” you can make a decision whether you want to let them continue in their blissful ignorance or risk offending them by pointing out their error. My predilection is always to educate and risk offense.
When speaking of a “sixth sense” the assumption is that there are five other senses – sight, hearing, smell, taste, and touch. The number of senses is a matter of definition. In fact one can make a case for only one sense or as many as 13 or more. The sense of touch could encompass all of the senses (sound waves impinging on the eardrum or light waves on the retina, etc.) or, in addition to the five already mentioned, there are the senses of balance, orientation, heat, cold, pain, movement, pressure or resistance, as well as hunger and thirst.
This word misuse reminds me of the current seemingly ubiquitous expression, especially by television talk show hosts or anchors, of “You will not believe this next story.” If the assumption is that the audience (more precisely viewer because it includes sight as well as audio (sound)) will not believe what they are about to be told, then what is the purpose of telling them? It is a rhetorical statement completely devoid of originality and steeped in clichéd repetition. More to the point is the question of why these people lemming-like keep making this completely inane statement.
When you next hear someone unthinkingly repeating the cliché of a “sixth sense” you can make a decision whether you want to let them continue in their blissful ignorance or risk offending them by pointing out their error. My predilection is always to educate and risk offense.
Friday, March 23, 2007
The Great Mortality & Immorality 9
The Black Death or as it was more commonly called at the time, The Great Mortality invaded Europe in 1347-52 apparently from the steppes of Central Asia becoming the most devastating plague of all time. The population of Europe has been estimated to be 75 million at that time and fully 1/3, at least 25 million people, died from the plague. Even if the mortality rate was not greatly iatrogenically accelerated, effective medical intervention was practically nonexistent. There have been many accounts of this plague, but one of the best is the 2005 book The Great Mortality by John Kelly.
The disease was caused by the bacillus, Yersinia pestis, with the transmission vector mainly being the rat flea, Xenopsylla cheopis, and also likely the human flea, Pulex irritans. The fleas in turn were carried by the black rat, Rattus rattus, although other varmints such as marmots may have also been involved.
There are additional aspects to this story besides the massive deaths and horrendous human suffering caused by the disease and one is implied by the second part of this article title.
Often a kernel of truth is contained in some of the most hackneyed sayings and this episode of disease is no exception. The cliché, originally from Psalms, “it is an ill wind that blows nobody good” illustrates the point. In the years after the Black Death abated, farm land became more plentiful (there were many fewer people) and the cost of labour in Europe increased for the same reason. Thus economic advantage accrued to the peasantry versus the landed gentry. Even in the cities labourers, especially skilled labour, saw their wages rise dramatically along with prices of course. There was an attempt by the ruling classes to impose wage and price controls, but then as now this never works. As frequently stated by the likes, among many others, of economists Milton Friedman, Thomas Sowell, Walter Williams, et al. the putative proposition is when wage and price controls are imposed which are set below free market wages and prices then shortages result; when price and wage supports are instituted which are above free market wages and prices then over supply and over production result which is wasteful and inefficient in a capitalistic economy because it takes resources in labor or goods away from other more desired production/consumption. This is such a simple concept yet is either not understood or not accepted by socialists or other hard left types to the detriment of economic prosperity.
Book making in the Middle Ages was a labor intensive undertaking which required several copyists. In the high wage post-plague years this cost became prohibitively expensive except for the very richest nobility. Enter Johann Gutenberg with his invention of the moveable type printing press in 1453. One could say that the advent of the Black Death led to mass production of printed material, including books, and a much higher literacy rate in Europe. “necessity became the mother of invention” might be the watchword.
Once again we have the proposition that “It is not that the primitive mind can not link cause and effect, but does so where none exists.” In this case it took an ugly and deadly turn. When the Black Death hit Europe in 1347-48, perhaps understandably with the discovery of microorganisms being centuries in the future, people, especially Christians, looked for more tangible agents. As in the previous, but much less severe, plagues in the 11th, 12th, 13th, and early 14th centuries the candidates were lepers, beggars, vagrants, and Jews. By far the most popular of these were the Jews. And Christians, by and large, showed little mercy in dealing with what they perceived was the source of their misery by inculpating Jews. One exception was Pope Clement VI who issue a Papal Bull on July 6, 1348 stating “it can not be true that the Jews…are the cause…of the plague…for [it] afflicts the Jews themselves.” Little attention was paid to this sensible analysis.
Unlike previous campaigns against the non-culpable Jews in Europe this time banishment and prohibition from entry were not the main actions taken. Real pogroms including massacres and attempted force conversion to Christianity were undertaken. Jews in what are now Austria, England, France, Germany, Italy, Spain, and Switzerland among other European countries were targets of indiscriminate killing – the favorite mode being burning at the stake. In at least one documented incident all the Jews in one town - men, women, and children – were crowded into one building which was locked and set on fire. Nobody survived. Apparently the syllogism was: The Great Mortality was bad; Jews do bad things, therefore Jews caused the deadly disease.
Naturally this horrendously deadly torment had to be justified – and so it was. You see the Jews caused the plague by poisoning water wells all over Europe. There were even a few Jews who confessed to this. After, of course, they were put on the rack or subjected to other unbearable torture.
It was not just in the 20th century where compurgation was not afforded the Jews, but they were massacred in great numbers during the holocaust. Whether the number was 6 million the Nazis slaughtered or 5 million or 7 million is of little import – it was undeniably millions. Given their history of persecution it should be natural to cut them a little slack in their dispute with the Arabs over Palestine. Especially the Europeans, with their historically deplorable treatment of Jews, should reasonable be less biased against the Israelis vs. the Palestinians than they are.
I am not claiming that the Israelis are completely blameless in the tête-à-tête with the Arabs although the Arabs did start the 1948 war because they saw the Jews getting stronger with immigration and aid from around the world. When I was in Libya in the late 1950’s I met several young Palestinian Arabs who were given 24 hours by the Jews to leave their homes in 1948. They could take with them as much of their personal property as they could manage, but they had to get out. The Israeli ambassador to the U.N., Dan Gillerman, was recently asked in a TV interview in the United States whether the Jews had not forced some Arabs out of Palestine during the 1948 war and he said “no, absolutely not.” Ambassador Gillerman is a smart, educated, well informed politician so he had to know what he said was not true. Yes, I am saying he lied.
Regardless of which side has the more legitimate claim on Palestine, a proposition which could be debated endlessly, it is abundantly clear to all sane, reasonable people that Israel wants to live in peace (and do business wouldn’t you know?) with their portion of the Holy Land while the Palestinians and other Muslims in the Middle East want the Jews either exterminated or exiled from all Arabs lands.
It is singular that the funest crimes committed by Christians against Jews and perceived non-believing other people hundreds of years ago in Europe would be absolutely unthinkable today. Can the same be said about Muslims?
Christians in the Middle Ages burned their enemies, real or imagined, at the stake. That does not differ materially, and certainly not in outcome, with Islamo-fascists blowing up innocent people with IED’s today. At times Medieval Christians gave Jews the option of converting to Christianity on pain of death if they refused, which they sometimes did. One of the stated objectives of current Islamic fanatics is to convert all non-adherents of their brand of Islam and Sharia (Arabic for "well-trodden path") law, on the pain of death if they refuse. Sound familiar?
One gets the impression that the madman in Tehran (President Mahmoud Ahmadinejad) would rather forgo any conversion option when it comes to the “Great Satans” of the United States and Israel and proceed directly to termination. If the Iranian regime manufactures nuclear bombs and acquires additional help from North Korea for developing long range missile technology then they will be in position to carry out their intimidation if not downright elimination strategy against the United States and Israel. The Iranians would find out too late the meaning of a pyrrhic victory – no matter how much damage they inflicted they would be completely incinerated in retaliation. Sane people would realize this, but can fanatics, religious or otherwise, be counted on to be rational?
My conclusion is that Christianity has advanced immeasurably in the last few hundred years in terms of compassion, morality, and rationality while Islam is still stuck in their concomitance with the earlier Christianity past.
I would enjoy entertaining dissenting opinions with my theses if they are supported by facts - Fons et origo, and apodictic reasoning. Even any lapsus calami pointed out would be appreciated.
The disease was caused by the bacillus, Yersinia pestis, with the transmission vector mainly being the rat flea, Xenopsylla cheopis, and also likely the human flea, Pulex irritans. The fleas in turn were carried by the black rat, Rattus rattus, although other varmints such as marmots may have also been involved.
There are additional aspects to this story besides the massive deaths and horrendous human suffering caused by the disease and one is implied by the second part of this article title.
Often a kernel of truth is contained in some of the most hackneyed sayings and this episode of disease is no exception. The cliché, originally from Psalms, “it is an ill wind that blows nobody good” illustrates the point. In the years after the Black Death abated, farm land became more plentiful (there were many fewer people) and the cost of labour in Europe increased for the same reason. Thus economic advantage accrued to the peasantry versus the landed gentry. Even in the cities labourers, especially skilled labour, saw their wages rise dramatically along with prices of course. There was an attempt by the ruling classes to impose wage and price controls, but then as now this never works. As frequently stated by the likes, among many others, of economists Milton Friedman, Thomas Sowell, Walter Williams, et al. the putative proposition is when wage and price controls are imposed which are set below free market wages and prices then shortages result; when price and wage supports are instituted which are above free market wages and prices then over supply and over production result which is wasteful and inefficient in a capitalistic economy because it takes resources in labor or goods away from other more desired production/consumption. This is such a simple concept yet is either not understood or not accepted by socialists or other hard left types to the detriment of economic prosperity.
Book making in the Middle Ages was a labor intensive undertaking which required several copyists. In the high wage post-plague years this cost became prohibitively expensive except for the very richest nobility. Enter Johann Gutenberg with his invention of the moveable type printing press in 1453. One could say that the advent of the Black Death led to mass production of printed material, including books, and a much higher literacy rate in Europe. “necessity became the mother of invention” might be the watchword.
Once again we have the proposition that “It is not that the primitive mind can not link cause and effect, but does so where none exists.” In this case it took an ugly and deadly turn. When the Black Death hit Europe in 1347-48, perhaps understandably with the discovery of microorganisms being centuries in the future, people, especially Christians, looked for more tangible agents. As in the previous, but much less severe, plagues in the 11th, 12th, 13th, and early 14th centuries the candidates were lepers, beggars, vagrants, and Jews. By far the most popular of these were the Jews. And Christians, by and large, showed little mercy in dealing with what they perceived was the source of their misery by inculpating Jews. One exception was Pope Clement VI who issue a Papal Bull on July 6, 1348 stating “it can not be true that the Jews…are the cause…of the plague…for [it] afflicts the Jews themselves.” Little attention was paid to this sensible analysis.
Unlike previous campaigns against the non-culpable Jews in Europe this time banishment and prohibition from entry were not the main actions taken. Real pogroms including massacres and attempted force conversion to Christianity were undertaken. Jews in what are now Austria, England, France, Germany, Italy, Spain, and Switzerland among other European countries were targets of indiscriminate killing – the favorite mode being burning at the stake. In at least one documented incident all the Jews in one town - men, women, and children – were crowded into one building which was locked and set on fire. Nobody survived. Apparently the syllogism was: The Great Mortality was bad; Jews do bad things, therefore Jews caused the deadly disease.
Naturally this horrendously deadly torment had to be justified – and so it was. You see the Jews caused the plague by poisoning water wells all over Europe. There were even a few Jews who confessed to this. After, of course, they were put on the rack or subjected to other unbearable torture.
It was not just in the 20th century where compurgation was not afforded the Jews, but they were massacred in great numbers during the holocaust. Whether the number was 6 million the Nazis slaughtered or 5 million or 7 million is of little import – it was undeniably millions. Given their history of persecution it should be natural to cut them a little slack in their dispute with the Arabs over Palestine. Especially the Europeans, with their historically deplorable treatment of Jews, should reasonable be less biased against the Israelis vs. the Palestinians than they are.
I am not claiming that the Israelis are completely blameless in the tête-à-tête with the Arabs although the Arabs did start the 1948 war because they saw the Jews getting stronger with immigration and aid from around the world. When I was in Libya in the late 1950’s I met several young Palestinian Arabs who were given 24 hours by the Jews to leave their homes in 1948. They could take with them as much of their personal property as they could manage, but they had to get out. The Israeli ambassador to the U.N., Dan Gillerman, was recently asked in a TV interview in the United States whether the Jews had not forced some Arabs out of Palestine during the 1948 war and he said “no, absolutely not.” Ambassador Gillerman is a smart, educated, well informed politician so he had to know what he said was not true. Yes, I am saying he lied.
Regardless of which side has the more legitimate claim on Palestine, a proposition which could be debated endlessly, it is abundantly clear to all sane, reasonable people that Israel wants to live in peace (and do business wouldn’t you know?) with their portion of the Holy Land while the Palestinians and other Muslims in the Middle East want the Jews either exterminated or exiled from all Arabs lands.
It is singular that the funest crimes committed by Christians against Jews and perceived non-believing other people hundreds of years ago in Europe would be absolutely unthinkable today. Can the same be said about Muslims?
Christians in the Middle Ages burned their enemies, real or imagined, at the stake. That does not differ materially, and certainly not in outcome, with Islamo-fascists blowing up innocent people with IED’s today. At times Medieval Christians gave Jews the option of converting to Christianity on pain of death if they refused, which they sometimes did. One of the stated objectives of current Islamic fanatics is to convert all non-adherents of their brand of Islam and Sharia (Arabic for "well-trodden path") law, on the pain of death if they refuse. Sound familiar?
One gets the impression that the madman in Tehran (President Mahmoud Ahmadinejad) would rather forgo any conversion option when it comes to the “Great Satans” of the United States and Israel and proceed directly to termination. If the Iranian regime manufactures nuclear bombs and acquires additional help from North Korea for developing long range missile technology then they will be in position to carry out their intimidation if not downright elimination strategy against the United States and Israel. The Iranians would find out too late the meaning of a pyrrhic victory – no matter how much damage they inflicted they would be completely incinerated in retaliation. Sane people would realize this, but can fanatics, religious or otherwise, be counted on to be rational?
My conclusion is that Christianity has advanced immeasurably in the last few hundred years in terms of compassion, morality, and rationality while Islam is still stuck in their concomitance with the earlier Christianity past.
I would enjoy entertaining dissenting opinions with my theses if they are supported by facts - Fons et origo, and apodictic reasoning. Even any lapsus calami pointed out would be appreciated.
Friday, March 16, 2007
MISCELLANEOUS RUMINATIONS 8
Americans spend multi-billions of dollars annually on foods, fads, books, treatments, surgery, séances, and God knows what else to lose or otherwise keep their weight under control. Why not the peripeteia of that? That is to say spending time and treasure to keep one’s weight up. Sounds ridiculous you think, then consider: The 18th & 19th century’s English cleric and economist, Thomas Malthus (1766-1834), observing that the population was increasing geometrically and the food supply arithmetically, declared in his Essay on the Principles of Population (1798) that absent of deadly pandemics or world-wide wars the world was in for massive starvation. What did happen to the world’s population in those centuries since?
The world population has increased approximately as follows: 600 million in 1700; 900 million in 1800; 1.6 billion in 1900; 2.5 billion in 1950; 6.5 billion today. This is certainly a geometric increase, especially in the 20th and 21st centuries. Why then has there not been catastrophic starvation? In fact this year the World Health Organization declared there are more people at risk for health problems from excessive weight (eating too much food) than from starvation. True enough millions of people in the third world suffer from a deficiency of nutritious food, yet a shortage of food is not at fault; rather it is a delivery problem coupled with corruption and inefficiency of United Nations and government officials on the receiving end of foodstuffs given by the generous and compassionate West.
It turns out that the supply of food has increased even more than the population in the past 300 years. The term used for it is the “Green Revolution” meaning advanced mechanized farming, packaging, storing, and delivery methods for food stuffs. Malthus had no way of predicting or even imagining this true revolution.
More recently, in the late 1960’s to the 1980’s, a screwball named Paul Ehrlich (see my essay Fools, Frauds & Fakes) in his 1968 book The Population Bomb predicted that ½ billion people would die of starvation in the next decade and there would be food riots in the United States in the 1980’s. He opined that the world’s population would be 1.5 billion by 1985. This Ehrlich book (it sold 20 million copies) and his 1991 book The population Explosion were best sellers and he made numerous appearances on national TV programs. Ehrlich was the darling of the left in his day much as the equally loony and labile Al Gore, who is clearly afflicted with amentia, is today. The next time you hear people such as the egregious Scott Pelley (see my essay Global Warming or Cooling: Which the Hell is it?) and ABC News reporter Bill Blakemore, who stated this past week that he rejects balance on reporting Global Warming as he considers critics of GW merely hacks and lackeys, keep in mind the record of the we-are-all-going-to-starve crowd. Journalism objectivity is not a strong suit of the paralogistic press (main stream news media).
The latest affright by the wacky left, quo modo, as with Global Warming, is that the world’s seas will be depleted of harvestable seafood by 2050. These kvetching people apparently will never be satisfied until depression and despair affects every living soul on earth. A 14 member group of scientists (?) from Halifax U. in Nova Scotia, Stanford, Scripps Inst. of Oceanography, Stockholm U., and British Columbia U. came up with this carking scenario. University of Washington professor of aquatic & fishery sciences, Ray Hilborn, responded “It’s just mind-boggling stupid.” Indeed.
Even though as the 16th & 17th century’s English poet and preacher, John Donne (1571? – 1631) wrote in his Devotions Upon Emergent Occasions (1624) “Every man’s death diminishes me because I am involve in mankind, therefore never send to know for whom the bell tolls; it tolls for thee.” that does not mean all sacrifice of life is futile. In defense of their homeland various countries have paid heavily. Great Britain, France, and the United States collectively suffered approx. 750,000 military deaths in World War II. In the three months Battle of Moscow in the late fall and winter of 1941 after Hitler decided to invade the Soviet Union, the Russians calamitously had 150,000 more military and civilian death than that - a total of 900,000 in just three months. In fact 80% of the German causalities in the war occurred on the Eastern front fighting the Soviets. It is conceivable that without Germany’s war with the Soviet Union on their Eastern front, the Allies invasion in June 1944 may have not succeeded. At the very least the Allies effort to defeat the Nazis would have been much more costly in time, blood, and treasure.
The United States had 405,000 military deaths during WWII. For every US death Japan had 5; Germany 20; and the USSR 75. The total military and civilian deaths during WWII for the Soviet Union were circa 30 million. There are those who have said that the United States should not have allowed the Soviet Union hegemony over Eastern Europe after WWII. The Red Army occupied the Eastern European countries at the finish of the war – it was a fait accompli. For the United States to have intervened would have meant another war. Wasn’t there killing enough already? Eastern Europeans suffered under the Soviet yoke from the rest of the 1940’s through the 1980’s, but they are largely free now. Incidentally when Boris Yeltsin declared the end of the Soviet Union on 12/06/1991 the date was 50 years to the day when the Soviets counter attacked in the Battle of Moscow.
The second most consumers spending holiday in the United States after Christmas is now Halloween. And this holiday has spread to other countries, especially in Europe. In what may fairly be characterized as continuing anti-American xenophobia, the French government is attempting to minimize if not completely eliminating this, horreur, imported American celebration. However, in an irenic spirit we can reasonably empathize with the Frogs on this one. After all what children would want to go around on Halloween saying, “bonbon ou baton (trick or treat)?” Literally this means “candy or a baton (club)”, presumably applied to the noggin of recalcitrant or parsimonious treat givers. The phrase in French is “treat or trick” - butt-backwards to the English expression, but then what would you expect from the Frenchies? Even French children sensibly do not want to make themselves sound ridiculous by repeatedly yelling “bonbon ou baton.”
The world population has increased approximately as follows: 600 million in 1700; 900 million in 1800; 1.6 billion in 1900; 2.5 billion in 1950; 6.5 billion today. This is certainly a geometric increase, especially in the 20th and 21st centuries. Why then has there not been catastrophic starvation? In fact this year the World Health Organization declared there are more people at risk for health problems from excessive weight (eating too much food) than from starvation. True enough millions of people in the third world suffer from a deficiency of nutritious food, yet a shortage of food is not at fault; rather it is a delivery problem coupled with corruption and inefficiency of United Nations and government officials on the receiving end of foodstuffs given by the generous and compassionate West.
It turns out that the supply of food has increased even more than the population in the past 300 years. The term used for it is the “Green Revolution” meaning advanced mechanized farming, packaging, storing, and delivery methods for food stuffs. Malthus had no way of predicting or even imagining this true revolution.
More recently, in the late 1960’s to the 1980’s, a screwball named Paul Ehrlich (see my essay Fools, Frauds & Fakes) in his 1968 book The Population Bomb predicted that ½ billion people would die of starvation in the next decade and there would be food riots in the United States in the 1980’s. He opined that the world’s population would be 1.5 billion by 1985. This Ehrlich book (it sold 20 million copies) and his 1991 book The population Explosion were best sellers and he made numerous appearances on national TV programs. Ehrlich was the darling of the left in his day much as the equally loony and labile Al Gore, who is clearly afflicted with amentia, is today. The next time you hear people such as the egregious Scott Pelley (see my essay Global Warming or Cooling: Which the Hell is it?) and ABC News reporter Bill Blakemore, who stated this past week that he rejects balance on reporting Global Warming as he considers critics of GW merely hacks and lackeys, keep in mind the record of the we-are-all-going-to-starve crowd. Journalism objectivity is not a strong suit of the paralogistic press (main stream news media).
The latest affright by the wacky left, quo modo, as with Global Warming, is that the world’s seas will be depleted of harvestable seafood by 2050. These kvetching people apparently will never be satisfied until depression and despair affects every living soul on earth. A 14 member group of scientists (?) from Halifax U. in Nova Scotia, Stanford, Scripps Inst. of Oceanography, Stockholm U., and British Columbia U. came up with this carking scenario. University of Washington professor of aquatic & fishery sciences, Ray Hilborn, responded “It’s just mind-boggling stupid.” Indeed.
Even though as the 16th & 17th century’s English poet and preacher, John Donne (1571? – 1631) wrote in his Devotions Upon Emergent Occasions (1624) “Every man’s death diminishes me because I am involve in mankind, therefore never send to know for whom the bell tolls; it tolls for thee.” that does not mean all sacrifice of life is futile. In defense of their homeland various countries have paid heavily. Great Britain, France, and the United States collectively suffered approx. 750,000 military deaths in World War II. In the three months Battle of Moscow in the late fall and winter of 1941 after Hitler decided to invade the Soviet Union, the Russians calamitously had 150,000 more military and civilian death than that - a total of 900,000 in just three months. In fact 80% of the German causalities in the war occurred on the Eastern front fighting the Soviets. It is conceivable that without Germany’s war with the Soviet Union on their Eastern front, the Allies invasion in June 1944 may have not succeeded. At the very least the Allies effort to defeat the Nazis would have been much more costly in time, blood, and treasure.
The United States had 405,000 military deaths during WWII. For every US death Japan had 5; Germany 20; and the USSR 75. The total military and civilian deaths during WWII for the Soviet Union were circa 30 million. There are those who have said that the United States should not have allowed the Soviet Union hegemony over Eastern Europe after WWII. The Red Army occupied the Eastern European countries at the finish of the war – it was a fait accompli. For the United States to have intervened would have meant another war. Wasn’t there killing enough already? Eastern Europeans suffered under the Soviet yoke from the rest of the 1940’s through the 1980’s, but they are largely free now. Incidentally when Boris Yeltsin declared the end of the Soviet Union on 12/06/1991 the date was 50 years to the day when the Soviets counter attacked in the Battle of Moscow.
The second most consumers spending holiday in the United States after Christmas is now Halloween. And this holiday has spread to other countries, especially in Europe. In what may fairly be characterized as continuing anti-American xenophobia, the French government is attempting to minimize if not completely eliminating this, horreur, imported American celebration. However, in an irenic spirit we can reasonably empathize with the Frogs on this one. After all what children would want to go around on Halloween saying, “bonbon ou baton (trick or treat)?” Literally this means “candy or a baton (club)”, presumably applied to the noggin of recalcitrant or parsimonious treat givers. The phrase in French is “treat or trick” - butt-backwards to the English expression, but then what would you expect from the Frenchies? Even French children sensibly do not want to make themselves sound ridiculous by repeatedly yelling “bonbon ou baton.”
Friday, March 9, 2007
THE FIRST MIRACLE DRUG 7
Before reading the book The Demon Under the Microscope by Thomas Hager it was my, and likely others as well, impression, erroneous as it turned out, that penicillin was the first ‘miracle’ anti-bacterial drug. It was not. The following is a brief recounting of what really was that drug.
In the summer of 1924 the teen age son of President Calvin Coolidge developed a blister on his big toe while playing tennis. The wound became infected with what looked like Streptococcus pyogenes and despite the best medical treatment available at the time he was dead within two weeks. In November 1936 Harvard senior, Franklin Delano Roosevelt Jr., was undergoing a round of cocktail and dinner parties and press conferences owing to his engagement to be married. Perhaps due to the stress and tiring activities of all this, young Roosevelt began feeling unwell with a sore throat and head which felt like it was full of concrete. Five days later he was admitted to Massachusetts General Hospital. Again despite the best medical treatment available at that time Roosevelt’s condition steadily deteriorated over the next three weeks as he was diagnosed with some strain of a streptococcus infection. In desperation he was given a new and still unproven drug. In two days there was a complete turnaround and Roosevelt soon left the hospital. The completely different outcome of these two cases was entirely due to this new drug.
Streptococcus infections, or strep for short, were estimated to kill 1.5 million victims per year in Europe and North America alone during the 1920’s. That same number adjusted for today’s population would total more than the current worldwide annual deaths from cholera, dysentery, typhoid, and AIDS combined. Such diseases as scarlet fever, rheumatic fever, and strep throat are caused by various species of streptococcus. With enough magnification all strains of strep look like twisted strings of beads (Latin streptococcus from Greek streptos twisted chain + kokkos berry or seed).
After the horrendous number of deaths of soldiers from infected wounds in WW1 the Germans, English, and French, and especially the Germans concentrated on finding medications which would cure these infections. A young German medic during WW1, Gerhard Domagk, experienced firsthand the futility of the medical treatment given to wounded soldiers in fighting infection. After the war he determined to combat this scourge of deadly infection by graduating from medical school in 1921 and devoting himself to finding cures. After first working at a medical school he went to work in the pharmaceutical department of Friedrich Bayer & Company.
German companies were the world leaders in the manufacturing of dyes for cloth and other materials. A Jewish German chemist, Paul Ehrlich, conceived the idea and developed a method of using dyes to stain bacteria so they could be studied under a microscope. He had received a Nobel Prize in medicine for his work on immunity and serum therapy in 1908. Domagk, and other German researchers started using dye molecules, especially what was called an azo dye in attempting to develop an anti-bacterial medication composed of the dye molecule coupled with other molecules. Domagk had six laboratory assistants - all of them women. How is that for diversity in the workplace for the 1920’s? After many trials and failures, Domagk tried combining the azo dye molecule with a molecule called sulfanilamide. This medication showed great promise treating some types of strep and after much further experimentation and refinement a drug called Prontosil was marketed by Bayer. Dr. Gerhard Domagk was named a Nobel Prize recipient by the Swedish committee in 1939. However Adolph Hitler himself prohibited Domagk from accepting the prize because a couple of years early a German opponent of the Nazi regime had been awarded the Nobel Peace Prize by the Norwegian Nobel committee. It made no difference to Hitler that the Swedish committee was different from the Norwegian committee – he wasn’t going to have anything to do with Nobel period. Domagk was invited to Stockholm to receive his Nobel Prize in 1947 when Hitler was history.
This new anti-bacterial drug, Prontosil, and further similar drugs developed by the Germans, French, and English worked almost miraculously in laboratory tests on mice and rabbits and then on human patients. There was no such thing as a clinical trial in the 1930’s so the patients became by default guinea pigs. Still the cure rates were as nothing seen in medicine before. The one puzzling aspect was that while the results in vivo in the laboratory and with human patients were spectacular, in vitro it was a dismal failure – the pathogens in test tubes were not affected by the drug.
The Germans and as a result the French and English as well were so hung up on using dye molecules that it was practically by accident that a group of test mice were treated with pure sulfanilamide in a French lab. When the sulfanilamide alone worked as effectively as the previous drugs linked to the azo dye there was shock and as expected, doubt in laboratories around Europe. After the results were confirmed by others the rush was on to turn out variations of sulfa based medications. One mystery was solved by the discovery of sulfa being the active anti-bacterial agent. Sulfa had to be released from the rest of the Prontosil molecule in order to become active. This could happen in the body of an animal, where enzymes in the body could split the Prontosil molecule into two pieces, releasing pure sulfa as the medicine, but not in a test tube which contained no enzymes.
By 1937 sulfa drugs were being widely used in the United States and therein lay a cautionary tale. In Tulsa, Oklahoma Dr. James Stephenson observed that many sick patients including a majority of children began showing up in Tulsa hospitals and later in hospitals in other cities around the country. A number of them began dying with renal failure. Sulfa pills, capsules, injectable solutions, and powders had recently hit the market from a variety of pharmaceutical manufacturers. Every drug store in Tulsa was selling sulfa to anyone who asked for it. Dr. Stephenson traced the problem to a proprietary drug called Elixir Sulfanilamide produced by the Massengill Company. Sulfa is difficult to dissolve, alcohol does not work well nor does many of the common medicinal solvents. The head chemist for Massengill, Harold Watkins, found that diethylene glycol, an industrial solvent, worked very well to dissolve the sulfa. As it turned out, the problem was it also worked very well in bring on renal failure and death which reached 67 confirmed. As difficult as it is to believe today, the Massengill Company ran no tests on laboratory animals nor did they do any clinical tests with their medicine. The FDA had been established in 1906, but was of minimal effect in regulating the medical field. Like other issues perhaps the pendulum has swung too much the other way today. Although Watkins was slow in admitting blame for the sickness and death of the people who took his medicine he clearly realized his culpability – he committed suicide eight months later by putting a loaded hand gun to his head and squeezing the trigger.
Sulfa drugs generally had no serious side effects (except when mixed with poison, e.g. diethylene glycol). I can attest there were some relatively minor ones however. When I was a teenager I was given a sulfa drug for a forgotten illness and when I got up from bed to go to the bathroom I felt so dizzy that I almost, but not quite, fell to the floor. The medication was discontinued and I believe logic would dictate that I recovered anyway.
In 1939 a drug called sulfapyridine was developed in the United States as a cure for pneumonia. Within a few years this drug was saving the lives (why not say postponing the deaths? Isn’t that more accurate?) of 33 thousand pneumonia patients each year in the United States alone. By 1942 at least 3600 sulfa derivatives had been synthesized and studied and more than 30 of them had been sold in the United States. Sulfapyridine was, gram for gram, 18 times more powerful than Prontosil; sulfathiazole was 54 times more effective; sulfadiazine was 100 times more effective. The number of diseases that could be treated with the new sulfa drugs was also growing. Sulfathiazole worked as well as the other drugs against strep and also was effective against staphylococcal infections.
There seemed to be an ever increasing spectrum of diseases which could be treated successfully with sulfa drugs. The one thing unexplained was how the medicine worked, but that was discovered eight years after Prontosil was formulated. Sulfa was less of a magic bullet than a clever imposter. It was observed that sulfa never worked as well when there was dead tissue around or a lot of pus as in uncleaned wounds. There was a yeast extract called para-aminobenzoic acid (PABA) a chemical involved in bacterial metabolism which had astonishing anti-sulfa abilities. Some bacteria can make their own PABA; others can not and have to find it in their own environment. For these bacteria PABA is an essential metabolite. If they can not find it they starve. The sulfa and PABA molecules are so similar that an enzyme critical in keeping the bacteria healthy mistakes one for the other, binding sulfa instead of PABA, but since sulfa could not be metabolized, the enzyme, with sulfa stuck to it, becomes useless. The bacteria, denied a nutrient they need, eventually starve to death.
Sulfa drugs of course had not yet been discovered by WW1 so that acute respiratory diseases, including influenza, pneumonia, bronchitis, and other diseases killed circa 50,000 U.S. soldiers. In WW11, with twice as many men and women in uniform, only 1265 died from these diseases. The main difference was the widespread use of sulfa drugs in WW11. The wartime sulfa production was more than 4500 tons in 1943 alone, enough to treat more than 100 million patients. In December 1943 British Prime Minister Winston Churchill undertook a long and exhausting, for a 69 year old heavy smoker and legendary drinker, trip to Cairo, then to Teheran to meet with wartime allies Roosevelt and Stalin, back Cairo to meet again with Roosevelt, and finally on to Tunis for a couple of days rest at Eisenhower’s villa. When he arrived he had a sore throat and a fever of 101ºF. X-rays revealed that he had contacted pneumonia. He was given sulfa drugs, but suffered two bouts of atrial fibrillation and an episode of cardiac failure. Finally the medication kicked in and Churchill’s temperature returned to normal. Two weeks after he became sick he flew to Marrakech, Morocco and then home. For a while it was touch and go for one of the two leaders of the free world with sulfa drugs being the determinant.
I would be remiss if I did not mention what everyone reading this must be thinking about now. How about the problem of drug resistance with sulfa? The answer is what you might expect. With the massive overuse of sulfa drugs, as there was and is with penicillin and other modern antibiotics, resistance to the drugs became an increasing problem in lowering their effectiveness. Killing off non-resistant bacteria and thereby leaving a few resistant strains is a still an unsolved problem for these disease fighting medications.
Incidentally in 1928 British medical researcher Alexander Fleming notice that one of his bacteria plates which had been contaminated with a mold had the odd effect of clearing a zone around itself in which the bacteria did not grow. Fleming logically thought that the mold was releasing some sort of substance that hindered the growth of the bacteria. Purifying an amount of the mold sufficient to conduct in vivo tests proved so difficult that he largely dropped the research and turned to – sulfa. He returned to penicillin years later and along with two other British medical researchers received a Nobel Prize for medicine in 1945.
In the summer of 1924 the teen age son of President Calvin Coolidge developed a blister on his big toe while playing tennis. The wound became infected with what looked like Streptococcus pyogenes and despite the best medical treatment available at the time he was dead within two weeks. In November 1936 Harvard senior, Franklin Delano Roosevelt Jr., was undergoing a round of cocktail and dinner parties and press conferences owing to his engagement to be married. Perhaps due to the stress and tiring activities of all this, young Roosevelt began feeling unwell with a sore throat and head which felt like it was full of concrete. Five days later he was admitted to Massachusetts General Hospital. Again despite the best medical treatment available at that time Roosevelt’s condition steadily deteriorated over the next three weeks as he was diagnosed with some strain of a streptococcus infection. In desperation he was given a new and still unproven drug. In two days there was a complete turnaround and Roosevelt soon left the hospital. The completely different outcome of these two cases was entirely due to this new drug.
Streptococcus infections, or strep for short, were estimated to kill 1.5 million victims per year in Europe and North America alone during the 1920’s. That same number adjusted for today’s population would total more than the current worldwide annual deaths from cholera, dysentery, typhoid, and AIDS combined. Such diseases as scarlet fever, rheumatic fever, and strep throat are caused by various species of streptococcus. With enough magnification all strains of strep look like twisted strings of beads (Latin streptococcus from Greek streptos twisted chain + kokkos berry or seed).
After the horrendous number of deaths of soldiers from infected wounds in WW1 the Germans, English, and French, and especially the Germans concentrated on finding medications which would cure these infections. A young German medic during WW1, Gerhard Domagk, experienced firsthand the futility of the medical treatment given to wounded soldiers in fighting infection. After the war he determined to combat this scourge of deadly infection by graduating from medical school in 1921 and devoting himself to finding cures. After first working at a medical school he went to work in the pharmaceutical department of Friedrich Bayer & Company.
German companies were the world leaders in the manufacturing of dyes for cloth and other materials. A Jewish German chemist, Paul Ehrlich, conceived the idea and developed a method of using dyes to stain bacteria so they could be studied under a microscope. He had received a Nobel Prize in medicine for his work on immunity and serum therapy in 1908. Domagk, and other German researchers started using dye molecules, especially what was called an azo dye in attempting to develop an anti-bacterial medication composed of the dye molecule coupled with other molecules. Domagk had six laboratory assistants - all of them women. How is that for diversity in the workplace for the 1920’s? After many trials and failures, Domagk tried combining the azo dye molecule with a molecule called sulfanilamide. This medication showed great promise treating some types of strep and after much further experimentation and refinement a drug called Prontosil was marketed by Bayer. Dr. Gerhard Domagk was named a Nobel Prize recipient by the Swedish committee in 1939. However Adolph Hitler himself prohibited Domagk from accepting the prize because a couple of years early a German opponent of the Nazi regime had been awarded the Nobel Peace Prize by the Norwegian Nobel committee. It made no difference to Hitler that the Swedish committee was different from the Norwegian committee – he wasn’t going to have anything to do with Nobel period. Domagk was invited to Stockholm to receive his Nobel Prize in 1947 when Hitler was history.
This new anti-bacterial drug, Prontosil, and further similar drugs developed by the Germans, French, and English worked almost miraculously in laboratory tests on mice and rabbits and then on human patients. There was no such thing as a clinical trial in the 1930’s so the patients became by default guinea pigs. Still the cure rates were as nothing seen in medicine before. The one puzzling aspect was that while the results in vivo in the laboratory and with human patients were spectacular, in vitro it was a dismal failure – the pathogens in test tubes were not affected by the drug.
The Germans and as a result the French and English as well were so hung up on using dye molecules that it was practically by accident that a group of test mice were treated with pure sulfanilamide in a French lab. When the sulfanilamide alone worked as effectively as the previous drugs linked to the azo dye there was shock and as expected, doubt in laboratories around Europe. After the results were confirmed by others the rush was on to turn out variations of sulfa based medications. One mystery was solved by the discovery of sulfa being the active anti-bacterial agent. Sulfa had to be released from the rest of the Prontosil molecule in order to become active. This could happen in the body of an animal, where enzymes in the body could split the Prontosil molecule into two pieces, releasing pure sulfa as the medicine, but not in a test tube which contained no enzymes.
By 1937 sulfa drugs were being widely used in the United States and therein lay a cautionary tale. In Tulsa, Oklahoma Dr. James Stephenson observed that many sick patients including a majority of children began showing up in Tulsa hospitals and later in hospitals in other cities around the country. A number of them began dying with renal failure. Sulfa pills, capsules, injectable solutions, and powders had recently hit the market from a variety of pharmaceutical manufacturers. Every drug store in Tulsa was selling sulfa to anyone who asked for it. Dr. Stephenson traced the problem to a proprietary drug called Elixir Sulfanilamide produced by the Massengill Company. Sulfa is difficult to dissolve, alcohol does not work well nor does many of the common medicinal solvents. The head chemist for Massengill, Harold Watkins, found that diethylene glycol, an industrial solvent, worked very well to dissolve the sulfa. As it turned out, the problem was it also worked very well in bring on renal failure and death which reached 67 confirmed. As difficult as it is to believe today, the Massengill Company ran no tests on laboratory animals nor did they do any clinical tests with their medicine. The FDA had been established in 1906, but was of minimal effect in regulating the medical field. Like other issues perhaps the pendulum has swung too much the other way today. Although Watkins was slow in admitting blame for the sickness and death of the people who took his medicine he clearly realized his culpability – he committed suicide eight months later by putting a loaded hand gun to his head and squeezing the trigger.
Sulfa drugs generally had no serious side effects (except when mixed with poison, e.g. diethylene glycol). I can attest there were some relatively minor ones however. When I was a teenager I was given a sulfa drug for a forgotten illness and when I got up from bed to go to the bathroom I felt so dizzy that I almost, but not quite, fell to the floor. The medication was discontinued and I believe logic would dictate that I recovered anyway.
In 1939 a drug called sulfapyridine was developed in the United States as a cure for pneumonia. Within a few years this drug was saving the lives (why not say postponing the deaths? Isn’t that more accurate?) of 33 thousand pneumonia patients each year in the United States alone. By 1942 at least 3600 sulfa derivatives had been synthesized and studied and more than 30 of them had been sold in the United States. Sulfapyridine was, gram for gram, 18 times more powerful than Prontosil; sulfathiazole was 54 times more effective; sulfadiazine was 100 times more effective. The number of diseases that could be treated with the new sulfa drugs was also growing. Sulfathiazole worked as well as the other drugs against strep and also was effective against staphylococcal infections.
There seemed to be an ever increasing spectrum of diseases which could be treated successfully with sulfa drugs. The one thing unexplained was how the medicine worked, but that was discovered eight years after Prontosil was formulated. Sulfa was less of a magic bullet than a clever imposter. It was observed that sulfa never worked as well when there was dead tissue around or a lot of pus as in uncleaned wounds. There was a yeast extract called para-aminobenzoic acid (PABA) a chemical involved in bacterial metabolism which had astonishing anti-sulfa abilities. Some bacteria can make their own PABA; others can not and have to find it in their own environment. For these bacteria PABA is an essential metabolite. If they can not find it they starve. The sulfa and PABA molecules are so similar that an enzyme critical in keeping the bacteria healthy mistakes one for the other, binding sulfa instead of PABA, but since sulfa could not be metabolized, the enzyme, with sulfa stuck to it, becomes useless. The bacteria, denied a nutrient they need, eventually starve to death.
Sulfa drugs of course had not yet been discovered by WW1 so that acute respiratory diseases, including influenza, pneumonia, bronchitis, and other diseases killed circa 50,000 U.S. soldiers. In WW11, with twice as many men and women in uniform, only 1265 died from these diseases. The main difference was the widespread use of sulfa drugs in WW11. The wartime sulfa production was more than 4500 tons in 1943 alone, enough to treat more than 100 million patients. In December 1943 British Prime Minister Winston Churchill undertook a long and exhausting, for a 69 year old heavy smoker and legendary drinker, trip to Cairo, then to Teheran to meet with wartime allies Roosevelt and Stalin, back Cairo to meet again with Roosevelt, and finally on to Tunis for a couple of days rest at Eisenhower’s villa. When he arrived he had a sore throat and a fever of 101ºF. X-rays revealed that he had contacted pneumonia. He was given sulfa drugs, but suffered two bouts of atrial fibrillation and an episode of cardiac failure. Finally the medication kicked in and Churchill’s temperature returned to normal. Two weeks after he became sick he flew to Marrakech, Morocco and then home. For a while it was touch and go for one of the two leaders of the free world with sulfa drugs being the determinant.
I would be remiss if I did not mention what everyone reading this must be thinking about now. How about the problem of drug resistance with sulfa? The answer is what you might expect. With the massive overuse of sulfa drugs, as there was and is with penicillin and other modern antibiotics, resistance to the drugs became an increasing problem in lowering their effectiveness. Killing off non-resistant bacteria and thereby leaving a few resistant strains is a still an unsolved problem for these disease fighting medications.
Incidentally in 1928 British medical researcher Alexander Fleming notice that one of his bacteria plates which had been contaminated with a mold had the odd effect of clearing a zone around itself in which the bacteria did not grow. Fleming logically thought that the mold was releasing some sort of substance that hindered the growth of the bacteria. Purifying an amount of the mold sufficient to conduct in vivo tests proved so difficult that he largely dropped the research and turned to – sulfa. He returned to penicillin years later and along with two other British medical researchers received a Nobel Prize for medicine in 1945.
Friday, March 2, 2007
BEER CONSUMPTION & OTHER LITTLE ICE AGE PHENOMENA 6
In a 2006 History Channel TV program, food & wine expert Joseph H. Coulombe gave the following statistics:
23 gallons of beer; 1 gallon of hard liquor; and 2 gallons of wine are consumed yearly on average by each adult American.
88% of wine is drunk by 11% of Americans, most on the East or West Coasts.
Various wine, beer, or alcohol producing institutes give the same figures of approximately two gallons of wine and 1 gallon of liquor and from just over 20 gallons of beer, to 21 gallons, to 22 gallons depending upon the year cited and the particular survey quoted. One survey estimated that 88% of wine is consumed by 16% of Americans.
Wine drinking is increasing in the United States - 231 million cases of wine were consumed in 2001 with an increase of 5½% in 2004. The projected increased total wine consumption by 2010 is 300 million cases.
IN 1997 adult Americans consumed an average of 22 gallons of beer, 23 ½ gallons of coffee, 53 gallons of soft drinks, and 2 gallons of wine. The French consumed an average of 64 liters (17 gallons) of wine while Germans drank an average of 120 liters (30 gallons) of beer.
Beer accounts for approx. 88% of all alcohol consumed in the United States – 11 times as much beer is consumed as wine.
Now to the reasons America is primarily a beer drinking nation:
The period called The Little Ice Age occurred from the early 14th to the middle of the 19th centuries. It was universally, especially in the northern hemisphere, but not uniformly cold decade after decade and century after century. Instead it was on average colder than the centuries preceding and following it with some years, including consecutive ones, of bone chilling cold followed by a few temperate years. Still there were long periods of cold and often wet weather such that starting in the 14th century, vineyards in Northern Europe died to the extent that grains were substituted to brew beer and distill hard alcohol. By the time of large immigration of Europeans to America in the 17th and 18th centuries Northern Europeans had been brewing beer and distilling liquor instead of making wine for two centuries. Thus they took their beer drinking culture and beer and alcohol making techniques with them to America. And during this embryonic habit forming period on the new continent overwhelmingly most of the immigrants were from Northern Europe – England, Scotland, Ireland, Germany, Holland, Poland, and Sweden, primarily. Some, but relatively few, came from the wine drinking Mediterranean Basin.
Although not a few of the Founding Fathers (possibly even the Founding Mothers) enjoyed drinking imported wine, Thomas Jefferson brewed beer at Monticello and George Washington was the largest distiller of rye whiskey in the Colonies.
There were many other historical events which were caused or enhanced by the Little Ice Age. Napoleon Bonaparte invaded Russia in the fall of 1812 with a force of circa 600,000. The winter was so severe that by the time he retreated his army was reduced to 130,000 due mostly to starvation and disease. Temperatures dropped to the minus 30’s ºF where snow crystals floated in the air instead of falling to the ground due to the high density of the cold air. It was described by one of the officers as a surrealistic world. There are accounts of soldiers cutting off meat from the flanks of the wagon pulling horses. The horse’s skins were frozen and they were so numb that they did not feel it. The blood from the wounds froze so quickly there was little blood loss. In 2006 the U.S. congress passed a law prohibiting the slaughter of horses in this country for the export of horse meat. Two of these plants are near here in North Texas. I don’t know how the slicing off of pieces of meat from live working horses would be perceived today, but I suppose allowances should be made for starving soldiers.
When Napoleon’s severely depleted army entered Vilnius, Lithuania it was reduced to approximately 40,000. In 2001 a mass grave in Vilnius was discovered containing 3000 skeletons which were determined to be almost two hundred years old - all from Napoleon’s army. Starvation, typhus, and gangrene were the main causes of the death of thousands of soldiers in the city. Only 3000 to 4000 of the original 600,000 man army eventually made it back to France. One of those who made it back was Napoleon himself. On June 18, 1815 the Duke of Wellington settled Napoleon’s hash, so to speak, at the Battle of Waterloo in Belgium.
In the year 1815, when James Madison was president of the United States and Indiana admitted as the 19th state, the single most powerful volcanic eruption ever recorded occurred on April 18th at Mt. Tambora on the Indonesian island of Sumbawa. Mt. Tambora was a 13,000 ft. extinct volcano until that fateful day when 4200 ft. of the top blew off throwing 36 cubic miles of ash, sulfur gas, and debris into the atmosphere to a height of 15 ½ miles. Immediate fatalities were 70,000 people on that and adjacent islands with the total going to 90,000 not long after. This eruption put 100 times more ash in the atmosphere than the 9000 ft. Mt. St. Helens eruption in Washington State in 1980 and had four times the energy as Krakatoa in 1883. Worldwide climate was affected for years.
That winter Hungary had brown colored snow and in Puglia (a region in southern Italy) where it seldom snows, the snow was red tinted. Of course these were caused by the volcanic eruption that spring thousands of miles away.
The New World did not escape the debilitating effects of the Little Ice Age. During what was called the Year Without a Summer in June 1816 in New England there were 5 consecutive days of snow. There was also snow during July and August; in fact 75% of the corn crop was ruined by the cold temperatures and excessive moisture. There were reports of birds falling out of the sky dead due to the cold. Finally many New Englanders had enough of the continuing cold and inclement weather and they greatly accelerated the country’s westward migration. They did not realize their problems originated half a world away.
Also in the summer of 1816 the British poets Percy Bysshe Shelly and Lord Byron along with Shelly’s 19 year old wife Mary Shelly vacationed along Lake Geneva in Switzerland. The weather was so wet and cold that they stayed indoors most of the time and decided to see who could write the most terrifying and gripping tale. Mary Shelly won with her novel of Frankenstein.
The Black Death which hit Europe in 1347 to 1351 was spread by rats carrying fleas infected with bubonic plague (Yersinia pestis bacillus). At least 25 million, 1/3 of the European population, died during this period. This high fatality rate was greatly enhanced by the population being left in a weakened state owing to crop failures brought on by the cold and wet weather at the start of the Little Ice Age.
There is a famous painting of George Washington crossing the Delaware River on the way to the Battle of Trenton, New Jersey on Christmas night of 1776 by artist Emanuel Luetze. This painting shows the river choked full of blocks of ice. Unlike some renditions of historical events this it is an accurate depiction of the weather conditions at that time. Today the Delaware River seldom if ever freezes, but then the world was in the grip of the Little Ice Age.
The Medieval Climate Optimum aka the Medieval Warm Period occurred from the 10th to the 14th centuries and was 4º - 7º F warmer than previously. During this period wine grapes were grown in profusion as far north as southern Britain. It was also when the Vikings settled in Greenland. Today Greenland and Iceland should more accurately have their names reversed. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.
However, around 1000 A.D. when the Vikings immigrated to Greenland it was properly named. The Vikings subsisted on a diet of 80% land grazing animals (mostly sheep, some cattle) and 20% cod from the sea. After the advent of the Little Ice Age in the 14th century and as the climate grew progressively colder there was a change in the Viking’s diet to 20% land animals and 80% fish as the sheep and cattle died out from a lack of grass. Eventually even the temperature sensitive cod stopped coming to the North Atlantic. The Vikings, unlike the Inuit natives, were unable to adapt to the changing climate and coupled with their essential barter with Northern Europe being interrupted by impassable pack ice they simply died out. The Vikings in Greenland were done in by The Little Ice Age.
What then caused The Little Ice Age? First off, the last glaciation period which ended about 12,000 years ago was a more extreme period of cold where there was an average drop in temperature of 9 ºF. The Little Ice Age had an estimated average temperature of 4º - 5º F cooler than today. This does not seem like much, but is enough to cause major climate changes. There have also been periods of glaciation with even much colder temperatures going back millions of years where glaciers covered the mid section of America as far south as Indiana, Illinois, and Ohio and in Northern Europe.
The short answer to what caused The Little Ice Age is that nobody knows. Naturally there are a number of theories. One is that, for reasons not understood, the sun undergoes periodic episodes of changing energy output. There is a phenomenon called the Maunder Minimum which was a period of low sun spot activity between 1645 and 1715 that correlates with the sun’s low energy output. Another is that there was an unusually active period of volcanism (an average of five per century vs. normally one or less) during The Little Ice Age which blocked heat from the sun. Still another is the Thermohaline Circulation or Mid-Atlantic Conveyor Belt Theory. This theory holds that currents carry water from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of the warmer water of the Gulf Stream flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding The Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes The Little Ice Age was initiated. Perhaps there was a combination of these effects, but whatever it was, neither The Little Ice Age nor the preceding warm period were due to individual or industrial pollution.
23 gallons of beer; 1 gallon of hard liquor; and 2 gallons of wine are consumed yearly on average by each adult American.
88% of wine is drunk by 11% of Americans, most on the East or West Coasts.
Various wine, beer, or alcohol producing institutes give the same figures of approximately two gallons of wine and 1 gallon of liquor and from just over 20 gallons of beer, to 21 gallons, to 22 gallons depending upon the year cited and the particular survey quoted. One survey estimated that 88% of wine is consumed by 16% of Americans.
Wine drinking is increasing in the United States - 231 million cases of wine were consumed in 2001 with an increase of 5½% in 2004. The projected increased total wine consumption by 2010 is 300 million cases.
IN 1997 adult Americans consumed an average of 22 gallons of beer, 23 ½ gallons of coffee, 53 gallons of soft drinks, and 2 gallons of wine. The French consumed an average of 64 liters (17 gallons) of wine while Germans drank an average of 120 liters (30 gallons) of beer.
Beer accounts for approx. 88% of all alcohol consumed in the United States – 11 times as much beer is consumed as wine.
Now to the reasons America is primarily a beer drinking nation:
The period called The Little Ice Age occurred from the early 14th to the middle of the 19th centuries. It was universally, especially in the northern hemisphere, but not uniformly cold decade after decade and century after century. Instead it was on average colder than the centuries preceding and following it with some years, including consecutive ones, of bone chilling cold followed by a few temperate years. Still there were long periods of cold and often wet weather such that starting in the 14th century, vineyards in Northern Europe died to the extent that grains were substituted to brew beer and distill hard alcohol. By the time of large immigration of Europeans to America in the 17th and 18th centuries Northern Europeans had been brewing beer and distilling liquor instead of making wine for two centuries. Thus they took their beer drinking culture and beer and alcohol making techniques with them to America. And during this embryonic habit forming period on the new continent overwhelmingly most of the immigrants were from Northern Europe – England, Scotland, Ireland, Germany, Holland, Poland, and Sweden, primarily. Some, but relatively few, came from the wine drinking Mediterranean Basin.
Although not a few of the Founding Fathers (possibly even the Founding Mothers) enjoyed drinking imported wine, Thomas Jefferson brewed beer at Monticello and George Washington was the largest distiller of rye whiskey in the Colonies.
There were many other historical events which were caused or enhanced by the Little Ice Age. Napoleon Bonaparte invaded Russia in the fall of 1812 with a force of circa 600,000. The winter was so severe that by the time he retreated his army was reduced to 130,000 due mostly to starvation and disease. Temperatures dropped to the minus 30’s ºF where snow crystals floated in the air instead of falling to the ground due to the high density of the cold air. It was described by one of the officers as a surrealistic world. There are accounts of soldiers cutting off meat from the flanks of the wagon pulling horses. The horse’s skins were frozen and they were so numb that they did not feel it. The blood from the wounds froze so quickly there was little blood loss. In 2006 the U.S. congress passed a law prohibiting the slaughter of horses in this country for the export of horse meat. Two of these plants are near here in North Texas. I don’t know how the slicing off of pieces of meat from live working horses would be perceived today, but I suppose allowances should be made for starving soldiers.
When Napoleon’s severely depleted army entered Vilnius, Lithuania it was reduced to approximately 40,000. In 2001 a mass grave in Vilnius was discovered containing 3000 skeletons which were determined to be almost two hundred years old - all from Napoleon’s army. Starvation, typhus, and gangrene were the main causes of the death of thousands of soldiers in the city. Only 3000 to 4000 of the original 600,000 man army eventually made it back to France. One of those who made it back was Napoleon himself. On June 18, 1815 the Duke of Wellington settled Napoleon’s hash, so to speak, at the Battle of Waterloo in Belgium.
In the year 1815, when James Madison was president of the United States and Indiana admitted as the 19th state, the single most powerful volcanic eruption ever recorded occurred on April 18th at Mt. Tambora on the Indonesian island of Sumbawa. Mt. Tambora was a 13,000 ft. extinct volcano until that fateful day when 4200 ft. of the top blew off throwing 36 cubic miles of ash, sulfur gas, and debris into the atmosphere to a height of 15 ½ miles. Immediate fatalities were 70,000 people on that and adjacent islands with the total going to 90,000 not long after. This eruption put 100 times more ash in the atmosphere than the 9000 ft. Mt. St. Helens eruption in Washington State in 1980 and had four times the energy as Krakatoa in 1883. Worldwide climate was affected for years.
That winter Hungary had brown colored snow and in Puglia (a region in southern Italy) where it seldom snows, the snow was red tinted. Of course these were caused by the volcanic eruption that spring thousands of miles away.
The New World did not escape the debilitating effects of the Little Ice Age. During what was called the Year Without a Summer in June 1816 in New England there were 5 consecutive days of snow. There was also snow during July and August; in fact 75% of the corn crop was ruined by the cold temperatures and excessive moisture. There were reports of birds falling out of the sky dead due to the cold. Finally many New Englanders had enough of the continuing cold and inclement weather and they greatly accelerated the country’s westward migration. They did not realize their problems originated half a world away.
Also in the summer of 1816 the British poets Percy Bysshe Shelly and Lord Byron along with Shelly’s 19 year old wife Mary Shelly vacationed along Lake Geneva in Switzerland. The weather was so wet and cold that they stayed indoors most of the time and decided to see who could write the most terrifying and gripping tale. Mary Shelly won with her novel of Frankenstein.
The Black Death which hit Europe in 1347 to 1351 was spread by rats carrying fleas infected with bubonic plague (Yersinia pestis bacillus). At least 25 million, 1/3 of the European population, died during this period. This high fatality rate was greatly enhanced by the population being left in a weakened state owing to crop failures brought on by the cold and wet weather at the start of the Little Ice Age.
There is a famous painting of George Washington crossing the Delaware River on the way to the Battle of Trenton, New Jersey on Christmas night of 1776 by artist Emanuel Luetze. This painting shows the river choked full of blocks of ice. Unlike some renditions of historical events this it is an accurate depiction of the weather conditions at that time. Today the Delaware River seldom if ever freezes, but then the world was in the grip of the Little Ice Age.
The Medieval Climate Optimum aka the Medieval Warm Period occurred from the 10th to the 14th centuries and was 4º - 7º F warmer than previously. During this period wine grapes were grown in profusion as far north as southern Britain. It was also when the Vikings settled in Greenland. Today Greenland and Iceland should more accurately have their names reversed. Iceland has the advantage of being centered over a volcanic “hot spot” and therefore has many hot water geysers to promote thermal heating.
However, around 1000 A.D. when the Vikings immigrated to Greenland it was properly named. The Vikings subsisted on a diet of 80% land grazing animals (mostly sheep, some cattle) and 20% cod from the sea. After the advent of the Little Ice Age in the 14th century and as the climate grew progressively colder there was a change in the Viking’s diet to 20% land animals and 80% fish as the sheep and cattle died out from a lack of grass. Eventually even the temperature sensitive cod stopped coming to the North Atlantic. The Vikings, unlike the Inuit natives, were unable to adapt to the changing climate and coupled with their essential barter with Northern Europe being interrupted by impassable pack ice they simply died out. The Vikings in Greenland were done in by The Little Ice Age.
What then caused The Little Ice Age? First off, the last glaciation period which ended about 12,000 years ago was a more extreme period of cold where there was an average drop in temperature of 9 ºF. The Little Ice Age had an estimated average temperature of 4º - 5º F cooler than today. This does not seem like much, but is enough to cause major climate changes. There have also been periods of glaciation with even much colder temperatures going back millions of years where glaciers covered the mid section of America as far south as Indiana, Illinois, and Ohio and in Northern Europe.
The short answer to what caused The Little Ice Age is that nobody knows. Naturally there are a number of theories. One is that, for reasons not understood, the sun undergoes periodic episodes of changing energy output. There is a phenomenon called the Maunder Minimum which was a period of low sun spot activity between 1645 and 1715 that correlates with the sun’s low energy output. Another is that there was an unusually active period of volcanism (an average of five per century vs. normally one or less) during The Little Ice Age which blocked heat from the sun. Still another is the Thermohaline Circulation or Mid-Atlantic Conveyor Belt Theory. This theory holds that currents carry water from the tropical regions northward transferring heat to northern latitudes. These waters cool as they reach northern latitudes becoming denser and therefore sink, effectively forming a conveyor belt of the warmer water of the Gulf Stream flowing northward at the surface and cooler water below traveling southward as equilibrium tends to distribute the northern and equatorial water levels. During the relative warm period preceding The Little Ice Age the northern glaciers melted and with the mixing of less dense fresh water from the glaciers the near surface waters did not sink thereby shutting down the conveyor effect. Without the heat transfer from the tropics to the northern latitudes The Little Ice Age was initiated. Perhaps there was a combination of these effects, but whatever it was, neither The Little Ice Age nor the preceding warm period were due to individual or industrial pollution.
Subscribe to:
Posts (Atom)