Tuesday, September 19, 2017

SURGEON’S GLOVES



     The craft of surgery is an ancient one, reaching back to well before the days of Hippocrates. But of quite recent origin is one of the most important aides to the surgeon – his gloves.
     The first use of gloves in medicine, in fact, seems to have been to protect the doctor, not the patient. One of the fathers of dermatology, the Viennese physician Josef Jacob Plenck, possibly remembering the surgeon John Hunter’s 1767 inoculation experiment with gonorrhea/syphilis, in 1808 recommended gloves to protect midwives who had a cut or sore on their hand while examining a patient with venereal disease. Occasional others recommended the same. Early gloves were made from animal bladder or colon. Postmortem exams were dangerous too for physicians, and crude gloves were sometimes used in that setting. At Hopkins William Welch used a pair imported from Germany for autopsies before they were used in surgery.
     Protecting the patient began with hand washing. Oliver Wendell Holmes in the U.S., Robert Storrs in England (both in 1843), and Semmelweiss in Vienna (1847-8), all recommended hand disinfection to protect birthing mothers from puerperal fever. Thomas Watson at Kings College, (1840-43), first suggested rubber gloves for this purpose, saying, ”a glove…might be devised which should be impervious to fluids, and yet so thin and pliant as not to interfere materially with the delicate sense of touch required…” Gloves were only used sporadically, however, until germ theory came of age.
     That age dawned as Joseph Lister developed a system of “antiseptic
Joseph Lister (Wikipedia)
surgery”, first described in 1867,  employing large amounts of carbolic acid as the antiseptic. Surgeons took up antisepsis but soon considered “asepsis” a preferred approach, using antiseptic washes to clear the surgical field and instruments of bacteria before operating. Wound infection rates fell, but not to zero. The impossibility of sterilizing hands was a stumbling block.   
     Typical was the clinic of Ernst von Bergmann, at the University of Berlin. One of his staff, Kurt Schimmelbusch, published an authoritative “Anleitung zur Aseptischem Wundbehandlung” (“Guide to Aseptic Wound Treatment”) in 1892. Instruments were sterilized and personnel wore sterile gowns but no masks. For the hands: one minute of brushing with soap and hot water, wipe dry with sterile towel, clean under nails, rub with gauze soaked in 80% alcohol for one minute, rinse with dilute mercuric chloride solution, then rub off. Gloves are not mentioned, but an antiseptic paste (especially to cover the nail ends) was discussed, only to dismiss it as impractical. Ironically, Schimmelbusch died of sepsis acquired from an infection in a hand, acquired during surgery.
     Johannes von Mikulicz, at the University of Breslau, worked
Johannes von Mikulicz (Wikipedia)
with bacteriologist Carl Flügge to solve the problem. First masks were introduced. Then gloves. Mikulicz used sterile finely woven white cotton gloves, sold by the dozen as “fine servant’s gloves” (presumably those that butlers wore). Knowing that when the gloves were wet bacteria could get through from the skin, he changed gloves at intervals during longer operations. If delicate tactile sense was needed at a certain point, he simply took off the gloves for the maneuver, then put on a fresh pair. After three months he asserted that he had no surgical infections (1897).
     Georg Perthes (also in 1897) in Leipzig used finely woven silk gloves that reached to the elbow, admitting that bacteria got through when wet. Anton Wöfler in Prague used military leather gloves. The first to use rubber gloves in Europe seems to be Werner Zoege von Manteuffel in Dorpat (now in Estonia), publishing also in 1897. (Vulcanized rubber, lending flexibility and better temperature tolerance, was invented in 1845.) Working from a city hospital where contaminated cases were common, he admitted that rubber gloves were not comfortable and that he lost some tactile sensitivity but felt the lower infection rate was worth it. Arguments raged over how much importance to attach to bacteriologic counts from hands or gloves during operations. Did the counts really predict infection rates? The subject was thrashed out in 1898 at an important Surgical Congress in Germany,  with no new conclusions.
     In the U.S., rubber was used from the start. William Halsted at
William Halsted (Wikipedia)
Johns Hopkins had a pair of rubber gloves made (by Charles Goodyear) for his nurse and wife-to-be, in the winter of 1889-90, to protect her skin against irritating antiseptics. Next the assistants who handled the instruments wore them for the same reason. Halsted later indicated that wearing rubber gloves was not a uniform practice in the hospital until late 1893 or early 1894, saying only that, considering the barrier to bacteria they offered, he had no explanation for the delay. The Hopkins gynecology surgeon, Hunter Robb, in a text of 1894, recommended routine use of rubber gloves. Halsted’s house surgeon, Dr. Joseph Bloodgood (called by the staff “Bloodclot”), said he was the first to use gloves routinely in clean cases, in 1896. Halsted, after he converted, had gloves specially made over a mold of his hands (not unlike his tailored suits and shirts). Halsted filled his sterilized gloves with 1:1000 mercuric chloride before putting them on, a practice not adopted by most others. Mikulicz was aware of the use of rubber gloves at Hopkins but the ones he tried were too cumbersome compared to the cotton. Of interest is a photograph of what is believed to be the first operation where rubber gloves were worn by the operator, taken at Johns Hopkins in 1893 (see Mitchell, below). Masks, hats, and covering gowns are conspicuously absent.
     Despite initial resistance by some, rubber gloves caught on,
Charles McBurney (Wikipedia)
helped by an enthusiastic article by Charles McBurney in 1898. Rubber gloves do seem to be a primarily American innovation.
    
SOURCES:

Randers-Pehrson, R. The Surgeon’s 
      Glove. Charles Thomas Pub.,   
      1960.
Imber, Gerald. Genius on the Edge:  
     The Bizarre Double Life of Dr. 
      William Halsted. Kaplan Pub.,
       2010.
Schlich, Thomas. “Negotiating Technologies in Surgery: The  
       Controversy about Surgical Gloves in the 1890s”. Bull Hist  
       Med 87: 170, 2013.
Halsted, W. “Ligature and Suture Material…..Also an Account of 
         the Introduction of Gloves, Gutta-Percha Tissue and Silver 
         Foil”. JAMA 60: 1119, 1913.
McBurney, C. “The Use of Rubber Gloves in Operative Surgery”. 
          Ann Surg 28: 108, 1898.
Bloodgood, J C. “Operations on 459 cases of hernia in the Johns 
          Hopkins Hospital from June 1889 to January 1899”. Johns 
          Hopkins Hosp Repts 7: 223, 1899.
Schimmelbusch, K. The Aseptic treatment of Wounds (Eng trans 
           from 2nd edition), 1894. p 54.
Brieger, G H. “American Surgery and the Germ Theory of 
           Disease”. Bull Hist Med 40:135,1966
Holmes, O W. “The Contagiousness of Puerperal Fever”.  N Eng 
           Quart J Med SurgApril 1843, p 503.
Mikulicz, J. “Über Versuche, die ‘aseptische’ Wundbehandlung zu 
           einer wirklich keimfreien Methode zu vervollkommen”. 
           Deutsche Medicinische  Wochenschrift. 1897. v 23, p. 409.
Watson, Thomas. Lectures on the Principles and Practice of 
           Physic; Delivered at King’s College, London. 1845, V23, 
           p.349.
Proskauer, C. “Development and Use of the Rubber Glove in 
           Surgery and Gynecology. J Hist Med All Sci. 1958, 13: 373-
           381.
Mitchell, J. “The Introduction of Rubber Gloves for Use in Surgical 
           Operations”. Ann Surg 1945, 122: 902.


Wednesday, August 9, 2017

chimborazo hospital
the largest in the world

      Medicine during the Civil War on the Union side has been well described. What about military medicine in the South? A note on the South’s largest Civil War hospital can serve as a starter.
     When Fort Sumter was fired upon the Union Army already had medical services in place, complete with a Surgeon General, career medical officers, and hospitals. Conversely, the South had to put together a new government, an army, and an army medical service from scratch. The Confederate Congress created the Medical Department in February 1861 and President Jefferson Davis appointed Dr. Samuel Preston Moore as Surgeon General
Samuel Preston Moore (from National Library
of Medicine)
(replacing an earlier, brief appointee), a wise choice. After attending the South Carolina Medical College Moore had joined the U.S. Army Medical Dept, working in various western posts. He served in the Mexican-American War, where he met Jefferson Davis who was impressed with his organizational skills. When S. Carolina seceded Moore resigned his Army commission and was later appointed by Davis to run the Army Medical Department. He was a stern but efficient administrator.
     Moore centered the Medical Dept in Richmond, as it was the Confederate capital, the largest city in the area, a hub for railroads, roads, and shipping, and was near the fighting. When the shooting started sick and wounded poured into the city, overwhelming the hastily established hospitals.  Dr. James McCaw, a professor at the Medical College of Virginia, advised Moore to utilize a hilltop near the city (where a brewery had
James McCaw (from National Library
of Medicine)
existed) for a new and larger hospital. Moore authorized its construction and put McCaw in charge. Constructed with slave labor, Chimborazo was the first pavilion-style hospital in the U.S., composed of separate wooden buildings, each with its own ward (for maximum ventilation), a design suggested by Florence Nightingale after the Crimean War.     
     Eventually 150 buildings went up in a gridded arrangement, most of them 30-bed wards. Included were a centrally located storage building, along with repair shops, apothecary, kitchen, bakery (that could bake 10,000 loaves a day), stables, grazing cattle, and vegetable gardens. Tents were erected around the periphery to house convalescents (who were given hospital duties later in the war). It was the largest hospital in the world when completed, and at its peak held 3000, sometimes more, patients.
      Dr. McCaw, the Surgeon-in–Chief was organized and knowledgeable. He employed resourcefulness, tact, and a knack for skirting restrictions to keep the hospital going to the end of the war. As the Union Army closed in supplies were ever scarcer, forcing doctors to improvise, and experiment with whatever was at hand – turpentine instead of quinine for malaria, for instance.
Model of Chimborazo Hospital, without tents (National Park Service, through
Wikipedia)
     A major advantage of the pavilion system was the ability to assign patients to groups. At first they were grouped from the same state. Later they were sorted by disorder: febrile diseases in one ward, other medical diagnoses in another, certain wounds in some, etc. Specialized care naturally developed, a trend seen in both North and South and continued after the war. Tents were used to isolate those with smallpox, rubella, etc.
     The hospital comprised five “divisions”, each with its own staff. The “matron” took care of food and cleaning, the nurses (comprised mainly of convalescent soldiers and volunteer women) nursed the sick, and the stewards handled procurement of supplies. The matron held the key over the monthly “whiskey barrel”, that officers frequently tried to requisition “for patients”. One chief matron, Phoebe Pember, relates this struggle in a memoir. Most of the kitchen personnel and orderlies (and some nurses) were
Phoebe Pember, Chief Matron (from A
Southern Woman's Story
, at Hathi Trust)
African-Americans, impressed into service from both slave and free status. All blacks were paid (not to exceed a soldier’s pay), and received extra pay on major holidays.
     Altogether during the war 77,889 were admitted to Chimborazo. Reflecting the ignorance of germ theory, 50,350 were for medical illness (mainly infectious and “camp” diseases – diarrhea, dysentery, typhoid, etc.), 14,661 for wounds and injuries, 12,000 with no diagnosis, and a few others. Pneumonia, tuberculosis, malaria, and skin diseases were also common. Scurvy broke out later as food supplies dwindled. Nineteen other new hospitals went up in Richmond, replacing the original makeshift ones.
     “Hospital gangrene” was a feared complication of wounds. Though caused by various mixtures of bacteria, surgeons then knew only that it seemed to be contagious and was most common where tissue was devitalized. Swift isolation of fresh cases, irrigation, debridement, covering wounds with clean dressings, and the use of antiseptics, especially nitric acid (which had to be applied under anesthesia), seemed to control spread – techniques used in both the North and the South, and all before the germ theory was known. Patient records indicate that stethoscopes were used with some frequency. Rats and maggots were frequent pests.
     The hospital staff maintained a close relationship with the nearby Medical College of Virginia (the only southern medical school that remained open during the war). Doctors attended lectures to keep abreast of new developments. Medical students gained experience on the wards as stewards. The Association of Army and Navy Surgeons of the Confederate States was organized in mid-war to read and discuss papers, many published in the newly created Confederate States Medical and Surgical Journal. Access to outside literature was limited.
     After the war Chimborazo was used as a school for freed blacks, holding day and night classes, and as a refuge where destitute ex-slaves could be fed and clothed. It later was replaced by a brewery (that failed) and eventually was turned into a National Park and memorial to a unique hospital.


SOURCES:

Cunningham, H H. Doctors in Gray: The Confederate Medical Service. 1958. Louisiana 
           State Univ Press
Green, C C. Chimborazo: The Confederacy’s Largest Hospital. 2004. Univ of Tennessee
          Press.
Pember, P Y. A Southern Woman’s Story: Life in Confederate Richmond. 1959.

         McCowat-Mercer Press.

Friday, July 14, 2017

OUR “NORMAL” TEMPERATURE 

     No one today thinks twice about taking the temperature of someone presenting with illness, but that wasn’t always so. Physicians in the U.S. rarely recorded temperatures before, and even during, the Civil War. World-wide the practice was much the same. What made the change?
     Human temperature seems to have been first measured by the remarkable Sanctorio Sanctorius, professor at the University of
Sanctorio Sanctorius (from Wikipedia)
Padua in the early 1600s. In addition to weighing intake and output to measure insensitive losses, he measured temperatures using a graduated thermoscope, an open tube sensitive to atmospheric pressure as well as temperature. Closed tubes, to eliminate atmospheric pressure effects, were soon invented. Then in 1714 Daniel Gabriel Fahrenheit, a skilled instrument maker working in Holland, fashioned a mercury-filled closed tube thermometer with a new scale that proved quite accurate. He
Daniel Fahrenheit (from Wikipedia)
picked zero as the temperature of a mixture of ice, water, and sea-salt, 32 of water and ice alone, and boiling water measured 212. The astronomer Anders Celsius, in 1742, brought back the centigrade scale first promulgated by Christiaan Huygens in the previous century. Both scales continued in use.
       Thermometry still had no sex appeal, however. Of a few prominent medical men who adopted it in their practice one stands out: Anton de Haen, a former pupil of Hermann Boerhaave. As professor of medicine at the University of Vienna from 1754-76, de Haen recorded the levels and diurnal variations in temperatures of normal and diseased subjects, correlated high temperatures with high pulse rates, and noted the association of rising temperatures
Anton de Haen (Wellcome Library)
 with chills. But most clinicians saw no practical value in measuring temperatures. Concepts of the mechanisms of heat production and regulation, both in physics and biology, were not developed until the early 19th century, and fever was still a disease, not a symptom.
     The person most influential in bringing the medical world around to using the thermometer was Carl Wunderlich. Wunderlich studied medicine at the University of Tübingen, graduating in 1837. Grumbling about the backward status of German medicine at the time he teamed up with two of his schoolmates, Wilhelm Griesinger and Wilhelm Roser, to literally usher in a new era. The three founded a new journal, the “Archiv für Physiologische Heilkunde” (Archives of Physiologic Medicine, or Medical Science) to emphasize the importance of physiologic investigation in addition to clinical observation. It was the beginning of a golden age in German medicine, assisted by government subsidies and attracting students from around the world.
     Wunderlich ended up as professor of medicine at the Univ. of Leipzig, where he conducted his temperature studies (and where clinical physiology achieved world renown). With Germanic
Carl Wunderlich (National Library of Medicine)
thoroughness he recorded around-the-clock temperatures of some 25,000 patients, collecting over a million measurements. Both normals and the sick of all ages were included. The work culminated in a book, Das Verhalten der Eigenwärme in Krankheiten (On the temperature in Diseases), that exerted wide influence.
    Wunderlich used a mercury bulb thermometer and preferred temperatures taken in “a well-closed axilla” over oral or rectal temperatures. His work established the now familiar “normal” average temperature of 37C (98.6 F) degrees, with oscillations seldom exceeding 0.5o C each way. He demonstrated rather typical fever patterns for various diseases, such as typhoid, typhus, relapsing fever, smallpox, measles, etc., thus using the thermometer as a diagnostic aid. Fever patterns, especially high temperatures, could also help with prognosis.
     In spite of the influence of the work, there were problems. Statistical methods were not well developed at the time and Wunderlich’s measurements are not tabulated, but rather summarized. More important, how accurate were the
English version of Wunderlich's book
second edition (Hathi Trust)
measurements? Wunderlich himself states, “Errors that do not exceed half a degree Centigrade are hardly worth mention”, suggesting that high precision was not a priority.
     And another question: is 37oC really the “normal” average temperature? In 1992 a group of 148 normal subjects aged 18 to 40 were studied over three days with modern Diatek electric oral thermometers (700 readings). Their mean (and median) temperature turned out to be 36.8oC +/-0.4 degrees (98.2oF +/- 0.7 F).
     Why the difference? Wunderlich preferred axillary temperatures, known to be lower than oral ones. He used glass thermometers with a mercury column, most about 12” long, that could take 15 to 20 minutes to equilibrate – frustrating to a busy nurse. Another possible source of error is calibration of the thermometers. The first internationally accepted temperature scale was not established until 1897, ten years after Wunderlich’s death. We cannot test Wunderlich’s own instrument but a thermometer belonging to one of his students is available (now in the Mütter Museum in Philadelphia). It is 22.5 cm long (about 9”), and in a water bath it gave readings 1.6oC to 1.8oC higher than modern electronic thermometers, partially offsetting the lower readings formerly obtained in the axilla. Physical changes from long storage, such as change in bulb size, were thought to play a minor role in such a difference. Finally, as mentioned, Wunderlich did not worry about less than a half degree variance in measurement.

     So we are “cooler” than we thought, though not by much. But Wunderlich’s work, in spite of some inaccuracies and fuzzy statistical methods, established thermometry as a permanent clinical practice.

SOURCES:

Wunderlich, W A. On the Temperature in Diseases: A Manual of
      Medical Thermometry. Eng trans by W B Woodman, London,
      1871.
Mackowiak, P and Worden, G. “Carl Reinhold August
       Wunderlich and the Evolution of Clinical Thermometry”.
      1994. Clin Infect Dis v18(3) pp 458-67.
Dominguez, A, et al. “Adoption of Thermometry into Clinical
       Practice in the United States”. 1987. Rev Infect Dis v9: 1193-
      1201.
Mackowiak, P A, et al. “A Critical Appraisal of 98.6oF, the Upper
       Limit of the Normal Body Temperature, and Other Legacies
       of Carl August Wunderlich”. 1992. JAMA v268: 1578-80.
Gershon-Cohen, J. “A Short History of Medical Thermometry”.

       1964. Ann N Y Acad Sci. v121:4-11.

Sunday, June 11, 2017

THE RORSCHACH TEST


     Anybody remember the Rorschach test? It was all the rage in the fifties but has since dwindled in the public eye. Its origins are reviewed in a recent biography of its inventor, Hermann Rorschach.
     Rorschach was born in 1884 in Zurich, and grew up in the nearby town of Schaffhausen where his father was an art teacher. His father died as he finished high school (gymnasium) leaving him short of money but he managed to afford medical school at the
Hermann Rorschach (from Wikipedia)
 University of Zurich. Connected to the Zurich medical school was a large psychiatric hospital, the Burghölzli, whose director, Eugen Bleuler, was already known for his commitment to mentally ill patients, probably related to the fact that his own sister had catatonic schizophrenia. Bleuler invented the name schizophrenia to replace Emil Kräpelin’s dementia praecox (premature dementia. Joining him in 1900 was a young assistant, Carl Jung.
     Rorschach did not work at the Burghölzli but Bleuler was one of his professors, influencing him to take up psychiatry. In Zurich Rorschach met numerous Russian women (unable to study medicine in Russia), learned Russian himself, and fell in love with a student named Olga, six years his senior. He completed his medical studies in Berlin in 1909. An attempt to practice in Russia failed and he took a job as psychiatrist in a mental hospital in Münsterlingen, near Lake Constance, where he and Olga were married.  
Psychiatric clinic, Münsterlingen (photo by Dominic Venezia, on Wikipedia)
     He engaged intimately with his patients, introduced art therapy, and arranged entertainments, slide shows, and dances for the patients. He and a nearby friend and schoolteacher, Konrad Gehring began to experiment with inkblots. While Gehring showed them to schoolchildren Rorschach showed them to his patients. The results encouraged him to explore further.
     Inkblots were not new. Probably the first to use them was Justinus Kerner, a German romantic poet and doctor (and the first to describe botulism and document the poison’s interruption of motor nerve transmission) who printed them as an accompaniment to his poems. Playing with inkblots soon became a child’s game. The psychologist Alfred Binet (of the Stanford-Binet test) used them as a measure of the level of imagination in a child. Some say that the whole idea of conjuring up realities from abstract designs originated with Leonardo da Vinci, who had written, “By looking attentively at old and smeared walls, or stones and veined marble of various colors, you may fancy that you see in them several compositions, landscapes, battles, figures in quick motion, strange countenances,….”.[i]  
Rorschach card II (Wikipedia)

     At the same time psychoanalytic theories of Freud and Jung were coming into vogue, and Jung and Franz Riklin, both working for Bleuler at the Burghölzli, had introduced word association tests.
     Rorschach’s work was interrupted when he took a job in a private psychiatric clinic near Moscow. Psychoanalysis was already popular in Russia; the first journal of psychoanalysis in the world was published in Russia. The technique meshed with the Russian proclivity for introspection and probing one’s “inner world”. Coincidentally, Russian “futurism” was exploding. New ideas on poetry, color perception and its relation to music, abstract forms (Kandinsky and Malevich are examples), and other themes permeated artistic and psychology circles. Rorschach, educated in art by his father, took it in.
     His next stop was a mental hospital in Herisau, in northeast Switzerland, where he returned to making blots. He used brushes and ink to create many, probably hundreds, of images, working to achieve something halfway between recognizable and unrecognizable shapes, introducing color, and opting for horizontal symmetry (rather than vertical). To allay patient anxiety the blots could be presented as a game or a test. He tried many samples on patients and eventually narrowed the selection to ten “blots” that became the standard. They helped him distinguish between certain diagnoses and provided insights into psychological types, such as introverted or extroverted, as advanced by Carl Jung. He separated patients’ answers into three categories: Form, Movement, and Color. Did an answer reflect a predominant focus on shape or form, was color important, or was something seen that moved or implied movement? His preliminary results were published in 1921 in a book entitled Psychodiagnostics.
Rorschach card IV (Wikipedia)
     But Rorschach’s research came to a tragic end. He contracted appendicitis and, because of delay in calling a doctor and the lack of a surgeon in the small town of Herisau, he died of peritonitis in April 1922 at the age of 37.
     The inkblots lived on, however. They were taken up in various countries, but reached greatest renown in America. Promoters of the blots ranged from objective researchers who sorted answers into measurable categories to help define personality types to those who used them as a psychoanalytic tool to probe the unconscious. Anthropologists administered them to remote tribes, the more remote the better. They were given (unofficially) to the top criminals awaiting trial in Nuremberg, by both the Army psychiatrist and the “morale officer” (who had written a book on Rorschach tests).  None were “insane” and the overall results were not outside the range found in the general population.  The military experimented with using them in recruits to weed out those unfit for combat (this was dropped). Eventually controlled trials were undertaken to evaluate their accuracy, shedding doubt on their value in diagnosis. Positive findings in one study were often absent in another. Problems were that the test giver could influence the results and that classifying the individualized responses to images was subjective. 
     The Rorschach test, using the original ten images, is still in use, though with diminished frequency. Controversy on its usefulness remains. Results are admissible as evidence in court and the test is reimbursed by insurance companies.
Rorschach card IX (Wikipedia)







SOURCES

[i] Da Vinci. A Treatise on Painting. Trans by J F Rigaud, 1835. J B  
          Nichols & Son, London. p 89.
Searles, Damion. The Inkblots: Hermann Rorschach, His Iconic 
        Test, and the Power of Seeing. 2017, Crown Publishing Co., 
         New York.
Wood, J M, et al. “The Rorschach Test in Clinical Diagnosis: A 
         Critical Review, with a Backward Look at Garfield (1947)”.
         Clin Psych 2000. 56(3): 395-430.

(to leave a comment click on "no comments" and a box will come up)

(to subscribe enter email address or send request to gfrierson@gmail.com)

Thursday, May 11, 2017

SCURVY and the GOLD RUSH

     On January 24, 1848, James Marshall, an employee of John Sutter, spotted gold in the American River bed. News leaked out quickly, setting off one of the great mass migrations of history. In the first year about 90-100,000 immigrants arrived, from all over the world, surging eventually to about a quarter of a million. Americans reached California by overland routes, by boat around Cape Horn, and across the Isthmus at Colombia, a part sea and part land odyssey.
     The overland journey from Missouri to California covered 1800 miles and took about six months to complete. Food storage was almost impossible and fresh fruit and vegetables difficult to obtain. Assorted diseases plagued the immigrants, including cholera, other diarrheas,  various fevers, and – importantly - scurvy.
     Fort Laramie in Wyoming was a stopping point on the trail. By August, 1850, almost 45,000 people had passed through to rest and buy supplies, which included canned or dried fruits (prevention of food spoilage using sealed cans or jars after cooking was discovered in 1810, long before the role of bacteria was appreciated). However, the Fort’s supplies were depleted and even the soldiers were scorbutic. Further on, immigrants crossed desert country in Utah and Nevada, moving up into Oregon or California. The scurvy rate rose with the distance, and claimed many victims. In southern Utah a town called Pickleville sprang up, selling pickles to the Argonauts to ward off the malady. Most people knew by this time that deprivation of fresh fruit and vegetables caused scurvy, though some felt that excess of salt in preserved meat was the culprit. 
From The Lectures of Bret Harte by C M Kozlay, 1909 (Internet Archives)

     Sailing around South America took up to six months. Fresh produce was available at Rio but going around the Horn or through the Straits of Magellan could take six weeks or more, followed by a stretch to Valparaiso. Scurvy was common on these routes. Those who went through the Isthmus were seldom affected since they could eat well on the land portion of the trip. 
From The Lectures of Bret Harte by C M Kozlay, 1909 (Internet Archives)
     Dr. Thomas M Logan, who sailed from New Orleans to California reported on scurvy in the miners. He described the hemorrhages around hair follicles that progressed to ecchymoses, the muscle and joint pains, fatigue, swollen and painful gums, and tendency of old scars or ulcers to break down again. Heavy physical labor seemed to precipitate symptoms. “Land scurvy” sufferers (i.e. the miners), as opposed to those with “ship scurvy”, often had diarrhea, though he attributed this to bad food or water. (James Lind had long ago said there was no difference between
James Lind (Wellcome Library)
land and sea scurvy). Logan went on to become the first secretary of the California State Board of Health, formed in 1870, and was president of the AMA in 1872. Physicians chasing after gold were surprisingly frequent, and those who wrote left similar descriptions.
     Scurvy was rife in the gold mining areas. The winters of 1848-9 and 49-50 were particularly rainy and roads from Sacramento, being rudimentary or nonexistent, were quagmires of mud. Getting winter provisions to the mines became almost impossible. The miners’ diet was reduced to bread, salted meat, and pork fat, and the physical labor hard. The wife of one of Sutter’s employees had planted some pear trees and the demand for the fruit was such that individual pears were sold before ripening, marked with tags carrying the purchaser’s name.    
     Scurvy, in fact, precipitated the formation of the town of Sonora. In the Sonora camp the 1849-50 winter brought so much scurvy that inhabitants, led by the alcalde, a butcher named Charles Dodge, decided to incorporate as a city to build a hospital for the care of scurvy victims. Money was raised by personal subscription and sale of vacant land. It was the first, and possibly the only, hospital to treat scurvy, and earned a nickname of the “California Haslar” (the Haslar Royal Hospital was the Naval Hospital where James Lind, 18th century investigator of antiscorbutic agents, was physician). Most of the money went to purchasing limejuice, potatoes, canned fruit, and other items at vastly inflated prices, but to good effect.  
Stamp commemorating Gold Rush
     Mexicans were generally considered fairly resistant to scurvy due to their habit of eating raw onions, the outbreak in Sonora apparently being an exception. The Chinese and German camps saw less scurvy, the first allegedly because of eating undercooked vegetables and sprouting legumes, and the second because of the German fondness for potato salad.    
     Doctors profited from the plethora of diseases – scurvy, cholera, fevers, etc., partly by selling medicines themselves. For scurvy they sold pickles and other antiscorbutics at marked-up prices. Payment was in cash, gold, or provisions. Another treatment, apparently not rare, and not on medical advice, did not work so well: burial up to the neck in the ground. Since sailors had been cured after reaching land the idea was that being placed deep in “land” would cure rapidly. Quacks, of course, peddled many useless nostrums.
     The need for fruits and vegetables created new businesses. A young entrepreneur sailed to Tahiti in 1849 and brought back a load of potatoes, squashes, and fruits, including 46,000 oranges. After almost a two-month sail about half the produce was still edible and he still profited. Locally farmed potatoes and green vegetables showed up by 1851-2. Ranchers in southern California who had citrus trees for their own use sold fruit to the north and converted grazing land into orchards, creating a citrus industry that persists today.
     After 1850 “epidemics” of scurvy were gone, and only scattered cases appeared. Anthony Lorenz, who supplied most of the above information, estimates conservatively that 10,000 men died of scurvy or its sequelae in the first two years of the Gold Rush, more than those claimed by cholera. Tragically it was at a time when preventive measures were generally known.

SOURCES
Lorenz, A J. “Scurvy in the Gold Rush”. J Hist Med 1957, 12: 473-510.
Lorenz, A J. “The Conquest of Scurvy”. J Amer Dietetic Assoc 1954, 30:   
      665-70.
Logan, T M. “Land Scurvy: Its Pathology, Causes, Symptoms, and
     Treatment”. Southern Med Reports 1851, 2: 468-80.
Carpenter, K J. The History of Scurvy and Vitamin C. 1986. Cambridge
      Univ Press.

(to leave a comment click on "no comments" and a box will come up)

(to subscribe enter email address or send request to gfrierson@gmail.com)



Friday, April 7, 2017

CHERRY TREES AND ADRENALIN

     Washington’s gorgeous cherry trees were in full bloom recently, a beautiful welcome to spring in our capital city.
Cherry trees, Washington (National Park Service photo)
     Ever wonder where the trees came from? The answer has a lot to do with adrenalin and progress in endocrinology and pharmacology.
     In the winter of 1893-4 George Oliver and Edward Schäfer in London showed that extracts of adrenal medulla tissue injected into dogs produced sudden hypertension and arterial
Edward Schafer (left) and George Oliver (right)
(from Wikipedia)
constriction. A few years later Bayliss and Starling demonstrated secretin, a substance from the intestine that stimulated secretion of pancreatic enzymes. And something in thyroid tissue produced metabolic effects.  Starling, in 1905, introduced the word “hormone”, Greek for “I excite”, as a label for these secretions.
     What was the blood-pressure-active adrenal hormone? The search was on in several laboratories, but only two “benchmen” will be highlighted here. The first is John J. Abel.
     Born near Cleveland in 1857, Abel studied mainly chemistry and physiology at the University of Michigan and Johns Hopkins.
John J Abel (Wikipedia)
Then he made the pilgrimage to Europe, where he studied under the physiologist Karl Ludwig (who trained a legion of Americans, including William Welch, a founder of Johns Hopkins), and Oswald Schmiedeberg. Schmiedeberg was a founding father of a new discipline, experimental pharmacology, whose students eventually occupied chairs of some forty different pharmacology departments. While in Europe Abel obtained an MD degree and accepted the chair of a new pharmacology department at the University of Michigan Medical School, where he built a laboratory and introduced hands-on German teaching methods. It was the first such department in the U.S. (as distinct from old, didactic departments of “Materia Medica”). Two years later, in 1893 William Osler enticed him to Johns Hopkins to head their new department.
     At Hopkins, after considerable hard work Abel obtained crystals of a substance from adrenals that was physiologically active. He named it “epinephrine” in a publication in 1897, believing it to be the natural compound. Unfortunately, though he was close he had actually isolated a benzoyl derivative. And he had lost an eye in a laboratory explosion, a loss he never complained about.
     Abel made many contributions, including isolation of poisons from Amanita Phalloides mushrooms, and, most famously, the crystallization of pure insulin. He helped found three journals. He invented a dialysis procedure to remove toxins from the blood of animals and developed a plasmapheresis process. Both ideas, ahead of their time, gained clinical use later. He did not believe in patenting his discoveries.
     The second protagonist is Jokichi Takamine, born in Japan in 1857 to a physician father. Jokichi started in medical school but switched to chemistry, in which he excelled. He worked for the
Jokichi Takamine (Wikipedia)
Japanese government on industrial projects, and was sent in 1884 to represent Japan at the New Orleans World Exposition. There he met, and later married, his landlord’s daughter and eventually took up residence in the U.S. Before long Jokichi patented the first digestive enzyme for human use (an amylase), and Parke-Davis marketed it under the name Taka-Diastase. Since Jokichi had set up his own lab Parke-Davis soon asked him to try to purify the active substance in adrenals. He visited Abel’s lab and, using a different chemical approach, proceeded to isolate the pure compound. He named it “adrenalin” in 1902 – after taking out a patent. Careful review indicates that he did not “steal” any ideas from Abel, nor did Abel accuse him of it.
     Adrenalin was a “blockbuster drug” in today’s parlance, used topically for all sorts of bleeding or inflammatory lesions, asthma, hay fever, and systemically for anaphylaxis. Quacks peddled it for cancer, etc. Royalties from sales of Adrenalin (the trade name) and enzymes made Takamine a rich man. He was one of the first “biotech entrepreneurs”. 
     Takamine’s wealth and business accomplishments helped him establish important social and political contacts and he became a sort of unofficial Japanese ambassador. In 1909, hearing that President Taft’s wife was interested in planting cherry trees in Washington, Takamine secured her acceptance of 2,000 trees as a gift from the mayor of Tokyo, though Takamine quietly paid for them. Unfortunately they were diseased and had to be destroyed, but Takamine financed, behind the scenes, a second lot of 3020 carefully grown trees. They have persisted, been added to, and are in bloom today.

    The contrast between Abel, a dedicated academic researcher and teacher who refused to patent ideas, and Takamine, an equally brilliant laboratory man who used his wealth from patents to foster cross-cultural amity, is striking.

Addenda:
     1. The name “epinephrine” is the approved one in the U.S. in recognition of Abel. “Adrenaline” is approved in the U.K., and in Japan a change was made in 2006 from “epinephrine” to “adrenaline” in honor of Takamine.
     2. The H K Mulford drug company marketed its own adrenalin, arguing that since it was a natural substance it was not subject to patent. Parke-Davis sued them. The Parke-Davis v Mulford case ended in a decision by Learned Hand in 1911 holding that the isolation/purification of a natural substance rendered it patentable. The decision was cited often in the recent Myriad Genetics case about the patentability of naturally occurring genes.
     3. The study of adrenalin effects opened the way to the discovery of chemical neurotransmitters.


SOURCES CONSULTED:

Hoffman, B. B. Adrenaline. 2013. Harvard Univ Press.
Kawakami, K. K. Jokichi Takamine: A Record of his American  
    Achievements. 1928. W E Rudge, NY.
Parascandola, J. The Development of American Pharmacology:  
   John J Abel and theShaping of a Discipline. 1992. Johns Hopkins  
   Press.
Voegtlin, C. “John Jacob Abel”. 1939. Journal of Pharmacology 
    and Experimental Therapeutics. 67: 373-406.

(to leave a comment click on "no comments" and a box will come up)

(to subscribe enter email address or send request to gfrierson@gmail.com)