section one

The 200 year history of medicine and the delivery of health care.

 A montage* in three acts: Section one: people, insights, drugs, devices, money.

Section two: the struggle to deliver care.  Section 3: problems looming on the horizon.

Section I: man discovers how the body works and learns how to heal.   

  1.  The 1800s. the discovery of germs, sanitation, misquitos, and the first war on drugs.
  2.  The transformative effect of anesthesia and transfusions
  3. tetanus, parasites, ticks, cortisone and insulin
  4. Penicillin TB; Syphilis, and the planet wide eradication of small pox.
  5. the mid 20th century and vaccines
  6. HIV
  7. Understanding and controlling the immune system

Cytokines and monoclonal antibodies 

Transplantation

T cells and cancer

Car-T 

  • Gene Therapy
  • Medical Devices.
  • Surgery
  • Childbirth
  • Vision

Section II.  delivery of health care

  1.  The right  to affordable quality care is given by the government to people:
  2.  Over 65 and the very poor (Medicare and Medicaid)
  3. With advanced renal disease
  4. In distress who arrive in an Emergency Room
  5. Native Americans, veterans, employees of the federal government and others
  6. Private insurers take control and operate health care as a for-profit business.
  7. The FDA and the government create the rules of the roads.
  8.  Hospitals and the care they provide.
  9. Obamacare—The Affordable Care Act  –the attempt to provide care to most

Section III:  looming problems

CHAPTER 1. UNDERSTANDING WHY AND HOW DRUGS BECAME EXPENSIVE.

  1. SETTING THE PRICE
  2. DONATED AND TAX DOLLARS
  3. SWISS MOVE IN 
  4.  DOMINATING THE MARKET
  5.  SENATE HEARINGS

CHAP 2    ARE GENERIC DRUGS SAFE AND EFFECTIVE?

CHAP 3    SHORTAGES

CHAP 4    NEGOTIATING

CHAP 5     CANADIAN PHARMACIES

CHAP 6    MALPRACTICE

CHAP 7.    GOUGING

In our 200 year journey to the world of healing that our ancestors dreamed of and prayed for, we have also created a land where health care, like any other private industry, has in large part become a means of creating profit for the few.

*Montage:  the process of selecting, editing, and piecing together events and people and forming a continuous whole.

Chapter 1:  the 1800s

“It always seems impossible until it’s done.” – Nelson Mandela   

Prior to the last two centuries physicians in the West often relied on the teachings of the ancients, like the Greek physician Hippocrates, who concluded that illness, was “due to an imbalance of blood, phlegm, black bile, and yellow bile2;” the Roman Galen who dissected monkeys and wrote about their anatomy. 

Over millions of years the creatures that populate our planet developed defenses against microscopic invaders.  Our immune systems became powerful, and our bodies learned how to heal fractures, limit blood loss, and much, much more.  

Gurus and religious leaders employed prayer, meditation, and positive thinking to improve the lot of sufferers.  People with trained hands detected and corrected displaced bones. Expert knew how to turn certain plants and herbs into balms and how certain foods affect some ailments.  Some healers could therapeutically massage muscles or stimulate acupressure points.  And there have always been individuals with a seemingly lethal disease who inexplicably got well.

For most of man’s time on earth, our doctors often served the ill best when they stepped aside or heeded the dictum:  “first do no harm.”  Even today they aren’t needed very often. 

In the late 1700s most of the treatments used by the doctors of the day were pretty awful.  Consider—the December morning in 1799 when 67 year old George Washington awoke desperately ill.  He was retired, living at Mt. Vernon.  The previous day Washington felt well and went out in the snow to “mark trees that were to be cut down.”  Upon awakening the day in question he couldn’t talk and had trouble breathing.  His wife sent for one doctor, then another.  George and his wife Martha were two of the country’s richest people.  They didn’t need subsidized care.

During the day three prominent physicians came to their home and plied their trade.  The doctors were among the country’s best and they worked hard.  On 4 occasions they bled the sick man and removed a lot of blood.  His throat was swabbed, he gargled, his feet were covered with wheat bran, and he was given an emetic to induce vomiting.  Nothing worked and Washington’s breathing got worse, so he dressed, thanked his 3 doctors, and made arrangements for his burial.  That night he died. (As related by his secretary Tobias Lear1)

At the time of his death, 1799, we didn’t know bacteria and viruses caused infections.  Antibiotics didn’t exist.  There were no blood banks.  Doctors (and barbers) were skilled at cutting off wounded or infected limbs, but surgery was risky and painful. 

The development of the health care that people in all corners of the world (to a greater or lesser extent) currently enjoy was triggered by discoveries made by a Dutch man named Leeuwenhoek.  A contemporary of Rembrandt and Vermeer, he was born in 17th century Delft, a town in Western Holland known for cool foggy summer mornings, numerous boat filled canals, and wide streets connected by wooden bridges.  In his day horses and carts clattered across the stones in front of a large open air market.  Food and wood were weighed before it was sold.  Narrow rows of houses surrounded the town square.  Leeuwenhoek’s mother came from a well-to-do brewer’s family, and he started his work life as a draper’s apprentice.  He used lenses, such as they were, to check the quality of a fabric’s thread.  Over time he became politically active, and he spent 40 years as the chamberlain of a city hall assembly chamber.  At age 40 he invented a totally new technique for making lenses.  His microscopes were tiny, powerful and revealed a world man had never before seen or suspected.  He was the earliest man who ever observed tiny bacteria and the first person that visualized and described red blood cells.   As he watched this previously unknown world, he drew pictures and sent them to the Royal Society.  They published his letters. 4 He kept the process for developing his lens a secret, and when he wrote about the microscopic world some who worked with ordinary polished and ground glass didn’t believe him.

In 1796 Edward Jenner, a British doc and a contemporary of Leeuwenhoek, proved an old wives tale true.  He took material from the pussy scabs on a milkmaid’s hand and injected it into the skin of another person.  The illness it caused was mild, and the “treated” person was now immune to the highly lethal viral disease, smallpox.  Thomas Jefferson and Madison read about Jenner’s findings and Congress passed the Vaccine act in 1813.

During the centuries that preceded Jenner’s revelations, some in parts of Africa, Asia, and later Europe were intentionally infected with live small pox (or technically the variola virus.)   Material from “scabs were blown into a person’s nose” or Variola rich pus was dried and scratched into skin.  When done right it caused a milder form of the disease. 

Twenty years before Jenner published his cowpox observations, George Washington watched his defeated troops enter the high, flat ground south of the Schuylkill River at Valley Forge. It was 6 days before Christmas. The skies were grey and foreboding, and a cold wind was most likely blowing off the river. Washington worried about a smallpox outbreak. The disease had recently “raged through Boston.” Most of the British troops were city boys who had been exposed when they were young, and they were immune.  Washington’s soldiers were commonly farmers and had not been present during the Smallpox outbreaks that periodically played havoc with the colonial cities.  Battlefield flare-ups had caused deaths, and had contributed to at least one defeat. The continental forces besieging Quebec in 1775 were weakened by a smallpox outbreak and had to withdraw.

The retreating men that Washington observed entering Valley Forge were “without clothes to cover their nakedness, without blankets to lay on, and without shoes.” They walked “through frost and snow without a murmur.” Shelters were just being erected.  They had been defeated, would have to fight again, and Washington decided to act.  On his orders medical personnel gathered pus from active smallpox lesions and rubbed it into freshly created wounds in each soldier.  As Washington explained in his journal, “The need for secrecy was great, as the British would have had a significant advantage had they known of the debilitated condition of the American troops who were recovering from induced smallpox.”  The inoculations are said to have caused the deaths of 4 of every 500 soldiers.9

 I find the fatality numbers a little hard to believe.  Twenty five percent of the men who spent the winter in the valley died.  Before they were immunized they were already in a weakened state. People with smallpox are really sick.  In epidemics the disease often has a thirty percent mortality rate.      

In 1848, more than a century after Leeuwenhoek demonstrated the existence of microscopic creatures, no one (best I can tell) connected “germs” and sanitation with infectious illnesses.  Washing hands was little more than a cultural or religious ritual.  That year a Hungarian physician, Ignaz Semmelweis, was working at a hospital in Vienna and was troubled because the women whose babies were delivered by doctors and medical students developed a fever and died 4 times more often as babies who were delivered by midwives.47 

Semmelweis investigated and learned that the medical students who delivered babies came from the dissecting room to the maternity ward without washing their hands.  He introduced hand washing and the death rate plummeted.  Unfortunately his fellow physicians continued to believe that the high rate of childbed fever was due “miasmas”, clouds of invisible matter, and Semmelweis lost his job.  The son of a prosperous grocer he returned to his hometown, Budapest, and in 1881 published a book on “childbed fever.”  In his late 40s, overcome by paranoia and dementia, he was committed to a psychiatric institution.  It took a generation before his teachings were widely accepted.11

1848 was also the year that Louis Pasteur, the Frenchman who would connect germs and disease, graduated and became a chemistry researcher.  An average student as a youth, Louis loved to draw and paint.  When he was a teenager he “won first prize in physics” and he went on to study chemistry and physics at the prestigious Ecole Normale.

Pasteur was 26 when he married 23 year old Marie Laurent. “According to legend he spent the morning of his wedding day in the lab and became so wrapped up in what he was doing that he had to be reminded to go to church.23”

At 32 Pasteur became a professor of chemistry at the University in Lille, a market city near the Belgian border whose streets were paved with stones and whose skies were often grey and rainy. In Lille, and 3 years later in Paris, Pasteur showed his fellow scientists that living organisms, called bacteria caused fermentation.  We call it the germ theory. In 1863, working for the French emperor, Napoleon III, Pasteur learned that the contamination of wine could be prevented by heating fermented grape juice to 50–60 °C (120–140 °F).  The process of eradicating harmful bacteria at a temperature well below boiling, bears the scientist’s name, Pasteurization.   

In 1879 he and his assistants discovered that a culture of bacteria that had been sitting around for a month lost most of its virulence.  When injected into a chicken it created a mild infection, and the chicken was subsequently resistant to illness caused by the bug.  In the following years his group learned how to weaken a pathogen to the point where it wasn’t harmful, but triggered an immune response.  Pasteur exploited the phenomenon and developed vaccines for chicken cholera, and anthrax. 

In 1885 a rabid dog bit a 9 year old French child. After it enters the body, the Rabies virus infects an axon, the “long slender projections of nerve cells that conduct electrical impulses.”  Then it travels up the axon to the brain and is uniformly lethal.  

The oft repeated story says the young man was bitten 15 times and his mother was knocking at Pasteur’s door two days later.  Prior to this incident Pasteur had, for some time, been injecting the agent that caused rabies into a series of animals.  When the first infected animal died, dried extracts of its spinal cord were harvested and injected into a second, and later a third animal.  With each passage the agent became less virulent.  Using this technique Pasteur injected the boy with a series of 14 increasingly virulent fragments of dried homogenized rabbit spinal cord.  The boy survived, and Pasteur’s fame grew.  For decades thereafter doctors used similar extracts to treat people who were bitten by a rabid creature. . 

          The Rabies virus is still responsible for the deaths of 59,000 humans a year. 90 percent of the cases in Africa and Asia are caused by dogs.  In this country we worry about bats and wild animals, but the U.S. has less than 5 confirmed cases a year.  In his later years Pasteur had a series of strokes and he died when he was in his 70s.1

Born 21 years after Pasteur, the other “father” of the germ theory, Robert Koch, seemed destined for greatness since his childhood.  He was a star student, and it is said he was reading newspapers at the age of 5.  In Germany he simultaneously ran a medical practice, was a district medical officer, and spent hours peering into a microscope. 

As a district medical officer he investigated a pasture where the cows that ate grass got sick and died.  He found rod shaped microscopic creatures in the soil and in the blood of a sheep that grazed in the pasture.  When the sheep died he collected some of the dead animal’s blood and injected it into a mouse.  The rodent promptly died and its blood killed a second mouse.  Koch was later able to grow bacilli that killed the mouse in a rabbit’s eye.  He dried the rod shape microscopic bacilli and they appeared to be innocuous, but they weren’t.  They still caused a lethal disease.  In the current century anthrax spores have been used as agents of bioterrorism. 

Once doctors had good microscopes they learned that when they dried and dyed objects before they viewed them, it was easier to identify bacteria.  Koch colored tissue infected with human and bovine (cow) tuberculosis with his special stains and was able to identify the bacillus that caused the disease.  “A plodding worker and a careful seeker of facts, Koch dazzled a group of colleagues one Friday evening in 1882 by proving that the tubercle bacillus was transmissible and that it was the cause of TB in man.  Later he isolated a glycerine extract of the bacillus that, when injected into the skin of a person with active disease, caused chills, fever and aggressive skin reaction.  Koch believed the byproduct that we now call tuberculin would “slow or halt” the deadly disease.  He announced his “cure” in a Berlin auditorium that seated 8000.   Some of “the rich and famous sought the treatment.  The mystery writer Conan Doyle noted a pile of letters four feet wide and two feet deep on the floor of Koch’s office.  All contained pleas for the miracle cure.” 

When his “cure” failed and he faced public scorn, he wrote his young lover: “as long as you love me I cannot be beaten down.”56 At age 47 he left his wife and married the 18 year old woman who was fascinated with his studies.”  (In the German television drama Charite, Koch’s woman, Hedwig, is a “vivacious debutante actress who “comes on to the shy scientist, and he falls in love with her.”) https://www.sciencedirect.com/science/article/pii/S1201971210023143#bib2

“When Koch inoculated himself with tuberculin, she volunteered to be injected too.” 

Before 1870 Germany was a group of kingdoms, and they had been humiliated by Napoleon.  In 1870 Germany became a nation and invaded and defeated the once proud French.  A few years later the two germ theory “fathers”, Koch and Pasteur were introduced to one another in London.  Koch was 47 years old and Pasteur 68 and partially paralyzed.  The meeting was cordial, tense, and controversial.  Later, in part due a mistranslation of what one said or wrote, each started criticizing the work of the other.24 

In 1848, the year Pasteur was a novice chemist, Joseph Lister, the father of modern antisepsis began his medical studies.  ” A humble Scotsman with an athletic build he became the surgical apprentice of James Syme, “the greatest surgical teacher of the day.”  Lister later married Syme’s eldest daughter and adopted her religion.  Born a Quaker, he became a Scottish Episcopalian.  A few years after he completed his training, he was the surgeon for the Glasgow infirmary and he noticed that half the people who had a limb amputated became septic and died.  He’d been reading about Pasteur and germs, and he started treating raw wounds with carbolic acid, a foul-smelling antiseptic that was used to clean sewers.  The infection rate dropped to 15%, and the Scots were impressed.  They started cleaning and sterilizing the tools they used at the time of surgery. English doctors weren’t convinced until Lister went to London and operated on a fractured kneecap.  He wired the bone together, closed the incision, and the wound didn’t become infected.29 At age 56 Lister was named a Baron.20

Florence Nightingale took our awareness of cleanliness up a notch.  Born in Florence Italy, hence her name, she was the rebellious daughter of wealthy Brits who didn’t want her to become a nurse.  During the disastrous Crimean War between Britain and Russia, (1853-6), she worked at a small hospital in London.  A world away in Turkey a muckraking reporter visiting the front lines stopped at a British Military hospital.  He found the conditions “appalling”, which no doubt meant poor sanitation, gaping wounds, and bad smells.  His newspaper articles detailed what he saw and his fellow countrymen were incensed.  Then a high official made it possible for Florence to get involved, and she and 38 other nurses sailed to Turkey.

At the military hospital in Scutari, sanitation was “neglected and infections were rampant.” There was no clean linen.  The clothes of the soldiers were swarming with bugs, lice, and fleas.  The floors, walls, and ceilings were filthy, and rats were hiding under the beds. There were no soap, towels, nor basins, and there were only 14 baths for approximately 2000 soldiers. Nightingale purchased towels and provided clean shirts and plenty of soap. She brought food from England, scoured the kitchens, and set her nurses to cleaning up the hospital wards.” A sanitary commission, set up by the British government, arrived to flush out the sewers. She may not have had the drugs, blood, or modern day ‘tools’ that can turn an illness around, but she showed that diseased bodies have a remarkable ability to mend themselves.  As Florence saw it, “Sufferings were the result of too little “fresh air, light, warmth, quiet, or cleanliness.”1

In the early part of the 19th century objects seen through the microscopes of the day were sometimes blurry or distorted.  A century earlier Van Leeuwenhoek had learned how to magnify objects 250 fold with a single tiny lens.  He produced 500 gadgets but didn’t teach others how to produce them and no one knew how to make “his kind” of lens.   Robert Hooker, his British contemporary used a different type of magnifying device when he studied plants.  Looking through a pair of lined up pieces of glass, his ability to see tiny objects depended on the quality of his lenses.  A jack of many trades, Hooke was also a surveyor and, as an architect, helped design the buildings that replaced those destroyed by the Great Fire of London in 1666. 

The quality of the microscopes produced during the early 1800s was variable.  Then two German mechanics independently started to manufacture really good lenses and microscopes.  One of them, Carl Zeiss, came from a German family of artisans and apprenticed with a maker of fine tools.  In 1846 he opened a mechanical workshop in Jena, a river valley town in the “green heart” region of eastern Germany.  His microscopes were simple.  Lenses were the result of trial and error.  Poor quality scopes were destroyed.  20 years later a mathematics professor, Ernest Abbe joined the company.  He introduced mathematical modeling (whatever that is) and the company started regularly producing microscopes that allowed observers to see tiny objects clearly.  A number of parasites that had not, previously, been visible were identified. After Zeiss died in 1888, Abbe became the head of the company and, ahead of his time, he introduced an 8 hour work day, pensions and holiday and sick pay.

In the late 1880s Charles Laveran was gazing through a powerful microscope and saw pigment and motionless bodies in the red blood cells of some people with Malaria.  A French military doctor, Laveran was the son of a physician, and spent part of his childhood in Algeria.  During the 1870 Franco Prussian War he was in charge of a number of ambulances and was trapped along with his fellow French troops in the besieged fortress of Metz.  At age 40 he married and he was stationed in Bone Algeria, a colonial city on the Mediterranean coast, when he saw the red cell abnormalities. Realizing what he saw might be significant he “meticulously examined the blood of 200 patients and observed crescent bodies in all cases of malaria” but never in the blood of people who didn’t have malaria.  Laveran presented his findings to his fellow doctors, and many thought he was seeing disintegrating red blood cells.  By 1884 Italian researchers looking through advanced microscopes were also able to “observe amoeboid movement of the organisms” and Laveran’s findings were validated.   

The parasite that causes Malaria lives in the liver and attacks oxygen carrying red blood cells.  Historically a major cause of death and disability, the disease has been around for hundreds of years.  In 1898 the Italian, Giovanni Grassi proved that “mosquitoes that fed on infected patients transmitted the parasite to uninfected individuals.  About the same time Richard Ross performed similar studies in India.  A British physician who preferred to spend his time writing novels and poetry, Ross didn’t believe in the mosquito theory, but he kept receiving bothersome letters that promoted the mosquito hypothesis from Patrick Manson, a Scottish physician he met in London.  In 1898 Ross finally did the research, and failed because he used the wrong mosquito. Eventually Ross proved that mosquitos carry the infection from one person to another and he received the Nobel Prize.  The Italians who made a similar finding weren’t honored.  And there were doubters.  To convince nonbelievers Manson shipped a number of infected mosquitos from Rome to the middle of London.  The trip by ship took 48 hours and some of the mosquitos died en route.  When they arrived Manson allowed one of the mosquitos to infect his son.  The young man got sick, but was cured with quinine. 

Malaria remains a huge worldwide problem.  In 2019, 228 million people were infected with the parasite.  400,000 of them, two thirds of which were children died. That year there were only 2000 cases in the U.S.  Most occurred in immigrants or people returning from an at-risk area. 

Over the last century and a half we’ve learned a lot about the illness.  Many generations of drugs have been developed.  But the disease is mainly absent from countries that are able to control the vectors that spread it.   

When the French tried to build a canal between the Atlantic and Pacific Oceans at Panama in the late 1800s no one knew that Malaria and another tropical disease, Yellow Fever, were transmitted from one person to another by mosquitos. A few years later an American military doctor had a theory and performed a study. 

The research was conducted in 1900 in Cuba. At the time the island was occupied by U.S. soldiers.  America had won a war with Spain in 1898, and had freed the island from the European colonizer.  Cuba was scheduled to become independent in 1902.   In 1898 there was an outbreak of Yellow Fever among American soldiers who were stationed on the island.

          Doctors currently believe the virus that causes the infection originally came from the rain forests of Africa, was around for centuries, and was brought to the New World by slave traders. Infected people develop headaches, fever, and bleed easily.  Many develop shock and die.  The doctors who cared for the sick soldiers in Cuba apparently thought the illness was caused by mysterious “fomites.” 

The U.S. government sent Major Walter Reed to the island of rum and cigars to investigate.  A military physician who had spent 17 years in the arid American West, Reed suspected Yellow Fever was transmitted by mosquitos.  He met with the doctors caring for the ill, and they were willing to check out his theory.  Volunteers, some of whom were ill and others who were well, were housed in a barracks. Mosquitos were collected and the insects were allowed to use the tip of their straw-like mouth to pierce each person’s skin.

One of the apparently skeptical doctors conducting the investigation “submitted to the bite of a creature that had fed on an infected soldier.”  He joked “if there is anything to the mosquito theory, I should have had a good dose.”   A week later his life was in the balance for three days. 

A fellow doctor who brought the mosquitoes from one soldier to another noticed a bug on his hand.  He allowed it to suck his blood.  A week later he became febrile, delirious, started vomiting, had seizures and died.  As one of the infected who survived put it: “such is yellow fever.” 

We now know the illness is caused by a virus and there is an effective vaccine.  But Yellow Fever still annually causes 200,000 infections and 30,000 deaths in central Africa and South America. 

As mentioned earlier, the role of mosquitoes was unknown when the French decided to build a canal at Panama.  They were encouraged when the Suez Canal opened in 1869, and they believed an ocean to ocean passage way in Panama wouldn’t be very expensive or difficult.  They raised money from over 200,000 investors.  Twenty thousand workers from the West Indies were recruited, and a Frenchman named Jules Dingler was put in charge.  When he went to Panama, Dingler brought his family.  He “is credited with having said that “only drunkards and the dissipated contract yellow fever and die.” During his early years on the job Dingler’s son, daughter and wife each contracted the infection.  None of them survived.  Dingler returned to France a man broken in mind and body.  Unlike Suez, the Panama terrain turned out to be challenging. Between 1881 and 1889 over 20,000 laborers who toiled in the thick rain forests died.  That year the company funding the project declared bankruptcy and the French abandoned the project.42

In 1904, when the U.S. started building the Panama Canal, the people in charge knew that the mosquito transmitted Malaria and Yellow Fever.  Engineers “drained pools of water, cut all the brush and grass near villages, and constructed drainage ditches.  Larvae were oiled and killed with an insecticide. Screens were placed on windows and doors, and “collectors were hired to gather the adult mosquitoes that remained in the houses and tents during the daytime.”

It took ten years to build the canal that linked the two oceans.

Cholera has only been a problem in the western world for a little over 200 years.  It first came to our attention in 1817 when an epidemic started in the Ganges Delta of India.  It was spread by ships trading goods in Thailand, China, and Japan. Millions got sick. 

Three decades later when the third world wide cholera pandemic struck there were outbreaks in Europe and the Americas.   The epidemic of 1854 started in the Soho district of London.  Cities in Western Europe and the U.S. didn’t have running water until sometime in the 1800s.  Initially there was only one tap per neighborhood or large apartment building.22  Most people got their water from communal pumps.  Sewage and untreated human waste was dumped in the Thames and the river smelled. 

John Snow a physician living in London suspected the source of the infection was contaminated water.  A vegetarian and teetotaler who liked to swim, Snow mapped the location of the people who got sick and convinced town officials that the Broad Street pump was the source of the epidemic.  They removed the pump handle and the epidemic “almost immediately” trickled to a stop.  Snow, not surprisingly, was subsequently disbelieved and denounced.

Some claim that the chlorination of our drinking water may be “the most significant public health advance of the millennium.” The practice was prompted by the realization, in the late 1800s, that bacteria were the cause waterborne infections like typhoid fever, dysentery, and cholera.  About the same time scientists learned that the element chlorine kills bugs.  When a municipal water sand filter failed in a city in England in 1905 there was a typhoid outbreak, and the city started adding chlorine to their municipal water. Three years later Jersey City, New Jersey started chlorinating city water.  Chlorine is currently added to “over 98 percent of all U.S. municipal water,” and outbreaks of typhoid fever have been virtually eliminated.  (Chlorine does not kill parasites like Giardia and Cryptosporidium.)

In the last century we’ve learned a lot about diarrheal diseases and their treatment.  But the slums of many third world cities don’t have good plumbing.  Much of the world’s fresh water is contaminated.  For half of the world’s people governments haven’t been able to solve the problem.44   One in 9 childhood deaths are the result of diarrhea and the culprits responsible include bacteria like E. coli, salmonella, shigella, and cholera.

When I was a freshman med student a biochemistry professor who looked like the movie star Robert Wagner was trying to prove that water didn’t just passively flow into our bodies through the wall of the small intestine.  Salt and water are sometimes pushed in by a glucose fueled biochemical pump.63

I took notes when he lectured and I passed the exam, but I didn’t know why it mattered.  Why would a person spend years of his life learning how salt and water moves across a membrane.  ?

          I later learned that when the cholera bacteria infect the small bowel the bug causes a large amount of liquid to leak into the intestines and those infected develop watery diarrhea. At the same time their small bowels don’t absorb much fluid and many die of dehydration.

Our lecturer, Robert Crane showed that if a teaspoon of salt and two tablespoons of sugar are added to a liter of clean water, a biochemical “force” drives liquid into the body and the person can be rehydrated.  His simple formula has saved more lives than most of the $100,000 a year drugs being advertised on television.  Somebody bottled it and called it Pedialyte. 

Self described as a problem solver, Crane was born in New Jersey and didn’t develop an interest in science until he was a freshman in college.  During the Second World War he was a deck officer on a navy destroyer that “took a bomb” at the battle of Leyte Gulf and was later part of the screen of ships that carried Douglas MacArthur back to the Philippines.  Crane got his PhD in chemistry at Harvard, married a biochemist, and spent more than a decade performing research in St Louis.  In 1978 he was awarded the Nobel Prize .

In 1991 a massive Hurricane blew across Hispanola, the Caribbean island that is shared by the Dominican Republic and Haiti.  At the time there hadn’t been cases of cholera in Haiti for more than a hundred years.  The U.N. brought in relief workers from Bangladesh and they lived with the people.  70% of Haitian households had either rudimentary toilets or none at all.  A few months after the volunteers arrived, the water people drank was spreading the kind of Cholera bacteria that infects people in Bangladesh.  665,000 Haitians were infected and over 8000 died.  15 years later a non-profit called SOIL is supplying compostable toilets and trying to help solve the island’s waste problem.62  

In addition to the drinking water we’re sometimes impacted by the water that flows from our cattle feeding lots.  E.coli was recently found in Romaine lettuce grown downstream from an area where cattle were confined.  In 1987 sewage from farm animals infected with the parasite cryptosporidium overflowed in Georgia and 13,000 people developed diarrhea.43

Hormones are chemical messengers. They are created in an “endocrine gland” then travel by blood and other mechanisms to parts of the body where they do their thing.  If a man is given estrogens he grows breasts.  If a woman is given male hormones she grows a beard. 

One of the endocrine glands, the thyroid gland is located in the middle of the neck, just below the voice box and above the chest.  The story of the gland and the hormones it produces is both old and new.  It’s complex enough to deserve a chapter or a book, and that’s not what I’m currently writing. 

Mankind has dealt with the thyroid gland for thousands of years, in large part because when it gets big it’s hard to ignore.  In parts of the world benign growths called goiters once commonly protruded from the front of many necks. We now know that the condition is caused by a deficiency in Iodide.  Thousands of years ago Chinese doctors knew that seaweed helped.

The mass is currently gone from much of the world because most countries require salt to contain iodide and it’s routinely added.  In the U.S., the land of freedom of speech and religion, our salt is often free of iodide.  Fortification, putting iodide into salt is voluntary and often not done.  Companies don’t have to mention the iodide content of a product on the label, and processed foods commonly use non iodinated salt.64,65 

In the absence of enough iodide newborns don’t make enough thyroid hormone, their growth can be stunted, and they can develop physical deformities and neurologic impairment.” The condition is called cretinism and it was recognized over the years by a number of physicians. 

Once we had adequate anesthesia, a few surgeons started removing thyroid glands that were too big or were cancerous.  The operation was bloody, we didn’t know how to transfuse blood, and before Theodore Kocher started performing the surgery many died.

One of the brightest students at his Swiss university, Kocher was not, best I can tell, formally surgically trained.  One of the talented and intuitive few, he apparently developed his skills by observing a few British surgeons and by trial and error.  Born, raised, and a lifelong resident of Bern, a centuries old Swiss city encircled on three sides by the Aare River, he was once described as “a slight, rather cadaverous little man with a close cropped beard and very large, prominent upper teeth made his smile rather ghostly.”  Aside from work he had few interests His vacations were mostly trips to medical meetings.  Deeply religious and adamant about sterility, he told students whose patients had developed an infection to stand, beat their breasts and say “I sinned.”

He learned that the thyroid was a vital organ the hard way.  In 1882 a physician in Geneva reported on the changes that occurred to 10 patients whose thyroids were totally removed a few years earlier.  Kocher heard the paper and asked a few people he had operated on to come in for an evaluation.  One woman in particular bothered him.  She “had changed from a pretty young girl to a small woman who was overweight, and slow of intellect and speech. She had lost hair and had a thick tongue.” Realizing the lack of thyroid was the cause her changes Kocher “vowed to never again to remove the entire gland.”

He was a colonel in the Swiss militia and spent some of his non medical time trying to get the makers of weapons to create missiles that didn’t deform and were “intentionally less lethal.”

His operations were meticulous.  Large goiters and cancer of the thyroid were removed slowly and with little or no blood loss.  Surgery took a long time and the slow pace irritated many of the spectators who gathered for the show. 

By 1912, Kocher had removed approximately 5000 thyroid glands and the mortality rate in his hands was less than one percent.  For this accomplishment and for his studies on the gland and how hormones function he received the Nobel Prize in 1909. 

I don’t know when or if Kocher started treating people with animal extract after surgery.  The books give the credit to George R. Murray, a 26 year old physician who didn’t have a hospital or medical school appointment.  In 1891 he obtained fresh sheep’s thyroid from a slaughterhouse, prepared an extract, and injected it under the skin of a woman who looked like she was markedly deficient in thyroid.  When she improved he published his experience.  Though others have probably used similar treatments for decades or centuries, Murray was the first rooster to crow and the books claim he was the first doctor in recent centuries who extracted and administered the hormone.60,61 

https://www.surgjournal.com/article/0039-6060(69)90069-5/pdf

By the time I became a physician narcotics had long been used for pain control and had an age-old disreputable history.  Mentioned 4000 years ago by Aristotle’s buddy Theophratus, the extract of the poppy plant, was long known as a drug that relieved pain, affected moods, and created a craving and an addiction.

          In the 1700s the British developed a love for tea grown in China and the UK was shelling out a lot of silver for the leaves.  Needing a commodity the Chinese would buy, the Brits grew poppies in India and peddled its extract, opium, in China.  It sold well.  Many got addicted and silver started flowing back to England.   By the 1800s Chinese leaders, troubled by the way the drug affected the people and upset because so much silver was leaving the country, started the world’s first war on drugs.  They forbade the British from trafficking opium.  The Brits responded by sending 16 warships.  They “arrived at Guangzhou, bombarded forts, fought battles,” and won the first opium war.  As spoils of their victory, the English gained control of the island of Hong Kong and access to 5 Chinese ports.

In the early 1800s at a time when purified plant extracts were used as medicines, Wilhelm Serturner was an apprentice apothecary in Paderhorn.  The town bordered Germany’s shortest river and was founded by Charlemagne in the eighth century.  After separating “morphine crystals from tarry poppy seed juice,” Serturner learned the crystals put stray dogs and rats to sleep.  Taking “a small quantity” for a tooth ache, he “experienced tremendous relief.”  A low dose made three volunteers “happy and light headed.” A higher dose caused “confusion.58” Over time Serturner became an addict.  He was once described as a person with “aggravated hypochondriacal alterations in his frame of mind and quiet disturbances of mood.59

In 1844 Francis Rynd, a Dublin physician who hunted foxes and was “in much demand at “fashionable dinner parties”, invented the hollow metal needle.  A decade later a physician in Edinburg and another in France independently invented the syringe.  In the process they helped people in pain and created an instrument that would lead to countless infections and the addiction of many.

In 1897 two traveling salesmen from North Carolina met at a train station in Texas and learned they had the same birthday.  A decade later, now entrepreneurs, Beckton and Dickinson started the first U.S. facility that produced hypodermic needles and syringes.

Chapter 2 Anesthesia and transfusions

In 1900, 76 million people lived in the U.S., 200,000 miles of railroad tracks criss-crossed the continent and people traveled in horse drawn carriages and wagons on narrow dirt and gravel roads.  Trains dominated commerce, the modern internal combustion engine was 15 years old and 1575 electric and 900 gasoline powered vehicles were produced each year.

Electricity was new and scarce.  Thomas Edison’s incandescent light bulb was 20 years old and an American city, Cleveland had started using electric lamps to illuminate some of its streets.    

Sixty seven years had passed since drinking water was first pumped into the White House from a nearby reservoir; Chicago’s “comprehensive sewer system” had been up and running for 15 years, and a revolutionary toilet made by Thomas Crapper’s was 9 years old.8

House calls were common.  People who had suffered heart attacks, strokes, or major trauma were often cared for at home.  Adequate anesthesia was made elective surgery painless. But in the decades that preceded blood banking and antibiotics, cutting a body open was risky. 

My wife had two uncles who died in the 1930s from a strep throat, an infection that’s currently rapidly cured with a few doses of an antibiotic. A family member who had a severe asthma attack and who was getting exhausted responded to medications and didn’t need to be intubated and placed on a breathing machine. Another who was stung by a bee and developed anaphylactic shock was treated aggressively and was well enough to go home the next day.  Either relative could have died a century ago.

The 20th and 21st centuries were filled with transformative medical advances and I chose a few that I think were game changers. 

  1.  Anesthesia, the first “game changer” got its start in 1846.  A monument in the Boston Public Garden commemorates the day that William Morton proved to the world that “the inhalation of ether causes insensibility to pain.”
  2.  The second:  Early in the 20th century we learned how to collect, store, and “safely” transfuse blood.

ANESTHESIA

The Tombstone in the Boston Cemetery marks the site of the “Inventor and Revealer of Inhalation Anesthesia: Before Whom, in All Time, Surgery was Agony; By Whom Pain in Surgery was Averted and Annulled; Since Whom, Science has Control of Pain.” 

The demonstration of Ether’s effect occurred in a Boston Hospital in October 1846.4   One of the nation’s most active, Boston’s Mass General Hospital was, at the time, hosting up to two surgeries a week.  They were performed in an operating room that was, in essence, a stage.  It was surrounded by a steep amphitheater.  People filled the seats and watched.  One October day the doctor in charge, Dr. Warren told the onlookers “there is a gentleman who claims his inhalation will make a person insensitive to pain.  I decided to permit him to perform his experiment.”

The dentist who administered the anesthetic, Dr. Morton, was described as being strikingly attractive and alternately optimistic and pessimistic.”  He arrived 25 minutes late, took out his narrow neck flask, and filled its bottom with two liquids:  Sulfuric ether and oil of orange. The second chemical was supposed to mask the ether odor.

The man who was about to undergo surgery inhaled “gas” through a mouth piece, and in 3 to 4 minutes he became “insensible and fell into a deep sleep.” He had a mass in his neck and the doctor quickly cut it out. When the operation ended Dr. Warren, the man’s surgeon, spoke to the rapt onlookers. Gentlemen, this is no humbug.  People cheered, and the public took notice.26

          During the Civil War battle of Fredericksburg, Morton decided to help. When a wounded soldier was about to undergo a limb amputation Warren “prepared the man for the knife, producing perfect anesthesia in an average time of three minutes.9

Once operations could be performed without pain surgeons started performing them and a number of hospitals were built or expanded.  Elective surgery became the cash cow that supported one institution after another.

In July 1868 William Morton was agitated because another doctor was trying to claim credit for his invention.  He was in New York, the city was in the midst of a grueling heat wave, and Morton took his wife on a wild carriage ride through Central Park.  Then he abruptly stopped the buggy, got out, and died. He was 48 years old.5

During the next century doctors and dentists used diethyl ether and later chloroform to put people to sleep.  Over the decades drugs changed, but the overall effect has largely remained the same.  One of the current anesthetics of choice is propofol, the drug that killed Michael Jackson.  It’s administered as an intravenous drip.  It starts and stops working rapidly and has a “low incidence of side effects like postoperative nausea and vomiting and cognitive impairment.1 ”  Operations are also performed using spinal anesthesia, a nerve block or a local infusion of lidocaine. 

One hundred years to the month after Morton made his presentation I was asked to see a sick patient with jaundice.  At the time I was the assistant chief medical resident at the San Francisco VA hospital, a collection of buildings on the edge of the Pacific that were usually blanketed in fog and cooled by a breeze from the ocean. The man I examined had yellow eyes, was weak, had no appetite and was lying in his bed.   He recently had a hair transplant and otherwise had been well.  He’d been put to sleep with halothane, the anesthesia of the day. Plugs of his hair were harvested from the back of his head, and planted up front.

The patient told me this had been his second transplant.  He turned yellow the first time and thought he knew what was happening.  He had witnessed fellow service men that got hepatitis when he was stationed in the South Pacific.  He decided to not tell his doctor because he was afraid the physician wouldn’t perform the second set of hair transfers.

This was a few years before liver transplants were being done, so people with failing livers could not be rescued. The patient’s condition got worse, his abdomen filled with fluid, he sank into a coma, and he died. — And it happened because of a hair transplant.

Most anesthesiologists back then didn’t believe “so called” halothane hepatitis was a real entity and wouldn’t accept that the anesthetic they used could cause the problem.  They had “never seen” a case of liver failure that they couldn’t pin on one of their patient’s underlying medical conditions.  Halothane was a smooth, well tolerated anesthetic.  I was a budding gastroenterologist and I knew they were wrong.  Turns out that halothane causes liver failure and death in one of 35,000 patients. 

Anesthesiologists needed proof and they got it in 1969 after an MD anesthesiologist visited the Yale hepatologist, Gerald Klatskin.  The anesthesiologist said he turned yellow every time he administered halothane to a patient. Klatskin decided to test his theory.  He biopsied the man’s liver.  It was normal.  Then the anesthesiologist inhaled halothane and his eyes and skin turned yellow.  A second biopsy showed an injured liver.  Klatskin published a report of the case, and U.S. anesthesiologists stopped administering halothane.  It is still “widely used in developing countries.3

Fast forward 40 years and I’m interviewing the chief of anesthesia at a local hospital.  He’s telling that nowadays general anesthesia is safer than crossing the street.  During the 50 years when the rest of medicine was inventing new operations and trying to cure more diseases, the top anesthesia thinkers were obsessed with safety.

They had long since learned how to put a person into a state where the patient heard and saw nothing, was impervious to pain, and had muscles that were totally relaxed.  When aroused some people had painful wounds, sensitive areas, inactive bowels, and bodily parts that didn’t function normally.  Grogginess could last a while.  But the recipients of general anesthesia had no memory of the trauma their body had endured. 

The anesthesiologist whose insights I’m channeling credits the emphasis on safety to the skyrocketing cost of malpractice insurance.  It became the focus of a number of physicians who “passed gas” for a living in the 1980s.  I’m sure doctors in the field thought their care was excellent and wondered why they were being singled out.  But the numbers said it all.  In 1974 three percent of all American doctors who bought malpractice insurance were anesthesiologists, and these were the very doctors who were responsible for 10% of all malpractice pay outs.  Outsiders concluded that the care they provided was “below the standard”.

          Malpractice is not a good way to judge medical quality.  Doctors are sued when something major goes wrong and when the responsible physician is arrogant or seems to be hiding something.  It’s also is easier to sue someone you have never consciously spoken to or interacted with, someone who has never become a real person with feelings and regrets.

Nonetheless rates were rising and something had to be done.  The anesthesia societies embarked on something they called the “closed claim project.”  They reviewed malpractice suits that had run their course, that had been litigated, settled or just dropped by the plaintiff.  Discovering what went wrong did not create a legal or other risk for the involved doctor.

Data for events prior to 1990 revealed that in a third of the cases, the person whose families sued had died or had suffered brain damage.  In 45 percent of these people the harm was caused by a “respiratory event”.  When anesthesiologists induce coma they become responsible for the movement of air into and out of the lungs.  They slide a tube through the mouth and pharynx, between the vocal cords, and into the bronchus.  Then they aerate the lungs and the body.  In 7 percent of the respiratory cases the anesthesiologist mistakenly slipped the breathing tube into the esophagus, the tube that transports food and drink to the intestinal system.  It is located above the vocal chords at the lower end of pharynx.  A sphincter at its top end keeps air from entering the gut and helps prevent regurgitation of esophageal contents.

In 12 percent intubation was difficult, and the body was deprived of air for a period of time.  In another 7 percent the doctor got the tube in the right place but didn’t ventilate the lungs adequately.

25% of the law suits were the result of cardiovascular events, arrhythmias of the heart, a drop in blood pressure, and heart attacks.

Nerve damage due to poor positioning and compression of nerves caused 21 percent of the problems.   

Anesthesiologists sometimes instill Novocain or alcohol into nerves in an attempt to mitigate chronic pain.  If they injected a person who was taking blood thinners they sometimes precipitated bleeding; damage caused by the leaking blood prompted some of the legal action. 

Six percent of the cases were prompted by burns caused by electrical cautery or by IV bags of fluid that were overly warmed.   

There were people whose blood pressure dropped and they lost vision, individuals whose airways had been damaged during a difficult intubation, and a few who had back pain, emotional distress, or eye injuries. (Anesthesiologists work close to the eyes.)

Seventy nine percent of the problems were attributed to lack of vigilance.  The specialty’s has an old saying:  putting someone to sleep starts with seconds of panic, (intubation) and is followed by hours of boredom.

 After the anesthesiologists learned what they were doing wrong they disseminated their findings, made recommendations, and general anesthesia became safer.

Anesthesiologists now have tools that make it possible to intubate almost everyone.  Small flexible instruments containing long fiberoptic bundles allow the anesthesiologist to see into dark corners.  Some scopes have chips on their tips and send images to a TV screen.  Anesthesiologists and anesthetists confirm the endotracheal tube is in the right place with a beside ultrasound examination and by measuring and monitoring the carbon dioxide level of air that exits the lungs.  If the level gets too high ventilation may be inadequate.  Since the blood of anesthetized people is enriched with oxygen, a high carbon dioxide concentration is more sensitive to air movement problems than a low level of oxygen. Complex machines that ventilate the patient regulate and monitor the movement of the gases.  Sophisticated gear has valves and gauges that are routinely checked.  Bells ring and beeps sound when something is amiss. 

In addition to physicians, 43,000 nurses administer much of the anesthesia in this country.  These nurses are educated, trained, licensed, and competent. In all but 15 states they are required to “work under a physician’s supervision”.

TRANSFUSIONS

Blood the liquid that carries nutrition and oxygen to every corner of our body is a mixture of cells and protein rich fluid.   Most of the cells are “erythrocytes” –red cells.  As they flow through the arteries of the lung the tiny discs discard carbon dioxide and acquire oxygen.  When they are propelled through the rest of the body the cells deliver oxygen and collect carbon dioxide.

We didn’t really start to understand the value and danger of transfusing the red solution until 1900 when a Viennese Physician and researcher, Karl Landsteiner, separated blood into its two components: Cells and serum.  “Many but not all normal sera will agglutinate the red cells of other normal persons. 

Landsteiner separated human beings into groups.  People in the same group could safely give blood to one another.8.”

A physician researcher, Landsteiner was six when his father died.  Raised by his mother Fanny, he was “so devoted to the woman that her death mask hung on his wall until he died.”  An esteemed professor, Karl was living in Vienna, the capital of the vast Austro-Hungarian Empire when the First World War ended.  His country had been on the losing side and the imperial lands were carved into many of the nations of modern day Europe.  That winter there were shortages and Landsteiner’s laboratory wasn’t heated.  One day “the Viennese poor cut down the trees around his house for firewood and tore away his fences.”  Feeling personally threatened, Landsteiner moved with his wife and children, to Holland.  During the next three years he lived and performed experiments in a “little cottage with a rose garden” in the seaside town, Scheveningen.  He was assisted by a man-servant and a nun who was “very devout and frequently quit the lab for prayers or to serve as an organist in the chapel.”  After accepting a position at the Rockefeller Institute, Landsteiner moved his family to New York and lived “on the floor above a butcher shop on a street with trolley cars.”  Avoiding social activities, he spent his days in the lab, and read and thought at night “until the late hour.”  “His energy was continuous and compelling, and no moment of idleness in the lab was tolerable to him.8”   He was living in New York in 1930 when he received the Nobel Prize.6

In the early 1900s, while still in Vienna, Landsteiner identified three blood types: A, B, and O. They were based on the antigens on the surface of red blood cells,  We now, of course, know that:                                                             

  • A person with Type B blood has “B” antigens on the surface of his or her red cells.  Their serum contains antibodies to A blood—antibodies that cause type A red cells to stick together. 
  • A person with Type A blood has “A” antigens on the surface of his or her red cells– and antibodies to B red cells in their serum
  • If a blood type B person is transfused with Type “A” blood, the infused red cells will agglutinate—stick to one another and form a clump.   . 
  • If B cells are transfused into someone with type A blood, the new cells will adhere to one another.
  • People who are transfused with incompatible blood got quite ill and can die.  
  • Individuals whose red cells have both A and B surface antigens on the surface of their red cells don’t have serum antibodies to either B or A.  They can receive–be safely transfused with A, B, AB OR O blood.
  • Type O red cells don’t have antigens on their surface.  The serum of Individuals with type O blood contains B and A antibodies. They can’t safely receive–be transfused with A, B or AB cells, but their red cells can be instilled into people with any blood type. 
  • In 1937 Alexander Wiener added another red cell surface antigen, the RH, Rhesus factor to the equation.

During the First World War years (1914-1918) a series of doctors learned that when they added sodium citrate to blood it didn’t clot.  With additional additives it could be stored for 2 weeks. 

The notion that blood circulates and that it can be transfused was “based” on the observations of a Brit named William Harvey. In the 1600s he cut open a few living fish and snakes, and learned (and wrote) that the ancients were wrong.  Blood didn’t come from the liver and slowly ebb through the body.  Its flow was “propelled by the heart” and the red stuff traveled through tubes called arteries.  In the 1800s a few doctors used a syringe to remove blood from one person and directly inject it into the vein of another.1 It helped some and harmed others.

The first blood bank was set up by the Russians in 1932. Doctors at Chicago’s Cook County Hospital are given the credit for opening the first American facility. It started to “save and store” donated blood in 1937.  San Francisco’s Irwin Memorial blood bank was established 3 years later.  

When blood, plus a chemical that prevents clotting is put in a test tube, the cells settle to the bottom and the plasma floats to the top.  For blood loss or significant anemia packed red cells or erythrocytes can be transfused. Each tiny disc lives for 120 days.  In transfused blood half the red cells are new and half old, so the average cell in a unit of blood should last 60 days. 

In the test tube full of blood, immediately above the red cells there’s a thin layer of white cells and a stratum of platelets.  The cells and particles only live a few days.  White cells are an important contributor to our defense against infection.  In transfused blood they can cause adverse reactions.  When chemotherapy suppresses the bone marrow white cell levels can get quite low.

Platelets are particles that plug holes and help stop bleeding. Some chemotherapy drugs can significantly suppress their blood levels for a number of days.  When the risk of bleeding is high enough platelets are collected from others and infused.   Blood banks have machines that separate and collect platelets, then reinfuse the platelet poor blood back into the donor.  

In 1940, a year and a half before the U.S. became combatants in the Second World War, London was being bombarded by Nazi planes.  Many in the U.S. wanted to aid the wounded, and an effort to provide the cell free portion of the blood, the plasma, to the Brits was started in New York.  Called Blood for Britain, the organizers attempted to collect thousands of units of blood, separate the cells from the plasma, and under sterile conditions ship the fluid across the Atlantic.  It was a huge undertaking and the man in charge had previously only organized a group of people once.  As a young man, he coordinated the paper routes of ten childhood friends who were delivering 2,000 newspapers a day. 

When he was still a trainee the doctor who was chosen to coordinate the effort, Charles Drew, studied the preservation of blood products.  He wrote a doctoral thesis titled “banked blood”, and he knew how to produce plasma that had a two month shelf life.  Gathering, transporting and processing thousands of units of blood was a complex undertaking but Drew pulled it off and was able to send close to 15,000 pints of the precious fluid to the Brits.  A black man, Drew was born in Washington D.C. and was an outstanding high school athlete. He was Amherst University’s most valuable football player in 1926, went to medical school at McGill University in Canada, and graduated in 1933. In 1941 he became the director of the first U.S. Red Cross blood bank. It was a big honor but he didn’t stay very long.  He resigned because the organization labeled each unit of blood with the donor’s race and didn’t give blood from a black donor to a white patient. He’s credited with saying “No official department of the Federal Government should willfully humiliate its citizens; there is no scientific basis for the practice; and people need the blood.”  Drew returned to Howard University and became the chief surgeon at Freedmen’s Hospital.7

By the time I entered med school (1958) blood drives had come to my campus annually, and I had been a donor twice.  The Red Cross proudly boasted that it saved the lives of wounded service men and women.  People who were hemorrhaging or very anemic often needed transfusions.   When I graduated in 1962 there were already 4400 hospital blood banks and 178 Red Cross and community facilities.  I never knew what medicine was like before transferable blood was readily available. 

In 1997 several San Francisco Bay area blood banks merged and called themselves Blood Centers of the Pacific.  The non -profit corporations collected huge amounts of blood (200,000 units a year) from willing voluntary donors.  They then checked it for blood type and for disease, fractionated the fluid into its various components, and sold– supplied it to more than 60 hospitals.  Their annual budget exceeded $40 million.

The blood supply is relatively safe in part because of the outrage of an angry man.  In the 70s a California legislator named Paul Gann capped our property taxes.  That made him famous.   But the legislation that bears his name, the Gann Act, has nothing to do with taxes.  It deals with transfusions.   It seems that around 1982 Gann had heart surgery and was transfused.  5 years later he discovered he had HIV.  The blood he received came from someone who was infected with the AIDS virus.  Either the blood donor had not been adequately screened or the blood Gann received was not tested carefully enough.  Gann was furious and apparently felt:  “there oughta be a law.”  So he wrote one. 

Prior to elective surgery California doctors must tell patients that they can store their own blood and have it available should they require a transfusion.  Stock piling blood prior to planned surgery can be tedious and costly.  But it’s intuitively better to get your own blood back than it is to receive that of another.  It’s also the law, so if the patient wants it we do it.  The act also says people can refuse blood from the “bank” and, instead get it from a donor they designate.  The idea makes sense, but the blood from a friend or loved one is no longer safer than banked fluid. Before a unit of blood is given it must be tested for the usual suspects, and it’s logistically near impossible to collect, check, and process designated blood in an acute or urgent situation. 

Before Gann’s outrage some blood bank executives argued that if they looked at blood too carefully they would have to reject many donors, throw away too many units.  Doctors wouldn’t be able to treat the ill.  People would die.  After the Gann incident blood banks (which were pretty good at questioning people about risk factors) got serious about screening blood for HIV, HTLV, Hepatitis B, Hepatitis C, (and a few other illnesses such as mosquito born West Nile virus, Zika, Cytomegalovirus, Chagas, a parasitic disease whose normal habitat is Central and parts of South America, and Babesia, a parasite found in New England that is transmitted by ticks.)

We’re apparently NOT yet testing the 11 million units of blood Americans use each year for dengue, a disease transmitted by mosquitoes that’s common in South East Asia— or for Chikungunya, a West Africa disease that was responsible (between 2014 and 2016) for the fever and joint pain of 4000 American travelers.  Most of them had recently visited a Caribbean island.5. And we don’t test for Hepatitis E, the most common type of hepatitis in India and parts of Asia.2 

Before 1996 blood banks identified viral diseases by checking for the presence or absence of specific antibodies in the serum.  When a virus invades a body, the immune system reacts and makes detectable antibodies.  It was believed that blood that did not contain certain antibodies should not be infectious.  To prove their blood was safe blood banks participated in studies on people who were transfused with blood whose antibody levels had been tested.  2.3 million transfusions were given during the study period and people were subsequently evaluated to see if they remained disease free.  One in every 493,000 infused units caused HIV; Hepatitis C was seen after one in a hundred thousand transfusions, and Hepatitis B one in 63,000.3 Screening was good but imperfect.  During the early weeks after a person is infected, the virus incubates and its number grow.  It takes a while before measurable antibodies develop.  So blood can be contagious when the antibody tests are negative. 

Over time PCR technology improved and we were able to directly detect and measure miniscule amounts of virus. (PCR is like a Xerox machine for DNA.  It allows technicians to make millions of copies of the original, to turn a tiny amount of stuff into a wad large enough to analyze and learn what we are dealing with.)   In 1999 blood banks started using the technique to screen all 66 million units of blood that were transfused.  Between 2006 and 2008, with PCR testing being used, the recipients of 3.5 million Units of blood were checked to see if they had been infected with any of three common chronic viral diseases.  One in 1.85 million units of blood that were free of “measurable” viral particles caused an HIV infection; one in 246,000 transmitted hepatitis C, and one in 410,000 gave the recipient Hepatitis B. We’re not perfect yet.4 

While blood is donated freely, screening the donor, and acquiring, testing and distributing the red stuff is expensive.  A recent survey put the cost of a unit of transfused blood at $522 to $1,183.  In most hospitals much of the blood is used at the time of surgery.   Hospitals vary in size and in the numbers and the types of operations performed.  So it’s not surprising that, in the same survey, acquired blood cost $1.6 million to $6 million per hospital annually. 

Chapter 3 –20th century tetanus, parasites, hormones

Early in the First World War, 8 of every thousand wounded British troops in France developed tetanus, also known as lockjaw.  When dirt got into their wound and the skin healed, bacteria were trapped and the germs that didn’t require oxygen thrived.  One of the organisms that commonly live in the soil, clostridium tetani, doesn’t usually cause a terrible infection but the germ produces a lethal toxin.  An average of eight days after a soldier was wounded, enough of the poison had often entered his body.  Jaws tightened, muscles became rigid, and breathing and swallowing became difficult. 

The development of a toxoid, an antibody that blocks the venom and prevents “lockjaw” was the result of research performed by 2 men.  One, a physician named Shibasaburo Kitasato, was born in a mountainous village on the southern Japanese island, Kyushu.  He became a microbiologist and the Japanese government sent him to Berlin, to Koch’s lab.  He was the first to grow a pure culture of the tetanus bacillus.  In 1890 he and another researcher “injected sub-lethal doses of tetanus toxin into rabbits.” They produced an antitoxin that, when injected, blocked the poison.  They did not, however, develop the vaccine that prevents the disease.      

The vaccine was created in 1924 by a French veterinarian, Gaston Ramon. He inactivated the deadly poison with formaldehyde.  Then he added an “adjuvant”, a chemical that boosts the immune response to the now weak and no longer lethal toxin.15 His creation, which also led to the development of the vaccine that prevents diphtheria, saved countless lives, and he was nominated for the Nobel Prize 155 times.  He never won.

By the time the U.S. entered the Second World War the army was routinely using the vaccine for tetanus.  Though there were over 2.5 million wounded soldiers during the conflict, and only 12 developed cases of lockjaw–tetanus.  Four of those individuals probably didn’t get all three shots. 

PARASITES

As a freshman med student, I attended a weekly class and was introduced to a large number of bizarre appearing microscopic creatures. I don’t recall many of the details I crammed into my brain for the final exam, but I’ll never forget the take home message.  It was written on the chalk board by the good natured Japanese professor and is probably the only thing that most of the giggling medical students remember to this day.  DES.  Don’t Eat Shit. 

By the time I entered school, the discoveries of Pasteur, Koch, and Lister were ancient history and a whole new class of invaders called parasites had been identified.   Dozens of scientists had isolated and studied the life cycle of one creature after another.  We learned that when people defecate in the open (and half a billion people in India still do) bacteria and parasites are deposited in the dirt and they get into our rivers.  Some of the organisms can enter the bodies of the animals and fish that we sometimes eat raw.  When we walk barefoot on ground where someone previously defecated, parasites can penetrate the soles of our feet.

Humans currently co-exist with 90 common species of parasites.  Most are mainly found in the tropics.  Some were apparently inherited from our primate ancestors in Africa.  The tape worms and round worms that live in our intestines and thus share our meals were visible to the naked eyes of Hippocrates and physicians of Rome, China, and of the Arab Empire during the first millennium.

Most of the parasite life cycles were worked out by researchers all over the globe before the Second World War.  There are a number of drugs with a variety of toxicities that have been developed and can kill the creatures. We don’t need them much in the developed world because parasites are not a major problem in most places that have good sewage, vector control, and clean water.  But a problem may be brewing.  Close to 200,000 Americans sleep outside most nights and the government doesn’t provide adequate facilities to protect them and, in turn. protect us from the microscopic creatures. 

The chief drug that’s used to kill intestinal parasites was patented in 1975 by Smith Kline.  It was created by Robert J. Gyurik and Vassilios J.Theodorides, after Vassilios read an article, had a sudden insight, and sketched the chemical structure of the future medication.  Raised in a small Greek village near the Macedonian border, the drug’s inventor, Vassilios, starts the tale of his youth by telling of the morning in 1941 when 400 German soldiers surrounded his town and marched its 1600 occupants to the village center.  He was 10 years old.  The soldiers told everyone to bring out their guns.  Then they searched the houses.  Finding shotguns in two homes they tied the owners of the weapons to a tree and publicly shot them.  Later that day a German soldier was searching Vassilios house. He saw a shotgun behind a door and motioned to Vassilios–hide it. When he left Vassilios threw the gun in the bushes and was grateful.

To attend school the young man had to walk an hour and a half to another village.  He was in that nearby town the day in 1947 that Communist soldiers burned his village and killed 48 people.  Some were relatives. A good student, Vassilios found school to be easy and interesting.  After he finished high school, planning to become a mathematician, he visited the university office and asked where the mathematics school was.  The clerk asked: what’s wrong with the Veterinary school?  And Vassilios decided to give it a shot.  As a veterinary student he developed an interest in research and decided he needed a PhD.  His future wife’s family had emigrated to Boston and Vassilios decided to follow her.  He came to the U.S., married the girl, earned a PhD, and worked for Pfizer in Terra Haute Indiana for two years.  There were only two Greek families in town.  The other Greek was the mayor.  Vassilios’ wife was unhappy and they moved to Pennsylvania where he got a research job at Smith Kline and French laboratories. 

A few years later he read an article, and had an “aha” moment.  Unexplainably he somehow “knew” the steps he would have to take to create the chemical that became Albendazole. (Quoting Pasteur he explained “God helps the minds that are prepared.”)  Introduced in 1977, the medication was initially given to animals in Australia and New Zealand, but it was not approved for people in the U.S.  Someone at the FDA decided it was carcinogenic.  Vassilios met with scientists at the agency and showed them that their “mathematical approach in evaluating the drug’s potential carcinogenicity was incorrect.”  The officials at the FDA agreed and humans started using it in 1982.  

Twenty eight years later the company that owned the drug had merged with two other pharmaceutical giants and was called GlaxoSmithKline, (GSK).  It had offices in over 100 countries, and was headquartered in Brenford, a “town” in greater London where Julius Caesar crossed the Thames River during his 54 BC invasion of Britain.  By October 2010, the drug had become a financial loser and the company dumped/sold it.  The marketing rights for the major anti-parasitic drug, was picked up by Amedra pharmaceuticals, a small American drug company.  The details of the deal were not disclosed (or at least I couldn’t find them on the web.)  As part of the agreement GSK agreed to continue manufacturing the medication for Amedra in the short run.   They also renewed their pledge to the World Health Organization.  They would continue to give the organization 600 million tablets per year as their contribution to the struggle to free the world from Lymphatic Filiariasis. 

As explained by the CDC: “Lymphatic Filiariasis (LF) is a mosquito-borne parasitic disease caused by microscopic, thread-like worms.  They “inhabit the lymphatic and subcutaneous tissues” and prevent liquid from flowing through the lymphatic vessels.  The country roads of the body, lymphatic vessels flow in an upward direction and empty into the large subclavian (under the collar bone) veins.  On their way they pass through lymph nodes that filter toxins.  The fluid within them transports infection fighting white cells.   

Filiariasis affects 120 million people in 80 countries.  People with the condition develop swollen limbs, breasts and scrotums and their skin becomes thick and hard.”  The Global Alliance is trying to rid the planet of the parasitic disease by annually giving albendazole and Ivermectin to all of the occupants of communities that are at risk.3

The year after Amedra bought Albendazole, generic medication manufacturer Teva, stopped manufacturing the drug’s only U.S. competitor, Mebendazole (brand name-Vermox), and Amedra became the only U.S. player in the intestinal parasite business. With the U.S. rights in their pocket, its new owner dramatically increased the amount they charged for the medication.   In late 2010, the average wholesale price was “$5.92 per typical daily dose”.  By 2013 it had jumped to $119.58. 

In India Albendazole commonly sells for $18.  According to Wiki, in some countries it costs a penny to 6 cents a dose. 

The parasites albendazole targets, helminthes, live in the intestines of a billion people. Usually acquired in childhood, the creatures are sometimes ingested with tainted food and water.  When mature, some worms can penetrate the skin of a child or adult who walks “barefoot on contaminated soil.”  Much as a butterfly spends part of its existence as a caterpillar, the parasites who camp in our bodies have a life cycle.  Some spend part of their existence in a cow, pig or fish and gain entrance to a person’s body when people eat raw meat or uncooked fish.  As Giovanni Grassi, an almost world famous Italian researcher (he should have shared the Nobel Prize for Malaria) demonstrated in 1881, the worms can go directly from man to man. He started an experiment to prove his point by examining his feces to prove he wasn’t infected. Then he ate ascaris eggs.  (Ascaris is a round worm that travels through the lungs on its way to its home in the intestines.) A bit later he found the parasite in his feces. In 1922 a Japanese pediatrician, Shimesu Koino one-upped Grassi when he ingested ascaris eggs and found worms in his sputum.  Such, as they say, is how research on parasites was performed.

Medicaid spent less than $100,000 per year on Albendazole in 2008, and more than $7.5 million in 2013.  Doctors in this country are prescribing it more often because the CDC (center for disease control) thinks we should presume that refugees that come here from poor countries have parasites in their intestines, and we should treat them. 

One day when my grandson, who had never lived in a third world country, was 3 or 4, he passed a worm.  My daughter checked the internet, identified the creature and put it in a jar.  When she showed it to her doctor he was taken aback– amazed.  His nurse, on the other hand, merely shrugged. No big deal.   She had grown up in a village in the Philippines.  We still have no clue as to where or how the worm got in the kid’s body.

A totally different kind of parasite lives in the bodies of millions of people in sub-Saharan Africa, some South American countries, a few countries in Asia and is common in parts of South Africa.  Called Shistosomiasis or bilharzias it “affects close to 240 million people in the world and it’s found in the fresh water ponds of Puerto Rico.  In 1960 the U.S. Army had a research facility that studied the creature.  It was situated on a bluff overlooking the balmy Caribbean, and I spent one med school summer working in the lab. A fellow student and I would drive to an inland body of water, put on protective hip boots and trudge into the ponds. The snails of Puerto Rico are part of the life cycle of a parasite.  An intermediate life form of the creature exits a snail, enters the water, and penetrates the skin of people who wade in the island’s fresh water ponds.  Once inside a person’s body the parasites make their way to the liver or urinary tract and lay eggs.  Our body’s reaction to them can cause significant damage. 

We scooped the snails up with a net, carried them back to the lab, and placed them in sunlight.  The snails discharged– “shed” miniscule worms.  We gathered the tiny parasites, they are called cercareiae. And we put them into test tubes.  Then we dipped the tails of mice into the liquid.  Weeks later we studied the infected mice.

In the 20th century we learned of at least one malaria like parasite that is transmitted by a tick.  Called Babesiosis, it enters a person’s body when an infected tick bites and it invades red cells, causing chills and fever.  It can be lethal. 

The most common infection that ticks spread is caused by a bacterium that’s technically a spirochete. Called Lyme disease it is named for the town in Connecticut where it was identified.  In the mid 1970s Mrs. Murray, an artist who had been living in a house near a picture post card rural country road for 15 years, developed rashes, painful swollen joints,  and numbness and weakness.  Over the years she got worse.  She was hospitalized three times and unable to paint.  Some thought she was a hypochondriac.  Then her son developed joint pain and he couldn’t smile.  He had Bell’s palsy and his facial muscles didn’t work.  Another son developed a rash behind his knee and others in the town got ill.  It took a while before a young Yale physician and others discovered that some ticks carry a bacteria– a spirochete.  When the ticks bite the spirochetes are injected into a person’s body.  It often takes decades before the creatures causing Lyme have done their worst. 

During the last century we’ve learned, one by one, about the diseases that are transmitted by the bites of a ticks hiding in the long grass we hike through, the nearby woodlands, or the fur of a four legged companion.  The brown dog tick can carry the bacteria responsible for Rocky Mountain spotted fever.  It’s an infection that makes people quite ill but thanks to doxycycline is only lethal one time in 200. The bite of black legged ticks of the North East and Great Lakes area can introduce our bodies to Anaplasmosis, a bacterium that causes chills and fever and responds to antibiotic.  And there are others.  

In recent decades the Center for Disease control has been informing those who care about common and rare conditions with concise, disease specific posts on the internet.  They have a free newsletter, the Morbidity and Mortality Weekly report. Located in Atlanta Georgia, the federal agency was created a little over 70 years ago.  In its early days it concentrated on mini epidemics.  I recall a hepatitis outbreak in 1963.  It affected a number of the doctors and nurses who worked at the hospital where I was an intern.  The CDC sent a young sleuth.  He checked our plumbing, drew our blood, and inspected our kitchens.  It took a month but he figured out where and how the illness started.  When I was in the military a friend who was sent to help immunize a tribe of Native Americans in the state of Washington came through town.  My brother-in-law was sent overseas to help determine the dose of gamma globulin that would keep soldiers in Viet Nam from getting hepatitis.  Currently a $6.6 billion a year department of the public health service, the agency collects data on acute and chronic diseases and makes recommendations.  Their epidemiologists investigate outbreaks, and they provide the laboratory and pharmacy of last resort for rare infectious diseases.  According to the Kaiser Health news, some experts are saying that “under President Donald Trump, the CDC has become a non-entity in the battle against the coronavirus.“59

In the late 19th and early 20th centuries a number of hormones were discovered and isolated.  We learned what endocrine glands do and how they go wrong. 

Hormones are chemical messengers. They are created in an “endocrine gland” then travel by blood and other mechanisms to parts of the body where they do their thing.  If a man is given estrogens he grows breasts.  If a woman is given male hormones she grows a beard. 

Thyroid and adrenal hormones help regulate many of the body’s metabolic and other functions. When either is totally absent people wither and die. They are “vital.”

Adrenal glands sit atop the kidney and produce cortisone and a number of other hormones.  In 1940 war seemed likely and the government funded cortisone research allegedly because they believed the hormone would allow pilots to fly up to 40,000 feet without oxygen. .

Three people got the Nobel Prize for cortisone.  The first, Phillip Hench was a Mayo Clinic rheumatologist who loved Sherlock Holmes and had “one of the more remarkable Sherlockian libraries ever assembled.”  He liked mysteries and wanted to understand why his patient with jaundice suddenly got well or a woman he was caring for who had severe rheumatoid arthritis improved dramatically when she was pregnant.  Some hormone had to be responsible. 

His “Watson” was Edward Kendall, Mayo’s chief of Biochemistry.  Described as a charismatic man with a generous personality, by one writer—and as a person who “often didn’t get along well with his colleagues” by another, he spent the greater part of his professional life standing before a lab bench.  Working with 150 tons of adrenal glands, he extracted 9 million dollars worth of epinephrine for Parke Davis and he kept the material the company didn’t want.  Later he separated 5 cortical compounds from the organs.13

Collaborating with Hench, Kendall checked the urine of the people who unexpectedly got well and found adrenal compounds in both of the patients.   Kendall gave Hench minute amounts of one his adrenal extracts and Hench gave it to volunteers with arthritis.  It didn’t help.  

About that time they met Tadeus Reichstein.  He was a Swiss researcher who was born in Poland and had been named for Tadeusz Kościuszko, the Polish national hero who fought in the continental army during the American Revolution.  Tadeusz had learned how to extract adrenal hormones from bile.  I’m not sure what happened next but Dr Kendall and biochemists from Merck & Company produced 9 grams of adrenal “extract E.”  In 1948 it was given to a patient with rheumatoid arthritis and “The resulting improvement was amazing,” and all three researchers were given the Nobel Prize

Before Merck Sharpe and Dome learned how to manufacture commercial quantities of the hormone turned medication, “the New York Times published sensational stories and pictures.”  Many wanted to try the medication.  Scientists using cattle and sheep bile were only able to produce a limited quantity, and for a few years there was a cortisone black market, doctors didn’t know how to use the drug, and when it was used there were serious side effects.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1113923/

Cortisone and hydrocortisone became available after the Second World War:  The hormone saved Jack Kennedy’s life.7 Kennedy was the 35th U.S. president and the son of a wealthy Irish immigrant.  In the 1940s he was diagnosed as having Addison’s disease, of having adrenal glands that didn’t make enough cortisone.  People with the condition are anemic, have abdominal pain, and lose weight.  Some have weak muscles and get dizzy when they stand.  The disease is usually the result of an immune system disorder.  20% of time Tuberculosis is the offender. When young Jack fought with his older brother he was always on the short end. As an adult Kennedy collapsed twice: once on a congressional visit to Britain and a second time at the end of an election campaign parade. That time “the diagnosing physician told one of Kennedy’s friends: “That young American friend of yours, he hasn’t got a year to live.35”  

Over the last century pharmaceutical researchers have developed many oral, injectable, or topical cortisone derivatives.  Some are a little more potent or have fewer side effects, but they’re basically the same hormone. Most doctors who care for patients have used them for conditions as unrelated as swelling of the brain, joint inflammation, severe asthma, and poison oak.  I, like most doctors, developed an introductory pitch. This is a medicine but it’s also a hormone.  That means depending on the dose and the amount of time it’s taken, everyone who uses it has side effects.  They range from feeling good, hungry and having trouble sleeping to decreased immunity, thin bones, muscle weakness, and diabetes.

The body makes insulin in the islets of Langerhans, groups of cells that are scattered throughout the pancreas.  The hormone allows sugar to enter muscles and fat.  When sugar can’t get into the cells, the level of glucose in the blood rises.  People with juvenile diabetes have, over months to years, lost the ability to make insulin because their immune system has destroyed their beta, insulin producing cells. 

Prior to 1920 scientists knew where the hormone was made, but they didn’t know how to extract it intact.  Most of the pancreas makes digestive enzymes that enter the small intestine through the pancreatic duct.  The activated enzymes break food into small absorbable pieces–they divide the bonds that link the amino acids together to form proteins.  The insulin inside the pancreas can’t be extracted because it is “digested “when the pancreatic enzymes become functional.

In the early 1920s it was known that when the pancreatic duct of a dog was tied off the enzymes were activated.  They digested the dog’s pancreas but the animal didn’t develop diabetes. 

Two Canadian scientists, Banting and Best took the knowledge and ran with it.  Fred Banting, a Canadian surgeon had been a battalion medical officer during the First World War and his arm had been wounded by shrapnel.  After the war he had trouble finding work and needed the research job. His colleague, Best had just graduated from college. 

They tied off a dog’s pancreatic ducts and managed to keep the dog alive long enough for the pancreas to destroy itself. Then they removed the organ, chopped it into pieces, mixed it with saline, and filtered an extract.  As they hoped, the activated enzymes had digested most of the pancreas, but they hadn’t destroyed insulin.3 

The discovery was momentous. A fatal disease could be controlled. For complex reasons the Nobel Prize for the breakthrough was only awarded to Fred Banting, but Charles Best, according to Banting, “was never credited with proposals that advanced the research.” The other half of the Nobel Prize for insulin was given to the head of the department, J.R. Macleod.

That bothered Best.  He had done most of the work.  He “developed a deep psychological hunger for recognition as a discoverer of insulin.” During the subsequent decades he and Banting worked together and “Banting developed an intense dislike of Best.” In 1940 Best was invited to come to London.  When he suddenly backed out Banting, now the head of the department, decided he would go.  Before boarding the plane and setting out for the distant world, before flying to war torn London, Banting’s last words were: “If they ever give that chair of mine to that son of a bitch, Best, I’ll roll over in my grave.” His plane crashed in Newfoundland, and he died.25