Experimenting On Humans Controversial But Useful
April 2, 2018 at 1:54 p.m.
By -
The parasite is a tiny waterborne worm species that causes schistosomiasis, a disease that sickens millions of people in Africa, the Middle East and Latin America and kills thousands each year. The experiment is designed to use these patients to develop a vaccine to prevent the disease.
Such research is not uncommon; other studies have been done with malaria, cholera, influenza, shigella, dengue, norovirus, tuberculosis, rhinovirus, typhoid and giardia. Human challenge studies, which only involve a few dozen volunteers, speed the process of deciding whether to pursue a promising treatment, which saves time and money. Moreover, tests that intentionally infect people can quickly and efficiently flag potential side effects.
Human challenges date back to the 18th century and the first vaccine, when English physician Edward Jenner attempted to persuade the world that infecting a person with harmless cowpox could prevent the more dreaded disease, smallpox.
Self-experimentation with drugs began as early as the 19th century, as reported by Friedrich Wilhelm Serturner, a German scientist. In 1803, he had numerous brushes with death as he experimented on himself to produce a drug that continues even today to be the ultimate painkiller — morphine. (Morphine is derived from opium.) His research was in fact the turning point in the history of drug studies.
Serturner observed that some samples of opium could rapidly and completely dull pain and he theorized that opium contained a specific active ingredient responsible for this effect. In his quest to find that component he was the first to apply basic chemical analysis. He discovered the specific narcotic substance contained in opium, and named it "morphine" after Morpheus, the god of sleep.
Once he isolated morphine, Serturner began testing it with cautious experiments on animals. Then he began to test it on himself and three young friends. They began ingesting a small dose, then continued to do so until each experienced a sharp stomach pain and a feeling that they were about to faint. Concerned with these symptoms, Serturner recognized that the drug was a poison and subsequently that they had swallowed about 10 times the dose now recommended.
One other incident is worth mentioning. Max von Pettenkofer, who lived in Munich, Germany, in the late 19th century, was convinced that it took more than cholera bacteria to cause the disease. To prove it, he swallowed 1 cubic centimeter of boullion laced with the organism derived from a patient who had died from cholera. The next day he began to experience abdominal colic with extensive gas pains and diarrhea that lasted almost a week. Fortunately he did not die, likely due to the fact that he had contracted an extremely mild case of the disease.
Intentionally infecting a human being with a deadly disease would not pass ethical standards in the civilized world today, but as recently as the early 20th century, Australian psychiatrist Julius Wagner-Jauregg won the 1927 Nobel Prize in Medicine for injecting blood from people with malaria into patients with syphilis, in a effort to cure them from insanity and paralysis. The treatment was quite successful, and many thousands of victims were spared almost certain lingering death from the disease. However, with the advent of penicillin in the mid 1940s, it was obvious that malaria treatment would be slowly but inevitably replaced.
Many doctors like the aforementioned Pettenkofer have challenged themselves with pathogens to prove the worth of their own experimental medicines or theories. Another example is Dr. Barry J. Marshall, who, along with Dr. J. Robin Warren, identified a bacterium now known as Helicobacter pylori in patients with inflamed stomachs and ulcers. As part of his research, Marshall swallowed a tube that was used for tests to document that he had neither gastritis nor an ulcer and was not silently harboring H. pylori. Then Marshall swallowed the bacteria, which led to gastritis. Although he did not continue the self-experiment long enough to produce an ulcer, his experiment provided strong indirect evidence that many ulcers result from H. pylori infection and are not caused by stress, as was previously thought.
Marshall's role was one part of continuous research that led to antibiotic therapies that can cure gastritis and ulcers as well as the strong suspicion that chronic H. pylori can cause stomach cancer. (Anyone interested in the history of self-experimentation should read “Who Goes First?” an excellent book published by the University of California Press.)
Today, of course, federal laws and regulations mandate human experimentation in large numbers of patients for medical research. Before a drug can be marketed in the United States it must pass through three stages for approval. The first involves tests of toxicity and safety on just a few human volunteers. Then the stages are enlarged in scope and participation.
Max Sherman is a medical writer and pharmacist retired from the medical device industry. He has taught college courses on regulatory and compliance issues at Ivy Tech, Grace College and Butler University. Sherman has an unquenchable thirst for knowledge on all levels. Eclectic Science, the title of his column, will touch on famed doctors and scientists, human senses, aging, various diseases, and little-known facts about many species, including their contributions to scientific research. He can be reached by email at [email protected].
The parasite is a tiny waterborne worm species that causes schistosomiasis, a disease that sickens millions of people in Africa, the Middle East and Latin America and kills thousands each year. The experiment is designed to use these patients to develop a vaccine to prevent the disease.
Such research is not uncommon; other studies have been done with malaria, cholera, influenza, shigella, dengue, norovirus, tuberculosis, rhinovirus, typhoid and giardia. Human challenge studies, which only involve a few dozen volunteers, speed the process of deciding whether to pursue a promising treatment, which saves time and money. Moreover, tests that intentionally infect people can quickly and efficiently flag potential side effects.
Human challenges date back to the 18th century and the first vaccine, when English physician Edward Jenner attempted to persuade the world that infecting a person with harmless cowpox could prevent the more dreaded disease, smallpox.
Self-experimentation with drugs began as early as the 19th century, as reported by Friedrich Wilhelm Serturner, a German scientist. In 1803, he had numerous brushes with death as he experimented on himself to produce a drug that continues even today to be the ultimate painkiller — morphine. (Morphine is derived from opium.) His research was in fact the turning point in the history of drug studies.
Serturner observed that some samples of opium could rapidly and completely dull pain and he theorized that opium contained a specific active ingredient responsible for this effect. In his quest to find that component he was the first to apply basic chemical analysis. He discovered the specific narcotic substance contained in opium, and named it "morphine" after Morpheus, the god of sleep.
Once he isolated morphine, Serturner began testing it with cautious experiments on animals. Then he began to test it on himself and three young friends. They began ingesting a small dose, then continued to do so until each experienced a sharp stomach pain and a feeling that they were about to faint. Concerned with these symptoms, Serturner recognized that the drug was a poison and subsequently that they had swallowed about 10 times the dose now recommended.
One other incident is worth mentioning. Max von Pettenkofer, who lived in Munich, Germany, in the late 19th century, was convinced that it took more than cholera bacteria to cause the disease. To prove it, he swallowed 1 cubic centimeter of boullion laced with the organism derived from a patient who had died from cholera. The next day he began to experience abdominal colic with extensive gas pains and diarrhea that lasted almost a week. Fortunately he did not die, likely due to the fact that he had contracted an extremely mild case of the disease.
Intentionally infecting a human being with a deadly disease would not pass ethical standards in the civilized world today, but as recently as the early 20th century, Australian psychiatrist Julius Wagner-Jauregg won the 1927 Nobel Prize in Medicine for injecting blood from people with malaria into patients with syphilis, in a effort to cure them from insanity and paralysis. The treatment was quite successful, and many thousands of victims were spared almost certain lingering death from the disease. However, with the advent of penicillin in the mid 1940s, it was obvious that malaria treatment would be slowly but inevitably replaced.
Many doctors like the aforementioned Pettenkofer have challenged themselves with pathogens to prove the worth of their own experimental medicines or theories. Another example is Dr. Barry J. Marshall, who, along with Dr. J. Robin Warren, identified a bacterium now known as Helicobacter pylori in patients with inflamed stomachs and ulcers. As part of his research, Marshall swallowed a tube that was used for tests to document that he had neither gastritis nor an ulcer and was not silently harboring H. pylori. Then Marshall swallowed the bacteria, which led to gastritis. Although he did not continue the self-experiment long enough to produce an ulcer, his experiment provided strong indirect evidence that many ulcers result from H. pylori infection and are not caused by stress, as was previously thought.
Marshall's role was one part of continuous research that led to antibiotic therapies that can cure gastritis and ulcers as well as the strong suspicion that chronic H. pylori can cause stomach cancer. (Anyone interested in the history of self-experimentation should read “Who Goes First?” an excellent book published by the University of California Press.)
Today, of course, federal laws and regulations mandate human experimentation in large numbers of patients for medical research. Before a drug can be marketed in the United States it must pass through three stages for approval. The first involves tests of toxicity and safety on just a few human volunteers. Then the stages are enlarged in scope and participation.
Max Sherman is a medical writer and pharmacist retired from the medical device industry. He has taught college courses on regulatory and compliance issues at Ivy Tech, Grace College and Butler University. Sherman has an unquenchable thirst for knowledge on all levels. Eclectic Science, the title of his column, will touch on famed doctors and scientists, human senses, aging, various diseases, and little-known facts about many species, including their contributions to scientific research. He can be reached by email at [email protected].
Have a news tip? Email [email protected] or Call/Text 360-922-3092