BioSupply Trends Quarterly logo
Search
Close this search box.
Fall 2021 - Innovation

A Brief History of Immunity and Immunoglobulins

Discoveries about the types of immunity eventually led to the use of human antibodies to treat disease.

About three thousand years ago, Mesopotamians believed disease was attributed to the celestial cycles of the planets, stars and moon (astrology). Astrologists posited these movements affected every aspect of life, including the weather, natural disasters and even human disease. In fact, astrology remained a legitimate scholarly subject pertaining to medicine until the study of medicine began in the 17th century.  

One thousand years later, miasma was thought to be the likely cause of disease. According to the miasma theory, disease is caused by poisonous vapors or mists arising from decaying organic matter, contaminated water, foul air or sick patients. These vapors could affect individual patients or even whole communities, thus explaining disease among close contacts and entire communities.

Another early theory known as theurgy suggested disease is caused by angry gods punishing individuals or communities because of their bad behavior. The severe punishment inflicted by these gods led to suffering, disease and even death. 

Greek philosopher Aelius Galen (129-219 A.D.), often considered “The Father of Medicine,” believed disease was due to an imbalance of the four vital humors — blood, phlegm, yellow bile and black bile — leading to the temperaments sanguine, phlegmatic, choluric and melancholic, a set of genetically determined psychic qualities a person possesses. Treatments for these temperaments included cupping, bleeding, leeches, purgatives and expectorants to restore humoral balance.

Yet, most of these theories were abandoned when the germ theory of disease became established in the 17th century.1 

Early Theories About Immunity and Contagion 

It was long known, as recorded by the Athenian historian Thucydides (460-400 B.C.), that persons who survived bubonic plague would not get a second attack. Thus, survivors were able to care for plague patients without becoming ill. This concept of immunity was substantiated by Persian physician Rhazes who in 900 A.D. recorded that survivors of smallpox were immune to a second attack.1

But, acquired immunity was not limited to disease. In 100 B.C., Mithridates, the King of Pontus (a small kingdom on the Black Sea), was concerned about attempts on his life by poisoning. So he made a concoction of the 12 known poisons (called mithridaticum) and took small and increasing doses of the mixture so he could survive deliberate poisoning. It worked so well that when he tried to commit suicide by self-poisoning, he was unable to do so!

In 60 A.D., the Roman statesman and poet Marcus Lucanus (39-65 A.D.) described the resistance to lethal snake bites developed by the snake charmers of the Psylli tribes of North Africa, for which Lucanus coined the Latin term “immunis evasi.”

During the periodic epidemics of smallpox and bubonic plague, it was realized that some illnesses were contagious. In 1546 A.D., Italian Girolamo Fracastoro suggested contagion was caused by invisible seeds (seminaria) in the air, earth or water, arising from an infected person or decaying organic matter. Seminaria had an affinity for certain organs or tissues or for one of the four humors proposed by Galen. It was believed when all of the seeds were expelled, recovery ensued, and the patient was resistant to a second attack.1 

Smallpox Immunity: The Royal Experiment of 1717 

When Mary Wortley Montagu, the wife of the British ambassador to Constantinople, was concerned her young children would get smallpox2 (variola), she heard inoculation (aka variolation, the method of inoculation first used to immunize individuals against smallpox) would protect against the dread disease. Variolation involved taking the powdered crusts of the sores of a patient recovering from smallpox and placing them under the skin of a second person, resulting in a mild case of smallpox, rendering the recipient immune. Its slight mortality, less than 2 percent, was much lower than the 30 percent to 50 percent mortality or the scarring and blindness of surviving patients.

After having the procedure performed on several prisoners and orphans, subsequently exposing them to smallpox and finding they were protected, Lady Mary had her children variolated. When she returned to London, she convinced the leaders of the British army to variolate all its soldiers. She also told the firebrand American preacher Cotton Mather about variolation, who brought it to Boston to halt the smallpox epidemic of 1771. It was then that George Washington had his surgeon general, Benjamin Rush, inoculate the entire Continental Army, preventing the inoculated British army from gaining a biological advantage.  

Edward Jenner and Vaccination

According to legend in 1776, English physician Edward Jenner (1746-1823) heard a Bristol milkmaid exclaim, “I shall never have smallpox for I have had cowpox; I shall never have an ugly pockmarked face.” After confirming this observation, Dr. Jenner took material from the cowpox sore of the finger of milkmaid Sarah Nelmes and inoculated it into both arms of 8-year-old James Phipps, the son of his gardener. James developed a slight fever but quickly recovered. He was then challenged with an exposure to smallpox, and he indeed was immune. (But the causative agent of smallpox remained a mystery.)

Dr. Jenner’s friend, Richard Dunning, coined this procedure “vaccination,” which was quickly adopted. Indeed, Dr. Jenner was awarded a medal by Napoleon Bonaparte, a French statesman and military leader, and at Jenner’s request, Bonaparte released two British prisoners after the war of 1812. After that, vaccination spread worldwide, leading to the complete eradication of smallpox from the world in 1980.1,2

The Germ Theory of Disease

Bacteria’s role in disease was first suggested in 1656 A.D. by German Jesuit priest and scholar Athanasius Kircher who observed tiny worms in the blood of plague patients during a Rome epidemic. More well-known are the 1670 studies of Dutch microscopist Anton van Leeuwenhoek who observed multi-shaped motile particles in swamp water that he called animalcules (microbes), which were spherical, rod-shaped and spiral-shaped, all of which were renamed bacteria. 

Using the new microscope, English Catholic priest John Tuberville Needham (1713-1781) noted that freshly boiled mutton gravy when placed in a corked bottle was soon swarming with live animalcules. He attributed this to a result of spontaneous generation, or a “vegetative force.” But an Italian priest from Modena, Italy, Lazzaro Spallanzani (1729-1799) was skeptical. He placed boiled mutton gravy in a sealed glass flask that, unlike the corked flask, was airtight, and its gravy remained free of animalcules. He also observed that a cultured single animalcule transformed from a spherical shape to that of a dumbbell before dividing into two identical spherical animalcules. Spallanzani concluded life only arises from other life, whether bacteria, plants or animals, thus disproving spontaneous generation.3 

In 1847, Hungarian physician Ignaz Semmelweis noted a high incidence of puerperal (postpartum) fever in women whose deliveries were assisted by doctors returning from the autopsy room. He instituted handwashing with chlorinated lime, reducing the incidence of puerperal fever from 18 percent to 2 percent.

In 1855, London physician John Snow witnessed an epidemic of cholera in a district of London using water pumped from the lower Thames River next to a sewage outlet. However, cholera was not occurring in districts using water from upstream Thames River. So, he recommended boiling the water and later removed the pump handle, thus interrupting the epidemic. This is regarded as the first epidemiologic study!

In 1860, French microbiologist Louis Pasteur (1822-1895) was able to culture bacteria from several patients with severe infections, including puerperal fever. In 1861, he proposed the germ theory of disease,4 a theory that was supported by German scientist Robert Koch (1843-1910) who proposed his four Koch postulates that must be met before a specific bacterium is proven to cause a specific disease:

1) The bacterium must be present in every case of the disease.

2) The bacterium causing the disease must be grown in a pure culture.

3) The disease must be reproduced by the cultured bacterium in a previously healthy host (e.g., an animal).

4) The bacterium must be recoverable from the experimentally infected animal.

These conditions are not always possible when the bacteria cannot be cultured (e.g., leprosy) or if there is no animal model (e.g., smallpox). 

Pasteur is regarded as the Father of Microbiology. Born to humble parents in rural France, he was an average student, more interested in fishing and sketching. He failed his first college exam but was eventually admitted to the École Normale Superiérure of Paris, graduating in 1846. After several appointments in various French institutions, he became the director of his own laboratory in Paris, the future Pasteur Institute. 

Pasteur is best known for heating milk and wine to inhibit bacterial contamination (pasteurization), but that was just one of his multiple accomplishments. He saved the French silk industry by developing a method to screen silkworm eggs for those not infected (a method still used today), and using killed bacteria from pure cultures, he developed vaccines for chicken cholera, cattle anthrax and swine erysipelas. 

Rabies and the Rabies Vaccine

Pasteur’s most innovative accomplishment was the development of a rabies vaccine, despite its unknown cause and the inability to see it or grow the rabies virus. 

Indeed, viruses had not been discovered until German agriculturist Adolf Mayer (1843-1942) in 1876 showed tobacco mosaic disease was contagious by making an extract of the affected tobacco leaves and using it to transfer the disease to healthy tobacco plants. In 1892, Russian biologist Dmitry Ivanovsky (1984-1920) showed that these extracts were still contagious when passed through a bacterial filter, thus free of bacteria, but when the filtrate was boiled, it was no longer infectious! In 1935, virologist Wendell Meredith Stanley (1904-1971) of the University of California, Berkeley, crystallized the tobacco mosaic virus, an RNA virus with a molecular weight of 18,000. He was awarded a Nobel Prize for this in1946. 

Rabies is usually acquired by the bite of a rabid animal, typically a dog or bat, and is almost always fatal. Pasteur’s rabies vaccine was derived from the dried nerve tissue of rabies-infected rabbits. His first patient was 9-year-old Joseph Meister who, on July 6, 1885, was badly mauled by a rabid dog. Pasteur gave the boy 13 injections of the vaccine over 11 days, at some personal risk since he was not a licensed physician. And, the boy did not get rabies. Pasteur tested his vaccine in 350 bitten patients with only one failure. The Pasteur Institute in Paris was then established to produce the vaccine. 

Pasteur died in 1894 without ever knowing the cause of rabies. He achieved worldwide acclaim, resulting in multiple statues, streets and buildings in his honor. After a funeral service in Notre Dame Cathedral, his body was interred in a vault at the Pasteur Institute, covered with Byzantine mosaics depicting his many achievements. 

Bacterial Toxins and Antitoxins

Pasteur’s liquid broth cultures were a mixture of many bacteria. In 1881, Koch added gelatin to a broth culture and poured it on a glass plate with the intent to grow colonies of a single bacterium. This effort was greatly improved when the wife of his assistant suggested adding agar to the broth instead of gelatin. When poured on the glass plate, the cooled agar broth adhered to the glass as a gel and promoted the growth of a pure bacterial colony. The isolated bacteria could then be transferred to liquid broth to grow large amounts of a single bacterium used to isolate its toxin and develop a vaccine. 

In 1884, German bacteriologist Fredrich Loeffler (1852-1915) identified and cultured Corynebacterium diphtheriae that causes diphtheria (known as “The Strangling Angel of Children” since it killed thousands of kids every year). In 1888, Pierre Roux (1853-1933) and Alexandre Yersin (1863-1943) isolated and concentrated the diphtheria toxin while working in Pasteur’s laboratory.

In 1890, Kitasato Shibasaburo and Emil von Behring gave small heat-weakened injections of diphtheria toxin to guinea pigs and then used their serum to protect other guinea pigs from lethal injections of the toxin. They called this “antitoxin activity.” Von Behring then used horse (equine) diphtheria antitoxin to successfully treat patients dying of diphtheria. This treatment was widely adopted, earning him the first Nobel Prize in Medicine in 1901.

German immunologist Paul Ehrlich (1854-1915) subsequently called antitoxin an “antikorper” (antibody), a term now used to depict all types of antibodies (e.g., antitoxins, agglutinins, precipitins, bacteriolysins, opsonins, neutralizing antibodies, etc.).

Von Behring’s work soon led to the development of tetanus antitoxin in horses, since they are large, calm and plentiful. Other animal antibodies for specific diseases were also developed, including those to Haemophilus influenzae, pneumococci and snake venoms.5 These animal antibodies are antigenic, so they must be used with caution since they sometimes cause anaphylaxis.

In 1914, von Behring used a mixture of toxin and antitoxin to minimize the toxic effect of immunizing animals for antitoxin production. In the 1920s, French veterinarian Gaston Ramon (1886-1963) made vaccines even safer by treating toxins with heat and formaldehyde, rendering the toxin nonreactive but maintaining its ability to provoke protective antibodies. These altered vaccines are termed “toxoids” and are still in use today.

Isolation of Human Immunoglobulin

Charles F. McKhann, MD, and Fu Tang Chu, MD, in 1933 noted that serum from the placental blood of newborn infants had the same antibodies as those of their mother’s blood, indicating the placental transfer of maternal antibodies. Using ammonium sulfate precipitation, they showed these antibodies were in the globulin fraction of the serum. These placental antibodies were first used in children to prevent or modify measles.

In 1937, Arne Tiselius, PhD, of Sweden, using a new optical instrument termed electrophoresis, showed serum contains five distinct fractions, based on their mobility in an electric field. These include albumin, alpha-1 globulin, alpha-2 globulin, beta globulin and gamma globulin. The gamma globulin fraction contains most of the antibodies of the serum (Figure).

In 1942, at the beginning of World War II, the U.S. Army commissioned Edwin Cohn, PhD, at Harvard to isolate human albumin from blood donors to treat shock in soldiers wounded on the battlefield. Dr. Cohn and his team developed a plasma fractionation procedure using cold ethanol at different concentrations and acidity (pH) to isolate several plasma fractions, including albumin (fraction 5) and gamma globulin (fraction 2), also known as immune globulin (IG).

Therapeutic Human Immunoglobulins 

These days, plasma from several thousand adults are pooled and fractionated to obtain Cohn fraction 2, the material used to manufacture IG preparations in use today. This fraction is treated with stabilizers, filtered to remove large complexes, tested for sterility and assayed for antibody content.5 These IG products contain antibodies to multiple bacteria and viruses. 

Some IG products are derived from individuals with high levels of antibodies against specific pathogens. These high-titer products include cytomegalovirus IG, tetanus IG, hepatitis B IG, hepatitis C IG, varicella-zoster IG, rabies IG, botulism IG and Rh IG, the latter of which is given to Rh-negative pregnant mothers to prevent Rh hemolytic disease in their newborns. COVID-19 IG is now being used to treat the current SARS-CoV-2 infection. 

Monoclonal Antibodies

In 1975, Georges Kohler and Cesar Milstein of the University of Cambridge described a method to obtain large amounts of pure antibody of known specificity (a monoclonal antibody).6 They first isolated a single B cell making a single type of antibody from the spleen of an immunized mouse. This B cell was expanded to produce a short-lived cell line, which they fused with a malignant B cell line that was not secreting antibody. The resulting cell, termed a hybridoma, combined the specific antibody of the mouse B cell with the immortality of the cancer cell line. These hybridomas can be grown in large quantities to produce an unlimited supply of one specific antibody. Kohler and Milstein received the 1984 Nobel Prize for this discovery.

Most monoclonal antibodies are used in the laboratory to identify different types of normal and abnormal cells in the blood or tissues. Today, several hundred monoclonal antibodies are also used in the practice of medicine.4 Four are directed against microbes, including respiratory syncytial virus, the anthrax bacterium, the C. difficile bacterium and the HIV gp120 receptor on CD4 T cells. Most recently, antiviral monoclonal cocktails have been developed to treat SARS-CoV-2 infection (see p.44). Many more are in the pipeline for therapeutic use.

A Long Line of Medical Discoveries 

Early theories about disease and immunity led to our current understanding of how human antibodies can treat disease. It began as early as 3000 B.C., when the ancient Babylonians believed disease was caused by the movement of the planets and stars (astrology). One thousand years later, miasma theory suggested disease was caused by poisonous mists or vapor. Other cultures attributed disease to angry gods. And in 500 B.C., Galen believed disease was due to an imbalance of the four humors: blood, phlegm, yellow bile and black bile. 

Recurrent epidemics such as the bubonic plague and smallpox led to the realization that some illnesses are contagious, and some survivors developed immunity to a second case of the disease. 

The discovery of bacteria by the Dutch microscopist van Leewenhoek prompted Pasteur and Ehrlich to culture bacteria and thus develop the germ theory of disease. Bacterial cultures were used to develop vaccines against an organism or its toxin. Von Behring first used horse antitoxin to save a child with diphtheria, earning him the first Nobel Prize in Medicine. About the same time, Pasteur developed a vaccine to rabies, although he could not see or culture the rabies virus.

In the early years of the 20th century, antibody activity was shown to be present in the gamma globulin fraction of serum. This fraction, termed Cohn fraction 2 or IG, was first isolated in the 1940s and used to prevent hepatitis, poliomyelitis and measles. In 1954, IG was first used to treat a boy with agammaglobulinemia. High-titered IGs were also developed using donors with elevated levels of antibody to a specific pathogen. 

In 1975, Kohler and Milstein isolated, cultured and immortalized a B cell making a single antibody and then fusing it with a malignant B cell line. This created an immortal cell, called a hybridoma, which can be expanded and cultured indefinitely to provide an unlimited supply of a specific antibody.4 There are now more than 600 monoclonal antibodies used in the diagnosis and treatment of human disease.5 

References

1. Silverstein, AM. A History of Immunology. Academic Press, San Diego, CA, 1989. 

2. Stiehm, ER, and Johnston Jr, RB. A History of Pediatric Immunology. Pediatric Research, 2005. 57:458-467.

3. De Kruif, P. Microbe Hunters. Harcourt, San Diego, CA , 1926.

4. Kaufmann, SHE. Immunology’s Coming of Age. Frontiers in Immunology, 2019. 10: 1-13.

5. Lu, R, Hwang, Y, Liu I, et al. Development of Therapeutic Antibodies for the Treatment of Diseases. Journal of Biomedical Science, 2020. 27:1-30.

E. Richard Stiehm, MD
E. Richard Stiehm, MD, is professor of pediatrics at the David Geffen School of Medicine at the University of California, Los Angeles.