
A vaccine tricks the body’s immune system into producing antibodies to fight a form of the virus that is not harmful. Then, if the person ever encounters the real and dangerous virus, the body is ready to prevent it from harming any cells.
An Idea in Search of a Method
If everyone in a room suddenly became exposed to the same disease in the same way at the same time, everyone would not be equally affected. One of the most important factors in determining how or whether a person gets sick is immunity—the human body’s own ability to prevent disease.
It has been recognized for centuries that some diseases never reinfect a person after recovery. Smallpox was the first disease people tried to prevent by intentionally inoculating themselves with infected matter. Inoculation originated in India or China some time before 200 BC.

The concept of immunization, or how to artificially induce the body to resist infection, received a big boost in 1796, when physician Edward Jenner inoculated a young boy in England and successfully prevented him from getting smallpox. Jenner used a lancet to scratch some infected material from a woman with cowpox (similar to smallpox) under the boy’s skin.

These smallpox inoculation devices illustrate both the simplicity of the idea and the complexity of the task. Left to right, from upper left corner: three examples of scab protectors (used after inoculation; early 20th century); two types of current disposable devices; bifurcated needles (a significant invention in 1968 because they used less vaccine and could be sterilized and reused); ivory vaccination points in glass carrier with wood shell (1900); vaccinator with metal carrying tube (19th century); spring lancet (1930s); glass and ivory points; round cowpox scab carrier (1860s, to transport vaccinating material); folding vaccinator (early 19th century); trigger vaccinator (1866); ivory-handled lancets with box (18th century); and drum vaccinator (19th century). The photograph shows a man with the distinctive smallpox blisters that often left permanent scars. Hugh Talman, photographer.
Can’t Catch This: Immunity and Immunization
Lack of immunity to disease has helped to decide the fate of entire communities, from smallpox among the Indians in the New World to syphilitic soldiers in the Old. Most people have some amount of natural immunity. The human body can take care of itself in many circumstances—cuts, colds, and minor infections disappear without major upheaval. In other cases, the body has little or no naturally occurring immunity, so if you are exposed to diseases such as polio, influenza, smallpox, hepatitis, diphtheria, measles, or whooping cough, you will probably get sick with it, unless you have been immunized.

Immunization refers to the artificial creation of immunity by deliberately infecting someone so that the body learns to protect itself. An important part of the history of immunization has been determining how to get the immunizing agent into the body. The skin, which keeps germs and mischievous substances out, is also a barrier to getting medicines and vaccines into the tissue where they can work. Physicians have used varying methods to create immunity where there is none.
The Skin Factor

While some scientists and physicians studied how the body worked and how to persuade it to fend off diseases, others puzzled over how to insert medicines and other substances such as vaccines. Having an effective vaccine that could produce sufficient immunity was useless without being able to get it into the body in a harmless way. Edward Jenner used a lancet and scratched two lines on James Phipps’s arm. Fifty years after Jenner, the hypodermic syringe became available. In 1885, scientist Louis Pasteur used one to vaccinate a young boy who had been bitten by a mad dog and was sure to die of rabies—the boy lived, and immunization took another giant step forward.
As more immunizing agents became available, people saw the benefit of immunizing large groups, such as soldiers. During World War I, they were vaccinated against diphtheria; during World War II, typhus and tetanus.
The Future Has a Past
In the 19th century, use of the hypodermic syringe was limited by dependence on large needles that could rust or snap in two, glass barrels that cracked, and tips that leaked. Before disposable needles in the 1960s, needles needed to be sharpened and sterilized. Since then, technological improvements include sharper, thinner needles and safety features. Still, more than a few people would like to avoid a shot in the arm.

Hypodermic injection remains the most common method of getting through the skin. But it is not the only technology for immunization. Engineers and scientists continue to search for alternative routes into the body, such as through the moth or nose. And continuing to solve the technological problems is critical for countries in which illness and death rates are high as a result of measles, maternal tetanus, and other preventable diseases.
A successful instrument or system must get the vaccine into the body with minimal disruption, and be cost-effective for use with billions of people. And perhaps the most important problem today—preventing reuse of syringes to avoid cross-contamination—was not even imagined in the 19th century.
Virus, Vaccines, Verification

World War II accelerated vaccine development. Fear of a repetition of the 1918–19 world epidemic of influenza focused urgent attention on all viral diseases, while commercial production of antibiotics taught researchers to grow viruses with less microbe contamination.

Also, investigators paid closer attention to vaccine safety and effectiveness through clinical studies before release of a vaccine to the public, especially after the yellow fever vaccine apparently caused hepatitis B in many U.S. soldiers in 1942.
Early Research

Polio vaccine is made from the actual virus. For both research and production, vaccine makers needed to grow large quantities of virus. Influenza virus had been grown in chicken eggs, but this method did not polio. So researchers sought other materials in which to grow poliovirus.

In 1936, Albert Sabin and Peter Olitsky at the Rockefeller Institute demonstrated that poliovirus could grow in human embryonic brain tissue, but they feared that this method might risk central nervous system damage in those who received the vaccine. The advantage of embryonic tissue, however, was that it grew quickly.
A Nobel Prize

In March 1948, John Enders, Thomas Weller, and Frederick Robbins used human embryonic skin and muscle tissue, grown in a nutrient mix with antibiotics, to prove poliovirus could infect tissue other than nerve cells. Their confirmation meant that researchers could now grow enough poliovirus to create large quantities of vaccine.
The three scientists won the Nobel Prize in Physiology or Medicine in 1954, the year polio vaccine had its first large clinical trial. Neither Jonas Salk nor Albert Sabin received a Nobel Prize for their work in creating vaccines.
Two Vaccines

From the early 1900s, researchers pursued two different kinds of polio vaccine. One used inactivated (killed) viruses. The other kind used live but attenuated, or weakened, virus. Jonas Salk was the leading proponent of the killed virus and Albert Sabin became the foremost proponent of the attenuated virus approach.
- At its peak incidence in the early 1950s, poliomyelitis occurred at a rate of 13.6 cases per 100,000 population. The incidence of cancer today, by comparison is 566.1 per 100,000.
- Edward Jenner created the first successful vaccination for a disease—smallpox—in 1796. At the time of the polio clinical trials, there were three widely used vaccines: for yellow fever (1937), rabies (1885), and smallpox. Today there are over 300 vaccines for about thirty different diseases.
- There are two kinds of polio vaccine. IPV (Salk’s) is an injected shot used today primarily in the United States and Europe. OPV (Sabin’s) is given orally in drop form and used in global efforts to stop polio transmission.
The Salk Vaccine


The chief advantage of Salk’s killed virus vaccine was safety. If made properly, it could not cause disease. Its chief disadvantage was that the formaldehyde used in its manufacture caused the immune system to recognize killed virus differently from live virus, possibly risking a shortened period of immunity.



Results of trials with small numbers of children in 1952 encouraged the National Foundation for Infantile Paralysis to adopt Salk’s vaccine for a large-scale trial in 1954. Salk called his vaccine “Pittsburgh vaccine,” but reporters named it “Salk.”
Sabin and Salk

While the large-scale clinical trial with Salk vaccine went ahead in 1954, Albert Sabin continued developing his live-virus vaccine.

Like many researchers of the day, Sabin strongly disagreed with Salk’s approach of using injected, “killed” virus. He believed that long-term immunity could only be achieved with a live, attenuated—or weakened—virus.

In the race to develop a safe and effective polio vaccine, accidents occurred with both types. In 1955, for instance, insufficiently killed virus in the vaccine from Cutter Laboratories in Berkeley, California, infected some 200 children; many were paralyzed and several died. But the global end to polio transmission would have been inconceivable without both the “killed” (Salk) and “live” (Sabin) vaccines. Neither Jonas Salk nor Albert Sabin patented their vaccines; they donated the rights as gifts to humanity.
The Sabin Vaccine
An important feature of Sabin’s oral polio vaccine was that immediately after vaccination, people shed weakened virus in their fecal waste. This boosted immunity for others in the community and gradually reduced the number of people susceptible to poliomyelitis.
Between 1963 and 1999, Sabin live vaccine largely replaced Salk killed vaccine everywhere in the world. However, because the live virus in the vaccine occasionally became strong enough to cause actual disease, Salk killed-type vaccine has replaced the live type in the United States.
“I have studied the effects of our new lots of polio vaccine in 100 adult volunteers and during the next few days shall give it to my wife and 2 children as well as to our neighbors and their children.”
Albert Sabin, 1957
Sabin and the Cold War

Because Salk vaccine was used so extensively in the United States, Sabin had to go overseas in the late 1950s to find people for his clinical trials, in the Belgian Congo and, on a massive scale, in the Soviet Union. An American was able to conduct an extensive polio vaccine trial in the Soviet Union at the height of the cold war because the fear of polio was stronger than political differences.
“After getting satisfactory results of tests of your vaccine in 20,000 children we are going to prepare from your strains (1956) material for vaccination of 2ñ3 million people more, and after thorough laboratory tests of this vaccine, to use it in our country in 1959.”
Dr. Mikhail Chumakov to Albert Sabin, letter of December 26, 1958

In the first five months of 1959, ten million children in the Soviet Union received the Sabin oral vaccine. Albert Sabin received a medal in gratitude from the Russian government during the height of the cold war.
Killed or Live Vaccine?

Albert Sabin and other researchers, including John Enders and Hilary Koprowski, had argued that long-term immunity to polio could only be achieved with a live though greatly weakened virus, and that it must follow the same route of infection as wild-type poliovirus—through the mouth, and infecting intestinal tissue. Weakening the virus required passing it through a succession of animals—rats, mice, or monkeys. This allowed it to become more virulent for these hosts, and less so for humans. Hilary Koprowski carried out the first successful trial of weakened virus vaccine in February 1950.
Clinical Trials

The National Foundation for Infantile Paralysis chose Dr. Thomas Francis Jr. at the University of Michigan to implement the first mass polio vaccine trial in 1954. More than 300,000 people, mostly volunteers, including physicians, nurses, schoolteachers, public health officials, and community members, carried out the work.
Polio Pioneers

In 1954, almost 75 percent of reported poliomyelitis cases occurred in people under twenty years of age, and 50 percent in children under ten. The trial’s study population, then, targeted some 1.8 million children in the first three grades of elementary school at 215 test sites. In the double-blind experiment, 650,000 children received vaccine, 750,000 received a placebo (a solution made to look like vaccine, but containing no virus), and 430,000 served as controls and had neither. All were “Polio Pioneers.”

The study called for all children receiving vaccine or placebo to have three intramuscular injections over a five-week period. About 2 percent of the children also gave blood samples to verify their immune response.
Data from all 1,829,916 clinical trial participants were entered on IBM punch cards and tabulated. The study evaluated every scrap of evidence, from the registration methods of the participants to laboratory procedures to statistical analysis.
“The news that began to pour out over the radio in the gym on April 12, 1955, the tenth anniversary of Roosevelt’s death, was news only in detail. That the field trials of the Salk vaccine would prove in some measure successful had been anticipated. Indeed, the assistant to the director at the previous hospital had remarked to me in November, ‘Too bad you didn’t wait a year. The vaccine sure looks good.’”
Edward LeComte, 1957
Medical Philanthropy

Three private organizations figured prominently in the history of poliomyelitis in the United States and worldwide: the Rockefeller Institute, the National Foundation for Infantile Paralysis (March of Dimes), and Rotary International.
Rockefeller University

Industrialist John D. Rockefeller founded the Rockefeller Institute for Medical Research in New York in 1901. Karl Landsteiner, who identified polio as a virus in 1908, joined the institute’s faculty in 1922, and studied human blood groups (for which he won a Nobel Prize in 1930).
Much of modern virology derives from the work of Rockefeller Institute investigators, including Simon Flexner, Thomas Rivers, and Peter Olitsky. Albert Sabin, arrived in 1935 and joined them in poliomyelitis research. The institute became “Rockefeller University” in 1965 and continues to be a leading research center for the molecular biology of human diseases.
The March of Dimes

President Franklin Roosevelt established the National Foundation for Infantile Paralysis in 1938. Its hugely successful fund-raising campaigns collected enough money to fund John Enders’s laboratory, where poliovirus was first grown in nonneural tissue; both Jonas Salk’s and Albert Sabin’s vaccine development; the 1954–55 field trial of Salk vaccine; and the supply of free vaccine to thousands of children afterward.
In 1958, the foundation changed its focus to premature birth and the prevention of birth defects. In 1979, the organization officially changed its name to the March of Dimes. Its work continues today, under the slogan “Saving babies, together.”
Rotary International

Chicago lawyer Paul Harris called together a group of civic-minded professionals in February 1905 to found the first “Rotary” Club—taking its name from rotating meetings in members’ homes and offices. By 1922, Rotary Clubs existed around the world, prompting the name change to Rotary International.

Rotarians were well-represented at the United Nations Charter Conference and have maintained their UN ties ever since. In 1985, Rotary International committed itself to immunizing all children against poliomyelitis. This organization, with 1.2 million members in 166 countries, has been the largest private-sector contributor to the polio eradication campaign worldwide.
Originally published by Smithsonian Institution, reprinted with permission for educational, non-commercial purposes.