Explore chapters and articles related to this topic
Host Defense and Parasite Evasion
Published in Eric S. Loker, Bruce V. Hofkin, Parasitology, 2023
Eric S. Loker, Bruce V. Hofkin
The parasite avoidance behaviors of social insects such as ants and bees are legendary, perhaps explaining why they seem to have lost immune genes as compared to other insects. One example of how behavior can facilitate a more effective colony-wide defense from parasites is provided by the observation that uninfected ants (Lasius neglectus) frequently rub against colony members infected with the fungus Metarhizium anisopliae, in the process becoming exposed to a low level of infection. The exposure stimulates up-regulation of a set of immune genes that have the overall effect of boosting anti-fungal immunity specifically. Interestingly, the ants do not show evidence of boosted antibacterial responses. By virtue of this controlled, low exposure to the fungal pathogen, ants are usually not killed and exhibit an enhanced ability to inhibit fungal growth. This process has been referred to as social immunization. It bears a striking similarity to a tactic taken by we humans in our historic fight against smallpox, namely to deliberately expose susceptible individuals to a low pathogen dose to induce immunity, a practice called variolation.
An Introduction to the Immune System and Vaccines
Published in Patricia G. Melloy, Viruses and Society, 2023
Many good reviews are available that discuss the history of vaccines and how they work (Graham and Sullivan 2018; Piot et al. 2019; Plotkin 2005). However, to understand the history of vaccines, one must go back to an earlier process known as variolation. In variolation, pus or scabs from a person who has had smallpox (variola) were ground up and placed under the skin or inhaled by another individual, thus protecting the new individual from smallpox. Variolation has been documented as early as AD 1000 in India while additional data suggest the practice could have originated in Central Asia or China as well. Legend has it that Lady Mary Montague witnessed the practice of variolation while living in Turkey and brought that variolation to England (Pemberton 2014; Plotkin 2005; Zimmer 2011; Plotkin and Plotkin 2017). It is also worth noting that George Washington had his Continental Army undergo variolation against smallpox in the 1770s (Plotkin and Plotkin 2017; NPS, 2022). However, there were risks associated with variolation, and it is estimated that 1%–2% of people who underwent variolation died (Medicine 2021).
The Early Middle Ages
Published in Scott M. Jackson, Skin Disease and the History of Dermatology, 2023
The Chinese were among the first peoples to record information about history's most significant skin diseases: leprosy and smallpox. As was discussed in Interlude 1, leprosy was first written about in the Feng zhen shi 封診式, “Models for sealing and investigating” (266–246 BCE) under a generic term for skin disease called li 癘. The Chinese correctly recognized anesthesia as a feature of that disease. The earliest description of smallpox is from the fourth century CE in China, but there is evidence that it was in China some 600 years before that time.68 The version of that virus faced by the Chinese in the first millennium must have been rather virulent and lethal. It is likely that trade between China and Japan led to the introduction of smallpox into Japan and the subsequent epidemic of 735–737 CE, killing a third of the Japanese people. Variolation, which the West learned about from the Ottoman Empire, first appears in Chinese in a text written in 1549, but less convincing reports suggest that the practice dates back to 1000 CE.69 The Chinese method involved blowing smallpox material, kept at body temperature for one month, up the nose.70 There are no reports of a demography-altering pandemic of bubonic plague in the fourteenth century, and the first reports of syphilis did not occur in China until the sixteenth century. In addition to using mercury for various medical indications since 500 BCE, the Chinese wrote about scabies in ancient times and may have been the first to use sulfur to treat it.
Using Individuals as (Mere) Means in Management of Infectious Diseases without Vaccines. Should We Purposely Infect Young People with Coronavirus?
Published in The American Journal of Bioethics, 2020
One alternative strategy is to actually infect individuals with the live virus, which means making the individuals very likely to become sick with the associated disease. This is known as “variolation” and was widely used against smallpox in 1700, before Edward Jenner introduced the smallpox vaccine–in the form of an attenuated cowpox virus–in 1796 (Fenner et al. 1988). Giving limited doses of the virus and giving them in certain ways (in the case of smallpox, via skin tissue) was very likely to cause a milder version of the disease, which was nonetheless enough to trigger the desired reaction by the immune system and confer protection against future smallpox infections. It does entail some of the risks and of the discomfort of the disease for the individual who is infected, but there might be significant individual and public health benefits, especially if enough individuals get infected and create herd immunity, or at least contribute to speeding up herd immunity. When there is no vaccine (as was the case with smallpox before 1796), and assuming there is not enough cross-immunity from other diseases caused by similar viruses, people getting infected and then immune is the only way to create herd immunity. If the public health benefit is significant enough, a cost-benefit analysis of this strategy at the collective level does not always rule out variolation.
The potential role of using vaccine patches to induce immunity: platform and pathways to innovation and commercialization
Published in Expert Review of Vaccines, 2020
Kamran Badizadegan, James L. Goodson, Paul A. Rota, Kimberly M. Thompson
The history of vaccine development includes exploration of vaccine delivery to humans through all possible routes of entry into the body using a wide range of strategies [1]. The earliest vaccination technique involved applying virus particles directly to disrupted skin (i.e. variolation with smallpox). Currently, although a limited number of licensed oral vaccines (e.g. oral poliovirus vaccine, oral rotavirus vaccine) and aerosol vaccines (e.g. FluMist™) exist [2], the use of syringe and needle that carries the vaccine through the skin barrier represents the dominant current vaccine delivery strategy. Delivery of vaccines by syringe and needle is generally well accepted by vaccine recipients, even though they may experience some fear (i.e. needle phobia), pain associated with receiving injections, and/or in rare instances of injuries, such as shoulder injuries related to syringe and needle vaccine administration [3] or adverse events from lyophilized vaccine reconstitution errors. Health systems also broadly accept syringe and needle delivery of vaccines and benefit from the interchangeability and stability of a supply chain supported by multiple suppliers. However, syringe and needle vaccine delivery require the use of trained, skilled health-care workers to administer the vaccines, and even with sufficient training these workers face risks of needle-related occupational injuries. In addition, the disposal of used syringes and needles leads to system costs and risks.
Epicutaneous peanut patch device for the treatment of peanut allergy
Published in Expert Review of Clinical Immunology, 2019
Alexandra Langlois, François Graham, Philippe Bégin
While often perceived as new, the epicutaneous route is, in fact, one of the oldest approaches to immunotherapy. When Edward Jenner first tested his smallpox vaccine through abraded skin in 1796, he himself was replicating the process of variolation (inoculation with actual smallpox) which had been practiced under various forms in China since the sixteenth century, if not earlier [15,16]. Its first use to treat environmental allergies dates back to 1921, when EPIT to horses was conducted in asthmatic patients by Vallery-Radot [17]. Further work in the 1950s–1960s in France and other European countries used a scarification technique called ‘quadrille ruling’ described by Blamoutier [18,19]. It showed superiority over subcutaneous immunotherapy in the treatment of pollen allergy [20]. After a long period of absence from the scientific literature, it was ‘rediscovered’ in the early 2000s for the treatment of environmental and food allergies [21,22]. At that time, one project intended to develop a ready-to-use atopy patch test for delayed milk hypersensitivity [23,24]. This led to the development of what is now used as an EPIT patch device for food allergens [25–27].