A Brief History of Allergies: From Ancient Times to Today

An allergy is fundamentally an immune system overreaction to a substance that is typically harmless to most people, such as pollen or peanuts. This physiological misfire, known as a hypersensitivity reaction, causes a range of symptoms from mild irritation to life-threatening collapse. The journey to understanding this phenomenon has taken millennia, moving from simple ancient observations of unusual reactions to a modern, highly specialized field of immunology. Tracing this history reveals how humanity struggled to name, identify, and manage a condition that has become increasingly common in the modern world.

Ancient Observations of Hypersensitivity

Descriptions of adverse reactions to certain foods or environmental factors appear in medical texts dating back thousands of years. These early records acknowledged the existence of “idiosyncrasies” as part of the human condition, even though the underlying mechanism was unknown. For example, the death of the Egyptian Pharaoh Menes in approximately 2641 BC, reportedly after a wasp sting, may be the earliest recorded instance of a fatal anaphylactic reaction.

Centuries later, the Greek physician Hippocrates noted that certain foods could cause digestive upset or hives in specific individuals. The Papyrus Ebers, an Egyptian medical text dating to around 1650 BC, also contains descriptions of what are believed to be treatments for asthma, suggesting respiratory hypersensitivity was a recognized ailment. These observations were understood as peculiar individual susceptibilities.

The Coining of “Allergy” and Foundational Theories

The shift from simple observation to scientific classification began in the early 20th century. In 1902, French scientists Charles Richet and Paul Portier studied sea anemone venom and found that dogs injected with a small dose experienced a severe, often fatal, reaction upon receiving a second, even smaller dose weeks later. They named this unexpected phenomenon “anaphylaxis,” meaning “without protection.” Their work established that a second exposure to a foreign substance could lead to a catastrophic systemic reaction, though they initially viewed it as a failure of immunity rather than an overzealous immune response.

Just four years later, in 1906, Austrian pediatrician Clemens von Pirquet introduced the term “allergy.” Derived from the Greek words allos (“altered”) and ergon (“reactivity”), his definition was initially much broader than the one used today. Von Pirquet used “allergy” to describe any “altered reactivity” of the organism to a foreign substance, encompassing both protective immunity from a vaccine and hypersensitivity seen in hay fever or serum sickness.

Von Pirquet’s conceptual framework unified a range of formerly disparate conditions, from asthma and eczema to delayed reactions like the tuberculosis skin test, under the umbrella of altered immune responsiveness. This conceptualization provided the theoretical underpinning for the decades of research that followed, linking immunity and hypersensitivity as two sides of the same biological coin.

Development of Modern Diagnostic Tools and Immunotherapy

The conceptual breakthroughs of the early 1900s paved the way for practical medical tools to diagnose and treat allergic disease. The earliest forms of diagnostic testing involved applying an allergen directly to the skin to provoke a localized reaction. In 1873, British physician Charles Blackley performed the first known scratch test on himself using pollen, demonstrating the relationship between the substance and the reaction.

The modern skin prick test, a quick and reliable diagnostic method, evolved from earlier techniques like the scratch test and more invasive injections. Concurrent with diagnostic advances, the first attempts at treatment aimed to desensitize the patient to the offending substance. In 1911, British physicians Leonard Noon and John Freeman pioneered allergen immunotherapy, or “allergy shots,” by injecting patients with increasing concentrations of grass pollen extract. Their work established the foundational principle of desensitization, which remains the basis for subcutaneous immunotherapy today.

A major mechanistic leap occurred in the late 1960s with the simultaneous discovery of Immunoglobulin E (IgE) by two independent research teams. IgE was identified as the specific antibody responsible for immediate hypersensitivity reactions. This discovery provided a biochemical explanation for the mysterious “reagins” that had been shown to transfer allergy via serum since the 1920s. The identification of IgE led directly to the development of blood tests, such as the Radio-Allergo Sorbent Test (RAST), which allowed clinicians to accurately measure specific IgE antibodies in a patient’s blood, revolutionizing diagnosis in the 1970s.

Understanding the 20th Century Allergy Epidemic

The latter half of the 20th century saw a dramatic increase in the prevalence of allergic diseases across industrialized nations, a trend often termed the “allergy epidemic.” This rise in asthma, hay fever, and food allergies was particularly noticeable in countries that had undergone rapid “Westernization” following World War II. Epidemiologists began searching for environmental and lifestyle factors that could explain this sudden shift in population health.

One of the most influential theories to emerge was the Hygiene Hypothesis, proposed by David Strachan in 1989. Strachan observed that children from larger families, who had greater exposure to infections from older siblings, had a lower risk of developing hay fever. The hypothesis suggested that reduced exposure to common childhood infections and environmental microbes in early life prevented the immune system from developing tolerance, causing it to overreact to harmless substances.

The theory was later refined and expanded into concepts like the “Old Friends Hypothesis,” which focuses on the lack of exposure to ancient microorganisms that co-evolved with humans. Modern lifestyle changes, including widespread antibiotic use, smaller family sizes, changes in diet, and a shift toward indoor living, are all seen as factors that have decreased microbial diversity early in life. These findings highlight that the increased incidence of allergies is a public health phenomenon linked to the historical changes in our environment and way of life.