Allergies were pretty much completely unknown until the beginning of the twentieth century, and allergic diseases have skyrocketed in the last twenty years, and well over half of all Americans are allergic to an airborne substance. What's going on here?
At first glance, that big spike in allergies might seem a bit unlikely — particularly to people like me, who are lucky enough to not have to deal with allergies. What could cause such a massive uptick in the incidence of allergies? Is this a natural byproduct of the modern world we live in, or this allergy scare all in our heads? The answer, it seems, might actually be somewhere in the middle.
Allergies are the essentially the result of the immune system getting its wires crossed. While our antibodies are meant to spend their time killing any viruses or bacteria that enter the body, there's one particular type of antibody that instead attaches itself to a harmless substance such as, say, pollen, which you can see on the left. This substance is known as an allergen, and the antibody is called immunoglobulin E, or IgE, and once it has formed in the bloodstream any subsequent exposure to the particular harmless substance will cause an allergic reaction.
This phenomenon is an example of what's known as hypersensitivity, in which the immune system launches an inflammatory response to harmless stimuli based on this erroneous information from IgE that the allergens are in some way dangerous. The reaction can take a number of forms and be centered on a variety of different organs — allergic reactions in the lungs take the form of asthma, for instance, while eczema occurs when the allergic reaction is centered on the skin.
The most severe allergic reactions are known as anaphylaxis, and these occur when an allergin triggers the massive activation of white blood cells throughout the body. In extreme cases, the immune system's response can actually prove fatal if not properly controlled. But all allergies, whether they are to dust or food or dander, have their root in each person's particular IgE antibodies. Exactly what causes some people's IgE to be so prone to allergens while others are not isn't well understood — there's likely a genetic component to it all, but the precise mechanisms remain elusive.
The term allergy comes from the Greek, "allos" meaning an altered state and "ergon" mean reaction, so literally an altered reaction. The term was coined in 1906 by the Austrian pediatrician Clemens Peter Pirquet von Cesenatico, who noticed many of his patients to whom he administered horse serum and smallpox injections had much more severe reactions the second time around. This led him to coin the term "allergy" for a whole body of conditions and illnesses that has already been observed throughout the 19th century, as we will discuss in a moment.
There isn't much question that there indeed has been an increase in the incidence of allergies. As Dr. Jonathan Corren notes in 100 Quesitons & Answers about Allergies, 3% of all Americans had asthma in 1990 while now the number is 7%. Nasal allergies have jumped from 10% to 20%, skin-based allergies have gone from 5% to 8%, and anaphylaxis has increased in incidence from 1% to 3%. A 2008 study by the National Health and Nutrition Examination Survey showed that at least 58% of Americans are allergic to at least one airborne allergen. Similar increases can be seen throughout the developed world.
The most common explanation, not to mention the most compelling, is called the hygiene hypothesis. Children are born with an immune system that is unable to readily distinguish harmless substances from potentially deadly threat. Initially, the best way for the still developing immune system to cope with this is through allergic reactions, which treat most substances entering the body as potentially hazardous and so reject them.
Throughout most of history, children grew up in an agricultural environment in which they were surrounded by a lot of, well, shit. Feces is the perfect breeding ground for microorganisms, and exposure to these viruses and bacteria forces the immune bacteria to learn how to differentiate between real threats and imaginary ones. The immune system abandons its earlier allergic approach and focuses on repelling only genuine threats.
Now, because young people in the developed world grow up in such overwhelmingly sanitary conditions, they aren't being exposed to the sorts of mild infections and microbes that are vital to the development of a fully functioning immune system. This is causing allergic reactions that children used to grow out of at a very young age to endure right into adulthood. This likely explains the bulk of the increase in allergies, although nutrition, particularly a lack of vitamin D, and air pollution also seem to play a role.
In a weird way, allergies were once seen as a status symbol. As University of Wisconsin historian Gregg Mitman explains in his book Breathing Space, allergies were first attracted serious attention around 1870 with the explosion of hay fever. This was a disease almost exclusively for the wealthy - indeed, it was actually thought that only the best and brightest would have sensitive enough noses to detect all the irritants created by the Industrial Revolution to contract this new disease.
Yes, a runny nose was seen as the ultimate proof of a wealthy person's innate superiority over the lower orders. Of course, the hygiene hypothesis would suggest these were simply the only people in the 19th century who weren't surrounded by hideously unsanitary conditions, and so they were the lucky few whose immune systems did not have to develop quickly just for them to survive.
There was another reason why the hay fever sufferers of the 1800s were generally upper middle class or wealthier — they were the only ones who could afford the disease. Resorts sprang up in the country geared to the treatment of hay fever, where patients were meant to spend a month or so sequestered from the rigors of modern civilization and its filthy air. A physician or a lawyer could afford such an expensive cure, but poorer people had to silently endure their condition while continuing to work.
Still, the rich men and women of the Gilded Age and the generations that followed weren't really wrong in considering hay fever and other allergic reactions a byproduct of a more civilized world, even if they insisted upon throwing in the rather bizarre value judgment. They were indeed on the vanguard of a new breed of civilized human, albeit one defined less by a refined palate and more by an immature immune system. Its ranks would only grow over the next century.
Part of the problem with allergies is that they can essentially be a self-fulfilling prophecy. Let's think about this for a second — we know that, according to the hygiene hypothesis, people who aren't exposed to a lot of microorganisms and other immune-building stimuli are likely to retain their allergic reactions past the point where they would normally grow out of them. Let's imagine there are two children — one who is shielded from all such stimuli by parents concerned about the spread of allergies, and another who comes into contact with a more normal range of stimuli. Which child do you think is more likely to develop a lot of allergies?
This is particularly problematic with peanut allergies, and likely helps explain the explosion in incidence of the allergy over the past few decades. It isn't just that children in the developed world are now protected from microorganisms - concern over peanut allergies has led many parents to keep young children away the foodstuff altogether, which doesn't give these kids a chance to grow out of the allergy, as most likely would if given a normal amount of exposure to peanuts. Indeed, a British study of 10,000 children in 2008 showed early exposure to nuts lowers the risk of allergy, not increases it.
None of this is meant to suggest that severe peanut allergies don't exist, and this sure as hell isn't meant as medical advice. But, looking at general trends, it does seem the concern over peanut allergies far outstrips the real threat, but, rather maddeningly, that very fear is actually making the allergy far more prevalent and dangerous than it would be otherwise. As Tara Parker-Pope wrote in a 2008 New York Times article, 3.3 million Americans are allergic to nuts, but there are only 2,000 hospitalizations per year for all food allergies, and just 150 deaths.
And those numbers probably aren't being held down by the emphasis on "peanut-free zones." According to Dr. Nicholas A. Christakis of Harvard Medical School, while basic precautions do make sense to protect those with genuinely serious allergy risks, there's no real evidence that attempts to completely prevent exposure to peanuts is actually effective in protecting children, and this is in fact exactly the sort of widespread avoidance that causes the allergy to become more prevalent.
Throw in the psychological impact of loudly advertising these "nut-free zones", which suggest peanuts are a constantly lurking danger, and you have a legitimate medical phenomenon that is being fueled and expanded beyond reason by mass anxiety.
When considering the psychological (or indeed, psychosomatic) side of allergies, we should also consider how allergies are diagnosed in the first place. A particularly intriguing study on just that subject came out in 2010, as Stanford researchers released an analysis of 72 different food allergy studies. They found that not only is there no clear, agreed upon definition for what constitutes a food allergy, but that in all likelihood allergies were being misdiagnosed.
Food allergies are particularly tricky to diagnose because many apparent instances are actually examples of food intolerance. The most famous of these is lactose intolerance, which is not an allergic reaction because the immune system isn't involved - instead, lactose intolerant people lack the necessary enzymes in their gastrointestinal tract to properly digest milk and other dairy products.
The problem, according to the Stanford researchers, comes in when doctors administer allergy tests, which typically involve exposing the skin to various allergens to chart possible reactions. While these tests typically work decently well with non-food types of allergies, the researchers found that less than half of all positive tests of people with unclear symptoms had actually accurately diagnosed a food allergy. In all the other incidences, the allergy test had generated a false positive.
There are a couple problems for this. One, on an individual level, food allergies aren't a fun thing to live with. They often require strict monitoring of diet and even just one's simple surroundings to ensure there's no exposure to allergens. On a broader level, the researchers argue this overdiagnosis of allergies could cause people to see food allergies as something trivial, on the premise that if seemingly everyone has at least a mild food allergy then they can't be anything too serious. And, on the other hand, this perceived pervasiveness of food allergies could lead to the sort of preemptive avoidance that helps children develop their allergies in the first place.
So what are we left with here? On balance, allergies certainly are very much for real, they probably really are on the rise in the developed world, and what's more there are clear, logical explanations for this increase that fit in with historical trends going right back to the 19th century. Still, that hardly precludes the possibility that the threat of allergies is still somewhat overblown, and indeed that increased anxiety only serves to make the problem worse in the long run. So then, it might be best for us, as a society, to chill out a bit about the whole thing. More often than not, they really are just allergies - and indeed, our immune systems are built to grow out of them anyway.
Breathing Space: How Allergies Shape Our Lives and Landscapes by Gregg Mitman
Allergy: The History of a Modern Malady by Mark Jackson
100 Questions & Answers About Allergies by Jonathan Corren, MD
"Diagnosing and Managing Common Food Allergies: A Systematic Review" by Jennifer J. Schneider et al.