Last Updated on
By Dr. Matthew Smith (bio below)
In 1933, the story of Marie, an allergic walrus, was published in the Journal of the Medical Veterinary Association. Marie had been brought to a California zoo as a young pup and, lacking walrus breast-milk, her keepers fed her condensed cow’s milk instead. Marie reacted badly, suffering from asthma, eczema, conjunctivitis, hair loss and “gastric distress.” The symptoms continued until her teeth came in, allowing her to eat crustaceans and other suitable food, and soon she became a beloved zoo performer.
Marie’s story is a fascinating example how, by the early 1930s, even veterinarians were aware of the role food could play in triggering allergic symptoms. This should not be terribly surprising. Long before the term allergy was coined in 1906 by Austrian pediatrician, Clemens von Pirquet (1875-1929), physicians recognized that food could cause strange reactions in their patients. None other than Hippocrates wrote 2,500 years ago that while cheese was an excellent food for most, others, when they ate it, “came off badly.”
Being aware that people have always suffered puzzling reactions to food might be nice to know for both food allergy patients and the parents of children who suffer. But the history of food allergy can go much further than that in helping us understand – or at least giving us the foundation to understand – some of the deeper issues inherent in this “strangest of all maladies.” In what follows, I list 3 “lessons” from my research on the history of food allergy, recently published by Columbia University Press as Another Person’s Poison: A History of Food Allergy.
1) Food allergy is (and always has been) controversial.
Marie’s vets do not appear to have given much second thought to the cause of her ailments. The reasons why they turned to allergy as an explanation is, sadly, not explained. Perhaps one of her vets suffered from allergies, or maybe one of them knew an allergist – California was a hotbed of food allergy research and clinical practice during the 1930s. Such nonchalance belies how very controversial food allergy was throughout the twentieth century. While some physicians argued that food allergy was the hidden cause of much chronic suffering, ranging from asthma and eczema to migraine headaches and behavioral problems, others considered it to be “witchcraft, a fad, or a racket.”
At the heart of many of these disputes were disagreements about how to define food allergy. Some allergists preferred von Pirquet’s definition of allergy as “any form of altered biological reactivity,” an open-ended definition that allowed almost any symptom to be described as allergic. Others, however, demanded proof of immune system dysfunction before calling a reaction an allergy. This was difficult in cases of food allergy because skin testing was not as reliable a test for food allergies as it was for other allergies, such as hay fever or allergies to animal dander.
As such, food allergists relied on elimination diets to diagnose food allergies, which relied on a great deal of patient testimony. This was not scientific enough for many other allergists, and splits emerged between food allergists and their more conservative colleagues. Such gaps widened after the Second World War, when environmental theories of allergy emerged vied with psychosomatic explanations for the phenomenon. Although the emergence of peanut allergy has distracted from these basic debates in recent decades, disagreement remains about what food allergies are, who actually has them, and how they should be prevented or treated. Knowing that food allergies have always been controversial might not help patients get better treatment, but it can be empowering: there have always been two sides of the story in food allergy.
2) Although food allergy has always been controversial, previous medical paradigms have been more willing to accept the possibility that food could cause bizarre symptoms.
Before the term allergy was coined in 1906, many physicians referred to strange reactions to foods as “idiosyncrasies.” And, while there were debates about the role of idiosyncrasies in triggering such reactions, as a whole, they were generally accepted as potential cause. One of the reasons for this was the overarching medical paradigm that dominated medicine since the time of Ancient Greece and Rome: humoralism.
Humoral medicine rested on the idea that the body was kept healthy by balancing the four humors: blood, phlegm, black bile and yellow bile. These substances were characterized by their qualities: blood was hot and wet; phlegm was cold and wet; black bile was cold and dry; and yellow bile was warm and dry. Since each person had different quantities of the humors in them naturally, balancing the humors was a decidedly individualistic process. And one of the ways people could balance their humors was by adjusting their diet, since different foods also had different humoral qualities: while cucumbers were cold and wet; dried pepper was hot and dry. Therefore, physicians prior to the twentieth century would have taken it for granted that people could react very differently to different foods.
This changed during the nineteenth century when medicine abandoned humoralism and began the quest to find specific causes for specific diseases and, to a degree, took the individual differences out of the equation. Also making a difference was the emergence of the medical laboratory as the source of medical knowledge. Physicians were no longer supposed to trust the knowledge they acquired in the clinic, but rather turn to evidence formulated in the laboratory to treat their patients effectively. Since such evidence was not always forthcoming with food allergy, food allergists continued to rely on their clinical experiences, and this was perceived as being old-fashioned. All this is to say that the context in which medicine operates makes a big difference with respect to how conditions like food allergy are understood and experienced.
Related Post: Top Tips for Travelling with Food Allergies
3) In order to unravel the mysteries of food allergy, we have to be open-minded and imaginative.
Because of points 1 and 2, research into the ultimate causes of food allergy has been rather disappointing. Not enough researchers engage try to answer these “why” questions, leaving a vacuum for untested theories to flourish. The fact that some people have always reacted strangely to foods suggests that there a timeless quality to food allergy. But, on the other hand, the rapidly rising rates of food allergy, and especially dangerous allergies such as those to peanuts, indicates that some changes in the human environment, broadly speaking, must be responsible. Unfortunately, many of the explanations that have been put forth are even more controversial than food allergy itself: the hygiene hypothesis, breastfeeding practices, vaccination, chemicals in the environment – all of these are sources of enormous debate on their own. Linking them to food allergy makes for a toxic combination that a medical researcher would not want to touch with a ten-foot pole.
Yet, if we are to understand what is behind food allergy, we must begin to wrestle with such possibilities in an earnest, creative way. Peanut allergies were breathtakingly rare 30 years ago; now they are commonplace. Why? No one is interested in answering these questions, despite what the answers might offer those suffering from allergies and despite what the answers might say about our immune system and its relation to the environment. Historians are always asking these kind of “why” questions; why won’t medicine?
Bio: Dr Matthew Smith is a medical historian at the University of Strathclyde, where he is a member of the Centre for the Social History of Health and Healthcare. His previous books include ‘Hyperactive: The Controversial History of ADHD‘ (Reaktion, 2012) and ‘An Alternative History of Hyperactivity: Food Additives and the Feingold Diet‘ (Rutgers University Press, 2011). His research on the history of food allergy was funded by the Wellcome Trust.