Credits
Theresa MacPhail is a medical anthropologist, former journalist and associate professor of science and technology studies. She is the author of “Allergic: Our Irritated Bodies In a Changing World” (Random House, 2023), from which this piece has been adapted.
Elizabeth, an engineer in her late-30s, has three children, all with some form of allergy. Her eldest daughter, Viola, 12, had eczema as a baby; has environmental allergies to pollen; and allergies to corn, tree nuts and peanuts.
Her youngest son, Brian, 3, also had eczema as a baby and subsequently developed allergies to peanuts and barley, though Elizabeth fears there could be more. Her middle daughter, five-year-old Amelia, had a dairy allergy as an infant, but is now just lactose intolerant. She’s the easiest of the three, at least in terms of allergy.
By the time I hear her story, Elizabeth is already a veteran at dealing with her children’s irritated immune systems. She began a support group for parents of children with corn allergies and is heavily involved in trying to educate other parents about food allergies.
The parents share their theories about why their children have allergies. Her own is that Viola and Brian both went to the emergency room with high fevers as babies and were given precautionary antibiotics. She blames the antibiotics for altering her children’s gut microbiome and herself for agreeing to the treatment in the first place.
Part of Elizabeth’s rationale is that no one else in her family has allergies. In fact, it’s so rare that her parents initially didn’t believe the diagnoses. They argued that “back in their day,” everyone ate everything and was fine; food allergy was made-up nonsense. But when both Viola and Brian landed in the ER repeatedly for food-related anaphylaxis, her parents realized these allergies were indeed “real.”
Elizabeth’s family’s routines have been upended. “My life revolves around cooking for them,” she explains. “We don’t eat out. We don’t trust people preparing their food.” Instead, Elizabeth gets up daily at 6:30 a.m. to cook a breakfast that avoids allergens for all three kids. Then she cooks and packs their lunches, preparing everything from scratch because most packaged foods contain at least one ingredient that one of her children will react to.
On a recent vacation with four other families, Brian ended up in the ER with anaphylaxis due to cross-contamination. Elizabeth says she will never share a house again if she’s not the “cleaning boss,” which basically means being not only continuously vigilant about the foods prepared in the kitchen but also thoroughly cleaning and wiping down anything that an allergy-inducing food touches. It’s a labor of love and worry in equal measure.
Brian’s allergies are the most severe. Though only a toddler, he knows some foods are dangerous. “I’ll ask him, ‘Do you know why you can’t have that?’” Elizabeth said. “And he’ll say, ‘Yes, Brian allergic. Makes me owie. Mommy give me shot and we go to hospital.’ He remembers the EpiPen. He remembers it because those things hurt. It’s an inch and a half needle jabbed into you.”
Brian runs away whenever he sees Elizabeth pack an EpiPen, a potentially life-saving device for allergy sufferers, into one of their bags. She says it makes her feel like she’s the biggest monster in the world. Not only because of her son’s reaction, but also because she ultimately feels responsible for his allergy.
Although allergy researchers may disagree on definitions, symptoms and methodology, all agree on one thing: Allergies have grown worse over the last few decades, and the staggering numbers of allergy sufferers worldwide is likely to continue growing. An estimated 235 million people worldwide have asthma, and anywhere from 240 to 550 million people globally may suffer from food allergies. Drug allergy may affect up to 10% of the world’s population.
There’s a consensus, looking at the last century’s data, that U.S. hay fever rates increased in the mid-20th century. Data suggests that the incidence of asthma increased beginning in the 1960s, peaking sometime in the 1990s. Since then, asthma rates have remained fairly constant. Respiratory allergic diseases and atopic sensitization (or skin allergy) have likely increased over the last few decades. But the most dramatic and visible increase has been the rise in global incidence rates for food allergies, which began in earnest in the 1990s and has grown steadily ever since.
There are, unsurprisingly, multiple theories about the cause. The hygiene hypothesis is one front-runner, positing that people who are “too clean” develop allergies. Many others think it’s our diet, that changes in the way we grow and prepare food have altered our gut microbiome, fueling allergies. Still others argue that manmade chemicals and plastics we encounter daily are making our immune systems more irritable.
“Although allergy researchers may disagree on definitions, symptoms and methodology, all agree on one thing: Allergies have grown worse over the last few decades and the staggering numbers of allergy sufferers worldwide is likely to continue growing.”
What everyone agrees on is that the environment’s influence on our genes, or epigenetics, has played a large role in the rise of allergies, as does the makeup of our nose, gut and skin microbiomes. In the end, it appears, we are at least partially doing this to ourselves. Modern living is likely at the root of the recent rise in allergies.
Our Changing Microbiomes
If you want to better understand how our modern lifestyles might be behind some of our biggest problems with allergy, you will end up talking to a diminutive, deeply intelligent, empathetic woman named Cathryn Nagler, who is one of the best immunologists in the world. Her decades of research primarily focuses on the role our gut microbiome plays in the development of children’s food allergy. She remembers when food allergy rates first began climbing in the late 1980s.
“I saw it myself,” Nagler said, loading graphs onto her computer in her University of Chicago office on a sunny spring afternoon. “I have kids that are 23 and 27, so I followed this in real-time because cupcakes were excluded from the classrooms as my kids went through school. Right around the late ‘80s and early ‘90s, when food allergy rates were starting to increase, the American Academy of Pediatrics said to withhold peanuts and allergenic foods from pregnant mothers, from nursing mothers and from children with risk of allergy until they’re four years old. That was exactly the wrong advice, and that fueled the fire and caused even more increase. Now all of the push is for early introduction.”
Nagler is referencing the now-famous Learning Early about Peanut Allergy (LEAP) study, conducted by researchers in the United Kingdom and the United States, led by Dr. Gideon Lack at King’s College in London and published in the New England Journal of Medicine in 2015. The study found that decades of erroneous advice to parents to avoid giving children younger than three years old anything containing peanuts had likely led to a massive increase in the incidence and severity of peanut allergy.
Infants enrolled in the study (four to 11 months old) were randomly assigned to two groups: parents in one group would continue to avoid peanuts; parents in the other would introduce peanuts to their children right away. Infants in both groups were given skin-prick tests for peanut sensitivities. Among those who tested negative, the prevalence of peanut allergy at 60 months of age was 13.7% in the peanut avoidance group and merely 1.9% in the peanut consumption group. Among those who had tested positive for sensitivity to peanuts, the prevalence of peanut allergy was 35.3% in the avoidance group and 10.6% in the consumption group.
A recent study in Melbourne, Australia, found that changes to dietary advice on peanuts in 2016, following the success of the initial LEAP study, had led to a 16% decrease in peanut allergy among infants. It’s perfectly clear that introducing peanuts to infants has a protective effect.
Nagler understands why parents might be hesitant, however, to introduce allergens into the diet early. After all, why would anyone trust the same people who gave them incorrect advice just a few years prior? Plus, she doesn’t think there’s definitive evidence that early introduction is good.
“You can be sensitized even before the first introduction of solid food,” Nagler explains. “Kids get allergic responses within the first month of life. That means they could have been sensitized by breast milk or by the skin. If you give early introduction to a kid like that, that kid is going to have an allergic response. So early introduction is risky, but now we know withholding is not good either.”
So how does the body learn to tolerate some foods and begin to react negatively to others? Nagler is convinced that food allergy as a phenomenon is part of a generational change.
“People will tell you that there is no history in their family of this,” she explains. “From parents with no family history of allergy to kids that have life-threatening responses to a crumb. You can be allergic to any food. Your allergy can develop at any point in your life. It used to appear between the ages of two and five. Now we’re getting a lot more adult-onset food allergy. It used to be that milk, eggs, wheat allergy were outgrown. Now they’re lasting into adulthood.”
In other words, food allergies signal a larger problem.
Nagler stops on a slide showing what is likely contributing to our immune system’s malaise: diet, C-sections, changes in food production, breastfeeding.
“In the end, it appears, we are at least partially doing this to ourselves. Modern living is likely at the root of the recent rise in allergies.”
“The idea is that modern industrialized lifestyle factors have triggered shifts in the commensal bacteria,” Nagler says, referencing the so-called friendly bacteria that exist within and alongside us. “Inflammatory bowel disease, allergies, obesity, autism — all non-communicable chronic diseases. They’ve all been linked to the microbiome.”
And there it is: Nagler’s answer to the all-important question of why allergies are rising. Changes to the makeup of our gut microbiome — or all the bacteria, fungi and viruses that help process our food into useable fuel for our cells — are driving immune function changes.
Recent studies have highlighted the connection between our diet, use of antibiotics and our gut bacteria in the development of allergies. A 2019 study led by Nagler showed that the gut of healthy infants harbored a specific class of allergy-protective bacteria not found in infants with cow’s milk allergy. This was followed by a study at Brigham and Women’s Hospital that found that five or six specific strains of gut bacteria in infants seem to be protective of developing food allergies. A lead researcher on that study, Dr. Lynn Bry, surmised that our lifestyles are, for better or for worse, capable of “resetting the immune system.”
Another study found that higher levels of cheese in our diet may accidentally worsen allergy symptoms because bacteria in some cheeses produce histamine — the naturally-occurring compound that helps trigger an effective immune response. University of California, San Francisco researchers led a study that discovered a link between three species of gut bacteria and the production of a fatty molecule called 12,13-diHOME. That molecule lowers the number of T-reg cells in our gut, cells crucial for keeping inflammation at bay. The researchers found that babies with higher levels of these three bacteria had an elevated risk for developing allergy and asthma.
Ultimately, most of us living in the 21st century have changed our microbiome makeup. Our diets are the real culprit, according to Nagler. When we go from eating foods with lots of fiber to highly processed foods loaded with sugar and fat, we end up starving beneficial bacteria in our gut.
“We’ve co-evolved with our microbes,” Nagler says. But “they can’t live without their food.”
There’s also the use of antibiotics that kill off not only the bacteria that cause strep throat and sinus infections, but our gut bacteria. And we eat meat from animals that have been given low-dose antibiotics to make them fat. We’re experimenting on ourselves and our microbiomes, Nagler says, to deleterious effect.
Nagler has developed the “barrier regulation” hypothesis, which theorizes that our gut and skin microbiomes regulate what is and isn’t allowed into the body. Commensal bacteria on the skin and in the gut are integral to maintaining barrier function. Nagler explains that a single layer of epithelial cells is all that stands between us and our environment, making sure that what enters our bodies is either inhaled or ingested.
Indeed, in 2018 researchers discovered a link between a gene coding for an antiviral protein in the gut, changes in the gut microbiota, and greater intestinal permeability and severe allergic skin reactions in mice. Gut microbiomes are an intricately balanced mix of different species of bacteria, viruses and fungi. Mice lacking the gene for the antiviral protein had a changed microbiome — or the amounts and types of different bacteria and viruses.
This suggests that our immune systems have developed ways of coping with microbes in our gut and maintaining balance. When the composition of the microbiota changes, the immune components’ responses shift, making us more miserable in the process. This is evidence of how our genes and the environment (changes to the gut microbiota) interact to produce allergy; it also proves Nagler’s larger point that altered gut microbiota can have a direct effect on allergy.
Avery August, an immunologist at Cornell University, describes human immune cells as curators of the human body — constantly sensing everything we encounter and making millions of micro-decisions about what should become a part of the human body or coexist with us and what should not.
The barrier regulation theory dovetails nicely with the conception of our entire immune system — microbiome included — as curators of what can and cannot be part of us. Without the regulation that those barrier cells provide, entire proteins can pass through our skin or gut into the bloodstream, where they encounter our immune cells.
“When we go from eating foods with lots of fiber to highly processed foods loaded with sugar and fat, we end up starving beneficial bacteria in our gut.”
The allergic person’s immune system is wholly functional; it is doing what it was meant to do. For Nagler, the problem is that it’s performing a job different from the one it was initially trained to do. From this perspective, allergic disease is a barrier problem, not necessarily an immune system problem.
All creatures, even invertebrates, have an associated microbiota, Nagler explains, which performs vital physiological functions. Without microbiota, there would be no life at all. The human gut encounters antigens from a hundred trillion — or 100,000,000,000,000 — commensal microbes and more than 30 kilograms — or 66 pounds — of food proteins per year. Cells making up the gut barrier must discern between what is harmful — pathogens like harmful outside bacteria or viruses — and what are harmless antigens.
To Nagler, Elizabeth’s theory blaming antibiotics for her children’s food allergies isn’t so far-fetched. Changes in the gut microbiome in infants and children can lead to a greater risk of developing allergic responses as children age. And our children’s earliest environments are likely the most crucial.
The microbiome has been shown to be incredibly stable by age three. Alterations before this age seem to be critical to whether allergies ultimately develop. A study led by France’s Pasteur Institute found evidence in mouse models for the role of gut microbiota in the development of a healthy immune system in as young as three to six months of age, around when most human babies are first introduced to solid foods.
Bacteria in the gut increased 10- to 100-fold after solid foods were introduced. This stage of rapid microbiome growth and development, called “pathogenic imprinting,” seems to determine one’s susceptibility to inflammatory disorders like allergy and autoimmune disorders in adulthood. Antibiotics could theoretically disrupt this developmental stage, producing a greater risk of allergic diseases.
So far, scientific evidence appears to back this up. Research by Rutgers University and the Mayo Clinic found that children under age two who are given antibiotics are at greater risk for asthma, respiratory allergies, eczema, celiac disease, obesity and ADHD. The study looked at 14,572 children born in Olmsted County, Minnesota, between 2003 and 2011. If antibiotics were given in their first six months, the risk increased dramatically.
Researchers found that 70% of the children in the study had been prescribed at least one antibiotic in the first 48 months of their lives (typically for respiratory or ear infections). Another study found that antibiotics can allow for the growth of non-pathogenic fungus in the human gut, which may make respiratory allergies more severe. Finally, related studies of Finnish and New York babies found that C-sections and antibiotics correlated with altered gut microbiomes and a greater risk of allergies in childhood.
These findings don’t surprise Nagler. Vaginal births give infants what are known as “founder bacteria,” she tells me. As the baby moves through the vaginal canal, it is exposed to its mother’s friendly bacteria. Breastfeeding introduces more helpful bacteria into the infant’s gut.
“If you skip over both of those processes, which many people have done, you’ve disordered the microbiome,” Nagler explains. “The first 100 -1,000 days of life are absolutely critical for the development of the immune system.”
Research has shown that babies born by C-section not only haven’t been exposed to the correct, harmless vaginal founder bacteria, but they have also been exposed to potentially harmful hospital bacteria. One recent review found that lactobacillus containing probiotics — the same bacilli found in breast milk — lowered SCORAD (Scoring Atopic Dermatitis) scores for children under age three who had moderate to severe atopic dermatitis, or more severe eczema.
Breastfeeding for the first three months of life has also been linked with a lower risk of respiratory allergies and asthma. In a study of 1,177 mother/child pairs, breastfed babies had a 23% lower relative risk of allergies by age six and a 34% lower relative risk of asthma if there was no family history of asthma.
But for children whose mothers supplemented breast milk with formula, the protective effect seemed to have mostly disappeared. (Important aside: If you’re a mother and you’re panicking a bit right now, please don’t. There are many valid reasons to have C-sections and to choose formula over breastmilk. A lot of this is complicated and there’s a lot that we still don’t know about these interactions.)
“Changes in the gut microbiome in infants and children can lead to a greater risk of developing allergic responses as children age. And our children’s earliest environments are likely the most crucial.”
Nagler reminds me that the cattle industry has been giving cows low doses of antibiotics for years to make them fatter and more commercially viable. We also eat highly processed food that’s low in fiber, with added sugars and fats. That means that the food we’re introducing into our guts is different from what our ancestors ate for millennia. That, of course, affects the types of bacteria that can flourish inside us.
Even simply changing bedsheets can change our microbiomes. Researchers in Denmark and the U.K. looked at samples from 577 six-month-old infants’ beds and compared them to respiratory samples taken from 542 of those infants at age three months old). Researchers found 930 different types of bacteria and 103 genera of fungi.
A correlation was found between bacteria in bed dust and those found in the associated children; while the two populations of bacteria were not exactly synonymous, they did seem to directly affect each other. An increase or decrease in respiratory bacteria mirrored an increase or decrease in the bacteria in the infants’ beds. The research suggests that less frequent changing of bed linens may benefit the health of all our nasal and airway microbiomes.
Many of the researchers I spoke to longed to return to a simple, less technologically driven way of life mostly centered on the foods we consume and how we produce them. One top allergist dreamt of performing the ultimate control study to prove that our modern lifestyle and habits negatively affect our immune systems.
“Imagine,” he said, “if we could get a group of people to revert back to a much older way of life. Eat foods grown without pesticides. Eat whole foods and a wide variety. Don’t use dishwashers or detergents. Do you know what would happen? No more allergies. I just wish I could prove it.”
The Canaries In Our Coal Mines
The most compelling evidence that our 21st-century lifestyles and manmade environmental changes have spurred our allergies is this: Our companion species of thousands of years — dogs, cats, birds and horses — all get allergies regularly. Other species — those that do not live in our homes or alongside us — do not.
Our pets’ symptoms are very similar to ours: sneezing, snoring, asthma, vomiting and over-grooming in cats; skin eruptions, persistent scratching and grooming in dogs; coughing and wheezing in horses. And they likely have allergies for the same reasons we do. After all, their immune systems are exposed to the same panoply of natural and chemical substances. The top allergen in dogs? Dust mites. The top allergen in horses? Their human-packaged feed. Cats are often allergic to grass, trees, and weed pollen. Cats and dogs can also be allergic to human dander, since we shed skin, too. Sound familiar?
Many owners spend lots of time and money trying to eradicate allergy symptoms in their pets. The most common methods are the same as for humans: pets either take antihistamines and steroids or undergo immunotherapy shots. The problem is that we don’t know just how big the problem is because we don’t have good data on pet allergies or their incidence. We don’t know if rates are increasing, or if vets and pet owners are just getting better at recognizing the signs.
To better understand how and why allergies affect our pets, I traveled to Cornell University’s College of Veterinary Medicine to speak with Elia Tait Wojno, who started her career doing work on parasitic worms and immune responses.
She explains that the immune response to parasitic worms is like the immune response during allergic responses in both humans and in dogs. (Of course, in the case of worms, those responses are protective and, in the case of allergies, the responses are the ones causing the miserable symptoms.) By studying the immune response to helminths (a type of parasitic worm) in dogs, we can learn a lot about the basic immune functions involved in allergy.
Working with dogs allows us to observe how allergies function in something other than mouse models. For decades, mice have been the dominant research organism in the field of immunology. But mice aren’t humans, and mouse models aren’t always the best predictors of what will happen in a human body. That is why there’s growing interest among allergy researchers to move beyond mouse models. Since some larger animals have natural allergic diseases, like cats and dogs, they might be good models for learning about basic immunology across species as well as doing drug testing for allergic conditions.
“As the baby moves through the vaginal canal, it is exposed to its mother’s friendly bacteria. Breastfeeding introduces more helpful bacteria into the infant’s gut.”
Unlike mice, which are confined to the lab, usually inbred and live in very controlled environments, the dogs that Tait Wojno works with are born the old-fashioned way. She works with breeders to enroll dogs into her studies and the dogs are treated like pets because they are. These aren’t lab animals; they live at home with their owners. This is an important detail, since that allows researchers to ponder which components of our shared, lived environments, habits and medical practices might be affecting our companion species as well as us.
Allergies in our pets offer potential clues to solving the mystery of allergies. If we can understand early immune response in animals, then we might be able to better understand it in humans. And that’s one of the things we really don’t understand in any mammals — the immune system’s initial reaction to something it encounters and the subsequent set of events that follow. Ultimately, my visit to Cornell convinced me of one thing: Our pets are the literal canaries in our figurative allergy coal mines. The fact that our intimate companions have allergies is a sign that something humans are doing is irritating the immune systems of us all.
The Dirt On Cleanliness
You’re likely already familiar with the most espoused theory of allergy causation, the idea that being “too clean,” or overly hygienic, is not good for childhood development of a properly functioning immune system. Maybe you’ve heard that it’s good to let children play in the dirt, get a little messy and slobber on each other. So goes the basic idea behind the hygiene hypothesis, first posited to try to explain the explosion of asthma, eczema and food allergies in the last half of the 20th century.
In 1989, epidemiologist David Strachan published a short article in the British Medical Journal (BMJ), entitled “Hay fever, hygiene, and household size.” Using data from a national sample of over 17,000 British children born during the same week in March 1958, he looked at three things: How many of the study participants self-reported symptoms of hay fever at age 23; How many of their parents had reported hay fever in them at age 11; and when participants were seven, whether the parents remembered if their child had eczema in their first year of life.
Strachan found that younger siblings seemed most protected from developing hay fever or eczema, despite differences in socioeconomic class. Strachan posited that the lowered allergy rates might be explained, “if allergic diseases were prevented by infection in early childhood, transmitted by unhygienic contact with older siblings, or acquired prenatally from a mother infected by contact with her older children.”
Smaller family sizes, improvements in housing, and higher cleanliness standards might have combined to reduce the opportunity for children’s exposure to a wide variety of microbes. In other words, Strachan’s findings suggested that mild childhood infections might be beneficial to a developing immune system.
At first this idea was rejected. Many immunologists still believed that bad infections could trigger allergy, especially asthma. But Strachan’s ideas were eventually adopted and popularized after researchers discovered that IgE-mediated (or antibody-driven allergic) immune responses were driving many allergic conditions. It seemed plausible that a lack of early exposure to certain germs was the underlying problem, leaving the immune system “untrained” and hyper-responsive in later life.
Early work on the microbiome and friendly commensal bacteria, Nagler co-wrote in a 2019 review, “led to a reformulation of the hygiene hypothesis as the ‘old friends’ or the ‘biodiversity’ hypotheses of allergy, which proposed that changes in the environment, diet and lifestyle associated with Westernized, industrialized countries have altered the diversity of the gut and skin microbiomes.”8
The “old friends” hypothesis posits that humans are more at risk of chronic inflammatory diseases, like allergies or autoimmune disorders, because we no longer regularly encounter some of the microorganisms that we evolved alongside for millennia. These “old friends,” the theory goes, helped regulate our immune function. Their risk to human health was minimal, and a healthy immune system could easily keep them in check. This trained the developing human immune system, making it more robust and adaptive to its normal environment.
In the absence of these old friends, our immune system lacks the early training it needs to better self-regulate and overreacts to otherwise harmless stimuli, like pollen or dust mites.
Nagler explained to me how the hygiene hypothesis and the idea of microbes as old friends combine to produce an almost idyllic conception of farm life and the “farmhouse effect.” Farmhouses, with their tilled soil, muddy barns and stables, and fertile fields, come with a lot of bacteria, viruses and parasites.
“The most compelling evidence that our 21st-century lifestyles and manmade environmental changes have spurred our allergies is this: Our companion species of thousands of years — dogs, cats, birds and horses — all get allergies regularly. Other species — those that do not live in our homes or alongside us — do not.”
But if you alter the environment, you alter the microbiota. If you have better sanitation, move away from farms and have fewer children, then you cut off your supply of a richly diverse microbiota. You become, in essence, less intimate with microbes in your day-to-day life. And intimacy with friendly germs, especially in the first few years of life, does seem to be protective of a wide variety of immune disorders — but not all of them.
Recent studies have suggested that there is a measurable “farmhouse effect,” but researchers are uncertain about which exposures are protective and what mechanisms they might be triggering to produce that protective effect. What seems certain is that exposure to livestock from early childhood dramatically lowers the risk of developing allergic conditions later in life.
In particular, exposure to stable dust seems to prevent most allergic responses. Something in “farm dust” is effective — bacteria, viruses, fungi or even more allergens themselves — but it’s not entirely clear which components of the dust are protective, and which aren’t. Another study of rural areas in Austria, Germany and Switzerland showed that a farming environment was more protective against hay fever, atopic sensitization and asthma.
If infants spent a lot of time in stables and drank cow’s milk in the first year of their lives, then their rates of allergic diseases dramatically lowered even if their IgE results showed sensitization. In other words, they might have an underlying sensitivity to some allergens, but that sensitivity did not become full-blown allergic responses.
In a different study that looked at the immune function of lab-raised mice versus those in a farm’s barn, the “farmhouse effect” was strongly supported. The results of studies in mice are, in fact, one of the key supports for this theory. August explained to me that pathogen-free mice bred for laboratory studies have dramatically different immune systems compared to their “unclean” peers; the lab mice have immune systems that resemble a human newborn’s immune system. When you place those “clean” mice in a “dirty” environment — like the mice study did to simulate farm life — their immune systems change to look more like that of an adult human.
This tracks with research in humans that suggests that germ-ridden environments can also protect against allergies. Children and adults who live with dogs have lower rates of asthma and obesity, in part due to more indirect exposure to bacteria that dogs carry and track into the home. A 2017 NIH-sponsored study showed that exposing children in the first three years of life to high indoor levels of pet and pest allergens, like cockroach, mouse and cat allergens, lowers their risk of developing asthma by age seven. But whether exposure is protective, depends on the bacteria.
Enter the fascinating case of Helicobacter pylori, or H. pylori, a common gut microbe that’s the culprit behind gastrointestinal ulcers, chronic gastritis and even some forms of cancer.
Although scientists discovered the species H. pylori in 1982, there is speculation that our colonization by the bacteria took place circa 60,000 years ago and that it depended on repeated contact in small, close-knit groups, i.e. the way humans typically lived until fairly recently. There are many different strains of H. pylori, and their prevalence in humans was estimated to hover around 80% until after World War II, when the introduction of antibiotics like penicillin to treat common infections led H. pylori to begin to disappear from the human gut. Today, it is estimated that around 50% of humans are infected with H. pylori, with rates hovering as high as 88% in one African nation and as low as 19% in a European one.
This is in line with the hygiene hypothesis, since transmission of microbes is far easier in large, crowded households with many siblings. H. pylori is usually acquired in early childhood, after the first year, and is transmitted via the ingestion of feces, saliva or vomit (and if you just physically recoiled, I apologize). In the absence of antibiotics, H. pylori, once acquired, can persist in the gut for decades, often for the entire life of its human host. Most people living with H. pylori have no symptoms or ill effects.
The stomachs of people with and without H. pylori are immunologically different and there is speculation that people with H. pylori have a larger gut population of regulatory T cells (Tregs). That’s important because Tregs are crucial for tamping down inflammatory immune responses. Although infection with H. pylori is associated with having more immune cells in the gut, some researchers have proposed that it may be a normal, rather than a pathological, response to the bacteria.
“There’s growing interest among allergy researchers to move beyond mouse models. Since some larger animals have natural allergic disease, like cats and dogs, they might be good models for learning about basic immunology across species as well as doing drug testing for allergic conditions.”
In other words, H. pylori may be beneficial in some situations. In fact, people who lack the bacteria are much more likely to suffer from gastroesophageal reflux disease (GERD), or acid reflux, and there is evidence that H. pylori plays a protective role against childhood-onset asthma.
All this gives credence to the basic premise of the hygiene hypothesis: We need regular exposure to friendly bacteria to train our immune systems. But also, simply living with more diverse microbial populations likely does not automatically produce improved immune system functionality. Dr. Thomas Platts-Mills, director of the University of Virginia School of Medicine’s Division of Allergy and Clinical Immunology, believes the hygiene hypothesis cannot possibly explain the rise of allergies, at least not by itself. His argument relies upon our more recent history of “cleanliness.”
Throughout the 20th century, hygiene standards were adopted more widely. Improved sewage systems and potable drinking water meant human exposure to microbes through ingestion was far less frequent. Regular infection by helminths, or intestinal parasites, had decreased due to food and water quality controls and the increased use of shoes.
During this time people moved from rural farms into urban centers, so the general population also saw lower exposure levels to farm animals and decreased diversity of the bacterial populations they encountered. Family size also decreased, perhaps exposing children to fewer germs. Platts-Mills notes, however, that all these changes were completed by the 1920s, which leaves the dramatic rise of asthma and allergic rhinitis, from the 1940s into the 1950s, unexplained.
Platts-Mills argues that the best explanation for the rise of hay fever and asthma is more likely “an increase in sensitization to indoor allergens and the loss of a lung-specific protective effect of regular deep inspiration.” In other words, outdoor play and recreation were likely more protective against allergies than spending hours playing Minecraft or Fortnite.
If the hygiene hypothesis or farmhouse effect were correct, one would also expect to see a marked decrease in allergy rates in rural, farming communities. Yet Dr. Jill Poole, the University of Nebraska Medical Center division chief of allergy and immunology, found that around 30% of Midwestern farmers suffer from allergic disease directly linked to their agricultural lifestyle. Dust from grain elevators and animal barns, pesticide exposures and grain rot from flooding causes so-called Farmer’s Lung. So while some farm exposures seem beneficial, others are clearly not.
And if family size, rural life and socioeconomic status are linked in the original hypothesis theory, then one might expect that countries with larger family size, more rural populations and lower socioeconomic status would have fewer allergic diseases. Yet their allergies are also steadily increasing.
A 2019 study found that half of Ugandans living in the capital of Kampala have some form of allergy. It found that allergies are on the rise in rural areas, although more urban dwellers have access to hospitals where they can report symptoms of asthma, nasal congestion or skin rash. Many Ugandans self-treat with over-the-counter antihistamines, steroids, and antibiotics. Dr. Bruce Kirenga, a Ugandan allergy expert, said he thinks environmental pressures like air pollution are to blame.
These findings suggest that the farmhouse effect or hygiene hypothesis might not be the smoking gun we’re searching for. The theory makes intuitive sense, but we don’t have enough scientific evidence to definitively say that rural life, with its “dirty” or microbially rich environments, can fully protect us from allergic disease.
And yet, the basic idea that something about our interactions with the microbial world around us has shifted because of our lifestyles and daily habits is compelling. The hygiene hypothesis, then, is likely partially correct. There is growing evidence that some of our habits (particularly in relation to our diets and food production) might be behind the recent rise of allergies — especially food allergy.
Barrier Warfare
In a 1950s pamphlet on allergy, Dr. Samuel Feinberg, a leading allergist and the first president of the now-named American Academy of Allergy, Asthma, and Immunology, wrote: “Man’s progress creates problems.” Feinberg pointed the finger at human ingenuity as a significant cause of the developed world’s increasing allergies. All our tinctures and dyes, our synthetic fabrics and new plastics, our lotions and eyeliner and lipsticks and shampoos were beginning to wreak havoc on our immune systems.
Dr. Donald Leung, an immunologist who is head of allergy and immunology at Denver’s National Jewish Health is also one of the world’s leading researchers on atopic dermatitis. Leung told me that we overuse soap, detergents and products containing alcohol.
“Outdoor play and recreation were likely more protective against allergies than spending hours playing Minecraft or Fortnite.”
We routinely use harsh antimicrobial products to clean our hands and homes, instead of soap and water; a fact exacerbated by the Covid pandemic. All of this can negatively affect our skin barrier, making it more likely we’ll develop an allergic condition.
Furthermore, exposure to food proteins through a weakened skin barrier along with early ingestion of higher doses of food proteins can lead to full-blown food allergy. In layman’s terms, that means if you make a peanut butter sandwich and don’t wash your hands and then you pick up your baby, you may be depositing trace amounts of peanut protein onto their skin. If their skin is “leaky,” or more permeable than normal skin due to a possible genetic mutation or a disruption (or irritation) of the skin’s normal microbiome, then that protein could seep into the baby’s skin and when the baby eats peanuts, it can trigger a peanut allergy.
“All the things we’re putting on our skin, or the things we’re putting on our babies’ butts, are probably not good for our barriers,” Robert Schleimer, former chief of allergy and immunology at Northwestern’s Feinberg School of Medicine, told me.
Schleimer said his first job in the 1960s was collecting used cotton diapers for the Tidee Didee Diaper Service for $1.70 an hour. He would bring them back to the laundry facility to be cleaned and repackaged for delivery. As he reflected on the barrier hypothesis, he noted that cotton is a natural fabric. Now we use plastic diapers with antimicrobial properties and apply creams to babies’ skin to prevent rashes from those materials. And that’s just one of the changes that might be exposing our children to more irritants.
“You have these very tough detergents made of rough chemicals that break things down,” said Dr. Kari Nadeau, director of Stanford University’s Sean N. Parker Center for Allergy & Asthma Research. “And initially that was seen as positive. Then they started to see that, wait a minute, all these people working in the plants that are making those detergents have breathing issues.”
In our discussion, Nadeau is adamant about the downsides of modern living, especially when it comes to daily chemical exposure. She points to the recent rise in severe eczema. In the 1940s and 1950s, the image of a “squeaky clean” home was heavily promoted by the same companies making these new detergents (like Dow Chemicals).
“It turns out that the way my grandmother lived on the farm was probably the right way to do things: not using a lot of detergents, not bathing every day, making sure you were exposed to a little bit of dirt, being exposed to the outdoors,” Nadeau said.
In one recent study, Canadian university researchers found that infants less than four months old living in a home where household cleaning products were used more frequently were more likely to develop wheezing and asthma by age three. Researchers noted that most of the infants spent between 80-90% of their time indoors — heavily increasing their exposure to these products.
Study co-author Dr. Anne Ellis noted that children take more frequent breaths than adults and, unlike adults, breathe mostly through their mouths — bypassing the nose’s natural filtration system and allowing anything in the air to more easily penetrate the lungs. The researchers hypothesized that fumes from cleaning products inflamed their respiratory tracts, activating the babies’ innate immune systems. The frequent use of household products such as air fresheners, deodorizers, antimicrobial hand sanitizers, oven cleaners and dusting sprays seemed particularly harmful.
Exposure to problematic chemicals in gestation can be equally harmful to developing immune systems. One study found that higher concentrations of plasticizers, or solvents used to make materials more flexible, equated to a greater risk of developing allergies. Researchers measured levels of Benzyl butyl phthalate (BBP), a common plasticizer used to make Polyvinyl chloride (PVC or vinyl), in the urine of pregnant women and new mothers. They found that exposure to these phthalates during pregnancy and breastfeeding caused epigenetic changes to specific repressors for Th2 immune cells responsible for generating inflammation.
Our Dark Sedentary Lives
Changes to our work and leisure habits may also be contributing to the rise in allergies. We live much of our lives in the shade, Dr. Pamela Guerrerio at the NIH told me. The lack of the sun’s UVB rays on our skin, means our cells produce less vitamin D. It’s unclear how protective vitamin D is against allergies, but the ongoing debate over the evidence shows how little we know about the ill effects of our move indoors.
“All our tinctures and dyes, our synthetic fabrics and new plastics, our lotions and eyeliner and lipsticks and shampoos … wreak havoc on our immune systems.
Dr. Scott Sicherer, director of Mount Sinai’s Elliot and Roslyn Jaffe Food Allergy Institute in New York, told me that both autoimmune and allergy diseases tend to occur at higher rates when a person lives farther from the equator. That fact made immunologists consider whether vitamin D was involved in immune disorders, since people are exposed to less sunlight at higher latitudes.
But Sicherer also noted, “there might be fewer people engaged in farming lifestyles at those latitudes. There might be different exposures to different things in different regions of the globe. It’s so complex that we just don’t know.”
Guerrerio agreed, remarking that people around the globe have different diets, which, along with less sunlight, might compound immune system impacts. Guerrerio said it’s likely several factors cause allergies — including our indoor-prone lifestyles — and that several interventions will be necessary to reverse the effects on our immune systems.
As for Elizabeth and her children, Elizabeth’s misplaced sense of guilt is intimately tied to her desire to provide them with the best care. But her decision to allow her very ill babies to receive antibiotics was almost certainly the correct one, given the likelihood of more dangerous outcomes without treatment. Still, her sense of regret lingers — and she is most certainly not alone.
We are regularly bombarded with advice about how to keep ourselves and our children healthy and happy. You can try to “game the immune system,” but I don’t recommend it. As we learn more from experts about how our immune systems react and respond to our ever-changing environments, it’s best to come to terms with the fact that it’s highly unlikely that we can directly cause our own or someone else’s allergies. Reality is almost always more complicated than that.
from Hacker News https://ift.tt/OrEseIQ
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.