Friday, November 1, 2013

What can evolution tell us about iron fortification of infant formula?



I usually avoid blogging about my own work here. Mostly, this is a space for me to explore new topics, or share my excitement over shiny new and cool breastfeeding science, or force my students to show off their own work. However, I recently wrote a paper I think is worth discussing. The paper was this: “Too much of a good thing: evolutionary perspectives on infant formula fortification in the United States and its effects on infant health” soon be published in the American Journal of Human Biology and currently available in Early View. 

Unlike most of my work which is centered on human milk, this paper focused instead on infant formula, specifically iron fortification of infant formula. I applied concepts from evolutionary medicine to fortification practices, and suggested that the current practice of fortifying infant formula with 12 mg/L of iron was excessive. I stand by this, even as I know many clinicians may challenge this, and even last year the Section on Nutrition at the American Academy of Pediatrics recommended universal fortification of breastfed infants out of concern that infants may be at risk of developing iron deficiency anemia. This viewpoint was immediately challenged internally by the American Academy of Pediatrics Section on Breastfeeding and scholars who study infant nutrition. You can read the responses here, and here.

Figure 1: Me, and a wall of infant formula in Cebu, Philippines. I'm five feet one inch tall if you need a scale. Photo by Chris Kuzawa.

Iron deficiency anemia (IDA) is global problem, with approximately 2 billion (yes, with a b) suffering from some form of anemia based on estimates from the World Health Organization. IDA during development is associated with increased infection, mortality, delayed cognitive development, and impairments with growth in weight and length.  It is a terrible nutritional deficiency, and it makes total sense that we would want to prevent and treat IDA as much as possible.  What I suggest in this paper is that in that noble goal, we may have gone too far, and commercial infant formula may contain an excess of iron. 

For most of us living in the United States, we live in high resource, low pathogen environments. Iron depleting infections, especially those caused by intestinal helminthes, are rare.  And iron fortification is quite plentiful for formula fed infants – formulas are typically fortified with 10-12 mg/L of iron, and low iron formulas (4 mg/L) are actually quite hard to find.  Breastfed babies receive milk with much lower levels of iron – about 0.2-0.5 mg/L. While differences seem huge on pixels, all the iron in milk and formula is not bioavailable – about 15-50% of human milk iron is bioavailable and about 7-14% of infant formula iron. The differences actually look like this across infancy, as shown here for an “average” female infant. I have defined average as growing on the 50th percentile of weight for age, consuming the standard recommended amount of formula (ounces per pound) or equilivent amount of human milk. As you can see, the differences in intake are striking. Recommended daily intakes (FDA) are 0.27 mg/day for infants less than 6 months; breastfed infants are meeting these requirements while formula fed intakes are consuming vastly more. 
Figure 2: Iron intake of a typical female infant (assume she is 50th percentile of weight for age and drinks the volume of formula or human milk recommended for that weight. The amount of dietary iron she ingests assumes a 7% absorption rate from formula and a 50% absorption rate from human milk. BOTH may be underestimates. The reference line is recommended infant intake. Note, this graph does not contain information on the leftover iron - the one that may be an all you can eat buffet for gut bacteria.

I hypothesized that this increase dietary iron would be mismatched to infant needs, and may result in an excess of iron. While adults can down regulate iron intake when they are iron replete, infants do not have the same capacity and will continue to absorb dietary iron. This excess iron may increase the concentrations of free radicals, lead to oxidative damage in cells, and most importantly, serve as an iron source for pathogens, increasing the risk of infection. The iron that is not absorbed by the infant (that other 86-93%) will spend some time in the infant’s digestive system before being excreted in feces, and may provide an iron source for pathogenic, iron requiring bacteria such as E. coli. By comparison, the common intestinal microflora of breastfed babies, Lactobacillus and Bifidobacterium, are either iron independent (Lactobacillus) or require minimal iron (Bifidobacterium). These bacteria even contribute to immune responses in breastfed infants AND competitively inhibit E. coli. Everything may shift with too much iron, allowing for increased amounts of iron requiring bacteria, including pathogenic bacteria and even altering the pH of the intestines to support additional pathogenic bacteria, increasing the risk of GI infections and diarrhea. Too much iron – absorbed or not – can have consequences for infant health. 

Elsewhere, it has been argued that maintaining lower levels of bodily iron – not anemic – may be protective against the risk of infection and may an evolved response to minimize infection risk. This actually makes a lot of sense – limiting iron puts the breaks on pathogenic growth and replication and may reduce infection risk. 
In infants, transplacental iron, especially from delayed cord clamping, is sufficient to meet iron requirements for the first several months of life.  Iron levels in unsupplemented infants are quite low at 6 months of life, although few will develop full blown anemia. I argue that these low levels at 6 months may be adaptive – this is the time period when infants will be introduced to foods besides breast milk. Consequently, their exposure to pathogens will increase greatly (it is also the time when they become more mobile, which may also contribute). Having low levels of bodily iron may, as suggested for adults in 1976 (Bullen et al., 1976), be protective against infection.  Infants with lower levels of bodily iron may have been less likely to contract infections or die from them, leading to gradual evolutionary change in how human infants handled iron – and possibly on the iron content of human milk. 

Commercial infant formula with the really high concentrations of iron undermines this normal biological rhythm, and in our important attempts to prevent IDA in infants, we may have overshot the mark. In Europe, the ESPGHAN Global Standards recommend fortification at 4-8 mg/L (Koletzko et al., 2005), and guess what – the incidence of IDA in infants is not higher than in the United States. Several randomized control trials, the gold standard of clinical investigation, have found the same thing – infants receiving formula with 4-8 mg/L of iron do not have increased risks of IDA compared to infants receiving 12 mg/L.  This has been interpreted as evidence that higher fortification levels are safe but it also demonstrates that lower levels of iron fortification are appropriate to meet infant needs.  Too much iron, I suggest may promote the growth of pathogenic bacteria, alter the composition of the microbiome, and may even increase long term risks of Parkinson’s disease. 

Infant formula clearly needs iron fortification. But the current levels of fortification used in the United States may be a case of too much of a good thing. And as suggested below in the comments - the needs of premature babies will be very different, and the model above is for full term infants of appropriate for gestational age (not premature or small for gestational age). 

Author's note: The Alpha Parent has recently discussed a similar topic , and I learned that the Science of Mom had made similar points in 2011 - after the paper had been published.  This project was originally presented as a conference talk in April, 2007 at the American Association of Physical Anthropologists.