Iron deficiency anaemia: the scale of the problem
Iron deficiency remains the most prevalent nutritional deficiency worldwide, affecting an estimated 4–6 billion people. Iron deficiency is also the main cause of anaemia in children and women in both high or low infection burden settings(Reference Wirth, Woodruff and Engle-Stone1, Reference Engle-Stone, Aaron and Huang2). Two billion people, about 30 % of the world's population, and 43 % of children between the age 6–59 months, are anaemic and the prevalence of anaemia is five times higher in low- and middle-income countries (LMIC) than in high-income countries(3, Reference Miller4). As such, iron deficiency anaemia (IDA) is the largest nutritional deficiency disorder in the world and one of the five leading causes of global disease burden(3). At any given moment, more individuals suffer from IDA than any other health problem with a staggering 1·24 billion affected individuals worldwide(3) (see Fig. 1).
Iron deficiency is associated with multiple pathologies, including anaemia and defective organ function and formation(Reference Prentice, Mendoza and Pereira5). Iron deficiency in the absence of anaemia is somewhat subtler in its manifestations than other micronutrient deficiencies, despite being a major contributor to ill health, premature death and lost earnings in developing countries. Even mild iron deficiency appears to impair intellectual development in young children and is lowering national intelligence quotients(Reference Grantham-McGregor and Ani6–Reference Halterman, Kaczorowski and Aligne10), while overt IDA is associated with an increased risk of serious morbidity, poor motor and mental development in children, reduced work capacity in adults, poor pregnancy outcomes and impaired immunity: all of which make it one of the most expensive diseases in the world according to the WHO, causing an estimated 4·05 % of lost gross domestic product globally(Reference Horton and Ross11). For WHO, reducing the prevalence of IDA is one of its six priorities, and it is estimated that appropriate treatment of IDA not only could restore individual's health, but also raise national productivity levels by as much as 20 %(Reference Horton and Ross11–14).
Nevertheless, in the past 25 years, not much has changed in LMIC, where IDA maintains its top position as the leading cause of years lived with disability(3), and is responsible for over 50 000 deaths annually(Reference Wang, Naghavi and Allen15), with iron deficiency contributing to 120 000 maternal deaths per year(16). This is perhaps not surprising, since eliminating IDA in these countries is particularly challenging due to the double burden of IDA and infectious diseases, and malaria, HIV/AIDS, tuberculosis, hookworm and other intestinal parasitic or bacterial infections, all contribute to the anaemia burden.
Iron deficiency in the UK
Recently, Public Health England and the Food Standards Agency has published the combined results from years 7 and 8 (2014/2015–2015/2016) of the National Diet and Nutrition Survey(17), providing an update on results from years 5 and 6 published in 2016. Data on dietary iron intake as well as on biochemical body iron status indicated an increased risk of iron deficiency in girls aged 11–18 years and women aged 19–64 years. When the dietary iron intake was compared with government recommendations, the National Diet and Nutrition Survey reported that 54 % of girls and 27 % of women had iron intakes below the lower reference nutrient intake, which is considered inadequate for most individuals, as it represents the level of intake that is likely to be sufficient to meet the needs of only 2·5 % of the population. Interestingly, the proportion of children aged 1·5–3 years with dietary iron intake below the lower reference nutrient intake was greater than in the previous report (10 % compared with 6 % in the past)(17, Reference Geissler and Singh18).
Analysis of blood samples of 704 adults and 329 children in the same UK survey provided evidence of IDA (as indicated by low Hb levels) and low iron stores (plasma ferritin) in all age/sex groups in the population, with a higher prevalence in females. Based on the WHO criteria for the definition of iron deficiency and anaemia, the prevalence of IDA in the UK was 9 % for girls aged 11–18 years (v. 5 % in the previous report), while low iron stores were evident in 24 % of adolescent girls, 5 % of adult women and 1 % of older women(17). It is important to note that pregnant or breast-feeding women who may have different requirements were among the population groups that were excluded from the survey.
These data support that iron deficiency, and IDA, remain impactful nutritional deficiency disorders in the UK, despite the wide-ranging availability of fortified foods and iron supplements.
Strategies to prevent and treat iron deficiency
The most commonly used strategies to control iron deficiency are supplementation (including multi-micronutrient powders), food fortification, dietary diversification and control of parasitic and other infections. Supplementation or fortification programmes are often the selected short-term approaches because they are cost-effective and relatively easy to implement.
The current WHO recommendation for the prevention of IDA is for daily iron supplementation for 3 consecutive months in a year to all pre-menopausal women, adolescent girls and young children in countries with over 40 % anaemia prevalence (i.e. the majority of countries in sub-Saharan Africa and South East Asia) and intermitted supplementation in settings with lower prevalence(19, 20).
Iron compounds are widely available but typically generate a non-physiological bolus of bio-accessible and reactive ionic iron that can cause significant adverse effects, either in the colon (i.e. unabsorbed iron fraction) or in circulation (i.e. absorbed iron fraction)(Reference Sazawal, Black and Ramsan21–Reference Prentice24). Meta-analysis of trials involving nearly 10 000 young children in developing countries have consistently shown that conventional soluble oral iron supplementation used to treat IDA is associated with increased infection including bloody diarrhoea(Reference Soofi, Cousens and Iqbal22, Reference Mayo-Wilson, Imdad and Junior25–Reference Jaeggi, Kortman and Moretti27) and detrimental changes to the gut microbiome and gut inflammation(Reference Jaeggi, Kortman and Moretti27, Reference Zimmermann, Chassard and Rohner28), further increasing the burden from enteric infection and environmental enteropathy (i.e. persistent gut damage and inflammation that leads to malabsorption), which is a major cause of growth failure in children in resource-poor environments(Reference Naylor, Lu and Haque29, Reference Lin, Arnold and Afreen30). Wide-scale home fortification programmes hold promise for reaching at-risk populations with the use of multi-micronutrient powders(Reference De-Regil, Suchdev and Vist31). Nonetheless, progress has been painstakingly slow, in spite of the dimension of the problem, and strategies to control iron deficiency have failed to decrease the global burden of this deficiency. This is particularly the case for young children in sub-Saharan Africa, where iron supplementation has consistently shown limited efficacy and a potential for increasing infection risk(Reference Sazawal, Black and Ramsan21, Reference Zlotkin, Newton and Aimone26, Reference Mwangi, Phiri and Abkari32).
Our biggest challenge: iron bioavailability in high-infection settings
Infection
Many LMIC have high prevalence of malnutrition and infection, both of which contribute to a vicious cycle(Reference Muller and Krawinkel33). First, malnutrition weakens barrier and immune functions, allowing pathogens easier access and overall impairing the host's ability to fight pathogens(Reference Calder and Jackson34). Secondly, damage to the intestinal mucosa lining caused by enteric infection and inflammation will decrease nutrient status due to impaired absorption and diarrhoea; therefore predisposing to further infection and worsening nutritional status, and perpetuating a cycle of chronic infection and malnutrition(Reference Calder and Jackson34). This cycle of infection, malnutrition and persistent inflammation will lead to malabsorption and in the long term to growth failure and mortality(Reference Calder and Jackson34–Reference Moore, Cole and Poskitt36).
The state of chronic infection leads to high levels of hepcidin and anaemia. Elevated hepcidin can decrease iron absorption, irrespective of the iron source, and also leads to low circulating serum iron and transferrin saturation (hypoferraemia), through the mobilisation of circulating iron to the iron stores in macrophages and the liver(Reference Atkinson, Armitage and Khandwala37–Reference Nemeth and Ganz40).
Conversely, iron supplementation regimens can cause the release of a high bolus of iron in the blood, which may transiently overwhelm transferrin's iron binding capacity and generate redox-reactive non-transferrin-bound iron, which could be available to extra-cellular pathogenic organisms causing infection(Reference Brittenham, Andersson and Egli41–Reference van der, Marx and Grobbee45). Indeed, a committee chaperoned by the WHO concluded that these non-physiological high doses of highly absorbable iron may bypass the naturally evolved systems that safely chaperone iron and be the cause of the excess non-transferrin-bound iron, therefore causing the afore-mentioned harmful effects(46). This conclusion, supported by a recent study, has shown that 3 h after consumption of a standard iron supplement dose, human serum greatly supports enhanced rates of replication of pathogenic bacteria(Reference Cross, Bradbury and Fulford47).
Nutritional immunity
Iron is an essential nutrient to virtually all human pathogens, and the most virulent and invasive strains are those with multiple iron acquisition and utilisation mechanisms(Reference Subashchandrabose and Mobley48–Reference Parkhill, Wren and Thomson51). Furthermore, iron can also regulate evolutionary transitions between commensal and pathogenic states in microbes(Reference Barber and Elde52). Nutritional immunity is an innate defence mechanism characterised by the sequestration of iron and other essential trace elements from the circulation to reduce iron availability and limit pathogen growth and virulence(Reference Calder and Jackson34, Reference Barber and Elde53, Reference Bullen, Rogers and Spalding54).
There is a constant competition for iron between the human host and invasive pathogens(Reference Drakesmith and Prentice39). Anaemia is protective against malaria and iron supplementation removes this protective effect(Reference Goheen, Wegmuller and Bah55, Reference Clark, Goheen and Fulford56). Iron supplementation may also increase the risk of respiratory infections and other systemic infections, due to iron's ability to increase virulence and multiplication of pathogens(Reference Cross, Bradbury and Fulford47, Reference Armitage, Stacey and Giannoulatou57, Reference McDermid, Hennig and van der Sande58). Our body has evolved very effective ways to intentionally reduce circulating iron when challenged by infection in an attempt to ‘starve’ the pathogen. Central to nutritional immunity are high-affinity iron transport proteins, such as transferrin and lactoferrin, which allow us to ‘starve’ pathogens by maintaining free iron atoms scarce at concentrations of 10−18 m(Reference Bullen, Rogers and Spalding54, Reference Bilitewski, Blodgett and Duhme-Klair59, Reference Weinberg60). As aforementioned, a key aspect of our response to infection is the more recently discovered hormone hepcidin, the master regulator of body iron homeostasis, which allows our body to reduce circulating iron by supressing intestinal iron absorption and decreasing its release from body iron stores. Any iron supplementation regimen that aims to be effective in a high-infection burden setting needs to acknowledge the close interaction between iron and infection and this key aspect of our natural immunity cannot be overlooked.
Hepcidin/inflammation
Our body has no means of excreting excess iron and the control of body iron levels occurs by regulation of iron absorption. Absorption of iron from the gastrointestinal tract is tightly regulated by the systemic need for iron through the action of the hormone hepcidin(Reference Drakesmith and Prentice39, Reference Ganz61, Reference Nemeth, Tuttle and Powelson62). Hepcidin responds to changes in body iron stores, tissue hypoxia and demand for iron, and it alters absorption accordingly. Hepcidin, as the master regulator of iron homeostasis, is the main inhibitor of iron export from cells (including enterocytes, hepatocytes and macrophages) into blood circulation(Reference Prentice, Mendoza and Pereira5). When hepcidin is elevated, it binds to ferroportin (the only cellular iron exporter known) and it causes its internalisation and degradation, so that cellular iron is locked inside cells and cannot (1) be absorbed into the bloodstream or (2) be released from body iron stores(Reference Ganz and Nemeth63, Reference Deschemin and Vaulont64).
In IDA, hepcidin levels tend to be low to allow iron absorption, but in the presence of inflammation or in iron-replete individuals, hepcidin levels are usually high(Reference Jaeggi, Moretti and Kvalsvig38, Reference Bregman, Morris and Koch65, Reference Prentice, Doherty and Abrams66). Anaemia of inflammation, also called anaemia of chronic disease, is characterised by high hepcidin, high iron stores and low circulation iron(Reference Pasricha, Atkinson and Armitage67).
In anaemia of inflammation, when hepcidin levels are high, if non-physiological bolus doses of highly absorbable iron supplements are initiated, this iron will be poorly absorbed, while unabsorbed iron will just be excreted in the stools(Reference Prentice, Doherty and Abrams66). Iron absorption can be as high as 50 % when hepcidin is switched off and is virtually zero when hepcidin is high(Reference Prentice, Doherty and Abrams66).
Microbiome
Low ‘free’ iron availability in the colon is a major modulator of our gut microbiome particularly in early life or disease. Both iron deficiency and iron supplementation have an impact on the gut microbiome(Reference Kortman, Dutilh and Maathuis68, Reference Pereira, Aslam and Frazer69). The establishment of an adult-like microbiome requires bio-accessible iron to maintain diversity(Reference Pereira, Aslam and Frazer69, Reference Timmerman, Rutten and Boekhorst70), but in certain population groups, iron supplementation could cause or contribute to microbiome dysbiosis (see Fig. 2).
Excess unabsorbed dietary iron through fortification or supplementation can modify the gut microbiota equilibrium and favour the growth of pathogenic strains over barrier strains. Iron is essential for the virulence and colonisation of most enteropathogenic strains(Reference Kortman, Mulder and Richters71–Reference Jadhav, Hussain and Devi74). However, gut bacteria species considered most beneficial to the host, i.e. barrier function bacteria such as members of the Lactobacillacae and Bifidobacteriacae families, have either no or a very low requirement for iron(Reference Weinberg75, Reference Paganini, Uyoga and Kortman76). Indeed, studies in children have shown the detrimental effects of currently used soluble forms of iron on the gut microbiome, suggesting that iron supplements decrease the abundance of beneficial bacteria groups such as Lactobacillus and Bifidobacterium and increase the relative abundance of potential enteropathogenic bacteria belonging to the Enterobacteriaceae family, and also increase intestinal inflammation(Reference Jaeggi, Kortman and Moretti27, Reference Zimmermann, Chassard and Rohner28, Reference Paganini, Uyoga and Kortman76). This gut microbiome dysbiosis combined with mucosal inflammation can lead to reduced resistance to infection, an increase in the risk of diarrhoea and an increase in gut permeability leading to endotoxaemia(Reference Soofi, Cousens and Iqbal22, Reference Jaeggi, Kortman and Moretti27, Reference Paganini, Uyoga and Kortman76–Reference Cani, Bibiloni and Knauf79). In this way, current iron supplementation could be contributing to the vicious cycle of infection, diarrhoea and eventually causing further anaemia. Use of prebiotics together with iron supplements may be able to alleviate part of the gut-related adverse effects of the iron supplements, but further studies are needed(Reference Paganini, Uyoga and Kortman76).
Future outlook: science and beyond
Micronutrient nutrition is a key aspect of the international development agenda and is integral to the UN sustainable development goals. Recognising the importance of micronutrient nutrition is also paramount in defining strategies to address the double burden of undernutrition and overnutrition and the double burden of malnutrition and infection, both widely prevalent across most countries in the world. The WHO recently recognised that a considerable investment in building public health nutrition capacity is required, particularly in LMIC, alongside a persistent effort in obtaining scientific evidence for the effectiveness of existing and new nutrition programmes and interventions(Reference Delisle, Shrimpton and Blaney80). Additionally, obtaining scientific evidence through clinical trials and nutrition intervention studies in ‘virgin’ rural settings in LMIC also carries the ethical responsibility to add social value, with community engagement and capacity building, which goes beyond the scientific discovery. Traditionally, research communication has been reserved for scientific publications and news pieces in research websites. However, this strategy creates a science isolation that is not effective in today's world. Therefore, we must train researchers not only in their ability to conduct nutrition science research but also in their ability to communicate science to the wider community and general public. Equally we must equip public health professionals, nurses and clinicians in LMIC in their ability to engage with the local communities as much as in their knowledge of how to manage the prevention and treatment of various forms of malnutrition. A practical example of how we can combine scientific discovery, capacity building and community engagement is outlined later with the IHAT-GUT iron supplementation clinical trial in The Gambia.
Case study: IHAT-GUT trial in The Gambia
Since iron delivered naturally through foods may be a safer alternative(Reference Prentice, Mendoza and Pereira5), there is a growing interest in developing novel nano iron compounds or delivery systems to use in fortification and supplementation(Reference Powell, Bruggraber and Faria81–Reference Pisani, Riccio and Sabbatini84). One of the strategies proposed is to use a mimetic of the iron core of ferritin, namely iron hydroxide adipate tartrate (IHAT), to alleviate the adverse effects associated with reactive ionic iron(Reference Pereira, Bruggraber and Faria85). The hypothesis is that supplementation with IHAT will cause a slow release of iron into the blood, a lower rise of transferrin saturation, resembling that of natural food iron(Reference Pereira, Bruggraber and Faria85–Reference Aslam, Frazer and Faria87), and may constitute a safer option for supplementation, particularly in population settings with high-infection burden. Pre-clinical data indicate that IHAT does not accumulate in the intestinal mucosa and can promote a beneficial microbiota after dysbiosis(Reference Pereira, Bruggraber and Faria85, Reference Aslam, Frazer and Faria87). IHAT and standard-of-care ferrous sulphate, are currently being tested in a randomised placebo-controlled double-blind clinical trial taking place at the MRC Unit The Gambia at the London School of Hygiene & Tropical Medicine. The IHAT-GUT trial (NCT02941081) is enrolling about 700 young children living in some of the poorest and more rural communities in The Gambia, where risk of infection and diarrhoea are very high.
As add-on projects to the IHAT-GUT trial, we have conducted nutrition training workshops and we are also conducting a social science project, where we document the ‘ins’ and ‘outs’ of setting up a nutrition intervention clinical trial, aimed at translating novel chemistry into public benefit, in a rural setting in The Gambia (see Fig. 3). We believe this social science aspect of research training is as important to global health and nutrition science researchers as the training in nutrition science itself. The social science project is gathering the personal views from clinical and field study staff, all of whom where based in The Gambia before the trial commenced, and will, hopefully, demonstrate that it is possible to translate international science to rural Africa and that it is possible to conduct nutrition clinical trials in these rural settings to ICH-GCP international standards. Likewise, it will illustrate the many challenges faced by the researchers, for example, how to implement clinical studies in new settings, how to attract motivated and skilled staff, etc. This project will help staff and communities to embrace these studies and to acknowledge the added value of being part of nutrition research. We envisage that such information will inspire the next generation of researchers in global health and international nutrition, from academia or industry, to translate more international science to rural resource-poor settings.
Conclusion
Anaemia is multifactorial as it is caused by both poor diet and high levels of infection and inflammation. Nowhere more than in sub-Saharan Africa do these causes coexist, and eliminating iron deficiency and anaemia in African children and women still remains one of the main nutritional challenges of the twenty-first century. Iron supplementation in malaria-endemic areas or in regions with high-infection burden needs to be implemented with caution, should target those who are anaemic or at a high risk of deficiency, and be accompanied by strategies to prevent and treat malaria, hookworm, schistosomiasis and other nutrient deficiencies, such as vitamin B12, folate and vitamin A(Reference Mwangi, Phiri and Abkari32, 46).
In spite of many successes, it is clear that more needs to be done to eliminate iron deficiency in LMIC. Today, public health interventions that can correct or prevent iron and other micronutrient deficiencies merit the highest priority for national and international organisations. Routine monitoring for compliance, research into the best delivery vehicles, ensuring the continuous availability of supplements or fortified foods at the community level, engaging local community and community leaders, and capacity building are the key factors for sustaining achievements and increasing the success of such programmes. Only then will we fully value the World Bank comment: ‘the control of vitamin and mineral deficiencies is one of the most extraordinary development-based scientific advances of recent years. Probably no other technology available today offers as large an opportunity to improve lives and accelerate development at such low cost in such short time.’
Acknowledgements
The authors are very grateful to all staff and participants of the IHAT-GUT trial for their continued support and inspiration.
Financial Support
D. I. A. P. and the IHAT-GUT trial are supported by a Bill and Melinda Gates Foundation Grand Challenges New Interventions for Global Health award. Funding for the add-on social value projects was received from the Cambridge-Africa Alborada Research Fund and the University College London.
Conflicts of Interest
D. I. A. P. is one of the inventors of the IHAT iron supplementation technology for which she could receive future awards to inventors through the MRC Awards to Inventor scheme. Notwithstanding, the authors declare no conflict of interest.
Authorship
All authors contributed to the writing of this manuscript. The opinions expressed in this article are the authors’ own and do not reflect necessarily the view of their affiliated institutions.