In recent years there has been a growing interest in folate and the metabolically related B-vitamins (vitamin B12, vitamin B6 and riboflavin). Apart from a well established role in the prevention of neural tube defects (NTDs), an optimal status of folate and related B-vitamins may protect against cardiovascular disease (particularly stroke) and certain cancers, and may have other important roles in cognitive function and bone health. These effects may/may not be mediated via homocysteine, the metabolism of which requires an adequate status of all four relevant B-vitamins. However, in the case of folate, vitamin B12 and riboflavin, the achievement of an optimal status presents particular challenges. This review will consider these nutrients, their potential roles in disease prevention, the assessment of status and the challenges involved in achievement of an optimal nutritional status.
Folate
Folate has a major role in one-carbon metabolism, involving the transfer and utilisation of one-carbon units in DNA and RNA biosynthesis, methylation reactions and amino acid metabolism. Folate deficiency leads to megaloblastic anaemia, characterised by larger than normal red cell precursors (megaloblasts) in bone marrow, macrocytes in the peripheral blood and giantism in the morphology of proliferating cells.
Folate status is routinely assessed by measurement of folate concentrations in plasma/serum or red cells. Red cell folate is considered to be the best index of longer term status (i.e. over the previous months), while serum folate reflects more recent dietary intake. The measurement of plasma total homocysteine concentration provides a reliable functional marker of folate status, on the basis that normal homocysteine metabolism requires an adequate supply of folate. When folate status is low or deficient, elevated plasma homocysteine is invariably observed.
Typical folate intakes are found to be sub-optimal in the diets of many people when considered from the perspective of achieving an optimal folate status, i.e. a folate level associated with lowest risk of folate-related disease (e.g. NTDs), rather than merely preventing overt folate deficiency (i.e. megaloblastic anaemia). This widespread under-provision of folate is generally attributed to the poor bioavailability of natural food folates. Although there is general agreement among experts that the bioavailability of natural food folates is incomplete when compared with the synthetic vitamin folic acid (FA), reported estimates of relative bioavailability vary greatly among human studies(Reference McNulty and Pentieva1).
With such uncertainty regarding folate bioavailability from food, setting dietary folate recommendations is problematic for policy makers. The approach adopted in the United States(2) is based on an adjustment for the differences in bioavailability between natural food folates and the synthetic vitamin, with the introduction of ‘Dietary Folate Equivalents’ (DFE). The DFE is defined as the quantity of natural food folate plus 1·7 times the quantity of FA in the diet, a definition based on the assumption that the bioavailability of FA added to food is greater than that of natural food folate by a factor of 1·7. This estimation, however, is highly dependent on one metabolic study in non-pregnant women which estimated the bioavailability of food folates to be no more than 50 % relative to that of FA(Reference Sauberlich, Kretsch, Skala, Johnson and Taylor3), plus other evidence showing that FA added to food had about 85 % the bioavailability of free FA(Reference Pfeiffer, Rogers, Bailey and Gregory4). Subsequent bioavailability studies have addressed the robustness of the US DFE value, with the findings of some(Reference Hannon-Fletcher, Armstrong and Scott5) being generally supportive of the estimates used, whereas others have concluded that food folate bioavailability is far greater than that assumed in generating the US DFE(Reference Winkels, Brouwer, Siebelink, Katan and Verhoef6). Emerging dietary folate recommendations in other countries may adopt a similar approach to that used in the United States, and the folate bioavailability literature will need to be carefully considered. Apart from their poor bioavailability, natural food folates (particularly green vegetables) can be unstable under typical cooking conditions, and this can substantially reduce the vitamin content before it is even ingested(Reference McNulty and Pentieva1). This is an important (but often overlooked) additional factor limiting the ability of natural food folates to enhance folate status. In contrast to food folates, FA (as found in supplements and fortified foods) provides a very stable and highly bioavailable form of the vitamin.
Vitamin B12
Vitamin B12 is required as a cofactor for two mammalian enzymes: methionine synthetase and methylmalonyl CoA mutase. In vitamin B12 deficiency, reduced function of the latter enzyme causes levels of the substrate methylmalonylCoA to build up and form the by-product methylmalonic acid (MMA) which is then excreted into the circulation; plasma MMA is thus a functional marker that is specific to vitamin B12 and will not be affected by intake/status of other B-vitamins. In the other reaction, vitamin B12 (as methylcobalamin) acts as a cofactor for methionine synthetase which catalyzes the remethylation of homocysteine to methionine. However this metabolic step is not only dependent on vitamin B12 but also on folate. Measurement of plasma homocysteine can therefore provide a functional biomarker of vitamin B12 status, but it is not specific for vitamin B12 and will be principally affected by changes in folate status (and also by vitamin B6).
Measurement of serum/plasma total vitamin B12 concentrations is the standard screening test for vitamin B12 deficiency, but it may not be the most sensitive marker of B12 status(Reference Miller, Garrod, Rockwood, Kushnir, Allen, Haan and Green7). It is proposed that the most relevant measure of vitamin B12 status is holotranscobalamin (holoTC) as this represents the fraction of circulating B12 that is attached to transcobalamin, the protein that delivers vitamin B12 to cells. In recent years, a reliable method to measure holoTC using a radio immuno-assay (RIA) was developed and validated(Reference Ueland, Eilertsen, Quadros, Rothenberg, Fedosov, Sundrehagen and Orning8). Numerous studies have since been published, strongly suggesting that holoTC may be a more reliable marker of B12 status than the typically used marker of status, serum total B12 concentration. The method has been the basis of an EU project framework 5 QLRT-2001-01775 "Demonstration of the clinical utility of holoTC as an early marker of vitamin B12 deficiency" aimed at determining the clinical utility of holoTC as a marker of vitamin B12 deficiency(Reference Morkbak, Heimdal, Emmens, Molloy, Hvas, Schneede, Clarke, Scott, Ueland and Nexo9). Some preliminary data from this project have been published, confirming that holoTC may be a better marker of B12 status than serum total B12 concentration(Reference Hvas and Nexo10).
There is a high prevalence of low/deficient vitamin B12 status among healthy elderly people which becomes more common with increasing age. One recent population-based study estimated that as many as 10 % and 20 % of British people aged 65–74 and >75 y, respectively, were at high risk of vitamin B12 deficiency(Reference Clarke, Refsum and Birks11). In other studies the prevalence of mild vitamin B12 deficiency is reported to be as high as 45 %, depending on the diagnostic criteria used(Reference Wolters, Strohle and Hahn12). However, the low status of B12 found in older people is not explained by inadequate dietary intakes of vitamin B12. In fact, among older adults living in the UK, the average daily intake of vitamin B12 was found to be 5·2 μg/d, an intake which is at least as good as that found in younger adults(Reference Henderson, Irving, Gregory, Bates, Prentice, Perks, Swan and Farron13) and markedly higher than the reference nutrient intake (RNI) value of 1·5 μg/d. Thus, the average diet of older people provides vitamin B12 levels which appear to be well in excess of needs.
The high prevalence of vitamin B12 deficiency among older people, despite adequate intakes, relates to changes in the gut with age that may prevent its release from natural food sources. Absorption of vitamin B12 from food requires the action of the stomach, pancreas, and small intestine and is therefore dependent upon normal functioning throughout the gastrointestinal tract. Less than 2 % of cases of B12 deficiency can be explained by pernicious anaemia, a condition characterised by loss of secretion of gastric intrinsic factor (GIF) leading to very marked B12 depletion as the vitamin must bind to GIF in order to be absorbed in the terminal ileum(Reference Carmel14). To effectively treat pernicious anaemia, therefore, patients must be given regular B12 injections which are continued for life. However, a much more prevalent but largely unrecognised cause of mild B12 deficiency is considered to be food-bound malabsorption. This is mainly caused by atrophic gastritis, an age-related condition resulting in decreased gastric acid production (hypochlorhydria). Because gastric acid is required for the release of vitamin B12 from proteins in food, vitamin B12 absorption from foods is diminished in older people with hypochlorhydria(Reference Carmel14). In theory, older adults with mild (pre-clinical) B12 deficiency due to food-bound malabsorption should be able to absorb free (crystalline) vitamin B12 because it is not bound to protein and because intrinsic factor is still secreted. In fact, on the basis of this assumption, the Institute of Medicine in the US(2) recommends that people aged 50 years and over consume most of their vitamin B12 from crystalline B12 found in fortified foods and supplements. No such recommendation exists in the UK and no study to date has directly addressed this issue in order to provide a firm basis for introducing one.
The issue of low/deficient B12 status among older people is also relevant to emerging policy on mandatory fortification of food with FA. A number of European countries have opted not to proceed with mandatory fortification or have delayed a decision on it because of potential health risks to the elderly, primarily the concern that high dose FA might mask the anaemia of vitamin B12 deficiency in older people, while allowing the associated irreversible neurological symptoms to progress(Reference Dickinson15). Although some reports concluded that mandatory FA fortification has not had any adverse effect with respect to the diagnosis of vitamin B12 deficiency since its introduction over 10 years ago in the US(Reference Mills, Von Kohorn, Conley, Zeller, Cox, Williamson and Dufour16, Reference Liu, West and Randell17), one recent study from the post-fortification era showed that among older American adults with a low vitamin B12 status, a higher serum folate concentration was associated with an increased risk of cognitive impairment(Reference Morris, Jacques, Rosenberg and Selhub18). Regardless of whether mandatory fortification with FA goes ahead in European countries, however, preventing the high prevalence of low vitamin B12 status due to food-bound malabsorption among older adults is an important public health issue that needs to be addressed.
Riboflavin
Riboflavin is the precursor of the coenzymes flavin mononucleotide (FMN) and flavin adenine dinucleotide (FAD), both of which are essential for several reduction-oxidation enzymes. Classical riboflavin deficiency (presenting as angular stomatitis, cheilosis and glossitis) rarely occurs in isolation and usually occurs in association with other vitamin deficiencies(Reference Gibson19). Riboflavin status is determined by erythrocyte glutathione reductase activation coefficient (EGRac). This is a functional assay which measures the activity of glutathione reductase before and after in vitro reactivation with its prosthetic group FAD(Reference Powers, Bates, Prentice, Lamb, Jepson and Bowman20). EGRac is calculated as a ratio of FAD-stimulated to unstimulated enzyme activity, with values of 1·3 or more generally indicative of suboptimal riboflavin status. Advantages of EGRac include stability and high sensitivity to small degrees of cofactor desaturation. The UK COMA Panel on Dietary Reference Values(21) decided to base RNI values for riboflavin (1·3 mg/d for men and 1·1 mg/d for women) on those intakes at which all but two corresponding EGRac results were less than 1·3 in British adult survey data available at that time(Reference Gregory, Foster, Tyler and Wiseman22). Similarly, in the US, the relevant Committee(2) used evidence from intake studies showing normal EGRac values when setting its reference values.
More recently, however, the Scientific Advisory Committee on Nutrition has expressed particular concern about the high proportion of the UK adult population with apparently poor riboflavin status as determined by EGRac values. In the National Diet & Nutrition Survey (NDNS), a major proportion of adults were found to have abnormal (>1·3) EGRac: 82 % of men aged 19 to 24 years decreasing to 54 % in those aged 50 to 64 years; 77 % of women in the youngest age group decreasing to 50 % in the oldest age group(Reference Ruston, Hoare, Henderson, Gregory, Bates, Prentice, Birch, Swan and Farron23). Some would argue that these abnormal EGRac values are simply a characteristic of the sensitivity of the assay procedure used, as in general, they did not correspond with dietary data, which indicated that only 3 % of men and 8 % of women had a riboflavin intake below the lower reference nutrient intake (LRNI) value of 0·8 mg/d(Reference Henderson, Irving, Gregory, Bates, Prentice, Perks, Swan and Farron13). However, the NDNS shows some evidence of low intakes in young adults, particularly young women, with some 15 % of women aged 19 to 24 years having a mean intake of riboflavin below the LRNI(Reference Henderson, Irving, Gregory, Bates, Prentice, Perks, Swan and Farron13). Consistent with NDNS findings, a recent study in Northern Ireland (Table 1) showed generally poor riboflavin status, and again particularly so among younger women.
Values are presented as mean ± SD.
Abbreviations: tHcy, plasma total homoysteine; SF, serum folate; RCF, red cell folate; B12, serum vitamin B12; and EGRac, erythrocyte glutathione reductase activation coefficient, a functional indicator of riboflavin status. A higher EGRac ratio indicates lower riboflavin status (values ≥ 1·3 considered to indicate sub-optimal status).
* Adapted from Hoey et al. (Reference Hoey, McNulty, Askin, Dunne, Ward, Pentieva, Strain, Molloy, Flynn and Scott54).
Whether there is a general problem of poor riboflavin status in the UK population, as indicated by the large proportion with high EGRac values, is unclear and requires further investigation. Although EGRac is well established as a valid (often considered gold-standard) marker of riboflavin status, its measurement requires very specific treatment at the time of sampling and is therefore limited to those studies which have set out to measure EGRac. It does not enable riboflavin status to be determined retrospectively in large clinical studies which may have available a stored plasma sample, for example, on which B2 vitamers can be directly measured(Reference Hustad, McKinley, McNulty, Schneede, Strain, Scott and Ueland24).
B-vitamin status and disease prevention
Neural tube defects (NTDs)
NTDs are major malformations in which there is a failure of the neural tube to close properly. Importantly, the neural tube normally closes in the first few weeks of pregnancy and therefore the NTD malformations may have occurred before a woman even knows that she is pregnant. Folate has a well established protective role against both first occurrence and recurrence of NTDs, resulting in clear recommendations in the UK and elsewhere which have been in place for over 15 years. For the prevention of first occurrence of NTD (i.e. primary prevention), official bodies worldwide recommend women to take an additional 400 μg/d folate/FA from before conception up to the 12th gestational week. However, the achievement of this recommendation is proving to be very problematic in practice. Compliance with FA supplements is poor(Reference Eichholzer, Tonz and Zimmermann25) and consequently they are generally ineffective. Recent evidence shows that current recommendations have not had any measurable impact on the rates of NTDs in some 13 European countries (including the UK) over a 10-year period, before and after the recommendations were introduced(Reference Botto, Lisi and Robert-Gnansia26). In contrast, the population-based policy of mandatory FA fortification of cereal grains implemented in North America and some non-European countries has substantially increased folate status(Reference Pfeiffer, Caudill, Gunter, Osterloh and Sampson27) and reduced the occurrence of NTDs(Reference Honein, Paulozzi, Mathews, Erickson and Wong28, Reference De Wals, Tairou and Van Allen29). Despite this evidence, many governments elsewhere have decided against implementing similar population-based policies because of safety concerns regarding chronic exposure to FA.
Cardiovascular disease
There is considerable interest in plasma homocysteine as a cardiovascular disease (CVD) risk factor, with meta-analyses of prospective studies predicting that lowering concentrations by 3 (mol/l (i.e. a 25 % reduction in current levels) would reduce the risk of heart disease by up to 16 % and stroke by up to 24 %(30, Reference Wald, Law and Morris31). Good evidence for a causal relationship between elevated homocysteine and CVD comes from genetic studies. The most important genetic determinant of homocysteine in the general population is the common 677C → T variant in the gene coding for the folate-metabolising enzyme MTHFR which results in impaired folate metabolism and higher homocysteine concentrations in vivo (Reference Frosst, Blom and Milos32). Importantly, people with the homozygous mutant (TT) genotype (typically 10 % of Western populations)(Reference Wilcken, Bamforth and Li33) are found to have a significantly higher CVD risk (by 14 to 21 %) compared to those without this genetic factor(Reference Klerk, Verhoef, Clarke, Blom, Kok and Schouten34, Reference Lewis, Ebrahim and Davey Smith35).
A recent population based study showed that the decline in stroke-related mortality in recent years in the US and Canada related to the introduction of mandatory FA fortification(Reference Yang, Botto, Erickson, Berry, Sambell, Johansen and Friedman36). Although the secondary prevention trials published to date have failed to confirm a benefit of homocysteine-lowering therapy on CVD events generally, they are now widely recognised as being substantially underpowered to detect a significant effect for heart disease risk, and therefore cannot be interpreted as evidence that no relationship exists. In support of this, evidence just published from a meta-analysis of clinical trials showed that homocysteine-lowering (by FA) reduced the risk of stroke by 18 % overall and by 25 % in those trials in people without a history of stroke(Reference Wang, Qin, Demirtas, Li, Mao, Huo, Sun, Liu and Xu37). Thus, although primarily aimed at reducing NTD, FA fortification may have an important role in the primary prevention of CVD, in particular stroke, via homocysteine-lowering.
Cancer
Folate is thought to play an important role in cancer prevention, with epidemiological evidence consistently showing inverse relationships between higher folate intakes and a reduced risk of cancer, most notably colorectal, but also pancreatic and breast cancer(Reference Giovannucci38, Reference Ericson39). These observational studies from humans are substantiated by animal and experimental studies, with the latter focusing on the mechanisms explaining the possible cancer-preventative properties of folate. These have been attributed to its function in the de novo synthesis of thymidylate and purines, nucleotides required for DNA replication and repair. Folate is also required for the production of S-adenosylmethionine a universal donor of methyl groups for numerous methylation reactions including the methylation of DNA which is central to gene silencing. In addition, genetic studies show that polymorphisms in folate metabolism are important in influencing folate status, related one-carbon metabolism and cancer risk(Reference Giovannucci38).
Of note, the issue of folate and cancer has received much recent public health and scientific attention as a result of the publication of a randomised controlled trial which provided preliminary evidence, though as yet unconfirmed, suggesting that exposure to high dose FA (1 mg/d for 5 years) could promote colorectal tumorigenesis specifically in patients with pre-existing neoplasms(Reference Cole, Baron and Sandler40). This in turn supports the view that FA might have a dual role in carcinogenesis: it might be protective before any precancerous lesions have developed and in populations with low folate status, whereas at high doses it might stimulate the further development of existing lesions in populations already exposed to high FA intakes (through fortification and supplementation). Thus the role of folate in carcinogenesis may be more complicated than the observational studies suggest; the timing, and particularly the dose of FA, appear to be highly relevant.
Cognitive impairment
Impaired cognitive function is a common problem of ageing, ranging in severity from mild memory loss to dementia, the latter characterised by a decline in memory and thinking to a level that impairs activities of daily living. Recent interest has focussed on the potential role of homocysteine related B-vitamins in maintaining cognitive function in ageing. Observational studies in both healthy and cognitively impaired older adults show that lower status of B-vitamins and/or higher homocysteine is associated with poorer cognitive function. However, it is difficult to interpret much of the evidence from observational studies, since poor diet may be both a cause and a consequence of poor cognitive function. Evidence from randomised trials of homocysteine-lowering with B-vitamins is lacking and inconclusive, with many of the studies in this area being of insufficient duration or sample size. Two well designed, randomised placebo-controlled trials of relatively long duration in older adults were recently published, but their findings are conflicting. One trial reported no benefit of B-vitamins (FA, vitamin B12 and vitamin B6 for two years) on cognitive performance(Reference McMahon, Green, Skeaff, Knight, Mann and Williams41), whereas the other concluded that supplementation with FA for three years significantly improved cognitive function(Reference Durga, van Boxtel, Schouten, Kok, Jolles, Katan and Verhoef42). Further evidence is required to confirm whether optimal B-vitamin status has a role in preventing cognitive impairment or dementia.
Osteoporosis
In recent years evidence is emerging from large cohort studies in the USA, the Netherlands and Norway linking elevated homocysteine levels (and/or lower status of related B-vitamins) to lower bone mineral density and a higher rate of osteoporotic fracture. More convincing evidence is provided by a randomised trial showing that combined treatment with FA and vitamin B12 was effective in reducing the risk of hip fracture following stroke(Reference Sato, Honda, Iwamoto, Kanoko and Satoh43), offering considerable support for the hypothesis that folate and/or vitamin B12 may delay the progression of osteoporosis, or that high homocysteine levels may promote it. If the link between elevated homocysteine and osteoporosis is confirmed as being a causative one, there is considerable scope for prevention of bone disease by lowering homocysteine through enhanced B-vitamin status.
Homocysteine as a functional biomarker of B-vitamin status
Apart from its potential role as a risk factor in CVD and other diseases, plasma homocysteine is very responsive to intervention with the B-vitamins required for its metabolism: folate(44), and to a lesser extent, vitamin B12(Reference Eussen, de Groot and Clarke45), vitamin B6(Reference McKinley, McNulty and McPartlin46) and riboflavin(Reference McNulty, Dowey, Strain, Dunne, Ward, Molloy, McAnena, Hughes, Hannon-Fletcher and Scott47). A recent meta-analysis of intervention studies examining the effect of B-vitamins on homocysteine-lowering, showed that vitamin B12 produces an additional one-third (7 %) lowering of homocysteine, over and above that achieved with FA alone (typically 20–25 % lowering)(48). Thus vitamin B12 is generally considered to be a far less effective determinant of homocysteine concentrations compared with folate. However, evidence from a study in healthy subjects, supplemented with low dose FA(Reference Quinlivan, McPartlin, McNulty, Ward, Strain, Weir and Scott49), and from studies in the era of mandatory FA fortification of cereal grains in the US(Reference Liaugaudas, Jacques, Selhub, Rosenberg and Boston50), shows that vitamin B12 becomes the main nutritional determinant of homocysteine once folate status is optimised.
The typical phenotype associated with homozygosity (TT genotype) for the MTHFR 677C → T polymorphism is elevated homocysteine levels(Reference Frosst, Blom and Milos32). Individuals with the TT genotype are considered to have increased dietary folate requirements on the basis that they have lower red cell folate levels compared to those without this genetic variant(Reference Molloy, Daly, Mills, Kirke, Whitehead, Ramsbottom, Conley, Weir and Scott51), and the expected increase in homocysteine is found to be most marked among those with lower folate status(Reference Jacques, Bostom, Williams, Ellison, Eckfeldt, Rosenberg, Selhub and Rozen52). Apart from folate, however, riboflavin (FAD) is required as a cofactor for the MTHFR enzyme. The reduced activity of the variant form of MTHFR has been shown in vitro to result from the inappropriate loss of its FAD cofactor(Reference Yamada, Chen, Rozen and Matthews53). Recent results, showing a genotype-specific response of homocysteine to riboflavin supplementation, now confirm that riboflavin is an independent modifier of homocysteine in people with the TT genotype. Significant lowering of homocysteine in response to riboflavin supplementation was observed in healthy individuals with the TT genotype, with levels decreasing by as much as 22 % overall, and markedly so (by 40 %) in those with lower riboflavin status at baseline(Reference McNulty, Dowey, Strain, Dunne, Ward, Molloy, McAnena, Hughes, Hannon-Fletcher and Scott47). No homocysteine response to intervention was observed in non-homozygous individuals (CC or CT genotypes), despite a significant improvement in riboflavin status and the pre-selection of subjects with sub-optimal riboflavin status at baseline. The responsiveness of homocysteine to riboflavin is therefore specific to individuals with the MTHFR 677 TT genotype and represents a new gene-nutrient interaction.
Thus homocysteine can be considered a reliable functional marker of the status of folate in particular, of vitamin B12 in those optimised in folate, and of riboflavin specifically in those with the MTHFR 677 TT genotype.
Public health challenges in achieving optimal B-vitamin status
Optimal folate status has an established role in preventing NTDs and possible preventative roles in stroke and other diseases. Thus, the achievement of optimal folate status should be a priority for public health, but will only be achieved with levels of intake of the vitamin greater than those currently provided by a typical diet as eaten in most European countries. Strategies to increase folate status are generally ineffective if based on health promotion and educational campaigns, and are controversial if based on mandatory fortification of foods with FA. There is a particular challenge for some European and other populations without access to fortified foods (not even on a voluntary basis in some cases) who are therefore dependent on natural food folate sources. In the case of vitamin B12, the achievement of an optimal status may present a particular difficulty for many older people because of the common problem of age-related food-bound B12 malabsorption. Finally, the evidence that riboflavin status is generally low in the UK population, and particularly so in younger women, warrants further investigation.