Date Published: June 5, 2017
Publisher: American Society for Nutrition
Author(s): Susan J Fairweather-Tait, Amy Jennings, Linda J Harvey, Rachel Berry, Janette Walton, Jack R Dainty.
Background: Values for dietary iron bioavailability are required for setting dietary reference values. These are estimated from predictive algorithms, nonheme iron absorption from meals, and models of iron intake, serum ferritin concentration, and iron requirements.
The bioavailability of dietary iron can be defined as the proportion (or percentage) of ingested iron that is absorbed and used within the body. A value for dietary iron bioavailability (sometimes referred to as the bioavailability factor) is required to transform physiologic requirements (i.e., absorbed iron) into dietary intakes and, hence, to derive dietary reference values (DRVs)8 and to develop dietary recommendations and public health policies. Initially, bioavailability factors were derived from predictive algorithms on the basis of intakes of heme iron and enhancers of nonheme-iron absorption (1). This method was followed by more complex algorithms that included inhibitors as well as enhancers of nonheme-iron absorption (2, 3) whereby the magnitude of the effect of modifiers of nonheme-iron absorption was determined from single-meal studies. Because the effect of enhancers and inhibitors may be exaggerated in single-meal studies (4), the mean absorption of nonheme iron from >1 meal was used to more closely reflect the whole diet (5, 6). However, this assessment does not reflect the diet that is consumed over time, and also, an adjustment has to be made to take into account the heme content of the diet with an assumed absorption value.
Data were used from the following 3 studies: the National Diet and Nutrition Survey (NDNS), the National Adult Nutrition Survey (NANS), and the New Dietary Strategies Addressing the Specific Needs of Elderly Population for a Healthy Ageing in Europe (NU-AGE). Briefly, the NDNS (9) and NANS (10) used nationally representative samples of adults (with the exclusion of pregnant women and breastfeeding women) in the United Kingdom (aged 19–64 y) and Republic of Ireland (aged ≥19 y), respectively. The NU-AGE study was a randomized, controlled, multicenter trial of healthy, independent older people (without frailty, heart failure, or serious chronic illness) aged 65–79 y with the aim of assessing the effects of a 1-y dietary intervention on markers of inflammation and health (11, 12). We used baseline data from United Kingdom participants only because their dietary patterns were likely to be similar to those in the other UK surveys; the data were collected between September 2012 and January 2014. Detailed methods of the data collection have been previously published (9–12), but the information that is pertinent to this article (dietary assessment and analytic methods) is summarized as follows.
A flowchart of the numbers of participants who were recruited and excluded at different stages of the 3 studies is available is shown in Supplemental Figure 1. Details of the 3 studies, including study subjects, exclusion criteria, analytic methods, and dietary assessment are summarized in Supplemental Table 1. The characteristics of participants from the 3 studies are presented in Table 1, and individual data are given in Supplemental File 1. The percentages of individuals with acute-phase reactant values that were indicative of inflammation or infection (hs-CRP concentration >5 mg/L or α-1-antichymotrypsin concentration >0.65 g/L) were 0% in the NDNS, 15% in the NANS, and 5% in NU-AGE study. These individuals were excluded from the analysis because their SF concentrations may have been elevated and, therefore, might not have reflected iron stores accurately.
In our model, differences in iron status between the 3 study population groups were partly explained by age (compared with premenopausal women, postmenopausal women had lower iron status because of their lower iron requirements) and also by diet (i.e., higher intake of meat in the NANS groups was associated with higher SF concentrations). When adequate body iron stores were present at an SF concentration of 60 μg/L, the efficiency of iron absorption was no longer upregulated (16), and the computed differences in dietary iron absorption were minimal, but with a lower SF concentration, the effect of the diet became more marked, thereby illustrating the importance of applying iron intake and SF data that are collected in populations with different dietary patterns. In particular, it appears that meat consumption is a key determinant of body iron status.