Implications of food fortification

March 15, 2014


Johanna T. Dwyer, D.Sc., R.D., Tufts Medical Center, Boston, Massachusetts, USA.

“Fortification or enrichment means the addition of one or more essential nutrients to a food, regardless of whether it is normally contained in the food or not, for the purposes of preventing or correcting a demonstrated deficiency of one or more nutrients in the popu- lation or specific population groups. In the first half of the 20th century, fortification was used to address classical nutrient deficiencies throughout the world. In the United States, iodine was added to salt to reduce the risk of goiter; vitamin D was added to milk to reduce the risk of rickets; and ironthiaminniacin, and riboflavin were added to wheat flour and other cereal products to replace nutrients lost during the milling process and to reduce the risk of iron-deficiency anemia, beriberi, pellagra, and riboflavin deficiency, respectively. For cereal grains in the United States, although the levels of nutrients originally mandated were set as such to replace losses incurred during processing rather than to increase nutrient levels, the intent was the same: to add specific nutrients to foods that were frequently consumed to ensure nutrient adequacy in the American population. Folic acid was incorporated into the US Food and Drug Administration (FDA) standards of identity for enriched grain-based foods in the 1990s to reduce the risk of neural tube defects. For certain foods, such as enriched flour and bread, standards of identity specify the nutrients contained therein and the levels of nutrients that must be added to market them as such. Fortified and enriched foods help to improve the over- all nutritional quality of the food supply and address a demonstrated public health need.

The nutrients added to foods must be approved food additives or must be generally recognized as safe under the conditions of their intended use. It is important that nutrient fortification is appropriate and necessary. The fortification of fresh produce, meat, poultry or fish products, sugars, or certain snack foods (e.g., candy or carbonated beverages) and the indiscriminate addition of nutrients to foods are all deemed to be inap- propriate. Some nutrients, such as folic acid and vitamin D, are specifically limited by regulations regarding which foods can be fortified, and at what levels, to avoid overconsumption. In contrast, vitamin A can be added to any food without limitation, other than restrictions imposed by good manufacturing practices. Mar- garine is required to contain vitamin A and may contain vitamin D. Whole milk may be fortified with vita-
min A at a level not less than 500 IU per 8 oz serving and vitamin D at a level of 100 IU per 8 oz serving, reflecting 10% and 25% of the respective daily values (DV) of these vitamins. Reduced-fat milks must be fortified with vitamin A to avoid nutritional inferiority. Although fortification of milk with vitamin D is optional in the United States, nearly all pasteurized fluid milk sold there contains added vitamin D.

The most recent example of an addition to the FDA’s fortification policy, based on public health concerns, was the FDA’s final ruling in 1996 on the addition of folic acid to enriched grain products in order to reduce the risk of neural tube defects (NTDs). This change in policy was initiated in response to the 1992 recommen- dation from the US Department of Health and Human Services and Centers for Disease Control that all women of childbearing age capable of becoming pregnant should consume 400 micrograms of folic acid daily. The rationale for fortifying cereal grains was that they are consumed by 90% of women and their fortification would increase the folic acid intake of most women of childbearing age without requiring a change in the dietary patterns of the target population. Folic acid fortification is also mandatory in Canada. As reported in 2011, although less than 1% of Canadians are folate deficient and 40% show high red blood cell folate concentrations (above 1,360 nmol/L), almost a quarter of women of childbearing age have suboptimal folate concentrations for maximal NTD risk reduction. As a result of the change in folic acid fortification policy, there has been an increase in folic acid intake and an improvement in folate status (serum and red blood cell folate levels) as well as a reduction in the prevalence of NTDs in both the United States and Canada. Folic acid fortification is carried out in more than 60 countries today.

Although the nutritional fortification of foods clearly helps to alleviate some nutrient deficiencies in the short term, it is not a panacea. Appropriate modeling, testing, and monitoring must be undertaken before forti- fication can be implemented, and the underlying causes of specific nutrient deficiencies must ultimately be addressed. An analysis of National Health and Nutrition Examination Survey (NHANES) data from 2005 to 2008, examining nutritional intake from food alone, showed similar multiple shortfalls in the US population: Approximately 94% of the population did not meet the estimated average requirement (EAR) for vitamin D, 90% for vitamin E, 50% for magnesium, 46% for calcium, 41% for vitamin A and 37% of the population failed to meet the EAR for vitamin C (1, 2). The intake spread for each nutrient varied by subject age and supplement use. For some nutrients, dietary intake was inadequate only for certain groups, e.g., vitamin B6 and folate among adult females, phosphorus among teenage girls, zinc among adults aged over 70 years and teenage girls aged 14–18 years, and iron among pregnant females.

Fortification of the food supply is an alternative to relying on supplement use. According to an analysis of NHANES data from 2003 to 2006, many Americans would not have achieved the micronutrient intake levels recommended in the Dietary Reference Intakes (DRIs) without one or a combination of the following: food fortification, food enrichment, and the use of dietary supplements (3). Enriched and/or fortified foods con- tribute largely to intakes of vitamins A, C, and D, as well as thiamin, iron, and folate, although intakes of some of these nutrients are still below the EARs for a significant proportion of the population. Most of the water-soluble vitamins in US diets come from enriched/fortified foods and/or supplements, whereas the major sources of most minerals, with the exception of iron, are foods that are neither enriched nor fortified. It should be noted that vitamin D represents a special case because much of the body’s supplies thereof come from exposure to sunlight rather than dietary sources, thus making diet a poor proxy for nutritional adequacy. An analysis has also been conducted on the impacts of enrichment/fortification in children (4).

There are many challenges associated with the assessment of nutrient intake from foods and supplements in order to predict or monitor the impacts of fortification. It is important to deal with these challenges because changes in fortification and in FDA policy could have effects on the nutrition of the population. These changes would more than likely vary from nutrient to nutrient, food vehicle used, and target group. The choice of foods selected for fortification is critical because these foods must reach the high-risk groups of the popu- lation in order to enhance their nutritional intake without creating excessive intake for the rest of the popu- lation. If a single food is fortified, it is not certain that everyone in a particular high-risk group will benefit because some may not consume that food for one reason or another; this is true with vitamin D-fortified milk, which is often avoided by those who believe they are allergic or intolerant to milk. In addition, it is possible that fortification may miss the intended population because individuals with the poorest diets may be less likely to eat said fortified foods. Fortification of a staple food or food component will increase consump- tion of a nutrient, but it will occur at a differential rate for low and high consumers of that item. Fortifying several foods, or ingredients used in many foods, with a single nutrient may benefit a larger number of people, but it becomes more difficult to assess the impact of fortification on the population and to avoid excessive intake among generally heavy consumers of fortified foods. However, with respect to the United States, a recent population-based survey found that the percentage of individuals exceeding the tolerable upper intake level (UL) for most nutrients (calcium, iron, zinc, and vitamins A, C, and E) via food and supple- ments was relatively small – approximately 3–8%, niacin being the sole exception with approximately 10% above the UL (3). Adults aged over 18 who exceeded the UL for folic acid, for example, consumed all of the following: enriched grain products, ready-to-eat cereals, and supplements containing folic acid.

Only certain members of the population take dietary supplements, and so their impact on nutrient intake is limited to that subgroup, whereas most of the population are exposed to fortification and enrichment. Al- though supplement use increases the percentage of people meeting EARs, it also increases the percentage that may exceed ULs. People who regularly opt for enriched/fortified foods and take high dose dietary sup- plements may exceed ULs. Roughly half of the adult US population and 70% of US adults aged over 71 take dietary supplements. The majority of people report taking only one or two dietary supplements on a regular basis. A multivitamin/multimineral supplement is the most common type of dietary supplement taken in the United States. However, dietary intakes alone are not necessarily representative of nutritional status, or of the impacts of inadequacies or excesses. It has been suggested by certain researchers, however, that ULs may be set too low, partly as a result of excessively large safety factors that account for out-of-species extrapolations (e.g., extrapolations from various experimental animals other than humans), a safety factor to extrapolate across age groups and to account for interpersonal differences in a life-stage group (5). In addition, the shape of the dose-response curve, which is unknown for most nutrients, is needed to carry out a risk assessment. Without that information, it is impossible to estimate the proportion of individuals within a group that are at risk from high nutrient intake (6). If current ULs are set too low, the proportion of people with intakes deemed to be above the UL according to the current methodology is likely to be an overesti- mation of the proportion of individuals at risk of harm from excess nutrient intake from all sources. Rather than using ULs, the risk of excess could be approached in the same way as the risk of inadequacy, that is, by estimating an average tolerance of population subgroups for each nutrient and formulating a distribution of tolerances (estimated average tolerance) for the group, with a variance that reflects person-to-person differences.

In addition to increasing nutrient intakes within the population to varying degrees, fortification appears to have an impact on consumers’ purchasing decisions, which can ultimately affect their health and well-being. According to findings from the International Food Information Council’s Food & Health Survey 2011 (7), four out of five Americans purchase foods and beverages specifically because they have been fortified or have an additional added benefit. About one third believe that fortification has a moderate or great impact on their overall health. A little more than one quarter indicated that fortified foods have a great or moderate impact on their food purchasing decisions.

Food vehicles that have been employed successfully in fortification programs around the world or that show promise for fortification include wheat products (flour, bread, and pasta), maize products (corn grits, corn- meal, and corn porridges), milled rice, ready-to-eat breakfast cereals, infant formulas, infant cereals, milk and other dairy products, margarines, vegetable oils, salt, sugar, soy sauce, and fish sauce. Technologies for manufacturing vitamin and mineral fortificants are also well-developed. Synthetic forms of most vitamins that are identical to naturally occurring forms are available in high purity. In recent years, manufacturers have developed encapsulated forms to stabilize vitamins and minerals from degradation during processing and storage. Vitamin and mineral premixes formulated to the specifications of the food manufacturer are avail- able from reputable sources at reasonable costs. Benefit/cost calculations take into account the prevalence and degree of nutrient inadequacy, costs of treating nutrient deficiency diseases, assumptions on the value of a human life saved or improved, and the impact of nutrient deficiencies on worker productivity, among other factors. Benefit/cost ratios have been computed for several fortification regimens, and while the esti- mates are not precise, most appear to be greater than 1. However, even with the fortification of several food products in the United States, intakes of some nutrients still fall short of recommendations, and a gap may sometimes exist between nutrient intakes and nutrient status.”

Based on: Dwyer J. T. et al. Fortification: new findings and implications. Nutrition Reviews. Published online January 2014.


  1. US Department of Agriculture, Agricultural Research Service. What We Eat in America, NHANES 2005–2006. Washington, DC: USDA; 2009.
  2. US Department of Agriculture, Agricultural Research Service. What We Eat in America, NHANES 2007–2008. Washington, DC: USDA; 2010.
  3. Fulgoni V. L. 3rd et al. Foods, fortificants, and supplements: Where do Americans get their nutrients? J Nutr. 2011; 141:1847–1854.
  4. Bailey R. L. et al. Do dietary supplements improve micronutrient sufficiency in children and adolescents? J Pediatr. 2012; 161:837–842.
  5. Zlotkin S. A critical assessment of the upper intake levels for infants and children. J Nutr. 2006; 136(Suppl):502S–506S.
  6. Carriquiry A. L. and Camaño-Garcia G. Evaluation of dietary intake data using the tolerable upper intake levels. J Nutr. 2006; 136(Suppl):507S–523S.
  7. International Food Information Council. 2011; IFIC Food & Health Survey.
    Available at: