The European Food Information Council (EUFIC), Brussels, Belgium
“Food fortification is defined as the practice of deliberately increasing the content of an essential micronutrient, i.e. vitamins and minerals (including trace elements) in a food, so as to improve the nutritional quality of the food supply and provide a public health benefit with minimal risk to health (1). With the vast array of food now on offer to most Europeans, dietary deficiencies should be a thing of the past. However, the intake of certain micronutrients remains critical. Among them, some have been used for staple food fortification, such as iodine, folic acid, calcium, and vitamin D.
Historically, iodine and vitamin D deficiencies were widespread throughout Europe, thus making diseases like goiter and rickets commonplace. Goiter indicates severe iodine deficiency, yet mental impairment can occur even when deficiency is mild. An effective strategy to manage iodine deficiency has been the iodization of salt (2). Since the introduction of iodized salt (e.g. 1922 in Switzerland), European countries have dramatically reduced their goiter rates. Today these populations maintain an adequate level of iodine intake. Similar success has been seen in countries that fortify their milk with vitamin D, thereby practically eliminating childhood rickets (3). Likewise, the mandatory fortification of margarine with vitamins A and D — adjusted to mirror the amounts of these vitamins contained in butter (for which it is commonly used as a substitute) —has helped establish what has been termed “nutritional equivalence” (1). In other words, people switching from butter to margarine will maintain their intake of these essential nutrients. It should be noted that despite these efforts, vitamin D has re-emerged as a major public health issue over the last five years. Aside from the well-known effect on the mineralization of bones and teeth, a range of medical and health factors also appear significant and there is an ongoing debate on the appropriate recommendations (3, 4).
A more recent fortification strategy has been the addition of folic acid to flour, primarily in an attempt to reduce neural tube defects. This became mandatory in the US in 1998, but is voluntary in Europe. Mandatory fortification is controversial due to concerns over a potential increased risk of bowel cancer (5). One of the disadvantages of food fortification is the possibility of particular groups over-consuming nutrients (6). A key feature of fortification, therefore, is calculating the optimum amount of the nutrient to be used. It needs to be effective but safe. A recent study has shown that the intake of nutrients via supplements and fortified foods varies considerably from country to country within Europe and that nutritional inadequacy in European children merits further fortification of foods with selected micronutrients (7, 8). Strict regulations within the European Union control not only the amount of nutrients added to food, but their use as fortificants in general (9).
Fortificants need to be in a form the body can utilize easily. Iron is a good example. It comes in two forms – heme iron (animal food sources) and non-heme iron (animal and non-animal food sources). Iron from animal foods, such as meat, fish, and poultry, is much better absorbed than iron from non-animal sources like vegetables. Iron added as a fortificant is in the form of non-heme iron, but its use in the body can be improved. Vitamin C (e.g. from citrus fruits) and animal proteins (meat/poultry/fish) enhance the absorption of non-heme iron.
Fortified foods can fill in certain nutritional gaps, yet they do not replace the need for a healthy, balanced diet comprising of a variety of foods. Fortification can be self-limiting due to high levels of additional nutrients altering the taste and appearance of a food. A diet providing the optimal level and balance of nutrients is potentially worthless if it does not look or taste good enough to eat. However, in general, fortified foods users show better nutrient adequacy levels attained through commonly consumed foods compared to non-users. This effect may be related to a higher nutritional awareness among users of fortified foods. While widespread fortification programs have proven successful at a population level, a targeted approach for individuals with specific nutrient requirements can be useful, and reduces the risk of over-supplying nutrients to those without increased needs. Nutrition labels can provide guidance as to the amount of specific nutrients contained in a given food.
There are ongoing efforts to decipher the relationship between dietary needs and genetic make-up so that one day nutrient recommendations may be made on an individual basis. Furthermore, nutrient stability and absorption within fortified foods are continuously being improved. Together with refined and standardized methods to accurately assess dietary supply, fortified foods pave the way for a personal approach to optimizing nutrient intake.”
Brussels, November 2011