By Julia Bird
What is fortification and enrichment?
Food fortification and enrichment are two population-level strategies that can improve micronutrient status . Both approaches employ the same technique of adding additional vitamins or minerals to staple foods such as flour, sugar, oil or salt. They differ in their intent: enrichment restores certain nutrients to foods that are lost during processing or cooking, while fortification adds certain nutrients that are not normally present in the food in nutritionally relevant quantities.
Why fortify or enrich foods?
Fortification and enrichment have both been used successfully over many decades to reduce the prevalence of certain nutritional deficiencies. This has led to widespread improvements in health and quality of life for millions of people. The main advantage of fortification and enrichment compared to other ways of improving the micronutrient content of the diet is that it does not require behaviour change. By using a staple food that is virtually universally consumed, intakes of the micronutrient will increase as a result of the fortification without the general population having to modify their normal diet. This means that all sectors of the population, including those living in resource-poor environments, will benefit from fortification and enrichment.
Examples of successful fortification policies
There are many examples of fortification and enrichment policies that have been able to significantly reduce deficiency diseases in populations. The first step in the process was to identify nutrients that were lacking in diets. Once this had been discovered, the root cause of the poor intakes could be determined, which could enable the selection of a staple food that would make a good candidate for fortification and enrichment. After the micronutrient had been added to the food, the incidence of deficiency and its repercussions, as well as toxicity concerns, could be monitored to make sure that the fortification was effective and to find areas for improvement.
Iodine and iodine deficiency
The reduction in iodine deficiency in several countries is the first micronutrient fortification success story . Iodine is a mineral that is required to produce thyroid hormones, which are responsible for regulating growth and metabolism. Deficiency leads to intellectual disability and congenital hypothyroidism in infants and children , and a disfiguring swelling on the neck (goiter) in adults , as the thyroid gland on the front of the neck expands to increase its production of thyroid hormones when iodine intakes are much too low.
Iodine is found in many types of foods from the sea including fish, shellfish and seaweed. For populations who live near the sea, iodine deficiency is seldom a problem. However, in areas that are a long way from the sea, the amount of iodine consumed depends heavily on the iodine content in the soil. It is not surprising that the first two examples of a successful iodine program were found in land-locked Switzerland, and in Michigan in the middle of North America. In these areas, cretinism and goiter were serious problems: in mountainous Swiss canton Valais, around 6% of the population had congenital hypothyroidism, and the iodine content of the soil was so poor in the Great Lakes region of North America that even trout in the rivers showed signs of deficiency . In the early 1920s, iodized salt programs were introduced in both countries. Salt is a good medium for fortification with iodine as the fortificant used is similar in appearance to table salt, and salt meets the criteria of being used almost universally and at fairly similar amounts for everyone. These fortification programs, started initially in specific areas, were followed by such dramatic reductions in iodine deficiency disease that they encouraged widespread implementation throughout the rest of these two countries, and in fact the rest of the world. There is still work to go, though. Iodized salt is available in 70% of global households; the program needs to expand to 30% of the world’s population still at risk of deficiency .
Folic acid and neural tube defects
In the early 1970s, diet had already been identified as potentially linked to the incidence of neural tube defects , a type of birth defect in which the brain, spinal cord or spine does not develop correctly in the embryo. By the mid-1970s, an association between folic acid status and birth defects  was used as a basis for the first clinical studies showing that folic acid supplementation was an effective way to reduce neural tube defect incidence . Once this had been discovered, various ways of increasing the supply of folic acid to women of childbearing potential were assessed . The main approaches are via:
· Food: Women who are planning a pregnancy should ensure that they consume enough folate-rich food.
· Supplements: Women planning a pregnancy should take a dietary supplement containing 400 µg of folic acid.
· Contraceptives: Women should take oral contraceptives containing folic acid.
· Staple food fortification: Increase the supply of folic acid to the entire population.
Each approach has its advantages and disadvantages, however the big plus for folic acid fortification is that folate intakes are raised in women who have an unplanned pregnancy, the group most at risk of a neural tube defect.
In the US and Canada, the fortification of wheat flour with folic acid was mandated in 1998. This has led to a reduction in the incidence of neural tube defects in the range of 1000 to 1500 cases per year . Due to the program’s success, other countries have also mandated folic acid fortification of staple foods, which has resulted in declines in the rate of neural tube defects in many countries around the world .
Flour fortification with B-vitamins
In the late 1800s, there was a gradual change of the type of bread consumed in Europe and the US due to the introduction of steel mills. Up until this time, bread was dark in colour, coarse, fibrous and dense, made of grains such as barley, rye, buckwheat and others. The introduction of steel mills meant that hard wheat grains could be separated from the germ and bran with high efficiency. The resulting “strong” flour produced a highly desirable white, risen loaf . Nutrition science was in its infancy, however, and the problems caused by removing the most micronutrient-rich part of the grain took decades to be recognised [13, 14]. Namely, the amount of many different vitamins and minerals is considerably lower in white compared to whole wheat flour, as they are found in the part of the wheat that is removed by the milling process. The most nutritionally important nutrients that are found at lower quantities in white bread are niacin, riboflavin, thiamine, and iron.
In the US, after the discovery of the structures and development of a production method for various vitamins in 1930s, a movement was started to enrich white flours with some of the nutrients that are removed by modern milling. By the end of the 1930s, some bread flours were enriched although there was a small price differential in the final product that made unenriched white flour a little cheaper for consumers. It wasn’t until a Second World War-related mandate in 1943, and subsequent promotional and educational campaign that flour enriched with thiamine, riboflavin, niacin and iron became almost universally used in the US. Enriched flour was identified as an effective tool to reduce the prevalence of deficiency diseases pellagra and beriberi in the US and other countries worldwide [15, 16].
What can happen when fortification policies are rolled back?
Sometimes food fortification programs are stopped. This can be due to a host of different reasons, but should be only be done carefully when the causes of low micronutrient intakes have been solved, to ensure that deficiency diseases do not reappear. Below are three examples of the consequences of rolling back food fortification policies.
Vitamin D fortification in Canada
The discovery that childhood bone disease rickets could be cured by either cod liver oil or exposure to sunlight , combined with the isolation of vitamin D in the 1930s and industrial synthesis in 1959, led to the implementation of food fortification with vitamin D in various countries including the US, Canada, the UK and Switzerland . In Canada, fortification of many foods took place at a minimum level of 400 IU and a maximum level of 800 IU for daily intakes. As a result of this uncontrolled policy, supplemental vitamin D intakes were above upper thresholds for some children, while others remained at considerable risk of rickets. In 1964, regulations restricted fortification with vitamin D to a much narrower range of foods because of concerns about toxicity: only evaporated milk, margarine, infant foods, and fluid milk a year later . Although these limitations on fortification with vitamin D were set in an attempt to prevent excessive intakes, they brought about an increase in rickets found in infants and young children through the latter half of the 1960s . Subsequent regulations made the addition of vitamin D to all milk permissible, however it wasn’t until vitamin D fortification of milk and margarine was made mandatory in 1975 that rickets was finally virtually eliminated in Canada .
Vitamin A fortification of sugar in Guatemala
In the 1970s, vitamin A deficiency was highly prevalent in Guatemala. Estimates placed the proportion of young children with frank vitamin A deficiency at 20%. Sugar fortification was selected as the most appropriate staple food for fortification : unlike the staple food maize, sugar is processed centrally; unlike luxury cereal wheat, it was consumed by a broad sector of the population; unlike salt, it was technically possible to add vitamin A to sugar at levels that make a significant contribution to intakes . After several regulatory set-backs, a fortification law was enacted in 1974 and fortification started in 1975 . Monitoring in 1977 showed that the prevalence of vitamin A deficiency in preschool children was dramatically reduced to 5% in one year due to increases in vitamin A intakes from fortified sugar.
Unfortunately, the program did not last. Sugar producers complained about the interference of government in free enterprise. Banks would not release foreign currency to allow sugar producers to purchase vitamin A internationally, and there was no local supplier . Sugar could no longer be fortified. By the mid-1980s, vitamin A deficiency rates had returned to levels found before fortification. The program was introduced again in 1987, this time after actively involving the sugar industry in its institution. Fortified sugar reached 95% of households after the second introduction of the fortification program . A recent survey conducted in 2009 found the prevalence of vitamin A deficiency to be less than 3%, highlighting the sustained effectiveness of the program’s reintroduction .
Iodine in Germany
Naturally occurring iodine levels in the soil in Germany are low. The normal iodine content of food in Germany is too low to prevent deficiency in the population, and surveys taken before any measures to improve the supply of additional iodine to the population in 1981 found that over one third of children, adolescents and pregnant women had goiter .
Prior to 1991, salt iodization programs differed between East and West Germany . In East Germany, an iodized salt program was introduced in 1985/1986 , whereas there was no program in West Germany. There was a considerable increase in iodine intakes in East Germany in the late 1980s . With the reunification of Germany in 1990, salt iodization projects were no longer supported. A “principle of voluntary action” was adopted, leading to declines in iodine intakes [25, 28]. It took a few years for a national program to be legislated in 1993 [29, 30]. Iodine levels in the population did not increase or decreased during the period 1992 to 1994, however after the salt iodization program was introduced again on a national level, they started to increase again in all sectors of the population [30, 31]. The increase in supply of iodine to the population was done gradually. The most significant increases in iodine supply have come from the iodization of salt used by bakers and to cure meats, as well as increasing the iodine content of animal feed. In 2005, researchers found that the iodine status of Germans was sufficient .
Global prevalence of micronutrient deficiencies
Micronutrient deficiencies are highly prevalent and a significant contributor to poor health worldwide, both in the developing  and industrialized world .
The micronutrient deficiency with the largest global impact is iron. Anemia affects one in three people globally, and iron deficiency is a cause in around half these cases, the others mainly due to parasitic infections. Anemia results in fatigue, low work performance and reduced cognitive performance, and is a risk factor for maternal and child mortality. Good management of anemia requires control of intestinal worms and malaria and improving access to bioavailable iron via dietary diversification. Iron fortification of food staples can be an important part of improving a population’s iron intake .
One in three people have low iodine intakes, although significant progress to improve iodine nutrition through salt fortification is having a considerable impact . Iodine deficiency diminishes the cognitive potential of populations, and increases the rate of stillbirth and infant mortality. The main cause of iodine deficiency is low intakes from soils with a low iodine content, although low protein intakes and consumption of poorly processed cassava contribute . Salt iodization is considered to be the most effective way to improve national iodine intakes.
Deficiency in vitamin A is one of the leading causes of blindness in the world, and a poor status increases the risk of certain infectious diseases in children. While the incidence of vitamin A deficiency has been decreasing over the past decades, it is still a serious problem for children aged up to 5 years in South Asia and sub-Saharan Africa, where it affects 30% of young children .
The two most important sources of vitamin A are meat (particularly organ meats such as liver), and from pro-vitamin A beta carotene in orange and dark green fruits and vegetables. A lack of dietary diversity is the most prevalent cause. Various short- and long-term strategies can improve vitamin A status in deficient populations. Supplementing high-risk populations such as young children with two high-doses per year, often administered as part of health care visits, has been effective in reducing rates of blindness and death. Fortification of foods such as milk, margarine, flour and sugar has been effective in various countries at improving vitamin A intakes . Promotion of breastfeeding up until infants reach two years’ of age is also an effective strategy to prevent vitamin A deficiency . The best long term approach to prevent deficiency is via dietary diversification to improve access to vitamin A rich foods , however this can only be accomplished with parallel poverty reduction.
Deficiencies in the B-vitamins thiamine (B1), riboflavin (B2), niacin (B3), B6, B12 and folate are widespread in many low-income countries, however reliable estimations of the proportions affected are unknown . Low intakes of animal products and dairy, and a reliance on cereal products for the bulk of energy intakes are the main cause of various B-vitamin deficiencies.
Although the curative effects of vitamin C on the sailor’s scourge of scurvy has been known for hundreds of years , and it was the first vitamin to be chemically synthesized in 1933 , vitamin C deficiency remains a considerable problem worldwide. Vitamin C is obtained in the diet from fruit and vegetables. In well-nourished countries such as the US, the prevalence is around 7% in the general population , but can be higher in people with a poor diet such as individuals with a low income . In developing countries such as Brazil, the prevalence of vitamin C deficiency is around 30%, while in India, the majority of the population has vitamin C deficiency . Improving access to and increasing consumption of fresh fruits and vegetables through increasing dietary diversity is seen as the best way to reduce deficiency rates.
Vitamin D can be obtained from sunlight or the diet and therefore the effects of poor dietary diversity that are the cause of many other micronutrient deficiencies can be avoided in theory. Even so, very few foods contain a good source of vitamin D, and sun exposure must be sufficient, yet not excessive, to avoid sunburn whilst still enabling sufficient production in the skin. Carefully implemented fortification programs have been able to reduce the incidence of vitamin D deficiency in some countries. Globally, however, vitamin D deficiency is common . Worldwide estimates find that around half the population is at risk of vitamin D deficiency .