Compartir

Etiquetas

  • TEMA DEL MES
  • 2016
  • VITAMINAS

The Downsides of Nutritional Defortification

Publicado

18 julio 2016

By Julia Bird

What is fortification and enrichment?

Food fortification and enrichment are two population-level strategies that can improve micronutrient status [1]. Both approaches employ the same technique of adding additional vitamins or minerals to staple foods such as flour, sugar, oil or salt. They differ in their intent: enrichment restores certain nutrients to foods that are lost during processing or cooking, while fortification adds certain nutrients that are not normally present in the food in nutritionally relevant quantities.  

Why fortify or enrich foods?

Fortification and enrichment have both been used successfully over many decades to reduce the prevalence of certain nutritional deficiencies. This has led to widespread improvements in health and quality of life for millions of people. The main advantage of fortification and enrichment compared to other ways of improving the micronutrient content of the diet is that it does not require behaviour change. By using a staple food that is virtually universally consumed, intakes of the micronutrient will increase as a result of the fortification without the general population having to modify their normal diet. This means that all sectors of the population, including those living in resource-poor environments, will benefit from fortification and enrichment.

Examples of successful fortification policies

There are many examples of fortification and enrichment policies that have been able to significantly reduce deficiency diseases in populations. The first step in the process was to identify nutrients that were lacking in diets. Once this had been discovered, the root cause of the poor intakes could be determined, which could enable the selection of a staple food that would make a good candidate for fortification and enrichment. After the micronutrient had been added to the food, the incidence of deficiency and its repercussions, as well as toxicity concerns, could be monitored to make sure that the fortification was effective and to find areas for improvement.

Iodine and iodine deficiency

The reduction in iodine deficiency in several countries is the first micronutrient fortification success story [2]. Iodine is a mineral that is required to produce thyroid hormones, which are responsible for regulating growth and metabolism. Deficiency leads to intellectual disability and congenital hypothyroidism in infants and children [3], and a disfiguring swelling on the neck (goiter) in adults [4], as the thyroid gland on the front of the neck expands to increase its production of thyroid hormones when iodine intakes are much too low.

Iodine is found in many types of foods from the sea including fish, shellfish and seaweed. For populations who live near the sea, iodine deficiency is seldom a problem. However, in areas that are a long way from the sea, the amount of iodine consumed depends heavily on the iodine content in the soil. It is not surprising that the first two examples of a successful iodine program were found in land-locked Switzerland, and in Michigan in the middle of North America. In these areas, cretinism and goiter were serious problems: in mountainous Swiss canton Valais, around 6% of the population had congenital hypothyroidism, and the iodine content of the soil was so poor in the Great Lakes region of North America that even trout in the rivers showed signs of deficiency [5]. In the early 1920s, iodized salt programs were introduced in both countries. Salt is a good medium for fortification with iodine as the fortificant used is similar in appearance to table salt, and salt meets the criteria of being used almost universally and at fairly similar amounts for everyone. These fortification programs, started initially in specific areas, were followed by such dramatic reductions in iodine deficiency disease that they encouraged widespread implementation throughout the rest of these two countries, and in fact the rest of the world. There is still work to go, though. Iodized salt is available in 70% of global households; the program needs to expand to 30% of the world’s population still at risk of deficiency [6]. 

Folic acid and neural tube defects

In the early 1970s, diet had already been identified as potentially linked to the incidence of neural tube defects [7], a type of birth defect in which the brain, spinal cord or spine does not develop correctly in the embryo. By the mid-1970s, an association between folic acid status and birth defects [8] was used as a basis for the first clinical studies showing that folic acid supplementation was an effective way to reduce neural tube defect incidence [9]. Once this had been discovered, various ways of increasing the supply of folic acid to women of childbearing potential were assessed [10]. The main approaches are via:

·         Food: Women who are planning a pregnancy should ensure that they consume enough folate-rich food.

·         Supplements: Women planning a pregnancy should take a dietary supplement containing 400 µg of folic acid.

·         Contraceptives: Women should take oral contraceptives containing folic acid.

·         Staple food fortification: Increase the supply of folic acid to the entire population. 

Each approach has its advantages and disadvantages, however the big plus for folic acid fortification is that folate intakes are raised in women who have an unplanned pregnancy, the group most at risk of a neural tube defect.

In the US and Canada, the fortification of wheat flour with folic acid was mandated in 1998. This has led to a reduction in the incidence of neural tube defects in the range of 1000 to 1500 cases per year [11]. Due to the program’s success, other countries have also mandated folic acid fortification of staple foods, which has resulted in declines in the rate of neural tube defects in many countries around the world [12].

Flour fortification with B-vitamins

In the late 1800s, there was a gradual change of the type of bread consumed in Europe and the US due to the introduction of steel mills. Up until this time, bread was dark in colour, coarse, fibrous and dense, made of grains such as barley, rye, buckwheat and others. The introduction of steel mills meant that hard wheat grains could be separated from the germ and bran with high efficiency. The resulting “strong” flour produced a highly desirable white, risen loaf [13]. Nutrition science was in its infancy, however, and the problems caused by removing the most micronutrient-rich part of the grain took decades to be recognised [13, 14]. Namely, the amount of many different vitamins and minerals is considerably lower in white compared to whole wheat flour, as they are found in the part of the wheat that is removed by the milling process. The most nutritionally important nutrients that are found at lower quantities in white bread are niacin, riboflavin, thiamine, and iron.

In the US, after the discovery of the structures and development of a production method for various vitamins in 1930s, a movement was started to enrich white flours with some of the nutrients that are removed by modern milling. By the end of the 1930s, some bread flours were enriched although there was a small price differential in the final product that made unenriched white flour a little cheaper for consumers. It wasn’t until a Second World War-related mandate in 1943, and subsequent promotional and educational campaign that flour enriched with thiamine, riboflavin, niacin and iron became almost universally used in the US. Enriched flour was identified as an effective tool to reduce the prevalence of deficiency diseases pellagra and beriberi in the US and other countries worldwide [15, 16].

What can happen when fortification policies are rolled back?

Sometimes food fortification programs are stopped. This can be due to a host of different reasons, but should be only be done carefully when the causes of low micronutrient intakes have been solved, to ensure that deficiency diseases do not reappear. Below are three examples of the consequences of rolling back food fortification policies.

Vitamin D fortification in Canada

The discovery that childhood bone disease rickets could be cured by either cod liver oil or exposure to sunlight [17], combined with the isolation of vitamin D in the 1930s and industrial synthesis in 1959, led to the implementation of food fortification with vitamin D in various countries including the US, Canada, the UK and Switzerland [18]. In Canada, fortification of many foods took place at a minimum level of 400 IU and a maximum level of 800 IU for daily intakes. As a result of this uncontrolled policy, supplemental vitamin D intakes were above upper thresholds for some children, while others remained at considerable risk of rickets. In 1964, regulations restricted fortification with vitamin D to a much narrower range of foods because of concerns about toxicity: only evaporated milk, margarine, infant foods, and fluid milk a year later [19]. Although these limitations on fortification with vitamin D were set in an attempt to prevent excessive intakes, they brought about an increase in rickets found in infants and young children through the latter half of the 1960s [20]. Subsequent regulations made the addition of vitamin D to all milk permissible, however it wasn’t until vitamin D fortification of milk and margarine was made mandatory in 1975 that rickets was finally virtually eliminated in Canada [21].

Vitamin A fortification of sugar in Guatemala

In the 1970s, vitamin A deficiency was highly prevalent in Guatemala. Estimates placed the proportion of young children with frank vitamin A deficiency at 20%. Sugar fortification was selected as the most appropriate staple food for fortification [22]: unlike the staple food maize, sugar is processed centrally; unlike luxury cereal wheat, it was consumed by a broad sector of the population; unlike salt, it was technically possible to add vitamin A to sugar at levels that make a significant contribution to intakes [23]. After several regulatory set-backs, a fortification law was enacted in 1974 and fortification started in 1975 [24]. Monitoring in 1977 showed that the prevalence of vitamin A deficiency in preschool children was dramatically reduced to 5% in one year due to increases in vitamin A intakes from fortified sugar.

Unfortunately, the program did not last. Sugar producers complained about the interference of government in free enterprise. Banks would not release foreign currency to allow sugar producers to purchase vitamin A internationally, and there was no local supplier [24]. Sugar could no longer be fortified. By the mid-1980s, vitamin A deficiency rates had returned to levels found before fortification. The program was introduced again in 1987, this time after actively involving the sugar industry in its institution. Fortified sugar reached 95% of households after the second introduction of the fortification program [24]. A recent survey conducted in 2009 found the prevalence of vitamin A deficiency to be less than 3%, highlighting the sustained effectiveness of the program’s reintroduction [22].

Iodine in Germany

Naturally occurring iodine levels in the soil in Germany are low. The normal iodine content of food in Germany is too low to prevent deficiency in the population, and surveys taken before any measures to improve the supply of additional iodine to the population in 1981 found that over one third of children, adolescents and pregnant women had goiter [25].

Prior to 1991, salt iodization programs differed between East and West Germany [26]. In East Germany, an iodized salt program was introduced in 1985/1986 [27], whereas there was no program in West Germany. There was a considerable increase in iodine intakes in East Germany in the late 1980s [28]. With the reunification of Germany in 1990, salt iodization projects were no longer supported. A “principle of voluntary action” was adopted, leading to declines in iodine intakes [25, 28]. It took a few years for a national program to be legislated in 1993 [29, 30]. Iodine levels in the population did not increase or decreased during the period 1992 to 1994, however after the salt iodization program was introduced again on a national level, they started to increase again in all sectors of the population [30, 31]. The increase in supply of iodine to the population was done gradually. The most significant increases in iodine supply have come from the iodization of salt used by bakers and to cure meats, as well as increasing the iodine content of animal feed. In 2005, researchers found that the iodine status of Germans was sufficient [30].

Global prevalence of micronutrient deficiencies

Micronutrient deficiencies are highly prevalent and a significant contributor to poor health worldwide, both in the developing [1] and industrialized world [32].

Iron

The micronutrient deficiency with the largest global impact is iron. Anemia affects one in three people globally, and iron deficiency is a cause in around half these cases, the others mainly due to parasitic infections. Anemia results in fatigue, low work performance and reduced cognitive performance, and is a risk factor for maternal and child mortality. Good management of anemia requires control of intestinal worms and malaria and improving access to bioavailable iron via dietary diversification. Iron fortification of food staples can be an important part of improving a population’s iron intake [33].

Iodine

One in three people have low iodine intakes, although significant progress to improve iodine nutrition through salt fortification is having a considerable impact [4]. Iodine deficiency diminishes the cognitive potential of populations, and increases the rate of stillbirth and infant mortality. The main cause of iodine deficiency is low intakes from soils with a low iodine content, although low protein intakes and consumption of poorly processed cassava contribute [34]. Salt iodization is considered to be the most effective way to improve national iodine intakes.

Vitamin A

Deficiency in vitamin A is one of the leading causes of blindness in the world, and a poor status increases the risk of certain infectious diseases in children. While the incidence of vitamin A deficiency has been decreasing over the past decades, it is still a serious problem for children aged up to 5 years in South Asia and sub-Saharan Africa, where it affects 30% of young children [35].

The two most important sources of vitamin A are meat (particularly organ meats such as liver), and from pro-vitamin A beta carotene in orange and dark green fruits and vegetables. A lack of dietary diversity is the most prevalent cause. Various short- and long-term strategies can improve vitamin A status in deficient populations. Supplementing high-risk populations such as young children with two high-doses per year, often administered as part of health care visits, has been effective in reducing rates of blindness and death. Fortification of foods such as milk, margarine, flour and sugar has been effective in various countries at improving vitamin A intakes [36]. Promotion of breastfeeding up until infants reach two years’ of age is also an effective strategy to prevent vitamin A deficiency [37]. The best long term approach to prevent deficiency is via dietary diversification to improve access to vitamin A rich foods [35], however this can only be accomplished with parallel poverty reduction.

B-Vitamins

Deficiencies in the B-vitamins thiamine (B1), riboflavin (B2), niacin (B3), B6, B12 and folate are widespread in many low-income countries, however reliable estimations of the proportions affected are unknown [1]. Low intakes of animal products and dairy, and a reliance on cereal products for the bulk of energy intakes are the main cause of various B-vitamin deficiencies.

Vitamin C

Although the curative effects of vitamin C on the sailor’s scourge of scurvy has been known for hundreds of years [38], and it was the first vitamin to be chemically synthesized in 1933 [39], vitamin C deficiency remains a considerable problem worldwide. Vitamin C is obtained in the diet from fruit and vegetables. In well-nourished countries such as the US, the prevalence is around 7% in the general population [40], but can be higher in people with a poor diet such as individuals with a low income [41]. In developing countries such as Brazil, the prevalence of vitamin C deficiency is around 30%, while in India, the majority of the population has vitamin C deficiency [42]. Improving access to and increasing consumption of fresh fruits and vegetables through increasing dietary diversity is seen as the best way to reduce deficiency rates.

Vitamin D

Vitamin D can be obtained from sunlight or the diet and therefore the effects of poor dietary diversity that are the cause of many other micronutrient deficiencies can be avoided in theory. Even so, very few foods contain a good source of vitamin D, and sun exposure must be sufficient, yet not excessive, to avoid sunburn whilst still enabling sufficient production in the skin. Carefully implemented fortification programs have been able to reduce the incidence of vitamin D deficiency in some countries. Globally, however, vitamin D deficiency is common [43]. Worldwide estimates find that around half the population is at risk of vitamin D deficiency [43].

REFERENCES

1.                Allen, L., et al., Guidelines on food fortification with micronutrients. 2006, World Health Organization and Food and Agriculture Organization of the United Nations.

2.            Zimmermann, M.B., Research on Iodine Deficiency and Goiter in the 19th and Early 20th Centuries. The Journal of Nutrition, 2008. 138(11): p. 2060-2063.

3.            American Academy of, P., et al., Update of newborn screening and therapy for congenital hypothyroidism. Pediatrics, 2006. 117(6): p. 2290-303.

4.            Pearce, E.N., M. Andersson, and M.B. Zimmermann, Global iodine nutrition: Where do we stand in 2013? Thyroid, 2013. 23(5): p. 523-8.

5.            Markel, H., "When it rains it pours": endemic goiter, iodized salt, and David Murray Cowie, MD. Am J Public Health, 1987. 77(2): p. 219-29.

6.            Zimmermann, M.B. and M. Andersson, Update on iodine status worldwide. Curr Opin Endocrinol Diabetes Obes, 2012. 19(5): p. 382-7.

7.            Knox, E.G., Anencephalus and dietary intakes. Br J Prev Soc Med, 1972. 26(4): p. 219-23.

8.            Smithells, R.W., S. Sheppard, and C.J. Schorah, Vitamin deficiencies and neural tube defects. Arch Dis Child, 1976. 51(12): p. 944-50.

9.            Smithells, R.W., et al., Prevention of neural tube defect recurrences in Yorkshire: final report. Lancet, 1989. 2(8661): p. 498-9.

10.          Czeizel, A.E., et al., Prevention of neural-tube defects with periconceptional folic acid, methylfolate, or multivitamins? Ann Nutr Metab, 2011. 58(4): p. 263-71.

11.          Williams, J., et al., Updated estimates of neural tube defects prevented by mandatory folic Acid fortification - United States, 1995-2011. MMWR Morb Mortal Wkly Rep, 2015. 64(1): p. 1-5.

12.          Atta, C.A., et al., Global Birth Prevalence of Spina Bifida by Folic Acid Fortification Status: A Systematic Review and Meta-Analysis. Am J Public Health, 2016. 106(1): p. e24-34.

13.          Fouser, D., A Much Better Article is the Old-Fashioned Loaf: Bread and Crisis in Britain‘s Country, City, and Empire, 1870-1914, in American Society for Environmental History annual meeting. 2014: San Francisco, California.

14.          David Bishai and Ritu Nalubola, The History of Food Fortification in the United States: Its Relevance for Current Fortification Efforts in Developing Countries. Economic Development and Cultural Change, 2002. 51(1): p. 37-53.

15.          Harper, C.G., et al., Prevalence of Wernicke-Korsakoff syndrome in Australia: has thiamine fortification made a difference?Med J Aust, 1998. 168(11): p. 542-5.

16.          Park, Y.K., et al., Effectiveness of food fortification in the United States: the case of pellagra. Am J Public Health, 2000. 90(5): p. 727-38.

17.          Rajakumar, K., Vitamin D, cod-liver oil, sunlight, and rickets: a historical perspective. Pediatrics, 2003. 112(2): p. e132-5.

18.          Sacco, J., Food Fortification Policy in Canada, in Handbook of Food Fortification and Health: From Concepts to Public Health Applications, V.R. Preedy, R. Srirajaskanthan, and V.B. Patel, Editors. 2013, Humana Press.

19.          Sacco, J., An Examination of the Population Health Implications of Voluntary Food Fortification and Nutrition-Related Marketing Practices in Canada, in Department of Nutritional Sciences. 2012, University of Toronto.

20.          Barsky, P., Rickets: Canada: 1968. Canadian Journal of Public Health / Revue Canadienne de Sante'e Publique, 1969. 60(1): p. 29-31.

21.          Canada, H., The Addition of Vitamins and Minerals to Foods: Proposed Policy Recommendations. 1999, Bureau of Nutritional Sciences, Food Directorate, Health Protection Branch. Ottawa.

22.          Bielderman, I., et al., Symposium Report: Effective and Safe Micronutrient Interventions, Weighing the Risks against the Benefits. European Journal of Nutrition & Food Safety, 2015. 5(4): p. 202-228.

23.          Pineda, O., Fortification of sugar with vitamin A. United Nations University

24.          Semba, R.D., The Vitamin A Story: Lifting the Shadow of Death. World Review of Nutrition and Dietetics, ed. B. Koletzko. Vol. 104. 2012: Karger.

25.          Meng, W. and A. Schindler. Iodine supply in Germany. in Elimination of iodine deficiency disorders (IDD) in central and eastern Europe, the Commonwealth of Independent States and the Baltic States. 1998. Munich, Germany.

26.          Meng, W. and A. Schindler, [Nutritional iodine supply in Germany. Results of preventive measures]. Z Arztl Fortbild Qualitatssich, 1997. 91(8): p. 751-6.

27.          Bauch, K., et al., [Dietary iodine deficiency in East Germany following introduction of interdisciplinary preventive use of iodine]. Z Gesamte Inn Med, 1990. 45(1): p. 8-11.

28.          Willgerodt, H., et al., The status of iodine nutrition in newborn infants, schoolchildren, adolescents and adults in former East Germany. Exp Clin Endocrinol Diabetes, 1997. 105 Suppl 4: p. 38-42.

29.          Bauch, K.H., et al., [Interdisciplinary preventive use of iodine in former East Germany following reunification and the status of packaged iodized table salt for improving alimentary iodine supply. Retrospect and prospects]. Z Gesamte Inn Med, 1991. 46(16): p. 615-20.

30.          Hampel, R., et al., [Urinary iodine excretion in German adults in 2005 meets WHO target]. Med Klin (Munich), 2009. 104(6): p. 425-8.

31.          Hampel, R., et al., Continuous rise of urinary iodine excretion and drop in thyroid gland size among adolescents in Mecklenburg-West-Pomerania from 1993 to 1997. Exp Clin Endocrinol Diabetes, 2000. 108(3): p. 197-201.

32.          Troesch, B., et al., Dietary surveys indicate vitamin intakes below recommendations are common in representative Western countries. Br J Nutr, 2012. 108(4): p. 692-8.

33.          Uauy, R., E. Hertrampf, and M. Reddy, Iron fortification of foods: overcoming technical and practical barriers. J Nutr, 2002. 132(4 Suppl): p. 849S-52S.

34.          Teles, F.F., Chronic poisoning by hydrogen cyanide in cassava and its prevention in Africa and Latin America. Food Nutr Bull, 2002. 23(4): p. 407-12.

35.          Stevens, G.A., et al., Trends and mortality effects of vitamin A deficiency in children in 138 low-income and middle-income countries between 1991 and 2013: a pooled analysis of population-based surveys. Lancet Glob Health, 2015. 3(9): p. e528-36.

36.          Dary, O., J.O. Mora, and A.C.G. International Vitamin, Food fortification to reduce vitamin A deficiency: International Vitamin A Consultative Group recommendations. J Nutr, 2002. 132(9 Suppl): p. 2927S-2933S.

37.          Allen, L.H. and M. Haskell, Vitamin A Requirements of Infants under Six Months of Age. Food and Nutrition Bulletin, 2001. 22(3): p. 214-234.

38.          Bhatt, A., Evolution of Clinical Research: A History Before and Beyond James Lind. Perspectives in Clinical Research, 2010. 1(1): p. 6-10.

39.          Zilva, S.S., The isolation and identification of vitamin C. Archives of Disease in Childhood, 1935. 10(58): p. 253-264.

40.          Schleicher, R.L., et al., Serum vitamin C and the prevalence of vitamin C deficiency in the United States: 2003-2004 National Health and Nutrition Examination Survey (NHANES). Am J Clin Nutr, 2009. 90(5): p. 1252-63.

41.          Mosdol, A., B. Erens, and E.J. Brunner, Estimated prevalence and predictors of vitamin C deficiency within UK's low-income population. J Public Health (Oxf), 2008. 30(4): p. 456-60.

42.          Ravindran, R.D., et al., Prevalence and risk factors for vitamin C deficiency in north and south India: a two centre population based study in people aged 60 years and over. PLoS One, 2011. 6(12): p. e28588.

43.          Wacker, M. and M.F. Holick, Sunlight and Vitamin D: A global perspective for health. Dermatoendocrinol, 2013. 5(1): p. 51-108.

This site uses cookies to store information on your computer.

Más información