High-income countries have low rates of food security. In general, there is plenty of food for everyone and rates of undernourishment are generally insignificant (1). However, deficiencies in vitamins and minerals can still exist at concerning rates despite good overall food security. For example, approximately one in three Americans have a vitamin deficiency or anemia (2). Vitamin D deficiency and insufficiency is common in many European countries (3). In addition, within a well-nourished population, individuals may still be at risk of deficiency due to nutrient-poor diets, illness or increased needs. This can have important consequences for public health. How to identify and target interventions to groups vulnerable to vitamin and mineral deficiencies was the subject of a recent review article by Bruins et al. (4).
Preventing Vitamin and Mineral Deficiencies
We, as individuals, want to avoid having a micronutrient deficiency. Pick up any nutrition textbook to hear about the symptoms: fatigue and reduced cognitive abilities from iron deficiency, an increase in infections with vitamin A deficiency, dermatitis and confusion with vitamin B6 deficiency, and poor wound healing with vitamin C deficiency (5). Also, governments, public health departments, health organizations and NGOs do not want people to have micronutrient deficiencies. Vitamin and mineral deficiencies can be treated easily, yet left untreated they can lead to an increase in health care costs and a reduction in the work efficiency of the population.
In well-nourished populations, it is important to identify who is at risk of a deficiency, so that these people can be targeted. This has two aims, firstly to direct scarce resources to the people who would benefit most, and to avoid over-supplementing well-nourished populations whose diets are already adequate. Nutrient needs vary through the lifecycle, and vitamin and mineral intakes are affected by our economic status, nutrition knowledge, and cultural factors. Populations commonly found to have higher rates of deficiency include pregnant women and young children due to their relatively high nutrient needs, and the elderly from reduced food intakes and illness. Anemia is often found in women of childbearing age (2).
Different approaches can be taken to address nutrient deficiencies. Many poor diets can be improved by nutrition education (6, 7). Another approach is to provide food packages to vulnerable groups to improve their diet (8). Taxes and subsidies can help consumers make more nutritious food choices (9). Food fortification can be used to increase the intake of certain micronutrients, but it can be hard to target it specifically to vulnerable groups (10). Dietary supplements can be one strategy: women planning a pregnancy are advised to consume a supplement containing folic acid in many countries to help prevent neural tube defects (11).
Is It Worth It? Calculating costs and benefits
Before an intervention is started in a vulnerable population, a cost-benefit analysis is a useful way to work out whether it will be worthwhile. Costs include those needed to implement the program, and resources used for identifying vulnerable groups and testing. Adoption and adherence affect how successful the program will be in reducing rates of deficiency and therefore the benefits of the program. Cost savings will depend on both direct reductions in health care costs, and indirect improvements in productivity. Measures such as the Disability-Adjusted Life Year or Quality-Adjusted Life Year can help here (12). Cost-benefit analyses have identified some potential programs to increase the supply of vitamins and minerals to certain vulnerable population groups, that results in net cost savings (13-15).
Read more about how to do this in “Considerations for Secondary Prevention of Nutritional Deficiencies in High-Risk Groups in High-Income Countries.”
Connect with NUTRI-FACTS on LinkedIn to stay updated on the latest nutrition research and news that is impacting the health of our world.