How Much is Too Much?
Excess Vitamins and Minerals in Food Can Harm Kids’ Health
June 19, 2014
How Much is Too Much? : Harmful Effects of Excess Vitamins and Minerals
Adequate vitamin and mineral intake from a balanced diet is essential for maintaining health and preventing diseases caused by dietary deficiencies, such as pellagra (triggered by a shortage of niacin) or night blindness in children who lack vitamin A. But at too high a level some nutrients can be toxic. Ingesting too much vitamin A during pregnancy can cause severe developmental abnormalities in the fetus, for example. Excessive zinc can suppress rather than stimulate the immune system. Large doses of niacin can produce symptoms that range from nausea and blurred vision to liver toxicity.
A number of vitamins and minerals have been tested in clinical trials to investigate whether taking large amounts could prevent cancer and other diseases. Generally, excessive vitamin levels had no preventive effect and in some cases were associated with increased cancer deaths. The research highlights the importance of sufficient but not excessive intake of vitamins and minerals.
People need adequate vitamin A to maintain normal immune function, eyesight, the reproductive system and many other aspects of health (Health Canada 2010). However, vitamin A deficiency is uncommon in the United States today (ODS 2013a; FDA 2014b), according to both the Office of Dietary Supplements of the National Institutes of Health and the FDA. As the FDA stated, “Vitamin A deficiency based on an assessment of vitamin A status is rare in the U.S. population” (FDA 2014b). In contrast, deficiencies remain a problem in some developing countries where diets do not provide sufficient amounts of vitamin A-rich animal foods. Insufficiency leads to vision problems, such as inability to see in low light or darkness.
“Vitamin A deficiency based on an assessment of vitamin A status is rare in the U.S. population” (FDA 2014b).
A diet that includes five servings a day of carotenoid-rich fruits and vegetables as well as milk and meat products generally provides enough vitamin A without food fortification or supplementation (IOM 2001). Even in the U.S., however, some groups may not eat a sufficiently varied diet. American teenagers are a prime example; fewer than half get adequate vitamin A (Berner 2014).
With widespread vitamin A-fortified food and increasing use of dietary supplements, however, many Americans, especially younger children, have the opposite problem: consuming more vitamin A than the Institute of Medicine considers safe (IOM 2001; IOM 2003; IOM 2005; ODS 2013a).
Numerous case studies have shown the risks of excessive intake of vitamin A for infants, toddlers and children. Infants getting very high amounts can develop intracranial and skeletal abnormalities as well as increased cranial pressure. Among the more common signs of vitamin A toxicity are brittle nails, hair loss, fever, headaches and weight loss (IOM 2001). At high doses, vitamin A is also toxic to the liver, the body’s main storage site for vitamin A.
Due to the particular risks of vitamin A to young children, the German Federal Institute for Risk Assessment and the Dutch National Institute for Public Health and the Environment have recommended against vitamin A fortification of foods in general (BfR 2005; Kloosterman 2007).
High intakes of preformed vitamin A may also be a health risk for older adults, particularly post-menopausal women at risk of osteoporosis and hip fractures (Tanumihardjo 2013; UK EVM 2003). Preformed vitamin A compounds, whether naturally occurring in foods such as liver or taken as dietary supplements, have been shown to alter bone metabolism and lead to bone loss and osteoporosis in humans and laboratory animals (Penniston 2006; Walker 1982).
Five large-scale population studies in the United States and Sweden found that high dietary intake of vitamin A decreased bone density and increased the risk of hip fracture in both older women and men (Feskanich 2002; Lim 2004; Melhus 1998; Michaëlsson 2003; Promislow 2002). These studies prompted the United Kingdom Expert Group on Vitamins and Minerals, an independent committee that advises the British government’s Food Standards Agency, to set a Guidance Level for vitamin A in adults of 1,500 μg preformed vitamin A per day, half the Tolerable Upper Intake Level for adults set by the U.S. Institute of Medicine (UK EVM 2003). Eating just two snack bars with 50 percent of the adult Daily Value of preformed vitamin A would reach this level.
In the long-running, prospective Nurses’ Health Study of 72,337 postmenopausal women aged 34 to 77 years, those who ingested 2,000 micrograms or more of preformed vitamin A a day had nearly twice the rate of hip fractures of those who took less than 500 micrograms a day. Ingesting beta-carotene, a naturally occurring vitamin A precursor, did not contribute significantly to fracture risk (Feskanich 2002).
In animal studies, high dietary intake of vitamin A has been shown to decrease bone mass and lead to bone thinning and spontaneous fractures. Retinoic acid, the active form of vitamin A, inhibits bone formation (Lind 2013). Getting enough calcium and vitamin D is important for bone health, but vitamin A counteracts the positive effects of vitamin D. (Johansson and Melhus 2001).
Zinc is involved in many aspects of cellular metabolism and normal growth and development. It also plays a role in the immune system. Severe zinc deficiency in malnourished people can suppress immune function and wound healing. Overt zinc deficiency is uncommon in North America, according to the National Institutes of Health’s Office of Dietary Supplements, but low levels can occur in vulnerable populations, such as people with gastrointestinal or sickle cell disease (ODS 2013b).
Zinc taken within 24 hours of developing a common cold, as in zinc lozenges, can shorten the duration of cold symptoms, even though it may be associated with unpleasant symptoms such as nausea (Science 2012; Singh 2013). But in healthy adults, routinely adding supplemental zinc beyond the amounts in the food supply has few, if any, long-term benefits for immunity (Hodkinson 2007). One 2011 study of healthy 7-to-13-year-old children found that eating breakfast cereal fortified with 25 mg of zinc per 100 gram serving (four times the recommended dietary allowance for 9-to-13-year-olds) had no influence on immune function (Nieman 2011).
Although no adverse effects have been found from consuming naturally occurring zinc in food, excessive supplementation has been shown to suppress immune function. That’s because zinc interferes with copper absorption, leading to copper deficiency, anemia, changes in red and white blood cells and lowered immunity (IOM 2001). In clinical studies, high zinc intake has been associated with a significant increase in hospitalization for genitourinary causes (ODS 2013b).
Because of such concerns, the German Federal Institute for Risk Assessment has recommended against fortifying food with zinc (BfR 2006).
Meanwhile, the zinc intake of US children has been increasing over the past two decades (Arsenault 2003; Butte 2010). In 2003, that led researchers at the University of California, Davis to warn that, “if zinc intake continues to increase because of the greater availability of fortified foods in the US food supply, the amount of zinc consumed by children may become excessive” (Arsenault 2003). Two years later, the Institute of Medicine said high intake of fortified zinc was a growing concern for young children (IOM 2005).
Today, 72 percent of 1-to-3-year-old children get too much zinc from diet and supplements (Butte 2010). The problem is particularly acute in families participating in the federal Women, Infants and Children (WIC) nutrition program and those in the lowest income category, because these groups often eat diets with limited fresh food and more processed, fortified foods (IOM 2005).
Niacin plays a role in a variety of metabolic reactions and is necessary for the activity of many enzymes. Niacin deficiency results in a disease called pellagra that affects skin, the digestive tract and the nervous system. The disease was common in the United States and parts of Europe in the early 20th century in areas where corn, a cereal low in both niacin and the amino acid tryptophan, was a dietary staple (IOM 1998b), but today it has virtually disappeared in the developed world. Alcoholism is currently the main cause of niacin deficiency in the United States, according to the University of Maryland Medical Center (UMM 2013).
Excess niacin can lead to flushing reactions, tingling, itching and reddening of the skin, rashes and nausea. High doses can cause liver toxicity, with symptoms such as jaundice, glucose intolerance and blurred vision (IOM 1998).
In February of this year (2014), a case of niacin toxicity was reported after a shipment of enriched rice was apparently accidentally over-fortified by the manufacturer, Mars Foodservices. The company recalled its Uncle Ben’s Infused Rice products after students and teachers in some Texas public schools experienced burning, itching rashes, headaches and nausea 30-to-90 minutes after eating the rice. Similar cases have occurred in Illinois and North Dakota (FDA 2014d). Mars Foodservices acknowledged in a statement that the illnesses might have been related to high levels of niacin in its enriched rice (Sun 2014).