Chereads / Carbon / Chapter 114 - Iron

Chapter 114 - Iron

Principles of Nutrigenetics and Nutrigenomics

Fundamentals of Individualized Nutrition

2020, Pages 317-320

Chapter 42 - Iron

Author links open overlay panelMartinKohlmeier

Show more

Outline

Share

Cite

https://doi.org/10.1016/B978-0-12-804572-5.00042-2Get rights and content

Abstract

Iron is both an essential nutrient and a very reactive chemical. Its uptake, transport to tissues, and its removal from the body is normally tightly controlled. Rare variants of virtually all involved genes disrupt this finely tuned balance and will cause severe health consequences, usually due to slow unchecked iron accumulation in sensitive tissues including liver, pancreas, heart, and brain. Relatively common gene variants, most often in the HFE gene, will increase the risk of excessive iron accumulation in conjunction with alcohol excess, high iron intake, and other dietary risk factors. Accumulation of iron in tissues can be prevented or slowed with dietary adjustments, bloodletting, and sometimes liver transplantation.

Some of these polymorphisms, particularly in the HFE, TF, TNF-α, and TMPRSS6 genes, slightly decrease the risk of iron deficiency. These polymorphisms have probably become common because they conferred this selective advantage of survival in an iron-poor nutrition environment (nutritope).

Abstract

Iron is both an essential nutrient and a very reactive chemical. Its uptake, transport to tissues, and its removal from the body is normally tightly controlled. Rare variants of virtually all involved genes disrupt this finely tuned balance and will cause severe health consequences, usually due to slow unchecked iron accumulation in sensitive tissues including liver, pancreas, heart, and brain. Relatively common gene variants, most often in the HFE gene, will increase the risk of excessive iron accumulation in conjunction with alcohol excess, high iron intake, and other dietary risk factors. Accumulation of iron in tissues can be prevented or slowed with dietary adjustments, bloodletting, and sometimes liver transplantation.

Some of these polymorphisms, particularly in the HFE, TF, TNF-α, and TMPRSS6 genes, slightly decrease the risk of iron deficiency. These polymorphisms have probably become common because they conferred this selective advantage of survival in an iron-poor nutrition environment (nutritope)

Pathogenesis of ring sideroblast formation

Iron accumulation in mitochondria is unique to sideroblastic anemia and is not seen in primary or secondary iron overload, which is characterized by cytoplasmic iron accumulation. This may be because of the unique regulation of heme biosynthesis in erythrocytes, in which excess iron is taken up by the mitochondria from transferrin. Such a transfer takes place via the transferrin–transferrin receptor pathway and is unavailable to cytoplasmic chelators. Studies also revealed that cytoplasmic iron not bound to transferrin is inefficiently utilized for heme biosynthesis and that the endosome–mitochondria interaction increases chelatable mitochondrial iron.

In nonerythroid cells, excess heme provides negative feedback and inhibits further heme accumulation. However, this mechanism is absent in erythrocytes in which excess iron is avidly taken up by the mitochondria even when protoporphyrin IX is suppressed (protoporphyrin IX is the substrate for the synthesis of heme; see Figure 3). This phenomenon of acquisition of iron from transferrin even in the presence of excess heme is an important distinction between erythroid and nonerythroid cells and plays an important role in mitochondrial iron accumulation. This also explains the accumulation of mitochondrial iron in disorders of porphyrin biosynthesis such as ALAS2 deficiency.

Iron is both an essential nutrient and a very reactive chemical. Its uptake, transport to tissues, and its removal from the body is normally tightly controlled. Rare variants of virtually all involved genes disrupt this finely tuned balance and will cause severe health consequences, usually due to slow unchecked iron accumulation in sensitive tissues including liver, pancreas, heart, and brain. Relatively common gene variants, most often in the HFE gene, will increase the risk of excessive iron accumulation in conjunction with alcohol excess, high iron intake, and other dietary risk factors. Accumulation of iron in tissues can be prevented or slowed with dietary adjustments, bloodletting, and sometimes liver transplantation.

Some of these polymorphisms, particularly in the HFE, TF, TNF-α, and TMPRSS6 genes, slightly decrease the risk of iron deficiency. These polymorphisms have probably become common because they conferred this selective advantage of survival in an iron-poor nutrition environment (nutritope).

Pathogenesis of ring sideroblast formation

Iron accumulation in mitochondria is unique to sideroblastic anemia and is not seen in primary or secondary iron overload, which is characterized by cytoplasmic iron accumulation. This may be because of the unique regulation of heme biosynthesis in erythrocytes, in which excess iron is taken up by the mitochondria from transferrin. Such a transfer takes place via the transferrin–transferrin receptor pathway and is unavailable to cytoplasmic chelators. Studies also revealed that cytoplasmic iron not bound to transferrin is inefficiently utilized for heme biosynthesis and that the endosome–mitochondria interaction increases chelatable mitochondrial iron.

In nonerythroid cells, excess heme provides negative feedback and inhibits further heme accumulation. However, this mechanism is absent in erythrocytes in which excess iron is avidly taken up by the mitochondria even when protoporphyrin IX is suppressed (protoporphyrin IX is the substrate for the synthesis of heme; see Figure 3). This phenomenon of acquisition of iron from transferrin even in the presence of excess heme is an important distinction between erythroid and nonerythroid cells and plays an important role in mitochondrial iron accumulation. This also explains the accumulation of mitochondrial iron in disorders of porphyrin biosynthesis such as ALAS2 deficiency.

View chapterPurchase book

Cirrhosis

S. Honigbaum, ... K.B. Schwarz, in Encyclopedia of Food and Health, 2016

Minerals

Iron losses may result from gastrointestinal bleeding, and iron stores may be affected in repeated episodes of bleeding. Zinc and magnesium losses may affect those patients who are prescribed diuretics. Zinc deficiency may also result due to poor intake and decreased absorption. Zinc deficiency may result in altered taste perception and anorexia, which may further contribute to poor intake. Insufficient calcium levels can be found due to steatorrhea and 'hungry bone' phenomenon in vitamin D deficiency. On the other end of the spectrum, copper and manganese may accumulate in the liver to the point of hepatotoxicity, and those patients who may be susceptible should be monitored.

View chapterPurchase book

Hyperkinetic Movement Disorders

Alisdair Mcneill, Patrick F. Chinnery, in Handbook of Clinical Neurology, 2011

Idiopathic NBIA

Idiopathic NBIA is an umbrella term used to describe all cases where imaging or autopsy shows high brain iron but in which a mutation in one of the known genes is not identified. There is evidence that idiopathic NBIA has both genetic and acquired etiologies. A general description of inherited idiopathic NBIA can be deduced from case series of PKAN (Hayflick et al., 2003) and INAD (Gregory et al., 2008). In cases of idiopathic NBIA from within a cohort investigated for PLA2G6 mutations, average age of onset in mutation-negative cases was 6.8 years (range 1–31 years) with a heterogeneous clinical presentation; optic atrophy was seen in 25 % of cases; MRI revealed cerebellar atrophy in 12% and high brain iron in the globus pallidus in all (Gregory et al., 2008). In mutation-negative cases from within a series investigated for PKAN, the presentation was in keeping with the atypical PKAN phenotype, but neither speech nor psychiatric problems were observed. Age at onset was from 6 months to 38 years and inheritance was likely recessive. No mutation-negative patient had an "eye of the tiger" sign (Hayflick et al., 2003).

Several studies have described MRI features consistent with brain iron accumulation in a variety of nongenetic neurological and systemic disorders. Grossly elevated levels of iron are present in the basal ganglia and cerebral cortex of patients with multiple sclerosis, as measured by MRI techniques (Ge et al., 2007). In patients with human immunodeficiency virus (HIV) infection elevated levels of basal ganglia iron have been demonstrated by MRI (Miszkiel et al., 1997). There is also MRI evidence of elevated iron deposition in the putamen and caudate of beta-thalassemia patients (Metafratzi et al., 2001). Iron deposition in the substantia nigra of Parkinson's disease patients (Gerlach et al., 2006) and the temporal lobe of Alzheimer's disease patients (House et al., 2007) has also been shown on MRI scanning. The mechanism leading to iron accumulation in these conditions is unclear, though iron present in inflammatory cells and iron released from degenerating neurons is likely to play a role. When interpreting MRI scans in clinical practice, it must be borne in mind that there are nongenetic causes of brain iron accumulation.

There are also detailed reports of individual cases of idiopathic NBIA. Forni et al. (2008) describe a 61-year-old woman who presented with chorea and high basal ganglia iron on her MRI. Ceruloplasmin was normal and genetic testing for neuroferritinopathy and PKAN negative. It seems most unlikely that this patient would harbor a PLA2G6 mutation. Her movement disorder responded to treatment with hydroxypyridone deferiprone (DFP), an iron chelator that crosses the blood–brain barrier. DFP may warrant investigation as a therapy for neuroferritinopathy and aceruloplasminemia.

Tofaris et al. (2007) described a 27-year-old woman with a progressive depressive disorder, extrapyramidal syndrome, and seizures with high iron in the globus pallidus and substantia nigra on MRI. Gene testing for PANK2 and neuroferritinopathy was negative. Cortical biopsy demonstrated alpha-synuclein-positive Lewy bodies and tau-positive neurites but no spheroids. These cases clearly show that there are other genetic and acquired causes of NBIA awaiting discovery.

View chapterPurchase book

Magnetic resonance imaging of the liver, biliary tract, and pancreas

Lawrence H. Schwartz, in Blumgart's Surgery of the Liver, Pancreas and Biliary Tract (Fifth Edition), 2012

Iron Deposition Disease

Iron accumulation within the liver has two main causes: hemochromatosis and hemosiderosis. Hemochromatosis is characterized by abnormal intestinal absorption of iron, the accumulation of which is predominantly in hepatocytes until late in the disease, when there is "spillover" into the pancreatic parenchyma. The liver shows abnormally low signal intensity compared with spleen on T1-weighted sequences. Gradient-echo sequences are the most sensitive sequence for detecting the presence of iron within the hepatic parenchyma. Primary, or genetic, hemochromatosis is important to diagnosis, because this entity may be unnoticed until late in the disease process, and its long-term sequelae include fibrosis, cirrhosis, and HCC, which also may be imaged with MRI. In addition, screening of family members is important in primary hemochromatosis, because it is an autosomal recessive trait.

Hemochromatosis must be distinguished from hemosiderosis, which is not genetically linked but is associated with multiple blood transfusions; conversely, it has a benign course, with accumulation of hepatic iron in the reticuloendothelial system. MRI is excellent for identifying these entities, because iron changes the expected signal intensities of abdominal organs (Pomerantz & Siegelman, 2002; Queiroz-Andrade et al, 2009). In patients with hemochromatosis, the normal liver–spleen pattern is reversed on T1-weighted images. In more advanced stages, iron also is deposited in the pancreatic parenchyma. Hemosiderosis affects the spleen and bone marrow early on in the disease process, with the liver being affected later. These distinguishing characteristics, and the patient's history, allow for correct differentiation; gradient-echo imaging is sensitive for the detection and characterization of these two processes (Fig. 17.5; Kim et al, 2002a; Rofsky & Fleishaker, 1995).

View chapterPurchase book

Hereditary Hemochromatosis

David J. Brandhagen, in Encyclopedia of Gastroenterology, 2004

Clinical Features

Iron accumulation in hereditary hemochromatosis is a slow, insidious process; only a few milligrams of excess iron may be absorbed from the duodenum each day. Clinical manifestations frequently do not occur until at least the fifth decade, when 15–40 g of iron have accumulated (normal body iron stores are approximately 3–4 g). Disease expression may occur earlier in some persons and may not occur at all in others. Clinical expression is influenced by age, sex, iron content of the diet, blood loss as occurs in menstruation and pregnancy, and other unknown factors. Although the gene frequency is similar in males and females, the disease is expressed less frequently in women than in men. Factors such as alcohol and hepatitis C may also influence disease expression. Several recent studies have concluded that clinically significant disease expression may not occur in a majority of individuals who are homozygous for the C282Y mutation, even if undiagnosed and untreated.

The classic description of HH is cutaneous hyperpigmentation, diabetes mellitus, and cirrhosis of the liver (bronze diabetes). Other clinical manifestations include fatigue, abdominal pain, hepatomegaly, abnormal liver tests, hepatocellular carcinoma, cardiomyopathy, cardiac conduction disorders, hypothyroidism, hypogonadism, impotence, and arthropathy.

In the past, HH was usually diagnosed at an advanced stage; currently, most patients with newly diagnosed HH are asymptomatic. This shift toward earlier diagnosis may be due in part to increased physician awareness. Of HH patients who are symptomatic, fatigue, arthralgias, and impotence are most common. Most if not all clinical manifestations are preventable if HH is diagnosed early and treated appropriately. Some of the disease manifestations, such as skin bronzing, cardiomyopathy, cardiac conduction disorders, hepatomegaly and abnormal liver tests, frequently are reversible once excess iron stores are removed. Most of the other clinical manifestations are not reversible, however.

View chapterPurchase book

The Bone–FGF23–Klotho Axis and Associated Diseases

Seiji Fukumoto, in Reference Module in Biomedical Sciences, 2019

Hypophosphatemia by Intravenous Iron Preparations

Some intravenous iron preparations such as iron polymaltose, saccharated ferric oxide, and ferric carboxymaltose can cause hypophosphatemic osteomalacia with high FGF23 level in patients with iron-deficient anemia (Schouten et al., 2009; Shimizu et al., 2009). Iron deficiency enhances FGF23 transcription and conversely iron administration decreases FGF23 level measured by C-terminal assay (Wolf et al., 2013). While ferric carboxymaltose temporally increased FGF23 measured by full-length assay, iron dextran did not increase full-length FGF23 (Wolf et al., 2013). These results suggest that some intravenous iron preparations suppress the processing of full-length FGF23 and increase FGF23 level. However, it is unknown how iron preparations affect the processing of FGF23 protein.

View chapterPurchase book

Practical Uses of Nutrigenetics

Martin Kohlmeier, in Nutrigenetics, 2013

7.2.4.1 Hemochromatosis

This iron retention disease is very common and important for health in many countries, for instance accounting for nearly 1% of all mortality in Denmark [7]. First of all, it needs to be understood that family members of individuals with known familial hemochromatosis must be educated about screening. The reason is that early lifestyle adjustment can reduce excess risk. Helpful measures include bloodletting (blood donations) several times a year, tempering alcohol consumption, doing away with iron-containing dietary supplements and iron-fortified cereals, and avoiding eating or handling raw seafood. Regular medical controls monitor iron accumulation but also need to ensure that iron levels do not fall below desirable levels in young women, particularly during pregnancy.

While most experts will agree that these simple preventive measures would reduce disease burden with modest cost (mainly arising from more frequent check-ups), there is not a concerted effort to detect carriers at the earliest possible stage. When it comes to screening, most discussion focuses on the question of whether to use transferrin saturation (the percentage binding capacity of the transferrin molecule filled by iron) or ferritin concentration. Both tests become abnormal only after iron has accumulated to an unhealthy level. Transferrin saturation values often become false positives due to diurnal variation, inflammation, recent food intake, or unrelated liver disease, which means suboptimal sensitivity. Both measurements have limited sensitivity, missing 15–30% of patients at risk, depending on cut-offs. Specificity is also imperfect, which means that a much larger number of false positives (typically 5–20%) than true positives (about 0.4%) are identified. This is less concerning, since most of the false positives are classified properly with later confirmation tests.

The use of a comprehensive genetic scan and assessment of all potential gene variants that could predispose to iron retention is rapidly coming within our grasp. Such a scan, most likely distilled from whole-genome sequence data, will provide very early information that can be used for preventive measures before significant iron accumulation has occurred. The costs would be modest if sequence data are already available.

View chapterPurchase book

Drug-induced ocular side effects

In Clinical Ocular Toxicology, 2008

Clinical significance

Systemically administered iron preparations seldom cause ocular side effects. Adverse ocular reactions have been reported after multiple blood transfusions (over 100), with unusually large amounts of iron in the diet or markedly prolonged iron therapy. A few cases of retinitis pigmentosa-like fundal degeneration have been reported. Hodgkins et al (1992) described a case of pigment epitheliopathy with an overlying serous retinal detachment following an infusion of iron dextran. Newer iron preparations make retinal degenerations less likely. Kawada et al (1996) described photosensitivity reactions due to sodium ferrous citrate.

Direct ocular exposure to acidic ferrous salts can cause ocular irritation, but significant ocular side effects rarely occur.

View chapterPurchase book

A worldwide yearly survey of new data in adverse drug reactions

N.H. Choulis, in Side Effects of Drugs Annual, 2014

Iron

Intravenous iron preparations are used when oral iron supplementation is not feasible or may not work. All intravenous iron products have a small risk of causing allergic reactions that can be life threatening if not treated promptly. A review of intravenous iron-containing medicines used to treat iron deficiency and anaemia associated with low iron levels concluded that the benefits of these medicines are greater than their risks, provided that adequate measures are taken to minimise the risk of allergic reactions [38S].

It is therefore suggested that measures should be put in place to ensure the early detection and effective management of allergic reactions that may occur with intravenous iron preparations; hence, these should only be administered in an environment where resuscitation facilities are available so that patients who develop an allergic reaction can be treated immediately. The current practice of first giving the patient a small test dose is not a reliable way to predict how the patient will respond when the full dose is given. Therefore, a test dose is no longer recommended but instead, caution is warranted with every dose of intravenous iron that is given, even if previous administration has been uneventful. Moreover, during pregnancy, allergic reactions are of particular concern as they can put both the mother and unborn child at risk. Intravenous iron medicines should not be used during pregnancy unless clearly necessary. Treatment should be confined to the second or third trimester, provided the benefits of treatment clearly outweigh the risks to the unborn baby.

View chapterPurchase book

Neuroimaging of Brain Iron Deposition in Mild Cognitive Impairment and Dementia

Sven Haller MD, PD, CC, MSc, ... Panteleimon Giannakopoulos MD, in Diet and Nutrition in Dementia and Cognitive Decline, 2015

Global Iron Deposition

Brain iron deposition takes place in a small number of genetically determined young cases and also occurs as a secondary event of neurodegeneration in normal aging [31], MCI, and AD [25,32,33]. As discussed above, SWI images are obtained by calculations based on both magnitude and phase images [6]. The majority of previous neuroimaging investigations on global brain deposition have focused on parameters derived from phase images, as phase shift is correlated with iron concentration, for example [34–36]. These phase images, however, are not used for film reading in clinical neuroradiology and are oftentimes not stored on image servers. Although less accurate, even "standard" SWI images correlate well with brain iron content [37], allowing for a wider application of SWI to assess global degenerative brain iron deposition (Figure 53.5).

Sign in to download full-size image

Figure 53.5. SWI.

Magnitude (A) and phase (B) image, which are combined to calculate the SWI image (C). Oftentimes, a minimal intensity projection (MinIP) is calculated based on the SWI images for clinical use, highlighting in particular the vascular structures (D). In most centers, only the clinically used SWI image is stored on the image server. The phase image, while not useful for clinical film reading, allows for most accurate assessment of brain iron deposition.

Global Degenerative Iron Deposition in Normal Aging

Brain iron accumulation during normal aging follows a clear anatomic pattern. For instance, in the globus pallidum, iron deposition begins in the posterior segment and later spreads to the anterior segment [31]. In the putamen, iron accumulation takes place initially in the lateral portion before spreading to the medial portion [31].

Global Degenerative Iron Deposition in MCI and AD

Iron content increases in the pallidum and substantia nigra in MCI, AD, and other types of dementias [33,38]. Two recent studies used phase-corrected MRI to detect brain iron accumulation in AD cases [35,36]. In the first one, a significant increase in iron concentration was mainly observed in the hippocampus, parietal cortex, putamen, caudate nucleus, and dentate nucleus of AD patients. The parietal cortex was the only region with a significant difference between mild AD and healthy controls. Similarly, Ding et al. [36] reported increased iron content in the basal ganglia and the right hippocampus but only weak correlations between dementia severity and iron content in this latter area. A recent SWI study assessed iron concentration in 14 manually drawn region of interests (ROIs) in HC, sMCI, and pMCI subjects [24]. The left putamen was the only region with a significant increase of iron levels over time in pMCI compared to sMCI and HC groups. Another recent study of MCI demonstrated that the most prominent changes in brain iron content occur in the pallidum, substantia nigra, and red nucleus [25], three subcortical areas that are known to have the maximum iron concentration in control brains [39,40]. In summary, these results suggest that the accumulation of brain iron in the basal ganglia is associated with cognitive deterioration. This accumulation was traditionally considered as an epiphenomenon of cell degeneration in subcortical nuclei. However, a disturbed balance of iron homeostasis was recently suggested as a potential precursor of the neurodegenerative processes, at least in some AD cases [32]. Brain iron deposition is not limited to AD, and it may play a key role in the pathogenesis of other neurodegenerative diseases, notably PD [37,41–43].

Dietary and nutritional aspects of iron

The majority of previous investigations regarding dietary and nutritional aspects of iron and cognition have focused on iron deficiency in childhood. It is generally accepted that iron deficiency, as the most prevalent nutritional disorder in the world, is associated with cognitive deficits in children, while the extent to which these cognitive deficits are reversible with iron reconstitution remains controversial [44]. The more relevant association in the context of the current review, notably the correlation between iron intake during aging and AD, remains very controversial. A systematic review assessing dietary implications of iron and copper concludes that neither increased nor decreased intake of iron or copper had a clear beneficial effect on cognitive performance in patients with AD [45]. An animal model demonstrated that iron deficiency in rat pups altered regulation in six of seven Alzheimer-related genes, while the iron treatment had only minimal effects, indicating that neonatal iron deficiency might induce a dysregulation of Alzheimer-related genes predisposing individuals to long-term neurodegenerative disease [46]. Another animal model of transgenic AbetaPP/PS1 mice showed that although iron administration in the postnatal period was associated with cellular and metabolic imprinting in the brain in adult life, there was no effect with respect to the appearance of amyloid plaques [47]. In summary, the available evidence on iron intake and risk of AD remains conflicting—both dietary iron increase as well as decrease being associated with AD.

Economic losses due to soiled eggs have been reduced through the use of improved methods of cleaning egg shells. Funk (1948) and Winter el al. (1958) reported wet cleaning methods were more desirable than dry ones. However, certain minerals present in water used for washing eggs can hasten the deterioration of egg quality. Iron has been reported to hasten bacterial spoilage in experimentally inoculated eggs and eggs washed under actual farm conditions (Garibaldi and Bayne, 1960, 1962a, b). Since iron imparts a brownish color to laundered goods and affects the taste of beverages it is considered an objectionable constituent in water supplies (Riddick el al., 1958). It would be reasonable to expect that iron present in egg wash water could stain the shell of eggs.