Anemia is a hallmark complication of advanced chronic kidney disease and of multifactorial origin, including reduced production and effectiveness of erythropoietin as well as reduced availability and access to ingested and stored iron.  Using a nationally representative survey of people in the United States, in individuals with a creatinine clearance of 30 mL/min or less and who were relatively iron replete (ferritin ≥ 100 ng/mL and transferrin saturation ≥ 20%), 46% of men had a hemoglobin concentration <12 g/dL and 21% of women had a Hb concentration <11 g/dL.1  Iron deficiency was rather common in this survey of the general U.S.. population. Specifically among patients with a creatinine clearance of 30 mL/min or less, approximately half of women and one fifth of men had a transferrin saturation of <20%; approximately 47% of women and 44% of men had ferritin concentrations of <100 ng/dL. These data from an era (1988-1994) when perhaps little attention was paid to managing iron status in patients with CKD, normalizing iron stores in a large proportion of the population would clearly have been a first important anemia treatment goal. I am unaware that these findings have been updated using more recent population-wide data.  

Iron therapy is certainly a most essential part of anemia treatment in hemodialysis patients. The considerable ongoing blood losses through the discarded extracorporeal circuit, frequent blood draws, recurrent exposure to heparin, a considerable propensity to bleed, from the gastrointestinal tract or the vascular access insertion site––these all contribute to continued losses of iron and depletion of iron stores. In addition, presence of a chronic inflammatory state also reflected in high hepcidin levels further impair access to and use of orally ingested iron as well as of already stored iron.

Treatment of iron deficiency in CKD not requiring dialysis

Historically, relatively little attention was paid to the treatment of iron deficiencies. In the CKD population not on dialysis, oral iron supplementation has been the mainstay of therapy, which is modestly effective, but a reasonable option for most individuals. Administration of IV iron is certainly more effective, but was unappealing initially since only relatively small doses of iron could be given during a single infusion session. And in the earlier days, the only formulation available were iron dextrans, first approved in 1957, which were fraught with a substantial and much dreaded risk of strong anaphylactic reactions.

The introduction of sodium ferric gluconate complex (1999) and iron sucrose (2000) provided presumably safer alternatives, which may have contributed to increased adoption of IV iron administration in patients with CKD. More recently, ferumoxytol was introduced in 2009 and provided a treatment option for the administration of larger doses of iron (510 mg) in a single push.  While billed as a safe treatment option, ferumoxytol found itself on the list of drugs with the highest numbers of spontaneously reported adverse drug events in the second quarter of 2010. Little to nothing is known about the relative effectiveness and safety of the various IV iron formulations in CKD patients, or of different approaches to administering prescribed iron doses (e.g., dosing, frequency of administration, etc.). Registrational trials are not much help, either, as new IV iron compounds were usually compared to oral iron supplementation, an approval standard that I have always found odd, to say the least. I am unaware of any randomized trials in CKD patients comparing one IV iron formulation to another, with the exception of a large registrational trial for a new IV iron compound, ferric carboxymaltose, which was approved in the United States for use in patients with CKD not requiring dialysis in mid 2013.2 In any case, the practice of IV iron supplementation in patients with advanced CKD seems to remain uncommon. In our own preliminary analyses of U.S.. Medicare data, we found that only approximately 10% of older patients (67 years or older) approaching ESRD received any IV iron supplementation in the 2 years before initiating dialysis in 2009.3

Treatment of iron deficiency in patients undergoing dialysis

In contrast to patients with CKD not requiring dialysis, IV iron administration in patients undergoing maintenance hemodialysis has become standard practice, especially since alternatives to iron dextrans have become available in the early 2000s. In any given month in 2012, more than two thirds of hemodialysis patients received at least one dose of IV iron.4 The high prevalence of IV iron use is mainly a consequence of treatment strategies that aimed to minimize the use of erythropoiesis-stimulating agents, first upon concerning safety signals in three large anemia trials in patients with non-dialysis CKD in the years 2006 and 2009, and then upon implementation of the ESRD Prospective Payment System in 2011. Facilitated by the findings of the DRIVE trial and its extension (DRIVE 2), increasingly aggressive iron administration was sought even in patients whose laboratory indicators may have indicated replete iron stores.5, 6

Relatively little attention has been paid to the specifics of iron administration. First, it has been assumed, in the absence of conclusive data to the opposite, that the available iron formulations are more or less equally effective and safe (“class effect”). In addition, and independent of formulation, two roughly distinguishable practices have been employed in most patients:

  • “Bolus” dosing regimens, in which no iron administration occurs until a patient falls below certain critical values in their iron parameters. The patient then receives a large amount of iron, divided up into smaller individual doses, over the course of, e.g., 8-10 hemodialysis sessions. Thereafter, no further iron administration occurs until another routine iron status inquiry indicates the need for additional IV iron.
  • Relatively smaller doses of IV iron, e.g., 50-100 mg, are administered once weekly or every other week. The dose is titrated by adjusting the frequency or individual dose (“maintenance” dosing).  

Surprisingly little evidence has been available informing the comparative effectiveness and safety of the various IV iron formulations or the bolus versus maintenance approaches to IV iron treatment. Through its Developing Evidence to Inform Decisions about Effectiveness (DEcIDE) program, the Agency for Healthcare Research and Quality has solicited and then funded two large and comprehensive investigations (led by research teams at the University of North Carolina and Johns Hopkins University, respectively) to elucidate the comparative effectiveness and safety of various strategies to IV iron treatment in ESRD.7

The individual studies from one of the study sites have begun to yield tangible results and more evidence can be expected soon. One of the research teams used a large dataset comprised by Medicare claims data and other information available through the U.S.. Renal Data System as well as detailed information from the electronic health records of a large national dialysis provider.  Following an index TSAT measurement, there was a six-week exposure ascertainment window during which IV iron dosing and formulation were defined; outcomes were then captured during a relatively short-term (4-12 weeks) follow-up window. Approximately 36% of patients did not receive any IV iron whereas 16% exhibited an IV iron treatment pattern compatible with bolus and 47% with maintenance dosing. Among those receiving a single formulation, 80% received iron sucrose and 20% ferrous gluconate. First focusing on effectiveness (hemoglobin response), Kshirsagar et al. found that bolus (versus maintenance) dosing, larger monthly doses (>200 vs. ≤200 mg/month), and iron sucrose (vs. ferrous gluconate) were all associated with higher hemoglobin responses.8

Table1-Winkelmayer-IronSupplement

Next focusing on safety, the investigative team studied infectious outcomes and found that bolus iron administration was associated with higher short time infectious risk and this safety signal was robust across a variety of different endpoint definitions (including diagnosis codes from hospitalizations, IV antibiotic use, infectious death).9 The excess infectious risk was particularly pronounced in patients with a central venous catheter. These findings may confirm longstanding concerns about increasing availability of free iron for infectious risk following administration of large IV iron doses. No associations were found between iron dosing regimens and clinically meaningful differences in health-related quality of life.10 Selected quantitative results are summarized in Table 1.

Little is known about the comparative safety among IV iron formulations. Several recent examples have highlighted that the default assumption of a class effect, especially with regard to drug safety, may be a flawed starting point (e.g., celecoxib, rosiglitazone, peginesatide). Among IV iron formulations, ferrous gluconate has a considerably shorter plasma half life than iron sucrose, which may have contributed to the stronger hemoglobin response in patients receiving iron sucrose in the report mentioned above.8 However, longer plasma half-life may also affect drug safety. Indeed, Sirken et al. found an increased risk of bacteremia in patients receiving iron sucrose compared with those receiving ferrous glucose.11 No larger scale studies on the comparative safety among IV iron formulations are currently available. Several such studies are currently being conducted and results can be expected later this year.

In conclusion, IV iron administration is common among patients with ESRD and conducted using a variety of dosing algorithms and formulations. Since the assumption of a class effect—similar effectiveness and safety among available treatment strategies—may not hold, large scale comparative effectiveness studies are being conducted and will hopefully enable us to conduct iron treatment in a more evidence-based fashion soon.

Read more from the iron management series