A recent study in Ontario, Canada, revealed iron deficiency (ID) to be common in pregnancies. Additionally, differences were seen in the odds of being screened for ID during pregnancy based on socioeconomic status. Study results were reported in the journal Blood.

In this retrospective cohort study, researchers evaluated pregnant patients who had prenatal testing in community laboratories throughout Ontario, with the goal of estimating the prevalence of ferritin testing. Other study goals included estimating the prevalence of ID and its severity, in addition to evaluating patient-related factors that may be associated with the likelihood of screening for ID.

A total of 44,552 pregnant women were included in the analysis. Overall, 59.4% of patients had ferritin measured while pregnant. A single ferritin test was performed in 62% of the patients who had a ferritin test, with 38% having 2 or more tests. The majority (71.4%) of the ferritin tests were ordered at the index date, which was when pregnancy was diagnosed.


Continue Reading

In an analysis that excluded patients with ferritin above the upper limit of normal, 25.2% of remaining patients showed iron insufficiency (30 to 44 mg/L ferritin), and 52.8% had ID (<30 mg/L ferritin) at some point during pregnancy. Severe ID (<15 mg/L ferritin) was seen in 23.8% of patients, and anemia was found in 8.3% of patients. Patients found to have anemia were not always followed with later ferritin tests during pregnancy, although subsequent ferritin testing appeared more likely with more severe anemia.

Patients were also evaluated for the likelihood of ferritin testing by income status. Patients in the highest annual household income category appeared generally more likely to receive ferritin testing. With the highest quintile of income serving as a reference, patients in the lowest quintile of income had an odds ratio (OR) of 0.83 (95% CI, 0.74-0.91) for ferritin testing. The second-lowest income quintile had an OR of 0.82 (95% CI, 0.74-0.91), the middle quintile had an OR of 0.91 (95% CI, 0.82-1.01), and the second-highest income quintile had an OR of 0.86 (95% CI, 0.77-0.97) for ferritin testing, in comparison with the highest-income group.

While most ferritin tests were administered at the time of pregnancy diagnosis, the study authors noted that the first trimester is the period of pregnancy when ID risk is lowest. They indicated that screening during the first trimester may enable sufficient time for iron supplementation, but that avoiding ferritin testing in later trimesters can miss cases of ID occurring during periods of greater risk, and when intravenous iron supplementation may be warranted.

“In conclusion, among pregnant patients tested at a nonhospital-based Ontario laboratory, ID affected more than one-half of pregnancies, with 1 in 4 complicated by severe ID,” the researchers wrote in their report. Even so, ferritin tests for ID screening were administered in fewer than half of the pregnant patients, and the study revealed differences in the odds of screening by income. The researchers recommended that gaps in care in ID testing and management be addressed in guidelines.

Disclosures: Some authors have declared affiliations with or received grant support from the pharmaceutical industry. Please refer to the original study for a full list of disclosures.

Reference

Teichman J, Nisenbaum R, Lausman A, Sholzberg M. Suboptimal iron deficiency screening in pregnancy and the impact of socioeconomic status in a high-resource setting. Blood Adv. 2021;5(22):4666-4673. doi:10.1182/bloodadvances.2021004352