Older women are at higher risk of developing osteoporosis and bone loss, which can lead to potentially debilitating bone fractures. To gauge bone strength in these patients, many doctors order bone mineral density tests every two years — which is how often Medicare reimburses the test — but a new study finds that such screenings can be delayed much longer.
The latest research, published in the New England Journal of Medicine, suggests that most women with normal or near-normal scores of bone density on an initial test may not need another one for up to 15 years.
The study addresses a difficult question that many doctors caring for older patients face. Bone mineral density readings, or T scores, which measure bone thickness at certain spots, usually the hip and spine, compare patients’ bone density to that of a healthy young adult. Patients with a T score of -2.5 or lower, which qualifies as osteoporosis, must continue testing regularly and begin drug treatments to strengthen their bones. But what about women with slightly higher readings? Do they need to be monitored as often?
In the study, lead author Dr. Margaret Gourlay of the University of North Carolina at Chapel Hill School of Medicine, sought to stratify these lower-risk women in order to determine how often screening would be necessary to catch the first signs of bone disease while avoiding overtesting. After analyzing other screening tests, such as mammograms and colon cancer and cervical cancer screens, she and her team settled on the 10% rule — looking for screening intervals that would result in fewer than 10% of all women developing osteoporosis. The threshold was purposely conservative, says Gourlay, to ensure that cases of the bone-weakening disease weren’t missed.
In applying this criterion to nearly 5,000 women aged 67 years or older, Gourlay and her colleagues excluded women with obvious osteoporosis, or T scores of -2.5 or lower. The remaining women were divided into three risk groups: a high-risk group whose T scores ranged from -2.49 to -2.0, a moderate-risk group with scores from -1.99 to -1.5, and a low-risk contingent with scores of -1.49 and higher.
The high-risk women, not surprisingly, required the most frequent screening to adhere to the 10% rule. For them, a bone density screening interval of only one year provided the optimal chances of intervening to prevent progression of the disease. For the moderate-risk women, that interval increased to five years, and for the normal- to low-risk women, the gap between screenings could be as long as 15 years.
“We didn’t expect such a difference,” says Gourlay. “That tells us that we need to individualize the decisions, and that women with high scores are so different from those with low scores that they can be screened less often.”
The data should help in establishing guidelines where there currently are none, and may go a long way toward reducing health care costs associated with fractures. Gourlay notes that tracking declining bone density is a more accurate and faster way of detecting osteoporosis than waiting for fractures to occur, at which point bone disease is already well-entrenched and beyond reversal.
Still, even if guidelines based on these results are adopted, doctors need to remain flexible in advising women about when to get tested. A patient who has a normal T score but then develops cancer and loses a lot of weight, for example, may be more vulnerable to developing osteoporosis and therefore may need to get screened before the 15-year interval. And women on the borderline of normal may also need to be screened more frequently, since they may cross over into moderate risk well before their next scheduled test.
But given how incapacitating hip and spine fractures can be, providing more science-based data to help doctors and patients decide how often they need to have bone density tests is an important first step.