Radiation May Be a Greater Cancer Risk for Adults Than Doctors Thought

  • Share
  • Read Later
Adam Gault

From the sun’s ultraviolet rays to the weak cosmic exposure we get on plane flights to the screening tests that doctors recommend, our bodies are constantly bombarded with small but relatively consistent doses of potentially cancer-triggering radiation. And although doctors had thought that the cancer risks posed by such exposure declined with age, a new study reveals that the hazard may be greater in adults than previously believed.

In a study combining analysis of cancer rates among U.S. adults with cancer data from survivors of the atomic bomb explosion in Japan in the 1940s, researchers led by David Brenner at the Center for Radiological Research at Columbia University Medical Center found that in middle age the risk of developing cancer due to radiation exposure is actually twice as high as predicted by earlier models. (More on Time.com: Photos: The Landscape of Cancer Treatment)

Previous estimates had assumed that the danger of radiation gradually declined with age, and that younger children were most vulnerable to radiation-based cancers. That’s because radiation appears to target quickly dividing and still-developing cells, which are more plentiful in children, making them susceptible to the genetic changes that radiating rays can cause.

But in recent years, cancer researchers have also come to appreciate that tumors can be triggered by cancer cells–in-waiting — cells that are ready to divide abnormally at the slightest provocation, such as by X-rays. With age, adults accumulate more of these primed cells, and Brenner’s group wanted to find out how much this cancer-promoting effect was affecting cancer later in life due to radiation exposure. (More on Time.com: Special Report: Advances for Breast Cancer Patients)

He created a model that accounted for both of these cancer-causing pathways and found that radiation can indeed still be an important cancer trigger for middle-aged adults. The finding, he says, may have an impact on the popularity of radiation-based screens, such as whole-body CT scans, which are increasingly being used for preventive health reasons. “The benefit from screening is that we potentially find cancers earlier or whatever else the patient might be at risk for,” says Brenner. “But there is a risk from the screen itself, from the radiation exposure, and we have typically said that they are age dependent and that age balances in favor of screening for middle-aged adults. But our results [now] show that for a middle-aged person around 50 or so, that risk came out to be twice what the standard models had predicted.”

Brenner stresses that CT scans are a medical necessity in many cases, such as after an accident to assess damage to the head or to detect broken bones, or in cases where doctors are trying to pinpoint the location of a tumor. In those situations, he says, the benefits of scanning outweigh the risks that the radiation may pose, and are therefore justified. But in circumstances where the benefit is less clear, such as with screening, the risks may now tip in favor of avoiding that exposure.

Brenner’s study, published online today by the Journal of the National Cancer Institute, is likely only the first in a series that will further clarify the relationship between radiation exposure and cancer. “The more understanding we have of how radiation induced cancer actually works is a good thing,” he says.

More on Time.com:

Breast-Feeding after Breast Cancer Is O.K.

Revisiting the Debate: Mammogram Benefit for Women in Their 40s?