Many radiologists rely on specialized computer software to pinpoint suspicious areas in routine mammograms. But in a large new study, the technology failed to improve breast cancer detection and also increased a woman's risk of being told she had an abnormal mammogram when she was, in fact, cancer free.
By Karen Pallarito
WEDNESDAY, July 27, 2011 (Health.com) — Many radiologists rely on specialized computer software to pinpoint suspicious areas in routine mammograms. But in a large new study, the technology failed to improve breast cancer detection and also increased a woman's risk of being told she had an abnormal mammogram when she was, in fact, cancer free.
The study analyzed 1.6 million mammograms taken at 90 radiology facilities in seven states between 1998 and 2006. The findings, which appear online in the Journal of the National Cancer Institute, extend and confirm the results of a controversial 2007 study from the same research team that cast doubt on the value of the technology, known as computer-aided detection (CAD).
"Women should probably understand that CAD is probably being used to interpret their mammogram and that it's probably not helping detect breast cancer earlier," says Joshua J. Fenton, MD, the lead author of both studies and an assistant professor of family and community medicine at the University of California at Davis School of Medicine, in Sacramento.
- 5 Simple Things That Could Cut Your Breast Cancer Risk
- 10 Celebrities Who Battled Breast Cancer
- How to Help a Loved One Cope With Breast Cancer
Routine annual mammograms are widely recommended for women 40 and older. The breast X-rays can detect cancer at an early stage, when it's most treatable, but they aren't perfect; they miss up to 20 percent of breast cancers, according to the National Cancer Institute.
Having two radiologists interpret a mammogram has been shown to improve the detection rate. In recent years, studies have found CAD software—which scans mammogram images and highlights areas that may require a closer look—to be just as effective in detecting cancers as a second pair of eyes.
Vijay M. Rao, MD, chair of the radiology department at Jefferson Medical College, in Philadelphia, says that in light of the new evidence, radiologists should use more discretion in interpreting CAD results. "Is it a legitimate pair of eyes? Is it really doing the job that the radiologists want it to do?"
The Food and Drug Administration approved the first CAD software in 1998, after a series of small clinical studies found that CAD could boost breast cancer detection without causing an unacceptable number of false-positives, cases in which doctors mistakenly identify benign abnormalities as cancerous.
CAD is now used in roughly three of every four screening mammograms, according to a 2010 analysis of Medicare data published in the Journal of the American College of Radiology. Medicare pays doctors an additional $12 per mammogram for CAD over and above the reimbursement for the mammogram itself, which ranges from about $80 (for conventional film-screen mammograms) to $130 (for newer digital mammograms).
In the new study, which included more than 680,000 women, the researchers examined film mammograms because too few digital mammograms were conducted during the study period to allow a thorough analysis. (Breast X-rays produced on film must be converted to digital images before they can be analyzed by the computer software.)
The detection rate for noninvasive breast abnormalities improved at radiology facilities that adopted CAD technology, but, crucially, the rate did not improve for invasive breast cancers, the dangerous type that invade healthy tissue in the breast or other parts of the body.
Moreover, in facilities that began using CAD the percentage of women with abnormal mammograms who were accurately diagnosed (a measure known as "positive predictive value") dropped, from 4.3% to 3.6%.
Rates of false-positives and "recalls"—being called back for further testing—increased slightly after facilities implemented CAD. However, the biopsy rate declined over time regardless of whether CAD was used.
In a similar study published in the New England Journal of Medicine in 2007, Fenton and his colleagues reported that CAD reduced the accuracy of mammograms and led to a higher rate of false-positives.
The new investigation shared the same overall design, but the researchers addressed criticisms directed at the earlier study by including a larger number of CAD screens and by excluding mammograms interpreted in the first three months after a facility adopted CAD, when radiologists were becoming accustomed to using the technology.
Despite these improvements, it's still unclear from the new study whether individual radiologists used CAD correctly, or even whether they used it at all, says Carol H. Lee, MD, chair of the American College of Radiology's Breast Imaging Commission, and a New York City–based breast-imaging specialist.
"It makes me think that we as a medical community need to further evaluate the use of CAD," says Lee, who does not use CAD in her own practice. "But I don't know that just based on this study that we should abandon this technology."
The Medical Imaging & Technology Alliance, an Arlington, Va.–based trade association that represents medical imaging developers and manufacturers, said in a prepared statement that other recent studies have demonstrated the benefits of CAD. Women should have access to the "right scan at the right time," the association said, whether it's mammography with CAD or another imaging technique.
In an editorial accompanying Fenton's study, Donald A. Berry, PhD, chair of the department of biostatistics at MD Anderson Cancer Center, in Houston, says economic incentives—including the use of CAD as a defense in malpractice suits—"may stoke its continued proliferation."
Researchers and device companies should work to make the software better, but in an experimental setting and not while exposing millions of women to a technology that may do more harm than good, Berry writes.