This news release describes the results of a large population-based study comparing several screening outcomes (including recall rates and cancer detection rates) between digital mammography and digital mammography plus tomosynthesis (i.e., Hologic Genius 3D mammography). Although the published study did find a significant difference between the two screening modalities by breast density tissue, the rates were not significant in women with extremely dense breasts. A point the news release failed to mention.
Another caveat is that the original research did not look at healthcare savings or any clinical outcomes. But the manufacturers of the 3D mammography tout the the potential savings and improved outcomes in the news release, suggesting to readers that these outcomes are to be expected when their technology is used.
Better detection of breast cancer through newer technology is not the same as improving clinical outcomes. Failing to mention that women with extremely dense breasts did not benefit from the newer technology can lead to more confusion for women who may falsely assume the new technology will help them.
This release also catches a coat-ride on new legislation adopted by about half of U.S. states that require that women receive notifications of their breast density (BRN) along with their screening mammogram results. These mandated notifications on breast density are not well-regarded by a number of experts since their value to patients is questionable. What are women to do with this information?
No discussion of costs or whether the addition of the 3D mammography is covered by insurance policies. This is a huge oversight considering the release touts the potential for significant cost savings. For example, the release states:
“This has the potential to provide health systems and insurance companies with significant cost savings, reduce patient stress and expenses, and alleviate challenges for referring physicians who are tasked with relaying mammography results to their patients.”
A quick online scan of the cost for this newer technology finds that although the 3D test is FDA-approved, it still isn’t covered by most insurance plans. At least one large clinic using the 3D technology asks patients for an an out-of-pocket payment of $60 and then the 2D portion of the imaging test is billed to insurance. The patient is liable for any outstanding balance after insurance has paid.
The news release mentions a 50% increase in invasive cancer detection which on the surface sounds impressive, but they don’t provide enough detail to understand what the 50% increase really means. According to the original study, the digital mammography plus tomosynthesis increased detection of invasive cancers from 3 per 1000 screens to 4.5 per 1000 screens and this high percentage increase was seen in only one of the four groups of breast tissue density. The other two groups had lower increases. (“Mammograms are categorized into four density groups: almost entirely fat, scattered fibroglandular densities, heterogeneously dense, and extremely dense,” according to the release.)
The news release failed to mention that in women with extremely dense breasts, the 3D mammography was not significantly better at detecting invasive cancer than digital mammography alone. The published study called for caution “in drawing conclusions regarding the performance of tomosynthesis [3D mammography] for the small proportion of women with extremely dense breasts.”
We also question the release’s portrayal of 3D mammography providing a “high percentage increase” in detection over conventional mammography. Our calculations show a 0.15% increase in detection rates (0.3% detection rate for conventional vs. 0.45% for 3D).
Finally, we need to address outcomes. There is no evidence from the study that finding these additional cancers leads to better outcomes. Finding more cancers with a more sensitive test could be leading to over-diagnosis. Furthermore, the value of finding these cancers earlier is of uncertain value in the absence of outcomes data–which ideally should come from a randomized trial to avoid bias in lead-time.
The many broad claims in the release are not fully supported by the evidence.
There were no explanations of potential harms of integrating 3D mammography into current breast cancer detection or that this technology does not appear to benefit women with extremely dense breasts. It is unclear if this technology could result in false positives (over-diagnosis) or what the negative predictive value of the test is (the probability that subjects with a negative screening test don’t actually have the disease).
If these tests lead to over-diagnosis then women face the harms (and costs) of unnecessary treatment.
Although the research letter in JAMA noted that adding tomosynthesis did not produce a significant result in detecting breast cancer for women with extremely dense breasts (one of the four categories of breast density), the news release does not mention this limitation to the study. The news release also takes the evidence from the study and extrapolates to health care savings costs and increased quality of care for women, neither of which were examined in the study. In fact, the original study did not look at clinical outcomes and noted this as a limitation, but the news release does not discuss this caveat.
Furthermore, we do not know whether the apparent benefits — in terms of increased cancer detection and decrease in recalls — would be mirrored in routine practice. The results were based on mammograms being performed by high volume sites in a clinical trial. The published research also noted that the study included “insufficient follow-up to determine if increased invasive cancer detection improved clinical outcomes.”
The report did not appear to engage in disease mongering.
The news release is distributed by the company that makes the 3D mammography so we can assume it played a role in funding the study. What’s troubling is that the release gives no mention of potential conflicts of interest — or the absence of conflict — among the many investigators involved with the study. The published research notes that none of the researchers reported a potential conflict but that five of the researchers received research grant funding from Hologic. Also concerning is a note included with the disclosure section of the study: “The study contracts and data sharing agreements specified that Hologic had the right to review all publications prior to submission, but could not mandate any revision of the manuscript or prevent submission for publication.”
At bare minimum the release should have noted that at least some of the researchers received grant funding from the manufacturer.
The purpose of study compared digital mammography alone with digital mammography with the addition of 3D mammography.
The news release does mention that 3D mammography has been available since 2011 and includes a link to a website where people can enter their zip code to find locations that offer the additional screening test.
The claim to novelty is found in this statement in the release:
“This new analysis confirms that Genius™ 3D MAMMOGRAPHY™ exams reduce unnecessary follow-up exams in dense-breasted women. This has the potential to provide health systems and insurance companies with significant cost savings, reduce patient stress and expenses, and alleviate challenges for referring physicians who are tasked with relaying mammography results to their patients.”
For the reasons stated above, we think the release makes questionable claims and therefore fails the novelty test so we’re rating this not satisfactory. In addition, the 3D screening technology has been available since 2011.
While the release generally uses appropriate language, we’re flagging it for the claim that “an estimated 10 million women in the U.S. benefited from a Genius exam.” Even if 10 million women received exams, not all of them benefited. We think the release should have included some backup for a claim this bold.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like