Having acknowledged the overall quality of the coverage, we’d note that our wish list for this story includes a couple of contextual details. Since the story advocates for a second opinion from a pathologist, it would’ve been helpful to explain how it easy it might be to get that opinion and whether insurance would cover it. We also wish the story would’ve discussed the factors associated with worse performance in diagnosing difficult cases. Specifically, the study found that pathologists who interpret fewer breast biopsies, who practice in nonacademic settings, and who work in small practices had a greater disagreement with the consensus diagnosis.
Each year nearly 1.6 million women in the United States undergo a breast biopsy. Clinicians and patients depend on an accurate pathology report to decide whether treatment is necessary, determine the future frequency of screening, and choose the right screening methods.
Since the first sentence of the piece focuses on “another reason for getting a second medical opinion,” readers would have benefited from some discussion of the costs of that opinion. The issue is a little murky. While many insurance policies cover a second opinion, this is traditionally understood to mean a second surgeon reading the original pathology report. Getting a second read on the pathology is not well established and coverage not as clear.
The article discussed the rates at which average pathologists misdiagnosed the presence of DCIS and the presence of abnormal, precancerous cells compared with a panel of leading experts. We’ll call this good enough for a satisfactory rating. However, the piece should also have noted that these abnormal, precancerous cells are termed atypia, since this is the term women would see in a pathology report. In addition, the story failed to quantify the rate at which pathologists correctly diagnosed invasive breast cancer (96%). Its says merely that the results show pathologists “are very good at determining when invasive cancer is present in breast tissue.” By contrast, rates of misdiagnosis are precisely quantified. Quantifying the reassuring rates of correct diagnosis would have provided more balance to the story.
The piece explains that accurate diagnosis of both DCIS and abnormal, precancerous cells (atypia) is important. Up to 160,000 American women are diagnosed with atypia each year. Because the study determined that less than half of these diagnoses are accurate, tens of thousands of women may be getting inappropriate treatment: over-treatment or under-treatment.
The story does enough to earn a satisfactory rating here. It notes that the study was an experiment and that pathologists were not permitted to consult with other pathologists when determining a diagnosis. As a result, the study results do not necessarily reflect a real world practice. The story also quotes a JAMA editorial that accompanied the study, which observed that “the study lacks information on patient outcomes, so there’s no proof that the experts made the correct diagnosis.”
The article could have provided some more details about the expert panel, who unanimously agreed on a diagnosis only 75% of the time. Also, study participants reviewed only one slide, not the multiple slides that pathologists usually review when determining a diagnosis.
While this story does not disease monger in the classic sense, we wish that it had provided more emphasis on the rate of correct diagnosis for invasive cancers, which is somewhat reassuring. The uncertainty here surrounds the cases that fall into a gray area and don’t as clearly require immediate treatment. In addition, the article states the DCIS “can sometimes spread so usual treatment is surgery and radiation.” Readers should also have been told that watchful waiting is an option with a DCIS diagnosis. The cumulative effect of these omissions throws the story off balance, a problem we’ll address here with a Not Satisfactory rating.
The piece includes a quote from Dr. David Rimm, a Yale University pathology professor and co-author of the accompanying JAMA editorial.
There are no proven alternatives to pathologist’s interpretation of biopsy slides.
The article includes the information that 1.6 million breast biopsies are performed in the United States each year, suggesting that access to the procedure is widespread. As noted above, it would have been helpful to have more information about the availability of a second pathologist’s interpretation of tissue samples. In addition, it would have been very helpful to include information from the study about the factors associated with pathologists who had greater disagreement with the consensus-derived reference diagnosis. Specifically, the study found that pathologists who interpret fewer breast biopsies, who practice in nonacademic settings, and who work in small practices had a greater disagreement with the consensus diagnosis.
The study notes, “Previous research has shown that interpreting mammograms can also be tricky and lead to under- or over-treatment.”
The story includes original reporting and quotes from the lead researcher, so we can be sure it’s not based on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like