This news release focuses on a recent study that evaluates the effectiveness of a 2014 Centers for Disease Control and Prevention (CDC) educational advertising campaign called “Tips From Former Smokers” (Tips) that aimed to encourage and help cigarette smokers quit smoking. The release highlights the health risks associated with smoking and the estimated effectiveness of the ad campaign. The release also argues that the Tips campaign was a cost-effective means of helping people quit smoking. It provides some good background data on the health-related costs of smoking as well as estimates on how many people were reached through the ad campaign. However, the release does not give readers a hard number on the campaign’s cost. It would also have been good to know how the effectiveness of the Tips campaign compared to other “quit smoking” efforts.
There are few activities as closely tied to adverse health outcomes as cigarette smoking. As the CDC notes (on its site, not in the release), “cigarette smoking harms nearly every organ of the body.” Smoking is linked to significantly increased risks of stroke, heart disease and lung cancer, with close to 500,000 deaths each year attributed to smoking in the United States. Smoking is not only a national health issue, but a global health issue. A 2015 study estimated that, globally, cigarette smoking causes approximately $500 billion in economic damage (largely through health costs) each year. For these reasons, finding effective means to help cigarette users stop smoking is clearly of widespread interest. However, that means taking a critical look at just how well anti-smoking campaigns actually work. And one way to do that is through the lens of a cost-benefit analysis, and comparing the outcomes of multiple campaigns. This release would have been stronger if it had helped readers make the necessary comparisons.
We give the release credit for providing some detail on the costs of smoking-related illnesses. However, the intervention here was a national anti-smoking campaign and the cost of that campaign was described in obscure terms:
“The Tips campaign is an important counter measure to the $1 million that the tobacco industry spends each hour on cigarette advertising and promotion,” said Corinne Graffunder, Dr.P.H., director of CDC’s Office on Smoking and Health. “The money spent in one year on Tips is less than the amount the tobacco industry spends on advertising and promotion in just 3 days.”
One can deduce based on the numbers given that the campaign costs less than $72 million ($1 million an hour for three days), but why spin it to this extent when it would have been easy to give us an actual number?
The release is focused on a study that used survey data to estimate the impact of the Tips campaign. And the release does a good job of articulating those estimated benefits. The most concrete benefit was the estimate that 104,000 people “quit smoking for good” as a result of the Tips campaign. That’s enough to earn a “satisfactory” rating. But the release doesn’t tell readers what quitting smoking “for good” means. It appears to mean, based on the study itself, that smokers had quit for six months — that’s good to know. It also doesn’t place those 104,000 quitters in context. Based on the information in the release itself, there are 40 million adult smokers in the U.S., and approximately 80 percent of those smokers saw at least one of the Tips ads. That comes to 32 million smokers. If 104,000 of those smokers quit, that means that the 2014 Tips campaign helped approximately 0.3 percent of smokers who saw the campaign to quit. Is that good? Bad? It’s hard to tell without comparing the 2014 Tips campaign outcome to the outcomes of other campaigns — which the release doesn’t do. It’s also important to note that the 104,000 is an extrapolation based on surveying only 4,428 subjects. Given that about 3,500 of these subjects saw the ads, the 0.3 percent quit rate equals 10 study subjects.
Intuitively, it may be hard to consider the possibility that there could be harms attributable to an anti-smoking campaign. However, some assessments of other public health education programs have found that to be the case, and we wish this summary had at least addressed the possibility.
In 2008, for example, ABC News reported on a large Congress-mandated study of the effectiveness of a $1 billion anti-drug campaign that launched in 1998. The report was published in the American Journal of Public Health.
“The study’s authors,” ABC reported, “assert that anti-drug ads may have unwittingly delivered the message that other kids were doing drugs, inadvertently slowing measured progress that was being made to curb marijuana use among teenagers.”
“Overall, the campaign was successful in achieving a high level of exposure to its messages; however, there is no evidence to support the claim that this exposure affected youths’ marijuana use as desired,” the report said.”
The release does an excellent job of explaining the 2014 Tips campaign, and of explaining why smoking is an important public health issue. However, the release offers virtually no information about how the CDC was able to estimate the impact of the Tips campaign. The only clue is this sentence: “The survey results are published in the March 24 release of the journal Preventing Chronic Disease.” If a release is focused on the findings of a study or analysis, it needs to offer some explanation of the study or analysis.
Given that results were based on a relatively small sample, all results (attempted quits, quitters) should be prefaced with “estimated.” Additionally, all results were self-reported and thus not validated. Known biases in reporting health behaviors would suggest that these estimates could be inflated. Finally, extrapolating results from an online survey to the population level is susceptible to selection bias — survey respondents may not be representative of the general population. The study acknowledges this limitation.
Normally, we ding releases or news stories that equate risk factors and health effects. There is, after all, a significant difference between the two. This release emphasizes the costs — both human and economic — of smoking, to the point of disease mongering. However, given the overwhelming abundance of research linking cigarette smoking to a wide array of severe health problems, we think this may be the exception to the rule. What keeps it on the satisfactory side is the language in the release highlighting smoking as a risk factor to cancer, stroke and other health problems.
The release offers very little information about the study — readers aren’t told who performed the study or who paid for it. Two of the authors are CDC researchers, and the study itself was funded by the CDC. To be clear: neither of those facts is necessarily problematic; what’s problematic is that the release doesn’t share that information with readers.
However, since the study is evaluating a CDC initiative and being published in the CDC’s journal, savvy readers would most likely deduce CDC funding.
There are a number of other campaigns out there that aim to help smokers kick the habit, such as the FDA’s “The Real Cost” campaign. How effective have those other campaigns been? How does that compare to the 2014 Tips campaign? The release offers no information related to those questions.
The news release refers readers to the quit line and a web site containing the Tips ads.
The release clearly articulates how the 2014 Tips campaign differed from an earlier Tips campaign, and notes that Tips was the “first federally funded anti-smoking paid media campaign.” That’s enough to earn it a pass here. However, again, it would have been great to draw comparisons to other major anti-smoking campaigns. What makes Tips different? Special? More effective?
This one is borderline. We understand what the CDC is trying to accomplish, and we sympathize with its goals. However, if a release is going to include a phrase like the one in a quote that says Tips is “extremely cost-effective and a best buy,” it should provide the numbers to back that up. That said, there is nothing completely over-the-top here, and the vast bulk of the language — including the headline and lead paragraph — are written responsibly. Ergo, the satisfactory rating. The quote from the co-author most misrepresents the actual study findings.