Skip Navigation

Issue 23 Interview Transcript

Adam Weiss: Welcome to the JNCI Podcast, a production of the National Cancer Institute. I'm Adam Weiss.


Adam Weiss: Next time you turn on the TV or open the newspaper, stop to think about what an amazing and scary medical world they're reporting on. Almost every week there's a new breakthrough in cancer treatment and a new food or behavior you'd better avoid if you want to live a long and healthy life. Of course, the medical world you see on TV is different from the real world whether you're watching House or the news and that realization can be scary too. To try to address this problem in health and science reporting, Steve Woloshin, Lisa Schwartz and JNCI Editor and Chief Barnett Cramer wrote an editorial published online on November 20th called "Promoting Healthy Skepticism in the News; Helping Journalists Get it Right." Doctors Woloshin and Schwartz are professors of medicine at the Dartmouth Institute for Health Policy and Clinical Practice and general internists at the VA Hospital in White River Junction, Vermont. They've worked with the National Institutes of Health for a number of years on a medicine in the media workshop and they join me now from their offices at the VA Hospital. Welcome.

Steven Woloshin: Hi.

Lisa Schwartz: Thanks for having us.

Adam Weiss: So, you wrote about how what we see in the news can be way off in terms of the degree to which a medical discovery is good or bad for you. And I'm not talking about the how your hair dye could kill you stories here, but also these breakthrough cures that come up all the time. You wrote about how we could improve some of this and how some of that improvement would come through changes in both the journalists and the scientific journals.

Steven Woloshin: That's right. We think there's a few things that need to be done to improve the quality of the news reporting, so one thing is to make sure that the materials that journalists are working with, the source materials are as good as they can be. So part of that means hopefully journal editors will do, you know, a real good job of making sure that all the numbers are there in a real accessible format and they highlight cautions. And then the same thing about journal press releases because that's a direct way that the journals communicate to journalists. And then finally of course, the journalists themselves need to be really sensitized to the importance of using numbers and highlighting cautions otherwise, you know, there's this real big danger of a lot of exaggeration.

Adam Weis: Now in the editorial you said that you feel that you know we know that some of these are exaggerated, but I know that I know that when I watch TV or when I read the news. And you know that as doctors when you see these things. But do you think the general public does? Do you think that they know that something that's billed as a breakthrough in cancer research maybe isn't?

Lisa Schwartz: No, I think that's the problem. I mean especially on TV news because so little time is given to each of these stories and they're said without any real explicit warning about maybe the very preliminary nature of these studies and I think that that's why it's such a problem because it gives the public a false sense. And, you know, we don't have a good place where people can go to sort of get the truth behind the headline to have that other take because there isn't a consistent place where someone is interpreting this information for them except for the media.

Steven Woloshin: I think-- I mean agree with you, Lisa. But I think that-- a lot of people don't know, but I think there's also a number effect, that sort of conflicting exaggeration. One week, you know, something is great for you and one week it's terrible for you and so on. I think that also may like induce a lot of cynicism and people just stop believing health news altogether. So I think both of those things happen.

Adam Weiss: And you had a great example of something just like that that you wrote in the editorial. You talked about some coverage of whether alcohol is good or bad for you. And it seems like it goes back and forth every few months when you hear this. And this was the case of reporting on alcohol raising the risk of cancer among women, but not how much of a risk and not how much alcohol in ways that you thought were well explained.

Steven Woloshin: Yeah, I mean the problem here was that when it was covered in the news, CNN for example said that, you know, there's no level of alcohol consumption that can be considered when it comes to cancer. And so that means like three quarters of the women in the United States should be worrying because they've consumed some alcohol. The problem is they never reported on how much the risk of cancer was increased by alcohol exposure. And it turns out in the study the investigators found that the risk of breast cancer diagnosis increased from about 2 percent to people who had the lowest amount of drinking to 2.6 percent for the people who had the highest amount of drinking which was more than 15 drinks a week over the course of seven years. So that .6 percent or 6 more cases out of 1000 women of cancer diagnosis may seem much smaller to people than when they just here, oh, no amount of alcohol is safe.

Adam Weiss: And one thing that you also said in the article was that putting things in absolute terms like that, saying there's 1/2 percent increase overall is something that has more of a grounding effect to people than saying in that case, "Oh, 25 percent more cancer."

Lisa Schwartz: Right because I think in general when you-- the bigger relative changes seem much more impressive than the small absolute changes and so fortunately in this study the rate of breast cancer diagnosis wasn't that high. So even a 25 percent increase isn't a very big absolute difference. And of course the most fundamental thing is whether that's really even true. Because, you know, these come from, you know, studies where we observe people and just see what happens and whether there's something really different about the people who drink in a certain way and whether it's something about those people rather than the alcohol itself that explains differences in breast cancer risk is always open to question when you have an observational study like that. So, I mean, part of why it's always flip flopping is that these studies are imperfect kind of studies and we have to have some humility about the fact that, you know, they're estimates, they're not truths and that by helping people to routinely understand that these studies can't definitely tell us, I think might help people to understand why things are changing all the time.

Adam Weiss: Now some of the stories you talked about in the editorial were done by very senior reporters and some of them are well respected doctors. Shouldn't they know better than reporting something that's really early stage?

Steven Woloshin: You know, just because someone is a doctor doesn't mean that they're-- you're an expert in understanding research or communicating research. It doesn't mean that you're going to be a good journalist. The set of skills that make you a good doctor or make you a good researcher or make you a good journalist are not necessarily the same. So just because someone has a certain title doesn’t mean that, you know, you turn off your skepticism. The other thing is that TV journalists work under a lot of constraints. They have very little time. Some of the sort of the culture of TV reporting is such that they often don't use a lot of numbers and so they're not giving the kind of data that we think is important. Although I think it's ironic because, you know, they do use numbers in other contexts. For example when they're reporting sports, they'll give you the whole score, they won't just give you-- they won't say the Yankees won, they'll tell you how much they won. They won't give a relative score, they'll give the absolute score. So it's possible we seem to do it better, but for some reason that doesn't really seem to be the culture.

Adam Weiss: But it's not just the journalists. In your editorial you said that there's plenty of blame to go around. And it seems like you're also saying that the information they're drawing from may be leading them to report in certain ways that aren't really accurate.

Steven Woloshin: Well that's right. I mean there's a lot of self interests involved that all sort of reinforce exaggeration. So, you know, researchers want their stories published by the journals. So there's a sort of incentive to make results look as strong as you can. And that's probably one reason why you often see relative results only without absolute risks in journal articles. And then the journals want their stories to get picked up by the newspapers. So the journal press releases are often hyping things. You know, we've done a study showing that they're quite bad in that respect. And then individual journalists want their stories to be on the front page and be prominent. So, you know, there's reasons that they want to make their stories, you know, have real bite or screaming headlines or have great sound bites. So there's all sorts of levels that contribute to hype.

Adam Weiss: That's a pretty intimidating group of people chain of people to have to fight against to get change in this area. But you do have some ideas. What do you think we should do about it?

Lisa Schwartz: Well, I mean, the journals as the Journal of the National Cancer Institute is starting to do is to try to make it easier for journalists to get it right and that's by making sure that the journal article and/or the press release consistently make it easy to get the right numbers, which, you know, we think are like the scores in the baseball game. And to be able to see the limitations of the study in an obvious way so at the same time that they're considering reporting it that they actually know what is the journal's take on this article. Because all studies have limitations and the question is how to-- you know, to figure out how important those limitations are and to cast the results in light of those limitations. And so the Annals of Internal Medicine, they in their abstracts have a header called Limitations to help make it more it more obvious to everybody rather than having to read through the whole discussion of the article to find out what the problems are with the study. And the British Medical Journal is going to move to these short summaries which also will highlight the numbers and the cautions. And I think that those are really important efforts to make it easy, you know, not just for journalists but for doctors who are making clinical decisions to have that information easily accessible.

Adam Weiss: And you're actually working with JNCI to try to make this even easier, to provide resources for journalists

Steven Woloshin: That's right. We've posted these materials that we've developed over a number of years in our work with the Medicine in the Media Symposium. And there are a couple of different tip sheets. And one them which emphasizes the need for highlighting cautions in reporting about research tries to make it easy for journalists by providing them with language that they should, you know, feel free to lift and use or, you know, edit as they see necessary but for different scenarios. So for example, for a randomized trial there's a certain set of cautions that come up for observation research like the study that we were talking about before, the alcohol study. There's language, so they don't have to reinvent the wheel but they can clearly draw the reader's attention to the fact that he has to be very careful because of the possibility of compounding other things might explain why these women had higher rates of breast cancer diagnosis. And for, you know, animal studies, you know, or lab studies or surrogate studies where you just don't know if these things will translate into meaningful patient outcomes. So that's the first of the tip sheets. The other tip sheet is about numbers. And here it has two purposes really. One is to educate journalists. There are a lot of statistics that they see all the time, relative risks, absolute risks, relative risk reduction, odds ratios, ____________________ to treat and so on. So it has simple definitions w

Adam Weiss: And one of the things that I thought was most useful about the other tip sheet was that you actually have some questions-- you call them questions to guide your reporting. And you've actually said that if you can't get answers to these questions maybe you shouldn't write the article.

Lisa Schwartz: Right, because I think what we want to encourage is that I mean sometimes, you know, we'll have reporters call us because after our years of teaching they'll call us and they'll ask us about things. And the researchers aren't always as forthcoming as they should be with the answers to some of these questions. And so I think by, you know, journalists sort of saying, well if you're not going to tell me that then maybe I'm not going to do this story, maybe that will set the bar higher for providing that information.

Adam Weiss: Well from what I see and I'm sure what you see the bar needs to be higher. So I'm glad to know that reporters, press officers, scientists and of course doctors will have the opportunity to communicate these studies better whether it's to their patients or the public. Thanks for telling me about the project.

Lisa Schwartz: Well thank you so much for having us.

Steven Woloshin: Thank you.


Adam Weiss: And thank you for listening to the JNCI Podcast. For more interviews, audio summaries of JNCI issues and more information about today's topic, visit jnci.oxfordjournals.org. To get in touch with us, send an email to podcasts@oxfordjournals.org or follow us on Twitter. We're at @JNCI_Now. If you liked this episode, please share it with your colleagues and friends. I'm Adam Weiss. Thanks again for listening.


#### End of jnci_101.23.inter