Journalists fall victim to spreading bad science all the time. In fact, a science journalist once tried to call attention to the problem by duping worldwide media organizations into spreading bad reporting about the most clickbait-y of all health/nutrition subjects: chocolate and weight loss.

 

Chocolate and weight loss might not seem like that big of a deal, but by being uncritical about the quality of scientific research and disseminating bad information or misinformation or even pseudo-science, journalists are doing everyone a disservice.

 

We asked Dr. Rebecca Goldin — director of STATS at Sense About Science USA, an organization dedicated to helping journalists understand statistics and a professor at George Mason University — about how to read and be skeptical about scientific studies.

 

1. Evaluate the study for yourself

Don’t rely on the press release, and if another news outlet has covered the same study, don’t trust that the reporter got it right. When you’re looking at the study, you’re trying to understand the study design.

 

The study design refers to the methodology: what question was the study trying to answer, how did researchers collect and evaluate the data to answer this question, and do the data ultimately answer the question.

 

This can reveal whether there are shortcomings or limitations in the way the study was conducted, and it can help you understand whether a link between two things (like eating chocolate and weight loss) shows causation or only correlation.

 

Dr. Goldin says you shouldn’t just presume that the study was well-designed. “A lot of times [researchers] haven’t designed [studies] well, and other times they’ve designed them as well as they could, but there are limitations, and its important to wrap our heads around those limitations,” she says.

 

2. Ask an independent expert

Independent experts who don’t have any investment in the study can help you understand study design, limitations and biases, explain what the numbers mean and help you do the math, and finally, help you assess whether or not the study is significant. And like in all journalism, you want the credibility of an independent expert’s voice in your reporting.

 

3. Share details on the study design in your story

An important number you’re looking for is the sample size. The sample just means the number of subjects involved in the experiment. If the sample size is small, you should be wary.

 

Researchers will often phrase the findings of the study in a more measured way than the press release. And as Dr. Goldin said, you’re trying to figure out what the limitations are of the study.

 

Researchers will disclose these in the study itself, and you should try to understand it and explain it to your audience. Finally, don’t just rely on data that are statistically significant (meaning that it is very unlikely that the results could have occurred by chance) but ask whether the results are clinically significant.

 

“Statistical significance is a technical word,” Dr. Goldin says. “Clinical significance is something we can all understand: Is the work important? Is the outcome something we really care about?”

 

4. Put percentages in context

Numbers can be misleading if they’re not put in context.

 

Here is a simplified version of a hypothetical scenario Dr. Goldin used to explain the concept to me:

 

Suppose a study looks at the risk of colon cancer between vegetarians and meat-eaters. Its conclusion: Eating meat increases the risk of developing colon cancer by 20 percent. That seems significant.

 

When you look at the actual numbers in this hypothetical study, you see that 5 in 100 of the vegetarians developed colon cancer, and 6 in 100 of the meat eaters developed colon cancer. That’s a 20 percent relative increased risk of developing colon cancer for meat eaters — but only a one percentage point difference (5 percent vs. 6 percent) between each group’s risk of developing cancer, which isn’t as significant.

 

Keep that in mind when writing your story — be wary of percentage increases or decreases in risk and try to put it in the context of the actual risk.

 

5. Disclose conflicts of interest

Finally, sometimes studies are sponsored by industries that might want a certain outcome in the research. For instance, a pharmaceutical company sponsored this research about whether an ADHD drug could help reduce smoking.

 

If you find a conflict of interest, it doesn’t mean you shouldn’t report on the study. If you’ve done your due diligence, and it seems like the study is legitimate and newsworthy, you should report on it. But you should definitely disclose the conflict in your story.

 

Resources

Health News Review’s Toolkit

STAT at Sense About Science USA

More from this Project
More from this Project

NAVIGATOR

Tips for digital security

July 25, 2017

Read More

NAVIGATOR

Advice on asking tough questions

August 9, 2017

Read More

NAVIGATOR

Contextualizing Your Reporting

October 3, 2017

Read More

NAVIGATOR

Tips for working with Census data

December 13, 2017

Read More

NAVIGATOR

How to keep your biases in check

September 20, 2018

Read More