How some journalists got hooked by fish oil and vitamin D spin

By | November 19, 2018

Results of a much-anticipated trial on fish oil and vitamin D generated conflicting headlines last weekend.

Some stories declared good news about the popular supplements:

Reuters wrote that fish oil “can dramatically reduce the odds of a heart attack while vitamin D’s benefits seem to come from lowering the risk of death from cancer.”

The Washington Post reported fish-oil medications were found “effective” in protecting against heart events while vitamin D was linked to “a decline in cancer deaths.”

HealthDay said fish oil “may lower your heart attack risk.”

But others reported the opposite: Fish oil and vitamin D actually don’t protect against those major diseases. CNN said they “do not prevent cancer or heart disease.” The Associated Press reported “no clear ability to lower heart or cancer risks.” Kaiser Health News called them “no guard against serious heart trouble.”

The New York Times labeled them “ineffective.” NPR said they “Mostly Disappoint.”

On Twitter, cardiologist Eric Topol, MD, jokingly pointed out the “slight” contrast:

What the research showed

So, who got it right?

The stories all covered the Vitamin D and Omega-3 Trial (VITAL) trial, which enrolled 25,871 healthy people 50 and older and randomized them to prescription omega-3 fish oil capsules, vitamin D supplements, both, or only placebos. They were followed for an average of 5.3 years.

Two papers were published, one with results for fish oil, the other with results for vitamin D.

Here is the conclusion of the fish oil study:

“Supplementation with n−3 fatty acids did not result in a lower incidence of major cardiovascular events or cancer than placebo.”

Here is the conclusion of the vitamin D study:

“Supplementation with vitamin D did not result in a lower incidence of invasive cancer or cardiovascular events than placebo.”

The statements seem clear: no benefit.

So why did some news organizations proclaim otherwise?

See also  After a year on Medicare, advice for journalists on stories that need to be told

A problematic news release

The answer may lie with a news release issued by Brigham and Women’s Hospital, where the trial was conducted.

The release focused on a handful of secondary findings, which aren’t the primary questions researchers set out to answer.

For example, researchers observed 28% fewer heart attacks among people taking fish oil. That rose to 40% in people who took fish oil but eat little fish. The perceived effect was highest in African-Americans, who saw a 77% reduction.

Heart attacks were just one component of a composite “cardiovascular events” measure, which also included strokes and deaths from cardiovascular causes. Researchers often combine outcomes into a composite result to make it easier to show a statistically significant difference between a treatment group and controls, facilitating faster trials with fewer patients. See our primer on composite outcomes.

Similarly, researchers also noticed fewer cancer deaths among people who took vitamin D for at least two years.

As we’ve written, such secondary findings need to be reported cautiously. They do not have the same statistical authority as primary findings, and are more likely to be due to chance. It’s been stated that secondary findings should only be used to help interpret the primary result of a trial or to suggest avenues of further research.

Yet the news release featured those rosy-sounding secondary findings at the top, with wording that made them sound like proven benefits:The subhead says omega-3 “reduced heart attacks” and that vitamin D “reduced cancer deaths” — similar to misleading causal language that appeared in some news stories.

Earle Holland, one of our contributors and a retired senior science and medical communications officer at Ohio State University, said news releases need to be “extra careful” about reporting the limitations of secondary findings because “readers will often assume causation where there is none.”

“Building false hopes is fundamentally cruel,” he said.

See also  What are in vitamin d

Primary findings were obscured

The news release downplayed the primary findings, describing them on the third page and not flagging them as the main result.

“You can talk about the secondary endpoints if you want, but not to the point where you obscure that the overall study was negative,” said cardiologist Christopher Labos, MD, one of our reviewers, who cited a “disconnect” between the news release and the study.

Matt Shipman, a public information officer at North Carolina State University and a HealthNewsReview.org contributor, said “it does feel as though the release somewhat buries the lede.”

Apparently, the study’s main finding was buried so deep that some journalists missed it.

Defending the news release

We asked Brigham and Women’s, which received about $ 40 million from the National Institutes of Health to run the trial, why its news release downplayed the main findings, elevated secondary findings, and didn’t explain the varying levels of evidence.

The hospital responded with a statement defending how the data were presented. It said all of the information in the release is accurate and “supported by the findings.”

Lead researcher JoAnn Manson, MD, the hospital’s chief of preventive medicine, noted the secondary endpoints had been planned before the trial was conducted and were contained in the published research.

“We don’t think this information should be withheld from the public,” Manson said in an email.

Note that no one has said anything about withholding those secondary findings. We’re talking about providing adequate explanations and context.

Some news organizations that led with the primary result also reported the secondary findings, but wisely included caveats.

For example, Kaiser explained how parsing data into smaller chunks for analysis can produce “unreliable” results, and “links between fish oil and heart attacks — and vitamin D and cancer death — could be due to chance.”

See also  Medical News Today: Can the keto diet treat epilepsy?

Other ways reporting went wrong

Such context was missing in the stories from Reuters, The Post and HealthDay.

Besides appearing to rely heavily on the news release, those news outlets didn’t include much input from independent sources. HealthDay used one who hyped the secondary finding on heart attacks as “profound.” Reuters and The Post didn’t have any.

The Post and HealthDay further confused things in the way they covered a different trial called REDUCE-IT in the same story. REDUCE-IT showed a benefit for a specific population of people with heart disease who took a fish-oil-derived drug called Vascepa (icosapent ethyl).

Both VITAL and REDUCE-IT involve fish oil, were presented on the same day at an American Heart Association meeting, were conducted at Brigham and Women’s, and were published simultaneously in the New England Journal of Medicine.

Findings “didn’t need to be spun”

We’ve often noted that negative or null results have difficulty attracting news coverage.

The sweeping coverage of VITAL shows it can be otherwise. Its results challenge common beliefs, which is the essence of a good news story.

Yet Brigham and Women’s shrouded those taxpayer-funded insights with dubious positive-sounding observational results, and some news organizations were hooked into going along.

“This may be a function of the bias in academic research against negative results, which is unfortunate,” Shipman said. “Negative results can have tremendous value. In my opinion, the negative results are the larger story in this case.”

Martha Gulati, MD, chief of cardiology at the University of Arizona, agreed the findings “didn’t need to be spun,” and will help people make more informed choices.

“As a health consumer myself, I wouldn’t want to be spending my hard-earned money on something that doesn’t give a benefit,” she said.

You might also like

HealthNewsReview.org