Researchers found a lot to be dissatisfied with in a review of nearly 2,000 stories about “new medical treatments, tests, products, and procedures.” Most stories were “unsatisfactory on 5 of 10 review criteria: costs, benefits, harms, quality of the evidence, and comparison of the new approach with alternatives,” Gary Schwitzer writes in a report published by JAMA Internal Medicine.
Some of the problems researchers from HealthNewsReview.org found in the study, which examined reports in print, Web and broadcast media:
- Stories “often framed benefits in the most positive light”
It’s important to report on absolute risk, not just relative risk, the study warns. Here’s a guide to understanding the difference. - Reports rarely explain the limitations of observational studies
Lots of news outlets reported on a Mayo Clinic study published last summer about the effects of coffee on mortality, and “Each story used language suggesting cause and effect had been established, although it had not,” Schwitzer writes. (“Heavy coffee consumption linked to higher death risk,” USA Today wrote, and it was far from alone).
The research “reported a ‘positive (statistical) association,’ Schwitzer wrote last year. “That’s not causation.” - Stories based on press releases or one interview
Eight percent of the stories studied “apparently relied solely or largely on news releases as the source of information.” Another problem the study ID’d: Coverage of new technology is often “Fawning.” - Stories “often provide cheerleading for local researchers and businesses”
A Los Angeles Times story about a drug that hoped to ease pain from menstrual cramps “provided no data but quoted a company vice president, the only person quoted, who said that the drug could be a “breakthrough,” the study says. Journalists, it says, “should be more skeptical of what they are told by representatives of the health care industry.”
Related: Why journalists drive scientists crazy, in graphs | The 10 biggest science-reporting mistakes (and how to avoid them)
Comments