Fact-checking has become one of the buzziest buzzwords in journalism. There are more than 100 fact-checking projects around the world. It’s been trotted out to counter alleged “fake” news. And to monitor the accuracy of political leaders who stretch the boundaries of believability. April 2 has been proclaimed International Fact-Checking Day.
Indeed, genuine fact-checking may play an important role in political news coverage. FactCheck.org has been working in this space for 15 years. The current PolitiFact/Kaiser Health News project to “truth-squad” claims in the run-up to the 2020 elections is another noteworthy example. However, partisan criticism of alleged partisan fact-checking has sprung up all over the web. Just do a search on the term “partisan fact-checking” to start your head spinning.
But the genre faces steeper challenges when applied to health care news – which is often based on the results of studies published in medical or scientific journals.
Factual but unhelpful and misleading
When communicating about biomedical research, you can be 100 percent factually correct while being 100 percent unhelpful to your audience. I offer examples later. In this arena, fact-checking by itself often is too limited in scope to be useful for the general public of news consumers and health care consumers. Fact-checking alone often fails to capture questions of nuance and context that arise – or should arise – whenever evaluating medical evidence.
Fourteen years ago, John Ioannidis, MD, wrote the important paper, “Why Most Published Research Findings Are False.” More recently, he wrote about what he called evidence-based hearsay:
Evidence is possible to subvert at all stages of its generation and dissemination, while fierce marketing, rumors, and beliefs become more powerful than data.
The work of researchers like Ioannidis reminds us how murky the claims about evidence, data and facts can be in the current science communication environment.
David vs. Goliath: one-man-band website vs. Facebook
Technology giants like Facebook and Google have promoted their fact-checking projects. But the devil is in the details. Last month, the Columbia Journalism Review published a piece headlined: “Facebook’s fact-checking program falls short.”
And Facebook’s health-related fact-checking got another black eye last week when the Popular Information website published, “Facebook giving massive distribution to dangerous misinformation about diabetes.” Excerpt:
Facebook is giving a page featuring incendiary right-wing memes and dangerous misinformation about diabetes massive distribution — reach that exceeds some of the nation’s largest news outlets.
The Rowdy Republican page, which has over 780,000 followers, is run by an affiliate marketer with a history of legal problems and deceptive practices. He is seeking to drive people to a site about “The Big Diabetes Lie,” which tries to convince people to purchase a $ 55 paperback book.
You need to read the article in its entirety in order to evaluate the depth of the concerns it raises.
The “diabetes lie” website uses familiar language: “scientifically proven….achieve the impossible…dramatic results.” One excerpt:
It doesn’t matter if you follow your doctors (sic) recommendations and dosages exactly as prescribed. This isn’t a question of IF, but WHEN. Your health will get worse. The drugs you take will fail. The insulin injections you take will also fail.
But Popular Information followed up, reporting:
Less than 24 hours later, all the links to the diabetes scam have been removed from the Rowdy Republican page. Facebook is finally enforcing its own rules.
Questions remain, such as why it took the efforts of a one-man-band website to reveal this ugly episode? And that one-man-band website lists other unanswered questions:
1. Why is a company with vast resources not able to do a better job of rooting out obvious scams and misinformation? This was not a case of a single random link. This was a systematic campaign, over many months, by a page with an enormous reach to push dangerous health misinformation to Facebook users.
2. Why is The Daily Caller, a right-wing site with a history of false reports, still an official Facebook fact-checking partner? The Daily Caller reviewed a post from Rowdy Republican that included the diabetes scam and rated the post “true.” The Daily Caller cannot be part of a legitimate effort to root out disinformation. It is actively making things worse.
3. Will Facebook commit itself to transparency and accountability? Despite taking action after the Popular Information report, Facebook has not responded to multiple inquiries about the Rowdy Republican page. Facebook says that it wants to earn the trust of the public, but it’s hard to take that effort seriously if the company doesn’t explain how it enforces its rules.
Nice try, but……
Some efforts to help consumers identify reliable health care news are presumably well-intentioned. But some have clear shortcomings.
Healthline says its stories are “fact checked by our panel of experts.” But earlier on HealthNewsReview.org, we blogged about one example – “When ‘fact-checked’ health news doesn’t tell the whole story” – that demonstrates how meaningless and useless such alleged health care news fact-checking can be. The Healthline story was about the promise of a new drug against a form of multiple sclerosis. An excerpt of our review:
The story didn’t include key details like side effects, and it used a quote lifted directly from the drug company news release, among other problems. These red flags raise an important question: Who is controlling the “facts” on this story? Healthline’s fact-checkers or the drug company that funded the study?
On a broader scale, NewsGuard – a plugin that calls itself “the Internet trust tool” – gives green checkmarks to websites that are “trying to do legitimate journalism.” Its criteria include not repeatedly publishing false content and not publishing deceptive headlines.
Recently NewsGuard looked at several news organizations that blatantly misled readers about a study that showed a statistical association between consuming soft drinks and longevity. Not cause-and-effect, but a statistical association.
The stories’ lead paragraphs disregarded the important caveat that the study couldn’t prove cause and effect. Reuters reported that soft drinks “may raise the risk of premature death.” CNN said it was time to “consider ditching your favorite soda.” Even drinkers of diet soda should “beware,” warned the Atlanta Journal-Constitution.
But NewsGuard gave all of these stories a green checkmark of legitimacy, like this one:
These news organizations may be “trying to get it right” in the verbiage of NewsGuard’s rating system, but their hyperbolic coverage failed miserably with this story, raising questions about the value of the green check of legitimacy.
Laudably, the New York Times followed up with a story that pointed out the soda study’s limitations, but consumers can’t rely on a helpful post-hoc critique of every health care claim that gets reported.
What HealthNewsReview did that no one else is doing
When HealthNewsReview.org lost its funding at the end of 2018, some people speculated that new fact-checking projects would fill the void. But that void hasn’t been filled because HealthNewsReview.org was far more than a fact-checking project. Our team applied 10 standardized criteria to the review of 3,216 news stories and public relations news releases that included claims about health care interventions. We employed reviewers who signed statements that they did not have financial conflicts of interest in the health care industry.
The review criteria focused on the misleading elements of much health care reporting that may be missed in a purely fact-checking approach. Our reviewers helped readers understand the use of statistics that, while not factually incorrect, frames research results in the most favorable light possible – in ways that mislead and arguably misinform readers. The reviews helped consumers evaluate the quality of evidence.
So, while it may be factually correct to report on the associations that researchers report in observational studies, we taught readers that it was a woefully incomplete message if it didn’t include the limitations of observational research – and simply wrong if it made cause-and-effect statements about what was merely a statistical association. A news story or news release may be judged to be factually correct when describing the results of a study in which the outcomes were surrogate endpoints or markers. But we educated consumers about the limitations of such surrogates and always urged discussions about what those findings may not mean.
An analysis based only on “Are the facts correct?” may make a message look more informative than it was.
The approach we took with our systematic reviews of media messages – always using the same 10 standardized criteria – and reviewing only leading mass media news organizations – has not been duplicated by anyone else in the U.S. at this time.
The Poynter Institute, a journalism training center, pointed to Metafact as one possible “solution to health misinformation.” The Metafact solution is “to empower anyone to directly ask experts to verify a claim they have read and for a consensus score to quickly aggregate and spread, allowing people to make better judgments on questions important in their lives.” One recent example demonstrates the huge difference between what Metafact is doing and what HealthNewsReview.org did every day for 13 years.
Does CBD (Cannabidiol oil) help with anxiety? was the title of a recent Metafact entry. Five experts responded with their opinions. No systematic criteria were used. All five simply responded with whatever opinion came to mind. Two of the five acknowledged clear financial conflicts of interest; one worked for a commercial cannabinoid-based drug company and one worked for the International Cannabis and Cannabinoids Institute, which serves clients engaged in cannabis commerce. Two of the five open-ended responses were 25 words or less.
This approach may help inform some readers. But I’m not a fan of publishing opinions from conflicted sources and then touting, as Metafact does, “We don’t take sides – ever. The only thing we don’t compromise are the facts.” I don’t think that this approach assures that readers get the facts. So, in this form, I don’t see it as a solution to health misinformation.
You’re entitled to your own opinions but not your own facts
I’ve collected the following quotes, each of which captures some aspect of the limitations of fact-checking.
“Facts are stubborn things, but statistics are pliable.” – Mark Twain
“The truth is more important than the facts.” – Frank Lloyd Wright
“Facts may be colored by the personalities of the people who present them.” – Reginald Rose, playwright of Twelve Angry Men.
Fact-checking projects may be colored by the people who publish them. Indeed, such efforts come in many forms with varying degrees of quality and usefulness. Careful consumers will need to hone their critical thinking and analytical skills in order to glean the most from any fact-checking project.
As the Facebook example above demonstrates, consumers can’t rely on technology behemoths to police health care claims. In this crazy environment, even fact-checkers need to be fact-checked.
In health care news, which follows an industry that is fraught with conflicts of interest and financial pressures that may influence the integrity of research and clinical care, patients and consumers need to be especially wary of what they read. And that includes what they read in health care fact-checking articles. If the fact-checkers don’t disclose who does the fact-checking, how they do it, and whether any potential conflicts of interest are involved, look elsewhere.