November 13, 2023

If politicians utter dubious statements on the campaign trail and fact-checkers are around to hear them, will those fact-checkers rate the statements with agreeing skepticism?

If the checkers are Snopes and PolitiFact, and the claim is similar, that answer is nearly always yes, Penn State University researchers found. This consistency helps build public trust in fact-checking and fact-checkers, the researchers said.

“‘Fact-checking’ fact checkers: A data-driven approach,” a 22-page October research article from the Harvard Kennedy School Misinformation Review, examined practices of U.S. fact-checking organizations Snopes, PolitiFact and Logically, along with The Australian Associated Press.

Sian Lee, Aiping Xiong, Harseung Seo and Dongwon Lee of Penn State University’s College of Information Sciences and Technology did the peer-reviewed research.

The Penn State researchers found U.S. fact-checking spikes during major news events. In recent years, that was during the COVID-19 pandemic and the 2020 presidential election. Further, the researchers said, misinformation’s spread can mislead and harm people and society.

The researchers examined 11,639 fact-checking articles from Snopes and 10,710 from PolitiFact from Jan. 1, 2016, to Aug. 31, 2022. They found Snopes checked more “real claims” — claims that rate true or mostly true — with 28.7% versus 11% for PolitiFact.

Looking widely, the researchers found high agreement when Snopes and PolitiFact probed the same information. Of 749 matching claims (examining the same information), 521 received identical ratings and 228 (30.4%) had diverging ratings. But, the researchers found nuances caused nearly all of these divergent verdicts — granularity of ratings (Snopes and PolitiFact scales differ slightly); differences in focus; differences in fact-checked information and the different timing of the fact-checks.

Adjusting for these systematic discrepancies, Penn State’s researchers found just one conflicting rating among the 749 matching claims.

The single diverging decision for Snopes and PolitiFact centered on whether 2016 Republican presidential candidate Ben Carson said, “Anyone caught involved in voter fraud should be immediately deported and have his citizenship revoked.”

The researchers said Snopes interpreted “anyone” to mean “illegal immigrants,” rating the claim as Mostly True. PolitiFact, by contrast, concluded that “anyone” could encompass any American, and rated the claim Mostly False.

This near-unanimous agreement for Snopes and PolitiFact, the researchers found, likely means that fact-checking is complex and multifaceted, involving numerous variables, and that checkers select and verify claims in unique ways. (All true.)

“When multiple fact-checking organizations consistently agree on the accuracy of a statement, the public is more likely to trust their assessments,” the researchers wrote.

The high agreement suggests Snopes and PolitiFact have established consistent and reliable fact-checking practices, they said, which “enhances the credibility of fact checkers in the eyes of the public.”

“The consistency of fact-checking among major organizations is crucial for mitigating misinformation online, especially as the evaluations of these organizations are increasingly being used by social media outlets,” the researchers wrote, later adding, “Therefore, the findings of this study can inform and improve the fact-checking practices of social media platforms, ultimately contributing to the promotion of truth and the prevention of the spread of misinformation on social media.”

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Matthew Crowley is PolitiFact’s copy chief. His journalism career spans more than three decades and includes stints at newspapers in Nevada, Arizona and New York…
Matthew Crowley

More News

Back to News

Comments

Comments are closed.

  • The article should also disclose that the author works for PolitiFact.

  • “521 received identical ratings”

    Sorry, have to call BS on this one.

    When Chloe Lim published her paper looking for fact-checker agreement (and finding a low level of agreement), she was criticized for trying to directly compare the ratings for the Washington Post Fact Checker with those of PolitiFact. The problem? The two organizations do not use the same rating system. The same appears true of Snopes and PolitiFact. Any article trumpeting the similarity of ratings should address that problem. The failure to mention it in this article counts as a whopping hole in the story.