Milwaukee Magazine's Erick Gunn ruminates on the inconsistency of PolitiFact and other fact checking organizations. His point of departure is the apparent inconsistency between its rating of statements that Paul Ryan's "Roadmap" would end Medicare ("pants on fire" said PolitiFact) and Tommy Thompson's claim to have ended welfare (rated as "true.)
I am sympathetic. "PolitiFact" was developed by the St. Petersburg Times and is operated in partnership with other media outlets including the Milwaukee Journal Sentinel. It's innovation - if that's what it is - is a "Truth-O-Meter" resembling what the Weekly Standard's Mark Hemingway calls an "old school instrument gauge" complete with red, yellow and green lights as well as flames for those statement rated "pants on fire."
I think it's crazy to say that Ryan's plan ends Medicare (the ad showing a Ryan-like figure pushing grandma off the cliff was reprehensible misrepresentation on a par with the worst of our political advertising) and an overstatement to say that welfare reform "ended" welfare. So while I don't agree that the degree of accuracy of the statements is the same, I do think that the characterizations of them represented by PolitiFact's cute little graphics are further apart than they ought to be.
Gunn refers to Hemingway's recent fiscing of "fact checking" in the Weekly Standard. I just got my copy on dead tree and read it. Hemingway makes a convincing case that media fact checking succumbs to the lack of ideological diversity in the traditional media and an unwillingness to treat as opinions those things that are opinion and to allow for the complexity of issues and the limitations of human language and every day discourse. Its simplistic set of conclusions - reflected in PolitiFact as "pants on fire," etc. - themselves require distortion
I have been concerned about this for awhile but was moved to write by PolitiFact's contorted treatment of a statement by Media Trackers that was clearly "true" as "false."
Hemingway cites a University of Minnesota study that provides overwhelming evidence that PolitiFact has a pro-Democratic bias. My impression is that the local operation is more even handed but my problem is independent of any claim of bias. It's one thing to vet a statement and let readers know about it's strengths and weaknesses. It's quite another to reduce that analysis to a simple set of conclusions.
The former might be useful. The latter is not. Doing the former does not require the PolitiFact brand. The Journal Sentinel - assuming it has the reporters - could avoid paying whatever skim it owes to the guys in St. Petersburg.
I understand that railing against the simple mindedness of PolitiFact is like holding back the wind. We like simple little graphics. But I think the problem is deeper than that.
While the customary thing is to rail against the "unwashed" who supposedly want simple answers and others to think for them, I think that PolitiFact is most misused by opinion leaders who are fully aware of its limitations. It is a source of "gotchas" that is I(back to bias) distorted by the ideological proclivities of the journalists who write it - a distortion that is reflected not only in the analysis of particular questions but in the selection of which questions to analyze.
I'd prefer the paper limit itself to straight reporting. As Hemingway points out that wouldn't solve the problems with "fact checking," but it would make them more manageable.
As for PolitFact's rating system, I rate it "Useless."