In other words, the cute little emoticon (an obfuscatory bit of "entertainment" fluff which I desperately wish the paper would get rid of) is pretty much backward.
Neither report attempts to measure whether the improvement of students who move to a choice school compared to comparable children who remain in a public school.
(In fact, at least for PPF, there would be no way to do so with the data available to them.)
We could go on. It is nonsense, for example, to compare the test scores of public and voucher students in Racine in 2014 because the choice program had only just begun in Racine and the voucher students had been in their new schools for only a few weeks.
Some folks argue, moreover, that DPI treats students who have opted out of tests in a way that disfavors voucher schools. There are other issues as well.
I would add that it is meaningless to treat voucher schools as a group since they are not subject to common control and differ dramatically from one another. If, in fact, well-run choice schools improve student performance then it would also be wrong to say that there is “no evidence” the program works.
In fact, a quick look at the 2013 PPF report reveals that a majority of choice schools appear to have matched or exceeded the MPS average for at least one subject during the years in question and that a substantial majority of Lutheran or Catholic schools did so. The 2014 results, while not quite as clear, are comparable. Perhaps certain types of choice schools do improve student learning.
So the numbers that Politifact says give public school students a “clear edge” do no such thing.
And then there’s the University of Arkansas School Choice Demonstration Project study – the only real study of whether the choice program“improved student learning.” The SCDP study found that Choice students were significantly more likely to graduate and go on to college. The study found that voucher students showed a greater increase in reading scores than MPS students. (That this happened in the last year of the study and the first year in which voucher schools had to, like MPS, publicize their scores doesn’t change that.) There were some other advantages for voucher students but there were not statistically significant, i.e., the researchers could not say that they were not a product of chance.
You can dismiss the statistically significant improvements as “too small” although they are at least as robust as the evidence for a variety of educational nostrums that are commonly touted. And these gains are accomplished at a bit over half the cost of a public school education. You can quibble over what caused the measured improvements. But the fact remains that they are “evidence”of improvement that Mary Burke falsely said does not exist.
How Burke’s statement can possibly be called “mostly true”is beyond me. The Politifact author regards the evidence for improvement as "thin." I'm not sure that statistically significant findings of improvement in the only properly designed study that exists can be so readily dismissed. But let's grant him this . Concerns about the extent of improvements or what really caused it are worth discussing.
But she said there was no evidence. That is false. It cannot be mostly true. This Politifact was not fact checking. It was an imposition of opinion.