When it comes to accountability, voucher schools aren’t special

Last week, I went after an actual columnist at the Milwaukee Journal Sentinel, so this week it may not seem fair that I go after a letter to the editor.

But the author of the letter is Joy Pullmann, an education research fellow at the conservative Heartland Institute in Chicago, rather than one of the random cranks usually populating the letters column (disclosure: I have sometimes been one of those cranks). The Heartland Institute, despite its homey name, is a “free-market think tank,” which is code for conservative; you might remember them from “Operation Angry Badger” during the 2012 recall of Wisconsin Gov. Scott Walker. They have a long history of fighting anti-smoking regulations, claiming that climate change is not real, opposing Obamacare and – importantly to our purpose here – promoting school vouchers over public education.

Pullman popped into our local letters page because of the current Wisconsin legislative debate over a “school accountability” bill.

She opens her letter by drawing a false analogy between school vouchers and college scholarships, but then suggests that since the state “does not oversee Marquette [University], and no one complains about that,” there is no need for state supervision of voucher schools. It’s not true, though, that the state has no oversight of Marquette. In fact, the university has state accreditation in a number of fields and must follow all kinds of state laws and regulations. But that’s a small distraction for what comes later.

One fairly consistent theme in the debate over school accountability is that voucher schools in the state should be required to give the same tests and report the same data about their voucher students that public schools do. This would allow, to use the cliché phrase that came up over and over in testimony last week, an apples to apples comparison.

This makes voucher advocates nervous. It should. As I’ve written here before, voucher schools in Milwaukee (where we have a history of data, as opposed to the new state-wide program) do not, by and large, do better than Milwaukee Public Schools on the state test. Matt Kussow of the Wisconsin Coalition of Religious and Independent Schools (WCRIS) – an organization that supports and represents voucher schools – argued strenuously against the most recent version of the bill last week, for example, because any measure of students on vouchers at a school doesn’t give a complete picture of the whole school.

Pullman, too, doesn’t want the apples to apples comparison for a different reason. Here’s what she wrote that attracted my attention:

“[M]aking the exact same demands of private voucher schools that the state makes of public schools is counterproductive, when a main reason for vouchers in the first place is to provide something different from public schools. The point is not to destroy education diversity, but to cultivate it, because different kinds of schools suit different kinds of families and children.”

While I am loathe to agree with Pullman, she has a point: Schools are different! Kids are not widgets who all get stamped out of the same mold! And the students attending schools in Milwaukee are decidedly very different from students who are attending schools in Brookfield or in Middleton or in Lake Geneva or on the Menominee Indian Reservation.

But can you imagine if, for example, MPS superintendent Gregory Thornton made that argument in front of the legislature? Please, he might say, don’t judge my schools the same way you judge suburban schools. Our families are different and are seeking different kinds of school experiences. Even within MPS, you could see the argument being made. The families choosing high-performing schools are seeking different experiences than those at the low-performing schools, so you can’t say one is worse than the other! The outrage would be so palpable you could shovel it like this winter’s snow.

In testimony before the assembly education committee last week, WCRIS’s Kussow complained that holding voucher schools to the state test didn’t make sense. He said, “You’re mandating that we use a test that has zero to very little value in our classroom.”

Shouldn’t Dr. Thornton be able to make the same argument? The high school test starting next school year, for example, is the ACT. But what about students who have no interest in going to college? Is it fair to hold them accountable – sorry, hold their teachers accountable – for a test that has zero to very little value in their classrooms?

Look at the highest-performing schools in MPS, and even they have students who clearly aren’t going to college and clearly aren’t doing well on the ACT. Reagan, arguably one of the state’s best schools, had only a quarter of their students meet the ACT math readiness benchmark on the most recent report card, for example. So don’t be surprised when a school with few college-bound students, like Bradley Tech – by definition (and name) a trade school with a student body that may have no interest in college – had just a single student (0.7% of their graduates) who met that math readiness standard.

Let’s be honest for a second, though: The “school accountability” bills under discussion already have some ways built in to account for differences in schools, through a “value-added” formula developed by the University of Wisconsin’s Value Added Research Center. This formula, according to the UWVARC’s Bradley Carl when he testified last week in front of the assembly’s education committee, takes into account students’ prior achievement and demographic characteristics that might affect their achievement when looking at new achievement scores. In other words, value-added is supposed to control for any factor other than the quality of the school and the teachers when reporting a student’s – or a whole school’s – achievement level.

However, education reform critics and even some scholars have been criticizing value-added for some time (You might recall the New York City middle school teacher whose students scored better than high school students, but who was deemed to be the worst 8th-grade math teacher in the city by a value-added formula).

The journal Education Next has, in its current issue, a strong critique of value-added methods currently in use including those like the UWVARC’s because they still show an effect of outside factors on school achievement (that is, the poorer the students in a school, the worse school achievement seems). The authors instead recommend going a step further to completely eliminate any effects of socioeconomic status or other demographic features, and show a more honest and accurate measure of a school’s impact on student achievement.

The way they recommend doing it? Comparing like schools, rather than unlike schools – something akin to what Joy Pullman might be asking for. Using a large sample of schools in Missouri, it found that by controlling not just for student characteristics but also for school characteristics, it was much easier to see which schools were really being successful at educating the students they have. Sure, poor schools had lower achievement than wealthy schools, but differences as large or larger were visible among just the poor schools and among just the wealthy schools.

The problem is there is no way the current legislature, if it even manages to put together and pass some version of the school accountability bill this session, is going to allow anything like my imaginary begging from Dr. Thornton. Public schools are public schools, the legislators said over and over at the hearing last week.

Pullman’s special pleading for voucher schools fails to be persuasive in the least when no such leeway could ever possibly exist for the public schools. MPS, which will bear the brunt of the consequences meted out in these proposed “acceptability” bills, has even been singled out for stricter punishments in some versions of the bill.

Why, then, should publicly funded voucher schools get a pass? Sorry, Joy Pullman of the Heartland Institute, they should not.

Leave a Reply

Your email address will not be published. Required fields are marked *