Yesterday, I broke down the numbers: Milwaukee Public Schools students who change schools within a year before they take the state’s standardized test (WKCE) perform worse on the test than students who don’t.
Today I want to game out some of the practical implications of that. If we know that there’s a group of MPS students, up to a quarter of them, whose achievement is not well-tracked but is awful compared to the rest of MPS, what does this mean?
First, you may want to try to figure out where these students are coming from. The answer: mostly from MPS itself, until middle and high school, where they mostly come from outside of MPS.
When you compare data from DPI and MPS, you can gauge how many students are actually coming in from outside of the district by comparing the district’s enrollment data to the FAY results released by DPI for the district as a whole. A Full Academic Year (FAY) student, recall, is defined as a student “continuously enrolled in a school or district” for a year before the WKCE is taken (my emphasis). So on a district-wide basis, only those students who were outside of MPS for all or part of the year before the test count as FAY.
Comparing FAY to enrollment, MPS non-FAY enrollment varies by grade, starting at about 10% in the low grades, rising into middle school (13% in 8th grade) and peaking in high school, where nearly 17% of 10th graders were enrolled outside of the district for all or part of the year before they tested.
In the Venn diagram that is FAY and non-FAY, all the non-FAY students for the district lie within the larger circle of students who are non-FAY for their schools. What that means is that there’s an overlap of any given school’s non-FAY students with the district’s non-FAY students. If 10% of 4th graders are non-FAY for the district, and 25% are non-FAY for the school, that means 15% of those 4th graders were in MPS year before the 4th grade test. Put another way, 60% of an average MPS school’s non-FAY 4th-graders were at some other MPS school for all or part of 3rd grade, and 40% were outside of MPS.
Running those numbers for 8th and 10th grade, I found the opposite ratio – only 35% of an average middle or high school’s non-FAY students were at a different MPS school for all or part of the previous year grade, while 65% came from outside of the district somewhere.
This is in the aggregate, so it won’t necessarily hold true for any one school. Besides, like the position of electrons in an atom, there is no good way to tell where any one non-FAY student came from, because the specific data on students is hard to come by. But the safe bet is that is that, for those 4th-graders, the ones who didn’t come from another MPS school came from the Milwaukee Parental Choice (or “voucher”) Program, or city charters not affiliated with MPS.
Anneliese Dickman, of the Public Policy Forum, explained that until recently, the voucher application for students asked about where the students had been enrolled before they applied for the voucher. And even though there wasn’t equivalent exit data, PPF could still make reasonable inferences by knowing where students were coming from. If, for example, the applications showed 2,000 new students to MPCP, but total enrollment in the program only increased by a few hundred, that meant there was a lot of churn back into the public schools – something PPF’s voucher program reports noted year after year after year.
Recent legislative changes to the voucher program eliminated that question on the application. “All we can see now is aggregate growth, not the churn,” Dickman explained, “but we assume the pattern didn’t dramatically change in subsequent years. Most of the district non-FAY is likely from vouchers and non-MPS charters.”
Phillip Cranley of the Wisconsin DPI, too, told me there wasn’t enough information to be sure, but “it makes sense to me that [the voucher program] would increase mobility rates even further, but I cannot say with certainty that it does.”
In 2010, the School Choice Demonstration Project tried to put a little more academic structure to this very question (pdf). They were looking at matched samples of students from both MPS and the MPCP, and reported about the 2008-2009 school year, which in Wisconsin edupolitics was, like, eleventy hundred years ago.
But here’s what they found: Students in the voucher program were less likely to stay in a school from year to year – including for changing schools through promotion or graduation (56% for MPS, 44% for MPCP). At the time, the SCDP had lost track of about 25% of its voucher student sample, so caution is warranted about these numbers.
But the SCDP also confirmed what Dickman suggested – the bulk of students entering MPS were coming from voucher schools. “[W]ithin-sector school mobility is considerably more prevalent in MPS,” they write (meaning that moving MPS students tend to switch to other MPS schools), and add, importantly, “students are more likely to move from MPCP to MPS than they are to move from MPS to MPCP.”
Further, the SCDP researchers noticed that the students who do change schools – the ones we identify as non-FAY for achievement testing purposes – had lower scores than their counterparts. They were unequivocal, even moreso than I was in yesterday’s post looking at sampled MPS data. School “switchers always have lower average scores,” they write.
This is as true before the change in school as it is after (remember, I looked at the after data – how students scored when they were labeled as non-FAY, i.e., that they had been in a different school for part or all of the year before the test). The SCDP researchers write, “There are not large disparities, but [the data] generally indicate that students who moved to a new school were doing somewhat worse in the year prior to switching than students who remained in place.”
So non-FAY students’ low scores in the testing year are not necessarily a function of having switched schools – they came into the new school with deficient skills. (This is good reason for keeping non-FAY achievement results out of the public eye.) Further, the SCDP authors note that, in particular, students who leave the voucher program for MPS (or vice-versa) have vastly lower scores than students who stay with their voucher schools.
Taken together with what yesterday’s post showed – that non-FAY students in any given MPS school are likely to be scoring way below the rest of the students in that school – the implication is pretty clear that a students’ switching schools should be seen as a giant red flag. More on that tomorrow.
By now, you have probably also started to wonder: Which came first, the chicken or the egg? If students who switch schools have low test results before and after their move, can we say that mobility itself causes lower test scores, or is it just that students with low test scores tend to be the ones who move?
Dickman told me, “The act of switching itself creates the deficiency.” She noted that Milwaukee’s “marketplace” of schools and culture of choice among its schools is at least partially at fault. “We have done a disservice to parents by telling them to vote with their feet,” she said. Instead, we should reinforce that parental research needs to be done before schooling starts, not after. “If we tell [parents] they have a choice,” she said, “we need to say it had better be a good one because you have to stick with it.”
MPS doesn’t seem to have plans to tell parents to “stick with it.” The district, as I noted yesterday, acknowledges its students’ tendency to move and has tried to counter it, not with a less-permissive school selection system that would keep kids in place more, but with uniformity of curriculum and expectations. The district’s new Comprehensive Literacy Plan and newer Comprehensive Math & Science Plan aim to standardize what, when, and how students learn in the core subject areas at all grades and all schools.
Even at this very moment MPS administration and the school board are considering further clarification and standardization of graduation requirements because, as Superintendent Gregory Thornton pointed out at a board committee meeting last week, MPS students change schools so often there has to be uniformity. To MPS, it seems to me, switching schools within MPS (or even between voucher schools and MPS) does not itself create the difference in scores. Rather, it’s the what students find at the new school.
So that’s one vote for the chicken, and one, kind of, for the egg. Can the academics at the School Choice Demonstration Project break the tie? As it turns out, no:
Beyond the imprecision of these estimates, as with other results presented in the preceding sections, we caution readers against drawing a causal interpretation of [the data]; these differences are unadjusted for other factors, and we cannot say that lower prior achievement caused students to switch. It may simply be that students who move are inherently lower performing, or that other factors unaddressed here were causing both low achievement and the decision to move. (their emphasis)
The 2010 study’s authors promise that “[f]uture analyses will consider these implications further,” but the remaining reports the SCDP provided in 2011 and 2012 don’t dig much deeper into the issue, unfortunately leaving the chicken-and-egg question tantalizingly unresolved.
Tomorrow’s post will look offer some recommendations for what to do next.