Wisconsin’s state achievement test, the Wisconsin Knowledge and Concepts Exam (WKCE), is given every November to public school students in grades 3-8 and 10.
At the high school where I teach, in early fall we English teachers were emailed a list of selected 10th-grade students. These are your FAY students, we were told. These are the ones who count.
Hang on, you might be saying, you lost me – FAY?
Yes, FAY. That stands for Full Academic Year. In eduspeak, “An FAY student is one who has been continuously enrolled in a school or district for 9.25 months, not including time that the student is not in school during summer.”
Or as Philip L. Cranley, an Education Data Consultant with Wisconsin’s Department of Public Instruction told me, “It works out to be about equal to the full prior academic year. In the case of [the WKCE], it’s about test window to test window.”
It’s easy to see why FAY is an important part of accountability discussions. Imagine a student who arrives at a school mid-October – maybe her family just moved, or maybe he’s one of those who rides the CO merry-go-round. Is it fair to judge this student’s new school based on the test she or he takes after just two weeks there?
So the US Department of Education and DPI report the test scores of FAY students – students in a school or district for a full year before taking the test. Non-FAY students are still tested and scored, but their scores are not counted toward things like meeting “No Child Left Behind” standards for achievement or progress.
Which brings me back to that email. To be clear, there was nothing in the email telling us to ignore the non-FAY students (and, credit to my colleagues – I didn’t teach 10th grade last year – they did not). But we were asked to look the list over, and leverage any positive relationships we had with those FAY students and make sure they did their best on the test.
And it worked! When you compare my school’s FAY student achievement on the 10th-grade WKCE in reading and math, we’re basically at the district average.
A couple of caveats at this point: One, we’re talking about the Milwaukee Public Schools, so being at the district average is still kind of sad – even if it was a pretty big deal to us at the time considering how far below it we’d fallen in recent years.
And two, especially given what my last post was here at SchoolMattersMKE, I need to make it clear that I do not believe a single standardized test is the only or even best measurement of any given district’s, school’s, teacher’s, or student’s achievement or ability. However, it is a measure, and we would be fools to ignore it completely, particularly when other people and institutions give it such weight. Also note that most of the time when I refer to “scores” on the test, even when I don’t qualify it, I’m referring to the percentage of students who rate as proficient or advanced on the state test.
Also, three, the email did not come from MPS itself. Rather, that email came from our “vendor” – the private-sector partner my school used as a reform consultant as part of our “School Improvement Grant.” They worked with my school and quite a few others in MPS last years, including one honored by the US DOE for doing so well at a “turnaround” (as measured by test scores). In general, they were very good and I liked them all, personally and professionally, quite a bit more than I expected to. And what happened here – a nudge to get us to focus on the students that count – should not be read as any kind of condemnation.
“It doesn’t surprise me,” said Anneliese Dickman, Research Director at Milwaukee’s Public Policy Forum, when I told her about that email. “You can find examples of principals, superintendents, and consultants all over the country who figure out how to maximize scores without breaking any rules.”
Dr. Heidi Ramírez, MPS’s outgoing Chief Academic Officer, made it clear to me that this sort of thing is not MPS policy. “[I]t’s been our belief that we need to hold schools accountable for the learning of all kids, not just the ones they think will ‘count’ in an external accountability system,” she wrote in a statement to me via MPS Spokeswoman Roseann St. Aubin. Citing Deb Lindsey, the district’s innovative, long-serving Director of Research and Evaluation, Ramírez adds, “[I]f we didn’t hold this belief, then [a school] could choose to NOT address the needs of non-FAY kids, or not treat them equitably. It’s our job to prevent that kind of thing from happening.”
So my school’s FAY students scored proficient or advanced in reading and math at about the district average. But what about our non-FAY students – they make up about a third of our 10th graders. Were they achieving at a higher or lower rate? And if my school showed a disparity between FAY and non-FAY, would other MPS schools, too? And thus began a dive into the data.
I couldn’t look at every school for every tested grade – I’m just one guy, and SchoolMattersMKE.com has sadly offered me no interns – so I sampled Milwaukee Public Schools data from the November 2011 test. I picked grades 4, 8 and 10, the ones that really matter in state and federal achievement reporting and only at reading and math. I looked at over 60 schools and data for more than 8,600 students in those grades, covering about 2/3 of MPS’s 10th grade enrollment, more than half of 8th grade enrollment, and just about 40% of 4th grade enrollment.
I tried to pick schools that were dispersed geographically around the city, with varied achievement and demographics. I wanted a wide sample, and I’m going to ask you to mostly trust me on that, because I will not print the scores for individual schools and grades because of privacy concerns for the students involved. I will name some specific schools to make particular points, but I will not say what their FAY or non-FAY scores were individually, only the schools in my sample in aggregate. (FAY results are available from DPI’s WINSS site, if you want to have at them yourself.)
Here are the results, in three graphs, comparing percentages of FAY and non-FAY students who scored proficient or advanced in reading and math in my sample:
You don’t have to be some kind of statistics genius to see that there’s a stark difference here, notably in reading at all grades. In 8th grade, for example, the difference between FAY and non-FAY students is 24 percentage points in reading – a pretty stunning number.
But maybe, you are thinking, the size of the non-FAY sample is small, so this big difference doesn’t have a great bearing on overall MPS achievement. In fact, though, I’m finding that in these grades, the non-FAY students make up a pretty consistent one-quarter or so of the sample (to be precise, 25.6%, 22.5%, and 25.8% in grades 4, 8, and 10 respectively).
One thing to keep in mind about the number of non-FAY students is how variable it is from school to school. Some schools had zero or a spare handful of non-FAY students, while others had scads of them. For example, Craig, Maryland Avenue and Fernwood Montessori schools had just four non-FAY students total among them at grades 4 and 8. But schools like Metcalfe, Pierce, and Maple Tree all had non-FAY numbers over 40%.
You see a similar variation in high schools, too, with Milwaukee School of Languages having less than 3% non-FAY students while Vincent and Madison have 40% and 48% non-FAY students, respectively. At alternative high schools, the number of non-FAY students can be in the 60% range. The winner was Washington, where because the Washington High School of Information Technology absorbed students from a school in that building that closed at the end of the 2010-11 school year, all those students were labeled as non-FAY – a total of 71.5%.
I asked DPI’s Cranley whether MPS’s high non-FAY numbers were unique in Wisconsin. “MPS has a great deal of mobility,” he agreed, “but due to its size and number of schools there is more opportunity for mobility than in other districts.” Not exactly the “Yes, MPS is a special case” I was hoping for, but I think it’s clear MPS’s situation is, as usual, very different from the rest of Wisconsin.
MPS doesn’t hide its high non-FAY number. In its 2010-11 district report card (very large pdf), the most recent one available, MPS claims a “stability rate” of just over 70% – meaning that all things considered, just 70% of its students who should return to a school the following year do, excluding graduates or those reaching a school’s terminal grade – like 5th grade, for a common example. MPS also claims a 15% “mobility rate,” the number of students who switch schools during the school year, not just between years. More on that in the tomorrow’s post, where I get to some of the implications of these data.
Despite the differing ratios of FAY to non-FAY, one thing remains pretty consistent across all the schools: Non-FAY students score lower than a school’s FAY students. It’s not just in the aggregate that these scores are low, with a few bad apples spoiling the bunch; rather, almost without exception on a school-by-school basis, these scores are low.
Of 16 high schools I looked at, 12 of them had non-FAY scores lower than FAY in math and 13 of them had non-FAY scores lower in reading. (The exceptions were often statistically small, a couple of percentage points higher rather than the large gaps – up to nearly 40 percentage points – for schools that had lower scores for their non-FAY students.) Middle and elementary schools followed a similar pattern, with about 80% of 8th grades sampled and about 60% of 4th grades sampled having lower non-FAY scores than FAY.
This matters because, as I said above, non-FAY students don’t count. When you look at a school’s standardized results as published by DPI or MPS or the press, non-FAY student scores are stripped out. Many of those students were in MPS for the year before they tested, so they count in the district’s FAY results (available here), but not when it comes to school-level results. They are, in a way, invisible.
And they are failing at a far greater rate than MPS’s FAY students. If there’s a crisis in achievement at MPS – and I think we can all agree that despite some general moves in the right direction in recent years, there sure is – then how MPS deals with its non-FAY student achievement failures is key to addressing that crisis. More on that tomorrow.