Why NOT School Performance Scores?

This blog by Gary Rubinstein at "TEACHFORUS" is a great example and explanation of the "inaccurate statistically meaningless scoring process" in New York City (that's where our unqualified candidate for State Superintendent previously worked his magic). Surprisingly, this system shows how the process hurts GOOD schools as well as "BAD." Louisiana's School Performance Score metric is no better!


When Bad Progress Reports Happen To Good Schools — Change ‘em!
by Gary Rubinstein

New York City’s Department of Education recently released their ‘progress reports’ for all the middle and elementary schools for the 2010-2011 school year.

For each school they have a complicated formula that assigns up to 60 points for ‘progress’, up to 25 points for ‘achievement’, and up to 15 points for ‘school environment’. The scores are tallied and out of the 1100 schools, the bottom 3%, which is around 33 are labeled as an ‘F.’ When a school gets an F, they are on probation and could get shut down and turned into a charter school or other sanctions. Even if it doesn’t get shut down, it is pretty embarrassing when schools get this grade, particularly when they know that they don’t deserve this label.

Well, I’ve finally waded through all the instructions about how the grade is calculated. I always figured that the stat was not statistically meaningful, but what I learned about the system surprised even me. Things are so much worse than I figured, and I plan to write in extreme detail in the coming weeks about this rating system and reveal all of the many flaws. For this post, though, I will concentrate on one.

What would the Department of Ed think if out of the 33 schools that got an F were several of the best performing schools in the city? That would be pretty embarrassing, wouldn’t it? It would make one question whether the scoring process was very accurate. Fortunately for them, when I looked at the list of Fs there were no schools with English and Math scores exceeding most of the school in the city — or were there?

So I did a little experiment. Instead of sorting the schools by their letter grades, I sorted them by their final score, which was a number from 0 to 100. The schools with Fs, the bottom 3%, were schools with a final score of 18.2 or less. But what I found was that among the schools with scores of 18.2 or less, there were some schools that did not have Fs. There were three Cs! How could this be, I wondered.

So I downloaded the progress report for one of these schools, P.S. 56 The Louis Desario School. And, as the top of the progress report shows, this school clearly got a C, even though their overall score was 14.9, which put them at the bottom 1 percentile of all schools.

Well, something was up. Then I figured out why. Looking more closely at the ‘fine print’ at the bottom right:

Mystery solved. Schools with average English and Math performance in the top third citywide cannot receive a grade lower than a C. How’s that for a self-fulfilling prophecy?

The message is that this inaccurate statistically meaningless scoring process is good enough for low performing kids, but not good enough for high performing (and low 9% Black and Hispanic percentage in this case, I might add) kids.

I find this loophole offensive. It is just a way to hide the fact that they have developed a horrible grading system. Stay tuned for more posts about the details behind the progress report grading system. When I am through exposing all the flaws, the only appropriate thing for the DOE to do would be to fire whoever came up with the system, have him/her apologize to all the teachers, administrators, and students he/she slandered with this failing label, and to re-open all the schools that were closed based on this phony metric.

No comments:

Post a Comment

Your Comments Welcome