John White and LDE Break the Law Again!

Reposted from Mike Deshotel's Blog, Louisiana Educator:

All of this test score manipulation is a problem for Louisiana students and schools, but the most significant problem for John White is that federal funding is based in part on these test scores and school performance scores are used to allocate monetary awards of TAXPAYER money to schools and districts.  Eventually someone will respond to our complaints and hold these people LEGALLY responsible.  

White Refuses to Release Raw LEAP Cut Scores
A press statement accompanying the release of the Spring 2014 LEAP and iLEAP testing results announced that the percentage of students receiving a rating of "mastery" on the LEAP had improved this year and the percentage of students rated "basic" remained steady this year despite the inclusion of more "rigorous" Common Core aligned questions on this year's tests. The press release from the LDOE stated:

"The Department of Education today announced that on LEAP and iLEAP tests aligned to more challenging learning standards, the percentage of students performing at the state’s 2025 expectation of “mastery” (level 4 out of 5) increased in both English Language Arts and math, while the percentage of students performing at the state’s expectation level established in 1999, “basic” (level 3 out of 5), remained steady."

Using critical thinking skills to decipher the real meaning of the above statement, I began to ask myself "Does this press release mean that more students got a higher percentage of answers correct on this year's test than they did last year even though the test was supposed to be more difficult?" Also I wondered: "Does performance at a level of 4 out of 5 mean that students got 80% of the questions on the test correct? Does a rating of "basic" mean that a student got at least 60% (3 out of 5) of the questions right?" But after studying the technical explanations at the LDOE website, I concluded that the press release tells us nothing about what percentage of correct answers are represented by the ratings of "basic" and "mastery".  It also really tells us nothing about whether students got more or fewer right answers on this year's test compared to last year. To figure that out we would have to know the raw scores equivalent to such ratings. . . . and John White is not telling us the raw scores: the percentage of correct answers required to produce a rating of basic or mastery.

You see it turns out that the raw scores, or percentage of correct answers for the ratings of "basic" and "mastery" can be changed from year-to-year based upon judgements made by the LDOE and the testing company employed by the Department to design and grade the tests. The policy of the Department is that if the test for a particular year contains more difficult questions (in the opinion of the DOE and the testing company), the decision can be made to lower the raw cut score (the percentage of correct answers) for a rating of either "basic" or "mastery" to "adjust" for the greater difficulty of the new test. The following is the technical explanation given by the LDOE for adjusting or resetting the raw cut scores from one test form to the next: (From the DOE Technical Summary Report page 6)
"Equivalency is established by first building the forms to be equated according to tight content specifications. Then the form scores are placed on the same scale, such that students performing on an assessment at the same level of (underlying) achievement should receive the same scaled-score, although they may not receive the same number-correct score (or raw score).(emphasis added) The raw-to-scaled-score relationship performs this leveling function based on form equating studies. Theoretically, differences in the raw-to-scaled-score relationship between the two forms can be partially due to differences in the samples utilized for calibration and the differences in item difficulty."

So until we know how the raw cut scores compare this year with previous years, we really don't know whether on not student performance at the basic level "remained steady" or that the percentage of students performing at a level of mastery has improved.

That's why on June 10th I made a public records request of John White as the custodian of public records for the LDOE to provide me with the percentage of correct answers needed for students to receive a rating of basic and mastery for this year compared to the previous year. I also asked for a copy of any communications between the LDOE and the contracted testing company concerning any adjustments in test scores from last year to this year. But after more than a month of wrangling with the attorney representing John White and the DOE I was informed Friday that the Department is not in possession of the information I requested. How can the DOE not have the information it used to to give ratings on LEAP and iLEAP to approximately 500,000 Louisiana students?

What is the definition of a public record anyway? According to the Public Affairs Research Council which for years has advised the public in Louisiana on the meaning of the public records laws, the definition is: "Generally anything having been used, being in use or prepared for use in the conduct of public business is a public record, regardless of physical form." Based on this definition, I believe that the raw cut scores for a rating of basic and mastery on the LEAP tests are public records and should be provided to any citizen requesting them.

Our state superintendent, John White, before coming to Louisiana worked in the New York system of Education. The New York state agency was notorious for manipulating the cut scores used for determining the performance of students and schools in New York. It has been revealed recently that the raw cut scores were changed drastically over a 10 year period to first make it seem that student performance had improved dramatically and then last year the cut scores were changed to show a drop in performance when probably no real change had occurred. Is something similar now happening in Louisiana? We won't know unless John White provides us with the raw percentage scores for the ratings of basic and mastery over a period of years. We have a right to know if data is being improperly manipulated. We need to know if moving to Common Core testing is going to cause our students to perform higher or lower on the state tests.

I have offered to meet with White or his staff to resolve this matter amicably but that is not happening and it seems like my only option now is legal action once again to simply get White to follow state law.

No comments:

Post a Comment

Your Comments Welcome