Has Accountability Become a Victim of Testing or has Testing Become a Victim of Accountability?

Please read this "Washington Post"  article about Pearson Publishing Company and its tests. 
 
I am sorry to admit that the teacher use of prepared publisher tests along with easy-to-grade bubble sheets is all too prevalent and has been since NCLB and the standardization mania.  Assessments are an important tool for teaching and certainly as important as a tool for learning for students.  But testing has become a victim of accountability and accountability has become a victim of testing.   The public has been convinced that a test, even a BAD test, can measure anything.
 
Do not think that the adoption of ANY curriculum/textbook whether it received a bogus evaluation by John White or not will solve the problems of Common Core.  Our best hope is in our teachers.  QUALITY professional development by QUALIFIED educators whose butts have not become petrified by sitting in central office has always been an essential component of QUALITY education.  
 
The following anecdote is an unfortunate result of testing, testing, testing and the great white hope of standardization.  Teachers can't even look at the high stakes tests they administer much less create them? 
 
 
October 8
Sarah Blaine is a mother, former teacher and full-time practicing attorney in New Jersey who writes at her own parentingthecore blog.  Early this year, I published a post of hers under the headline, “You think you know what teachers do. Right? Wrong,” that was extremely popular with readers. Here’s a new post by Blaine from her blog about what happened when her fourth-grade child came home with some school work — and why it affects far more than her family.

By Sarah Blaine

Last Friday morning, my fourth grader handed me her “Thursday folder” shortly before we needed to head to the bus stop. I was glad to see a perfect spelling test, and a bunch of excellent math assignments and math tests. Time was short, however, so I flipped to the wrong answers. And sprinkled among the math tests, I came across two wrong answers that caused me concern.
The first problem was this:

Now, I looked at this problem before I’d had my morning coffee, and I wasn’t sure at first that I wasn’t just missing something. So I posted this picture to my Facebook feed, and asked my friends to confirm that I wasn’t crazy.

But my daughter was right: if Curtis walked three miles a day for 26 weeks, Curtis did in fact walk 546 miles.

3 miles/day x 7 days/week = 21 miles/week
21 miles/week x 26 weeks = 546 miles

I double, triple, and quadruple checked myself.  I pulled out a calculator.

My friends agreed: my initial reaction to this question wasn’t nuts. My daughter’s answer was correct. And they came up with some good theories for why the answer might have been marked wrong.
Perhaps the teacher was trying to teach children, especially girls, to be confident in their answers, and she’d been marked wrong due to the question mark.
Perhaps she’d been marked wrong because she failed to indicate the units.
Perhaps she’d been marked wrong because she hadn’t provided every step of her work (i.e., she’d figured out the first step (3 miles/day x 7 days/week = 21 miles/week) in her head, and therefore had paid what one of my friends memorably described as a “smart kid penalty.”
 
But they were all wrong.

My daughter is fortunate enough to attend an excellent public school and her responsive teacher both sent a note home and called me that afternoon to discuss (I’d scribbled a quick note asking what the deal was along with my required signature on the front of the paper).

It turned out that my daughter had been marked wrong for a very simple reason: the Pearson answer key was wrong.

Let me say that again: Pearson was wrong.

Pearson listed some totally different — and wrong — number as the answer. The teacher had missed it when reviewing the test with the morning class, but in the afternoon class she’d realized the problem. My daughter’s teacher apologized for forgetting to mention it again to the morning class (and for not having previously changed their grades, but to be honest, I really could not care less if my kid scored a 95 percent or 100 percent on a 4th grade in-class math test).

In the olden days, I’d have laughed it off. Once in awhile, the textbook publisher screws up. In the olden days, that screw up was no big deal: it is mildly annoying to those of us who pay the taxes to buy the books, but it’s a pretty minor annoyance in the grand scheme of things.
However, these are not the olden days. The are the days of high-stakes testing. These are the days in which our kids’ high school graduations hinge on tests created by the very same company — Pearson – that messed up the answer to this question.

Tests we parents will never get to see.
Tests we parents will never get to review.
Tests we parents will never get to question.

So Pearson’s mistake on its fourth-grade answer key doesn’t exactly inspire confidence.
Presumably, before the enVisions curriculum was published, Pearson checked and rechecked it. Presumably, its editors were well-paid to review problems and answer keys.

After all, Pearson itself describes this math curriculum as:
Written specifically to address the Common Core State Standards, enVisionMATH Common Core is based on critical foundational research and proven classroom results.
And yet… it was still dead wrong.

It seems that all of Pearson’s critical foundational research and proven classroom results in the world couldn’t get the question 3 x 7 x 26 correct.

To the uninitiated, I bet I sound nuts.  Who cares, right?  It’s just a question on a math test.  But if we are going to trust this company to get it right on high-stakes tests (where there is no public accountability), then the company better get it right all the time when it is operating within the public eye.  So this isn’t just about a fourth grade math test.  It’s all of the other Pearson-created tests my daughter is scheduled to take: in particular, the new Common Core PARCC tests this spring, which are the ones that come with no public review, and no public accountability.

Here, the test came home in my daughter’s backpack. As a result, there was an opportunity for public review and public accountability because I could review the test and question the wrong answer. The teacher could check the question and realize that the book was wrong, and substitute her own professional judgment for that of the textbook publisher.

And most importantly, the mistake was not a big deal, because the outcome of this test would not determine my daughter’s placement into an advanced math class or a particular school or even prevent her from graduating from the fourth grade. The outcome of this test would not determine her teacher’s future salary or employment. This test was nothing more than the kind of test our nine and ten year olds should be taking: a fourth grade in-class, teacher-graded chapter test. At most, this test will determine a small portion of my daughter’s report card grade.

But what about those tests that Pearson will be administering to our students this spring? We won’t be able to review the test questions, the answer keys, or our children’s answer sheets. We won’t be able to catch Pearson’s mistakes.

This spring, even if the answer really is 546 miles, Pearson will be able to write that Curtis traveled 1024 miles, or 678 miles, or 235 miles, or any other distance it wants. And we’ll never know that our kids weren’t wrong: Pearson was. But our kids’ futures — and their teachers’ careers — will be riding on the outcomes of those tests.

There has to be a better way.

In a low-stakes world, Pearson’s screw up was a low-stakes mistake. But now we’re forcing our kids — our eight, nine, and ten year olds — to live in a high-stakes world.

And in a high-stakes world, Pearson’s screw ups are high-stakes. So shame on you, Pearson, for undermining my daughter’s hard-earned (and easily eroded) math confidence with your careless error. I will parent my kid so that she learns not to second-guess herself with question marks after her answers.

But Pearson, I will be second-guessing you. As publicly as possible.

Here is a follow-up to this post, with Pearson apologizing to Sarah Blaine.

Here’s the author’s first post on The Answer Sheet, “You think you know what teachers do. Right? Wrong.”

*******
Here is a reader's comment that is quite relevant:

lindentreeislander
9:13 AM CDT
 
 
I've looked at the sample Common Core language arts and math tests for 3rd graders. The language arts test was filled with reading passages followed by questions and multiple choice answers about those passages. I found many of the so-called "right" answers to be either plain errors or arguably not the best answer. In other cases, the information given in the passage was not sufficient to know or infer any answer to the question. Frankly, the language arts test seemed to have been made up by computer (IT) specialusts or statisticians who just are not highly intelligent at using spoken/written language. They are instead pretty smart at producing collections of test questions that will yield a "normal" distribution of scores, a "hill" on a graph. I would have scored poorly, and this was the third grade test!  
 
As far as the math test, I only gave it a cursory look but it seemed to me to require conceptual thinking beyond the reach of many children in the 3rd grade (above the level of the simple arithmetic problem discussed in this column). If it additionally contains frank errors, or suffers from so-called "right" or "best" answers that really are arguable, the potential harm to the confidence of children and to their school trajectories is considerable. (Arguable answers can exist in math questions about a hypothetical real-life situation described in a reading passage if the passage is subject to more than one interpretation or if insufficient information is given.) 
 
What I looked at were only sample tests put out by the company. Not only are parents prevented from seeing the actual tests and answer keys. Teachers and principals are not allowed to see them either! They are proprietary tests owned by a corporation and do not belong to the school district. How convenient that no review of the corporation's product is permitted.
 

 

 

 

No comments:

Post a Comment

Your Comments Welcome