Test foul ups

Status: 
Archived
Subject: 
K-12 Testing

While the federal government demands ever more testing with ever higher stakes, more states are encountering difficulties with their existing systems or in trying to expand it.

 

Nevada officials reported in July that 736 sophomores and juniors who had been told they failed the math portion of the state test had in fact passed it. While none had been denied graduation, some may have dropped out of school. Testmaker Harcourt Educational Measurement said the problem was that students needed to answer 41 questions correctly to pass, but a Harcourt computer programmer calculated the students needed to answer 42 correctly. State Board of Education members, who had approved the high-stakes use of the test, expressed outrage, with some calling for cancelling the contract with Harcourt.

 

North Carolina has suffered its second major testing foul-up in two years. The state Board of Education voted in early July to toss out the results of this year's fourth and seventh grade writing exams when scores diverged wildly from the previous year's results. While 69% of fourth graders passed in 2001, only 47% passed this year, and the seventh grade pass rate declined 10%. In 2001, the state's Department of Public Instruction had overestimated the difficulty of the math test and set the passing score so low almost every student passed. In both cases, outside experts concluded the state is developing and implementing its exams too quickly. The writing test was deemed to be poorly designed and the prompt not clear.

 

In Georgia an equating error forced the Department of Education to indefinitely postpone release of results from the Stanford 9 achievement test given in grades 3, 5 and 8. The problem began when the state had the publisher, Harcourt, produce a special SAT-9 form for Georgia to administer in 2001-02; a nationally normed form was used the previous year. In about one third of the subject areas and grades, scores from one year to the next diverged by 15 to 24 percentile points, far greater than the typical one to three point differences between forms and years. Equating is the procedure test makers use to ensure that two forms of a test are of equal difficulty.

 

The state told districts not to use the results. Some districts misuse the tests to make student class assignments or determine educator bonus awards. State officials acknowledge that the problem may never be fixed, and the Board has voted to make the exam optional next year.

 

Release of New Mexico test results was delayed months when about 70% of the state's school superintendents reported errors, including missing data and unusually lower scores. The state uses CTB/McGraw Hill's norm-referenced TerraNova achievement test to rank schools. The lower scores were due to differences between the previous test form and the new one, and results were adjusted to match the old norms.

 

Minnesota postponed scoring state exams simply due to lack of money. Results of the March tests, used in school accountability, will not be known until October. One state representative asked, "If the results are going to be delayed, why give the tests?" Many states are facing budget shortfalls at the same time federal law requires them to implement more testing.