NAEP Levels Found To Be Flawed
Once again, a major study has concluded that the process used to categorize scores on the National Assessment of Educational Progress (NAEP) is "fundamentally flawed," resulting in levels that are "not believable" and typically too difficult. A report by the National Research Council (NRC), Grading the Nation's Report Card, called on the National Assessment Governing Board (NAGB) to revise the levels for NAEP studies starting after the year 2002.
NAGB actually had revised levels set by its independent panels for the 1996 NAEP science exam, exemplifying the problems with the procedures the panels use. NAEP levels ("basic," "proficient" and "advanced") are used to define student achievement and the quality of U.S. education.
Earlier reviews of the levels-setting process, including one from the National Academy of Education (NAE), had similarly criticized the process (see Examiner, Fall 1991, Spring 1992). At that time, NAGB simply attacked its critics and defended the levels-setting process. This time, NAGB requested that the NRC offer advice on how to improve the levels setting, which the NRC has agreed to do.
According to the report, the levels are inadequately defined, and the process of setting the cut scores between levels is not well designed. The report added, "NAEP's consensus-based frameworks and the assessments based on those frameworks...do not lead to portrayals of student performance that deeply and accurately reflect student performance."
The NRC panel recommended that rather than rely on only one NAEP exam, student learning be described using "multiple methods for measuring achievement that go beyond large-scale assessment formats." It also called on NAGB to clarify the criteria for including students with disabilities and limited English proficiency in the NAEP exams.
Unfortunately, the report appears to support NAGB's plan to gather less of the in-depth information which has been used to correlate student achievement on the test to other factors, such as the types of instruction they receive (see Examiner, Summer 1996). This data is useful for such things as interpreting the relative effectiveness of different teacher practices.
The NRC and NAE both concluded that NAEP levels often are set unreasonably high. For example, the 1994 NAEP reading test results reported that 40 percent of fourth grade students were unable to read at even a "basic" level. However, just a few years earlier, an international reading test found that U.S. third graders scored in a statistical tie for second among participating nations, following only Finland.
The setting of unreasonably high levels has become something of a trend. Thus, the Massachusetts state exam appears to have grade four reading passages that are mostly at a grade five or six level, according to readability formulas. An advisor to the test's development noted that items on the tenth grade Language Arts test are appropriate for sophomores at elite colleges. Such high levels on state exams could soon cause many students to be denied a high school diploma.
The NAEP levels also have helped to create an atmosphere in which, for example, a New York education official could denounce a test simply on the grounds that too many students pass it.
-- Available from National Academy Press, 2101 Constitution Ave., NW, Washington, DC 20418; (800) 624-6242; $47.95 + shipping; or see website: www.nap.edu
- Public School
- College Admissions
- Fact Sheets
- Act Now
- Other Resources