PLR/CLR as Large-Scale Assessment
Can classroom-based assessments provide information that could be used effectively for accountability? Statewide use of portfolios in Vermont (Examiner, Winter 1993-94) and Kentucky (Examiner, Spring 1993) is begin-ning to show that this might be possible. In both these states, the content of the portfolios is largely specified by the state. For example, Vermont requires particular types of work in its math and writing portfolios.
The Primary Language Record (PLR) and its adaptation, the California Learning Record (CLR), are beginning to demonstrate a somewhat different approach to using classroom-based information for large-scale purposes. With the PLR/CLR, teachers observe and document student learning, interview students and parents, and collect samples from student portfolios, focusing on reading, writing, listening and speaking. The content of the samples from the portfolios is not specified in advance and is selected by teachers. Teachers sum up each students' progress several times a year in narrative form on the student's record. Research in the US and England has consistently shown the PLR/CLR to be valuable for instruction and for making classrooms more learner-centered (see sidebar and Examiner, Summer 1992).
The PLR/CLR also utilizes a series of five-point reading and writing scales that correspond to the development of learning in these areas. In reading, the scales are for "Becoming a Reader" (mostly for K-3) and for "Becoming an Experienced Reader" (grades 4-6 or higher). The CLR is in the process of validating a third scale for high school students, as well as learning records and scales in other subjects. Using classroom information, teachers can note student progress in terms of the scales. These scales could serve as the basis for accountability reporting.
The question is whether teachers can accurately place their students on the scales. To answer this question, the Center for Language in Learning (CLL), which administers the CLR, has conducted scoring sessions in California. A sample of 20% of a classroom's records is rescored. If the sample was scored accurately by the teacher, then it is reasonable to conclude the rest will be. And if teachers' placements are correct, then public reporting based on the scales will convey accurate information.
At regional meetings in 1994, each sampled CLR was read by a team of two reviewers. In this process, the teacher provided evidence together with the Record that reviewers examined to determine whether the teacher accurately placed the student. CLL explained that having a team of readers helps reduce bias and ensure reliability. The readers, all of them teachers, spent about 15 minutes on each record.
Fifty-five reviewers scored 156 records. The reviewer's score then was compared with the original teacher's score. In 134 cases (86%), there was exact agreement between the teachers and the second readers. Third readings were done when the second readers did not agree with the teacher, and two-thirds of the third readings confirmed the teacher's placement. In only two percent of the readings was there no agreement among three readers.
In 1995, the process was altered to create a new level of review. Teachers scored their own students. Then first reviews were done at the school site, using a sample of records from each participating teacher. At a regional meeting, the records were read a second and if necessary a third time. After all scoring was complete, the results were compared to the scores given by the students' own teachers.
Of 174 scored records, 79% of the second readings corroborated the teacher's placement. Thirty of 35 third readings agreed with the second readers rather than the initial readers, and usually the second and third readers' score was lower than that of the student's teacher. Inadequate supporting material caused 24 records to be unscorable.
The two years' work shows that even relatively inexperienced teachers can make accurate placements. CLL points out that most teachers were just beginning to use the CLR and lacked experience in selecting material to support their placement decisions. During interviews by CLL, these teachers often commented that the scoring sessions made them more aware of the kinds of supporting evidence they needed to supply to help subsequent readers understand the teacher's reasoning.
The studies provide early evidence that group scoring processes can be used successfully to demonstrate student progress in reading to the public by using a classroom instrument such as the PLR/CLR. The Center points out that if scoring sessions are viewed as professional development -- which many teachers have said they are -- then the assessment share of the costs is not excessive. Scoring sessions and studies will continue to be conducted in California and London.
Since there have only been two small trials, it would be premature to conclude the PLR/CLR can be used for accountability purposes. Larger studies and more rigorous technical analyses are needed. Classroom use of the PLR/CLR also requires extensive professional development for teachers. Unless teachers and schools have or want to have learner-centered classrooms, they may view using the Record as being very time-consuming. If classrooms are not organized to support use of the PLR/CLR, if teachers have too many students, or if teachers are not supported in this work by their schools, then it will be difficult to maintain its use.
Finally, if states or districts decide to use the PLR/CLR for accountability purposes and then place high stakes on the assessment, such as sanctions in cases where students make slow progress in terms of the scales, teacher response to such pressures may undermine the important instructional benefits of the PLR/CLR. However, using separate assessments of student progress for accountability that are not compatible with PLR/CLR-supported classroom practices is also likely to undermine instruction and learning. Thus, the accountability dilemma remains unresolved.
CLL, 10610 Quail Canyon Rd., El Cajon, CA 92021; (619) 443-6320.
- K-12 Testing
- University Testing
- Fact Sheets
- Get Involved
- Other Resources