August 7, 2007

ED Daily - NAGB to Determine Future of Reading Trend Line

NAGB to determine future of reading trend line
By Stephen Sawchuk Staff Writer

With a new National Assessment of Educational Progress reading test scheduled to debut in 2009, the board that oversees NAEP policy must determine whether to attempt to continue the existing reading "trend line" -- used to track student performance on the test throughout time -- or end it and begin a new one.

The National Assessment Governing Board periodically revises its assessments to reflect advancements in the fields being assessed and in teachers' instructional practices.
The new reading test contains some significant alterations, including a heavier emphasis on understanding non-fictional and informational texts than the previous test, which has been used since 1992.

Because of the changes in construct, NAGB frequently restarts the trend line when it institutes a new test.

But the stakes are higher in this instance because policymakers use NAEP as an objective measure to determine whether NCLB actually has leveraged increases in students' reading and math achievement throughout time. A new trend line would prevent those types of analyses for the time being.

"We have a key subject here that's part of NCLB," said Peggy Carr, associate commissioner for the National Center for Education Statistics, while briefing the board on its options at its quarterly board meeting last week. "There is a desire to link the trend lines."

The NCLB issue has already proved serious enough for the board to modify its policy for the reading exam once. NAGB pushed back implementation of the new test from 2007 to 2009 to allow for an additional data reporting cycle using the old test.

Moving forward, NAGB is considering three main options for preserving the trend line. The most conservative option would give a portion of students the old test and a second portion the new test in 2009 and would report scores from both assessments on two "overlapping" trend lines.

Two other options involve augmenting the new test with a series of questions from the old test so that results would remain comparable and continue to be reported on the existing trend line.
NAGB heard the first feedback at its quarterly board meeting last week from a variety of stakeholders. At least two groups backed the more conservative option.

"If the [test] constructs [are] different, it would be most accurate to represent them as two separate scales," said Theresa Siskind, chair of a Council of Chief State School Officers subcommittee on education information.

Her opinion was seconded by George Bohrnstedt, chairman of the NAEP Validity Studies Panel.
But Skip Kifer, a member of the NAEP Design and Analysis Committee, supported the integrated approach, calling it more "forward looking." He cautioned, however, that the approach should allow the board to fall back on reporting the results on two separate scales if it ultimately proves technically unfeasible to link the two exams.

"No matter what decision you make, there are going to be quite a number of issues you will have to address," he cautioned the board.

NAGB's decision also could hinge on the results of two outstanding reports commissioned by NCES. One of them will attempt to establish statistical correlations between the scales on the old and new reading tests. The other will analyze the similarities between the two tests' reading passages and questions.

Those reports are due out by next May, Carr said, which means that NAGB will likely have to make a final decision by next summer in order to give officials enough time to prepare for the 2009 administration.

Though the matter is far from settled, some NAGB members predicted challenges in bridging the two exams while maintaining valid and reliable results.

"As we did our work, our assumption was that this was indeed a very different test from the one that preceded it," said Amanda Avallone, a principal and teacher who served on the committee that developed the new test framework. "They are very, very different in what they are asking of students."

August 6, 2007

No comments: