In Two Major Studies on Academic Standards, Colorado is Statistical Oddball
How did Colorado get to be the oddball? It’s got to be more than just so I would have something to tell you about. Oddball at what? you ask. Okay, let me back up and give you a little context.
Yesterday Harvard professor Paul Peterson wrote yesterday on Education Next about a new U.S. Department of Education report rating state math and reading standards for 4th and 8th grade. Though USDOE’s report didn’t acknowledge it, Dr. Peterson and his team had published very similar research — comparing state standards to the “gold standard” National Assessment of Educational Progress (NAEP) — just a year ago:
Every state, for both reading and math (with the exception of Massachusetts for math), deems more students “proficient” on its own assessments than NAEP does. The average difference is a startling 37 percentage points.
Interestingly, the new USDOE report concludes:
All NAEP scale equivalents of states’ reading standards were below NAEP’s Proficient range; in mathematics, only one state’s NAEP scale equivalent was in the NAEP Proficient range (Massachusetts in grades 4 and 8).
A case of deja vu? Though the Education Next and USDOE studies used somewhat different methods to compare the data, they came up with almost the exact same answers. Peterson noted yesterday that the correlations between the findings for all states — not just Massachusetts — were statistically very high, with one notable exception:
Colorado is the one state where we provide substantially different rankings. Ednext ranked it 4th; the Department says it is 45th. I suspect the difference is due to a change in standards in Colorado, but I invite readers to throw light on the discrepancy.
Why did last year’s survey find our state among those with the highest math and reading standards, while the new government study places us among the lowest? I’m not sure how to explain that away. Colorado’s State Board of Education adopted new academic standards in 2009, but assessments have yet to make the transition. Is there something significantly flawed in either the state-reported data (Ed Next) or school sample data (USDOE), but only for Colorado? Could the feds be relying on old data? Frankly, I’m baffled.
And I don’t like to be in the oddball state. But I would be glad if someone got to the bottom of it so we could know the truth. As for whether it was worthwhile for the USDOE to reproduce very similar work already done by private researchers just to find a major discrepancy for Colorado… well, that’s the million-dollar question.