New School Year, New Assessment Data
As I mentioned last week, it’s back-to-school season in Colorado. As it turns out, it’s also get-your-test-scores-back season. Yes, that’s right. We have a whole raft of new data to dissect and discuss. Hooray!
I see you looking at your calendar, and I know what you’re thinking: Didn’t students take these tests like, last spring? Well, yes. Yes, they did. And you’re not the only one who finds the delay perplexing. As it turns out, that reporting lag causes some major problems for local school and district leaders looking to make adjustments for the new academic year. To make matters worse, the recently released PARCC scores only cover state-level data. That means district- and school-level data in English language arts and mathematics won’t be available until later this month.
In fairness, releasing the scores in August is significantly better than releasing them in, say, November. And I should mention that scores from the older TCAP tests were also released in August. Still, one of the promises of computer-based online testing was that it would get valuable data into the hands of educators faster. That simply hasn’t happened. Maybe the delay has something to do with the fact that 2015 testing compromise legislation added the option of taking a paper-and-pencil version of the test, but I’m not sure that explanation will do much to bolster support for the faltering PARCC assessments.
But we can save a discussion on the merits of PARCC itself for another time. We have data to look at, including data from Colorado’s own CMAS tests in social studies and science and from the venerable ACT. We’ll hit the highlights, but you should feel free to dig in on your own if you’d like. You can find PARCC English and math data here, CMAS social studies and science data here, and ACT data here.
I’m sure most of you remember the performance plummet Colorado experienced on the 2015 PARCC assessments, on which less than half of Colorado’s student population “met” or “exceeded” expectations. You probably also remember the various possible explanations for the drop—a temporary “implementation dip,” harder standards and tests, or problems with the testing instrument itself. If you’re hoping that I’m going to definitively answer that question today, you’re going to be sorely disappointed. Still, how 2016 scores look in comparison to 2015 scores is instructive, and I can hazard a guess.
So, how did we do in 2016? Unfortunately, not much better than we did in 2015 overall. Here are some handy comparative graphs from Chalkbeat Colorado’s report on PARCC math and ELA scores:
As you can see, things are looking pretty flat. There are even a few areas where students did worse in 2016 than they did in 2015—including in third-grade English, which is a critical milestone. And no matter how you slice the data, woefully low numbers of Colorado students—one-third to one-half in most grade levels—are where they should be in these critical subjects.
Things don’t look much better on the science and social studies CMAS tests, which are notably distinct from PARCC. In fact, they look a lot worse. Sadly, the low numbers come as no surprise in social studies; I’ve been harping for some time about the frighteningly inadequate education our students receive on that side of the academic spectrum. Don’t worry, we’ll beat that particular dead horse again some other time. For now, here are some graphs to help you visualize the CMAS results:
The short version of all of this is that things didn’t look great for Colorado in 2016. But there are a few bright spots buried in the gloom:
- Students did better in math in grades three, four, and five. They also made small gains in English Langauge Arts (ELA) in grades four, five, and eight. Science scores also improved in grade five. None of these improvements were spectacular, but we should always remember that improvement in education tends to be incremental. We’ll need to wait and see if the positive trends continue.
- High school performance in math looks a little more promising when broken out by subjects, with more than 70 percent of students making the grade in Algebra II and about 60 percent doing so in Integrated Mathematics III. Unfortunately, I can’t compare these scores to last year’s because they involve different populations of students following the elimination of such testing in 10th and 11th grades.
- Statewide ACT scores were up almost across the board. Only math scores showed no improvement this year. Whether this trend will (or can) continue once our state finishes the somewhat baffling move to the SAT next year remains to be seen.
- Achievement gaps remain distressingly wide, but there was significant positive movement among minority groups in elementary school math. Things look considerably less rosy in mathematics in later grades and in reading, but there does appear to have been at least some positive movement among some minority groups in some instances.
I’m not sure any of this information really answers the question of what’s going on in Colorado. Scores certainly haven’t shot up, which pours some cold water on the “implementation dip” theory. Then again, we shouldn’t discount the fact that there has been a little positive movement in some areas.
It’s also hard to say with any degree of confidence whether the continued low performance is due to a problem with the testing instruments themselves, though I will point out that this argument is somewhat undermined by the fact that these results broadly track with Colorado’s performance on the gold-standard NAEP assessment. That’s not to say that there aren’t serious issues with the PARCC exam—there most certainly are, and I wouldn’t lose a wink of sleep if the test was replaced tomorrow—but that I’m not sure we can fairly blame our poor performance on the test itself.
Overall, I think what we’re seeing is further evidence of something we’ve known all along: That there is a lot of work to do when it comes to improving Colorado education. Brilliant analysis, I know. That’s why they pay me the big bucks.
See you next time!