Trying to Measure "Non-Cognitive Skills" Beats "Deja Vu All Over Again"
An old baseball player from a long time ago once famously said, “It’s deja vu all over again” (or so my Education Policy Center friends would have me believe). Little voices have been asking me when I’m going to write something about the latest round of TCAP results — Colorado’s annual state testing for different grades in math, reading, writing, and science. But first, I had to figure out what year it was.
Wednesday’s headline at Ed News Colorado started out “State TCAP scores mostly flat….” In August 2012, the same publication reported the release of state test results under the headline “State scores mostly flat….” So I didn’t know how worthwhile it would be to write about last year’s news on a blog that’s already two days behind the curve.
There are a couple interesting notes. Denver Public Schools continues to lead the way in academic growth, but still has a healthy distance to travel. In most subject areas, another reform-minded district with similar student demographics — Harrison 2 — made some bright marks of progress as well. But to me, more interesting than rehashing the limited insights TCAP gives us, I wanted to bring your attention to an insight into another possible way to understand student learning success.
Jay Greene (himself brilliant) introduced me to this “brilliant new measure of non-cognitive skills.” Demonstrating proficiency in math and language arts certainly is important, but what about other factors that may indicate how well students might achieve in life? More and more researchers believe the non-cognitive skills are important but haven’t figured out how to measure it effectively:
[Greene’s student Collin] Hitt and [Julie] Trivitt have taken an enormous step forward to solve this problem. They have discovered that student non-response on surveys (not answering questions or saying they don’t know) is an excellent measure of non-cognitive skills that are strongly predictive of later life outcomes. In particular they examined survey response rates from the National Longitudinal Study of Youth (NLSY) given to students ages 13 to 17 in 1997. The number of items that students answered was predictive of the highest level of education students attained by 2010, controlling for a host of factors including measures of their cognitive ability. If students care enough to answer questions on a survey they are more likely to care enough to pursue their education further.
As Greene goes on to explain (and to really grasp it you need to read the whole thing), this approach might help us bridge the gap between modest improvements on test scores and strong gains in student attainment (e.g., graduation rates) — like we’ve seen in the Milwaukee school choice program. What else is going on there? A lot of work needs to be done, but they’ve already helped to expand the thinking in my little brain.
One thing is for certain. There’s always something new and interesting to write about in the world of K-12 education, so less “deja vu all over again” is just fine with me!