Kevin Drum seems to swing and miss: Everybody gets to be wrong once in a while.
In our view, Kevin Drum exercised his option today. We refer to his post about the new NAEP scores for twelfth-graders:
DRUM (5/7/14): I periodically try to remind everyone that test scores for American students have not, in fact, plummeted over the past few decades. In fact, they're up. To the extent that standardized tests can measure learning, American kids simply aren't doing any worse than kids in the past. They're doing better.As he continues, Drum notes that changing drop-out rates make it hard to compare twelfth-grade scores over time. But what he says and implies about twelfth-grade math seems to be basically wrong on its face.
But there's always been a caveat: this is only for grade school and middle school kids. All those test score gains wash out in high school, and today brings the latest evidence of this. Scores from the 2013 NAEP—widely considered the most reliable national measure of student achievement—are now available for 12th graders, and they confirm what we've known for a while. In reading, scores have been basically flat since 1992, and the scores for every racial subgroup have been pretty flat too. Math has only been tested since 2005, and scores have risen a few points since then. But not enough to demonstrate any kind of trend.
For starters, it isn’t true that math has only been tested since 2005. In this, the study called “The Main NAEP,” twelfth-grade math has been tested since 1990.
(All data can be accessed through the NAEP Data Explorer. Click here, then click on MAIN NDE. From there, you’re on your own.)
For some technical reason we don’t understand, the NAEP Data Explorer only lets you compare twelfth-grade math scores over two time periods. First, you can compare math scores from 1990 through 2000. Then, you can compare math scores from 2005 through 2013.
(Reading scores can be compared all through the 21-year period starting in 1992.)
Over each of these two time periods, all four major student groups have shown fairly solid score gains in math. According to the NAEP Data Explorer, these are the gains recorded in math from 1990 through 2000:
Score gains in math, Grade 12 NAEP, 1990-2000By a very rough rule of thumb, ten points on the NAEP scale has often been compared to one academic year. Applying that very rough rule of thumb, those were fairly decent score gains over that ten-year period.
White students: 7.10 points
Black students: 6.11 points
Hispanic students: 6.61 points
Asian students: 6.86 points
Similar score gains were recorded from 2005 through 2013, although the NAEP was now working with a completely different numerical scale:
Score gains in math, Grade 12 NAEP, 2005-2013The NAEP Data Explorer provides no way to compare scores from 1990 with scores from 2013. It provides no way to measure score changes from 2000 to 2005.
White students: 4.32 points
Black students: 5.24 points
Hispanic students: 7.67 points
Asian students: 11.08 points
That said, it looks to us like score gains were being recorded by all four demographic groups during both these periods.
Does the “ten point” rule still apply to these twelfth grade scores, even under the new scoring scale which has been in use since 2005? We can’t answer that question. We don’t normally work with twelfth-grade scores, given the interpretive difficulties introduced by the drop-out question.
But Drum was wrong when he said that math has only been tested since 2005. And overall, we would say there has been a definite upward trend.
This confusion illustrates a few basic points about the way our society treats this general topic:
How should we sensibly interpret these twelfth-grade math scores? In an even slightly rational world, the mainstream press corps would have pursued that question long ago.
Journalists would have asked NAEP officials about the interpretive problems surrounding the change in drop-out rate. They would have asked about the ten-point rule. They would have asked NAEP officials about the change in the NAEP scoring scale which occurred in 2005.
Journalists haven’t asked these questions, and they never will. Let’s state what is blindingly obvious:
Aside from their utility in promoting elite propaganda about public school failure, no one gives a flying fig about the nation’s test scores. Your “journalists” simply don’t care. Their coverage is largely a sham.
The most basic questions have gone unasked and unanswered for many years. As we’ve noted again and again, the most basic facts get completely misstated all over the press corps. And alas! When Drum decided to get something wrong, he decided to pick this topic.
Out of all the gin joints in all the world, he had to walk into the NAEP!