Where that rough rule of thumb comes from: Yesterday, we received an e-mail concerning the origin of that very rough rule of thumb—the rough rule of thumb which says that ten points on the NAEP scale is roughly equal to one academic year.
At one point, our mailer linked to this academic paper. On page 5, the authors describe a 10-11 point rule of thumb, and give a sense of its origin:
LUBIENSKI AND LUBIENSKI (2006): NAEP mathematics results are reported on a 0-500 scale, with the 2003 mean being 235 at grade 4 and 278 at grade 8. The NAEP mathematics scale was originally designed to allow cross-grade comparisons, indicating that the 43-point difference between grade 4 and grade 8 would mean that a gap of 10 or 11 points represents roughly one grade level. NAEP no longer maintains that the scale is consistent across grades. However the scales have not changed markedly, and therefore the idea that “10-11 points is approximately 1 grade level” is still often used as a helpful, albeit rough, guide for interpreting score differences.That has always been our general understanding of where this rough rule comes from. It seems to us that there was once a pristine 40-point difference between NAEP cut-off scores for “proficiency” at the Grade 4 and Grade 8 levels. That was always our understanding of where the rule came from.
Still, our mailer wanted more. He started like this:
EMAILER: This email in-part pertains to your often quoted rule of thumb regarding interpreting score gains on the NAEP: that 10 points gives a very rough approximation of one academic year. I've been looking around for other instances of this rule of thumb being used and have not been able to find very many. I'm not presenting this as a criticism, rather I'm looking for some means of defending the use of this rule of thumb should I ever be asked about it while discussing student score gains on the NAEP...Where can he find this rule of thumb being used? Offhand, we can’t give specific citations, and it isn’t easy to search for. But education reporters and education “experts” will often apply this ten-point rule when they discuss achievement gaps on the NAEP. They won’t specifically declare that they are applying a ten-point rule. They will simply (for example) report a 20-point achievement gap, then say it corresponds to roughly two years of schooling.
You can see a similar performance in Amanda Ripley’s new book, this time with respect to the PISA. At several points in the book, Ripley converts score differences on the PISA into a rough number of academic years. In this example, she compares an 84-point gap in the PISA scale to two academic years:
RIPLEY (page 158): African-American students did poorly on PISA, heartbreakingly so. On average, they scored eighty-four points below white students in reading in 2009. It was as if the white kids had been going to school two extra years. The gap between white and African-American students showed itself in dozens of other ways too, from graduation rates to SAT scores. Generally speaking, up to half the gap could be explained by economics; black students were more likely to come from lower-income families with less-educated parents.In her text, Ripley doesn’t state her basis for that conversion, in which 84 points is compared to two school years. In an endnote, she says this: “In general, thirty-nine score points in PISA is considered the equivalent of one year of formal schooling.” She doesn’t state her source for that rule of thumb.
In a similar way, we have often seen the ten-point rule applied in practice—always when discussing the gaps, never when discussing the large score gains which have been achieved on the NAEP.
Alert newspapers would have done the following long ago. First, they would have reported the score gains which have been achieved by different groups on the NAEP. This is virtually never done. Except in aggregate form, the score gains are essentially never reported.
Second, they would have interviewed NAEP officials, looking for ways to interpret the score gains. At the Grade 4 level, black kids gained 37 points in math between 1992 and 2013. How much academic gain might we associate with a score gain of that size?
Newspapers should have pursued that question long ago. In truth, there’s no sign that they ever will. Our big newspapers won’t even report these large score gains, let alone try to interpret them.
We live inside a badly broken intellectual culture. Our reporters know how to do one thing. They know how to memorize and repeat the scripts preferred by elites.
How much better are black kids doing? What might explain their apparent progress? The New York Times doesn’t care. Neither do the millionaires who entertain you night after night on The One True Channel.
They enjoy the antics of Rob Ford. Black kids can go straight to Hell—or to Toronto, whichever comes first. These are the actual values of our millionaire pseudo-press.
We know—they seem like lovely people. We’re simply telling you what they do—whose lives they stoop to discuss.