What we were talking about: Yesterday, new reading and math scores were released from the National Assessment of Educational Progress (NAEP), the widely-praised “gold standard” of domestic testing.
We refer to Grade 12 scores from the 2013 testing. Yesterday, we discussed the new math scores (click here).
This morning, the New York Times and the Washington Post tried to report the new data.
At the Washington Post, education reporter Emma Brown scattered errors, omissions and imponderables all through her report. Consider this early passage:
BROWN (5/8/14): Also called the nation’s report card, NAEP is widely regarded as the most consistent measure of U.S. student achievement over time. Since the 1990s, it has been administered every four years to high school students and every two years to students in the fourth and eighth grades.In fact, the NAEP conducts two parallel studies, each of which is widely discussed. The so-called “Main NAEP” did start in “the 1990s.” But the so-called “Long Term Trend” assessment started in 1971.
Younger students’ results on the 2013 NAEP were released in November and showed incremental progress, continuing a slow but upward long-term trend. Twelfth-grade performance, by contrast, has been stagnant in recent years, and senior achievement in reading has declined since the early 1990s.
Charitably, you can write that off as a minor type of omission. But how about the highlighted statement? Is it true that “achievement in reading has declined since the early 1990s” on the twelfth-grade level?
We’re not sure. Here’s why:
First, it doesn’t really make sense to consider these scores without engaging in “disaggregation”—without checking the scores which have been attained by various student groups.
How have black kids done? How about Hispanics? Brown doesn’t bother with this.
Second, twelfth grade comparisons are complicated by changes in drop-out rate. Later in her report, Brown notes that Arne Duncan announced this week that last year’s graduation rate was the highest ever.
It’s good when fewer kids drop out, but it tends to weaken the overall twelfth grade population pool. In theory, this complicates comparisons over time.
Brown doesn’t mention this either. Meanwhile, note the way the New York Times’ Al Baker blows right past this point:
BAKER (5/8/14): In reading, 38 percent of seniors across the country achieved proficiency last year—compared with 40 percent in 1992, the first year for which data on seniors was available. The lack of progress was striking since elementary- and middle-school students have shown some growth during that time, as have graduation rates, suggesting that learning gains were wearing off in the high school years even as more students were earning diplomas.In that passage, Baker puzzles over the drop in proficiency rate even as he notes that graduation rates have improved. It doesn’t occur to him, even theoretically, that the two factors could be related.
Are the two factors related? Given the way our press corps works, we don’t know the answer to that, and we never will.
As we look at the disaggregated data, another point jumps out from the reading scores, which start in 1992. Blacks, Hispanics, and Asians all showed weirdly large drops in average scores from 1992 to 1994. This suggests the possibility that sampling errors may have occurred in the first year of testing.
Adopting 1992 as the starting point, Brown reports gloomy news about reading. By way of contrast, if we take 2002 as our starting point, these score gains have occurred:
Score gains in reading, Grade 12 NAEP, 2002-2013Have those score gains been retarded by declines in the drop-out rate? We don’t know and we never will, given the norms of the press corps.
White students: 5.48 points
Black students: 0.71 points
Hispanic students: 3.58 points
Asian students: 11.45 points
If you want to evaluate progress in schools, you pretty much have to disaggregate scores. On the Grade 12 level, the drop-out rate is a complicating factor.
It’s also true that initial scores in the 1992 reading testing seem to have been peculiarly high. If we were running the Post or the Times, we’d want our reporters to question NAEP officials on all these points.
Alas! At the Post and the Times, education reporters mainly seem to be trying to make it through the night. As a general matter, we stopped having a “press corps” long ago. Basic competence is especially low in the reporting of test scores.
How do Grade 12 math scores look? Pretty good! See yesterday’s post.
First, you cry: For unknown reasons, Brown did include the following point from one of our “education experts:”
BROWN: Some analysts contend that the 12th-grade scores are evidence only of an unsurprising truth: that high school seniors are not motivated to try their hardest on tests in which they have no real stake.According to Hess, “high school seniors are not motivated to try their hardest on tests in which they have no real stake.”
“We all remember exactly how engaged your 17-year-old high school senior is,” said Frederick Hess of the conservative American Enterprise Institute. Hess said skepticism about high school results should serve as a reminder not to read too much into younger students’ scores, as well.
That may be true, of course. But presumably, it has always been true. That tends to wash things out.
Theoretically, if today’s unmotivated seniors know more than earlier unmotivated seniors did, they would tend to perform better than the earlier group, despite their ongoing lack of motivation. This undermines the attempt at a point by Hess, who may be experiencing a delayed onset of spring fever.
Yesterday, we saw substantial score gains in Grade 12 math recorded by all four major demographic groups. The lack of motivation presumed by Hess didn’t stop these gains from occurring.
“Nothing gold can stay,” Frost insisted. When journalists like Brown talk to experts like Hess, few good things can result.