Which study is correct: Are high school seniors doing better in math? We have no way of knowing!
We do know where the data are. The best data, from the so-called “Main NAEP,” seem to suggest that high school seniors are doing better in math—have been recording score gains in math over the past twenty-five years.
“Hold on!” the dead-enders will cry. What about the other NAEP study? What about the smaller score gains on the so-called Long-Term Trend Assessment?
The question is perfectly valid. In the study that’s called the Main NAEP, the NAEP tests high school seniors in reading and math. In the study that’s called the Long-Term Trend Assessment, the NAEP tests 17-year-old students, no matter what grade they’re in. (Most are juniors or sophomores.)
The two NAEP studies use different math tests. Because we read American newspapers, we’ve never encountered a real attempt to explain the difference between the math tests.
It’s our impression that the Main NAEP test is geared to the math curriculum as it currently exists, while the Long-Term Trend math test has largely stayed the same over its forty-plus years of use.
Warning! That’s our impression, but it could be totally wrong. As noted, we read the American press corps, which uses the NAEP, almost exclusively, as a vehicle for driving narratives about our failing schools, which clearly ought to be charters.
(For the record, we aren’t opposed to charter schools in any way at all. We are opposed to ludicrous claims which get made on their behalf.)
Whatever! The Main NAEP tests students in Grade 12. The Long-Term Trend Assessment tests students 17-year-old students—mostly juniors and sophomores.
Here’s the conundrum—score gains have been larger in math on the Main NAEP. Indeed, 17-year-old black students have recorded no score gains in math at all on the Long-Term Trend Assessment.
Adjusting for a procedural change which occurred in 2004, this is what score gains look like on the Long-Term Trend Assessment:
Gains in average math scores, 1990-2012For whites and Hispanics, those score gains aren’t huge, though they also aren’t non-existent. For black kids, there are no score gains at all.
Long-Term Trend Assessment, 17-year-old students
National public schools
White students: 6.74 points
Black students: -0.32 points
Hispanic students: 8.45 points
Asian-American students: 19.84 points
A different pattern seems to obtain for black kids on the Grade 12 Main NAEP. And just for the record, these are the score gains recorded by 13-year-old and 9-year-old students on the Long-Term Trend Assessment:
Gains in average math scores, 1990-2012Judged by normal rules of thumb, those are large score gains. (Warning! There’s a very large Standard Error for Asian-American scores in 1990, when the “N” for that group was small.)
Long-Term Trend Assessment, 13-year-old students
National public schools
White students: 18.33 points
Black students: 19.17 points
Hispanic students: 18.22 points
Asian-American students: 41.43 points
Gains in average math scores, 1990-2012
Long-Term Trend Assessment, 9-year-old students
National public schools
White students: 19.31 points
Black students: 22.25 points
Hispanic students: 20.95 points
Asian-American students: 30.75 points
Thirteen-year old students have recorded large gains on the Long-Term Trend math test. The same is true for 9-year-old students.
That's true of black students in each group. This leads to some obvious questions:
Why have score gains on the Long-Term Trend been smaller among 17-year-old students? Why have 17-year-old black students shown no gains at all?
Black kids have shown substantial gains everywhere else, including at all grade levels on the Main NAEP. Why are scores for 17-year-old black students flat? Why are gains by white and Hispanic students relatively small at that age?
We don’t know how to answer those questions. We’ll stick with our standard excuse:
Such questions are never asked by the American press, which never presents a serious discussion of any such topics at all. Quite plainly, the American press corps uses the NAEP in exactly one way—to cherry-pick data in support of the claim that our schools are stagnant or failing.
Some journalists may be doing this deliberately. Almost surely, most of our “education reporters” have never looked at a set of NAEP scores in their lives. They simply channel the standard claims which come to them from “educational experts,” almost all of whom are on the dole from Bill Gates and the rest of the billionaire “education reform” funder class.
As far as we know, Gates is fully sincere in his efforts. But on the whole, the narratives which control our discourse about public schools serve corporate and political interests which are blatantly obvious. It’s all about spreading gloom and doom and refusing to report the overall, rather large gains in test scores.
Why are math scores somewhat flatter among high school students on the Long-Term Trend? We have no idea! It may be because of the nature of the math test used in that study. If we had an actual press corps, somebody would have asked!
Why have black students recorded large score gains in math at ages 9 and 13, but not at age 17? Just a guess—the effects of declining drop-out rates would be in effect at age 17. For various reasons, we’ll guess that the black student population might have been more affected by that (desirable) change than the other three groups. Why haven’t reporters asked?
We’ll finish with a basic point, the most basic point of all:
Our mainstream “education reporting” is largely propaganda. It has been that way for a very long time. Whatever anyone’s intentions may be, it’s clear what interests are served.
You’re gloomily told, again and again, that our schools are stagnant or failing. You’re told that miraculous Finland is just sooo much better.
You’re told that nothing has worked in our schools. You’re told that we need more charter schools and a whole lot more “reform!”
Overwhelmingly, the pattern on our “federal tests” is hard to square with that narrative. For that reason, you’re never told about the score gains which predominate in both NAEP testing programs.
Routinely, you’re told about the gaps; the gains go unreported. When reporters do talk about gains, they turn to the Long-Term Trend Assessment, 17-year-olds only.
That’s what Michael Petrilli did in that horrible blog post. And by the way, note this:
Above, we said we were “adjusting for a procedural change which occurred in 2004.” At that time, the NAEP introduced “accommodations” which let them test kids with certain types of challenges—a relatively small group of kids whose counterparts hadn’t been tested before that.
Even with “accommodations,” including these kids in the NAEP tends to drive average scores down a few points. If you forget to make the basic adjustments, you’re absent-mindedly making test scores look “flatter” over the years.
Michael Petrilli forgot! If you look at his graphic for 17-year-old students, you’ll see a superscript 1 (essentially, an asterisk) appended to the scores for years before 2004.
But he never explains what the asterisk means! What kind of “expert,” in any field, conducts his business like that?
The asterisk meant he was flattening scores, in line with standard procedure. For many years, that has been the primary business of our “education experts” and the reporters who peddle their scripts.
They won’t tell you about score gains. Instead, they happily say that we need charter schools.
For ourselves, we aren’t opposed to charter schools. We are opposed to scams.
Where do test scores come from: For all data from the Long-Term Trend Assessment, just follow these easy steps:
First, click this. Then, click on LTT NDE (Long-Term Trend NAEP Data Explorer).
Click on “I agree to the terms above.” From there, you’re on your own! It’s just like Dylan said!