**WEDNESDAY, MAY 7, 2014**

**Kevin Drum seems to swing and miss:**Everybody gets to be wrong once in a while.

In our view, Kevin Drum exercised his option today. We refer to his post about the new NAEP scores for twelfth-graders:

DRUM (5/7/14):As he continues, Drum notes that changing drop-out rates make it hard to compare twelfth-grade scores over time. But what he says and implies about twelfth-grade math seems to be basically wrong on its face.I periodically try to remind everyone that test scores for American students have not, in fact, plummeted over the past few decades. In fact, they're up.To the extent that standardized tests can measure learning, American kids simply aren't doing any worse than kids in the past. They're doing better.

But there's always been a caveat: this is only for grade school and middle school kids. All those test score gains wash out in high school, and today brings the latest evidence of this.Scores from the 2013 NAEP—widely considered the most reliable national measure of student achievement—are now available for 12th graders, and they confirm what we've known for a while. In reading, scores have been basically flat since 1992, and the scores for every racial subgroup have been pretty flat too.Math has only been tested since 2005, and scores have risen a few points since then. But not enough to demonstrate any kind of trend.

For starters, it isn’t true that math has only been tested since 2005. In this, the study called “The Main NAEP,” twelfth-grade math has been tested since 1990.

(All data can be accessed through the NAEP Data Explorer. Click here, then click on MAIN NDE. From there, you’re on your own.)

For some technical reason we don’t understand, the NAEP Data Explorer only lets you compare twelfth-grade math scores over two time periods. First, you can compare math scores from 1990 through 2000. Then, you can compare math scores from 2005 through 2013.

(Reading scores can be compared all through the 21-year period starting in 1992.)

Over each of these two time periods, all four major student groups have shown fairly solid score gains in math. According to the NAEP Data Explorer, these are the gains recorded in math from 1990 through 2000:

By a very rough rule of thumb, ten points on the NAEP scale has often been compared to one academic year. Applying that very rough rule of thumb, those were fairly decent score gains over that ten-year period.Score gains in math, Grade 12 NAEP, 1990-2000

White students: 7.10 points

Black students: 6.11 points

Hispanic students: 6.61 points

Asian students: 6.86 points

Similar score gains were recorded from 2005 through 2013, although the NAEP was now working with a completely different numerical scale:

The NAEP Data Explorer provides no way to compare scores from 1990 with scores from 2013. It provides no way to measure score changes from 2000 to 2005.Score gains in math, Grade 12 NAEP, 2005-2013

White students: 4.32 points

Black students: 5.24 points

Hispanic students: 7.67 points

Asian students: 11.08 points

That said, it looks to us like score gains were being recorded by all four demographic groups during both these periods.

Does the “ten point” rule still apply to these twelfth grade scores, even under the new scoring scale which has been in use since 2005? We can’t answer that question. We don’t normally work with twelfth-grade scores, given the interpretive difficulties introduced by the drop-out question.

But Drum was wrong when he said that math has only been tested since 2005. And overall, we would say there has been a definite upward trend.

This confusion illustrates a few basic points about the way our society treats this general topic:

How should we sensibly interpret these twelfth-grade math scores? In an even slightly rational world, the mainstream press corps would have pursued that question long ago.

Journalists would have asked NAEP officials about the interpretive problems surrounding the change in drop-out rate. They would have asked about the ten-point rule. They would have asked NAEP officials about the change in the NAEP scoring scale which occurred in 2005.

Journalists

*haven’t*asked these questions, and they never will. Let’s state what is blindingly obvious:

Aside from their utility in promoting elite propaganda about public school failure, no one gives a flying fig about the nation’s test scores. Your “journalists” simply don’t care. Their coverage is largely a sham.

The most basic questions have gone unasked and unanswered for many years. As we’ve noted again and again, the most basic facts get completely misstated all over the press corps. And alas! When Drum decided to get something wrong, he decided to pick this topic.

Out of all the gin joints in all the world, he had to walk into the NAEP!

That's a big mistake by Drum. He should have noted that twelfth-grade scores rose because leaded fuel was phased out in the seventies.

ReplyDeleteI wonder if anyone has ever done a study to estimate the impact on intelligence of the phase-out of leaded fuel. Presumably the phase-out caused IQs to increase by some amount, as well as SATs, NAEPs, etc. Based on any of these tests, how much benefit was produced by the phase out of leaded fuel?

DeleteEven more important than the increase in test scores is the decrease in crime rates.

DeleteDavid, if you were a regular reader of Kevin Drum, you wouldn't be wondering this. He has been reporting about it for a long time.

DeleteIt matters whether scores have risen or not because these facts support or refute the narrative about whether teachers are doing their job, whether our schools are failing miserably, whether we need a new common core, and so on. A public that does not know whether kids are learning or not will buy anything anyone wants to say to them about education. It may be that journalists (and those they serve) prefer to keep readers ignorant for that reason, but perhaps I am too cynical.

ReplyDeleteThe LA Times today reported that learning is flat and 12th graders have shown no improvement on NAEP for decades. Woe is us.

The ten point rule is bogus.

ReplyDelete

DeleteThe most basic questions have gone unasked and unanswered for many years.Not any more. Thanks Anonymous @5:50P

Anytime, 6:13P.

DeleteOMB (Data Mixing in the Gin Joint...Cherries Optional)

ReplyDelete"Out of all the gin joints in all the world, he had to walk into the NAEP!"

"That said, it looks to us like score gains were being recorded by all four demographic groups during both these periods." The OTB

Let's have a double math, because we can't handle our reading!

Reading Scores 1992 -2013

White 297.....297

Black 273.....268

Hispanic 279.....276

And please, don't serve us any of that Long Term Assessment.

It gives us Gap since 1988.

KZ

Allow me to ask the dumb question : Given that kids are, from year to year, presumably similarly competent at a given age in dealing with learning skills due to developmental parallels...why would anyone expect significant variation in results ?

ReplyDeleteThe value added that comes from teaching.

DeleteOur kids now know they are being clobbered in the PISA derby by the sorry-ass Poles. They are giving it the extra effort now that they know they aren't automatically considered Numero Uno.

DeleteIn an earlier post, the blogger says a college freshman is a "fuzzy thinker and writer." The jokes write themselves.

ReplyDeleteHave you heard the one about the blogger, the drop out, and an inexperienced Ivy League lass who walk into a bar and order a sloe gin fuzz?

DeleteIt’s the dirty non-secret of public education: Public ed is largely a second-tier discipline, presided over by second-tier watchdogs, and second-tier minds are always eager to seize on the newest sensation. Everyone knows—and jokes about—the discipline’s endlessly faddish ways, and the illogical excesses of the “standards movement” are the latest cosmic example. Where else would people think that a bunch of kids who still count on their fingers should be placed in Algebra 1—indeed, that they should take the course six separate times, long after it becomes perfectly obvious that they aren’t gonna pass it? But then, major parts of the “standards movement” never made much sense for these struggling kids—for kids who couldn’t even come close to meeting the standards we already had. What did we really expect to gain when we decided that we’d “raise standards” for them? What exactly was our thinking?

ReplyDeleteRegarding Drum's mistake and the 10 point rule as it pertains to math.

ReplyDelete"Average mathematics scale score results are based on the NAEP mathematics scale, which ranges from 0 to 500 for grades 4 and 8 and 0 to 300 for grade 12. The 2005 mathematics framework initiated minor changes at grades 4 and 8 and more substantial changes at grade 12. This meant that the current trend line could be maintained at the lower grades but a new trend line needed to be established at grade 12.

-----

At grade 12, new achievement levels were established in 2005 based on the revised content framework. As provided by law, NCES has determined that the achievement levels are to be considered on a trial basis and should be interpreted and used with caution."

Wonder why BOB presents score gains and not scores for the two different periods? Because the old scores for 12th grade are based on the 500 point scale and the current ones use a 300 point scale.

Drum said "Math has only been tested since 2005, and scores have risen a few points since then." Somerby had a field day with his error.

If Drum had said "Math has only been tested

with the current examsince 2005 he would have been accurate.Sorry. Forgot to sign my comment at 10:01.

DeleteAnd for the benefit of the Troll Patrol, the quotes are from the NCES own description of how to interpret their NAEP scores.

http://nces.ed.gov/nationsreportcard/mathematics/interpret_results.aspx

KZ

OMB (Blindingly Drunk on Flying Fig Gin)

ReplyDelete"For some technical reason we don’t understand, the NAEP Data Explorer only lets you compare twelfth-grade math scores over two time periods. First, you can compare math scores from 1990 through 2000. Then, you can compare math scores from 2005 through 2013."

BOB

"The 2005 mathematics framework initiated minor changes at grades 4 and 8 and more substantial changes at grade 12. This meant that the current trend line could be maintained at the lower grades but a new trend line needed to be established at grade 12."

NCES

"Let’s state what is blindingly obvious:

Aside from their utility in promoting elite propaganda about

mediafailure,neither BOB nor BOBfansgives a flying fig about the nation’s test scores. Your “blogger” simply don’t care. Their coverage is largely a sham."KZ badly paraphrasing, cutting, pasting and distorting.

"Your “journalists” simply don’t care. Their coverage is largely a sham."

ReplyDeleteI'll ask this then, does it matter if they "care"? Should that matter? Can we make his assumption that they cover what they "care" about? What if they cover what they believe other people care about? What if they cover what they're told to cover?

Could they do a good job covering test scores and education WITHOUT caring about kids in public schools and how they test?

Is there some example of something they "care" about where they do a careful and thorough analysis of numbers where one could compare and say "oh, they obviously CARE about that, given their careful and thorough analysis of numbers!"

Watch the coverage of charter schools. It's all emotional appeals and the same sloppy number analysis, and they "care" about those schools.

So is "caring" what we need or want, or something else? Something like "professionalism" or "dogged pursuit of meaning behind numbers"? I get that "caring" may lead to more solid work, but it is a necessary condition FOR solid work?

I agree with Mr. Somerby that there is an elite consensus that public schools suck which is mostly nonsense and spin, but what kind of reporting would fix this? What attributes would reporters need? Are we sure it's "caring" reporting?

So I take it you disagree that Public ed is largely a second-tier discipline, presided over by second-tier watchdogs, and second-tier minds are always eager to seize on the newest sensation?

DeleteBob, you do know Drum corrected his error? Well, of course you do since you comment regularly on his blog. Did you note his error in commenting? No, I suppose not. Did you note his correction on your blog. No. I wonder why. Could it be because his update included this?

ReplyDelete"Actually, math has been tested since 1990, but the test was revised in 2005 and scores before then aren't comparable to current scores. A crude comparison suggests that scores actually have increased for 12th graders since 1990, perhaps by as much as ten points, though this is in direct contradiction to the long-term NAEP, which shows no gains at all for 17-year-olds. My own guess, based on both of these results, is that math scores have increased slightly since 1990, but probably not enough to really be noticeable."