BLACK KIDS IN PUBLIC SCHOOLS: A fuller set of Grade 12 scores!

FRIDAY, SEPTEMBER 25, 2015

Part 5—In our view, black gains matter:
For the past several weeks, we’ve been reviewing the annual back-to-school gong-show performed by the mainstream press corps.

It happens every autumn! Incompetent, semi-literate journalists roll their eyes and wring their hands about the alleged semi-literacy of the nation’s 9-year-old public school kids.

Those scribes! They cherry-pick their statistics and bungle their logic—and they stick to approved elite scripts. This year, we the people been encouraged to enjoy these familiar old tales:
This year’s Dick-and-Jane narratives:
1) Our high school seniors are dumber than ever. We need more charter schools!
2) Black kids get suspended from school way too much. It’s mainly because of you-know-who in thirteen Southern states!
3) The switch to charter schools in New Orleans has produced amazing results! My wife works for a charter school org. We need more charter schools!
In fairness, it isn’t just the journalists who have produced these novelized tales:

In the past few weeks, we’ve visited two professors at Penn who produced an horrific study. We’ve reviewed a blog post by an “educational expert” whose expertise is largely confined to slippery sleights of hand.

That said, the journalists always seem to rise to the challenges posed by professors and experts. With regard to that bungled study of those thirteen Southern states, a range of journos at major news orgs leaped to advance the professors’ bungled suggestions.

In some cases, the journalists introduced stunning new errors into the pre-bungled work. But it occurred to none of these journalists to note one basic fact:

More than half the nation’s black kids attend school in those thirteen states! We saw no one cite that fact, which destroys the professors’ familiar and pleasing old tale.

Have worms been eating our journalists’ brains? If so, they’re skinny, wan.

At any rate, the astounding illiteracy of our press corps can’t be discussed in the press. For years, they’ve simply continued to cherry-pick facts to advance their controlling scripts:

Nothing is working in our schools! It’s all because of our public school teachers with their infernal unions!

Why can’t we be more like Finland? We need a lot more charter schools and a whole lot more “reform!”


Journalists love these memorized tales! At the Washington Post, Nick Anderson and his unnamed editors pimped this general script with a September 3 front-page report about the dumbness and decline of our high school seniors.

It wasn’t just those average SAT scores, Anderson said—failing to note an obvious reason why those scores have shown a small decline in recent years. It’s also the “generally stagnant results from high schools on federal tests.”

Or so Anderson wrote. Good God, but these high school seniors are dumb! What the heck explains the mess in These High Schools Today?

As noted, Anderson was explicitly writing about high school seniors. The “federal tests” to which he referred were, one might thereby assume, the National Assessment of Educational Progress (NAEP), which tests twelfth-graders as part of the study which bears this name:

“Main NAEP.”

Have Grade 12 scores been stagnant on the Main NAEP? Last week, we showed you the score gains in Grade 12 math over the past decade. We also explained a basic shortcoming of the Grade 12 Main NAEP in math:

The math test was changed in 2005, creating a statistical gap. Changes in average scores can be observed from 2005 forward. But the NAEP provides no way to compare those scores to scores obtained on the previous math test, which was last given in the year 2000.

That said, we thought we’d give you a larger look at Grade 12 scores down through the years. As Ed McMahon might have said to Johnny:

How stagnant have they been?

Below, you see the score gains recorded from 1990 through 2000 on the original Main NAEP math test. Then, you see the score gains recorded from 2005 through 2013 on the new math test.

We then total the score gains for the eighteen years, over two time spans, which we can review. Important context follows:
Gains in average scores, 1990-2000
Main NAEP, Grade 12 math
National public schools

White students: 7.10 points
Black students: 6.11 points
Hispanic students: 6.61 points
Asian-American students: 6.86 points

Gains in average scores, 2005-2013
Main NAEP, Grade 12 math
National public schools

White students: 4.32 points
Black students: 5.24 points
Hispanic students: 7.67 points
Asian-American students: 11.08 points

Overall gains in average scores, two time spans
Main NAEP, Grade 12 math
National public schools

White students: 11.42 points
Black students: 11.34 points
Hispanic students: 14.28 points
Asian-American students: 17.94 points
A basic bit of context:

Ten points on the NAEP scale is often compared to one academic year. We regard that as a very rough rule of thumb. But it provides a very rough way to imagine what score gains of this size might suggest.

(If some newspaper would break the code of silence and report and discuss the NAEP score gains, we might end up learning more about all such questions.)

A second important point:

These score gains were being recorded as the nation’s drop-out rate was declining.

In the blog post to which we referred, Michael Petrilli explained how this desirable trend might tend to depress average scores over time:
PETRILLI (9/3/15): One explanation [for allegedly stagnant high school achievement] could be America’s rising graduation rate. Students who would have previously dropped out are now staying in school and remaining in the NAEP sample, thereby dragging down the scores. That sounds plausible to me...
Our view? Those Grade 12 score gains look substantial to us as they are. Beyond that, the actual size of the academic achievement may have been disguised by the improved graduation rate, which tends to “drag down” average scores.

Please note: These are the only “federal tests” which measure the achievement of high school seniors. Interpretation of these data has always been made complex by the likely effect of the changing drop-out rate. But let’s note two points about these data:

First point: Even in Grade 12, the Main NAEP seems to record substantial score gains in math. In Grade 4 and Grade 8, the recorded score gains have been larger.

Second point: The public isn’t allowed to know that these score gains exist. Very few people have ever heard about these large score gains.

What follows is an important fact—an important fact you’ve virtually never seen in the national press:

In the past several decades, black kids have shown large score gains in reading and math. (Hispanic kids have shown large score gains too.) Our “achievement gaps” still exist because white kids have also shown significant score gains, although their gains haven’t been as large.

That almost sounds like important good news—but the public isn’t allowed to hear it. Mainstream reporting on public schools is driven by mandated narratives:

Nothing is working in our schools! High school achievement is pitifully stagnant!

It’s all the fault of our public school teachers! Charter orgs need to expand!


Is achievement by high school seniors stagnant? That’s what Anderson gloomily said, on the front page of the Post.

Before long, Kevin Drum was saying it too. For reasons the analysts can’t understand, their Uncle Drum always seems to revert to that line, which would be a decent point to examine as part of as well-balanced meal.

As the analysts wail, they choke out a claim: He’s cherry-picking his federal tests!

There’s a bit of truth to what they say, we’re glumly forced to admit. Why run straight to the Long-Term Trend Assessment when the Main NAEP is right there for review?

Do America’s high school seniors know more math than their counterparts did in the past?

We don't have the slightest idea, but the Grade 12 score gains seem to suggest that they do. And good God! At Grades 4 and 8, the score gains have been larger!

Do black kids’ score gains actually matter? Aside from the need to make lots of money while busting up the last major unions, why can’t the public be told about what these great kids have achieved?

Black kids are doing better in school. Do facts about black kids matter?

30 comments:

  1. I suspect people run to the long-term trend assessment because its name makes it sound like the right instrument to be measuring long term changes in performance. You have to dig into the descriptions of the two tests to understand which might be right for your purposes. Generally, you are supposed to pick the measurement tool before looking at the results, not pick the one whose results best fit your idea of what is happening -- the latter approach is cherry-picking (I say this because KZ seems to think that cherry-picking is just selectivity, not selectivity for the purpose of supporting a preconceived hypothesis). The key is to select your test, then look at the results not vice versa. Note that the honor system assumes you will do this in the right order, as part of integrity in research or journalism (or blogging).

    ReplyDelete
    Replies
    1. Your key may be "select your test" but other people might think it might be a good idea to consider all available data and tests, instead of the part of one test that just so happens to fit the narrative you are selling.

      In other words stop pretending that selected data from selected years from one selected test represents a "gold standard" that with the use of very rough thumbs tells us things that the test was never designed to tell us.

      Delete
    2. If you think the long-term test is important, you need to make a case for it, not simply look at the results and conclude that because they are flat, kids didn't improve.

      Somerby has explained why he doesn't use the long-term test. One of his reasons is that it includes kids who are a particular age instead of a particular grade level. Kids who are different ages can be in different grades, depending on when they entered kindergarten. If they are in different grades they will have been exposed to different math curricula and perhaps, if in a higher grade, even had a year more of math, compared to kids in a lower grade but of the same age. That makes comparisons more problematic. A few days back, I also explained the difference in emphasis of the questions between the two tests (citing the NAEP website for a comparison of them). Because the long-term test emphasizes computation whereas the Main NAEP emphasizes reasoning (especially after the 2005 revision), and many schools have deemphasized practice in computation in their curricula, that could be an explanation for the flatness of the curve over time (see Drum's graph). Kids can thus be getting better at reasoning and producing higher scores on the Main NAEP while also producing the same scores as in the past on the long term NAEP because they are being better taught to compute and apply formulas but getting less practice at it, especially in later grades. So the results on the two tests may be consistent even though it looks like one is producing improvement while the other is not (improvement is obscured by the change in curriculum).

      NAEP is a gold standard because it is harder to cheat on than state tests and because there is less incentive to cheat on it because it isn't used to evaluate teachers or schools. The test is obviously designed to tell us whether kids are improving over time or not. It cannot tell us what the score changes mean in terms of grade levels because schools vary. But that doesn't mean the score gains tell us nothing at all. You may object to the words "very rough" and "rule of thumb" (which implies a rough not precise measurement), but there does need to be some way to understand what the gains mean. These kinds of rough interpretations are used all over the place in research and they are both appropriate and important because producing results without saying what they mean makes research useless for any practical purpose, impossible to apply to real life.

      The narrative most people are selling is that there is no improvement because they only look at the long-term NAEP. That is just as wrong as only looking at the Main NAEP. However, Somerby has discussed the long-term NAEP, explained why he doesn't use it and has always dealt fairly with the data relevant to whatever question he is focusing on. He has been criticized for not looking at other questions that he has not considered important to discuss (but that trolls try to distract him with).

      Main NAEP scores cannot improve without reflecting real improvement in their academic achievement, no matter what the results of the long-term NAEP. Anyone wishing to argue that the Main NAEP scores are meaningless needs to provide evidence, because kids don't score high on tests by accident. The proper question is then why they don't show similar improvement on other tests. I have never heard anyone here make a convincing argument about that. You are welcome to try.

      Delete
    3. "If you think the long-term test is important, you need to make a case for it . . ."

      Now why am I required to do that when I just said you can't take any ONE set of data, proclaim it the "gold standard" and ignore absolutely everything else?

      You know, there's a wealth of standardized testing out there beyond the NAEP. main and long term. But those can be disappeared with a mere wave of the hand, can't they?

      Yes, Somerby has "explained" why he doesn't even the NAEP long-term. And of course, "it doesn't fit my narrative" isn't it. Funny how that works.


      But judging from the utter lack of interest his latest in-depth look at the small piece of NAEP Main scores he wants to look at (to the exclusion of the rest of NAEP Main data), seems like most thinking people have tumbled onto the sad fact that Bob is full of shit.

      But then again, Somerby has been warning us for years about charlatans who fudge and cherry-pick date.

      Delete
    4. 1. Somerby has not "ignored" everything else. He has explained why he thinks the gains on the Main NAEP are important.
      2. He has explained why he uses the term "gold standard." So did I above. It wasn't by proclamation.
      3. Irrelevant data isn't "disappeared" just because it isn't mentioned. It is not discussed because it is irrelevant. If you find it relevant, draw some connection. Explain why it should be used, especially in preference to what Somerby chose to describe. You can't do that because the data you want to consider is usually not relevant to whatever Somerby has been discussing.
      4. When people don't reply to a Somerby post, it isn't necessarily because they lack interest or disagree. It is often because they do agree and there isn't anything more to be said on the topic. When people here think Somerby is "full of shit" they are not shy about saying so. Yet you are the only one. Why do you think that is?
      5. You still haven't explained how kids who are making no progress can show gains on the Main NAEP and you have not explained why any of my explanations about the long-term NAEP are incorrect. I think you are full of shit.

      Delete
    5. You'd have to totally ignore the sordid history of cheating on state tests to think state data should be used in preference to something like the NAEP. Maybe this commenter is unfamiliar with that history?

      Delete
    6. Yes, a familiar Somerby line that seeks to explain why he uses just one set of data, and not even all of that, to the exclusion of all others. Not just "in preference" but to the exclusion of all others.

      Do you also know that there is a vast array of national standardized testing out there besides the NAEP and "state data"? Probably not because Bob hasn't told you about them.

      What Bob is doing here is like evaluating a major league hitter not just by choosing one particular statistic, such as batting average, and discounting all the rest, but by looking only at that one statistic from July.

      Not hard to get data to tell the story you want them to tell if you slice, dice, exclude and fudge.

      Delete
    7. And what has prevented you from showing us some of that other national data with its trends that contradict Somerby? What alternative national math tests are you talking about? You are full of shit.

      Delete
    8. If KZ is talking about the TIMSS or the PIRLS (tests used for international comparisons), Somerby has discussed them here frequently, and scores have been increasing on them too. Eagerly waiting to hear what tests he thinks Somerby should have been discussing, what tests show flat performance in math.

      Delete
    9. Somerby writes criticism of press coverage of student progress. When did it become his obligation to review all national math tests? He usually mentions the same tests that the reporters have discussed.

      Delete
    10. Anon. 9/25@ 7:52

      "1. Somerby has not "ignored" everything else. He has explained why he thinks the gains on the Main NAEP are important."

      Somerby has "ignored" (your word) or "disappeared"
      everything but the Math Main NAEP.

      Delete
    11. Anon. 9/25@ 7:52

      "2. He has explained why he uses the term "gold standard." So did I above. It wasn't by proclamation."

      Yes, Bob has adopted a favorite phrase of the Washington Post, the paper where you never find positive test scores discussed and other guild rules followed.

      "The NAEP is considered the gold standard of testing:"

      10/15/09 Editorial Praising Michelle Rhee

      "The NAEP is the gold standard of student assessments, and its officials have held out the District as an example of a place where things are going right. Credit also goes to the system's hardworking teachers as well as to the students themselves."

      3/27/10 Editorial Praising Michelle Rhee and furthering the meme of blaming ratty teachers and their infernal unions.

      Of course, if the NAEP is the gold standard, then all its tests, including Reading and its Long Term Trend Test are also "gold"?

      Delete
    12. You have changed the subject. What are all those other tests you were talking about? You are still talking about the NAEP when you mention Reading and Long-term Trends.

      Why do you keep ignoring the explanations provided to you previously about why Reading test scores are more difficult to change than Math scores? Why do you keep insisting that Somerby consider the Long Term Trend Test when it has deficiencies compared to the Main NAEP, which have been repeatedly explained to you?

      Do you think the criticisms of teachers are all negated by a few words praising those teachers who have followed Rhee's reform initiatives. The first thing she usually does is fire a bunch of teachers. The ones who remain are no doubt highly motivated to do things her way, since others have been shown the highway. Not a wonderful example of praise for those ratty teachers, in my opinion.

      Delete
    13. Anon. 9/25@ 7:52

      "3. Irrelevant data isn't "disappeared" just because it isn't mentioned. It is not discussed because it is irrelevant."

      This statement could and should be remembered and applied regularly when Somerby makes claims that others have "disappeared" things. He who controls the narrative gets to decide the relevancy. Or should one say the novelist, while glumly suppressing a mordant chuckle?

      Delete
    14. Anon. 9/25@ 7:52

      "4. When people don't reply to a Somerby post, it isn't necessarily because they lack interest or disagree. It is often because they do agree and there isn't anything more to be said on the topic."

      The first sentence suggests something that seems to be sufficiently qualified by the word "necessarily" to be impossible to refute. The second sentence is you quantifying thoughts you cannot read and speaking for those you imagine have those thoughts.

      Delete
    15. Anon. 9/25@ 7:52

      "5. You still haven't explained how kids who are making no progress can show gains on the Main NAEP and you have not explained why any of my explanations about the long-term NAEP are incorrect. I think you are full of shit."

      I am not the person with whom you have been trading comments, but in reading and rereading that person's comments I never heard it stated that kids were making no progress so it is irrelevant for the commenter to explain anything (See your point 3 above).

      Your opinion about the person being full of shit is based on you critically demanding of him/her what you justify being absent from Somerby.

      Delete
    16. Anon. 9/26 @ 11:48

      "You have changed the subject. What are all those other tests you were talking about?"

      I changed no subject. I am a new commenter reacting to one of your comments. You chose the subject. All of my comments are based on quotes of yours.

      Delete
    17. If you don't want to be confused with KZ when mimicking his complaints, try using a screen name.

      Delete
    18. If you don't want to be confused by reality keep staying home with your head in your Howler.

      The real KZ, not some figment of 6:20's imagination.

      KZ

      Delete
  2. "Have worms been eating our journalists’ brains? If so, they’re skinny, wan."

    Mordant chuckle

    ReplyDelete
    Replies
    1. An excellent humorous interlude by the novelist.

      Delete
  3. As I understand it, high schoolers these days are required to take at least one more year of math than the class of 1980 had. When I went to school, first there were two tracks. A select group of about 60 students took Alegbra 1 in the 8th grade. The rest took 8th grade math. Then we took Algebra II in the 9th grade and the rest took algebra one. Nothing else was required.

    Most of the Algebra 2 kids kept going and took geometry in 10th grade and then trig in 11th grade. Meanwhile the other students some of them took algebra 2 in 10th grade, geometry in 11th and trig as a senior. Maybe 50 kids did that out of a class of 250. Ultimately there were less than 12 of us in the Senior math class. For some reason it was not calc or pre-calc. Actually we sort of had sophomore college level matrix algebra and also statistics.

    But near as I can tell more math is required of all students now. Which would also help to explain rising math scores.

    ReplyDelete
    Replies
    1. According to the NAEP developers, their intent for the 12th grade test is not to assess what kids have learned from their math curriculum, but to see whether they are properly prepared for to do college level math. So, the questions sample what they think a well-prepared entering Freshman should know, not what has been taught in various high schools across the country. The more kids have the opportunity to learn, the better prepared they should be.

      Those who have been asking why progress isn't being maintained at the high school level might take into account the shift in the rationale for the questions on the test. I doubt the 4th grade NAEP is testing whether fourth graders are properly prepared for college, so these seems to be a disconnect across the years tested that makes expecting continuous improvement or steady progress nonsensical. I'm not sure why Kevin Drum thinks the slope of the lines matters, but I haven't been impressed with his statistics or math reasoning on other occasions either.

      Delete
    2. 8:15 I was in HS in the early 80s. Most of us took algebra in 9th, the geometry, then trig. I'd say at least a quarter of us took calc as seniors. So I'm not sure your experience was typical

      Delete
    3. Math scores, the point seems to be, are easier to raise because more students are taking the subjects being tested due to an increasing emphasis on math.

      Rather than offer empirical evidence, I would point you to the long Howler series from long ago bemoaning the Los Angeles School District requiring students to take and pass Algebra as a graduation requirement. Presumably those students being required to do an additional year or more of math are going to be doing a bit better on NAEP math tests.

      That said, I can't pinpoint an explanation.

      Special Report: Farewell Gabriela

      http://www.dailyhowler.com/dh022206.shtml

      Series began 2/22/2006 ended 3/3/2006

      Delete
    4. Reading scores are hard to raise because they depend on literacy behaviors of parents during the first two years of life. The scores also include non-native English speakers. Math involves a symbol system independent of these language issues. That makes it easier for schools to make progress in math compared to Reading.

      Delete
  4. From the Commenter Who Told Someone They Were Full of Shit:

    "According to the NAEP developers, their intent for the 12th grade test is not to assess what kids have learned from their math curriculum...."

    From the NAEP Developers Themselves:

    "The 2005 mathematics framework for grade 12 introduced changes from the previous framework in order to reflect adjustments in curricular emphases and to ensure an appropriate balance of content."

    https://nces.ed.gov/nationsreportcard/mathematics/frameworkcomparison.aspx

    Or, for those roughly a few thumbs lighter on the reading level:

    "Main NAEP assessments change about every decade to reflect changes in curriculum in the nation’s schools; new frameworks reflect these changes.

    Continuity of assessment content was sufficient not to require a break in trends, except in grade 12 mathematics in 2005."

    https://nces.ed.gov/nationsreportcard/about/ltt_main_diff.aspx


    ReplyDelete
    Replies
    1. Read what it says in the NAEP report about the difference between a curriculum and an assessment framework.

      Delete
    2. Pardon us for speaking out of turn Anon. @ 6:18, but we read "the Main NAEP test is geared to the math curriculum as it currently exists."

      KZ

      Delete
  5. OMB (Raising Our Thumbs to the Scoring Heavens With the OTB)

    We love the heading for this hymn title as BOB finishes the Sermon on the Mounting Test Scores:

    Overall gains in average scores, two time spans
    Main NAEP, Grade 12 math
    National public schools


    If ever there was a better representation of BOB's manipulation of his dwindling flock we haven't seen it.

    BOB adds the gains on the 1990-2000 test scores to the gains on the 2005-2013 test scores on the 12th Grade NAEP math scores and applies his rule of thumb in a rousing chorus right up your wazoo.

    Problem is, as we have been at great lengths to explain, the NAEP Validation Panel said Bob's rough opposing digits might have some utility when assessing reading progress between grades 4 and 8. They are of no use in math. And Lordy, they are certainly of no use when applied to 12th grade NAEP math results SINCE THEY CHANGED THE TEST SCORE SCALE IN 2005.

    Has BOB mentioned that yet? We don't think so, but if we err, we'll use our standard excuse: BOB repeats himself so often we glaze over and may miss an insinuated new fact or two BOB's more devoted follows can read in his literary entrails.

    The rule of thumb (ten points equal one year), in order to have any validity, requires a cross grade score scale that roughly speaking has your average randomly sampled gold standard tested kiddo score 40 points higher each time the test is administered at a higher level. 8th graders should score 40 points above 4th graders and 12th graders 40 points above 8th graders. That was possible with a 500 point scale and a test which is consistent over time, but the math test has not been and beginning in 2005 they even threw out the 500 point cross grade scoring scale and replaced it with a 300 point scale that has no relation whatsoever to the previous 500 point scale.

    That means you cannot add the total for the two differently scored tests. And the only use for your thumb is as a stopper for leakage of what you blow out your backside.

    Here is how to do the same thing BOB has done with the same degree of dishonesty.

    Overall Average Score Drop Two Time Spans
    12 Grade NAEP Math

    1990-2000: 305
    2005-2013: 151

    Average Score Drop: 154 points

    Using the BOB very Rough Rule of Thumb. today's 12 graders are doing pre-K math!

    ReplyDelete