Part 2—Big-ass confusion in comments: It isn’t Julia Ryan’s fault that she got hired by the Atlantic to write about an important subject she seems to know nothing about.
Julia Ryan, Harvard ’13, didn’t hire herself! As for those who performed the hire, you can hardly blame them.
Darlings! Ryan prepped at Exeter, then moved on to Harvard, from which she graduated in June! To the withered beings who run what’s left of an historic American journal, those credentials made Ryan the perfect choice to write about low-income kids!
For whatever reason, and it isn’t her fault, Ryan is a type. Increasingly, our finer post-journalistic news orgs are hiring young women who went to the finest schools to work this degraded beat.
(We don’t know why they hire young women for this beat. But the pattern is rather clear.)
They went to Harvard, Yale, Princeton and Brown. In the case of Amanda Ripley (Cornell ’96), they even prepped at Lawrenceville! The fact that they don’t know squat about the world on which they’re being asked to report—well, that no longer seems to count for much in our post-journalistic world.
In our post-journalistic world, low-income kids can go hang in the yard. Increasingly, our finer news orgs serve as employment agencies for children of the elite, high-ranking if clueless college grads who need good jobs at good wages.
As the week proceeds, we’ll look at others in this faux education reporter class. For today, let’s marvel at the sheer confusion which resulted from Ryan’s latest attempt at a news report.
Let’s consider the massive confusion which can be found in comments.
Back to Ryan’s attempt at a news report—a news report concerning a subject she seems to know nothing about. When we left off yesterday, Ryan was trying to describe the nation’s new batch of NAEP scores.
As Ryan started her report, an obvious question was raised by her text. Have scores and skills increased a little, or a lot, over the past twenty years?
That is a very basic question. This is the way she began:
RYAN (11/7/13): Every two years, hundreds of thousands of American fourth and eighth grade students take a test called the National Assessment of Educational Progress. The test evaluates students’ reading and math abilities through reading comprehension questions and grade-appropriate math problems.Over the last two decades, “scores have been rising, but slowly.” At this point, Ryan presented a graphic. It showed the score gains in reading and math on the NAEP over the twenty-plus years since 1990.
The results of the test have provided a snapshot of American education since 1990. Over the last two decades, scores have been rising, but slowly. The 2013 results are out, and the national average scores have increased—just barely—since 2011. Here's what this year's score report says about the state of American education today.
Math and reading skills are improving—slowly
Math and reading skills haven’t changed much in the last two years, according to new National Assessment of Educational Progress scores. Fourth and eighth grade students averaged one point higher on math than they did in 2011 on tests that are scored out of 500 points. Eighth grade students scored two points higher on average on the reading test, and fourth grade students showed no change in their average reading scores since 2011.
Hang on now! For reasons we will explain tomorrow, we would regard Ryan’s graphic as highly misleading. That said, its data are technically accurate. Ryan’s graphic showed score gains of this size:
Score gains on the NAEP since 1990Although we’d regard them as misleading, those data are technically accurate. Having said that, our question remains:
Grade 4 reading: 5 points
Grade 4 math: 8 points
Grade 8 reading: 22 points
Grade 8 math: 28 points
Are those score gains a lot or a little? How much academic progress have you made if you gain 5 points, or even 28 points, on the NAEP scale?
People! A reporter has to explain that! In theory, five points can mean a lot or a little, depending on the scale in use. Example:
On the 2400-point SAT scale, a gain of 5 or even 28 points means very little, next to nothing. But on the NAEP scale, gains of those sizes suggest something quite different.
That said, Ryan knows nothing about this work, as Marshall McLuhan once said. She offered no way to estimate the significance of those score gains. Beneath her graphic, she wrote this, then moved to another topic which she failed to explain:
RYAN (continuing directly): But steady increases of one or two points every other year on NAEP math tests have added up to big changes since 1990. Fourth grade math averages have increased by 28 points and eighth grade math averages by 22 points over the last two decades.In that passage, Ryan seems to say that the score gains in math represent “big changes.” She seems to be saying that American students have gained a lot in math since 1990, only a little in reading.
That said, Ryan never made any attempt to explain why she would make those judgments. Predictably, this cluelessness on Ryan’s part led to massive confusion in comments.
Over the last two decades, have students improved a lot or a little? Ryan seems to voice a judgment, but she offers no explanation for her judgment. She gives her readers no basis on which to judge her assessment.
This leaves readers barefoot and clueless. In comments, one under-informed but observant reader was unconvinced.
This commenter makes a perfectly good observation. That said, the commenter is a million miles off in the weeds, which is Ryan’s fault:
COMMENTER (11/7/13): Between 1992 and 2013 the reading scores at the 8th grade increased by 3% (8/260). This seems insufficient given the big increases in per child costs, all those computers, and all that NCLB money, changes and tests. For 4th grade reading, a 2.3% (5/217) improvement also seems quite inadequate.Another commenter tried to semi-challenge this observation. Thanks to Ryan’s ineptitude, he too seemed to have no idea what he was talking about.
That first commenter made a sensible observation. Looking at Ryan’s graphic, she saw that eighth graders averaged 260 in reading in 1992. This year, in 2013, they averaged 268.
From a starting point of 260, that didn’t seem like much of a gain. Later, this commenter extended her point, this time regarding an unexplained claim by Ryan concerning the gaps between states.
In this comment, the commenter refers to the Grade 4 reading scores of Massachusetts and Mississippi:
COMMENTER: The article oddly states, “There’s a huge difference between the strongest and weakest states." Yet the numbers provided do not seem to support the claim. For example, 253 vs. 231…So Mississippi is only 8.7% worse than Mass., which doesn't seem "huge." And what if you factor in the expenditures per child and the socioeconomic status of the family?That is a perfectly sensible observation. In her report, Ryan asserted that the 22-point difference between Massachusetts and Mississippi is “huge.” But she made no attempt to explain that characterization.
Even for the raw numbers you'd expect Mass. to beat Miss. by a much larger margin than a pathetic 8.7%. The real question is why isn't Massachusetts at least 50% better?
On the NAEP scale, is a gain of 22 points a lot or a little? Is a 22-point difference “huge?” Throughout her report, Ryan rattles off judgments about such matters without ever explaining the basis for her claims.
Question: Was there any basis for Ryan’s claims? Did Julia Ryan have any idea what she was talking about? You can count us among the skeptics. But if she did have a basis for her judgments, she never explained what it was.
Atlantic readers were left in the dark. Massive confusion invaded the comments, shared by the commenter we have quoted and the second reader who tried to respond to her claims.
Is 22 points a lot or a little? It didn’t seem to occur to Ryan that she had to explain! That said, we doubt that Ryan even knew the way such score gains are commonly limned. And alas! This was hardly the only problem with the way she presented these test scores.
Question: When Ryan presented that graphic, did she know that she was presenting the average scores of public and private school students combined? We will guess she did not.
Question: Did Ryan know how the gain in Grade 4 reading would look if she chose to “disaggregate” public school scores? Again, we’ll tilt toward no.
Tomorrow, we’ll show you the size of the score gains if you look at public schools only, and if you disaggregate. We’ll even repeat that rough rule of thumb with which folk approach NAEP scores.
Warning! Tomorrow, our score gains will look very different, and very large—much larger than the gains Ryan posted. Did she know she was tilting the scale a bit when she posted that graphic, whose data are technically accurate?
Almost surely, she did not. To all appearances, Ryan was hired because of her pleasing diplomas. There is no sign that she knows anything at all about low-income kids, on whom she has been asked, through no fault of her own, to pretend to report.
Based upon her reports to date, there is no sign that she knows squat about SAT, TIMSS or NAEP scores.
In the post-journalistic world, low-income kids don’t count for much. The careers of those from the finest schools are seen as more important.
Our finest young grads need good jobs at good pay! Post-journalistic news orgs exist to provide that essential service.
Tomorrow: Regarding those score gains, the rest of the story
Coming: Amanda Ripley (Lawrenceville ’92) and others from the finest schools regarding “tracking” and “ability grouping”