Part 3—The Post refuses to ask: It’s fascinating to see the way the Washington Post reported the story in question.
The famous newspaper’s bungled report ran 1800 words. Beyond that, the report included a pair of graphics, one of which seems to be irrelevant to the questions at hand.
The lengthy report appeared on the front page of Sunday’s editions—the highest-profile spot of the week. The report concerns a type of event our big newspapers have bungled for at least forty years—“huge gains” on a standardized test at a local, low-income grade school.
By now, it’s hard to say that this journalistic bungling isn’t deliberate and willful. In this case, the procedure starts with youngish, inexperienced scribes being assigned to report the story.
In this case, their bungling starts in their opening paragraph, then proceeds onward from there. Assuming that they worked in good faith, these young reporters were unprepared to report a story of this type.
What are the basic facts of the case? How “huge” were the score gains at the local, low-income school?
Once again, this is the way the front-page report started. Basic information is missing right there in paragraph one:
BALINGIT AND SHAPIRO (1/11/15): A grim picture of academic performance was emerging at Carlin Springs Elementary. Fewer than half of the school’s third-graders had passed the reading and math portions of the Virginia Standards of Learning exam, and numbers for history and science weren’t much better.As told by the Washington Post, the story goes like this:
Teachers pored over the data, dumbfounded.
“To get information like that back can be like a shock to your system,” said Mary Clare Moller, a literacy teacher at the Arlington, Va., school, reflecting on test results that came in after the 2012-2013 school year. “You’re just thinking, like, ‘But I taught this information. I don’t understand why the kids didn’t get it.’ ”
Moller and other third-grade teachers devised a strategy for the following fall: They led six weeks of daily test preparation lessons, tracked students’ progress with a new computer program and provided extra tutoring for students who seemed at risk of missing the mark.
Teaching to the test had remarkable results: While the rest of the school continued to flounder under Virginia’s tougher testing standards, Carlin Springs’ third-graders saw double-digit gains across the board, with passage rates between 70 percent and 79 percent in every subject.
In the spring of 2013, fewer than half the school’s third-graders passed the statewide math and reading tests.
The next year, dumbfounded teachers provided six weeks of “test prep” lessons for their third-grade students. Because of this “teaching to the test,” third-grade passing rates jumped by roughly thirty points.
That’s the way the story of told by the pair of young Post reporters. For our money, their inexperience was already showing right there in paragraph one.
This is why we say that:
In paragraph one, we’re told that fewer than half of this school’s third-graders passed the statewide tests in the spring of 2013. We’re given to think that this was a horrible performance.
But alas! We’re never told what the passing rates were for the state of Virginia as a whole. Depending on the “difficulty” of the tests, those passing rates could conceivably have been in line with statewide performance. Or those passing rates might have come close to matching statewide performance on the tests, which are called the SOLs.
And not only that! Later, we’re told that students at the school in question are “largely poor and Hispanic.” But we’re never told how their passing rates compared to those of other low-income Hispanic kids in the state.
How bad were those passing rates on those statewide tests? It’s hard to answer such a question in the absence of statewide comparative data, but such data never appear in this lengthy report. Nor are we told what the third-grade passing rates were like at this school in the spring of 2012.
At one point, we’re told that third-grade passing rates at this school “plummeted” in the spring of 2013. But we’re never told what the passing rates were in the spring of 2012—not even in the graphics which accompany the text.
(Warning! You may think that the Post’s first graphic provides this information, at least “for all four SOL subjects combined.” Careful! Based on data from the Post’s report, it seems fairly clear that this graphic must present passing rates for all the grades in the school combined, not just for third graders. You’ll note that the title of that graphic doesn’t claim that the data are for the third grade.)
Starting right in paragraph one, a lot of info is missing. Most strikingly, we never learn how well this school’s low-income Hispanic third-graders were performing compared to their counterparts around the state. And we’re never given a clear account of the drop in passing rates which occurred in 2013.
By how much did third-grade passing rates “plummet” that year? We’re never given that information! We’re simply told that teachers at the school were “dumbfounded” by that year’s passing rates—and that they devised a strategy to bring the passing rates up the next year.
At this point, the inexperience of the Post journalists struck us in another way. We were struck by how much shaky behavior they described, without ever seeming to realize.
Good God! After those passing rates “plummeted” in 2013, third-grade teachers at this school swung into action. As described in the passage above, they “devised a strategy for the following fall” which included extensive “test preparation.”
To us, the problems with that behavior start right at the top. Let’s consider the role of the state of Virginia.
In theory, if a state offers statewide standardized tests, the state should also prescribe standardized test preparation. Here’s why:
Does extensive “test prep” of the type described really change scores on the SOLs? We can’t answer that question.
In theory, if test formats are so confusing that “test prep” substantially changes scores, then those formats are too confusing. You need better tests.
Let's suppose that such test prep does change test scores on the SOLs. If so, the value of the SOLs is severely undermined if School A provides extensive test prep and School B does not. If passing rates can virtually double based on test prep lessons, then it’s pointless to compare passing rates of various schools from around the state, and it’s pointless to compare some school’s passing rates to those of the state as a whole.
Beyond that, if test prep lessons change scores that much, it makes no sense to use the tests to determine “proficiency” at reading or math. Student A may know just as much math as Student B. But Student A has not been provided the test prep lessons, so his math score may be much lower.
One is “proficient,” the other is not, even though they both know the same math!
At the Washington Post, the young, inexperienced reporters completely accepted the idea that this school’s test prep lessons produced those jumps in passing rates.
“Teaching to the test had remarkable results,” the young, largely clueless reporters wrote at the start of their piece. As they continued, they betrayed no sense that, in passages like these, they are describing a form of anarchy, chaos, bedlam:
BALINGIT AND SHAPIRO: As the teachers celebrated the gains, some soul searching began: They felt uncertain about the accomplishment and its educational value.“We have to try to do what we can to get the kids to pass?”
“I just knew it’s a part of the game,” said Carissa Krane, who taught third grade during the two years the test scores plummeted and then soared at Carlin Springs; she has since moved to California. “There has to be a way to be accountable, and this is the way that our country’s decided we’re going to hold kids accountable and the teachers accountable.”
Arlington Superintendent Patrick K. Murphy said that Carlin Springs’ gains showed that educators were able to tailor their instruction to fix a problem, acknowledging that part of their strategy involved intensive test preparation, a practice he said is common in schools everywhere. He said he thinks SOLs are not a comprehensive measure of a student’s knowledge and that teachers sometimes put too much emphasis on the results.
“We fall into the trap of wanting to be able to quantify things with a single number because it makes us feel good,” Murphy said.
For educators at Carlin Springs, the tests became a frustrating reality of modern teaching. They know that year-long classroom efforts will be reduced to scores and pass rates.
“It’s just really hard to look at just the numbers, and that’s what we’re judged by,” Moller said. “We have to try to do what we can to get the kids to pass.”
In recent years, such statements have come from schools in cities like Atlanta and Washington, in the wake of cheating scandals that may yet send people to prison. But so what? The clueless young scribes at the Washington Post betray no sense of the problems which may lurk in such comments. They betray no sense of the chaos which is unleashed when every group of third-grade teachers feels they get to decide the proper way to prepare their students to take statewide standardized tests.
In that passage, Superintendent Murphy fails to see the obvious problem with his quoted statements. Is “intensive test preparation” really “a practice [which] is common in schools everywhere?”
Obviously, no—it is not. According to this report, there was no such practice at this school until the 2013-14 school year.
Presumably, a crazy-quilt pattern of test prep obtained in the schools of Arlington, and across the state of Virginia. But if test prep affects SOL scores this much, that means that Virginia’s testing program has virtually no meaning or utility. The teachers and the superintendent don’t seem to see this obvious problem. Neither do the incompetent, young, inexperienced scribes who did this lengthy front-page report for the Washington Post.
If test prep really affects scores that much, then (at the very least) test prep needs to be standardized. Having said that, a very basic question must be asked:
Does test prep affect test scores that much? Does that six weeks of test prep lessons really explain the way passing rates “soared” at this low-income school?
We don’t have the slightest idea why passing rates went up at that school. We do understand the possible scam the Washington Post still refuses to discuss, even after Rhee and Atlanta.
Did test prep really produce those “huge gains?” Tomorrow, we’ll look at the rest of the (possible) story—a possible story the Post has worked to avoid down through these long pitiful years.
Even after the scandals under Rhee—even after the scandals in Atlanta—the Washington Post still refuses to go there! Could it be that something other than test prep lessons produced those soaring passing rates?
Even after Rhee and Atlanta, the Post refuses to ask!
Do we care about low-income kids? If we cared about such kids, would we engage in relentless journalistic scams about their important young lives?
Tomorrow: Even after Rhee, even after Atlanta, the Post refuses to ask