WEDNESDAY, MAY 31, 2023
The history of that search: In February 2006, one of our favorite journalists—a person we flatly admire—made a substantial mistake.
As you know, we human beings sometimes make mistakes. In this case, the mistake was bannered across the top of the front page of the Washington Post, where a banner headline said this:
A Study in Pride, Progress
The front-page report told a (very) familiar story about a (nearly miraculous) elementary school in Alexandria, Virginia.
The front-page report told a highly novelized story about the progress and pride now on display at one low-income school. Under the lead of a new, "energetic principal," the school had shown remarkable one-year growth in its scores on Virginia's statewide tests.
Unfortunately, a mistake lay at the heart of that front-page report. A few days later, the Washington Post editorial board got taken in by the mistake:
WASHINGTON POST EDITORIAL (2/5/06): A profile of once-disastrous, now-successful Maury Elementary School in Alexandria by The Post's Jay Mathews last week showed what can be achieved if teachers and administrators use the law well. It's an odd idea, getting the Democrats to embrace a Republican project. But if they are brave enough to do it, thousands of inner-city children will be better off.
As you can see, the editors were thrilled by the large score gains at the "once-disastrous, now-successful school."
As you can see, the editors attributed the score gains to President George W. Bush's No Child Left Behind law. According to the editors, if other teachers would employ the law the way the staff at this one school had, "thousands of inner-city children will be better off."
Unfortunately, this upbeat editorial was based on a large mistake. Everything was not as it seemed with this school's improved test scores.
In yesterday morning's report, we reminded you of the way impressive score gains can sometimes result from outright cheating—from deliberate, fraudulent conduct by the staff of a school or school system.
We first became aware of this problem over dinner one night, with three other Baltimore City teachers, in 1971 or maybe 1972. Finally, almost forty years later, USA Today and the Atlanta Journal-Constitution researched this phenomenon to devastating effect.
For one brief shining moment, that reporting—and the subsequent criminal trials in Georgia—created a world in which mainstream journalists were willing to acknowledge the existence of this long-running phenomenon.
That said, we humans love the familiar, highly novelized tale of The Little Low-Income Elementary School That Could. Sometimes, major journalists may love that novelized story so much that they fail to check the small print in the data sets which accompany the upbeat report on the miraculous scores at the school.
Did the AP fail to check the small print concerning the "Mississippi miracle," the topic it explored in this lengthy, upbeat report back on May 17?
We're going to examine that question before our current series is done. But for today, we thought we'd review the remarkable error which led to that front-page report in the Washington Post—that front-page report about a small "inner city" elementary school which had previously been disastrous but was now successful.
As far as we know, there was no "cheating"—none at all—on the part of the Maury staff. As far as we know, that school's apparent test score gains did not result from fraudulent conduct on the part of Maury's teachers or its "energetic" new principal.
Instead, the test score gains. which turned out to be bogus, had resulted from a bizarre policy move on the part of the state of Virginia's Department of Education. That said, please understand this:
There are different ways in which apparent test score gains can turn out to be bogus. Flagrant cheating can produce such gains, but so can policy decisions made at a higher level.
Briefly, this is the remarkable story concerning what happened at Maury:
For starters, our search began with this. Here at THE HOWLER, we didn't believe that front-page report about those heartening score gains.
We didn't necessarily disbelieve that news report. But through long experience, we knew that upbeat news reports of that type often turn out to be bogus.
In the case of that front-page report, our search went on for several months. By the time we were done, everyone from the chairman of the Virginia Board of Education on down had agreed that a deeply ridiculous policy provision had led to the misimpression conveyed in that front-page report.
What actually happened at Maury Elementary? Crazy as it's going to seem, the story turned out to be this:
At that time, the state of Virginia was testing Grades 3, 5 and 8 in its statewide testing program. At some point, the state board had come up with a truly crazy idea—a crazy idea which would vastly inflate the official passing rates at large numbers of schools on the statewide tests.
Crazy as it seems—and no, we really aren't making this up—the official procedure was this:
Kids would take the statewide reading and math tests when they were in Grade 3. If they failed to achieve a passing grade, they would be promoted to Grade 4—and at the end of their Grade 4 year, those kids would take the Grade 3 test again!
Depending on how this was reported, this could have made perfect sense. But what follows is the way it was actually reported—and no, we aren't making this up:
Consider Maury Elementary, a small, low-income school. In the school year under review, there were only 19 third graders in the school.
Only five of them passed the state's reading test. This created a dismal passing rate of 26.3 percent.
Statewide, the passing rate had been 77 percent! With that in mind, how did a school with a 26.3 percent passing rate quality as "now successful?"
The answer to your question is this, and no, we aren't making this up:
At Maury that year, there were a bunch of kids in the fourth grade who had failed the Grade 3 reading test the year before. In accord with official state policy, those kids took the Grade 3 test again at the end of the year—and 12 of those kids now attained a passing grade on the Grade 3 test.
So far, this still could have made a type of sense. But here's what happened next:
In accord with official state policy, the twelve fourth graders who passed the test were lumped in with the five third graders who passed it. This meant that seventeen kids had passed the Grade 3 reading test, in a school with only 19 third graders!
By now, you'll surely think we're kidding. Sadly enough, we aren't! In accord with official state procedure. the state reported that 17 kids had passed the Grade 3 test at Maury—17 kids, in a school with only 19 third graders.
On that basis, the state reported that Maury Elementary had an 89% passing rate on the Grade 3 reading test! That was the official report, despite the fact that only five of the school's 19 third graders had actually passed the test.
We know, we know, we know, we know—it sounds we're making that up. It's hard to believe that something so crazy could have been happening as a matter of official state policy—but that is the way it was working in Virginia at the point in time.
By the time we spoke with the chairman of the state board, everyone had come to agree that this had been a crazy procedure which shouldn't have been adopted. But on the basis of that crazy procedure, the state had reported that Maury had a 89% passing rate on the Grade 3 reading test, and the Washington Post had accepted that claim at face value.
The Post got fooled atop its front page, then in an upbeat editorial. Based on long experience with bogus test scores, we didn't (necessarily) believe what we read that first day, and we decided to check it out.
It took several months of effort to get clear on what actually happened. You can review the search in our 2006 archives, running from February 6 through March 23 on a nearly daily basis, and then occasionally after that.
Just for the record, here's how crazy it was! Below, you see one torrent of official language explaining the way this policy worked.
We highlight the last part only. In that part of this torrent of language, the state explains what to do if more than 100 percent of your school's kids pass a given test!
Remediation Recovery, which started with the 2001 SOLs, is a third reason for apparent score disparities. Students in grades 4, 6, or 9 may retake failed English: Reading/Literature and Research or mathematics tests for grades 3, 5, or 8, respectively, following a Remediation Recovery program. Additionally, students who failed Algebra I, Geometry, or Algebra II and who are enrolled in a Remediation Recovery program may retake a given EOC mathematics test. Tables 6, 7, 15, 17, and 20 display the number of students who retook the failed SOL, the percentage who passed, the number who passed (Bonus number), and the potential benefit to the school (Recovery Bonus or Unadjusted + Recovery score). In the State's calculations to determine accreditation, the number of students who pass the targeted test following a Remediation Recovery program will be added to the number of students who passed the SOLs in the same content area. For example, a fourth grader’s passing grade 3 mathematics score will be added to that school’s grade 3 mathematics passing scores. At other grade levels, the passing mathematics score will be added to the school’s “collapsed” SOL mathematics scores (for accreditation calculations, all mathematics scores are collapsed or averaged together to create one passing percentage). Remediation Recovery students will be included in the unadjusted number of students who passed, but not in the number of students tested, hence the term Recovery Bonus. Said another way, passing Remediation Recovery students are added to the numerator, but not to the denominator. What this means is that a passing percentage exceeding 100 percent is possible (Note: while this overview reports percentages more than 100 percent, the State caps pass rates at 100 percent).
Good God! Under the state's official policy, "a passing percentage exceeding 100 percent is possible!"
Also, note the boondoggle about adding the number of "Remediation Recovery students" to the numerator, but not to the denominator. Under this absurd procedure, it could turn out that more than 100 percent of your school's kids had passed some particular test!
At that point, the state would step in! When schools ended up with passing rates which exceeded 100 percent, the state of Virginia had decided to "cap" that (impossible) passing rate at a mere 100 percent!
Was this policy adopted in good faith? We aren't mind-readers here.
That said, the Post believed what it was told about Maury's passing rates. It didn't examine the small print in the state's reports on individual schools like Maury, where puzzling statistical contradictions immediately began to turn up.
Based on long experience with fraudulent test scores, we pretty much didn't believe what we read in the Post about those high scores. We spent several months conducting a search.
Back to the question which has triggered the present search:
Everything was not as it seemed with the passing rates at Maury. Is it possible that things are not completely as they seem with those improved Mississippi test scores?
Tomorrow, we'll start to examine that question. We'll offer a bit more of this history of bogus test scores in this week's afternoon submissions.
Tomorrow: The AP's explanations