Part 2—A look at the Post's reporters: How poorly do our big newspapers cover our low-income schools?
Consider the background of the people who wrote the Washington Post’s front-page report about one such school, a report which appeared this past Sunday.
The report ran 1800 words; it featured a pair of graphics. As mentioned, it sat on Sunday morning’s front page, the most high-profile spot of the week.
The report concerned a public elementary school in the Washington area—a school which serves a population which the Post describes as “largely poor and Hispanic.” According to the Post’s report, the school achieved “huge gains” on Virginia’s annual statewide tests due to six weeks of “test prep” sessions.
Did these things really happen? Here’s the way the Post’s report began. Moriah Balingit and T. Rees Shapiro are the reporters in question:
BALINGIT AND SHAPIRO (1/11/15): A grim picture of academic performance was emerging at Carlin Springs Elementary. Fewer than half of the school’s third-graders had passed the reading and math portions of the Virginia Standards of Learning exam, and numbers for history and science weren’t much better.Wow! According to the Post’s reporters, this is what happened:
Teachers pored over the data, dumbfounded.
“To get information like that back can be like a shock to your system,” said Mary Clare Moller, a literacy teacher at the Arlington, Va., school, reflecting on test results that came in after the 2012-2013 school year. “You’re just thinking, like, ‘But I taught this information. I don’t understand why the kids didn’t get it.’ ”
Moller and other third-grade teachers devised a strategy for the following fall: They led six weeks of daily test preparation lessons, tracked students’ progress with a new computer program and provided extra tutoring for students who seemed at risk of missing the mark.
Teaching to the test had remarkable results: While the rest of the school continued to flounder under Virginia’s tougher testing standards, Carlin Springs’ third-graders saw double-digit gains across the board, with passage rates between 70 percent and 79 percent in every subject.
In the spring of 2013, fewer than half this school’s third-graders recorded passing scores on statewide reading and math tests. The next year, third-grade teachers devised and taught six weeks of test prep lessons.
In the spring of 2014, more than seventy percent of third-graders passed the statewide tests.
Third-grade students did much better—but in other grades, students continued to flounder. “Teaching to the test had remarkable results,” the Post’s journalists declare.
Did those things actually happen? Did six weeks of test prep sessions really produce the “huge gains” cited in the Post’s headline?
Truth to tell, we have no idea, so incompetent is the Post’s journalism. These are just a few of the problems with the Post’s reporting:
The most basic kinds of information are missing from the lengthy report. One of the graphics the Post provides seems to be irrelevant to the issue at hand.
The Post’s reporters made little attempt to explore the plausibility of the explanation they give for the “huge gains.” Most strikingly, they didn’t consider other possible explanations for the change in passing rates. In the process, they turned a blind eye to a major testing scandal which recently rocked the D.C. area and the nation as a whole.
Tomorrow, we’ll examine a few of these journalistic fails—failures which infect this report from its first paragraph on. Before we do, let’s ask a different set of questions:
How experienced were the reporters the Post assigned to this task? Was there any reason to expect that they would do competent work as they examined this school's score gains and tried to explain their provenance?
Alas! The reporters who wrote this front-page report are young and inexperienced. They’ve spent little time on the public school beat. They have little background in the widely-discussed topic of standardized testing.
They may turn out to be superb reporters in the future. At this time, there was little reason to think they would know what questions to ask about the “huge gains” they were asked to examine.
How inexperienced are these scribes? Let’s take a look at the record:
According to this LinkedIn profile, Balingit graduated from the University of Oregon in 2007. Starting in January 2008, she worked various beats at the Pittsburgh Post-Gazette. But until she came to the Post in September 2014, she had never covered education.
According to Middleburg Life, Shapiro graduated from Virginia Tech in 2009. He went directly to the Post, writing obituaries for several years. He has covered various sorts of education topics in recent years, many on the university level, but he seems to have no particular background covering public school testing.
Obviously, there’s nothing “wrong” with being youngish reporters. Recently, Shapiro did some good front-page reporting on the alleged UVa gang rape case.
There’s nothing “wrong” with being youngish reporters. There is something wrong with assigning reporters to cover a topic for which they lack technical background and understanding. We see no sign that Balingit and Shapiro brought requisite background knowledge to the question of this school’s score gains.
What was “wrong” with the pair’s reporting? Tomorrow, we’ll explore that topic in some detail.
That said, the Washington Post has a very poor record covering public school testing. How much do we care about low-income schools? In the work routinely done by the Post, we may get a hint of an answer.
Did six weeks of test prep sessions really produce “huge gains” at this low-income school? We don’t have the slightest idea, in large part thanks to the Post.
That said, when topics get covered in slipshod ways, we may learn who we actually care about. Or perhaps we learn about the intellectual capital our big newspapers possess.
What was wrong with this front-page report? Tomorrow, in Part 3, we’ll start with the Post’s first paragraph.
Tomorrow: Just how huge were the gains in question? And what actually caused them?