Some clarity regarding those PISA scores!

TUESDAY, DECEMBER 3, 2013

Regarding Finland and Massachusetts, watch the Atlantic move script: We hadn’t planned to comment further today about the reporting of the new PISA scores.

That said, Motoko Rich’s report in the New York Times turns out to be utterly woeful. Since we quoted so much of her report in our earlier post, we thought we ought to provide a bit of clarity.

At the Atlantic, Julia Ryan makes her usual succession of errors concerning the new PISA scores. That said, she offers a more coherent framework than Rich for discussing the scores.

Ryan reports, or attempts to report, where the United States ranked among the 34 OECD nations which took the test. Her focus on those 34 nations makes much more sense than Rich’s framework in the Times.

Thirty-one other nations and education systems took part in the PISA testing, but many of them are Third World nations. There is little point in reporting the fact that the U.S. surpassed their performance.

That said, that’s the approach Rich took in today’s New York Times. Earlier, we posted her findings, which were accurate but largely unhelpful.

The simplest way to summarize the U.S. performance involves that core 34. This is where the U.S. ranked among the 34 OECD nations:
United States rank among OECD nations, 2012 PISA
In reading: 17th out of 34
In math: 26th out of 34
In science: 21st out of 34
That represents the U.S. rank among those 34 nations. In terms of average scores, the U.S. essentially scored right at the OECD average in science and reading, somewhat below the average in math.

Again, that represents the American standing among the 34 OECD nations. That said, several of the highest scorers come from outside the OECD fold.

In math, the four highest scorers were non-OECD entities—Shanghai, Singapore, Hong Kong, Taiwan. These entities all outscored the U.S. by large margins.

As we noted earlier today, the U.S. tends to do better on the TIMSS, a conventional test of knowledge of curriculum. The U.S. tends to do less well on the PISA, which brands itself as a measure of “critical thinking.”

Different people have different ideas about which test may be more significant. We’d be inclined to lean toward the TIMSS, but we offer no firm preference.

In this morning’s New York Times, Rich’s report was woeful. Ryan’s report in the Atlantic is almost as bad.

This is the way Ryan starts. The errors start right away:
RYAN (12/3/13): The U.S. education system is mediocre compared to the rest of the world, according to an international ranking of OECD countries.

More than half a million 15-year-olds around the world took the Program for International Student Assessment in 2012. The test, which is administered every three years and focuses largely on math, but includes minor sections in science and reading, is often used as a snapshot of the global state of education. The results, published today, show the U.S. trailing behind educational powerhouses like Korea and Finland.
Does the PISA “focus largely on math?” Ryan doesn’t seem to know how the PISA works.

Every three years, the PISA tests students in all three subjects—reading, math and science. That said, the point of emphasis rotates every three years.

The focus was on math last year, in 2012. But the focus was on reading in 2009, on science in 2006.

Ryan is straight out of college, with no particular background in education. She has bungled a succession of reports about testing in the past few months. (This report is her best.)

A publication like the Atlantic shouldn’t have such an inexperienced education reporter—but we rather plainly live in a post-journalistic world.

We note that Ryan still refers to Finland as an “educational powerhouse.” In math, Finland finished sixth among the 34 OECD nations, but its average score isn’t especially close to the aforementioned Asian tigers.

We’ll return to Finland’s performance below. As Ryan continues, a second error occurs. Note the self-contradiction:
RYAN (continuing directly): Not much has changed since 2000, when the U.S. scored along the OECD average in every subject: This year, the U.S. scores below average in math and ranks 17th among the 34 OECD countries. It scores close to the OECD average in science and reading and ranks 21st in science and 17th in reading.

Here are some other takeaways from the report:

America Is Struggling at Math

The U.S. scored below the PISA math mean and ranks 26th out of the 34 OECD countries. The U.S. math score is not statistically different than the following countries: Norway, Portugal, Italy, Spain, Russian Federation, Slovak Republic, Lithuania, Sweden, and Hungary.
In math, does the U.S. rank 17th or 26th among the 34 OECD nations? Within the space of three paragraphs, Ryan states it both ways.

Alas, the answer is 26th. A full eight hours after posting, no one at the Atlantic has noticed this contradiction.

(For all PISA scores, start here.)

Might we note one further passage? Massachusetts took part in the PISA as an independent entity. Ryan scalds the state for its woeful performance:
RYAN: Even the top students in the United States are behind: This year, the PISA report offered regional scores for Massachusetts, Connecticut, and Florida. Massachusetts, which is a high-achieving U.S. state and which averaged above the national PISA score, is still two years of formal schooling behind Shanghai.
Massachusetts is two years behind Shanghai! That sounds extremely gloomy. Aside from the fact that Shanghai remains a special case within China, this is what has been omitted:

That statement is only true about math, a point Ryan doesn’t make clear—and Massachusetts’ score in math is indistinguishable from Finland’s! That score bestowed “powerhouse” status on the Finns—and it earned our Bay State kids a good swift kick in the keister!

Regarding Finland and Massachusetts, their scores are indistinguishable in two of the three subjects tested. Massachusetts was a few points ahead of Finland in reading, a few point behind in math.

Finland outscored Massachusetts by 18 points in science, less than half a year according to the PISA's 39-point rule of thumb. Shanghai outscored Finland by a larger margin, some 35 points.

Let’s note the power of script:

Massachusetts matched Finland in reading and math, was somewhat behind in science. But on the basis of those three scores, Finland was called an “educational powerhouse,” Massachusetts was sent to the doghouse. That’s what happens when under-qualified reporters try to maintain well-known scripts.

Ryan shouldn’t be working this beat for a journal like the Atlantic; she simply doesn’t have the background. But, as we have often told you, we are plainly living in a post-journalistic world.

6 comments:

  1. This is important and I am going thoroughly through the report and awaiting your comments.

    ReplyDelete
  2. "That statement is only true about math, a point Ryan doesn’t make clear—and Massachusetts’ score in math is indistinguishable from Finland’s! That score bestowed “powerhouse” status on the Finns—and it earned our Bay State kids a good swift kick in the keister!"

    Hah!

    Oh, we're DONE with Finland. Bunch of losers, those kids. This week we want to be JUST LIKE Shanghai.

    When does the TIMSS come out? I need a heads up for when we again find and crown The Smartest Kids In The World. Speaking of smart kids, Amanda Ripley was the celebrity edu-expert who opined on the scores at the USDOE meeting. That's truly frightening. Can they not find someone who actually knows something?

    ReplyDelete
  3. If the PISA is testing on different curriculum than the TIMMS, it should be no surprise that the scores are not the same. Nor should the various countries score the same on either test if their local curriculums vary in content. If our students were going to work in one of these other countries, it might matter whether they are ahead or behind the students there, but they are going to work in the US and in US companies. Presumably our curriculum has been designed to meet the needs of our society and to prepare students for lives here. I'm not sure what the value is of making these kinds of comparisons with other countries.

    ReplyDelete
  4. Bob, thank so much for teaching me how to read this data.

    ReplyDelete
  5. There are at least two important questions here: (1) How are the PISA or other test scores to be interpreted? (2) Would it be wise to design curriculum so as to maximize our students' scores in, say, the PISA testing? Mr. Somerby has been focusing on the first question. Diane Ravitch's blog has been focusing on the second as well. The MSM, predictably, comes off badly on both counts.

    ReplyDelete
  6. The next TIMSS assessment is 2015. The results will be released in 2016.

    ReplyDelete