On to the PISA next week: Are U.S. public schools an embarrassing mess as compared to schools in the rest of the world?
How about as compared to schools in the rest of the developed world? Are American schools an embarrassing mess as compared to them?
That impression is constantly given within the American discourse. It's done through comparisons to the miracle schools of miraculous Finland, a small, middle-class, unicultural nation whose kids tend to outscore their American peers on international tests.
It's also done through comparisons to the world's other nations in general. That said, it's hard to conclude that American schools are an embarrassing mess by looking at scores from the TIMSS and the PIRLS, which constitute one of the two international testing programs from which all our data flow.
On the TIMSS (math and science) and the PIRLS (reading), American students tend to score rather well, except when compared to a handful of Asian nations and city-states which tend to outscore everyone else in the world. If we look at results from the TIMSS and the PIRLS, American students and American schools don't seem to be a horrid, embarrassing mess.
Perhaps for that reason, we rarely hear about results from the TIMSS and the PIRLS. When American "opinion leaders" talk about the public schools, they tend to discuss the wonders of Finland—and they tend to discuss test scores from the PISA.
(Note: By all accounts, Finland is an admirable nation which functions extremely well. We would only criticize the misleading way its education ministry tends to promote itself.)
The PISA (the Program for International Student Assessment) is the other major international testing program. On the one hand, we have the TIMSS and the PIRLS. On the other hand, we have the PISA, which conducts tests in reading, math and science.
Compared to kids from other nations, American students haven't ranked as well on the PISA as they have on the TIMSS/PIRLS. Perhaps for that reason, advocates of certain types of education reform tend to discuss results from the PISA while completely ignoring results from the TIMSS and the PIRLS.
We'd planned to discuss "achievement gaps" on the PISA today, but we've decided it can't be done in a single report. We want to show the size of the gaps between the U.S. and other nations. But we also want to "disaggregate" those gaps. That is, we want to show what the gaps look like on the PISA (and on the TIMSS) when you report the divergent average scores of American students from our major population groups.
In Part 3 of our current report, we recorded the large achievement gaps of that type which obtain on our major domestic testing program, the NAEP. Despite large score gains by all major groups, large achievement gaps persist on the NAEP.
In our view, you can't fully understand American scores on international tests unless you see what scores look like for those different major groups. In our view, the data are painful, but instructive. That "disaggregation" makes our nation's educational challenges that much more stark and more clear.
It doesn't make sense to try to do all that in one post. Instead, we'll reshape our ongoing series, which was originally planned to last four weeks in all.
Next Monday, we'll start a new four-part series, Where the PISA Gaps Are. we'll look at the gaps between the world's nations, and between different groups of American kids.
We could try to cram it all in today. We could do it, but it would be wrong.
For today, you might want to look at Kevin Drum's capsule summary of these matters. In this new post, Drum offers an overview of some of the data we've been discussing.
That said, we do have a question about this highlighted point:
DRUM (10/3/16): [A]s Somerby points out, one of the striking things about these tests is that a small clutch of Asian countries do far better than us. In fact, they do far better than everyone, something they accomplish through a combination of cherry picking the students who take the test and a monomaniac culture of test prep. So let's take that as given, and just look at the rankings outside of the Asian tigers...We'll have occasion to discuss that "monomaniac culture of test prep" again next week. That said, we don't think we've ever seen anyone claim that the high-scoring Asian nations "cherry-pick the students who take" the TIMSS and the PISA.
Drum may be referring to the limited case of Shanghai, which took the PISA as an independent entity in 2009, producing extremely high scores which set off paroxysms in the American press. It was almost like Shanghai's kids had put a new Sputnik in orbit.
As it turned out, Shanghai's student population, and its public schools, are highly unrepresentative of the Chinese system in general. It was a bit like testing kids from Groton and Choate and acting like you'd tested a representative sample of American students.
Once the gorilla dust had settled, the PISA was criticized for letting China participate in the program in such a misleading way. That may be the situation to which Drum referred. Aside from that, we're not aware of claims that South Korea, Japan or Taiwan arrange for an unrepresentative sample of students to be tested. Perhaps some such allegation exists, but we don't think we've ever heard about it.
Asian tigers to the side, American students score pretty well on the TIMSS and the PIRLS. They seem to stack up less well on the PISA.
Next week, we'll do a four-part report about American scores on the PISA. Along the way, we'll break American scores down by groups, and we'll look at how certain states performed on the PISA as independent entities.
In our view, these practices give us a clearer picture of the educational challenges we face. Aside from standard assertions of script and cant, these topics are very rarely discussed. We want to end up with a detailed report which makes lots of basic information available all in one place.
We're sorry for the delay. But we've come to believe that the PISA's gaps deserve a week of their own.