Wednesday, October 5, 2011

Yes, but how about the international ranking of education researchers

Civitas Review touts a recent study that allegedly "exposes the myth of suburban schools." But it seems that the study actually exposes the myth of competent George W. Bush Presidential Center education researchers.

The study in question, the Global Report Card sponsored by the George W. Bush Presidential Center, compares the test score distributions of individual schools and school districts to international distributions. However, because states don't all use the same tests, the study uses a normalizing procedure. The study's web-site describes its procedure
The calculations begin by evaluating the distributions of student achievement at the state, national, and international level. To allow for direct comparisons across state and national borders, and thus testing instruments, we map all testing data to the standard normal curve using the appropriate student level mean and standard deviation. We then calculate at the lowest level of aggregation by estimating average district quality within each state. Each state's average quality is evaluated then using national testing data. And finally, the average national quality is determined using international testing data. Essentially, this re-centers our distribution of district quality based upon the relative performance of the individual state when compared to the nation as a whole as well as the relative performance of the nation when compared to our economic competitors.

For example, the average student in Scarsdale School District in Westchester County, New York scored nearly one standard deviation above the mean for New York on the state's math exam. The average student in New York scored six hundredths of a standard deviation above the national average of the NAEP exam given in the same year, and the average student in the United States scored about as far in the negative direction (-.055) from the international average on PISA. Our final index score for Scarsdale in 2007 is equal to the sum of the district, state, and national estimates (1+.06+ -.055 = 1.055). Since the final index score is expired in standard deviation units, it can easily be converted to a percentile for easy interpretation. In our example, Scarsdale would rank at the seventy seventh percentile internationally in math.
This may be an example of the "new math," but it looks like the index number should be 1.005, not 1.055.

There are other problems with the methodology. A big one is that it normalizes standard deviations in the distribution of test scores across countries and states. Consider a hypothetical state that was successful in improving test scores and in "closing the achievement gap," that is, the state improved test scores among all students but improved them more for students in the bottom of the test score distribution than for students in the top. The standard deviation (measure of dispersion) for its test scores would fall. The Global Report Card, however, uses standard deviations as its unit of measure. The effect would be that school districts within this state would be evaluated on a different standard than school districts in other states.

Another problem is that the methodology does not account for the characteristics of students, such as numbers of students who enter with limited native-language proficiency.

So, beyond the obvious goof on the web-site, there's a lot about this report (and Civitas reporting) that doesn't add up.