Monday

 

Glory Days

The newest edition the of top high schools in the United States is out. I've always thought number of Advanced Placement/ International Baccalaureate exams administered per graduating senior is an overly simplistic metric, though I suppose it's hard to come up with an academic standard that's usable for so many different schools in so many different states. Even if accepting the idea of ranking high schools based on solely on year end subject tests, wouldn't it make sense to factor to in the results of those tests? I forget if AP test grades are curved to a bell curve, and I don't know the IB system at all, but it couldn't be too difficult to normalize the results, aggregate them, and then weight participation by results. For instance, let's say 20% of students on the AP American History exam get a 5 (the top grade, for those not familiar with AP's), while only 8% of test takers on the AP Physics C exam* receive the same grade. That should be normalized, then the normalized results for all tests at a given high school averaged, and then maybe weight the rankings based on 70% participation and 30% results.

My high school alma mater has fallen fourteen places since the 2003 rankings, despite administering .279 more tests per graduating senior. I wanted to take a broader look at results over time (I know we were in the top ten in the early days of the ranking, oh how the mighty have fallen, etc.) but I can't find any results prior to last year's online.

*Percentages are, of course, made up.

Update: While looking for historical (historical meaning seven or eight year old) results, I did find an old Slate article in which James Fallows criticizes the rankings, Jay Mathews (the Newsweek reporter who's been the bylined writer on the rankings articles since at least 2000, if not earlier) responds, and Fallows sums things up. Unfortunately, the link to Mathews' response is broken, so it's a rather one-sided debate.

|