Saturday, January 21, 2006

How (One Part) of the "System" Works

As do most academics, I think very little of the U. S. News ranking of colleges. The formula that is used to calculate the ranks has (in common with far too many other things in social science, sadly) a whole pile of constants that can charitably be described as "numbers we pulled out of our butts." For example, let's say, arguendo, that freshman retention rate is somewhat important (I don't actually thinks so, but bear with me). Then we can assign it some percentage of the total score for the school. What percentage? Well, we can construct elaborate post-facto rationalizations, but really it's a number that we pulled out of their butts (how could you tell is 7% was more accurate than 15% or 2% as a reflection of the importance of this piece of data).

But even people like me who think that the U. S. New rankings are crap (sorry, but anyone who thinks a general undergraduate degree at Michigan is more intellectually rigorous than one at Carnegie Mellon hasn't spent any time at both schools) often pay attention to the "academic ranking" part of the formula. For most of us in academia, that's all that matters in terms of the "rank" of the institution.

Well, even that piece of data should now be called into question, because I've just learned how it is calculated. The "academic ranking" is determined by a survey of college Presidents and Provosts about their opinions of schools in the same category as the schools they preside over (i.e., they use the survey of liberal arts college Presidents and Provosts to rank those colleges). That's it. That's all the data.

So it's no wonder that the older and more established a school is, the higher it ranks (a point made by many critics of the U.S. News rankings: there's an almost perfect correlation between age of school and rank -- with a major exception being Brandeis, which ranks very high despite being very young, a point about which they should be justifiably proud).

But can a survey of Presidents and Provosts, who are very, very busy people, really show anything about what's actually going on at so many schools. Unlikely. I doubt that the President of, say, Pomona, has a clue about how good (or bad) the English department is at Wheaton.

On the other hand, the system does potentially generate some good results: Presidents and Provosts come from a huge variety of academic specialities, and they are so busy that they probably only read the leading journals in their own specialities, so the only way to get your college's name in front of these people is to scatter your seed very widely. If each department in your school (and each subspeciality in your department) publishes in leading journals, you'll be more likely to catch the attention of those Presidents and Provosts. So, if you're a President or a Provost, a good strategy for making the ranking system work for your school is to encourage your faculty to do a lot of research and publish widely. That should translate into support across the curriculum, or at least far enough across the curriculum to have work appear in the major journals read by Presidents and Provosts (my guess is that there are relatively fewer Presidents and Provosts from the Fine and Performing Arts -- though ours is; go Wheaton! -- and so working the system this way might be more difficult).

You can also see why research that gets picked up in the mainstream media is so valuable for colleges: the people doing the ranking, Presidents and Provosts, are far more likely to read about your research if it gets picked up in the New York Times than if it is in the Journal of English and Germanic Philology despite the fact that the intellectual standards and likely accuracy of JEGP are about 1000 times that of the NYT).

My conclusion: in many ways the system is bogus: it almost certainly furthers the continuation of an outdated picture of the system, (i.e., which departments were strong when Presidents and Provosts were mere faculty a decade or two ago), and it almost certainly perpetuates a rich-get-richer dynamic (i.e., the English department at William and Mary could be a bunch of useless old fossils who haven't published anything good since 1973 -- n.b., I do not know if this is true--but the school is so old that it picks up a good ranking anyway, because everyone has heard of it, and so the higher rank attracts better faculty and more students and more money, etc). But the good news is that one good way to "game" the system and raise your ranking is to support your faculty and attempt to disseminate their research.

1 comment:

The Donegalitarian said...

You have and excellent website and I also enjoy you books on Tolkien.



www.escapingthedarkagesofmodernity.blogspot.com