When the Times Higher Education Rankings Fail The Fall-Down-Laughing Test

You may have noted the gradual proliferation of rankings at the Times Higher Education over the last few years.  First the World University Rankings, then the World Reputation Rankings (a recycling of reputation survey data from the World Rankings), then the “100 under 50” (World Rankings, restricted to institutions founded since the early 60s, with a methodological twist to make the results less ridiculous), then the “BRICS Rankings” (World Rankings results, with developed countries excluded, and similar methodological twists).

Between actual rankings, the Times Higher staff can pull stuff out of the database, and turn small bits of analysis into stories.   For instance, last week, the THE came out with a list of the “100 most international” universities in the world.  You can see the results here.  Harmless stuff, in a sense – all they’ve done is take the data from the World University Rankings on international students, foreign faculty, and international research collaborations, and turned it into its own standalone list.  And of course, using those kinds of metrics, geographic and political realities mean that European universities – especially those from the really tiny countries – always come out first (Singapore and Hong Kong do okay, too, for similar reasons).

But when their editors start tweeting stuff – presumably as clickbait – about how shocking it is that only ONE American university (MIT, if it matters to you) makes the top 100 – you have to wonder if they’ve started drinking their own Kool-Aid.  Read that list of 100 again, take a look at who’s on the list, and think about who’s not.  Taken literally, the THE is saying that places like the University of Ireland, Maynooth, the University of Tasmania, and King Abdulaziz University are more international than Harvard, Yale, and Stanford.

Here’s the thing about rankings: there’s no way to do validity testing other than what I call the, “fall-down-laughing test”.  Like all indicator-systems, they are meant to proxy reality, rather than represent it absolutely.  But since there’s no independent standard of “excellence” or “internationalization” in universities, the only way you can determine whether or not the indicators and their associated weights actually “work” is by testing them in the real word, and seeing if they look “mostly right” to the people who will use them.  In most international ranking systems (including the THE), this means ensuring that either Harvard or Stanford comes first: if your rankings come up with, say, Tufts, or Oslo, or something as #1, it fails the fall-down-laughing test, because “everybody knows” Harvard and Stanford are 1-2.

The THE’s ranking on “international schools” comprehensively fails the fall-down-laughing test. In no world would sane academics agree that Abdulaziz and Maynooth are more international than Harvard.  The only way one could possibly believe this is if you’ve reached the point where you believe that specifically chosen indicators actually *are* reality, rather than proxies for it.  The Times Higher has apparently now gone down that particular rabbit hole.

Posted in

4 responses to “When the Times Higher Education Rankings Fail The Fall-Down-Laughing Test

  1. Still though, laughter probably isn’t going to stop them, any more than it stops people reading top-whatever lists on twitter (the Buzzfeed-ification of Higher Ed?) But one possible antidote (more of a kludge really) is to find a way to interact with the data, and to nuance it more. I spent some time in Maynooth (and I take your point!), but one possible way of fixing might be to have further measurements of the size of country (which would bump Ireland down the list), then students from out of state, national population, etc.
    There are tools being developed (I am thinking of http://sgratzl.github.io/paper-2013-lineup/ ) which would facilitate this. It seems like further development of the rankings is more likely than them not being used. THE at least has the option to climb out of the rabbit hole by claiming to be more open and nuanced than the alternatives out there, adding more and more options. Until U-Multirank takes over the planet at least.

    1. Hi Andrew. Thanks for writing in. I hadn’t seen that before – it’s very cool. Thanks for the pointer.

Leave a Reply

Your email address will not be published. Required fields are marked *

Search the Blog

Enjoy Reading?

Get One Thought sent straight to your inbox.
Subscribe now.