Posted tagged ‘university league tables’

Profiling (or ranking?) universities

January 14, 2014

Right at the end of 2013, while most were still digesting their Christmas dinners and Ireland was more or less closed down, the Higher Education Authority – the funding council of the Irish higher education sector – published a report entitled Towards a Performance Evaluation Framework: Profiling Irish Higher Education.  In his introduction to the report, the HEA’s chief executive, Tom Boland, describes its purpose as follows:

‘The development by the HEA of the institutional profiles presented in this report is intended to support higher education institutions in their strategic performance management in order to maximise the contribution of each both to the formation of a coherent higher education system and to national development. This on-going work is therefore fundamental to the implementation of the national strategy, particularly in respect of the imperative to align institutional strategies and national priorities, and to foster and clarify mission-diversity. Rather than reflecting any desire to instigate a ranking system, this report signals the HEA’s intention to work in partnership with all higher education institutions to ensure that the system as a whole advances the national priorities set out by the Government—for economic renewal, social cohesion and cultural development, public sector reform, and for the restoration and enhancement of Ireland’s international reputation.’

The bulk of the report then contains metrics for each institution, including student data, research performance and financial information. So for example we learn that it costs, on average, €10,243 p.a. to educate a student in an Irish university, with the cost ranging in individual institutions from €8,765 in NUI Maynooth to €11,872 in University College Cork. We also find out that the student/staff ratio in Irish institutions ranges from 19.5:1 in Dublin City University to 30.1:1 in NUI Maynooth. In research terms the institutions’ citation impact ranges from 0.6 in the University of Limerick to 1.7 in Trinity College Dublin, with most other universities clustering around the world average of 1.0.

What does this kind of information tell us? Or more particularly, to what use will it be put? Tom Boland emphasises in the passage quoted above that the intention is not to ‘instigate a ranking system’, though others could of course use the metrics to do just that. It can of course be used, as the HEA suggests, by institutions themselves ‘in their strategic performance management’ (presumably in setting and assessing key performance indicators), or as they also suggest to assess whether institutions are advancing government priorities.

In fact, university ‘profiling’ is all the rage, and not just in Ireland. The European Union’s ‘U-Multirank’ project, which is supposed to go live early this year, is something similar:

‘Based on empirical data U-Multirank will compare institutions with similar institutional profiles and allow users to develop personalised rankings by selecting indicators in terms of their own preferences.’

This too will, or so it seems to me, be an exercise in institutional profiling, presenting metrics that can be used to generate comparisons, i.e. rankings.

I don’t really doubt that as recipients of public money universities should present transparent data as to how this is being spent and what value is being generated by it. But comparisons between institutions based on such data always carry some risk. So for example, DCU’s student/staff ratio looks more favourable because the university has a much larger focus on science and engineering than other Irish universities, and laboratory work requires more staff input. NUI Maynooth is ‘cheap’ because the main bulk of its teaching is in the humanities, which are less resource-intensive. This information may not be immediately obvious to the casual observer, who may therefore be driven to questionable conclusions. Ironically some of these risks are not so prominent in the more mature league tables, such as the Times Higher Education global rankings, which will already have allowed for such factors in their weightings. The raw data are more easily misunderstood.

It seems to me that institutional profiling is not necessarily preferable to rankings. And it could be open to mis-use.


Global rankings: the awe, the envy, the hate, the fear

June 18, 2011

If university rankings have become something of an industry, it is almost as nothing compared with the industry of assessing and (usually) criticising rankings. League tables are everywhere, but they are not as ubiquitous as the critiques of them. In fact if I were compiling a new table I would be pretty distraught if I couldn’t attract a heap of robust denunciations.

Some of these critiques are exercises in hyperbole, and some are more measured. The latest in the latter category is an assessment published by the European Universities Association, Global University Rankings and their Impact. The author, Latvian professor Andrejs Rauhvargers, argues that all the better known league tables really only measure research output, that they pretty much ignore most of the world’s 17,000 or so universities because they use criteria that are irrelevant to them, but that on the other hand they cause universities to try to behave like Harvard or Cambridge (however hopeless that objective might be), thereby destroying diversity of mission.

It is not difficult to sympathise with this perspective, not least because some of his conclusions are correct. But then again, as the author also acknowledges, rankings are popular, and they have focused attention not just on the performance of individual institutions, but perhaps more importantly on national higher education policies and priorities. Countries wanting to be recognised as knowledge societies need their universities to perform well in global rankings.

The truth is, really, that global rankings are here to stay, and that they will continue to recognise scholarly output above anything else, though with some modifiers. Personally I have no great problem with that. I would also be quite relaxed about the impact of global rankings on individual universities. I don’t actually see any destruction of diversity: Harvard is totally different from Caltech, and both are in the top 10 in the key league tables. I would also suggest that aiming for a particular range within the rankings – say, an institutional aim to be in the global top 100, or to be ranked at all – is not sensible. It is far better to pursue an institutional mission in the most excellent way available, and let league table position be a by-product rather than a strategic aim; it is not in any case an aim that institutional strategy alone can deliver. I was of course pleased when my last university, DCU, entered the global top 300 in the Times Higher Education rankings, but it was never one of our strategic aims.

And in the meantime, if rankings increase interest in higher education and encourage the provision or facilitation of resources, then that’s good. For those who don’t make it into the top rankings, that isn’t the end of the world either; many are highly successful and generate considerable income without ever being ranked.

So maybe a good idea for the rankings sceptics is to let go of this obsession and just move on.

Taking charge of your own university rankings

April 15, 2011

Whenever I raise the topic of university rankings, I always get readers who, either in comments made here or in emails sent offline, will suggest that I really shouldn’t be paying so much attention to them or encouraging their authors. I know very well that many academics are very sceptical about league tables and don’t believe that they reflect any sort of reality; or they suspect that rankings prompt inappropriate behaviour by university managers, or in some other way undermine academic integrity.

In reality, however, league tables are part of the landscape, and this is so in part because those who want to enter into any kind of relationship with universities – whether as students or as faculty or as business partners or as donors – take them seriously and want to have them as a guide. We may wish that it were otherwise, but it isn’t. This being so, we need to engage with them, and in that way help to ensure that they are reasonable and accurate and transparent. So for example, the transformation over the past year or two of the Times Higher Education world rankings owes a lot to academic interaction with the journal and with the company they selected to manage the project.

The best known world rankings – those run by Times Higher Education and by Shanghai Jiao Tong University – have one important thing in common: the global top 10 universities are exclusively American and British. This is tolerated by Asian institutions that believe they are rising up the tables and are biding their time, but it disturbs the European Union and its member states.  In both rankings the top non-British EU university only comes in at number 39 (French in each table, but not the same university).

Because of this the EU has set out to design its own rankings, to be known as U-Multirank. The thinking behind this is that the established league tables are too much focused on research outputs, and in particular on science research; they neglect teaching and don’t encourage diversity of mission, and they drive universities into strategies that they don’t have the means to deliver. So the new rankings are to be weighted differently, so that the resulting table would be more balanced; and moreover they are to allow users to design and weight their own criteria, so that students (say) can create their own league table that more accurately reflects the strengths they are looking for in considering universities.

Can this work? In my view, no – probably not. Rankings are not really meant to provide a method of institutional profiling, but rather are designed to set out a kind of reputational gold standard. They are not the answer to the question ‘what kind of institution is this?’ – rather, they answer the question ‘what does the world think of this institution?’ This may not be a scientific answer, or else all rankings would give us the same results, but it is an attempt at standardising external evaluation. Also, too many people will think of U-Multirank as an attempt to support the somewhat lesser known European universities and design the rules to suit them.

Still, if you’re interested, the U-Multirank project is coming to the end of a feasibility evaluation and, if this supports the idea (as it will), it will be rolled out some time over the next year or two. It will be interesting to see whether it attracts support. I suspect that it will not displace the pre-eminence of Times Higher.

And now, the Times Higher Education rankings

September 16, 2010

Well, I did say that it was the rankings season. And here’s another, and perhaps the one most likely to be seen as definitive: the World University Rankings issued by the journal Times Higher Education. It’s hard to know whether to describe these as new or the latest in a series. Times Higher have been publishing rankings for a few years now, but previously these were prepared in collaboration with Quacquarelli Symonds. In 2009 they parted company with QS, and this year the Times Higher rankings have been prepared in collaboration with Thomson Reuters. The methodology used is different, and so it may be better to see this as a completely new exercise, rather than a new edition of the old one. In any case, QS have continued with their own rankings, as we noted last week.

Before getting to the actual outcomes, the following rather eccentric element should be noted. It had been announced before today that the Times Higher rankings would only list the top 200 global universities. And so indeed it is, and you can find the table here. However, they are also selling an iPhone app through iTunes, and with this (and only this) you can access the top 400. This allows us to list more of the Irish universities, because only two appear in the top 200, while a further three are in the 200-400 range.

Unlike the QS rankings last week, Times Higher has the United States leading the field. The top university is Harvard, and it is followed by four other US universities (though not Yale, which comes in at number 10). The top non-US universities are Cambridge and Oxford, coming in together at number 6=. The top non-US/UK university is the Swiss Federal Institute of Technology Zürich at number 15. The top non-UK European Union university is France’s École Polytechnic at number 39. Overall, while there are a good few European Universities in the rankings, they are not challenging the US/UK dominance. Some Asian universities make make an appearance, as do Canadian and Australian ones.

And what about the Irish institutions? Here is the table:

TCD    76
UCD   94
UCC   243
NUI Galway  299
DCU    313
DIT   347

NUI Maynooth and the University of Limerick are not in the top 400.

So while Trinity College still leads the field, UCD is catching up. No Irish university has made it into the top 50. And as I have noted before, these positions are likely to slip further as the funding cuts start to bite even more.

And what’s next? The Sunday Times league table for Irish universities is due out next weekend.

The month of league tables

September 3, 2010

In the course of this month (September) two league tables will be published that, whatever we may think of them, will determine some people’s views of Irish higher education. They are the Times Higher Education world rankings (now compiled with Thomson Reuters), which will be issued on September 16, and the Sunday Times University Guide, which is typically published around September 20. Just to complicate the picture, Quacquarelli Symonds Limited  (QS), who until 2009 were the Times Higher partners for the world rankings, will continue to publish their own league table, expected to be out in October.

What can we expect to see this year? The Sunday Times rankings are extremely unpredictable; the only university to have maintained the same position, as Ireland’s number 1, in every year is Trinity College. All other universities have jumped round the numbers; DCU for example has over the years occupied every position except top and bottom. But if you look at the precise points used, you will see that the Irish universities are not far apart, which may explain the erratic positioning.

The Times Higher rankings may look completely different this year. For a start, they will only publish the list up to number 200. On present performance, that would mean that only three universities (TCD, UCD and UCC) will appear. But in fact they have also completely changed the performance indicators used and their weightings, with more emphasis on teaching than before and less on stakeholder and peer opinions; over 60 per cent of the score will however derive in one way or another from research. In the light of the tricky funding issues facing Irish higher education and the quality impact this has, we must in any case expect Ireland’s universities to begin slipping in these rankings, though it is possible that this will not become fully visible until 2011.

The QS rankings may be the ones that work best for Irish universities, though at this stage it is difficult to know what standing they will have globally in the light of the parting of ways between QS and Times Higher.

One way or another, we will hear a lot more about league tables over the weeks ahead.

World rankings

October 8, 2009

Today sees the publication of the QS/Times Higher Education world university rankings, and Irish universities have done well. Five of the seven are in the top global 300, ranked as follows (last year’s position in brackets): TCD 43 (48); UCD 89 (108); UCC 207 (226); NUI Galway 243 (368); DCU 279 (302). Though some academics are sceptical about any league tables, nevertheless the improvement in the rankings for all the Irish institutions is welcome.

However, the reality is that in the current funding environment it is unlikely that these positions can be sustained, let alone improved further. The cuts currently being applied are likely to have a significant impact on the relevant metrics, so that the international competitiveness of Irish universities will be compromised. It may of course be that this is a price we simply have to accept. But given the known impact that the quality of the universities has on decisions on foreign direct investment, there is something dangerous about the risk we may be taking.

The resourcing and funding of universities in times of recession is a topic that needs to be be discussed more urgently. It is to be hoped that the Minister for Education, Batt O’Keeffe TD, will now move swiftly to disclose and subject to analysis his proposal for tuition fees and related matters.