Posted tagged ‘U-Multirank’

Regarding rank, again

September 5, 2016

While most university heads will at some point declare that they don’t like and are suspicious of global rankings, in reality they do pay significant attention to them. This week sees the publication of the QS World University Rankings. One of the trends apparently captured by QS is an interesting one: almost without exception English universities have slipped in the rankings since 2015. This trend appears to be particularly English, asScottish universities have a more varied performance.

Globally Asian universities are on the rise, European ones (including Ireland) are in apparent decline. United States universities are on the whole doing well.

Some may see the performance of English universities as surprising, given that the fees regime of the UK government has given many of them access to more cash and resources; but this does not seem to translate into higher places in the rankings.

Can we actually conclude anything of use from all this? Do these rankings provide potential users (students, industry partners, others) with any worthwhile information? Is a place in the rankings a valid strategic objective? These questions are hardly addressed now in the major higher education debates: many hate the league tables but feel they have no choice but to play the game. That this game has to be played competitively in order to matter at all is shown by the failure of the EU’s U-Multirank project to make any real impression.

Rankings are in fact now a lucrative business. That does not of itself make them bad; indeed, they may tell us things that could have a useful influence on policy. But my advice to universities is not to build strategy around them. Our mission is to provide high quality teaching, valuable research and effective outreach and knowledge leadership. Our strategy must be to succeed in those objectives and to be excellent in communicating that success.

Advertisement

Where will the world’s leading universities be?

May 20, 2014

How countries and regions respond to dramatic economic circumstances can have significant longer term effects on the global balance of power. Two historical developments, for example, shaped the world’s political make-up for the later 20th century: the financial fall-0ut from the First World War, when US dollars moved in to bankroll some of the key European combatants, including Britain; and the Roosevelt administration’s New Deal response to the Great Depression. The Second World War, while significant in that its outcome temporarily side-lined Germany as a major power, merely reinforced what was already a fact in international relations, the supremacy of the United States. Furthermore, the decline of Britain in the 1940s and 1950s, and later the collapse of the Soviet Union in the late 1980s, demonstrated that military muscle not supported by economic power was actually a handicap rather than a support, a point underscored also by the rise of Japan and (West) Germany in the 1960s.

The recent recession, which may now at last be coming to an end in global markets, will probably also leave a significant legacy, and this time it is higher education that may see some of the major changes. In itself that is not new. The ability of the United States to consolidate its global economic dominance in the 1950s was hugely supported by major investment in higher education, and by the tendency of the US to attract and retain talented scientists and academics from across the world to add excellence to its universities. When we see the global university rankings, we don’t just discover where to find higher education excellence, we observe the world’s power structures.

The question now is whether those rankings will still look the same in 10 years time. Many presume that the position of Asian universities will have improved dramatically, as the key countries there are channelling big investments into their higher education systems right now. Not just China (which has been investing huge sums in its universities), but also Japan, Singapore, Malaysia and Thailand are taking aggressive steps to give their universities a chance of global recognition. But this is coming at a time when the major western countries in the North America and Europe talk the language of higher education development while simultaneously withdrawing the resources. For some time now the University of California system, containing arguably the best cluster of public universities in the world, has been under serious threat due to funding cutbacks. In Europe the rankings show no sign that any national sector other than the British is on the rise.

However, I believe that the US will turn itself around and continue to drive global excellence in its higher education, even if they may find themselves sharing the limelight a little more with universities from Asia. But in Europe? The signs are not necessarily that great. Even the new U-Multirank ranking system that has been devised in Europe (with the hope held in some quarters that it would return more European universities in the top places) still shows American universities leading the field. To change this, countries in this part of the world need to show ambition and vision in their higher education policies. If they don’t, we are in a community of nations doomed to slip into the second tier and stay there. It’s not too late to correct this, but there isn’t much time.

Profiling (or ranking?) universities

January 14, 2014

Right at the end of 2013, while most were still digesting their Christmas dinners and Ireland was more or less closed down, the Higher Education Authority – the funding council of the Irish higher education sector – published a report entitled Towards a Performance Evaluation Framework: Profiling Irish Higher Education.  In his introduction to the report, the HEA’s chief executive, Tom Boland, describes its purpose as follows:

‘The development by the HEA of the institutional profiles presented in this report is intended to support higher education institutions in their strategic performance management in order to maximise the contribution of each both to the formation of a coherent higher education system and to national development. This on-going work is therefore fundamental to the implementation of the national strategy, particularly in respect of the imperative to align institutional strategies and national priorities, and to foster and clarify mission-diversity. Rather than reflecting any desire to instigate a ranking system, this report signals the HEA’s intention to work in partnership with all higher education institutions to ensure that the system as a whole advances the national priorities set out by the Government—for economic renewal, social cohesion and cultural development, public sector reform, and for the restoration and enhancement of Ireland’s international reputation.’

The bulk of the report then contains metrics for each institution, including student data, research performance and financial information. So for example we learn that it costs, on average, €10,243 p.a. to educate a student in an Irish university, with the cost ranging in individual institutions from €8,765 in NUI Maynooth to €11,872 in University College Cork. We also find out that the student/staff ratio in Irish institutions ranges from 19.5:1 in Dublin City University to 30.1:1 in NUI Maynooth. In research terms the institutions’ citation impact ranges from 0.6 in the University of Limerick to 1.7 in Trinity College Dublin, with most other universities clustering around the world average of 1.0.

What does this kind of information tell us? Or more particularly, to what use will it be put? Tom Boland emphasises in the passage quoted above that the intention is not to ‘instigate a ranking system’, though others could of course use the metrics to do just that. It can of course be used, as the HEA suggests, by institutions themselves ‘in their strategic performance management’ (presumably in setting and assessing key performance indicators), or as they also suggest to assess whether institutions are advancing government priorities.

In fact, university ‘profiling’ is all the rage, and not just in Ireland. The European Union’s ‘U-Multirank’ project, which is supposed to go live early this year, is something similar:

‘Based on empirical data U-Multirank will compare institutions with similar institutional profiles and allow users to develop personalised rankings by selecting indicators in terms of their own preferences.’

This too will, or so it seems to me, be an exercise in institutional profiling, presenting metrics that can be used to generate comparisons, i.e. rankings.

I don’t really doubt that as recipients of public money universities should present transparent data as to how this is being spent and what value is being generated by it. But comparisons between institutions based on such data always carry some risk. So for example, DCU’s student/staff ratio looks more favourable because the university has a much larger focus on science and engineering than other Irish universities, and laboratory work requires more staff input. NUI Maynooth is ‘cheap’ because the main bulk of its teaching is in the humanities, which are less resource-intensive. This information may not be immediately obvious to the casual observer, who may therefore be driven to questionable conclusions. Ironically some of these risks are not so prominent in the more mature league tables, such as the Times Higher Education global rankings, which will already have allowed for such factors in their weightings. The raw data are more easily misunderstood.

It seems to me that institutional profiling is not necessarily preferable to rankings. And it could be open to mis-use.

Taking charge of your own university rankings

April 15, 2011

Whenever I raise the topic of university rankings, I always get readers who, either in comments made here or in emails sent offline, will suggest that I really shouldn’t be paying so much attention to them or encouraging their authors. I know very well that many academics are very sceptical about league tables and don’t believe that they reflect any sort of reality; or they suspect that rankings prompt inappropriate behaviour by university managers, or in some other way undermine academic integrity.

In reality, however, league tables are part of the landscape, and this is so in part because those who want to enter into any kind of relationship with universities – whether as students or as faculty or as business partners or as donors – take them seriously and want to have them as a guide. We may wish that it were otherwise, but it isn’t. This being so, we need to engage with them, and in that way help to ensure that they are reasonable and accurate and transparent. So for example, the transformation over the past year or two of the Times Higher Education world rankings owes a lot to academic interaction with the journal and with the company they selected to manage the project.

The best known world rankings – those run by Times Higher Education and by Shanghai Jiao Tong University – have one important thing in common: the global top 10 universities are exclusively American and British. This is tolerated by Asian institutions that believe they are rising up the tables and are biding their time, but it disturbs the European Union and its member states.  In both rankings the top non-British EU university only comes in at number 39 (French in each table, but not the same university).

Because of this the EU has set out to design its own rankings, to be known as U-Multirank. The thinking behind this is that the established league tables are too much focused on research outputs, and in particular on science research; they neglect teaching and don’t encourage diversity of mission, and they drive universities into strategies that they don’t have the means to deliver. So the new rankings are to be weighted differently, so that the resulting table would be more balanced; and moreover they are to allow users to design and weight their own criteria, so that students (say) can create their own league table that more accurately reflects the strengths they are looking for in considering universities.

Can this work? In my view, no – probably not. Rankings are not really meant to provide a method of institutional profiling, but rather are designed to set out a kind of reputational gold standard. They are not the answer to the question ‘what kind of institution is this?’ – rather, they answer the question ‘what does the world think of this institution?’ This may not be a scientific answer, or else all rankings would give us the same results, but it is an attempt at standardising external evaluation. Also, too many people will think of U-Multirank as an attempt to support the somewhat lesser known European universities and design the rules to suit them.

Still, if you’re interested, the U-Multirank project is coming to the end of a feasibility evaluation and, if this supports the idea (as it will), it will be rolled out some time over the next year or two. It will be interesting to see whether it attracts support. I suspect that it will not displace the pre-eminence of Times Higher.