Profiling (or ranking?) universities

Right at the end of 2013, while most were still digesting their Christmas dinners and Ireland was more or less closed down, the Higher Education Authority – the funding council of the Irish higher education sector – published a report entitled Towards a Performance Evaluation Framework: Profiling Irish Higher Education.  In his introduction to the report, the HEA’s chief executive, Tom Boland, describes its purpose as follows:

‘The development by the HEA of the institutional profiles presented in this report is intended to support higher education institutions in their strategic performance management in order to maximise the contribution of each both to the formation of a coherent higher education system and to national development. This on-going work is therefore fundamental to the implementation of the national strategy, particularly in respect of the imperative to align institutional strategies and national priorities, and to foster and clarify mission-diversity. Rather than reflecting any desire to instigate a ranking system, this report signals the HEA’s intention to work in partnership with all higher education institutions to ensure that the system as a whole advances the national priorities set out by the Government—for economic renewal, social cohesion and cultural development, public sector reform, and for the restoration and enhancement of Ireland’s international reputation.’

The bulk of the report then contains metrics for each institution, including student data, research performance and financial information. So for example we learn that it costs, on average, €10,243 p.a. to educate a student in an Irish university, with the cost ranging in individual institutions from €8,765 in NUI Maynooth to €11,872 in University College Cork. We also find out that the student/staff ratio in Irish institutions ranges from 19.5:1 in Dublin City University to 30.1:1 in NUI Maynooth. In research terms the institutions’ citation impact ranges from 0.6 in the University of Limerick to 1.7 in Trinity College Dublin, with most other universities clustering around the world average of 1.0.

What does this kind of information tell us? Or more particularly, to what use will it be put? Tom Boland emphasises in the passage quoted above that the intention is not to ‘instigate a ranking system’, though others could of course use the metrics to do just that. It can of course be used, as the HEA suggests, by institutions themselves ‘in their strategic performance management’ (presumably in setting and assessing key performance indicators), or as they also suggest to assess whether institutions are advancing government priorities.

In fact, university ‘profiling’ is all the rage, and not just in Ireland. The European Union’s ‘U-Multirank’ project, which is supposed to go live early this year, is something similar:

‘Based on empirical data U-Multirank will compare institutions with similar institutional profiles and allow users to develop personalised rankings by selecting indicators in terms of their own preferences.’

This too will, or so it seems to me, be an exercise in institutional profiling, presenting metrics that can be used to generate comparisons, i.e. rankings.

I don’t really doubt that as recipients of public money universities should present transparent data as to how this is being spent and what value is being generated by it. But comparisons between institutions based on such data always carry some risk. So for example, DCU’s student/staff ratio looks more favourable because the university has a much larger focus on science and engineering than other Irish universities, and laboratory work requires more staff input. NUI Maynooth is ‘cheap’ because the main bulk of its teaching is in the humanities, which are less resource-intensive. This information may not be immediately obvious to the casual observer, who may therefore be driven to questionable conclusions. Ironically some of these risks are not so prominent in the more mature league tables, such as the Times Higher Education global rankings, which will already have allowed for such factors in their weightings. The raw data are more easily misunderstood.

It seems to me that institutional profiling is not necessarily preferable to rankings. And it could be open to mis-use.

Explore posts in the same categories: higher education

Tags: , , , ,

You can comment below, or link to this permanent URL from your own site.

7 Comments on “Profiling (or ranking?) universities”

  1. Al Says:

    I had to tip my hat to it.
    We need to be developing some internal evaluation to reflect on rather than the usual theatrical reaction to international rankings.
    The morning news report with the political reaction on the 10 am and drive time shows…
    Now if they could include a metric outlining wage levels over 100k..

  2. V.H Says:

    If the HEA is generating all this data and drawing up targets and long term strategies why exactly have we your grade and all the admin and management people in the universities themselves. I thought ( well, perhaps that’s too strong). I was under the impression that each university designed it’s own future. If for no other reason than to make certain we don’t put all the eggs in one basket.
    To go a tad wide-angle on this. If the State administration is determined to steer education in a certain way, surely it would be vastly easier to adjust 1st, 2nd levels and create a K1-2-3 since they control these themselves.

  3. The ‘comparisons = rankings’ conflation seems a bit premature, specifically regarding U-Multirank. It has been specifically designed not to give a one-number-ranking, and instead to provide the kind of information to those interested that they are looking for, across various variables and indicators. There are always dangers with any numbers that attempt to quantify the quantitative, that is why the require interpretation and clarification. The handwringing here just seems performative, as though one should be seen to have an issue with all attempts at benchmarking/measurement. Rankings have been around for a decade at this stage, and so our attitudes towards them should move beyond reactions that they are simply ideologically suspect, that they are to be rejected accordingly.

    I would ask, that if this new attempt at improving the situation of rankings is “not necessarily preferable”, then what is? Or what could be added to the U-Multirank system? It includes teaching (which most other rankings don’t), as well as regional engagement, and technology transfer. It is going to be available to use by any number of stakeholders (rather than bureaucrats and administrators, as was previously the case), and this is the expectation today – transparency. Finally, it will include subject/discipline specific indicators for those often-overlooked areas of study (arts, humanities, and social sciences).

    Websites like Eurostat are also open to “mis-use”, and mis-interpretation, but is this a good reason to reject them? Open datasets, which can be interrogated and reconfigured in novel ways by the curious, are becoming the norm (data journalism, or groups such as Open Data Dublin, etc.) For a change there is an opportunity for real bench-marking, for real transparency, which no doubt will have room for improvement, and yet it seems that some still want to protect their data fiefdom from the unitiated, the great unwashed.

  4. foleyg Says:

    There seems to me to be a fundamental contradiction in the way the HEA are going about this. As I understand it, universities are required to produce strategic plans and these, naturally enough, tend to focus on developing each institution in a way that builds on their existing strengths, while also ensuring that the various institutions have unique and complementary identities. There is absolutely no point in DCU trying to emulate Trinity, for example, either for DCU or for the country. Yet, the HEA is proposing to evaluate university performance on the basis of a one-size-fits-all set of metrics. I would have taught that each university should be asked to prepare an agreed strategic plan and then assessed on whether or not they achieve their individual targets. It seems to me that what the HEA is proposing is the antithesis of a strategic approach.

    • The fourteen indicators in the HEA suggests that it isn’t quite -one-size-fits-all. The “top university” represented by the diagrams accompanying the information for each HEI is the ‘ideal’ university, and no one university represents this ideal, and indeed the “average” university for each HEI is also represented. Those factors which the HEA has selected (student/staff ratio, international enrolment, mature entrants, etc.) don’t really fall within the realm of strategic plans, which tend to be in the realm of fields/disciplines.
      These indicators are more in line with ascertaining where the Irish HE landscape stands with respect to the international situation. A similar document to the HEA’s report is the LH Martin Institute’s Research Briefing document, “Profiling diversity of Australian universities” from 2013.

      • foleyg Says:

        Need to think about that! Those diagrams are cool though.

      • foleyg Says:

        I can see why this kind of data is useful to the HEA – it does give them a good birds-eye view of the entire system. But it you want to evaluate the performance of any given institution, you really have to look at that institution in terms of what its particular mission is and what goals it has set for itself following discussion with the HEA. To be honest though I need to read the preamble in the long document and see what exactly they are saying about what they want to get form all of this.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: