Posted tagged ‘performance indicators’

Higher education performance

December 13, 2011

The statement usually attributed to the author and management consultant Peter Drucker – that ‘what gets measured gets done’ – has nowhere been as enthusiastically adopted as in higher education over recent decades. Anyone working in a university across much of the world will be aware of performance criteria which govern everything from institutional funding to personal career development. So we assess the student’s ‘learning outcomes’ and examination results, the professor’s publications, the university’s attrition rate; in fact, anything we believe we can measure. The statistical outputs from all this, unmediated by any coherent analysis, then get themselves published as some table or other that, in turn, will determine resources.

It is hard to argue against league tables, because these present an assessment of performance, however imperfect, and thereby allow interested onlookers to form a judgement about institutional quality. Those putting forward a case for universities to be left alone and find their own way of delivering good quality without external interference are not going to find a sympathetic audience: it is not the spirit of the age. Nevertheless, there is also evidence that the search for performance indicators has distorted strategy and sometimes incentivised very questionable policies. For example, it has led to a serious downgrading of teaching as against research. So what should be done?

First, if we are to have performance indicators, we should have fewer. In a recent presentation to a meeting on higher education strategy, the chief executive of Ireland’s Higher Education Authority, Tom Boland, listed 32 key performance indicators that could be used to inform strategy. Another example is the list recently proposed for Portuguese universities (in this document, at page 10). However, the impact of such lists is to reduce strategy to ticking boxes (to use that rather annoying expression); it is no longer strategy, but risk management, the key risk to be managed being the risk of losing public money.

Secondly, if you are setting up performance indicators, keep them consistent. In the Boland list inputs are mixed with outputs in a way that is unlikely to produce anything coherent. Also, relatively trivial indicators are found competing with more fundamental ones. Looking at them all together you cannot get any sense of mission or direction, you just have a list.

Thirdly, keep them relevant. Just because something can be measured doesn’t mean that it tells us anything if it is. Yet there is a lot of evidence that reporting on certain aspects has been required not because it is useful but because it is possible.

Overall, it is hard to resist the suspicion that the whole culture of performance indicators has been more one of bureaucratisation than of transparency. And yet, clarity about purpose, mission, priorities is important; as is the capacity to report on how successful these have been. It’s not that we shouldn’t do this, we just need to do it better.

Advertisement

Universities: the need to present better information

February 25, 2010

I recently attended a meeting attended by people some of whom worked in universities and some who did not. We were discussing the future of the Irish university system, and a number of questions were asked about key performance indicators showing activities and outputs from the sector. Nobody had any of the figures to hand, nor could offer even a rough estimate. Since then I have been trawling the websites of the universities to see if I could assemble the data that way, and I had to conclude that I cannot. Some metrics for the sector as a whole are published and made available by the Higher Education Authority (HEA), but others are not. Some details that are published are fairly successfully hidden away on the institutions’ websites; so for example I wanted to find the most recent audited accounts for one particular university, and by searching persistently for about 30 minutes I did eventually find them – but most would have long given up by then.

One of the charges that have been levelled at universities in recent years is that they don’t release and publish and draw attention to key information in a timely manner, or indeed at all. Taking the same university again, I tried to see whether I could on its website get information on student progression and retention; or the gender an racial/ethnic breakdown of staff generally and according to grade. Well, if this information is available there, it is so well hidden I could not find it.

It is fair to say that, probably, we have all been bad at maintaining openness and transparency in these ways. Getting key information to a wider audience is part of what we need to do in confidence building if we are to have broader public support. We must not see the disclosure of information as a threat, or something to be defensive about, but rather as an opportunity to engage our stakeholders. Doing this also avoids the perhaps even worse suspicion or fear that we do not publish such information because, in fact, we don’t have it ourselves. In which case one might wonder how we are able to devise our strategies.

Higher education: measuring success

February 10, 2010

On June 11, 2009, the Oireachtas Public Accounts Committee assessed (amongst other things) the allocation of money to and by Science Foundation Ireland. At the end of the session the Comptroller and Auditor General, Mr John Buckley, reflected briefly on how one might assess whether the state is getting an adequate return from the investment in research carried out in universities. This is how he summarised the issue:

The challenge is to manage commercialisation such that there is a return on the State investment. This applies to the university sector where there is a need to ensure the process of protecting intellectual property rights and licensing and so on are in place and exploiting research outputs to ensure they are optimised. Then there is the level of industry or industrial promotion. The challenge here is to ensure there is a pay-off for State assistance and investment in research and development. The point is that this pay-off will be increased to the degree that it is linked with commercialisation and market awareness.

There is very little in that statement to argue with, but it needs to be said that in political debates in particular there is often an impatience with what some might regard as the rather vague metrics offered as justification for research investment. The politician’s instinct is always to look for jobs, and to look for them in the here and now. Having got used to sums invested in the IDA producing employment in a short time frame, they expect university research to do the same. But as I have noted previously in this blog, the economic impact of research is often in the job creation that it encourages third parties to make in the shorter term, and in the commercialisation impact in the much longer term. But it is dangerous to look for a ‘pay-off’ in any direct way in the short term. To that extent, universities should not encourage such misleading analysis by themselves offering the prospects of jobs in larger numbers directly created by university research, as this promotes a misunderstanding of why research should be funded.

But even when we assess university degree programmes, how do we measure success? The number of students admitted into higher education? The number of successful graduates? The number of access students? The ‘value added’ of improving results (which are sometimes taken as evidence of ‘dumbing down’, perhaps perversely)? Student satisfaction? Foreign direct investment in areas where graduates provide skills?

It seems clear to me that we are living in an age where everything that is funded is assessed in terms of whether the funded activities satisfy key performance indicators and can therefore be seen to provide value for money. This is an understandable approach, but its application to higher education is complex. We probably cannot – and probably should not – avoid this movement, but we need to develop a much clearer approach as to what indicates that investment in higher education has provided an appropriate return to the taxpayer. We may want to say that the return is vital but (literally) immeasurable: for example that it is demonstrated in an enlightened, skilled, intelligent, entrepreneurial, cultured and adaptable population. But that may not satisfy the spirit of the age that wants greater accountability in a more direct sense. So as a sector we should lead in developing an understanding of how what we do can be justified in such a spirit. This may now be one of our most urgent tasks.