When rank is a question of a balanced fruit diet

April 14, 2000

League tables have come under fire but they are hard to argue against. Mayfield University Consultants consider how to build the best

Despite the criticisms, university league tables are proliferating. Since their beginning in the United States in 1983 and their introduction to the United Kingdom by The Times in 1992, they have spread to other newspapers. In 1999, the Higher Education Funding Council for England got in on the act with its own set of performance indicators. There has also been an increase in the use of league tables in other sectors such as the social services.

One reason for the spread of league tables is that it is hard to argue against them. Universities differ in all sorts of ways, data about the differences are publicly available - a league table simply uses a measure of the difference as the sort order. So the issue is not whether we should have league tables, but how they might best be constructed.

There are difficulties in using raw data to compile a league table. Judgements have to be made, so the quality of the table depends to some extent on the judgement of the compilers. This does not render them meaningless; there is evidence that the impact of such judgements is small.

ADVERTISEMENT

The league tables published in The Times, The Sunday Times and The Financial Times in 1999 were compiled with considerable differences in how they handled the data. Despite this, a comparison between them demonstrates that the median difference in the rank of individual universities between any two of the tables is only four or five places. More than 50 per cent differ by five places or fewer between any two of the tables and, depending on which tables are compared, 75-90 per cent by ten places or fewer. The average rank of the pre-1992 universities is 29 in all three tables and that of the post-1992 universities is 77 in all three tables.

Much of the data seem superficially straightforward, but frequently they turn out to be complex. For example, teaching quality assessments appear to be a simple score out of 24 that can be averaged to create a measure of teaching quality. However, some way of dealing with old-style (excellent, satisfactory, unsatisfactory) and new-style (1-24) scores must be found, as well as an approach to the outcome "highly satisfactory", which occurs only in Scotland.

ADVERTISEMENT

In addition, it has been suggested that old scores should have less weight than new ones, on the grounds that quality is improving. There is evidence that subject review scores are increasing, but it is not clear whether or not this is due to improving quality. It might be that universities are getting better at convincing subject reviewers, or that their own objectives, against which they are assessed, are getting less ambitious. Even if the increasing scores were due to improving quality, it could be argued that most of the change occurred in the few months leading up to a subject review.

Measures of spending can also be problematic. Separation of library and computer spending is becoming increasingly hard to justify, and spending in these areas can be inconsistent. A major upgrade of, say, a university network could cause a significant increase in spending in one year and so create spurious instability in a league table. Thus there is an argument for taking the average of two or more years in these areas.

It is argued that a fundamental flaw in league tables occurs when there is an aggregation of scores on a number of measures into a single overall score. This is not a trivial point, but it may not have the force that is often assumed. It is not unreasonable to suggest that a university that is good at many things should rank above one that is only good at some things. It is not the principle of a fruit salad that is the problem, but the balance between the different fruits and the way they are sliced up.

There is a good precedent for aggregation from universities themselves. An unseen three-hour paper, an assessed essay and a practical examination are all different methods of assessment that measure very different skills. Yet universities seem very comfortable with aggregating marks from some or all of them to create an overall mark that determines a degree classification.

ADVERTISEMENT

For The THES this year, tables of the underlying data used to compile The Times league table have been prepared as usual. We have also prepared tables of some other measures of universities, relating to their patterns of employment, that are relevant to prospective staff.

League tables (or at least some of them) are getting better. The Times in particular has been open about the issues and has sought to maintain a constructive dialogue with universities to help the process of improvement.

Andrew Hindmarsh, Bernard Kingston, and Robert Loynes are partners in Mayfield University Consultants, which compiled the league tables for The Times and The THES.

League tables, pages i-iv.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT