U-Multirank may be a waste of taxpayers’ money

January 1, 1990

22 March 2012

It was conceived by the European Commission as an alternative to the established research-led university rankings, offering a user-driven “multi-dimensional” ranking with a wider range of indicators.

But today, an influential House of Lord’s committee in the UK has suggested that more than two years into the multi-million Euro project, U-Multirank may be a waste of taxpayers’ money.

The House of Lords European Union Committee recommends in its report, Modernisation of Higher Education in Europe, that unless the Commission can overcome the perceived deficiencies inherent in all ranking systems, it “should prioritise other activities”.

“In the meantime, rankings such as the Times Higher Education World University Rankings may have a valuable contribution to make,” it adds.

ADVERTISEMENT

The committee looked at many aspects of the modernisation agenda in European higher education, but the section on U-Multrank stands out.

It says that “most” of the witnesses who gave evidence to the committee “were not convinced by the merits of yet another league table”.

ADVERTISEMENT

“No one really came out and said this is a good idea,” said Baroness Young of Horney, chair of the subcommittee that carried out the inquiry.

The report lists a “series of concerns” raised by witnesses: “about the proposal’s lack of clarity as to whether it would be a ranking or transparency tool; that the league tables market was already too crowded, with each ranking deploying its own methodologies; that it would confuse applicants and be incapable of responding to rapidly changing circumstances in institutional profiles; that it could become a ‘blunt instrument’ which would ‘not allow different strengths across diverse institutions to be recognised and utilised’ and end up being used as the basis for future funding decisions; on the grounds of quality, accuracy and lack of data; and the EU funds could be better spent on other EU priorities.”

Phew! That is quite a list. But there was also concern about whether the project could deliver what it promised.

“The UK Bologna experts…considered that U-Multirank’s success was ‘highly dependent on the extent of institutional engagement, coverage, and accuracy of data used to compile rankings’,” the report says.

Others were concerned at the notion that such a ranking, with EC backing, would be afforded “official” status.

“The Higher Education Funding Council for England acknowledged the Commission’s efforts to overcome some of the limitations of traditional league tables and to render it more objective but advised ‘caution in providing any form of official sanction to any one form of ranking tool’,” the report says.

ADVERTISEMENT

The committee notes that the UK government believed that U-Multirank “might be useful if it genuinely provided a transparent source of information for students wanting to study abroad”, but it was “not convinced that it would add value if it simply resulted in an additional European rankings system alongside the existing international ranking systems”.

“However, the [universities and science] minister [David Willetts] struck a less positive tone,” the report says, “when he told us that it could be viewed as ‘an attempt by the EU Commission to fix a set of rankings in which [European universities] do better than they appear to do in the conventional rankings.”

ADVERTISEMENT

The committee notes the steps Times Higher Education took to improve its World University Rankings during 2010 “in order to apply a different methodology and include a wider range of performance indicators (up from six to 13)”.

“They [THE] told us that their approach seeks to achieve more objectivity by capturing the full range of a global university’s activities – research, teaching, knowledge transfer and internationalisation – and allows users to rank institutions (including 178 in Europe) against five separate criteria; teaching (the learning environment rather than quality): international outlook (staff, students and research); industry income (innovation); research (volume income and reputation); and citations (research influence).

“In order to inform the revision of their rankings, their data supplier, Thomson Reuters, conducted a global survey which found that many users distrusted the methodology of the existing world rankings. While THE considered that rankings were ‘relatively crude’ and could never be properly objective, they nevertheless considered that if used appropriately then could still provide a useful role in providing information”.

The Lord’s committee concludes that the provision of clear information and guidance to students is important in order to assist them in making an informed choice of university.

“However, we also appreciate how difficult it can be to evaluate a wider range of university performance indicators in an objective manner, noting the limitations inherent in many of the existing ranking systems. Therefore it is important that the Commission is clear about the purpose of U-Multirank, what information will be provided and what methodology will be used.”

But until the deficiencies can be overcome, the report adds, “we consider that the Commission should prioritise other activities. In the meantime, rankings such as the Times Higher Education World University Rankings may have a valuable contribution to make”.

ADVERTISEMENT

Phil Baty is editor, Times Higher Education Rankings

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT