Rebecca Attwood reports on latest rival methodologies and the concerns about tables' effects on institutions.
Universities should boycott league tables, argued one educationist this week as the latest set of rankings was announced.
League tables have always been the subject of heated debate in the sector. But with the Higher Education Funding Council for England's announcement that it is investigating their impact on institutions' reputations and behaviour, and the emergence of yet another new table earlier this month, scrutiny of university rankings is on the rise.
This week The Times published its annual tables, and it ran straight into a fresh row over its choice of criteria. This year, the newspaper, to which The Times Higher is no longer linked, decided to reduce the number of its subject specific tables by half, from 64 to 32.
Michael Harloe, vice-chancellor of Salford University, said that the strongest subjects of many universities, including his own, had been excluded.
"The University of Salford, for example, is a world leader in building and built environments," he said. "This will not feature at all in The Times university guide this year. Successful Salford subjects such as aeronautical engineering, social policy and subjects allied to medicine will similarly not appear."
John O'Leary, former editor of The Times Higher and editor of The Times guide, said the smaller number of subjects would be for this year only and was due to "a late change", which did not allow time for more subjects to be properly checked.
"Half had to go and it was always going to be a difficult choice," he said. But he emphasised that no university would suffer in the overall rankings because the main table still covers the whole range.
According to Roger Brown, the former vice-chancellor of Southampton Solent University, tables cannot measure quality and it would be far more honest if newspapers were simply to publish a table of income per student.
"I think universities should decline to collaborate with commercial league tables because the damage they do to higher education isn't justified by the benefits," Professor Brown said.
"I bet you right now most vice-chancellors are sitting down thinking about ways in which they can collect information to improve their league-table score."
Even those who agree on the principle of tables can disagree on methodology, including fundamental elements such as whether to include performance in the research assessment exercise. This can lead to dramatically different placings for institutions.
The Guardian chooses not to include research in its annual tables, instead focusing solely on teaching.
But Bernard Kingston, who previously helped produce The Times' annual tables but who broke away this year to compile a rival good university guide published in The Telegraph, said: "I take a view that, amongst other things, teaching and research are central to what we call a university."
Donald MacLeod, higher education editor at The Guardian, said that if research had impacted positively on teaching, this should come through in the ratings given for teaching.
"If we added it in, we felt it would bias the table towards older, research rich universities," he said.
Those responsible for compiling league tables freely admit that rankings reflect the value judgments of those who put them together.
But Dr Kingston is keen to emphasise that the flexibility brought about by the internet has mitigated this effect.
The Good University Guide and The Guardian websites encourage users to choose their own weightings for the various measures and design their own "personalised" league table. Dr Kingston describes this as a "breakthrough".
"Now readers can home in on the measures that are important to them," he said.
Hefce's project will examine the methods and data behind national newspaper rankings, as well as the world rankings produced by The Times Higher and Shanghai Jiao Tong University.
From this autumn, it will undertake case-study research looking at how universities are responding to league tables, and gather data about their impact via an online survey.
THE RIVAL DATA
Name: The Times Good University Guide
Published in: The Times
Top 10:
Oxford
Cambridge
Imperial College
London School of Economics
St Andrews
University College London
Warwick
Bristol
Durham
King's College London
Bottom 10:
Anglia Ruskin
Southampton Solent
Edge Hill
Cumbria
Middlesex
Greenwich
Lincoln
Thames Valley
Wolverhampton
Liverpool Hope
Who's behind it?
The table was compiled by Bernard Kingston from its inception in 1993 until 2006, but he parted company with The Times and last month launched his own table, The Good University Guide. The Times' guide is edited by John O'Leary, the former editor of The Times Higher Education Supplement. The tables were compiled by Exeter Enterprises Ltd, an Exeter University spin-off company.
Website: www.timesonline.co.uk/ gooduniversityguide
Methodology:
This year there are eight measures, rather than last year's nine, because of changes to the way data is collected. Student satisfaction, using the results of all 22 questions in the National Student Survey, and research quality, as measured in the 2001 research assessment exercise, are given the heaviest weighting, at 1.5, while other indicators weigh in at 1. They are: entry standards, student-staff ratio, services and facilities spend, completion rate, good honours and graduate prospects. The table ranks 113 institutions, based on Universities UK membership. The number of subjects covered in the subject tables has been reduced from 64 to 32. Subject tables are based on research quality, entry standards and graduate prospects. The majority of the Higher Education Statistics Agency data used dates from 2004-05.
Name: The Good University Guide
Published in: The Telegraph
Top 10:
Cambridge
Oxford
Imperial College
London School of Economics
St Andrews
University College London
Bristol
Warwick
Bath
Durham
Bottom 10:
Thames Valley
Cumbria
Southampton Solent
Middlesex
Wolverhampton
Lincoln
Liverpool Hope
London South Bank
Greenwich
Edge Hill
Who's behind it?
Bernard Kingston, former director of the careers service at Sheffield University, and his company, Mayfield University Consultants, and Miguel Nunes from the department of information studies at Sheffield University.
Website: www.thegooduniversityguide.org.uk/
Methodology:
As with The Times' table, student satisfaction and the results of the RAE are given the greatest weighting, at 1.5. The remaining seven measures, each weighted at 1, are similar to those used by The Times - entry standards, student-staff ratio, academic services spending, facilities spending, good honours, graduate prospects and completion rates. But different calculations are used - for example, student satisfaction is measured using the results of only the first four questions in the National Student Survey. All the Hesa data dates from 2004-05 because the compilers decided to use data from a single year, and full 2005-06 data was not available. The table ranks 113 institutions. The website features 61 subject tables, which judge institutions on research quality, entry standards and graduate prospects.
Name: Guardian University Guide 2008
Published in: The Guardian
Top 10:
Oxford
Cambridge
Imperial College
St Andrews
London School of Economics
Edinburgh
Warwick
Loughborough
Bath
Soas
Bottom 10:
Wolverhampton
Swansea Institute
Southampton Solent
Greenwich
Trinity College, Carmarthen
Liverpool Hope
Lincoln
North East Wales Institute
Middlesex
Who's behind it?
The tables are compiled in association with EducationGuardian.co.uk by Campus Pi, an applied research department at Brunel University. Edited by The Guardian's higher education editor, Donald MacLeod.
Website:
education.guardian.co.uk/ universityguide2008
Methodology:
In contrast to the compilers of the other two tables, The Guardian does not consider research ratings or research funding, but focuses on teaching instead. Its subject tables cover 46 different areas and are based on seven statistical measures. Five of these - spending per student, staff/student ratio, job prospects, a measure to compare students' degree results with their entry qualifications, and entry score - all weigh in at 17 per cent. Two measures use data from the NSS: the student verdict on teaching quality counts for 10 per cent, and students' opinion of the feedback they receive weighted at 5 per cent. For a university's overall score, the subject scores are weighted according to the number of first-degree students enrolled on the course. A total of 124 institutions are included. The tables are compiled using the most up-to-date data from Hesa, resulting in a mix of 2004-05 and 2005-06 data.
THE CASE FOR AND AGAINST RANKINGS
FOR
* League tables aid student choice.
"Young people have to choose which universities to apply to, and if you are not in the system it can be a very difficult area," says Bernard Kingston, who is responsible for The Good University Guide.
* They are independent and objective because they are constructed outside the sector.
* They are an antidote to the more extreme claims made in university prospectuses.
The student complaints ombudsman has warned universities against making promises in their prospectuses that they cannot keep, and exercises conducted at the Institute of Education have shown that assertions made in university prospectuses do not always live up to scrutiny.
* They increase transparency and encourage universities to improve
AGAINST
* League tables are biased towards older, wealthier universities. Research by David Watson and Rachel Bowden has shown a strong correlation between a university's league table position and its income per student
* League tables are coloured by the values of those who compile them.
Eyebrows were raised this year as Exeter University rose from 28th place to 17th in The Times' table in the first year that its spin-off firm Exeter Enterprises helped to compile the table. The Times said that the company was at "arm's length" from the university and added that the results were independently verified by Durham University.
* League tables distort university behaviour.
There are examples in the US of higher-education institutions manipulating data, as well as processes and practices, in order to improve their league table positions.
In the UK, The Times Higher revealed in December that Bangor University changed the way it calculated students' degree classifications so that it could award more firsts and upper seconds and improve its league-table position.
* League tables are trying to measure something that cannot be measured.
Roger Brown, the vice-chancellor of Southampton Solent University, argues: "There can never be perfect, or even adequate, information about quality, mainly because there is not agreement across higher education either about what is meant by quality or about how it should be measured."
* Despite high-ranking universities' claims, league tables do not really aid student choice.
Prospective students from families with no history of higher education can be put off applying to universities by the positions they occupy in league tables, according to research such as Higher Education and Social Class: Issues of Exclusion and Inclusion by Louise Archer, Alistair Ross, Merryn Hutchings.