A new order for research excellence has been established across institutions and disciplines for the first time in seven years, as the results of the 2008 research assessment exercise are made public this week.
And while there are no massive changes to the overall research landscape - the biggest research-intensive universities are still clustered at the top of the table of excellence, followed by the smaller research-intensive institutions - there is certainly some significant individual movement, according to a ranking devised by Times Higher Education.
Outside the specialist institutions of the Institute of Cancer Research and the London School of Hygiene and Tropical Medicine (which submitted research in only a few disciplines and gained first and third place respectively), it was the University of Cambridge that ranked highest in the Times Higher Education Table of Excellence (see over) with the highest average score for the quality of its research. Cambridge came top overall in the previous RAE, in 2001.
Our excellence table presents a quality profile for each institution showing the percentage of staff submitted to the RAE who fall within each of the four RAE research grades (4* for “world leading” down to 1* for “nationally recognised”). Institutions are ranked on a “grade-point average” (a weighted average) of their quality profile using a scale from 0 to 4 (for a detailed explanation of the methodology, see page 30).
Cambridge was followed closely by the London School of Economics and the University of Oxford, which tied for fourth place. But the LSE had more of its research rated 4* (35 per cent) than either Oxford or Cambridge, so it may argue that it put in a stronger performance than its ancient rivals.
Also featuring in the top ten are (in order) Imperial College London, University College London (which remained static in seventh place), and the universities of Manchester, Warwick (which fell from sixth to ninth place) and York (which rose from 18th in 2001 to claim the tenth place this year). York, with its rise of eight places, is the only institution from the 1994 Group of research-intensive universities that made the top ten.
The research-intensive university that shot highest up the table was Queen Mary, University of London (from 48th in 2001 to 13th). It was followed by the University of Nottingham (37th to 24th). The universities of Loughborough, Leeds and Exeter also put in strong performances as rising stars among research-intensive institutions.
But there were also significant falls among the research-intensive universities.
Cardiff University dropped out of the top ten altogether after coming eighth in 2001. It is now 22nd. Cardiff appears to have sacrificed its quality rating by submitting a high volume of staff. This may help when it comes to its research funding allocation in March, which is based on a combination of research quality and volume of staff.
The University of Southampton fell from 11th to a tie for 14th spot, and the universities of Lancaster and Glasgow slipped as well.
Among the traditionally non-research-intensive universities, there have been some strong performances.
The University of Hertfordshire challenged the middle-ranking universities, rising from 93rd to 58th. The University of Brighton rose from 80th to 59th. Lower down the table, big gains were also made by the universities of Anglia Ruskin, Bournemouth and Derby, although it is not clear how much these improvements will convert into funding.
Contextual information on the indicative proportion of RAE-eligible staff submitted for assessment by institutions suggests that some universities were particularly selective, while others submitted virtually all their staff.
Ultimately, however, it is research power (a mixture of excellence and volume) that will determine funding. A final formula simply multiplying average research scores by the volume of staff submitted would see the most cash flow to Oxford (which submitted the most staff), followed by Cambridge, Manchester, UCL, Edinburgh, Nottingham and Imperial College.
David Sweeney, director of research at the Higher Education Funding Council for England, said that across all the types of league tables that could be constructed, “no single institution” had come out the best.
He said: “I don’t think you can say there is one institution that has clearly come top. There are many measures of performance, the choice of which will vary the order. This reflects a fine-grained analysis that gives a lot of detail to institutions. There is no institution that has come top on every measure.”
He said there were “few” examples of institutions that had put in a very small number of people and done well. “Institutions have not bought success by entering a handful of staff,” he said.
Mr Sweeney said that in developing the 2008 RAE, Hefce had taken an “overall view” that it was not appropriate to include the percentage of RAE-eligible staff that a university had excluded from assessment - because it said little about research quality and it penalised those institutions that put a premium on employer engagement or employer co-funded skills and knowledge transfer, rather than on research.
Rankings by subject
While the overall league table may reveal a relatively mature research landscape, there are many more movements within individual subjects. For the first time, it is possible to see who is, in effect, top and bottom in a subject.
A second table compiled by Times Higher Education (page 30) compares the performance of institutions in each of 67 research disciplines. Departments are ranked within each subject based on the “grade-point average” of the quality profile they achieved for that subject, with a scale from 0 to 4 (for detailed methodology, see page 41).
Cambridge came top in 18 of the 50 subjects it entered, beating its nearest rival Oxford, which came top in eight of the 50 categories it entered. Manchester came top in six of the 53 subjects it entered, and the LSE came top in four of its 14 entries.
The results show that some departments that had the highest ratings in certain subjects in 2001 have not repeated their performances. Within physics, Oxford and Southampton (rated in 2001 at the highest level, 5*) ranked 17th and 18th in 2008. Lancaster and Cambridge (also previously 5*) took first and second positions, although Lancaster submitted fewer researchers.
Within history, Imperial College played to its strengths in the study of the history of science. It entered five academics and came top of the history rankings.
The much-touted example from RAE 2001 of Oxford Brookes University’s history department trumping Oxford’s by scoring a 5* to Oxford’s 5 was not repeated in RAE 2008. This time, Oxford took fifth place in history, compared with Oxford Brookes’ 17th. Some 35 per cent of Oxford’s history research activity was graded at the highest level (4*) compared with 25 per cent of that at Oxford Brookes.
But Oxford may be rueing the day it decided to enter its researchers into the “communication, cultural and media studies” category for the first time. The institution was beaten by a large number of post-1992 universities, including De Montfort, Sunderland, Lincoln, Nottingham Trent, East London and Westminster (which comes second in the Subject Rankings table after the University of Leicester).
Of all the subjects assessed, it is “economics and econometrics” that shines as the UK’s top scorer, nationally averaging more than three (on a scale of 0-4). The lowest-scoring subject overall was “allied health professions and studies”, which scored only just above two on average. “Business and management studies” had the most researchers submitted.
In total, the equivalent of about 8,900 staff accrued the highest grade (4*). In 2001, 25,000 could claim to be in departments that scored the top grade (some 55 per cent of those submitted were in 5* or 5 departments). This time, 16 universities made submissions of one person.
Mr Sweeney added: “That there is excellence throughout the sector is clear. It was there in 2001, but it is now much more visible.” He said most institutions would have “good stories to tell”.
How you were judged
RAE 2008 differs from previous exercises in that single summative ratings for each university in each discipline have been replaced by “quality profiles” of research activity.
These show in finer detail the quality of the research activity within departments, revealing pockets of excellence wherever they may be as well as reducing the problem of departments falling on the cusp of a grade boundary, which could have a significant impact on funding.
Times Higher Education’s Table of Excellence (pages 28-30) and RAE Subject Ratings table (pages 31-41) are derived from the quality profiles (see methodology, pages 30 and 41).
RAE 2008 was, like its predecessors, a peer-review exercise. It employed a two-tiered panel structure. Some 15 main panels oversaw the work of 67 subpanels, made up of about a dozen academics and users of research who assessed the quality of work submitted by institutions to 67 discipline-based units of assessment (UoAs).
The profiles show the percentage of research activity in each department judged to fall within each of four quality grades “in terms of originality, significance and rigour”.
The official RAE data are at: www.rae.ac.uk
The grades are:
- 4* world-leading
- 3* internationally excellent … “but which nonetheless falls short of the highest standards of excellence”
- 2* recognised internationally
- 1* recognised nationally
There is an “unclassified” (U/C) grade for research that falls below the standard of nationally recognised work or does not meet the RAE’s published definition of research.
The profile considers three components of an institution’s submission - “research outputs” (with up to four pieces of work submitted per researcher entered), “research environment” and “indicators of esteem”.
Behind the numbers: the methodology used to make the tables
How to read the Table of Excellence
Institutions are ranked according to the average score of the staff they submitted.
For each institution, a quality profile shows the percentage of staff submitted by the institution receiving the RAE grades 4*, 3*, 2*, 1* and unclassified.
The overall average score is the “grade-point average” (GPA) of the institution’s quality profile. To find the GPA, the percentage of staff within an institution to receive a 4* grade is multiplied by 4, the percentage of staff to receive a 3* is multiplied by 3, the percentage of staff to receive a 2* is multiplied by 2 and the percentage of staff to receive a 1* is multiplied by 1; the results are added together and divided by 100 to give an average score of between 0 and 4. Staff who are unclassified receive a score of 0 and do not feature in the calculation.
The 2001 rank order is taken from Times Higher Education’s table for the RAE 2001. This ranked institutions according to the average score per member of staff submitted, with a score from 1 to 7 representing the seven previous RAE grades (5*, 5, 4, 3a, 3b, 2 and 1). An equivalent method is used in the RAE 2008.
The contextual column listing the indicative proportion of RAE-eligible staff submitted is an attempt to show how selective universities have been in choosing which academics to enter into the RAE. It is calculated by dividing the total number of staff that an institution submits to the RAE by the number of academic staff at the institution within the grades “professors”, “senior lecturers and researchers” and “lecturers”, according to the latest published data available from the Higher Education Statistics Agency (Resources of Higher Education Institutions, 2006-07).
There are known to be differences in how institutions classify staff, but the three grades are selected because nearly all staff within them will technically be eligible for the RAE. Two further grades, “researchers” and “other”, are excluded because the former includes many research assistants who are not RAE eligible and the latter generally refers to non-standard academic staff who tend not to be RAE eligible.
The information on indicative proportion of RAE eligible staff entered should be read with caution and the following caveats:
- Not all RAE-eligible staff will be captured. At least some staff graded as “researcher” or “other” will be RAE eligible;
- some non-RAE-eligible staff will also be captured;
- there are differences in how Hesa and Hefce calculate full-time equivalent staff numbers;
- the data cover the period 1 August 2006 to 31 July 2007, and are therefore outside the RAE 2008 census date for staff in post at 31 October 2007 (Hesa could not provide the 2007-08 data before Times Higher Education went to press);
- the approach is not applicable for a number of universities, including, for example, the University of Huddersfield, which entered all its staff in the “other” category;
- institutions that appear to have returned more than 100 per cent of their staff are capped at 100 per cent with a “greater than” sign (>) to indicate this;
- Hesa has requested the following disclaimer accompany the use of the data: “Hesa holds no data specifying which or how many staff have been regarded by each institution as eligible for inclusion in RAE 2008, and no data on the assignment to units of assessment of those eligible staff not included. Further, the data that Hesa does hold is not an adequate alternative basis on which to estimate eligible staff numbers.”
Additional notes
- n/a indicates that no information is available, or comparison with 2001 is not possible because the subject is not compatible or the institution has not entered before.
- The number of staff submitted for assessment is the number of full-time equivalent (FTE) staff.
- Hefce’s quality profiles within individual units of assessment (UoAs), from which the average score is calculated, show the percentage of “research activity” falling within the grades rather than the percentage of staff. Times Higher Education’s approach to calculating institutional profiles assumes that research activity is equivalent to staff activity. Hefce’s original quality profiles are rounded to 5 per cent.
For information on how to read the RAE Subject Ratings table, below.
Data analysis was undertaken by Education Data Surveys (EDS), which is part of TSL Education. Special thanks to Almut Sprigade.
The official RAE data are at: www.rae.ac.uk
Subject by subject: methodology
The subject ratings table shows institutions’ performance in each of the 67 units of assessment (UoA).
Within each UoA, departments are ranked by their 2008 average, which is the “grade-point average” of their Hefce quality profile. To obtain this GPA, the percentage of research activity within an institution to receive a 4* grade is multiplied by 4, the percentage of research activity to receive a 3* is multiplied by 3, the percentage of research activity to receive a 2* is multiplied by 2, and the percentage of research activity to receive a 1* is multiplied by 1; the results are added together and divided by 100 to give a score of between 0 and 4.
The percentage of 4* research activity and the number of staff submitted are presented as contextual information. Each UoA also has a national profile showing the total number of staff submitted for the entire UoA, the percentage of 4* research activity within the UoA and the overall average.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login