Talking totals

February 21, 1997

David Law asks whether quality assessment results could ever be used to produce a league table of the best teaching universities

League tables are a national preoccupation in education. The research assessment exercise results have produced various league tables but, so far, the assessment of teaching (Teaching Quality Assessment, TQA) has escaped this approach. Last November, Brian Fender, chief executive of the Higher Education Funding Council for England, circulated the latest batch of quality assessment reports and warned "it would be wrong to construct league tables or to use overall totals for comparative purposes". How long will this warning be heeded?

The graded profile now used in the TQA exercise reports judgements using numerical values (1 to 4). HEFCE itself makes numerical comparisons between performance in the six "aspects of provision" and between groups of institutions and will soon be operating a methodology for funding teaching that asks institutions to present evidence of quality. So why do we not "talk totals" when, in private, we already compare scores?

First, there is less than absolute confidence that two different teams sent into a university to assess teaching would arrive at the same figures. The approach is subjective and compressed within time constraints.

ADVERTISEMENT

Second, league tables depend on comparing "like with like" and higher education has become increasingly diverse in order to offer choice. The RAE claims to be assessing against standards: we might claim international reputations but it is for others to decide. TQA allows the assessed, at least in theory, to decide the rules. "Each judgement in the profiles is made against the specific aims and learning objectives of the subject provider" (HEFCE), so if I claim that my nakedness is really "new clothes" who is to say I am wrong? At some time the sector will need to grapple with the legitimacy of self-referenced aims.

And last, well we do not like league tables. At least, not unless we are "there or thereabouts". (The possibility that Oxford and Cambridge are only in a mid-table position may bring more supporters for the idea.) In January HEFCE's Report on Quality Assessment 1995-1996 was published. Its basic message is that the new system works better than the old because "the graded profile is a more constructive tool . . . than the previous single grade". In the past we sent in our self-assessments claiming excellence, then received a visit from the inspectors which ended up with most of us being told we were not excellent. Unsurprisingly, many colleagues found this dispiriting. Now, by contrast, there are very few losers because most visits produce at least one maximum grade. Much better to feel that something is going well, and then pay attention to the "could be better" comments.

ADVERTISEMENT

There were 2 visits in eight subjects during 1995/96 with all but two producing "quality approved" verdicts. The two "subject to re-assessment" outcomes were in modern languages; both arose from deficient resources.

Most universities and colleges got a mix of 3s and 4s. Forty-two per cent of all grades were maximum scores and 50 per cent were 3s. There were only 124 grades of 2 awarded - out of 1,632 gradings - approximately 7.5 per cent. The assessors tell us that in only three cases did they find room for significant improvement to be made in the guidance of students. A little hard to reconcile with the rapid rise in student complaints.

The sector has given itself a collective pat on the back, although budgets are being cut. Constructing a league table is not easy but is likely to be attempted and justified on the basis of a "consumer information".

However the rules for the TQA league might be written, a maximum score in all aspects of provision must put the provider at the top of the table. Last year there were seven assessments in which all grades were 4s. Four of these were in sociology: Birmingham, Sussex, and Warwick universities and the Open University. The other three maximum scores were achieved in languages: Exeter (German), Hull (Iberian languages/ studies), and Sheffield (Russian). Less than 3 per cent of all assessments produced straight 4s - Oxford and Cambridge are conspicuous by their absence.

Having defined the top of the league, I considered which institutions would be added to the list if the qualification became five grade 4s and one aspect graded as making a "substantial" rather than a "full" contribution to "the attainment of stated objectives". This produced a further 11 institutions for this TQA premiership, including Cambridge and four University of London colleges: King's, Queen Mary and Westfield, the School of Slavonic and East European Studies, and University College. Three ex-Polytechnics and Colleges Funding Council universities are included: Greenwich, Northumbria and Portsmouth. The other three are: Lancaster, Loughborough, and York.

ADVERTISEMENT

However, although it may have some merit, this method is flawed. Some institutions had more opportunity than others to join the league because they were assessed in more subjects. Moreover, membership is determined by one good result: Wimbledon beat Manchester United in the FA Cup replay but did not thereby become the better side.

Time-series results have to be checked. With 2 assessments it could be a lengthy process to establish any sort of rational order. All that the numbers mean is that, in the judgement of a small group of peers, the provider, in relation to the provider's intentions, was doing an outstanding to indifferent job during a particular week in a particular subject. If a university scores consistently highly it is fair to presume that it has demonstrated the capacity to formulate aims and objectives appropriately. At best, its students can be confident of high-quality education.

It seemed logical to check the other assessments of the top performers. In the Open University there was only one assessment. The other universities each had at least four assessments, including the top-scoring 24.

ADVERTISEMENT

Birmingham has been visited by six assessment teams in addition to the sociology assessment. In two assessments, two aspects were graded 2, ie, "significant improvement could be made". Exeter similarly had two assessments resulting in a total of three grades of 2 (with the same number of assessments as Birmingham). One of the three additional assessments at Sussex produced two grade 2s.

Sheffield had six assessments in addition to Russian. None of these produced a grade 2s, but mixed 4s and 3s with one "excellent". Hull had five assessments in addition to Iberian languages/studies. In all of these there were some grade 3s but no 2s. Warwick had three assessments in addition to sociology/social policy. Again, there were grade 3s but no 2s.

So, can any conclusions be drawn? Constructing a meaningful league table across subjects is only possible if questionable assumptions are made. The lesson is that there are few institutions that achieve consistently high grades in TQA. Everyone's objective is to improve the quality of the student experience. There is room for improvement and becoming more aware of good practice in other places is part of that.

David Law, director of programmes at Keele University and assistant director at the HEQC, writes in a personal capacity.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT