National Student Survey 2024: which university performed best?

With students studying in England no longer quizzed on their overall satisfaction, THE crunches the numbers to calculate which UK university has the most satisfied students

July 12, 2024
Source: iStock/ Pavel Muravev

A former Anglican teacher training college received a more positive response from its undergraduates than any other UK university in the National Student Survey, according to analysis.

The 2024 edition of the poll, which at a sector-wide level showed improvements across every measure, also suggests that students at the most prestigious institutions tend to be less satisfied with their experience.

With an overall satisfaction question being again left out for English providers by the Office for Students, the sector regulator that runs the UK-wide survey, Times Higher Education has analysed responses to the 26 questions asked of all UK undergraduates for the provider where they are taught.

This shows that 89.4 per cent of students’ responses across all subjects at Bishop Grosseteste University in Lincoln were positive – the highest rate of all 149 universities in the analysis.

It was followed by Arts University Plymouth, which had a positivity rating of 86.7 per cent – and also scored highest in a controversial question on freedom of expression.

The University of Wales Trinity Saint David and the University of St Andrews, last year’s winner, finished in joint third on 86.5 per cent, and were the top-ranked institutions from Wales and Scotland, respectively.

The questions cover seven different themes, with Bishop Grosseteste scoring highest in four – learning opportunities, assessment and feedback, learning resources, and student voice.

The University of Oxford came top for teaching, Plymouth Marjon University for academic support and St Andrews for organisation and management.

RankUniversityScore

THE’s analysis includes full-time and part-time students at English providers registered with the OfS as universities, plus University of London constituent colleges and universities in Scotland, Wales and Northern Ireland. A small number of institutions were removed because of a low response rate.

The University of Ulster, Northern Ireland’s champion, had an overall positive score of 82.6 per cent.

Although it had just two universities included in the analysis, Northern Ireland had the highest overall positive score of the four nations – 81.5 per cent. It was followed by Wales (81.4 per cent), England (80.5 per cent) and Scotland (79.6 per cent).

The OfS said the data shows that the student experience has not been affected by current financial pressures. However, THE’s analysis suggests those at elite institutions are not as satisfied as others.

The overall positive score across the Russell Group was 78.3 per cent, which was up slightly on the 2023 results, but well behind the score of 81.4 per cent for the rest of the institutions.

The best placed of the Russell Group was the University of Sheffield, which finished in 17th place on 82.2 per cent. At the bottom of the pile among the mission group was the University of Edinburgh, which had a score of only 73.5 per cent.

The OfS flagged that there were minor data quality issues associated with the University of Chester, the University of Hertfordshire and the University of Bedfordshire, but these are not expected to have affected the overall scores.

patrick.jack@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (9)

Yep the top 30 nearly all second/third rate institutions (with the odd exception) tells you everything you need to now about how useless the NSS is.
That's a bit rude :O. Not sure what criteria Maverick2 is using to determine second/third rate.
First of all your spelling is bad. Secondly, can you explain what is a second/third rate institution is and finally please stop stealing my name.
Please stop using my call sign thanks.
There are so many reasons for not using NSS data in this way to compare universities, but just to mention one: where are the error bars in the above data? Many of those institutions look as though they may have a relatively small number of students. Is it correct to compare averages from small institutions with those from much larger ones? To me, the above data analysis is meaningless.
Doesn't the inverse correlation between the top universities and NSS scores tell you that brilliant researchers at the top universities don't care too much about teaching undergraduates, whereas less illustrious institutions put the needs of their undergraduates first?
No, it tells you that students who go to higher ranked universities are more likely to have higher expectations. This measurement is completely useless because nearly every respondent has only ever been to one university.
To present the mean score (composed of all Questions*) as a proxy score for "Overall Quality" doesn't demonstrate where pockets of excellent practice might exist for e.g feedback practices (Q10-Q14 / Theme 3) a much more "meaningful" measure of excellent. Q27 was removed from the Survey for a reason of single word/number judgements. We have seen how harmful these are to schools and staff (see Ofstead) so coming up with your own measure (in the absence of Q27) is really poor journalism * In the absence of a published methodology
The rankings are incredibly misleading - the differences for some are so minor within standard errors that they can be attributed to measurement error. For example, the percentage difference between the 2nd and 3rd ranked universities is a difference of *0.2%* - how certain are we that the data permits the detection of such a minute difference? Laughable ranking.

Sponsored