The changes to the National Student Survey could prove disastrous
The removal of well-being-centric questions and the ability to compare with previous years will impede universities’ ability to offer adequate support
You may also like
Popular resources
“How do you solve a problem like Maria?” It’s a good question. From a song in quite a good film (The Sound of Music), which you may have watched over the Christmas break. How do you solve a problem? Well, in my world I gather data, analyse it and take actions based upon it.
In universities across the UK, data are being gathered from final-year undergraduate students on their whole university experience. This is the first time the new National Student Survey (NSS) is being run, and it’s all change this year.
For example, gone is the consistent Likert scale from “strongly agree” to “strongly disagree” for every question. Instead, we have different answer options for different questions. Gone, too, is the consistent question approach about student perception, replaced by direct questions such as how frequently something is the case or how well something is done.
- Belonging: why it is the next step on the equity, diversity and inclusion ladder
- Amplify the authentic student voice in university marketing
- How to champion the international student experience from admission to graduation
Out (for England at least, the other UK nations have been allowed to keep it) is the “overall satisfaction” question. Some other questions have disappeared completely, too (more on that later), but if they haven’t, they have been reworded so they are no longer comparable.
So, no more properly defined and tested scales, which we and our course leaders could historically use with confidence to evaluate our initiatives to enhance the student experience and monitor progress over time.
The real value of the NSS comes at course level within institutions. So, if we, in the words of the famous song, “wanted to solve a problem like Maria” (taking Maria in this case to be the course leader), we would look at what the students said about Maria’s course, ask her to address the feedback as appropriate and monitor the change over time.
Simply put, if you had a student satisfaction problem pre-2022, then you can no longer use the NSS to monitor how you are doing regarding the solving of it.
As a mixed-methods researcher, I’m frustrated that these changes were seemingly made without any significant input from the expertise found in the sector and without sharing any data on the development of the questionnaire.
Yes, there was a consultation. We gave feedback: what we thought would and wouldn’t work; what would hinder us from using the data to solve a problem with our teaching; what should stop, start and continue. It was very much the approach we take to our own course and module surveys – how to do this for the best.
Our feedback – and yours, too, most likely – was completely ignored. I wonder what the Office for Students (OfS), or indeed the Office of the Independent Adjudicator for Higher Education (OIAHE), would say if we took our students’ feedback as seriously as our feedback on the NSS was taken?
The NSS was developed and introduced back in 2005. It was never intended to be used in the way it is. It was a survey designed to provide information to individual institutions to allow them to make local changes, as highlighted in the first evaluation of the NSS by Paula Surridge.
Unsurprisingly, both Surridge and the further evaluation of the NSS undertaken by the Institute of Education in 2008 found that one should take account of sources of variation, such as subject mix and student characteristics, if comparing institutions.
Thus was born the benchmarked NSS we know today: the one we asked to keep in phase one of the recent OfS consultation to enable us to gather key information, see how we compared in relation to others with similar students and subjects and decide where to focus our enhancement efforts.
At Portsmouth, we’ve been working on developing our students’ Being, Belonging and Becoming (BBB), so the removal of a key question – “I feel part of a community of staff and students” – is quite a problem for us, as it was an important and useful one. And we’re not the only ones concerned about the removal of this question.
I was recently invited to speak about our BBB work and survey to a group of principal fellows from across the sector concerned about developing students’ well-being and taking a whole-institution approach while doing so. The use of a common survey to investigate this is one approach we could take up.
If the OfS will not facilitate benchmarking for us in this important area – with the only well-being question on the NSS now about how well communicated the well-being services were – then perhaps we need to do it for ourselves.
There are a number of questions we could ask in a sector-designed survey that we would find useful when responding to the next Teaching Excellence Framework (TEF) exercise (or equivalent for those not required to take part in the UK). For example, our BBB survey has scales related to all three of the Bs, notably in terms of belonging questions about peer relationships and relationships with university staff.
The “lonely goatherd” in The Sound of Music felt isolated. In the current Covid-disrupted context, it’s critical that we know if our students are feeling isolated, too, whatever their educational, ethnic or social background and whichever delivery mode is being used. Then we can address their feedback via enhancements to our learning, teaching and student experience.
If we want useful student feedback this year, it seems we’ll have to make our own song and dance about it.
Harriet Dunbar-Morris is dean of learning and teaching and reader in higher education at the University of Portsmouth.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.