So this year there was a change with the national student survey (NSS). This time it was not changes to the survey itself (last year the questions were changed and new themes introduced) but, rather, changes that it now came under the custody of the Office for Students (OfS) since they replaced HEFCE.
The changes were not dramatic but represented a subtle shift towards the students’ interests. The data was not shared with Higher Education Institutions (HEI) before publication so that the students got to know the performance of the HEI at the same time as the Institutions themselves.
The other changes have been that the data has been presented in a slightly different, more accessible format. It is a long way off dashboards with visualisations but the data is there for people to easily review.
So we at #VizHE decided to pick up from where we left off last year (NSS 2017: One data set, many dataviz approaches.) and pull out a few interesting nuggets for you.
The starting point was to see how the students had responded on a national level looking at the themes. This was also tied into the monthly story telling with data challenge which was on dot plots.
The interactive viz: is here!
The main takeaway is that it really has not changed that much and where changes have occurred, they have been negative. The only exception is Wales where the students have responded much positively this year compared to last.
Adam decided to reviz the headline theme performance overview originally posted on the OfS pages because he wished to make it easier to compare satisfaction with the themes rather than themes and years mixed up together.
Following on from revizzing the headline NSS theme results Adam wished to dig a little deeper into the student satisfaction with the separately reported question 26 – ‘Students’ union’ part of the ‘Student Voice’ theme, highlighted above as very much lagging in student satisfaction compared to the sector theme scores.
Which unions are getting it right?
The interactive viz is here!
The main takeaway is that Alternative Providers (AP’s) appear to be getting it right more than Higher Education Providers and Further Education Colleges when taking the median score for each the provider types as a reference point. However, the populations are small for AP’s and this could be causing the volatility and wide distribution of scores seen in the figures. Hover over the box plot distributions to see which providers are getting a thumbs up from the students and who are in the dog house!
So that’s all from us for now. Thank you for reading and we hope you enjoyed exploring our visualizations!
Dave and Adam