The first article in this series looked at overall satisfaction scores from the results of the recent National Student Satisfaction survey. To recap, a simplistic league table based on ranking overall satisfaction scores was criticised in a recent review from HEFCE, which suggested more complete analyses should include the computed benchmarks, and comparisons of HE providers should be based on specific subjects or courses. For details see the ‘Review of the National Student Survey’ (Institute for Education, Natcen Social Research and Institute for Employment Studies, 2014).
Please send any comments or questions on the articles to Paresh Shah.
Comparing HE providers using benchmarks
Benchmarks indicate the optimal scores for HE providers to achieve for Question 22, which asks “Overall, I am satisfied with the quality of the course”. Generally, league tables are constructed without reference to whether overall scores differed significantly from calculated benchmarks for HE providers. .
For analyses of benchmarks the source file was NSS2015_summary_data which collated the scores on overall satisfaction (Question 22) for students grouped either as ‘Taught’ and ‘Registered’. A total of 159 HE providers were included for analyses excluding The Open University.
Satisfaction scores were largely identical for Taught and Registered students. The exceptions were for three of 30 HE providers which had significantly exceeded their benchmarks, and four of 19 HE providers whose satisfaction scores were significantly less than their benchmarks.
In the seven cases where Taught and Registered scores differed, the direction of significance was maintained except with one HE provider which showed positive significance for Taught students (exceeded benchmark) but no significance for Registered students. This HE provider (Oxford Brookes) has a large number of overseas registered students on ACCA courses (see HESA Data Intelligence Notes).
HE Providers with scores exceeding benchmarks
Thirty of the 159 HE providers significantly exceeded their benchmarks (Overall satisfaction score higher than the upper confidence interval). Six of 18 HE providers in the South East had satisfaction scores significantly higher than their allocated benchmarks, while two of 36 HE providers in London exceeded their benchmarks (Table 1 in PDF of Tables and Charts).
HE Providers with scores lower than benchmarks
Nineteen HE providers had overall satisfaction scores significantly lower than their benchmarks, and eight of these were in London (Table 2).
Overall satisfaction scores and student: staff ratios
A scatter plot of satisfaction scores versus student: staff ratios‡ indicated there was no simple linear relationship between the two untransformed variables (Chart 1).
‡ based on HESA numbers for Total undergraduate students and Academic staff for 2013/14.
Three specialist HE Providers in London featured in the Times Higher rankings for Overall satisfaction scores, namely The Courtauld Institute of Art, The Conservatoire for Dance and Drama (CDD) and The Royal Veterinary College (RVC). The two specialist HE providers which significantly exceeded their benchmarks for overall satisfaction were CDD and Rose Bruford College of Theatre and Performance.
It may seem that for a London-based HE provider to ‘do well’ in the NSS, then it should be a relatively small, possibly single site, institution. However, CDD is formed of eight schools, two of which are based outside of London, and RVC has two campuses, one in central London and one in Hertfordshire.
The eight London HE providers which had overall satisfaction scores significantly lower than their benchmarks consisted of two outer London institutions, five in central London and one in east London. In 2014 four of the eight had significantly lower scores in as well, but the other four achieved overall scores approximating their individual benchmarks. All, except for one, have multi-site campuses and/or separate specialist colleges and institutes, which may support the view that student satisfaction for larger London-based HE providers is higher with closely bounded campus-based institutions.
For the future, the consultation on changes to the NSS, announced on 1 October by HEFCE, may see the omission of the question on overall satisfaction.
Comparing HE providers by subjects
Data on scores for the 22 questions in the survey were selected and sorted using the data file NSS_taught_all15. This file also supplies Confidence Intervals (CI) calculated for overall satisfaction scores at subject level for each HE provider. Analyses and summaries were focused on Medicine & Dentistry and Creative Arts & Design, as these are two important subject groups for the HE sector in London.
Medicine & Dentistry
Thirty six HE providers in the UK offer Medicine & Dentistry at First degree level, including five in London. Overall satisfaction scores varied from 67 to 98% but, given the wide variation in confidence intervals, it is difficult to visually ascertain significant differences between providers (Chart 2 in PDF of Tables and Charts).
Creative Arts & Design
Of the 130 HE providers offering First degree courses in Creative Arts and Design, 28 are in London. Overall satisfaction scores varied from 63 to 100% in London (Chart 3), similar to scores for the rest of the UK.
Possible factors affecting overall satisfaction scores for subject groups
For Medicine & Dentistry and Creative Arts & Design, individual scatter plots were constructed relating scores for overall satisfaction scores with the six groups of questions (Teaching on my course; Assessment & feedback; Academic support; Organisation & management; Learning resources and Personal development). R2 values for simple linear trend lines were computed in Excel. High R2 values (>0.60) may indicate survey questions which positively affect overall satisfaction scores in each of the two subject groups.
With Medicine & Dentistry the scores for questions on Academic support seem to be strongly related to higher satisfaction scores (Chart 4), while for Creative Arts & Design questions on Teaching appear to have high R2 values (Chart 5). The influence of the different groups of questions on overall satisfaction scores could be further investigated using regression-based techniques such as generalized linear models.
A recent statistical analysis questioned the validity of using NSS scores as a proxy measure to assess teaching quality and student outcomes in medicine. This was based on comparisons of NSS scores with relevant professional exam results. Similar studies with other subject groups may support the call for alternative metrics to the NSS as part of the Teaching Excellence Framework.