Official statistics

National Student Survey data: provider-level

Provider-level data

This page contains a visualisation of data from the latest National Student Survey (NSS) for each participating higher education provider.

Results are shown for NSS 2024 and for NSS 2023. You can use the dashboard to visualise the data filtered by year, registering provider (the provider that has overall control of the higher education content, delivery, assessment and quality assurance arrangements), mode of study, subject level (CAH1 and CAH3) and subject.

Get these results as spreadsheets

Data quality notes

2024

Read the NSS 2024 quality report for full details

There are some issues with the completeness of some of the data collected by JISC, which has an impact on our benchmarking. To mitigate this, we have used student data for ethnicity, sex, care experienced status and socioeconomic classification (SEC) from the same student in the previous year. These variables were chosen for backfilling because they are subject to more missingness than usual and are unlikely to change between years. While there will still be some missingness remaining in these variables, this backfilling reduces the need to suppress benchmarked results due to missing data.

There are specific data quality issues relating to five providers:

There is a data quality issue known to affect partnership arrangements data for this provider, where some students who were taught at a different provider in 2022-23 were instead recorded as both registered and taught at the University of Chester. This may mean that a small number of students are allocated to Chester rather than the appropriate teaching provider.

The 2022-23 student data return submitted by the University of Hertfordshire did not meet our data quality requirements in full and there remain outstanding issues with its data that could impact on its onward use. We are publishing indicators for this provider because these impacts are expected to be marginal, but users should be aware they have the potential to mean that some people with disabilities may not have had any known disabilities recorded, and this may make the benchmarked results for this provider less reliable.

The 2022-23 Student data return submitted by the University of Bedfordshire did not meet our data quality requirements in full and there remain outstanding issues with its data that have the potential to impact on its onward use. These impacts are expected to be marginal, but could mean that some students on two-year courses may have been wrongly recorded as full-time.

There is a data quality issue which may affect the population of taught students for this provider, where some students who were registered at a different provider but taught by Navitas UK Holdings Limited may have been erroneously reported as being taught by a different provider. The may lead to some NSS respondents being allocated to the wrong providers.

There is a data quality issue which may affect the population of students for this provider, where some students were not included in the 2022-23 HESA student data return. This affects intercalating students from medical courses. This may lead to the NSS population being lower than the actual number of students studying on medical courses at this provider.

Due to a survey administration error, question 28 ('overall satisfaction') was not asked of the majority of survey respondents from Glasgow Caledonian University (GCU). We therefore replaced the GCU results with the Scottish sector average and did not publish results for subject-level GCU populations for this question in NSS 2023. The error was not the fault of the provider and does not reflect on the quality of the provider.

Using the data and dashboard

The data dashboard allows you to select NSS results, using the filters at the top of the dashboard. You can use the filters to focus on a particular population.  For example, when viewing the NSS results for the UK, you can use the filters to focus on those students who are studying full-time, or who are studying a particular subject.

In all views, you can use the “slider” in the centre of the filter bar to choose whether to view the positivity measure or the difference from benchmark.

Question 28 and the healthcare questions are shown separately from the other questions, due to their different format and coverage.

This dashboard contains NSS results presented by registering provider (the provider that has overall control of the higher education content, delivery, assessment and quality assurance arrangements). Users are able to visualise the NSS data using the following filters:

  • Year (NSS 2024 and/or NSS 2023 data)
  • provider
  • Mode of study
  • Subject level (CAH1 and CAH3)
  • Subject

In order to maximise the usability of the dashboard we have restricted the filters available to allow users to access headline results easily. Provider level data with all available splits (such as teaching institution, level of study, subject at CAH2 and the results for the healthcare, allied health, and clinical practice placement questions) are available as spreadsheets for both NSS 2024 and NSS 2023.

The dashboard contains multiple filters and it will take a moment to reload the dashboard when each filter is changed. This loading time can vary based on internet speed and the processing power of the device used. We are aware that the dashboard can be particularly slow on the day of the annual NSS publication, due to the high number of users, and we are actively looking for solutions to this.

All the data available on the dashboards can be accessed using our data downloads, which we provide in CSV and Excel format.

In our data dashboards, the positivity measure and the difference from benchmark are surrounded by shaded bars. These indicate that the values are estimates, and may be affected by random variation. For example, the NSS results are a measurement at a point in time – it is possible that some respondents would respond differently on another day, and that this would lead to a different estimate. We refer to this as statistical uncertainty. 

The shaded bars show a range. We can be confident that the true value lies within this range. The shading of the bars indicates the changing likelihood that the true value falls outside the range of the shading. It is more likely that the true values falls within the heavily shaded areas, and less likely that it falls within the lightly shaded areas. Conversely, we can be very confident that the true value lies within the entire shaded range, and less confident that it lies within the narrower, heavily shaded area around the estimate.

The shaded bars are constructed from confidence intervals, ranging from 99.7 per cent to 75 per cent, with the shading changing with 2.5 percentage point increments.

Uncertainty depends heavily on population size. You will typically see that measures based on small populations are surrounded by wider shaded bars.

Find out more about the statistical methods used to create the shaded bars.

Question 28 ("Overall, I am satisfied with the quality of the course”) has a different format from the other survey questions, and five rather than four response options. Because of this, the positivity measure for question 28 cannot be straightforwardly concerned with the positivity measure for the other questions. In particular, we would typically expect the positivity measure for question 28 to be lower than the positivity measure for the other questions, because we allow respondents to take a neutral stance (“Neither agree nor disagree”).

To make it clear that the results for question 28 cannot be compared with the results for the other questions, we have designed the dashboards so that question 28 always appears separately from the other results.

Question 28 is only asked of students studying in providers in Northern Ireland, Scotland and Wales. Therefore, country-level statistics for question 28 are not provided for England, or for the UK.  

The data dashboard shows, for each question, the number of responses. This number is the total number of students who have responded to the question with a response other than “This does not apply to me”, reported as a full-person equivalence (FPE). This number is the denominator for the positivity measure.

The number of responses sometimes varies by question. This is mainly because differing numbers of students respond “This does not apply to me” to each question. It is also because some students start but do not complete the questionnaire.   

Through changing the filters on the data dashboard, you may arrive at a selection for which no data is available. This would happen, for instance, if you selected the filters “Apprenticeship” and “All undergraduates” for a provider that did not offer apprenticeships. When this happens, you will be presented with the message “Data unavailable”. You can leave this page by choosing a different combination of filters, or by refreshing your browser.

The ‘Sup’ column on the dashboard is used to indicate where data has been suppressed for the following reasons:

Data Protection Low (DLP): When the response rate for a publication unit is 100 per cent, and all, or nearly all, the students responded negatively to a particular question. This is to ensure that students feel able to honestly report poor quality, without risk of being identified. This suppression is very rare; when it occurs, we indicate that the positivity measure for the question is very low using the marker ‘DPL’, but otherwise provide minimal information.

Data Protection (DP): When, for a publication unit, a theme includes a question that is DPL suppressed. In this case, publishing the theme measure could allow data users to infer information about the suppressed DPL measure.  We therefore suppress the theme measure too and mark it as ‘DP’ (data protection). 

Benchmark (BK) suppression: Rarely, we are unable to calculate accurate benchmarks.  In some cases, one or more of the benchmarking factors are unknown for most students in a provider/subject/mode split. In these cases the results are shown as ‘N/A’. Benchmarks are also unavailable in some cases when the provider/subject/mode split makes an extremely large contribution to its own benchmark. Again, these results are shown as ‘N/A’.  We also do not provide benchmarks when results are suppressed as ‘DPL’, or ‘DP’. 

If you receive an error message beginning with '{"result":' when trying to view a dashboard on our website, try closing and restarting your Internet browser. This issue can occur if you have viewed a dashboard on the OfS portal since opening your current browser session.

It is not possible to view our dashboards on the portal and on our website concurrently in the same browser. To switch between them you will need to restart your browser, delete your browser cookies, or use two different browsers.

When NSS results are aggregated at the level of country (UK, England, Northern Ireland, Scotland, Wales) we do not calculate benchmarks. This is because the benchmarking method does not create meaningful results in many of these cases. The “Difference from benchmark” view will show “N/A” in these cases.

Rarely, we are unable to calculate benchmarks for provider-level data.  In some cases, one or more of the benchmarking factors are unknown for most students at a provider. In these cases the results are shown as “BK”. Benchmarks are also unavailable in some cases when the provider makes an extremely large contribution to its own benchmark. Again, these results are shown as “N/A”.  Similarly, we do not provide benchmarks when results are suppressed as “DPL”, or “DP”.  

Find out more about the NSS data
Published 10 August 2023
Last updated 13 September 2024
13 September 2024
Added data quality note for University of Edinburgh.
10 July 2024
Published NSS 2024 data.

Describe your experience of using this website

Improve experience feedback
* *

Thank you for your feedback