The National Student Survey 2005
As part of a revised Quality Assurance Framework for higher education, the first full National Student Survey (NSS) in England, Wales and Northern Ireland was conducted in 2005. HEFCE commissioned Paula Surridge at the University of Bristol to conduct research into the first NSS and explore its findings.
The two main reports cover the survey findings, and the responses and survey methodology. A summary report gives an overview of the key points.
The summary concludes that the NSS offers an extremely rich resource for understanding student experiences in higher education, in particular their assessments of the quality of different aspects of their courses, both for global analyses of the entire higher education sector and for individual institutions to understand the responses of their own students.
The NSS showed that students had very high levels of satisfaction with their courses. All aspects of teaching quality measured were rated positively by students. However, not all aspects were rated equally positively. In particular, students were less positive about assessment and feedback than about other aspects of their course experiences.
A foreword (read on-line) gives HEFCE's response to the review, and next steps.
The National Student Survey 2005: Summary report
The National Student Survey 2005: Response and survey methodology
The National Student Survey 2005: Findings (main report)
Review of the NSS: next steps
Foreword by Professor David Eastwood
Chief Executive, HEFCE
The National Student Survey (NSS) is increasingly valuable for understanding student experiences in higher education. It is proving its value as an aid to prospective students in making their choices of where to study, and a resource enabling universities and colleges to revise and improve the quality of provision; and an increasingly-widely used source of evidence in the wider debates and discussions about higher education policy.
Given the survey's use by applicants, in institutions, and policy discussion, it is important that we have a robust understanding of how and why students respond as they do, appropriate methodologies for analysing the data, and the relationship between the survey's outcomes and underlying realities both in individual institutions and in the higher education sector more widely. The analysis represented here is a contribution to that wide and critically-informed understanding of the NSS.
These reports, based on the first NSS in 2005, build on earlier work and extend our understanding of all aspects of the survey. Some of the findings are straightforward, but some relate to a particular group of students and particular sets of questions. All the implications of this study may therefore take time to digest, but we hope that the findings will support institutions in carrying out further analysis and help inform policy development into the student learning and teaching experience.
A key priority for us in the coming months will be to work with the Higher Education Academy and other partners to support institutions in further enhancing their provision. We also anticipate that the findings will be useful to other users of the survey.
Though the analysis reported here is extensive, it is far from providing the last word, and is therefore best viewed as work in progress.
We look forward to the analysis of the 2006 NSS results, in spring 2007, which will identify and report on changes between the two years, providing a still richer source of information to help bring about change to further enhance the student experience.