Meeting Notes

NSS Institutional Working Group

15 April 2011, The Higher Education Academy, York

Present: Catherine Rendell (University of Hertfordshire), Clare Milsom (Liverpool John Moores University), Julian Martin (University of Worcester), Nicola Poole (UWIC), Jenny Lyon (University of Leeds), Anna Round (University of Sunderland), Jason Leman (Sheffield Hallam University), Andrew Turner (Coventry University), Adam Child (Lancaster University), Clare Cousins (Goldsmiths College, London), Richard Harrison (Durham University), Alison Jones (Leeds Metropolitan University), Nuala Toman (Queens University Belfast), Jackie Potter (HEA), Alex Buckley (HEA), Katherine Gent (HEA)

Introduction

·  Jackie Potter welcomed everyone to the meeting, and gave an update on the HEA’s NSS work, focusing on the revised approach to considering subject level NSS data, through the proposed State of the Subject Reports.

Presentations and discussion

Presentation 1: “Developing a revised approach to considering subject level NSS data – a Durham example”

Richard Harrison, Durham University

·  Academic Support Office in Durham is where all organisational analysis is carried out.

·  Approach used has been innovative for Durham and presentation outlines why the y have changed their approach and what is being done now.

·  Weaker areas are assessment and feedback with room for improvement in personal development.

·  Senior managers have KPI to be in the top 5 nationally and they seek to make progress to get towards the top 5.

·  For past five years there has been a fairly traditional approach:

o  Summarising in comparison to Russell Group / 1994 Group.

o  From 2007 the stats unit in the Maths department employed to analyse and social science programme co-ordination to look at the qualitative data.

o  Review of student support mechanism etc. as a result of the data analysis

o  Departmental action plans developed and submitted to faculty level for overview.

·  Not felt that this approach was working because:

o  Increased response rates and levelling off of institutional data

o  Departments above the trigger to develop action plans did not engage with the data

·  Change in approach to look at data more holistically. This also involved a change in ethos to make students feel valued members of the academic community – a name not a number. Failure to do basic things right, simple things like being nice to students when they come to enquire at the departmental office, can affect scores. Communication with students is not just about emails, it is also the day to day contact including with administrative staff.

·  Began with a faculty away day for all faculties this year. Rather than paper plans there is ongoing dialogue and a reflection on what has been done in 2010-11. There has been a development of a Faculty NSS Manifesto and there has been continued discussion throughout the year. There has been a move towards a faculty teaching and learning strategy.

Points of discussion

·  How does the NSS manifesto developed relate to the student charter? This has been used as a background to inform the development of the student charter. Durham has been more open about the development of the student charter and have sought to connect with more academics.

·  Why the NSS? Felt that NSS carried more weight

·  Flexibility of having a departmental approach is helpful as it gives people power to develop from the bottom us and gives them increased ownership.

·  The movement towards increasing the sense of an academic community chimes in with university and education strategy. It ties in with the qualitative analysis which indicates that the quality of contact is as important as the number of contact hours.

Presentation 2: “Like it? Student Survey Season”

Alison Jones, Leeds Metropolitan University

·  The process for running the NSS and analysing results all takes place in one department. This department is also responsible for a programme of surveys including PTES and ISB.

·  The main aim has been to increase response rates to the NSS and other surveys.

·  In the past there have been a range of approaches depending on the survey. These included things such as “shout outs” prior to lectures for areas where there are low response rates, direct emails, free sports passes if survey completed and a viral approach where students talk to one another and spread the message.

·  The student survey season aims to bring all the surveys together and use ideas from all of the surveys.

·  There is a single point of delivery through the student portal which they all need to access. It is also tailored to the type of student that they are, so when they log in they see only the survey that they are eligible for.

·  The promotion in posters etc. has taken a more positive approach than previously. In other years the tag line “You said… We did…” was used. This year the motto “Like it?” has been used widely with a text comment and the score drawn from previous surveys together with an example of what the university is doing.

·  Training for both academic and administrative staff has been extended to increase awareness.

·  Incentives have included a voucher for a free coffee, though the number claiming these has been a lot lower than the number who have completed the surveys.

·  So far this seems to be having a fairly good response, for the annual survey and the NSS the response rate has been higher by this stage than in previous years. However PTES has been lower, though this may be on account of the loss of postgraduates working in the office, which had previously increased awareness in a viral approach.

·  There has been some difficulty. For example, some students “not seeing” the advert on the portal when they log on.

·  Academic staff often promote NSS above other surveys, but focusing attention in one survey season hopes to address this.

Points of discussion

·  The advantage is that you create a target list of students which feeds into the portal. This means that you don’t waste time targeting ineligible students by publicising it widely.

·  Training has been provided to explain who is eligible and a list of students is sent to course leaders in advance.

·  How do you get staff to take part in training? It is optional and there are smaller faculty level meetings. It is a technical briefing not looking at the results and the aim has been to target course leaders and postgrad tutors.

·  Better response may be gained if students had to click through before they get to another part of the portal.

·  Noted that increasing response rates are not always good. Liverpool John Moores University had their best ever response rate of 78%, however this was accompanied with their lowest scores in many cases.

·  To get a competitive edge, one institution had a response-o-meter by faculty which got greater academic engagement and allowed individuals to take action in time.

Information gathering and evaluation

The group discussed the ways in which the insights and good practice raised in meetings might be gathered for wider dissemination.

Main points:

Academy has a more political voice than individual institutions. If there are consultations that relates to the working group, the group would welcome being asked of their views.

The group provides a snapshot of good practice though it is questioned whether the members are always the right people to carried forward actions within their institutions.

It was felt that there was value for members in working with this group for their own institution’s strategy.

Institutions are at different stages and have different experiences, though there are similarities in the NSS lifecycle and challenges faced. It would be beneficial for the HEA is to synthesise materials on topics, perhaps in a discussion board format that can be passed on within an institution and capture the voices across institutions.

Suggested that there may be scope for surveying delegates from the Surveys For Enhancement conference to see what they feel the working group should be discussing.

What can you share?

·  How to increase response rates

·  How to use quantitative data to enhance student experience

·  How to maximise JACS and course level results

·  How do you promote, analyse, disseminate and address NSS scores whilst maintaining institutional dialogue?

·  Why is the NSS important at all?

·  How to ensure that an institution has a co-ordinated approach to NSS

·  How to introduce a new paper-based NSS aligned module level survey

What do you want to know?

·  How do you manage expectations associated with the NSS without undermining the survey?

·  What are the best strategies for improving scores?

·  What is the most useful form of analysis to an HEI when (human) resources are limited? i.e. provides meaningful info

·  Is there a definitive answer to increasing response rates, esp. from smaller subject areas?

·  How the approach of other institutions is changing and evolving (including good ideas that I can take back to base)

·  Impact of working group on institutional policy

·  How to develop an NSS “surplus” model as opposed to a deficit “you said we did” identifying failing programmes?

·  How do I get academic staff to care and then do something about it?

·  Evidence of the impact of the NSS on enhancing the student experience

·  How do other people’s enhancement activities for pedagogy/student experience relate to NSS findings?

·  I want to know the evidence for showing that the NSS is actually improving the student experience. I want to evaluate the NSS

·  How do you present the NSS data to academic staff and engage them to want to use it?

·  How to act upon the analysed data and produce outcomes?

·  How do we address low NSS scores in areas where staff have already tried everything?

·  How can we ensure that the effort put into the NSS in my institution does actually improve student experiences

·  How can NSS articulate with other (more meaningful) evaluative mechanisms?

·  What is the value (from your perspective) of the NSS in relation to learning and teaching and quality enhancement? Are there other means which might have stronger relations to such ideas?

·  Why do Russell Group Universities usually have lower feedback scores than post 92 Universities?

Sharing Practice

Alex Buckley outlined some of the analysis done by the HEA on the national aggregate data in relation to part time students and international students.

Topic 1 - Using the NSS to address the needs of non-traditional students

Part Time Students

·  Some analysis is done based on WP demographics

·  For small institutions there are not enough part time students to make this sort of analysis helpful

·  Often on the same course as full time students but over a longer time; Part time students have a different path/different story e.g. mature students, childcare responsibilities and so have different requirements/expectations and these differences make it difficult to analyse for change.

·  It is more common to look at other data rather than NSS in relation to part time students e.g. progression data, dropout rates.

·  In many cases there is not a firm line between full time and part time.

·  It is possible to break down the data more than with international students

·  The qualitative comments are particularly useful

·  Usefulness of national analysis

International Students

·  One institution has dropped the ISB because the response rate was too low

·  ISB is more useful as you can’t drill down, only look at institutional level

·  Some HEIs don’t have enough international students

·  NSS can be usefully used in specific departments where there is a high level of international students such as the Business School. These departments often do their own specific disaggregation of data

·  Not much work is done at an institutional level to look at the NSS results of international students in comparison to home students.

·  Will be more important with changing times

Topic 2 - Effective ways of closing the feedback loop

The idea of saying “thank you” – this should be the first message as an alumni but as yet this is not being done. Alumni donations are a very important source of income for institutions, need to be getting it right.

Should be going away from measuring and towards a more transactional approach.

There remains a lack of understanding of what the NSS reveals.

Quality enhancement v league tables

It is public information and can be used for recruitment and marketing rather than at the chalk face to improve the student experience.

The involvement of the students union depends upon the discipline. Around 46% of ethnic minorities and part time students do not engage with students unions.

There is still room for student involvement in the process via student representation. SU meet with representatives and pass on information from meetings and working groups. Paid representatives within faculties can also feed information up.

1