Guidance for Annual Programme Evaluation (Undergraduate and Postgraduate Taught Programmes)

Contents

Contents 2

Guidance on Annual Programme Evaluation 3

Overview 3

Can more than one programme be included in an APE? 3

Planning for the next Periodic Review 4

Management Information 4

Timetable 4

Undergraduate Timetable 5

Postgraduate Taught Timetable 6

Completing the form 7

Front sheet 7

Milestones, targets and statistical data 7

Summary Evaluation 8

Good Practice/ Supporting the Education & Student Strategy 8

Thematic element 9

Student feedback 9

Action Planning 10

Appendix 1: Partnership provision 12

Guidance on Annual Programme Evaluation

The Policy on Annual Programme Evaluation (APE) sets out the purpose of APEs and the principles that guide their development and use.

This guidance should be used in conjunction with the policy and provides additional information to support the development and use of APEs. This guidance is specifically for internal and partnership provision. Guidance for validated provision can be found at: http://www.city.ac.uk/__data/assets/word_doc/0008/69254/ape_guidance_vip.doc.

Overview

APEs are designed to support reflection on the past year’s activity and to capture and monitor actions being taken to enhance programmes across the course of the present academic year.

They are the primary way in which programme teams plan and report on performance against key metrics for student satisfaction, progression and employability.

They are ‘living documents’ updated throughout the year to reflect on-going feedback, developments and new data. Programme Committees monitor progress at every meeting and Boards of Studies consider APEs twice each year. APE (and particularly action planning) should be undertaken in consultation with Staff Student Liaison Committees (SSLCs).

Education & Student Committee and its substructure consider the contents of APEs to inform strategic decision-making. Senate receives an annual report on quality and standards and best practice as part of the APE cycle.

APEs should be based on evidence from a variety of sources including (but not limited to):

10

·  data on recruitment, progression,

performance, retention, graduate destinations

·  NSS/PTES/PRES/ Your Voice surveys

·  module evaluation

·  SSLC minutes

·  External Examiner feedback

·  professional body reviews or audits

·  periodic reviews

10

APEs are also designed to capture actions being taken by Professional Services and by management colleagues within Schools.

Can more than one programme be included in an APE?

Some Schools/departments choose to cluster a number of programmes into a single APE. This is very helpful where a number of modules are shared or the broad provision is comparable.

Advice should be sought from the Associate Dean (Education) and School Professional Services lead for Quality Standards and Enhancement on how best to group programmes. It is vital that differences across provision are adequately reflected and feedback action plans may relate specifically to a particular route or cohort. The summary evaluation is also an excellent place to draw out any key differences between the groups covered.

Consideration should also be given to whether some or all of the Management Information tables and survey targets should be separated out so that differences in admissions, progression, destinations and the student experience can be identified and addressed.

A balance must be struck between covering a very small programme or clustering too many programmes together – both approaches present risks for undertaking meaningful evaluation and action planning. It is therefore vital that where a cluster of programmes are covered by a single APE, each programme has a separate and targeted action plan.

Planning for the next Periodic Review

Periodic Reviews take place every six years. If you do not already know the year of your programme’s next Periodic Review please check with your Associate Dean (Education), School Quality and Standards lead or Student & Academic Services (please see key contacts).

The APE action plan should also be used to identify where actions could be taken forward through the Periodic Review process (occurring every six years). Actions here could be quite broad-ranging and should support the development of action plans for Periodic Review.

Management Information

Staff responsible for preparing the draft of the APE should reference the technical annex.

If a programme is required to provide statistical data on an annual basis to an accrediting or professional body, and if this data covers the above areas, the programme team may be able to append that data rather than duplicating the information using the tables in the APE. Programme teams should discuss these options with Student & Academic Services.

10

Timetable

Throughout the lifecycle of the APE on-going monitoring and development takes place via a standing item on Programme Committee and SSLC agendas. Both Committees should receive a full copy of the APE as early as possible in the APE lifecycle. Following this the Committees will focus on the progress of the action plan and development required resulting from feedback and new data including matters arising during the year.

10

Undergraduate Timetable

Monitoring and development of previous year’s APE via Programme Committee and SSLC. Overseen by ADE, PD and HoD. /

ê

Timing / Activity / Input / Responsibility / Committee(s) /
June-July / Review and planning
Identify broad issues to be covered in new APE / ·  Previous year’s APE
·  School and University strategic developments / ADE, PD, HoD / Programme Committee
June -Sept / Draft APE
Incorporate inputs as they become available
Identify and incorporate any development support needs for each action (insert under ‘Support needs and status’)
Where possible early consultation with students is also helpful (e.g. on internal survey results) / ·  Your Voice survey data
·  Reflection on operation of programme/ modules, student feedback and direction of enhancement activity
·  Admissions & destinations data
·  Assessment Board data/ External Examiner comments and any actions arising
·  NSS results
·  External Examiner report(s)
·  Resit Assessment Board data
·  Partnership co-ordinators statement / PD in liaison with HoD and ADE as appropriate / Programme Committee
According to School planning cycle / APE and School Plan
Meeting to discuss APE’s interaction with School plan / ·  School plan/ impact of any strategic developments
·  Resource implications / ADE, Dean, HoD, PD / School Exec
Sept / Programme Committee sign-off
Confirmation that APE is ready for receipt by Board of Studies (send copy to Student & Academic Services) / PD, ADE / Programme Committee
Oct / Board of Studies approval
First stage consideration and approval of APE / Dean / Board of Studies
First SSLC of the year / Student input
Discuss and update APE / Targeted focus groups and other forms of feedback might usefully be used here in addition to SSLC / PD, ADE / SSLC, Programme Committee
Each PC & SSLC / On-going updates
Monitoring and updates / ·  Matters arising
·  Additional feedback
·  Additional data / PD / Programme Committee, SSLC
Nov / Quality and thematic review
Report to University-level Committees / S&AS, DVC / E&S Com/ Partnerships WG
As needed / Issues for institutional consideration
Discuss University-level issues / DVC / Senate, UET/ExCo
By last BoS / Board of Studies receives APE update
Actions updated/ monitored / PD, Dean / Board of Studies

ê

Use APE to commence development cycle for new APE. /

Postgraduate Taught Timetable

Monitoring and development of previous year’s APE via Programme Committee and SSLC. Overseen by ADE, PD and HoD. /

ê

Timing / Activity / Input / Responsibility / Committee(s)
July - August / Review and planning
Identify broad issues to be covered in new APE / ·  Previous year’s APE
·  School and University strategic developments / ADE, PD, HoD / Programme Committee
July-November / Draft APE
Incorporate inputs as they become available
Identify and incorporate any development support needs for each action (insert under ‘Support needs and status’)
Where possible early consultation with students is also helpful (e.g. on internal survey results) / ·  PTES survey data
·  Reflection on operation of programme/ modules, student feedback and direction of enhancement activity
·  Admissions & destinations data
·  Assessment Board data/ External Examiner comments and any actions arising
·  External Examiner report(s)
·  Resit Assessment Board data
·  Partnership Co-ordinators statement / PD in liaison with HoD and ADE as appropriate / Programme Committee
According to School planning cycle / APE and School Plan
Meeting to discuss APE’s interaction with School plan / ·  School plan/ impact of any strategic developments
·  Resource implications / ADE, Dean, HoD, PD / School Exec
November-December / Programme Committee sign-off
Confirmation that APE is ready for receipt by Board of Studies (send copy to Student & Academic Services) / PD, ADE / Programme Committee
December-January / Board of Studies approval
First stage consideration and approval of APE / Dean / Board of Studies
First SSLC / Student input
Discuss and update APE / Targeted focus groups and other forms of feedback might usefully be used here in addition to SSLC / PD, ADE / SSLC, Programme Committee
Each PC & SSLC / On-going updates
Monitoring and updates / ·  Matters arising
·  Additional feedback
·  Additional data / PD / Programme Committee, SSLC
February / Quality and thematic review
Report to University-level Committees / S&AS, DVC / E&S Com/ Partnerships WG
As needed / Issues for institutional consideration
Discuss University-level issues / DVC / Senate, UET/ExCo
By last BoS / Board of Studies receives APE update
Actions updated/ monitored / PD, Dean / Board of Studies

ê

Use APE to commence development cycle for new APE. /

10

Completing the form

Front sheet

Purpose: To provide the School Board of Studies with an overview of responsibilities and progress

Completion:

1.  Complete all sections of the first box except ‘key contact’ which should only be completed where the APE covers more than one programme (see page 3 for guidance).

2.  For all partnership provision complete the Partnership section and ensure the Academic Partnership Co-ordinator Annual report is appended to the APE. More information is available at Appendix 1.

3.  Complete the ‘Progress Tracking’ section as the APE progresses through the year.

Milestones, targets and statistical data

Statistical data

The APE is accompanied by a series of tables which are completed throughout the year as data becomes available. Programmes may also wish to add data available to them related to the specific discipline, demographic information or other areas which will help with evaluation and planning. Commentary is likely to focus on:

·  achievements against milestones and targets (see below)

·  trends seen year on year

·  particularly weak or strong indicators as well as any contextual information that explains weak/ strong performance

·  what indicators the data provides in relation to the overall strategy of the programme moving forward

·  discussion of performance in relation to equivalent programmes/cohorts in the University and sector competitors

Milestones

You will find the milestones for your subject area under your School section of the Performance Indicators pages on the Strategic Performance and Planning Unit website: http://www.city.ac.uk/intranet/strategy-and-planning/performance.

Milestones monitored through annual programme evaluation are for student satisfaction, progression, achievement, and employability as follows:

1.  Student satisfaction: NSS – average of questions 1-21

2.  Student satisfaction: PTES – question 13g (this is no longer available via PTES so local targets should be set)

3.  Student satisfaction: Your Voice (the milestones are the same as NSS)

4.  Achievement: Good honours degree (1 or 2:1)

5.  Progression & Completion: First year, full time, first degree students progressing in to the second year

6.  Employability: Graduate prospects

The milestones take into account disciplinary differences (e.g. for student satisfaction some disciplines consistently score higher than others).

Targets

Targets are set for individual questions within the NSS, Your Voice and PTES by Schools, departments and programme teams to help them break down the steps towards achieving milestones.

Targets should be stretching yet achievable and support the management of staff and students’ expectations. Setting target scores also provides a valuable opportunity to discuss areas for improvement with students. This process should not be completed solely by the Programme Director as there will be many factors outside his/her control that will contribute to achieving change (e.g. organisation and management matters such as administrative support). Engagement from Heads of Department or equivalent and Deans is an important part of this management process.

Targets should be set ensuring they are realistic and the focus is on areas of greatest concern, and may be set for banks of questions rather than individual questions. Some programmes have chosen to set a target range rather than a specific number (e.g. 80-85) as small fluctuations may not have statistical significance. In setting target scores it is important to take into account the overall student experience of the particular cohort, and how different aspects of the student experience can impact positively or negatively on satisfaction levels overall

Some programmes will not achieve publication thresholds for data due to small student groups or low response rates. Where this is the case aggregate data can be requested from Shereen Sally in Student & Academic Services; the aggregate data will group a programme with other equivalent programmes to provide the most meaningful data possible to support the programme in setting and working towards targets.

Summary Evaluation

This section is likely to provide useful context for discussion of the APE at Boards of Studies and with students. It provides an opportunity to reflect on the overall health of the programme over the past academic year, assessing the cumulative impact of enhancement activity and any relevant wider changes within the programme, discipline area, School, University and/or sector. It can also be used to reflect on future direction, relevant opportunities and challenges as well as to highlight any issues and contextual factors the programme team has not raised elsewhere on the form. Some programme teams have usefully employed a SWOT (strengths, weaknesses, opportunities, threats) analysis as part of the summary evaluation.

Good Practice/ Supporting the Education & Student Strategy

This section is an opportunity for programme teams to reflect on general good practice. It also allows for identification of where activities and initiatives, either planned for or already embedded in the programme, support implementation of the Education and Student Strategy. This is particularly useful for programme teams in identifying where specific activities align with the five areas (hubs) identified by the strategy.