7 / Monitoring and Evaluation
7.1 / Introduction
7.1.1 / This chapter is concerned withthe proceduresby which the University evaluates the continuing effectiveness of its curriculum and of assessment in relation to the intended learning outcomes.[1]. In developing these processes, the University’s Strategic Plan[2], and the QAA’s UK Quality Code[3]. (expectations outlined in Part A[4] and Chapter B8) have been taken into account.
7.1.2 / In order to discharge its academic functions effectivelythe University must be in a position to assess its performance at various interlocking levels.
  1. it must display a capacity to carry out an all-institution audit of the performance of its taught provision
  2. there must be capacity for analysing the performance of taught provision within faculties
  3. taught courses must be susceptible to individual and collective review, on an annual (immediately after delivery and when full results are known) and periodic basis. Here, design and delivery issues (input and throughput standards) meet output standards in terms of student awards (and their regulatory frameworks) and employment outcomes. Taught modules, where the student experience becomes very manifest, must be analysable immediately after delivery and again when full results are known.

7.1.3 / These levels of interlocking scrutiny will enablethe University to investigate quality assurance, enhancement and standards issues ranging from the individual unit (module) to overall University performance.
7.1.4 / With this in mind, the University’s Monitoring and Evaluation procedures have been developed to provide Module and Course Leaders, Performance Enhancement Meeting (PEM) Chairs,faculty managers and University level committees with a range of evidence with which to evaluate the quality and standards of the University’s teaching, assessments, marking and awards. Evidence includes data on student progression and achievement, the views of External Examiners and students, and nationally available data such as NSS and DLHE outcomes. Monitoring and Evaluation procedures also provide the means by which both faculties and the University can evidence the quality of teaching and learning and the student experience overall. In addition, these outcomes provide an evidential tool for identifying and sharing good practice and promoting consistency in quality assurance across the University[5].
7.2
7.2.1
7.2.2
7.2.3
7.2.4
7.2.5
7.2.6
7.3
7.3.1 / Reporting and Analysis Structure
Reporting levels and sources
Reporting is based in quantitative data and on commentary and evaluation by a range of people including University staff, students and External Examiners.
All data used in monitoring are drawn from the University’s Student Records System (SRS), where performance data are updated to reflect the decisions taken or approved by relevant University committees or authorised officers.
The key decisions concerning student academic achievement will have been taken:
  • For modules, at Subject Standards Board (SSB) level, with quality assuranceprovided by the algorithms in the SRS, by the internal double-marking and external Subject Standards Examiner moderation systems
  • For progression, at Course level, by the University’s progression process, with quality assurance provided by the algorithms in the SRS and nominated senior Faculty staff
  • For awards, at the University’s Awards Board, with quality assurance provided by the algorithms in the SRS and by the auditing carried out by External Awards Examiners.
So, performance is analysed at module and course level and consolidated at SSB/PEM level. Faculty and University level performance is analysed at Faculty and University Undergraduate and Postgraduate Committees.
Module Level
Module level data focuses on pass rate and mark averages for overall module results. Average module mark and component level data are also available to facilitate consideration of individual assessments.
Course Level
Course level data are based on the credits and completions achieved by students and the RPA[6] consequences. For instance, whether a student has achieved sufficient credit to: progress to the next level (Progression); completed the requirements for an award (Achievement); whether a student may stay with the course they joined, or fail, or withdraw (Retention).
Consolidated Reporting Levels
Consolidated data at levels about course level uses the same categorisation/analysis as at course level thereby enabling comparison between the performance of different subject areas (SSB/PEM level) and different faculties, as well as year on year comparison.
Module and Course Log Templates
The University provides standard Module and Course Log templates for monitoring purposes.[7] Prior to the July PEM, Academic Registry produces and circulates module data for Module Leaders to incorporate in their Module Logs and at course level the Office of Institutional Effectiveness (OIE) pre-populates logs with interim performance data and distributes them to faculties. Course and Module Leadersare thenrequired to evaluate modules/courses for which they are responsible and propose any action required.
Course and Module Log Data
Retention, progression and achievement (RPA) data
7.3.1 / The Office of Institutional Effectiveness (OIE)[8] is the University’s authorised source for monitoring and evaluation data. Course and module data are extracted from the University’s Student Records Sytem(SITS)and entered onto Module and Course Logs.The OIE also provides data on the destination of leavers from higher education (DLHE) outcomes and student feedback collected through the National Student Survey (NSS).
7.3.2 / Course and Module Leaders are expected to analyse and comment on data in the Course and Module Logs at the July census point,at a point where all teaching and all taught provision initial assessments have been completed and results (and awards for UG) published. Whilst these interim data do not deliver a complete picture of module and/or course performance, as they are prior to resits and postgraduate dissertations, they provide Course and Module Leaders with an important early indicator of performance and student behavior[9]sufficient to enable key issues to be identified and action planned.
7.4 / External Examiner Commentary
7.4.1 / In a system where universities have degree-awarding powers and are responsible for the quality and standards of their own taught provision, the External Examiner arrangement is the principal means of benchmarking academic standards nationally.
7.4.2
7.4.3 / Subject Standards Examiners provide, at module level, advice on the suitability of draft assessment tasks and on the marking standards of students’ completed assessments. Subject Standards Examiners are asked to attend PEMs[10]where they have the opportunity to comment more generally (including above module level), on the basis of performance information supplied to them. Subject Standards Examiner Annual Reports provide institutions with independent feedback at module, course and SSB level. In line with the UK Quality Code’s[11] expectation and indicators on external examining, the University expects its External Examiners to comment on good practice and opportunities for improvement in addition to confirmation of standards. Module and Course Leaders are expected to reflect on External Examiner comments in Module and Course Logs.
Awards Examiners are appointed to the University Awards Board, and provide advice at institutional level via the three meetings of the Board each year and also via their Annual Reports. Awards Examiners comment and advise on the probity and management of the University’s awards processes and regulations, on academic standards, on the effectiveness of the University’s monitoring and improvement processes, with emphasis on RPA, and on strategic level initiatives affecting the student experience. This is in addition to their fundamental role of auditing and verifying proposed awards. The Board works on the basis of Board papers and presentations, mostly based on University level data – normally broken down to faculty or school level and where appropriate also down to SSB level - provided by the OIE or Academic Registry. Occasionally Awards Examiners will also be asked to conduct audits of subject areas – typically at Faculty / School level – or of a University-wide change process.
7.5 / Course Committees
7.5.1 / Course Committees[12] are an important element of the quality assurance and enhancement cycle and as part of their remit include a review of Course and Module Logs and consideration of External Examiner reports. Course Leaders are expected to reflect on feedback from Course Committees in Course Logs. For further information on Course Committees, see also the chapter on Student Engagement.
7.6 / Course and Module Questionnaires
7.6.1 / The outcomes from Course and Module questionnaires, which are completed in May each year for undergraduate courses, and January and May for postgraduate, are available at the end of the semester or academic year, as appropriate. Course and Module Leaders are expected to reflect on these outcomes in Course and Module Logs.
7.7 / Performance Enhancement Meetings (PEMs)
7.7.1
7.7.2 / Performance Enhancement Meetings (PEMs) are designed to provide faculties with the opportunity to assess the academic health of modules and courses, monitor performance of students and engage with Subject Standards Examiners regarding the fitness for purpose, and ways of enhancing, the course or module.
Performance Enhancement Meetings (PEMs) take place following the Subject Standards Board (SSB) mark confirmation meetings and subsequent publication of marks to students, and the early July University Awards Board and publication to students of progression and awards. The PEMs are scheduled at the earliest possible time bearing in mind the need to produce and consider module and course data. All examiners, both external and internal,are invited to attend PEMs.
7.7.3 / The PEM and the equivalent SSB normally share a common membership and Chair. PEMs are scheduled by the Academic Registry but operate under the aegis of the Faculty Undergraduate & Postgraduate Committee with Secretariesbeing provided by the appropriate faculty. Information on the arrangements for PEMs is available through QEU’s Monitoring and Evaluation webpage[13]. The Academic Registry also providesoperational guidance to External Examiners, PEM Chairs, Vice Chairs, Deans and others in the run up to meetings.
7.7.4
7.7.5
7.7.6 / The PEM focuses primarily oncourse level performance. Module results will have been considered at an earlier point by Course and Module Leaders, with Module Logs and draft Course Logs provided to the PEM Chair. Modules with a 60% or lower pass rate at July (Modules of Concern) are also considered at the PEM, along with any evidence of good practice that is worth sharing. The PEM Chair should identify prior to the meeting which modules need to be considered, but will also ensure the meeting addresses any issues raised at the meeting by External Examiners.
Prior to the meeting, data consolidated at SSB level will be provided to the Chair by OIE. The Chair will share and encourage the meeting to reflect on this if and to the extent he/she considers useful. The meeting – as the primary face-to-face meeting of the year concerning performance enhancement - will be formally minuted. It is important that documentation is finalised promptly after the meeting. The intention is that actions arising can, where appropriate, be instigated in time to affect provision for the following academic year. In particular, actions – which may commonly need to be approved via the formal modification process – may impact on curriculum and/or assessment and where possible need to be reflected in changes to specifications and handbooks.
Information and/or decisions from PEMs flow into other areas of quality assurance and enhancement: to Faculty committees such as the Faculty Undergraduate & Postgraduate Committee, and into central University committees, such as the Undergraduate & Postgraduate Committee. The diagram below illustrates these connections.

Monitoring and evaluation process

Information and source Core documentation Scrutiny Forum


7.8 / Timing of PEMs
7.8.1 / Performance Enhancement Meetings(PEMs), for all taught provision, are scheduled to take place in July, approximately five weeks after the completion of mark entry and four weeks after the consideration of student work and the confirmation of marking standards by External Examiners. This allows for essential decisions affecting courses and modules to be taken at the earliest possible stage following the PEM. Where issues have been identified by a PEM, Module and Course Teams have up to two weeks to complete the process required for modification (see the chapter on Modifications)in order to effect improvements to the course/module forthe following academic year.
7.8.2 / The PEM Chair’s Report is based on updated, aggregated module, course and subject level data from the OIE, and takes into account re-sit and postgraduate dissertations results. The PEM Chair’s Report is expected to be completed in Novemberand should draw heavily on thesources that were considered at the July PEM.
7.8.4 / In summary, the two phases of the PEM processare:
  1. Performance Enhancement Meeting (PEM) held as early as feasible, in the latter half of July. Data is provided in advance by the Office of Institutional Effectiveness (OIE)[14] (and in the case of supplementary module information, by Academic Registry) to faculty participants and managers to aid their preparation for the meeting. Academic Registry provide
distribution to Subject Standards Examiners.
  1. PEM Chair’s Report, taking into account relevant Course Logs,is completed November when full year data (re-sit and postgraduate dissertation outcomes) is available.
There are separate templates for undergraduate and postgraduate provision.
The documents that feed into the first phase are as follows:
Module Log
  • QEU0023a PG Module log form
  • QEU0023b UG Module log form
Course Log
  • QEU0024a PG Course Log form
  • QEU0024b UG Course log form
PEM Chair’s Report (PCR)
  • QEU00XX UG PEM Chair Report Template
  • QEU00XX PG PEM Chair Report Template

7.8.5
7.8.6
7.8.7
7.8.8
7.8.9
7.8.10
7.8.11 / Updated data
For phase two, updated data is produced by OIE at module and course level. This will be associated by Module / Course Leaders with the phase one Logs produced in July, and Module / Course Leaders will consider whether there is a need to update the actions contained in the commentary they made at phase one.
New data and reports
In the case of modules not considered at the July PEM, namely postgraduate dissertations and other exceptional summer modules, full Module Logs will be completed at this stage.
In addition, full PG Course Logs should produce at this second phase, as awards and completion / non-completion information was not available for the first phase of the PG Course Logs.
Thresholds and targets
It should be noted that different threshold expectations will apply at phase two, e.g.
  • the UG threshold module pass rate would be 70% after resits,
and similarly for other key indicators such as
  • percentage of Year 4 and 5 students progressing to the next year,
  • percentage of final year students achieving target award,
  • percentage of final year students achieving a “good degree” (1st + 2.1 or Distinction + Merit).
Given acknowledged sector-wide variations by subject area / discipline, for some indicators faculties will set their own targets.
New documentation
The new documentation which is produced only at phase two is the
PEM Chair’s Report (PCR)
  • QEU00XX UG PEM Chair Report Template
  • QEU00XX PG PEM Chair Report Template
Following on from the above, faculties produce a Faculty Annual Monitoring Statement(FAMS) which is considered first at the faculty Undergraduate and Postgraduate Committees and thereafter at the University’s Undergraduate and Postgraduate Committee (see 7.5 below).
7.9 / Faculty Annual Monitoring Statement (FAMS)
7.9.1 / The FAMS is intended to provide the University’s Undergraduate and Postgraduate Committee with the following:
  • an evaluation of Faculty performance at undergraduate and postgraduate level
  • confirmation of the implementation of the University’s quality assurance procedures
  • information on validation/review/modification/deletion activity
  • update on enhancement within the Faculty
  • action plan

7.9.2 / The PEM Chairs’ Reports and Course Logs should be used to inform the Faculty Annual Monitoring Statement (FAMS). Other evidence should include External Examiner reports, course committee minutes, student evaluation questionnaire results and PEM outcomes.
7.9.3 / The scrutiny of the Undergraduate and Postgraduate Committee is essential to provide the University with assurance, that through its processes, the institution is managing risk appropriately andproportionately for its portfolio of programmes.
7.9.4 / The Undergraduate and Postgraduate Committee’s consideration of the FAMS also provides an institutional level platform for the identification and dissemination of effective practice (both internally and externally) and a means of promoting the benefits gained by the institution as whole (staff, students and other stakeholders) fromthe annual round of monitoring and review activities.
7.10 / Annual Monitoring and Periodic Review
7.10.1 / The University’s taught provision is reviewed every five years and is described more fully in the chapter covering validation and review. Unlike the day-to-day quality management of taught provision, Periodic Review takes into account a changing environment, longitudinal data, market trends and current research etc. To do this, the review process draws heavily on the Module and Course Logs, External Examiner comments and PEM minutes, amongst other things.
7.11 / Monitoring and Evaluation for Collaborative Partners
7.11.1 / Collaborative partners delivering University courses are required to keep Module and Course Logs for each module and course delivered by them. Collaborative Module and Course Logsare adapted versions of the on-campus templates and are employed in the same way at Performance Enhancement Meetings (PEMs).
7.11.2 / In addition to the above, partners are required to complete a Collaborative Annual Monitoring Statement(CAMS) summarising performance of all courses delivered by the partner and updating the University on any major developments.
7.11.3 / Module and Course Logs are submitted to the dedicated Academic Liaison Tutor for their commentary on the academic health of provision delivered by the partner. Once completed, Course Logs are submitted to the faculty’s Quality Representative and to the QEU.
7.11.4 / The Collaborative Annual Monitoring Statements are submitted to the faculty manager responsible for collaborative provision and to the QEU for consideration at faculty level. The QEU evaluates all Course Logs and CAMS and submits an annual collaborative provision report to the University’s Undergraduate and Postgraduate Committee for consideration and, if required, action.
7.11.5 / Having taken into consideration the Course Logs and CAMS, the faculty manager (or nominee) responsible for collaborative provision submits a Partnership Annual Monitoring Statement(PAMS) as an accompaniment the QEU’s annual collaborative provision report to the University’s Undergraduate and Postgraduate Committee for consideration and, if required, action. The diagram below depicts the process.
7.11.6 / Timings will of necessity depend upon the teaching and assessment cycles of each partner. But performance statistics will be generated and formally considered only once, when the annual cycle including calculation of awards is complete.
7.11.7 / It will commonly be convenient for the PEM to take place on the same day as and immediately after (or incorporated within) the final SSB of the year. In this case the PEM will use partner-provided or SSB Secretary-provided data (assuming proposed module marks will be confirmed and award proposal calculations will be agreed) as the basis for PEM discussion. Records including draft Module Logs and draft Course Logs will be revised as necessary subsequently, to reflect final authorised data entered in the University’s SRS. The SSB Secretary will normally also be the PEM Secretary.
7.11.8 / In the few remaining cases where there is no dedicated SSB for the partnership and module results are instead considered and confirmed at one or more SSBs designed for London Met taught provision, then a separate PEM for the partnership will normally need to be scheduled.
7.11.9 / Generally for undergraduate level collaborative provision, the PEM will follow the final SSB and therefore be between June and September, while for postgraduate level collaborative provision the PEM will follow the final SSB and therefore be between October and December.
7.11.10 / CAMs should be normally be completed by end January and discussed at the next Faculty UG/PG Committee.
7.11.11 / PAMS should be completed should be completed in early February and the ACPR should be completed in mid-February. Both will be discussed at the next University UG/PG Committee.

Annual Monitoring for Collaborative Partners