TFAS Project – Examination FeedbackDRAFT TOOLKIT
This toolkit is designed to offer the opportunity to:
- Explore the significance and purpose of offering feedback to examinations;
- Identify various models of examination feedback;
- Assess key factors that may influence the choice of examination feedback model;
- Develop a programme strategy for examination feedback; and
- In light of the programme strategy, develop models of examination feedback at the modular level
The Significance and Purpose of Examination Feedback
There has been considerable focus within the HE sector on both formative and summative feedback mechanisms, with feedback and feedforward cycles highlighted as examples of good practice to enhance student learning, motivate and enhance. Assessment feedback is a valuable part of the student learning cycle and the quality of feedback within a module can motivate and enhance a student’s performance and future employability (Nicol and Macfarlane-Dick, 2006).
Written, oral and practical examinations assessing specific learning outcomes can occur mid-cycle and/or at the end of a module or period of study. Traditionally (particularly for end-of-module time limited and unseen examinations)a sole final grade, with no or little feedback, is returned to the student. Indeed, much focus has been placed on feedback for coursework based assessment and “seldom on …exams which may also have a significant effect on …overall marks or grade” (Hounsell et al., 2007b). For example, whilst Plymouth University, like many institutions permits a student to make an appointment with a tutor to talk through an exam script, this is not widely promoted and staff report little take-up by students of this opportunity.
In addition, with concerns that marks for exams are generally lower than those for coursework and the implications this may have in determining degree classification, the need for exam feedback becomes critically important (Bridges et al., 2002, Simonite, 2003).
The general lack of feedback to examinationshas prompted student concerns (NUS, 2009) – it creates uncertainty and limited opportunities to improve and learn, particularly forany future examination based assessment.As Scoles et al (2012) argue: ‘…while exams have long been regarded as different from otherforms of assessment it is not justifiable to exempt them from the good practice thatcan, and does, inform other types of assessment and other areas of teaching andlearning. This need is reinforced by the continuing use of exams as a significantpart of course assessment strategies in many subject areas.’
Recognising that examination feedback has the potential to make an important contribution to enhancing both the learning experience and student performance (Muldoon, 2012), many institutions have amended their assessment policies to specifically extend the expectation of feedback to examinations.
This toolkit offers a means by which programmes and modules can reflect on current practice and enhance their approach to providing feedback to examination based assessment.
Models of Examination Feedback
Analysis of practice across HEIs and the literature identify a variety of models capable of offering effective and meaningful exam feedback. These recognise a need to offer formative assessment opportunities with feedback as well as summative (Springer Sargent, C., and Curcio, A., 2012).Some examples of feedback mechanisms are illustrated below:
Examples of feedback to formative and summative examinations and tests / Suggestions how toGeneric
Written: summary covering minimum requirements and linked to learning outcomes – could be based on distinguishing between classifications / Email
Moodle
Timetabled session (note potential of content capture)
Written: ‘Model’/ outline answers and/or exemplars bank that could be sent post exam and/or accessed by students on-demand
Verbal: Q&A session
Verbal: peer discussion/ feedback (e.g. of individual formative and/ or exemplars)
Audio: Podcast/ video
Combination: powerpoint with narration
Individual written
Template/ summary feedback form with minimum requirements and linked to learning outcomes with specific reference on how to improve/ what to do next / Hard copy(collected from/ distributed by tutor/ support office)
Moodle
PebblePad
Individual verbal / audio
1:1 appointment with exam marker using script (formative mock and/or summative) / Generallypromoted with on-demand appointments
Blocked out periodsforappointments (‘exam open day’/ ‘feedback week’ etc.)
Moodle
PebblePad
Meeting with Personal Tutor who has received reports from markers as part of PDP process
Examination ‘open day’ to look through papers with tutors
Individual audio recording such aspodcast
Some things to note:
using exemplars may involve a combination of the above e.g. written exemplar with written feedback on its strengths and weaknesses; written exemplar discussed in session with peer feedback and/or verbal feedback from tutor – see Scoles, J., et al. (2012) and Handley, K., and Williams, L., (2009)for example
mock exams may offer the opportunity to provide a number of different types of formative feedback - see the case study by Copestake (2006) for example.
good practice indicates combinationsof these methods across formative and summative examination based assessment can be particularly effective
For example:
- Summative generic written + offer of 1:1 appointment for additional verbal feedback
- Summative generic written + generic group session for Q&A
- Individual formative written + summative generic written
- Formative using exemplars bank and class discussion + summative generic or individual written
- 1:1 appointment with exam marker using script + written individual summary to take away
- Range of formative types + examination ‘open day’
Key Factors in Choosing a Model of Examination Feedback
Whilst the underpinning principles of providing meaningful and timely feedback apply to exams as they do to other forms of assessment, a ‘one size fits all’ model for exam feedback can be difficult in light ofkey factors impacting on what is both practicable for academic staff, and useful for students.
The starting point should therefore be the development of underpinning principles for the design of the programme’s examination feedback strategy. To begin this process complete a table similar to the one below for a standard UG 3 year programme to identify key information (add rows for any additional levels you wish to incorporate):
Level / Semester / Module Code/Title / ExamY/N / Cohort
Size / Exam
Weighting / Nature/ format
of the exam / Purpose(see assessed Learning Outcomes) / Location and type of student (campus-based; FT; PT; DL etc.) / Access to exam
Script / Professional body requirements
4 / 1 / List core and elective modules
2 / List core and elective modules
5 / 1 / List core and elective modules
2 / List core and elective modules
6 / 1 / List core and elective modules
2 / List core and elective modules
As a programme team, discuss the following key factorsin developingunderpinning principles for an examination feedback strategy:
Factor / CommentsLevel
Do you have examinations across all levels of the programme? / Programmes may wish to distinguish in their policy/ strategy in feedback methods applicable to modules at different levels, dependent on the significance of examinations at that level, and thus ‘sequence’ or ‘scaffold’ feedback requirements accordingly.
Size of the Cohort
What is the overall size of the cohort?
What is the size of the cohort at each Level?
If you have core modules with examinations, what is the average cohort size at each Level?
For electives with examinations, what is the average cohort size at each Level? / The size of any individual cohort is an influential factor on what is practicable in terms of feedback. With small cohorts, for example, individual 1:1 verbal feedback by way of appointment may be practicable in way that is not possible with large cohorts.
However, in programmes where examination is a key method of assessment, its significance particularly at Level 4 (see factor above), for example, or in core modules, may determine, even with larger cohorts, that individual 1:1 feedback should be provided because of the significant impact it may have in improving understanding of feedback, performance and satisfaction.
Weighting
In the modules with examinations, what is the weighting?
Do you have any modules that are 100% examination? / The extent of both formative and summative feedback should relate to the weighting of examination as a component of assessment. For example, modules with 50 - 100% examination should contain both formative and summative assessment and feedback opportunities, and may want to offer individual feedback, particularly if future modules also have a high weighting for examination based assessment.
Nature and purpose of the exam
What type of examination is used? (for a range of examples see Appendix 1)
Is there any particular type of examination format used at any level (e.g. MCQ)?
What is the purpose of the examination – is it for example to identify knowledge, or to test practical skills? (Check programme and module learning outcomes.) / The nature/ type of exam may in turn drive the nature of the feedback provided. For example, MCQ based exam assessment may be very capable of immediate, generic feedback, individualised if IT based. The same may apply to seen exams/ questions, which could also utilise an exemplar bank.
In contrast, individualised feedback may bemore appropriate for unseen exams, with choice of exam question across the syllabus. The same may apply if an exam is intended to assess skills such as for example the ability to critically analyse or apply, particularly if these skills are key to improvement and enhanced success at a subsequent level.
Location of and type student
Are your students on or off campus?
Are your students full-time, part-time, distance learners? / The location of students can be a significant factor in determining the type of feedback – for example, offering 1:1 verbal feedback may be difficult if students are not located on campus, although written feedback, or audio recorded feedback, may be feasible. Similar considerations may apply if student are PT and/or DL and in such cases solutions through the DLE can be explored to provide an equivalent experience to FT students.
Timing
When do summative examinations take place e.g. Semester 1 or Semester 2?
When does formative examination assessment and feedback take place? / Semester 1 summative examination feedback may be both easier to provide (in a variety of ways because students are in second semester classes) and of significant benefit if exams follow at the end of Semester 2.
Semester 2 examination feedback types may be limited by students leaving the campus/ ending their year of studies. In such cases there may be some consideration given to offering feedback at the beginning of the next level of study, particularly if examination is a form of assessment at that level.
Similar considerations may apply to the types of feedback offered to formative exam assessment opportunities.
Timescales
What is your absolute marking timescale?
Are there any examples of where this should vary? / 20 working days is the maximum timescale for providing both grades and feedback
Factors that may influence shortening this period include for example:
- when an exam follows very quickly where feedback from a preceding one may be useful; another module developing knowledge follows immediately and therefore quick feedback is beneficial;
- at the end of the year, marks may need to be available more quickly because of panels and boards and students leaving –in such circumstances, as noted above, generic feedback supplemented with additional feedback at the start of the next Level may be considered.
Exam scripts
What is your access to exam scripts? / One type of effective exam feedback is to provide individual 1:1 appointments to work through an exam script. To do this requires access to that script. Making arrangements to facilitate this may be particularly important for exam feedback ‘open days’.
In light of the above discussion, confirm aprogramme strategy for examination feedback that offers sequencing and scaffolding; ensures consistency; and which generates clarity of expectation for both students and staff.In addition to the above key factors there may be some discussion of whether the strategy should offer specific support for some students, for example, those that are unsuccessful in examination assessment and those with DAS requirements.
Examples of potentialprinciples underpinning a programme strategycouldinclude the following, for example. Discuss as a team whether any of these, or indeed any others you can think of, may be appropriate and relevant to your strategy:
- Individual feedback to be provided for examinations at Level 4; generic exam feedbackto be provided as a minimum, with individual offered ‘on demand’, for examinations at Level 5 and Semester 1 of Level 6; generic feedback only can be provided for Semester 2Level 6 examinations
- Where assessment is 50% or less exam based, a minimum of generic summative feedback in conjunction with formative feedback throughout the delivery of the module will be provided. Where assessment is 50-100% examination, individual summative feedback in conjunction with formative feedback throughout the delivery of the module will be provided
- All forms of feedback for Semester 1 examinations must be provided within 20 working days; feedback for Semester 2 examinations can include generic feedback within 20 working days with further feedback provided at the start of the next Level
- 1:1 feedbackto be provided for failed students prior to any reassessment opportunity
- Students with a DAS assessment to be provided with 1:1 feedback for all examination based assessment
- For immersive modules with examination, 1:1 feedback to be provided with bespoke support sessions offered for those required to complete the in-year resit
Confirmed Programme Examination Feedback Strategy:
Selecting Examination Feedback at Modular Level
In light of the confirmed programme strategy, Module Leaders with examination based assessment can now select feedback mechanisms.
This should include both formative and summative feedback to offer a combination of methods which offer inclusive, timely and clear feed-on, feedforward and feedback advice for students in line with the Assessment Policy 2014-2020.
The table below enables the programme to map the distinct module examination feedback mechanisms adopted, which in turn can inform peer review discussion on best practice and sequencing across the programme. (This is based on a standard 3 year UG programme – please add rows if additional levels need to be incorporated.)
Level / Semester / Module Code/Title / Methods of Examination FeedbackFormative / Summative
4 / 1 / List of core and/or elective modules with exam based assessment
2 / List of core and/or elective modules with exam based assessment
5 / 1 / List of core and/or elective modules with exam based assessment
2 / List of core and/or elective modules with exam based assessment
6 / 1 / List of core and/or elective modules with exam based assessment
2 / List of core and/or elective modules with exam based assessment
Appendix 1
Unseen/ closed book: Students have no sight of thepaper’s content prior to the start of the examination.Open book: Participants can take a supporting text into the exam hall, but have not seen the questions in advance. Open books are designed to test students understanding of a subject and how well they can make an argument more than their ability to memorise facts.
Seen: A 'seen' examination is one where the examination questions are released to the students before the examination date. Students then prepare their answers before writing them in a formal invigilated examination environment.
Multiple choice questions (MCQ): Multiple choice exams are often designed to test how quickly students can answer questions, as well as what they know. A twist on this would be to get students to write their own MCQs throughout the term and submit them to a question bank in the knowledge that the MCQ exam will include a proportion of questions from the student MCQ bank mixed with MCQs written by the tutor. MCQ questions can easily be an online exam through ICT software.
Short answer questions: As the name suggests, this type of exam consists of a series of questions that only require concise answers, usually in the form of a definition.
Problem- or case- based scenarios: Problem-based exams can take a variety of forms, such as mathematical problems that require the use of equations, formulae, or the application of scientific theories (such as those used in the disciplines of statistics, chemistry, engineering and physics).
Case-based exams involve the presentation of hypothetical case studies that require the identification of problem/s and solutions. This can be done in writing or in an oral exam.
Practical examinations: In science disciplines, aim to examine students’ ability to perform specific tasks in which to apply their knowledge of the subject to solving specific practical problems or performing specific tasks. Observations such as evaluating teacher performance in a school classroom, or the practical demonstration of skills e.g. social work students in a mock counselling session.
Observed Structured Clinical Examinations (OSCEs): Take place in the health disciplines to assess clinical competence.
Integrated Structured Clinical Examinations (ISCEs): Used in health disciplines to assess both clinical competence and professional skills.
Computer Aided Assessment (CAA): Computer aided assessment utilises software such as Moodle, Questionmark Perception QMP (an online assessment and reporting tool) or simulated environments. CAA can include electronic marking and reduce tutor workload, although the set up time must be taken into account.
Group exams: These could use a number of the formats already listed but in a group context. For example a same day takes home exam where students are given a project/problem at 9am and must hand in a group report / solutions by 4.30pm. This type of exam is authentic and can assess teamwork skills
Individual oral exams: Oral exams test knowledge and capabilities through spoken interaction between the student and the examiners. They range from a straightforward question and answer format, to problem-based or hypothetical scenarios that may evaluate a student’s interpersonal communication, diagnostic or creative abilities. Typical formats:
- Viva voce: a panel of experts questioning a student about how they would deal with a particular patient or case - used in medicine and some other health disciplines. The viva may also be used as a verbal defence of a written research project or dissertation (commonly used in science disciplines) A viva is often paired with a substantial written or visual project which the student is invited to elaborate on or ‘defend’.
- In modern foreign language programmes students deliver a presentation or participate in a conversation
- Auditions or performances often used in creative and performing arts
Group oral exams: A group is usually comprised of three to five students. The exam might include a group presentation, a group discussion with examiners or a group audition or performance.
Note the following should be recorded for MR purposes as coursework since they are not formally invigilated:
Take-home - same day: A one-day take-home examination is handed out and returned on the same day, typically beginning at 8:30 a.m. and ending at 4:30 p.m. In the case of exams done in an online format, via computer labs or web browsers, the time limit may be a matter of hours from the moment of logging-in. Students need plenty of notice of the date.
Take-home – extended: An extended take-home examination is taken over a period of time to answer the questions, which may vary from 24 hours to a week. Extended take-home examinations are open-book and allow for full discussion among students. Students need plenty of notice to of the dates to organise any personal commitments.
(DRAFT) Bibliography and Further Reading