Assessment Careers

Pilot Report

Assessment careers: Masters in Education, Health Promotion and International Development (MA EHPID)

Author: Dr Ian Warwick

October 2013

Institute of Education, London

JISC Assessment and Feedback Strand A Project

Pilot Project Aims and objectives

The 5 pilot projects each aim to explore the potential and practicality of a longitudinal (assessment careers) approach to assessment with the following objectives to:

a) promote a longitudinal approach to writing feedback (tutors)

b) promote a longitudinal approach to acting on feedback (students)

c) encourage reflection and dialogue over student progress both within modules and between modules

d)develop a set of Assessment Career principles to be used to scale up and embed the longitudinal approach to feedback

e)assess the role of technology in meeting the above objectives.

The intervention made to achieve these aims is different in each pilot. However, all pilots will use the same action research methodology, evaluation questions and undertake the same evaluation process using the same tools to ensure that findings are generalisable at least across the IOE.

  1. Summary

The pilot took place with two online modules of the Masters in Education, Health and International Development (MAEHPID) at the IOE. This pilot aimed to improve feedback on two online modules by the module leader. In particular, involvement in the pilot enabled the module leader to identify how new activities might best encourage students to identify past feedback and consider its use to promote learning, and explore ways of further building assessment for learning into current online module designs.

  1. What are the headline achievements of your pilot?

Participating in the study has enabled the member of staff providing feedback on the two modules to initiate conversations with students about the value of feedback – and to tailor feedback to student concerns and interests. The addition of specific comments about how best to strengthen essays was noted by the external examiner as a valuable addition to feedback to students. In the external examiner'scomments on coursework feedback, for example, the external examiner noted:

On [some] modules there are comments like, ‘this essay could be strengthened by...’ type comments … I think this is useful for students to take forward into other assessments.

  1. What were the key drivers for undertaking the pilot?

The aim was to improve formative and summative feedback to students – in particular, to make it explicit how students could strengthen their written work and to ‘feed forward’ into future academic writing. This works towards the programme aims of reflecting on research, analysis, policy and practice in relation to education and health promotion in low- and middle-income countries and to critically review arguments, advance analysis and undertake research on education, international development and the promotion of health and well-being.

  1. What was the educational/organisational context in which you undertook your pilot?

The two modules which were selected for inclusion in the study were attached to the MA Education, Health Promotion and International Development – which form part of a cluster of MA programmes in a suite of International Development programmes offered by through the Department of Humanities and Social Sciences at the Institute of Education, University of London.

The two modules are both offered online only. In 2012, the module included in the study was Developing and Promoting Health and Wellbeing (DPHW), in 2013, the module was Introduction to Social Research (ISR). The modules are taught by one member of staff who provides feedback on the formative assignment. Summative assignments are parallel marked with the module tutor providing written feedback to students.

The modules attract students interested in international development with a number of students studying while in low/middle income contexts. As such, they have limited access to technology (particularly up to date hardware), some have limited internet connections with some also having limited access to electricity.

Modules were designed so that students could download readings and learning activities and work on them offline if needed.

With regard to feedback, this was provided individually on formative and summative work. In addition, group feedback was also provided. Group feedback took the form of general points across the essays which students could use a checklist to ensure that their work was aligned with the requirements of the essay and the assessment criteria.

For both modules – and based on the teaching the modules in previous years – details guidance was provided on the structure and the areas of content the essay should address. This was provided to illustrate how the general IOE assessment criteria would be used for each of the modules.

  1. What was the technology context?

The modules were taught using the Virtual Learning Environment (VLE) Moodle.

As noted, some students had limited access to technology.

Draft and final assignments were submitted electronically by students. For draft essays, feedback was provided directly on essays (using MSWord comments and track changes).

Final essays were printed and staff two marked hard copies in parallel, with feedback being provided to students by the first marker on marking/feedback sheets which were sent to students electronically.

  1. How did you approach the pilot?

Given that some students limited access to technology, it was decided to invite students to reflect on past feedback and identify potentially helpful current feedback using a similar approach to the learning activities used on the module. Therefore, a cover sheet was prepared which students could download and attach to their assignment cover sheet. This invited students to identify what past feedback they had received. For formative work this was past feedback on another module. For summative feedback (on a final module assignment) this was to summarise their formative feedback on their draft assignment for the module. Examples of the cover sheets are included as Appendix 1.

Students were asked to note what use they had made of feedback and what feedback they would value with regard to their current submission (whether to improve the current draft piece of work, or related to summative feedback).

  1. What benefits has your pilot delivered and who are the beneficiaries?

For feedback on drafts (formative feedback) 22 students were involvedin Introduction to Social Research during 2012-13. For feedback on final work (summative feedback) 25 students were involved. Ten students provided qualitative feedback via email on taking part in the process, and the quotes that appear below are taken from their comments..

The main benefit for students appeared to be the dialogue about feedback which was begun using the Student Feedback Response Sheet. One student, for example, noted,

Took it [feedback] on board, summative and formative so tried to weave it into the ISR essay and am bearing in mind more general feedback e.g. be more critical for future work

With regard to the various assessment categories, another student noted that information about progress enabled them to ‘…know whether I am on the right track or not and therefore act accordingly’; and ‘Advice for future assignments help to make a good reminder so that the next assignments is an improvement from the previous one’.

Although giving praise was not an intended to be a focus of this pilot project, when asked about whether it was of value, two students in particular noted,

Praise gives me the satisfaction that I was able to do something that was recognised and thus the confidence and even the feeling to do more

I found this [praise] extremely helpful – often we concentrate on what is wrong with work at MA level (both our own and that of others). Always looking with a critical eye – so some time-out to reflect on the positives of the essay was a timely confidence boost.

One student, who had been away from academic study for some time, noted that detailed feedback was particularly helpful on a distance learning module:

The feedback I received was critically important in the development of my paper, and as a learning tool in general. This was my first module in the MA course, and as a student who is returning to university after many years, and coming from an American academic background, I needed the guidance help from the instructor on writing styles, and for getting a feel of where I stood with my work. I had no way of gauging from a distance whether I was on track or not. The feedback itself was detailed, and offered both positive, reinforcing feedback for the things that worked; and constructive criticism for the things that did not.

However, not all the comments received were positive. One student noted that summative feedback had ‘taken quite a long time’ on the social research module and had hoped for swifter feedback as they had intended to use the feedback to inform the development of their dissertation. This same student, and one other, also commented on the detailed guidance for the assessment.

I went through the feedback I received for my draft essay in some detail to ensure that I'd covered all the points raised before submitting the final essay. I felt that the feedback on my own essay was useful and I was glad to be prompted to ask my tutor to respond to specific queries (e.g. is the balance/weighting of the issues ok? have I brought in enough of my key learning points?). However, I felt quite mixed about the general (whole group) feedback we received. Although this was useful, and I did use it as a checklist for my essay as suggested, I couldn't help but feel that it was a bit too prescriptive and that we were being too spoon-fed (…) which was quite at odds with the teaching, discussion and material covered throughout the module, which I had found both interesting and challenging.

The general feedback to the whole group was a summary of the key issues arising out of the module – and perhaps for this more academically able student (who was awarded a grade ‘A’ in the final essay), some of the basic points highlighted – while drawn from across the essays – appeared rather too rudimentary. However, students more generally appeared to value the detailed general guidance. However, this does suggest that a more nuanced approach to guidance and feedback – or clearer guidance on the nature and purpose of the group feedback would be useful.

This raised a somewhat unanticipated outcome of the project. Although an evaluation sheet is completed after the teaching of each module at the IOE, this can tend to omit feedback from students about feedback and assessment as it is generally administered before students have received feedback on their summative assessment. By inviting feedback on these elements of their study, the module leader was able to find out more about whether and in what ways in assessment itself was thought to be useful with regard to student learning and the perceived value of different types of assessment feedback. This suggests that feedback should form one element of the module evaluation.

However, even for those students who were somewhat critical of some of the feedback, each valued the addition of the feedback sheet to the module assessment process. When asked whether the feedback sheet help them to focus on the value of feedback, the following responses were provided:

Yes – I liked the idea of the feedback sheet and would welcome its inclusion in all module feedback.

Yes it did. It helped me reflect on areas I needed help from the tutor.

Yes it does. Reading the feedback showed me the weakness and strengths in my essay. I believe this will help for future essays and how to phase the essay.

Yes, particularly as I'd taken a long break between this and the previous module so it actually made me go back and consider feedback from earlier assignments.

Absolutely. It was my guide.

We also compared feedback given on assignments in previous years of the module with feedback given after the intervention by the same teaching team using a standardised feedback analysis tool. The feedback analysis tool was redesigned to distinguish the different purposes of feedback drawing on simple feedback categories developed by Orsmond & Merry (2011) and with an additional category of ipsative feedback.

The feedback categories were:

P1Praise for good work

P2 Recognising progress or ipsative feedback

CCritical feedback. This was subdivided to distinguish error correction C1, and critique of content C2 and critique of structure and argument C3.

AGiving advice. This was also subdivided to distinguish specific content advice for the current assignment

A1, general advice for the current assignment A2 and advice for future assignments A3.

Q Questions addressed to learners to request clarification of their ideas.

O Unclassified statements. Neutral comments, for example that describe the piece of work but do not make any judgement, were unclassified.

A feedback profile for an individual assessor or for a module or for a programme could be compiled from looking at the balance between the categories.

The result with the feedback analysis tool in Table 1 did not show there to be much in the way of difference between the types of feedback provided across the two modules for summative feedback (see the table, below for feedback categories). For example, specific advice on the content of formative essays was a key element of feedback across the first and second modules.

However, for formative feedback there were several changes in the pattern of feedback. While specific advice A1 and correction of errors C1 remained dominant, the feedback for 2013 showed a large decrease in the use of questions. This appeared to be a response to making greater use of statements which were explicit about how the writing might best be strengthened. One change for future feedback would be to invite students themselves to consider whether addressing one or another issue in the issue might strengthen their writing.

Table 1 Health and Development Feedback on drafts

2013 / 2012
No. of comments (n=18) / Rank / No. of comments (n=22) / Rank
P1 / 29 / 5 / 29 / 5
P2 / 0 / 0
C1 / 136 / 2 / 230 / 1
C2 / 49 / 3 / 92 / 4
C3 / 28 / 6 / 4
A1 / 204 / 1 / 192 / 2
A2 / 22 / 15 / 6
A3 / 0 / 1
Q / 35 / 4 / 160 / 3
O / 14 / 0
Average no. of comments per script / 28.72 / Average no. of comments per script / 32.86

Health and Development Feedback on final work

2013 / 2012
No. of comments (n=28) / Rank / No. of comments (n=25) / Rank
P1 / 296 / 1 / 143 / 1
P2 / 0 / 0
C1 / 43 / 3 / 7
C2 / 43 / 3 / 48 / 2
C3 / 30 / 6 / 32 / 4
A1 / 148 / 2 / 41 / 3
A2 / 24 / 7 / 13
A3 / 2 / 6
Q / 13 / 3
O / 39 / 5 / 0
Average no. of comments per script / 22.79 / Average no. of comments per script / 11.7

For summative feedback, praise was the main type of feedback with less in the way of advice or critique on the work itself. There may be scope to consider how the social research assignment could be better aligned with students small-scale research projects (such as that for a report or dissertation) and so guidance could ‘feed forward’ into that work – although this may be easier to accomplish if the tutor for the modules was the same as that for the report or dissertation.

However, using phrasing such as ‘…this essay could be strengthened by..,’ enabled the external examiner for the programme as a whole to identify specific advice being given to students.

With regard to workload, apart from the development of the feedback sheets, there was little if any extra time needed for feedback. In fact, providing feedback – although not done quicker – was perceived by the module leader to make it easier to focus and tailor comments to student requests.

For future modules, it is likely that a consideration of feedback on past work could be built into initial module activities. For example, towards the beginning of a module, students could be invited to identify issues which have been raised by past feedback and to consider whether this could inform their learning on their current module of study and, if so, waht action they might take.

It is likely that this would be a relatively straightforward set of activities to scale-up for other modules. One key issue is the extent to which other module leaders would perceive there to be value in encouraging students to consider past feedback as one element of engaging with new topics and issues addressed through other modules. This would, however, require a reconceptualisation of the programme, at least to a degree, to enable a greater focus across modules on supporting and developing students’ academic and assessment literacies.

  1. What outputs has your pilot produced?

The feedback sheets used in the study have already been shared with other members of the Assessment Careers project. These are relatively straightforward to adapt for other modules. However, it is likely that one or two activities will now need to be developed to explore how students can begin to think about assessment at the start of their learning on modules. For example, for the latest iteration of the social research module (offered during Autumn term 2013), a question about whether and how feedback has contributed to a student’s own learning has been included as part of an introductory activity with other students. The module leader will summarise the key themes outlined, present these back to students, and, towards the end of the module (when students are focusing on preparing their draft assignment), will use the themes to prompt students to consider what sort of feedback they might value.

  1. Has delivering the pilot brought about any unexpected consequences?

As noted earlier, one unexpected consequence was the apparent value in developing dialogue with students about assessment more generally – such as that provided throughout a module – as well as asking students to consider in what ways the assessment activity itself (such as an assignment) might be better aligned with their own learning across a programme.