Assessment and feedback practice in Higher Education has long been the major source of student dissatisfaction (National Student Surveys) . While technologies are increasingly being used as a tool to improve the learning experience for students and staff, the use of technologies in improving the assessment experience is still patchy. In particular, in the JISC call for proposals for the assessment and feedback programme, it was suggested that ‘many institutions have yet to put in place well-embedded, supported and sustainable technology-enhanced assessment and feedback practices.’
Background of the Project
The University of Glamorgan has long identified as part of its strategic vision the key role assessment has as the crucial element to improving the student learning experience. Technology has played a significant part in developing and implementing a revised approach to curriculum design at the University by placing assessment at the heart of learning. Two major assessment and feedback innovations, Assessment Diaries and GradeMark, were introduced as part of the institutional wide Change Academy Project - “Putting assessment at the heart of learning”  supported by the Higher Education Academy and the Leadership Foundation in 2009-2010. The two innovations, Assessment Diaries and GradeMark, have since been gradually introduced and implemented across the institution. As is the case with many innovations in Technology Enhanced Learning (TEL), these innovations were interpreted and adopted at different levels across the University.
As part of the wider JISC assessment and feedback programme  aimed at improving assessment experience via the use of technologies, the project aims to evaluate both the Assessment Diaries and the use of GradeMark. These tools were designed to address the issue of assessment bunching and poor feedback to students.
Assessment bunching is a term used to describe what happens when several assessment deadlines fall on the same date. For example, a student studying six modules with varying assessment tasks finds that four of the modules have submission deadlines on the same date. The issues associated with assessment bunching are commonly identified as less time being spent on each of the assessments, thus impacting on the quality of the submissions, and lower attendance in lectures and seminars whilst students are concentrating on the multiple assessments to be submitted. It is self-evident that poor-quality submissions will affect individual student performance but they may also lead to academic staff being obliged to explain poor performance in a module during their annual review. Similarly, poor attendance can have a detrimental impact on the whole class as it limits the opportunities for collaborative learning and may result in a more tutor-centred learning experience.
The quality of feedback on assessments is an integral part of the learning process and can enable students to develop their approach to assessments as well as enhance their knowledge acquisition, communication and critical thinking skills. Feedback that comprises only ticks and a grade allows students to understand they did something correctly in that piece of work, but does not engage them in a dialogue about what was right, or how they could transfer that approach to future work.
Both assessment bunching and poor feedback have been frequently highlighted in the literature as major challenges towards improving student and staff assessment and learning experiences. For example, the NUS (National Union of Students) , Bloxham and Boyd  and the REAP (Re-engineering Assessment Practices) Project  have also stressed the need for time on task and speedy and detailed feedback. The University of Glamorgan project team believes that the key to resolving these two issues lies with using technologies to support and enhance dialogues not only between students and staff, but also amongst staff.
The Assessment Diary is essentially a simple list of module codes and titles, dates for assessment submission and return of feedback. This information is posted on the institution’s VLE, Blackboard. The diary uses an in-house Web-based front end which is provided within Blackboard. The assessment diaries are populated with assessment data taken from the centralised database of validated modules. The type of assessment is automatically populated and cannot be changed. This provides an additional quality assurance mechanism in that if tutors wish to change the type of assessment – say from essay to a time-constrained test – they will be required to process that change through the University’s quality assurance procedures. Module tutors are able to view their own modules and a course overview that enables them to see when other deadlines have been set. The tutors then follow a series of simple steps to create new deadlines and also input the date on which feedback to students will be provided.
Students can view the assessment diaries in three formats – by module, by course or by calendar view – enabling them to see the detail of each module’s assessment deadlines and feedback due dates (in module view) or all their assessment deadlines and feedback dates (in course view) or through an electronic calendar allowing them to plan their time (in calendar view). Tutors can view all the assessment and feedback dates for all assessments in a simple list. A series of automated reminders to students are generated two weeks before submission, one week before and on the day of the submission deadline. Tutors receive automated reminders requesting them to populate the assessment diaries if assessment and feedback dates have not been provided.
GradeMark is an online marking tool that is part of the Turnitin plagiarism software and it provides tutors with a facility to annotate comments directly onto the student’s work. Tutors can also set up comment banks of commonly used phrases and record up to three minutes of oral feedback. Students are able to access their marked work via the University’s VLE. Having marked work available online has significantly reduced the number of uncollected scripts. Moreover, tutors are also able to see when students have read their feedback - meaning that it is possible to identify the extent to which students are engaging with feedback.
The project team view GradeMark as more than a tool to improve the quality of feedback. Rather, it is seen as a tool that could significantly improve student and staff experience of assessment by better managing the feedback process, and offering an opportunity to maintain a meaningful dialogue with students.
An online questionnaire was used to gather baseline information on the student experience of Assessment Diaries and GradeMark. Over 220 students completed the questionnaire. This was followed by 5 focus groups with students in order to explore some key themes identified from the questionnaire. Interviews were used to capture the views and experiences of academic, technical and administrative staff on the two innovations. In particular, the interviews focused not only on pedagogical impact but also workload impact. The evaluation of the two innovations, focusing on tackling the two issues would be of great benefit to the wider sector in improving the assessment and feedback experience for both staff and students.
The team have explored the use of Assessment Diaries and GradeMark with both users and non-users. Some findings to date are presented below.
Major Findings to Date
Pedagogical Impact of Assessment Diaries
A common theme emerging from the analysis of interviews so far was the expected comment that use of the Assessment Diaries helps to prevent bunching of assessment deadlines for students. Some staff expected the diaries to enable students to make ‘submissions that were better planned and executed.’ The course overview of the Assessment Diaries enabled staff to see the balance between types of assessment methods and ensure appropriate variety of approaches.
One interviewee in particular felt the diaries made a significant contribution to quality assurance in highlighting where assessment details on validated module specifications did not match actual practice. In this particular example the interviewee had responsibility for an entire course of study and was able to see where there were mismatches between the type of assessment noted on the validated module documents compared to the type of assessment the tutors were seeking to enter in the assessment diary. This led to a discussion among all module tutors to ensure that assessment practice matched the validated course specifications. Interestingly it also led to some agreed changes in the validated specification and a better balance of assessments in terms of the mix of presentations, essays, reports and examinations.
Pedagogical Impact of GradeMark
A number of interviewees felt that GradeMark enabled them to improve their feedback to students. For example, one interviewee felt able to provide more detailed feedback and another commented that when students queried their grade he was able to go back and look at all his comments and see how the grade had been awarded. Other staff felt that GradeMark simply enhanced what they have always done and it was fortuitous that GradeMark was able to enhance those existing assessment and feedback processes. Some staff felt that the comment bank may usefully build up over time for things like referencing and directing students to further sources of support. However, they were cautious about over-using the same comments because they want the students to feel they have had personal feedback.
Workload Impact of Assessment Diaries
Initially staff thought that where large groups were involved, it was going to be difficult meeting the handback dates. However, experience showed that ‘as long as you communicate to students and explain why it might be a bit late, students are fine’. The students commented that they just need to know when the feedback is due and at the same time, staff felt that ‘the diary opens a line of communication.’ Some staff felt that publishing the handback dates was a good way to ‘stop emails from students 3 days after the submission deadline asking for feedback.’ However, an unexpected outcome was the number of staff who referred to the Assessment Diaries as a mechanism to manage their own marking deadlines to ‘avoid personal bunching’ and manage their marking loads. One interviewee mentioned that as staff know when they plan to set assessment deadlines, the Assessment Diaries present no extra work. However, as the co-ordinator she felt that the ‘workload comes from chasing staff to give me the dates.’
Workload Impact of GradeMark
Interview respondents were divided on whether GradeMark reduced or increased marking time. Several mentioned the elimination of paper handling as a positive aspect of online marking and that the process was quite intuitive and offered them flexibility in terms of location for marking i.e home or work. Interestingly one interviewee felt that marking on GradeMark took longer ‘but it was easier than marking on paper and there were no worries about losing scripts - no trudging around with piles of scripts.’ However, there were conflicting views from some respondents who found it much more difficult to mark on screen than on paper. Of those that found it easier, the use of the comment bank was often mentioned as a benefit in terms of marking speed. One respondent no longer asks students to submit a hard copy of their essays alongside their electronic submission (previous convention) on any module he teaches - ‘better for them and better for me.’
Student Experiences of the Tools
The most frequent comment to emerge from the questionnaire was that students found that the diary enabled them to plan their time better. However this was not always the consensus view expressed in the student focus groups, where some students used the diaries to refer to the dates and information, but did not mention using it to plan their time more effectively. This is interesting when compared with the staff expectation that students would use the diary to improve their time and task management of assessment.
It is worrying to note that students stated that the Assessment Diaries are often incomplete. This occurred in various ways and included – inaccurate assessment information, missing deadlines and/or feedback dates. This comment from the student focus groups expresses the point well:
Whether students use the diaries depends on the module that we are doing, if we are confident in the lecturers and we have been discussing when the assessment is due, rather than just checking on the diaries. Depends on the modules.
From the questionnaire, only two students indicated a positive relationship between incomplete information and the diary making no difference to their time management, showing that although the information may be incomplete, the tool was effective in aiding student time management.
Overall, judging from the students’ experience of using GradeMark, the picture remains somewhat unclear – with almost the same numbers of students saying they were more engaged with their feedback online as those who found it more difficult to engage with feedback provided online. When asked what they saw as the main disadvantage of electronic feedback, 38% of students said that they were limited to online access, 33% reported reading feedback on-screen as the main disadvantage, with a large proportion of the sample (29%) saying they saw no disadvantages in using online feedback.
The main benefits students saw with using online feedback were that it was easily stored and re-accessed (64%) and that they were spared illegible handwriting
by on the part of tutors (60%) while 52% stated the main advantage was easy access which somewhat counterbalances the 38% who saw online access as the main disadvantage.
The key advantages were related to the technical/online/system/process aspects of GradeMark whereas students’ suggestions as to improvements derived more from a pedagogical perspective and related to the quality of feedback and their engagement with that feedback.
Observations on Potential Future Developments
One major future development for Assessment Diaries is to provide an overview of all assessment dates for all staff to view which would be similar to the student view currently provided. At present, staff can see the diary for their own module but only certain staff roles such as Award Tutors or Programme Leaders have an overview of the entire course. A further need is to provide different submission dates for distinct tutorial groups of students who are registered on the same module.
For GradeMark users it has become apparent that some staff are willing to forgive the limitations of the technology – even when GradeMark freezes on IE browsers. This tolerance stems from the fact that they like the underpinning pedagogy and processes involved in online marking. Most have been willing to share their opinions about how the tools could be enhanced. The most significant improvement for these colleagues would be to link GradeMark with the University of Glamorgan student record system (Quercus Plus) to prevent their having to duplicate mark entry process as is currently the case.
For non-users of GradeMark, there was some way to go in terms of engaging with the practice of online marking. It was apparent that the non-users were committed to their existing methods of providing feedback, and that these were often very successful. It was interesting for the project team to reflect on the reasons tutors gave for not engaging with online marking as it became clear that those in the technical disciplines felt it would work well in the social sciences, and vice versa. The project team formed the view that if tutors did not wish to use online marking, they would suggest a discipline-based reason why they should not – even if other colleagues in their discipline had used it successfully. Clearly further dissemination of practice by staff who have found online marking to be effective may inform colleagues’ future decisions about marking online.
One theme that the project team did not expect to emerge is how some staff are using the Assessment Diaries and Grademark to support student retention. Innovative examples include: tracking student progression through engagement with Assessment Diaries and facilitating personal tutor/student conversations. One interviewee noted that the diaries had been:
[u]seful as a prompt for Personal Tutors who can initiate conversations saying 'how did the assessment go on Friday?' etc.
The interviewee was a Personal Tutor and had accessed student assessment diaries with her tutees to engage them in a dialogue about their learning from feedback in different modules on the same course. She felt that this approach promoted a much more meaningful dialogue with her tutees than the generic ‘how are things going?’ approach she had used previously.
The project team has gathered insights into the barriers preventing staff from engaging with the tools and have begun to see a clearer picture of activities that the University can put into place to support some of these staff as they begin to use these technologies.
The project interviews were stimulating and highlighted some fresh perspectives concerning the tools in relation to pedagogy and workload. Overall, it is fair to say that whilst staff may share divergent views about the tools, they are agreed on the principles behind these developments. The project team are confident that the project will produce valuable outcomes for the institution and the sector.
- The National Student Survey (NSS) http://www.thestudentsurvey.com/
- Higher Education Academy Change Academy Project: Putting assessment at the heart of learning http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassess_puttingassess.pdf
- JISC Assessment and Feedback Programme http://www.jisc.ac.uk/whatwedo/programmes/elearning/assessmentandfeedback.aspx
- The NUS assessment and feedback campaign http://www.nus.org.uk/en/campaigns/higher-education/assessment-feedback-/
- Bloxham, S. & Boyd, S. (2007) Developing Effective Assessment in Higher Education – a practical guide. Maidenhead; OUP/McGraw Hill
- The Re-engineering Assessment Practices in Higher Education(REAP) Project http://www.reap.ac.uk/
Karen is currently the Deputy Head of the Centre for Excellence in Learning and Teaching at the University of Glamorgan. She was involved in the first phase of the Change Academy Project and became the project manager for the second phase of the Change Academy project 'Putting Assessment at the Heart of Learning'. She is also leading the Higher Education Academy First Year Experience Group which shares good practice on themed first-year student experience initiatives throughout institutions in Wales.
Alice is currently working as a research fellow in the Centre for Excellence in Learning and Teaching at the University of Glamorgan. She is responsible for research on assessment at Glamorgan and supports the implementation of the assessment policy. She seeks to make assessment a support for the student learning experience rather than simply a measure of it. Alice was heavily involved in the research for the Change Academy projects. She is currently working on a part-time PhD entitled ‘Assessment for Learning in Higher Education’.
This article has been published under Creative Commons Attribution 3.0 Unported (CC BY 3.0) licence. Please note this CC BY licence applies to textual content of this article, and that some images or other non-textual elements may be covered by special copyright arrangements. For guidance on citing this article (giving attribution as required by the CC BY licence), please see below our recommendation of 'How to cite this article'.