After going through the first seven phases of the assessment cycle, you as a lecturer (or team of lecturers) evaluate the entire assessment cycle. What went well and what could have been done better? The evaluation gives input for the (re)design of the assessment. Based on the evaluation, you can for example decide to adjust some assessment questions, take another look at the overall level of the assessment or replace the assessment with a whole other assessment form.
- Are the learning objectives still up-to-date?
- What was the function of the assessment and do you still support it?
Phase 2 Designing an assessment matrix
- Are the learning objectives and assessment methods still aligned with each other?
- Do the items or assignments fit the learning objectives?
- Do students’ answers actually give an impression of how well they master the learning objective in question?
- Was the assessment as a whole fitting to appraise the intended skills and knowledge of students?
- How was the distribution among the creators of the assessment? Did it go according to plan? Was there enough time for construction or did they experience time constraints?
Phase 4 Taking an exam
- Were students well-prepared for the assessment(s)? Did they know what was expected of them well ahead of time?
- Did students have enough time while taking the assessment? How quickly were they finished?
- Was the instruction clear?
- Were there problems while taking the assessment?
- What did the feedback given by the invigilator/E-support employee say?
Phase 5 Appraising assessment results
- Were answer models and rules about scoring the assessment clear? Could people handle the assessment criteria/rubric well?
- Did you like the way of grading? Were people finished in time? If not, what was the cause?
Phase 6 Analysing an assessment
- Was an assessment and item analysis performed?
- Is the analysis in Cirrus sufficient or is more information needed (for example via Evaluation Service)?
- Is there sufficient knowledge among the lecturers to use the analyses? Or could an educational advisor think along and help?
- Can you rule out that wrong answers can be contributed to the way the question was asked?
- Which questions functioned well? Can they be used as inspiration for new questions?
- Which questions did not function well? Do they need to be taken out completely or can they be adjusted?
- How did the inspection go?
- What information emerged during the inspection?
- How did students evaluate the assessment?
- What was done with the evaluation data? (Plan Do Check Act cycle?)
- Which form of feedback was given to the students?
- How did they react to this?