Skip to main content

This is an account of our approach to online exams, blended with recommendations to others.

Choose the format. Please agree on the general format of your assessment within your programme. Essays, especially when run via TurnItIn, are regarded as suited for open-book and online exams. They are however not very practical for larger student cohorts. I will come back to essays at the end of this text.

Multiple Choice Questions (MCQs) are very attractive for assessing larger cohorts, because they allow automated marking. Please do consider using new formats of questions. We have truly engaged with hot-spot questions (available in New Quizzes), where the student has to click on a certain part of a graphic to answer the question. Other formats such as fill-in-the-gap questions or very-short-answer questions are equally promising to contribute to better online exams in the future. They have the advantage of short-answer-questions, but promise to be marked easier.

Please check with your local HEFi team that the question formats you select are supported for post-exam analysis.

Work on your questions. Whether we actually assess knowledge strongly depends on the quality of the multiple-choice questions themselves. Do consider longer introductions to your questions to move them into an applied context. Why not display an image or a graphic that needs interpretation, before the student gets to the answer. Providing a high-resolution version of that graphic and a generic alternative text is good practice to meet access-needs of some students (e.g. those with visual impairments).

All this would make googling the answers harder (Google-proof), as copy-pasting is no longer a possibility, or at least more time-consuming. It also aims to assess higher levels of Bloom’s learning taxonomy and hence makes the questions better in general … and, yes, you can refurbish your old question bank and make it better by this approach.

Use available Canvas quiz options. By choosing shuffle questions, you would certainly make it harder for students to “collaborate” during the exam. If your questions follow a logical order, such as the sequence of the individual learning activities of your module, you might want to refrain from this this option. Regardless, I would always recommend to shuffle answers. It is easier to put up these questions in the first place and you are less likely to give any cues to the students about the right answer.

Displaying one question at a time, to lock the questions, once submitted, may be a powerful option within Canvas. It prevents students from browsing through the exam at the end, making it harder to copy content or to answer requests from other students. Please however tell this to the students beforehand - this option comes with increased stress to some students.

Time is a big factor. From the point of safe assessments (peer-proof), one might aim to give as little time as possible, squeezed into an as short as possible flexible time window. As an international university however, we need to give our students enough time to sit the assessment comfortably, even in different time zones. We neither want to increase their stress additionally.

To be tech-proof, please set up your exam properly and build in quality checks by others. People from HEFi are really helpful here. During the assessment, please be approachable for your students. In case of technical failure or other emergencies, they might be highly stressed and require immediate response via email. Please also have back-up strategies in place for complete tech failures, such as offering a re-sit of some sort. It is here that essays might come into play. Dealing with smaller numbers of students for a re-sit, a timed essay may become more feasible.

Please consider each of these points only as recommendations. From this tool box, you may pick whatever fits the needs of your specific assessment and/or the requirements of your external accrediting body. Whatever you choose, do involve your students in a discussion about that format and give them training in that kind of assessment. This is to make sure that we are genuinely testing their knowledge rather than their ability to cope with style, input format or other technicalities.

After the exam, please consider comparing your outcomes to previous assessments or even to SAQ/essay style assessments from the same year. Finally, Canvas offers great analysis tools to measure the discriminating power of individual questions. Taken together, these recommendations may help you to create better online exams, maybe even better exams in general.

If you want to discuss these points further, please feel free to contact me (j.w.mueller@bham.ac.uk) or my colleague Dr Sarah Pontefract (s.k.pontefract@bham.ac.uk).

A solid review about best practice MCQs in pre-Corona times:
Rudolph et al, Best Practices Related to Examination Item Construction and Post-hoc Review. Am J Pharm Educ. 2019; 83(7):7204. doi: 10.5688/ajpe7204. PMID: 31619832

A very useful guide to better electronic exams from George Washington University: https://smhs.gwu.edu/impact/sites/impact/files/Firmani_OCEPs.pdf

Another collection of tips for better online exams
Smith-Budhai, Fourteen Simple Strategies to Reduce Cheating on Online Examinations, 2020