Practising putting 110,000 examinations online
Two weeks ago I described how Covid-19 prompted us to set up an Assessment Transformation project to deliver alternatives to 110,000 examinations in just three months for University of London students. Now, in early June 2020, the Transformation is well under way with practice runs to test the new systems.
In summary; the challenge is to move our 110,000 paper-based examinations online over the summer. That is 600 examination papers, delivered to approximately 35,000 students in 20+ time zones.
We have conducted a background review that explores approaches to planning and undertaking assessment when conventional examinations are not possible; to see how close we can get to conventional examinations under current conditions; and to consider some possible adaptations. It deals with some of the main, often inter-related, features of conventional University examinations under the current changed circumstances:
- Seen or unseen examinations
- Distribution / release and receipt of examination papers
- Open or closed book examinations when students are taking the examination alone and with Internet access
- Typed or handwritten answers, and issues raised in each case
- Timing of examinations under changed conditions, including time to upload scripts
- Candidate identification
- Issues in student language
And it offers thoughts and suggestions on:
- Fairness and student and staff concerns under changed examination conditions
- Implications for and working with policy
- Doing without examinations altogether
- Analysing conventional examinations in changed times
- Using technology: opportunities and difficulties
- Ethical behaviour under changed examination conditions
- Revisiting and extending the idea of ‘reasonable accommodations’
- Commercial examination, invigilation / proctoring and other services.
Now, in early June 2020, the Transformation is well under way. Senior managers are working closely with Programme Directors, Chairs of Boards of Examiners, External Examiners, the Student Voice Group, and our student support, comms, technical, QA and development teams. The assessment team, with up to 40 members, has moved from meeting daily to twice per week. In addition, many small groups are meeting to take forward particular operational aspects. We are opting for three approaches, depending in circumstances:
- Online, time-limited, proctored (invigilated) examinations where this is essential
- Online, time-limited, unseen, closed book examinations
- Online, longer-time (several days or one week), assessment of an open book nature
How to stop cheating
Like all institutions across the England, we looked to our regulator, the OfS, for guidance and requirements in relation to assessment in the time of COVID-19. We welcomed and adopted the No Detriment policy. Regular meetings of the QA teams allowed the required new assessment regulations to be developed, reviewed by stakeholder groups and passed through our Academic Committee. Where we would previously have regulations covering what equipment a person can take into a physical exam room, we had to be explicit about what behaviour we expected of those taking an exam on their own device in their own home.
Also, like other universities, we use live invigilators or proctors to monitor students during conventional exams. Moving online we have explored three proctoring options:
- Live proctoring, which involves a person viewing exam candidates through the candidate’s webcam;
- AI video proctoring, where changes in the camera image that might indicate a student is leaving the room, or another is joining them, are picked up by the software; or
- Video proctoring, which is just a recording of the candidate via their webcam for later viewing and checking.
In addition to un-invigilated, fixed time and longer open book assessments we have selected options 2 and 3 depending on the requirements of the specific programme.
Finding a commercial partner
To meet professional body requirements, about 25,000 examinations require proctoring (invigilation). This large number and the short lead time (3 months) has nudged us in the direction of finding a partner with established systems capable of managing data protection issues, procurement, stability and scalability, as well as interfacing to UoL systems and data management, at an acceptable cost. With support from UoL’s own Digital Services provider CoSector, we have teamed up with existing partner Janison to help us deliver proctoring.
Making sure it works
To help prepare students, and to test our new online assessment systems, we have been running a variety of practice tests with our students. In some cases we have issued prompt questions such as ‘tell us about your day’, and asked them to submit a short response. Tasks like this make it plain that we are testing the system, not the students! For many of our level four students, we have provided a low-stakes short assessment paper for one of their core modules, giving them both the opportunity to familiarise themselves with the system and to achieve an academic outcome in a low risk, no-detriment environment.
Most students engaged with the practice tests, for which we are grateful. The great majority of those who took part in the practice tests were able to access the practice test papers, or test area, and to upload their responses.
We have learned a lot through this process. There have been particular in-country challenges related to access and broadband. This trial has given us the opportunity to fine-tune, and to provide more detailed support to the students, including country-specific follow-on practice tests.
Students communicated with us very fully, through our student enquiry portal and via email. (A backup email address has been provided for issues to be logged). In our practice tests the use of emails contacts has been very varied. In many cases students used email to provide us with feedback about the process. Sometimes students told us they thought they had uploaded their papers successfully, but just wanted to back up via email to make sure. In other cases, students were seeking enquiry system, email or phone line for help with the ID codes they needed to enter.
Achieving a consistency of terminology for log-in codes across UoL and third party providers has been an issue: Log-in codes to securely access the Third Party proctoring platform; codes to label the student’s submitted work, which needs to be anonymous but linked to their candidate number; and coding the correct one of our 2000-plus examiners to the right submitted script for marking.
With thousands of students undertaking practice exams on very many different programmes and in our different platforms, the demands on our staff have been great. Colleagues have been available on our student enquiry system from very early in the morning until very late at night to meet the time-zone locations of our students.
From these experiences across the different assessment platforms and assessment formats, we have been able to develop very detailed sets of FAQs tailored to assessment formats and levels of proctoring and to revise the joining instructions provided for students for the real assessment events.
The real assessments have started now for some programmes. Staggered start and end times for exams, and making sure that several large exams are not taking place at the same time, should ensure that our VLEs (we use more than one) can cope with the uploads. We are still developing our late submission policy to assure quality and ensure fairness within a no-detriment system. For example, if the submission window has closed on the VLE before a student has managed to submit their response, and they submit their work via are email, we need to make sure that the time stamp on the email is reasonable and that their work is correctly allocated to their Exam code. And we are still monitoring everything, of course.
Next I shall be reporting on the ways in which we are evaluating this rapid move to online assessment and describing some of the challenges of moving forward from digital assessment to digital marking.