Skip to main content
Centre for Online and Distance Education

Putting 110,000 examinations online – how are we doing?

Date

Written by
Dr Linda Amrane-Cooper, Director of Strategic Projects, Head of the Centre for Distance Education, Programme Director PG Cert in Learning and Teaching in Higher Education, University of London

The Covid-19 pandemic made it impossible for University of London students to take their summer 2020 examinations in local examination centres. This development required a large scale and rapid response from the University entailing  the provision of over 100,000 exam opportunities online to some 35,000 students. Previous blogs explain the background to this assessment transformation. Here we report on how we are evaluating the experience for students, staff and our systems and reveal some early insights. 

93% take up

Exams finished in the first week of August. We are still processing the final data. At this point it seems some 93% percent of our students participated in online assessment with us. On average this is a higher percentage of exam engagement than in our traditional model of unseen exams taken at our 600 examinations centres across the world. So we are delighted that we have provided students with the opportunity to demonstrate their learning despite the Global pandemic.

Measuring up The Centre for Distance Education Head, Linda Amrane-Cooper, with CODE Fellows Profs Alan Tait and Stylianos Hatzipanagos, are working with University teams to lead an extensive evaluation of the online assessment experience. We are looking at four key areas:
 

Four evaluation aspects

The data are being considered in a number of ways. For example, remembering that we have students in over 20 time-zones, analysing with regard to student location will help us to understand how local Wi-Fi availability and bandwidth affected access and outcomes. 

We used three types of online examination; proctored (invigilated) exams, fixed-time unseen closed-book exams, and unseen open-book exams with a longer response time (24 hours or several days). We will thus be able to consider the effects of differences in the format of examinations.
Gender, age, programme of study and declared disability requiring special examination arrangements are further parameters we will explore. 
Student behaviour data is easily accessible via our virtual learning environments which was where students accessed their exam papers. As well as noting that take-up of exams has been strong, we have also identified some more detailed initial findings.

For example. we operate zonal exam papers for our larger programmes, to accommodate the need for security of the exam papers across global time-zones. We use different examination papers in different time zones, although we ensure that these different papers assess the same learning outcomes. Initial analysis indicates that  with open-book exams, students still predominately access their paper as soon as their zonal exam becomes live for them in their location. Some of the students then submitted their paper within a few hours of receiving the paper – they did not take the more time to reflect on their work. We suspect, but cannot confirm, that this shows students adopting what we may call the exam habit, working under (in this case self-imposed) time pressure, a habit learned over many years of conventional examinations. This observation has implications for how we help student prepare future assessment, supporting them to develop strategies for open-book assessment, more broadly for examination under these new conditions. 

As the examiners complete their marking online, rather than with paper exam scripts couriered to them, we have the opportunity to explore the experience of academic staff in this new normal. Exam boards are meeting in the next weeks to confirm marks and awards. We can then start our detailed analysis of student outcomes – comparing average marks and pass rates with previous years’ performances. We suspect that exam formats may be an important factor to consider in the analysis of outcomes for students. Gender may be important here too, if child care demands and work in the house under lockdown is disproportionately impacting women students. 

Student voice 

Early response to the student sentiment survey has been strong, with an expected completion rate of 30%. As the survey goes out two weeks after the students’ programme exams have completed, we won’t have the final responses in until 25 August. Analysis of the open text responses will be as important as understanding the other types of survey response.  

At this stage, with our first groups of responses received and analysed, we are seeing a clear understanding of why we moved to online assessment this year. In addition to the bulk of students confirming that they completed online assessment, we received responses from students who did not complete all or some of their online exams. For this early sample, the reasons for lack of engagement with online assessment are generally Covid related rather than issues of access to Wi-Fi or suitable computer equipment. (We appreciate that Wi-Fi or computing problems will also mean some students are less likely to respond to the survey.) Illness, disruption and mental health issues resulting from the pandemic have meant these students were not able to engage with the assessment. 

Feedback from the survey is helping us to understand the ways in which communication with students worked best; how students undertook their assessment – by writing or by typing; and if they encountered issues uploading their answers to the VLE. These are all areas that map against the operational and logistical lessons learned that we are reviewing

Looking forwards

In the final part of the student experience survey we ask students if they would like online assessment to continue. Data so far shows a strong support for future online assessment, with only 12% of returns being against it. Of course we will seek to understand the reasons for these objections to online assessment – they may have implications for equity in the University’s offer. And when presented with a range of possible future approaches to assessment including online assessment taken at home, or in an exam centre, or paper based exams in an examination centre we have the opportunity to understand which approach they would currently like to see in future assessment. From the returns so far, online assessment taken at home is first choice for approximately 55% of respondents, followed by our traditional model of hand-writing an exam in an exam hall (first choice for about 20% of the respondents). However, with about half the student not yet surveyed any conclusions need to be tentative. 

We expect to have completed this large study by the end of October. We look forward to sharing our findings with the sector, as well as using the results to inform our developments in assessment over the coming year. The experience of and the learning from swiftly moving 110,000 examinations online will inform our understanding and practice of assessment, for students and for the sector.

This work will have implications for assessment well beyond the current pandemic.