What our school learned from running digital exams

With the switch from paper to on-screen exams looking increasingly likely, this teacher explains the insights that his school gleaned from experimenting with digital assessment
9th August 2022, 8:00am

Share

What our school learned from running digital exams

https://www.tes.com/magazine/leadership/data/digital-exams-gcse-a-level-assessment
What our school learned from running digital exams

Earlier this year, when he was education secretary, Nadhim Zahawi highlighted that the Department for Education was considering the potential of online/on-screen exams as part of the assessment system in education.

This is a move that is also being touted by other organisations such as the International Baccalaureate, and exams boards are trialling digital exams, too. Clearly, there is a growing sense that education cannot ignore the rise of digital technologies that could modernise assessment.

Yet what would the impact of such a change be - on everything from teaching and learning practices and outcomes to teacher workload and marking?

To try and evaluate this from a frontline school perspective, I undertook a school-based practitioner inquiry in my school setting last year.

The aim was to assess these impacts and provide an insight on the benefits and challenges that may occur when formal assessments move from paper-based assessments to online/on-screen exams.

The rise of digital exams?

To do this I enlisted six teachers (three computer science teachers and three non-computer science teachers) and 43 students in Years 10 and 12 sitting GCSE and A-level exams in 2023.

The computer science teachers had some experience of using online/on-screen assessments (three to five times over the course of the year) while the other teachers, who taught maths, chemistry and religious studies, had described their experience as being limited (once or twice over the same period).

Two of the less experienced teachers highlighted that more time was needed to create digital versions of test papers.

The students from my sample sat five online/on-screen tests using Microsoft Assignments and two traditional paper-based exams. The digital assessments are essentially created on Microsoft Word documents and set on Microsoft Assignments for students to sit.

I also compared the results of students who took the online/on-screen exams with a group that sat the same test on paper to measure any improved learning outcomes of students.

So what did we learn?

Benefits to teachers

For teachers, there appear to be many benefits with digital exams.

Typically, teachers spend a lot of time creating test papers, printing them out, distributing and collecting them during tests, marking through pages one by one, and making copies of each marked paper, before returning them to students.

But using auto-marking in Microsoft Forms, whereby student answers can be compared with a set of accepted responses, could reduce marking time significantly.

This also reduces the possibility of making errors during marking. Furthermore, having scripts supplied digitally improves record-keeping and reduces the risk of papers going missing.

Teachers also said it was easy to administer the tests via email rather than having to carry heavy piles of paper to and from school and around the site.

Overall, half of the teachers interviewed highlighted that the overall process was quicker to implement, saving them at least 30 minutes, while teachers with more experience in implementing online/on-screen assessments indicated that they were able to save up to 100 minutes for each assessment cycle per class.

What’s more, it also means that once a test is written, it can be reused again in future without having to print it multiple times again - saving time and resources, such as paper and ink.

Teachers also found that retaining copies of tests digitally meant that they were easily accessible for both teachers and students.

They also welcomed the possibility of having an overview of responses for question-by-question analysis. Teachers said the marking and feedback stages of the assessment process were quicker and easier to do than with a paper-based assessment.  

Microsoft Forms, for example, allows teachers to create short-response questions, which can then be set using the auto-mark tool. There is also a response tool that provides an instant visual overview of how each student responded to each question in the test.

Benefits to students

There were also several benefits for students, too.

When students were asked how easy they found the online/on-screen assessments on a scale of one to 10 (10 being very easy and one being very difficult) 39 out of 43 students selected 7 or better, with the average score being 7.98.

Furthermore, 44 per cent of my sample found online/on-screen assessments on Microsoft Assignments easier and quicker to implement, that it removed the anxiety caused by having to write responses by hand, and that their answers would be easier to check, and it was easier to navigate between questions and pages in the test.

What was even more compelling was the impact that online/on-screen assessments had on the learning outcomes of students.

In almost all cases from five separate assessments over the course of the year, the classes that sat the online/on-screen exams on Microsoft Assignments (with the same questions but typed responses) outperformed the class that sat the paper-based equivalent by an average of 5 per cent of the year average.

However, all groups were within 1 per cent to 2 per cent of the year team average when comparing the results of all classes when all students sat paper-based exams on both occasions.

The challenges - and how we can overcome them

Despite these benefits, there were, of course, challenges that arose, too.

Perhaps some of the most expected included ensuring that there was the necessary access to computers (and even types of computers) to sit the test.

For example, the non-computer science teachers were required to book access to a computer room and/or a laptop trolly not readily available to the class.

The differences in current provision and in a department’s ability to prepare itself, at pace, could lead to unfairness to students.

Then there are different departments that use a range of devices, including desktop computers, laptops and tablets, all of which had different browsers and operating systems.

There are also likely to be some compatibility issues between different hardware, and the lack of a sufficient number of devices of a consistent specification for whole cohorts (or substantial sub-cohorts) to sit examinations at the same time.

Issues of insufficient or unreliable internet connections also came up, with teachers noting that in the past when all laptops are being used in a classroom it can impact on connectivity. However, this did not occur in the experiment.

Nevertheless, a move to full digital exams would clearly require sufficient and reliable internet connections to ensure that students can access their assessment and lesson resources.

Finally, the need for staff and student training to use online/on-screen assessments was also highlighted as something for consideration, as not everyone will be immediately savvy with how to run, or sit, a digital assessment.

As such, technical support must also be readily available so that internet connections are not lost during assessments or so that, if they are, these problems are resolved promptly, allowing students to resume answering questions.

Schools would need to be offered adequate training and time to practise administering online/on-screen assessments.

Furthermore, half of the teachers identified limitations with the tools available in Microsoft Assignments to enable students to draw diagrams or show working out without the use of a stylus (not available on desktops).

Malpractice might be more difficult to identify and challenge, so consideration over how best to monitor this would be crucial for the reliability of assessments and to ensure that no student is ever disadvantaged.

Next steps

It is clear that technology could have a positive impact on assessments - reducing workload for teachers and improving engagement and, potentially, even outcomes.

However, it should be noted that while the evidence of student performance improving through online/on-screen assessments is positive, it is still too early to make conclusions, at least until more data is gathered from other departments and across different schools.

Furthermore, there are unquestionable issues that exist with the idea of digital assessments that would need serious consideration - from access to devices to robust internet connections.

This is why practitioner inquiries like this are so crucial - so we can take the steps in schools to help inform how the future design of digital assessments may be developed.

I would encourage school leaders across the country to begin taking the steps to collect data from their school settings to better prepare their staff and students for the transition of formal assessments, from paper-based assessments to online/on-screen exams.

A full report into the findings will be available in the Camtree digital library in autumn 2022, including the full methodology and research template.

Monir El Moudden has been teaching computer science for over 13 years and currently works at an independent school in London, UK. He tweets @monirelmoudden

You need a Tes subscription to read this article

Subscribe now to read this article and get other subscriber-only content:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

Already a subscriber? Log in

You need a subscription to read this article

Subscribe now to read this article and get other subscriber-only content, including:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters
Recent
Most read
Most shared