Online exams: is technology or authentic assessment the answer?

Academics debate how best to test students’ learning at a distance and reduce cheating: remote invigilation or exercises that ask students to apply their knowledge

January 28, 2021
Close-up of an eye
Source: Reuters

With the coronavirus pandemic still raging, distance learning looks to remain a key part of universities’ activities for some time. And given the success they have had with experiments in online tuition, many institutions may well choose to continue it as an option into the long term.

But teaching at a distance means assessment at a distance, and finding the fairest and most secure way to do this has not been straightforward. Nevertheless, the surge in cheating cases reported by universities since the switch to online examinations makes the need for a solution urgent.

Online “closed-book” exams have proved comparatively easy to cheat on, while open-book exams with a time limit do not fix the problem because contract cheating websites, which write student essays or answer questions for a fee, now operate around the clock.

Sarah Eaton, an associate professor of education at the University of Calgary and a programme organiser with the International Center for Academic Integrity, said the sector’s response to the problem could be simplified into two options: the use of online proctoring technology or “authentic assessment”.

ADVERTISEMENT

Fair assessment: tackling the rise in online cheating


Authentic assessment asks students to apply what they have learned in different circumstances, rather than relying on what knowledge they have memorised, and is therefore considered difficult to cheat on. Online proctoring uses technology to act as the “invigilator” on a student’s computer, checking that students remain in the room or tracking their eye movements, analysing their keystrokes or blocking them from accessing other websites.

“We’re starting to see a divide in the academy between those who favour authentic assessment and new ways of doing things, and those in positivist or STEM disciplines where testing has been the norm and continues to be important,” Dr Eaton said.

ADVERTISEMENT

Technology seems like an obvious answer, particularly for those courses accredited by professional, statutory and regulatory bodies, which require students to meet certain professional standards and call for closed-book exams.

Many institutions are taking the technological route. Scott McFarland, chief executive of US-based ProctorU, which offers remote oversight by invigilators, said he has had to double his staff of about 500 human proctors since the lockdowns began. Another US company, Proctorio, said it has experienced 900 per cent year-on-year growth in proctored exams completed on the platform.

In the UK, uptake has been more cautious. The Quality Assurance Agency (QAA) conducted a small survey of members about their use of proctoring services since the pandemic, and Simon Bullock, the organisation’s quality and standards specialist, said that of the 50 who responded, about half had used some form of online proctoring. This ranged from simple identity checking or invigilators monitoring students via Zoom to the kind of software that tracked eye movements or keystrokes, but “very few used the most advanced software”, he said.

This was likely because of the high cost for the more advanced software, and the resources needed for Zoom invigilation, for example, he explained.

Mike Draper, professor of legal studies at Swansea University and a member of the QAA’s academic integrity advisory group, explained that additional resources were necessary to use the most advanced software. Before students sit an assessment, they have to do a “pre-assessment” to check that the software works, and there have been instances where the technology failed and students have had to resit the whole exam.

The issue of credibility and the need for some students preparing for certain professions to be able to demonstrate that they have the right competencies were important, Professor Draper said. However, online proctoring was not “entirely foolproof”, he continued. “You still find that there are ways to cheat it.”

Aside from resources, online proctoring throws up ethical issues, Professor Draper said. Over the summer, a number of legal news outlets reported that students taking the Bar Standards Board’s exam online resorted to relieving themselves in bottles because they feared that the online proctoring system would mark them down for leaving to the room to use the toilet.

Source: 
Getty

Privacy issues are a particular concern, with many students baulking at the idea of having someone take over their computer or gaining access to their camera. “Finding a quiet and private space to take an exam at home means using their bedroom for most students. That can be very intrusive, particularly for young women,” said Cath Ellis, associate dean of education at UNSW Sydney.

ADVERTISEMENT

Last summer, students at the Australian National University launched a petition against the use of proctoring software Proctorio at the university, describing it as a “gross invasion of our privacy”. The issue was compounded in August 2020, when the universities of Sydney and Melbourne confirmed that they were investigating an alleged a data breach at ProctorU, the online examination tool they were using.

Some companies say they have recognised the issue. Keith Straughan, chief executive of Axiologs and one of the developers of UK-based online testing platform Tenjin, said they took these “concerns very seriously indeed”. Tenjin does not “take control” of a candidate’s computer or use continuous video monitoring; instead it takes frequent photos, biometrically matched against identity documents, and supplements this with voice recognition, he said.

This also lowers the bandwidth needed, making the platform easy to use internationally, even in technically challenging geographies, which is particularly useful for English-language assessment, Professor Straughan added.

Some universities have opted against using any kind of proctoring software. Dr Eaton said Calgary had commissioned a group, of which she was a member, to look into whether the university should invest in such technology. It concluded that there was “not enough research done by independent and objective third parties to support its use”, she said.

ADVERTISEMENT

“It didn’t justify the cost, not at a time when we’re cutting jobs,” she explained.

This was echoed by Phillip Dawson, associate director of the Centre for Research in Assessment and Digital Learning at Deakin University. “Proctoring companies claim that their tools can reduce cheating, but I’m yet to see any robust, independent peer-reviewed evidence in support of that,” he said. “I’ve approached several remote proctoring companies over the past couple of years to arrange a study where I try out a range of cheating approaches in their exams, but none have said ‘yes’ so far.

“To me, the question is: how much does it reduce cheating, and is that reduction worth the trade-offs that are made in terms of privacy?” he said.

Jesse Stommel, senior lecturer in digital studies at the University of Mary Washington, warned that some companies “are taking advantage of the crisis”.

The use of online proctoring software “creates a culture of anxiety and stress”, he said. Rather than focusing on tales of an increase in cheating, which, Dr Stommel said, was negligible, institutions should be making decisions about what was best for their students. “Students don’t need cameras in their homes, watching their every move. Students need support,” he said.

Irene Glendinning, academic manager for student experience at Coventry University, advised that “if you are going to [use online proctoring], you need to have the consent of the students and ensure that they understand what it will do and why it is needed. Overall, authentic assessment is a better way of assessing: you are asking students to demonstrate their understanding, rather than regurgitate facts, and learn more in the process.”

Dr Ellis agreed. “People need to look at what is the purpose of that assessment. What are we asking students to demonstrate? If we go back, a lot of it is about checking that students can do something themselves without outside assistance. So we need to find a better way of doing that,” she said.

Dr Ellis suggested that even subjects where high-stakes exams seem necessary, such as languages or medicine, could move towards more authentic assessment through mini-vivas, in which students are questioned about their body of work or asked to demonstrate their overall understanding.

Neil Morris, dean of digital education at the University of Leeds, explained that addressing assessment takes time. “We need to remember that everyone is still in emergency mode. To do this properly, you start with curriculum design and learning outcomes…people are doing their very best, but there hasn’t been the time to go back and rethink their curriculum from the very beginning,” he said.

Professor Morris believes that what is required is a move towards assessing students’ ability to solve problems and to apply knowledge to contextual situations through authentic assessment. Creating such a portfolio attesting to a student’s learning and demonstrating their abilities would be more valuable to future employers, he said.

“Looking ahead, online proctoring would likely be used only where it’s absolutely necessary for professional exams, but at a minimum level,” Professor Morris said.

The problem is that assessment often is designed with efficiency in mind rather than educational effectiveness, argued Janice Orrell, emeritus professor of higher education and assessment at Flinders University. Institutions were failing to provide sufficient professional education or supportive mentoring for staff to take on this high-stakes aspect of university education, she said.

This was echoed by Dr Stommel, who said institutions should be investing in their teachers and their development. “Every dollar that is spent towards remote proctoring, towards new learning management systems, should go towards faculty development and student support,” he said. This would allow teachers to develop assessment methods that help learning. “It’s the most pedagogically sound thing to do,” he said.

For Professor Dawson, proctored exams have made it “easy to keep doing roughly the same exams we’ve always done. I worry this might get in the way of reimagining what assessment should be for the future.” They must not be “an artificial life-support system for a dying type of assessment: the invigilated exam,” he said.

ADVERTISEMENT

However, Dr Glendinning was hopeful that the sector was starting to change. While the pandemic has posed many challenges, it has “had a really positive impact on people’s understanding of what makes good assessment, what works well and what not to do. It has shone a light on the problems with the old way of doing things, ways that have been done for centuries but not changed,” she said. “The genie is out of the bottle.”

anna.mckie@timeshighereducation.com

POSTSCRIPT:

Print headline: Universities seek fair and square solution for online exams

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Most universities still rely on exams and assessed essays to grade their students. But as the fourth industrial revolution, employability and student satisfaction all rise up the agenda, many experts are suggesting that assessment needs to much more closely resemble real-world tasks. Anna McKie marks the arguments   

23 May

Reader's comments (8)

This conversation is long overdue. In fact when COvID disrupted the norms of education, the unwise starting point was the immediate transfer of physical classroom to virtual. It should have began with outcomes and assessments under the New Normal, and then develop the pedagogy that will support the outcomes and new assessment approaches. Traditional assessment assumes a sustained and/or significant amount of classroom time where students came together under the eyes of a supervising teacher or faculty, and where students meet others. These assumptions were compromised. It's like training a water polo team, but due to the shortage of water, they get to compete in soccer.
I solved this problem back in the 1980s by creating a scenario pertaining to the course being studied whereby the student had to elaborate on the information to provide a suitable action to create the next step. The courses I was teaching was Small Business (undergraduate) and Entrepreneurship (post graduate) which encouraged students to "think outside the square". My teaching methodology encouraged students to ask questions and provide their own viewpoints on the topic of the day as in Real Life there is NO correct answer in running one's own business. I encouraged all students to ask questions in class (Who, What, When, Where, How, Why) - this generated a great deal of discussion and ideas flowed freely - there were NO right answers, but they were assessed on their ideas and methods. QED.
Similarly -I ran a module on food retailing; the assessment was, the students had to visit our local supermarket, or another of their choice, and recommend improvements. Again, no correct answers, it was judged on their initiative and reasoning, and analysis of current trends, of course this was different frojm year to year.
Suppose we had 'testing centres' with safe individual booths with terminals etc. which can be fairly easily monitored. The booths and tests could be booked at any time of the year so that we can actually do university courses sensibly on a continuing basis rather than the batch (semester) system we have now. A good potential private business there for someone - but as this is so obvious I suspect there are companies doing it already. Certainly universities should be doing this for their courses.
An online “closed-book” multiple choice exam is not fool-proof at all. Online proctoring platforms could possibly curb the cheating to some extent but definitely not 100% fool-proof. There is no doubt that Covid-19 has accelerated our education system to move online faster, but we have not thought much about authenticity of the assessment process and ultimate repercussion to the society.
We've been running digital examinations for a few years now and the analytics have taught us that arbitrary recommended durations often need to be revised, particularly where MCQs are in use - the overwhelming majority of students are completing exams much more quickly than we expect. A few things that I believe would help: - Create a pool of varied but equally meritable questions on any material which the students can be examined on, allow the platform to randomly generate unique exam papers based on a few rules (e.g. each exam should contain 10 "easy" questions, 6 "medium" questions, 4 "hard" questions, no two questions should be the same. The qualifiers don't necessarily have to be difficulty level and could include other parameters). - Further, randomise the order in which the questions are presented in the exam paper and randomise the order of the answers where appropriate. This way if several students are given the same questions from the pool, it's highly unlikely they will encounter the content at the same time. - Consider presenting one question at a time and prevent students from navigating back to previous questions. This one needs to be approached with some trepidation as there is value in allowing students to return to/ review their work and modify their answers. If timings of the exam are kept tight and the other two strategies can be adopted this may not be necessary. There are definitely obstacles which need to be considered here (e.g. how to communicate corrections/ clarifications to questions on unique exams) but the best way to discourage cheating is to simply not make it worth a student's time or effort. If timings are tight, it's not worth wasting precious seconds trying to google answers or ask for them on a Whatsapp group. If each student's exam is asking different questions, there's no incentive to help another student out when you stand to gain nothing (and again, actively lose precious time in the process!).
Why not scrap exams altogether at the university level? Let those who want to do the selection from the pool of candidates for whatever purpose devise their own tests for selection.
"...the use of online proctoring technology or authentic assessment..." = false dichotomy.

Sponsored

ADVERTISEMENT