AI ‘could reinforce bias’ in university admissions

Most disadvantaged students likely to be hardest hit if human element of selection is replaced by technology, conference hears

十一月 28, 2023
Young people communicate with robot
Source: iStock

University admissions staff considering using artificial intelligence for selecting student applicants should make sure that the technology does not reinforce pre-existing biases, a major global conference has heard.

Jennifer Dwyer, technical senior manager at the Al Fakhoora scholarship programme, which is part of Qatar’s Education Above All foundation, said some universities insisted on using AI to filter out candidates in the selection process because it is faster and cheaper.

But that does not make it better than people-led efforts, she warned delegates at the World Innovation Summit for Education (Wise) in Doha, Qatar, because it can lead to many “gems” being missed out along the way.

“Everyone keeps saying that AI is levelling the playing field, and everybody has a better opportunity so anybody around the world is going to be able to fill out applications…but that’s not true,” she said.

Ms Dwyer said the most marginalised youth in post-conflict areas would be particularly affected because they do not have access to career counselling or many other support mechanisms that those in the West take for granted.

And it is those young people who will get the biggest benefit if they are able to obtain a higher education then return to their communities, she added.

“AI doesn’t care, it’s artificial intelligence. Our biggest concern…is that it starts taking away humans who will actually work with you,” Ms Dwyer said.

Also speaking at the 11th edition of Wise, which is organised by the Qatar Foundation, Lauren Goodlad, distinguished professor of humanities at Rutgers, the State University of New Jersey, said human beings have biases and can make mistakes.

“Datasets that are trained on those [human] decisions simply amplify those biases, but do so in a way that falsely provides the impression of being more objective when they may actually be less objective,” she said.

“That’s what happens when you use poor datasets and end up amplifying stereotypes at scale, and it certainly has happened in several kinds of decision-making systems.”

Professor Goodlad said it was theoretically possible to devise selection processes that would be helpful – for example to provide certain datapoints to focus on – but this was not happening in practice.

“People need to make sure decision-making processes are constantly audited to make sure that they match the very best human decision-makers – not reproduce the biases of the worst,” she said.

Jonah Kokodyniak, senior vice-president of the Institute of International Education (IIE), outlined the “enormous impact” AI was already having – including helping students to discover scholarships, universities to manage their application selection processes, and applicants to write their essays.

“[It is] making many of us reconsider – given the rise of ChatGPT – which selection process or mechanism might actually not have been that effective in the first place,” he added.

Mr Kokodyniak said there were many ways that AI could be transformative for administrative staff – particularly how it could free them up to focus on the vital elements of their work that demands more intensive, personal effort.

Although most university selection processes have been refined over decades of trying to reach new demographics or incorporate diverse perspectives, he said, some parts of the job should remain untouched by AI.

“I’m most reluctant to integrate fully into the actual selection process because it’s very hard to see how AI and datasets can replace the understanding of a human’s lived experience.

“We need to be very cautious before we eliminate parts of that process.”

patrick.jack@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT