‘We’re on the wrong side of AI partnership’: technology and the future of learning

If AI is to transform education, assessment must be rebooted and developers must make leaps in understanding the learning process, experts say. Rosa Ellis reports

September 26, 2024
A visitor participates in 'AI Dream', an artificial intelligence (AI) immersive experience project, at Shanghai Tower on April 12, 2024 in Shanghai, China
Source: Tang Yanjun/Getty Images

If you ask ChatGPT how generative AI is going to impact higher education over the coming years, it responds with 10 positive outcomes plus a few words of caution. Human experts in technology and education are, on the whole, less optimistic. Times Higher Education spoke to several of them about how new technologies could affect learning and where the opportunities and dangers may lie.

Short-circuiting the learning process is a common concern. “The important thing about learning is never to let somebody else or something else do your thinking for you,” says Christopher Dede, senior research fellow at the Harvard Graduate School of Education. “I find it very dismaying when people say, ‘Well, nobody needs to write any more because ChatGPT will do their writing for them. Or maybe if they really want to learn a little bit about writing, they could edit something that ChatGPT does.’ Editing a ChatGPT product is not the same as having your own individual ideas.”

Diana Laurillard, emeritus professor of learning and digital technology at UCL, agrees. ChatGPT and similar technologies “should only be used by teachers as part of a design to help students develop their own ability to search and process texts or interpret and summarise them. If students use them to challenge and enhance their own thinking, then they will assist learning, but they need the teacher to guide this process,” she says.

Dede warns against thinking that AI is smarter than humans. People are wired to have an emotional reaction to language, he explains, while AI’s language is unemotional and based on maths and statistics. AI’s strength, he continues, is “reckoning”, making calculated predictions. It cannot make judgements like humans can. But when the two work together, the outcome could be hugely positive.

“You can imagine a cancer doctor working with an AI partner. The AI is doing reckoning. It’s scanning 1,500 medical journals every morning, and if anything relates to new information about helping this patient, it tells the doctor. But you would never ever want the AI counselling the patient in the way the doctor does. The AI doesn’t understand pain, doesn’t understand death, doesn’t understand having a family and the financial and emotional consequences of different kinds of treatment decisions,” Dede says.

The problem universities face is that many tests assess students on their reckoning ability, not their judgement, he says. “We judge the quality of our educational institutions by the ability of students to do tests. We’re on the wrong side of AI partnership.”

Bryan Alexander, an expert on how technology can transform education at Georgetown University, agrees that making the most of AI requires a “reboot and redesign of assessment from top to bottom”. He says most universities around the world are not making strategic decisions about AI and do not have policies on it: “Most have skipped making a decision and left it up to the individual faculty.” He suggests that universities set up generative AI committees, with representation from not only computer science but also other disciplines such as literature and economics, to monitor the technology and its impacts and help shape institutional policy.

Interestingly, Alexander does not believe that AI is necessarily here to stay. It is facing several existential threats, he says, including the possibility of governments banning it, judges outlawing it, people shunning it and companies dropping it because they cannot find a sustainable business model for it. “We’re not quite sure if generative AI will continue to exist over the next year or if it will change drastically.”

If it does stick around, the potential outcomes are not all gloom and doom. Some believe that AI could have a big part to play in the future of personalised learning at scale.

“We’re trying to get to that magic relationship where you as a learner get the content and the instruction that matters for who you are and what you know,” says George Siemens, director of the Centre for Change and Complexity in Learning at the University of South Australia.

There are differing views on whether personalised learning at scale already exists, depending on the definition being used. Siemens says there is renewed hype around AI and personalised learning because AI chatbots can simulate an authentic human response. However, he continues, the current technology is lacking. “It has a big missing piece right now, which is it doesn’t really have a significant memory of you as a learner,” he says. “I cannot think of an organisation that’s doing [personalised learning] well.”

The first step towards effective large-scale technology-mediated personalised learning requires developers who understand what it takes to acquire high-level skills and knowledge, believes Laurillard. “It is not a matter of working out which bit of knowledge they [students] lack and telling it to them again, as so many ‘personalised learning’ programs do,” she says. Developers must grasp that learning results from a student “doing something, not just reading, watching and listening…AI could support this, but programmers have chosen not to experiment with personalised learning in this way.”

Despite the challenges, many organisations are working on using AI to develop personalised learning that could be transformational.

Combining AI with virtual reality technology has the potential to enhance the teaching of performance-related skills such as presenting and negotiating, says Dede, who is the associate director for research at the National AI Center for Adult Learning and Online Education in the US.

“What we can now do is create authentic, immersive simulations. So let’s say that you’re going to buy a used car and you want to be able to negotiate well with a used-car dealer to get a fair price. You can practise in an authentic immersive simulation with a chatbot,” he says. “That’s in a virtual environment that’s been trained to behave like a used-car dealer. And you can practise and get feedback and practise and get feedback.”

rosa.ellis@timeshighereducation.com


The World University Rankings 2025 will be published on 9 October. 

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored