The rise of AI doesn’t mean we should forget about memorisation

Graduates who can’t think critically without electronic assistance will be at a distinct disadvantage in the workplace, says Loïc Plé 

五月 5, 2024
A robot points to its head, symbolising memory
Source: iStock/iLexx

One of the primary challenges facing higher education is the growing disconnect between students’ expectations and traditional teaching methods.

A large percentage of Gen Z and the incoming Generation Alpha students, accustomed to instant access to knowledge (or what they believe to be knowledge), already question the purpose of learning that requires a lot of effort, and they struggle to understand the value of theoretical content, preferring to focus on practical applications, especially in business schools.

Now, with machines capable of generating content (text, audio, video, images) that they might be asked to create as students or as future practitioners, students are also questioning the relevance of their learning experiences. This likely nurtures the disengagement that is a growing concern for professors worldwide and requires a proactive response from universities.

But we should not retreat too quickly from our learning outcomes. Learning remains crucial for students to perform effectively in their future careers. If individuals spend their time searching for information or ways to complete tasks, their efficiency and adaptability in the workplace will be very limited and they will be highly replaceable by generative AI.

Furthermore, understanding and remembering theory is likely to remain essential to act knowledgeably and efficiently in professional contexts. For instance, consider a business consultant who fails to remember the strategic analysis models they learned at university (such as the BCG matrix or Porter’s five forces) or the underlying assumptions of these models. If they constantly need to refer to online resources when working with colleagues and clients, they might inadequately identify problems, formulate suboptimal strategic recommendations and be at a competitive disadvantage compared with others who have assimilated this knowledge and can use it without external support.

Similarly, a data analyst who struggles to recall statistical models and techniques without technological assistance might identify patterns more slowly than others, draw flawed conclusions or offer suboptimal advice. They might also be less able to communicate confidently with other stakeholders.

Learning, understanding and remembering theory and basic concepts is also fundamental for developing critical thinking skills, which, among others, are vital for interacting with generative AI and analysing and verifying the results it generates (some of which are likely to continue to be hallucinations that do not match reality or do not even exist).

To address these issues, universities must rethink the role of pure memorisation in their curricula. There has been a longstanding belief that students primarily need to understand concepts rather than memorise raw knowledge, but the latter remains important, too. Memorisation of foundational concepts and principles provides a solid framework on which students can build their critical understanding and effectively apply their knowledge in practical situations.

Universities must adopt more innovative and engaging teaching strategies that motivate students and help them to see why learning and memorising are essential. These techniques must bridge the gap between theory and practice so students better understand the complementarity between the two. To that end, universities should consider adopting what are known as inductive teaching approaches more broadly.

By exposing students to real-world scenarios and practical applications, such as through gamified learning experiences, before delving into theoretical concepts, professors can help them identify the specific knowledge gaps they need to fill to become effective professionals. If students acquire a deeper understanding of what they need before they learn it, they will be more likely to engage with that learning.

Implementing inductive teaching methods will require a significant shift in how professors design and deliver their courses, as it strongly differs from the more prevalent theory-first deductive method. Universities must, therefore, invest in professional development programmes. But the cost will be worth it. By rebalancing the emphasis on memorisation and comprehension, embracing inductive teaching methods and prioritising practical application, institutions can increase students’ engagement and equip them with the knowledge and skills needed to thrive in a world of generative AI.

Loïc Plé is director of teaching and learning and a full professor in strategic management at IÉSEG School of Management, France.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (2)

I het what you are saying, and I am less hostile to memorisation than I used to be, but I still think to emphasis in university is on thinking, not remembering. I'm a moderately successful molecular geneticist, but I still don't remember which base is a purine, and which a pryrimidine, nor which amino acid belongs to which category. That said, 8 do have a lot of knowledge at my fingertips. But this is not because I learnt it at university, but that the things I have repeated used in 20 years of professional practice have imprinted themselves. It's a set of facts that is almost entirely mutually exclusive from those my university teachers thought were important. Perhaps it is the fast moving nature of my field, but any fact we teach our students now will be irrelevant in 5 years time. Better to equip them to be able to move rapidly with the times than to excel at practices and with knowledge that will be deprecated almost before they've even graduated.
Thank you very much for your comment, Ian. I fully agree that (critical) thinking prevails over remembering. one of the points I wanted to emphasize in the paper is that the imbalance between both has become (too) huge and should be reconsidered, because thinking requires learning and memorizing first. I know this is obvious, but it really feels like it has been somewhat put aside. Mastering and memorizing the bases (and even a bit more) is essential to keep building on them and developing new knowledge and skills. I am not a molecular geneticist, but I would guess that you would struggle to master the new practices and knowledge without being able to build on the old ones (even though I also guess that some of the old ones may sometimes hinder the development of new ones). Once again, the point was really about re-establishing some balance ;)
ADVERTISEMENT