
What can module leaders learn from Toyota?
How to use the plan-do-check-act cycle to improve student success, satisfaction and grades
The concept of continuous improvement, rooted in the lean manufacturing techniques developed by Toyota in the 1950s, has been a fundamental component of the manufacturing industry for decades. Despite its popularity across various sectors, higher education has been slow to adopt continuous improvement practices. As a continuous improvement manager, I have experienced firsthand how they can transform teaching and learning in higher education.
I have been helping colleagues implement continuous improvement principles for module enhancement to increase engagement, grades and student satisfaction. While several relevant methodologies exist, the most useful and easily applicable relevant to higher education is the plan-do-check-act (PDCA) cycle. This is in essence a four-step model for implementing change and a rigorous framework to design meaningful and data-driven interventions.
Plan
The planning stage is the most important part of the PDCA cycle. Here, the focus is on identifying the problem(s). Ask yourself:
- What are the main problems with my module?
- Are my students happy with the way I teach?
- Is the assessment well suited for the module?
When I led a postgraduate module in 2021, the first thing I did was review the data. It became clear that there were areas for improvement where student success and satisfaction were concerned. Therefore, we needed to make some changes.
After identifying the problems, the next step is understanding the root causes. In my department, we used the five whys (5Whys) technique, a problem-solving method involving asking “why?” five times until you get to the root of the problem. The reasons we discovered were multifaceted. First, the reading list needed to better reflect the student population, as nearly 80 per cent of students in the module were international and from culturally diverse backgrounds. Second, international students often face challenges navigating the UK university system and adapting to its academic expectations, which can automatically disadvantage them. Last, qualitative comments from the student satisfaction survey suggested students needed more support for the summative assessment. After identifying the root causes, we decided to make several interventions.
- Spotlight guide: Back to the basics of pedagogy
- Why interactive and game-based instruction beats lecturing every time
- Using fun and interactive micromodules to energise learning
Do
With these insights, we moved to the “do” stage. This is where you implement the planned interventions based on the problems identified. In our case, to support students with assessments, we developed an assessment checklist that acted like a roadmap, including hyperlinks to referencing, plagiarism, and academic writing resources created by the university’s library. We also decolonised the reading list to represent a broader range of voices and perspectives. Additionally, we integrated digital tools such as Padlet and Poll Everywhere to collect responses to questions and quizzes during seminars, increasing student engagement.
Check
During this stage, we analysed whether the interventions were successful, using several performance indicators including assessment scores and student feedback. The results showed that the number of students achieving an upper award increased, the failure rate dropped to only 3 per cent and student feedback scores significantly improved. It was clear that the interventions had been successful.
Act
This stage can be considered the standardisation stage. In the manufacturing and service sectors, the interventions made in the “Do” stage can be implemented on a small scale and in a controlled environment. However, this approach may not always be applicable in education. In our case, following the success of our interventions, we standardised them for subsequent deliveries of the module.
The module has continued to perform exceptionally well since we introduced these interventions, and it even received the department’s “Most Improved Module” award. If the interventions had been unsuccessful, we would have analysed why they failed and what could be improved. Either way, we should not rely solely on one iteration, as the PDCA cycle is inherently iterative. We refine the process each time we run the module.
Integrating continuous improvement techniques into higher education teaching and learning can greatly improve overall student outcomes. However, implementing them is not without its challenges. One significant hurdle is the perception that such techniques are for businesses, not for universities. However, this is clearly not the case, as reflected by our success in implementing it in my department. Another challenge is the faculty’s resistance to change, which is to be expected. Despite these challenges, the effort is well worthwhile. The results speak for themselves.
Sercan Demiralay is a continuous improvement manager, and principal lecturer in accounting and finance at Nottingham Business School, Nottingham Trent University.
If you’d like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.