What can human behaviour analytics tell us about student learning?
Human behaviour analytics could be the answer to enhanced student engagement and better learning experiences in computer-aided learning environments
Computer-aided teaching and learning has underpinned many aspects of modern education. Yet, while potentially transformative, the wholesale integration of computing systems into contemporary schooling and university teaching has not yet delivered solutions to enduring problems in the education sector. The wide adoption of an online curriculum as a result of the Covid-19 pandemic further revealed that lowered student engagement and deficient methodology for assessing learning progress were among the primary challenges for online learning.
What are the problems in online learning?
- Although it is widely known that students learn in different ways and at different paces, it remains difficult to identify optimal engagement and content delivery systems, particularly at the individual student level.
- Teachers face challenges in identifying student learning difficulties and lack information that they can draw on to tune their real-time response.
- Learning outcome assessment still hinges on sporadic milestone events (such as exams and assignment delivery) rather than deeper ongoing assessment of learning, understanding and progress across the student education experience.
Behaviour analytics as a solution
We believe human behaviour, in particular real-time interaction while learning, is the key to resolving the problems. In our research at the University of Technology, Sydney, we have observed students using the mouse to click text segments to assist reading, using the keyboard to search for keywords, and using digital pens to take notes, all of which are typical learning behaviours.
- What’s next for AI in higher education?
- Go green, AI!
- AI has been trumpeted as our saviour, but it’s complicated
Students’ eye movements and facial expressions are also important channels of interaction that will help us to understand their focus of interest and whether they encounter points of learning difficulty. In fact, several typical patterns could be seen in these behaviours if the signals generated by digital tools such as the mouse, keyboard or camera could be captured, processed and analysed using signal-processing and machine-learning methods.
Working collaboratively with Acer Australia and local schools, we have developed a multimodal learning analytics platform to address these problems and have tested it in real-world classrooms. It can record natural learning behaviour data, provide real-time engagement tracking, and assess the learning progress of individual students.
These data could support teaching and learning in many ways, allowing users to assess the fit of educational materials for students and track changes in engagement patterns, and may even act as an input for further downstream data science processing.
Flexible, low-cost and non-intrusive resolution
A typical commercial eye-tracker suite involves expensive hardware with specially designed cameras or glasses, and a complicated software package. A calibration procedure of the user is usually needed before using the eye tracker. Considering the requirements of quick and large-scale deployment in schools, universities and at home, these devices and procedures are not appropriate.
Unlike most commercial products, our devised resolution doesn’t require extra hardware – the mouse, keyboard and embedded camera on a laptop would suffice. The software, once copied to a computer and launched, can collect and synchronise user-interaction data from different modalities according to the preference of the user. It is a customisable, platform-free and installation-free learning analytics resolution, and adaptable to any computer-based learning context. This distinguishes it from other in-built learning analytics resolutions.
Ethics of research, and student acceptance
Privacy concerns are always a consideration when eye-tracking technology is used and behavioural data are collected. In our work, we have gone through all the necessary ethics approval processes. Strict rules have been applied in the data collection and management strategy, including recording extracted data features alone instead of re-identifiable data or any facial images and anonymisation of all the students in the learning records. The data recording period is strictly limited to the course of learning.
“What does this software do?” and “How would it help me?” are the typical questions asked when the software is first used. Although no immediate benefit could be described in the initial phase of deployment, most students and teachers are open to trialling our learning analytics software. When sufficient data have been collected and a learning curve is plotted, they are always delighted to view their learning progress as time elapses.
Enhanced learning, better experience
Drawing on hundreds of hours of computer interaction data and leveraging cutting-edge machine-learning approaches and behavioural theory, our resolution can deliver real-time student engagement analytics to teachers – describing where and when a student is captivated by the material and when they may be somewhat less interested. The outcome supports teachers in understanding what works best in their specific classrooms and for each specific student – both in the moment and across the long term. Moreover, when linked with traditional performance evaluation, the data provide a new way to understand the link between student behaviour and student understanding.
Fang Chen is distinguished professor and executive director of UTS Data Science Institute and and Kun Yu is lecturer and leader of the Learning Analytics project at the Data Science Institute, both at the University of Technology, Sydney.
If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.