
How universal design for learning can build AI proficiency
You may also like
If we want to boost critical artificial intelligence proficiency across our institutions, we should start with course design. Taking focus away from what students are doing with generative AI – producing essays, drafting code, composing music – or faculty discussions about academic integrity policies and detection software does not diminish faculty expertise. It makes it more visible and more necessary.
In my undergraduate music theory teaching, the most impactful uses of AI have to do with how I structure learning – and how accessible that structure is to the full range of students I encounter.
Using AI to boost classroom inclusion
For years, my discipline relied heavily on high-stakes, exam-based assessment, where students demonstrated mastery in a single written test. Universal design for learning (UDL) challenges us to rethink that model. It asks us to provide multiple means of engagement, representation and expression – that is, more than one pathway to understanding and to demonstrating learning.
Many educators support this in principle, but few have the time to redesign their courses accordingly. This is where AI can help, not as a shortcut for thinking but as a catalyst for it. If you are unsure where to begin, take one learning outcome from your course and ask an AI system to generate alternative ways students might demonstrate proficiency. You may receive suggestions for projects, presentations, reflective analyses or creative applications. You are unlikely to be fully satisfied with the initial output – and that is the point. The value lies in the disciplinary judgement required to evaluate, refine and reshape it, and the process can expand your pedagogical imagination.
- Spotlight guide: Make learning accessible to all in higher education
- Accessing the what, how and why of UDL using popular culture
- Is AI literacy an information skill?
In my courses, this led me to build structured choice into assessment. Instead of one cumulative exam, students can choose how they demonstrate understanding: a traditional test, a musical composition applying key concepts, a music analysis paper or a teaching demonstration. Not every student chooses differently – some still prefer the exam – but the message is clear: the ability to demonstrate understanding is not confined to a single format.
We can also use AI-assisted UDL to reduce unnecessary cognitive load. Ask students what most improves their experience in a course, and the answer is often knowing what is expected and when. Using AI, I developed a detailed weekly template in our learning management system that includes short-term learning goals, a day-by-day overview of activities, itemised assignments and consistent links. I prompted the system to keep UDL principles in mind, then substantially edited the output to fit my disciplinary context and standards. The result was not a loss of academic rigour but less unnecessary confusion.
If you would like to try something similar, ask AI to draft a weekly course outline that includes:
- concise learning goals
- a table of daily topics and tasks
- clearly formatted assignment lists with due dates
- links and resources.
Then adapt it to your voice and institutional platform. If a weekly breakdown doesn’t appeal to you, consider formatting by unit or test. Consistency alone can dramatically improve students’ experience and performance.
Transparency is another powerful lever for equity. Many of us inherited assignment prompts that are brief, high level and unintentionally opaque. Consider feeding an assignment into an AI system with instructions such as: “Rewrite this as student-facing. Provide step-by-step guidance. Reduce jargon. Include checkpoints. Keep accessibility in mind.” Review the output carefully, and you may find clearer sequencing, more explicit criteria and built-in scaffolding that strengthens, rather than dilutes, academic expectations.
In my experience, adding this structure improves the quality of student responses, particularly in short writing tasks. Students are not guessing what I value; they can see it.
Guiding students in AI use
Of course, students need guided opportunities to experiment with AI tools as well. Early in the semester, I invite students to use an AI system as a “tutor” for a defined portion of our course material. In the AI prompt, I provide parameters: use a particular web-based, open access source; adapt to my responses; if I provide verbatim responses, ask me to explain concepts in my own words; continue until I demonstrate proficiency. I also encourage students to test the system’s limits – to see how easily it can be prompted into error, for example. And I share my personal experiences with AI’s inadequacies, especially as they relate to the very human-centred practice of music-making and learning. Students experience one way to engage with AI, seeing both pros and cons in the same short assignment. Importantly, detecting those limits depends on the disciplinary knowledge they are developing.
This dual approach – using AI to design more accessible courses and helping students engage with AI critically – cultivates a more meaningful form of proficiency. It is about exercising judgement, knowing when to use the tool, how to question it and how to design around its limits.
Why this matters
Across disciplines and national contexts, graduates are entering workplaces where AI tools are already embedded. In some sectors, their use is expected. Whether our students become musicians, engineers, teachers or public servants, they will need to navigate AI systems thoughtfully. That means understanding what these systems can do, where they fall short and how to interrogate their outputs.
As educators, we can model the stance that AI is neither a threat to be banned nor a magic solution to be embraced uncritically. It is a tool. When used intentionally, it can clarify expectations, scaffold complex learning and expand meaningful choice. It begins with small, deliberate design decisions in our own courses: take one assignment and clarify it; structure one week transparently; offer more than one path to mastery.
If AI helps us do those things more effectively and efficiently, its most significant contribution to higher education may not be the content it generates, but the barriers it quietly removes.
Kim Loeffert is an assistant professor of music theory in the School of Performing Arts at Virginia Tech.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

