Logo

Seven steps to make an effective course quality evaluation instrument

Tools such as rubrics and checklists are increasingly common for monitoring the quality of courses, so how do we choose the best one for our purposes?

Richard McInnes's avatar
24 Mar 2023
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
Checklists and other course quality evaluation tools

Created in partnership with

Created in partnership with

The University of Adelaide

You may also like

Designing with quality and engagement at the forefront
Designing online courses with quality and engagement at the forefront

Course quality evaluation instruments (CQEIs) package objective criteria to practicably and consistently determine the relative strengths and weaknesses of online courses – and thus indicate “quality”. Indicative of their growing popularity, there are now well over 100 such instruments, from the well-established Quality Matters to national, local and institution-specific instruments. CQEIs generally list groups of assessable criteria (for example, learning design, assessment, technology, social interaction and accessibility) against which quality is assessed; they can also include elaboration and resources illustrating best practices.  

But why do so many different instruments exist? Well, the most effective instruments are tailored to a specific context, delivery mode and cohort of learners. This leads many institutions to create or modify an instrument to suit their specific needs. This can enable them to set their own benchmark for quality or target areas where they seek to improve. By contrast, when instruments are not contextualised, their validity for specific individuals and tasks is reduced or, worse, they could be used inappropriately to satisfy performance and quality agendas.

We believe high-quality instruments, put to use well, can reap benefits for students and their teachers. So how do we assess the quality of the instrument? This article provides practical guidance on how to select or modify the best instrument for your purposes, whether you are an individual teaching academic or educational leader.  

We have just completed a scoping review examining 75 course quality evaluation instruments, and we like to think we have learned a thing or two along the way. So, what advice would we give to those looking to choose, develop or update their own course quality evaluation instruments? Well, here are our seven top tips:

1. Contextualise the instrument

The reason that so many of these instruments exist is that each of our contexts is different. Therefore, try not to use an “off-the-shelf” instrument as is, and instead tailor the best one to your unique circumstances. For example, adopt or create criteria that are specific to your institution or student cohort, or you could trial an instrument as part of your usual evaluation processes, enabling you to make more targeted assessments and interventions (and, over time, improve your instrument).

2. Make it research-based

Try to avoid simply identifying things you personally like. Instead, use scholarly literature to confirm best practices and indicators of “quality”. See, for example, how Quality Matters blends the literature with best practices from educators and makes this visible to users.

3. Clearly articulate criteria

A key challenge for an effective CQEI is making sure it is interpreted consistently. Workshop your criteria with critical friends until you are confident your wording is clear. You might also want to provide more in-depth explanations of the criteria with examples. Have a look at OSCQR to see how this can be done effectively, or alternatively, look at the TELAS reviewer-training programme.

4. Include ways to improve

It is all very well for an instrument to tell someone that an element of their course is not of the requisite quality, but what can they do about it? Choose an instrument that provides improvement strategies – or make your own. You should include examples or strategies that can help someone improve specific elements of their course. These strategies should ideally be simple and straightforward to increase the chances of them being implemented. Again, the OSCQR is a publicly available exemplar of practice in this area.

5. Give opportunities for progression

To ensure your instrument is inspiring and authentic (and not just a tick-box exercise) make sure it shows the difference between minimum standards and more exemplary practices. This can help users track their progress as they embark on the process of course improvement. For example, at the University of Central Florida, the instrument designates different levels of quality for courses.

6. Integrate the instrument into professional development

No one likes it when the flaws in their course are exposed, especially when there isn’t help on hand. So, where possible, try to integrate the instrument into existing professional development programmes, thus upskilling educators in a more holistic manner (see, for example, the supporting course enhancement sessions at California State University, Long Beach). Whether you are using this instrument to inform your own teaching practices, or you are in a leadership position, simply completing the checklist or rubric is not really sufficient. Consider how the instrument can be used to advance skills as well as reflective and evaluative judgements – for example including its use in teacher peer review practices.

7. Treat your CQEI as a living document

The criteria you select for your instrument represent a point in time. To be truly useful, CQEIs need to be responsive to change – we only have to consider the unfolding impact of Artificial Intelligence to recognise this. Be open to reviewing your criteria, engage representatives from all your stakeholder groups and update your document regularly. 

In conclusion, CQEIs are here to stay. They provide consistent and reliable measures of quality that can suit the needs of individual users, institutions and national bodies. To make sure these tools work for you – not against you – we recommend you become involved in creating and/or modifying the tools and their use.

Richard McInnes is a learning designer and product lead at the University of Adelaide, Australia.

Joshua Cramp is an online educational designer, Claire Aitchison is an academic developer, Katherine Baldock is associate director: teaching innovation unit, Kerry Johnson is an online educational designer and James Hobson is an online educational designer, all at the University of South Australia.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site