Logo

‘Small changes in assessment design can make thinking visible’

Many concerns about the impact of artificial intelligence on academic integrity and authorship have focused on controlling the tool rather than reconsidering the task, writes Nicole Brownlie. Here, she offers a shift to assess process, not just answers
Nicole Brownlie's avatar
7 May 2026
copy
  • Top of page
  • Main text
  • More on this topic
Young Black man with thought bubbles
image credit: jacquesdurocher/iStock.

Created in partnership with

Logo

You may also like

Assessment isn’t a finish line, it’s a learning process
5 minute read

When assessment focuses only on a final submission, educators must infer how that work was produced. Generative artificial intelligence (GenAI) has made visible this flaw in how we measure learning. The fundamental issue is not that students can use chatbots to generate answers. It is that we have designed assessment where answers are the only evidence we look at.

Detection does not solve the underlying problem either, despite universities’ heavy investment. If we want to maintain academic standards in an AI-influenced environment, the solution is not tighter policing. It is better assessment design.

Make thinking the evidence

So, instead of treating AI use as something to avoid or detect, I redesigned a major assessment task in a first-year teacher education course to require it. Students used GenAI to explore how learning takes place and submitted the full transcript of one continuous interaction alongside a short written response. The transcript was not supplementary. It was the assessment.

Students were not rewarded for producing a polished answer in one attempt. They were assessed on how they worked with the tool:

  • how they questioned weak explanations 
  • how they refined ideas using learning theory 
  • how they recognised limitations and gaps 
  • how they moved towards a more integrated understanding. 

The final written response mattered, but it was secondary. The primary evidence of learning was the process.

This design shift changed the nature of both the task and how I assessed it. Instead of asking: “Did the student write this?”, the question became: “How was the student thinking?”

Shift from performance to reasoning

This approach can be extended. In another first-year assessment, students engage in short teaching interactions with a chatbot and annotate key moments to explain their decisions. The interactions are not graded. Instead, students are assessed on how they justify their choices using developmental theory, what learner characteristics they are responding to, and why their approach supports learning at that stage. The focus is not on producing a perfect performance but on making professional reasoning explicit.

Across both tasks, the same principle holds: we are not assessing what students produce in isolation. We are assessing the decisions that led to it.

This matters, because judgement is at the core of academic work. As I have argued previously, GenAI does not remove the need for judgement, authorship or care. It makes them all more visible and necessary. 

Why this approach to assessment works

When thinking is visible, several things change:

  • Academic integrity becomes easier to judge. Instead of trying to detect whether a final product is authentic, we can see how ideas were developed and refined.
  • The role of AI becomes clearer. It is no longer a shortcut to an answer, but a tool within a broader process of reasoning. 
  • The quality of learning improves. Students are required to engage, question and make decisions. They cannot rely on a single prompt or response.

This does not eliminate all challenges when grading assessment. But it shifts the focus to the quality of thinking, not the appearance of independence.

What about marking time?

A common concern is that process-based assessment will take too long to mark. If students submit transcripts or annotations, it can seem like there is more to read.

In practice, this has not been the case. You are no longer searching for evidence of learning. It is already visible. Instead of inferring how a final product was produced, you see where ideas were challenged and refined. You are not reading everything in detail. You are scanning for moments of decision-making: where a student questions an explanation, draws on theory or revises their thinking. These are easier to identify in a transcript or annotated task than in a polished submission.

Clear criteria also matter. When assessment focuses on reasoning and judgement, markers can make faster, more confident decisions. The work either shows evidence of thinking, or it does not.

This does not remove workload pressures. But it shifts effort towards what we value and makes that effort more efficient and defensible.

What this looks like in practice

Redesigning assessment does not require a complete overhaul. Small changes can make thinking visible. You might:

  • require submission of interaction histories or drafts 
  • ask students to annotate key decisions
  • assess how ideas were developed, not just the final claim 
  • explicitly integrate AI into the task. 

The key is to ensure that students must show how they arrived at their work, not just present it.

Keep judgement at the centre

The rapid rise of GenAI has prompted concern about standards, integrity and authorship. But many responses have focused on controlling the tool rather than reconsidering the task. We cannot return to a world before AI tools were readily available. Nor can we rely on detection to preserve standards. We need to design assessment that reflects what we value.

If we value critical thinking, thinking must be visible.

If we value judgement, judgement must be evidenced.

If we value learning, the process must count.

The future of assessment is not AI-proof. It is thinking-visible.

Nicole Brownlie is a lecturer in teacher education at the University of Southern Queensland.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site