Confronting challenges to academic integrity in the age of AI
Institutions must strike a balance between using AI to enhance education and preserving confidence in the value of academic qualifications
University staff and students have had around 18 months to get used to the rise of generative AI and its impact on teaching and assessment. A Times Higher Education webinar, held in partnership with Turnitin, brought together experts who shared their insights on navigating challenges to academic integrity in the age of breakthrough technologies such as AI.
Patti West-Smith, senior director of customer engagement at Turnitin, explained that the space is evolving so quickly that it’s crucial for educators and technology companies alike to dedicate time to keep up with technological advancements. “The best thing is to have educators who are using these tools themselves so they understand better what they can and cannot do,” said West-Smith. “We cannot assume that people are on the same page. In particular, we cannot assume that students are on the same page as educators.” Setting out clear expectations of what is and isn’t acceptable sits at the heart of these issues, West-Smith advised.
The Rennes School of Business updated its academic integrity policy in 2023 and the first step in the process was to define the responsible use of any tool in assignments and assessments, said Viatcheslav Dmitriev, associate dean for faculty relations at the institution. “What is most relevant is that it does not undermine the goals of education. You need to define what the responsibilities are at the school level, staff and department level, and student level,” he explained.
However, addressing the use and misuse of generative AI by students can be overwhelming. Staff members at many institutions lack training, meaning they are ill-equipped to handle incidences where a detection tool suggests AI may have been used. “There is some training and education needed for faculty who feel scared or threatened,” said Leslie Layne, coordinator of college writing at the University of Lynchburg. “I think this will take some investment from universities.”
The academic community should share its learnings to help educators during this period of change, suggested Mary Davis, professor of education and student experience at Oxford Brookes Business School. The university runs drop-in events and peer mentoring opportunities around this issue. “The guidance we need to give students is about their practice. It’s not about the tools. Responsible use is also about ethical decision-making,” said Davis.
Tricia Bertram Gallant, director of the Academic Integrity Office at the University of California San Diego, said that institutions sometimes prioritise identifying the use of AI in assessments over assessing whether learning outcomes have been met. “We need to think about what human, durable skills students will need and what they need to learn before we allow them to offload to AI.” For many faculties, this would require redesigning assessments to focus on different competencies, which “ensure we graduate human beings and not cyborgs”, she added.
Davis concluded that navigating new norms could be a chance for educators to enrich the tutor-student relationship and learn more about where AI can add value. Educators can use the time freed up by using AI tools for face-to-face interaction with students. “We can turn AI on its head and value the human more. [This applies to] the student-tutor relationship too – this could be an opportunity for us to reprioritise that relationship,” Davis said.
The panel:
- Mary Davis, professor of education and student experience, Oxford Brookes Business School
- Viatcheslav Dmitriev, associate dean for faculty relations, Rennes School of Business
- Tricia Bertram Gallant, director of the Academic Integrity Office, University of California San Diego
- Alistair Lawrence, head of branded content, Times Higher Education (chair)
- Leslie Layne, coordinator of college writing, University of Lynchburg
- Patti West-Smith, senior director of customer engagement, Turnitin
Resources:
- The UK Quality Assurance Agency for Higher Education's advice and resources on generative AI
- Jisc resource page on AI
- UNESCO's guidance for generative AI in education and research
- UK's AI Standards Hub
- Download “Crafting your GenAI and AI policy: a guide for instructors” by Tricia Bertram Gallant from UC San Diego at the top of the article
Explore more insights from events held in partnership with Turnitin here.
Find out more about Turnitin.