Logo

Teaching AI literacy: how to begin

Students urgently need to develop their AI literacy skills if they are to gain graduate-level jobs and help society tame the perils of the technology, write Christine O’Dea and Mike O’Dea

,

University of Huddersfield,York St John University
9 Jun 2023
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
A student learning AI literacy

You may also like

We don’t teach students to use a slide rule in a world of calculators
4 minute read
Vintage-style photo of secretary and co-worker

To say artificial intelligence (AI) is developing quickly is an understatement. And its inexorable march is dramatically changing industry. BMW, for example, recently announced it will use generative AI to create the new design for the 8 Series Gran Coupe line; meanwhile Snapchat has just released a generative AI chatbot called My AI to provide more tailored recommendations to its users; and the financial company Bloomberg has developed BloombergGPT, a financial-focused AI model. Not only is AI influencing our daily lives, it’s also transforming the workplace our students will enter. For this reason, our graduates cannot fear AI but should understand it, so that we can, as a society, contain the worst excesses of the technology and harness its abilities for the benefit of all. Universities must provide AI literacy training to their students.

The shift from digital literacy to AI literacy

While the requirements for digital literacy are well established and comprehensive digital literacy frameworks, notably from Jisc, have been adopted, there is no dedicated AI literacy framework. Nor is the higher education sector in agreement on what one would encompass. However, skills to deal with even the most common and widely generative AI tools are underdeveloped in our current generation of graduates, and they urgently need to learn them.   

You’ll probably be aware of the gender and health prejudices that exist in the datasets that generative AI tools have been trained with. But higher education must also consider ownership of generative AI outputs. Who owns the copyright of generative AI responses? And are these productions protected under intellectual property laws? There are no simple answers to these questions yet and there does not seem to be a universal rule on intellectual property rights. Many countries are revising their IPR law and regulation responding to generative AI, meaning users will need to consult IPR laws in individual countries.

Graduates also need to be able to identify and possibly correct or counteract AI hallucinations such as deepfakes and misinformation. Recent examples of deepfakes and hallucination include an image of the pope in a Balenciaga-style puffa jacket and a false claim that an Australian mayor had been jailed for bribery.

Where to begin teaching AI literacy skills

Fact-checking 

AI outputs should become second nature to our graduates. It is important that graduates, and indeed society as a whole, understand that the output of AI must be authenticated.

  • Use case studies. Examples of hallucination in outputs or simply incorrect information being adopted in high-profile examples (for example, the US lawyer who used ChatGPT to sue Colombia’s Avianca Airlines, unaware that the cases it cited were not real) can be adopted in training or teaching. These real-life examples are key in showing some of the consequences of failing to fact-check outputs.
  • Use a hands-on approach. Students should be given opportunities to learn and apply fact-checking principles, such as Google’s expertise, authoritativeness and trustworthiness principles, to AI-generated outputs. Students should also practise and develop a habit of using a fact-checking service such as Google Fact Check Explorer or ClaimBuster (for text), Illuminarty or AI or Not (for images), to check the content produced by generative AI.

Verification of outputs

Verification of outputs is closely related to fact-checking. Training and support can be provided via either curricular or co-curricular activities, such as tutor-led tutorials, in-class discussions, quizzes, debates and student-created newsletters or reports. Whatever the options educators use, it is important to incorporate hands-on practice for students. The activities should focus on:

  1. Training students to get familiar with AI detectors, such as ZeroGPT, Copyleaks or content at scale to understand how they work and how to use them to verify the AI-generated outputs.
  2. Supporting students to develop awareness that AI-generated text tends to be uniform, grammatically correct with few, if any, typos and rarely uses slang. The concept of “perplexity” can be applied to AI-generated text. If the text has a low perplexity rating, it is characterised by using the most appropriate words throughout and is very likely to be AI-generated.

Create effective prompts

Creating effective prompts is equally important for students to learn. They must recognise that AI results depend on the input. One way to do this is to use a flipped-classroom approach where students watch a recorded lecture explaining the TAP (topic, action and parameters) principles for effective prompts and draw on good examples and explanations provided by the educator. Students should be encouraged to read additional materials, such as blogs and textbooks, before class. 

During the timetabled sessions, students can be given hands-on practical opportunities to try out prompts and analyse the outputs either individually or as a group (with three or four students). The ability to evaluate AI outputs is extremely relevant to 21st-century living. We already teach our students how to think critically and be sceptical, but these skills need to expand to include how to verify, validate and cross-check data and “facts”. These capacities should become second nature to all graduates. So too should the mindset of continually monitoring and updating their skills in recognition that AI is and will be developing constantly. To not keep on top of AI literacy runs the risk of disadvantaging both our graduates and our society.

Xianghan (Christine) O’Dea is subject group leader (teaching and learning) in logistics, transport, operations and analytics at Huddersfield Business School, University of Huddersfield.

Mike O’Dea is senior lecturer in computer science at the School of Science, Technology and Health at York St John University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site