Logo

Original essays written in seconds: how ‘transformers’ will change assessment

Artificial intelligence that generates original text will be both empowering and disruptive for students and higher education at large, say Mike Sharples and Rafael Pérez y Pérez

,

The Open University,Metropolitan Autonomous University
4 Jul 2022
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
Robot writing

You may also like

AI has been trumpeted as our saviour, but it’s complicated
A robot marks a book. Aritificial intelligence has been trumpeted as our saviour, but the truth is much more complicated.

Every few years a technology comes along that its proponents claim will transform education. Now there is another one, handily called a “transformer”. Unlike systems to personalise teaching or deliver content, this is a tool for students rather than educators or administrators. It costs pennies to use, is available now online and it generates essays and assignments.

The most widely known artificial intelligence (AI) transformer is GPT-3 (Generative Pre-trained Transformer 3) from OpenAI. GPT-3 works like a superpowered text completer of the kind you have on your mobile phone. Instead of looking back at the last few characters and using these to predict the next words, it can output an entire short story, blog or student essay. It can also summarise a scientific article in simpler language, write a review, translate languages and answer general questions. In short, a transformer is a general-purpose language machine. But what will be its impact on education?

Generating student essays 

The most immediate impact is that students will use AI transformer software to generate their written assignments. The student registers at the OpenAI site, goes to its “playground”, types the title and opening few words, presses “submit”, and in a few seconds the program has produced the essay, at a cost of about 6 US cents (5p in the UK) for 750 words. All that is possible now. Some companies are already promoting their own AI writing tools to students based on GPT-3.

As a test, we gave GPT-3 the prompt “The construct of ‘learning styles’ is problematic because” and it produced 400 words on the topic of learning styles, in grammatical English, including a citation and reference. Here is a short extract from the text it generated:

“In a review of this research, Dunn and Dunn (1997) found that the evidence does not support the idea that people have a particular learning mode that is dominant or that they process information in a particular way. In their study, the authors found that when students were given a particular instructional treatment, they could be classified in a particular learning modality category. But when they were given a different instructional treatment, they were classified in a different category. In a follow-up study in which they gave students a number of instructional treatments, Dunn and Dunn found that students could be placed in a particular category in some treatments, but not others.”

There are a few points to note. The complete essay is well structured, with an introduction, background, example and conclusion. The text is original, not copied, so plagiarism detectors will not flag it. The example includes a citation along with a reference at the end to a journal paper by Dunn & Dunn. These are entirely false. Dunn and Dunn are real researchers who devised a learning style inventory, but they published no study of learning styles in 1997.

Amoral systems 

This raises a fundamental issue with AI transformer programs. They are hugely proficient wordsmiths but are essentially amoral. They give the appearance of knowledge but have no ability to reflect on what they have written or check whether their output is decent, accurate and honest. In every sense, they do not care. In the hands of students such programs will be both empowering and dangerous.

If AI generators make composing easier and let students focus more on structure and argument, that may be to the good. But the danger is that students will just let the AI take over composing, churning out plausible nonsense that they submit without scrutiny.

Just as a student can generate an essay in seconds with a transformer, so a teacher can assess it. We added the prompt “Here is a short assessment of this student essay:”, pressed “submit” and it wrote a 100-word academic response. This conjures a nightmare vision of students generating assignments with the aid of AI which teachers then assess through AI. Nobody learns, nobody gains.

Response from higher education 

As with other technologies that empower students, such as mobile phones, Wikipedia and YouTube, the first response from higher education to AI transformers will be to ignore them – transformers have been around since 2019 making scarcely a dent on university education. Then, as students submit assignments generated by AI, universities will set up countermeasures. These are unlikely to be successful. Any software that is sufficiently powerful to detect computer-generated text will also be powerful enough to evade it, especially if the student does some light editing to improve its meaning and coherence. Universities may then be forced to adapt by running costly invigilated and oral exams. 

A few institutions may reach beyond such knee-jerk reaction to rethink assessment and harness creative AI for learning. As educators, if we are setting students assignments that can be answered by AI transformers that lack self-awareness, are we really helping students learn? 

There are many better ways to assess for learning, such as constructive feedback, peer assessment, reflective practice and teachback. In a class on academic writing, transformers could show students different ways to express ideas and structure assignments. A teacher can run a classroom exercise to generate a few assignments on a topic, then get students to critique them and write their own better versions. Writing with creative machines may soon become as natural to students as processing words on a screen.

Transformer technology is a step towards a new kind of human-machine creativity where writers and artists collaborate with machines to produce interactive stories, images, films and games. How higher education manages this transition will show how it prospers in a hybrid world of real and artificial experience.

Mike Sharples is emeritus professor of educational technology at the Open University.

Rafael Pérez y Pérez is a full professor at Metropolitan Autonomous University at Cuajimalpa, México City.

They are authors of Story Machines: How Computers Have Become Creative Writers, to be published by Routledge on 5 July 2022.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site