Why ChatGPT is an opportunity for schools


Last updated:

Date published: 28 February 2023

Dr Matthew Glanville, Head of Assessment Principles and Practice

Source: Why ChatGPT is an opportunity for schools | The Times

Those of us who work in the schools or exam sector should not be terrified by ChatGPT and the rise of AI software – we should be excited. We should embrace it as an extraordinary opportunity.

Contrary to some stark warnings, it is not the end of exams, nor even a huge threat to coursework, but it does bring into very sharp focus the impact that artificial intelligence software that can write sophisticated responses could have on the way we think about teaching, learning and assessment.

We should not think of this extraordinary new technology as a threat. Like spell-checkers, translation software and calculators, we must accept that it is going to become part of our everyday lives, and so we must adapt and transform education so students can use these new AI tools ethically and effectively.

The International Baccalaureate, where I am head of assessment principles and practice, has decided it is not “banning” the use of ChatGPT or any similar AI software as has been seen elsewhere; this is the wrong way to deal with innovation.

Students will, however, need to be made aware that we do not regard work written by such tools to be their own. To submit AI-generated work as their own is an act of academic misconduct and would have consequences. But that is not the same as banning its use.

In truth, many of the issues thrown up by Chat GPT are extensions or variations of current issues that the IB is familiar with managing, even if these technologies are significantly different in terms of speed, ease of access and scale.

For example, the risk of students getting someone else to write their work for them is familiar. For many years teachers and the IB have been dealing with essays bought from the internet (from so-called “essay mills”), completed by external tutors or even by family members. We counter this in many ways; including the fact that all IB coursework requires regular check-in meetings between students and teachers, where there is an opportunity for teachers to ask the student about their ideas and to expand on their arguments to ensure that the student work is a true reflection of what they understand.

But we are also enormously excited by the prospect of exploring the enormous educational opportunities that this software has created.

If AI – in the form of Chat GPT and its inevitably more powerful descendants – is indeed to be routinely used in everyday life around the world, then it will raise a series of fascinating questions about what essential skills and knowledge students will need that we cannot afford to ignore.

These include having the ability to evaluate AI-produced essays and the ability to refine the questions being asked of the bot. A common theme in comments about ChatGPT at the moment is the need to explore asking the right question for the answer you want.

Students must also be able to identify and address bias. All AI-produced work is based on the information it has “learnt”, particularly from today’s internet, which is heavily biased by human authors. Students need to understand that AI will inherit the bias and blind spots of its programmers or source material in the case of self-learning systems.

Finally, in an AI-informed world, young people will need to be able to think around problems and be creative rather than seeking simple answers or following a routine process. The AI tools will do the latter quicker and more effectively, while the former is where humans can excel.

Let’s imagine a real-life scenario of how AI might immediately be used in the classroom. How about using AI to provide example work for students to evaluate and criticise? Many teachers find asking students to mark examples of work an effective teaching technique. Using AI addresses many of the ethical and practical problems associated with gathering example work for this classroom activity.

The teacher would, of course, need to explain why it is ethical for them to use the AI tools in this way but not for the student to use it to write their work, but this in itself could potentially be an interesting lesson in ethics. The point is this: if an AI programme can indeed convincingly answer an exam question in the style of an 18-year-old student, why not take advantage of that fact in today’s teaching and learning?

Ultimately what AI is likely to mean in the longer term is that we spend less time teaching the mechanics of essay-writing or communication and more on understanding, describing and analysing problems. This is something we can celebrate rather than fear.

Matt Glanville is head of assessment principles and practice at the International Baccalaureate.