Different perspectives on Generative AI exist among lecturers, ranging from ‘what great possibilities’ to ‘I don't like it, I don't want anything to do with it either’. What are these perspectives based on? Is it fear of the unknown, or in any case, discomfort with letting go of the familiar, as happened when the online search engine and Wikipedia, or the mobile phone, made their appearance? Or is it because the pride of doing research, critical thinking, and carefully choosing words and conveying your message as art can now be outsourced to an algorithm and the craftsmanship of the researcher is thereby undermined? Are we anxiously trying not to become redundant?
Many lecturers and researchers are looking for a way to deal with this on a daily basis. Students use GenAI, as a search engine, as a second opinion, or to ask questions when we are not available. And sometimes they outsource writing assignments or project work. The lecturer sees this or in many cases suspects it, during lectures and in assessments.
At our university, it is assumed that the current EER is sufficient, because basically using GenAI is presenting others' work as one's own. So the lecturer is caught. More teaching, less time and meanwhile trying to deliver ‘good’ teaching. With having to justify assessment with rubrics, assessment matrixes, etcetera, but at the same time feeling that a student hasn't quite mastered what the programme requires. And I think most, if not all, lecturers sometimes wonder: If I give a pass, am I rewarding the student's knowledge and skills in the subject, or for the effective prompting of a generative language model?
An example, probably recognisable to many: when reviewing products there is doubt as to whether a student himself understands what is written. After some questioning, that doubt seems justified. Still, using GenAI is difficult to prove, because GenAI does not produce the same pieces of text twice. Thereby, with the right prompt, it can also deliberately produce highly credible errors at the student's level. Non-existent references used to be a clear sign of GenAI, but that is no longer the case today.
Anyway, thus no central policy. We do have an educational vision, which describes teachers as experts, as coaches and as team members. That expert role and coach role we are presumably going to share more and more with GenAI. At least, in the eyes of students who often use GenAI while studying. We are simply not available 24/7, and even if we were, we have so many other tasks that giving individual attention to all students in every course would not work. So, somewhat ironically in these days, we have a new colleague who brings a lot of expertise. Unfortunately, this new ‘colleague’ is also sometimes a bit confused and we do have to ask just the right questions to benefit from that expertise.
As coaches, we guide students to properly ask the right questions and look critically at the answers. But we also guide in taking a different perspective on all kinds of ethical and moral issues in relation to AI. At first glance, it seems free, but if we zoom out, AI is very expensive (the climate impact, the self-reinforcing inequality through continuous reproduction of bias such as gender or skin colour, irreducible theft of intellectual property, to name a few). If we don't want to blindly accept that price, we should at least be aware of it.
Then there are reflexive challenges. As with the rise of the calculator and online search engines, we as teachers need to team up to have the conversation about learning outcomes and the function of the study programme. What does our degree stand for? What do we consider important, why do we want a student to learn certain things? And in doing so, let us also be critical of ourselves. Are we looking through the lenses of our own studies, in a world without AI, perhaps even without the internet? How does that compare with what working life and society will demand of us in five to ten years? That is, after all, where our students will end up.
Ultimately, this leads back to issues of significance. What will be our role, how do we have a significant impact when AI keeps our economy running and we only have to monitor, or not even that anymore? I was talking to a student about this last week. Without hesitation, the answer was: ‘Love and compassion, that's what we are all looking for anyway,’ the student said. That’s a silver lining. Lecturers are certainly not becoming redundant. They are only becoming more important.