Imagine dedicating years of your life to a course that promises to launch your dream career, only to discover it’s largely taught by an AI. That’s the harsh reality for students at the University of Staffordshire, who feel betrayed and shortchanged after enrolling in a government-funded apprenticeship program. James and Owen, two of the 41 students in the coding module, had hoped to transition into cybersecurity or software engineering. Instead, they found themselves staring at AI-generated slides, sometimes read aloud by an AI voiceover. But here’s where it gets controversial: while students are strictly prohibited from using AI in their work, the university seems to have no qualms about relying on it to teach them.
James, frustrated and disillusioned, expressed his anger during a recorded confrontation with his lecturer in October 2024. ‘If we handed in AI-generated work, we’d be kicked out,’ he said. ‘But we’re being taught by an AI.’ His sentiment echoes that of many students who feel ‘robbed of knowledge and enjoyment.’ Despite repeated complaints, the university appears unmoved, even publishing a policy statement justifying AI use in teaching. And this is the part most people miss: the same university that restricts students’ AI use is actively promoting it as a tool for academics, creating a glaring double standard.
The issue isn’t isolated to Staffordshire. Across the UK and beyond, universities are increasingly turning to AI for teaching, course material creation, and personalized feedback. A Department of Education policy paper hailed generative AI as a ‘transformative’ force in education, and a Jisc survey found nearly a quarter of UK educators already using AI tools. But for students, the reality is far less inspiring. In the US, students are slamming professors who rely on AI, while in the UK, undergraduates are taking to Reddit to vent about lecturers copying ChatGPT feedback or using AI-generated images. One student summed it up: ‘I understand the pressures on lecturers, but it just feels disheartening.’
James and Owen spotted the AI red flags early on. During their first class, the lecturer played a PowerPoint presentation with an AI-generated version of his voice. Soon, they noticed other telltale signs: inconsistent language shifts between American and British English, suspicious file names, and generic content that occasionally referenced US legislation. Even this year, the AI blunders continued—in one course video, the voiceover inexplicably switched to a Spanish accent for 30 seconds before reverting to British English. But here’s the kicker: when The Guardian analyzed the course materials using AI detectors, they confirmed a ‘very high likelihood’ of AI-generated content.
James didn’t stay silent. He raised his concerns with the student representative and later confronted the lecturer during a recorded session. ‘I know these slides are AI-generated,’ he said. ‘I’d rather you just scrap them. I don’t want to be taught by GPT.’ Another student chimed in, pointing out the lack of substance: ‘There’s maybe 5% useful information, and a lot of repetition. We could get the same stuff by asking ChatGPT.’ The lecturer’s response? An uncomfortable laugh and a quick subject change to another tutorial he’d created—using ChatGPT.
While the university eventually brought in a human lecturer for the final session, James and Owen felt it was too little, too late. ‘I feel like a bit of my life was stolen,’ James said. Owen, in the midst of a career change, added, ‘This material isn’t worth anyone’s time. It’s frustrating when you could be engaging with something worthwhile.’
The University of Staffordshire defended its practices, claiming ‘academic standards and learning outcomes were maintained.’ But the question remains: Is it ethical for universities to rely on AI for teaching while penalizing students for the same behavior? And more importantly, what does this mean for the future of education? Are we sacrificing quality and human connection for efficiency? Let us know your thoughts in the comments—this is a debate that’s far from over.