Students at the University of Staffordshire have accused the institution of depriving them of a meaningful learning experience after discovering large parts of their course had been taught using AI generated material. They say the discovery left them feeling frustrated, misled and worried about their future careers.
The concerns centre on a coding module taken last year by 41 students enrolled on a government funded apprenticeship scheme. Many had joined the programme to start new careers in cybersecurity or software engineering, expecting hands on teaching and expert guidance. Instead, they found slides written by AI, voiceovers produced by synthetic voices and lessons that felt thin on substance.
Signs something was not right
Students James and Owen say they noticed the issues almost immediately. During the first lesson, the lecturer played a PowerPoint presentation narrated by what appeared to be an AI version of his own voice. Over time, other clues emerged, including American spellings awkwardly changed to British English, references to US laws in a UK course and file names that looked autogenerated.
In one video uploaded this year, the voiceover suddenly switched into a Spanish accent for half a minute before reverting back to British. When two separate AI detection tools were used to scan course documents, they flagged several assignments and presentations as very likely to have been machine generated.
Students repeatedly raised their concerns with lecturers and university officials. In a recorded session, James asked the lecturer to abandon the slides entirely, saying everyone knew they were AI generated. Another student remarked that only a small proportion of the content was actually useful, adding that they could have found the same information by asking ChatGPT themselves.
University defends use of AI
Despite the complaints, the university continued to use AI produced materials. Earlier this year it posted a new policy statement to the course website that appeared to justify its approach, describing a framework for academics using AI automation in teaching and research.
This sat uncomfortably alongside the university’s rules for students, which warn that passing off AI generated work as one’s own breaches academic integrity and may lead to penalties.
The university has insisted that academic standards were maintained. It said AI tools may support the preparation of materials but do not replace expert teaching and must be used responsibly.
Frustration and a sense of lost time
For many students, the reassurances came too late. James said he feared he had wasted two years of his life and no longer trusted those running the programme. Owen, who is changing careers, said he joined to gain real knowledge rather than just a qualification, and found the experience demoralising.
While the university did arrange for two human lecturers to deliver the final session, students felt this small gesture did not address the wider issue. They say they have not received compensation or meaningful redress for the quality of the teaching.
As the use of AI in education grows, the Staffordshire case has highlighted a growing tension between innovation and expectations. For these students, the promise of a new career has been overshadowed by the feeling that their education was delivered on the cheap.








