
skynesher/Getty Images
AI has made its way into the classroom. Along with it, concerns from teachers about student apathy.
“Some of the ones that I see using it all the time — I think if it wasn’t there, they would just sit there looking blindly into space,” Gary Ward, a physics, economics, and business teacher at Brookes Westshore High School in Victoria, British Columbia, told Business Insider.
Since the release of ChatGPT in 2022 and the mass adoption of it and other generative AI tools, concerns surrounding academic plagiarism have multiplied. Educators found themselves needing to react quickly, adapting their curriculums to embrace or counter a technology that had tremendous potential to both be a teaching aid and a “homework cheating machine.”
Ward, who’s been a teacher for about thirty years, said that he’s noticed student usage of AI increase in increments — until this year, when it just “exploded.”
“Literally, all students are using it this year,” he said. In order to try and prevent students from gaming all of his assignments with artificial intelligence, Ward said he’s begun to use it defensively. He’s asked ChatGPT to help him develop work that would be harder for anyone completing it to feed back into an LLM.
“I just started it with a conversation in ChatGPT, and sort of iteratively went through — explained in my prompt what was happening, and said, ‘This is what I want,'” Ward said. “It told me, ‘These are things you can do to make it harder for students to be able to just answer with some large language model.’ And typically, it’s making it more personalized.”
At Manchester Metropolitan University in Manchester, England, Richard Griffin — a lecturer in the business faculty specializing in project management and portfolio development — says a similar strategy is underway.

Xinhua/Jon Super via Getty Images
The university has developed an in-house system that educators can feed their assignments into, which will then provide an assessment of how difficult it might be to cheat it with AI and recommendations to make it more difficult to do so.
“The IT department have done their own tool which assesses how AI safe it is, or AI savvy it is, and will give you a bit of a grade to say, ‘Well, really, you will need to adjust some of this,'” Griffin said. “It doesn’t give us specific information, but it does give you a bit of a scroll to say, No, this isn’t very safe. You need to add some deeper challenges here, or you need to make this more personal, etcetera.”
A shift back toward analog assignments
The best defense against AI so far, according to Ward, is to spin back the clock a couple of decades.
“I’ve tried to sort of shift back toward some handwritten assignments, instead of having them do it on the computer,” Ward said. “That way, I can tell this is how they’re writing. I know it’s theirs.”
Even if Ward can’t go analog for all the coursework he assigns, at the very least, it helps him determine a baseline for each student’s writing, making it easier to determine when future work is produced synthetically.
“Now, yeah, it’s expensive and it takes a lot of time to grade them, but I think that needs to continue,” he added.
The goal of a classroom is generally to empower students with foundational skills — proficiency in research, deep thought, and comprehension, to name a few. By substituting the typical processes of studying with seeking out AI answers, many students are no longer meeting those benchmarks, said Paul Shockley, an assistant professor at Stephen F. Austin State University in Nacogdoches, Texas.
“Many students today are using AI as a way of fulfilling their assignments, and it is creating a loss of critical thinking, a loss of originality, a loss of discernment, a loss of personal reflection, and so on,” said Shockley, who primarily teaches courses in philosophy and religious studies.
For Shockley’s part, he was an early adopter of AI and was experimenting with the capabilities of LLMs soon after the launch of ChatGPT. He expects the technology not only to endure long-term, but to improve exponentially, and he began to believe it was crucial to help students build a healthy relationship with it.
“My mindset on the topic, since AI has emerged, has shifted, moved like a pendulum from fascination to fear, given how it may be used,” Shockley said. “But my fascination with AI is rooted in what it may be able to ameliorate, ameliorate things in the energy sector, industry, natural environment, medicine, science, person-centered care, but I decided that I would be open to using AI and my pedagogy in a Socratic approach.”
Originally, he developed an assignment for his undergraduate courses in philosophy and religious studies that encouraged students to dialogue with an LLM and analyze the output. He hoped that students would not only learn how to ask smarter questions but also develop a healthy skepticism of artificial intelligence.
He has since discontinued the assignment and no longer allows any use of AI in lower-level classes. Too many students, he found, used it to outright cheat — including one instance in which he said a student submitted a paper that cited a hallucinated quote from a book Shockley co-authored.
“The use of AI in the classroom for me as a philosopher is limited to inquiry among senior-level students doing research where they have maturity,” Shockley said. “They have the chance to grow and so, and become equipped with critical thinking skills for themselves.”
Some assignments are naturally more AI-resistant
Though Shockley still assigns research papers, he also tries to deploy “experiential” assignments whenever possible. For instance, in undergraduate environmental ethics and religious studies courses, Shockley has sent students out to visit local nature spots or religious sites.
He hopes to engage students, he added, by “hooking” them — connecting them more personally to the subject matter that they’ll eventually interact with in more traditional ways. That way, they may be less likely to turn to AI to complete their work. Additionally, he’s begun to attach reflective components to any assignments that could likely be gamed by AI on their own.
“What is it that students want? What is it that people want to experience these days?” he said. “What is it that young people want to experience these days, right? They want to have phenomenal experiences, you know, transformative experiences, cool experiences, and so, how can I harmonize those things together?”
Generally, certain disciplines are more insulated against AI cheating, given that they better lend themselves to project-based assignments. In Griffin’s case, many of the business courses he teaches require actual interaction with a real-world client.
“We’re challenging them with quite difficult tasks out in the real world to deliver projects for clients, you know, and there’s a huge variety of expectation and understanding, both from the clients’ perspective, but also from our sort of undergrads as well,” Griffin said.
Much like Shockley, Griffin is focusing on incorporating reflection into his curriculum, hoping that the layered steps will prompt deeper thinking.
“I’m using projects and portfolios, so people are out in the real world. We’re also relying very much on reflective aspects of that,” Griffin said. “So they’ll deliver a project with a client. If you’re going to use AI and tell the client some really tough information, they’re not going to be particularly happy.
“And then that reflective element means that they really have to delve deeper and give us some honesty, which wouldn’t normally be there in normal sort of assignments or assessments,” he added.
A shift toward oral assessments and discussion-based assignments, Griffin said, is also likely as AI continues to develop.
“So assessments, I don’t know whether I’d say they’re going to become harder,” he said. “They’ll certainly become more focused. I think we need to accept that. We maybe can’t teach as broad a topic as we’d like to, but we can certainly teach criticality.”
The post 3 teachers tell us the changes they’re making in the classroom to address students’ rampant use of AI appeared first on Business Insider.