DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

I’m a Professor. A.I. Has Changed My Classroom, but Not for the Worse.

November 25, 2025
in News
I’m a Professor. A.I. Has Changed My Classroom, but Not for the Worse.

At the end of a class in mid-September, as everyone was gathering their things, a student named Tyler approached me. “Can we talk sometime about how we can ask the questions on our own?” he said. “We always have you to ask the questions and set up how we’re going to discuss and analyze, but I’d like to know how to do that for myself, for when we don’t have someone else to do it for us.”

I assured Tyler, whose open face under his ball cap bore the look of a guy who hoped he had not somehow offended, that this was a question he absolutely should be asking and that the short answer was yes. I told him to bring it up again at the beginning of our next class, when we could talk about it as a group. This put us a month ahead of schedule in achieving one of my goals in any class I teach: gradually turning over to students the responsibility for defining problems and deciding how to solve them.

I teach English courses at Boston College. I don’t lecture much. Mostly, we engage in conversation, paying attention to one another and to the book we have all read. I don’t teach content so much as a way of coming at things — tools and moves we can use to extract meaning from the world around us and make well-supported arguments about what we find. Class as workshop, not factory. As in a band or team practice, everyone in the room is simultaneously developing their individual chops and participating as a member of a problem-solving community. We practice on novels, poems and other literary artifacts, but the set of skills we’re trying out is basic equipment for living for any citizen or worker, any thinking person. It’s the same tool kit you would use to make sense of a State of the Union address, the state of your neighborhood or the sequence of events that has led you to be sitting in a cubicle or a jail cell.

Tyler and his peers want to be capable humans, independent thinkers. They are not, contrary to widespread belief, universally eager to turn over every last shred of their intellectual and community-building capacity to robot servants/overlords. And as Josie, another student in the class, told me, at least some of them “are not that OK” with peers who rely on A.I. to do their work. This all runs counter to the doomy picture of helplessness painted last spring and summer by well-circulated stories about generative A.I.’s effect on higher education. Everybody’s cheating their way through college. No student is going to read a book or write a paper on their own ever again. It’s the end of the essay, of reading, of thinking.

As professors in the humanities, who were already feeling besieged by a sustained chorus that questions the value of their disciplines, considered these stories and their own encounters with A.I., their unease sharpened into a specific concern about the fall semester. If they kept doing what they normally do, would they be sleepwalking into a minefield? A significant number of them did more thinking about teaching than usual over the summer, when professional scholars typically concentrate on research. As a result, they came into the fall with retooled courses featuring more purposeful approaches to writing and reading, less reliance on technology and a renewed focus on face-to-face community.

An A.I.-resistant English course has three main elements: pen-and-paper and oral testing; teaching the process of writing rather than just assigning papers; and greater emphasis on what happens in the classroom. Such a course, which can’t be A.I.-proof because that would mean students do no writing or reading except under a teacher’s direct supervision, also obliges us to make the case to students that it’s in their self-interest to do their own work. Colleagues I’ve talked to around the country do discuss A.I. with their students, and some have found creative ways to bring it into the curriculum, but few have turned to high-tech solutions like A.I.-detecting software or monitoring keystroke histories.

Now, well into the semester, it has come as a pleasant surprise to me and to many of my fellow teachers that the A.I. apocalypse that was expected to arrive in full force in higher education this fall, bringing an end to reading and writing and school as we know them, has not come to pass just yet. Responding to the rise of A.I. by doubling down on the humanity of the humanities appears to be working, at least for the moment. When I suggested as much to Stuart Selber, who directs the program that provides required writing courses to 18,000 students at Penn State, he said: “That is a fair statement in our case. It’s hard to know how much better things might be, because of all the complexity and moving parts, but we have not imploded. It feels like the panic is settling down, too.” Some colleagues even report a notable uptick in students’ engagement.

Many teachers have changed the way they test students. That often means returning to old-school pen-and-paper exams — thereby contributing to the comeback of the blue book, a horseshoe-crab-like relic of primordial ed tech that A.I. has saved from extinction. Assessments like tests and quizzes should reinforce the behavior you want, so mine consist of mini-essays that gauge command of the reading and of the ideas and analytical methods we’ve talked about in class, with most of the points awarded for interpretive substance. As a way to sum up in the last few minutes of a class discussion, we sometimes knock around students’ suggestions for a fair exam question about what we covered that day. For example, at the end of one class in the course that Tyler and Josie are taking — on the city in literature and film — we came up with something like Compare how Nathaniel Hawthorne’s “My Kinsman Major Molineux” and the first chapter of Theodore Dreiser’s “Sister Carrie” represent the process of becoming a city person.

You can’t address A.I.’s effect on writing without considering its invitation to settle for a summary of content instead of actually reading. Exams are a delayed, high-stakes way to hold students accountable for reading; quizzes are a lower-stakes, day-to-day way to do that.

Scott Saul, at the University of California, Berkeley, told me that he has started giving five-minute pen-and-paper quizzes that (unlike a midterm or a final) ask for details from the text and no interpretation at all, an idea he got from Nabokov. Saul told me, “Nabokov would give these impossible quizzes in his Cornell classes” — like in “Anna Karenina,” what did Anna’s son receive as a birthday present in 1875? “And I’d be like, ‘That is so nitpicky, this guy is insane.’” But Saul came to see the point. “All of the bigger forces in our culture are pulling people away from deeply attentive reading,” he said. “People lack patience with the detail that makes an essay or a story or a novel have texture and depth,” especially when they’re used to having everything reduced to bullet-point summaries or skipping over large chunks of prose as they scan on a screen. Such tech-assisted breezing through a text is to reading a novel roughly as listening to a podcast about marriage is to being married. Nitpicking Nabokovian quizzes are one way to reinforce getting down into the details where meaning resides. I also have students scan and turn in their mark-ups — underlinings, marginal notes, highlighting — of the hard copy they’re reading, which is as close as I can get to watching them think as they read.

A second response to A.I. for teachers who still want to assign take-home papers, rather than settle for having students now do all their writing by hand in class: emphasize teaching the process of writing — breaking it down into a series of steps that a teacher can see and respond to — rather than simply grading the product.

Scaffolding is the term of art in the English biz. In the case of a paper analyzing a work of literature, the steps might include noticing patterns, building them into interpretive insights, coming up with a thesis, finding evidence to support it, drafting and revising. My students this semester are writing brief weekly exercises in response to the reading that can serve as the germ of a paper, then writing drafts of papers, then revising those into final drafts. It’s more work for me to read all that, but as my Boston College colleague Maia McAleavey says, “Pay them attention that counts while they’re still working on their papers, rather than when the paper’s done and they don’t care anymore or even remember.”

You can add peer reviews to extend such relationships to other students, who don’t appreciate being obliged to respond to A.I. slop. Getting involved at various stages of a paper’s development also makes it easier for a teacher to spot a clanking disjunction between a student’s thought process and the final product. To that end, I’ve also added a conference in which the student tells me about conceiving and writing a paper. Assigning a separate grade for the conference is another way to reward students who did their own work or penalize those who skimped on process by using A.I. Just as the point of a gym is not for the weights to go up and down but for you to move them, the objective of assigning papers is for students to think, not to fill the world with papers.

Another approach to adjusting paper assignments is to make them feel more authentic by making them more personal or creative, more essayistic. Mark Edmundson offers an example from a poetry course he teaches at the University of Virginia. “Are you a Whitmanian?” is the paper topic, he told me. “So we blend some analytical work on poetry with some reflective work on the student’s part. How much of Whitman’s vision do they buy? How much do they find troublesome? How much are they just wanting to dismiss out of hand?”

A student can feed this prompt to a bot, Edmundson acknowledges, “but first you have to tell ChatGPT who you are, give it a lot of details about yourself.” Despite their heavily digital existences and their willingness to share their data, he says, his students are still “reluctant to have their personality, character, identity usurped by a machine” in the intimate ways this kind of assignment would require. Also, students are more likely to get into writing — and therefore less likely to outsource — more personal papers like these, which can feel like a project one might plausibly engage in for reasons other than to gain course credit. And the chance to take some stylistic and intellectual risks with course content offers a contrast to the more mechanical writing that students do in blue-book exams, where the point is mostly to demonstrate that they have been keeping up with the reading and class discussions.

Judging by the first round of papers in my classes, the results so far appear to be OK. I can hear the students’ individual voices in their papers, which grow plausibly from exercises and drafts, for the most part without sudden bends toward the robotic. If I detect the ultraprocessed flavor of A.I. in a paper, which is trickier to do as bots get better at mimicking human writing, I’ll meet with that student to discuss their priorities. Rachel Trousdale, at Framingham State University, who treats the use of A.I. as plagiarism, says that this kind of conversation is among the most difficult she has with students. “So far that hasn’t happened much this semester, but I hate doing it every time,” she says. “It can be hard to explain the difference between ‘This sounds like A.I.’ and ‘I don’t think you’re able to do good work.’”

Reports of the demise of the essay may be exaggerated, but A.I. could well hasten the end of teachers just telling their students to write an essay and putting a grade on the result. For decades now, teachers of composition and creative writing have been showing the rest of the profession that the path, not the destination, should be the goal. Stuart Selber, of Penn State, makes a distinction between teaching writing as a craft and simply assigning writing as a product that can be cranked out by A.I. “We don’t just assign and collect writing, which is what most of the university does,” he told me. “If you don’t know what students are doing, you’re not involved in the work flow or the process, and you’re also not teaching writing.”

The third and most important pillar of an A.I.-resistant course: stress the value of what happens in the classroom. Students in a humanities class are paying for the admissions policy and the hiring policy that produced the other people in the classroom for them to engage with. Everything else — reading, being talked at by talking heads, firing hot takes into the electronic void — they can do online, alone at home in their underwear. They are probably never again going to be regularly spending undistracted blocks of time in a room with other people who have all read the same thing and are committed to getting meaning from it, so we should treat our time together like the special occasion it is — about 2,000 very expensive minutes over the course of a semester.

The dawning of the A.I. age hasn’t drastically reshaped the exchange of ideas in an English class, but it alters the meaning of such exchanges. When we’re in the groove in the classroom, I’m occupied with familiar discussion-orchestrating tasks: urging Charlie to turn her violent antipathy to the top-5-list-making music snobs in “High Fidelity” into the germ of an analysis; reminding myself to be patient and let Andrew and Kyla cook up their points at the deliberate speed they favor; encouraging all to see how the observations that Samantha and Dylan and Yasmine have made add up to an interpretation we can evaluate and refine. We can see and hear one another thinking. Some like to ask questions, some to answer them; some prefer to go first, some to lay back and counter; some want to build something, some to break it down.

These classroom dynamics feel timeless. But our ever-greater reliance on nonhuman interlocutors and assistants has given new value to the very fact of face-to-face exchange between humans. Class discussions offer increasingly rare opportunities to practice talking about ideas with other people, being a fully present member of a community pursuing a shared purpose, understanding others’ views. College graduates will need such competences to make their way in the world, even in the age of A.I., and they have fewer opportunities to practice them than they did in the past.

Like tests that focus on our discussions of the reading, course policy should reinforce what matters most. To qualify for any grade at all in my courses you have to actively participate in those discussions — which means, at minimum, speaking up at every class meeting. The quality of your contributions constitutes a significant part of your grade; papers, other writing exercises, tests and quizzes also count, but you can’t just turn in the deliverables and sit quietly. You have to pull your weight.

Another crucial policy, one that the rise of A.I. has inspired some teachers to finally adopt after contemplating it for years, is banning laptops and phones from class, which significantly improves classroom chemistry. Because I regularly observe the teaching of colleagues who don’t enforce such bans, I’ve had ample opportunity to see how every student who’s shopping or watching sports highlights creates a vortex of engagement-killing inattention that sucks in surrounding classmates. Students and colleagues report that device-free classrooms feel like a respite from all-tech-all-the-time, a place where they can leave off the endless clicking and scrolling, slow down and do some thinking for themselves and with others.

Building classroom community has become much more important to me over the years. The cellphone and now A.I. have made college a lonelier experience than it used to be, and the pandemic accelerated the long-term waning of community on campus. While my students are generally more professional and accomplished than my generation of college students was, they are also more anxious and isolated. So I try to make the classroom a place where they feel not just free to speak but expected to speak, responsible for doing their part as a citizen. This priority imparts outsize importance to mundane habits like banning screens, getting everyone involved, making sure we know everybody’s name, discussing what we’re going to do and why, and finding opportunities to joke around a bit while doing our serious business. I also make clear my expectation that when I begin class, I should have to quell a hubbub of neighbors catching up, not break the funereal hush of isolatoes bent in solitude over their phones. People usually have something to say, and they’re more likely to say it when they feel they’re part of a community.

I don’t have an absolute ban on using A.I. in my courses because I don’t like to make rules I can’t enforce, and I can’t prove that any particular piece of work has been done by A.I. Instead, in addition to making sure I place enough grading emphasis on tasks students can’t use A.I. to accomplish, I ask them not to use any nonhuman assistance and to tell me if they do, and I explain why it’s in their interest to do their own work. I’m asking them not only to do their own writing and reading but to also do it the old-fashioned way: turn off spell-check and grammar-check, read a book with pen in hand — and refrain from playing Monster Hunter at the same time.

Maryanne Wolf, a cognitive neuroscientist at U.C.L.A. who studies reading and writing, helped me make this case to students. She told me I could say to them, “Let me show you what you are short-circuiting if you don’t expend the effort in giving enough time to thinking and really focusing and immersing yourself in the material.” She was referring to our ability to draw inferences, take on others’ perspectives, evaluate truth, discern an underlying message — sophisticated cognitive and affective processes that lead to our best insights and can atrophy from disuse. “The oldest platitude in neuroscience is ‘Use it or lose it.’”

I told my students all this at the beginning of the semester. It was a pretty good speech, I thought. I delivered versions of it in both of my classes, growing a little impassioned as I asked them to stand up for themselves against powerful forces urging them toward frictionless passivity. Reading is thinking and writing is thinking, I said, and using A.I. to do your thinking for you is like joining the track team and doing your laps on an electric scooter. You’re paying $5 a minute for college classes; don’t spend your time here practicing to be replaceable by A.I. Use it or lose it, people.

When I finished, the students looked back at me with the steady earnest-ironic gaze of the young, as if to say: You seem relieved to have gotten that off your chest, Gramps, but that’s the third A.I. speech I’ve heard from a professor this week. Can we just get back to doing school?

The A.I.-resistant approach that teachers have collectively roughed out for the moment has major limitations. As was true before A.I., students hell-bent on not doing their own work can still get away with it some of the time in courses, like mine, that still require writing outside the classroom. Also, because my enrollments top out around 30, I can count on giving individual attention to my students, but that becomes impractical at larger scales. And I can’t be sure that anything that might work now will continue to work in the face of A.I.’s relentless advance, especially the increased use of pendants and glasses, invisible earpieces and other such wearable tech that could encroach on the face-to-face classroom.

But I do have confidence in my students and in the value of what we’re doing. In a humanities course, classically, you assemble equipment for living while considering lessons in how to live. These days, humanists also find themselves teaching doing your own work, sustained attention, intellectual stamina, reasoned disagreement and other meta-skills in ever-shorter supply in the culture at large.

This tech-obsessed moment may have finally pushed humanists to identify the value of their disciplines and play to their strengths, after first going fetal for far too long when assailed by return-on-investment literalists who willfully misunderstand the fit between school and work (“There is no job called English or history”). Students feel pressure to use college to prepare intensively for a particular career, like training full-time as an amateur to play a sport professionally, but they will graduate into a working world where the games and rules keep changing.

A more accurate analogy for school-to-work, instead of playing in the majors after majoring in baseball, would be having to play dodgeball with bales of data one day and ballroom dance with clients the next. So students need to train to be all-around athletes, ready for everything and anything. As finance and computer science majors are now finding out, the sudden but uneven intrusion of A.I. into various industries makes it even harder to time the entry-level labor market even just a couple of years out from picking a major. This strengthens the argument for getting yourself a well-rounded liberal arts education heavy on meta-skills like adaptability, being good at learning fast and developing the critical acuity to separate the wheat from the ever-multiplying chaff in the increasingly A.I.-generated cascade of information coming at us from our screens.

Ohio State University, Colby College and some other schools have committed to cultivating A.I. fluency by seeking ways to integrate it across the curriculum. It’s fair to ask why a teacher would do the opposite, not even trying to take advantage of some of the most celebrated (and hyped) technological innovations of our time. In response, I’d point to the gym as a useful analogy for what we’re doing in school. (After all, the original Greek gymnasion, or place for naked exercise, was dedicated to intellectual as well as physical development.) For centuries now we’ve had machines that can lift more weight and move faster than human beings, and yet if you want to get stronger and faster you still have to lift the weight and run the distance yourself. Students should have teachers who show them how to use machines properly and teachers who show them how to do the work themselves. The stakes are high. Maryanne Wolf’s rundown of the mental fitness that comes of doing your own effortful reading and writing includes crucial tasks that we as citizens of an increasingly digital democratic society do very poorly these days: evaluating the truth of what we read, for instance, or comprehending a perspective that’s not our own.

That’s all in the back of my mind when we get down to business in my classroom with paperbacks and notepaper and pens and chalk ready to hand. We often start by first figuring out how we’re going to come at Ann Petry’s “The Street,” China Miéville’s “The City and the City” or whatever book we’re discussing — practicing the kind of interpretive self-sufficiency that Tyler sought. Do we start by simply noticing how the story is told: word choice, imagery, structure? Or this time do we start with theme, organizing logic, what we think the story is about? We’re practicing not only the craft of analysis but the work of being a thinking person alert to the flow of meaning through the world around us, sometimes on the surface of things and sometimes below it. That’s an essential part of the work of being human, and I don’t think we’re willing to outsource it all to machines just yet.


Carlo Rotella is the author, most recently, of “What Can I Get Out of This?: Teaching and Learning in a Classroom Full of Skeptics.”

The post I’m a Professor. A.I. Has Changed My Classroom, but Not for the Worse. appeared first on New York Times.

Cal Fire approach to SoCal’s wildfire crisis could make things worse, court says
News

Cal Fire approach to SoCal’s wildfire crisis could make things worse, court says

November 25, 2025

In a case that calls into question plant clearing techniques that have become fundamental to the California Department of Forestry ...

Read more
News

War brews as GOP members clash over glut of fresh accusations: report

November 25, 2025
News

Russia expected to reject changes to U.S. peace plan that Ukraine accepts

November 25, 2025
News

Reaching a new low, CDC discards science in claims about vaccines and autism

November 25, 2025
News

‘I’ve tried to warn them’: Elon Musk says Tesla’s rivals don’t want its self-driving tech

November 25, 2025
New antisemitism probe of Berkeley Unified announced by congressional committee

New antisemitism probe of Berkeley Unified announced by congressional committee

November 25, 2025
‘Ignorance and incompetence’: Nobel Prize winner lays into DOGE’s vow to continue cuts

‘Ignorance and incompetence’: Nobel Prize winner lays into DOGE’s vow to continue cuts

November 25, 2025
The ex-landscaper behind the deportation diary L.A. never wanted

The ex-landscaper behind the deportation diary L.A. never wanted

November 25, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025