Commencement season invites self-reflection. Honorary speakers ask it of half-awake, partied-out, graduating seniors. But colleges and schools, too, are gazing in the mirror: What’s the point—the medium, more precisely—of education nowadays amidst tech upheaval?
As a writer and teacher, I’m admittedly baffled by this question. (But I suspect the textile weavers of early 19th-century Britain were as well, when they saw industrial factories spring to life.)
What’s clear to me is that generative AI has begotten Generation AI.
Since debuting as the fastest-growing platform in internet history, ChatGPT has become a metonym for all that AI ails. Within two months of its launch, a survey found 90% of college students were using it for homework. More recently, a massive study by AI company Anthropic confirms that students outsource “higher-order cognitive functions” like creativity and analysis. We’ve seen obituaries penned for the out-of-class essay; writing teachers throw up their hands and quit; and the humanities endure yet one more existential punch.
Read More: I Quit Teaching Because of ChatGPT
As one undergrad devastatingly summarized, “College is just how well I can use ChatGPT at this point.” Such sarcasm ridicules six-figure tuition, room, and board. But pedagogically, a false fork in the road is presented: Education surely has to work both with and against artificial intelligence, even as questions vex either way.
The latter calls for analog retreat: blue books, oral exams, and the like. It clings to the calculator comparison—that its invention didn’t invalidate the need to learn logic of basic math, by hand—and (correctly) places faith in the cultivation of intellectual muscles rather than accepting their inevitable AI-augmented atrophy.
Some of our students falsely assume that product—a final paper—is what we seek, because high-stakes testing has trained them transactionally, and that’s what grading tallies. But, of course, process is what we ultimately aim to sharpen: The steps and lessons learned along the way. AI rewires that relationship, short-circuiting effort from output.
How might we better measure process? Must it involve students—anxious of AI accusations—absurdly uploading hours-long screen-recordings that self-surveil their compositions?
This is the ask of the “work against AI” caucus, one that looks to the practice of tradition for answers: not just the wisdom of the ancients, but the techniques by which they were handicapped—the conditions of their time that gave rise to that wisdom. We’ve lived with and benefited from the calculator, for instance—much like other disruptive, suspicion-inducing technologies of knowledge before (writing) and after (the internet).
Thus, for the “work with AI” evangelists, a different counterpoint: If reading and writing are now supposed to be outsourced to the machines for the sake of efficiency, what are educators supposed to assign, assess, and indeed, idealize as the proxy for thinking?
Literacy had a pretty good 5,000-year run, if you’re a fan of human advancement. And it may well be that Generation AI is on the cutting-edge of some radical new means of thinking, rather than its lazy replacement. But that’s for ed-tech to prove, not for us to take on blind faith.
Educators’ suspicion of artificial intelligence is well-warranted because AI is an epistemological provocation wrapped in a tech advance. It’s also a massive, ongoing sales pitch. Unholy sums of capital are being sunk into Silicon Valley. Those bets only pay out if we all adopt AI in our daily lives, both personally and professionally.
That training must start early: hence, OpenAI offering their “Plus” tier free through finals season. It’s perpetuated by the pesky swarm of AI assistants that want to help read and write for you across apps and platforms. Outsourcing nurtures dependency and deskilling.
Tech companies race to establish first-mover advantage in this space: battling to become monopolistically synonymous with AI like Google with search, Amazon with online retail, and Meta with social networking. For OpenAI—valued at $300 billion but burning through $5 billion a year—that’s a can’t-lose bet.
Hence, the cliché increasingly heard (and certainly by our over-indebted undergrads): “AI won’t take your job; the person who knows how to use AI will.” But that sounds more like a corporate cost-cutting threat just masquerading as historical inevitability. No technology is inevitable, however faithfully the evangelists sermonize to make it so.
Intellectual humility demands that education hedge both “with” and “against” AI, because we can’t know which technologies will triumph and which will collect dust. Some become Facebook; others, the Metaverse.
While colleges sort out Chat GPT’s precise place in matters curricular, we can double down on delivering what Generation AI equally needs: the experience of humanity, a quality the machines can never know and must never supplant. This includes the experiential learning that accompanies volunteer service, immersing students, three-dimensionally, in the lives and worlds of society’s marginalized.
It also includes the social and communal dimensions of campus life that might offset our crippling national crisis of alienation and loneliness. Yet, here, we’ll need to guard against college becoming even more of a lifestyle project, reducible to “day in the life” TikToks.
And that, in the end, might be exactly what Generation AI needs the most, beyond large language models: a space to unplug; a space to think, to find oneself; a space to strengthen those muscles of focus.
Your attention is the most valuable thing you’ll ever have, college graduate. Know that it’s your superpower—the purest and most generous expression of love. Never squander that on anyone or anything who doesn’t deserve it—least of all, a machine.
The post What College Graduates Need Most in the Age of AI appeared first on TIME.