DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

This Is What Fully Automated School Looks Like

April 10, 2026
in News
This Is What Fully Automated School Looks Like

William Liu is grateful that he finished high school when he did. If the latest AI tools had been around then, he told me, he might have been tempted to use them to do his homework. Liu, now a sophomore at Stanford, finished high school all the way back in 2024. “I have a younger sibling who is just graduating high school,” he said. “Our educational experience has been vastly different, even though we’re just two years apart.”

By the time Liu graduated, ChatGPT was already causing chaos in the classroom. But the automation of school is intensifying. If at first teachers worried about students using chatbots to write essays, now new agentic tools such as Claude Code are allowing students to outsource even more of their work to the machines. Need to take an online math quiz? Write a biology-lab report? Create a PowerPoint presentation for history class? AI can do all of this and more. One high schooler recently told me that he struggles to think of a single assignment that AI wouldn’t be able to do for him.

As a measure of just how good AI has become at schoolwork, consider a new bot called Einstein. Several weeks ago, the tool went viral with big claims: “Einstein checks for new assignments and knocks them out before the deadline,” a website advertising the bot explained. All that a student had to do was hand over their credentials for Canvas, the popular learning-management platform, and Einstein promised to do the rest. No matter the task, the bot was game: Einstein boasted that it could watch lectures, complete readings, write papers, participate in discussion forums, automatically submit homework assignments. If a quiz or a final exam was administered online, Einstein was happy to do that too.

When I first came across Einstein, I was skeptical: Flashy AI demos have a way of overpromising and under-delivering. So I decided to test the tool out for myself. Because I’m not a college student, I enrolled in a free online introductory-statistics class. The course website explained that the class was self-paced and that it could help undergraduates, postgraduates, medical students, and even lecturers build up basic statistical knowledge. I set the bot loose, and in less than an hour, Einstein had worked through all eight modules and seven quizzes. There were some hiccups—the bot took one quiz 15 times—but it ultimately earned a perfect score in the class. As for me? I hardly so much as read the course website.

[Read: AI agents are taking America by storm]

Einstein was designed to provoke. Its creator, Advait Paliwal, a 22-year-old tech entrepreneur, told me that he’d released the bot as a way of alerting educators as to just how good AI is at schoolwork. “You can blame me,” he said. “But this is happening right now, and more people need to know about what’s to come.” (He has previously said that he designed Einstein’s landing page by prompting AI to make a website “that people would get angry over.”) Almost immediately after releasing Einstein, Paliwal started receiving emails from professors chastising him for creating a tool seemingly designed to perpetuate academic fraud. He took down the bot after he received multiple cease-and-desist letters, including one from Canvas’s parent company.

To Paliwal, the backlash missed the point: “If I didn’t post about this, someone would have used the same technology and hidden it from the professors,” he said. “It’s actually better that they know that this exists, and they can correctly prepare for what’s to come.” The tool also, of course, gave Paliwal a moment of viral fame. Nevertheless, Einstein does seem to be an indicator of where AI in the classroom is headed. The latest bots have massive context windows, meaning that students can feed in mountains of course content such as syllabi, lecture slides, and practice exams. Today’s agentic tools can complete all kinds of tasks, such as participating in online discussion forums and taking notes on recorded lectures without student intervention. According to one analysis, the percentage of students middle-school age or older who self-reported using AI for help with homework climbed by 14 points from May to December of last year.

Amid all of this, Silicon Valley is doubling down on its push to integrate AI into schools. In the lead-up to final exams last spring, nearly every major AI firm offered college students free (or heavily discounted) access to their paid chatbots. Now the tech industry is offering students cheap access to their agentic tools. Last summer, Anthropic announced “Claude Builder Clubs”—an initiative in which students paid by the AI company host workshops and hackathons on their campuses. In exchange for membership in those clubs, students are given free access to Claude Code. A few weeks ago, OpenAI announced that it would be offering college students $100 worth of credits for Codex, its agentic coding tool.

The students affiliated with the AI companies, at least, say that the more powerful bots are helping them with their studies. Thor Warnken, an Anthropic ambassador and a biology major at the University of Florida, told me that he has designed what is effectively a personalized Khan Academy. When he takes a practice test—say, in organic chemistry—he feeds his completed work into Claude. He then asks the bot to find patterns in his errors and make new practice problems based on them. “The first practice question will be super easy, and the next one will get a little harder and a little harder, until it gets super hard,” he explained. Liu, who also serves as an ambassador for Anthropic, similarly said that the bot has made for a “fantastic” study partner. When he has questions during large lectures, he asks Claude, which has access to his course materials, and the bot explains concepts in real time; previously, those questions might have gone unanswered.

[Read: The AI takeover of education is just getting started]

Instructors, as I have previously written, are also using plenty of AI. Canvas recently introduced a new AI teaching agent designed to save instructors time on “low educational value tasks” such as organizing online-course modules and adjusting assignment due dates. “Faculty are using AI tools both for instructional purposes, for building course materials, but they’re also starting to play around with generative AI to actually grade and assess the learning,” Marc Watkins, a researcher at the University of Mississippi who studies AI and education, told me. He gave a hypothetical: “I could set my agent up, open it up in my course, go out on campus to walk across campus to get a cup of coffee at Starbucks,” he said. By the time he returned, 15 minutes later, all of the essays would be graded, and “bespoke personal feedback” would be sent out to each student. AI can save teachers time—that same grading takes him 10 or 12 hours, Watkins estimated—but in the process, the technology threatens the relationship between students and teachers that is core to education. “That’s really scary,” he said.

Most people I spoke with seemed unhappy with the current trajectory of bots in the classroom. Even as growing numbers of students are using the technology, a majority believe that the more they use AI for classwork, the more it will harm their critical-thinking skills. Natalie Lahr, a Barnard sophomore studying history and political science, doesn’t use the technology “unless it’s something that’s asked of me by a professor,” she told me, “and even in that case, I’m generally quite opposed.” In one particularly “anti-AI radicalizing” experience, Lahr met with a tutor at the college’s writing center to get help on an essay. According to Lahr, that tutor copy-pasted her essay prompt into the popular AI tool Perplexity and gave Lahr the AI-generated outline. “That was basically the end of our session,” Lahr said. “I had a crashout about that afterwards because I was like, Why am I even here?”

Some educators are worried about “a fully automated loop”—as the Modern Language Association put it last fall—in which AI-generated assignments are completed and graded by AI agents. Instructors have taken to analyzing students’ Google Docs history to make sure they are typing responses live instead of pasting in text from a bot. But of course, an AI work-around exists for that too: A new suite of human-typing simulators promises to generate text to make it look as if a student is writing in real time when, really, the work is being done by AI.

The post This Is What Fully Automated School Looks Like appeared first on The Atlantic.

As we kissed, I realized a surprising truth about my date. We had history
News

As we kissed, I realized a surprising truth about my date. We had history

by Los Angeles Times
April 10, 2026

I didn’t think anyone would take my Hinge prompt seriously. My ideal first date is … hot yoga. The prompt ...

Read more
News

Russia’s air force is much more dangerous now than it was before it invaded Ukraine, airpower experts warn

April 10, 2026
News

Melania’s ‘poorly worded’ Epstein denial leaves Trump advisor shook: ‘What is she doing?’

April 10, 2026
News

U.S. missile burn rate in Iran leaves the Pacific cupboard bare

April 10, 2026
News

With Iran Setting Limits, Strait of Hormuz Remains Thorny Politically

April 10, 2026
‘Charlie’s Angels’ soars to a golden milestone

‘Charlie’s Angels’ soars to a golden milestone

April 10, 2026
See You in Pyongyang: Russia Pushes Its People to Embrace North Korea

See You in Pyongyang: Russia Pushes Its People to Embrace North Korea

April 10, 2026
Vance Faces a High-Profile Test of His Negotiating Skills With Iran Talks

Vance Faces a High-Profile Test of His Negotiating Skills With Iran Talks

April 10, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026