Covid-19 gave everyone a harsh lesson in the power of exponentials, and that memory haunts any analysis of artificial intelligence. Sure, everything looks fine — now. But then, everything also looked fine in early March 2020. By the end of the month, we were locked in our houses with our strategic reserves of toilet paper.
In a viral essay on X this week, Otherside AI founder Matt Shumer draws the parallel explicitly. “I think we’re in the ‘this seems overblown’ phase of something much, much bigger than Covid,” he writes, before launching into a description of what’s already here for coders: AI agents building “usually perfect” software from a plain-English description. He’s predicting a world soon in which AI blows up software development and moves on to every other profession.
“I know the next two to five years are going to be disorienting in ways most people aren’t prepared for,” he writes. “This is already happening in my world. It’s coming to yours.”
By Friday, the post had 80 million views, and X had been divided into two warring camps, each astounded by the other’s naiveté: skeptics who saw this as more false hype, and AI boomers and doomers who think we’re on the cusp of the biggest social and economic transformation since at least the Industrial Revolution, and possibly the taming of fire.
Is it time to freak out? Well, don’t panic, but you should be concerned. Though not because the economy as we know it will end in two years, or five.
As readers of this column know, I’m closer to a boomer than a skeptic. I’ve watched AI get steadily better at doing parts of my job (though not the writing, every word of which has been lovingly handcrafted by a human). I’m also paying attention to what people from AI World are saying — and not just the executives, who can be suspected of hyping their product as they raise vast sums of capital to build more data centers.
Dismiss them if you will, but pay attention to the people who are leaving the major AI platforms, declaring we’re on the verge of recursive self-improvement (machines building better and better versions of themselves). Or else murmuring about finding something else to do in the brave new world, like studying poetry. All this makes me inclined to believe that Shumer is directionally correct. Even if the improvement stalls well short of superintelligence, a world of merely very intelligent machines is apt to get really weird for a good long while. Though probably not as soon as AI World thinks. It often seems to extrapolate from the pace of change in the software industry, which is undergoing a staggering transformation.
But most of the economy is not the software industry. Tech firms are best positioned to innovate in the business they understand best. As AI spreads beyond those borders, the pace of advancement should slow.
Electricity, chips and the growing political pushback will become problems as AI expands. But leaving those constraints aside, AI will face steeper challenges in industries that work with people, or physical objects, rather than electrons.
What percentage of jobs can be automated by AI? Hard to say, but take the maximalist case: every job that was done over Zoom in 2021. In that year, according to the Census Bureau, 17.9 percent of workers were working primarily from home. That means more than 80 percent of jobs required someone’s physical presence, which implies they were doing something that cannot easily be replaced by a virtual worker.
Yet even that 17.9 percent probably overstates the potential, at least in the near term. Having spent five years working in IT, I can attest that software engineers adopt new technical tools much quicker, and with considerably less pain, than any other user.
Many other constraints don’t exist in the software industry but abound outside it. Take drug discovery, which has captured a lot of imaginations — cures for cancer, on demand! Even if every other part of the process was turbocharged by AI, drug companies would still be required by law to test inventions in thousands of human subjects. However much AI improves that process, it will not enable you to administer a 12-week course of a new drug to fewer than the required number of subjects, or in less than 12 weeks.
Almost every sector outside of software has many such constraints — cultural, physical and regulatory. Maybe one day we’ll get so good at modeling biological processes that we can skip the clinical trials. But probably not in five years, and given how glacially bureaucracies move, maybe not in 50. Likewise, we might get robots that translate AI into the physical world, but we won’t scale them at AI speeds, because building robots will require extracting huge volumes of raw material and moving them slowly on trucks and container ships to places where they can be turned into machine parts.
So while there are a few industries where everything might go sideways in the next five years (journalism, alas, is one of them), in most jobs, you should expect things to be mostly business as usual come 2030. That said, remember covid, and don’t let the apparent normalcy blind you to what’s coming. If you’re in a white-collar job, you’ve probably got time. But it won’t do you much good unless you use it to prepare for what’s coming.
The post The covid reality check for AI hype appeared first on Washington Post.




