In 2005, there was no inkling of the artificial intelligence boom that would come years later. But directors at Intel, whose chips served as electronic brains in most computers, faced a decision that might have altered how that transformative technology evolved.
Paul Otellini, Intel’s chief executive at the time, presented the board with a startling idea: Buy Nvidia, a Silicon Valley upstart known for chips used for computer graphics. The price tag: as much as $20 billion.
Some Intel executives believed that the underlying design of graphics chips could eventually take on important new jobs in data centers, an approach that would eventually dominate A.I. systems.
But the board resisted, according to two people familiar with the boardroom discussion who spoke only on the condition of anonymity because the meeting was confidential. Intel had a poor record of absorbing companies. And the deal would have been by far Intel’s most expensive acquisition.
Confronting skepticism from the board, Mr. Otellini, who died in 2017, backed away and his proposal went no further. In hindsight, one person who attended the meeting said, it was “a fateful moment.”
Today Nvidia is the unrivaled A.I. chip king and one of the most valuable corporations in the world, while Intel, once the semiconductor superpower, is reeling and getting no lift from the A.I. gold rush. Nvidia’s stock market value, for years a fraction of Intel’s, is now more than $3 trillion, roughly 30 times that of the struggling Silicon Valley icon, which has fallen below $100 billion.
As the company’s valuation has sunk, some big tech companies and investment bankers have been considering what was once unthinkable: that Intel could be a potential acquisition target.
Such scenarios add to the pressures facing Patrick Gelsinger, appointed in 2021 as Intel’s chief executive. He has focused on restoring the company’s onetime lead in chip manufacturing technology, but longtime company watchers say Intel badly needs popular products — such as A.I. chips — to bolster revenue that declined by more than 30 percent from 2021 through 2023.
“Pat Gelsinger is very much focused on the manufacturing side,” said Robert Burgelman, a professor at the Stanford Graduate School of Business. “But they missed A.I., and that has been catching up to them now.”
The story of how Intel, which recently cut 15,000 jobs, got left behind in A.I. is representative of the broader challenges the company now faces. There were opportunities missed, wayward decisions and poor execution, according to interviews with more than two dozen former Intel managers, board directors and industry analysts.
The trail of missteps was a byproduct of a corporate culture born of decades of success and high profits, going back to the 1980s, when Intel’s chips and Microsoft’s software became the twin engines of the fast-growing personal computer industry.
That culture was hard-charging and focused on its franchise in personal computers and later in data centers. Intel executives, only half-jokingly, described the company as “the largest single-cell organism on the planet,” an insular self-contained world.
It was a corporate ethos that worked against the company as Intel tried and failed, repeatedly, to become a leader in chips for artificial intelligence. Projects were created, pursued for years and then abruptly shut down, either because Intel leadership lost patience or the technology fell short. Investments in newer chip designs invariably took a back seat to protecting and expanding the company’s money-spinning mainstay — generations of chips based on Intel’s PC-era blueprint, called the x86 architecture.
“That technology was Intel’s crown jewel — proprietary and very profitable — and they would do anything in their power to maintain that,” said James D. Plummer, a professor of electrical engineering at Stanford University and a former Intel director.
Intel’s leaders, at times, acknowledged the issue. Craig Barrett, a former Intel chief executive, once compared the x86 chip business to a creosote bush — a plant that poisons competing plants around it. Still, the profits were high for so long, Intel did not really shift course.
At the time Intel considered a bid for Nvidia, the smaller company was widely viewed as a niche player. Its specialized chips were mostly used in machines for computer gamers, but Nvidia had started adapting its chips for other kinds of number-crunching fields such as oil and gas discovery.
Where Intel’s microprocessor chips excelled in rapidly executing calculations one after another, Nvidia’s chips delivered superior performance in graphics by breaking tasks up and spreading them across hundreds or thousands of processors working in parallel — an approach that would pay off years later in artificial intelligence.
After the Nvidia idea was rejected, Intel, with the board’s backing, focused on an in-house project, code named Larrabee, to jump ahead of competitors in graphics. The project was led by Mr. Gelsinger, who had joined Intel in 1979 and rose steadily to become a senior executive.
The Larrabee effort consumed four years and hundreds of millions of dollars. Intel was confident, perhaps arrogant, that it could transform the field. In 2008, speaking at a conference in Shanghai, Mr. Gelsinger predicted, “Today’s graphics architectures are coming to an end.” Larrabee would be the new thing.
Larrabee was a hybrid, combining graphics with Intel’s PC-style chip design. It was a bold plan to meld the two, with Intel’s linchpin technology at the core. And it didn’t work. Larrabee fell behind schedule and its graphics performance lagged.
In 2009, Intel pulled the plug on the project, a few months after Mr. Gelsinger announced he was departing to become president and chief operating officer of EMC, a maker of data storage gear.
A decade after leaving Intel, Mr. Gelsinger still believed Larrabee was on the right track. In a 2019 oral history interview with the Computer History Museum, he recalled that people were beginning to use Nvidia chips and software for things beyond graphics. That was before the A.I. boom, but the direction was clear, Mr. Gelsinger said.
Larrabee’s progress was halting but, he insisted, it could have proved a winner with more corporate patience and investment. “Nvidia would be a fourth the size they are today as a company because I think Intel really had a shot right in that space,” he said.
Now, three years after he was wooed back to take over Intel, Mr. Gelsinger still holds that view. But in a brief interview with The New York Times recently, he also emphasized the long-term commitment that would have been needed. “I believed in it,” he said. Had Intel kept at it, “I think the world would be very different today,” Mr. Gelsinger said. “But you can’t replay history on these things.”
Some of the Larrabee technology was used in specialized chips for scientific supercomputing. But Intel’s graphics push was curtailed. Nvidia continued investing for years not only in its chip designs but also in the crucial software to enable programmers to write a wider range of software applications on its hardware.
In later years, Intel continued to stumble in the A.I. market. In 2016, the company paid $400 million for Nervana Systems, one of a new crop of A.I. chip companies. Its chief executive, Naveen Rao, was named head of Intel’s fledgling A.I. products unit.
Mr. Rao recounted a litany of problems he encountered at Intel, including corporate curbs on hiring engineers, manufacturing troubles and fierce competition from Nvidia, which was steadily improving its offerings. Still, his team managed to introduce two new chips, one of which was used by Facebook, he said.
But in December 2019, Mr. Rao said he was stunned when, over his objections, Intel bought another A.I. chip start-up, Habana Labs, for $2 billion. That deal came just as Mr. Rao’s team was close to completing a new chip. Mr. Rao’s feelings are still raw about that move.
“You had a product that was ready to go and you shot it — and you bought this company for $2 billion that set you back two years,” said Mr. Rao, who resigned shortly afterward and is now vice president of A.I. at Databricks, a software company.
Robert Swan, who was Intel’s chief executive at the time, declined to comment.
Intel spread its efforts thin in A.I. by also developing multiple graphics-style chips — products now discontinued — as well as taking years to offer credible chips from the Habana Labs lineage. The latest version, called Gaudi 3, has attracted interest from some companies like Inflection AI, a high-profile start-up, as a lower-cost alternative to Nvidia.
Under Mr. Gelsinger, Intel has made some progress catching up to Asian rivals in chip manufacturing technology. Intel has convinced Washington to pledge billions of dollars in federal funding — under the CHIPS and Science Act — to help revive its fortunes. Still, it will be a steep climb.
Intel has lately designed new chips that have impressed industry analysts, including an A.I. chip for laptop PCs. Yet it is a measure of Intel’s troubles that these new chips are being produced not in Intel factories, but by Taiwan Semiconductor Manufacturing Company — a decision, made to exploit that company’s more advanced production technology, that tends to reduce Intel’s profit on the chips.
Today, Intel sees its A.I. opportunity emerging as the technology is increasingly used by mainstream businesses. Most corporate data resides in data centers still populated mainly by Intel servers. As more A.I. software is created for businesses, the more conventional computer processing will be needed to run those new applications.
But Intel is not at the forefront of building big A.I. systems. That is Nvidia’s stronghold.
“In that race, they are so far ahead,” Mr. Gelsinger said at a recent Deutsche Bank conference. “Given the other challenges that we have, we’re just not going to be competing anytime soon.”
The post How Intel Got Left Behind in the A.I. Chip Boom appeared first on New York Times.