DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Nvidia’s Deal With Meta Signals a New Era in Computing Power

February 18, 2026
in News
Nvidia’s Deal With Meta Signals a New Era in Computing Power

Ask anyone what Nvidia makes, and they’re likely to first say “GPUs.” For decades, the chipmaker has been defined by advanced parallel computing, and the emergence of generative AI and the resulting surge in demand for GPUs has been a boon for the company.

But Nvidia’s recent moves signal that it’s looking to lock in more customers at the less compute-intensive end of the AI market—customers who don’t necessarily need the beefiest, most powerful GPUs to train AI models, but instead are looking for the most efficient ways to run agentic AI software. Nvidia recently spent billions to license technology from a chip startup focused on low-latency AI computing, and also started selling standalone CPUs as part of its latest superchip system.

And yesterday, Nvidia and Meta announced that the social media giant had agreed to buy billions of dollars worth of Nvidia chips to provide computing power for the social media giant’s massive infrastructure projects—with Nvidia’s CPUs as part of the deal.

The multi-year deal is an expansion of a cozy ongoing partnership between the two companies. Meta previously estimated that by the end of 2024, it would have purchased 350,000 H100 chips from Nvidia, and that by the end of 2025 the company would have access to 1.3 million GPUs in total (though it wasn’t clear whether those would all be Nvidia chips).

As part of the latest announcement, Nvidia said that Meta would “build hyperscale data centers optimized for both training and inference in support of the company’s long-term AI infrastructure roadmap.” This includes a “large-scale deployment” of Nvidia’s CPUs and “millions of Nvidia Blackwell and Rubin GPUs.”

Notably, Meta is the first tech giant to announce it was making a large-scale purchase of Nvidia’s Grace CPU as a standalone chip, something Nvidia said would be an option when it revealed the full specs of its new Vera Rubin superchip in January. Nvidia has also been emphasizing that it offers technology that connects various chips, as part of its “soup-to-nuts approach” to compute power, as one analyst puts it.

Ben Bajarin, CEO and principal analyst at the tech market research firm Creative Strategies, says the move signaled that Nvidia recognizes that a growing range of AI software now needs to run on CPUs, much in the same way that conventional cloud applications do. “The reason why the industry is so bullish on CPUs within data centers right now is agentic AI, which puts new demands on general-purpose CPU architectures,” he says.

A recent report from the chip newsletter Semianalysis underscored this point. Analysts noted that CPU usage is accelerating to support AI training and inference, citing one of Microsoft’s data centers for OpenAI as an example, where “tens of thousands of CPUs are now needed to process and manage the petabytes of data generated by the GPUs, a use case that wouldn’t have otherwise been required without AI.”

Bajarin notes, though, that CPUs are still just one component of the most advanced AI hardware systems. The number of GPUs Meta is purchasing from Nvidia still outnumbers the CPUs.

“If you’re one of the hyperscalers, you’re not going to be running all of your inference computing on CPUs,” Bajarin says. “You just need whatever software you’re running to be fast enough on the CPU to interact with the GPU architecture that’s actually the driving force of that computing. Otherwise, the CPU becomes a bottleneck.”

Meta declined to comment on its expanded deal with Nvidia. During a recent earnings call, the social media giant said that it planned to dramatically increase its spending on AI infrastructure this year to between $115 billion and $135 billion, up from $72.2 billion last year.

Nvidia, which also declined to comment on the new deal, has for years now said that its hardware can be used for inference computing needs in addition to frontier AI training. Two years ago, in a sit-down interview with WIRED, Nvidia founder and chief executive Jensen Huang estimated that Nvidia’s business was likely “40 percent inference, 60 percent training.”

In December, Nvidia announced it was spending $20 billion to license technology from the chip startup Groq and bring some of Groq’s top talent, including CEO Jonathan Ross, into the fold at Nvidia. According to a statement from Groq, the deal reflected a “shared focus on expanding access to high-performance, low cost inference.” The deal represented Nvidia’s largest acquisition deal to date.

Competition Heats Up

Nvidia’s deal with Meta comes as the most prominent AI labs and multi-trillion dollar software companies are looking to diversify their sources of compute power. OpenAI, Anthropic, Meta, XAI, and many others have relied on Nvidia hardware to do the heavy lifting as they’ve trained and deployed generative AI models over the past several years.

Now, in many cases, they’re building or customizing their own chips, putting pressure on Nvidia to offer more services.

Microsoft relies on a mixture of Nvidia GPUs and custom-designed chips for its AI cloud services. Google also uses Nvidia chips for cloud services, but primarily relies on its own, homegrown Tensor Processing Units (TPUs) to train and deploy its advanced AI models. Google has also reportedly considered selling its TPUs to Meta.

Anthropic uses a combination of Nvidia GPUs, Google’s TPUs, and chips from Amazon—one of its most significant minority stakeholders—for its Claude AI models. (Anthropic cofounder and CEO Dario Amodei is also one of the rare executives who has publicly criticized Huang’s lobbying efforts to persuade the US government to allow Nvidia to sell advanced chips to China.)

OpenAI, which last year signed a massive AI infrastructure deal with Nvidia that could be worth as much as $100 billion, has openly said that it’s working with Broadcom to create its own AI chip hardware and network systems.

OpenAI chief executive Sam Altman joined AMD chief Lisa Su on stage at the chipmaker’s annual AI conference in June of last year; the two later struck a deal for OpenAI to buy up to 6 gigawatts of AMD chips over the next several years, in exchange for potentially acquiring 10 percent of AMD. AMD is known for its CPUs, though it also has been gaining share in the AI accelerator and AI GPU market.

And just two weeks ago, OpenAI announced that it planned to use technology from Cerebras to add “add 750MW of ultra low-latency AI compute” to OpenAI’s platforms. The deal was valued at $10 billion—not as much as its deals with Nvidia, but not insignificant either.

“The AI labs are looking to diversify because the needs are changing, yes, but it’s still mostly that they just can’t access enough GPUs,” Bajarin says. “They’re going to look wherever they can get the chips.”

The post Nvidia’s Deal With Meta Signals a New Era in Computing Power appeared first on Wired.

Zuckerberg’s courthouse entourage showed up in Meta Ray-Bans
News

Zuckerberg’s courthouse entourage showed up in Meta Ray-Bans

by Business Insider
February 18, 2026

Mark Zuckerberg took the stand at the Los Angeles Superior Court. Patrick T. Fallon / AFP via Getty ImagesZuckerberg's courthouse ...

Read more
News

My last intersection with Jesse Jackson

February 18, 2026
News

The West’s Winter Has Been a Slow-Moving Catastrophe

February 18, 2026
News

A ground-floor Costco is proving what’s broken in housing

February 18, 2026
News

Leavitt Scoffs at Reporter for Asking if Trump Is Racist

February 18, 2026
Ethan Hawke’s Films Are Getting Harder to Make, but He Still Has Faith

Ethan Hawke’s Films Are Getting Harder to Make, but He Still Has Faith

February 18, 2026
California sues Trump administration over billions in canceled clean energy funding

California sues Trump administration over billions in canceled clean energy funding

February 18, 2026
MAGA-Curious CBS Boss Losing Another ‘60 Minutes’ Star After Cooper’s Exit

MAGA-Curious CBS Boss Nixed From ‘Future of Journalism’ Talk

February 18, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026