DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Nvidia’s Groq bet shows that the economics of AI chip-building are still unsettled

December 30, 2025
in News
Nvidia’s Groq bet shows that the economics of AI chip-building are still unsettled

Nvidia built its AI empire on GPUs. But its $20 billion bet on Groq suggests the company isn’t convinced GPUs alone will dominate the most important phase of AI yet: running models at scale, known as inference.

The battle to win on AI inference, of course, is over its economics. Once a model is trained, every useful thing it does—answering a query, generating code, recommending a product, summarizing a document, powering a chatbot, or analyzing an image—happens during inference. That’s the moment AI goes from a sunk cost into a revenue-generating service, with all the accompanying pressure to reduce costs, shrink latency (how long you have to wait for an AI to answer), and improve efficiency.

That pressure is exactly why inference has become the industry’s next battleground for potential profits— and why Nvidia, in a deal announced just before the Christmas holiday, licensed technology from Groq, a startup building chips designed specifically for fast, low-latency AI inference, and hired most of its team, including CEO and founder Jonathan Ross.

Inference is AI’s ‘industrial revolution’

Nvidia CEO Jensen Huang has been explicit about the challenge of inference. While he says Nvidia is “excellent at every phase of AI,” he told analysts at the company’s Q3 earnings call in November that inference is “really, really hard.” Far from a simple case of one prompt in and one answer out, modern inference must support ongoing reasoning, millions of concurrent users, guaranteed low latency, and relentless cost constraints. And AI agents, which have to handle multiple steps, will dramatically increase inference demand and complexity—and raise the stakes of getting it wrong. 

“People think that inference is one shot, and therefore it’s easy. Anybody could approach the market that way,” Huang said. “But it turns out to be the hardest of all, because thinking, as it turns out, is quite hard.”

Nvidia’s support of Groq underscores that belief, and signals that even the company that dominates AI training is hedging on how inference economics will ultimately shake out. 

Huang has also been blunt about how central inference will become to AI’s growth. In a recent conversation on the BG2 Podcast, Huang said inference already accounts for more than 40% of AI-related revenue—and predicted that it is “about to go up by a billion times.”

“That’s the part that most people haven’t completely internalized,” Huang said. “This is the industry we were talking about. This is the industrial revolution.”

The CEO’s confidence helps explain why Nvidia is willing to hedge aggressively on how inference will be delivered, even as the underlying economics remain unsettled.

Nvidia wants to corner the inference market

Nvidia is hedging its bets to make sure that they have their hands in all parts of the market, said Karl Freund, founder and principal analyst at Cambrian-AI Research. “It’s a little bit like Meta acquiring Instagram,” he explained. “It’s not they thought Facebook was bad, they just knew that there was an alternative that they wanted to make sure wasn’t competing with them.”

That, even though Huang had made strong claims about the economics of the existing Nvidia platform for inference. “I suspect they found that it either wasn’t resonating as well with clients as they’d hoped, or perhaps they saw something in the chip memory-based approach that Groq and another company called D-Matrix has,” said Freund, referring to another fast, low-latency AI chip startup backed by Microsoft that recently raised $275 million at a $2 billion valuation. 

Freund said Nvidia’s move into Groq could lift the entire category. “I’m sure D-Matrix is a pretty happy startup right now, because I suspect their next round will go at a much higher valuation thanks to the [Nvidia-Groq deal],” he said. 

Other industry executives say the economics of AI inference are shifting as AI moves beyond chatbots into real-time systems like robots, drones, and security tools. Those systems can’t afford the delays that come with sending data back and forth to the cloud, or the risk that computing power won’t always be available. Instead, they favor specialized chips like Groq’s over centralized clusters of GPUs.

Behnam Bastani, CEO and founder of OpenInfer, which focuses on running AI inference close to where data is generated—such as on devices, sensors, or local servers rather than distant cloud data centers—said his startup is targeting these kind of applications at the “edge.” 

The inference market, he emphasized, is still nascent. And Nvidia is looking to corner that market with its Groq deal. With inference economics still unsettled, he said Nvidia is trying to position itself as the company that spans the entire inference hardware stack, rather than betting on a single architecture.

“It positions Nvidia as a bigger umbrella,” he said.

The post Nvidia’s Groq bet shows that the economics of AI chip-building are still unsettled appeared first on Fortune.

With rain, early blooms, this SoCal desert escape is already blanketed in wildflowers
News

With rain, early blooms, this SoCal desert escape is already blanketed in wildflowers

by Los Angeles Times
December 30, 2025

Wildflower seekers typically must wait until February or March to see blankets of color in Borrego Springs but, thanks to ...

Read more
News

Studying the Melting Continent, if We Can Reach It

December 30, 2025
News

Eurostar Trains Face Day of Delays After Power Failure

December 30, 2025
News

‘Tron: Ares’ Sets January Streaming Date on Disney+

December 30, 2025
News

No, the White House can’t defund the CFPB, judge says, just days before agency would run out of cash

December 30, 2025
Pregnant Leavitt Given Dire Health Warning on Her Signature Look

Pregnant Leavitt Given Dire Health Warning on Her Signature Look

December 30, 2025
Megyn Kelly Snaps After MAGA Host Brings Up Blackface Scandal

Megyn Kelly Snaps After MAGA Host Brings Up Blackface Scandal

December 30, 2025
Tatiana Schlossberg, the granddaughter of the late President John F. Kennedy, has died at 35

Tatiana Schlossberg, the granddaughter of the late President John F. Kennedy, has died at 35

December 30, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025