DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

IBM sees enterprise customers are using ‘everything’ when it comes to AI, the challenge is matching the LLM to the right use case

June 25, 2025
in News
IBM sees enterprise customers are using ‘everything’ when it comes to AI, the challenge is matching the LLM to the right use case
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Over the last 100 years, IBM has seen many different tech trends rise and fall. What tends to win out are technologies where there is choice.

At VB Transform 2025 today, Armand Ruiz, VP of AI Platform at IBM detailed how Big Blue is thinking about generative AI and how its enterprise users are actually deploying the technology. A key theme that Ruiz emphasized is that at this point, it’s not about choosing a single large language model (LLM) provider or technology. Increasingly, enterprise customers are systematically rejecting single-vendor AI strategies in favor of multi-model approaches that match specific LLMs to targeted use cases.

IBM has its own open-source AI models with the Granite family, but it is not positioning that technology as the only choice, or even the right choice for all workloads. This enterprise behavior is driving IBM to position itself not as a foundation model competitor, but as what Ruiz referred to as a control tower for AI workloads.

“When I sit in front of a customer, they’re using everything they have access to, everything,” Ruiz explained. “For coding, they love Anthropic and for some other use cases like  for reasoning, they like o3 and then for LLM customization, with their own data and fine tuning, they like either our Granite series or Mistral with their small models, or even Llama…it’s just matching the LLM to the right use case. And then we help them as well to make recommendations.”

The Multi-LLM gateway strategy

IBM’s response to this market reality is a newly released model gateway that provides enterprises with a single API to switch between different LLMs while maintaining observability and governance across all deployments. 

The technical architecture allows customers to run open-source models on their own inference stack for sensitive use cases while simultaneously accessing public APIs like AWS Bedrock or Google Cloud’s Gemini for less critical applications.

“That gateway is providing our customers a single layer with a single API to switch from one LLM to another LLM and add observability and governance all throughout,” Ruiz said.

The approach directly contradicts the common vendor strategy of locking customers into proprietary ecosystems. IBM is not alone in taking a multi-vendor approach to model selection. Multiple tools have emerged in recent months for model routing, which aim to direct workloads to the appropriate model.

Agent orchestration protocols emerge as critical infrastructure

Beyond multi-model management, IBM is tackling the emerging challenge of agent-to-agent communication through open protocols.

 The company has developed ACP (Agent Communication Protocol) and contributed it to the Linux Foundation. ACP is a competitive effort to Google’s Agent2Agent (A2A) protocol which just this week was contributed by Google to the Linux Foundation.

Ruiz noted that both protocols aim to facilitate communication between agents and reduce custom development work. He expects that eventually, the different approaches will converge, and currently, the differences between A2A and ACP are mostly technical.

The agent orchestration protocols provide standardized ways for AI systems to interact across different platforms and vendors.

The technical significance becomes clear when considering enterprise scale: some IBM customers already have over 100 agents in pilot programs. Without standardized communication protocols, each agent-to-agent interaction requires custom development, creating an unsustainable integration burden.

AI is about transforming workflows and the way work is done

In terms of how Ruiz sees AI impacting enterprises today, he suggests it really needs to be more than just chatbots.

“If you are just doing chatbots, or you’re only trying to do cost savings with AI, you are not doing AI,” Ruiz said. “I think AI is really about completely transforming the workflow and the way work is done.”

The distinction between AI implementation and AI transformation centers on how deeply the technology integrates into existing business processes. IBM’s internal HR example illustrates this shift: instead of employees asking chatbots for HR information, specialized agents now handle routine queries about compensation, hiring, and promotions, automatically routing to appropriate systems and escalating to humans only when necessary.

“I used to spend a lot of time talking to my HR partners for a lot of things. I handle most of it now with an HR agent,” Ruiz explained. “Depending on the question, if it’s something about compensation or it’s something about just handling separation, or hiring someone, or doing a promotion, all these things will connect with different HR internal systems, and those will be like separate agents.”

This represents a fundamental architectural shift from human-computer interaction patterns to computer-mediated workflow automation. Rather than employees learning to interact with AI tools, the AI learns to execute complete business processes end-to-end.

The technical implication: enterprises need to move beyond API integrations and prompt engineering toward deep process instrumentation that allows AI agents to execute multi-step workflows autonomously.

Strategic implications for enterprise AI investment

IBM’s real-world deployment data suggests several critical shifts for enterprise AI strategy:

Abandon chatbot-first thinking: Organizations should identify complete workflows for transformation rather than adding conversational interfaces to existing systems. The goal is to eliminate human steps, not improve human-computer interaction.

Architect for multi-model flexibility: Rather than committing to single AI providers, enterprises need integration platforms that enable switching between models based on use case requirements while maintaining governance standards.

Invest in communication standards: Organizations should prioritize AI tools that support emerging protocols like MCP, ACP, and A2A rather than proprietary integration approaches that create vendor lock-in.

“There is so much to build, and I keep saying everyone needs to learn AI and especially business leaders need to be AI first leaders and understand the concepts,” Ruiz said.

The post IBM sees enterprise customers are using ‘everything’ when it comes to AI, the challenge is matching the LLM to the right use case appeared first on Venture Beat.

Share197Tweet123Share
Jaxson Dart Makes Noteworthy Leap on NY Giants’ Depth Chart
News

Jaxson Dart Makes Noteworthy Leap on NY Giants’ Depth Chart

by Newsweek
June 25, 2025

Heading into this season, the New York Giants will be navigating two separate timelines at the quarterback position. Along with ...

Read more
News

D.C. delegate in Congress insists for second time she’s running for re-election. Her office again says no decision yet.

June 25, 2025
Crime

San Diego police hunt for man who posed as rideshare driver to commit sexual assault

June 25, 2025
Crime

Fraudster with ‘bedazzled’ gun sentenced for role in identity theft ring after racking up nearly $500K in debt

June 25, 2025
News

Jacob deGrom loses no-hit bid in 8th, leads Rangers over Orioles 7-0

June 25, 2025
‘Witch-hunt’: Trump calls for cancellation of Netanyahu’s corruption trial

‘Witch-hunt’: Trump calls for cancellation of Netanyahu’s corruption trial

June 25, 2025
A top banker said his Wharton MBA was ‘a waste of time’ — but the skills from his humanities degree are needed more than ever

A top banker said his Wharton MBA was ‘a waste of time’ — but the skills from his humanities degree are needed more than ever

June 25, 2025
Acne Studios SS26 Menswear: A Study in Modern Masculinity

Acne Studios SS26 Menswear: A Study in Modern Masculinity

June 25, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.