DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

Cracking AI’s storage bottleneck and supercharging inference at the edge

July 7, 2025
in News
Cracking AI’s storage bottleneck and supercharging inference at the edge
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

As AI applications increasingly permeate enterprise operations, from enhancing patient care through advanced medical imaging to powering complex fraud detection models and even aiding wildlife conservation, a critical bottleneck often emerges: data storage.

During VentureBeat’s Transform 2025, Greg Matson, head of products and marketing, Solidigm and Roger Cummings, CEO of PEAK:AIO spoke with Michael Stewart, managing partner at M12 about how innovations in storage technology enables enterprise AI use cases in healthcare.

The MONAI framework is a breakthrough in medical imaging, building it faster, more safely, and more securely. Advances in storage technology is what enables researchers to build on top of this framework, iterate and innovate quickly. PEAK:AIO partnered with Solidgm to integrate power-efficient, performant, and high-capacity storage which enabled MONAI to store more than two million full-body CT scans on a single node within their IT environment.

“As enterprise AI infrastructure evolves rapidly, storage hardware increasingly needs to be tailored to specific use cases, depending on where they are in the AI data pipeline,” Matson said. “The type of use case we talked about with MONAI, an edge-use case, as well as the feeding of a training cluster, are well served by very high-capacity solid-state storage solutions, but the actual inference and model training need something different. That’s a very high-performance, very high I/O-per-second requirement from the SSD. For us, RAG is bifurcating the types of products that we make and the types of integrations we have to make with the software.”

Improving AI inference at the edge

For peak performance at the edge, it’s critical to scale storage down to a single node, in order to bring inference closer to the data. And what’s key is removing memory bottlenecks. That can be done by making memory a part of the AI infrastructure, in order to scale it along with data and metadata. The proximity of data to compute dramatically increases the time to insight.

“You see all the huge deployments, the big green field data centers for AI, using very specific hardware designs to be able to bring the data as close as possible to the GPUs,” Matson said. “They’ve been building out their data centers with very high-capacity solid-state storage, to bring petabyte-level storage, very accessible at very high speeds, to the GPUs. Now, that same technology is happening in a microcosm at the edge and in the enterprise.”

It’s becoming critical to purchasers of AI systems to ensure you’re getting the most performance out of your system by running it on all solid state. That allows you to bring huge amounts of data, and enables incredible processing power in a small system at the edge.

The future of AI hardware

“It’s imperative that we provide solutions that are open, scalable, and at memory speed, using some of the latest and greatest technology out there to do that,” Cummings said. “That’s our goal as a company, to provide that openness, that speed, and the scale that organizations need. I think you’re going to see the economies match that as well.”

For the overall training and inference data pipeline, and within inference itself, hardware needs will keep increasing, whether it’s a very high-speed SSD or a very high-capacity solution that’s power efficient.

“I would say it’s going to move even further toward very high-capacity, whether it’s a one-petabyte SSD out a couple of years from now that runs at very low power and that can basically replace four times as many hard drives, or a very high-performance product that’s almost near memory speeds,” Matson said. “You’ll see that the big GPU vendors are looking at how to define the next storage architecture, so that it can help augment, very closely, the HBM in the system. What was a general-purpose SSD in cloud computing is now bifurcating into capacity and performance. We’ll keep doing that further out in both directions over the next five or 10 years.”

The post Cracking AI’s storage bottleneck and supercharging inference at the edge appeared first on Venture Beat.

Share197Tweet123Share
Here Are Trump’s New Tariff Threats
News

Trump Tariff Letters: See the Latest Rates and Countries Affected

by New York Times
July 9, 2025

President Trump has informed at least 21 nations that their exports will face tariffs of 20 percent or more starting ...

Read more
News

Man who shot at boy, 12, on freeway brandished weapon at other motorist moments before opening fire: CHP 

July 9, 2025
News

X CEO Linda Yaccarino resigns after two years at the helm of Elon Musk’s social media platform

July 9, 2025
News

FACT FOCUS: Trump misrepresents facts about wind power during Cabinet meeting

July 9, 2025
News

More than 17 million people in Yemen are going hungry, including over 1 million children, UN says

July 9, 2025
Opinion: How Trump Is Getting Way With Murder. Literally

Opinion: How Trump Is Getting Way With Murder. Literally

July 9, 2025
‘Bachelor’ alum Madi Prewett, husband plan on ‘spanking’ daughter ‘in a very loving way’: ‘Discipline is not pleasant’

‘Bachelor’ alum Madi Prewett, husband plan on ‘spanking’ daughter ‘in a very loving way’: ‘Discipline is not pleasant’

July 9, 2025
70-Year-Old Man Charged in 1999 Murder Case

70-Year-Old Man Charged in 1999 Murder Case

July 9, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.