**Dell and Nvidia Highlight 'Massive' Storage Opportunity from GenAI Data Byproducts**

 "If I give you a paragraph of text and convert it to embeddings that then get stored, the size of the embeddings is much bigger than the original text," said Manuvir Das, vice president of enterprise computing at Nvidia, in an interview with CRN. "It can be 10 times bigger. This represents a massive data and storage opportunity that people haven’t fully grasped yet."



**Dell and Nvidia Spotlight 'Massive' Storage Opportunity from GenAI Data Byproducts**


Dell Technologies' Vice Chairman and COO, Jeff Clarke, highlighted at Dell Technologies World 2024 that in the AI system, GPUs serve as the brain, networking as the heart, and storage is akin to the lungs. "I’d argue storage is the lungs pumping the data," Clarke said at the event.


The new compute stacks are capable of consuming more data than ever before, making storage systems critical for overall performance, according to CR Howdyshell, CEO of Dell Titanium partner Advizex. He emphasized the need for high-quality storage to support these advanced systems. "We get enthralled by and begin focusing on all the sizing for compute and AI requirements. You have to make sure you’re getting ahead of the storage conversation," Howdyshell told CRN. "The opportunity is just huge."


In August of last year, Dell launched its Partner First For Storage program, incentivizing core sellers to close storage deals through channel partners. This initiative has brought new customers to partners like Advizex and VirtuIT. Dell aims to capture a share of the $13 billion mid-range storage market.


John Lee, CTO of VirtuIT, sees a significant opportunity in refreshing old infrastructure for SMEs, preparing them for the widespread adoption of generative AI. "Let’s get you set up so when you are ready to really start implementing generative AI, you have the infrastructure you need from a storage point of view, ready to catch up with your compute," Lee said.


Arthur Lewis, Dell’s President of Infrastructure Solutions Group, underscored the vast storage opportunity, stating, "AI workloads drive 300 times the amount of data throughput that we see in traditional compute." He predicts AI machines will require 27 quettaflops of computing power by the end of the decade.


Manuvir Das, Nvidia’s Vice President of Enterprise Computing, pointed out that AI models are now reading more complex data forms, generating large amounts of data that necessitate new storage solutions. "If I give you a new kind of text and convert it to embeddings, the size of the embeddings is much bigger than the original text. It can be 10 times bigger. This is a massive data and storage opportunity that people haven’t grasped yet," Das explained.


Dell is preparing for this demand with products like the software-defined PowerScale, designed with AI in mind, and the PowerScale F910, a high-performance file solution for unstructured data. They also announced Project Lightning, a parallel file system for AI, promising performance increases of up to 20x over competitors.


As AI models become more tailored and compute requirements shrink, the need for high-quality attached storage will grow. Smaller models will increasingly rely on storage for quick data reference, making the quality of attached storage crucial. Howdyshell is particularly excited about Dell’s new PowerStore with 5 to 1 compression, noting, "In the deals we have done and in the opportunities we are quoting, typically, the customer wants to move fast with compute, and then secondary is storage. It’s a big opportunity."


Ultimately, as AI technology advances, the importance of robust storage solutions will only increase, driving significant opportunities for Dell and its partners.

Post a Comment

Previous Post Next Post