NetApp has been providing its storage and data management systems for AI work since 2018 or earlier and says it is well positioned as large language model workloads transition to general across-the-board enterprise use.
The company may be looking to correct perceptions it may be lagging in the AI field, because it can’t point to marquee AI engagements such as those promoted by, for example, DDN and VAST Data, nor product offer changes like those promoted by Dell Technologies and HPE.
But the company told financial analysts at its Insight 2023 conference that, with the surge in AI interest, it is focused on the full lifecycle of AI purchases, not just the speeds and feeds of pushing data to GPUs. It is focusing on where the data is being generated, how it’s being organized, how it’s being unified, how it’s being prepared, how it’s then being stuffed into GPUs for training; a very important part of it, but just one little bit of it.
On that point, it said it had introduced ONTAP Nvidia GPUDirect Storage (GDS) support capabilities in April.
NetApp is also looking at what happens to data after that, when the results, the validation of the AI model happens. It sees the entirety of the AI data lifecycle as what will be driving revenue for NetApp.
CEO George Kurian said customers need “an intelligent data infrastructure that combines hybrid multi-cloud data storage with integrated data services and AI powered cloud operations, monitoring, optimization, and automation.” He claimed: “AI is powered by data and data runs on NetApp,” because many enterprises are NetApp customers. NetApp thinks that the customers are sitting on data that has value that they are unable to derive without AI and NetApp can help with that.
Director of Product Management Russ Fishman said: “AI isn’t really focused on any one particular deployment methodology, so it’s not really about on-prem, it’s not really about cloud, it’s really about all of it. And NetApp has this very unique position in the market, which is that we have a leading storage operating system, which is available on all the clouds and on-prem.”
NetApp said it’s working with partners, e.g. Nvidia, with whom it has delivered five or six unique solutions that it co-developed, co-innovated, and has brought to market and been adopted by hundreds of customers.
Andy Sayare, director of Global Strategic Alliances for AI, said: “We paired our storage technology with video server technology, the DGX platform, and their Mellanox switching … So together that made what we call ONTAP AI, which is essentially our operating system managing AI workloads in this converged infrastructure stack.” Some customers don’t want this on-premises. “One alternative is what’s now called DGX Cloud that Nvidia offers. It’s currently available in OCI, but it will be moving to Azure and GCP and other clouds in coming months.”
He added: “We work with Nvidia on their SuperPOD, which is of course large-scale AI training, often focused on large language models and other very large training situations.”
Other partners include Domino Data Lab (MLOps), Run.AI, and computer system vendors Cisco, Lenovo, and Fujitsu.
The NetApp pitch is that virtually all of its products are AI ready and that it has hundreds of customers using NetApp in AI workloads already.
Wells Fargo analyst Aaron Rakers asked: “Does the DGX Cloud Solution utilize NetApp ONTAP AI as [its] primary storage backend versus other alternatives?”
Sayare replied: “So currently, no … DGX cloud is leveraging an alternative storage for its scratch space. So this is the area that the GPUs use as extra space from their internal storage. What NetApp is doing is working adjacent to that to help connect the data from multiple sources, multiple clouds and on-prem, to be able to bring that data together, to be able to train those models.”
He explained: “Scratch space in general is just that – it’s really the last mile, right? It’s ephemeral data, which means that it’s not stored long term, it’s not protected. The data management capabilities are generally not included there. What we’re finding from customers is that that is not us – that’s not a complete solution.”
Evercore analyst Irvin Liu asked: “Do you see a share gain potential or opportunity presented by AI or is or most organizations are going to stick with their incumbent vendors and avoid a major upgrade or a major transformation prior to jumping into the AI journey?”
Sayare said: “I think we’re extremely well positioned … We’ve been building for AI for five years, five and a half years. So I think we have a portfolio that makes us competitive in non-traditional NetApp and non-NetApp customers … I think there’s always the opportunity.”
NetApp is not actually predicting it will make share gains through AI.
Comment
Both HPE and Dell are making big marketing pushes around AI. NetApp has been more muted in its approach, saying its Data Fabric spanning the on-premises world, co-lo services with Equinix, and first party ONTAP-based AWS, Azure, and GCP offerings enable its customers to access and use data anywhere in their enterprises’ reach. And do so better than any other supplier.
This is an incumbent’s pitch, and analysts recognized this.
Rakers told subscribers: “NetApp … emphasized that is increasingly pulled into discussions following GPU purchases as customers look to maintain high GPU compute utilization. AI is expected to drive more capacity-optimized flash solutions and consumption of its data management software.”
William Blair’s Jason Ader noted: “On the AI side, NetApp sees the ongoing hype around generative AI as highlighting the importance of enterprise data management (particularly related to unstructured data).” He told subscribers: “Management sees more of a revenue opportunity on the data management/inference side for generative AI models, since inferencing accounts for 85 percent of run time with generative AI models (though not necessarily 85 percent of the spending).”
He points out: “Within AI, there is competition ramping up from high-performance storage vendors like Pure and VAST Data.”
In effect, having made its product offers AI-ready, NetApp is talking to its existing customers rather than seeing AI as an opportunity to gain new customers.