Nutanix provides cloud-native AI stack

Nutanix has built a cloud-native Nutanix Enterprise AI (NAI) software stack that can support the deployment of generative AI (GenAI) apps “in minutes, not days or weeks” by helping customers deploy, run, and scale inference endpoints for large language models.

NAI can run on-premises, at the edge, or in datacenters, and in the three main public clouds’ Kubernetes offerings – AWS EKS, Azure AKS, and Google GKE – as well as other Kubernetes run-time environments. This multi-cloud operating software can run LLMs with Nvidia NIM-optimized inference microservices as well as open source foundation models from Hugging Face. The LLMs operate atop the NAI platform and can access NAI-stored data.

Thomas Cornely, Nutanix
Thomas Cornely

Thomas Cornely, SVP for Product Management at Nutanix, stated: “With Nutanix Enterprise AI, we’re helping our customers simply and securely run GenAI applications on-premises or in public clouds. Nutanix Enterprise AI can run on any Kubernetes platform and allows their AI applications to run in their secure location, with a predictable cost model.”

NAI is a component of Nutanix GPT-in-a-Box 2.0, which also includes Nutanix Cloud Infrastructure, Nutanix Kubernetes Platform, and Nutanix Unified Storage, plus services to support customer configuration and sizing needs for on-premises training and inferencing.

Nutanix sees AI training – particularly large-scale generalized LLM training – taking place in specialized mass GPU server facilities. New GenAI apps are often built in the public cloud, with fine-tuning of models using private data occurring on-premises. Inferencing is deployed closest to the business logic, which could be at the edge, in datacenters, or in the public cloud. NAI supports these inferencing workloads and app locations.

Nutanix says NAI has a transparent and predictable pricing model based on infrastructure resources. This is in contrast to most cloud services that come with complex metering and unpredictable usage-based pricing.

Nutanix Enterprise AI deployment scenarios
NAI deployment scenarios

The usability and security angles have roles here. NAI offers an intuitive dashboard for troubleshooting, observability, and utilization of resources used for LLMs, as well as role-based access controls (RBAC) to ensure LLM accessibility is controllable and understood. Organizations requiring hardened security will also be able to deploy in air-gapped or dark-site environments. 

Nutanix diagram
Nutanix diagram

Nutanix suggests NAI can be used for enhancing customer experience with GenAI through improved analysis of customer feedback and documents. It can accelerate code and content creation by leveraging copilots and intelligent document processing as well as fine-tuning models on domain-specific data. It can also strengthen security – including leveraging AI models for fraud detection, threat detection, alert enrichment, and automatic policy creation.

We see NAI presented as an LLM fine-tuning and inference alternative to offerings from Microsoft, Red Hat, and VMware that should appeal to Nutanix’s 25,000-plus customers. Coincidentally, Red Hat has just announced updates to its enterprise offerings including OpenShift AI, OpenShift, and Developer Hub:

  • Developer Hub – Includes tools for leveraging AI to build smarter applications, including new software templates and expanded catalog options.
  • OpenShift AI 2.15 – Offers enhanced flexibility, optimization, and tracking, empowering businesses to accelerate AI/ML innovation and maintain secure operations at scale across cloud and edge environments.
  • OpenShift 4.17 – Streamlines application development and integrate new security features, helping businesses to tackle complex challenges.

On top of that, Red Hat has signed a definitive agreement to acquire Neural Magic – a pioneer in software and algorithms that accelerate GenAI inference workloads.

SUSE also presented its SUSE AI, described as “a secure, trusted platform to deploy and run GenAI applications” at KubeCon North America.

NAI and GPT-in-a-Box 2.0 are currently available to customers. More information here.