Storage suppliers predictions generally predict that the supplier’s products will be successful – because they represent the view of the market through the supplier’s own lens. No supplier is going to predict that its product strategy is mis-placed or will fail.
Successful suppliers will have a view of the market that matches what customers in the market are thinking. See if you agree with these forecasts from Backblaze (cloud storage), Lightbits (cloud block storage) and Phison (SSD controllers).
Backblaze
Nilay Patel, VP of Sales at Backblaze, predicts:
1: Cloud operations budgets get grilled. We’re at the end of a cycle that started in March 2020, when IT departments had to make work from home (WFH) a thing that could work practically overnight. There was a lot of overspending that folks just accepted. And of course, that emergency spending continued on for a couple of years. … Organizations will have to decide whether to solidify this spending as part of their everyday budget or start cost cutting in other areas.
Vendors that sell collaboration storage such as Box and Microsoft are starting to cut unlimited plans to save on their own operational costs. … The combination of cost-cutting at both ends will help start an exodus from those traditional vendors as anxious and budget-strapped organizations search for less expensive options.
2: Ransomware protection increases. Organizations have moved past panicking that cyberattacks are inevitable, and are realizing protection is not as complicated nor as expensive as they thought it would be. … IT departments will be able to justify the investment as far less expensive than the average downtime and mitigation costs that result from a ransomware attack.
3: Easier object storage options will fuel AI innovation. Recently, Amazon introduced S3 Express One Zone, object storage designed for massive AI applications with higher speed, higher scalability, and lower latency. This will enable new applications that in the past developers wouldn’t have been able to write without spending a huge amount of money—but not just for AWS customers.
Whenever there’s new innovation that comes out of AWS, ecosystems are created around it and, typically – thanks to AWS’s complexity and high prices—alternative approaches from other companies bring more value to businesses. The expectation is a broader impact where the tooling, model training, AI inference, and other AI oriented workflows will support data stored in object storage as a matter of course. Organizations who are competing with AWS, or looking for a less expensive approach, will be able to unlock these new performance capabilities via using object storage from other providers.
Lightbits
Eran Kirzner, co-founder and CEO, thinks:
1. NVMe over Fabrics (NVMe-oF) will gain more momentum as the main Tier-1 storage connectivity. Kirzner predicts NVMe-oF will continue to replace iSCSI and Fiber Channel (FC) and become the de facto standard for cloud storage technology and the underlying storage access protocol to support modern-day applications with a thirst for higher performance.
Industry tech leaders like Microsoft have recognized the convergence of enterprise and modern cloud storage platforms by jumping into the mix and democratizing the NVMe protocol with their announcement at 2023’s Microsoft Ignite to support inbox NVMe/TCP–making it available now on all data center OSs.
2. The number and market size of AI Cloud providers will grow exponentially. We’ll see more native AI services offered by the hyperscalers and a proliferation of AI Cloud service providers offering specialized services. AI Cloud service providers with specialized platforms, like Crusoe Cloud, are being launched where the speed and scalability of GPUs, compute, and storage will play a key role in enterprise organization’s successful AI initiatives.
3. Hybrid cloud is here to stay and enabled by software-defined cloud architectures. Hybrid cloud implementations are becoming universal, with organizations using multiple clouds to support diverse application workloads. … Business leaders with a hybrid- or multi-cloud strategy want the flexibility of moving workloads to the cloud platform that offers the best cost-efficiencies, without compromising on performance, scalability, or data services.
They want the same look, feel, and capabilities from their cloud storage across any deployment platform – on-premises or public clouds, plus the fast provisioning of storage resources to where and when it’s needed. … They should prioritize a software-defined architecture that offers license portability across cloud platforms.
4. Data security will be a critical capability of cloud storage systems. Organizations will require detection, protection, and prediction from advanced AI-driven monitoring systems (AIOps) integrated into their cloud storage systems. Encryption at rest and in flight are table stakes for any cloud storage supplier. Many business leaders are moving away from hardware-based encryption with Self-Encrypting Drives (SED) and shifting to SW-based encryption to reduce storage costs and lead time, eliminate hardware vendor lock-in, and enable cloud and hybrid cloud portability.
5. Legacy storage appliances with their monolithic architectures are going the way of the dinosaur, incapable of keeping pace with performance-sensitive, cloud-native applications or enabling organizations’ cloud-first, hybrid cloud strategies. Public cloud usage is ubiquitous; cloud storage is no longer a barrier to migrating legacy performance-sensitive applications. In tandem, Kirzner sees many organizations building cloud-native infrastructure within their on-premises data centers and the continued proliferation of CSPs offering specialized platforms.
The common thread for successfully building a cloud service is a modern storage system that is software-defined and NVMe-based. This combination delivers fast, simple storage provisioning, with the flexibility to move storage services where and when they are needed, as well as lower storage TCO.
Phison
SSD controller supplier Phison’s 2024 predictions:
- SSD, GPU, DRAM and other essential data center components will increasingly include device-level cryptographic identification, attestation, and data encryption to help better guard data against attack as AI deployments expose new digital threats.
- Private, on-premise deployment of infrastructure for LLMs to run AI model training on proprietary data without exposure to cloud security vulnerabilities.
- Ultra-rapid advancements in AI and LLMs will challenge AI infrastructure reliance on GPU and DRAM, resulting in new approaches to architecture that take greater advantage of high-capacity NAND flash.
- In these systems, PCIe 5.0 NAND flash will gain wider adoption to power applications in production environments at top speed and efficiency, freeing GPU and DRAM to separately run AI inference models, maximizing resource efficiency and productivity.
- Private LLMs will focus initially on essential activities that are not held to strict time-to-market deadlines, such as improved chatbot interactions for professionals and incremental advancements for patented products.
- As these private deployments accrue positive results, applications will be adapted for adjacent operations and procedures, furthering the proliferation of these everyday infrastructural solutions for AI.
Dr. Wei Lin, CTO, Phison HQ, Head of Phison AI R&D and Assistant Professor at the College of Artificial Intelligence, National Yang Ming Chiao Tung University, said in a statement: “As critical infrastructure evolves to support rapid advancements in AI, NAND flash storage solutions will take a central role, enabling greater architectural balance against GPU and DRAM for balanced systems built to maximize the benefits of ongoing, long-term AI deployment.”