Home Blog Page 44

Data protection company pays $47M to stick Clumio in its Commvault

Data protection company Commvault is buying AWS cloud data protector Clumio for $47 million – significantly less than Clumio’s total funding.

Clumio was founded in 2017 and provides SaaS data protection services to Amazon’s S3, EC2, EBS, RDS, SQL on EC2, DynamoDB, VMware on AWS, and Microsoft 365, storing its backups in virtual air-gapped AWS repositories. Its CEO is Rick Underwood, who has been CEO for three months since his promotion from CRO in June. Co-founder and board chair Poojan Kumar was the previous CEO.

Sanjay Mirchandani, Commvault
Sanjay Mirchandani

Commvault CEO Sanjay Mirchandani said: “Combining Commvault’s industry-leading cyber resilience capabilities with Clumio’s exceptional talent, technology, and AWS expertise advances our recovery offerings, strengthens our platform, and reinforces our position as a leading SaaS provider for cyber resilience.”

Kumar added: “At Clumio, our vision was to build a platform that could scale quickly to protect the world’s largest and most complex data sets, including data lakes, warehouses, and other business-critical data. Joining hands with Commvault allows us to get our cloud-native offerings to AWS customers on a global scale.” 

Clumio has raised $262 million in funding, with the latest being a $75 million D-round in February this year, four months before Underwood’s promotion. A Commvault SEC filing says the acquisition price is approximately $47 million, a fraction of Clumio’s total funding, seemingly less than Clumio’s D-round in February. 

Clumio customers include Atlassian, Cox Automotive, Duolingo, and LexisNexis. Clumio says it has experienced a 4x year-over-year growth in annual recurring revenue (ARR).

Poojan Kumar, Clumio
Poojan Kumar

Commvault’s history shows a focus on on-premises data protection with a move to SaaS-based protection with its 2019-launched Metallic offering. It has become a revenue growth driver for Commvault, along with its cyber-resilience product features. The company reached a quarterly revenue high point of $224.7 million in the second quarter of 2024.

Clumio appointed Carol Hague as VP of Marketing in July as Rick Underwood set about growing the company.

A period of consolidation is now operative in the data protection world. It started with Cohesity buying Veritas in February, and now Commvault has bought Clumio.

The asset acquisition, as Commvault calls it, is expected to close in early October 2024, and be immediately accretive to ARR and revenue, and accretive to free cash flow within the next three quarters. The $47 million cost will be funded with cash on hand. This acquisition follows Commvault’s purchase of cloud resiliency supplier Appranix in April this year.

The “asset acquisition” term indicates that Commvault is not buying the entire Clumio company and its liabilities, only specific assets, which have not been identified.

Commvault has reiterated its fiscal second quarter 2025 earnings guidance previously announced on July 30, 2024. The company reported fiscal 2024 revenues of $839.2 million. In its latest quarter it reported cash and cash equivalents of $287.9 million. Its FY Q2 period ends in September and the earnings guidance is for revenues of $220 million +/-$2 million. Clumio won’t contribute much to this; the quarter is nearly over. 

Kumar founded Clumio with CTO Woon Ho Jung and engineering VP Kaustubh Patil. The three previously founded PernixData, which Nutanix bought in August 2016 for 528,517 shares and $1.2 million in cash.

Huawei unveils next-gen OceanStor Dorado all-flash array

Huawei has launched a seventh generation of all-flash OceanStor Dorado storage array products to support mission-critical AI workloads with more performance and resilience.

There are three Dorado product groups: the high-end 8000 and 18000, the mid-range 5000 and 6000, and the entry-level 3000. The v7 Dorado 18000 has three times more performance, Huawei says, than the previous generation, helped by CPU offload hardware including DPU-based SmartNICs separating data flows from control flows. An upgraded FLASHLINK intelligent collaboration algorithm between disk controllers and DPUs enables over 100 million IOPS and “extremely low, 0.03 ms latency” from an up to 32-controller system with 500 PB of capacity.

More offload engines are used for deduplication and compression, and ransomware detection. 

The Dorado features 99.99999 percent single system reliability. Its SmartMatrix full-mesh architecture tolerates multi-layer faults, including those of controller enclosures, disk enclosures, and rack cabinets. It can survive the failure of up to seven out of eight controller enclosures without service interruption. Huawei says it supports ransomware protection for SAN and NAS across all zones, achieving a claimed ransomware detection rate of up to 99.99 percent. Intelligent snapshot correlation analysis and snapshot synthesis technologies ensure 100 percent data availability after recovery. Integrated IO-level continuous data protection means data can be recovered to any point in time.

Huawei OceanStor Dorado

Huawei says it has a unique active-active system for SAN, NAS, and S3 workloads providing load-balancing and failover.

A Data Management Engine (DME) allows for dialog-based O&M and can proactively detect exceptions through AI large language model technologies, improving O&M efficiency five-fold. It provides 30 percent higher IOPS per watt and TB per watt compared to the best unnamed competitor product, according to a Datacenter Dynamics report.

Huawei says the Dorado 18000 has a native parallel architecture for blocks, files, and objects. It can be placed in existing v6 clusters and will be compatible with future v8 clusters.

Yang Chaobin, Huawei’s president of ICT products and solutions, said at the Huawei Connect 2024 Shanghai event: “We have a lot of v6 customers that have been looking forward to seeing our next generation. Because of the American sanctions, we have a lot of limitations politically, so now we are gradually trying to recover from a lot of those difficulties, and our customers are looking for that.”

Huawei has more info on next-gen OceanStor Dorado All-Flash Storage here.

The role of data governance in AI

COMMISSIONED: Robust data governance is key to ethical, compliant, and efficient AI projects. Here’s why the balance between innovation and responsibility is delicate, but crucial.

In a bustling New York office, a data scientist named Emily is racing against the clock. Her team is developing an AI algorithm intended to revolutionize personalized customer experiences. The project is ambitious and promising, with the potential to drive unprecedented business growth. However, Emily has one lingering concern: data governance. Despite her excitement, she knows that without robust data governance, the project could face ethical dilemmas, compliance issues, and even data breaches. Emily’s story is not unique; it’s a reflection of the broader challenges faced by organizations today as they balance the pursuit of innovation with the responsibility of data stewardship.

Artificial Intelligence (AI) has become the cornerstone of modern innovation, driving advancements in various fields such as healthcare, finance, and entertainment. AI’s ability to process and analyze massive amounts of data allows businesses to uncover insights and make decisions that were previously unimaginable. Yet, with great power comes great responsibility. The same data that fuels AI’s capabilities also poses significant challenges in terms of governance, privacy, and ethical use.

Data governance is the framework that ensures data is managed properly throughout its lifecycle. It involves policies, procedures, and technologies that maintain data quality, security, and compliance. For AI to be truly transformative, organizations must prioritize data governance as much as they prioritize AI development.

The Importance of data governance in AI

As organizations increasingly adopt AI technologies, the need for strong data governance becomes essential. Robust data governance ensures that AI systems are not only efficient and accurate but also aligned with legal and ethical standards. Here are four crucial aspects through which data governance enhances AI projects:

– 1) Ensuring data quality: AI algorithms are only as good as the data on which they are trained. Poor-quality data leads to inaccurate models, which can result in flawed business decisions. Data governance ensures that data is accurate, complete, and reliable, providing a solid foundation for AI initiatives.

– 2) Compliance and privacy: With stringent regulations like GDPR and CCPA, compliance is a critical aspect of data governance. AI projects must adhere to these regulations to avoid hefty fines and legal repercussions. Data governance frameworks help organizations manage consent, anonymize data, and implement robust security measures to protect sensitive information.

– 3) Ethical AI: As AI systems become more integrated into decision-making processes, ensuring ethical use of data is paramount. Data governance provides guidelines to prevent biases, ensure fairness, and maintain transparency in AI algorithms. This not only builds trust with customers but also mitigates risks associated with unethical AI practices.

– 4) Operational efficiency: Effective data governance streamlines data management processes, reducing redundancy and improving efficiency. This enables data scientists and analysts to focus on extracting value from data rather than dealing with data quality issues or compliance roadblocks.

PowerScale is a storage solution designed to handle massive amounts of unstructured data, making it an ideal solution for AI applications. It is also a prime example of how technology drive and reinforce strong data governance practice with features such as:

Scalability and performance

Achieving operational efficiency includes maximizing scalability and performance. PowerScale is designed to seamlessly scale to meet the expanding data demands of AI applications while maintaining top-tier performance. Based on internal testing which compared the streaming write of the PowerScale F910 using the OneFS 9.8 distributed file system to the streaming write of the PowerScale F900 using OneFS 9.5 distributed file system, the new F910 delivers faster time to AI insights with up to a 127 percent improved streaming performance (actual results may vary). It accelerates the model checkpointing and training phases of the AI pipeline, keeping GPUs fully utilized with up to 300 PBs of storage per cluster. This ensures uninterrupted model training and prevents GPU idling, effectively accelerating the AI pipeline.

Additionally, PowerScale supports GPU Direct and RDMA (Remote Direct Memory Access) technologies, further optimizing data transfer between storage and GPUs. GPU Direct enables direct communication between GPUs and the storage system, bypassing the CPU, which reduces latency and improves throughput. RDMA enhances this by allowing data to be transferred directly between storage and GPU memory over the network, minimizing CPU involvement and further reducing bottlenecks. Together, these technologies ensure that large datasets are managed efficiently, and that data remains accessible and manageable, fostering high-quality AI development on our AI-ready data platform.

Data security and compliance

PowerScale integrates advanced security features, including encryption, access controls, and audit trails, to protect sensitive data and ensure regulatory compliance. With federal-grade embedded security and real-time API-integrated ransomware detection, it safeguards the entire AI process from attacks and protects your intellectual property from unauthorized access.

PowerScale also supports air-gapped environments, providing an extra layer of security by isolating critical systems from unsecured networks. This ensures that your most sensitive data is kept out of reach from external threats, significantly reducing the risk of cyberattacks. The air-gapped configuration is particularly crucial for industries with stringent compliance requirements, such as finance, healthcare, and government, where the integrity and confidentiality of data are paramount. By combining air-gapped protection with comprehensive security measures, PowerScale offers a robust solution that meets the highest standards of data security and regulatory compliance.

Data lifecycle management

PowerScale provides tools for managing data throughout its entire lifecycle, from creation to archiving, ensuring that data is treated according to governance policies at every stage. This includes not just storage, but also classification, retention, and deletion, which helps organizations maintain compliance with regulatory requirements. By automating these processes, PowerScale reduces the risk of human error, ensuring that data governance is applied consistently. Furthermore, it supports tiering strategies, allowing organizations to move less frequently used data to lower-cost storage while keeping critical data accessible, optimizing both cost and performance as AI workloads evolve.

Flexibility and integration

PowerScale offers the flexibility to build your infrastructure when, where, and how you need it. Its variety of node types and software services enable right-sizing and scaling of infrastructure to match diverse workload requirements. Additionally, PowerScale seamlessly integrates with existing data management tools and workflows, including Hadoop Distributed File System (HDFS), NFS, and SMB protocols. For AI-driven workflows, it supports popular data pipeline tools like Apache Spark and TensorFlow. This broad integration capability makes it easy to fit PowerScale into existing environments, allowing data teams to leverage their current tools while gaining the scalability and performance advantages PowerScale offers.

The balance between innovation and responsibility is delicate but crucial. Organizations must foster a culture that values data governance as much as technological advancement. This involves:

– 1) Leadership commitment: Leaders must prioritize data governance and allocate resources to develop and maintain robust frameworks. This commitment sets the tone for the entire organization and emphasizes the importance of responsible data management.

– 2) Cross-functional collaboration: Data governance is not solely the responsibility of IT departments. It requires collaboration across all functions, including legal, compliance, and business units. This ensures that data governance policies are comprehensive and aligned with organizational goals.

– 3) Continuous improvement: Data governance is an ongoing process that must evolve with changing regulations, technologies, and business needs. Regular reviews and updates to governance policies ensure that they remain effective and relevant.

The journey of balancing innovation and responsibility is ongoing. As AI continues to evolve and integrate into various aspects of our lives, the role of data governance becomes increasingly critical. PowerScale exemplifies how technological solutions can support this balance, providing the tools necessary to manage data effectively and responsibly.

Ultimately, it’s not just about what AI can achieve, but how it’s implemented. Organizations prioritizing data governance will be better positioned to leverage AI’s full potential while maintaining the trust and confidence of their stakeholders. Just like the example given of Emily, businesses must recognize that innovation and responsibility go hand in hand, ensuring a future where AI advancements are achieved with integrity and accountability.

Learn how Dell solutions can help you transform with AI.

Brought to you by Dell Technologies.

Pure Storage extends FlashBlade with file services, capacity boost

Pure Storage is providing a set of fleet and cluster level FlashBlade file services that make it easier to operate and manage large-scale file environments in a cloud-like way, plus an entry-level FlashBlade//S100 system, and doubled Direct Flash Module (DFM) maximum capacity to 150 TB.

FlashBlade is Pure’s all-flash, unified file and object storage system, operated by the Purity OS. Pure is adding two software capabilities for multiple storage classes – Fusion for Files and Zero Move Tiering – plus Real Time Enterprise File services with always-on multiple protocol, auditing, QoS, SMT for file and the AI Copilot. The company is also adding universal (cross-product) licensing credits and a VM Assessment service to help optimize Pure customers’ on-prem and public cloud storage for virtual machine (VM) workloads in light of Broadcom changes to VMware licensing conditions and costs. Overall it claims that legacy file storage is restricting customers’ ability to meet modern file workload requirements.

Chief product officer Ajay Singh said in a statement: ”For years, outdated and rigid legacy file storage has done customers a huge disservice by holding them back and forcing them towards frequent technology refresh cycles. Through the Pure Storage platform, Real-time Enterprise File along with the new VM assessment and Universal Credits empowers our customers to navigate today’s fast-moving, complex business environment with confidence.”

Pure graphic

Pure says its Real-time Enterprise File offering enables file services to dynamically change, adapt, and reconfigure in real time. With Fusion, there are now fleet-level global storage pools – unlimited storage pools without any fixed allocation. Pure arrays can join a global storage pool while keeping data in place.

The Purity OS supports both SMB and NFS file access protocols by default and also logs write requests by default for subsequent auditing with log configuration required. It solves noisy workload neighbor problems by having always-on quality of service to prevent any one workload using excessive resources and bottlenecking performance.

Purity can now virtualize a FlashBlade system into so-called servers. These, Pure claims, “provide data access isolation for file workloads enabling customers to service multiple untrusted domains, present multiple share or export namespaces, or share data between different untrusted domains.”

The Zero Move Tiering idea enables different storage performance classes by having a single QLC DFM storage base and applying faster or slower compute and network resources to data accesses for hot or cold data respectively. It’s quasi-storage tiering in that data does not move between storage tiers, as is usually the case, to save costs by putting cold data on slower and cheaper disk tiers. Instead compute performance classes provide tiered levels of storage access.

Pure graphic

The FlashBlade//S100 fits under the existing performance-centric FLashBlade//S200 and capacity-centric FlashBlade//S500 systems as an entry-level product.  Both the //S200 and //S500 have one to four DFMs with QLC (4bits/cell) flash media and up to 3,000 TB (3 PB) raw capacity. There are up to four DFMs per blade carrier and systems start with seven blades, scaling up to ten blades in a chassis. There can then be up to ten chassis in a cluster. 

The //S100 has four QLC-based DFMs (18/37/75 TB) per blade and starts at 126 TB with seven blades each carrying a single 18 TB DFM. It can scale up to 3 PB of raw capacity. At launch, Pure will support 37 TB DFMs and 18/75 TB DFMs will follow. The //S100 can be non-disruptively upgraded to either the //S200 or //S500.

By introducing its 150 TB DFMs, Pure has doubled the maximum raw capacity of its FlashBlade systems. They can now support up to 4 PB in a 3RU chassis and 6 PB in a 5RU chassis; 40 x 150 TB DFMs, and 60 PB in a cluster. Pure customers can non-disruptively double the capacity of their current 75 TB DFM FlashBlade deployments. The 175 TB DFMS are more than twice the capacity of current 61.44 TB SSDs, such as those supplied by Solidigm, and will enable Pure’s arrays to occupy less datacenter space than competing all-flash arrays using off-the-shelf SSDs.

The Universal Credits scheme lets customers purchase a pool of credits and use them across various services without being locked into specific subscriptions, while providing predictable billing. Customers can gain volume discounts by purchasing Universal Credits and applying them across various Evergreen//One consumption-based service model for storage, Pure Cloud Block Store, and Portworx services. 

The VM Assessment service is included in a Pure1 cloud-based management and monitoring platform subscription. It provides VM performance monitoring, utilization, and rightsizing recommendations with potential subscription impacts for scenario planning.

Shawn Hansen, Pure Storage
Shawn Hansen

The already-announced AI Copilot is here presented as a new way to manage file services using natural language. Users can get a quick, comprehensive view of file services, from performance to capacity, and pinpoint specific user activity and get proactive recommendations to optimize their environment before issues arise.

We asked Shawn Hansen, general manager of Pure’s core platforms unit, if he sees a time when the differences between the FlashArray and FlashBlade on-premises product wither away as they effectively become a single storage resource. “We see that time emerging very quickly,” he said. “Basically, customers will zoom out and they’ll see an SLA, and they’ll see a class of business processes, and then they’ll just change the SLA of what they need, and we will deploy the node. The node will be FlashArray or FlashBlade, but the customer won’t care. That’s the vision that we have. Exactly as you said. In the end, you’re deploying a cloud service. You don’t care about what’s underneath the covers.”

A blog, Leave Legacy Behind with the Pure Storage Platform, provides background information.

Universal Credits are available now. FlashBlade//S100, FlashBlade Zero Move Tiering, and VM Assessment will be generally available in the fourth quarter of Pure’s fiscal 2025, meaning by the end of January. The 150 TB DFMs should arrive by the end of December this year.

Gartner moves Magic Quadrant goalposts for primary storage

Gartner has updated its primary storage platform Magic Quadrant ratings, resulting in three of last year’s Leaders being displaced into the Challengers box.

Update: Huawei comment added – 24 September 2024

Gartner analysts Jeff Vogel and Chandra Mukhyala have redefined a primary storage platform (PSP) as providing “standardized enterprise storage products, along with platform-native service capabilities to support structured data applications. PSP products like primary enterprise storage arrays provide mandatory and common enterprise-class primary storage features and capabilities needed to support the platform. Platform-native services like storage as a service (STaaS) and ransomware protection, with PSP product capabilities, are required to support platform-native services.”

They say the PSP market has evolved “in conjunction with the demand for hybrid, multi-domain platform-native storage services, extending on-premises services to public cloud, edge, and colocation environments.” In effect, it’s no longer enough to provide on-premises block storage array hardware and software. That software has to run in the main public clouds, provide a cloud consumption model and cyber-resiliency in the cloud, on-premises, and in hybrid environments.

Their strategic planning assumptions are:

  • By 2027, consumption-based platform SLA guarantees will replace over 50 percent of product feature requirements in storage selection decisions, up from less than 5 percent in 2024.
  • By 2028, consumption-based storage as a service (STaaS) will replace over 33 percent of enterprise storage capital expenditure (capex), up from less than 15 percent in 2024.
  • By 2028, more than two-thirds of critical application primary storage infrastructure will employ cyber liability detection and protection capabilities, up from less than 5 percent in 2024.

Here’s the 2024 PSP Magic Quadrant (MQ) diagram:

Because of this redefinition, the suppliers in 2023 MQ edition’s Leaders box, shown below, have mostly received lower ratings on the horizontal Completeness of Vision axis and moved to the left, with three – Hitachi Vantara, Huawei, and Infinidat – crossing the Leaders box boundary to become Challengers.

Huawei’s Michael Fan, Marketing VP of Data Storage Product Line, tells us: “Gartner’s decision to place so much emphasis on North America does not give an accurate picture of the global market and risks misleading customers. Huawei remains a world-leader in data storage products and solutions, and is trusted by customers in over 150 countries and regions.”

IEIT Systems was a Challenger last year and has moved down the vertical Ability to Execute axis to become a Niche Player. DDN, a Niche Player last year, has exited the MQ “because it did not meet the minimum requirements and inclusion criteria for platform-native services,”  while Zadara has entered for the first time, as a Niche Player.

This year, Pure Storage has the highest Completeness of Vision and Ability to Execute ratings in the Leaders box, followed by HPE and NetApp, then IBM and Dell.

The Gartner analysts promote the concept of an on-premises software-defined storage system that separates compute and storage resources, which helps compute and capacity resources scale independently and cost-effectively. They mention that Pure Storage does not offer this capability and neither do Hitachi Vantara nor NetApp.

The analysts also note that high-capacity (60 TB or more) QLC flash drives are being offered by some suppliers, but not all, as an alternative to hard disk drive storage. HPE has made the 2024 PSP MQ report available here.

Bootnote

The “Magic Quadrant” is a 2D space defined by axes labeled “Ability To Execute” and “Completeness of Vision,” and split into four squares tagged “Visionaries” and “Niche Players” at the bottom, and “Challengers” and “Leaders” at the top. The best placed vendors are in the top right Leaders box and with a balance between execution ability and vision completion. The nearer they are to the top right corner of that box the better. 

Storage news ticker – September 24

Cohesity-commissioned research shows organizations are fueling ransomware attacks through their readiness to pay. Over half of all UK companies surveyed had been attacked so far in 2024, with three in four willing to pay a ransom and some admitting having paid up to £20 million. One of the most striking findings is the correlation between countries where people are most likely to pay a ransom and those reporting the highest incidents of ransomware attacks.

In detail, 95 percent of UK respondents said cyber attacks were on the rise – a fact supported by more than half (53 percent) having fallen victim to a ransomware attack in 2023. This is a stark rise from the 38 percent that reported a ransomware attack in the previous year. Some 74 percent said they would pay a ransom to recover their data after an attack, and 59 percent had indeed paid a ransom in the previous year. Only 7 percent ruled it out, despite two in three (66 percent) having clear rules not to pay. Get a copy of the report here.

Data streamer Confluent has a suite of product updates:

  1. New support of Table API, which makes Apache Flink available to Java and Python developers – helping them easily create data streaming applications using familiar tools.
  2. Private networking for Flink, providing a critical layer of security for businesses that need to  process data within strict regulatory environments.
  3. Confluent Extension for Visual Studio Code, which accelerates the development of real-time use cases.
  4. Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.

Check out the new Confluent Cloud features here.

Wikimedia Deutschland has launched a semantic search concept in collaboration with search experts from DataStax and Jina AI to make Wikidata’s openly licensed data available in an easier-to-use format for AI app developers. Wikidata’s data will be transformed and made more convenient for AI developers as semantic vectors in a vector database. DataStax provides the vector database while Jina AI provides the open source embedding model for vectorizing the text data. The vectorization will enable direct semantic analysis and could help facilitate the detection of vandalism in the knowledge graph. Vectorization also simplifies the process of using Wikidata in RAG applications in the future. Wikimedia Deutschland started creating the concept in December 2023. The first beta tests of a prototype are planned for 2025.

Firebolt announced a next-generation Cloud Data Warehouse (CDW) that delivers low latency analytics with drastic efficiency gains. Data engineers can now deliver customer-facing analytics and data-intensive applications (data apps) more cost-effectively and with greater simplicity. It’s a modern cloud data warehouse that combines the ultra-low latency of top query accelerators with the ability to scale and transform massive, diverse datasets from any source, while using standard SQL to handle any query complexity at scale. Read a launch blog for more information.

Hitachi Vantara has redesigned its Hitachi EverFlex infrastructure-as-a-service (IaaS) portfolio as a scalable and cost-efficient Hybrid Cloud as-a-service solutions for modern enterprises. Hitachi EverFlex enables customers to transition to hybrid cloud environments by offering a consumption-based model that aligns costs with business usage. EverFlex Control leverages AI to automate routine tasks, reduce human error, and optimize resource allocation. Users can scale resources up or down to meet fluctuating business demands. Check out an EverFlex blog here.

Storage exec Neil DiMartinis, Index Engines
Neil DiMartinis

Malware threat scanning Index Engines announced its new chief revenue officer, Neil DiMartinis, previously president of Cutting Edge Technologies, and advisory board member, Jim Clancy, formerly president of Global Storage Sales at Dell Technologies. They will both be involved in creating new relationships and spearheading the company’s channel expansion with new strategic partners. 

Micron announced the availability of the Crucial P310 M.2 2280 PCIe 4 NVMe SSD, which offers twice as fast performance than Gen 3 SSDs and 40 percent faster performance than Crucial’s P3 Plus. It has capacities from 540GB up to 2TB and read and write speeds of 7,100 and 6,000MB/sec respectively. The drive features random reads up to 1 million IOPS and random writes up to 1.2 million IOPS.

Micron Crucial P310 storage

Distributor TD SYNNEX France has signed a distribution agreement with Object First for its Ootbi (Out-of-the-Box-Immutability) ransomware-proof backup storage appliance purpose-built for Veeam.

Percona announced that its new database platform, Percona Everest, is GA. Percona Everest is an open source, cloud-native database platform designed to deliver similar core capabilities and conveniences provided by database-as-a-service (DBaaS) offerings but without the burden of vendor lock-in and associated challenges.

……

Quantum has launched an improved channel partner portal and program “to make it easier and more lucrative for partners to sell Quantum’s comprehensive data management solutions for AI and unstructured data.” Quantum Alliance Program enhancements:

  • “Expert” level partners can now earn double the rebate previously available while “Premier” level partners can earn up to three times the rebate. 
  • Automated lead generation tools using social media and email drip campaigns with full analytics, reporting, and built-in dashboards. 
  • Campaign-in-a-Box marketing programs on trending topics including AI, ransomware and data protection, cloud repatriation, Life Sciences data management, and VFX and animation. 
  • New Quantum GO subscription models to meet customers’ growing data demands and budgetary requirements.

Learn more here.

Rubrik Cyber Recovery capabilities are now available for Nutanix AHV. Rubrik Cyber Recovery enables administrators to plan, test, and validate cyber recovery plans regularly and recover quickly in the event of a cyberattack. AHV customers can now: 

  • Test cyber recovery readiness in clean rooms – Create pre-defined recovery plans and automate recovery validation and testing, ensuring recovery contains necessary dependencies and prerequisites.
  • Orchestrate rapid recovery to production – Identify clean point-in-time snapshots, reducing the time required to restore business operations.

Check out the details here.

SMART Modular Technologies announced a proprietary technology to mitigate the adverse impact of single event upsets (SEUs) in SSDs. Its MP3000 NVMe SSD products with SEU mitigation reduce annual failure rates from as high as 17.5k/Mu (million units) to less than 10/Mu and can save hundreds of thousands of dollars in potential service costs by helping to ensure hundreds of hours of uninterrupted uptime – especially important for tough-to-repair remote deployments. 

SEUs are an inadvertent change in “bit status” that occurs in digital systems when high-energy neutrons, or alpha particles, randomly strike and cause bits in memory – logic components – to literally flip their state. Addressing these errors or upsets within the SSD enables recovery without the need for a full system reboot. It claims its SATA and PCIe NVMe boot drives can slash annual failure rates by as much as 99.7 percent by recovering from soft errors due to single event upsets. By having the ability to gracefully reboot itself without a host system reboot, the SSD also handles possible flipped bits in other components within the SSD, which might account for an additional 10 percent of failures.

Smart Modular Technologies ME2 storage

The ME2 SATA M.2 and mSATA drives with SEU mitigation provide 60GB to 1.92TB of storage. The MP3000 NVMe PCIe drives provide 80GB to 1.92TB of storage in M.2 2280, M.2 22110 and E1.S form factors. Both products are available in commercial grades (operating temperature: 0 to 70°C) and industrial grades (operating temperature: -40 to 85°C). The M.2 2280 also supports SafeDATA power loss data protection.

StarTree, a cloud-based real-time analytics company, showcased new observability capabilities at Current 2024 in Austin, Texas. It highlighted how StarTree Cloud, powered by Apache Pinot, can now be used as a Prometheus-compatible time series database to drive real-time Grafana observability dashboards.

Distributed cloud storage company Storj introduced two new tools at IBC to simplify remote media workflows:

  • NebulaNAS (from GB Labs) – A cloud storage solution that delivers cloud flexibility with on-premises-like performance, enabling global access, collaboration, and enterprise security.
  • Beam Transfer – A breakthrough data transfer solution built on Storj’s distributed cloud, offering speeds of up to 1.17GB/sec, designed for fast, global collaboration in media production.

Titan Data Solutions, a specialist distributor for edge to cloud services, has become Vawlt’s first distribution partner in the UK. Titan will drive channel recruitment and engagement to accelerate go-to-market momentum and customer adoption of Vawlt’s distributed hybrid multi-cloud storage platform across the region.

….

VergeIO has partnered with TechAccelerator to deliver a suite of hands-on labs designed for IT professionals looking to migrate from VMware to VergeOS. These self-paced VMware Migration Labs provide a comprehensive, interactive experience to help organizations transition smoothly to VergeOS, offering a learning opportunity without needing hardware. The VergeIO Labs are available for potential customers to try and they can register here.

WTW, a global advisory, broking, and solutions company, has launched Indigo Vault, claiming it’s a first-to-market document protection platform that provides advanced cyber security for sharing and storage of business-sensitive files. Using WTW-patented, end-to-end quantum resistant security, Indigo Vault allows assigned users to decide where and how documents are stored, who can access them and for what length of time on a specific device, and how documents are used, to prevent them from being saved, seen, or shared outside of specifically defined parameters. Indigo Vault encryption uses NIST-certified algorithms that cannot be cracked by standard computers and are resistant to quantum computer attacks. Find out more here.

Software RAID Supplier Xinnor has announced a strategic partnership with HighPoint Technologies for its PCIe to NVMe Switch AIC and Adapter families with Xinnor’s xiRAID for Linux. A single Rocket series Adapter empowered by xiRAID can accommodate nearly 2PB of U.2, U.3, or E3.S NVMe storage configured into a fault-tolerant RAID array, and is capable of maximizing a full 16 lanes of PCIe host transfer bandwidth. This enables the system to deliver up to 60GB/sec of real-world transfer performance, and up to 7.5 million IOPS. In RAID5 configurations, xiRAID outperformed the standard mdraid utility by a significant margin, demonstrating over 10x improvement in both random and sequential write performance.

Veeam acquires SaaS backup outfit Alcion

Veeam has acquired SaaS ap Alcion, appointing its co-founding CEO Niraj Tolia as CTO.

Alcion was founded in 2022 by Tolia and Vaibhav Kamra, VP Engineering, to back up and protect Microsoft 365 with AI-powered threat detection. It raised $8 million in seed funding in May last year and Veeam participated in a $21 million A-round a year ago. Now Veeam has bought the whole company, IP, product, and workforce, making a second buyout exit for Tolia and Kamra. The acquisition cost has not been revealed.

Veeam CEO Anand Eswaran stated: “Niraj is one of those rare individuals who not only understands where the market is headed but also possesses the skills and vision to bring that future to life for our customers.”

Niraj Tolia, Alcion
Niraj Tolia

Tolia and Kamra had previously co-founded Kasten, a Kubernetes container backup company. Veeam paid $150 million in stock and cash for Kasten and its K10 software in October 2020. At that time Danny Allan was CTO. He resigned in January this year to become Snyk’s CTO. Now Tolia takes up that position, with Kamra appointed Veeam’s VP for Technology.

Vaibhav Kamra, Alcion
Vaibhav Kamra

Eswaran said Kasten “has become the #1 solution for Kubernetes data resilience since being acquired by Veeam.”

The data protection leader says Tolia will work closely with chief product officer Anton Gostev. A Tolia blog says: “Our charter is to continue to invest in, and build out, the Veeam Data Cloud platform. Combining the AI and Security capabilities of the Alcion platform with the scale of VDC – the fastest-growing Veeam product ever – is going to deliver great value to every Alcion and Veeam customer, current and future.” The intent, Veeam states, is to develop the Veeam Data Cloud (VDC) into a “powerful, flexible suite of services.”

At Alcion, Tolia and Kamra’s software “leveraged powerful AI techniques to learn user behavior, schedule backups intelligently, remove malware, detect ransomware, and proactively schedule backups when threat signals are detected.” We can envisage this approach being added to VDC. Protecting customer data in SaaS apps and helping customer data both on-premises and in the public cloud/SaaS environment to be more cyber-resilient are two of the main developments in the data protection market.

Before founding Kasten, Tolia led product development at Maginatics, a startup acquired by Dell EMC’s Data Protection Group.

Ugreen flies the flag for AI entry-level network-attached storage

Ugreen NASync Series: Next-level Professional Data Storage Solutions

Network-attached storage supplier Ugreen, which targets small firms, home offices, and consumers, has launched a new range of its NASync Series devices, bundling new AI and management features.

The devices function as smart data management hubs, allowing data storage and access across desktops, PCs, laptops, smartphones, tablets, and other devices through network connectivity.

Featuring up to Intel Core i5 processors inside, and dual 10GbE network ports, the NASync devices use Ugreen’s proprietary operating system, which includes an all-in-one app and an intuitive interface. Already available in the US, the devices are expected to be sold through partners and retailers in Northern Europe, including Germany and the Netherlands, next month.

“Our new AI-empowered NAS models are set to be the world’s first AI NAS equipped with a Large Language Model (LLM),” said the provider, “offering natural language processing and AI chatting capabilities, which are all running locally.”

“I congratulate Ugreen on their exciting product announcements. Their introduction of Intel-based AI NAS systems, now shipping with Intel Core i5 processors, represents a new era of intelligent storage solutions, bringing groundbreaking capabilities to the market,” said Jason Ziller, vice president and general manager for the client connectivity division at Intel.

There are six different systems, starting with the DXP2800 and going up to the DXP8800 Plus. They range in price from $400 to $1,500 each.

Established in 2012, Ugreen provides a variety of digital devices and claims to have over 40 million users worldwide. Last December, the firm launched its Revodok Series Hubs and Docking Stations aimed at the media and entertainment industry.

The Revodok Max 213, the premier product of the Revodok Max Series, is tailored for professionals in fields including media, data and financial analysis, photography, audio and video production, engineering, and design. With its Thunderbolt 4 interface, the Revodok Max 213 offers a 40 Gbps transmission speed, facilitating rapid file transfers.

Cleondris unboxes three-in-one data protection for NetApp

NetApp add-on services firm Cleondris has just publicly revealed its Cleondris ONE data protection offering, integrating backup, security, and compliance for NetApp systems into a unified platform.

Initially revealed at a private showing to press and analysts at the IT Press Tour in Istanbul, Turkey at the beginning of this month, Cleondris told us the new service “redefined” cyber resilience for NetApp environments.

NetApp partner Cleondris’ solutions integrate with existing NetApp setups, and now it is selling a unified protection product covering three areas, instead of separately selling three products to users.

Christian Plattner, CEO of Cleondris, claimed the firm was “maximizing security without adding complexity.”

“The cybersecurity world is changing fast. We’re facing sophisticated, multi-vector attacks that can paralyze operations in minutes. AI-powered threats and even nation-state actors are targeting businesses of all sizes,” said Plattner. “This new landscape calls for a fresh approach to data protection.”

He said organizations need “cyber resilience,” rather than single anti-ransomware tools, for instance. “We need a holistic approach to data security and recovery. It’s not just about prevention – it’s about quick detection, response, and keeping the business running when attacks happen,” said Plattner.

Cleondris ONE, designed to protect NetApp ONTAP environments, promises to make sure its integrated backup, security, and compliance technologies provide a coordinated response to threats.

First – as the sale pitch goes – it offers proactive threat detection and prevention, using “advanced AI” to spot potential threats before they cause damage. Second, rapid recovery and business continuity are promised, minimizing downtime and data loss if an attack occurs. Finally, it automates compliance processes and is said to reduce complexity for IT teams. “This frees up your IT teams to focus on strategic work instead of routine data management,” Plattner said.

Cleondris ONE works with all versions of NetApp ONTAP from 9.10 and up. This includes on-site systems and cloud solutions, like Amazon FSx for NetApp ONTAP and Cloud Volumes ONTAP.

“This means you get consistent, reliable data protection across your entire NetApp infrastructure, whether in your datacenter or the cloud. We’ve designed our system to work hand-in-hand with NetApp’s built-in features, while adding our own layer of advanced protection,” Plattner said. “It’s like upgrading your car with a high-tech security system – you keep all the original features you love, but now with an extra layer of protection.”

For intelligent data recovery, the Precision Data Restore feature allows users to roll back to previous points in time, so they can recover their data from just before an attack happened.

In addition, Granular Cyber Restore is a recovery tool that combines forensic analysis with data restoration. This means users can “quickly” identify which files were affected by an attack and restore only those files. This is done while gathering attack evidence and maintaining a detailed audit trail to support compliance.

Cleondris ONE uses blockchain tech to create tamper-proof audit logs. This means every file access and change is recorded securely, with each action creating a new “block” in the chain, making it impossible to alter records. This gives firms a “reliable” trail for audits and investigations, says Cleondris.

The new product is said to install in “less than an hour,” and is scalable as organizations’ data footprints grow.

Sigma tweaks cloud data management console with some new tools

Cloud data management firm Sigma has taken the wraps off various new AI technologies and integrations, in the hopes of pitching them to organizations looking to protect their data and squeeze operational results from it.

The intelligent tools are meant to clean and model data, with the aim of delivering consistency in calculations.

Two new AI features in Sigma are Explain Viz and Formula Assistant. Explain Viz uses the customer’s connected AI to automatically generate a “clear and concise” description of any chart, highlighting key insights, observations, and data summaries. The feature helps users quickly understand the “story” behind their data, saving time and effort in interpreting complex visualizations.

Formula Assistant leverages the customer’s AI models to help users create new formulas, correct errors in existing ones, and provide explanations for formulas used in workbooks and data models, making it easier to work with complex data. The feature streamlines the formula-building process, reduces errors, and increases productivity for users of all skill levels, says Sigma.

The firm has also announced a new integration with Glean, an enterprise search tool that connects an organisation’s entire workflow. With “just two clicks,” says Sigma, users can search Glean within Sigma, instantly accessing unstructured data like Slack threads, Google Docs, Jira tickets, and more.

The integration eliminates the need to jump between systems, empowering users to find solutions faster and work more efficiently by connecting Sigma’s structured insights with the context that lives elsewhere.

Sigma is also announcing the expansion of OAuth coverage with write access for Snowflake and other platforms. This makes Sigma the first data analytics platform to offer OAuth for write access to Snowflake and Databricks, we are told. It expands OAuth support to enable secure write-back from a Sigma input table to a Snowflake data warehouse.

In addition, customers can use OAuth with Databricks connections for centralized user access management between Sigma and Databricks, to provide greater security and decrease administrator time investment. OAuth support is provided for input tables, write-back, warehouse views, materializations, and CSV uploads.

On top of these enhancements, it has revealed Sigma BI Analyst by Hakkoda, a Snowflake native app that will assess a company’s usage of a legacy BI tool on Snowflake, and show them potential cost savings with Sigma.

Currently “shared via private listings,” Sigma BI Analyst can show users information like workbook statistics, data source patterns in workbooks, and license utilization. Select users will be able to input how much they’re paying for viewer licenses and compare that to free Sigma Lite licenses.

The app also uses Snowflake Cortex AI to recommend formula syntax changes between BI tools.

“We’re partnering with Sigma to simplify the migration process for enterprises looking to modernize their business intelligence and sunset legacy BI tools,” said Erik Duffield, CEO of Hakkoda.

Finally, Sigma has released a data connector to Microsoft Azure, to enable secure communications between the Sigma platform and an Azure cloud data warehouse.

“With our latest enhancements, we’ve made it easier than ever for business users to define their metrics without writing a single line of code. By delivering flexibility, speed, and seamless integration with the broader data ecosystem, Sigma allows users to trust both the data and the process,” added Mike Palmer, Sigma CEO.

NetApp and Aruba.it to launch data solutions across Europe

NetApp
NetApp

NetApp has become the preferred data infrastructure provider of Italian datacenter services provider Aruba.it, which offers web hosting, domain registration, and secure email account services across its European footprint.

The two companies will now offer new data management and storage solutions labeled “Powered by NetApp.”

Aruba is one of Italy’s providers of cloud, datacenter, hosting, domain registration, and PEC (certified email) services. Aruba was already a user of NetApp products behind the scenes, but is now expanding the relationship into a formal partnership as part of its wider go-to-market strategy.

“By combining forces, the companies will be able to collaborate on strategic goals and synergistic initiatives, to be able to provide optimized datacenter solutions, both from the data server and data management side,” the pair said.

Aruba, founded in 1994, claims to have 16 million users and operates a data management and storage infrastructure distributed across seven datacenters that support 2.7 million registered domains and thousands of customer systems.

The provider is accredited by Italy’s Agenzia per l’Italia Digitale for the provision of qualified services, and its infrastructure is approved by the country’s National Cybersecurity Authority to handle critical and strategic data.

As well as facilities in Italy, Aruba runs its own datacenter in the Czech Republic, as well as shared datacenter locations in France, Germany, Poland, and the UK.

“Aruba is dedicated to planning, implementing and managing highly customized technology solutions to support our customers across Europe,” said Fabrizio Garrone, enterprise solution director at Aruba. “This partnership shows our customers the high quality of the infrastructure we provide, and we look forward to developing new, innovative solutions that will serve our customers into the future.”

Gabie Boko, chief marketing officer at NetApp, added: “The alliance creates new opportunities for joint innovation and development to meet the specific needs of customers across Europe, expanding the reach of both companies.”

Veritas reveals clutch of updates to remove data recovery ‘uncertainty’

Veritas
Veritas

Most of the Veritas Technologies business is in the process of merging with Cohesity, but that hasn’t stopped product updates. Veritas says it now has new AI-driven capabilities and user interface enhancements that it claims will “remove uncertainty” around data recovery.

The company says cyber recovery will be made “simpler, smarter and faster” through streamlined navigation and operations management with the Veritas Alta View platform. The dashboard integrates AI-driven insights and a cyber risk score for real-time, actionable analytics.

“Enhanced visualisation tools allow users to monitor their entire data estate, proactively manage risks and expedite cyber recovery,” claimed Veritas.

In addition, the new Veritas Alta Copilot automatically scans and identifies unprotected assets, recommends and applies tailored protection, and “instantly” integrates with existing protection policies to ensure all critical data is covered.

Also, enhanced security, “accelerated” threat detection and a “more rapid” ransomware response is being promised, through hash-based tracking of malware in backup data and blast radius analysis. Once malware is identified, new functionality reduces the time to scan and assess the spread across the entire estate by “up to 93 percent”, the company claimed.

It also aserts the new interactive guide could help proactive disaster management and cyber recovery, by letting IT teams create, automate, test and edit workflow plans. Blueprints can be customised at a relatively granular level across multiple domains, including hybrid, platform-as-a-service and container environments, “ensuring tailored and effective” risk management, Veritas said.

And optimized recovery is now possible through proactive, in-depth analysis, that provides recommended recovery points. This, we’re told, reduces recovery time and potential data loss by eliminating the need to manually identify the “last known good copy,” relying instead on risk engine analysis to minimise the dependence on costly malware scans.

“We’re focused on making recovery simpler, smarter and faster. With expanded AI assistance and intuitive management, we are eliminating the guesswork and trial and error from the recovery process,” said Deepak Mohan, executive vice president of engineering at Veritas. “Organizations can now bounce back from ransomware attacks quickly and confidently, minimising business disruption.”

The Veritas Copilot features will be available next month. All the other bits and pieces will be provided through updates to Veritas NetBackup, Veritas Alta Data Protection, and Veritas Alta View this month.