Home Blog Page 67

Storage news ticker – September 24

Cohesity-commissioned research shows organizations are fueling ransomware attacks through their readiness to pay. Over half of all UK companies surveyed had been attacked so far in 2024, with three in four willing to pay a ransom and some admitting having paid up to £20 million. One of the most striking findings is the correlation between countries where people are most likely to pay a ransom and those reporting the highest incidents of ransomware attacks.

In detail, 95 percent of UK respondents said cyber attacks were on the rise – a fact supported by more than half (53 percent) having fallen victim to a ransomware attack in 2023. This is a stark rise from the 38 percent that reported a ransomware attack in the previous year. Some 74 percent said they would pay a ransom to recover their data after an attack, and 59 percent had indeed paid a ransom in the previous year. Only 7 percent ruled it out, despite two in three (66 percent) having clear rules not to pay. Get a copy of the report here.

Data streamer Confluent has a suite of product updates:

  1. New support of Table API, which makes Apache Flink available to Java and Python developers – helping them easily create data streaming applications using familiar tools.
  2. Private networking for Flink, providing a critical layer of security for businesses that need to  process data within strict regulatory environments.
  3. Confluent Extension for Visual Studio Code, which accelerates the development of real-time use cases.
  4. Client-Side Field Level Encryption encrypts sensitive data for stronger security and privacy.

Check out the new Confluent Cloud features here.

Wikimedia Deutschland has launched a semantic search concept in collaboration with search experts from DataStax and Jina AI to make Wikidata’s openly licensed data available in an easier-to-use format for AI app developers. Wikidata’s data will be transformed and made more convenient for AI developers as semantic vectors in a vector database. DataStax provides the vector database while Jina AI provides the open source embedding model for vectorizing the text data. The vectorization will enable direct semantic analysis and could help facilitate the detection of vandalism in the knowledge graph. Vectorization also simplifies the process of using Wikidata in RAG applications in the future. Wikimedia Deutschland started creating the concept in December 2023. The first beta tests of a prototype are planned for 2025.

Firebolt announced a next-generation Cloud Data Warehouse (CDW) that delivers low latency analytics with drastic efficiency gains. Data engineers can now deliver customer-facing analytics and data-intensive applications (data apps) more cost-effectively and with greater simplicity. It’s a modern cloud data warehouse that combines the ultra-low latency of top query accelerators with the ability to scale and transform massive, diverse datasets from any source, while using standard SQL to handle any query complexity at scale. Read a launch blog for more information.

Hitachi Vantara has redesigned its Hitachi EverFlex infrastructure-as-a-service (IaaS) portfolio as a scalable and cost-efficient Hybrid Cloud as-a-service solutions for modern enterprises. Hitachi EverFlex enables customers to transition to hybrid cloud environments by offering a consumption-based model that aligns costs with business usage. EverFlex Control leverages AI to automate routine tasks, reduce human error, and optimize resource allocation. Users can scale resources up or down to meet fluctuating business demands. Check out an EverFlex blog here.

Storage exec Neil DiMartinis, Index Engines
Neil DiMartinis

Malware threat scanning Index Engines announced its new chief revenue officer, Neil DiMartinis, previously president of Cutting Edge Technologies, and advisory board member, Jim Clancy, formerly president of Global Storage Sales at Dell Technologies. They will both be involved in creating new relationships and spearheading the company’s channel expansion with new strategic partners. 

Micron announced the availability of the Crucial P310 M.2 2280 PCIe 4 NVMe SSD, which offers twice as fast performance than Gen 3 SSDs and 40 percent faster performance than Crucial’s P3 Plus. It has capacities from 540GB up to 2TB and read and write speeds of 7,100 and 6,000MB/sec respectively. The drive features random reads up to 1 million IOPS and random writes up to 1.2 million IOPS.

Micron Crucial P310 storage

Distributor TD SYNNEX France has signed a distribution agreement with Object First for its Ootbi (Out-of-the-Box-Immutability) ransomware-proof backup storage appliance purpose-built for Veeam.

Percona announced that its new database platform, Percona Everest, is GA. Percona Everest is an open source, cloud-native database platform designed to deliver similar core capabilities and conveniences provided by database-as-a-service (DBaaS) offerings but without the burden of vendor lock-in and associated challenges.

……

Quantum has launched an improved channel partner portal and program “to make it easier and more lucrative for partners to sell Quantum’s comprehensive data management solutions for AI and unstructured data.” Quantum Alliance Program enhancements:

  • “Expert” level partners can now earn double the rebate previously available while “Premier” level partners can earn up to three times the rebate. 
  • Automated lead generation tools using social media and email drip campaigns with full analytics, reporting, and built-in dashboards. 
  • Campaign-in-a-Box marketing programs on trending topics including AI, ransomware and data protection, cloud repatriation, Life Sciences data management, and VFX and animation. 
  • New Quantum GO subscription models to meet customers’ growing data demands and budgetary requirements.

Learn more here.

Rubrik Cyber Recovery capabilities are now available for Nutanix AHV. Rubrik Cyber Recovery enables administrators to plan, test, and validate cyber recovery plans regularly and recover quickly in the event of a cyberattack. AHV customers can now: 

  • Test cyber recovery readiness in clean rooms – Create pre-defined recovery plans and automate recovery validation and testing, ensuring recovery contains necessary dependencies and prerequisites.
  • Orchestrate rapid recovery to production – Identify clean point-in-time snapshots, reducing the time required to restore business operations.

Check out the details here.

SMART Modular Technologies announced a proprietary technology to mitigate the adverse impact of single event upsets (SEUs) in SSDs. Its MP3000 NVMe SSD products with SEU mitigation reduce annual failure rates from as high as 17.5k/Mu (million units) to less than 10/Mu and can save hundreds of thousands of dollars in potential service costs by helping to ensure hundreds of hours of uninterrupted uptime – especially important for tough-to-repair remote deployments. 

SEUs are an inadvertent change in “bit status” that occurs in digital systems when high-energy neutrons, or alpha particles, randomly strike and cause bits in memory – logic components – to literally flip their state. Addressing these errors or upsets within the SSD enables recovery without the need for a full system reboot. It claims its SATA and PCIe NVMe boot drives can slash annual failure rates by as much as 99.7 percent by recovering from soft errors due to single event upsets. By having the ability to gracefully reboot itself without a host system reboot, the SSD also handles possible flipped bits in other components within the SSD, which might account for an additional 10 percent of failures.

Smart Modular Technologies ME2 storage

The ME2 SATA M.2 and mSATA drives with SEU mitigation provide 60GB to 1.92TB of storage. The MP3000 NVMe PCIe drives provide 80GB to 1.92TB of storage in M.2 2280, M.2 22110 and E1.S form factors. Both products are available in commercial grades (operating temperature: 0 to 70°C) and industrial grades (operating temperature: -40 to 85°C). The M.2 2280 also supports SafeDATA power loss data protection.

StarTree, a cloud-based real-time analytics company, showcased new observability capabilities at Current 2024 in Austin, Texas. It highlighted how StarTree Cloud, powered by Apache Pinot, can now be used as a Prometheus-compatible time series database to drive real-time Grafana observability dashboards.

Distributed cloud storage company Storj introduced two new tools at IBC to simplify remote media workflows:

  • NebulaNAS (from GB Labs) – A cloud storage solution that delivers cloud flexibility with on-premises-like performance, enabling global access, collaboration, and enterprise security.
  • Beam Transfer – A breakthrough data transfer solution built on Storj’s distributed cloud, offering speeds of up to 1.17GB/sec, designed for fast, global collaboration in media production.

Titan Data Solutions, a specialist distributor for edge to cloud services, has become Vawlt’s first distribution partner in the UK. Titan will drive channel recruitment and engagement to accelerate go-to-market momentum and customer adoption of Vawlt’s distributed hybrid multi-cloud storage platform across the region.

….

VergeIO has partnered with TechAccelerator to deliver a suite of hands-on labs designed for IT professionals looking to migrate from VMware to VergeOS. These self-paced VMware Migration Labs provide a comprehensive, interactive experience to help organizations transition smoothly to VergeOS, offering a learning opportunity without needing hardware. The VergeIO Labs are available for potential customers to try and they can register here.

WTW, a global advisory, broking, and solutions company, has launched Indigo Vault, claiming it’s a first-to-market document protection platform that provides advanced cyber security for sharing and storage of business-sensitive files. Using WTW-patented, end-to-end quantum resistant security, Indigo Vault allows assigned users to decide where and how documents are stored, who can access them and for what length of time on a specific device, and how documents are used, to prevent them from being saved, seen, or shared outside of specifically defined parameters. Indigo Vault encryption uses NIST-certified algorithms that cannot be cracked by standard computers and are resistant to quantum computer attacks. Find out more here.

Software RAID Supplier Xinnor has announced a strategic partnership with HighPoint Technologies for its PCIe to NVMe Switch AIC and Adapter families with Xinnor’s xiRAID for Linux. A single Rocket series Adapter empowered by xiRAID can accommodate nearly 2PB of U.2, U.3, or E3.S NVMe storage configured into a fault-tolerant RAID array, and is capable of maximizing a full 16 lanes of PCIe host transfer bandwidth. This enables the system to deliver up to 60GB/sec of real-world transfer performance, and up to 7.5 million IOPS. In RAID5 configurations, xiRAID outperformed the standard mdraid utility by a significant margin, demonstrating over 10x improvement in both random and sequential write performance.

Veeam acquires SaaS backup outfit Alcion

Veeam has acquired SaaS ap Alcion, appointing its co-founding CEO Niraj Tolia as CTO.

Alcion was founded in 2022 by Tolia and Vaibhav Kamra, VP Engineering, to back up and protect Microsoft 365 with AI-powered threat detection. It raised $8 million in seed funding in May last year and Veeam participated in a $21 million A-round a year ago. Now Veeam has bought the whole company, IP, product, and workforce, making a second buyout exit for Tolia and Kamra. The acquisition cost has not been revealed.

Veeam CEO Anand Eswaran stated: “Niraj is one of those rare individuals who not only understands where the market is headed but also possesses the skills and vision to bring that future to life for our customers.”

Niraj Tolia, Alcion
Niraj Tolia

Tolia and Kamra had previously co-founded Kasten, a Kubernetes container backup company. Veeam paid $150 million in stock and cash for Kasten and its K10 software in October 2020. At that time Danny Allan was CTO. He resigned in January this year to become Snyk’s CTO. Now Tolia takes up that position, with Kamra appointed Veeam’s VP for Technology.

Vaibhav Kamra, Alcion
Vaibhav Kamra

Eswaran said Kasten “has become the #1 solution for Kubernetes data resilience since being acquired by Veeam.”

The data protection leader says Tolia will work closely with chief product officer Anton Gostev. A Tolia blog says: “Our charter is to continue to invest in, and build out, the Veeam Data Cloud platform. Combining the AI and Security capabilities of the Alcion platform with the scale of VDC – the fastest-growing Veeam product ever – is going to deliver great value to every Alcion and Veeam customer, current and future.” The intent, Veeam states, is to develop the Veeam Data Cloud (VDC) into a “powerful, flexible suite of services.”

At Alcion, Tolia and Kamra’s software “leveraged powerful AI techniques to learn user behavior, schedule backups intelligently, remove malware, detect ransomware, and proactively schedule backups when threat signals are detected.” We can envisage this approach being added to VDC. Protecting customer data in SaaS apps and helping customer data both on-premises and in the public cloud/SaaS environment to be more cyber-resilient are two of the main developments in the data protection market.

Before founding Kasten, Tolia led product development at Maginatics, a startup acquired by Dell EMC’s Data Protection Group.

Ugreen flies the flag for AI entry-level network-attached storage

Ugreen NASync Series: Next-level Professional Data Storage Solutions

Network-attached storage supplier Ugreen, which targets small firms, home offices, and consumers, has launched a new range of its NASync Series devices, bundling new AI and management features.

The devices function as smart data management hubs, allowing data storage and access across desktops, PCs, laptops, smartphones, tablets, and other devices through network connectivity.

Featuring up to Intel Core i5 processors inside, and dual 10GbE network ports, the NASync devices use Ugreen’s proprietary operating system, which includes an all-in-one app and an intuitive interface. Already available in the US, the devices are expected to be sold through partners and retailers in Northern Europe, including Germany and the Netherlands, next month.

“Our new AI-empowered NAS models are set to be the world’s first AI NAS equipped with a Large Language Model (LLM),” said the provider, “offering natural language processing and AI chatting capabilities, which are all running locally.”

“I congratulate Ugreen on their exciting product announcements. Their introduction of Intel-based AI NAS systems, now shipping with Intel Core i5 processors, represents a new era of intelligent storage solutions, bringing groundbreaking capabilities to the market,” said Jason Ziller, vice president and general manager for the client connectivity division at Intel.

There are six different systems, starting with the DXP2800 and going up to the DXP8800 Plus. They range in price from $400 to $1,500 each.

Established in 2012, Ugreen provides a variety of digital devices and claims to have over 40 million users worldwide. Last December, the firm launched its Revodok Series Hubs and Docking Stations aimed at the media and entertainment industry.

The Revodok Max 213, the premier product of the Revodok Max Series, is tailored for professionals in fields including media, data and financial analysis, photography, audio and video production, engineering, and design. With its Thunderbolt 4 interface, the Revodok Max 213 offers a 40 Gbps transmission speed, facilitating rapid file transfers.

Cleondris unboxes three-in-one data protection for NetApp

NetApp add-on services firm Cleondris has just publicly revealed its Cleondris ONE data protection offering, integrating backup, security, and compliance for NetApp systems into a unified platform.

Initially revealed at a private showing to press and analysts at the IT Press Tour in Istanbul, Turkey at the beginning of this month, Cleondris told us the new service “redefined” cyber resilience for NetApp environments.

NetApp partner Cleondris’ solutions integrate with existing NetApp setups, and now it is selling a unified protection product covering three areas, instead of separately selling three products to users.

Christian Plattner, CEO of Cleondris, claimed the firm was “maximizing security without adding complexity.”

“The cybersecurity world is changing fast. We’re facing sophisticated, multi-vector attacks that can paralyze operations in minutes. AI-powered threats and even nation-state actors are targeting businesses of all sizes,” said Plattner. “This new landscape calls for a fresh approach to data protection.”

He said organizations need “cyber resilience,” rather than single anti-ransomware tools, for instance. “We need a holistic approach to data security and recovery. It’s not just about prevention – it’s about quick detection, response, and keeping the business running when attacks happen,” said Plattner.

Cleondris ONE, designed to protect NetApp ONTAP environments, promises to make sure its integrated backup, security, and compliance technologies provide a coordinated response to threats.

First – as the sale pitch goes – it offers proactive threat detection and prevention, using “advanced AI” to spot potential threats before they cause damage. Second, rapid recovery and business continuity are promised, minimizing downtime and data loss if an attack occurs. Finally, it automates compliance processes and is said to reduce complexity for IT teams. “This frees up your IT teams to focus on strategic work instead of routine data management,” Plattner said.

Cleondris ONE works with all versions of NetApp ONTAP from 9.10 and up. This includes on-site systems and cloud solutions, like Amazon FSx for NetApp ONTAP and Cloud Volumes ONTAP.

“This means you get consistent, reliable data protection across your entire NetApp infrastructure, whether in your datacenter or the cloud. We’ve designed our system to work hand-in-hand with NetApp’s built-in features, while adding our own layer of advanced protection,” Plattner said. “It’s like upgrading your car with a high-tech security system – you keep all the original features you love, but now with an extra layer of protection.”

For intelligent data recovery, the Precision Data Restore feature allows users to roll back to previous points in time, so they can recover their data from just before an attack happened.

In addition, Granular Cyber Restore is a recovery tool that combines forensic analysis with data restoration. This means users can “quickly” identify which files were affected by an attack and restore only those files. This is done while gathering attack evidence and maintaining a detailed audit trail to support compliance.

Cleondris ONE uses blockchain tech to create tamper-proof audit logs. This means every file access and change is recorded securely, with each action creating a new “block” in the chain, making it impossible to alter records. This gives firms a “reliable” trail for audits and investigations, says Cleondris.

The new product is said to install in “less than an hour,” and is scalable as organizations’ data footprints grow.

Sigma tweaks cloud data management console with some new tools

Cloud data management firm Sigma has taken the wraps off various new AI technologies and integrations, in the hopes of pitching them to organizations looking to protect their data and squeeze operational results from it.

The intelligent tools are meant to clean and model data, with the aim of delivering consistency in calculations.

Two new AI features in Sigma are Explain Viz and Formula Assistant. Explain Viz uses the customer’s connected AI to automatically generate a “clear and concise” description of any chart, highlighting key insights, observations, and data summaries. The feature helps users quickly understand the “story” behind their data, saving time and effort in interpreting complex visualizations.

Formula Assistant leverages the customer’s AI models to help users create new formulas, correct errors in existing ones, and provide explanations for formulas used in workbooks and data models, making it easier to work with complex data. The feature streamlines the formula-building process, reduces errors, and increases productivity for users of all skill levels, says Sigma.

The firm has also announced a new integration with Glean, an enterprise search tool that connects an organisation’s entire workflow. With “just two clicks,” says Sigma, users can search Glean within Sigma, instantly accessing unstructured data like Slack threads, Google Docs, Jira tickets, and more.

The integration eliminates the need to jump between systems, empowering users to find solutions faster and work more efficiently by connecting Sigma’s structured insights with the context that lives elsewhere.

Sigma is also announcing the expansion of OAuth coverage with write access for Snowflake and other platforms. This makes Sigma the first data analytics platform to offer OAuth for write access to Snowflake and Databricks, we are told. It expands OAuth support to enable secure write-back from a Sigma input table to a Snowflake data warehouse.

In addition, customers can use OAuth with Databricks connections for centralized user access management between Sigma and Databricks, to provide greater security and decrease administrator time investment. OAuth support is provided for input tables, write-back, warehouse views, materializations, and CSV uploads.

On top of these enhancements, it has revealed Sigma BI Analyst by Hakkoda, a Snowflake native app that will assess a company’s usage of a legacy BI tool on Snowflake, and show them potential cost savings with Sigma.

Currently “shared via private listings,” Sigma BI Analyst can show users information like workbook statistics, data source patterns in workbooks, and license utilization. Select users will be able to input how much they’re paying for viewer licenses and compare that to free Sigma Lite licenses.

The app also uses Snowflake Cortex AI to recommend formula syntax changes between BI tools.

“We’re partnering with Sigma to simplify the migration process for enterprises looking to modernize their business intelligence and sunset legacy BI tools,” said Erik Duffield, CEO of Hakkoda.

Finally, Sigma has released a data connector to Microsoft Azure, to enable secure communications between the Sigma platform and an Azure cloud data warehouse.

“With our latest enhancements, we’ve made it easier than ever for business users to define their metrics without writing a single line of code. By delivering flexibility, speed, and seamless integration with the broader data ecosystem, Sigma allows users to trust both the data and the process,” added Mike Palmer, Sigma CEO.

NetApp and Aruba.it to launch data solutions across Europe

NetApp
NetApp

NetApp has become the preferred data infrastructure provider of Italian datacenter services provider Aruba.it, which offers web hosting, domain registration, and secure email account services across its European footprint.

The two companies will now offer new data management and storage solutions labeled “Powered by NetApp.”

Aruba is one of Italy’s providers of cloud, datacenter, hosting, domain registration, and PEC (certified email) services. Aruba was already a user of NetApp products behind the scenes, but is now expanding the relationship into a formal partnership as part of its wider go-to-market strategy.

“By combining forces, the companies will be able to collaborate on strategic goals and synergistic initiatives, to be able to provide optimized datacenter solutions, both from the data server and data management side,” the pair said.

Aruba, founded in 1994, claims to have 16 million users and operates a data management and storage infrastructure distributed across seven datacenters that support 2.7 million registered domains and thousands of customer systems.

The provider is accredited by Italy’s Agenzia per l’Italia Digitale for the provision of qualified services, and its infrastructure is approved by the country’s National Cybersecurity Authority to handle critical and strategic data.

As well as facilities in Italy, Aruba runs its own datacenter in the Czech Republic, as well as shared datacenter locations in France, Germany, Poland, and the UK.

“Aruba is dedicated to planning, implementing and managing highly customized technology solutions to support our customers across Europe,” said Fabrizio Garrone, enterprise solution director at Aruba. “This partnership shows our customers the high quality of the infrastructure we provide, and we look forward to developing new, innovative solutions that will serve our customers into the future.”

Gabie Boko, chief marketing officer at NetApp, added: “The alliance creates new opportunities for joint innovation and development to meet the specific needs of customers across Europe, expanding the reach of both companies.”

Veritas reveals clutch of updates to remove data recovery ‘uncertainty’

Veritas
Veritas

Most of the Veritas Technologies business is in the process of merging with Cohesity, but that hasn’t stopped product updates. Veritas says it now has new AI-driven capabilities and user interface enhancements that it claims will “remove uncertainty” around data recovery.

The company says cyber recovery will be made “simpler, smarter and faster” through streamlined navigation and operations management with the Veritas Alta View platform. The dashboard integrates AI-driven insights and a cyber risk score for real-time, actionable analytics.

“Enhanced visualisation tools allow users to monitor their entire data estate, proactively manage risks and expedite cyber recovery,” claimed Veritas.

In addition, the new Veritas Alta Copilot automatically scans and identifies unprotected assets, recommends and applies tailored protection, and “instantly” integrates with existing protection policies to ensure all critical data is covered.

Also, enhanced security, “accelerated” threat detection and a “more rapid” ransomware response is being promised, through hash-based tracking of malware in backup data and blast radius analysis. Once malware is identified, new functionality reduces the time to scan and assess the spread across the entire estate by “up to 93 percent”, the company claimed.

It also aserts the new interactive guide could help proactive disaster management and cyber recovery, by letting IT teams create, automate, test and edit workflow plans. Blueprints can be customised at a relatively granular level across multiple domains, including hybrid, platform-as-a-service and container environments, “ensuring tailored and effective” risk management, Veritas said.

And optimized recovery is now possible through proactive, in-depth analysis, that provides recommended recovery points. This, we’re told, reduces recovery time and potential data loss by eliminating the need to manually identify the “last known good copy,” relying instead on risk engine analysis to minimise the dependence on costly malware scans.

“We’re focused on making recovery simpler, smarter and faster. With expanded AI assistance and intuitive management, we are eliminating the guesswork and trial and error from the recovery process,” said Deepak Mohan, executive vice president of engineering at Veritas. “Organizations can now bounce back from ransomware attacks quickly and confidently, minimising business disruption.”

The Veritas Copilot features will be available next month. All the other bits and pieces will be provided through updates to Veritas NetBackup, Veritas Alta Data Protection, and Veritas Alta View this month.

Panzura unveils Symphony to make data operations sound easier

Symphony
Symphony

Cloud data management firm Panzura has hit go on its Symphony product, promising support for various file system deployments and protocols under a single pane of glass.

It supports on-premises, private, public, and hybrid cloud object storage workloads with Amazon Web Services, Microsoft Azure, Google Cloud Platform, and Wasabi Hot Cloud Storage, among others.

With a unified dashboard and user interface, the platform is built to enable ITOps and line-of-business leaders to perform automated, exabyte-scale data discovery and assessment, along with risk and compliance analysis, and dynamic data movement orchestration.

Governance and management policies can be applied for file and object data inspection, we are told, to provide greater insights and reporting, data mobilization and archiving, and workflow and AI pipeline automation integration for DevOps support.

Transparent to users and applications, original namespace, security, and file metadata is preserved, supporting autonomous streaming of content on demand. The result is “dramatically reduced storage costs”, claims Panzura, along with efficient workflows, and a more proactive and compliant data posture.

“Panzura Symphony brings together proven technology and Panzura’s future vision where data management meets strategy for business success,” said Dan Waldschmidt, CEO of Panzura in a statement.

The platform announcement follows Panzura’s recent buy of Moonwalk Universal, intended to help provide an expanding tool set. Panzura bought Moonwalk for its hybrid cloud unstructured data estate scanning, migration and tiering technology.

Symphony is designed to augment the features of Panzura Data Services, which provides data visibility, governance, and real-time access across ecosystems, and sit alongside the functionality of Panzura CloudFS, a hybrid cloud file services platform that supports large-scale multi-site workflows for the most active data.

It is also said to extend Panzura’s hybrid cloud portfolio to the next “ring” of unstructured data – the other 90 percent of file data for which organizations need synchronous visibility and control to make use of digital transformation in their own contexts. Symphony therefore changes the way businesses gain insights into their entire data estate, allowing teams to conduct comprehensive data discovery and assessments, providing a detailed overview of data including its structure, content, and relationships.

For instance, Symphony lets businesses identify cost optimization possibilities based on data volatility, temperature, file type, file size, ownership, and metadata tags. Additionally, it provides more granular control at the folder, share, entire file system or bucket level, and can aggregate data across multiple file systems and object stores.

Symphony also helps streamline data placement, transformation, and restructuring to support artificial intelligence workflows and common formats for faster data analysis. And it lets teams aggregate storage consumption for chargebacks within their organization, to ensure proper cost control by department.

On top of that, teams can extend Symphony’s compliance capabilities with key integrations. For example, integration with IBM Fusion allows them to take automated, appropriate policy-driven action for handling personally identifiable information (PII) and other sensitive data. This helps organizations prepare for regulatory audits and demonstrate adherence to data protection standards.

Lakehouse startup promises big data processing savings after $10M funding round

e6data has announced a $10 million Series A funding round as it seeks to tap into firms looking to trim the cost of analyzing their own data.

The data lakehouse startup says it is aiming to “level the playing field” for customers by negating the “immense pricing power” a handful of vendors in the space currently enjoy, due to what it characterizes as various “new forms of compute ecosystem lock-ins” at different layers of the data stack.

In today’s digital-first landscape, said e6data, enterprises rely on powerful data and AI capabilities to fuel innovation, enhance customer experiences, and optimize operations. However, they are set to spend an estimated $100 billion in 2024 on data intelligence platforms to derive value from their own data.

Focused on this data compute spend, e6data says it aims to halve that bill. Its Series A funding round was led by Accel Partners, with participation from Beenext and others.

“The rapid increase in spending has made data intelligence platforms the second largest IT spending category, behind only cloud spend, for operational systems and application infrastructure,” said Vishnu Vasanth, co-founder and CEO of e6data. “It’s fuelling the meteoric rise of data warehouse and data lakehouse companies such as Snowflake and Databricks, and the rapid growth of corresponding offerings from AWS, Azure, and Google Cloud.”

However, as the spending grows, concerns over whether businesses are achieving a return on investment (RoI) are reaching “boiling point,” he adds. “Legitimate RoI concerns stand in the way of enterprises realizing the full potential of data and AI.” He says organizations cannot freely move lakehouse table formats, data catalogs, compute providers, and cloud providers without “adverse” price-performance impacts, the need for data movement, and “cumbersome” application migrations.

To address the challenges, e6data has developed a “compute engine” for data intelligence platforms that helps enterprises amplify RoI on their existing platforms and architectures, and “escape” ecosystem lock-ins. And it is promised there will be “zero friction” to adoption, with zero data movement, zero application migration, and zero downtime.

e6data is offering its Lighthouse Customer Program as a managed service to enterprise customers, complete with production support and professional services.

e6data says data lakehouses and warehouses use, at their core, distributed compute engines, whether open source or vendor-backed, for every form of processing spanning ingestion, transformation, dashboards, reports, ML model training and inference, as well as RAG-based generative AI applications.

However, it maintains that existing compute engines are built on “monolithic architectures” with centralized components for most aspects of a query or job’s life cycle. This creates challenges with respect to cost, performance, concurrency handling, and uptime, says e6data, particularly on compute-intensive heavy workloads that enterprises increasingly encounter as they operate at production scale.

e6data promises it can address these challenges with a new engine architecture and distributed processing model that is disaggregated, decentralized, and Kubernetes-native. It claims its engine outperforms leading commercial and open source solutions across “real-world heavy workloads” and popular benchmarks, using a “truly format-neutral approach” that negates ecosystem lock-in. e6data says it has already signed up publicly listed Fortune 500 enterprises, as well as “high growth” companies as customers. But the proof of the pudding is in the eating, as they say.

Odaseva widens data management, security support for Salesforce

Odaseva, the enterprise data security platform for Salesforce, has expanded its product line after a $54 million Series C funding round last quarter.

The company, which launched its Data History module for Salesforce earlier this year, has now enhanced its technology across the areas of data governance, zero trust, and ransomware.

With Governance Center, enterprises with multiple Salesforce organizations can monitor and govern data across all of them through a single system, which the vendor claims will allow them to gain actionable insights as well as monitor critical business metrics like Salesforce adoption and spend, security scores, and carbon footprints.

Another fresh addition is Zero Trust Connect, which allows enterprises with ultra-sensitive data to extend their Salesforce Shield tech by bringing end-to-end encryption to data leaving Salesforce. This feature aligns with the zero trust security model preferred by large, intensely regulated global enterprises, by ensuring that data remains encrypted throughout its entire lifecycle, even when interacting with external applications.

Odaseva has also now introduced ransomware prevention, with customers able to enforce immutability on the backed up files on storage systems such as Azure and AWS, so that no one can remove them until after a certain duration.

“This is especially important for customers in highly-regulated industries that must meet regulatory requirements, and it aids in preventing ransomware and insider attacks,” the company said.

“Odaseva’s new product enhancements strengthen our position as the leader in the enterprise data security space for Salesforce,” claimed Sovan Bin, CEO and founder of Odaseva, although his company faces stiff competition in the Salesforce data protection space, from the likes of Veeam, Keepit, IBM, Kaseya, Varonis, Druva, HYCU, Cohesity, and many others. And, of course, these named competitors protect multiple data workloads, not just Salesforce.

The company certainly now has deeper pockets to try to make an impression on the market. This June, Odaseva sealed a $54 million Series C equity financing round, bringing its total funding to more than $90 million to date.

Storage News Ticker – September 16

Khara, a motion picture production company based in Japan most well known for its creation of the popular Evangelion anime series, has selected Wasabi as its cloud storage provider, moving its operations and 500TB of archival data from on-premises to the cloud.

The files in its 500TB archive make up the intellectual property assets of the company, with every new production adding more data to its archive. In an effort to avoid hardware failures, Khara turned to Wasabi to keep its data secure and control costs.

“A major advantage of Wasabi cloud storage is that it frees us from frequent HDD failures and hardware replacements, allowing us to concentrate on system administration work,” said Kazuki Misawa, chief system engineer at Khara. “Repeated failures with our hardware increased the risk of losing irreplaceable data and the fear of this happening kept me awake at night as the large amount of data in each hard disk made replacement a nerve-wracking chore – replacing and rebuilding data on a single disk would often take one or two weeks.”

With Wasabi’s cloud storage, the infrastructure cost is “comparable” to the existing system, and the management and operation cost will be reduced by about 80 percent, is the claim. “We’re working with the innovative team at Khara to help keep their data safe and eliminate the headaches associated with replacing and rebuilding their on-premises solutions,” added Aki Wakimoto, Japan country manager at Wasabi. “We will enable Khara’s IT team to work more efficiently and help drive creative ideating without being burdened with worries of data loss.”

———-

Data streaming firm player Confluent has acquired WarpStream and its bring-your-own-cloud (BYOC) data streaming architecture. 

With the acquisition, Confluent claims it now has a data streaming offering “for every company” – which includes fully managed with Confluent Cloud, self-managed with Confluent Platform, or BYOC with WarpStream.

“Confluent wants to offer data streaming to all customers with all requirements and workloads. I’ve been deeply impressed with WarpStream as it’s BYOC done right,” said Jay Kreps, CEO of Confluent.

Richard Artoul, CEO of WarpStream, added: “The leader in the data streaming space has acquired WarpStream to offer next-gen BYOC. Together, we will continue to ensure that Apache Kafka-compatible data streaming is accessible to every organization.”

WarpStream’s BYOC approach is built directly on object storage, just like Confluent’s Kora engine, and brings managed data streaming benefits into the customer’s cloud. “In time,” said Confluent, features like processing and governance will be added to WarpStream BYOC, to provide a “complete” data streaming platform solution for high-volume logging and observability workloads.

———-

Privately held cloud computing platform Vultr is working with GPU-accelerated analytics platform provider HEAVY.AI. Integrating Vultr’s global Nvidia GPU cloud infrastructure into its operations, HEAVY.AI says it can interactively query and visualize massive datasets, enabling faster, more efficient decision-making for customers across diverse sectors.

“Partnering with Vultr has allowed us to leverage their highly performant, global Nvidia GPU cloud infrastructure to provide our customers with better access to unparalleled speed and efficiency,” said Jon Kondo, CEO of HEAVY.AI. “This integration ensures that our platform continues to deliver the rapid insights and cost savings that are critical for our customers’ success.”

Nvidia GH200 Grace Hopper Superchips are combined with Nvidia A100 Tensor Core GPUs and Vultr Bare Metal instances to drive faster insights via HEAVY.AI‘s platform. It says it can deliver “5x or greater” price/performance when compared to 8XA100 instances, completing industry-standard analytic SQL benchmarks such as TPC-H100 in less than 4.5 seconds, and executing 11 of the 22 queries in less than 100 milliseconds.

“Vultr is one of the first cloud providers to offer the revolutionary GH200 Grace Hopper Superchip,” said Todd Mostak, CTO of HEAVY.AI.

———-

Enterprise storage firm Infinidat says its InfiniBox technology has been tested to work with Red Hat OpenShift Virtualization. The technical validation opens new possibilities for enterprise customers and channel partners to deploy, migrate, and manage new and existing virtual machine (VM) workloads and virtualized applications using Red Hat OpenShift Virtualization.

Infinidat’s InfiniBox Container Storage Interface (CSI) driver for petabyte-scale Kubernetes deployments was previously certified for Red Hat OpenShift in hybrid and multi-cloud environments, for both high performance enterprise primary storage and data protection/backup needs.

“InfiniBox enabled on Red Hat OpenShift Virtualization is a key step forward for enterprises and service providers to achieve real-world application performance, cyber storage resilience, reduced storage Capex and Opex, and greater simplicity,” said Erik Kaulberg, VP of strategy and alliances at Infinidat. “We support virtualized and containerized applications with an integrated set of trusted tools that maximize the advantages of Red Hat OpenShift Virtualization on a unified platform.”

———-

Object First, which provides “ransomware-proof” backup storage appliances purpose-built for Veeam, has appointed Pete Hannah as vice president of sales, Western Europe. He will focus on channel development, the recruitment of partners, and hiring to grow the sales team.

In Q2 2024, Object First says it achieved a 300 percent year-over-year increase in transacting partners, built through the supplier’s channel only model.

“Pete’s extensive experience in the IT channel, combined with his proven track record of driving growth and building teams, makes him the perfect fit to lead sales in Western Europe,” said David Bennett, CEO of Object First. “As we continue to expand our partner network with the best storage for Veeam, his leadership will allow us to build on our strong momentum.”

Hannah has over 25 years of experience in the IT industry, recruiting and leading teams for growth, through Noima Consultancy, Netgear, TD Synnex, and BT.

———-

Ocient, a data analytics biz, has launched its Data Retention and Disclosure System, a “high-performance”, modular solution for telcos and communications service providers (CSPs). The system promises enhanced data ingestion, retention, and analysis capabilities for network analytics and compliance in a more “cost-, storage-, and energy-efficient footprint.”

Customers can scale their data collection, retention, and analysis capabilities for a growing network of metadata stores, while enabling “rapid” search and retrieval of records “within seconds,” we are told.

“Telcos and CSPs have historically been challenged to effectively retain, search, and analyze their growing and often overwhelming volume of network and communications data,” said Chris Gladwin, CEO of Ocient. “With our Data Retention and Disclosure System we’re equipping customers with the capabilities they need to address the technical burden posed by this data, while helping them avoid fines and penalties associated with not analyzing this data fast enough. We are also enabling them to reduce the cost, size, and energy consumption of these high-volume network data repositories.”

———-

Cloud storage firm Qumulo has announced the general availability of Cloud Native Qumulo (CNQ) on Amazon Web Services (AWS). The release is an enterprise multi-protocol system designed to manage unstructured data, seeking to address scalability, flexibility, and cost efficiency demands.

CNQ is designed to cater to various industries, including healthcare and life sciences, media and entertainment, higher education, financial services, and the energy sector. For customers in regulated industries or those supporting federal entities, CNQ can be deployed in AWS GovCloud, offering a secure, compliant option that meets stringent government and regulatory requirements.

“For the first time, customers can use CNQ on AWS to elastically burst performance or capacity at will, customizing the storage instance to meet dynamic workload demands,” said Kiran Bhageshpur, chief technology officer at Qumulo. “CNQ can be deployed within minutes and updated within seconds, enabling customers to run enterprise file workloads within their virtual private clouds (VPCs), with optional use of S3 intelligent tiering to reduce infrastructure costs further.”

CNQ offers a “transformative” approach to cloud economics, priced “up to 80 percent less” than legacy file offerings. By using AWS S3 for long-term data persistence, CNQ “significantly” reduces costs without sacrificing performance.

———-

Data unification and management firm Reltio has introduced its Data Pipeline for Databricks. We’re told users will no longer have to spend time and money creating their own custom integrations between the two products.

The Pipeline pushes real-time, “insight-ready” data from Reltio’s offerings – Reltio Customer 360 Data Product, Reltio Multidomain Master Data Management, and Reltio Entity Resolution – to the Databricks Data Intelligence Platform.

Powering Reltio’s offerings, the Reltio Connected Data Platform unifies, standardizes, and enriches data continuously as it is ingested from various systems. This data is then made available downstream into Delta Lake tables through the Reltio Data Pipeline for Databricks, enabling various operational and strategic analytics and AI use cases “with minimal effort”, said the firm. Trusted, real-time data from Reltio then provides time-sensitive and strategic insights, while reducing data preparation efforts around the customer’s Databricks environment.

“By working closely with Databricks and building a pipeline to its popular Data Intelligence Platform, our customers can more easily support their strategic and operational use cases that require delivery of trusted data in real time,” said Venki Subramanian, senior vice president of product management at Reltio. “This is especially timely as businesses want to easily take their AI/ML, business intelligence, and business analytics initiatives to the next level, and poor data quality would hinder their efforts.”

Roger Murff, VP of technology partners at Databricks, added: “The Reltio Data Pipeline for Databricks significantly enhances our customers’ ability to access high-quality, real-time data for their AI/ML initiatives. Together, we are empowering organizations to get more from their data and accelerate their paths toward becoming truly data-driven organizations.”