Scality: there are degrees of immutability

Object storage supplier Scality released the results of a survey of US and European decision-makers about immutable storage, saying that not all immutability is created equal – some forms still leave a window of exposure. Chief marketing officer Paul Speciale blogged about true immutability and this sparked our curiosity. We asked him some questions abut this via email.

Blocks & Files: Are there degrees of immutability? If so, what are they and how do they matter?

Paul Speciale.

Paul Speciale: Yes, there are degrees. To first provide some context, we view immutability as a cornerstone of an organization’s data cyber resiliency strategy, especially for backup storage in the face of today’s rising ransomware threats. Cyber criminals understand that as long as you can recover (restore), you are less likely to pay their ransom, so attacks on backups are much more common now. Immutability puts up significant barriers to common attack threats that modify (encrypt) or delete backup data. 

That said, there are indeed different forms of immutability offered in different storage solutions that offer varying degrees of protection. The key factors to consider here include:

  • Does the underlying storage have true immutable properties, or can data essentially be edited/modified-in-place and is the underlying storage, therefore, effectively “mutable”?
  • Is the data immutable at the moment it is stored, or is there a delay that creates a window of exposure to ransomware? How long is that delay in  practice?
  • How tightly integrated is the immutability capability with the writing application (is there an API that can be used to enable and configure immutability directly in the backup storage system)? 
  • Are immutable backups online for fast restore in the event it is needed?

True immutability provides a critical last line of defence against ransomware attacks and data loss, by preventing any alteration or deletion of data. Other forms of immutability are more focused on preserving data states at certain points in time, which doesn’t provide the same cyber resilience benefits as those preventing the alteration or deletion of the actual data.

Blocks & Files: How much do the degrees of immutability cost?

Paul Speciale: Ransomware protection is really what imposes longer retention periods and higher costs – immutability by itself does not. By increasing the retention period of the backups, organizations  are giving themselves more time to detect that an attack has occurred, and they’re also increasing the probability that a “clean” (non-infected) backup will be available to restore in case of an attack.

Regarding cost, the impact is dependent on the specific immutability mechanism in the backup solution implementing it. For example, data reduction mechanisms such as deduplication  or incremental forever backups will reuse unique data blocks (S3 objects) at different restore points. 

To manage different immutability periods from restore points using the same unique S3 objects in an object store, backup software vendors may implement different strategies with different cost implications. One strategy can be to have multiple copies of these unique S3 objects, each retaining a different immutability period. In that case, the cost of immutability translates directly into additional capacity requirements.

Another strategy is to extend the immutability period of these unique S3 objects using put-object-retention operations. In that case, the cost of immutability translates into additional S3 operations. This approach incurs more costs on public clouds charging for S3 operations, but it comes at no cost for on-premises object storage such as Scality provides in RING and ARTESCA. 

Blocks & Files: Are there different restoration regimes for immutable data stores? How should we categorize them?

Paul Speciale: Yes, restoration regimes will depend on the type of immutable storage that is employed – this can range from backups on tape, on cloud storage such as in AWS S3 or Glacier, backup  appliances, NAS/file system snapshots and modern object storage. Each one of these options presents their own tradeoffs in how data is restored and, moreover, how quickly it can be restored.

Corporate backup strategies should always include multiple copies of backup data. The famous 3-2-1 rule dictates at least three copies, but we often see organizations maintain more. In the case of on-premises application data, a multi-tiered backup policy that places strong emphasis on true immutability and rapid (online/disk-based) restore for performance and capacity tiers, and uses tape or cloud storage for longer-term retention can be an effective strategy.

Scality supports all-flash and hybrid RING and ARTESCA configuration to protect the most restore-time-sensitive workloads. Both of these solutions combine the main virtues of true immutability, high restore performance and low TCO for long-term backup retention, making them an optimal choice for multiple tiers of a backup strategy. 

Blocks & Files: Can you tell me about the immutability exposure window?

Paul Speciale: Many traditional immutable solutions leave gaps in between writes. Some solutions may stage backups on standard storage before moving them to immutable media. File systems typically make data immutable via scheduled and periodic snapshots. Hours or more may pass between the last snapshot taken, with frequent intervals becoming increasingly costly in terms of capacity overhead and storage performance (especially when there are extensive  snapshot histories maintained). 

In addition, perhaps the most important issue with exposure windows is that when they exist, backups may get deleted and corrupted, requiring additional operations and complexity to restore from the snapshots. This can result in data retention gaps, leaving exposures for data loss from malicious internal or external actors.

Blocks & Files: How can it be reduced?

Paul Speciale: The only way to eliminate this exposure window is by saving and storing data instantly each and every time it is written. Doing so enables data to be restored from any point in time. The emergence of an immutability API in the popular AWS S3 API (S3 object locking) has created a de facto standard interface for data protection applications to directly enable, configure, and manage immutability in an optimal manner. Veeam Data Platform, for example, can  enable object locking, set retention policies, and also set compliance mode enforcement directly in the object store through the APIs.  

Blocks & Files: Should organizations have a formal and known immutability exposure time objective, set like a policy and variable across files, folders and objects?

Paul Speciale: Organizations should define and track their recovery point objective (RPO) and schedule their backups accordingly. Most workloads will tolerate a one-day RPO, and traditional daily backups are typically scheduled for that purpose.

As discussed, backups remain exposed before they are made immutable, so those backups can be deleted or altered. If the most recent backups are deleted, the organization can only restore older immutable backups, thus increasing the RPO.

But immutability exposure time has the most impact on the recovery time objective (RTO). Exposure means you cannot trust your latest backups, and you will potentially need to perform additional operations (e.g. rollback a snapshot) before you can restore. Having to connect to multiple administrative interfaces to perform manual actions will impose additional delays in recovery. 

To restore at a higher throughput, having a very high-performance storage system for backups is certainly ideal. But are you really going to restore faster if you first need to spend  your time on manual rollback operations before you can start the actual restore process? Admins must carefully consider the impact of these manual operations, plus the “immutability exposure time” when setting their organization’s RPO and RTO objectives.

Blocks & Files: What can you tell me about flaws in an organization’s immutability armor?

Paul Speciale: Immutability as it is normally implemented by most storage solutions protects against attacks that go through the data path – for example, from malware instructing the storage to delete its data. But ransomware actors have become increasingly sophisticated in their attack methods to increase their chances of forcing ransom payments – and a newer breed of AI-powered cyber attacks is raising the stakes even higher. 

It’s important that organizations are aware of (and address) advanced attack vectors that can make storage systems vulnerable:

  • The first attack vector is the operating system, which needs to be hardened to prevent any root access.
  • Server consoles, such as the IPMI and the iDRAC, are also subject to vulnerability and can provide full access to underlying storage. They should either be configured on a dedicated secure network or disconnected entirely.  
  • External protocols such as NTP and DNS also need to be protected to avoid timeshifting attacks or DNS hijacking.

Immutable object storage solutions are now available that provide the strongest form of data immutability plus end-to-end cyber resilience capabilities to protect against the threat vectors described above.

Blocks & Files: How can they be detected and fixed? 

Paul Speciale: We can refer back to our principles to find solutions that combine true immutability at the storage layer, API-based control to allow applications to enable and configure immutability directly in the storage system, and fast/online restore capabilities. 

Moreover, we should mention that the latest data protection solutions from vendors like Veeam are able to leverage object storage directly for backups (direct-to-object over S3), high-performance (multi-bucket SOBR), instant restore (VM Instant Recovery), and backup integrity validation (VM SureBackup) and more.

There are several elements that we recommend organizations ensure are integrated into their immutable storage:

  1.  Instantaneous data lock The second you store object-locked data, it is immutable with zero time  delay.
  2.  No deletes or overwrites Make sure your data is never deleted or overwritten. This blocks ransomware attackers from encrypting or deleting data to prevent you from restoring. If any changes are made, a new version of the object is created, leaving the original data intact.
  3.  Support for AWS-compatible S3 Object Locking APIs Enabling immutability at the API level prevents user or application attempts to overwrite data when issuing S3 commands against a data set.
  4.  Configurable retention policies  These are important because organizations are required to keep data for varying time periods. The ability to customize such retention policies means they can keep their data protected and fully immutable during this time.
  5.  Compliance mode This capability prevents anyone (including the system superadmin) from changing immutability configurations.

Blocks & Files: Is there a way to scan an organization’s data environment for immutability weaknesses?

Paul Speciale: Yes. First, the organization needs to understand the capabilities and potential limitations of immutable storage. Start by ensuring that immutable backups can be swiftly restored in case of an attack. Periodic and regular restore testing in disaster-like scenarios is strongly encouraged. Backup vendors have implemented tools to make these types of real-world restore validations more practical, and having a fast and reliable backup storage solution is mandatory to cope with this additional workload.

Vendor capabilities and immutability implementations can also be validated by third-party agencies. Scality, for example, has worked with the independent validation firm Cohasset  Associates, and has published SEC 17a4(f) and FINRA 4511(c) Compliance assessments. 

And look for storage solutions that offer multiple internal layers of security to avoid the novel threat vectors mentioned above. To address the need for end-to-end security, Scality is setting a new standard of cyber resilient ransomware protection with its comprehensive CORE5 approach.