HPE fluffs up on-prem GreenLake model, adds new services, aims to give users the ‘cloud experience’

Beachy Head lighthouse

HPE has crammed in new services, from silicon hardware to software and services, with security additions, to its on-premises GreenLake model, both to extend its scope and make them easier for customers to use.

GreenLake is HPE’s services and subscription business model for using its products as services in on-premises and hybrid cloud data centres and colocation centres, remote offices and other Edge locations. Its aim is to make the use of HPE’s hardware, software and services more like a public cloud operating model. 

Antonio Neri, HPE CEO
Antonio Neri

Antonio Neri, HPE president and CEO, said in a public statement: “Organisations today know that to succeed in their industries, they must pursue a cloud everywhere mandate, which enables them to collect, analyse, and act on data, wherever it resides. … [GreenLake] empowers organisations to harness the power of all their data, regardless of location.”

At its virtual Discover event, HPE announced it was adding services to GreenLake to extend its usability.

Lighthouse is cloud-native code to enable customers to add and run HW+SW system configurations for new cloud services with a few clicks in the GreenLake Central front-end management facility. It is built on HPE’s Ezmeral container building and orchestration platform. HPE clams this autonomously optimises different cloud services and workloads by composing resources to deliver the best performance, lowest cost or a balance of both, depending on business priorities.  

Project Aurora delivers cloud-native, zero-trust security and will be embedded within the HPE GreenLake cloud platform building blocks to automatically and continuously verify the integrity of the hardware, firmware, operating systems, platforms, and workloads, including security workloads. 

HPE claimed this integrity verification or continuous attestation could be used to “automatically detect threats from silicon to cloud, in seconds compared to today’s average of 28 days”. It can minimise data loss, unauthorised encryption — HPE does not use the term ransomware — and data and intellectual property corruption.

Aurora uses silicon root of trust technology and, HPE suggested, when combined with open-source technologies like SPIFFE and SPIRE, could “enable DevOps and SecDevOps engineering teams to deliver workload identities rooted in continuously verified hardware”.

The Aurora code will be initially embedded into GreenLake Lighthouse and then into GreenLake cloud services and HPE’s Ezmeral software.

Silicon on demand involves HPE offering flexible consumption capabilities for Xeon CPUs and Optane persistent memory. It has been developed with Intel. Customers can instantly activate and pay for more CPU/Optane capacity with just a click, removing the need to order and install new processors and Optane PMem modules. 

The cloud-based Compute Cloud Console (CCC) provides unified compute operations as a service and builds upon HPE’s Data Services Cloud Console for Alletra array management announced in May. It monitors and manages compute (server) facilities, automating compute operations across an organisation’s entire fleet, such as provisioning and lifecycle management.

We have not seen any datasheets or tech backgrounders for these new GreenLake Lighthouse, Aurora, silicon on demand or Compute Cloud Console offerings and have no detailed commentary to offer.

Vertical market service offerings

There are new vertical market-optimised on-premises GreenLake services covering applications for 5G, electronic medical records, financial services, data and risk analytics, high performance computing (HPC), artificial intelligence (AI) and more.

Keith White, SVP and GM GreenLake Cloud Services positioned them: “For the industry-critical workloads that power their business, organisations shouldn’t have to choose between the cloud experience, or the security and control of keeping their apps and data on-premises.”

In his view, GreenLake delivers the cloud experience on-premises.

Here is a list of the new services:

  • GreenLake for Electronic Medical Records (EMR) for Epic applications with validated configurations, management services and cloud experience;
  • GreenLake for Core Payment Systems in partnership with Lusis for the financial services market;
  • GreenLake for Splunk, using HPE’s Ezmeral Container Platform, to collect, analyse, and act upon the data generated by an organisation’s technology infrastructure, security systems, and business applications;
  • GreenLake for 5G core deploys HPE’s 5G Core Stack and Telco Core Blueprints so carriers can get a purpose-built, open, cloud-native 5G core with lower up-front investment;
  • GreenLake for Azure Stack HCI and SQL Server allows joint customers to consolidate virtualised Windows and Linux workloads; 
  • General availability of GreenLake for High Performance Computing, GreenLake for Machine Learning Operations, GreenLake for SAP and GreenLake for Virtual Desktop Infrastructure. 


GreenLake is both a business model and a set of application-focused services that run on HPE hardware and software. 

HPE has said it has over 1200 GreenLake customers representing $4.8B in total contract value, a 95 per cent customer renewal rate, and GreenLake is actively sold by over 900 partners worldwide. In its most recent quarter, GreenLake grew annual recurring revenue 30 per cent and orders 41 per cent year on year. 

The Project Aurora code can verify the integrity of the hardware and system software used in GreenLake services. But it cannot verify the integrity of users, separating out genuine users from malware attackers using stolen identities and mounting ransomware attacks on an organisation’s data. 

Putting better locks on doors and windows and checking their status is all very well, but does not prevent phoney users, with apparently valid identities and access status, penetrating a network. That is the huge security problem that still needs to be addressed.

The Cloud Compute Console looks like a natural partner for the already-announced Data Services Cloud Console and B&F wouldn’t be surprised to see the two being integrated in some way such that users eventually have a single console for monitoring and managing both compute and shared storage.

HPE is erecting a GreenLake operations management superstructure over its hardware and software products that makes them easier to consume and manage. We could imagine this being extended to cover HPE-supplied services in the public clouds with customers operating within a GreenLake environment wherever their IT facilities are located.