09 februari 2021 
4 min. read

"Right-Size" or "Lift & Shift"? Worthy considerations when moving your SAP HANA BI to the cloud

Leverage your Move to the Cloud

When making your business case to move your on-premise SAP HANA BI landscape to the cloud, savings on run & maintain cost in the magnitude of 25% are not uncommon.

Moving your SAP HANA to a major cloud provider like AWS, brings even more attractive advantages like agility and flexibility, reducing operating costs up to 50% when using the Lemongrass Cloud Platform (LCP), fully dedicated to SAP on AWS.

The true opportunity that is often over-looked however, is choosing to optimise your HANA BI data storage as an integral part of your migration to the AWS cloud.  The following “right sizing” case study will illustrate this best practice and its returns. 

Case Study Background

In partnership with Lemongrass, ONE labs recently engaged with the world’s largest toy manufacturer to sharpen their cloud business case and to help them size and prepare a potential move to the AWS cloud in an optimal way.

This toy manufacturer runs a 14 TB BW on HANA production instance that holds 14+ TB of data, a 200% over-allocation of what SAP would typically advise. Their operations team frequently offloads data manually to ensure sufficient HANA memory space for processing data and to avoid adverse impact on business users.

Instead of “lifting and shifting” this 14 TB production instance “1:1” to the cloud, the toy manufacturer’s team engaged Lemongrass and ONE labs to optimise their existing data footprint upfront and “right size” the target configuration in the AWS Cloud.

“…the internal due diligence work combined with truly knowledgeable sparring partners at ONE labs and Lemongrass were instrumental in achieving a confident SAP on AWS target sizing…” T. V., Senior BI Manager, Denmark

Planning your AWS Move

When planning an SAP HANA move to AWS cloud, the challenge lies in striking a balance between identifying the best node configuration cost options, and, defining what data is required in ‘hot’, ‘warm’ and ‘cold’ data tiers. This needs looking into the aging of data and understanding the usage of that data on your analytics platform, insights that standard SAP tools do not provide.

To that end, ONE labs, the laboratory of SAP HANA experts ONE Consultants, has developed Oxygen for HANA. Oxygen scans all data loaded in HANA memory and identifies aged and less frequently used data in a transparent way, indicating to business users and technical staff what data can safely be off-loaded to cheaper warm or cold data tiers, without compromising the business user’s experience.

Showcasing how Oxygen had been instrumental in identifying data aging and usage patterns for another global player in the energy sector, the toy manufacturer understood that it would be of pivotal importance to run a similar exercise on its BW on HANA production instance, prior to determining its AWS node configuration for production.

ONE labs were hired to run multiple memory scans to identify data optimisation opportunities, by assigning a hot-, warm- or cold-classification to the existing BW on HANA data. Hot is defined as high frequency usage from a business end user perspective (or high frequency loading). Warm as either in-frequent business queries on data or data used in one daily loading activity. Cold is referring to data removed from daily load cycles or reporting models that can still be accessed ad-hoc as required.

“The Oxygen scan at this toy manufacturer revealed at peak that of 14 TB of data, only 3 TB was read into memory for the purpose of frequent business reporting.” Mogens Madsen, HANA Expert, ONE-Consultants

The scan further revealed that from the top 16 tables, 50% of data was aged data that could be moved to either warm or in some cases even cold, resulting in a need for just 1.5 TB of hot data for frequent reporting. Another 6.7 TB was read into memory for the purpose of daily load cycles (of which 3 TB was aged data) whilst the last 4.3 TB was data that was already termed cold (1.3 TB) or tables with partially loaded columns (1.2 TB).

Right-Sizing Saves

With the above information in mind, Lemongrass  could confidently move forward in sizing the SAP HANA landscape on AWS. With the assumption that the clean-up of data and move to ‘warm’ could happen prior to locking-in cloud capacity, the toy manufacturer would target a 2 x 4 TB scale up from its existing 7 x 2 TB nodes on premise (=sized for for peak performance), saving 6 TB in the process. In a “Lift & Shift” scenario the sizing would have been (close to) the original 14 TB footprint.

Facts that influenced that decision:

  • AWS is currently offering SAP HANA nodes as virtual nodes up to 4 TB in size; any size above is currently offered as bare metal only, which is substantially more expensive per TB.
  • Additionally, customers also lose the opportunity to subscribe to an on-demand disaster recovery when going beyond 4 TB.
  • To materialise a ‘warm’ tier on SAP HANA for BW, one of the 4 TB nodes is dedicated as a warm “extension node”, allowing to allocate 200% of its capacity.

In summary, in a semi-flexible contract, the new target landscape of 2 x 4 TB of SAP HANA on AWS will provide:

  • Capacity of 2 TB of hot data (allowing 25% growth) and 8 TB of warm data for a total of 10 TB data capacity
  • The ability to run DR on demand (prior to the AWS move, a dedicated instance was locked in and paid for on same hardware terms as the PRD instance)
  • The ability to swap to a larger 6 TB virtual node, if and when that would be required and offered by AWS

Move with Confidence

Senior BI manager at our customer, T. V., outlines that the internal due diligence work combined with truly knowledgeable sparring partners at ONE-Consultants and Lemongrass have been instrumental. Not only in achieving vital insights, but more importantly, in achieving confidence. Confidence that the reduced sizing will serve their business without compromising user experience and confidence that the toy manufacturer will not be exposed to unforeseen costs in having to re-size its SAP HANA BW landscape after its migration to AWS.

 

Are you moving your SAP HANA BI landscape to the cloud?
Let us clean up your BW on HANA or HANA Native data footprint. Right-sizing beforehand or even after a Lift & Shift will help save considerably on TCO.

With the right insights and expertise, ONE Labs and Lemongrass will help your business optimise its TCO with leaner SAP HANA operations on AWS. Reach out to Matt Nys @ONE labs or Rob Antens @Lemongrass Consulting, we’ll help you move with confidence!

(Image source: https://unsplash.com)

Request an assessment call

Please submit the following info and we will call you to arrange an assessment asap!
  • My preferred date & time (and time zone)

Henrique Pinto
By

Henrique Pinto

on 16 Feb 2021

I really like the article as it starts the important conversation on how to modernize SAP BW in the cloud, however I would have liked to see a more “aggressive” approach leveraging more cloud native capabilities. You mention they were able to identify hot, warm and cold data with the Oxygen for HANA product, however in the proposed solution you only addressed the hot and warm data, with 2x4TB HANA nodes, 1 for hot and 1 for warm (extension node). In all fairness, this approach could also be done On Premise, there is nothing cloud specific about it. I would have liked to see an approach where the proposed solution leveraged either S3+Athena or Redshift as a place to displace cold BW data, either via NLS or a more custom approach (e.g. push data out and virtualize into HANA and union via Composite Providers). Happy to discuss in more details.

Place comment