By linking SAP and BigQuery, the SAP database can be streamlined through targeted data management.
SAP HANA is an in-memory database. As a result, the RAM requirements for SAP installations with HANA DB are exceptionally high and increase as the data volume grows. To reduce RAM requirements and thus also save costs, it makes sense to think about targeted data management. BigQuery can provide support here and simplify the handling of warm and cold data.
Data that is one to two years old is referred to as Hot Data. These are used daily, so they are active data. In HANA, this data is stored in memory and is therefore quickly available. Here, outsourcing to BigQuery makes little or no sense.
Warm data, on the other hand, is data that is three to five years old. This data is accessed less frequently. This data is needed for annual reports or comparative figures, for example. For warm data, HANA enables the non-active data to be swapped out of the main memory into the file system via Hana Dynamic Tiering, thus slimming down the required memory. However, if the data is to be used again, it must first be loaded back into the database, for which a buffer must be left. Therefore, slimming via dynamic tiering is not ideal. BigQuery, on the other hand, can be integrated into the SAP system. Among other things, warm data can be stored in BigQuery and made available again in SAP via corresponding views.
Data that is already more than five years old usually only exists for archiving purposes. Therefore, this data, in particular, should be loaded into BigQuery because it takes up storage space in the conventional database and unnecessarily inflates it.
With the SAP Data Service, data management can be automated, and Warm Data can be moved to BigQuery and deleted from the SAP database. This way, storage capacities are automatically updated, managed, and kept lean by the SAP database.