A fully automated Data Ingestion process to transfer data from SAP to Azure Lake
Velocity is a reliable data ingestion solution on Microsoft Azure that helps businesses quickly and efficiently ingest large volumes of data from diverse sources. It is certified by SAP and compatible with sources like Oracle and SQL. Using Velocity, organizations can effortlessly move and integrate their data into Azure and accelerate their journey towards a Modern Intelligent Data Platform (MIDP), which is essential for a successful digital transformation strategy.
- Transfer data from any type of SAP, Oracle and other sources to Azure rapidly and securely
- Fully managed change data capture (CDC) built in
- Batch data ingestion to have low source system impact
- 100% PaaS
- Full GUI, no code solution, empowering citizen data scientists/engineers without technical coding skills
- Unlimited data ingestion/users and number of sources/targets included within the license
Supported Data Sources:
- SAP: SAP ECC, SAP S4 HANA, SAP BW, SAP BW4HANA, SAP BW on HANA, HANA Sidecar, SuccessFactors, CAR, CRM
- Oracle: Oracle ERP, Oracle E-Business Suite, Oracle Sales Cloud, Oracle Demantra, Oracle Cloud
- Azure Data Lake Gen2
Velocity syndicates the source data schema definition. This is useful for processes that consume the data from storage, leveraging this content to identify the structure definition when required. Schema definition can be used for further downstream transformations.
Change Data Capture (CDC) is a technique that captures changes in the source system and updates them in the target system. Velocity uses a sophisticated CDC solution that works at either the application or database log level to identify changes in the source system. This technique is effective with all source table types, including pool and cluster tables, and ensures that the target system always reflects the data in the source system.
Velocity integration has two endpoints - source and Azure, eliminating the need for intermediary servers, reducing security vulnerabilities. Data is encrypted during transfer and only rests within the security of the data lake without the need for intermediate databases. Velocity uses source authorization and Active Directory permissions in Azure to control data access and configuration.
Velocity optimizes the data extraction & replication process with numerous adjustable parameters to tune your extractions (in source and in Azure).
Velocity is a serverless solution, removing the need for additional servers to deploy & operationalize the source to Azure integration. Velocity connects directly between the source systems and the target Azure system(s) by reducing infrastructure, support costs and solution complexity, while increasing security & deployment speed.
Velocity keeps the storage in sync with your source data, without the need for any additional ETL tools, jobs, or batch file generation processes. This simplifies the process of capturing changes using CDC and reduces the time required to process deltas. Velocity provides near real-time data to the target storage, enabling the generation of actionable insights and quick reactions to limit potential damage.
Deployment & Knowledge Transfer
Deployment consists of source transports and Azure code deployments. Proof of Value or full implementation will occur alongside your team to ensure knowledge of the solution is transferred in-house. An installation guide will be provided for your specific source add-on to ensure that the solution meets security standards.
Velocity aligns with best practices for data management by allowing users to define and manage storage areas based on the type of data being stored. For example, finance and customer data can be automatically stored in separate areas within the storage target. Velocity also supports multiple storage accounts as targets, allowing for the segregation of data as needed.
Users can add source systems, table extractions, and custom logical views in the Velocity portal. Administrative users can configure the following:
Objects to Ingest, Change Data Capture (CDC) Method, Data Compression, Max batch Size, Destination File Format, Target Data Encryption, Limit job Currency on Source Systems.