- Servicios de consultoría
Automated Data Pipeline Framework: 2-4 hr Workshop
Learn how to apply best CI/CD practices, including IaC, automated versioning, and artifact signing to your existing and planned Azure data pipelines during our free workshop.
Modern systems require a lot of data processing, which usually involves even more pipelines for processing ETL feeds, collecting analytics data, etc. It is usually relatively easy to create new ones, but over time technical burden related to maintenance may become a major problem.
Our solution is to apply the same treatment usually reserved for application code in the scope of CI/CD and make use of templating and infrastructure automation frameworks in order to bring order, proper QA and code review practices, and release management procedures to data pipelines in Azure. This helps to greatly reduce maintenance overhead, allowing a relatively small team to support hundreds of pipelines in solutions like Databricks, while avoiding cloning of typical configurations in case new projects should be onboarded within the framework, typically reducing onboarding time from weeks to days or even hours.
The GD Team will have a discovery and pre-assessment workshop to give an overview of our pipeline migration framework. It will also identify how to apply our framework to address the customer’s needs.
Workshop agenda: Introductions How to apply ideas behind CI/CD practices around Data/ETL pipelines in Azure Leveraging the best practices and technologies offered by Azure cloud Deep dive into current areas of concern for Customer and drafting the resolution plan Closure and discussion of next steps.