The Data Lake in a Box is a template for the automated deployment of a data platform connecting its users to a host of azure services, fast!
Combine all your structured, unstructured and semi-structured data (logs, files and media) using Azure Data Factory to the Azure Data Lake Store.
Leverage your cleansed and transformed data in Azure Data Lake Store and perform scalable analytics with Azure Databricks.
Seamlessly and securely access and move your data at scale between Azure services to combine and create one hub for all your data.
Build operational reports and analytical dashboards on top of Azure Data Warehouse to derive insights from the data and use Azure Data Warehouse to serve thousands of end users.
The Data Lake in a Box is a single deployment methodology based on Microsoft's Enterprise Data Warehouse Architecture. A CICD Pipeline will be set up for you and your team to rapidly accelerate Data Platform development, refute the concept of Data Platform "Proof of Concept" and drive business value from your data as soon as possible.
The Platform Deploys:
- Azure Data Factory
- Azure Synapse
- Azure Data Lake Store
- Azure Analysis Services
- Azure Databricks
Connecting all these services together in Azure Data Factory and managing authentication so you have a fully deployed environment fully secure and connected in a fraction of the time doing it manually or building Infrastructure as Code yourself.
During the 5-day phase, we will work with your team to fully deploy the Data Lake in a Box to your environment, using multistage YAML pipelines to define how and when you want environments to be deployed. Creating a sample pipeline within Azure Data Factory will be done to show the solution in action loading one of your datasets through ingestion to Analysis Services.