Low Code No Code Ingestion Framework: 6-Wk Implementation

Sigmoid

The framework enables the rapid development and democratization of data integration by simple selection of the desired data source & configuration of the connection parameters.

Traditionally, data ingestion involves complex coding and scripting, which often requires specialized skills and a significant amount of time. There is also a need for separate and specialized frameworks for ingesting data from dissimilar sources. The orchestration and transformation of these engineering pipelines also leads to high cost, effort and inefficient usage of Cloud-native technologies.

Sigmoid’s LCNC data ingestion Framework follows a metadata driven architecture that offer rapid development and democratization of data, enabling domain leaders for automating ingestions and helping business users with a robust, accurate and precise data access.

The framework provides a wide range of pre-built connectors and adapters, enabling seamless integration with various data sources such as databases, APIs, cloud services, and file systems. Users can simply select the desired data source, configure the connection parameters, and visually map the data fields to their destination. The LCNC Ingestion Framework takes care of generating the necessary code and executing the ingestion process, abstracting the technical complexities and allowing users to focus on the data itself.

In accordance to the Well Architected Framework of Azure, the LCNC framework, when integrated with Azure workloads, streamlines data flow through several key stages:

  1. Data Collection: Data is gathered from various sources using connectors or APIs provided by the LCNC platform or building custom connectors, with Azure Event Hubs serving as the messaging service.
  2. Data Ingestion: Collected data is ingested into a processing pipeline using Azure Data Factory, which supports both batch and streaming data processing.
  3. Data Transformation and Enrichment: Data Factory enables transformations and enrichments, with additional data retrieved from Azure SQL Database or Cosmos DB.
  4. Data Storage: Transformed data is stored in Azure Data Lake Storage or Azure Blob Storage for unstructured data.
  5. Data Analysis and Visualization: Insights are gained using Azure Synapse Analytics, Azure Databricks, and visualization tools like Power BI.
  6. Data Monitoring and Alerting: System performance is monitored using Azure Monitor and Azure Log Analytics, with custom metrics and alerts.
  7. Data Retention and Archiving: Long-term storage is managed using Azure Blob Storage Archive tier and Azure Data Lake Storage tiers for automated workflows.

The Engagement model for the implementation of this framework is as follows:

  1. Scope Alignment and Data Acquisition - Agreeing on the Data Sources, granularity, frequency, cardinality; Understanding existing datasets and schema; Understanding business requirements; Agreeing on the Success Criteria
  2. Engineering Pipeline Orchestration - Customizing the LCNC Framework for meeting the specific use-case
  3. Testing and Deployment - Testing pipelines for integration, data validation & frequency; Performing Stress testing, site testing
  4. Publishing the Output - Publishing the final outputs to business & IT/Data team and aligning on solution roadmap
https://store-images.s-microsoft.com/image/apps.46944.a6eca7d2-ac30-4ac1-9e7c-3c10e3e7b20d.677bd2ee-5873-4280-a025-4acad1d09e51.d28f1386-046b-4b5c-9d46-920df80bcc1c
https://store-images.s-microsoft.com/image/apps.46944.a6eca7d2-ac30-4ac1-9e7c-3c10e3e7b20d.677bd2ee-5873-4280-a025-4acad1d09e51.d28f1386-046b-4b5c-9d46-920df80bcc1c