SharePointList To Staging Pipeline

Purpose: Transfer SharePoint List data through azure blob storage and into the DataWarehouse Requires: Prerequisites: Process Steps: 4. Data Pipeline 5. Create pipeline parameters. Create two pipeline parameters. 5. Activities Tab 6. Look up Activity Click on the Lookup activity and go to settings, from there if not already set up create a new connection … Read more

An Overview of the Snapshot Pipeline

Objective: The Snapshot pipeline aims to streamline data analysis and management by creating optimised views from raw data files. Process Overview: Loading parquet files and Delta lake table Creation: View Creation API Access and loading into the data warehouse Conclusion: The Snapshot pipeline offers an efficient solution for transforming raw data into actionable insights. By … Read more

An Overview of the Bronze Pipeline

Data pipeline name: My Bronze Pipeline Owner: Data Engineering Team Used since: July 2023 Purpose: The Bronze pipeline is an automated the loading of Parquet data files from an Azure data lake storage container into tables in the staging database Overview This pipeline orchestrates the movement of new data that is continually appended to storage. … Read more

An Overview of the Medallion Pipeline

Data pipeline name: Medallion to Bronze ETL Pipeline Owner: Data Engineering Team Used since: July 2023 Purpose: The Medallion pipeline is an automated process that handles the extraction, loading, and processing of CSV data from SATUK or GMISUK systems into bronze storage as parquet files. Overview The Medallion pipeline begins by setting source variables based … Read more