Data Integration Standard

Issued by: Data & Analytics TeamApplies to: All data integrations, pipelines, and views entering or leaving the Data WarehouseStatus: Approved DraftDate: [Insert Date]Version: 1.0 1. Purpose This standard defines the structure and principles for integrating data from source systems—including ERP (IFS), Dynamics 365, Power Apps, and SharePoint—into the enterprise Data Warehouse. It ensures that data … Read more

Architecture Rules of the Road

A calm, consistent, and clever approach to data engineering. 🔹 1. Look in the Box First Before inventing a workaround, check what already exists.If Microsoft or IFS built it, use it — it’s likely more robust, secure, and supported. 🔹 2. Easy Landings Data should arrive safely and predictably.Keep import containers simple, standardised, and transformation-free.Bronze … Read more

Service Level Agreement (SLA) for DataMart availability

This draft framework provides a robust SLA for DataMart availability, balancing business needs, technical feasibility, and stakeholder communication. 1. Availability Target Set a clear uptime percentage that aligns with the criticality of the DataMart to your business processes. 2. Operational Hours Define the expected operational hours of the DataMart. Specify if it needs to be … Read more

Data Engineering and Transition to IFS Cloud

What good looks like Background:  By September 2025, all IFS data currently residing in the Apps 10 solution will transition to the IFS Cloud environment. Commitment: Data Engineering commits to ensuring uninterrupted delivery of DataMarts sourced exclusively from IFS Cloud, enabling the continuity of reporting assets. Prerequisites for a Successful Transition 1.    Reduction of Views … Read more

QA Requirements

Quality standards that apply to the development. Software Lifecycle used Software Quality tool(s) and methods used. The overarching goal of Data Warehouse Development Quality Plan is proactive problem-solving, where any anomalies or discrepancies are swiftly identified and rectified before they escalate into issues. Through continuous monitoring, analysis, and data observability, the reliability, accuracy, and accessibility … Read more

Introduction

Purpose of Software Quality Plan The purpose of a Data Warehouse Development Quality Plan is to ensure that the development, deployment, and maintenance of the data warehouse, along with its associated components such as third-party data sources, Azure Data Factory (ADF) pipelines, and Data Marts, meet established quality standards. This plan serves as a comprehensive … Read more

Development Programme

Project Plan or other project management document. Data Discovery Stage: Objective: Identify data sources, understand data requirements, and define the scope of the DataMart project. Key Activities: Deliverables: Design Stage: Objective: Design the architecture, schema, and data models for the DataMart. Key Activities: Deliverables: MVP (Minimum Viable Product) Stage: Objective: Develop and deploy a minimum … Read more

Support

Organisation/Team responsible for management of Service Desk requests/incidents. Escalation Routes All reports WILL provide a method of raising a Service Desk request for support.  Once raised the following escalation routes will be followed. Issues Reporting (IT Service Desk) Stage Activity Comments Call Received Greet & Verify IdentityDocument Caller Details   Problem Identification Listen & Clarify … Read more

Control of Data

Identification of controlled data, its classification, and associated processes. Data Identification Inventory of Data Assets Data Cataloguing: Create a comprehensive inventory of all data assets across your organisation, including databases, data warehouses, data marts, and data lakes. Data Source Mapping: Identify and map all data sources, including internal systems and third-party sources. All Data Sources … Read more

Design Methodologies and Environment

Design methodology, development platform, hardware requirements, database. All Software Quality Procedures Continuous Integration and Continuous Deployment (CI/CD) Continuous Integration and Continuous Deployment (CI/CD) ensures that changes and updates to data pipelines are tested, integrated, and deployed to production, facilitating consistent and reliable data processing and delivery. In dynamic data environments where sources, formats, and requirements … Read more