Strategic & Technical Blueprint

Manufacturing Data Activation Blueprint

Activating SAP and Salesforce Data in Databricks for Manufacturing Enterprises, Governance-First, BI-Ready, CI/CD-Managed

Prepared by

Customertimes Data Practice

Audience

CIO / VP Data / Enterprise Architect / Finance Leader

Industry

Manufacturing

Version

1.0 — 2026

Section 1

Executive Summary

Manufacturing enterprises are operating with a data paradox: they generate more operational data than ever before, from SAP production orders to Salesforce pipeline activity, yet the teams who need that data most still rely on manually reconciled spreadsheets, shadow BI tools, and periodic extracts that are outdated before they're shared.

The root cause is not a shortage of data. It is a shortage of trustworthy, governed, BI-ready data, data that is current, reconciled across systems, quality-validated, and accessible through a governed platform that the business can self-serve.

This blueprint addresses that gap by laying out a proven approach to activating SAP and Salesforce data in Databricks using a medallion architecture, Unity Catalog governance, and CI/CD-managed data products.

Offer table: What it solves and timeline
Offer What It Solves Timeline
Quick Starter: SAP + Salesforce Data Activation Data is trapped, BI-inaccessible, or untrusted 4–6 weeks
ABAP Logic Migration & Decommission Custom ABAP code creates ECC exit risk and maintenance debt 6–10 weeks
Finance 360 (Manufacturing) Finance lacks a governed, reconciled view of procurement, AR, and commercial data 8–12 weeks

Why Customertimes

  • Senior-heavy delivery teams with deep SAP, Salesforce, and Databricks expertise
  • Proven extraction patterns for SAP-to-Databricks (SLT, ODP, CDS Views, RFC)
  • Governance-first approach: Unity Catalog, lineage, DQ validation, and CI/CD from day one
  • Manufacturing context: we understand OTIF, GR/IR, DSO, backlog, and pricing leakage
  • Composable offers: start with Quick Starter, expand to Finance 360 on the same architectur

Manufacturing enterprises are operating with a data paradox: they generate more operational data than ever before, from SAP production orders to Salesforce pipeline activity, yet the teams who need that data most still rely on manually reconciled spreadsheets, shadow BI tools, and periodic extracts that are outdated before they're shared.

The root cause is not a shortage of data. It is a shortage of trustworthy, governed, BI-ready data, data that is current, reconciled across systems, quality-validated, and accessible through a governed platform that the business can self-serve.

This blueprint addresses that gap by laying out a proven approach to activating SAP and Salesforce data in Databricks using a medallion architecture, Unity Catalog governance, and CI/CD-managed data products.

Offer table: What it solves and timeline
Offer What It Solves Timeline
Quick Starter: SAP + Salesforce Data Activation Data is trapped, BI-inaccessible, or untrusted 4–6 weeks
ABAP Logic Migration & Decommission Custom ABAP code creates ECC exit risk and maintenance debt 6–10 weeks
Finance 360 (Manufacturing) Finance lacks a governed, reconciled view of procurement, AR, and commercial data 8–12 weeks

Why Customertimes

  • Senior-heavy delivery teams with deep SAP, Salesforce, and Databricks expertise
  • Proven extraction patterns for SAP-to-Databricks (SLT, ODP, CDS Views, RFC)
  • Governance-first approach: Unity Catalog, lineage, DQ validation, and CI/CD from day one
  • Manufacturing context: we understand OTIF, GR/IR, DSO, backlog, and pricing leakage
  • Composable offers: start with Quick Starter, expand to Finance 360 on the same architectur

Section 2

The Manufacturing Data Problem

Manufacturing enterprises are operating with a data paradox: they generate more operational data than ever before, from SAP production orders to Salesforce pipeline activity, yet the teams who need that data most still rely on manually reconciled spreadsheets, shadow BI tools, and periodic extracts that are outdated before they're shared.

The root cause is not a shortage of data. It is a shortage of trustworthy, governed, BI-ready data, data that is current, reconciled across systems, quality-validated, and accessible through a governed platform that the business can self-serve.

This blueprint addresses that gap by laying out a proven approach to activating SAP and Salesforce data in Databricks using a medallion architecture, Unity Catalog governance, and CI/CD-managed data products.

Offer table: What it solves and timeline
Offer What It Solves Timeline
Quick Starter: SAP + Salesforce Data Activation Data is trapped, BI-inaccessible, or untrusted 4–6 weeks
ABAP Logic Migration & Decommission Custom ABAP code creates ECC exit risk and maintenance debt 6–10 weeks
Finance 360 (Manufacturing) Finance lacks a governed, reconciled view of procurement, AR, and commercial data 8–12 weeks

Why Customertimes

  • Senior-heavy delivery teams with deep SAP, Salesforce, and Databricks expertise
  • Proven extraction patterns for SAP-to-Databricks (SLT, ODP, CDS Views, RFC)
  • Governance-first approach: Unity Catalog, lineage, DQ validation, and CI/CD from day one
  • Manufacturing context: we understand OTIF, GR/IR, DSO, backlog, and pricing leakage
  • Composable offers: start with Quick Starter, expand to Finance 360 on the same architectur

2.1 Structural Challenges

System fragmentation at scale. SAP is typically the system of record for order-to-cash, procure-to-pay, finance, and production. Salesforce manages the commercial pipeline, account relationships, and pricing agreements. These two systems are rarely integrated in a way that supports analytical workloads.

Complex, time-sensitive metrics. OTIF, production backlog, GR/IR accruals, and pricing leakage require joining multiple SAP modules (SD, MM, PP, FI) with Salesforce objects, applying business rules that often exist only in ABAP custom code, and refreshing frequently enough to be actionable.

ABAP technical debt. Over years of SAP customisation, manufacturers accumulate ABAP reporting programs, user exits, and enhancement spots that encode critical business logic. This logic is rarely documented, is difficult to test, and becomes a significant risk as ECC decommission deadlines approach.

Shadow analytics proliferation. When the official data platform cannot meet business demand, teams build local solutions: Excel extracts, Power BI reports connected directly to SAP via RFC, Salesforce reports that don't reconcile with finance. The result is contradictory numbers and eroded trust.

2.2 The Cost of Inaction
Common consequences in manufacturing organisations without a governed data activation programme:
  • Month-end close delays caused by manual GR/IR reconciliation and AR aging preparation
  • Pricing leakage that goes undetected because net billing price is never compared to quoted or list price at scale
  • OTIF reporting inaccuracies because SAP delivery data and Salesforce commitment data are never joined reliably
  • Audit risk from ungoverned data access, undocumented transformations, and missing data lineage
  • ECC decommission risk from undocumented ABAP logic that cannot be safely migrated without inventory and testing

Section 3

Architecture Blueprint: Medallion on Databricks

3.1 Three-Zone Medallion Overview

Customertimes implements a three-zone medallion architecture on Databricks, aligned to Unity Catalog governance. Each zone has a distinct purpose, quality standard, and access policy.

Source

SAP ECC / S/4HANA

Source

Salesforce CRM

Databricks

Medallion Lakehouse

Consumption

Power BI / Tableau

BRONZE

Raw Ingestion, Source-Faithful Delta Tables

SILVER

Cleansed & Conformed, Business Rules Applied

GOLD

Business-Ready Data Products, Governed & BI-Ready

3.2 SAP Extraction Patterns
Pattern table: Use case, latency, and complexity
Pattern Use Case Latency Complexity
SAP Landscape Transformation (SLT) High-volume transactional tables (VBAK, EKKO, BSEG) Near real-time Medium
ODP (Operational Data Provisioning) BW DataSources, LO and FI Extractors Batch / incremental Low
CDS View Extraction (S/4HANA) S/4-native analytical views, clean semantics Batch / delta Low–Medium
RFC / BAPI Direct Master data, reference data, small-volume tables Batch Low
ADF / Informatica Table Replication Bulk historical load, broad table coverage Batch Low

3.3 Salesforce Extraction Patterns
Pattern to Use Case mapping
Pattern Use Case
Salesforce Bulk API 2.0 Full and incremental loads for large object volumes
Salesforce Change Data Capture (CDC) Near-real-time event streaming for Opportunity, Account, Order
MuleSoft / Informatica IICS Where existing middleware is in place and re-use is preferred

3.4 Technology Stack
Platform layers and components
Layer Component
Compute Databricks (Unity Catalog workspace)
Storage Azure Data Lake Storage Gen2 / AWS S3 (Delta Lake format)
Orchestration Databricks Workflows, Apache Airflow
Transformation PySpark, Databricks SQL, Delta Live Tables
CI/CD Git (Azure DevOps / GitHub), Databricks Asset Bundles
Governance Unity Catalog (metastore, lineage, tags, access policies)
BI Layer Power BI, Tableau, SAP Analytics Cloud
Infrastructure‑as‑Code Terraform + Databricks Terraform Provider

Section 4

Governance Framework

4.1 Unity Catalog Structure

Customertimes implements a catalogue hierarchy aligned to manufacturing data domains:


Unity Catalog Metastore

├─ catalog: raw_bronze
│
│  ├─ schema: sap_ecc
│  │
│  │  ├─ mara, marc, mard -- Materials
│  │  ├─ vbak, vbap, likp, lips -- Sales & Deliveries
│  │  ├─ ekko, ekpo, ekes, ekbe -- Purchasing
│  │  └─ bseg, bkpf, bsid, bsad -- FI Documents & AR
│  │
│  └─ schema: salesforce
│     ├─ opportunity, account, order
│     └─ pricebook_entry, contract
│
├─ catalog: silver_conformed
│  ├─ schema: commercial
│  ├─ schema: procurement
│  └─ schema: finance
│
└─ catalog: gold_products
   ├─ schema: finance_mart
   ├─ schema: operations_mart
   └─ schema: commercial_mart

4.2 Data Access Control Model
Role-based access by data layer
Role Bronze Access Silver Access Gold Access
Data Engineering Read / Write Read / Write Read / Write
Data Architecture Read Read / Write Read / Write
Finance Analysts None None Read (finance_mart)
Operations Analysts None None Read (operations_mart)
Commercial / Sales Ops None None Read (commercial_mart)
Auditor / Compliance Read (audit log) Read (DQ logs) Read

4.3 Data Ownership Model

Each Gold data product has a named data product owner (typically a business stakeholder), a data steward (governance team), and a data engineer (technical maintainer). This is documented in the Unity Catalog tag taxonomy and enforced through change management.

4.4 PII and Data Classification

All columns are tagged on ingestion in the Bronze zone using Unity Catalog system tags:

  • pii.classification: personal_identifiable — customer name, contact, bank detail
  • pii.classification: sensitive_business — pricing, margin, rebate conditions
  • pii.classification: public — material descriptions, plant codes, cost centres

Column masking policies are applied automatically based on tag + role combination, with no manual intervention required on new tables that inherit the schema.

Section 5

Observability and Data Quality

5.1 DQ Check Framework

Every Silver and Gold table includes automated DQ checks executed as part of the pipeline run:

Data quality checks: examples and actions on failure
Check Type Example Action on Failure
Completeness order_value IS NOT NULL Quarantine + alert
Referential Integrity material_id EXISTS IN material_master Log + alert
Uniqueness No duplicate delivery_number per day Deduplicate + alert
Business Rule gr_quantity <= po_quantity Flag for review
Freshness Silver table updated within SLA window Alert + SLA breach log
Volume Row count within ±20% of expected range Alert + hold for review

5.2 Reconciliation Reporting

For every incremental load, Customertimes generates a reconciliation report comparing:

  • Source record count vs. Bronze record count
  • Bronze record count vs. Silver (post-DQ) record count
  • Silver vs. Gold aggregate totals (e.g., total order value by plant)

These reports are stored in a dedicated audit schema and surfaced in the observability dashboard, with drill-down to the specific records that failed or were quarantined.

5.3 Observability Dashboard

The standard observability dashboard covers:

  • Pipeline run status (success / failure / partial) by table and date
  • DQ check pass/fail rates by table, with trend over time
  • Data freshness: last successful load timestamp vs. SLA
  • Row count trends and anomaly detection (volume spikes or drops)
  • Lineage map: source → Bronze → Silver → Gold, visualised

Section 6

Offer 1: Quick Starter, SAP + Salesforce Data Activation

OFFER 1 OF 3

Quick Starter: SAP + Salesforce Data Activation

BI-ready in Databricks, in weeks, not quarters

6.1 Purpose

The Quick Starter is designed for manufacturing organisations that need to demonstrate value from their Databricks investment quickly, or those building a governed data foundation for the first time. It delivers BI-ready data from SAP and Salesforce in 4–6 weeks.

6.2 Scope and Deliverables

In Scope:

  • Source-to-Bronze ingestion for 3–5 priority SAP data domains (Sales Orders, Deliveries, Materials, FI Documents, Customer Master)
  • Source-to-Bronze ingestion for 2–3 Salesforce objects (Opportunity, Account, Order)
  • Silver layer: cleansed, conformed tables with DQ checks and reconciliation logs
  • Gold mart: 1–2 priority use cases defined in scoping session (e.g., OTIF or order-to-cash)
  • Unity Catalog registration, lineage tracking, and column-level tag taxonomy
  • CI/CD pipeline: Git-based, automated DQ tests, dev/staging/prod environment promotion
  • Data dictionary, runbook, and knowledge transfer sessions (2 × half-day)

6.3 Timeline
Project plan by week and activity
Week Activity
Week 1 Scoping, data access setup, SAP + Salesforce connection validation, use‑case sign‑off
Week 2 Bronze ingestion: SAP extraction patterns configured, Salesforce Bulk API connected
Week 3 Silver layer: business rules, DQ checks, reconciliation framework established
Week 4 Gold mart: first use case delivered, Unity Catalog governance applied
Week 5 CI/CD pipeline hardening, testing, observability dashboard
Week 6 Stakeholder review, knowledge transfer, runbook, go‑live sign‑off

6.4 Risk Controls
Project risks and mitigations
Risk Mitigation
SAP access delays Pre‑engagement access checklist; parallel workstream on Salesforce while SAP access is provisioned
Schema drift from SAP changes Schema drift detection in CI/CD; Bronze zone is append‑only — source changes are captured, not masked
DQ issues in source data DQ quarantine zone; reconciliation reports surfaced to named business data owner for adjudication
Scope creep Scoped use cases agreed in writing before Week 1 starts; formal change control for additions
Knowledge dependency All logic documented in runbook; restart procedures tested before handover

6.5 Success Metrics

SAP and Salesforce data available in Databricks Gold zone, passing all DQ checks

Pipeline SLA met: daily refresh by agreed time for 5 consecutive days without manual intervention

Business stakeholder signs off on output accuracy against source system reconciliation report

CI/CD pipeline operational: dev → staging → prod with automated test gate

Data dictionary and runbook delivered, reviewed, and accepted by client data engineering team

Section 7

Offer 2: ABAP Logic Migration & Decommission

OFFER 2 OF 3

ABAP Logic Migration & Decommission

De-risk your ECC exit, with a clean path to SAP Datasphere

7.1 Purpose

Custom ABAP code is one of the most underestimated risks in SAP ECC decommission programmes. Business-critical calculations, pricing adjustments, allocation rules, exception classifications, are embedded in ABAP programs, user exits, and BAdIs that are not documented, not tested, and often known to only a small number of people.

This offer systematically inventories, documents, and migrates that logic to Databricks-native code, reducing ECC exit risk and creating a forward path to SAP Datasphere if required by the enterprise roadmap.

7.2 Three-Phase Delivery

Phase A: Inventory and Documentation (Weeks 1–2)

  • Structured interview programme with SAP functional and technical leads
  • ABAP program inventory: classified by type (report, user exit, BAdI, function module)
  • Business rule extraction: documented in structured format (input → logic → output)
  • Dependency mapping: which programs feed which reports or downstream systems
  • Complexity and risk scoring: which programs are highest-priority for migration

Phase B: Migration to Databricks (Weeks 3–6)

  • PySpark / Databricks SQL translation of inventoried ABAP logic
  • Parameterised, unit-testable functions registered in Unity Catalog
  • Documentation: code comments, function signatures, test cases
  • SAP Datasphere compatibility layer (if applicable): translated logic wrapped for Datasphere views

Phase C: Validation and Sign-Off (Weeks 7–10)

  • Side-by-side reconciliation: ABAP output vs. Databricks equivalent on identical data slice
  • Regression test suite in Delta Live Tables: automated, re-runnable on each pipeline run
  • Stakeholder readout: reconciliation results, exception log, assumptions and deviations
  • Decommission sign-off checklist: conditions that must be met before SAP program retirement

7.3 Timeline
Plan by week range and activity
Week Activity
Weeks 1–2 ABAP inventory, business rule documentation, complexity and risk scoring
Weeks 3–4 PySpark/SQL translation: high‑priority programs migrated and documented
Weeks 5–6 Translation: medium‑priority programs; Datasphere compatibility layer (if applicable)
Weeks 7–8 Side‑by‑side reconciliation; regression test suite built in Delta Live Tables
Weeks 9–10 Stakeholder readout, sign‑off checklist, documentation handover, close‑out

7.4 Risk Controls
Risks and mitigations (phase B)
Risk Mitigation
Undocumented business rules Structured interview programme; access to ABAP source code; functional lead review of all documented logic before migration begins
Reconciliation discrepancies Discrepancy log maintained; business owner adjudicates on intent vs. historical ABAP behaviour
SAP Datasphere roadmap uncertainty Compatibility layer is optional; architecture decision documented and acknowledged regardless of choice
Key-person dependency on ABAP knowledge Knowledge is externalised and documented as a primary deliverable of Phase A
ECC timeline pressure High‑priority programs identified in Week 1; migration sequenced by risk score, not alphabetically

7.5 Success Metrics

ABAP inventory completed, reviewed, and signed off by SAP program lead

All high-priority programs migrated and reconciled at agreed pass rate.

Regression test suite operational in Delta Live Tables.

Decommission sign-off checklist completed and approved by programme sponsor.

No critical reconciliation discrepancies unresolved at handover.

Section 8

Offer 3: Finance 360 (Manufacturing)

OFFER 3 OF 3

Finance 360 (Manufacturing)

Sales + Procurement data products with governed CI/CD marts

8.1 Purpose

Finance leaders in manufacturing need a single, governed view of the business that reconciles procurement spend, accounts receivable, and commercial performance, without waiting on IT for every report or reconciling four spreadsheet versions at month-end.

Finance 360 delivers a purpose-built suite of data products for manufacturing finance, combining SAP FI, MM, and SD module data with Salesforce commercial data into governed, BI-ready Databricks Gold marts.

8.2 Data Products Included

Procurement Data Product

Source: SAP MM — EKKO, EKPO, EKES, EKET, EKBE, MARA, LFM1

Key metrics and their definitions
Key Metric Definition
GR/IR Accrual Balance Goods received not yet invoiced, by cost centre and GL account — close‑ready
Open PO Value Committed spend on open purchase orders not yet received
Price Variance % (PO price – invoice price) / PO price, by vendor, material category, plant
Invoice Cycle Time GR posting date → invoice receipt date → payment date, by vendor

AR / DSO Data Product

Source: SAP FI-AR — BSID, BSAD, KNA1, VBAK

Key metrics and definitions
Key Metric Definition
DSO Net receivables / daily revenue (rolling), by customer segment and business unit
AR Aging Distribution % overdue by bucket: current, 30, 60, 90, 90+ days
Dispute Value Total open items in dispute, by dispute reason code
Credit Utilisation Customers approaching or exceeding SAP credit limit

Commercial Data Product

Source: SAP SD (VBAK, VBAP, LIPS, VBRK, VBRP) + Salesforce (Opportunity, Account, Order, PricebookEntry)

Key metrics and definitions
Key Metric Definition
Pricing Leakage % (SF list price – SAP net billing price) / list price, by customer tier and product line
OTIF % On-time and in-full delivery rate, by plant, customer, and SKU
Conversion Rate SF Opportunity Closed Won → SAP Sales Order created within N days
Backlog Value Open SAP sales orders not yet confirmed or shipped

Governed CI/CD Marts

All Finance 360 data products are delivered as versioned, tested, CI/CD-managed Gold tables:

  • dbt-style transformation layers: modular SQL/PySpark with clear lineage and dependency graph
  • Automated DQ tests: every mart run includes completeness, business rule, and reconciliation checks
  • Environment promotion: dev → staging → prod with pull request review and automated test gate
  • Schema change management: additive changes auto-deployed; breaking changes require sign-off
  • Semantic layer: column-level business definitions, certified datasets in Power BI / Tableau

8.3 Timeline
Plan by week range and activity
Week Activity
Weeks 1–2 Data discovery: SAP FI/MM/SD tables, Salesforce objects, business rule definition workshops
Weeks 3–4 Bronze ingestion: all source tables connected, DQ baseline established
Weeks 5–6 Silver layer: canonical entities, business rule application, reconciliation framework
Weeks 7–8 Gold mart: Procurement and AR/DSO data products delivered and validated
Weeks 9–10 Gold mart: Commercial product, pricing leakage detection, OTIF calculation
Weeks 11–12 CI/CD hardening, semantic layer, stakeholder review, go‑live

8.4 Risk Controls
Risks and mitigations
Risk Mitigation
Business rule ambiguity (e.g., DSO calculation method) Definition workshop in Week 1; agreed definitions documented before build begins
Salesforce–SAP join failure (no shared key) Matching algorithm defined and validated in scoping; fallback rules agreed with business owner
Pricing leakage logic complexity Pricing condition type mapping workshop; phased delivery (net price first, rebates second sprint)
Month‑end close dependency Go‑live not scheduled within 2 weeks of month‑end; hypercare window covers first live close
Data volume and performance Partitioning strategy and Z‑order optimisation applied in Gold layer; performance SLA defined upfront

7.5 Success Metrics

All three Gold data products operational and passing DQ checks

Finance leader sign-off on GR/IR accrual output vs. SAP FBL3N / MB5S reconciliation

DSO output within agreed tolerance vs. SAP standard AR aging report

OTIF metric agreed and validated by Operations lead

CI/CD pipeline operational with automated test gate across all three marts

First live month-end close supported without escalation to engineering team

Section 9

TCO and ROI Logic

9.1 Framing

Quantifying the ROI of a data platform programme requires honesty: some benefits are directly measurable (reduced manual effort, avoided licence costs), others are indirect (better decisions, faster close, reduced audit risk). This section provides the logical framework and cost drivers without inventing specific figures.

How to use this section

We recommend using this model as a structured conversation with your Finance team during the assessment phase, substituting your organisation's actual operational numbers. A well-constructed, defensible business case built on actual data is more valuable, and more trusted internally, than an inflated projection.

9.2 Cost Reduction Drivers

LICENSE & INFRASTRUCTURE

Driver

Logic

Shadow BI consolidation

How many separate BI or extract tools can be retired when Gold marts are self-service?

SAP BW / BEx rationalisation

How many BW InfoProviders can be retired if ABAP logic migrates to Databricks?

ECC dual-run cost

Does ABAP migration remove blockers to ECC retirement, reducing parallel running cost?

OPERATIONAL EFFICIENCY

Driver

Logic

Month-end close

How many FTE-days per period are spent on manual GR/IR, AR aging, and intercompany reconciliation?

Pipeline rework

What is the engineering cost of fixing broken pipelines caused by SAP schema changes or DQ failures?

Audit preparation

How much time is spent assembling lineage, access logs, and DQ evidence for internal or external auditors?

9.3 Value Creation Drivers

REVENUE & MARGIN

Driver

Logic

Pricing leakage recovery

If leakage detection identifies unrecognised discounts or rebate errors, even a small % of revenue can be material at manufacturing scale

OTIF penalty avoidance

If earlier OTIF visibility enables intervention before customer penalty threshold is breached, what is the avoided cost?

WORKING CAPITAL & SPEED

Driver

Logic

DSO reduction

If earlier dispute identification accelerates cash collection by even a few days at scale, what is the working capital impact?

Decision cycle time

If Finance and Operations have daily-refreshed trusted data vs. weekly manual reports, what decisions can be made faster?

9.4 Time Horizon Structure

Horizon focus and value drivers
Horizon Focus Typical Value Drivers
Year 1 (Activation) Avoided cost and efficiency gains Manual effort reduction, shadow tool licence consolidation, audit preparation time
Year 2 (Optimisation) Revenue and margin impact Pricing leakage recovery, OTIF penalty avoidance, DSO improvement
Year 3+ (Scale) Platform compounding value New data product velocity, self‑service adoption, advanced analytics on clean Silver data

Section 10

Phased Roadmap

Each row represents a workstream. Phases are designed to be composable — Path A and Path B in Phase 2 can run in sequence or overlap depending on team capacity and business priority.

Phase 1

Weeks 1-6

Quick Starter: SAP + Salesforce Data Activation

♦ Go-Live W6

Phase 2A

Weeks 7-18

Finance 360 (Manufacturing) — Procurement, AR/DSO, Commercial

♦ Live Close W17

Phase 2B

Weeks 7-16

ABAP Logic Migration & Decommission — Inventory → Migrate → Validate

♦ Sign-Off W16

Phase 3

Month +5

Scale: New Data Products • Self-Service Enablement • Advanced Analytics

Phase 1 Exit Criteria (before Phase 2 starts)
  • Business stakeholder validates output accuracy via reconciliation report
  • Pipeline runs without manual intervention for 5 consecutive days
  • Unity Catalog governance confirmed: lineage, tags, and access policies in place
  • Executive sponsor briefed and committed to Phase 2 scope and resource

Phase 3 Capabilities (Month 5+)
  • New data product velocity: additional use cases built using established patterns and CI/CD, without re-architecting
  • Self-service enablement: business analysts build on Gold marts without engineering intervention, using governed semantic layer
  • Advanced analytics: ML models for demand forecasting, churn prediction, anomaly detection — built on the clean Silver layer
  • Platform consolidation: retiring redundant tools — shadow BI, legacy BW queries, manual extracts
  • Data product marketplace: internal catalogue enabling cross-functional data sharing via Delta Sharing

Section 11

Assessment Checklist for Leaders

Use this checklist to evaluate your organisation's readiness and to structure your conversation before the assessment call with Customertimes.

Data Landscape Readiness

We have an active Databricks workspace, or have budget and approval to stand one up

We can identify the SAP modules most critical to our data needs (SD, MM, FI, PP, etc.)

We have a named SAP technical contact who can provide schema access and extraction credentials

We have a named Salesforce administrator who can configure API access and object permissions

We know which 2–3 business use cases would deliver the most immediate value if data were trusted and timely

Governance and Risk

We have (or are willing to establish) a data governance policy that defines data ownership

We understand our GDPR, SOX, or industry-specific data handling obligations for the data we want to activate

We have a view on which data assets contain PII and which are purely operational or financial

We know our ECC decommission timeline, or can find out within 2 weeks

We have a process for change management when data definitions or pipeline logic changes

Organisational Readiness

There is an executive sponsor (CIO, VP Data, or CFO) who can unblock access and resource requests

The Finance team is willing to validate output accuracy against source system reports

Our data engineering team (internal or outsourced) can own the platform post-handover

We are open to a time-boxed engagement with clear scope and written success criteria

We have capacity for 2–3 hours per week from a business stakeholder during the engagement

Architecture and Technology

We are committed to Databricks as our primary analytical platform, or evaluating it seriously

We are willing to implement Unity Catalog governance (or already have it in place)

Our cloud environment (Azure, AWS, or GCP) is compatible with Databricks deployment requirements

We are open to implementing Git-based version control for data pipeline code

We understand the difference between ELT (transform in Databricks) and ETL (transform in middleware) and have a preference

Strategic Alignment

Leadership understands that this is a data platform programme, not a reporting tool purchase

We have a realistic timeline expectation: 4–6 weeks for Quick Starter, not 4–6 days

We are prepared to make decisions on data definitions, access policies, and use case priorities in a timely manner

We see this as the beginning of a data product discipline, not a one-time project

We have communicated to business stakeholders that initial outputs will be validated before replacing existing reports

Ready to take the next step?

Book a free 60-minute data landscape assessment with Customertimes. We'll map your SAP and Salesforce environment, identify your highest-value use cases, and recommend the right offer, with no commitment required.

Contact Our Experts