Power BIn8nAutomationFinanceCase Study

How I Automated a $150M Expense Reporting Pipeline

March 15, 2026·8 min read

Every quarter, a team of analysts would spend weeks manually consolidating expense data from dozens of cost centres into a single Excel file — then pivot it, format it, and email it to 15 executives. It was painful, error-prone, and expensive.

I was asked to fix it. Here's exactly how I did it.

The Problem

The company managed $150M+ in annual operating expenses across 6 business units. The reporting process looked like this:

  1. Finance team pulls data from SAP manually each month
  2. Data lands in 12 different Excel files with different structures
  3. Analysts spend 2–3 weeks merging, cleaning, and formatting
  4. A static PDF report gets emailed to leadership
  5. Leadership asks follow-up questions → process repeats

The total cost: ~40 analyst-hours per month, plus constant firefighting when data was wrong.

The Solution Architecture

I designed a 3-layer automation stack:

Layer 1 — Data Extraction (n8n)

Built an n8n workflow that:

  • Connects to the SAP OData API on a scheduled trigger (1st of each month)
  • Pulls GL transactions, cost centre hierarchies, and budget data
  • Normalises the schema across all 6 business units
  • Loads clean data into Azure SQL Database
SAP OData → n8n workflow → Azure SQL

The workflow runs automatically. Zero analyst involvement.

Layer 2 — Data Modelling (Power BI + Fabric)

Built a star schema data model in Microsoft Fabric:

  • Fact table: GL transactions (12M+ rows)
  • Dimensions: Cost centre, legal entity, account, time
  • Calculated measures: EBIT, EBITDA, OPEX variance, run-rate

Key DAX pattern I used for budget variance:

EBIT Variance % = 
DIVIDE(
    [EBIT Actual] - [EBIT Budget],
    ABS([EBIT Budget])
)

Layer 3 — Reporting & Distribution (Power BI + n8n)

The Power BI dashboard auto-refreshes daily. But I went further:

  • n8n workflow detects when the refresh completes
  • Automatically generates a PDF snapshot of the executive summary page
  • Emails it to the distribution list with dynamic commentary
  • Posts a summary card to the Finance Teams channel

Results

MetricBeforeAfter
Time to produce report2–3 weeks< 4 hours (refresh time)
Manual analyst hours/month40+ hours2 hours (QA only)
Report accuracy~93% (human error)99.8%
Executive response timeDay after meetingReal-time dashboard

What Made This Work

1. Starting with the data contract
Before touching Power BI, I defined exactly what data format I needed. This forced the n8n pipeline to conform to a spec, not to SAP's messy output.

2. Treating automation as a product
I added monitoring, alerting, and a simple admin dashboard so the finance team can see pipeline status without calling me.

3. Change management
The hardest part wasn't the technology — it was getting buy-in from analysts who were used to "their" Excel files. I ran 3 workshops showing how the tool helps them do higher value work instead of data wrangling.

What's Next

I'm now extending this pipeline to include:

  • Forecasting layer using Python (scikit-learn) integrated into Fabric
  • Drill-through capability to invoice level
  • Natural language query via Copilot integration

If you have a similar reporting challenge, I'd be happy to walk through the approach with you.

Facing a similar challenge?

📅 Book a Free Call