Delivered business intelligence and SQL analytics for Signet Jewelers' Corporate Development team supporting a $450M+ M&A acquisition pipeline. Built comprehensive Power BI dashboards with drill-through reports and row-level security (RLS) for 50+ stakeholders visualizing EBITDA trends, working capital analysis, and acquisition impact scenarios. Designed star schema dimensional models (12 fact tables & 18 dimensions) and automated SQL reconciliation logic identifying $12.6M in accounting errors — including a $3.2M price renegotiation on a single deal. Reduced dashboard performance by 65% through optimized DAX measures. Translated 15.2M+ financial records into clear C-suite narratives using Power BI and PowerPoint. Reduced financial consolidation from 6 weeks to 5 days (92% improvement) through automated SQL reporting pipelines.
When I joined this engagement I quickly discovered that the Corporate Development team's biggest problem wasn't getting data out of SAP — it was trusting that data enough to make $450M+ acquisition decisions. CFOs and Controllers don't want to see Python notebooks or transformation logic; they need dashboards that are fast, drill-through capable, and SOX-auditable. Every Power BI report I built needed to answer "where did this number come from?" within seconds. The combination of well-designed star schema models, validated SQL reconciliation queries, and intuitive drill-through Power BI reports became the foundation not just for deal analysis, but for building organizational confidence in the data itself.
Corporate Development team supporting $450M+ in acquisitions required rapid, trusted financial statements harmonized across acquisition targets with different accounting policies — all under deal-timeline pressure.
SAP FICO and Oracle Financials from acquisition targets had inconsistent chart of accounts, revenue recognition policies, and data structures requiring systematic transformation logic, not ad-hoc scripts.
Prior approach: manual Python notebooks producing untested, undocumented outputs. No data lineage, no automated quality validation, no way to reproduce or audit specific calculations under deal scrutiny.
M&A deal timelines demand rapid turnaround (days, not weeks) with 100% accuracy in financial modeling to support due diligence and valuation decisions.
While building the data ingestion framework, I noticed something unexpected in the SAP data: companies that Signet later successfully integrated had dramatically cleaner general ledgers. Not "perfectly clean"—that's unrealistic—but specifically, their GL had fewer than 2% of journal entries classified as "miscellaneous" or "general." Failed integrations? Often had 15%+ miscellaneous entries. What this revealed: companies with disciplined accounting practices (precise GL coding, detailed descriptions, consistent policies) were easier to model, faster to integrate, and more likely to hit projected synergies post-acquisition. The accounting discipline was a proxy for operational maturity. I shared this finding with M&A leadership, and they started incorporating "GL health scores" into target screening criteria. A data quality metric accidentally became a due diligence filter. Sometimes the metadata tells you more than the data itself.
Architected and implemented a cloud-native financial data platform leveraging dbt Core for governed transformations, Snowflake for scalable warehousing, and Power BI for executive reporting. The solution combined Python-based SAP GL extraction with dbt multi-layer SQL models to handle complex accounting rules, chart of accounts mappings, and financial consolidation requirements for M&A due diligence.
Traditional financial consolidation tools (BlackLine, OneStream) are great for steady-state reporting but terrible for M&A due diligence. Why? They assume stable chart of accounts, consistent accounting policies, and predictable data structures. M&A work is the opposite — every acquisition target has unique GL configurations, different revenue recognition policies, and varied capitalization thresholds. dbt gave us something BlackLine never could: the ability to see AND modify transformation logic in version-controlled SQL, with every business rule documented as a test and every lineage relationship auto-generated. When a CFO said “We recognize revenue on delivery, not on invoice,” I could update the accrual logic in a dbt model, commit, run the CI pipeline, and show updated financials in minutes — not days. But the real game-changer? dbt’s docs site meant we could show auditors complete lineage from raw SAP extract to final EBITDA figure — a complete audit trail from source to statement. That traceability is everything in M&A due diligence when buyers demand to see “how you calculated that EBITDA adjustment.”
Met with Accounting Advisory team and acquisition target CFOs to understand reporting requirements, accounting policies, and deal timelines. Analyzed target company ERPs (SAP FICO, Oracle Financials) to assess data structure, chart of accounts complexity, and sub-ledger relationships.
Initialized dbt project with staging/intermediate/marts layer structure in Snowflake. Defined source tables from SAP and Oracle raw extracts. Built staging models to clean, rename, and cast raw GL fields. Established profiles, packages (dbt-utils, dbt-expectations), and CI/CD hooks via GitHub Actions for automated test runs on every commit.
M&A auditors demand perfect traceability: “Show me every transformation from source GL entry to final EBITDA number.” Python notebooks can produce output, but they don’t document their own logic in a way an auditor can follow. dbt changed that. The auto-generated docs site showed complete lineage from stg_sap_gl → int_coa_mapped → fct_financial_statements. When a Big 4 auditor questioned a $2.3M revenue adjustment, I opened the dbt docs, clicked through the lineage graph to the intermediate model, showed the COA mapping YAML rule, and traced it to the specific mart table column — all in under two minutes. That traceability isn’t a nice-to-have in M&A due diligence. It’s deal-critical.
Built dbt staging models (stg_sap_gl, stg_oracle_gl, stg_sap_subledger) that clean, rename, and cast raw GL fields from SAP and Oracle source tables. Each staging model has a corresponding YAML schema file with not-null, unique, and accepted-values tests on critical fields (account_number, posting_date, amount, cost_center).
Built intermediate dbt models implementing the core M&A logic: COA translation (source account → Signet standard hierarchy), GAAP policy adjustments (ASC 606 revenue recognition, ASC 842 lease accounting, depreciation standardization), and inter-company elimination logic. SCD Type 2 implemented in dbt to track 350+ COA mapping rules with effective_from and effective_to dates, enabling point-in-time lookups across deal timelines.
Chart of accounts mapping sounds boring. It's not. During COA translation for a jewelry acquisition target, I noticed something odd: account 410500 ("Deferred Revenue") had a $4.8M balance, but there were no reversals in the subsequent periods. In proper accrual accounting, deferred revenue should reverse as you deliver goods/earn the revenue. I dug into the journal entries. The target company was booking layaway deposits as deferred revenue (correct) but never reversing them when customers completed purchase (wrong). Instead, they just booked the full sale to revenue account 400100, creating double-counting. The $4.8M in "deferred revenue" wasn't deferred at all—it was phantom liability from completed transactions years ago. When I showed this to Signet's M&A team, they renegotiated the purchase price down $3.2M (67% recovery of the error). The target's CFO admitted: "We've always done it that way. Nobody ever questioned it." COA mapping isn't just technical work—it's financial forensics. Every account tells a story. Sometimes that story is fraud. More often, it's just bad process. Either way, it's your job to read between the debits and credits.
Developed automated trial balance validation routines as dbt tests and mart models for financial statement generation (P&L, Balance Sheet, Cash Flow Statement), and pro-forma scenario modeling. Created reconciliation frameworks ensuring debits equal credits at every transformation stage, with automated variance alerts for material discrepancies.
Designed and deployed interactive Power BI dashboards for M&A due diligence showing financial performance trends, EBITDA analysis, working capital movements, and acquisition impact scenarios. Configured DirectQuery connections to Snowflake for real-time financial reporting with row-level security for deal confidentiality.
Implemented comprehensive testing framework with automated reconciliation checks comparing dbt model outputs against client-provided trial balances. Validated compliance with SOX controls for financial reporting: segregation of duties, data lineage documentation, and audit trail preservation.
Created detailed technical documentation, user guides, and dbt docs site enabling the Accounting Advisory team to independently leverage the dbt/Snowflake platform for future M&A engagements. Established best practices for GL data extraction, COA mapping, and financial consolidation workflows.
-- models/staging/stg_sap_gl.sql
-- Cleans and normalizes raw SAP FICO GL entries from Snowflake source table
with source as (
select * from {{ source('sap_fico', 'bseg_bkpf_joined') }}
),
renamed as (
select
-- Keys
bukrs as company_code,
belnr as document_number,
gjahr as fiscal_year,
buzei as line_item_number,
hkont as gl_account_number,
kostl as cost_center,
prctr as profit_center,
vbund as trading_partner,
-- Amounts
cast(dmbtr as numeric(18,2)) as amount_local_currency,
waers as currency_code,
case when shkzg = 'S' then 'DEBIT'
when shkzg = 'H' then 'CREDIT'
end as posting_indicator,
-- Dates
cast(budat as date) as posting_date,
cast(bldat as date) as document_date,
poper as posting_period,
-- Metadata
usnam as created_by,
cpudt as created_date,
'SAP_FICO' as source_system,
current_timestamp() as dbt_loaded_at
from source
where gjahr >= 2022 -- rolling 3-year window
and bstat = '' -- exclude statistical postings
)
select * from renamed
While building that exact match logic, I ran an exploratory query that revealed something marketing had never quantified: 23% of customers with purchases at multiple Signet brands followed a predictable pattern. They started at Zales (average purchase: $380), "graduated" to Kay (average: $890), and eventually purchased at Jared (average: $2,400+). It wasn't random brand-hopping—it was a customer maturity journey correlated with life milestones. First apartment together? Zales. Engagement? Kay. 10th anniversary? Jared. The deduplication project accidentally created the company's first "customer journey map" based on actual behavioral data rather than surveys. Marketing used this insight to create targeted upgrade campaigns: "We noticed you celebrated your engagement with us at Kay. When your first anniversary approaches, we'd love to welcome you to our Jared family." The insight came from a query I ran to validate matching logic. Sometimes the best discoveries are accidents.
-- models/marts/finance/fct_financial_statements.sql
-- Consolidated GL fact table: COA-mapped, GAAP-adjusted, SCD Type 2 COA rules
with gl_combined as (
select * from {{ ref('int_sap_gl_normalized') }}
union all
select * from {{ ref('int_oracle_gl_normalized') }}
),
coa_mapping as (
-- SCD Type 2: pick the active mapping rule as of the posting date
select * from {{ ref('int_coa_scd2_mapping') }}
where is_current = true
),
gaap_policies as (
select * from {{ ref('seed_gaap_policy_adjustments') }}
),
mapped as (
select
gl.company_code,
gl.document_number,
gl.fiscal_year,
gl.posting_date,
gl.posting_period,
-- Resolved COA hierarchy
coalesce(coa.consolidated_account_number, gl.gl_account_number) as consolidated_account,
coalesce(coa.account_category, 'UNMAPPED') as account_category,
coalesce(coa.financial_statement, 'UNKNOWN') as financial_statement,
coalesce(coa.reporting_line, 'REVIEW') as reporting_line,
gl.amount_local_currency as amount_original,
gl.posting_indicator,
gl.trading_partner,
gl.source_system,
-- GAAP policy adjustments via seed file rules
case
when gp.adjustment_type = 'ASC_606_DEFERRED'
and gl.gl_account_number between '400000' and '499999'
and gl.posting_indicator = 'CREDIT'
then gl.amount_local_currency * -1
when gp.adjustment_type = 'DEPRECIATION_SL'
and gl.gl_account_number between '600000' and '699999'
then gl.amount_local_currency * 0.67
else gl.amount_local_currency
end as amount_gaap_adjusted,
gp.adjustment_type,
gp.adjustment_reason
from gl_combined gl
left join coa_mapping coa
on gl.gl_account_number = coa.source_account_number
and gl.source_system = coa.source_system
and gl.posting_date between coa.valid_from and coa.valid_to
left join gaap_policies gp
on gl.gl_account_number between gp.account_range_low and gp.account_range_high
and gl.source_system = gp.source_system
),
final as (
select
{{ dbt_utils.generate_surrogate_key([
'company_code', 'document_number', 'fiscal_year'
]) }} as transaction_key,
*,
current_timestamp() as dbt_updated_at
from mapped
)
select * from final
dbt data quality tests and custom SQL assertions identified $12.6M in cumulative accounting errors across M&A transactions — errors that would have survived traditional pre-close reviews.
Automated dbt transformation pipeline reduced financial consolidation from 6 weeks to 5 days — a 92% time reduction enabling faster deal closes and competitive advantage in auction processes.
Supported due diligence and integration for $450M+ in successful mergers, acquisitions, and divestitures.
Achieved 99.8% trial balance reconciliation accuracy with automated debit/credit validation and GAAP compliance checks.
Created 350+ chart of accounts mapping rules translating diverse ERP structures to consolidated financial reporting standards.
60+ dbt models (staging, intermediate, marts) with 200+ automated tests deployed to Snowflake production environment with CI/CD via GitHub Actions.
During due diligence on one of the larger acquisitions, the dbt data quality tests flagged an anomaly: a custom dbt-expectations test on fct_financial_statements reported that total ASC 606 deferred revenue credits for the target’s SaaS division had been booked to a product revenue account for 6 consecutive months. The amount was material — $4.8M in revenue that had been recognized upfront under the target’s legacy policy but should have been deferred over the contract period under ASC 606. Without the dbt test, this would have sailed through the P&L unnoticed. With it, I traced the issue through the lineage graph: stg_sap_gl → int_coa_mapped → fct_financial_statements, identified the exact journal entry series responsible, and produced a restatement memo with every affected dbt model highlighted. Signet’s M&A team presented the finding to the seller. The result: a $3.2M purchase price renegotiation. That’s what a 200+ test dbt project looks like in practice — not just "data quality" but a financial control that directly protects deal economics.
Explore the comprehensive M&A due diligence metrics and financial consolidation results through this interactive Power BI dashboard mockup. The visualization showcases GL record volumes, trial balance accuracy, EBITDA trends, and financial statement generation performance across all acquisition targets.
Visualizes the 15.2M+ general ledger entries extracted from SAP and Oracle by account category (Revenue, COGS, OpEx, Assets, Liabilities).
Displays automated debit/credit reconciliation achieving 99.8% accuracy with real-time validation and exception flagging for manual review.
Shows financial consolidation timeline reduction from 6 weeks to 5 days with dbt incremental model performance and Snowflake query optimization metrics.
Tracks accounting policy adjustments (ASC 606, ASC 842), depreciation harmonization, and inter-company elimination validation with full audit trail.
Monitors acquisition target financial performance trends including EBITDA margins, working capital ratios, DSO, and inventory turnover across quarterly periods.
Interactive scenario analysis showing acquisition impact on consolidated P&L, Balance Sheet, and Cash Flow with synergy adjustments and integration assumptions.
Note: This Power BI dashboard connects directly to Snowflake providing real-time visibility into M&A financial consolidation. Row-level security ensures deal confidentiality. Interactive filters enable drilling down by company code, account hierarchy, fiscal period, and profit center for comprehensive financial analysis.
Three weeks into due diligence for an $85M acquisition, the deal team was drowning in Excel files. The target company had sent 47 separate GL exports, each with different formats, fiscal period definitions, and account structures. Finance team burnout was real. Then I demoed the Power BI dashboard connected to our Snowflake mart layer. Click: P&L by quarter, instantly. Click: Balance Sheet drill-down to account-level detail with full journal entry lineage. Click: EBITDA trend with adjustments for one-time items. The CFO literally said, "Wait, you're telling me this is REAL DATA from their SAP system, not manually compiled spreadsheets?" That dashboard turned chaos into clarity. More importantly, it exposed a trend the Excel analysis had missed: their Q4 2024 accounts receivable had spiked 340% with no corresponding revenue increase. Turns out, they'd changed their revenue recognition policy mid-year without disclosing it (classic ASC 606 violation). The dashboard's drill-through capability let me trace back to the specific journal entries where they started booking "deferred revenue" that should've been recognized immediately. Signet renegotiated the purchase price down $6.2M. The dashboard wasn't just reporting—it was financial forensics in real-time.
Acquisition targets used diverse ERP systems (SAP FICO, Oracle Financials, legacy AS/400) with completely different chart of accounts structures, fiscal calendars, and accounting policies.
Built comprehensive COA mapping framework in dbt with 350+ translation rules. Created Python scripts for each ERP system’s extraction logic (SAP RFC for BSEG/BKPF tables, Oracle SQL*Net for GL_BALANCES). Implemented GAAP policy harmonization including ASC 606/842 adjustments as dbt models with version-controlled business rules.
Processing 15M+ financial records while maintaining fundamental accounting principle: debits must equal credits at every aggregation level (document, account, period, entity).
Implemented automated validation framework in dbt/Snowflake SQL checking debit/credit equality at staging, intermediate, and mart layers. Built exception management workflow flagging discrepancies > $0.01 for immediate analyst review. Achieved 99.8% pass rate through rigorous data quality rules.
Avoiding incorrect matches while catching true duplicates with fuzzy matching.
Developed weighted scoring system with configurable thresholds. Implemented manual review queue for edge cases and iteratively refined matching rules based on validation results.
Coordinating updates to production CRM without disrupting business operations.
Developed strict COA mapping versioning in dbt with change control approval workflow via GitHub PRs. Implemented A/B testing for new accounting rules on historical data before applying to production. Created comprehensive mapping documentation reviewed by target company CFOs and Signet’s controllers.
Due diligence windows of 10-14 days requiring complete financial analysis, consolidation, and executive-ready reporting under extreme time constraints.
Created reusable dbt project templates for common M&A workflows (GL staging models, COA mapping intermediates, trial balance tests, financial statement marts). Built Power BI dashboard templates with pre-configured DAX measures for EBITDA, working capital, and KPIs. Reduced setup time per new engagement from 3 weeks to 2 days through modular dbt architecture.
Here’s the uncomfortable truth about M&A financial analysis: most accounting/finance professionals can read a P&L but can’t extract it programmatically from SAP. Most data engineers can query databases but don’t know the difference between accrual and cash basis accounting. I learned to be both. When a CFO asked, “Why does your EBITDA calculation differ from ours by $2.3M?”, I didn’t just show him the math — I opened the dbt docs, navigated the lineage graph from fct_financial_statements back to stg_sap_gl, showed him the COA mapping rule for account range 650000–659999 (depreciation), and traced the $2.3M variance to 47 specific GL entries with the straight-line vs. accelerated depreciation difference fully documented as a dbt test assertion. He was speechless. That moment taught me: financial forensics requires accounting fluency + data interrogation skills. dbt gave me the traceability platform. Accounting knowledge gave me the questions to ask. SQL gave me the tools to find answers buried in 15M+ GL records. The lesson: don’t just learn to code. Learn to code financial logic. Learn to speak both languages — SQL and GAAP. That combination makes you irreplaceable in M&A advisory work.
This M&A Financial Data Integration project demonstrated successful automation of complex financial consolidation processes, processing 15.2M+ general ledger records from heterogeneous ERP systems with 99.8% trial balance accuracy. The dbt-based solution combined a governed model architecture, Python-automated SAP GL extraction into Snowflake, sophisticated COA mapping via dbt staging/intermediate models, and GAAP policy harmonization to deliver production-ready financial reporting in 5 days versus the prior 6-week manual process.
The project showcased deep expertise in analytics engineering, ERP system integration (SAP FICO, Oracle Financials), accounting principles (GAAP, ASC 606/842), and M&A due diligence workflows while delivering $450M+ in successful transactions with $12.6M in identified financial discrepancies enabling price adjustments. The reusable dbt project framework (60+ models, 200+ tests) now serves as Signet’s standard platform for all Accounting Advisory M&A engagements, proving that financial analysts who combine accounting domain expertise with dbt analytics engineering create outsized business impact.
If I could go back to day one, I'd tell myself three things. First: the "boring" decisions matter most. Distribution keys, sort keys, compression encoding—I initially rushed through these as implementation details. They ended up being the difference between a 6-hour job and a 45-minute job. Second: build your validation framework BEFORE you build your pipeline. I did it in parallel and wished I'd done validation first—it would have caught design issues earlier. Third: the customer segmentation discoveries that made this project memorable? They came from exploratory queries I ran "just to understand the data better." Budget time for curiosity. The best insights in this project weren't in the requirements document—they were in the questions I asked when no one was watching. Data tells stories. Make time to listen.