Sub-Ledger Analysis in Due Diligence: Going Below the Trial Balance
Trial balance data provides the foundation for due diligence analysis. It shows account balances by period and enables the team to build the QoE, NWC, and net debt analyses. But trial balance data has inherent limitations. It shows balances, not transactions. It shows totals, not components. For many analytical questions in due diligence, the answer lies below the trial balance, in the sub-ledger detail.
Sub-ledger analysis extracts and analyzes transaction-level data from the target's ERP system. This level of detail is essential for several key due diligence workstreams.
When Sub-Ledger Analysis Is Necessary
Revenue Disaggregation
The trial balance shows total revenue by account code. To assess revenue quality, the team needs revenue disaggregated by customer, product, geography, or contract type. This data lives in the accounts receivable or sales sub-ledger (SAP SD module, Oracle AR, or the equivalent in the target's ERP system).
Specific questions that require sub-ledger data:
- What is the customer concentration? Do the top 10 customers represent 30 percent or 80 percent of revenue?
- How much revenue comes from recurring contracts versus one-time sales?
- What is the revenue trend by customer or product segment?
- Are there unusual transactions (related party sales, intercompany revenue) mixed into the reported revenue?
Receivables Analysis
Working capital analysis at the trial balance level shows total trade receivables. Sub-ledger analysis reveals the aging profile, customer concentration, and collectibility:
- Aging analysis: What proportion of receivables is current versus 30, 60, 90, or 120+ days past due?
- Concentration: Are receivables concentrated in a few large customers?
- Write-off history: What is the historical bad debt experience, and are provisions adequate?
- Unusual items: Are there credit balances, intercompany receivables, or non-trade items in the receivables ledger?
Payables Analysis
Similarly, the accounts payable sub-ledger enables:
- Aging analysis: What proportion of payables is within terms versus overdue?
- Supplier concentration: Dependency on key suppliers
- Payment pattern changes: Has the target been stretching payables to manage cash flow?
- Reverse factoring identification: Are certain suppliers being paid through supply chain finance programs?
Inventory Analysis
The inventory sub-ledger provides item-level detail that supports:
- Slow-moving analysis: What proportion of inventory has not turned in 6, 12, or 24 months?
- Provision assessment: Is the target's obsolescence provision adequate?
- Cost structure: Are inventory cost records consistent with the target's stated valuation method (FIFO, weighted average, standard cost)?
- Intercompany margin: For groups with intercompany inventory transfers, what is the unrealized margin in closing stock?
Data Extraction Challenges
Sub-ledger data extraction is substantially more complex than trial balance extraction. The data volumes are larger (thousands or millions of transaction records versus hundreds of account balances), and the data structures are more varied.
ERP-Specific Structures
Each ERP system stores sub-ledger data differently:
- SAP: Sales data in VBRK/VBRP tables, receivables in BSEG/BSID/BSAD, inventory in MSEG/MARD
- Oracle: AR transactions in RA_CUSTOMER_TRX, AP invoices in AP_INVOICES, inventory in MTL_ONHAND_QUANTITIES
- NetSuite: Transaction lines with custom fields and segments
- Mid-market systems (Sage, Exact, Cegid): Varied proprietary structures
Extracting clean, usable data from these systems requires knowledge of the specific ERP data structures and the ability to handle large datasets efficiently.
Data Volume
A mid-market company processing 50,000 invoices per year generates 200,000+ AR transaction records over a four-year analysis period. Add inventory movements, AP transactions, and journal entries, and the total dataset can reach millions of records.
Processing this volume in Excel is impractical. The data exceeds Excel's effective working capacity well before reaching its row limit, because formula calculations on large datasets become prohibitively slow.
Data Quality
Sub-ledger data amplifies the data quality challenges present in trial balance data. Transaction-level records may have:
- Missing customer or product codes
- Inconsistent naming conventions (the same customer appears under multiple names)
- Memo fields with unstructured text that requires interpretation
- Currency and date format inconsistencies
Integrating Sub-Ledger Analysis with the Broader Diligence
The sub-ledger analysis must reconcile to the trial balance. Total AR per the sub-ledger should match the trade receivables balance in the trial balance. Total revenue per the sales sub-ledger should reconcile to the revenue accounts in the GL.
When these reconciliations do not balance (which is common), the team must investigate the differences. Common causes include:
- Timing differences between sub-ledger postings and GL postings
- Manual journal entries posted directly to the GL that do not flow through the sub-ledger
- Consolidation adjustments or intercompany eliminations
- Reclassification entries that move balances between accounts
These reconciliation gaps consume analyst time and delay the analysis. Teams that have a systematic approach to data normalization handle reconciliation more efficiently because the data is structured consistently from the outset.
Making Sub-Ledger Analysis Efficient
The key to efficient sub-ledger analysis is extracting the right data at the right level of granularity. Not every engagement requires full transaction-level detail for every sub-ledger. The team should assess early in the engagement which sub-ledger analyses will be material to the conclusions and extract accordingly.
For example, customer-level revenue data (aggregated from invoice-level records) may provide sufficient insight for the revenue quality assessment without requiring analysis of every individual invoice. Receivables aging at the customer level may be sufficient without line-item detail.
This targeted approach reduces data processing time while preserving the analytical value of sub-ledger analysis. It requires experienced judgment about which questions the sub-ledger data needs to answer, applied early enough in the engagement to guide the data extraction process.