How to Create Due Diligence Reports From Excel Data and General Ledger Exports
Financial due diligence reports are rarely built from a single pristine source. On most engagements, the team receives a mix of Excel workbooks, PDF management accounts, ERP exports, and general ledger (GL) detail that must be reconciled, normalized, and translated into a coherent story about earnings quality, working capital, and net debt.
This article treats Excel-based inputs and general ledger data as two common starting points, then describes how both paths converge on the same analytical spine and the same deliverable standards.
Start With the End State
Before touching data, define what the report must prove and how it will be reviewed. A strong financial due diligence report typically follows a predictable structure: executive summary, Quality of Earnings, net working capital, net debt, cash flow, and supporting appendices with schedules and limitations. Aligning your working files with that structure early prevents rework when the partner review begins. For section-level guidance, see due diligence report template thinking and the broader financial due diligence checklist.
The operational goal is simple: every headline number in the report should tie to validated source data, with an audit trail that a second reviewer can follow without opening ten unrelated files.
Path A: Creating the Report From Excel Data
Excel is often the container for diligence inputs even when the underlying system is an ERP. Targets send trial balances by period, management reporting packs, covenant reporting extracts, and analyst-built bridges. Treat Excel as a staging format, not the system of record.
1. Inventory what you actually received
Catalog every file by entity, period, currency, and purpose (statutory reporting, management reporting, tax basis, carve-out perimeter). Note whether numbers are rounded, calendarized, or mapped to a non-standard chart of accounts. This inventory becomes the scope appendix in the final report.
2. Normalize structure before you normalize numbers
Different tabs and workbooks rarely share column order, sign conventions, or date formats. Standardize to a single canonical layout: entity, account code, account description, period end date, debit, credit, net movement, and opening or closing balance as required by your model. Consistent structure is a prerequisite for reliable consolidation and for catching import errors early. Data quality issues caught in week one are cheap; issues caught in partner review are expensive.
3. Reconcile Excel to something authoritative
Excel inputs should reconcile upward to audited financial statements, filed statutory accounts, or bank covenant reporting where available. If reconciliation is impossible because the target only provided management Excel, document the gap explicitly in the report and tighten procedures elsewhere (for example, deeper GL testing on high-risk accounts).
4. Build analytical tables that map to report sections
Once trial balance or management data is clean, construct the analytical tables that feed each report chapter: revenue and margin trends for QoE, monthly working capital components for the NWC bridge, debt schedules for net debt. Keep calculation logic traceable. Hard-coded plugs are the fastest way to lose review credibility.
5. Know where Excel stops working
On multi-entity deals with long histories and granular schedules, spreadsheets become slow, fragile, and difficult to version. That is not a judgment on analysts; it is a property of the tool at scale. Understanding those limits helps you decide when to move ingestion, mapping, and validation into a more controlled environment while keeping Excel for judgment-driven analysis. For a fuller discussion, see Excel limitations in due diligence.
Path B: Creating the Report From General Ledger Detail
When you receive true GL detail (transaction-level or summarized journal activity), you gain depth: you can test revenue and expense seasonality, investigate account spikes, and support adjustments with journal references. You also inherit volume and complexity.
1. Confirm export completeness and period coverage
Validate that the GL extract spans the full analysis window for every in-scope entity, including opening balances if you are building movement-based bridges. Confirm that inactive accounts, system accounts, and year-end reclassifications are included or intentionally excluded, and that the exclusion rules are consistent across periods.
2. Tie the GL to the trial balance and financial statements
Roll the GL forward to trial balance and tie trial balance to the face financials. This is the same reconciliation discipline as Path A, but with more moving parts. Discrepancies often reveal mapping errors, missing entities, or cutoff issues at period boundaries.
3. Map the chart of accounts to your analytical framework
GL data is only analytically useful once accounts roll up into categories that support QoE, NWC, and EBITDA adjustments. That mapping step is foundational and time-consuming when done manually across hundreds or thousands of accounts. For why this matters economically, read the cost of manual GL mapping; for methodology, see chart of accounts mapping.
4. Use GL depth where it changes conclusions
Not every account needs transaction-level scrutiny. Prioritize: revenue recognition and cut-off, payroll and related costs, large and unusual journals, related-party activity, and balance sheet accounts that drive NWC. The report should show judgment: depth where risk is high, summary presentation where risk is low.
5. Preserve evidence next to the adjustment
When you propose an EBITDA adjustment or an NWC normalization backed by GL evidence, store the supporting journal references, descriptions, and amounts alongside the adjustment schedule. Partners and clients increasingly expect diligence to be defensible, not merely directional.
Where the Two Paths Meet
Whether you start from Excel schedules or GL exports, the same middle layer appears:
- Single analytical model of the business (consistent periods, currencies, and perimeter)
- Reconciled ties to authoritative financial statements where they exist
- Mapped accounts that roll into QoE, NWC, and net debt logic
- Documented adjustments with clear categorization and evidence
- A report narrative that matches the numbers in the workbook
GL-first work tends to produce stronger evidence for adjustments. Excel-first work tends to be faster when the target is small and data is clean. Most real engagements blend both: management reporting in Excel, substantiation in GL detail.
A Practical End-to-End Workflow
The sequence below works across both inputs:
- Scope and perimeter — entities, periods, GAAP basis, carve-out rules.
- Data acquisition and inventory — what arrived, from whom, in what format.
- Ingestion and standardization — consistent fields, signs, and period keys.
- Validation — TB ties, intercompany eliminations where relevant, FX logic.
- Mapping — COA to analytical categories; document mapping changes over time.
- Core analytics — QoE bridge, NWC monthly analysis, net debt identification.
- Cross-workstream consistency — the same event should not be treated differently in QoE and NWC unless the difference is intentional and explained.
- Drafting — write the report from validated exhibits, not from memory.
- Independent review — a second person retraces key ties and adjustments.
- Finalization — version control, limitations, and appendix completeness.
If your practice is standardizing this sequence across deals, standardizing deal workflows principles apply directly.
Common Failure Modes
- Mixing reporting bases without labeling (management vs statutory vs tax).
- Silent mapping changes between periods that break trend analysis.
- Consolidating before validating entity-level balances.
- Adjustment narratives disconnected from the supporting schedule rows.
- Over-reliance on Excel versioning on large teams; see workpaper management discipline.
Closing Thought
Excel data and general ledger exports are not competing approaches; they are different doors into the same room. The quality of the due diligence report depends less on which door you entered and more on whether reconciliation, mapping, and evidence were handled with consistent rigor. Where volume and complexity make spreadsheets fragile, financial due diligence software and automation are best used to protect the integrity of the data layer, while analysts remain responsible for judgment, client communication, and the final narrative.