QC Checklists Senior Consultants Use Before Client Delivery

2026-03-13·by Poesius Team

QC Checklists Senior Consultants Use Before Client Delivery

The difference between a first-draft consulting deck and a client-ready one is almost never the analysis. It's the 47 small things that slip through when the team is racing to finish under deadline: the footnote that references the wrong data source, the chart whose axis doesn't start at zero, the executive summary that summarizes a finding that was revised two days ago.

Senior consultants at top firms don't trust memory to catch these things. They use checklists.

Not generic editorial checklists—structured, category-specific QC protocols that check the exact failure modes that consulting deliverables produce under deadline pressure. This guide documents those checklists in full.


Why Checklists Matter in Consulting

Airline pilots use pre-flight checklists not because they can't remember the steps—they've performed them hundreds of times—but because the cost of missing one step is catastrophic, and human memory under pressure is unreliable.

The same logic applies to consulting deliverables. An engagement manager who has built 50 decks knows instinctively how to build good slides. But at 11pm the night before a partner review, working through the 40th revision, their attention is not uniformly distributed across every slide. The checklist catches what exhausted attention misses.

The practical evidence: consulting firms with strong QC cultures consistently receive fewer partner review comments, fewer client revision requests, and fewer post-delivery corrections. The checklist investment is paid back within the first three engagements.


The Four-Category QC Framework

Senior consultants typically organize their QC checks into four categories:

  1. Narrative coherence — Does the deck tell a clear, logical story?
  2. Visual consistency — Does the deck look like it was produced by one team?
  3. Analytical integrity — Are all numbers, sources, and claims correct and consistent?
  4. Logistics — Are all administrative elements correct?

Each category catches a different class of error. Running them in sequence—narrative first, logistics last—is more efficient than checking randomly, because narrative errors sometimes require slide-level changes that can reintroduce visual or analytical errors.


Category 1: Narrative Coherence Checklist

The title read-through. Read only the slide titles, in order, without looking at any slide content. Do the titles tell a coherent story? Does each title connect logically to the previous one? Can a reader understand the deck's central argument from titles alone?

This check is frequently skipped and consistently catches problems. Slide titles that sounded fine in isolation often reveal narrative gaps when read in sequence.

The governing message test. What is the single sentence that captures the deck's central finding or recommendation? Is this sentence stated explicitly in the executive summary slide? Is it reflected accurately in the section titles? Does every section contribute to proving this sentence?

The section transition check. At each section break, write one sentence: "This section showed X. The next section argues Y because Z." If you can't write that sentence, the transition is missing.

The executive summary accuracy check. The executive summary slide should accurately reflect the findings in the full deck. After the deck is complete, re-read the executive summary. Does it accurately summarize all key findings? Has any key finding changed since the executive summary was last updated?

The recommendation landing check. Does the recommendation section follow logically from the analysis? Does each recommendation connect explicitly to the analytical finding that supports it? Is there any recommendation that appears without analytical support in the main body?


Category 2: Visual Consistency Checklist

Font audit. Open each slide and check the font in: slide title, body text, chart labels, footnotes, callout boxes. Are all of these using the correct fonts and sizes as specified in the standards document? Flag any slide with a deviation.

The most common font deviations:

  • Text pasted from external sources (Word documents, emails, other PowerPoint files) that retains its original formatting
  • Resized title boxes where text has been reduced to fit
  • Charts imported from Excel with non-standard font formatting

Color audit. Check that all charts, callout boxes, and highlighting elements use the specified color palette. Pay particular attention to:

  • Chart bar and line colors (especially multi-series charts where analyst may have used default colors)
  • Highlight colors in comparison tables
  • Section header background colors

Slide element alignment. On five randomly selected slides, check that major elements (title box, content area, footnote area) are correctly aligned to the grid. Slides with content that has drifted off-grid are visually distinguishable from correctly aligned slides even when the viewer can't articulate why.

Title style consistency. Are all slide titles in the same format—either all argument sentences (what the slide proves) or all topic labels (what the slide covers)? Mixed title styles within a deck signal inconsistent authorship.

Bullet point formatting. Check that bullet formatting is consistent across sections. Common deviations:

  • Mixed use of full sentences and sentence fragments in the same deck
  • Inconsistent indentation of sub-bullets
  • Inconsistent bullet symbols (filled circles, dashes, squares mixed)

Logo and client name check. Verify that all logo instances are the current version and correct placement, and that the client name is spelled correctly and consistently throughout the deck. These errors are disproportionately embarrassing given how easy they are to prevent.


Category 3: Analytical Integrity Checklist

Get Poesius for Free

  • Create professional presentations 5x faster than manual formatting

  • Get custom-designed slides built from the ground up, not templates

  • Start free with no credit card required

Numbers cross-reference. For every number that appears in more than one slide—particularly numbers in the executive summary that are also in the analysis sections—verify that the number is identical in all instances. Decimals rounded differently, currency symbols mixed (€ vs. EUR), and "~" used inconsistently are common culprits.

Financial assumption consistency. Identify the three or four key assumptions that drive the financial analysis (growth rate, margin assumption, discount rate, FX rate). Check that these assumptions are consistent across every slide that references them. In multi-analyst decks, assumption inconsistencies between sections are extremely common and extremely damaging in partner reviews.

Data source audit. Every chart and table should have a footnote indicating its data source. Check:

  • Is there a source footnote on every chart and table?
  • Are all footnoted sources accurate (the chart reflects the cited source)?
  • Is the citation format consistent?

Estimate labeling. Every estimated number should be clearly labeled as an estimate, not a fact. Common consulting labels: "~€40M," "estimated," "based on team analysis," or "directional." Check that all estimates are labeled and all labeled "facts" are actually sourced from external data.

Model check. If the deck includes financial model outputs (NPV, IRR, payback period), verify the inputs in the model match the assumptions stated in the deck. The most common disconnect: the model was updated with revised assumptions, but the deck still shows the prior version's assumptions.

Proofreading. Read every text element on every slide for spelling, grammar, and factual accuracy. Yes, every slide. The one you skip is the one with the client's name spelled wrong.


Category 4: Logistics Checklist

Page numbers. Are all slides numbered? Are the numbers correct and sequential? Do appendix slides continue from main deck slide numbers, or are they separately numbered (as indicated in the table of contents)?

Table of contents alignment. Does the table of contents accurately reflect the section names and slide numbers in the final deck? This is almost always slightly out of date in final deliverables—section names and slide numbers change throughout production.

Version and date. Does the deck footer or cover slide show the correct date and version number? Is the file name correct per the firm's naming convention?

Appendix references. Every appendix slide should be referenced from the main deck: "See Appendix A for full sensitivity analysis." Check that all appendix references are accurate and that all referenced appendices exist.

Print formatting. Send the deck to a printer preview and check: Does all content fit on the slide without overflow? Do footnotes render at a readable size? Are there any formatting issues that appear in print that don't appear on screen? (This is particularly important for decks that will be physically printed for the client.)

Hidden slides. Check for hidden slides that should not be in the final deck. Working drafts often accumulate hidden slides containing earlier versions, draft analyses, or internal notes that should be removed before client delivery.

File size. Large PowerPoint files (over 20MB) can cause issues when emailed or uploaded to client portals. If the file is large, check for high-resolution images that can be compressed without visible quality loss.


The Senior Partner Pre-Meeting Check

In addition to the full QC checklist, senior partners typically run a focused pre-meeting check immediately before entering the presentation room:

  1. The executive summary read. Read the executive summary slide in full. Does it accurately state the central finding? Is it the first thing a skimming executive will see?
  2. The recommendation clarity check. Can you state the primary recommendation in one sentence? If not, the recommendation section may need strengthening.
  3. The "hard question" prep. Identify the two or three findings most likely to receive pushback from the client. Confirm the supporting analysis is ready in the appendix.
  4. The technical check. Confirm that the presentation file opens correctly on the presentation device, that slide animations work as intended, and that the presenter view is set up if needed.

This five-minute check has prevented more last-minute crises than any other preparation activity.


Building QC Into the Engagement Timeline

The best QC happens progressively throughout the engagement, not only at the end. Engagement managers who run mini-QC checks at each major milestone—after the ghost deck is built, after each work stream's first draft is complete, after the integration meeting—find fewer problems at the final review.

Milestones for structured QC checks:

  • Ghost deck review: Narrative coherence only—does the planned structure tell a coherent story?
  • Section draft reviews: Analytical integrity and visual consistency for each section as it's completed
  • Integration draft review: Full narrative coherence and cross-section analytical integrity
  • Pre-partner review QC: Full four-category checklist
  • Post-partner review final check: Confirm all partner comments have been addressed and no new errors have been introduced

Progressive QC distributes the review burden throughout the engagement and prevents the final QC session from becoming an all-night emergency. The final QC is a confirmation, not a discovery process.


The AI-Assisted QC Opportunity

Some QC checks that require human judgment—narrative coherence, analytical integrity—cannot be automated. But several visual consistency and logistics checks are candidates for automation through AI tools.

Checking font consistency, color palette adherence, slide element alignment, and page number accuracy are rule-based checks that modern AI tools can perform faster and more reliably than manual review. Consulting teams that integrate automated visual consistency checking into their workflow free senior time for the judgment-intensive QC tasks.

Tools like Poesius, which enforce brand and style standards at the generation layer, reduce the visual consistency QC burden by preventing deviations from occurring in the first place—rather than catching them after the fact.


Get Poesius for Free

  • Create professional presentations 5x faster than manual formatting

  • Get custom-designed slides built from the ground up, not templates

  • Start free with no credit card required