
What Senior Partners Really Look for in Analyst-Level Slides
When a senior partner reviews an analyst's slides, they're not reading them the same way a thoughtful peer would. They're scanning—quickly, with pattern recognition built from hundreds of decks—for specific signals that tell them whether the analyst can be trusted to deliver.
This evaluation takes three to five minutes per section and produces a judgment that significantly affects how the analyst's career develops at the firm. Yet most analysts don't know exactly what signals the partner is reading—and as a result, they invest in the wrong things and miss the cues that matter.
This guide is the partner's evaluation decoded.
The First 10 Seconds: The Scan Test
Partners don't read consulting decks sequentially from slide 1 to slide 30. They scan. They look at slide titles, glance at charts, read the executive summary. The scan happens before the detailed review, and it establishes the partner's initial impression of the deck's quality.
What partners assess in the first 10 seconds per slide:
- Does the title tell me something or just describe what the slide covers?
- Does the visual make sense at a glance, or do I need to read the body text to understand it?
- Does the slide feel like it belongs to the same deck as the previous slides (consistent formatting, same visual style)?
An analyst's slides that pass the scan test—clear action titles, self-evident visuals, consistent formatting—generate the initial impression that the analyst is in command of the material. Slides that fail the scan test generate the opposite impression: "this needs work," before the partner has read a single bullet.
Signal 1: The Quality of Slide Titles
Slide titles are the highest-visibility element of any consulting slide, and partners read them first. The quality of an analyst's titles reveals the clarity of their analytical thinking.
What partners are looking for:
A title that states the finding completely: "EMEA Procurement Costs Are 35% Above Industry Benchmarks Due to Fragmented Vendor Relationships" not "EMEA Procurement Analysis."
The signals partners draw:
- A strong action title signals: "This analyst has made an analytical judgment about what this slide proves. They understand the role of the slide in the deck's argument."
- A topic label title signals: "This analyst hasn't committed to a finding. Either the analysis is incomplete, or they're avoiding stating a conclusion they're not confident in."
- An over-qualified title ("Although data is limited, there appears to be some potential for…") signals: "This analyst is hedging. They haven't drawn a clear conclusion from the data."
Partners who see a section of topic-label titles will ask the engagement manager: "Have we actually made these analytical judgments, or are we still in research mode?" That's a damaging question at the review stage.
Signal 2: The Chart-Title Alignment
Partners check whether the chart on the slide actually proves what the title claims. This is a quick check: does the visual evidence support the claim in the title?
Common misalignments that partners notice:
- Title claims X is "significantly higher" than benchmark; chart shows X is 5% higher (not significant)
- Title claims a "trend" is developing; chart shows two data points (not a trend)
- Title claims the finding is "driven by" a specific factor; the chart shows correlation, not causation
These misalignments tell the partner two things: the analyst has either drawn conclusions not supported by the data, or they've used imprecise language without understanding its implications. Both are analytically concerning.
The fix: Before finalizing any slide, read the title and ask: "Does the chart prove this exactly? Is the language precise—is 'higher' actually higher, is 'trend' actually a trend, is 'driven by' actually causal?"
Signal 3: Formatting Consistency and Attention to Detail
Partners use formatting quality as a proxy for broader attention to detail. They know this is an imperfect proxy, but it's the most visible signal available in a three-minute review.
What partners notice:
- Slides with inconsistent font sizes (some titles 18pt, some 16pt)
- Charts with non-standard colors (standard Excel colors instead of the firm's palette)
- Bullet points that mix sentence structures (full sentences and fragments in the same list)
- Numbers presented with inconsistent decimal precision (€42M on one slide, €41.7M on the adjacent slide)
Each of these is a small issue. But a section with five such issues signals a pattern of inattention—the analyst has not checked their work carefully before submission.
The standard: A partner who finds zero formatting inconsistencies in a section review concludes that the analyst checks their work. A partner who finds five concludes they don't. This perception matters at performance review time.
Signal 4: The Logical Chain
Get Poesius for Free
Create professional presentations 5x faster than manual formatting
Get custom-designed slides built from the ground up, not templates
Start free with no credit card required
Partners assess whether the slides in a section build a logical argument or merely present a collection of data points.
The test: Read only the slide titles in the section, in sequence. Do they tell a coherent story? Does each title build on the previous one, moving the argument forward?
What strong logical chains look like:
Title 1: "The Market Is Growing at 12% CAGR, Creating an Attractive Entry Opportunity" Title 2: "Three Competitor Segments Are Competing for This Market, With Different Value Propositions" Title 3: "Our Client's Current Offering Aligns With the Mid-Market Segment, Which Shows Fastest Growth" Title 4: "Entering the Mid-Market Requires Three Specific Capability Investments"
Each title follows from the previous; the section is building an argument.
What weak logical chains look like:
Title 1: "Market Overview" Title 2: "Competitive Landscape" Title 3: "Capability Assessment" Title 4: "Recommendations"
Each title is a topic, not a finding. The argument isn't built; it's implied.
Partners who see the first type think "this analyst understands how to build an argument." Partners who see the second type think "this analyst has organized information but hasn't analyzed it."
Signal 5: The Completeness of the Analytical Work
Partners assess whether the analysis is complete or whether the analyst has stopped before drawing the full conclusion.
The most common completeness failure: Slides that show data accurately but don't state the implication. A chart showing three years of declining customer retention is not a complete slide; the complete slide states: "Declining Retention Is Costing €8M Annually in Revenue—70% Concentrated in the SMB Segment."
Partners call this "so what-ing the data"—drawing the analytical implication of the finding, not just presenting the finding. It's the skill that separates analytical work from consulting work.
The signal it sends: An analyst who consistently "so whats" their data is thinking like a consultant. An analyst who consistently stops at the data is thinking like a researcher. The consulting firm needs consultants.
Signal 6: The Absence of Surprises in Partner Review
The meta-signal that partners interpret most positively: an analyst whose work produces no surprises in the partner review.
No structural surprises (the section does what the ghost deck promised). No analytical surprises (the findings are consistent with the engagement hypothesis, or if they're not, the analyst has flagged and explained the divergence). No formatting surprises (the slides meet standards without requiring correction).
An analyst who produces no surprises in their second partner review on an engagement is building a reputation as reliable. Partners promote reliable analysts; they coach unreliable ones.
The Evaluation Summary
| Signal | Strong indicator | Weak indicator | |---|---|---| | Slide titles | Action sentences stating the finding | Topic labels describing the content | | Chart-title alignment | Visual precisely proves the title claim | Visual doesn't match the title claim | | Formatting | Consistent, meets standards, no errors | Inconsistent, deviations from standards | | Logical chain | Titles tell a coherent sequential argument | Titles are independent topic labels | | Analytical completeness | Data + implication ("so what") | Data only (no implication stated) | | Partner review surprises | No structural or content surprises | Structural or content surprises requiring revision |
Partners rarely articulate these signals explicitly—they experience them as a general sense of quality and trust. But the signals are real, and they compound over time into the reputation that drives promotion decisions.
Related Resources
Get Poesius for Free
Create professional presentations 5x faster than manual formatting
Get custom-designed slides built from the ground up, not templates
Start free with no credit card required