1) Preparation (Pre-Workshop)

Goal: Ensure readiness and alignment before the session.

Inputs

  • Scope definition, stakeholder list, available documentation (org charts, process maps, KPIs, system architecture).

Activities

  • Define objectives and assessment criteria (strategy, process, technology, people, data, governance).
  • Select facilitation tools (Miro, Mural, whiteboards, templates).
  • Circulate pre-reads and survey key stakeholders.

Outputs

  • Agenda, participant roles, baseline knowledge pack.

Talking Points

“Before we get everyone in the room, we need to be clear on what success looks like. Preparation is about alignment, not overload.”

“Think of this as setting the stage: the right people, the right context, the right expectations.”

3 Questions to Ask

  • “Why is this assessment needed now?”
  • “Who must be present to make this successful?”
  • “What materials already exist that we can leverage?”

Common Mistakes

  • No stakeholder alignment before kickoff.
  • Overly complex pre-read materials.
  • Tech or logistics issues left unresolved until the day of.

Stratenity Guidance

Use Stratenity’s one-page “Purpose & Scope” template to align executives. Pre-reads should be lightweight, visual, and designed for clarity.

2) Kick-Off & Context Setting

Goal: Align on purpose, scope, and expected outcomes.

Activities

  • Introduce team and participants.
  • Present workshop objectives and structure.
  • Reconfirm assessment scope, success metrics, and boundaries.

Outputs

  • Shared understanding of goals, working norms, and focus areas.

Talking Points

“Let’s start by clarifying why we’re here today and what we need to achieve together.”

“Everyone in this room plays a role in shaping the picture of today’s state, this isn’t just observation, it’s co-creation.”

3 Questions to Ask

  • “How would you define success for this workshop?”
  • “What 2–3 challenges are most urgent from your perspective?”
  • “Are there any areas we should intentionally set aside?”

Common Mistakes

  • Skipping introductions to save time.
  • Not clearly restating scope and outcomes.
  • Allowing side discussions to dilute focus too early.

Stratenity Guidance

Use the Stratenity visual agenda format. Invite executives to restate goals in their own words, alignment is stronger when it’s spoken, not assumed.

3) Current State Discovery

Goal: Capture existing processes, systems, roles, and challenges.

Activities

  • Process Mapping: Break down end-to-end processes (e.g., procure-to-pay, hire-to-retire).
  • Systems & Data: Identify tools, platforms, integrations, and data quality.
  • Org & Roles: Clarify responsibilities, governance, and accountability.
  • Pain Points: Collect challenges, bottlenecks, and inefficiencies.

Outputs

  • Draft process maps, system landscape, stakeholder pain-point log.

Talking Points

“Right now, we’re not solving problems, we’re capturing them. Let’s focus on what’s really happening day-to-day.”

“This is the chance to surface the hidden workarounds, the handoffs that get messy, and the steps that consume time but add little value.”

3 Questions to Ask

  • “Where do you lose the most time in your process?”
  • “Which systems are most essential and which are least reliable?”
  • “What manual workarounds have become ‘normal’?”

Common Mistakes

  • Jumping to solutioning too early.
  • Capturing processes at either too high or too low a level.
  • Failing to involve cross-functional perspectives.

Stratenity Guidance

Use Stratenity’s process mapping toolkit: one swimlane per function, annotate with pain points, and capture shadow processes explicitly.

4) Deep-Dive & Evidence Gathering

Goal: Validate current state with facts, not just perception.

Activities

  • Review KPIs and performance data.
  • Conduct breakout sessions by domain (finance, HR, IT, operations).
  • Surface “shadow processes” or workarounds.
  • Compare current practices against leading practices/benchmarks.

Outputs

  • Evidence pack (KPIs, performance gaps, benchmark comparison).

Talking Points

“Perceptions are important, but evidence is what builds credibility. Let’s anchor what we’ve heard in real numbers.”

“Our goal here is to identify gaps not just from stories, but from data and benchmarks that highlight where you stand.”

3 Questions to Ask

  • “Which metrics do you track today and which ones do you wish you had?”
  • “Where do current KPIs conflict with lived experience?”
  • “What external benchmarks do you compare yourselves against?”

Common Mistakes

  • Accepting anecdotal evidence without validation.
  • Using metrics without context or baselines.
  • Forgetting to segment data by function or geography.

Stratenity Guidance

Always produce a Stratenity “Evidence Pack”: one-page KPI dashboards plus benchmark comparatives, with clear caveats on data quality.

5) Synthesis & Thematic Analysis

Goal: Organize findings into coherent insights.

Activities

  • Cluster issues into themes (e.g., data quality, role clarity, tech debt, compliance).
  • Use MECE categorization (mutually exclusive, collectively exhaustive).
  • Map pain points to business impact (time, cost, risk, customer experience).

Outputs

  • Thematic summary, prioritized issue list.

Talking Points

“Now we connect the dots: instead of a laundry list of issues, let’s group them into themes that tell a clear story.”

“Themes help us move from isolated problems to systemic insights and they make prioritization actionable.”

3 Questions to Ask

  • “Which of these themes resonates most with your experience?”
  • “What surprises you about how issues cluster together?”
  • “If you had to pick one theme to address first, which would it be?”

Common Mistakes

  • Overcomplicating categories beyond stakeholder understanding.
  • Failing to link issues to measurable impact.
  • Leaving themes too broad to act on.

Stratenity Guidance

Use Stratenity’s MECE clustering template. Each theme should have 3 elements: description, supporting evidence, quantified impact.

6) Validation & Alignment

Goal: Confirm accuracy with stakeholders.

Activities

  • Share synthesized findings with the group.
  • Facilitate discussion to validate, refine, and fill gaps.
  • Align on top 3–5 priority issues to address in future state design.

Outputs

  • Validated current state picture, agreed priority focus areas.

Talking Points

“This is where we check ourselves: does this picture reflect your reality?”

“Alignment doesn’t mean everyone agrees on everything, it means we have a shared baseline to move forward from.”

3 Questions to Ask

  • “What feels most accurate in this synthesis?”
  • “Where do you see gaps or missing nuance?”
  • “Which issues would you prioritize tackling first?”

Common Mistakes

  • Not giving stakeholders space to challenge findings.
  • Forcing consensus instead of alignment.
  • Leaving priorities too vague or numerous.

Stratenity Guidance

Always validate with a “Top 5 Issues” short list. Use voting or dot prioritization to drive quick alignment without extended debate.

7) Wrap-Up & Next Steps

Goal: Close with clarity and momentum.

Activities

  • Summarize key findings and themes.
  • Outline immediate next steps (future state design, quick wins).
  • Collect feedback on the workshop.

Outputs

  • Final workshop pack (process maps, issue log, thematic summary, action items).

Talking Points

“Let’s leave this room with clarity: here’s what we discovered, and here’s where we go next.”

“Momentum matters, the value of this session comes from translating insight into immediate action.”

3 Questions to Ask

  • “What quick wins can we act on immediately?”
  • “What support do you need to move into future state design?”
  • “How should we share these findings more broadly?”

Common Mistakes

  • Ending without clear next steps or ownership.
  • Failing to capture participant feedback.
  • Letting the energy drop at the end of the day.

Stratenity Guidance

Always close with a “One Page Next Steps” handout. Include ownership, timeline, and Stratenity’s recommended path into future-state design sessions.

8) Level of Effort & Pricing Guidance

Goal: Provide industry benchmarks for effort and pricing, and outline Stratenity’s guidance for conducting current state assessments with professionalism and efficiency.

Level of Effort (Benchmark)

  • Preparation: 2–3 days for scoping, stakeholder alignment, and pre-read development.
  • Workshop Facilitation: 1–2 full days (or equivalent half-day sessions) for group sessions, breakouts, and real-time synthesis.
  • Post-Workshop Deliverables: 3–5 days to produce process maps, issue logs, evidence packs, and the assessment report.
  • Total Effort: 6–10 consulting days per engagement, adjusted by complexity and breadth of scope.

Pricing Standards & Turnaround Benchmarks

  • Top-Tier Consulting Firms: $45,000 – $70,000. Engagements typically run 4–6 weeks due to larger teams, extended stakeholder interviews, and layered governance. Premium pricing reflects brand, breadth, and global benchmarks, though delivery speed may be slower.
  • Midsize Consulting Firms: $25,000 – $40,000. Typical turnaround is 3–4 weeks, with leaner teams and focused methodologies. Strong balance of rigor and efficiency, but documentation and benchmarking depth may vary.
  • With Stratenity AI Tools & Accelerators: $12,000 – $20,000. Turnaround averages 1–2 weeks, as AI frameworks automate process mapping, pain-point logging, and benchmarking. Consultants pass efficiency gains to clients while maintaining industry-standard quality.

Commitment to Consulting Standards

Stratenity’s industry guidance emphasizes that every consulting entity, regardless of size, should uphold the following commitments:

  • Prepare: Enter every engagement with clarity of scope, aligned stakeholders, and curated pre-reads.
  • Conduct: Facilitate sessions with discipline, neutrality, and inclusive participation.
  • Deliver: Produce structured, evidence-based reports with clear priorities and next steps.
  • Uphold Standards: Ensure outputs are professional, actionable, and consistent with industry best practices.

Stratenity Guidance: Consultants who adopt structured playbooks and AI-powered accelerators can not only reduce costs but also elevate client trust by delivering faster, more consistent, and insight-rich assessments.

9) Risk & Mitigation Considerations

Goal: Surface key risks early, quantify impact/likelihood, and define practical mitigations that inform future-state design.

Activities

  • Establish a risk taxonomy (people, process, technology, data, governance, compliance).
  • Create a risk register with impact × likelihood ratings and owners.
  • Run a cross-functional review to stress-test assumptions and identify interdependencies.
  • Define mitigations: prevention controls, detective controls, contingency plans, decision gates.
  • Translate critical risks into design constraints and acceptance criteria for the future state.

Outputs

  • Risk register with ratings, owners, and target dates.
  • Mitigation plan mapped to each high-risk item.
  • Design constraints & guardrails feeding the next phase.

Talking Points

“Every assessment reveals risk. Our aim isn’t to eliminate all of it, it’s to make risk visible, owned, and managed.”

“Let’s separate ‘inherent’ risk from ‘residual’ risk so we’re honest about what changes and what remains.”

3 Questions to Ask

  • “Which risks could materially delay or derail operations in the next 6–12 months?”
  • “Where do we lack monitoring or early-warning signals today?”
  • “Who owns the decision to accept, mitigate, or retire each high-risk item?”

Common Mistakes

  • Treating the risk list as a formality rather than an input to design decisions.
  • Assigning no clear owner or due date to mitigations.
  • Overloading with low-impact risks while underplaying systemic ones.

Stratenity Guidance

Use the Stratenity risk register template with a simple 1–5 scale for impact and likelihood, plus a “control strength” score. For the top risks, require a one-paragraph mitigation narrative and a named accountable owner. Convert the top 3–5 risks into explicit future-state design guardrails.

10) Knowledge Transfer & Capability Building

Goal: Ensure insights are adopted, maintained, and improved by the client team, so value persists beyond the assessment.

Activities

  • Host a debrief session with process owners and PMO to walk through findings and artifacts.
  • Deliver editable assets (maps, logs, KPI packs) with clear versioning and ownership.
  • Run “train-the-trainer” micro-sessions for updating process maps, risk logs, and KPI dashboards.
  • Define a cadence (monthly/quarterly) for refreshing KPIs and revalidating pain points.
  • Set up a lightweight governance loop (RACI + change log) for ongoing maintenance.

Outputs

  • Editable knowledge pack (source files, templates, and standards).
  • Operating playbook for updates (who updates what, when, and how).
  • Capability uplift plan: roles, skills, and enablement milestones.

Talking Points

“This work only sticks if your team can update it without us. Our goal is to make the living version of this assessment easy to maintain.”

“We’ll hand over editable files and a simple update rhythm, so insights don’t expire the moment the deck is shared.”

3 Questions to Ask

  • “Who is accountable for keeping each artifact current (process map, KPI pack, risk log)?”
  • “What’s the simplest cadence we can commit to for refresh and review?”
  • “Which skills or access do your team members need to maintain these effectively?”

Common Mistakes

  • Delivering static PDFs with no source files or editing guidance.
  • No defined ownership or cadence for updates.
  • Overly complex tooling that the client can’t realistically support.

Stratenity Guidance

Deliver all artifacts in a structured workspace with clear foldering, naming standards, and “how-to-update” notes. Include a one-page capability uplift plan (people, tools, cadence). Favor simple, widely-adopted tools unless the client already has mature modeling platforms.

11) From Strategy to Implementation

Goal: Bridge the gap between high-level findings and concrete execution, ensuring strategy translates into measurable results.

Activities

  • Translate prioritized issues into actionable initiatives with timelines and owners.
  • Develop a transition roadmap that sequences quick wins, foundational changes, and long-term transformations.
  • Define success metrics and link them directly to business outcomes (cost, efficiency, risk, customer experience).
  • Establish governance and accountability structures to drive follow-through.

Outputs

  • Implementation roadmap (phased plan with milestones).
  • Action tracker with owners, timelines, and success criteria.
  • Governance model to ensure oversight and accountability.

Talking Points

“Strategy without execution is just a slide deck. Our aim is to show the direct path from what we’ve diagnosed to how change gets implemented.”

“This is where priorities turn into actions, owners, and measurable outcomes.”

3 Questions to Ask

  • “Which initiatives can deliver impact within 90 days?”
  • “What dependencies could delay implementation?”
  • “Who owns delivery, and how will progress be tracked?”

Common Mistakes

  • Leaving the assessment at the “insight” stage with no next steps.
  • Failing to link initiatives to measurable business outcomes.
  • Not sequencing initiatives realistically (trying to do everything at once).

Stratenity Guidance

Always provide a 90-day roadmap plus a 12-month horizon. Use Stratenity’s phased playbook (Quick Wins → Core Fixes → Transformational Moves). Ensure every recommendation has an owner, a metric, and a time frame.

12) Client Guidance When Engaging Consulting

Goal: Help clients understand how to engage consultants effectively, set expectations, and maximize value from an assessment.

Activities

  • Clarify objectives and scope in writing before engagement begins.
  • Ensure the right stakeholders are available and committed to participate.
  • Request transparency on consultant methodology, deliverables, and timeline.
  • Define how findings will be used: strategy, transformation, compliance, or decision support.

Outputs

  • Shared expectations between client and consulting team.
  • Defined roles: client sponsors, stakeholders, and decision makers.
  • Clear understanding of how outputs will be consumed and acted upon.

Talking Points

“Consulting delivers the most value when clients are active participants, not passive observers.”

“Your role as a client is to provide access, context, and honest input, our role is to structure, analyze, and guide.”

3 Questions to Ask (Clients)

  • “What internal resources are we ready to dedicate to this assessment?”
  • “How will we use these findings to make concrete decisions?”
  • “What does success look like for us, quick insights, deep analysis, or a full roadmap?”

Common Mistakes

  • Clients expecting consultants to “own” all answers without providing internal context.
  • Unclear sponsorship leading to slow or diluted decisions.
  • Underestimating the time commitment needed from internal teams.

Stratenity Guidance

Encourage clients to prepare by designating a clear executive sponsor and empowered working team. Remind them that the assessment is co-created: insights are richer and more actionable when client teams engage actively. Provide clients with Stratenity’s “Engagement Readiness Checklist” to align expectations upfront.

Deliverables (Post-Workshop)

The Current State Assessment Deck/Report should include the following sections with their key components:

  1. Executive Summary
    • Purpose of the assessment
    • Scope and boundaries
    • Top findings in plain language
    • 3–5 priority issues or opportunities
    • High-level next steps
  2. Process Maps & System Landscape
    • End-to-end process diagrams (e.g., procure-to-pay, hire-to-retire)
    • System/application inventory
    • Integration and data flow diagrams
    • Ownership and responsibility mapping
  3. Stakeholder Pain Points
    • Collected challenges and bottlenecks
    • Role-specific issues (e.g., finance, HR, IT)
    • Shadow processes or workarounds identified
    • Impact rating (low/medium/high)
  4. Performance Data & Benchmarks
    • Baseline KPIs and metrics
    • Trends and variances vs. targets
    • Comparisons to industry benchmarks
    • Evidence pack (supporting data, logs)
  5. Thematic Issues & Prioritized List
    • Grouped findings by category (people, process, technology, governance, data)
    • Root causes and business impacts
    • Priority heatmap or matrix (impact vs. effort)
    • Quick-win opportunities flagged
  6. Recommendations for Future State Phase
    • Immediate actions (quick wins)
    • Suggested focus areas for design workshops
    • High-level roadmap guidance
    • Additional assessments or deep-dives needed