Introduction to ISO 9001 Process KPIs and Design Principles
This section defines an end-to-end KPI architecture that aligns process performance with corporate strategy under ISO 9001:2015. The aim is to establish a cross-industry backbone and then make manufacturing, services, software, and logistics vertical templates plug-and-play in later sections. The approach spans process mapping, KPI alignment, target cascading, dashboard design, audit evidence, supplier management, CAPA integration, and Management Review (MR) outputs to form a closed loop.
Mandatory KPI metadata
A mature KPI system standardizes every metric’s definition, formula, and management cadence. Below are the mandatory metadata fields for deployment at enterprise scale.
| Field | Definition | Recommendation |
|---|---|---|
| Name / Code | Unique metric name and short code | Corporate code standard like “FPY_MFG_01” |
| Purpose | Business outcome tied to quality objective | Customer satisfaction, cost, cycle time, compliance |
| Definition | What is measured and scope limits | Clarify process boundaries using SIPOC |
| Formula | Mathematical calculation method | Numerator–denominator universe match + validation rule |
| Target / Limits | Annual target, warning, and critical thresholds | MR-approved with quarterly cadence |
| Data Source | Operational system, record, or form | ERP/MES/CRM, QMS forms, supplier portal |
| Measurement Frequency | Collection and reporting period | Daily/weekly for ops, monthly feed to MR |
| Ownership | Single accountable owner and deputy | Process owner + data owner assigned via RACI |
| Strategy Link | Connection to strategic goal, OKR or policy | Map to corporate balanced scorecard layer |
| Visualization | How it is rendered on dashboards | Trend, target bands, traffic light, distribution |
| Risks & Assumptions | Factors that may impact data integrity | Source changes, sampling error, latency |
| Related CAPA | Action path when thresholds are breached | Auto-create CAPA and run root cause analysis |
Process mapping and KPI alignment
KPIs attach to control points on the process map (SIPOC/Value Stream Map). Create families for input quality, transformation efficiency, and output conformity. Place of measure, source system, and frequency must match the process cycle. For high-volume flows use daily process KPIs; for low-volume project work use gate KPIs. Maintain one “result” KPI per step and as many “leading” KPIs as needed. This keeps dashboards lean and preserves causal readability.
Critical design tip
If a KPI value does not move on the dashboard there are two options: the measure point is wrong or the frequency is decoupled from the process cycle. Validate the cycle first, then standardize the data source.
Target/limit design and strategy alignment
Target cascading translates policy-level goals into process-level bands. Top-level result KPIs (e.g., customer satisfaction, DPMO, OTIF) decompose into leading indicators in sub-processes. Annual targets break down into quarterly and monthly bands using a traffic-light model. Link to OKRs: KRs bind to KPI bands and Management Review consolidates outcomes on one page.
Data source, frequency, and ownership
Select data sources by balancing measurement cost and accuracy. Prefer automated pulls (ERP/MES/CRM/QMS); enforce digital audit trails for manual forms. Frequency must be short enough to observe statistical variation yet long enough to avoid noise. Ownership follows the “single accountable” principle; define accountability in a RACI matrix and document deputies and continuity risks.
Visualization and dashboard design
Design dashboards to accelerate decisions. Render each KPI consistently: target bands, time trend, last value, and threshold alert. Traffic-light status must be unambiguous. Monthly executive views stay summary-level; process dashboards show detailed trends. Use distribution and box plots to read process stability. Manage dashboard versions under configuration control with a data dictionary.
Audit evidence for KPIs
Auditors look for three evidence types: (1) a defined KPI dictionary and approved targets, (2) data integrity and traceability (source screenshots, report exports, timestamps), and (3) CAPA records and effectiveness checks for threshold breaches. Close findings with corrective plans owned by the process owner.
CAPA and KPI interaction
Critical breaches auto-trigger CAPA. Root-cause analysis (5W1H, Fishbone, 5 Whys) must also test KPI formulas and data assumptions. Separate containment from permanent fixes; verify effectiveness by returning the KPI trend to its target band. Closeout ties back to Management Review.
Management Review outputs
MR is the official feedback loop of the KPI ecosystem. Agenda covers target attainment, customer feedback, nonconformity–CAPA status, supplier performance, and resource needs. Outcomes include target resets and investment/resource plans. In-year target changes are versioned and communicated.
KPI maturity assessment
Maturity is five-level: Level 1 ad hoc measures, Level 2 defined but inconsistent data, Level 3 standardized dictionary and cadence, Level 4 predictive analytics with CAPA integration, Level 5 strategy-driven closed-loop optimization. Assess using data integrity, ownership clarity, target cascading, and CAPA effectiveness.
- High-leverage move: Lock KPI–process mapping in a single workshop.
- Risk: Bands set below or above real process capability generate false alarms.
- Control: Version the data dictionary and revisit targets in MR.
KPI Dictionary and Execution Discipline: Targets, Data Source, Cadence, Ownership
The KPI dictionary is the single source of truth for metric lifecycle management. It operationalizes ISO 9001:2015 clauses 6 (quality objectives), 7.5 (documented information), and 9.1 (monitoring, measurement, analysis, evaluation). This section provides a deployable template for target bands, authoritative data sources, measurement and reporting cadence, and RACI-based ownership.
Target Bands: Design Rules
Use a three-band model: Target (green), Warning (amber), Critical (red). Align bands to process capability and cascade annually → quarterly → monthly. Targets must be process-specific, time-bound, and numerator–denominator consistent.
| Criterion | Description | Control Check |
|---|---|---|
| Strategy Fit | Traceability to policy objective or OKR | OKR code cross-referenced with KPI code |
| Process Capability | Cp/Cpk and last-12-period distribution | Band sits within capability envelope |
| Cascading | Year → Quarter → Month targets | MR approval and version record |
| Alarm Logic | Auto tasks at warning/critical | CAPA trigger rule documented |
Authoritative Data Source Governance
Enforce single system of record. Preference order: primary operational system > integration feed > manual form. For manual entry, require digital audit trail. Store schema, version, and access controls in the dictionary.
- System: ERP/MES/CRM/QMS module and table name
- Fields: Columns feeding the formula
- Filters/Business Rules: Date, status, product family, region
- ETL/Refresh: Pull cadence, latency, failure handling
- Validation: Independent recalculation procedure
Data Integrity Tip
One KPI, one source. When multiple systems exist, label only one as authoritative; tag others as contextual reports.
Measurement and Reporting Cadence
Cadence must reflect process dynamics. For fast-cycle operations use daily measurement and weekly review; for project work use gate-based reporting. Avoid over-sampling noise or under-sampling lag.
| Process Type | Collection | Reporting | Note |
|---|---|---|---|
| High-volume manufacturing | Daily | Weekly + Monthly | SPC band control |
| Field service | Daily event logging | Weekly SLA | Seasonality segmented |
| Software/SRE | Real-time | Weekly postmortem | MTTR, change success |
| Logistics | Per shipment | Weekly OTIF | Carrier-level split |
Ownership: RACI and Deputization
Assign a single Accountable owner per KPI. Maintain a role-based RACI in the dictionary. Record deputy plan and continuity risk. Apply MR approval for ownership changes.
| Role | Responsibility | Success Metric |
|---|---|---|
| A – Accountable | Target setting, source approval, alarm rule | Target band coherence |
| R – Responsible | Data pull, checks, dashboard update | On-time reporting |
| C – Consulted | Analytics, methodology, model change | Formula validation |
| I – Informed | MR, process owners, suppliers | Decision traceability |
Template: KPI Dictionary Record
Use the template below for every metric. Treat the dictionary as a controlled document under information security policy.
| Field | Example Value |
|---|---|
| Code / Name | LOG_OTIF_01 – On Time In Full |
| Purpose | Customer delivery performance |
| Definition | Percentage of shipments delivered on time and in full |
| Formula | OTIF = (On-time & In-full / Total) × 100 |
| Target / Warn / Critical | 97% / 95% / 93% |
| Data Source | TMS delivery confirmation, e-sign POD |
| Measurement / Reporting | Per shipment / Weekly |
| Ownership (RACI) | A: Logistics Manager, R: Planning, C: Quality, I: Sales |
| Visualization | Trend + target band + traffic light |
| Risks/Assumptions | Address change delays, integration lag |
| CAPA Trigger | Two consecutive periods below band → root cause |
| Version | v1.3 – 2025Q3, MR approved |
Audit Checklist
- Are bands capability-based and MR-approved?
- Is the data source single, traceable, and authorized?
- Does cadence match process cycle time?
- Is RACI clear with a deputy plan?
- Are alarm→CAPA triggers enforced?
- Is version control active across dictionary and dashboards?
Common Failure Modes and Corrective Actions
Universe mismatch: Mixed filters in numerator/denominator. Action: Standardize scope. Over-sampling: Noise inflation. Action: Redesign cadence. Multi-source conflict: Divergent figures. Action: Declare system of record. Owner gap: No accountability. Action: Assign “A” and record in MR.
Supplier KPIs: Scoring, Contract Clauses, and Governance
This section standardizes supplier performance monitoring under ISO 9001 clause 8.4. Objective: consolidate critical metrics into a single dashboard and a single contract appendix, tie penalty/bonus logic to numeric thresholds, and integrate CAPA.
Core Supplier KPI Family
| KPI Name/Code | Definition / Formula | Target / Limits | Data Source | Frequency | Ownership |
|---|---|---|---|---|---|
| Supplier OTIF (SUP_OTIF_01) | On-Time & In-Full POs / Total POs | ≥ 98% target, 96% warn, 94% critical | TMS/ERP receipt + e-POD | Monthly | Supply Chain (A), Logistics (R) |
| PPM – Defects per Million (SUP_PPM_02) | (Nonconforming Parts × 106) / Total Parts | ≤ 500 target, 800 warn, 1200 critical | Incoming QC, 8D records | Monthly | Supplier Quality (A), Quality (R) |
| ASN/Label Compliance (SUP_ASN_03) | Accurate & Timely ASN / Total Shipments | ≥ 99% target, 97% warn, 95% critical | EDI/Portal, WMS gate | Monthly | Logistics Operations (A) |
| Lead Time Adherence (SUP_LT_04) | Delivered within Committed LT / Total Orders | ≥ 97% target, 95% warn, 93% critical | ERP orders & planning | Monthly | Planning (A), Procurement (R) |
| CAPA On-Time Closure (SUP_CAPA_05) | CAPA Closed on Time / Total CAPA | ≥ 95% target, 90% warn, 85% critical | QMS CAPA module | Monthly | Quality (A), Supplier (R) |
| Supplier COPQ (SUP_COPQ_06) | Supplier-caused COPQ / Total Procurement | ≤ 0.3% target, 0.5% warn, 0.8% critical | Finance, scrap/rework | Quarterly | Finance (A), Quality (C) |
Supplier Scorecard Template
Aggregate KPIs into one score using strategic weights. Review annually in MR.
| Dimension | KPIs | Weight | Scoring Logic |
|---|---|---|---|
| Logistics | SUP_OTIF_01, SUP_LT_04 | 40% | Band-normalized min(OTIF, LT) |
| Quality | SUP_PPM_02, SUP_CAPA_05 | 40% | Inverse PPM + on-time CAPA |
| Compliance/Data | SUP_ASN_03 | 10% | ASN accuracy |
| Financial | SUP_COPQ_06 | 10% | Within target band |
Grade thresholds: A≥90, B=80–89, C=70–79, D<70. A/B remain strategic. C requires improvement plan. D triggers exit plan.
Contract Appendix – KPI/SLA Clauses
- KPI Definitions & Thresholds: SUP_OTIF_01 ≥ 98%, SUP_PPM_02 ≤ 500, SUP_ASN_03 ≥ 99%, SUP_CAPA_05 ≥ 95%, SUP_COPQ_06 ≤ 0.3%.
- Measurement & Verification: ERP/TMS/QMS as system of record. Monthly reconciliation with timestamps.
- Penalty/Bonus: Per-point deviation below/above threshold. Annual floor/ceiling applies.
- Mandatory CAPA: Two consecutive periods below band require 8D. Default closure target: 30 days.
- Audit Rights: Announced/unannounced process and record audits. Traceability sampling allowed.
- Data & EDI: ASN/label standards, EDI message set, barcode format. Mislabeling rework at supplier cost.
- Continuity & Risk: BCP/DR, alternative source disclosure, critical raw-material alerts.
- Confidentiality & IP: Classification of drawings/specs.
Dashboard and Visualization
Two levels: supplier-level and part/SKU-level. Default tiles: OTIF trend, PPM box plot, ASN heatmap, score distribution. Filters: supplier, part family, region, carrier. Show target bands and alarm log per metric.
Audit Evidence and CAPA Integration
- Dictionary and contract appendix versions with MR approval.
- Source screenshots, EDI logs, timestamps, and monthly reconciliations.
- 8D/CAPA files for band breaches with effectiveness proof on trends.
RACI Example
| Activity | A | R | C | I |
|---|---|---|---|---|
| Annual target setting | Procurement Manager | Supplier Quality | Finance, Production | MR |
| Monthly data reconciliation | Supplier Quality | Logistics | Supplier | Planning |
| Band-breach CAPA | Quality Director | Supplier | Engineering | Procurement |
Contract Appendix – Section Heads
- Scope and definitions (KPI/SLA dictionary)
- Measurement method and system of record
- Target/limit bands and periods
- Penalty/bonus and settlement method
- CAPA and audit procedures
- Confidentiality, security, BCP
- Effective date, revision, termination
Implementation Tip
Attach penalty/bonus to the weighted score, not a single KPI. This reduces financial volatility from one-off metric swings.
CAPA–KPI Integrated Management: Threshold Triggers, 8D Flow, Effectiveness Verification
This section operationalizes a closed loop between KPI deviations and CAPA per ISO 9001:2015 clauses 10.2 and 9.1. The objective is to auto-trigger problem solving from threshold breaches, expand root-cause analysis to validate metric assumptions and data lineage, and verify effectiveness on the KPI trend before formal closeout in Management Review (MR).
Trigger Logic: Band Breach → Event → CAPA
- Warning-band breach: Open a monitoring event. Apply short-cycle containment. CAPA not mandatory.
- Single critical breach: Open event + rapid 5W1H. Perform risk screening.
- Two consecutive periods below band:CAPA mandatory. Launch 8D.
- Strategic KPI deviation: Auto-add MR agenda item. Evaluate resource and investment decisions.
Mapping the 8D Flow to KPIs
| 8D Step | KPI Link | Evidence/Output |
|---|---|---|
| D1 Team | Assign KPI owner and data owner via RACI | Tasks, authority matrix |
| D2 Problem Definition | Confirm band, period, segment, formula scope | Problem statement + dictionary reference |
| D3 Containment | Short-term recovery actions on the KPI | Timestamped change log |
| D4 Root Cause | Test formula, data source, sampling, process factors | Fishbone/5 Whys + independent recalculation |
| D5 Permanent Corrective Action | Parameters, training, supplier change, automation | Change plan, validation protocol |
| D6 Implementation & Verification | KPI returns to band and stabilizes | Before/after chart, control-plan update |
| D7 Prevention/Replication | Roll out to similar processes/KPIs | FMEA update, lessons learned |
| D8 Closure | MR decision record and version sync | Closure report, effectiveness sign-off |
Risk-Based CAPA Prioritization
Rank events by RPN (Severity × Occurrence × Detection) and include financial impact and compliance risk. Lower the CAPA trigger for high-RPN items. Link backlog priority to expected KPI uplift and time-to-impact.
Effectiveness Verification on the KPI
- Trend return: Three consecutive periods within target band.
- Statistical proof: SPC free of special-cause signals; capability Cpk ≥ 1.33.
- Side-effects: No degradation on related leading indicators.
- Control sustainment: Control plan and training records updated.
Data Model for Event–CAPA–KPI Linkage
| Field | Description | Example |
|---|---|---|
| KPI_Code | Unique metric code from dictionary | MFG_OEE_03 |
| Band_Type | Target / Warning / Critical | Critical |
| Segment | Product/Customer/Region/Shift | Line-2, Shift-B |
| Event_ID | Threshold-breach event identifier | ALR-2025-0912-004 |
| CAPA_ID | QMS CAPA record number | CAPA-25-118 |
| RPN | Risk priority number | 196 |
| Verification_Date | Effectiveness verification timestamp | 2025-10-15 |
Operational Playbook
- Dashboard alarm fires → open event → reference KPI dictionary row.
- Compute risk score → decide CAPA requirement.
- Launch 8D → apply containment.
- Implement permanent fixes → update control plan and training.
- Track KPI trend → verify effectiveness for three periods.
- Replicate as needed → close in MR with decision record.
Audit Evidence Set
- Alarm log and band-breach screenshot.
- 8D file, root-cause proofs, change logs.
- Before/after KPI charts, SPC outputs.
- Updated control plan, training records, version notes.
Implementation Tip
Tie CAPA closure to metric stabilization, not calendar deadlines. Calendar-driven closures create false positives.
Visualization and Dashboard Design: Target Bands, Trend Literacy, and Segmentation
This section codifies decision-centric dashboard standards that satisfy ISO 9001 performance monitoring while enabling rapid executive and operational action. Objective: render KPI data with target bands, time-series trends, and an alarm log so status, variance, and causality are unambiguous. Information architecture spans three tiers: executive summary, process dashboards, and diagnostic analytics. Each visual element references the KPI dictionary entry for formula and bands.
Dashboard Hierarchy and Use Cases
- Executive Summary: 8–12 strategic KPIs, quarterly cascade, traffic-light status. Use in Management Review.
- Process Dashboards: Result KPIs (FPY/OEE, SLA, OTIF) with leading drivers. Use in weekly operations.
- Diagnostics: Distribution, root-cause slices, segment comparisons. Use in problem-solving sessions.
Visual Standards
- One KPI = One Chart: Time trend + target band + last-value card.
- Target Bands: Show target, warning, and critical envelopes explicitly.
- Time Base: Minimum 12 periods visible; expandable to 36.
- Traffic Light: Status computed from last value relative to bands; include timestamp and source tag.
- Scale Coherence: Align axes and periods when placing multiple KPI tiles on the same page.
Chart Selection Matrix
| KPI Type | Recommended Chart | Rationale | Augmentation |
|---|---|---|---|
| Ratio/Percent (FPY, OTIF, SLA) | Line trend + band | Immediate in/out-of-band visibility | Traffic light + last-value card |
| Counts/Frequency (defects) | Column trend | Volume dynamics clarity | 3-period moving average |
| Distribution/Quality (PPM, cycle time) | Box plot | Median, quartiles, outliers | SPC markers |
| Comparative (region/product) | Horizontal bar | Ranked segmentation | Target line overlay |
| Relationship | Scatter | Correlation readout | R2 label |
Segmentation and Filters
Every KPI chart must be segmentable via top-level filters: product family, customer segment, region, shift, supplier. If segmentation materially shifts performance, render segment-specific targets and mark charts with a “segment target” badge. Persist the selected segment in the URL or dashboard state for auditability.
Alarm Log and Annotations
- For each out-of-band point, store timestamp, owner, and short analysis note.
- Link CAPA record and show status badge: Open / In Progress / Verified.
- If a band or formula changes, display a visible version tag on the chart.
Data Freshness and System-of-Record Tag
Expose a Freshness indicator per page: “Freshness: T−x hours”. Add a system-of-record badge from the dictionary. If ETL fails, gray out visuals and show last successful refresh timestamp with a link to the failure log.
Standard KPI Card Components
| Component | Content | Source |
|---|---|---|
| Title | Code + Name (e.g., LOG_OTIF_01) | KPI dictionary |
| Last Value | 97.4% (↑0.6) | Calc engine |
| Bands | 97% / 95% / 93% | Dictionary |
| Trend | 12-period line + band | Data warehouse |
| Alarms | Two critical in last 90 days | Alarm log |
| Note | “Carrier change improved OTIF” | Ops note |
Accessibility and Usability
- Color-blind friendly palettes and patterned bands.
- Alt-text summaries for each chart.
- Keyboard-accessible filters and date pickers.
Versioning and Change Control
Put dashboard configurations under version control. For every change log scope, list impacted KPIs, band updates, and approval owner. Do not alter executive visuals without MR authorization.
Implementation Checklist
- Are band values drawn for the correct period?
- Is freshness and system-of-record visible?
- Do segment selectors update targets dynamically?
- Are alarms and CAPA links working?
- Is navigation coherent across executive → process → diagnostic tiers?
Implementation Tip
Limit to four charts per page view. Excessive visuals dilute salience and mask alarms.
Internal Audit Evidence for KPIs: Evidence Chain, Traceability, and Nonconformity Closure
This section standardizes the evidence model for ISO 9001:2015 clause 9.2 internal audits across the KPI ecosystem. Audit scope spans three layers: (1) design (dictionary, targets/bands), (2) execution (data source, cadence, ownership, dashboard), and (3) results and improvement (CAPA linkage, effectiveness verification, Management Review decisions). Evidence must be timestamped, reproducible, and version-controlled.
Evidence Architecture and Traceability Chain
Each KPI must have a one-page evidence chain. Trace through coded references: KPI_Code → Formula_File → System_of_Record → ETL_Pipeline → Dashboard_Widget → Alarm_Log → CAPA_ID → MR_Decision. Auditors sample any period and reconcile end-to-end.
| Evidence Item | Content | Control Criterion |
|---|---|---|
| KPI Dictionary Row | Code, purpose, definition, formula, bands, ownership | Version-controlled, MR-approved, current |
| Source System Screen | ERP/MES/CRM/QMS raw data view | Timestamp, filter match, authoritative source |
| ETL/Integration Log | Pull time, row count, failure logs | Freshness T−x hours, error handling rule |
| Dashboard Snapshot | Trend, band overlay, last value | Bands identical to dictionary |
| Alarm Log | Out-of-band points, owner, quick analysis note | Matches CAPA trigger policy |
| CAPA File | 8D, root cause, permanent fix, verification | Effectiveness proven on KPI trend |
| MR Decision Record | Target revision, resource/investment approval | Decision → KPI target version sync |
Sampling Design and Independent Recalculation
Use risk-based sampling. For strategic KPIs apply larger samples and full recalculation. For medium risk use periodic samples and formula verification.
- Sampling Frame: Last 12 periods with segment splits (product, region, shift).
- Method: Extract raw data → reapply formula independently → compute delta to dashboard.
- Acceptance: Delta ≤ 0.2% or explained rounding rule.
Common Nonconformities and Fix Patterns
| Finding Type | Description | Corrective/Preventive Action |
|---|---|---|
| Band–Dictionary Mismatch | Dashboard bands differ from dictionary | Fix config, add release note, obtain MR approval |
| Unauthorized Data Source | Report pulls from non-authoritative system | Declare system of record, reclassify others as contextual |
| Formula Universe Error | Mixed filters in numerator/denominator | Standardize scope, add regression tests |
| Missing Alarm→CAPA Link | Out-of-band period without CAPA | Enforce trigger rule, open retroactive CAPA |
| Data Freshness Gap | ETL lag or stale freshness badge | Define ETL SLA, gray-out visuals on failure |
Rapid Audit Question Set
- Is the dictionary entry current with MR approval and correct bands?
- Do dashboard bands equal dictionary values with visible version tags?
- Is the raw-data source the system of record? Are ETL delay and error logs captured?
- For last 12 periods, do out-of-band points have alarms and CAPA links?
- Was effectiveness verified with three in-band periods and no adverse impact on leading KPIs?
- Do segment selections auto-refresh segment-specific targets?
Nonconformity Grading and Closure SLA
Classify as Critical–Major–Minor. Critical items include customer risk/regulatory breach and require containment in 7 days and permanent fix in 30 days. Major: 30/60 days. Minor: 60/90 days. Assign a single owner and trace closure in the CAPA system with KPI-based verification.
Audit Package and Archiving
- Preparation: Audit plan, KPI inventory, risk map.
- Execution: Sample list, screenshots, recalculation files.
- Closeout: Nonconformity report, CAPA plan, owners and dates.
- Archive: Encrypted folder, role-based access, retention policy.
KPIs for the Audit Process Itself
Measure audit process performance with On-Time Audit Completion, NC Closure SLA Adherence, Repeat Findings Rate, and Post-Audit KPI Improvement. Review in MR and refine audit approach accordingly.
Process Mapping and KPI Alignment: SIPOC, Value Stream, and Control Point Design
This section integrates ISO 9001’s process approach with KPI architecture. Objective: standardize process boundaries and critical control points so each KPI has a clear where, how, and how often. The method runs on three layers: (1) SIPOC to define supplier–input–process–output–customer scope, (2) Value Stream Mapping (VSM) to visualize flow, wait, and inventory, (3) a Control-Point Matrix to place leading and lagging indicators.
Step 1 — SIPOC to Lock Scope and Accountability
SIPOC defines the measurement universe and prevents numerator–denominator mismatches. Create a one-page SIPOC per process and version it under the process code in the KPI dictionary.
| Component | Content | KPI Impact |
|---|---|---|
| Supplier (S) | Internal/external suppliers, carriers, IT services | Supplier KPIs (OTIF, PPM, ASN) |
| Input (I) | Material, data, demand, specification | Input conformity rate, data quality score |
| Process (P) | Step-by-step flow, waits, decision nodes | Cycle time, OEE/SMED, rework |
| Output (O) | Product/service, document, delivery | FPY, SLA compliance, OTIF |
| Customer (C) | Internal customer, end customer, regulator | NPS, complaint rate, nonconformities |
Rule: a KPI is calculated within one SIPOC universe. Flows outside the universe are contextual reports.
Step 2 — Value Stream Mapping to Expose Bottlenecks
VSM separates value-adding from non-value-adding time and informs control-point placement. Mark takt time, cycle time, wait, and WIP. At each handoff, specify data capture method (auto/manual) and timestamp.
- Flow metrics: Lead time, touch time, WIP, Rolled Throughput Yield (RTY).
- Quality metrics: DPMO, FPY, rework rate.
- Delivery metrics: OTIF, response SLA, MTTR.
- Cost metrics: Scrap, rework, COPQ.
Assign at least one leading KPI to each VSM bottleneck. Do not rely on lagging KPIs alone.
Step 3 — Control-Point Matrix and KPI Placement
Define for each step: measurement type, source, and cadence. Optimize the trade-off between data-collection cost and decision value.
| Flow Step | Control Point | Indicator Type | Formula/Example | Source/Frequency |
|---|---|---|---|---|
| Inbound Receiving | Lot acceptance | Leading | Conforming lots / total lots | QMS, daily |
| Processing | Station output | Leading | Station FPY | MES/PLC, hourly |
| Final Inspection | Sales-ready | Lagging | Total FPY | MES+QMS, daily |
| Shipment | Dock exit | Lagging | OTIF | TMS, weekly |
Leading–Lagging Balance and Target Cascading
For every result KPI attach at least two drivers. Example: for LOG_OTIF_01, use dock wait time and ASN accuracy as drivers. Cascade targets as strategy → process → station. Ensure top-level bands are a convex combination of lower-level targets.
Numerator–Denominator Consistency
Align filter scope for numerator and denominator: date, product family, region, shift. Store the filter set in the dictionary scope field and version changes. Show a visible version tag on dashboards when scope changes.
Cadence by Process Dynamics
Sample at least twice as fast as the main variation period. High-volume lines: hourly/daily. Project-based software: sprint/weekly. Services: daily demand-driven. Over-sampling creates noise; under-sampling slows decisions.
Data Capture Design and Cost–Impact Balance
- Automation-first: Pull from ERP/MES/CRM; for manual input enforce digital trace.
- Sampling: Use statistical sampling for high volume; full count for critical quality.
- Validation: Monthly random period for independent recalculation.
Cross-Process KPIs and Ownership
Define cross KPIs for Sales–Ops–Logistics chains. Assign a single Accountable owner; sub-process owners are “R”. Record in MR.
Standard Outputs and Documentation
- Process Code & Name (PRC_XXXX), version, date.
- One-page SIPOC linked from the KPI dictionary.
- VSM with takt, CT, WIP callouts.
- Control-Point Matrix and measurement plan.
- Risks & Assumptions list with change history.
Implementation Checklist
- Is each KPI bound to a single SIPOC universe?
- Do VSM bottlenecks have leading KPIs?
- Are source and cadence specified per control point?
- Is scope versioned and visible on dashboards?
- Is there a single “A” owner for cross KPIs?
Implementation Tip
Run a three-day workshop: Day 1 SIPOC, Day 2 VSM, Day 3 control-point matrix. By end of Day 3, deliver draft dictionary entries and dashboard templates for the first 10 KPIs.
Management Review (MR): Agenda, Inputs/Outputs Architecture, and Decision Traceability
This section standardizes how Management Review, per ISO 9001:2015 clause 9.3, integrates with the KPI ecosystem. Goal: run a closed loop across strategy, process KPIs, CAPA, supplier performance, risks, and resources. Decisions must be numeric, time-bound, and linked to specific KPI codes and target versions.
Standard MR Agenda
| Agenda Item | Input Pack | Decision/Output |
|---|---|---|
| Strategic KPI Summary | 8–12 KPIs with last value, 12-period trend, bands | Approve/update targets; segment-specific adjustments |
| Customer Feedback | NPS, complaints, critical incidents | Segment prioritization, service-level changes |
| Nonconformities & CAPA | Open 8D list, risk-ranked backlog | Resource allocation, closure SLAs |
| Supplier Performance | Score distribution, D-grade list | Exit/transition plans, contract clause updates |
| Risks & Opportunities | Risk register, FMEA highlights | Mitigations, investment decisions |
| Resources & Competence | Critical roles, training matrix | Hiring plan, upskilling actions |
| Compliance & Audits | Internal/external findings, law changes | Policy/procedure revisions |
MR Input Packages
- KPI Executive Pack: Code, last value, bands, trend, alarm count.
- Segment Analysis: Region/product/shift deltas with target alignment.
- CAPA Portfolio: Open items with RPN, due dates, blockers.
- Supplier Dashboard: A–D distribution, critical parts, penalties/bonuses.
- Resource & Budget Requests: FTE/CapEx/OpEx with expected KPI lift.
Decision Types and Traceability
Record decisions in a Decision Log tied to KPI codes and versions. Each entry lists rationale, expected effect size, owners, and due dates.
| Decision Type | Description | Trace Artifact |
|---|---|---|
| Target Revision | Change target/warn/critical bands | KPI_Dictionary v.X note, dashboard release note |
| Resource Allocation | FTE/equipment/software spend | Budget code → expected KPI uplift |
| Priority Shift | Reorder CAPA/project portfolio | Backlog rank + due dates |
| Policy/Procedure | New or revised rules | Document code, training plan |
Target Cascading and OKR Bridge
Map annual strategic objectives to quarterly KRs, then bind KRs to KPI bands. In dashboards, enable KR → KPI drill-through. Close a KR only when the linked KPI stays within band for three consecutive periods.
Performance Dialogue and Escalation
- Green: Inform only, continue plan.
- Amber: Operational correction, optional CAPA.
- Red: Mandatory CAPA, escalation at MR, resource assignment.
Post-MR Action Tracking
Track decisions on an Action Closure Board. For each action show owner, due date, expected KPI impact, risk, and status badge. Auto-alert on delays; escalate after three consecutive misses.
MR Pack Template
- Executive summary: 12 key KPIs and traffic lights.
- Critical variances with root-cause synopsis and proposed decisions.
- CAPA portfolio with RPN heatmap.
- Supplier and customer impact matrices.
- Resource/budget proposals with ROI estimate.
- Decision log with owners and dates.
Compliance and Auditability
Store MR minutes under version control. For any sampled decision, auditors must trace input → decision → implementation → KPI effect. Synchronize target versions between dictionary and dashboard.
Implementation Tip
Allocate MR time 70% to decisions and 30% to reporting. Distribute reports as pre-read; discuss only variances and proposed actions during the session.
KPI Maturity Assessment: Level Model, Diagnostic Criteria, and 90-Day Roadmap
This section provides an objective model to assess and improve the KPI ecosystem under ISO 9001. Scope covers strategy alignment, process–KPI mapping, data governance, visualization and decision cadence, CAPA integration, supplier ecosystem, internal-audit evidence, and MR traceability.
Maturity Levels (1–5)
| Dimension | Level 1 Ad hoc | Level 2 Defined | Level 3 Standardized | Level 4 Integrated | Level 5 Optimized |
|---|---|---|---|---|---|
| Strategy Alignment | No targets | Targets exist, weak linkage | OKR/BSC mapped | KR→KPI auto-trace | Predictive, strategy-driven |
| Process–KPI Mapping | Scattered metrics | Basic map | SIPOC/VSM coverage | Balanced lead/lag | Adaptive, bottleneck-focused |
| Data Governance | Multiple sources | Manual-heavy | System of record set | ETL SLA + data dictionary | Quality score + auto validation |
| Visualization | Static reports | Basic charts | Trend + bands | Segmented dashboards | Insight and guidance |
| Decision Cadence | Reactive | Month-end | Weekly rhythm | Closed loop via MR | Preventive, simulation-led |
| CAPA Integration | Manual | Partial | Band-triggered CAPA | 8D with trend verification | Risk-weighted, proactive |
| Supplier KPIs | Not tracked | Basic OTIF/PPM | Contract appendix SLA | Scorecard + bonus/penalty | Joint improvement portfolio |
| Internal-Audit Evidence | Fragmented | Sample screens | Evidence chain | Sampling + recalculation | Auto-built audit packs |
Scoring and Weights
Score each dimension 1–5. Recommended weights: Strategy 20%, Data 20%, Decision Cadence 15%, CAPA 15%, Process–KPI 10%, Visualization 10%, Supplier 5%, Internal Audit 5%. Classification: <2.0 = Low, 2.0–2.9 = Emerging, 3.0–3.7 = Standard, 3.8–4.4 = Advanced, ≥4.5 = Leading.
Diagnostic Checklist
- Does each KPI dictionary row include system of record, formula, bands, and RACI?
- Do dashboards show last value, 12+ periods, and bands by default?
- Are band breaches auto-creating CAPA with 8D linkage?
- Is OKR→KPI trace available one click away?
- Is ETL latency and data-quality score visible?
- Do MR decisions update target versions consistently?
90-Day Roadmap (Wave 1–3)
| Wave | Focus | Deliverable | Success Metric |
|---|---|---|---|
| W1 (0–30d) | Dictionary + source unification | KPI_Dictionary v1, system-of-record list | 100% coded KPIs, 90% SoR assigned |
| W2 (31–60d) | Dashboard standard + alarm→CAPA | Trend+band template, alarm log | Top 20 KPIs with auto CAPA trigger |
| W3 (61–90d) | Segmentation + MR integration | Executive pack, KR→KPI bridge | MR-visible trace: decision→KPI impact |
Risk–Priority Matrix
Rank backlog by impact (customer, quality, cost, compliance) and implementability (time, dependencies, data). Execute quick wins first. Queue high-impact but complex items for MR funding.
Change Control for KPIs
- Request → impact analysis → approval → build → test → release → training.
- Release notes must cover formula, bands, source, dashboard impact.
- Rollback plan and audit trail are mandatory.
Measurement Quality and Confidence Score
Assign a “measurement confidence score” per KPI. Components: data integrity, sampling adequacy, formula transparency, on-time reporting. If score <0.8, down-weight as MR decision input.
Audit Evidence Pack
- Maturity assessment form and scoring table.
- Dictionary samples with version history.
- Dashboard screenshots, freshness badges, alarm log.
- CAPA files with trend recovery graphs.
- MR minutes, decision log, target-version diffs.
Implementation Tip
Constrain scope for the first 90 days: drive Level 3 for the top 20 KPIs, then expand to Level 4 integration. Reserve Level 5 for strategic KPIs only.
