Rework Ratio Benchmarks 2026: Where Does Your Team Stand?
Updated 17 April 2026 -- incorporates DORA 2024, McKinsey DVI 2023, GitHub Octoverse 2024, Capers Jones most recent data
The most commonly cited benchmark pages for rework ratios are 2019-2022 articles that have not kept pace with DORA's 2024 data or the McKinsey Developer Velocity Index 2023. This page compiles the freshest published data from the four most credible sources, presented side by side for the first time.
One important caveat before the data: all survey-based benchmarks have response bias. Organisations experiencing acute rework problems are less likely to complete surveys about engineering performance. The real median is probably worse than what these numbers show. Treat them as lower-bound estimates.
Multi-Study Benchmark Table
| Study | Elite | High | Medium | Low |
|---|---|---|---|---|
DORA 2024 (Change Failure Rate) 39,000+ respondents | <5% | 5-10% | 10-15% | >30% |
Capers Jones (Defect Removal Efficiency) Translates to <5%, 5-10%, 10-15%, >15% escaped defect rate | >95% DRE | 90-95% DRE | 85-90% DRE | <85% DRE |
NIST 2002 (Rework % of Dev Effort) US software industry sample, 2002 baseline still widely cited | <10% | 10-20% | 20-40% | >40% |
McKinsey DVI 2023 (Productivity Percentile) Rework reduction is one of five key velocity drivers | Top quartile: 4-5x output | Second quartile | Third quartile | Bottom quartile: 1x |
DORA 2024: The Gold Standard for Rework Measurement
The DORA State of DevOps Report 2024 surveyed 39,000+ technology professionals globally, making it the largest annual study of software delivery performance. The most directly rework-relevant metric is change failure rate: the percentage of deployments that cause a service degradation requiring remediation.
Elite
<5%
change failure rate
Multiple times per day deploys
High
5-10%
change failure rate
Between once per day and once per week deploys
Medium
10-15%
change failure rate
Between once per week and once per month deploys
Low
>30%
change failure rate
Between once per month and once every 6 months deploys
Capers Jones: Defect Removal Efficiency by Company Type
Capers Jones' Applied Software Measurement (2008) and subsequent updates track Defect Removal Efficiency (DRE) -- the percentage of all defects removed before release. DRE is the inverse of escape rate. A DRE of 85% means 15% of defects escape to production, each costing substantially more to fix than those caught earlier.
| Organisation Type | Typical DRE | Escape Rate | Notes |
|---|---|---|---|
| Aerospace / Defence (mandated reviews) | 95-99% | 1-5% | DO-178C compliance drives formal inspections |
| Enterprise software (best practices) | 90-95% | 5-10% | Strong QA function, phased testing |
| Average IT organisations | 82-88% | 12-18% | Most commercial software teams |
| Startups (pre-PMF stage) | 65-75% | 25-35% | Speed prioritised over defect elimination |
| Worst-quartile commercial | <65% | >35% | Typically no formal QA process |
Company-Size Adjustment
Larger engineering organisations have higher rework rates on average. The primary driver is coordination complexity: more teams means more handoffs, and more handoffs means more miscommunication-driven rework. The offset is that large organisations can invest in platform engineering, shared testing infrastructure, and formal processes that partially counteract this.
| Team Size | Typical Rework Rate | Elite Floor | Primary Driver |
|---|---|---|---|
| Small (<10 engineers) | 15-20% | 5-8% | Unclear requirements, insufficient review bandwidth |
| Medium (10-50 engineers) | 20-30% | 8-12% | Team handoff rework, async communication gaps |
| Large (50-200 engineers) | 25-35% | 10-15% | Coordination overhead, tech debt accumulation |
| Enterprise (200+) | 30-45% | 12-20% | Org complexity, legacy dependencies, process overhead |
Industry Adjustment
Rework rates vary significantly by industry. The key drivers are regulatory compliance requirements (which force more rigorous defect prevention), deployment risk tolerance, and the cost structure of external failure (a bug in a banking system costs far more than a bug in a game).
Fintech / Banking
15-25%
Regulatory compliance drives formal testing. External failure cost is very high.
SaaS / B2B Tools
20-35%
Typical commercial range. Quality varies widely by company maturity.
Consumer Apps / Gaming
25-40%
High feature velocity often means more rework. Faster feedback loops help.
Embedded / Safety-Critical
5-15%
DO-178C, ISO 26262 compliance mandates formal defect prevention.
Enterprise / Regulated
25-45%
Legacy systems and coordination complexity increase rework rates.
Open Source Projects
10-20%
Distributed review culture provides strong defect detection, but asynchronous coordination adds rework.
The Self-Reporting Caveat
All benchmarks in this page derive from surveys and self-reported data. Teams with the highest rework rates are the least likely to track it formally and the least likely to complete industry surveys. This creates systematic underestimation in benchmark data. Capers Jones has noted this bias explicitly in several of his data sets. A reasonable heuristic: add 5-10 percentage points to any published benchmark average to get a realistic estimate of the true industry mean. The NIST 2002 figure (20-40% range) was derived from a mixed-method study that partially corrected for this, which is why it remains the most-cited estimate.
Frequently Asked Questions
What is the average rework rate in software development?▼
The most widely cited figure is 20-40% from NIST Planning Report 02-3 (2002). DORA 2024 medium-performer teams show change failure rates of 10-15% per deployment. Capers Jones data shows typical defect removal efficiency of 85%, meaning 15% of defects escape pre-release. All three figures are consistent with a 20-30% rework-time range for typical engineering organisations.
What is a good rework percentage for an engineering team?▼
Below 10% is DORA elite performance. Most teams that have invested seriously in testing infrastructure, spec quality, and code review achieve 10-15%. The 20-40% range is average for commercial software organisations. Teams above 40% are experiencing systemic process failures and should audit requirements, testing, and communication practices first.
How does team size affect rework rates?▼
Larger teams have higher rework rates due to coordination complexity. Capers Jones data shows small teams average 15-20% rework; large teams average 25-35%; enterprise teams can exceed 40%. Platform engineering investment and strong documentation practices partially offset this scaling penalty.
Continue reading:
Sources
- Google DORA. State of DevOps Report 2024. DORA Research Program, 2024.
- Jones, C. Applied Software Measurement. 3rd ed. McGraw-Hill, 2008.
- Jones, C. Software Engineering Best Practices. McGraw-Hill, 2010.
- NIST Planning Report 02-3. The Economic Impacts of Inadequate Infrastructure for Software Testing. RTI International, 2002.
- McKinsey. Developer Velocity Index 2023. McKinsey Digital, 2023.
- GitHub. Octoverse 2024. GitHub, 2024.
- GitLab. Global DevSecOps Survey 2024. GitLab, 2024.