Current Defect Density Statistics

May 1, 2017 | Author: Elijah Simmons | Category: N/A
Share Embed Donate


Short Description

1 Current Defect Density Statistics Ann Marie Neufelder You may not reprint or copy any part of this presentation withou...

Description

Current Defect Density Statistics Ann Marie Neufelder Copyright SoftRel, LLC 2007 You may not reprint or copy any part of this presentation without without express and written consent from Ann Marie Neufelder

Actual fielded defect density from 90+ projects spanning nearly every industry

Average delivered normalized defect density

Defect density clusters 3 World class

2.5

Very good

2 1.5

Good

1

Average

0.5

Fair

0

Poor 0

0.5

1

Ugly

Percentile group This data is in terms of fielded (escaped) defects per 1000 lines of effective code normalized to assembler. Seven “clusters” are visible. A method to predict which cluster your project will fall into was developed from this data. Copyright SoftRel, LLC 2007

How to determine normalized effective size Predict/count new and modified lines of code Predict/count deleted lines Multiply existing but unchanged code by 10% Entire functions deleted reduce existing size Effective size = Modified + New + Lines subtracted + (10% of existing code)

Multiply effective size by conversion ratio to assembler using industry tables as summarized below Second generation (C, Fortran) – 3 Object oriented (Java, C++, Ada 9x) – 6 Visual Basic – 10 Copyright SoftRel, LLC 2007

Use the SoftRel Survey to predict the best cluster Appropriate “Cluster” determined by these things Inherent stability of existing design and code Methods and techniques use to prevent defects and develop software Application type Existence of major obstacles (new technology, new environments, etc.) Existence of major opportunities (end user domain experts available to project, etc.) Inherent stability of development process

Process alone will not guarantee a world class cluster! An SEI CMM level 1 organization can be world class An SEI CMM level 4 or 5 does not guarantee world class

Copyright SoftRel, LLC 2007

How the survey score maps to the clusters

SoftRel Survey score

SoftRel survey score versus defect density clusters 60

World class

40

Very good Good

20

Average Fair

0 0

0.5 Percentile group

1

Poor Ugly

The most variation exists in the world class cluster, however, this cluster is easily predictable because of void of major obstacles and presence of major opportunities as shown next. Copyright SoftRel, LLC 2007

The World Class Cluster had no major obstacles

Number of major project obstacles

Major project obstacles versus defect density clusters 6 5 4 3 2 1 0

World class Very good Good Average Fair 0

0.5 Percentile group

1

Poor Ugly

Obstacles are defined specifically as – new technology, new operating system, new development environment,Copyright new compiler, new target hardware SoftRel, LLC 2007

The “Ugly” group had no opportunities

Number of major project opportunities

Major project opportunities versus defect density clusters 8

World class

6

Very good

4

Good

2

Average

0

Fair 0

0.5 Percentile group

1

Poor Ugly

Opportunity – Explicitly defined as the degree to which end user domain experts are available to the software engineers for this project Copyright SoftRel, LLC 2007

Average defect density by system type Average fielded defect density

Average testing defect density

Command and control

0.106

0.180

1.7

Command, control and communications

0.011

0.366

33.9

Military ground vehicle

0.106

Satellite

0.087

0.358

4.1

Large stationery capitol equipment

0.649

2.495

3.8

Small devices

0.202

4.787

23.7

GPS

0.134 n/a

System application type

Power systems

Ratio of test to field

n/a

1.0925 n/a

No special target hardware

0.123

0.448

3.6

Total/average

0.414

2.104

5.1

Copyright SoftRel, LLC 2007

Average defect density by software type Average fielded defect density

Software application type

Average testing defect density

Ratio of test to field

Wireless capabilities

0.165

3.092

18.7

Biometrics

0.400

1.290

3.2

Domain knowledge can be acquired via public domain in short period of time

0.068

Client server

0.108

0.434

4.0

Real time

0.476

2.172

4.6

Multi-tasking

0.449

2.104

4.7

DB interfaces

0.456

1.459

3.2

Mathematically intensive

0.430

1.513

3.5

Web based

0.008

0.091

11.1

Target HW is new or evolving

0.642

2.588

4.0

Application process evolving

0.378

0.278

0.7

n/a

n/a

This is the same set of data sliced a different way. Copyright SoftRel, LLC 2007

Average defect density by risk level Average fielded defect density

Risk level

Average testing Ratio of defect test to density field

Safety risk (occupational, regional, national or global)

0.509

2.608

5.1

Legal risks (banking, etc)

0.169

4.539

26.9

Monetary risks (loss of product with monetary value)

0.476

1.535

3.2

Recall risk

0.230

3.158

13.7

Government regulated

0.141

3.165

22.5

This is the same set of data sliced a different way.

Copyright SoftRel, LLC 2007

You can also predict the risk of a late delivery

Percentile World Class Very Good Good Average Fair Poor Ugly

Ratio of Normalized Fielded Defect Density testing to Stddev fielded defect density Ave Min Max 8.5 .011 0.0055 0.0180 .006 12.4 .060 0.0396 0.0756 .0172 10.7 .112 0.0888 0.135 .0169 10.6 .250 0.180 0.366 .0590 2.1 .618 0.400 0.835 .177 16.1 1.111 1.0357 1.224 .081 .5 2.069 1.743 2.674 .524

Probability of a late delivery (%) 10 20 25 36 85 100 83

Margin of error when delivery is late (%) 17.5 25 25 41 125 100 75

Probability of late delivery – If your organization makes 10 releases and the probability of being late is 10% then 1 out of 10 will be late Margin of error – Measured as a percentage of the original schedule prediction

Copyright SoftRel, LLC 2007

How to predict your cluster Answer a survey based on Risks Product characteristics Application type Resources Practices, techniques and methods Process stability

Determine a baseline cluster for your “typical” project Each project specific additional obstacle lowers the cluster while adding domain expertise raises the cluster Copyright SoftRel, LLC 2007

Lessons Learned Risks cannot be overcome by any of the following New expensive automated tools to theoretically speed up development (this will actually increase the risk level for the first time project) Wishful thinking

Risks can be minimized by More granular milestones Addressing high risk items before everything else in the schedule software engineers tend to work on the low risk tasks first

Design prototyping when design is a risk Requirements prototyping when end user requirements are volatile Defect prevention techniques such as formal unit testing Increasing the end user domain knowledge of the team This does not mean software experience – this means application experience

Copyright SoftRel, LLC 2007

View more...

Comments

Copyright � 2017 SILO Inc.