Professional Documents
Culture Documents
Sudipto Ghosh
CS 406 Fall 99
November 30, 1999
Learning objectives
Software metrics Metrics for various phases Why metrics are needed How to collect metrics How to use metrics
11/30/99
CS 406 Testing
Questions
How big is the program?
Huge!!
Can you, as a manager, make any useful decisions from such subjective information? Need information like, cost, effort, size of project.
11/30/99 CS 406 Testing 3
Metrics
Quantifiable measures that could be used to measure characteristics of a software system or the software development process Required in all phases Required for effective management Managers need quantifiable information, and not subjective information
Subjective information goes against the fundamental goal of engineering)
11/30/99 CS 406 Testing 4
Process metrics
quantify characteristics of the process being used to develop the software
efficiency of fault detection
11/30/99
CS 406 Testing
CMM
Level 4: Managed level
Process measurement performed Quality and productivity goals set Continually measured and corrective actions taken Statistical quality controls in place
Issues [1]
Cost of collecting metrics
Automation is less costly than manual method CASE tool may not be free
Development cost of the tool Extra execution time for collecting metrics
Validity of metrics
Does the metric really measure what it should? What exactly should be measured?
11/30/99 CS 406 Testing 7
Issues [2]
Selection of metrics for measurement
Hundreds available and with some cost
Basic metrics
Size (like LOC) Cost (in $$$) Duration (months) Effort (person-months) Quality (number of faults detected)
11/30/99
CS 406 Testing
Selection of metrics
Identify problems from the basic metrics
high fault rates during coding phase
Introduce strategy to correct the problems To monitor success, collect more detailed metrics
fault rates of individual programmers
11/30/99
CS 406 Testing
Utility of metrics
LOC
size of product take a regular intervals and find out how fast the project is growing
11/30/99
CS 406 Testing
10
Applicability of metrics
Throughout the software process, like
effort in person-months staff turnover cost
Specific to a phase
LOC # defects detected per hour of reviewing specifications
11/30/99
CS 406 Testing
11
Metrics: planning
When can we plan the entire software project?
At the very beginning? After a rapid prototype is made? After the requirements phase? After the specifications are ready?
11/30/99
CS 406 Testing
12
Metrics: planning
graph of cost estimate
4
Problem with
underestimation (possible loss by the developer) overestimation (client may offer bid to someone else)
Cost
internal (salaries of personnel, overheads) external (usually cost + profit)
11/30/99 CS 406 Testing 14
Cost estimation
Other factors:
desperate for work - charge less client may think low cost => low quality, so raise the amount
Size of product
11/30/99
CS 406 Testing
15
11/30/99
CS 406 Testing
16
Problems
creation of code is only a part of the total effort effect of using different languages on LOC how should one count LOC?
executable lines of code? data definitions comments? What are the pros and cons?
11/30/99 CS 406 Testing 17
What if you are using a code generator? Early on, you can only estimate the lines of code. So, the cost estimation is based on another estimated quantity!!!
11/30/99 CS 406 Testing 18
11/30/99
CS 406 Testing
20
Based on statistics
large number of projects are studied
Hybrid models
mathematical models, statistics and expert judgement
11/30/99 CS 406 Testing 21
COCOMO
COnstructive COst MOdel Series of three models
Basic - macroestimation model Intermediate COCOMO Detailed - microestimation model
Estimates total effort in terms of personmonths Cost of development, management, support tasks included Secretarial staff not included
11/30/99 CS 406 Testing
22
Intermediate COCOMO
Obtain an initial estimate (nominal estimate) of the development effort from the estimate of KDSI
Nominal effort = a X (KDSI)b person-months
System Organic Semi-detached Embedded a 3.2 3.0 2.8 b 1.05 1.12 1.20
11/30/99
CS 406 Testing
23
Kind of systems
Organic
Organization has considerable experience in that area Requirements are less stringent Small teams Simple business systems, data processing sys
Semi-detached
New operating system Database management system Complex inventory management system
11/30/99 CS 406 Testing 24
Kind of systems
Embedded
Ambitious, novel projects Organization has little experience Stringent requirements for interfacing, reliability Tight constraints from the environment Embedded avionics systems, real-time command systems
11/30/99
CS 406 Testing
25
Adjust the effort estimate by multiplying the initial estimate with all the multiplying factors Also have phase-wise distribution
11/30/99
CS 406 Testing
26
COCOMO example
System for office automation Four major modules
data entry: data update: date query: report generator: Total: 0.6 KDSI 0.6 KDSI 0.8 KDSI 1.0 KDSI 3.0 KDSI
EAF = 1.15 * 1.06 * 1.13 * 1.17 = 1.61 Adjusted effort = 1.61 * 10.14 = 16.3 PM
11/30/99 CS 406 Testing 29
Tentative information
a process in a DFD may be broken down later into different modules a number of processes may constitute one module
11/30/99 CS 406 Testing 31
11/30/99
CS 406 Testing
32
11/30/99
CS 406 Testing
33
Cyclomatic complexity
Number of binary decisions + 1 The number of branches in a module Proposed by McCabe Lower the value of this number, the better Only control complexity, no data complexity For OO, cyclomatic complexity is usually low because methods are mostly small
11/30/99
also, data component is important for OO, but ignored in cyclomatic complexity
CS 406 Testing
34
Fan-out of a module:
number of flows out of the module plus the number of data structures updated by the module
Measure of complexity:
length X (fan-in X fan-out)2
11/30/99 CS 406 Testing 35
OO design metrics
Assumption: The effort in developing a class is determined by the number of methods. Hence the overall complexity of a class can be measured as a function of the complexity of its methods.
Proposal: Weighted Methods per class (WMC)
11/30/99
CS 406 Testing
36
WMC
Let class C have methods M1, M2, .....Mn. Let Ci denote the complexity of method
WMC c i i 1 n
11/30/99
CS 406 Testing
37
WMC validation
Most classes tend to have a small number of methods, are simple, and provide some specific abstraction and operations. WMC metric has a reasonable correlation with fault-proneness of a class.
11/30/99
CS 406 Testing
38
DIT evaluation
Basili et al. study,1995. Chidamber and Kemerer study, 1994.
Most classes tend to be close to the root. Maximum DIT value found to be 10. Most classes have DIT=0. DIT is significant in predicting error proneness of a class. Higher DIT leads to higher error-proneness.
11/30/99
CS 406 Testing
40
11/30/99
CS 406 Testing
42
11/30/99
CS 406 Testing
43
11/30/99
CS 406 Testing
44
Statistical-based testing:
zero-failure technique
11/30/99 CS 406 Testing 45
Metrics: inspections
Purpose: measure effectiveness of inspections
may reflect deficiencies of the development team, quality of code
11/30/99
11/30/99
CS 406 Testing
48
References
Textbook
S. R. Scach - Classical and Object-Oriented Software
Engineering (Look at metrics under Index)
Other books
P. Jalote - An Integrated Approach to Software
Engineering (Look at metrics under Index)
11/30/99
CS 406 Testing
49