Professional Documents
Culture Documents
• Agile Centric, Metrics are derived by keeping the tenets of Agile Manifesto
in mind, instead bending to adapt traditional metrics to Agile framework.
• Meaningful , provide data and metric that allows one to track projects,
measure success of the process and quality of the product.
– Process Metrics: How well is the Agile Process being applied and
managed across projects.
Optional Metrics
• Facetime
• Customer Expectation Baseline
Agile Product Metrics
• Reusability Factor X
• Reusability Factor Y
Sprint Effort Factor
Sprint Effort Factor measures the effort required in a Sprint Cycle
as percentage of features targeted in current sprint versus the time
allotted to accomplish these items.
Sprint Effort Factor =
[ (# of features in current Sprint) / (# of features in Release) / (# of weeks in Sprint) ]* 100
where,
– (# of features in current Sprint) =
[(# of features in previous sprint) + (# of change requests)]
– (# of change request) =
[(# of new features added) - (# of future features deliberately or consequently
eliminated)]
• Deliberate elimination means that the client removes that feature from the release
list.
• Consequential elimination means that feature becomes redundant and not necessary
as a result of some new feature added or changed.
Sprint Effort Factor
Sprint Complexity
Measures the complexity of the current sprint as a function of
Number of interface points with parts of the system previously
developed and a subjective Subject Complexity Factor.
Sprint Complexity =
[ (Reusability Factor X) + (Other interface points) ] *(SubjectComplexityFactor)
• Note that the use of (Subject Complexity Factor) is optional and it may not be used
in some projects. It is purely a subjective value assigned by the SMEs and the
Project Manager to factor in the inherent complexity of the subject matter being
implemented.
Sprint Complexity-Effort Matrix
• The sprint complexity-effort matrix gives the project
manager a clear indication of whether a proposed sprint in
trying to accomplish too many items or too few items.
• The Y-axis of the matrix plots the following for each sprint
(Sprint Complexity Factor * Sprint Effort Factor).
Degree of Change =
[new features added + existing features refined + deliberate (or
consequential) removal of feature) ] / (# of features in Release (adjusted))
• Too low a number indicates that the client has not changed features through the
sprint which might indicate that the end customer may not be giving sufficient
feedback.
• Too high a number indicates that the initial requirements need to be revisited
since they are continuously being changed.
The number should be too high or too low.
Degree of Change
Reusability Factor X
Reusability Factor X measures how well the product is being
developed.
A higher number indicates that the system is well designed and the
architect has identified patterns and functionalities that can be
reused. This also indicates a modular approach and development of
system libraries which in turn reduces the amount of rework that
would need to be done in the later sprints.
The number is typically higher side specially during the initial Sprint
cycles of a Release when the base modules of the system are being
developed and base framework of reusable components are being
identified. The number should significantly drop during the later
Sprints of a release.
Reusability Factor Y
Reusability Factor Y complements Reusability
Factor X in that it attempts to reuse as much of
the components from the libraries created in
previous Sprints.
Higher number indicates higher interaction between the associated members and
groups. The graph gives a clear indication of cohesiveness of the various
stakeholders.
• This simple metric plots the difference between the minimal number of
features and the actual features delivered at the end of the sprint. Note
that it is imperative that all the minimal features be met and even if there
are additional features completed in the sprint but these additional
features are not part of the identified minimal feature set, the metric goes
into the negative.
• Burnup Chart
• Burndown Chart
Note that the Agile Project Metrics are adapted from the AgileEVM set of
metrics which is currently the most widely used and recognized suite of
metrics in Agile projects.
Agile Project Metrics
AgileEV =
[(Actual # of features completed in sprint)
+ ∑(# features completed from previous sprints – modified
features)]/
[Total # of features in the release]
AgilePV =
[(Planned # of features completed in sprint)
+ ∑(# features completed from previous sprints – modified
features)]/
[Total # of features in the release (adjusted)]
AgileAC =
Actual Cost incurred at end of the Sprint.
Agile Project Metrics
• AgileSPI = AgileEV/AgilePV
– The Agile Schedule Performance Index is similar to the SPI
proposed in the AgileEVM metrics.
– It is meant to provide a clear snapshot of the Schedule
variance in Agile Projects.
– A value of < 1, indicates that the project is currently behind
schedule
• AgileCPI = AgileEV/AgileAC
– The Agile Cost Performance Index is similar to the CPI
proposed in the AgileEVM metrics.
– A value of < 1, indicates that the project is currently behind
budget
Agile Project Metrics
• BurnUp Chart
BurnUp chart is a simply project metric that plots
the number of features accomplished till date.
• BurnDown Chart
Burn down charts are classic sprint tracking
charts that provide a snapshot of the features
that have been completed or being worked on in
the current sprint.
Afterword
• The RAMM Agile Metrics have been used in over two
dozen projects within different organizations with a high
degree of success measured in terms of customer
satisfaction, product quality improvement and
organizational process improvement.