You are on page 1of 27

Agile Metrics

A seminal approach for calculating Metrics in


Agile Projects

Overview, Analysis and Detailed Description of a proposed set


of comprehensive metrics for Agile Projects.
Ramm’s Agile Metrics
Rational for Agile Metrics
• Organizations today are increasingly recognizing
the advantages and benefits of using the agile
project management

• While most large corporations see definite


advantages in using the agile approach in project
development, organizations are lost when it
comes to using a well defined set of metrics that
can be applied to these Agile Projects
Rational for Agile Metrics
The agile metrics in use today suffer from a
variety of shortcomings and drawbacks.

• Common issue observed in the current set of agile metrics


out there is that they tend to mix up project and process
metrics.

• Proposed metrics are fragmented over different authors


with no clear presentation of a comprehensive metrics

• Most of these metrics not agile-centric but adaptations of


traditional metrics.
Criteria for Comprehensive Agile Metrics
The proposed set of Comprehensive Agile Metrics satisfy the following
criteria

• Practical, Easy to use, and implement in project. Easy to understand and


present to project sponsors and upper management.

• Relevant to Agile, not adapted or reworked from another method.

• Agile Centric, Metrics are derived by keeping the tenets of Agile Manifesto
in mind, instead bending to adapt traditional metrics to Agile framework.

• Meaningful , provide data and metric that allows one to track projects,
measure success of the process and quality of the product.

• Comprehensive, in that it addresses all aspects of agile methodology


project management, product and process
The Agile Metrics
• RAMM’s Agile Metrics are built on the tenets of
Agile Project management principles

• The proposed Agile metrics learn from the


shortcomings of previous metrics, improve on
their drawbacks, collate the best ideas from the
ones proposed and combine them with ideas of
agile centric approach.

• Will soon become the industry standard for


metrics for Agile projects
The Agile Metrics
These metrics are divided into separate metrics for
projects, process and product and allow organizations not
just simply to measure snapshots and status of individual
projects but measure and improve their project processes
and also satisfy customers and stakeholders by measurably
improving the quality of the product.

– Process Metrics: How well is the Agile Process being applied and
managed across projects.

– Project Metrics: Helps in tracking the progress of the project.


Primarily indicator of project cost and schedule variance.

– Product Metrics: How good is the inherent quality of the


product developed. How can quality of the product be
improved.
The Agile Metrics
• The process metrics are meant for higher management and are
meant to be indicative of how well the Agile processes are being
run across the projects within an organization.
It gives clear indications of the pain points and tracks the areas of
improvements within the Agile process.

• The process metrics are also helpful to the project manager in


understanding what can be done better and in improving the agile
processes applied to their project.

• Product metrics give a good indication of the quality of the system


being developed.
Note that the concepts of Cyclomatic Complexity and KLOC are
outdated and have not evolved for last decade to incorporate the
rapid evolution in Project Management methodologies
Agile Process Metrics
• Sprint Effort Factor
• Sprint Complexity Factor
• Sprint Complexity-Effort Matrix
• Degree of Change

Optional Metrics
• Facetime
• Customer Expectation Baseline
Agile Product Metrics
• Reusability Factor X

• Reusability Factor Y
Sprint Effort Factor
Sprint Effort Factor measures the effort required in a Sprint Cycle
as percentage of features targeted in current sprint versus the time
allotted to accomplish these items.
Sprint Effort Factor =
[ (# of features in current Sprint) / (# of features in Release) / (# of weeks in Sprint) ]* 100

where,
– (# of features in current Sprint) =
[(# of features in previous sprint) + (# of change requests)]

– (# of change request) =
[(# of new features added) - (# of future features deliberately or consequently
eliminated)]

• Deliberate elimination means that the client removes that feature from the release
list.
• Consequential elimination means that feature becomes redundant and not necessary
as a result of some new feature added or changed.
Sprint Effort Factor
Sprint Complexity
Measures the complexity of the current sprint as a function of
Number of interface points with parts of the system previously
developed and a subjective Subject Complexity Factor.

Sprint Complexity =
[ (Reusability Factor X) + (Other interface points) ] *(SubjectComplexityFactor)

• Subject Complexity factor attempts to integrate the inherent complexity of the


subject matter being implemented. It may be assigned a value between 0 < and 1,
with 1 being a Sprint cycle that deals with the most complex subject matter.

• Note that the use of (Subject Complexity Factor) is optional and it may not be used
in some projects. It is purely a subjective value assigned by the SMEs and the
Project Manager to factor in the inherent complexity of the subject matter being
implemented.
Sprint Complexity-Effort Matrix
• The sprint complexity-effort matrix gives the project
manager a clear indication of whether a proposed sprint in
trying to accomplish too many items or too few items.

• The Y-axis of the matrix plots the following for each sprint
(Sprint Complexity Factor * Sprint Effort Factor).

• The X-axis of the matrix plots the Durations of the Sprints.

• The Project Managers’ objective should be to aim for an


even distribution of the Sprint points in the green area of
the graph.
Sprint Complexity-Effort Matrix
The sprint complexity-effort matrix gives the project manager a clear
indication of whether a proposed sprint in trying to accomplish too
many items or too few items.
• If the sprint lies in the Red area it
indicates that too much is being
tried to be accomplished in a short
period of time, which may lead to
team stress and missed deadlines.

• If the sprint lies in the Yellow area it


indicates that the sprint has a lot of
lag built in and too few items are
being accomplished.

• The objective of the project


manager is to keep the complexity-
effort value in the Green area which
indicates a good balance between
the tasks intended in the Sprint and
the time allotted to accomplish
those tasks.
Degree of Change
Measures and tracks requirements refinements made by client
at the end of each Sprint. It is an indicator and measure of fine
tuning of the requirements to create a product that better
meets the customer expectations.

Degree of Change =
[new features added + existing features refined + deliberate (or
consequential) removal of feature) ] / (# of features in Release (adjusted))

• Too low a number indicates that the client has not changed features through the
sprint which might indicate that the end customer may not be giving sufficient
feedback.

• Too high a number indicates that the initial requirements need to be revisited
since they are continuously being changed.
The number should be too high or too low.
Degree of Change
Reusability Factor X
Reusability Factor X measures how well the product is being
developed.
A higher number indicates that the system is well designed and the
architect has identified patterns and functionalities that can be
reused. This also indicates a modular approach and development of
system libraries which in turn reduces the amount of rework that
would need to be done in the later sprints.

Reusability Factor X = (# of components added to the system library)

The number is typically higher side specially during the initial Sprint
cycles of a Release when the base modules of the system are being
developed and base framework of reusable components are being
identified. The number should significantly drop during the later
Sprints of a release.
Reusability Factor Y
Reusability Factor Y complements Reusability
Factor X in that it attempts to reuse as much of
the components from the libraries created in
previous Sprints.

Reusability Factor Y = (# of components reused from the system library)

The number will follow the reverse trend of its


counterpart and will be significantly lower during the
initial Sprints of a Release, but will progressively rise in
the later sprints as more and more components
developed in the earlier sprints are being reused.
Relationship between
Reusability Factor X and Reusability Factor Y
The following graph shows the relation between the Reusability Factor X
and Reusability Factor Y. Note that as previously mentioned as the
Reusability Factor X drops its counterpart increases. Similarly Reusability
Factor Y is higher in the later sprints that in the initial Sprints.
Facetime
This metric is a direct result of the Agile Tenet
“Business people and developers must work together
daily throughout the project” and “Individuals and
interactions over processes and tools”.

Facetime metric = f(Clique Density)

The facetime metrics is intended as a process metric to


calculate the amount of interpersonal interaction
between the team members and between the project
stakeholders and the time.
Facetime
Team members, Project manager, SMEs and other project stake holders are seen
as individual points in a graph. The interaction time spent between the project
members is indicated using numbers on the graph edges.

Higher number indicates higher interaction between the associated members and
groups. The graph gives a clear indication of cohesiveness of the various
stakeholders.

As indicated in the graph above there should be a higher clique density


within the team and a sparse clique density between SMEs and individual
team members. This is an optional metric that may be used in projects
where the teams are co-located.
Customer Expectation Baseline
Another optional process metric is how well the sprints meet the
customers’ minimal expectations. It is indicative of how well the
sprint meets and/or surpasses the base expectations of the
customer in terms of the features promised for the sprint.

Customer Expectation Baseline =


(actual minimal features delivered / minimal features for the Sprint)

• This simple metric plots the difference between the minimal number of
features and the actual features delivered at the end of the sprint. Note
that it is imperative that all the minimal features be met and even if there
are additional features completed in the sprint but these additional
features are not part of the identified minimal feature set, the metric goes
into the negative.

• The minimal feature set is identified at the beginning of the sprint.


Agile Project Metrics
• Agile SPI (Schedule Performance Index)
• Agile CPI (Cost Performance Index)

• Agile EV (Earned Value)


• Agile PV (Planned Value)
• Agile AC (Actual Cost)

• Burnup Chart
• Burndown Chart

Note that the Agile Project Metrics are adapted from the AgileEVM set of
metrics which is currently the most widely used and recognized suite of
metrics in Agile projects.
Agile Project Metrics
AgileEV =
[(Actual # of features completed in sprint)
+ ∑(# features completed from previous sprints – modified
features)]/
[Total # of features in the release]

AgilePV =
[(Planned # of features completed in sprint)
+ ∑(# features completed from previous sprints – modified
features)]/
[Total # of features in the release (adjusted)]

AgileAC =
Actual Cost incurred at end of the Sprint.
Agile Project Metrics
• AgileSPI = AgileEV/AgilePV
– The Agile Schedule Performance Index is similar to the SPI
proposed in the AgileEVM metrics.
– It is meant to provide a clear snapshot of the Schedule
variance in Agile Projects.
– A value of < 1, indicates that the project is currently behind
schedule

• AgileCPI = AgileEV/AgileAC
– The Agile Cost Performance Index is similar to the CPI
proposed in the AgileEVM metrics.
– A value of < 1, indicates that the project is currently behind
budget
Agile Project Metrics
• BurnUp Chart
BurnUp chart is a simply project metric that plots
the number of features accomplished till date.

• BurnDown Chart
Burn down charts are classic sprint tracking
charts that provide a snapshot of the features
that have been completed or being worked on in
the current sprint.
Afterword
• The RAMM Agile Metrics have been used in over two
dozen projects within different organizations with a high
degree of success measured in terms of customer
satisfaction, product quality improvement and
organizational process improvement.

• Projects consisted of team sizes varying from 5-7 member


teams to 10-15 member teams using the Agile Scrum
methodology.

• Overall project timelines varied from projects having 3-4


month deliverable schedule to projects well over the 12-14
month mark.

You might also like