Sampling Some Agile Transformation Metrics

Many times when agile coaches get together, we often start talking about how our agile transformations are progressing and what metrics we’re using.  From what I’m seeing, there is an emerging focus on “actionable metrics”. I like actionable metrics as it focuses on the work being delivered instead of some metrics which focus on how the work is being delivered. But I feel we’re still missing a metric.  Let’s first take a look at what metrics are often analyzed in typical agile transformation shops.

Check Lists with Ratings

I’ve used and created check lists, such as whether a team does daily standups, and rating these on a 1-5 scale:

A table of action items, with a rating of 1-5 on how well the team has progressed through it.
A typical 1-5 checklist to determine progress toward agility

These can then be rolled up to a group of teams, etc. until they account for VP-level or company-level metrics.  Doing this over time also provides trending information.    

Another popular tool is Spotify’s Team Health Check:

The Spotify team metric, more focused on the personal growth with agility

When this was first introduced to me, I was a bit skeptical. However, after using it and seeing the conversations that spawn from it, I was sold. I really love how it focuses on the team’s mental health, which helps make sure they feel heard.

And like most measurements you can conduct this every quarter to provide trending:

A spider graph that shows how the team has evolved over time

While these provide some insights into some of the mechanics of an agile transformation, they don’t provide any insight into the agility of the enterprise, such as how well it can respond to changes in the market.

Actionable Metrics

In contrast to such things as velocity, burn-down charts, team health checks, etc., actionable metrics focus on the continuous flow of the work by measuring Work-In-Progress, Cycle Time, and Throughput:

Work-In-Progress: This is the number of different items the teams are working on concurrently.  It’s best that you limit Work-In-Progress (WIP) as context switching reduces productivity and increases defects.

Cycle Time: The time between when a request is made and the solution is delivered.

Throughput: The number of items completed for a given length of time.

Actionable metrics are advocated with the stated benefit that focusing on these will increase predictability and quality.  Siemens is a common example where this was successfully put into practice and discussed in detail in a well-written article.

Essentially Siemens saw significant improvement by using Little’s Law:

A visual explanation of Little's Law

No measurement is perfect, and it’s common for people to respond to what they’re being measured against, so if they’re told their cycle time should be 40 days or fewer, they may parse work less optimally.  So if your company chooses to employ this measure, make sure teams still feel empowered to divide work that they feel is most optimal.  This flexibility will decrease predictability but it would increase the teams’ efficiency.  As I’ve stated previously, metrics should be used to determine if you should start a conversation, not to provide blanket policies to meet an arbitrary value (e.g. cycle time should be less than 40 days).

However, while I find this method useful and I recommend it, to me this is only part of the solution, such as determining if a car is drivable by solely looking at the engine, there’s a lot more that has to come together in order to make a car drivable.

Reaction Time: Achieving The Goal Of Agile Transformation

At a large client, we believed we achieved the long-sought agile transformation when an executive decided to respond to a competitive threat.  Within a couple of days, we identified the initial steps to respond to this threat, and within 6 weeks we launched our first response.  Before our agile transformation, the culture allowed for analysis paralysis, then the handoff to IT, and the development itself would have easily taken 6 months if it was ever released. 

However, this response under agile was even more remarkable because some parts of our response were going in the opposite direction of well-established business policies.  So this change is not just about just being nimble in developing software, but also about being able to make mission-critical decisions rapidly and having a culture that’s able to take this initiative and drive it.

Reaction time measures the time between when a threat or opportunity emerges to when the response is delivered.

By focusing on reaction time, it’s truly putting the eyes on the desired outcome of the agile transformation. 

Graph illustrating reaction time metric