Has anyone ever looked behind the curtains at their flow time and noticed the amount of variability in the data?
For most of my customers, the flow time ranges from 0 days (yes a lot of 0s exist in the number), to 600+ days. When you stare at this data long enough, you start to see what appear to be lanes in the traffic.
The fast lane has work which completes in a few days or less. This is transactional work. On some teams there is a high volume of this work. The pro-tip to improve flow for the fast lane is to automate or eliminate the need to do this work at all.
There also appears to be the slow lane which rides atop the data. This data has flow times exceeding a year in many cases. This is the realm of almost abandoned work. At one point it sounded like a great idea, but something else more urgent superseded it. The pro-tip here is to understand how work makes it into this lane, and then try to figure how to prevent it from happening. One idea would be to establish aging policies to prevent work from aging beyond a given timeframe (no work allowed to age beyond 90 days). As the threshold approaches, complete or cancel the work.
I’m curious if this resonates with anyone? Do you have any other ideas?