The improvement imperative is driven by the need to stay competitive at a holistic level. One would expect that continuous process improvement or continuous improvement would be discussed earnestly at every staff meeting at every level of an organization, and at every meeting. However, interest in process improvement vacillates based on pressure on budgets. This observation suggests a disconnect, given that almost every framework specifically highlights the need for a continuous focus on improvement — it is even mentioned in the principles of the Agile Manifesto.
One of the causes of the disconnect is that frameworks tend to focus on a slice of the organization rather than the whole. Agile until recently has been viewed as a software “thing” while lean was viewed as an approach for operational departments, and quality was focused on the product. Improving a slice of the business rarely impacts the whole. To be effective, improvement has to be done in light of the overall “system” or value chain. Local optimization that does not translate into improved outcomes, a better product, or savings for the organization will do no one much good.
Rick Hall, CEO at Aginity Corporation, interviewed in SPaMCAST 634 stated “the art of analytics is based on innate curiosity and the ability to experiment.” Finding structure and information in big data has parallels to finding improvements in the flow of work. Improving how work is done is rarely just a matter of reorganizing boxes on a flow chart but rather a dance between people, finance, technology, and information. Even a simple improvement can require complicated negotiations between teams. A few years ago I watched a software team make changes that improved their cycle time, nearly cutting in half the time needed to design, code, and test changes. The changes were then transferred to an operations team for implementation where they waited until that team had the bandwidth to “deal” with the changes. The local optimization did not translate to the overall system performing any better.
Another reason for the disconnect between a call for continual improvement and actually continually surfacing problems and fixing them is corporate culture. Ryan Ripley and Todd Miller in Fixing Your Scrum: Practical Solutions to Common Scrum Problems (next book in our Re-read Saturday feature) note that a common issue that troubled teams face is not addressing small problems until they become large problems. Surfacing problems before they require heroics is branded as complaining or detracting from the focus needed to meet team goals. Learn from the small problems and save the heroics for the movies.
A third cause of the disconnect between the stated need to continuously improve and reality is the fear of experimentation. The idea of trying something new, evaluating the results, and then adapting fits with the agile mantra of inspect and adapt but only works if organizations allow failure. I often hear leaders suggest that a negative outcome is only a failure if “you fail to learn” only to find out that the negative outcome influenced the learner’s performance evaluation. Most process improvements are experiments that begin with a team creating a hypothesis, implementing changes, collecting data, evaluating results, and gaining knowledge. Done correctly the team will prove or, in some or even many cases, disprove, the hypothesis. If changes always work, something is fishy. Experimentation often disproves the hypothesis and sends teams back to the drawing board, hopefully wiser. If nothing else they will understand what not to try again.
Continuous improvement has a long history with a strong heritage from the lean and quality movements that have become part of the agile canon. However, just saying that improvement is needed doesn’t always translate into action. Translating words into action requires a bit of theory, some techniques, courage, and elbow grease.
About the Author
Thomas Cagley has more than 20 years of experience in the software industry, serving as a consultant since 1997. He was previously the Metrics Practice Manager at Software Productivity Research. Earlier, he held technical and managerial positions in different industries as a leader in software methods and metrics and QA. Thomas is a frequent speaker at metrics, quality and project management conferences. His areas of expertise encompass management experience in a wide variety of methods and metrics: Lean software development, Agile software development, quality integration, quality assurance and the application of the CMMI Institute’s Capability Maturity Model® Integration (CMMI) to achieve process improvements. Thomas blogs and podcasts on all things Agile at his blog: Software Process and Measurement.
Check out our schedule of live talks on EuroSTARHuddle.com or check out our on-demand section for a wide range of software testing talks.