Unconventional Wisdom V2–Traceability Truths

Unconventional Wisdom

“It isn’t what we don’t know that gives us trouble, it’s what we know that ain’t so.”

– Will Rogers

Welcome to the next issue of my Unconventional Wisdom blog. Much of conventional wisdom is valid and usefully time-saving.  However, too much instead is mistaken and misleadingly blindly accepted as truth, what I call “conventional we’s dumb.” Each month, I’ll share some alternative possibly unconventional ideas and perspectives I hope you’ll find wise and helpful.


Traceability Matrix Conventional Wisdom


Traceability and the traceability matrix demonstrating it are widely touted as essential for requirements and testing.  At its simplest, a traceability matrix traces forward from each requirement to the tests of that requirement’s implementation and traces backward from each test to the requirement whose implementation it tests.


Thus, gaps at either end indicate likely issues. Untested requirements usually need to have tests.  Any tests that don’t stem from a defined requirement generally mean either the test is superfluous or a requirement is missing.


Between the requirements and tests end points, most traceability matrices also identify a sequence of intermediate artifacts. For example, a business requirement would be traced to product/system/software feature requirements addressing it.  Product requirements in turn would be traced to design elements implementing each feature.  Design elements would trace to physical software modules and databases, which in turn would trace to the actual tests.  Depending on methodology used, different or additional artifacts may be included in the chain.


Besides simply revealing potentially missed requirements and tests, often the most valuable use of a traceability matrix is in assisting decisions regarding whether or not to add/change/delete particular requirements. Because requirements changes, which frequently are called “creep,” often incur significant unplanned development time and effort, many organizations have Change Control Boards decide whether a proposed change should be made now, later, or never.


Automated Tools Role


As can be imagined, capturing and tracing all these relationship can constitute a significant number of data entries. Similarly, it can be equally demanding to understandably portray the full chain of artifacts from business requirement to tests of the software addressing it, especially when trying to reflect several intermediate entries.  Spreadsheets are an obvious initial choice for capturing and portraying traceability but generally quickly prove inadequate for the task.


Spreadsheets’ inadequacy created a valuable opportunity which initially was filled by automated requirements management tools. Originally, such tools primarily captured itemized requirements along with optional descriptive and categorization information and were especially helpful in enforcing control over changes to the requirements.  Frankly, such capabilities alone failed to generate much excitement or sales for the requirements management tools.


Adding the ability to relatively easily capture respective artifact cross-references to individual requirements and then (re-)display and (re-)print traceability matrices on demand became a major selling point for such tools. Once requirements management tools also addressed additional development artifacts, it was logical for vendors to transition the tools into more appealing full-featured products for Application Lifecycle Management (ALM).


Tool Truths


Some tool vendors characterize their traceability matrices as essentially a panacea, though exactly what for is not always as clear. I’m not aware of any tool that points out that each cross-reference entry must be identified and entered manually, which can turn out to take a lot of effort the vendor ads don’t mention.  The tool won’t figure out the cross-references for you, so the traceability matrix it draws is only as accurate and complete as your identification and entry of the requirements and their often numerous cross-references.


At least one tool features a very simple traceability matrix with a handful of requirements down the left side and checkmarks next to them under column headings for perhaps half a dozen artifacts. How informative is it really?  For example, let’s say the requirement is “end world hunger” and its traceability check mark actually indicates merely that one person had breakfast once.  See any problems?


Traceability’s Dirty Secret


You may find it easier to understand the most important traceability matrix issue by seeing a concrete example in my SearchSoftwareQuality.com “What is a test case?” article picked as the top tip of the year at http://itknowledgeexchange.techtarget.com/software-quality/top-ten-software-quality-tips-of-2010/. (Full disclosure, when Daragh invited me to publish on TestHuddle.com, he had in mind simply reprinting my many SearchSoftwareQuality.com featured articles; but SearchSoftwareQuality.com didn’t go along, so these are all new posts.)


The example shows how the same test can be viewed by different people as at least five different-sized test cases. At the “A” level, it’s viewed as one big test case; whereas at the “E” level, it’s considered 24 test cases.  Although the example focuses only on test cases, the same concept applies to requirements.  That is, different people can use the same term “requirement” to mean widely-divergent-sized artifacts.  The article’s emphasis was on misunderstandings that result when people unknowingly use the same term but interpret it very differently.


However, the example also is instructive with regard to traceability. For practicality, a traceability matrix must have both requirements and tests at the “A” level.  That’s how you get the easy to read checkmarks matrix—one big requirement and one test of it.  On the other hand, meaningfulness demands the greater granularity of the “E” level.  That is, checkmarks won’t tell us anything useful unless there are 24 test cases cross-referenced to each of 24 more detailed requirements.  That’s too much data entry and too much detail to read in the matrix.




Robin F. Goldsmith, JD helps organizations and business and systems professionals get the value they need through direct advisory assistance, training, writing, and speaking on risk-based Proactive Software Quality Assurance and Testing™, REAL requirements, REAL ROI™, metrics, outsourcing, project and process management. He is President of the Software Quality Group of New England and author of the book, Discovering REAL Business Requirements for Software Project Success, and the forthcoming book, Cut Creep: Put Business Back in Business Analysis to Discover REAL Business Requirements for Agile, ATDD, and Other Project Success.


About the Author


Consultant and trainer on quality and testing, requirements, process measurement and improvement, project management, return on investment, metrics
Find out more about @robingoldsmith