What parts of your testing would you get rid of, if you could? That was the gist of Daniel Maslyn’s unconventional session, “Boston T(ech) Party,” at the recent Agile Testing Days USA conference outside Boston. Before sharing answers to that question, it may help to give a bit of background.
Many readers may be familiar with Agile Testing Days, which has been held in Europe for ten years. US firm TechWell partnered with the conference organizers to create a US version. TechWell, which originally was called Software Quality Engineering (SQE), created and conducts the US and Canada STAR Conferences and was involved in founding EuroSTAR. Full disclosure, for many years I was the only non-SQE-employee on its conferences’ staffs.
The Agile Testing Days USA initial program consisted largely of presenters from the European version, many no doubt repeating their European presentations. Maslyn’s was new, however, and differed from other sessions in several additional ways. The title is a take-off on the Boston Tea Party, which was a 1773 protest against England’s taxation of tea in its American colonies. In one of the key events leading to the American Revolution, a number of colonists disguised themselves as native Americans, boarded a merchant ship, and dumped its cargo of tea into Boston Harbor.
Maslyn asked session attendees to pretend they were participating in the Boston Tea Party. He had a number of large boxes on stage and asked each attendee to throw one overboard. Instead of containing tea, however, each attendee’s box was to hold aspects of their testing they’d like to get rid of. After explaining their box’s contents, each participant with great energy threw or kicked their box off the stage “into the harbor.” It was not your typical PowerPoint presentation; and Maslyn did an excellent job keeping the conceit going.
What Was Dumped
My apologies for not remembering all the specific things all seven or so participants dumped. My impression was that there was a lot of overlap, mainly falling into two categories: automated testing tools and various forms of documentation. The automated testing tools issues seemed to reflect a theme I noticed in several of the other, preceding sessions. Speakers and presumably attendees seemed to accept as, may I say perhaps, “conventional we’s dumb,” that traditional GUI/script-based automated test execution tools are not helpful and in fact may impede Agile testing, especially in continuous integration/continuous deployment environments.
Rationales included excessive time and effort needed to create tests for the tools to execute, compounded by the large number of such tests created, and the extensive required run time for (again a large number of) end-to-end regression tests.
Many of the participants also dumped various forms of documentation they found time-consuming to create and not contributing sufficiently to their test efforts. One especially elaborated on their despised documentation—test plans. Again, participants seemed to accept as, may I again suggest, “conventional we’s dumb,” concerns voiced repeatedly by the exploratory testing community that creating documentation about testing takes time away from time available for executing tests and therefore should be skipped.
By the way, many of the scheduled speakers voiced similar automated test tool issues; but I don’t recall any I heard raising the documentation issue. However, several indeed expressed as I’d call a “conventional we’s dumb” given that exploratory testing is the most useful type of testing for catching defects automated tools miss.
Some Other Perspectives
First, let me be clear that I don’t doubt—and do sympathize with–participants’ portrayals of difficulties they encounter with their automated test tools and test documentation. Second, I was surprised to hear these issues from folks presumably already doing Agile, especially about the documentation, which I’d think Agile pretty much precludes.
Third, and what I want to discuss here, is my concern that the choices of what to dump in many ways said more about how wide-spread limited and even mistaken understanding of testing are than about the supposed bad practices themselves.
For more than 20 years, test automation authorities have advised identifying what needed to be demonstrated to give confidence systems work, then determining which of those tests are good candidates for automation, and finally actually expending effort executing only the subset of such tests which makes sense for controlling relevant risks.
Everyone should know this, but these folks didn’t seem to and instead described trying to execute every test they could think of without analysis, prioritization, or selectivity.
On the other hand, I’ll acknowledge that many, if not most, testers, mean by “test plan” their typically voluminous set of detailed executable tests—inputs and or conditions, expected results, and often procedural instructions frequently in keystroke-level detail. I further acknowledge such documents can take more time to create and maintain than they are worth.
However, as discussed in “Unconventional Wisdom V10: What Is A Test Case Anyway? No Test Cases, No Way” at
https://huddle.eurostarsoftwaretesting.com/unconventional-wisdom-v10-no-test-cases-no-way/, they are test cases, not test plans. And contrary to “conventional we’s dumb,” test cases don’t have to take such a counterproductive form. Instead, consider using highly effective but low-overhead formats to capture and use test cases. See “Unconventional Wisdom V9: Proactive Testing™ Part 2: Drives Development” at
https://huddle.eurostarsoftwaretesting.com/unconventional-wisdom-v9-proactive-testing-part-2-drives-development/.
As used in IEEE Std. 829-2008, test plans are project plans for the testing project, which itself is a sub-project within the overall development project. Project plans are widely-accepted as aiding project success and involve things like tasks, resources, budgets, schedules, priorities, and strategies. Contact me at [email protected] to discuss how we can work with you to learn and apply these more useful Proactive Testing™ approaches and methods.
“It isn’t what we don’t know that gives us trouble, it’s what we know that ain’t so.”
– Will Rogers
Welcome to my Unconventional Wisdom blog. Much of conventional wisdom is valid and usefully time-saving. However, too much instead is mistaken and misleadingly blindly accepted as truth, what I call “conventional we’s dumb”. Each month, I’ll share some alternative possibly unconventional ideas and perspectives I hope you’ll find wise and helpful.