• Author
  • #17111

    Hi All

    Just want to see does anyone have issues/opinions with feedback on testing documents in relation to formatting, wording, tables etc instead of team members reviewing the important content like what areas (Coverage) are being tested, how the product is being tested (Strategy). I find a lot of people who are required to review the documents don’t know anything about testing in order to provide feedback so they focus on the formatting etc

    Any why do think people outside of testing are required to sign off test documents and tester never sign off other departments documents?

    Just like to hear some opinions



    I have experienced feedback that the headline order was incorrect and similarly formatting issues. I learned that to some people ANY ERROR is a sign of weakness, that will allow for others. “if you missed this, what else did you miss”. A BLAME culture – that I would not prefer to work in. That being said in Regulated Environments the strictness and compliance to given rules is a part of the game.

    There is a review technique you might want to try next time. Give each reviewer something to look for, one gets coverage, another formatting, etc. And then take turns.

    Another idea is to have every functional team do sign-off on everything. Dev Lead, Architect, Test Lead etc all sign off on all documents. But then again why do you have the sign-off? Is it customer required?

    I often wonder why development & outsourcing contracts have requirements towards the headlines & content of the test plans – but rarely to the corresponding design & specification documents.


    Thanks Jesper.

    I have myself been on projects where no sign off was required and stake holders were just more interested in the information gather during testing and the perceived quality of each area.

    Sign off from so many people who really don’t care about the document, properly understand it or understand what they should be getting from it takes a lot of time and money. I agree in regulation projects they are required and everything is fairly strict.

    I find it amusing that not one person is concerned about how the product is being tested. All seem to be more interested a strategy( Which tends to be a product coverage outline) and schedules than anything.

    One of the main issues I have with the testing industry here is the lack of real knowledge of why we test software and how we test software. Jesper correct me if i’m mistake you are a believer in CDT? if you implemented if in a company before how did you tackle it?




    Good to hear Marco,

    On the reviews, my idea was when we request that everybody must sign-off on all documents either levels out so that all documents get the same treatment. OR it exposes that it’s silly with that much sign-off ūüôā

    On CDT I have long tried to advocate it. I have tried a few times, but never successful on the practices of eg. heuristics and mind maps. Usually I go with the principles of finding a testing approach that match the context.


    Irrespective of whether the reviewer knows about testing or not, I believe that he should not go into too much details of formatting. Having said that, the documents go for further review to clients and management and it does not give a good impression if there are say, spelling mistakes or the document is not readable due to formatting issues. At least the basics should be taken care of.


    @archana – Thanks. The formatting was as per the template but the person did not like it.

    @jesper – I understand, thats an idea I will use. I have used a number of other documents in order to learn about architecture and products etc and none of them have been signed off or even past a draft version.

    Yes trying to get people to be CDT is difficult. I guess one of the issues is that people who are not software testers have too much say on how we do our job and the other issue is that test managers over testers are not skilled or knowledgable enough to accept that people under them have self trained, are better skilled and better testers. They are threatened.


    Peer review of artifacts supporting the development of a product are important, no matter the artifact.  Whether deliverable or non-deliverable.

    That said, is there much value in peer review of an artifact after the resources (e.g., budget, schedule) allocated to the artifact have been consumed?  I propose a constant engagement by peers in the shepherding of artifacts to their completion.  Remove defects at the earliest possible time.  Do master crafts-persons let their apprentices work without oversight?  Or do they guide each step along the way?

    I concur with @jesper, Peers should have assigned responsibilities for the review.  That is their scope.

    And yes, someone needs to be pedantic and correct formatting and grammatical errors.  But NOT everyone.  We expect no less of the requirement engineers.  Attention to the details is always important. but I do not propose being a slave to process.  Of course a mature organization is constantly improving and leaning their processes for optimal ROI.

    Regarding the “format protestor”, the acceptance criteria for an artifact needs to be a matter of documented business process.¬† Individuals who want to constantly argue with a business guideline need to be chided by the their supervisor to cease such anti-productive behavior or find another job.


    When you’re in the business of quality, formatting matters.  Perception matters.  If the testing team is quality control and its deliverables are overrun with spelling/formatting errors (etc.), then I completely understand why reviewers would focus attention on these issues. It’s our (the testing team’s) responsibility to collaborate with stakeholders to define expectations and collective objectives.

    Testers must also show how activities add (or do not add) value. I’ve observed multiple reviews where product or business analysts nitpick formatting and gloss over the content describing the testing to occur.  These same analysts then expect to be given weeks of time for exploratory/acceptance testing.  (Weeks that were likely just wasted correcting table alignments and seeking additional approvals.) Personally, I believe there is more value in the groups collaborating early to set appropriate expectations that allow issues to be identified early.

    As a team, the stakeholders should decide how much focus will be given to formatting issues.  Does the formatting error have the potential to skew the final product?  Or cause interpretation error?  If not, focus on the value add components. If yes, clarify, fix, and move forward.  Endeavor not to repeat the same mistakes twice. Teams should play to one another’s strengths and grow together. Sharing knowledge and building trust are the keys to overcoming the blame game.

    I agree completely with the previous replies suggesting reviewers be given specific tasks. It reminds me of Edward de Bono‚Äôs ‚ÄúSix Thinking Hats‚ÄĚ and the need to focus on various aspects of the documentation under review. ¬†For example, venting about formatting and font color could be an emotional (red hat) moment.¬† The more constructive conversation is the analysis of the facts (coverage and intent), which is the white hat perspective.¬† I‚Äôve used the Six Hats approach during review meetings, especially early on when team members are getting to know one another, with positive results.

    ~Marcia Buzzella


    Thanks All, you have some good points.

    I have no issues we documentation being reviewed by peers; I think it is required. In the context of the formatting and so on, there isn’t an issue with someone reviewing¬†formatting, grammar etc.¬†until someone decides their personal preference is what is needed. For example, the table was formatted and easy to read but the person would like a colour change, font size or type.

    I don’t believe testers are in the business of quality or quality assurance. I believe we are in the business of identifying risks that could de-value a product. If a test team cannot communicate the test project logistics and test strategy they could be poor at creating documentation but possibly great software testers. More often or not do testers create pointless documentation which has very little value whereby templates are used¬†and sections copy and pasted. I agree that there tends to be more focus on aspects of documents that have little or no value while no one focuses on how the product is being tested; but why is this?

    Most people reviewing the documents don’t really understand testing.¬†In turn¬†they can only focus on formatting &¬†product coverage outlines and they feel they are required to have input on the review.¬†These reviewers generally¬†want to see requirements = test cases and value all other aspects of the document other then how the product is tested.¬†This is why most testing strategies in companies will not tell a reader how the product is being tested and why most test strategies are wasteful documentation that is never read again.


    While I cannot agree with you that testers aren’t in the business of quality (quality control is a major part of reducing risk), I do agree that quality assurance isn’t in our scope. ¬†You hit the nail on the head with your comment on “personal preference”. A lot of time is wasted conforming to personal preference. ¬†What’s worse is that after you think you have all the bumps ironed out the team changes and it’s back to the drawing board. ¬†That’s why I like to set team expectations so that newcomers can’t derail progress with personal preference (at least not right out of the gate).

    The focus on documentation might be a direct result of lack of understanding.  Maybe we believe that if we write something down its more easily understood?  But then we get lost in the formatting rather than learning to speak the same language.  The language of what steps add value, increase quality, and reduce risk.  If a document or activity does none of these things, then something needs to change. These are the hard conversations that test managers should be having with project stakeholders. When we perceive a lack of value we should be able to back up the perspective with facts and course correct through explanation and negotiation.  In all too many cases, I see test managers get frustrated and give up instead.

    Why do we keep producing documents that add no value?  What aspects of creating the TS do add value (even if the document is never opened again)? Is there another way to achieve the same result? Test managers should help stakeholders understand the value of why we do what we do.  At the same time we must listen to understand and address their concerns.  When all players are working from the same expectations handbook the team will achieve much greater results.

    ~Marcia Buzzella


    Products these days are large complex systems and projects have many changing elements to them. As products are always subject to change and peoples idea of value can also change so frequently in most cases I believe that quality is very much taken out of our hands. Testers are there to gather information & build an inference to allow the business make decisions.

    I agree test managers who try to make a change do end up giving up a lot of the time. As I previously stated before I think a lot of it has to do with people that are in a higher position doesn’t understand or doesn’t want to understand new approaches or ideas. The purpose of any documentation is to communicate information. What information is of real value? Who is reading the document? More importantly, stakeholders need to understand that just because test plans or strategies are created it does not guarantee skilled testing. Testing is and always will be a performance.

    I think you are spot on asking is there any other way to get the same or better results as I believe there are many. Trying to get the buy in from stakeholders to work with change is the part most test managers have problems. Keith Klain does a great talk on speaking to CEOs about testing.


Viewing 11 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic.