So you’ve collaborated with your team and created a feature file full of examples, you’ve analysed features to identify potential risks to test for, and you’ve carried out your first exploratory testing session based of one of those risks. What happens next? Before you jump into your next session based exploratory testing session, you will need to complete the SBTM process with an exploratory testing session debrief.
What is a debrief?
A debrief is a discussion around a recently completed exploratory testing session between two people, the:
- Reporter – or the person who ran the exploratory testing session
- Reviewer – or the person who learns about what happened during the exploratory testing session
During the debrief, the reporter shares information such as what they did and didn’t test, what they learned during testing, what issues they faced and what bugs they raised. As the reporter shares this information the reviewer asks questions based on the reporter’s feedback, which can help you:
- Make an informed decision – If you are a product owner or someone who is responsible for making decisions about the progress of your product, being informed of your product is vitally important. A debrief will give you confidence in the positives and negatives of the product to allow you to plan the next steps.
- Identify additional exploratory test sessions – It may be that the reporter shares that they couldn’t test everything in one session due a wide range of influences such as time, environment or other distractions. Discussing what else is required may result in a new test session for the same charter being scheduled.
- Identify new risks – As new information is discovered, new risks might be found which in turn will mean new exploratory test sessions will need to be carried out. You may also discover that an exploratory test session actually covered multiple risks meaning other exploratory test sessions are no longer required.
- Discover opportunities for automation in testing – Repeated actions or activities that consume large amounts of time might be identified for automation, such as setting up an environment, creating data sets or running data-driven API checks against an HTTP endpoint.
Getting started with debriefs
Running debriefs is an activity that doesn’t come easily, it requires effort from your team to adopt debriefs and run them successfully. You and your team need to encourage an environment of collaboration and communication and ensure debriefs after each test session are run regularly. You may want to consider adding changes to your agile board or workflow.
Try to track when debriefs were and weren’t carried out and why. When you first get started with debriefs you will want to regularly take a step back and assess how adoption of debriefs is going, and metrics might help the conversation. Use retrospectives and team gatherings to ask what is and what isn’t working during your debriefs, what might be preventing you from carrying out effective debriefs, and how can you improve your debriefs to make them more valuable to the team.
Running a successful debrief
From the perspective of the reporter
A debrief is not a critique of your skills as a tester, but an exercise in sharing what you’ve learnt, using that learning to identify other testing activities and enabling the reviewer to make an informed decision (if they are in a position to do so). The debrief is an opportunity to tell the story of your exploratory testing session.
Once you have finished your session you should review your test notes and session report to ensure that you are prepared for a debriefing. We’ve provided a checklist of things you will want to ask yourself about your notes and session report before heading into a debrief session:
- Have you compared your session to previous sessions that contain the same charter to determine differences in test coverage?
- Have you added in timings for setup, execution and investigation?
- Have you considered and recorded the quality and coverage of your session?
- Are your notes in a state that would enable you to retell your testing story in the future?
- Have you raised all the bugs you found during the session?
From the perspective of the reviewer
The goal of the reviewer is to learn about what has been discovered during the exploratory testing session, help the reporter tell their story, ensure the exploratory testing session was a success and whether any further actions are required. This means using skills such as active listening and questioning techniques to get as much value out of the debrief as possible.
Asking the right questions is key to getting the most out of a debrief, so that you’re in a position to make informed decisions. However, it takes skill and practice to ask the right questions during a debrief. We’ve compiled a checklist of questions to ask yourself to ensure you’ve got everything you need from the reporter. These are not questions you would ask the reporter but activities to check have been fulfilled:
- Have you been debriefed on positive information that was discovered during the session?
- Have you been debriefed on negative information that was discovered during the session?
- Have you been debriefed on any bugs that were found?
- Do you feel you have been given sufficient information related to the charter to make a decision on releasing?
- Have you discussed how the current session compared to previously executed sessions related to the same charter?
- Have you confirmed that all required data has been added to the session report?
- Have you discussed any issues that might have affected the setup, execution and investigation timings?
- Have you discussed any new risks that were discovered?
The checklist we’ve created is a great place to get started with debriefs but you might want to consider adding in your own checklist items. Ours were based on the structure of SBTM session reports as well as James Bach’s own checklist. Keep in mind, though, that choosing the right questions or creating a checklist can be tricky. There is an excellent book called The Checklist Manifesto by Atul Gawande that you might want to read to learn how to build a checklist that works for you and your team.
Debriefs in summary
Debriefs can be a very effective and valuable tool for a tester and their team but they take time and experience to get right. If you find them hard to run at first don’t be disheartened, just remember these summary points and you and your team will be expertly sharing experience before you know it:
- Encourage regular debriefs after every exploratory testing session
- Foster a safe culture that promotes collaboration instead of criticism
- As the reporter, come prepared with good testing notes
- As the reviewer, use active listening and questioning techniques that open up the discussion
- Keep reviewing the quality of your sessions and work as a team to improve them and make them your own
About The Author
Mark is a tester, teacher, mentor, coach and international speaker, presenting workshops and talks on technical testing techniques. He has worked on award-winning projects across a wide variety of technology sectors ranging from broadcast, digital, financial and public sector working with various web, mobile and desktop technologies.
Mark is part of the Hindsight Software team, sharing his expertise in technical testing and test automation and advocating for risk-based automation and automation in testing. He regularly blogs at mwtestconsultancy.co.uk and is also the co-founder of the Software Testing Clinic in London, a regular workshop for new and junior testers to receive free mentoring and lessons in software testing. Mark has a keen interest in various technologies, regularly developing new apps and devices for the Internet of things. You can get in touch with Mark on twitter: @2bittester
- Session Debrief Checklist – James Bach – http://www.satisfice.com/sbtm/debrief_checklist.htm
- The Checklist Manifesto – Atul Gawande – http://atulgawande.com/book/the-checklist-manifesto
- Reporting Session Based Testing – Katrina Clokie – http://katrinatester.blogspot.co.uk/2014/03/reporting-session-based-testing.html