Home › Forums › Software Testing Discussions › Taxonomy of Software Testing Terms
- This topic has 46 replies, 14 voices, and was last updated 8 years, 7 months ago by Paul.
-
AuthorPosts
-
August 18, 2014 at 3:30 pm #3412
Hi everyone, we’re looking to create a taxonomy of terms which we will use to make resources on this website more easily discoverable by relevant subject matter. This taxonomy could also be used as a point of reference for areas of members would like to see us focus on for new webinars, eBooks, presentations etc.
The goal is to create a bank of content that can be efficiently searched by testers trying to solve a particular problem – whether directly related to testing or indirectly e.g. advice from others on managing people; communicating to stakeholders etc.
I’m not a tester, so I doubt this list is fully complete / logical / relevant (and maybe not even useful!) but who better to help improve it than a community of professional testers?
List to follow shortly, I look forward to your replies – any input welcome!
August 18, 2014 at 3:34 pm #3441Testing through the SDLC
-
Requirements Analysis
- Requirements/Specification
-
Design
- Test Planning/Design
-
Development(Coding)
-
Testing
- Functional Testing (Correctness Testing)
- Unit Testing
- Smoke Testing
- Sanity Testing
- Regression Testing
- Integration Testing
- Component testing
- Back-to-back testing
- System Testing
- Model-Based Testing
- Unit Testing
- Functional Testing (Correctness Testing)
-
Results Analysis
- Reliability Achievement and evaluation by testing
-
Deployment
-
Acceptance Testing
- Non-functional Testing
- Acceptance Testing (Functional)
- UAT
- Operational Acceptance Testing
- Interface Testing
- Contract & Regulation Acceptance Testing
- Installation Testing
- Alpha & Beta/Field Testing
- Availability Testing
- Baseline Testing
- Compatibility Testing
- Compliance Testing
- Documentation Testing
- Localization & Internationalization testing
- Performance Testing
- Load Testing
- Stress Testing
- Reliability
- Stability Testing
- Endurance
- Soak
- Spike
- Isolation
- Volume Testing
- Configuration
- Recovery Testing
- Reliability Testing
- Resilience Testing
- Scalability Testing
- Security Testing
- Penetration Testing
- Static Testing
- Acceptance Testing (Functional)
-
Maintenance
- Non-functional Testing
Test Methods
- White Box Testing – a.k.a. logic-based testing; glass-box testing; design-based testing; structural
- Black-Box Testing – a.k.a. data-driven; input/output driven; requirements-based testing; functional testing
- Grey-box testing – a mix of white and black box methods
Test Techniques
- Tester Intuition/Experience
- Exploratory
- Ad hoc Testing
- Specification-based
- Equivalence partitioning (black-box)
- Boundary-value analysis (black-box)
- Decision table (black-box)
- Finite-state machine-based (black-box)
- Testing from formal specifications (black-box)
- Random testing (black-box)
- Code-based
- Reference models (white-box)
- Control flow-based criteria (white-box)
- Data flow-based criteria (white-box)
- Fault-based
- Error guessing (black-box)
- Mutation testing(white-box)
- Usage-based
- Operational profile
- Software Reliability Engineered Test (SRET)
- Nature of Application
- Object-oriented testing
- Component-based testing
- Web-based testing
- GUI Testing
- Testing of concurrent programs
- Protocol conformance testing
- Resting of distributed systems
- Testing of real-time systems
Test Process
- Waterfall
- Agile
- Test-Driven Development
- Behaviour-Driven Development
- Top-Down Testing
- Bottom-Up Testing
Test Management
- Attitudes/People
- Test Strategy
- Test Process
- Test Documentation
- Test Team
- Cost/Effort Estimation
- Test Reuse & Test Patterns
- Test Planning
- Test Case Generation
- Test Environment Development
- Problem Reporting/ Test Log
- Defect Tracking
Test Automation
- Code-driven Testing
- Graphical User Interface Testing (GUI) Testing
- Software Testing Tools
- Metrics
Testing Artefacts
- Test Plan
- Test Design
- Traceability Matrix
- Test Case
- Test Procedure
- Test Script
- Test Schedule
- Test Suite
- Test Data
- Test Harness
People
- Behaviour
- Coaching / Training
- Certification
- Collaboration
- Communication
- Creativity
- Critical Thinking
- Diversity
- Heuristics
- Innovation
- Test Experiences
- Leadership
- Teamwork/Team Building
- Recruitment
- Skills
August 18, 2014 at 5:04 pm #3454This is scary. It just goes to show that over time the industry has over thought everything and keeps adding to the list rather than rationalising it.
i think the idea of the list is excellent for several reasons. Including the original intent of providing a reference sieve for definitions.
But also to highlight the plethora of terms and methods and concepts that are in use so that we can rationalise.
I bet iso21199 doesn’t even cover this list 😉
How are you going to manage updates and suggestions for the taxonomy? Wiki or some other resource
?August 18, 2014 at 6:38 pm #3456Hi Stephen, I think the forum would be a good starting point for new suggestions. Just list them right here, if there is enough interest to create a wiki then we will do that for sure, great idea!
The list itself seems unwielding, so rationalising it makes sense. Where to from here?
August 18, 2014 at 7:37 pm #3457Hi Paul, It’s a good effort bring the things in structured way. Better make repository place ( MediaWiki ) with the help of admin, so every one contribute their knowledge on it in the more structured way.
August 19, 2014 at 3:27 am #3458Nice initiative Paul! You can think about adding few more terms mentioned here:
– “Build” along with Deployment as we frequently use it.
– Test environments – QA, Development, Staging, Inactive, Production etc.
– Scrum as another test/ development process.
– Test Estimation and Test Management tool in Test Management.
– Automation tool in automation.I will get back to you if I remember anything else.
August 19, 2014 at 3:43 am #3459Also, you can add Geo-beta along with alpha and beta.
August 19, 2014 at 5:42 am #3461Test Data: masking, anonymous,
Artefacts: Test Policy
Test management: Risk analysisAugust 19, 2014 at 7:27 am #3462People: Motivation, Performance Management, Delegation of tasks, Self Initiation
Allocation of Resources: Time, Cost, Quality, monitor progress and correct
Consultancy Skills
Business Relationships Analyse user requirements, Advise users on scope and recommended test approach, Discuss options for operational improvement, Develop client relationships
Test Management : Change Control
Industry AwarenessAugust 19, 2014 at 8:10 am #3464I’m surprised no-one has mentioned the V-Model or W-Model yet. 🙂
@Afreen what’s Geo-beta? That’s a new one on me.
One of the problems with a taxonomy is the application. It’s not a 2 dimensional list. i.e. Some items are related, but not relevant when used together with unrelated items.
e.g. talking about Shift left in an Agile/Scrum world is not appropriate as you can’t shift very far left in a 2 week sprint.
And we need to start defining the terms also to people know and understand what they mean. Starts to look like an ISTQB certification, but hopefully not an ISO Standard 🙂
To add to the list:
Verification – Tests and Checks against the requirements and specifications aimed and what we have done what we said we’d do.
Validation – Tests and Check against the customer expectations, i.e. does the product satisfy the customer/end users needs. Not the same as acceptance testing, although acceptance testing is part of validation.
VVRM (Verification and Validation Requirements Matrix) – Part of Requirements tracing. A matrix of tests/checks mapped to requirements; illustrating coverage of requirements by tests during design and completeness of requirements during execution.
Test – An exercise intended to investigate the target (Unit, Interface, Feature, Functionality, System, System of Systems, Process etc) to provide information on suitability to release to the next stage. This is an intelligent investigation guided by scripts but not limited by them.
Check – An inspection of the target (Unit, Interface, Feature, Functionality, System, System of Systems, Process etc) to assert a specific condition, Either positive or negative. A check has a defined input and output.
System under Test – The system that is the target of the test, as apposed to any auxiliary or connected tools/systems used during the test.It is amazing how many items you can add to a list when you get started. It’s a bit like a crossword. The more you stare at it the more answers you fill in. If I spent all day at this I still would think of more terms I’ve heard or used.
It’s funny really. I think of Testing as an easy job with lots of nuances and something new every day. But when you compile a list like this it’s not simple at all, it’s very complex. No wonder we have difficulty communicating when we have all of this running around in our head, just waiting to be expressed and used.
August 19, 2014 at 9:12 am #3482Stephen, Geo-beta is not commonly used by all. It is a stage where product is made live on some countries or a group having just enough users to collect data, but not the target market. Beta is released for a larger group of users from where market size and revenue related decisions can be made, but again not the actual target market. Beta and geo-beta can have the same meaning for many of the companies though.
August 19, 2014 at 11:20 am #3488So the list continues to grow! I might have opened a can of worms here!!
@Padmaraj – the wiki sounds like the way to go. We will factor that into the next round of changes to this site which will commence fairly soon.
@Vivi, @Afreen – than you both, keep them coming!
What I would like to do is create an exhaustive (or near-exhaustive) list and then agree on the most common sub-groups by which material on this site would be categorised. The list itself could be updated/maintained as a wiki to include definitions as @Stephen mentioned.
August 19, 2014 at 11:41 am #3489So @Stephen, where would your suggestions sit in the current list?
VVRM – in Requirements?
Verification? Functional Testing?
Validation? Is this a synonym for Non-Functional Testing or something else?
V-Model & W-Model – Test processes?Tests – can apply to several steps along the SDLC? This is less likely to be used as a tag for website content but another term to be defined.
Checks – similarly to tests, they are not confined to one particular step in the SLDC? This is less likely to be used as a tag for website content too but another commonly used term.
System under test – this is the system being tested? Again, less likely to use as a tag but something that should be contained in a wiki of terms and definitions.From the outside looking in, testing definitely appears complex and I think the terminology plays a part in that. I don’t think it would be easy to get to grips with the coining of new phrases and acronyms that continue to appear. However, if there was a means of understanding the relevance and context of these terms then there’s a greater likelihood of testers being able to share their experiences and seek out relevant knowledge that does exist (somewhere) rather than assuming their problem hasn’t been solved before.
August 19, 2014 at 11:45 am #3490Incidentally, @jane-moller-nash mentioned on email that the ISTQB may have a similar list to this – has anyone got a link to it? For some reason I’m having trouble accessing their website today!
August 19, 2014 at 12:12 pm #3491VVRM is a Test Management tool. Should be Built during the setup phase, updated and managed through the Design Phase, Monitored and used for reporting during execution and used in the final reports during exit/closure phases.
V and W Model are Test Process.
Verification is during Unit, Integration, System testing. Functional Testing.
Validation: Non-Functional for me means Stress and Performance Testing. But as an antonym for the Earlier phases known as functional I suppose Validation is a Synonym for Non-Functional. This is were terms get confusing and need clarification.
Validation should be done at all phases of the project. i.e.
Validate that the requirements meet the customers needs, and not just the contract.
Validate that the integration meets the customer needs, particularly if your integrating to 3rd party or customer systems, and that they don’t just meet the requirements and specs.
Validate that the system supports the business needs not just meets the requirements and specs.Validation is about how the customer will use the product, not about how it was designed and built.
@Afreen: Thanks for the explanation of Geo-beta. It seems to me to be a name for a way of phasing a Beta roll out.
August 19, 2014 at 1:27 pm #3492Some more terms:
Common terms used in testing: Attack(negative testing), bug, defect, Defect density, COTS, entry/exit criteria, incident management
Testing: Confirmation Testing, dynamic testing
Test Management: Configuration Management, Defect Management, Defect life cycle, Version Control
Tools & Techniques: Debugging (tool), Simulation, Decision CoverageAugust 19, 2014 at 1:28 pm #3493@Paul :
Check out this linkhttp://norwood-technologies.com/software-testing/istqb-foundation-level-glossary-terms/
August 19, 2014 at 3:51 pm #3496Thanks @Jayanthi, that ISTQB list appears to be an alphabetical list of known terms. How they relate to each other is where it gets interesting.
@Stephen, thanks for the explanations!
I wonder is it common for testers to use a large subset of this or are there smaller subsets that would be common place in different organisations/regions.
For example, would everyone working in testing in a large multinational use the same terminology regardless of where their testing team members might be located?
August 19, 2014 at 4:04 pm #3498@Paul lol, “use the same terminology regardless of where their testing team members might be located?”
I work in a multinational, multi-cultural, multi-phased company. And however much I try to promote and enforce a common taxonomy, people will interpret their work differently. I also find some terms are contextual, e.g. depending who you are and what your role in the who SDLC.
e.g. Acceptance Testing and UAT.
For an Engineer/Developer acceptance testing is done by the testing team and internal users. So their deffinition of UAT and who does it refers to internal users and the team/group recieving their output. They don’t always have a concept of view of the end user. If they do, it’s CAT, Customer Acceptance Testing. Or Operational Testing.For the System Test Team UAT is conducted purely in the test environment and is conducted by testers who try to simulate users.
Then there is the operational team and various delivery teams who are customer facing and refer to UAT as testing with the customer and end users.
At least 3 different definitions of UAT, depending on your postion and outward view. I am constantly trying to educate all teams on a common use of the term and the correct terms from a holistic perspective. So that everyone knows what is meant when each term is applied.
One problem is the sales cycle not engaging with The test practice early enough and inventing their own terms. Once the term is set with the customer you have to maintain it for constency and appearance that your company does know what it’s doing. Again I’m fighting to get a common language established.
So any attempt to provide a central, industry standard taxonomy, glossary, dictionary, is good for me and the industry.
August 19, 2014 at 6:54 pm #3499A “taxonomy” and a “glossary” are not birds of the same feather. It is my understanding that a “taxonomy” is a classification scheme as in “Bloom’s Taxonomy”. Whereas a “glossary” is employed to document the definitions of the terms of a domain of discourse. At least this is my understanding.
As a systems engineer I always press for the establishment of a project glossary at the outset, but as Stephen mentions, that is a challenge.
I would like to suggest that a standard wiki (i.e., wikipedia) may not fully satisfy the requirements being expressed here, as I understand them. If we agree that the vocabulary of a domain of discourse possesses an architecture, then modeling may be the best manner to articulate and describe our vocabulary. The principle element of the model would be a “term”. Suggested attributes of a term ‘ might be its definition(s), its acronym [0..1], the terms status as the preferred term:Boolean, if not preferred then the identity of the preferred term, source citation, comment [optional]. Relationships amongst the terms adds additional understanding of the vocabulary. Uses of terms in a context can be represented. Just an idea to add to the discussion.
Modeling enables the capture of relationships between elements which may be more likely to satisfy requirements.
August 20, 2014 at 9:31 am #3500Thanks Geoff, a taxonomy is what we require in order to make material on this website more efficiently discoverable – a consistent classification of existing and new material.
The idea snowballed relatively quickly but if this leads to a collaborative effort to create a software testing vocabulary model, I’d be happy to help out wherever I could. I guess it would be relatively easy to derive a taxonomy from an established model but modeling a vocabulary will take a great deal longer (I assume, knowing little about data modelling).
August 20, 2014 at 9:58 am #3503@Stephen, seems crazy to me that you can have three (or more) interpretations of UAT – which effectively means that any of the terms listed above have multiple interpretations. Scary!
What do you think of Geoff’s idea regarding a vocab model for testing?
August 20, 2014 at 3:22 pm #3538I am surprised no one has mentioned ISO-IEC 29119 Software and Systems Engineering — Software Testing Part 1 – Concepts & Definitions. The Contents section pretty much identifies the Testing through the SDLC you are trying to addres. Plus, as the name suggests, it also includes a Terms & Definitions section.
Why re-invent the wheel?The drawback being you would need to buy the standards document to have visibility of its contents.
August 20, 2014 at 3:46 pm #3539@Paul
If I really understood how to model something it would be a good idea. Seems we are looking at vocab issue on the subject now.
Is it a taxonomy, a glossary, a dictionary, a Wiki, a Model or what? 🙂As with many people, I use a Glossary in a lot of the docs I produce so people know what I mean when I use a particular term or acronym
The idea of a dictionary is really just an extended Glossary.
A Wiki takes that further, e.g. from a short one liner, to a short definition, to a short essay.A Taxonomy, surely is one type of model, even if it’s a simple one.
I think each of these has value to someone at some point. Like having a full text book and a pocket guide on the same subject.
A Glossary is a good starting point. Thrash out all of the terms and short definitions.
This can then be expanded in two routes. 1 – A Wiki to take the definition and explanation further and 2- a Model to show when and how to use each term.
I think in Testing a Taxonomy is limited as it’s a bit 2 dimensional. A full model could be multi-dimensional showing all the contextual relationships and enabling the user to identify the appropriate terms and methods for each project and situation.@Keith I’ve looked at ISO-IEC 29119, as far as you can without paying out, and at the moment I agree with a lot of the chatter on various social media sites, I don’t think it’s good for the industry. Saying that I’ve not seen the definitions provided by the standard.
To be honest, most of the standards I’ve seen, trained on or used are really best used to standardise the language used, attempting to ensure we all know what each other is talking about: But not for how to do the job. All standards I’ve worked with don’t go all the way and include all the terms used; you still end up with people talking a different language at times.
An initiative like is more likely to capture the vast range of terms used by all, and not just a select group of academics ( I do realise that some of the committee that defined the ISO-IEC 29119) are testing professionals as well).
Lets face it; the world today is a world of global collaboration and development by social media, not by committee and red brick university or institutional organisations.
Sorry, I seem to have slipped of subject.
August 20, 2014 at 6:02 pm #3541@Paul, I concur that a taxonomy is required for discovery. Your original list confused me because from my perspective I recognized some things as classes of things whilst others appeared to be things. Other contributions were clearly directed towards a glossary vice a lexicon.
So then the question becomes how to group things. What is the basis of our classification scheme? Do we take a perspective aligned to the process classification schema employed by ISO/IEC 15288?
System Life Cycle Processes
Agreement
Acquisition
Supply
Enterprise
Life Cycle Model Management
Infrastructure Management
Project Portfolio Management
Human Resource Management
Quality Management
Project
Project Planning
Project Assessment and Control
Decision Management
Risk Management
Configuration Management
Information Management
Measurement
Technical
Stakeholder Requirements Definition
Requirements Analysis
Architectural Design
Implementation
Integration
Verification
Validation
Operation
Maintenance
DisposalISO/IEC 15289 has an excellent (IMHO) classification schema for information items
Information Items
Description
Concept of operations
Database design description
Interface description
Proposal
Service catalog
Software architecture description
Software design description
Software unit description
System architecture description
System element description
Plan
Acceptance plan
Acquisition plan
Asset management plan
Audit plan
Capacity plan
Configuration management plan and policy
Development plan
Disposal plan
Documentation plan
Domain engineering plan
Improvement plan (process improvement plan, service improvement plan)
Information management plan
Information security plan
Installation plan
Integration plan (implementation plan)
Maintenance plan
Measurement plan
Project management plan
Quality management plan (quality assurance plan)
Release plan
Reuse plan
Risk management policy and plan
Service Availability and continuity plan
Service management plan
Training plan
Validation plan
Verification plan
Policy
Configuration management plan and policy (change management policy, release policy)
Improvement policy
Information security policy
Life cycle policy and procedure
Quality management policy and procedure
Risk management policy and plan
Procedure
Audit procedure
Capacity management procedure
Configuration management procedure (change management procedure, release management procedure)
Complaint procedure
Implementation procedure
Incident management procedure
Life cycle policy and procedure
Maintenance procedure
Operational test procedure
Problem management procedure
Process assessment procedure
Qualification test procedure
Quality management policy and procedure
Software unit test procedure
Supplier management procedure
Supplier selection procedure
Training documentation
User documentation
Report
Acceptance review and testing report
Audit acknowledgement report
Audit report
Configuration status report
Evaluation report
Incident report
Installation report
Integration and test report
Monitoring and control report
Problem report
Process improvement analysis report
Product need assessment
Progress report
Qualification test report
Review minutes
Service report
Software unit test report
User notification
Validation report
Verification report
Request
Change request
Customer satisfaction survey
Request for proposal
Resource request
Risk action request
Specification
Contract
Service level agreement
Test specification
Process Specification
Stakeholder Requirements Specification
System Requirements Specification
Software Requirements SpecificationDo we look to the OMG’s MetaObject Facility (MOF) and build a test domain business model? http://www.omg.org/mof/
The OMG’s UML Testing Profile provides some useful ideas on classification of test artefacts. Behavioral vs. structural. It is currently under revision. The OMG has also recently released the TestIF Specification (Test Information Interchange Format) into Beta. Another source for classification concepts.
Mention has been made of ISO/IEC 29119 which I think is a good starting point (IMHO) for a glossary as well as providing food for thought on building the lexicon.
I’d like to see the ISTQB’s glossary harmonized with the ISO and the OMG glossaries, perhaps there are others I’m not aware of that provide value. I believe standardization to be a good thing as it is necessary for effective communication. Standards are not necessarily without flaws. I feel strongly that the ISO does a disservice in charging license fees that individuals and small entities find to be a barrier. The fees are a clear inhibitor to use and adoption of the standards.: es exemplified in this thread.
As to “social media”, I question its ability to be as effective as a committee or any organized body bringing together committed individuals for the purpose of reaching a consensus to better a specific domain. Social media is too easily overwhelmed by bullies and crackpots. Apologies for the digression.
August 21, 2014 at 8:18 am #3559@Keith – thanks for the suggestion. I agree, if a body of work already exists and it’s something testers use, then well and good – let’s not undertake work for the sake of it. However, if we’re to use the ISO document and amend we run into copyright issues I’m sure?
@Stephen taxonomy|glossary|model|dictionary|lexicon – just Google each of those and you’ll come across several (albeit closely related) definitions of each – we could start with a glossary of terms for our new taxonomy or is it the other way around?!
@Geoff thanks for the further references, the list goes on :-). Gathering and harmonising existing glossaries might be the starting point?
A summary:From what I can see, there is interest in (and a need for) a standard language for testing (that doesn’t cost 178 Swiss francs for everyone to access). There is interest in this being a community project but reluctance to allow the ‘squeaky wheels to get the most oil’ by social media being the basis for creating a language.
How about a community-elected committee of committed individuals who are prepared to take responsibility for taking this forward? A forum could continue to be a place where aspects of the project are discussed (proactively) by all and where progress and updates are shared. Ultimately it’s a committee who finalise definitions, concepts, relationships etc. Could this work and crucially, would people be willing to commit to what is a considerable body of work?
August 21, 2014 at 9:23 am #3561Knowing how much effort went into producing the ISO-IEC 29119 standard, I think 178 Swiss Francs is a small price to pay to avoid the need to re-invent the wheel & put in significant effort from multiple individuals to effectively re-write the standard from scratch.
Assuming everyone had access to the standard & wanted changes made, then I could act as liaison to get recommended changes reviewed for incorporation in subsequent revisions to the standard
A standard is not a document that everyone has to adheer to in its entirety, rather it is a series of guidelines & recommendations for what test professionals should consider in undertaking their role;
Accreditation to the standard is an option, but it is not mandatory (unless your customers require it of you)
August 22, 2014 at 2:14 pm #3585Hi all, I wasn’t aware of this website until now: http://testingstandards.co.uk/, I presume testers in the UK are aware of this. Are these standards (or part of) employed and if not why not?
August 22, 2014 at 2:22 pm #3586I wasn’t aware of that site. Which is ironic that I am aware of Paul Gerrard who hosts the site.
I’m going to have to explore that site now.
August 22, 2014 at 2:32 pm #3587 -
-
AuthorPosts
- You must be logged in to reply to this topic.
[…] 4. Discussion: Taxonomy of Software Testing Terms […]