Within the ICT department of a large financial institution in the Netherlands a transition is taking place from traditional system development to DevOps. Key elements in the new way of working are continuous integration, test automation and thus continuous testing. A fully equipped and automated Test Data Management (TDM) process is a precondition for the overall test approach within DevOps. We know that implementing DevOps in general isn’t an easy ride. We experienced that the road of Implementing the required TDM in particular is bumpy as well.
TDM consists of subsetting and data masking. In the market place several tools are for sale to support these two base TDM functionalities. For this implementation of a TDM tool – including a Proof of Concept (PoC) – was challenging from both an organisational – and a technical point of view.
Organisational Challenges for Test Data Management
From an organisational perspective, we had to deal with the following:
- It starts with “getting the TDM requirements straight”. The pitfalls of having too high expectations on one hand and wanting to maintain the old way of working on the other hand are always close by. Examples of TDM requirements are: Test environments can only filled from a frozen and anonymized staging environment, or: After each test run the complete test environment will be emptied and refilled. Make people aware of the SOLL situation.
- How to define the scope of the PoC without turning it into a real implementation? Focus only on the technical challenges you want to tackle. You can trust the standard features.
- When implementing a TDM tool Database administrators will have to hand over DBA privileges to the tool that is used and maintained by the testing department. They are not always will to do so. Thus some diplomacy is required.
Technical Challenges for Test Data Management
From a technical perspective, we had to overcome issues like:
- An incomplete data model: For sub setting is essential to completely understand the data model. However, not all relations between the tables exist within the data model itself. Some relations have been created within the application software. Although a good TDM tool supports this phenomenon, thorough analysis is still needed to correctly configure the tool.
- Robustness of the TDM process itself. In the process of sub setting things can go wrong (unexpected data model changes, hardware hiccups and interfering background processes). The database can be left behind in an unusable state. Procedures and scripts have to developed in order to restore the database and to make root cause analysis possible.
- Performance of TDM tools are normally quite good. However, there are always limitations. So, choices will have to be made how large you want have your selections of test data.
At the moment we have the TDM process fully automated with Runbook technology. The test specialist placed a request “with a push on a button” and half an hour later he or she is ready to test. However, this doesn’t mean that a TDM process is in place and maintained “with a push on a button”.