December 12, 2019 at 8:09 am #23916LianaParticipant@lianamelissa
Hello All, I work for a private company that offers a Linux appliance as well as a custom suite of applications that run atop that appliance. The application suite consists of about 13 different C++ applications with one being a GUI (Qt) application and another that runs as a service and controls all the other components from the GUI via IPC. The Linux appliance is just a Debian install running on specialized hardware that meets the specific needs of our customers.
Currently the developers work on their own development machines in git and when the application suite needs to be built for testing, we use Jenkins to clone the repo and then build each application sequentially in a single pipeline. We are working on a way to make the individual applications build in parallel but we are trying to resolve some dependency issues (one application requiring another to already be built before it can build for example) in order to get that to work effectively. This gives us a build time of over an hour. And if the second to last application fails the build, then we just wasted a bunch of time only to uncover a problem that needs to go back to the developers. Not ideal. Once the suite is built, it’s sent off to QA who load the suite on their own appliances and start executing their tests. There are no web applications, the suite is not public or even downloaded, they’re all provided to the customer by production/sales as the Linux appliance with the application suite already installed.
Our company has started to take a more Agile approach to do things, so we’re really trying to improve the way we do business. Among other things there are a lot of rumblings about how we should try and incorporate Docker into our work flow to make things easier. I have been reading the docker.io docs and googling non web-app use cases, but I can’t seem to understand how Docker could realistically fit into our current environment.
There are a couple common benefits I’ve read about that I don’t think apply to our organization.
Standardization. Removes the ‘works on my machine’ problem.
The developers use their own development environment on top of the Linux appliance. When the application suite is built and sent to QA, they load the suite on their own appliances simulating what the end-user would have. I don’t think Docker would benefit this situation because the appliances would need to have Docker Engine running in order to use the Docker containers and that is not how the appliance will be provided to the customer.
Makes Deployment Faster/Easier
We don’t actually deploy anything externally. We ‘deploy’ the application suite internally via Ansible to those who want it on their appliance, but we’re not at such a large scale where QA can’t just copy the suite and install it on their machines. We don’t deploy to servers or the cloud, everything is on the appliance itself.
The only real benefit I think Docker may have is during the build process, if we’re able to resolve the dependency issues that affect some of the applications. I’m all about trying new things, but they usually have obvious benefits/ROI for trying them out. I just can’t see that with Docker. Is there anything else that I’m overlooking or just not seeing?
Thanks and regards.June 8, 2020 at 10:12 am #24822DarwinModerator@darwin
Based on your application specifics you can request your team to set up a common test server as a QA environment where all the developed features will be deployed for you to test so that this could help you to avoid pulling the latest image to your local machine every single time you need to test. If your applications are communication through a service, do they have an API? in that case have you considered sending requests through an API?
Docker is just a virtual machine which helps to isolate the environment in the form of containers as per the need. To my understanding it could be that developers are simulating the docker environment based on end user specifics and then they are adding the test suite on top of it to help testing. But if you feel you should test the application like the end user would do and that would help you better for your testing needs then you can come up with end to end use cases and try it out with pilot run.
yes, docker does remove the problem of ‘works on my machine’
While docker might not be necessary in your case I believe it will be helpful to onboard with the developers and a add-on to your skills.
On the other side I would suggest you to talk with your team to have a clear understanding of your process such as what you do, the hardships you face, possible improvement you can suggest and what do others think and what are their suggestions to help you to improve the process. Because they should understand that testers and developers are working towards a common goal of ship the product error free. Then agree to some terms that can help you and other developers onboard to the same goal.
Here are some resources I can recommend to understand what docker does and what are the uses:
- You must be logged in to reply to this topic.