Crowdsourcing testing, the main subject of this article, has been growing as a recognised model for testing over the past couple of years. What once was perceived as a novelty has become a commonly recognised alternative approach to on-demand test resourcing. This has been largely attributed to large scale marketing campaigns and high visibility fund raising by a number of players in the marketplace.
At Centre4 Testing our Cloud Testers team has grown and evolved over the last couple of years, during which time we have encountered a number of challenges and tailored our service to accommodate these. The end result has been an OnDemand service, benefiting from a crowd source model, whilst being able to undertake more scope and flexibility than most standard crowd testing models.
What is Crowd Sourcing & Crowdsourcing Testing?
Crowd Sourcing, in its simplest form is sourcing people to work together to achieve a common goal. That in its own right doesn’t sound any different to a project team or group of people working together, so why does it get such a buzz?
The buzz comes from the ability to undertake a mammoth task using a larger than normal team over a reduced period of time, generally for a lower cost than would otherwise be the case.
Some commonly adopted examples of crowd sourcing include:
Crowd Funding – a crowd of people contributing funds and investment towards a larger goal that individually would be too expensive.
Crowd Sourced Design – sending a design spec to a group of designers who all individually design something to meet the spec.
Crowd Wisdom – commonly seen in the form of online forums, enabling knowledge sharing between groups of people.
Crowd Turfing – spoof sites set up to draw crowds of users which then get targeted for marketing campaigns or as part of data theft.
Crowd Sourced Testing – using a large group of people to test a product or application.
The difference between these crowd sourcing models and using a standard approach are mainly around the scalability and variety of resource available. Spinning up a team or connecting with a globally positioned resource pool can be undertaken as quickly and efficiently as today’s technology will allow. If you have access to the individuals and they have made themselves available for contact then whether you require contact with only 1 or 2 individuals or 100’s there is no additional work to implement.
Crowd sourced testing is software testing’s equivalent of crowd sourcing and it has been growing as a recognised model for testing over the past couple of years. What once was perceived as a novelty has become a commonly recognised alternative approach to traditional test resourcing.
The Potential Pitfalls
Despite the interest around the phenomenon, crowdsourced testing is not well received by everyone. Some argue it has many pitfalls that can be hazardous to a project for a variety of reasons, including security, logistics and questions of experience.
A common concern with crowd sourced testing is security. Who is accessing your systems or testing your application? Is your data secure? Surely opening up your systems to unknown individuals is not a good thing?
Commercial sensitivity comes into question here – should the information in your systems be made available to the world before you are ready to release it? Is there intellectual property that you’re particularly sensitive about pre-releasing to the ‘market’? Is it a platform that’s not intended for public-consumption and instead a system for internal use?
Experience, or lack of
What experience do the individuals have? How good are the unknown resources at doing the job they are being paid to do? Examples of questionable quality can sometimes surface in online crowdsourced projects, such as Wikipedia, until such a time as quality checks are undertaken to put them right.
How do we control costs and stop them spinning out of control? 10’s or 100’s of individuals working on a project can soon rack up costs if not kept in check.
These are all valid concerns, and not to be taken lightly in today’s data driven world of technology where everything we do leaves a breadcrumb of traceability in a digital footprint.
So the question is, how do we get that balance between a flexible crowdsourced model and the professionalism of a managed service without these concerns? It sounds like quite a tall order and it is.
How to Overcome the Pitfalls
Know your crowd
Ensuring that members of the crowd are vetted before engaging them is critical and whilst time consuming, this initial step provides assurance of expertise and work history. At this stage a benchmark can be established for the capabilities of the team, essential when promoting your service or entrusting them as a viable means to carry out the task at hand.
This provides the structure required in order to be able to use a crowd testing model where a higher level of security is required, whether that be a requirement for on-site co-location for aspects of the project or special levels of clearance.
Training becomes essential. The ability to grow your capabilities, retain the quality resources, and meet ever changing technical requirements, as well as ensuring your ‘crowd’ is fit for purpose can only come when the members are not only retained but also maintained.
Using a resource just because they are on your books, as opposed to because they are the right resource, is a fine way to get yourself into trouble, easily done when you don’t know who your team is or what their abilities are.
The management and reporting of the resources and test effort is a real differentiator for a good solution and is often overlooked.
Planning and coordination, alongside clear guidelines on what is required from the team are key. Problems experienced by co-located teams are enhanced in magnitude by remotely managed resources especially when ground rules are loosely set.
The ultimate goal and nirvana of a well-managed team means no more testers sitting around waiting for code to drop or environments to be ready. No intensive last minute resourcing to meet the urgency of projects and no more retraining of new contractors or permanent employees due to staff attrition.
Cloud Testers is one such team set up to meet these requirements. It provides a UK based, professional resource capable of fulfilling real devices that cover any mobile or desktop environment. Using only UK based professional resource, Cloud Testers ensures quality is maintained using our Private crowd strategy and secure Cisco infrastructure. For each project we generate a Private crowd that is managed, vetted and maintained by us directly.
The management layer that the Cloud Testers’ team has developed provides a single point of contact on all projects removing the challenge of managing remote teams. The fact that we know who our testers are, their capabilities, their availability and their history, not to mention their security status, gives assurance that the concerns described earlier around crowd sourcing no longer exist.
So the question remains, what types of testing are a best fit for the crowd sourcing model and does crowdsourcing testing work? In my view, without the Private crowd model the risks are high. Using Private crowds, with known resources, the opportunities are endless. Not only can we now look towards usability testing but functional testing, exploratory testing, accessibility testing, automation and any other form of testing that would have previously only been seen as suitable for an on-site managed service.
About The Author
As the Head of the Cloud Testers practice at Centre4 Testing, Dan Curtis has grown the service to ensure that the OnDemand service offers the professionalism of a managed service with the flexibility and scalability of a crowd sourced solution.
Dan has extensive experience across a multitude of Industry sectors including Accountancy, Publishing, Media, Space, Defence, Web Performance and Legal. His expertise covers setting up teams from scratch, developing established teams and reviewing strategies as well as the more technical and traditional hands on test management activities.
An advocate of bespoke best practice, as opposed to following “the book” and squeezing all scenarios into a single model, Dan often reinvents the wheel time and again to make the best fit process and tool selection for any organization he works with, ensuring the client comes first.