The Unconventional Wisdom series features posts from EuroSTAR Huddle member, Robin Goldsmith, with his opinions on different areas of software testing. You may not always agree with him or he may share truths that you otherwise might not realise. In the latest installment, Robin queries the usefulness of tool demos versus proof of concept. Explore more posts in the Unconventional Wisdom series here.
You’ve probably attended vendor demonstrations of automated tools, for testing and other purposes. At conferences and professional meetings, demos are usually displayed on medium-to-large front-of-the-room screens or on an oversized monitor in a vendor’s booth. Increasingly vendors demo via online webinars on your own computer.
Vendors rely on tool demos as perhaps their main sales method for showing off the wonders of their tools. Nothing conveys the tool’s capabilities and advantages better than a well-choreographed demo by a skilled demonstrator. The tool essentially sells itself.
In turn, prospective tool buyers rely largely on demos to learn about and determine whether to purchase particular products. Buyers generally recognise that watching a demo does not suffice to make them proficient with the tool; but it can give them a sense of understanding many find sufficient to place an order.
This is pretty much mistaken “conventional we’s dumb.” Everybody knows that what looks so easy and good when the vendor demos it often does not match buyers’ experiences trying to use the tool themselves.
Actually, vendor demos don’t always go so well either. Back in “the day” when Bill Gates appeared at various conferences on behalf of Microsoft, twice I attended his keynotes where the latest version of some product blew up mid-demo. He’d reboot and continue on; and Microsoft seems to have done okay in spite of it.
I frequently see less well-known, but frankly probably more proficient users of the product, encounter some error during a demo. They may get an error message or often end up somewhere they didn’t intend. Usually they recover quickly enough to gloss over the error. On the other hand, those of us who don’t know the product are left wondering what they did to cause the glitch, what they did to get out of their problem, and why I shouldn’t expect the same issues if I try to use the product.
These are circumstances that vendors surely recognize reduce the effectiveness of typical tool demos, but which vendors seem content to live with. However, let me suggest there are some other common demo experiences that vendors seem unaware of and which not only interfere with their intended messages but can even essentially unsell the product.
Most software tools have certain similarities to commercial jetliner cockpit instrument panels. Both jam into a small reachable space all manner of dials, displays, and other controls that are meant to maximize operational efficiency when used by highly-trained and experienced operators.
A less-informed observer would have no idea what most of the controls are, let alone what they’re for, how they work, or how to use them. That’s literally why commercial pilots get the big bucks! Time after time they do the impossible by getting these way-heavier-than-I-am-and-I-can’t-fly behemoths off the ground, to their destination, and back on the ground safely.
Imagine an experienced commercial pilot is showing you what all the instruments are, quickly pointing at one and then another, maybe naming it or briefly saying something about it, and so on. You’re too far away to read the label under each instrument or to read what it says. You may not even be sure which one the pilot is talking about at any given moment. Yes, you’re probably impressed with the pilot’s knowledge; but do you really know any more than when the pilot started showing you the cockpit? I doubt it.
Vendor tool demos fall into the same trap. Tool screens are packed with small-point-size fields so their skilled users can enter and display lots of data that the tool works with. Even when the tool is on your own computer’s small screen, it can be next to impossible to read field labels and contents, especially as the demonstrator bounces from field to field at a typical demo pace, which is intended to show a lot of actions very quickly as evidence of how easy and fast the tool is to use.
You actually often can read even less small-point data from a much larger screen that is fairly far away from you at the front of a conference or meeting room. What’s the net sales benefit of a demo whose content you literally cannot read?
Then It Gets Worse
Even on those rare occasions when you can read field names along with what’s being entered and displayed, it’s pretty likely you have no idea why the demonstrator is doing what they’re doing. Often, the demo is using some sample case situation with transactions and rules that don’t mean anything to you.
Moreover, even when you understand the application, tool demos seldom explain the rationale or flow of how the tool works. Thus, you have little idea why a demonstrator clicks on this field, enters who-knows-what into that field, or sees some value, color, or meaningless icon in yet another place.
If you’re not yet confused enough, try following what’s happening when the demonstrator “goes off script,” suddenly going to some other part of the tool than where the expected flow seemed to be leading. This is always the effect when the demonstrator makes a mistake and has to take extra actions to recover and return to the point of departure. It also happens frequently when the demonstrator thinks of something interesting to show you, like bouncing to an inquiry to show data field contents or perhaps outside the tool entirely.
In my experience, only a handful of vendors demonstrate their tools in ways that actually serve their and their customers’ needs. They first clearly explain what the tool does, how it does it, and why it’s a good thing. Then they do it, explaining the case and its data step-by-step with sufficient time to easily read easily-readable relevant information. They probably show less than half as many actions as the typical demo; but the observers follow all instead of none of them.
“It isn’t what we don’t know that gives us trouble, it’s what we know that ain’t so.”
– Will Rogers
I hope you enjoyed this ‘Unconventional Wisdom‘ blog. Much of conventional wisdom is valid and usefully time-saving. However, too much instead is mistaken and misleadingly blindly accepted as truth, what I call “conventional we’s dumb”. Each month, I’ll share some alternative possibly unconventional ideas and perspectives I hope you’ll find wise and helpful.
The opinions expressed in this blog are those of the authors. They do not purport to reflect the opinions or views of EuroSTAR Huddle or its other members.
See more software testing resources on EuroSTAR Huddle.
For more information on testing tool providers, see the EuroSTAR Software Testing Conference which brings together the top testing tool and service providers for Europe’s largest testing expo.