The RisingSTAR Award recognises innovation in the software testing and quality assurance industry. It is self-nominated and the winner is chosen by the RisingSTAR supporter group who provide mentorship and guidance to the winner as they develop their idea into being.
Kimberly Snoyl of the Netherlands was awarded the EuroSTAR 2022 RisingSTAR Award for her idea her idea to create a HeuriChecker to Automate the Usability Heuristics. See more details. Over the coming year, Kimberly will provide updates on her progress and experiences as the 2022 RisingSTAR.. This is her first update. You can find out more about RisingSTAR on our main award page.
Beyond my wildest dreams, I actually won the RisingSTAR award 2022. In this first blog I would like to tell you more about the idea and what progress has been made since winning the award.
Why do we need UX Testing?
As a tester with a background in UX, I noticed that UX Testing is not very common among testers. What I experienced myself is that there is not a lot of time in a sprint to focus on other important quality aspects then “working functionality”, and UX findings are usually rated as less important by the business, so they are put on the bottom of the backlog and never picked up (if they even make the backlog). Since I have a MSc in Interaction Design, I know that UX can sometimes be more important than working functionality: We can have working functionality, but when the UX is not considered, it could be that the user has a hard time reaching their goals on your website and will be frustrated in the process. So, I went to a Nielsen and Norman (NNgroup) Usability Testing course (THE gurus in the UX field) and created my own course to share the knowledge with my colleagues. The replies on the course I gave were that Testers don’t have the time for traditional Usability Testing, and they were wondering what other techniques they can use which fit in a sprint. I wrote a blog about my research on this, but the most practical tip was using a UX checklist: Nielsen’s 10 Usability Heuristics.
I started presenting about this topic based on this blog and during an interview at Agile Online Summit, the host, Neil Killick, asked me if these Heuristics could be automated. That’s when the lightbulb came up and I started researching this. This is not a straightforward answer, since UX is seen as something subjective, so hard to automate. But since these Heuristics are the same for every website, it sounds like something that can be automated! And so, the idea was born.
In 2020 I asked for an intern to research if my premise was correct and if it is possible to automate these heuristics. The premises were confirmed with a small group:
- Testers don’t know how to test UX in a structured way
- Testers don’t know how to articulate their “feeling” that the usability can be improved
- Testers are always in a time squeeze, so the focus is on Functional testing
- Testers rather focus on Automation than Usability (which is quite obvious, since it can save them time)
- There is not always UX expertise present (e.g., low code applications)
And so the idea of the UX-Tester was born*: a tool to automatically test any application on the 10 Heuristics. There were 2 directions in which the research was done: to have a library which the Tester could import when automating tests. Or to have a website where you can enter any URL to check on UX. For me the last option was more appealing because I want to tool to be accessible to everyone, not only technical testers. And this way anyone can check their website with this tool. This also eliminates creating a plugin.
*Still working on the name. Heuricheck app or Heurichecker are now off the table. If you can think of a catchier name, please let me know!
First Supporter call
As a RisingSTAR, I get access to 20 of the most well-known testers at EuroSTAR, so we had a first group call to get acquainted. In this call we had the discussion if you would want to automate UX Testing. Well, my answer is: Yes. Automation saves time. But of course, it is true that it is not possible to test all the Heuristics, so the most important part of the tool is to also inform testers on HOW to test UX. Give them examples of best and worst practices.
I had some 1 on 1 calls and received some advice from Kari Kakkonen to create a survey to find out if there is a need for this tool. I also wanted to verify my premises on a wider audience, so I created this survey. Feel free to spread it to every tester you know! After my call with Isabel Evans, I was inspired and made a presentation with all the information on the “UX-Tester” including the next steps, which are:
- Getting more expert advice on the possibilities of automation, most likely we will need AI to create this tool, so the tool can learn from good examples.
- Finding ways to give the manual UX Testing a prominent spot on the “UX-Tester” website, so people do not get a wrong idea when they see the scores of what was checked automatically
- Making more clear how much can be tested automatically and how much can NOT be tested automatically
Isabel also introduced me to Chris Porter from the University of Malta, who has a student who is now also researching automatically testing aesthetics, which is Heuristic #8! I have a call planned with him to learn more about that.
I also had a call with Rik Marselis, who is involved in the T-MAP books, to check whether the information on UX Testing is still low. It turns out there is an entire chapter on Usability Testing nowadays, but still lacking practical examples. Since the practical examples are not in the book, but online, I might create content with practical examples for the T-MAP website.
Also, I was invited to speak at TestFlix, organised by The Test Tribe: India’s Largest Software Testing Community. In a 15 minute atomic talk, I for the first time talk about this idea!
So, this is the update for now. I wish you all a great summer holiday, and I will get back to you in the autumn with the next update!
Check out all the software testing webinars and eBooks here on EuroSTARHuddle.com