Blog

A valuable addition to the tester’s toolbox. Interview with Zeger Van Hese. Part 1

Zeger Van Hese has a background in Commercial Engineering and Cultural Science. He has a passion for exploratory testing, testing in agile projects and, above all, continuous learning from different perspectives.
12 February 2015
Interviews
The article by a1qa
a1qa

Zeger Van Hese has a background in Commercial Engineering and Cultural Science. He started his professional career in the motion picture industry but switched to IT in 1999. A year later he got bitten by the software testing bug (pun intended) and has never been cured since. He has a passion for exploratory testing, testing in agile projects and, above all, continuous learning from different perspectives. Zeger considers himself a lifelong student of the software testing craft. He was program chair of Eurostar 2012 and co-founder of the Dutch Exploratory Workshop on Testing (DEWT). He muses about testing on his TestSideStory blog and is a regular speaker at national and international conferences.

a1qa: You are known for relating outside things to testing. How can testers benefit from art or any outside perspectives?

Zeger Van Hese: Ah yes, test side stories… it’s a strange thing, really. It was never my explicit mission to bring seemingly unrelated areas into testing, but it is a pattern that I first saw emerging in many stories on my blog. Later on, the same happened on my quest for interesting presentation topics: each time I followed my energy, I ended up in weird places – subject-wise, that is (although there have been times were I thought I got beamed into a surrealist novel).

It started as a disposition to think about testing that way, but later on it dawned on me that testing also benefits from such outside perspectives.

I think the main thing that diverse perspectives can bring for quality assurance consultants is diversity, more specifically team diversity and the diversity of ideas. This diversity gives our testing a requisite variety that has the potential of making our testing more effective. If we build in diversity in our teams, each one of our team members will lack some skills, but the team as a whole will have them all. The broader the range of cultural and experiential backgrounds, in more diverse ways people will analyze the software and the more problems they will find. Diversity is a critical asset in testing, not something to be avoided. Intellectual diversity has another great benefit: it is a major enabler of innovation. In November last year I presented a keynote at Eurostar “Everything is connected: Exploring diversity, innovation, leadership” in which I investigated the relationship between those three central concepts of the conference theme. A lot has been written about the link between diversity and innovation, but the contradictions are striking: some studies that show that diversity is good for a team – that it leads to better performance, creativity and innovation – while there are equally compelling ones that reach opposite conclusions – that it leads to chaos and friction in the workplace. Nancy Adler shed some light on this paradox in her book “International Dimensions of Organizational Behavior“: she found out that creativity of a team does not depend on the presence or absence of diversity, but rather on how well diversity is managed. When managed well, diversity becomes an asset for the team. When ignored, diversity causes process problems that diminish the performance. But that is a leadership issue, a different story altogether.

As you mentioned, one of those specific subjects I have thought and presented about in the past is the link between art and testing (“Artful Testing”). As an art aficionado, I got the idea after reading two books that proudly carry the term “art” in their title: Glenford Myers’ book “The Art of Software Testing” (in which I was surprised to find that the word “art” does not even appear once throughout the whole book, except on the title page) and “Artful making” by Rob Austin and Lee Devin (which addressed software development and its resemblance to art). I could not help but wonder: what about “artful testing”? – can the fine arts in any way support or complement our testing efforts? As a matter of fact, they can. Testers can benefit from studying art and learning to look at it, since this largely resembles what we do when we are testing: thoughtfully looking at software. The tools used by art critics can also be a valuable addition to the tester toolbox: demystification, deconstruction, binary opposites, the notions of connotation/denotation – they can all be applied to testing, enabling testers to become software critics. Testers can learn from artists too, more specifically from the way they look at the world. Different art movements look at the same things in very different ways. What if testers would emulate those different ways of looking at the world and transpose that to software?

My main heuristic throughout this all is to follow my energy – I go wherever my curiosity and interests lead me. And this means that I am currently balancing two main themes in my head for the coming year: skepticism and visual thinking.

a1qa: Skepticism and visual thinking – what testing angles can be found there?

Zeger Van Hese: I study subjects in order to draw testing lessons from them and in the case of skepticism; it has been a fascinating journey so far: philosophy, religion, science and pseudo-science, critical thinking… oh the places I’ve seen! Sometimes I feel like a kid in a toy store. I’ve always felt that testers should strive to be professional skeptics (as James Bach puts it, skepticism is not about rejecting belief, but about rejecting certainty), keeping a wary eye and a skeptical mind. It’s natural that our clients want to become more confident in the system under test, but we should make clear to them that at no point we can promise absolute certainty. Since we cannot really prove that the software *works*, perhaps we should focus on the ways in which it fails or might fail. We should be the ones still doubting when everyone else isn’t seeing any problems: “what am I missing here, what am I not seeing?”. I’d much rather work in the questioning and information business than the confirmation business.

The visual thinking angle on testing is still very much a work in progress. It sparked my interest after I became aware of being a visual/spatial learner. I’ve always been a doodler. When my attention wanders, I start drawing: mostly doodles, random scribbles, sometimes little sketches – you should see my notes after yet another 2-hour meeting. Sure, it is a great way for my brains to stay engaged, to avoid dozing off. But I think this kind of behaviour is revealing something more fundamental: that drawing is my preferred way of thinking, it is “thinking on paper”. I need to visualize things in order to understand or think deeply about them – and that’s what doodling is: deep thinking in disguise.

Now I realize that I can put doodling or drawing to good professional use. There are a couple of ways in which testing can benefit from visualization. Firstly, it is a simple and powerful tool for innovating and solving sticky problems. When drawing, you also tap into all four learning modalities at the same time (visual, auditory, kinesthetic, and tactile). I’d say that anything that helps me learn and helps me think better is a good addition to my tester toolbox.

Visualization also plays an important role in testing with regards to modelling, which lies at the heart of testing. We use mental models all the time: requirements documents, flow charts, state transition diagrams: these are all tangible models of some aspect of the software. During testing, we also construct mental models of the software. We use them to guide our testing. Mental models are by definition volatile and intangible, so materializing them by capturing and visualizing in drawings can make them a good basis for discussion, investigation and understanding.

Over the last year, I have also been trying my hand at sketchnoting as a way of capturing talks, ideas and presentations. Yet another way of visualizing spoken or written content.

Read the second part of the interview here

More Posts

10 March 2020,
by a1qa
6 min read
Dedicated team model in QA: all you should know about it
Check on everything you should know about when to apply, how to run and pay for a dedicated team in QA.
Interviews
QA consulting
Quality assurance
30 September 2019,
by a1qa
4 min read
“Every team member is responsible for software quality”: interview with Head of QA at worldwide media resource
We continue talking about unsurpassed software quality. Consider how to make QA more efficient using shift-left and continuous testing.
Interviews
8 December 2017,
by a1qa
4 min read
a1qa: one-stop shop for first-rate QA services
Dmitry Tishchenko, Head of a1qa Marketing and Pre-Sales Department, answers the questions of The Technology Headlines. 
Interviews
Quality assurance
17 August 2017,
by a1qa
4 min read
From requirements specification to complex business analysis: interview with a1qa head of BA
Check how we at a1qa converge business knowledge with IT skills to deliver maximum value. 
Interviews
QA consulting
1 August 2017,
by a1qa
4 min read
Interview with head of a1qa test automation center of excellence
Dmitry Bogatko on how to manage the in-house Center of Excellence delivering value to the company's projects. 
Interviews
Test automation
19 August 2016,
by a1qa
4 min read
Interview with Adam Knight: Big Data exploratory testing
It is not so much to say that I find exploratory testing necessary. Rather I would say that I found it in my experience to be the most effective approach available to me in testing the business intelligence systems that I have.
Big data testing
Interviews
5 August 2016,
by a1qa
5 min read
Interview with Adam Knight: how much of a Sisyphean task is in software testing?
I’m a great believer in automation. I don’t believe that an agile approach to development is possible without some level of test automation. The use of such approaches does, however, need to be combined with an appreciation of the information that the automation provides you with.
Interviews
Test automation
20 July 2016,
by a1qa
4 min read
Good testing is about asking right questions: Intereview with Thanh Huynh
Software testing is not just to confirm things, it’s a process to explore, exercise the system to discover potential problems. You can achieve that by asking good questions to the system under test, yourself, your customers, your product owner, your manager, your colleagues, etc.
Interviews
20 June 2016,
by a1qa
5 min read
Interview with Lisa Crispin: whole-team approach to quality
To succeed with delivering valuable software over the long term, the whole team must take responsibility for quality, planning and executing testing activities. Our mindset has to shift from finding bugs after coding to preventing bugs from occurring in the first place.
Agile
Interviews

Get in touch

Please fill in the required field.
Email address seems invalid.
Please fill in the required field.
We use cookies on our website to improve its functionality and to enhance your user experience. We also use cookies for analytics. If you continue to browse this website, we will assume you agree that we can place cookies on your device. For more details, please read our Privacy and Cookies Policy.