The debate about when and how to apply software tools to our jobs continues to rage on. In some cases, the conversation can be very black-and-white, with some voices vehemently objecting to tools, believing them to be a crutch that stops us from thinking on our own. While that may be true of some people and some tools, my feeling is that is not true most of the time. We invent tools (ah, there we go, thinking and creating) in order to make a task easier or to accomplish something that can’t be done without a tool. In fact, designing and using tools is often pointed to as one sign of higher intelligence, so I’m not even sure how it became linked in some disciplines as a lessening of intelligence.
Do you recall the old argument about students bringing calculators to class? It was the end of math as we knew it! How will students ever learn mathematical concepts if they’re allowed to bring calculators to class?! Of course, while people still debate it, it’s mostly been put to bed – trust me, I know, having spent a fortune on various types of calculators over the years to arm my children through high school and college. At some point, our education system realized that maybe there was a way to incorporate the tool into your learning that makes the experience richer.
Various ways people learn
As humans, we tend to make assumptions about others that are only true for ourselves. One thing educators know but managers need to internalize as well is that not everybody learns the same way. Many educators boil learning styles down to three main types:
- Auditory: people who learn best by listening to an explanation
- Visual: people who learn best by digesting written content or visual elements like charts and graphs
- Kinesthetic: people who learn best by doing
My learning style is a blend of auditory and kinesthetic: I like to have something explained verbally, then be given a chance to practice it myself in a hands-on setting. My older son learns the same way as I do, so I thought I understood everything about teaching kids because it was easy to teach him to read and count. Then came my second child, an impossible code to crack when it came to learning. Obviously smart, he just didn’t learn the same way I did, which meant I was at a loss when it came to teaching him. He is very much an autodidact who can pick up just about anything but is most comfortable learning in his own way at his own pace, by himself. Autodidactic people are often misunderstood as arrogant in the professional world. When someone expresses a preference to learn online or by “playing around” with a tool, it’s easy for colleagues to hear this as a rejection of their own experience and teaching, which couldn’t be farther from the truth.
My point here is that people need to learn in ways that work best for them, and they need to believe that they can develop a skill using whatever techniques work for them (unless proven otherwise). We have a tendency within our professions to claim that we know the best way for employees to learn a new skill; this is especially true in the software testing world because expectations are so varied for each company, even from one project to another.
Tools and their place
I’m the first to admit that we software people are a quirky bunch. Like any population, we have a variety of personalities and opinions and, like any population, we have our disagreements. The value of tools in our industry and the best time and place to use them is one such disagreement. There’s an underlying irony here, of course, in that software people build the tools for software people to use, so we’re a bit like chefs who argue about whether people should eat out.
Part of the problem with the tools argument is that it quickly falls into polarizing arguments that divide the community rather than unify it. Unfortunately, there really is a school of thought that suggests all software testing can be automated, but it is a small minority of people. The fear that testers will be replaced by tools fuels a lot of irrational debate that, in my opinion, slows our evolution as an industry.
What’s curious is that developers use tools all the time – they happily install their IDEs and then use plug-ins to those IDEs for all kinds of things like automated builds, code reviews, and static analysis. This is purely anecdotal, but, in my experience working with and managing developers, they are usually pretty happy to try new tools and pass them around like moonshine around a campfire. In fact, the more tools they find that the group likes, the smarter they look. When they automate a tedious or repetitive part of their job, their colleagues applaud and do the same.
Not so with testers. Advocates of testing tools invariably fail trying to persuade their colleagues to trying new tools. There seems to be a very different viewpoint in the testing circuit that worries about having their own intelligence and experience replaced by an automation tool, even though this is primarily a myth.
Tools have their place and often, bringing a tool into the mix frees up a tester to do more than if they are forced to do all that data set-up and repetitive work manually. It’s the old calculator argument all over again. Just because you use a tool to automate distributed load testing doesn’t mean you don’t understand distributed load testing. In fact, you can’t effectively use the tool if you don’t have that understanding. And using something that is designed for this very purpose is often a much faster alternative than scripting and configuring on your own.
How tools can trigger understanding
Wait, what does this have to do with learning? Well, the impetus for this article was the claim that you cannot learn a concept by reading online and experimenting with a tool. The claim was further predicated on the idea that using a tool is a lesser means of accomplishing the chosen task, than finding a way to perform the task without a tool. And I beg to differ, especially when it comes to certain types of software testing, like load and performance testing.
Does learning a tool guarantee understanding of the concept? Does it replace experience? No, on both counts. However, we can’t make the assumption that because someone started by learning the tool and reading online material that they have no understanding of the discipline. You can certainly read about performance testing, download a free tool or two (or three), experiment on your own and gain enough understanding to perform some basic load testing. As with any discipline, the more you apply your learning, the better you become. Hopefully, in this connected world, you share your learning online with others so they too can pick up the baton and run with it.
When I was new to the concept of mind-mapping, the first thing I did after hearing about it was download a free tool so I could play. As I explored and experimented with the tool, the concepts behind it became more tangible in my mind and I found that I could do some off-roading to explore various ways of using it that weren't so obvious at first. For me, the tool itself was the key to learning what mind mapping was and figuring out ways to use it in my work. Oddly enough, I now find myself sitting in brainstorm sessions around a whiteboard with colleagues and realizing that the whiteboard scribbles are mind maps… And the circle is joined, as my Swedish friend says.
So, what’s my point? My point is that sometimes people can learn from the tool. My autodidactic son picked up a guitar (tool) and logged into YouTube (tool) one day – years later and without a single lesson, he can play classical guitar like a pro. I would suggest that this example is seen many times over in professional organizations, especially within the software industry where learning and trying new tools is pretty much a daily activity.