Over the past couple of years, I have had the pleasure of helping develop education initiatives and material for a program called SummerQAmp. SumerQAmp is designed to help students between the ages of 16-24 learn about software testing as a career, train to be eligible for internships, and then intern for companies that would hire them as software testers. The program itself is successful, and is expanding into more markets each year. A fundamental question still remains: How can we be sure that we're setting these interns up for success? Put more simply, how can we, as a community, help guide the next generation of software testers?
It was at the conclusion of the Software Testing Professionals Conference (STP-CON), held in San Diego in April that I had the chance to pose these questions (and others) to a number of the participants, speakers and organizing committee. James Pulley, noted performance tester and co-host of the PerfBytes podcast, sat down with me and we considered a number of ideas and aspects that are currently missing from software testing "curriculum," as it were. The fact that very few schools offer software testing as a viable degree program, or even as a certificate, is telling. Let's assume that, for sake of argument, there really isn't going to be a Bachelors’ Degree in Software Testing (we can save if there should be one for another time). How then could we help ensure that the skills needed (and the idea that we are matching the proper people to the proper companies) be met?
James and I felt it might be helpful to make a comparison to another profession, cosmetology, or "the art of cutting and styling hair" (ironic considering the two people having this conversation are 100% bald). The point is that cosmetology students all go through similar steps. Before they ever set foot on the floor of a styling shop, they are versed in basic topics, one of which is chemistry. Why chemistry?
Because many of the preparations used in styling hair rely on chemical reactions, and understanding those chemical reactions can spell the difference between success or failure of a procedure (not to mention the health and safety of themselves, their clients and their co-workers). Software testers also have a fundamental base knowledge that should be just as concrete as chemistry for cosmetology students. Before learning anything else, every aspiring tester should become well versed in the Scientific Method. That one area of emphasis could have a huge impact on the future success of any software tester.
What is the Scientific Method?
The Scientific Method has been with us a long time. It started in the ancient world, thousands of years ago. Variations of it have existed since the dawn of civilization. The Sumerians, Egyptians, Babylonians, Chinese, Indus and Greek civilizations all used it. Each society emphasized different aspects and elements based on their own cultures, but it was the ancient Greeks who (through writing and the conquests of Alexander the Great) had the greatest effect in codifying and spreading it around the world.
Science Buddies does a good job in making this a succinct and quick-to-understand topic. They state the basic steps as a way to ask and answer scientific questions by making observations and doing experiments. We do this by:
- Asking a specific question
- Performing background research on the question we would like to ask
- Constructing a hypothesis based on the question and our research
- Testing out our hypothesis by performing experiments
- Analyzing the data we accumulate from our tests and drawing a conclusion
- Communicate (and be ready to defend) our results
Additionally, it is important that we construct what are considered "fair tests." This means we limit our tests to the changing of just one variable (if possible) and keeping all other aspects of our tests the same.
To those of us who are career software testers, this makes a lot of sense. It's what we do every day, whether we are conscious of it or not. As I thought back through the various companies I have worked with, I have also seen that many testers, especially new ones, do not understand this. It's only later (sometimes years later) that software testers see the connection between what they do each day and the scientific method. Imagine being able to have those who first come into the field clearly understanding how relevant the scientific method is to the work that they do.
Let's take each of them in order:
Asking a specific question
Each question we ask needs to be able to answer one of the basic directives (How, What, When, Who, Which, Why, or Where).
Example: How long will it take to authenticate a file exists on a Sharepoint server? Who will be able to see those files once I upload them? What server will hold the files? Why am I trying to ensure that only I or an administrator can see the files I upload? Each of these questions are, in and of themselves, individual tests we can perform. By asking the questions, we can determine what matters and at what level we should be testing. Do we want to consider these at the atomic, or unit level, or do we want to look at these as integration level tests? Additionally, can these tests be measured? If it's a performance test, can we determine how long a transaction takes?
Performing background research on the question we would like to ask
Example: What does it take to configure Sharepoint security? If I want to ensure that only myself or the administrator can see if a file resides in a particular location, what do I need to know about Sharepoint configuration and setup to make sure that that is set correctly?
Constructing a hypothesis based on the question and our research
Example: Since I've gone to the Sharepoint Server, and I have set the viewable permissions to be just me and Sharepoint Adminsitrator, I can now decide that my test will pass if I log in as myself and as an administrator, but should not pass if I log in as a guest user or some other user that does not have permissions set.
Testing out our hypothesis by performing experiments
Example: Log in as the administrator, log out and log in as your designated user with permissions set, log out and log back in as a guest user or a user that does not have (or should not have) permission to view the files.
Analyzing the data we accumulate from our tests and drawing a conclusion
Example: Did we see the files in the data store on the Sharepoint server if we were a user with permissions granted? Was it hidden from us if we did not have the proper permissions?
Communicate (and be ready to defend) our results
Example: In a story card, Kanban or tracker attachment, describe what we did, what we saw, and why we feel that our experiment satisfies the criteria that the tests we performed confirm that only those users who are configured to see files uploaded to a particular space can see them, and all others are unable to see them.
Additionally, consider other experiments that you can perform where you change the value of the configuration (one at a time is best) so that you can also confirm negative examples (can the administrator grant permissions to the blocked user? Can the original user be blocked?
Granted, this is a fairly simplistic example, but the idea is to show how each area of our testing maps very clearly to the scientific method.
Another benefit of having new testers learn about the scientific method is that they also learn the value of "abstraction". As shown above, the steps taken for each aspect can be separated from the actual product they are testing. If they are not able to abstract the fundamental processes, then they will have a much more difficult time working on different aspects of a software project (functional testing, integration testing, performance testing, etc.).
They will also find themselves challenged when they start testing in different domains. Will the skills they learn testing video games translate to biotech or financial software? The specifics, very likely not. The fundamental processes contained in the scientific method? Absolutely!
See also:
