Systematic vs. exploratory testing -- AKA -- Spock vs. Anakin Skywalker

SW1.pngA number of years back, I had the privilege of attending a training based on “The Methodology of Systematic Testing”. The class was literally taught by the man who wrote the book on systematic testing,  Ret. Col. Rick D. Craig. I was so impressed with the course that I read the book over a weekend. I also immediately started to implement the processes and strategy of systematic testing at the bank that I was working at the time.

 

There was only one major stumbling block in my way—I was not currently managing any testing groups.  Some people would look at that as a major roadblock; I chose to look at it as an opportunity to exercise the tried-and-true method of the squeaky wheel mentality. If my memory serves me correctly my manager said I was being obnoxiously righteous and that his emotional stability was borderline with my repetitive persistence. After looking all those words up in the dictionary I decided that he was on board with my ideas.

 

Live long and prosper—thinking like Spock

 

spock.jpgAfter the first year, we were well on our way to software testing nirvana. During this time the project management group decided that each line of business could develop or adopt their own software development lifecycle project management methodologies. I quickly found out that while systematic testing is a great way to identify and mitigate potentially high risk software and applications in the SDLC process, it wasn’t the most forgiving when it came to integrating with multiple project methodologies and processes.

 

I came to the conclusion that when boiled down systematic testing is simply that a team will spend so much time on prep work that nothing can surprise them during the actual test execution. The whole process is based off of what I call Spock-isms: because when describing systematic testing one must use words like methodical, logical and discipline throughout the testing process. A geek like me can’t help but think of doctor McCoy’s green blooded pointy ear nemesis in the Star Trek series—oh and the Vulcan death grip. How handy would that be during an argument?

 

If I was going into outer space I hope the people building and testing my mode of transportation would have used some form of the systematic testing methodology to ensure my safe return to mother Earth. I want them to have logically thought of every possibility of my demise—and then do everything in their power to prevent it.

 

Spock’s secret weakness

 

But just like when working with Spock, there is also some inflexibility that can’t be ignored or skirted.  This inflexibility prevents testers from cutting steps and simply going with their instincts and experience with the application. Instincts can’t be calculated and could possibly lead to chaos and an expose the organization and team to unnecessary risk.

 

May the force be with you—feeling like Anakin

 

Anakin.jpgAfter years of working with testers. I have come to the conclusion that they are not only a very destructive lot of people, they tend to be relentless when it comes to exposing issues. These testers are like a dog with a bone. Unfortunately this scenario typically plays out during the execution phase of testing, which in most cases is way too late.

 

That brings me to my next epiphany: test early and often. If you can test early and often during the application lifecycle development process, not only will more defects be found, it will also be easier and cheaper to fix defects if they are found earlier in the process.

 

If you’re constantly executing tests you can’t spend your time properly documenting test cases or completing risk analysis unless you can move objects with your mind?   

While at first I was very resistant to the concept of exploratory testing; I did see definite value add when it came more testing and less prep work. Exploratory testing doesn’t bog testers down by having them constantly document possible scenarios and use cases—which only a few are converted into the regression tests set. But without structure comes chaos and one thing most testers don’t like is confusion.  Confusion leads to the dark side of the force.

 

Using the tools of the trade—the use of the lightsaber

 

Exploratory testing in a nutshell takes advantage of the experience level of a manual tester on a specific application and allows them some autonomy when it comes to deciding how and when to test new functionality. It also gives them some independence when it comes to regression testing, also known as shotgun testing, because you cover a lot more area. This method leaves bigger gaps in your testing of an application; however, sometimes this can be too big a risk to the overall performance of the application.

 

Once you find an issue or defect, can you repeat those same exact undocumented steps time and again? A lot of times the tester will find an issue during exploratory testing and to correctly document the issue he or she must duplicate that same issue two to three times before handing the issue over for resolution which sometimes can be frustrating and monotonous if the issue is intermittent.

 

Before getting into much hot water with exploratory testing methodology advocates the theory is that a person should document test steps as they go. The reality is that documenting is very time-consuming and documenting your steps is only practical if a defect is found or it has been decided that this test case is a candidate for regression testing.

 

Depending on your experience level with the application and the amount of training on the methods of exploratory testing can really pay off. Like Anakin, too much emphasis on one person’s abilities to foretell application’s behavior in production place a lot of responsibility on a single person shoulders and can lead a project team to have a false sense security? This scenario plays out quite a bit when using these lean development methodologies. For example You pay a testing 60k a year on a five million development project that could generate tens of millions for the company.

 

What role does Agile play?

 

When it comes to exploratory testing in Agile project some of my peers and mentors believe that  the changes to the application are small and incremental that  allowing a defect into production , is somewhat watered-down by the knowledge that you’re basically in a continuous testing scenario. The fact is that anything that you missed last Sprint can be addressed in the next iteration or Sprint. (Just because he killed a couple of sand people doesn’t mean he won’t kill padwans, right?) 

 

While the above statement does hold water, I can’t help but harp on the idea or statement that a severe issue could be undiscovered or unrealized for several iterations or sprints—until a catalyst is introduced to the environment. To reiterate: typically testers tend to have more experience on the application than the average business user, giving them a more in-depth functional knowledge of an application. The problem is that sometimes the testers lack practical or production experience which can only be gained over time working hand-in-hand with the business users. (Lack of experience, human nature and plain old boredom can lead to the dark side of application testing and of course that’s where the darkest defects linger)

 

Picking the coolest ship—Millenium Falcon or the USS Enterprise?

 

ent.jpgmf.jpgTo answer the original question, when faced with a Spock (systematic approach to testing) versus Anakin (exploratory approach to testing) situation (without having the benefit tools and the constraints that sometimes imposed by different development strategies) I would always choose the methodical logical systematic type of testing. But I am not done with my train of thought here.  With the development of tools like Sprinter and the introduction lean development strategies exploratory testing seems very appealing at times.

 

There is no one de facto statement I can use that could clearly define one strategy over another. For example, I believe if systematic testing is structured correctly it can work seamlessly in an Agile environment. If you have methodically driven team of testers, the use of exploratory testing on a critical or large development effort,( taking into account a groups of testers experience level) could speed up your testing endeavor and still lower your overall risk to the current initiative. But in most cases there are never just two criteria that you can base your decision on.

 

When your team is selecting a testing strategy your group must weigh a myriad of information to make the best choice for your team.  But choose wisely young padawan, and remember—Vulcans never bluff.

 

  

If you find yourself in this dilemma please feel free to reach out and comment on this blog with your criteria sets.  Please include your preferred testing methodology and we can work with you to mitigate some of those issues so you can make a more informed decision.

 

Note to the reader: while systematic and exploratory testing are not the only strategies/methodologies that are currently being used in the software testing industry—they are two of the most common types of strategies.

uft6.jpg

 

Thanks

@wh4tsup_Doc

thCAEFKP8P.jpg

Comments
MichaelDocDeady | ‎01-14-2013 09:50 AM

Frank  commented on Linkedin QTP group

 

Frank Pergolizzi •My job is developing and maintaining automation frameworks used by groups that peform both exploratory and systematic testing and I find there are design challenges to accomodate both with the same code, particulary in the areas of error handling/reporting, output parameters vs. function return codes, and message logging.

What someone implementing a relatively fixed system test script may regard as an obvious error condition they expect the framework to handle and/or report, may be something an exploratory tester couldn't care less about and would prefer be ignored by the framework or bubbled up for the script writer to handle or ignore at their choosing.

I've often told people the most difficult part of test automation is defining what's meant by "pass" or "fail" for every context, and every level. I still routinely run into people (mostly management) who argue the purpose of running automated tests is for them to pass - faster than the same tests should pass using manual testers.

That's like saying the purpose of a tire pressure gauge is for it to report your tires are at the correct pressure

MichaelDocDeady | ‎01-14-2013 09:51 AM

I couldn't agree with you more in the English language there is only a few words more black and white then pass or fail which rank right up there with yes or no and on or off or in the ultimate language mathematics of binary 1 or 0. That we spend so much time attempting to redefine these two simple words blows my mind. Take for example negative testing or something as simple as general (ambiguous) requirements. Add on top of that automation testing simple logic engine and ask Automator to building some type of business intelligence (oxymoron). Recently by worked with the group that was tasked with a simple idea of creating an auto defect generator across all lines of business within QTP and ALM, that single statement will probably consume an experienced Automator six months of his life. In fact you gave in the great idea for a multi-series blog on defining word pass or fail if it’s all right with you.

I found your comments very interesting and I’m sure more people would like to read your comments in the original context. I was wondering if I could ask a favor that you post them on the HP community or if I could reprint your comment (giving you credit) along with my response on HP community?

Thanks again
@wh4tsup_doc

Leave a Comment

We encourage you to share your comments on this post. Comments are moderated and will be reviewed
and posted as promptly as possible during regular business hours

To ensure your comment is published, be sure to follow the Community Guidelines.

Be sure to enter a unique name. You can't reuse a name that's already in use.
Be sure to enter a unique email address. You can't reuse an email address that's already in use.
Type the characters you see in the picture above.Type the words you hear.
Search
About the Author
Michael Deady is a Pr. Consultant & Solution Architect for HP Professional Service and HP's ALM Evangelist for IT Experts Community. He spec...


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation