Jim Hazen - Testing Is Irrelevent, Shipping is Futile!

Jim Hazen - Testing Is Irrelevent, Shipping is Futile!

November 30, 2014 - Three years and a lot of things going on

Well sorry folks, its been 3 years since my last post here.  Been busy with life and work as usual.  I've done a few more conferences and other talks about automation.  Just trying to pay the bills.


For those of you who care I've posted all my latest presentations over at my LinkedIn home page (https://www.linkedin.com/pub/jim-hazen/0/b48/760), check them all out and I hope they help you in some way.


Hopefully it won't be another 3 years before I post.  I've got some ideas on posts and presentations so I'll see what I come up with in 2015.


Jim


Comments ( 0 ) :: Permanent Link

October 13, 2011 - And so it goes

Over the last week or so the computer industry has lost two giants/innovators.  First being Steve Jobs.  Nuff said on that one.  Second is Dennis Ritchie, co-creator of the C programming language and UNIX.  I think the following comment by a reader of an article about Dennis Ritchie is an appropriate RIP for him:

#include
int main(int argc, char** argv).
{
...
fprintf(stdout, "Goodbye World!\n");.
exit(0);.
}
 
For those of you who cut your professional teeth in programming with C you know this is a play on the famous "Hello World!". 
 
I learned C my last year of college in 1986 and it helped me get my first job in programming/testing at a commercial software company.  I've used the language off and on over the years for various things (like LoadRunner web scripting).  I have an original copy of "The C Programming Language" by Kerninghan & Ritchie buried somewhere in all my books. 
 
As a tribute to both Steve Jobs and Dennis Ritchie I say thank you for your contributions, and for providing me with a career and livelyhood for the last 23 years.  I tip my glass to you. 
 
RIP.
Comments ( 0 ) :: Permanent Link

March 19, 2011 - My life on the D-List (speakers and known consultants)

At times I feel like Kathy Griffen, living my professional life on the D-list as a speaker and "industry" consultant.  Most of it is my own self imposed position on the fringe.  I am outspoken and opinionated, but try to only be so when it is really needed. 

I've seen too many "experts" with their blog sites spouting and ranting on about the same thing as other people.  There is a lot of rehash of topics and ideas in our industry right now.  But every once in a while there is something new.  It just seems that Testing has kind of stagnated again in its progress as a profession, at least that is my opinion.  The same crop of "experts" have been at for the last 10 years or so, and some of their messages are getting old too.

So what does this have to do with my life on the D-list?  Not much except a feeling I've had lately about wanting to get more involved in the industry, and I'm doing it to a small degree.  I'm speaking this next week at STPCon 2011 in Nashville, so if you are going please stop by and say hello.  Maybe even listen to my presentation, and be good enough to give me feedback so I can make it better.

I'm thinking about doing some more writing and publishing, but that takes time.  And right now I'm looking for my next gig, I still have bills to pay.  Trust me I have ideas for a couple of articles, one of which may be a bit of a handgrenade being thrown into the automation debate. 

So... I'll see you around.  And who knows by this time next year I might have moved up, to the C-list that is.

Comments ( 1 ) :: Permanent Link

February 25, 2010 - Testers are like Fighter Pilots

If you think about it our "mission" is to go out and find the enemy (defects) and shoot them down (or at least be the air tactical guys who guide the fighter pilots to them).  We do this by various means and record our victories with symbols (defect reports) on our planes fuselage.

Now I don't know about you, but my plane is starting to look like a pile of flying post it notes.  How about you, how many "kills" do you have?  How many times over are you an Ace?

Comments ( 0 ) :: Permanent Link

January 29, 2010 - Traditional vs. Agile methods

Because I am a smartaleck... In tribute to George Carlin (think Football vs. Baseball)

Traditional vs. Agile

----------------------------

Stern Manly voice: In Traditional methods we drive to production.

Not so manly voice: In Agile methods we iterate until it's good enough. 

 

Stern Manly voice: In Traditional methods we push a system out the door.

Not so manly voice: In Agile methods we deliver a demonstrable piece.

 

Stern Manly voice: In traditional we have firm and well defined requirements in documents in a repository.

Not so manly voice: In Agile we have loosely written user stories on post cards on a wall.

 

Stern Manly voice: In traditional the Project Manager is the quaterback of the group and guides the team to the release date.

Not so manly voice: In Agile we are all equal on the team and run together through the sprint.

 

Stern Manly voice: In traditional we have specifically trained people who wear a specific hat.

Not so manly voice: In Agile we have cross-trained people who can wear multiple different colored hats.

 

Stern Manly voice: In traditional we meet on a predifined weekly schedule to discuss the progress of the work.

Not so manly voice: In Agile we meet every day for 15 minutes to talk about what's blocking us.

 

Stern Manly voice: In traditional the product is the most important deliverable in the project regardless of individual contribution.

Not so manly voice: In Agile the people are the most important thing because we want them to feel good about their product.

 

Comments ( 0 ) :: Permanent Link

January 4, 2010 - New idea for a new year

A friend of mine just told me about an interesting philosophical axiom.  This was a translation from russian (he is a Russian ex-pat, now U.S. citizen) he relayed to me, and I think it fitting for a new year.

"It doesn't matter which hand a monkey holds a grenade with." 

Think about it a little.

Comments ( 0 ) :: Permanent Link

October 7, 2009 - Automation is like an Ogre is like an Onion. It's got layers!

A quick post here about some things I've read lately about test automation.  Specifically the test execution methods and the different levels in software that it can be implemented against.

 

With Agile methods there is a heavy emphasis in TDD (Test Driven Development) to use Unit Testing and the harnesses that support it.  This is good, and about time for development to do some testing and automation of their own. 

 

Next we get into using these harnesses to create Integration level tests and to incorporate them into the build process as part of Continous Integration (CI).  Again, a good idea whose time has long been due.

 

Now we get to the higher level tools, harnesses and methods for doing the API and "below the surface" tests (as some have called getting under the 'skin').  This adds another layer of testing we can do on the system and help to improve the efficiency and effectiveness of the testing via automation.  I'm all for that.

 

Finally we get to the GUI layer (where most of the test automation work is still done today) and all the fun we have with that.

 

So as you can see there are different layers of the software that we can peel away to see and test via automation.  Hopefully some of those layers don't have too strong a smell and make your eyes water up.  Just like when you peel an onion.  And as part of that you cannot judge the software by its outer layer alone, you will need to do some peeling further down to make sure things really work the way they are supposed to.  This is the complexity issue we deal with in software in general.  Nowadays there is a lot of complexity in our systems.

 

At times this complexity makes us look at our systems as being like a big green Ogre, it just gets ugly.  But if we spend the time and effort to go beyond the surface we find it isn't all that bad.

 

Really... really.

 

I'll follow up another time with more details and findings about the various layers. Right now it is a bit late and I need my beauty sleep.  ;-)

Comments ( 0 ) :: Permanent Link

August 23, 2009 - I'm a Tester, and I'm Okay!

Oh what the hell...

 

I'm a Tester (to the tune of I'm a LumberJack)

I'm a Tester, and I'm okay.
I sleep all night. I test all day.

Project Managers: He's a Tester, and he's okay.
                He sleeps all night and he tests all day.

I crank out tests. I eat my lunch.
I go to the laboratory.
On Wednesdays I go schlepping
And have beers with developers.

Project Managers: He cranks out tests. He eats his lunch.
               He goes to the laboratory.
               On Wednesdays he goes schlepping
               And has beers with the developers.

Chorus : I'm (He's) a Tester, and I'm (he's) okay.
             I (He) sleep(s) all night and I (he) test(s) all day.

I crank out tests. I ship and jump.
I like to impress the users.
I put on salesmen's clothing
And hang around in bars.

Project Managers: He cranks out tests. He ships and jumps.
                He likes to impress the users.
                He puts on salemen's clothing
                And hangs around in bars?!

Chorus : I'm (He's) a Tester, and I'm (he's) okay.
             I (He) sleep(s) all night and I (he) test(s) all day.

I crank out tests. I wear canvas high tops,
baggies, and say 'chaa'.
I wish I'd been a developer,
Just like my dear Mama (or Papa in later versions)

Project Managers: He cranks out tests. He wears canvas high tops,
                baggies, and says 'chaa'?!

Chorus : I'm (He's) a Tester, and I'm (he's) okay.
             I (He) sleep(s) all night and I (he) test(s) all day.

             Yes, I'm (He's) a Tester, and I'm (he's) ok-a-y.
             I (He) sleep(s) all night and I (he) test(s) all day.

Comments ( 1 ) :: Permanent Link

August 16, 2009 - The meaning of a Testers life, or how in the hell did I get here

I've been at this thing we call Software Testing for over 22 years now, and at times the more things change the more they stay the same.  In the grand scheme of things you have to ask yourself at times "what the hell am I doing and how did I get here".  In my years of experience I have seen some thing just not change when it comes to the "profession" of software testing.

First and foremost, we are still the "red headed bastard step-child" to Development.  Short and sweet, we are still to this day fighting to gain credibility and respect for our work.  When in fact we are the other side of the coin from Development.  Again though, for those of you who remember the 45 record, who played the B-side song.  Analogy aside, it is this second-class citizen relationship that to this day prevails.

Second, the attitude of executive management that "anyone can be a tester" and that "testing is not a technical discipline like development".  Sofware / IT seems to be the only industry that has this mentality.  Go to other industries like Computer Hardware, Electronics (other than computers), Automotive, Aerospace, General Engineering and Manufacturing and you will see the importance and support of testing by the executives.  Because they know, and have learned, that in order to have a solid and sellable product it needs to be reliable and usable (tested).  In these industries the Test Engineers are some of the best people around.  They get support (funding) from management and are seen as adding value to the product and not as just a "cost center".  Must be nice eh?

Third, increased commoditization of the work.  Meaning Outsourcing, and specifically Offshoring.  I know this is a can of worms.  But you get what you pay for.  Twenty years back companies only outsourced what they could not do themselves.  Work was predominately done in-house, and this was because the cost of doing it by an outside group was not cost effective.  Now as this has started to shift there have been increases in the outsourcing/offshoring of the testing work.  Recently it has been done to a fault and predominately based on labor costs alone.  Some companies are finding out this is a mistake, and that they need a better balance in the equation. In my opinion letting the bean counters make these types of decisions for technology is asking for a world of hurt.  And I've seen alot of pain recently.

Fourth, lack of education and educational resources for University/College students on the topic of Software Testing and Quality Assurance.  When I first started out in Testing I came over from the Development ranks (I was a good developer, just better at testing).  My background in software in college was in programming and testing was part of my classes only at the level of 'debugging' my code.  There wasn't any classes specifically about Software Testing and Quality Assurance.  What I learned early on was either through reading, OJT or week long seminar classes (and even these were few and far between).  I got my "testing" education through the "School of Hard Knocks".  But recently there is more formal education popping up; either by colleges or professional agencies, and these have helped to improve the profession.

Fifth, majority of people (inside and outside of the ranks) are still calling software testing "Quality Assurance".  If all you are doing is testing the sofware then that is what you are, a "Tester".  Software Quality Assurance is so much more than testing alone.  Testing is a component piece, and a key on at that, in the overall scheme of Quality Assurance.  Testing is more akin to Quality Control.  It's true, don't fight it.  Other disciplines within Quality Assurance are: Audit, Inspection & Review, Software Configuration Mangement, Metrics, Process Improvment, Risk Management, Project Management (this really should fall under QA), and Standards & Procedures to name a few.  And we, testers, are the worst at this misnaming/misrepresentation of our work.  Why, because 'QA' seems to have a better connotation than 'Test', thus we would have a higher standing in the company.  And also because a lot of people outside of the testing ranks think that by just doing testing you are going to ensure 'quality' is going into the product.  Guess they need to better understand the saying "you cannot test quality into a product", and so do we.

Now how does all this relate to my subject?  Because I have been through all of this during my career I know that when I come on to a project there is quit a bit of education and enlightenging I have to do with other people so that later on I can actually do my job effectively. 

As a Tester my job is to create and execute tests so as to gather information on the reliability and stability of the product.  Is this thing ready for prime time?  It is also to make sure that due diligence is done by other groups.  To make sure they do their part of the 'quality' process.  My job is to make sure the right people get this information so they can make an informed decision on whether or not to release the product for general use. 

Because I take pride in my work and give a damn.  That is how I got here, and why I stay.

Comments ( 2 ) :: Permanent Link

August 16, 2009 - Sensory Overload - How many books on Testing do you need?

Twenty plus years ago when I started working in this profession there were maybe only a couple of books on Software Testing, and even 12 years ago there were only a few more in the mix, and up until about 2000 there were only about 20+ books on the market. 

Nowadays though it seems everyone and their brother has written a book of some type on Software Testing.  This is good in the fact that we have different subjects within testing now being covered.  Topics range from basic knowledge and techniques to test automation (functional, unit and performance) to test management to specific tools or applications (like SAP), and other points in between. 

Performing an Amazon search I came up with a list of about 250+ books, of which (after trudging through each listing page) about 200 or a little more (I didn't do an exact count, give me a break) were books specifically on software testing or associated subjects (Unit, Debugging, etc.).  Looking through the list I noticed the majority of books were published in the last five to six years.  Wow!

Now the bad news (I'm a Tester for crap sake, not doing my job if I don't bring this up), the majority of the books cover the area of general processes and techniques for Software Testing.  Now how many times can you rehash the same subject trying to build a better mouse trap?!  This leads me into my topic, how many books do we need?  This is worrysome because now we have swung to the other end of the pendulum and run the risk of confusion.  I mean which books are really going to be useful. 

Also, as part of the plethora of books you have new definitions of old terms and methods.  This unto itself leads to confusion and much heated debate, and typically for no good reason.  Look at the definition and use of the term Bug / Defect.  I'm not going to go into it here, but you can see my response, and a smart aleck one at that thank you very much, on other forums.  Basically it was my attempt at humor via George Carlin (genius of making fun of terms).

This whole thing is just plain stupid when we debate vocabulary and definitions of terms.  No wonder other groups look at us with disdain.  But back on topic...

I can only think of and recommend a few books that I would say are original and relevant to our profession.  The first being "The Art of Software Testing" by Glenford Myers, still a classic and contains all that is necessary to learn the theory of Software Testing.  Second, "Testing Computer Software" by Kaner et al.  This book takes the theory and puts it all to practical use.  Third, "Managing the Testing Process" by Rex Black.  Again, straight forward and has practical application.  Fourth, "Software Metrics: Establishing a Company-Wide Program" by Robert Grady & Deborah Caswell.  An older book, but has a lot of great information on testing metrics and their implementation.  Still relevant to this day.  Fifth, "Automated Software Testing" by Elfriede Dustin et al.  A great book on automation of software testing and a process to implement it.  The follow up book "Implementing Automated Software Testing" is a great read also, and has a lot of practical information for any project.  Sixth, "Software Testing In The Real World" by Edward Kit.  This book is a great look at implementation and alignment of testing within a company.  Shows different models of how testing can be done and what role(s) the Test staff may play. 

Now I'm sure some of you will argue my list, but these are books that were first to have new ideas and make points that are still relevant today.  I have read all of these books and they remain in my library.  There are some other books that some may argue are "classics", but to me they are rehashes of the ones listed. 

If anything thing these are good books to start off with, and the others need to be read & utilized as best determined.  So again, how many books do you need, or can read, on this subject of Software Testing.  And which ones are really going to help you do your job.  Our work is about practical application and not theory. 

Comments ( 1 ) :: Permanent Link

May 14, 2009 - But who is going to Test the Testers

Recently UTest (www.utest.com) relaunched their public facing site.  As part of that they are touting their abilities to help companies do testing via a "crowd source" approach.  But it seems they didn't do a good job of testing of their own site.

After moving around on the site looking at things I found a 404 error for a link to a page.  And this was on a page, and on a link, that will see some moderate to high level of traffic on it.   You would think that they would have tested their own stuff better being that they are a 'testing' company. 

After passing this tidbit on to Joe Strazzere for his "They probably should have tested better" hall of shame I got a response email with a list of other broken links that Joe found using Xenu Link Checker.  Again, you think they would have run a link check test against their site before they launched.  Nope.

 

So who is going to 'test' the Testers?  ;-)

 

Comments ( 2 ) :: Permanent Link

March 16, 2009 - Ethics - what does that mean in today's world

With all the recent happenings in the U.S. economy and business world (incompetence, illegalities, and greed) you have to wonder what has happened to the common ideal of corporate and personal ethics.

I'm not going to go on a long rant here, but just recall a conversation I had with my father in December 1990 about ethics and dealing with people that seem to not have any.  It related to a recent event in my work life that led me to leave my company.  Bottom line is that he told me to keep my head up and continue on.  To hold to the things he and my mother had instilled into me about doing the 'right thing'. 

He later sent me an article from the newspaper that talked about the 'problem' of ethical behaviour by todays society.  That was in 1990, and looking at todays news it makes me wonder have we improved at all or are we in a continous state of entropy.  I'm going to look for the article and will try to post some of it as best as I can here in the next week or so.

 

Till then, be good people.

Comments ( 1 ) :: Permanent Link

January 2, 2009 - New Year, new things and Yakking up hairballs

After 2 years of in-activity I guess I should do something to revive this blog.  I guess being busy with work, life and all I have been negligent in my rantings... er, writings.

I found a couple of other blogs recently that I think others will like.  They are qahatesyou (http://qahatesyou.com/wordpress/) and PracticalQA (http://www.practicalqa.com/).  The latter has a great post and title for the post.

So for 2009 I suggest we all need to make sure we are not "yakking up hairballs".

Be good people!

 

Jim

Comments ( 1 ) :: Permanent Link

December 20, 2006 - Life of a Traveling Consultant

For the last few years I have been working for Consulting type companies.  At times it has been fun and others I wonder why I did this to myself.  Right now I'm out of my home town on a project and regretting the travel involved.  I do go home every weekend, but with the weather this time of year (December) in Colorado it can be a bit of a gamble.

Today there is a major snow storm in Colorado and I am wondering if I am going to be able to fly home tomorrow.  I will have to play the phone game with the airline and live on the "Standby" list for flights if my flight gets cancelled.  Oh joy!

So for anyone thinking about going into consulting/contracting you may have to travel for your work.  Just keep in mind that you will get "stuck" somewhere sometime, part of the personal price you pay for those "Consulting Dollars".

Comments ( 1 ) :: Permanent Link

September 23, 2006 - Ethics in business, where did they go?

Posted in Business Matters

Recently I got burned by a job candidate, or more accurately by the unethical company that represented them.  In this case it was a company that had 'doctored' up the person's resume so that they would appear to be a highly qualified test professional.  The resume's front page in the Skills/Knowledge was loaded with the 'correct buzzwords' for the tools and technologies that caused it to land in my lap. 

I work for a consulting firm and we do work across the U.S., and we try to 'contract' local people if needed when we cannot get an FTE consultant to the client. We are a small shop so this does happen often.  Our internal recruiter contacted the person and setup a technical interview with me over the phone (we were not in the same state). When I spoke with this 'person' they were right on the money with knowledge and speaking ability, so they looked like a good candidate. I was ready to move forward, but wanted to check out a couple more candidates to cover my bases.

The next resume I got from our internal recruiter looked almost the same as the previous person I just spoke to. I did a side by side comparison and the first page had a knowledge section that was 95% the same between the two, there were only a few tweeks. I went back and talked with my recruiter to find out what was up. They replied it looked like the two candidates were probably from the same Contract Registry (do Corp. to Corp. contract) and that they had probably used a ringer for my phone interview.

I got fooled by a 'ringer' phone interviewing for a potential job candidate.  So I called the 'person' I talked with about an hour later. Guess what, not the same person (voice, speaking skills or knowledge)! Their resume and the other one immediately went into the trash can!
There are some places out there where they just doctor up the resume so they can get high on the hit list and then have a 'ringer' get the job for some dumb schmo!  I had heard about this bait and switch crap, but never thought I would get caught by it. I still want to find the company that did this and expose their fracking asses for this B.S. !!!!

Caveat emptor (buyer beware)

Comments ( 2 ) :: Permanent Link

July 29, 2006 - Change is the only constant in life

This last week (while I was out on vacation no less) Mercury Interactive was bought out by Hewlett Packard (HP), and is causing a lot of us in the QA/Testing profession to wonder what is going to happen next.  As Scott Barber said in his blog this week, we have seen some major players (Rational & Segue) in the Test Tool marketplace get acquired in the last couple of years and a couple of new entries (Microsoft (or a re-entry, anyone remember MS-Test?), Akimbi) that have caused quite a stir.  This is change, and whether we like it or not it does happen.  Sometimes for the better and other times not.  But this is what life is all about, change.  How we cope with it and learn to live with it.

This type of "Sea Change" has happened before, and before the Dot.com boom of the late 90's.  There have been other tools and companies that have come before and either merged, emerged, or died. Early on in the PC software world when DOS was dominant there were only a few test tools around (AutoTester, QES QA/Architect, etc.), and those companies all pretty much died off when the GUI market started to boom with Windows 3.0 and OS/2.  Companies like Mercury (XRunner & WinRunner), SQA (Robot), Segue (QA/Partner), Softbridge (Automated Test Facility or ATF), Compuware (QA/Run) and Microsoft (MS-Test) all took off and pushed us to the next generation of tools and automation.  Right before the Dot.com craze started some of the companies were bought out (SQA by Rational, Softbridge by Teradyne), sold off their products (Microsoft with MS-Test to Rational) or emerged as the main players (Mercury, Segue, Compuware, and Rational with acquisition of SQA).  This scared some of us then too.  I switched quickly to Windows and QA/Partner, when OS/2 started to go down in flames, after working with ATF for 4 years.  I learned other tools and survived.  I adapted to the change, and the last few years have been focused on the Mercury toolset.  Now I will need to adapt again possibly.  This is a change I will need to roll with.  Only time will tell what other changes are in store for us.

 

Comments ( 1 ) :: Permanent Link

July 18, 2006 - In the middle of it all

I just got my latest copy of Better Software magazine and found an article about the "worth" of testing (article is "Proving our Worth, Quantifying the Value of Testing" by Lee Copeland).  In it he discusses the "value" we as testers bring to the Software Development equation, and states that our main goal is to provide "information" to all the other parties associated to a project.  I agree totally with this statement.

As testers our main goal is to "find" things out about the system under test and to disemminate the information to the concerned parties on the project.  We are in essence the CSI of the software world; we use deductive reasoning and experiments (tests) to try to determine what is going on and what caused it to happen.  Then we tell the proper authorities about our findings.  As Lee stated in his article, the context and quality of our information is what lends value to the project and that is one of the factors that proves our worth in whole.

Which leads me to the premise of the title of this entry; we (testers) are in the middle of it all.  We are the central focal point, or "hub", on the software project wheel.  A lot of other groups revolve around us and rely on testing to "interconnect" them.  Information flows in and through the Testing/QA group during the project.  Now this may sound a bit self-absorbed and egotistical, but it is the truth.  And it is one of the reasons testing is one of the more politically hot areas on a project.  Testing/QA lives in the limelight and under the microscope, whether we like it or not.  Testing/QA is at the center of the process wheel, or eye of the storm as some would say, and we are responsible for holding together the rest of the groups via the information we provide.  If we are not good at doing that then the rest of the wheel is unstable and will break apart.  And even at times where testing is not responsible for something in the process we still get the blame and heat for it.  Price we pay for being in the middle of it all.

So in conjecture to the article, is it really worth all the headaches and pain associated to the work we do on the project and in the information we try to bring to the table?  Yeah, because we do a "worthy" job and provide "value" to the project.  And in some respects it is fun to be in the middle of it all.

Comments ( 2 ) :: Permanent Link

June 21, 2006 - Nature vs. Nurture - Is a good tester born or taught?

I previously talked about a person's 'innate curiosity' and 'inquisitive nature', and how this trait lends itself to being in the QA/Testing field.  In response to this post I got a comment along the lines of "is this capability of a tester innate or learned?"  Which my response is going to be a non-committal 'well... yes and no'. 

Yes, it can be learned to a degree.  There are different lines of thought process that can be taught that relate to the tasks involved in testing work.  We experience this all the way through school in classes on Mathematics (Proof Theorums), hard sciences (Chemistry, Physics), Psychology, and Philosophy (Logic and Critical Thinking).  We learn how to 'pick apart' a system (be it biological or non-biological) and prove whether it works or not, why so, and how.  That is why I have met a lot of test professionals with non-CS degree's like Biology, Chemistry, Physics, Philosophy, and Psychology.  They were taught to be thinkers and researchers.  The skills were taught in these types of disciplines, thus it was a nurture type of situation.

Now... No it can't all be learned.  There has to be some innate mindset that needs to be present.  Personality characteristics are what help drive a person to go towards the disciplines that teach the additional skills and help refine them.  Speaking for myself I was always a curious child, I got into things (and a lot of trouble) and always enjoyed pulling things apart to see how they work (my fathers favorite lawnmower was a victim of this curiosity; got it apart alright, just couldn't get it back together correctly again ;-) ).  Also, I have a degree in Zoology and got into computers out of interest (and economic need).  I was always more interested in the analytical/mathematical side of the discipline (genetics, population biology, epidemiology) and didn't like the classification stuff (what is the difference between a bufo woodhousi and a rana pipiens, there both frogs and who cares).  So there was a 'natural' ability present in me.  When I first got into the software industry I was a programmer, doing a lot of maintenance work.  I debugged/tested and fixed other peoples code, and I naturally moved into the formal testing field quickly when an opportunity arose (I liked it more, and at the time saw it as a career move that would be a better fit for me).  Now I still like programming, I do a lot of automation work (functional and load/performance).  But I do it in combination with my overall work focus.  It is in my nature.

Now I know people will argue this, and I expect it.  Does it make me a better tester?  In my opinion it does because this line of work can be stressful and tedious, and it takes a certain personality and characteristics to be able to do it effectively.  I have met other 'testers' who didn't last long because they only learned the craft and didn't have a built in drive for it.  That has been my experience so far.

So is a good tester born or taught?  They are born (nature) and they are taught (nurture), which is the key factor.  As with anyone we all have natural talents, and it is how those talents are developed (through education) that define our capabilities.  As the saying goes "brains only gets you so far, the rest is hard work!"

Comments ( 2 ) :: Permanent Link

June 6, 2006 - Starting out in Testing, Part 2 - 'Inquisitive Nature'

In part 1 I talked about things such as self study, finding a mentor and taking a realistic approach to your work/job.  In part two I want to get more into the mind / thought processes employed when getting into testing.  This will talk about the 'inquisitive nature' a tester needs in order to perform the work.

An 'inquisitive nature' is one where a person looks at something and examines it to gain a better understanding of what it is and how it works.  Curiosity is another term used for this, and it can be used interchangeably here if you wish.  I'll stick with inquisitive for now.

People by nature are inquisitive; children are the best at this.  What is that?  Why does it do that?  How does it do that?  When will it do it?  Where will it do it?  These are basic cognitive questions that a child will ask.  These are the questions we as test professionals need to constantly ask, and answer.

What is that piece of functionality?  What does it do, and how?  Where does it do its function?  Where does it come from?  Why does it do this part of functionality at this time, a question of when.  When does it not do its functionality and why?  Who uses this functionality?  These are some of the basic questions of the system under test (SUT) for Who, What, Where, When, Why and How.  These may be easy, or not, questions to get answers for.  It is our job to formulate and get answers to these questions.  Because of that we need to also look at the questions and answers for the methods/processes we will use to get the answers of the SUT.

These questions will include; What information about the SUT do I need to learn in order to test it, where is the information at, who is the person that might know about the SUT, when will I need to get the information or perform the test itself, how will I perform the test, what am I trying to prove by doing the test, why should I test that component or not, etc. etc.  This is just a sample of the questions to be asked and answered.  Testing is about asking questions and getting answers to them.

But this all needs to be done with some sort of organization and plan.  And next time I will discuss that aspect of starting out in testing.  So come back later and let your 'inquisitive nature' get some answers to those questions.

 

Comments ( 1 ) :: Permanent Link

June 5, 2006 - Burn it down

Posted in Managing Testing

One thing I have been interested in over the years is the question of determing when testing is done.  When enough testing has been completed and the product is ready to be shipped.  In the past I have used completion metrics such as Defect Resolution Rate (how fast are you fixing things and when are you fixing more than your finding) and Test Execution Status to aid in this task.  Test Execution Status can be the amount of coverage of the requirements (test cases covering requirements) and their pass/fail rate along with execution rate (tests planned and executed) of the tests.  This show the traditional growth curves when plotted.  It give you an idea of how much you have done.  Which helps to determine where you are at  in the cycle.

An interesting thing I have read about in relation to the Agile methods (SCRUM in particular) is the 'Burn Down Rate' which is used to show how much is left to do for 'stories' (requirments/use cases) over a specified time period.  It is a downward sloping curve that can used to say you have so much left until the cycle is complete.  Considering this is more in line with what we get asked (how much testing is left to do, and when will it be done by) by management it may be a more effective way of also showing how much testing is left.  Instead of stories you plug in the test cases for those stories, and you plot the trend of how fast you are 'burning down' the list of things to do.  This way you could say to management we have X percent of test cases left to execute and we look to have them done by Y date.  This is what they want to hear.

I have not done this on a project yet, but am looking forward to trying it out.  After reading this if you have implemented it please respond back with your experiences.  If not, then maybe this will inspire you to change your perception of test execution status reporting & monitoring, and to see if you can 'burn down' some of your testing.

Comments ( 1 ) :: Permanent Link