You are on page 1of 32

This guide will take you through the In's and outs of software testing.

If you plan to make a career in software testing , this is a MUST READ!

What is Software Testing?


Software testing is a process of rating properties of a computer system /program to decide whether it meets the specified requirements and produces the desired results. In process, you identify bugs in software product/project. Software Testing is indispensable to provide a quality product without any bug or issue.

Skills required to become a Software Tester


Following skills are indispensable to become a good software tester. Compare your skill set against the following checklist to determine whether Software Testing is a really for you

A good software tester should have sharp analytical skills. Analytical skills will help break up a complex software system into smaller units to gain a better understanding and created corresponding test cases. Not sure that you have good analytical skills - Refer this link - if, if you can solve atleast ONE problem you have good analytical skills. A good software tester must have strong technical skills . This would include high level of proficiency in tools like MS Office , OpenOffice etc , Testing tools like QTP , Loadrunner , etc.. and ofcourse deep understand of the application under test. These skills can be acquired through relevant training and practice. Also it's an added advantage that you have some programming skills but its NOT a must. A good software tester must have a good verbal and written communication skill. Testing artifacts (like test cases/plans, test strategies, bug reports etc) created by the software tester should be easy to read and comprehend. Dealing with developers (in case of bugs or any other issue) will require a shade of discreetness and diplomacy. Testing at times could be a demanding job especially during the release of code. A software tester must efficiently manage workload, have high productivity ,exhibit optimal time management and organization skills To be a good software tester you must a GREAT attitude. An attitude to test to break' , detail orientation , willingness to learn and suggest process improvements. In software industry, technologies evolved with an overwhelming speed and a good software tester should upgrade his/her technical skills with the changing technologies. Your attitude must reflect a certain degree of independence where you take ownership of the task allocated and complete it without much direct supervision. To excel in any profession or job, one must have a great degree of the passion for it. A software tester must have passion for his / her field. BUT how do you determine whether you have a passion for software testing if you have never tested before? Simple TRY it out and if software testing does not excite you switch to something else that holds your interest.

Academic Background:
Academic background of a software tester should be in Computer Science. A BTech/ B.E. , MCA , BCA , BScComputers will land you a job easily. If you do not hold any of these degrees than you must complete a software testing certification like ISTQB and CSTE which help you learn Software Development/ Test Life Cycle and other testing methodologies.

Remuneration
Compensation of a software tester varies from company to company. Average salary range of a software tester in US is $45,993 - $74,935. Average salary range of a software tester in India is Rs 247,315 - Rs 449,111. Also, a software tester is also give health insurance, bonuses, gratuity and other perks.

Typical Workday:
On any typical work day you will be busy understanding requirement documents , creating test cases , executing test cases , reporting and re-testing bugs , attending review meetings and other team building activities.

Career Progression:
Your career progression as a software tester (QA Analyst) in typical CMMI level 5 company will look like following but will vary from company to company QA Analyst (Fresher) => Sr. QA Analyst (2-3 year experience)=> QA Team Coordinator (5-6 year experience> =>Test Manager (8-11 experience) => Senior Test Manager (14+ experience)

Alternate Career Tracks as a Software Tester


Once you have got yours hand dirty in manual testing , you can pursue following specializations

Automation Testing : As an automation Test Engineer , you will be responsible for automating menial test case execution which otherwise could be time consuming. Tools used IBM Rational Robot , Silk performer and QTP

Performance Testing: As a performance test engineer , you will be responsible for checking application responsiveness (time taken to load , maximum load application can handle) etc. Tools used WEBLoad , Loadrunner. Business Analyst: A major advantages Testers have over Developers is that they have end to end business knowledge. An obvious career progression for testers is to become a Business Analyst. As a Business Analyst you will be responsible to analyze and assess your company's business model and work flows ,and especially how they integration with technology . Based on your observation you will suggest and drive process improvements.

Common Myths
Software Testing as a Career pays Less Developers are more respected as compared to Testers

Contrary to popular belief , Software Testers (better known as QA professionals) are paid and treated at par with Software Developers in all "aspiring" companies. A career in Software Testing should never be considered as "second rated". Software Testing is Boring Software Testing could actually "test" your nerves since you need to make sense of Business Requirements and draft test cases based on your understanding. Software testing is not boring. What is boring is doing the same set of tasks repeatedly. The key is to try new things. For that matter , have you ever spoken a to a software developer with more than 3 years experience ?He will tell you how boring his job has become off-lately.

Okay I am interested ,where to begin ?


For a complete newbie, here is our suggested approach to learn Software Testing

You start with learning Basic principles of software testing. Once done you apply for freelancing jobs. This will help you gain practical knowledge and will fortify the testing concepts you have learned.

Next you proceed to QTP - Automation tool , then Loadrunner - Performance Testing tool and finally Quality Center - Test Management Tool. All the while you are learning ,we suggest you apply for freelancing jobs (apart from other benefits you will make some moolah too!). Once you are through with all the tools , you may consider taking a certification. We recommend ISTQB. But this is optional. After this , when you apply for permanent jobs in big corporations you will have many skills to offer as well some practical freelancing experience which may be of value and will definitely increase your chances of being selected. Learning Links :

Software Testing - link QTP - link Loadrunner - link Quality Center - link Freelancing Job - link Permanent Jobs - Any major job portal like monster.com or naukri.com

Hope to see you in a QA conference some Day! :-)

Are you a new in Software Quality Assurance? Please let this be a reference to get you started learning all about SQA.

QA Practices and Procedures


Testing Methodology

_____________________________

The following is an overview of the quality practices of Software Quality Assurance team: -The iterative approach to software development presents a significant challenge for SQA. The iterative, rapid deployment process is characterized by a lack of strict adherence to a traditional waterfall development methodology (marketing first specs the feature set, then engineering refines the marketing requests into more detailed specifications and a schedule, then engineering starts building to specification and SQA starts building tests, then a formal testing cycle, and finally product release). Here is a variant of development: -As progress is made toward a release, the first priority features are done to a significant level of completion before much progress is made on the second priority features. A similar approach is taken for the hopefullys and third priority features. The first priority feature list is all that has to be completed before a product is feature complete, even though, there has been time built into the schedule to complete the second priority, as well. -Other than the initial OK from the executive team that they want a particular product built, there is not a rigorous set of phases that each feature must pass. -Developers (designers, coders, testers, writers, managers) are expected to interact aggressively and exchange ideas and status.

-By not going heavily into complete specifications, the impact of a better idea along the way need not invalidate a great deal of work. -One prototype is worth a pound of specification. However, this does not mean that large scale changes should not be specified in writing. Often times, the effort to do paper based design is significantly cheaper than investing in a working prototype. The right balance is sought here. Complementing the strategy of iterative software development, the SQA testing assessment is accomplished through personal interaction between SQA engineers and Development engineers. Lead SQA engineers meet with the development team to assess the scope of the project, whether new features for an existing product, or the development of a new product. Feature, function, GUI, and cross-tool interaction are defined to the level of known attributes. When development documentation is provided, the understanding of the SQA engineer is greatly enhanced. The lead SQA engineer then meets with the test team, to scope the level and complexity of testing required. An estimate of test cases and testing time is arrived at and published, based upon the previous discussions. -Working with the development team, the SQA team takes the builds, from the first functioning integration, and works with the features as they mature, to determine their interaction and the level of testing required to validate the functionality throughout the product. -The SQA engineers, working with existing test plans and development notations on new functionality, as well as their notes on how new features function, develop significant guidelines for actual test cases and strategies to be employed in the testing. The SQA engineers actively seek the input of the development engineers in definition and review of these tests. -Testing is composed of intertwined layers of manual ad hoc and structured testing, supplemented by automated regression testing which is enhanced as the product matures. Test Plan Components Test requirements based on new features or functions. Specific testing based on the features defined as Development Priority 1. There must be a plan in place for these features and they must be scheduled for testing. A product release date will be slipped in order to complete adequate testing of the Priority 1 features. Specific testing based on new features or functions defined as Development Priority 2. There must be a plan in place for these features and they must be scheduled for testing. If testing of the Priority 1 features impacts adequate testing of these, they may be dropped from the product. Specific testing based on new features or functions defined as Development Priority 3. Software Quality Assurance will not schedule or plan for these features. However, Priority 3 completed prior to Functional Freeze will be added to the SQA Priority 2 for testing and appropriate risk assessment will be taken with respect their inclusion in the released product. SQA has its own set of Priority 1, Priority 2, Priority 3, which include not only the Development activities, but also testing required as due diligence for product verification prior to shipment. -Priority 1, features include the testing of new features and functions, but also a defined set of base installations, program and data integrity checks, regression testing, documentation (printed, HTML and on-line Help) review and final "confidence" (high level manual or automated tests exercising the most frequently used features of the product) checks on all media to be released to the public. Products being distributed over the Web, also have their Web download and installation verified. -Priority 2, include a greater spectrum of installation combinations, boundary

checking, advanced test creation and more in-depth "creative" ad hoc testing. -Priority 3, usually reflect attempts to bring greater organization to the SQA effort in documentation of test scripts, creation of Flashboards for metric tracking, or expanded load testing. Testing One of the test methods SQA team practice is "Black Box" testing. The SQA engineers, like the customers whom they attempt to emulate, are isolated from the source code and must rely upon their understanding of the application and its features and functions. SQA engineers work with Development engineers toward development of code which lends itself to the exercise of automated test tools, thus providing for stable, repeatable, reliable testing of base features and functions. The deductive, reasoning, creative skills of the SQA engineers are thus freed from the more repetitive tasks to focus on development of more user centric testing, which expands the scope coverage. Manual Testing -GUI - SQA team members upon receipt of the Development builds, walk through the GUI and either update existing hard copy of the product Roadmaps, or create new hard copy. This is then passed on to the Tools engineer to automate for new builds and regression testing. Defects are entered into the bugs tracking database, for investigation and resolution. Questions about GUI content are communicated to the Development team for clarification and resolution. The team works to arrive at a GUI appearance and function which is "customer oriented" and appropriate for the platform, Web, UNIX, Windows, Macintosh. Automated GUI regression tests are run against the product at Alpha and Beta "Hand off to QA" ,(HQA) to validate that the GUI remains consistent throughout the development process. During the Alphaand Beta periods, selected customers validate the customer orientation of the GUI. -Features & Functions - SQA test engineers, relying on the team definition, exercise the product features and functions accordingly. Defects in feature/function capability are entered into the defect tracking system and are communicated to the team. Features are expected to perform as expected and their functionality should be oriented toward ease of use and clarity of objective. Tests are planned around new features and regression tests are exercised to validate existing features and functions are enabled and performing in a manner consistent with prior releases. SQA using the exploratory testing method, manually tests and then plans more exhaustive testing and automation. Regression tests are exercised which consist of using developed test cases against the product to validate field input, boundary conditions and so on... Automated tests developed for prior releases are also used for regression testing. -Installation - Product is installed on each of the supported operating systems in either default, flat file configuration, or with one of the supported databases. Every operating system and database, supported by the product, are tested, though not in all possible combinations. SQA is committed to executing, during the development life cycle, the combinations most frequently used by the customers. Clean and upgrade installations are the minimum requirements. -Documentation - All documentation, which is reviewed by Development prior to Alpha, is reviewed by the SQA team prior to Beta. On-line help and context sensitive Help are considered documentation as well as manuals, HTML documentation and Release Notes. SQA not only verifies technical accuracy, clarity and completeness, they also provide editorial input on consistency, style and typographical errors. Automated Testing -GUI - Automated GUI tests are run against the product at Alpha and Beta "Hand off to QA" (HQA) to validate that the GUI has remained consistent within the product throughout the development process. The automated Roadmaps, walk through the client tool windows and functions, validating that each is there

and that it functions. -Data Driven - Data driven scripts developed using the automation tools and auto driver scripts are exercised for both UNIX and Windows platforms to provide repeatable, verifiable actions and results of core functions of the product. Currently these are a subset of all functionality. These are used to validate new builds prior to extensive manual testing, thus assuring both Development and SQA of the robustness of the code. -Future - Utilization of automated tools will increase as our QA product groups become more proficient at the creation of automated tests. Complete functionality testing is a goal, which will be implemented feature by feature. Reporting The SQA team lead produces a formal test plan which is submitted to the project team for validation and information. Input is solicited for the understanding of requirements, so that the plan can accurately reflect the testing which will be required. Because of the variances between Development teams' style of disseminating feature, function scope and definition, the amount of information available is not always consistent. The SQA lead attempts to establish a good working relationship with the Development team, so that questions can be asked and comprehensive, identical understanding of the product will exist. The lead also pairs a QA engineer with a development counter part to facilitate day to day interaction. SQA engineers produce status sheets for distribution to the members of the product teams. The data on the status sheets have included: features, functions, time estimated to test, time consumed and amount of testing yet to be done, the last has proved to be too subjective. Some test engineers include the staff assigned to specific testing and the percentage complete. The later is difficult to estimate due to the iterative process, as feature and function specifics change often and rapidly during the development cycle, as they are refined. The evolving report, which appears to give the most information, is one which lists the features, the build in which they were tested or re-tested and the pass or fail status of the test. SQA provides monitoring of the defects which appear in the product, through the use of QA designed Flashboards (graphical representations of the aggregate numbers of defects) and reports. The defects found in the product are recorded in an Bug Tracking database, where the information is made available to the development group. Information stored in the database then provides statistical and trend analysis for defect find rates and product areas. This information is compared to that presented by the Development team and the Product team. Customer support is kept abreast of these defects and influences the priority assigned to the defect by the team. Team leads have established directories for their products in which test plans and weekly status reports have been posted. These are updated weekly by the team lead and reviewed by the manager and linked or posted to the QA home page on the intranet. QA managers work with their teams to assess better forms and methods of information dissemination. These reviewed with the larger engineering and project teams so that the teams feel they understand the scope of work to be done by SQA and the status of a project currently being tested.

For teams managing software quality, it is crucial to manage the workflow around the the defect reporting process so that everyone understands how a defect moves from recognition to resolution. Below are some tips for defining the workflow for software defects.

Managing Workflow for Software Defects

___________________

Define the Workflow Statuses - When tracking software defects, it is important to define the workflow. Workflow is normally tracked via the "status". Let's create a simple workflow for a development team, where the tester finds a defect and follows it through resolution, quality assurance and closure. Below

are some possible sets of statuses (workflow) for this process. Workflow Statuses: -Active -Resolved -QAed -Committee -Closed Flowchart the Workflow - Flowcharting the workflow allows team members to understand the process in full. We created the flowchart using Microsoft Word. Advanced Workflow - In our example above, we used simple workflow. However, if your team uses software to manage defects, you should be able to implement more robust workflow. For example, the software should allow you to define "state transitions". This identifies how a status can transition from one status to another. In our example, above, you may want to setup these transitions: Active - Can only transition to Resolved or Committee Committee - Can only transition to Active or Closed Resolved - Can only transition to Active or QAed QAed - Can only transition to Active or Closed Closed - No transitions allowed Likewise, the software should also allow you to define what fields (or items) you wish to make required upon different states. In the example above, if the defect is changed to Resolved, we may want to require that the programmer enter the resolution information (resolution code and description of how they resolved it). Robust defect tracking software will allow you to define the field attributes for each state transition. Software Planner (http://www.SoftwarePlanner.com) does this nicely, you can see how this is handled from Software Planner by viewing this movie: http://www.pragmaticsw.com/GuidedTours/Default.asp?FileName=Workflow Defect Severity - Another important aspect of defect tracking is to objectively define your defect severities. If this is subjective, team members will struggle classifying the severity. Below are severities that are objective: 1-Crash - Set when the defect causes the software to crash 2-Major Bug - Set when there is a major defect with NO workaround 3-Workaround - Set when there is a defect but it has a workaround 4-Trivial - Not a major bug, trivial (e.g. misspelling, etc) Defect Priority - Similar to severity, the priority for resolving the defect should be objective, not subjective. Below are priorities that are objective: 1-Fix ASAP - Highest level of priority, must be fixed as soon as possible 2-Fix Soon - Fix once the priority 1 items are completed 3-Fix If Time - Fix if time allows, otherwise, fix in a future release User Acceptance Test Release Template - Upon entering User Acceptance Testing, it is wise to create a document that describes how your QA process went. Here is a User Acceptance Test Release Report template: http://www.PragmaticSW.com/Pragmatic/Templates/UATRelease.rtf

Flowchart:

"The full business, from initial thinking to final use, is called the product's life cycle." Cem Kaner
Testing Computer Software

Software Life Cycle

____________________________________

Project A user defined software test effort. Projects contain the specific test plans, test procedures, test cases, defect information, test schedule information, and performance data used to test software applications and track results. Pre-Alpha Pre-Alpha is the test period during which the product is made available for internal testing by QA, Information Development and other internal users. Shipping Pre-Alpha drops to external customers during this time is explicitly for

the purpose of getting feedback about the implementation or usability of one or more features. Alpha Alpha is the test period during which the product is complete and usable in a test environment but not necessarily bug-free. It is the final chance to get verification from customers that the tradeoffs made in the final development stage are coherent. Entry to Alpha All features complete/testable (no urgent bugs or QA blockers, including automation) High bugs on primary platforms fixed/verified 50% of medium bugs on primary platforms fixed/verified All features tested on primary platforms Purify run on post-FF drop to obtain baseline Performance measured/compared to previous release (user functions) 80% of automation complete (not including UI tests) Media verified by QA Doc review started Usability testing and feedback (ongoing) Alpha sites ready for install Final product feature set Determined Beta Beta is the test period during which the product should be of "FCS quality" (it is complete and usable in a production environment). The purpose of the Beta ship and test period is to test the company's ability to deliver and support the product (and not to test the product itself). Beta also serves as a chance to get a final "vote of confidence" from a few customers to help validate our own belief that the product is now ready for volume shipment to all customers. Entry to Beta At least 50% positive response from Alpha sites All customer U/H/M bugs addressed via patches/drops in Alpha (except OAR bugs) 100% run to plan Secondary platform/compatibility testing complete: All U/H/M bugs fixed/verified Bug fixes regression/confidence tested Bug fix rate exceeds find rate consistently for two weeks Preliminary release notes available Beta sites ready for install Second doc review complete GM (Golden Master) GM is the test period during which the product should require minimal work, since everything was done prior to Beta. The only planned work should be to revise part numbers and version numbers, prepare documentation for final printing, and sanity testing of the final bits. Entry to Golden Master Beta sites declare the product is ready to ship All customer U/H bugs addressed via patches/drops in Beta All negative responses from sites tracked and evaluated Support declares the product is supportable/ready to ship QA-qualified U/H bugs re-verified in final drop All patches delivered to Beta sites included in final drop Final drop selectively regression tested with no new U/H bugs Bug find rate is lower than fix rate and steadily decreasing Docs signed off FCS (First Customer Ship) FCS is the period which signifies entry into the final phase of a project. At this point, the product is considered wholly complete and ready for purchase and usage by the customers. Entry to FCS Product baked for two weeks with no new urgent bugs (multiple attempts may be required)

Product team declares the product is ready to ship OHG (Hand-off to QA) approval to ship Test Phase Pre - determined period of QA evalution of each software build Builds In many software projects, programming and testing are treated as separate phases. Code units are written and unit tested before any form of integration or system testing. Although this approach may be acceptable on small projects, there are many advantages to overlapping development and testing activities. Builds, which are fundamental approach to testing, allow overlapping development and testing activities.

Why are there Bugs? Bugs exist because humans aren't perfect.

Why are there Bugs?


by Mark Glaser

___________________________________

Since humans design and program hardware and software, mistakes are inevitable. That's what computer and software vendors tell us, and it's partly true. What they don't say is that software is buggier than it has to be. Why? Because time is money, especially in the software industry. This is how bugs are born: a software or hardware company sees a business opportunity and starts building a product to take advantage of that. Long before development is finished, the company announces that the product is on the way. Because the public is (the company hopes) now anxiously awaiting this product, the marketing department fights to get the goods out the door before that deadline, all the while pressuring the software engineers to add more and more features. Shareholders and venture capitalists clamor for quick delivery because that's when the company will see the biggest surge in sales. Meanwhile, the qualityassurance division has to battle for sufficient bug-testing time. "The simple fact is that you get the most revenues at the release of software," says Bruce Brown, the founder of BugNet, a newsletter that has chronicled software bugs and fixes since 1994. "The faster you bring it out, the more money you make. You can always fix it later, when people howl. It's a fine line when to release something, and the industry accepts defects." It may seem that there are more bugs these days than ever before, but longtime bug watchers like Brown say this is mostly a visual illusion caused by increased media coverage. Not only has the number of bugs not changed, but manufacturers are fixing them more quickly. But while the industry as a whole may not be buggier, one important new category is, arguably, more flawed than other genres: Internet software. The popularity of the Internet is pushing companies to produce software faster than ever before, and the inevitable result is buggier products. "Those are crazy release schedules," says Brian Bershad, an associate professor of computer science at the University of Washington. His Kimera project helped catch several security bugs in Java. "The whole industry is bonkers. Web standards need to be developed and thoughtfully laid out, but look at all the versions of Java and HTML. It's not that the people aren't smart; it's just that they don't have time to think." But software and hardware companies persist in arguing that we should put up with bugs. Why? Because the cost of stamping out all bugs would be too high for the consumer. "Software is just getting so incredibly complicated," says Bershad. "It's too expensive to have no bugs in consumer software."

Employers hire people they like, they don't hire people they don't like. So First and Foremost, be likable and create a good first impression on your interview.People get hired, experience alone does not.

QA Career is it

___________________________________

Good?
Question: I am planning to take QA testing training and get a job. How is the market? Is it easy to get a job in QA market? What I feel is its not that easy to get a job as a fresher in testing. If you have 1 or 2 yrs experience then its ok otherwise its bit difficult.
Posted by Tamil

Answer: I am in QA for last 1.5 years, and I have found the career very enjoyable and rewarding. Its an emerging field and definitely has got good career prospects.
Posted by Uday

Answer: This would be the case for every fresher of all fields so don't hesitate and jump into the field. It has great potential ahead. Some time some where you would be fresher and after that an experienced.
Posted by Sanjay

Answer: The very first question I would like to ask you is why you want to come into this field? If you are coming in this field only because u see many openings in this particular area nowadays then I think your decision is wrong. Because I am sure after six months you will see lot of openings will be in the market for dot net. Then you will think of shifting to dot net. But if you really interested in testing or QA then you must try for a job. Learn some basic concepts of testing rather than going for any tool directly. There are lot of books available in the market as well you will find lot of stuff on the net. If you really have the desire than definitely you will get it no matter what is your previous background and experience.

Salary Negotiation

_____________________________________

Question: I graduated with a degree in Computer Engineering this past spring. My SQA experience is very limited, but I've proven to be good at it so far, and have even received a monetary bonus for my work. I'm currently a consultant and interviewing for a permanent position with the company holding the contract. My job description and responsibilities would not change if hired. Considering my limited experience, what is a good salary range to try to negotiate at my interview? I'd appreciate any feedback as I have no job experience to compare to.
Posted by Julie

Answer: Why are you considering leaving? Are you happy where you are now? Is your only concern the cash? Most major salary increases occur when you leave your current company for a new position elsewhere. A friend of mine gave me this rule of thumb: don't leave your current job for less than a 10% increase. Whenever salary comes up in an interview, I simply tell the interviewers the truth: they will give me a fair salary and I don't offer any information about what I expect them to pay me. In practice, every company has a salary range in consideration for every position that they want to fill. If they think that your skills match the position but your salary expectations are out of line, you won't get called back, but you don't know what that line is. The company will have to make you an offer letter that indicates how much they are willing to pay you. If it's not enough, go back to them and tell them that they are not being fair. They won't have burned any bridges by telling other candidates that the position had been filled. They may negotiate with you--I have had many HR representatives tell me how they came to that conclusion through salary surveys of the local area (it's usually a discussion ending statement). As I see it, the key is, as in chess or most martial arts: don't make the first move. Feel free to give them your current salary and tell them that you will not leave for anything less than 10% more, or whatever your bottom line is. Every company knows that if you're in motion, you're willing to continue in that motion (i.e. look for work elsewhere) and other companies may make plays for you

once you land on their doorstep. They will tend to be fair and if they're not, then you'll know in the offer letter. As a side note, one former manager was courted by a large company in town. She refused to leave. When another company convinced her to leave, they made several attempts to court her with larger and larger salaries. She finally agreed to go for more than double their initial offer to her. Their initial salary offer to her was more than 50% larger than her salary at the company in which we both worked so it was a lot of money--but it was only for the money and she stayed for one year as she agreed to do.

Employers hire people they like, they don't hire people they don't like. So First and Foremost, be likable and create a good first impression on your interview.People get hired, experience alone does not.

How to Face HR Interviews?

_____________________________

Question: How to face HR interviews. They are asking why you leave old company and what are your weaknesses what are your strengths? Answer: The simple answer without any hesitation should be: Tell that there is no more testing and you are in the company just to develop the test cases and execute and report bugs (something like this as per your nature of work). Usually, HR interviews are just to see how spontaneous you are. But they don't go into the detail much. So without changing facial expressions, just tell them spontaneously something like above one and makeup. This doesn't mean anything to you or them. They just ask to see how spontaneous you are. Let them write your answer on a piece of paper, but it is not an issue. No need to think about it.
Posted by Rupa

Answer: Then the HR person, who meets regularly with the HR person from that company (or just called them before your interview) will either terminate your interview (because if you can lie to her about you past employment you can lie about having other knowledge too) or continue and you will not get a call back. TELL THE TRUTH AND YOU WON'T HAVE PROBLEMS! Why is that such a difficult concept for people to understand? You have to tell the truth when you file defect reports so it should be a natural extension for you to do so when interviewing. > Usually, HR interviews are just to see how spontaneous you are. No they're not. They're designed to determine what skills and knowledge you have. To see if teh credentials that you report in your resume (or C.V.) are congruant with you when faced in person. > But they don't go into the detail much. So without changing facial expressions, just tell them spontaneously something like above one and makeup. No lying is no way to get a job. Ever. Anywhere. Under no circumstances should you "makeup" anything in any interview. There have been many jobs that I have held where months after I have been in place I found out that my previous HR manager and the HR manager at my current job have a regular meeting or are part of an informal network and speak regularly. There is an understanding that when certain companies are listed, you are call those HR people. I have also received e-mails from HR people--out of the blue--asking if I would come and interview for positions because my name was mentioned by a past HR person. DO NOT LIE. It's not only a good, ethical practice, it's common sense.
Posted by Walter Grlitz

Answer: You are right, if the position is a permanent position at the end client. But is not the case, I guess, if it is a contracting position. Do you think so? The smallconsulting companies who are interested in making money at any cost, only will ignore the true answer. Some who just think about their name and fame, may think that placing such people will make them face some problems in business relations. In this case, I don't think a truth will not work. Do you think if it is easy to find who is who among those two categories? I doubt, the placements for such poor people who say "in worst case scenario", that "Because I screwed up something, I lost my job" which is a truth. Sometimes "Loosing job" may also happen though the working person is very careful following all the company procedures but gets involved in a crisis without his

knowing that something is happening and finally gets the blame. This may be due to some miscommunication between the team and non-performance of superiors. Or if there is a new manager and want to establish his own team (the process is just look for a blame on team one by one and remove one by one from job and recruit persons whom he knows from old company, in a slow process). In these scenarios, people cannot just talk on their superiors even though they are out of company, after loosing the job. What's your opinion on these as you mentioned that you did some HR interviews?
Posted by Rupa

Answer: I am saddened that anyone thinks that lying is the right way to get any job-contract or permanent. I am mortified that someone would offer this advice publicly. It does not matter if your employer is unscrupulous, you need to maintain a level of integrity and honesty. Life (and to a lesser extent work as it is only a subset of life) is not a game to be won at all costs, it's a responsibility to be fair with every other person on the planet. I don't know how large the software development and testing market is where you live, but every time your lies are uncovered, you burn a bridge. And as I said, there are connections in the HR and testing world. There was one tester who worked at a company I was working at. He didn't lie about his skills--he was a good tester--but he didn't work very hard. Two companies later, his resume came across my desk. I responded directly to him and asked him if he wanted to let his name stand knowing that I know his works habits. He never responded. My point is that you are not alone in this world and your actions in one place will affect future job opportunities in both positive and negative ways elsewhere. If you do very well, co-workers will remember you and will be offered jobs in the companies in which they find themselves because they will mention your name for jobs. This has happened to me often. If you don't do well, you will have to interview for positions. There are ways of telling the truth without coming out looking bad. Everyone understands that there are differences of opinion. If there are a lot of those on your resume, chances are that you will likely have one again, so don't use that "excuse" too often. As I said, the truth is always easier to defend than a lie.

Manage all phases of your software development with Software Planner More...
This is a quick reference guide to the Quality Assurance process. It explains at a high level the key documents QA must receive or prepare during the course of a project cycle to adequately assess the readiness of a product for release. Although the level of detail in each document may vary by team and project, each is mandatory.

Quick Reference Guide to the Quality Assurance Process


I. PRAD

_______

The Product Requirement Analysis Document is the document prepared/reviewed by marketing, sales, and technical product managers. This document defines the requirements for the product, the "What". It is used by the developer to build his/her functional specification and used by QA as a reference for the first draft of the Test Strategy. II. Functional Specification The functional specification is the "How" of the product. The functional specification identifies how new features will be implemented. This document includes items such as what database tables a particular search will query. This document is critical to QA because it is used to build the Test Plan. QA is often involved in reviewing the functional specification for clarity and helping to define the business rules. III. Test Strategy

The Test Strategy is the first document QA should prepare for any project. This is a living document that should be maintained/updated throughout the project. The first draft should be completed upon approval of the PRAD and sent to the developer and technical product manager for review. The Test Strategy is a high-level document that details the approach QA will follow in testing the given product. This document can vary based on the project, but all strategies should include the following criteria: - Project Overview - What is the project. - Project Scope - What are the core components of the product to be tested - Testing - This section defines the test methodology to be used, the types of testing to be executed (GUI, Functional, etc.), how testing will be prioritized, testing that will and will not be done and the associated risks. This section should also outline the system configurations that will be tested and the tester assignments for the project. - Completion Criteria - These are the objective criteria upon which the team will decide the product is ready for release - Schedule - This should define the schedule for the project and include completion dates for the PRAD, Functional Spec, and Test Strategy etc. The schedule section should include build delivery dates, release dates and the dates for the Readiness Review, QA Process Review, and Release Board Meetings. - Materials Consulted - Identify the documents used to prepare the test strategy - Test Setup - This section should identify all hardware/software, personnel prerequisites for testing. This section should also identify any areas that will not be tested (such as 3rd party application compatibility.) IV. Test Matrix (Test Plan) The Test Matrix is the Excel template that identifies the test types (GUI, Functional etc.), the test suites within each type, and the test categories to be tested. This matrix also prioritizes test categories and provides reporting on test coverage. - Test Summary report - Test Suite Risk Coverage report Upon completion of the functional specification and test strategy, QA begins building the master test matrix. This is a living document and can change over the course of the project as testers create new test categories or remove nonrelevant areas. Ideally, a master matrix need only be adjusted to include near feature areas or enhancements from release to release on a given product line. V. Test Cases As testers build the Master Matrix, they also build their individual test cases. These are the specific functions testers must verify within each test category to qualify the feature. A test case is identified by ID number and prioritized. Each test case has the following criteria: - Purpose - Reason for the test case - Steps - A logical sequence of steps the tester must follow to execute the test case - Expected Results - The expected result of the test case - Actual Result - What actually happened when the test case was executed - Status - Identifies whether the test case was passed, failed, blocked or skipped. - Pass - Actual result matched expected result - Failed - Bug discovered that represents a failure of the feature - Blocked - Tester could not execute the test case because of bug - Skipped - Test case was not executed this round - Bug ID - If the test case was failed, identify the bug number of the resulting bug. VI. Test Results by Build Once QA begins testing, it is incumbent upon them to provide results on a consistent basis to developers and the technical product manager. This is done in two ways: A completed Test Matrix for each build and a Results Summary

document. For each test cycle, testers should fill in a copy of the project's Master Matrix. This will create the associated Test Coverage reports automatically (Test Coverage by Type and Test Coverage by Risk/Priority). This should be posted in a place that necessary individuals can access the information. Since the full Matrix is large and not easily read, it is also recommended that you create a short Results Summary that highlights key information. A Results Summary should include the following: - Build Number - Database Version Number - Install Paths (If applicable) - Testers - Scheduled Build Delivery Date - Actual Build Delivery Date - Test Start Date - Scope - What type of testing was planned for this build? For example, was it a partial build? A full-regression build? Scope should identify areas tested and areas not tested. - Issues - This section should identify any problems that hampered testing, represent a trend toward a specific problem area, or are causing the project to slip. For example, in this section you would note if the build was delivered late and why and what its impact was on testing. - Statistics - In this section, you can note things such as number of bugs found during the cycle, number of bugs closed during the cycle etc. VII. Release Package The Release Package is the final document QA prepares. This is the compilation of all previous documents and a release recommendation. Each release package will vary by team and project, but they should all include the following information: - Project Overview - This is a synopsis of the project, its scope, any problems encountered during the testing cycle and QA's recommendation to release or not release. The overview should be a "response" to the test strategy and note areas where the strategy was successful, areas where the strategy had to be revised etc. The project overview is also the place for QA to call out any suggestions for process improvements in the next project cycle. Think of the Test Strategy and the Project Overview as "Project bookends". - Project PRAD - This is the Product Requirements Analysis Document, which defines what functionality was approved for inclusion in the project. If there was no PRAD for the project, it should be clearly noted in the Project Overview. The consequences of an absent PRAD should also be noted. - Functional Specification - The document that defines how functionality will be implemented. If there was no functional specification, it should be clearly noted in the Project Overview. The consequences of an absent Functional Specification should also be noted. - Test Strategy - The document outlining QA's process for testing the application. - Results Summaries - The results summaries identify the results of each round of testing. These should be accompanied in the Release Package by the corresponding reports for Test Coverage by Test Type and Test Coverage by Risk Type/Priority from the corresponding completed Test Matrix for each build. In addition, it is recommended that you include the full Test Matrix results from the test cycle designated as Full Regression. - Known Issues Document - This document is primarily for Technical Support. This document identifies workarounds, issues development is aware of but has chosen not to correct, and potential problem areas for clients. - Installation Instruction - If your product must be installed as the client site, it is recommended to include the Installation Guide and any related documentation as part of the release package. - Open Defects - The list of defects remaining in the defect tracking system with

a status of Open. Technical Support has access to the system, so a report noting the defect ID, the problem area, and title should be sufficient. - Deferred Defects - The list of defects remaining in the defect tracking system with a status of deferred. Deferred means the technical product manager has decided not to address the issue with the current release. - Pending Defects - The list of defects remaining in the defect tracking system with a status of pending. Pending refers to any defect waiting on a decision from a technical product manager before a developer addresses the problem. - Fixed Defects - The list of defects waiting for verification by QA. - Closed Defects - The list of defects verified as fixed by QA during the project cycle. The Release Package is compiled in anticipation of the Readiness Review meeting. It is reviewed by the QA Process Manager during the QA Process Review Meeting and is provided to the Release Board and Technical Support. - Readiness Review Meeting: The Readiness Review meeting is a team meeting between the technical product manager, project developers and QA. This is the meeting in which the team assesses the readiness of the product for release. This meeting should occur prior to the delivery of the Gold Candidate build. The exact timing will vary by team and project, but the discussion must be held far enough in advance of the scheduled release date so that there is sufficient time to warn executive management of a potential delay in the release. The technical product manager or lead QA may schedule this meeting. - QA Process Review Meeting: The QA Process Review Meeting is meeting between the QA Process Manager and the QA staff on the given project. The intent of this meeting is to review how well or not well process was followed during the project cycle. This is the opportunity for QA to discuss any problems encountered during the cycle that impacted their ability to test effectively. This is also the opportunity to review the process as whole and discuss areas for improvement. After this meeting, the QA Process Manager will give a recommendation as to whether enough of the process was followed to ensure a quality product and thus allow a release. This meeting should take place after the Readiness Review meeting. It should be scheduled by the lead QA on the project. - Release Board Meeting: This meeting is for the technical product manager and senior executives to discuss the status of the product and the teams release recommendations. If the results of the Readiness meeting and QA Process Review meeting are positive, this meeting may be waived. The technical product manager is responsible for scheduling this meeting. This meeting is the final check before a product is released. Due to rapid product development cycles, it is rare that QA receives completed PRADs and Functional Specifications before they begin working on the Test Strategy, Test Matrix, and Test Cases. This work is usually done in parallel. Testers may begin working on the Test Strategy based on partial PRADs or confirmation from the technical product manager as to what is expected to be in the next release. This is usually enough to draft out a high -level strategy outlining immediate resource needs, potential problem areas, and a tentative schedule. The Test Strategy is then updated once the PRAD is approved, and again when the functional specifications are complete enough to provide management with a committed schedule. All drafts of the test strategy should be provided to the technical product manager and it is QA's responsibility to ensure that

information provided in the document (such as potential resource problems) is clearly understood. If the anticipated release does not represent a new product line, testers can begin the Master Test Matrix and test cases at the same time the project's PRAD is being finalized. Testers can build and/or refine test cases for the new functionality as the functional specification is defined. Testers often contribute to and are expected to be involved in reviewing the functional specification. The results summary document should be prepared at the end of each test cycle and distributed to developers and the technical product manager. It is designed more to inform interested parties on the status of testing and possible impact to the overall project cycle. The release package is prepared during the last test cycle for the readiness review meeting. Test Strategy Template QA Test Strategy: [Product and Version] [Document Version history in format MM-DD-YYYY] 1.0 PROJECT OVERVIEW [Brief description of project] 1.2 PROJECT SCOPE [More detailed description of project detailing functionality to be included] 2.0 MATERIALS CONSULTED [Identify all documentation used to build the test strategy] 3.0 TESTING - CRITICAL FOCUS AREAS [Areas identified by developers as potential problems above and beyond specific feature enhancements or new functionality already given priority 1 status by QA] - INSTALLATION: [Installation paths to be qualified by QA. Not all products require installation testing. However, those that do often have myriad installation paths. Due to time and resource constraints, QA must prioritize. Decisions on which installation paths to test should be made in cooperation with the technical product manager. Paths not slated for testing should also be identified here.] - GUI [Define what if any specific GUI testing will be done] - FUNCTIONAL [Define the functionality to be tested and how it will be prioritized] - INTEGRATION [Define the potential points of integration with other MediaMap products and how they will be prioritized and tested] - SECURITY [Define how security issues will be tested and prioritized] - PERFORMANCE [Define what if any performance testing will be done and its priority] - FAILURE RECOVERY

[Define what if any failure recovery testing will be done and its priority] 3.1 TECHNIQUE - [Technique used for testing. Automation vs. Manual] 3.2 METHODOLOGY [Define how testers will go about testing the product. This is where you outline your core strategy. Include in this section anything from tester assignments to tables showing the operating systems and browsers the team will qualify. It is also important to identify any testing limitations and risks] 4.0 TEST SET-UP 4.1 TEST PRE-REQUISITES [Any non-software or hardware related item QA needs to test the product. For example, this section should identify contact and test account information for 3rd party vendors] 4.2 HARDWARE QA has the following machines available for testing: Workstations: Servers: [Include processor, chip, and memory and disk space] Other: [Identify any other hardware needed such as modems etc.] 4.3 SOFTWARE [Identify all those software applications QA will qualify with the product and those QA will not qualify. For example, this is where you would list the browsers to be qualified. It is also important to identify what will not be qualified (for example, not testing with Windows 2000)] 4.4 PERSONNEL [Identify which testers are assigned to the project and who will test what. It is also important to identify who is responsible for the creation of the test strategy, test plan, test cases, release package, documentation review etc.] 5.0 COMPLETION CRITERIA [Identify how you will measure whether the product is ready for release. For example, what is the acceptable level of defects in terms of severity, priority, and volume?] 6.0 SCHEDULE 6.1 Project Schedule - PRD Review completed by [MM-DD-YYYY] - [STATUS] - Functional Specification completed [MM-DD-YYYY] - [STATUS] - Release Date approved by [MM-DD-YYYY] - [STATUS] - Test Strategy completed by [MM-DD-YYYY] - [STATUS] - Core Test Plan (functional) completed by [MM-DD-YYYY] - [STATUS] - Readiness Meeting - [STATUS] - QA Process Review Meeting - [STATUS] - Release Board Meeting - [STATUS] - Release on [MM-DD-YYYY] - [STATUS] 6.2 Build Schedule - Receive first build on [MM-DD-YYYY] - [STATUS] - Receive second build on [MM-DD-YYYY] - [STATUS] - Receive third build on [MM-DD-YYYY] - [STATUS] - Receive fourth build on [MM-DD-YYYY] - [STATUS] - Receive Code Freeze Build on [MM-DD-YYYY] - [STATUS] - Receive Full Regression Build on [MM-DD-YYYY] - [STATUS] - Receive Gold Candidate Build on [MM-DD-YYYY] - [STATUS] - Final Release on [MM-DD-YYYY] - [STATUS]

7.0 QA Test Matrix and Test Cases:

Manage all phases of your software development with Software Planner Software Planner is a project collaboration tool that allows you to manage all phases of your software development. In the initial stages of the project, it allows you to post functional specifications and post project related documents (like meeting minutes, client proposals, etc.). As the project progresses, it allows you to post baseline documents (like detailed designs and project plans). As development proceeds, it allows your project managers and developers to track project deliverables. The developers can update the percentage complete for all items assigned to them. Once testing begins, it allows your testers to create test cases and track software defects. Developers are automatically alerted, by email, as defects are assigned to them. Team members are alerted as new documents are uploaded or re-uploaded (like project plan updates, etc.). And each person has the ability to control the email alerts they wish to receive. Use the discussion forums to communicate all issues with clients and project team members. Keep your appointments and to do list on-line and updated at all times.

Summary:Automated test tools are powerful aids to improving the return on the testing investment when used wisely. Some tests inherently require an automated approach to be effective, but others must be manual. In addition, automated testing projects that fail are expensive and politically dangerous. How can we recognize whether to automate a test or run it manually, and how much money should we spend on a test?

When Test Automation Makes Sense

Lets start with the tests that ideally are automated. These include:

Regression and confirmation. Rerunning a test against a new release to ensure that behavior remains unbrokenor to confirm that a bug fix did indeed fix the underlying problemis a perfect fit for automated testing. The business case for test automation outlined in Software Test Automation by Mark Fewster and Dorothy Graham is built around this kind of testing.

Monkey (or random). Tests that fire large amounts or long sequences of data, transactions, or other inputs at a system in a random search for errors

are easily and profitably automated Load, volume, and capacity. Sometimes, systems must support tremendous loads. On one project, we had to test how the system would respond to

50,000 simultaneous users, which ruled out manual testing! Two Linux systems running custom load-generating programs filled the bill. Performance and reliability. With the rise of Web-based systems, more and more automated testing is aimed at looking for slow or flaky behavior on

Web systems. Structural, especially API-based unit, component, and integration. Most structural testing involves harnesses of some sort, which brings you most of

the way into automation. Again, the article I wrote with Greg Kubaczkowski, "Mission Made Possible" (STQE magazine, July/Aug. 2002), provides an example.

Other tests that are well-suited for automation exist, such as the static testing of complexity and code standards compliance that I mentioned in the previous article. In general, automated tests have higher upfront coststools, test development, environments, and so forthand lower costs to repeat the test.

When to Focus on Manual Testing

High per-test or maintenance costs are one indicator that a test should be done manually. Another is the need for human judgment to assess the correctness of the result or extensive, ongoing human intervention to keep the test running. For these reasons, the following tests are a good fit for manual testing:

Installation, setup, operations, and maintenance. In many cases, these tests involve loading CD-ROMs and tapes, changing hardware, and other

ongoing hand-holding by the tester. Configuration and compatibility. Like operations and maintenance testing, these tests require reconfiguring systems and networks, installing software

and hardware, and so forth, all requiring human intervention. Error handling and recovery. Again, the need to force errorsby powering off a server, for examplemeans that people must stay engaged during

test execution. Localization. Only a human tester with appropriate skills can decide whether a translation makes no sense, is culturally offensive, or is otherwise

inappropriate. (Currency, date, and time testing can be automated, but the need to rerun these tests for regression is limited.) Usability. As with localization, human judgment is needed to check for problems with the facility, simplicity, and elegance of the user interface and

workflows. Documentation and help. Like usability and localization, checking documentation requires human judgment.

Wildcards

In some cases, tests can be done manually, be automated, or both.

Functional. Functionality testing can often be automated, and automated functional testing is often part of an effort to create a regression test suite or smoke test. However, it makes sense to get the testing process under control manually before trying to automate functional testing. In addition, youll want to keep some of the testing manual.

Use cases (user scenarios). By stringing together functional tests into workflows, you can create realistic user scenarios, whether manual or

automated. The trick here is to avoid automation if many workflows involve human intervention. User interface. Basic testing of the user interface can be automated, but beware of frequent or extensive changes to the user interface that can incur

high maintenance costs for your automated suite. Date and time handling. If the test system can reset the computers clocks automatically, then you can automate these tests.

Higher per-test costs and needs for human skills, judgment, and interaction push towards manual testing. A need to repeat tests many times or reduce the cycle time for test execution pushes towards automated testing.

Reasons to Be Careful with Automation

Automated testing is a huge investment, one of the biggest that organizations make in testing. Tool licenses can easily hit six or seven figures. Neophytes cant use most of these toolsregardless of what any glossy test tool brochure saysso training, consulting, and expert contractors can cost more than the tools themselves. Then theres maintenance of the test scripts, which generally is more difficult and time consuming than maintaining manual test cases.

Dr. Cem Kaner on Software Testing as a Career


I take this opportunity to share few on great discussions happening at "softwaretesting" Yahoo group, to all my blog readers. For the purpose of focus, I am not sharing the entire thread and the beauty of the reply is such that you can read this without knowing or referring the original post that initiated this discussion.... There were actually two replies by Dr Kaner - I am taking the liberty of rearranging few paragraphs from both replies in order to givea specific flow to the whole thing. The purpose of this post is to share the words of wisdom and experience for all those who would like pursue the career in Software Testing [Dr Kaner: Quote] Let me start by distinguishing between a CAREER and a JOB. A CAREER involves a longterm, intentional focus on a field or type of work. A JOB is a temporary assignment with a particular employer. My career is focused on improving the satisfaction and safety of software users and developers. My current job is as a professor. I have also held jobs as a tester, test manager, programmer, human factors analyst, software development manager, technical publications manager, development director, organization development consultant, salesperson, software development consultant, and attorney focused on the law of software quality. Each of these has addressed different aspects of what has been, to me, the same career. People define their own careers. Many people define their career in terms of traditional categories (programmer, tester, lawyer, teacher), but the choice belongs to the person, not the category. When you make a choice ("I am an X" or "My career is X"), that choice is both inclusive (Xness is in your path) and exclusive (if Yness is not part of Xness, and Xness is not part

of Yness, then "I am X" means also "I am not Y"). When someone defines their career as "tester," I think that definition is too narrow. I see software development as a bundle of coordinated tasks, including programming, design, testing, usability evaluation, modeling, documentation, development of associated training, project management, etc. Very few people would do all of these as part of the same job. Fewer would do them all on the same project or in the same week. But working at one company as a tester and another company later as a programmer is not inconsistent with calling myself a software developer at either/both companies I don't generally encourage my students to pursue software testing AS A CAREER. They can make that decision later, after they have more experience. I prefer to encourage them to try SOFTWARE DEVELOPMENT as a career -- to me, development includes testing. And that they take a job doing serious, skilled testing as PART of that career. Most of the best testers I know have significant experience outside of testing and apply that experience to what they do as testers or test managers. I think that testing is a fine choice for a first job--for some people--but that doesn't make it a first career. It becomes a first career only for the person who says, "This, testing, is my career." I don't recommend that people make a decision to narrow their career that much, early in their career. Let them explore the field more, in their next few jobs, before they lock themselves into something. I think that some people are good at both programming and testing, some people are good at both writing and testing, some people are good at design and testing, very few people are good at every software development task. So I think it is inappropriate to say that someone shouldn't be considered a software developer because they are good at some aspects of development but not others. Most (all?) of the hidebound processpushers that I know in the field have never done serious professional work outside of testing. From their narrow perspective, they think they know more about how to manage a development project than the people who retain their testing services. Instead of trying out their ideas as project managers (where they will be accountable if they fail) these process advocates undermine the projects they work on by trying to control things they don't understand with rigid policies and procedures, standards and superstitions, whose costs and impacts are beyond their imagination. We have too many of these people in our field. We need more people who have a broader view of the tremendous value that testing can offer--within its limited role--and are glad to embrace a service-provider role that provides that value. I think some fresh engineers should start their career with a job in programming, others

with testing, others writing, others with human factors assessment, others with configuration management, others with data analysis. I think that choice should depend on what motivates the particular person. What makes testing worth spending time on--as a job and maybe as a career? We are professional investigators. Rather than building things, we find ways to answer difficult questions about the quality of the products or services we test. Our job--if we choose to do it well--requires us to constantly learn new things, about the product, its market, its implementation, its risks, its usability, etc. To learn these, we are constantly developing new skills and new cognitive structures in a diversity of fields. It also requires us to communicate well to a diverse group of people. We ALSO get to build things (test tools), but very often, we build to our own designs, which can be more satisfying than building an application that does something we'll never personally do (or want to do). Learning to do good software testing requires learning to do critical thinking well, and to back it up with empirical research. Not everyone will like to do testing. Not every engineer or programmer will have the skills or the interest to do professional-level testing. But for those of us who enjoy critical thinking, experimentation, and keeping the human relevance of what we do always in mind, there is nothing else like it in software development (except, for some people on some projects, requirements analysis backed with rapid prototyping and prototype-based research).

oftware Testing - A Creative Career


By Anand Ramdeo on May 4, 2011 0 Comments

This question was asked by Dina in SQATester yahoo groups. Various reasons were discussed on why software testing is a creative career. This was the original questionI'm a senior computer engineering student and I'm really considering testing as a career. I did an internship last summer and as fun as it was, I was disappointed. My question is this: I felt the problem is that the software produced are very much alike and so it turns the testing process to a routine. if you have this, you do that, end of story. I'm not sure if that was related to the whole career or was it just because the project I was working on was like that? Because the main reason I went into the testing career (or looking into it to be exact) is the creativity, not to mention that (I was told) I fit the description. I'd appreciate some guidance :) Greg Ventura

Testing is as creative as your company/boss allows it to be. Maybe look for a company that is just getting into serious testing? That way they might not have much established and you can provide some of the direction for them. I would not 100% trust someone who tells you that you fit the description of a tester. They have some pre-conceived idea of what a tester does and they think you possess those qualities. Do you trust this person? I have seen people tell programmers or computer engineers that they should be testers. It was their way of saying, you are not good enough to be a programmer, so go test. My personal opinion is that testing can be a very creative process. Costinn for me testing is creative, because I am working in a railway environment (interlocking solutions - mainly black box testing) and I have the chance to learn the rail philosophy from different countries from Europe, so the writing of the test cases is a great challenge for me, especially when you have to understand the requirements and in the same time, the safety conditions for each country. Geek Lets start with your question by what do you mean by being creative? If you mean that you can apply your brain and solve problems that possibly add value to some one, testing will certainly fit the bill. We question product to validate the claims made by developers and give our opinion on whether product is good enough or not to solve some problems for its end user. It is very creative. Since you are still in final year and have some time before you can take decision on what you want to do, learn as much about testing as you can. Take a informed decision, in this forum you will hardly get some one saying that testing is not creative. We do testing for our living and enjoy it, thats why we participate in forums like this. Hope it helps. Andrew In Software Testing, ideas are not soda cans. Recycling sucks. I was recently asked an intriguing question at my blog. The question was simple: do you have to be a programmer to write automated tests? Everything's conditional. You just can't always anticipate the conditions. But I struggled a bit to answer the question because to me the answer is not as simple as yes or no. But I have a feeling that well-considered thought to the question is at the heart of what happens to the future of automated testing. Just because its inexplicated doesn't mean it's inexplicable. Many career testers are feeling the pressure of picking up 'automated testing'

skills in today's job market because every IT company wants it. On the one hand, it's not fair to put someone into the position of architecting and building or even maintaining an automated test system without having any educationunderstanding of the sound programming skills and practices that are necessary. You guys are still thinking like software developers when you should be thinking like plumbers. Come on, I wanna see some butt crack. Union rules. I can't check out this software coder seeping gonorrhea this close to lunch. On the other hand, why would anyone with good programming skills choose to enter into the thankless, dead-end, glass-domed, low-paying and under-respected testing career path? Let's be honest, they wouldn't unless they recognized it as an entry point into a much more lucrative development career path. Oh! Level Three. Have you called Jack Bauer? I know I'm an ass The latter is exactly the reason multi-billion dollar corporations have recognized that there's a huge market in creating easy to use automated test tools that require special skills to operate but no programming is necessary to be up and running and fairly successful in a short period of time. I think Mercury and IBM haven't done us any favors though. They're marketing these 'record-and-playback' tools (in a way that's reminiscent of Sun Microsystems early Java marketing) as entry level tools and creating a misperception in the market-place that deep technical knowledge and understanding of software systems is not necessary for automated testing. When my boss claims that I can't test anything on an abnormal environment, I say that's so close-minded! It's not "abnormal", It's special :-P It sounds as if I'm leaning strongly towards a 'yes you pretty much do have to be a programmer to write automated tests.' doesn't it? So this is where my brain starts to spin in circles as I contemplate the paradox we're in. If you think about it, Automated testing is right now about where we were in 1995 with web application development. (And yes I am actually old enough to speak from experience about that). My work is like an intricate Austrian metro system. All trains run on time. I believe automated testing has absolutely tremendous potential and we haven't even cracked the egg yet. There are so many frontiers to explore in this arena such as artificial intelligence, serviceoriented testing, intelligent dimensional data analysis?the list goes on. What else turns you on? Drugs? Casual Sex? Rough sex? Casual rough sex on drugs? I'm an tester, need to know... So how do we convince the bright, talented innovative thinkers in IT that automation can be a destination too, not just a path to another better job? I really don't know any quick answers as the problem is a systemic one and touches on the mindset of the whole industry. What am I looking for? Same as you. Love, acceptance, a solid return in investment... Distractions My two big suggestions are to:

1. Raise the bar in how we define the skill set needed to do this work and realize that an automated test developer is actually a programmer and 2. Open up research and development projects that make automated testing a destination instead of a half-way house.

I thought I'd get your theories, mock them, and then embrace my own. The usual... Rajesh Indeed..Raise the bar .. If some one starts thinking verification and validation is the only process in testing it could be boring for anyone ..If some one asks you to just test using the test cases given by a client its indeed boring Everyone needs to come out of the shell, Question the requirement /Specification and think your self as a user and enggr going to impement the same Question the design.. Because the developer will think only abt his module ..come out of the box and think abt the impact Question the code ..I mean read it .. You need to know how object oriented programs work, How operating system work, what is memory, how to handle multi threaded applications, What the heck is RBBMS.. ultimately any programming language has to use all of this ..if or for or while is common ..object is common.. and prevent bugs rather than fixing them later on Design your test cases with customer focus and engineering knowledge As for automated testers I use the junior Enggrs to record and play back ..Seniors, I will use them for designing and coming up with scenarios and figure out what needs to be automated. Security,Performance,Reliability, Capability,Usability, I18n,L10n,I508 .. there are more This is just a start as you grow you need to raise bar and learn new things.. As long as you learn every day, everything is creative.... A Thought : "I am a tester not because I cannot write code as good as a developer. But because I can help them create better software" Think abt the situation if you are just worried abt the program you write ..who will think of the problems it can cause when it works together with other programs Walter I think it's time that I comment, especially in light of the comments made by one group member, who I've warned--off-list--for making personal attacks on group members. Is testing creative? There are a few issues to consider. * What does creative mean to you? Does it mean you have a lot of opportunities to express yourself. Does it mean that you will do

many diverse things? Does it mean that you don't do the same thing over and over? * What are you comparing it to? The construction industry? A job as a lab technician? One in the legal profession? Life as a chef? A job in a traditionally "creative field"? Depending on that criteria, you could come up with a different answer. It also depends on where you are working and how your company approaches testing. It also depends on the "school" of testing that is used. Most beginning testing jobs are in companies where you simply perform the same steps over and over. That can be very uncreative. However as you advance, you have to learn to write those steps. That can be creative. Also, in these companies, you may be required to find novel solutions to produce or reproduce certain scenarios. However, if you find yourself working in a company that encourages exploratory testing, either as your primary testing activity or in support of "scripted" testing, then you will have a very creative job. You have to think about the system under test and how to expose bugs. I'll address this one other way: your job is only as creative as YOU make it. One taxi driver could say that his job is not creative because all he does is drive passengers around all day. Another taxi driver could say that it creative because she tries to find the best routes. Another could say it's creative because he enjoys meeting the people that he drives. It's the same with testing. It is only as creative as make it. Bernard Homes As you are a Computer engineering student, I will assume that you are comparing the creativity from testers to the ceativity from Developpers. As in many different professions, opportunities for creativity exist, but they are not all available at all levels. Comparin the tester's goals to the developper's goal, you can say that the developer looks for _one_ solution that fits the specifications, while the tester has to (should) find _all_ the defects that were introduced in the development process (from specificaiton to code). As such the challenge is much more interesting, just as the challenge for the "hunter" is greater than for the "harvester". You will enjoy creativity in coming up with different attacks (methods for finding defects) to show presence (or absence) of defects. However, it is most likely that you will first have to execute tests created by others (less creativity here). Then you will have enough understanding of the different types of possible defects, and conceive new tests that others will execute (more creativity).

In the end you may come up with completely new paradigms on testing, and that is the greatest creativity possible (IMHO). What you will always find are different views by people who are considered to belong to "schools" of thoughts, and you will have to understand that there is not a single method (suggested by a school [any school]) that will be able to be used in all the different contexts you may encounter. And here is the greatest creativity of all: be able to design the best set of tests to find the largest number of most important defects in the systems that you will test. And I am speaking of systems, so you will have to learn about many different environments, hardware, software (and software languages), and the context where these systems are being used. The creativity is not based on company, nor on the team where you will work. It is based only on how you look at your work: if you limit yourself to executing again and again the same activity (without learning new things) then it will not be creative. If you limit the scope of your thoughts to one school of thinking (schools) the same thing will happen, and you will not find creativity. On the other hand, if you challenge your own knowledge and try to expand it always, then you will have the opportunity to be both creative and recognized by your peers. With over 26 years in experience, 15 of which in development, I can tell you that I never had as much fun in development as i have now in testing.

Starting a Career in Software Testing


Formal education is important, but if that's not an option and you have an opportunity to enter a testing department, take it.

One of the most frequently asked questions in the USENET newsgroup comp.software.testing is something like "Where is a good starting point for becoming a software QA specialist? I have an opportunity but not much experience. Any suggestions are welcome." My emphatic suggestion: TAKE THE OPPORTUNITY. (Much of the time, by the way, "software QA" is a misnomer: a company often uses this term as a name for its system testing group. Quality should go into every portion of the project at every step of the way; that makes everyone--developers, tech support people,

and tester--quality assurance staff. In this document, I assume that you have an opportunity in or a desire to learn software testing.) Attend the Earn-While-You-Learn University. It is unlikely that you will get any better long-term, day-to-day teacher than experience. All employers these days recognize that there's going to be a lot of learning on the job in any case; today's high-end technology is tomorrow's Tamagotchi, so we're all dealing with new stuff all the time. If an employer is willing to pay you while you learn, that's just fine; if the employer is willing to train you and provide you with mentorship, so much the better. More about education in a moment. Read. Start with Testing Computer Software, by Kaner, Falk, and Nguyen. When you've grasped it, look at The Craft of Software Testing by Brian Marick, and some of the other books listed in the comp.software.testing FAQ list for more technical and formal testing techniques. Drop in to comp.software.testing and monitor the traffic. Stroll around the Web, starting from the FAQ and hopping links where you find them. Learn. Learn everything you can about the operating environment under which you'll be testing. Learn how to use batch files, scripts, or macros to automate repetitive tasks. Strongly consider learning low-level languages such as C or C++ to understand how programming in general works. Later apply this knowledge to automated test tools, but learn how to recognize the good uses of test automation from the bad ones. Learn how to use the product you're testing. Become adept at it, and learn all you can about the discipline in which it works. On the other hand, try to retain an open mind; part of the skill of being a black-box tester is approaching the program with as few preconceptions as you can, the better to stay perceptive and observant, and thus find defects. Learn how to write cogent defect reports so that the defects you find can be found and fixed. Learn to write better in general--you can find some basic suggestions here--since part of your job as a tester will involve lobbying for resources, or arguing diplomatically but assertively to have particular defects addressed. If you have an opportunity to obtain formal training at a university or technical school-especially if your employer supplies or subsidizes the fee--take it. Not many colleges or universities so far offer accredited courses in formal testing techniques; in the meantime, courses in Computer Science are helpful. You can find a list of academic institutions that

do offer quality- and testing-related courses at the Software Research Institute's Quality HotList Educational & Academic Institutions page. I can personally recommend (however immodestly) my own course offered through Advanced Information Technologies. I also recommend Cem Kaner's course on Black Box Software testing; you might be able to find out where and when it is being offered by checking at Cem Kaner's Web site. Be on the lookout for conferences on testing and QA, and try to attend them; negotiate with your employer to pick up the cost. At such conferences you may quite possibly learn techniques that will save weeks of work on your company's projects. Ask lots of questions. Don't ever be shy about asking questions. People are not mind readers; they can't be expected to offer information or advice without you asking for it. Recognize within yourself the power to understand anything if you're given a reasonable explanation. Don't worry about looking stupid or feeling silly because you don't understand something. If you don't understand something, it's the explainer's problem; keep asking for different models or analogies until you get it. The real mentors will hang in there with you and help you along; cultivate relationships with such people. Try to ask your question and get your answer in public forums; you might feel uncomfortable at first, but there are likely others even more shy than you are. Someone who helps along the process of learning is a leader, so ask away; other people will appreciate your bravery and will be grateful for hearing the answer. Ask plenty of questions in comp.software.testing, and don't ask for private replies; let others benefit from the answer. Make friends. Try to hang out with other testers, programmers, and tech support people; exchange ideas, stories, techniques, and theories. Some kinds of developers are particularly eager to teach you how stuff works, and will natter on for hours if they sense you're interested. From the tech support people, find out about the wacko things that customers have done to extend or break the product; then add those things to your test suites. Don't just meet your colleagues on formal terms; some days you learn more in the pub than you do in the lab. Invite your co-workers to lunch, and don't limit the invitation to your peers--move up the ladder. Good managers will recognize that they can learn things from you, just as you can learn things from them. Teach other people what you know. Explaining something to others is a great way to realize new perspectives and a more complete understanding of the topic. Write white papers or articles or technical notes so that you and others can find information more easily, or so your customers or your company's sales people can understand better the technical nature of the product. Share

experience and information with the new recruits in your department (there will be someone along shortly, no doubt). If you write something that can be useful or helpful to others, don't be modest or coy about it; post it on the Web or in the newsgroup, write an article, or deliver presentations at conferences. Keep at it. Testing (or, if you must, QA!) is a career with real job security. As long as there are people who make stuff, they're going to need people who can figure out where and how it's broken, and that's where testers come in. Keep working, reading, learning, and teaching, and the world will be your oyster; you can stay in testing for life or transport the skills you learn to other fields. If you're being offered an opportunity, grab it and grow from there. If you (or your company, or your manager, or your employee) need counselling or instruction in testing, I can help with engaging and informative courses on quality assurance and software testing in plain English that save your company lots of time and money; contact me by for details.

You might also like