You are on page 1of 1

In our project we are getting the requirements in the form of usecases and business rules documents.

Once we received the requirements, we need to identify the testable requirements(testable requirements is the requirements which we need to test it in the current release) in an Excel sheet, we are conducting the reviews. After client review client will sineoff the requirements phase, then we are exporting the requirements into Requirements tab of the QC ,and we will start preparing Testcases for the testable requirements in an Excel sheet. Then we will conduct reviews and after client review, we are consolidating all the testcases in a single Excel sheet and exporting it to Testplan tab of QC. Now we are mapping the requirements with the testcases in the test plan tab(i.e on which requirements basis we have designed the testcases we need to map that testcase) and we will conduct Tracability Matrix Report(TMR) to verify whether all the requirements are covered in the form of testcases or not. After getting 100% in the TMReport client will sign off the testcase phase. Now we are going for testdata preparation(testdata is the data/value which we are using for testing on the application). After completion of the testdata preparation we are going for the execution. We will execute the testcases in the testLab tab of QC, but all The testcases in the testplan tab are in the Alphabetical order, so we are pulling those testcases from testplan tab to testlab tab in a sequential executable order which we called it as a testlab setup. Now we are waiting for the application. When the development team has developed the application and that has deployed into the testing environment, we will start executing the testcases in the testlab tab. During execution time we are comparing the Actual result of the application with the Expected result. If the E.result is matching with the A.result in the application then we are marking the status of the step as PASS. When the A.result is mismatching with the E.result then we are marking the status of the step as FAIL, that failed step we are calling as a DEFECT and we are reporting it to the Development team directly from the failed step of the testcase of the testlab tab of QC. Once the defect is accepted by the developer and that has fixed by the developer, in order to check whether the fixed defect is working fine, we are performing retesting on the fixed defect in the testing environment. If the fixed defect is working fine in the testing environment we are not closing that defect, we are changing the status of the defect as UAT migrated-in. once we executed all the testcases in the testing environment and when all the defects got UAT migrated-in, then the application is deployed into the UAT environment. Once the application has deployed into the UAT environment, we need to perform testing on the application by executing all the testcases in UAT environment and also we need to perform the retesting on the migrated defect in the UAT environment, if it is working fine in the UAT environment also then we need to close the defect. Whenever we closed all the defects in the UAT environment, we need to perform regression testing to verify whether the fixed defect is influencing on other parts of the application or not. Once we completed regression testing we are moving for the testclosure, here project manager will be involving to sineoff the session and technical team will deploy the application into the production department.

You might also like