Professional Documents
Culture Documents
Moogilu International 2455 N Naglee Road, Suite 227 Tracy, CA 95391 Channagiri Jagadish Ph: 408 884 0325 650 245 1885 Jagadish@Moogilu.com Title: Moogilu Mobile Platform for Social Gifting Case Study Date: January 26, 2013
1 Executive Summary
The customer was shipping products into the marketplace without adequate QA. This caused considerable problem in the field and many of the expensive Machine Vision machines were either returned back or production was delayed considerably. The primarily goal of the QA Engagement was to ship a quality product and also manage updates that did not breakdown the production line. The goal was to enhance customers ROI and reputation in the market place by shipping quality products. The Machine Vision QA Engagement includes: In-depth understanding of Machine Vision System Comprehensive manual test plan Automation Smoke Testing Regression Production Testing Reporting Integration with JIRA
Moogilu QA Case Study for Machine Vision 1.1 Objectives Identify the product modules. Understanding the test approach. Understanding the test artifacts. Understanding the automation test process.
2 Scope of Testing
The products that are tested are as below: 2.1 Product Overview 2.1.1 Production Quality Advisor. Production Quality Advisor is an application used to view inspection data stored in a database.
2.1.2 Console Console is an application used to control inspection and to view the inspection data created by the inspection system. 2.1.3 Recipe Recipe Manager is the application used to define the parameters for inspecting a product. The parameters used to inspect a particular product are contained in a configuration file known as a recipe. 2.1.4 Classifier Manager Classifier Manager is an application used to create and manage classifiers. A classifier specifies the parameters to determine the type of defects as they are detected.
2.2 Test Coverage Identify and describe the amount and type of testing that is required
Test Type Covered
3 Test Deliverables
No 1 2 3 Deliverable Name Test Case Documents Test Results Documents Automation scripts execution Guide Deliverable Description Excel sheet with Test scenarios. Excel sheets with Test scenarios along with the Results List of instructions to execute automation scripts
Module Name
Module Name
Total Planned
20 67 8 34 84 213
Pass
20 50 6 30 75 181
Fail
0 10 1 4 3 18
Total Executed
20 60 7 34 78 199
Pass
100 74.626866 75 88.235294 89.285714
Fail
0 14.925373 12.5 11.764706 3.5714286
On Hold
0 10.447761 12.5 0 7.1428571
Total Executed
100 89.55223881 87.5 100 92.85714286
4 Testing Approach
All the tests were designed and executed by Moogilu QA Team.
4.1 Functional Testing Approach (Manual) For each application, system QA team wrote test cases with relevant test steps. It will cover all the functional tests. There was 100% test coverage. All the tests cases were executed and issues reported to JIRA. All test cases were update with results.
Moogilu QA Case Study for Machine Vision 4.2 Regression Testing Approach (Manual) When the bug fixes release deployed to the test server, system QA team will verify all the bug fixes and verify them. If the fixed bugs are still in the system, the bug will be reopened. New issues were added as bugs were detected in the system 4.3 New Feature Testing Approach (Manual) When the new feature release deployed to the test server, system QA team will test all the new features and verify the functionalities. New issues will be added as bugs. 4.4 Regression Testing Approach (Automation) When the bug fixes release or new feature deployed to the test server, system QA team executed the automation regression test suit to verify the existing functionalities.
5 Automation Testing
5.1 Automation Test Requirement Today, many IT organizations struggle to achieve quality objectives while facing tight delivery schedules and constrained budgets. In these organizations, testing remains primarily a resource intensive, manual effort despite the increasing workloads, aggressive deadlines and escalating cost of skilled test engineers. Moogilu helped the customer by architecting and executing an automation test bed. The advantages of automation include: Accelerate testing cycles and release products on time Conduct extensive testing and increase test coverage Utilize test resources efficiently Improve test accuracy and test management Enhance the productivity of testing efforts
Moogilu QA Case Study for Machine Vision 5.1.1 Moogilu Approach to Automation Select Appropriate Tools We are not obliged to use only one tool. We select right tools from a stack of tools for the right project when conducting UI Tests, Performance Tests, Web Service Tests, and Data Validation Tests that support across web applications, desktop and mobile applications on .Net, Java and PHP platforms. Knowledge Over the years Moogilu has engaged with multiple test automation projects and have in-depth understanding of test automation processes, tools and techniques. We have highly specialized skills in Selenium, Coded UI, Soap UI and JMeter. Set Realistic Expectations We set client expectations at the start and ensure that they are delivered. Use a Highly Maintainable Framework We use Page Object Design Pattern to minimize effort of modifications. Also UI Mapping is used to store all the locators of the test suite in one place. Further application credentials and test data are parameterized for easy maintenance of the test suits. Use Reusable Components We have created Automation Framework to work with UI elements such as data grids, paging and search functions. Also we use external components to read /write excel files, databases & XML files. Customized APIs are used to communicate with test management tools. Moogilu extensive knowledge in Automation helped build a framework within a month for this engagement. 5.2 Tools used for the automation
Test Tool
Test Category
Features
Moogilu QA Case Study for Machine Vision Selenium Functional Coded UI Functional TestLink Test Management SOAP UI Web Service Many language support (Java, C#, Python and Perl) Open Source. Automatically generate more advance Code compared to Selenium or Telerik. Supports C#. Maintain Test Cases Maintain Test execution Report. Open Source. In build browser. Support both UI and Web Services. Functional and Load testing.
Jmeter
Load
5.3 Automation Results sheets This automation result will automatically generated by TestLink Test management tool.
Test Case CGN-1: Test Case Title Build Build 1.0 Tester admin Time 26/01/2013 10:53:18 Status Passed Description Bugs
6 Results
The engagement is ongoing and with first 6 months the results include: Tested all products and some of the products had 100% test coverage Automate 30% of the test cases and is ongoing Smoke Testing on any staging build Release and Production Testing No Field issues reported after product shipment to customers Transfer of Knowledge to the Company The Customers has continued the engagement