You are on page 1of 6

DATAWAREHOUSING, ETL, INFORMATICA RESUMES

Please note that this is a not a Job Board - We are an I.T Staffing Company and we provide candidates on a Contract basis. If you need I.T.Professionals to fill a Contract Position, please call (800) 693-8939 or Submit a Request

PROFESSIONAL SUMMARY:
Extensively worked on data extraction, Transformation and loading data from various

sources like Oracle, SQL Server and Flat files.


Responsible for all activities related to the development, implementation, administration and

support of ETL processes for large scale data warehouses using Informatica Power Center.
Strong experience in Data Warehousing and ETL using Informatica Power Center 8.6. Had experience in data modeling using Erwin, Star Schema Modeling,

and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL

processes.
Had knowledge on Kimball/Inmon methodologies. Hands on experience in tuning mappings, identifying and resolving performance bottlenecks

in various levels like sources, targets, mappings and sessions.


Extensive experience in ETL design, development and maintenance using Oracle SQL,

PL/SQL, SQL Loader, Informatica Power Center v 5.x/6.x/7.x/8.x.


Experience in testing the Business Intelligence applications developed in Qlikview. Well versed in developing the complex SQL queries, unions and multiple table joins and

experience with Views.


Experience in database programming in PL/SQL (Stored Procedures, Triggers and

Packages). Well versed in UNIX shell scripting.


Experienced at Creating effective Test data and development thorough Unit test cases to

ensure successful execution of the data & used pager for notifying the alerts after successful completion.
Excellent communication, documentation and presentation skills using tools like Visio and

PowerPoint. TECHNICAL SKILLS: Data warehousing Tools : Informatica Power Center 8.6/8.1, Data Stage

Databases : Oracle10g/9i/ 8i/ 8.0/ 7.x, MS SQL Server 2005/ 2000/ 7.0/ 6.0, MS Access, MySQL, Sybase. Programming GUI : SQL, PL/SQL, SQL Plus, Java, HTML, C and UNIX Shell Scripting BI Tools : QlikView 8.x Tools/Utilities : TOAD, Benthic golden, PL/SQL developer Operating Systems : Windows XP/NT/2003, UNIX Configuration Management Tool : Surround SCM, Visual Source Safe EDUCATION: Master of Science in Computer Science. PROFESSIONAL EXPERIENCE: Confidential sanofi-aventis- NJ Oct11- till date Informatica Developer USMM implementation project is the upgrade of the current sanofi-aventis 1.x series MCO medical reps Quest application to latest 4.x .net series of applications. In this project the database was upgraded and an enterprise data ware house was implemented for the MCO reps. Distributed data is coming from the heterogeneous sources like SQL server, Oracle and in flat files from the clients. Responsibilities: Analyzed the business requirements and functional specifications.
Extracted data from oracle database and spreadsheets and staged into a single place and

applied business logic to load them in the central oracle database.


Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data in

the data warehouse.


Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and

Lookup, Update strategy and Sequence generator and Stored Procedure.


Developed complex mappings in Informatica to load the data from various sources. Implemented performance tuning logic on targets, sources, mappings, sessions to provide

maximum efficiency and performance.


Parameterized the mappings and increased the re-usability. Used Informatica Power Center Workflow manager to create sessions, workflows and

batches to run with the logic embedded in the mappings.


Created procedures to truncate data in the target before the session run. Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the

performance of the conversion mapping.


Used the PL/SQL procedures for Informatica mappings for truncating the data in target

tables at run time.


Extensively used Informatica debugger to figure out the problems in mapping. Also involved

in troubleshooting existing ETL bugs.


Created a list of the inconsistencies in the data load on the client side so as to review and

correct the issues on their side.


Created the ETL exception reports and validation reports after the data is loaded into the

warehouse database.
Written documentation to describe program development, logic, coding, testing, changes

and corrections.
Created Test cases for the mappings developed and then created integration Testing

Document.
Followed Informatica recommendations, methodologies and best practices.

Environment: Informatica Power Center 8.6.1, Oracle 10g/ 9i, MS-SQL Server, Toad, HP Quality Center, Windows XP and MS Office Suite Confidential, Aug 07- Sep10 sanofi-aventis- NJ Informatica Developer Sales force automation (SFA) system is a CRM solution that provides sales forces with a roboust set of customer relationship management capabilities that promotes team selling, multi-channel customer management, information sharing, field reporting, and analytics all within a life science-tailored mobile application that is easy to use. The Purpose of this project is to maintain a data warehouse that would enable the home office to take corporate decisions. A decision support system is built to compare and analyze their products with the competitor products and the sales information at territory, district, region and Area level. Responsibilities: Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources like Oracleand Delimited Flat files.
Development of ETL using Informatica 8.6. Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads. Prepared various mappings to load the data into different stages like Landing, Staging and

Target tables.
Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter,

Lookup, Update Strategy Designing and optimizing the Mapping.


Developed Workflows using task developer, worklet designer, and workflow designer in

Workflow manager and monitored the results using workflow monitor.


Created various tasks like Session, Command, Timer and Event wait. Modified several of the existing mappings based on the user requirements and maintained

existing mappings, sessions and workflows.


Tuned the performance of mappings by following Informatica best practices and also applied

several methods to get best performance by decreasing the run time of workflows.
Prepared SQL Queries to validate the data in both source and target databases. Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and

packages in Oracle.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle

different scenarios.
Created Test cases for the mappings developed and then created integration Testing

Document.
Prepared the error handling document to maintain the error handling process. Automated the Informatica jobs using UNIX shell scripting. Closely worked with the reporting team to ensure that correct data is presented in the

reports.
Interaction with the offshore team on a daily basis on the development activities.

Environment: Informatica Power Center 8.1, Oracle 9i, MS-SQL Server, PL/SQL Developer, Bourne shell, Windows XP,TOAD, MS Office and Delimited Flat files Confidential, Dec05-Jul07 Chicago- IL Data warehouse Developer The American Medical Association (AMA) plays a key information management role by collecting, maintaining, and disseminating primary source physician data. The development and implementation of AMA policy and support a variety of data driven products and services. This repository of physician information is created, maintained, and customized for DEA. Responsibilities: Member of warehouse design team assisted in creating fact and dimension tables based on specifications provided by managers.
Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various data

marts like PMS and DEA.


Designed and created complex source to target mappings using various transformations

inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
Implemented effective date range mapping (Slowly Changing dimension type2)

methodology for accessing the full history of accounts and transaction information.
Design complex mappings involving constraint based loading, target load order. Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating

transformations and Create mapplets that provides reusability in mappings.


Involve in enhancements and maintenance activities of the data warehouse including

performance tuning, rewriting of stored procedures for code enhancements.


Designed workflows with many sessions with decision, assignment task, event wait, and

event raise tasks, used Informatica scheduler to schedule jobs. Environment: Informatica Power Center 6.2, Oracle, Business Objects 6.x, Windows 2000, SQL Server 2000, Microsoft Excel, SQL * Plus Confidential, Sep04-Nov05 AXA - NY ETL Consultant A single electronic solution provides to employees of AXA Pacific and AXA Assurance surety companies to access a centralized system. It also provides and intranet interface to all employees of AXA surety bonds users across Canada. Responsibilities: Designed and developed the data transformations for source system data extraction; data staging, movement and aggregation; information and analytics delivery; and data quality handling, system testing, performance tuning.
Created Informatica Mappings to build business rules to load data using transformations like

Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups, Filters and Sequence, External Procedure, Router and Update strategy.
Stored reformatted data from relational, flat file, XML files using Informatica (ETL). Worked on Dimensional modeling to design and develop STAR schemas using ER-win 4.0,

Identifying Fact and Dimension Tables.


Created various batch Scripts for scheduling various data cleansing scripts and loading

process.
Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters. Created post-session and pre-session shell scripts and mail-notifications Created Data Breakpoints and Error Breakpoints for debugging the mappings using

Debugger Wizard.
Created Several Stored Procedures to update several tables and insert audit tables as part

of the process.
Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with

backend database using PL/SQL.


Written Unix Shell Scripts for getting data from various source systems to Data Warehouse. Performance tuning by optimizing the sources targets mappings and sessions. Tested mappings and sessions using various test cases in the test plans.

Environment: Informatica 6.2, Oracle 8i, TOAD, Windows NT and UNIX Confidential, India Nov02-Aug04 QA Consultant

HDFC Bank had various products like Accounts & Deposits, Loans, Insurance and Premium Banking etc. This project is mainly to test the Accounts & Deposits module functionality. The Accounts & Deposits has different account types like Savings, Salary, Current, Demat, Deposits, Rural and Safe Deposit locker. Responsibilities: Analyzed the business requirements and functional specifications.
Understanding the requirements specification and use case documents. Creation of test plan, test strategy and test approach. Created Test scripts, Traceability Matrix and mapped the requirements to the test cases. Done Peer review of test scripts and prepare the QA measurement forms. Attended the QA Audit meetings and test artifacts review. Participated in the Integration testing and Unit Testing along with the Development team. Conducted System, GUI, Smoke and Regression testing identified application errors and

interacted with developers to resolve technical issues.


Extensively used SQL scripts/queries for data verification at the backend. Executed SQL queries, stored procedures and performed data validation as a part of

backend testing.
Used SQL to test various reports and ETL Jobs load in development, testing and production Performed negative testing to test the application for invalid data, verified and validated for

the error messages generated.


Responsible for Generating Progress Reports and Updates to Project Lead Weekly including

with test Scenarios Status, Concerns and Functionality outstanding.


Involved in Daily and Weekly Status meetings. Creation of Test Summary report, traceability Matrix, Test Script Index. Monitor the testing project in Quality Center and ensuring defects are being entered, tested,

and closed.
Analyzed, documented and maintained Test Results and Test Logs.

Environment: Java, HTML, DHTML, Oracle8i, Data stage, Clarify and Benthic Golden.

You might also like