Professional Documents
Culture Documents
com
Experience Summary
25+ years of software architecture and development experience including:
Technical Experience
Tools: Informatica PowerCenter (versions 1 – 8), , Talend,
Power Designer, Brio Query and Enterprise Server, ERWin, Microsoft
Office Toolset, Visio, Oracle Designer, Erwin
Languages: SQL, PL/SQL , TSQL, UNIX shell scripts , COBOL, Perl, PHP, JavaScript,
C, C++
Page 1 of 7
David Hubbard 303-349-9064 dh@dh11.com
Experience
Datasource Consulting / Transfirst ( Jan 2009 – Jun 2009 )
As an ETL consultant designed and created 200+ mappings/sessions/workflows to move data from source to
reporting data warehouse for complex Cognos Business Intelligence project in credit card processing
industry. Used Informatica in MS Window 2008 / SQL Server 2008 environment. Created standard and best
practices for Talend ETL components and jobs.
As an Informatica consultant streamlined and tuned accounting related ETL to be more efficient using
Teradata, Oracle and SQL Server databases. Processing time for the accounting close period was reduced
from 80 hours to 48 hours. Assist others in maximizing the use of Informatica in ETL processes.
As a Consultant modified mappings, data models and asp.net application to use new data sources. Backend
databases changed from Oracle to SQL Server. Asp.net application used a service oriented architecture to
retrieve data.
As an Application Architect reviewed database design, data flow, ETL and application performance in
order to recommend hardware size needed for implementation of a Basel II risk management application.
Data warehouse / Basel application was implemented using Oracle 10g RAC, Informatica 7.1.3, SAS and
various reporting tools.
• Created hardware sizing test plan for benchmarking application timings in Sun Microsystem’s
hardware lab.
• Collaborated with diverse development groups to create benchmark timings on internal hardware.
• Recommend database infrastructure changes to improve performance.
• Recommend data modeling changes and ETL changes to improve maintainability, data quality, best
practices and performance.
As an Data Warehouse Architect and Informatica/ETL consultant, architected and developed ETL software
modifications for a data warehouse and associated data marts in Informatica Version 7.
As an Informatica/ETL consultant, designed, modeled, developed ETL software for a data mart in
Informatica. Data was sourced from Oracle databases. Data was utilized by Brio reports and fed into Elevon
mainframe financial application.
Page 2 of 7
David Hubbard 303-349-9064 dh@dh11.com
• Created upgrade plan to migrate all internal applications populating data marts and data warehouse to new
version.
• Implemented upgrade plan including testing of all production mappings, sessions, workflows.
• Modified mappings, sessions, workflows as needed to get them to work as designed in upgraded
environment.
Informatica internals development. Created an advanced external procedure module to allow Informatica
mappings on SUN platforms to execute PERL scripts. Functionality was needed for a project for the FBI. This
module is now available as part of versions 6 & 7 of Informatica.
As an Data Warehouse consultant, developed ETL software for a new data warehouse and associated data
marts in Informatica Versions 6.2.
Page 3 of 7
David Hubbard 303-349-9064 dh@dh11.com
• Converted ETL from Informatica PowerCenter/PowerMart Version 5.1 to Version 6. Changed existing
mappings to take advantage of new features that improve performance.
• Trained development team in Version 6 upgrade process and new features.
• Modified company practices and procedures (best practices) to accommodate differences in Version 6.
• Lead conversion of other projects to Version 6.
On an ongoing data warehouse project providing financial data to Brio reports, responsibilities included
• Data modeling using Erwin
• Technical mentoring of the ETL team
• Architecting and design of data warehouse ETL processes.
• Conversion and migration of ETL functionality from Sybase to DB2 UDB EEE databases.
• Re-engineering of ETL to take advantage of new database design. The new database has a different data
model and the ETL required significant re-work. It is also significantly larger (250 GB vs. 50 GB).
As part of a data conversion and performance improvement project, implemented improvements to the ETL by,
• Created new batches that utilized parallelism better.
• Analyzed and tested scheduling strategies to fine tune our Informatica batches.
• Replaced lookups in mappings with join conditions.
• Broke up complicated slow mapping into multiple mapping that ran much faster and could be run in
parallel.
• Analyzed usage and cached vs. noncached lookups and made changes as needed.
• Simplified mappings by removing unneeded objects.
• Removed ‘default’ setting for ports in expression transformations.
• Minimized the usage of iif statements and other logical processing in mappings where practical.
• Used monitoring tools in UNIX and the database as part of the tuning process. ( top, sar, dbArtisan,
sp_monitor, sp_sysmon)
• Reviewed hardware architecture and recommended changes.
• Cleaned up mappings to not create lengthy log files by turning off verbose logging and getting rid of
warning messages.
• Used perl and shell scripts to pre-process data in UNIX.
• Implemented a partitioning strategy similar to Informatica’s partitioning in version 5 to increase
parallelism.
• Implemented documentation standards and practices to make mappings easier to maintain.
• Implemented design improvements that provided increased scalability of the target data mart. These
included the increased parallelism mentioned above to allow sessions to be spread across all available
processors. This scalability proved crucial in allowing enterprise wide usage of the data mart.
• Converted the ETL code-base from PowerCenter 1.7.2 to PowerCenter 5.1.
• Created a method using the existing version control process to work with Informatica PowerCenter 5.1 and
XML. This process allowed complete change management and version control of Informatica mappings. In
addition the new process streamlined and automated the process of promoting Informatica code from
development to production.
• Participated in DB2 EEE physical and logical database design and implementation in addition to systems
architecture design.
• Assist the front-end Brio development efforts when necessary.
Technical mentoring of junior team members and was responsible for their growth and progression as employees.
• Designed and implemented Informatica ETL mappings for near real time data warehouses for
Gateway Computers using Informatica's PowerCenter 1.6. Target database was Oracle 8i on Sun
E10000 hardware running Sun UNIX. Three warehouse systems were involved storing 50, 1200, and
130 gigabytes of data each.
• Wrote PL/SQL, perl, shell, SQL, and Informatica scripts as needed. Acted as a mentor to
more junior developers training them on Informatica and guiding the project's use of 'Best Practices' for
Informatica and ETL. Maintained data model using Erwin.
Page 4 of 7
David Hubbard 303-349-9064 dh@dh11.com
• Created ETL to run at highest possible speed to be able to have one of the warehouses
refreshed every half hour. Created cron based scheduling system of all mappings, batches and ETL
scripts in perl and shell.
• Created Informatica mappings that sourced data from some unusual sources: MS Excel
spreadsheets, MS Access databases, ODBC and flat file reports as well as mainframe and relational
databases (22 source systems in all – 59 source tables).
• Created and archival process to make sure that storage needs were minimized and that data
could easily be reloaded if necessary from archives.
• Design and Maintain Informatica PowerMart mappings for extraction, transformation and load
(ETL) between Oracle, Informix and Sybase IQ databases.
• Administered Informatica repositories and Informatica server running on UNIX.
• Performed performance and tuning functions on Informatica sessions, batches and other related
scripts.
• Created scripts to detect, report and repair errors occurring during nightly ETL processing.
• Upgraded Informatica PowerMart software from version 4.5 to 4.6.2. Trained new
developers on Informatica basics.
• DBA for Sybase ASE, Sybase IQ and Informix databases.
• Data Modeling and Database Design using Sybase's Power Designer - Warehouse Architect
software. Used Warehouse Architect to maintain our Data Dictionary.
• Upgraded Brio software from 5.5 to 6.
• Assisted developers writing SQL.
• Wrote and maintained Perl scripts. Upgraded Perl from 2.3 to 2.6. Used Perl's DBI interface to
access databases directly.
• Reduced time to refresh data mart from over 170 hours to 48 hours through performance
improvements in mappings.
• Hardware: Sun E10000 running Solaris 2.6, IBM J50s running AIX.
• Databases: Informix, Oracle, Sybase.
• Size: 30 Gigabyte warehouse.
Page 5 of 7
David Hubbard 303-349-9064 dh@dh11.com
• Wrote extraction, transformation and load (ETL) scripts and stored procedures in SQL to replicate data
between Dun & Bradstreet financial application and PowerCerv manufacturing software.
Page 6 of 7
David Hubbard 303-349-9064 dh@dh11.com
Tasks Performed
Companies
Education
• Business - University of Phoenix - 1984
Professional Organizations
• Informatica Users Group
• Sybase User Group
• International Oracle Users Group -Americas
• Rocky Mountain Oracle Users Group
Page 7 of 7