Professional Documents
Culture Documents
Nobody can agree So Im not actually going to define a DW Dont feel cheated, though By the end of this talk, youll
Understand key concepts that underlie all warehouse implementations (talk the talk) Understand the various components out of which DW architects construct real-world data warehouses Understand what a data warehouse project looks like
Optimize classroom, computer lab usage Refine admissions ratings systems Forecast future demand for courses, majors Tie private spreadsheet data into central repositories Correlate admissions and IR data with outcomes such as:
Admissions data (student vitals; SAT, etc.) Special events: A-student suddenly gets a C in his/her major Slower trends: Students GPA falls for > 2 semesters/terms
To think and communicate usefully about data warehouses youll need to understand a set of common terms and concepts:
OLTP ODS OLAP, ROLAP, MOLAP ETL Star schema Conformed dimension Data mart Cube Metadata
Evidence shows that IT will only build a successful warehouse if you are intimately involved!
OLTP
OLTP = online transaction processing The process of moving data around to handle day-to-day affairs
Scheduling classes Registering students Tracking benefits Recording payments, etc.
Transactional Systems
Transactional systems are optimized primarily for the here and now
Can support many simultaneous users Can support heavy read/write access Allow for constant change Are big, ugly, and often dont give people the data they want
Transactional systems dont record all previous data states Lots of data gets thrown away or archived, e.g.:
As a result a lot of data ends up in shadow databases Some ends up locked away in private spreadsheets
Admissions data Enrollment data Asset tracking data (How many computers did we support each year, from 1996 to 2006, and where do we expect to be in 2010?)
Each green box is a database table Arrows are joins or foreign keys This is simple for an OLTP back end
Recruitment Plus back-end database Used by many admissions offices Note again:
Green boxes are tables Lines are foreign key relationships Purple boxes are views
Considerable expertise is required to report off this database! Imagine what its like for even more complex systems
Colleague SCT Banner (over 4,000 tables)
Often we require OLTP data as a snapshot, in a spreadsheet or report Reports require querying back-end OLTP support databases But OLTP databases are often very complex, and typically
Contain many, often obscure, tables Utilize cryptic, unintuitive field/column names Dont store all necessary historical data
Requires special expertise May require modifications to production OLTP systems Becomes harder and harder for staff to keep up!
Workarounds
The data is easier to understand The data is optimized for reporting Easily pluggable into reporting tools
ODS
ODS = operational data store ODSs were an early workaround to the reporting problem To create an ODS you
Build a separate/simplified version of an OLTP system Periodically copy data into it from the live OLTP system Hook it to operational reporting tools
An ODS can be an integration point or real-time reporting database for an operational system Its not enough for full enterprise-level, crossdatabase analytical processing
OLAP
OLAP = online analytical processing OLAP is the process of creating and summarizing historical, multidimensional data
To help users understand the data better Provide a basis for informed decisions Allow users to manipulate and explore data themselves, easily and intuitively
More than just reporting Reporting is just one (static) product of OLAP
Hold snapshots of data in OLTP systems Provide history/time depth to our analyses
Are optimized for read (not write) access Updated via periodic batch (e.g., nightly) ETL processes
ETL Processes
Extract data from various sources Transform and clean the data from those sources Load the data into databases used for analysis and reporting
By hand in SQL, UniBASIC, etc. Using more general programming languages In semi-automated fashion using specialized ETL tools like Cognos Decision Stream
What sort of a database do the ETL processes dump data into? Typically, into very simple table structures These table structures are:
Denormalized Minimally branched/hierarchized Structured into star schemas
Facts:
Answer (text) Answer (raw) Count (1)
Oops
Not a star Snowflaked!
Oops, answers should have been placed in their own dimension (creating a factless fact table). Ill demo a better version of this star later!
Data Marts
One definition:
Data marts should not be built in isolation They need to be connected via dimensional tables that are
So, e.g., if I construct data marts for
The same or subsets of each other Hierarchized the same way internally
One or more star schemas that present data on a single or related set of business processes
GPA trends, student major trends, enrollments Freshman survey data, senior survey data, etc.
Input star
By phase
Output star
Voltage Current
By phase
Sensor star
Voltage Current
By sensor
Temp Humidity
CIRP Freshman survey data Corrected from a previous slide Note the CirpAnswer dimension Note student dimension (ties in with other marts)
ROLAP, MOLAP
Data Cubes
The term data cube means different things to different people Various definitions:
1. 2. 3. 4. A star schema Any DB view used for reporting A three-dimensional array in a MDB Any multidimensional MDB array (really a hypercube)
Metadata
Metadata = data about data In a data warehousing context it can mean many things
Information on data in source OLTP systems Information on ETL jobs and what they do to the data Information on data in marts/star schemas Documentation in OLAP tools on the data they manipulate
Many institutions make metadata available via data malls or warehouse portals, e.g.:
University of New Mexico UC Davis Rensselear Polytechnic Institute University of Illinois
OK now were experts in terms like OLTP, OLAP, star schema, metadata, etc. Lets use some of these terms to describe how a DW works:
Provides ample metadata data about the data Utilizes easy-to-understand column/field names Feeds multidimensional databases (MDBs) Is updated via periodic (mainly nightly) ETL jobs Presents data in a simplified, denormalized form Utilizes star-like fact/dimension table schemas Encompasses multiple, smaller data marts Supports OLAP tools (Access/Excel, Safari, Cognos BI) Derives data from (multiple) back-end OLTP systems Houses historical data, and can grow very big
Like an ODS (such as SCTs) A canned warehouse supplied by iStrategy Cognos ReportNet Like Oracle SQL Server
Another def.: The union of all the enterprises data marts Aside: The Kimball model is not without some critics:
E.g., Bill Inmon
Phil Goldstein (Educause Center for Applied Research fellow) identifies the major deployment levels:
Level 1: Transactional systems only Level 2a: ODS or single data mart; no ETL Level 2: ODS or single data mart with ETL tools Level 3a: Warehouse or multiple marts; no ETL; OLAP Level 3b: Warehouse or multiple marts; ETL; OLAP Level 3: Enterprise-wide warehouse or multiple marts; ETL tools; OLAP tools
Goldsteins study was just released in late 2005 Its very good; based on real survey data Which level is Colorado College at?
In many organizations IT people want to huddle and work out a warehousing plan, but in fact
The purpose of a DW is decision support The primary audience of a DW is therefore College decision makers It is College decision makers therefore who must determine
Decision makers cant make these determinations without an understanding of data warehouses It is therefore imperative that key decision makers first be educated about data warehouses
Once this occurs, it is possible to
Elicit requirements (a critical step thats often skipped) Determine priorities/scope Formulate a budget Create a plan and timeline, with real milestones and deliverables!
Sure, according to Phil Goldstein (Educause Center for Applied Research) Hes conducted extensive surveys on academic analytics (= business intelligence for higher ed) His four recommendations for improving analytics:
1. Key decisionmakers must lead the way 2. Technologists must collaborate
Must collect requirements Must form strong partnerships with functional sponsors
Carleton violated this rule with Cognos BI As we discovered, without an ETL/warehouse infrastructure, success with OLAP is elusive
Goldsteins findings mirror closely the advice of industry heavyweights Ralph Kimball, Laura Reeves, Margie Ross, Warren Thornthwaite, etc.
Sure, it can be huge Dont hold on too tightly to the bigsounding word, warehouse Luminaries like Ralph Kimball have shown that a data warehouse can be built incrementally
Can start with just a few data marts Targeted consulting help will ensure proper, extensible architecture and tool selection
You may be surprised to learn what DW step takes the most time Try guessing which:
Hardware Physical database setup Database design ETL OLAP setup
90 80 70 60
Hardware
West
Acc. to Kimball & Caserta, ETL will eat up 70% of the time. Other analysts give estimates ranging from 50% to 80%.
Duration
21 days 14 days 7 days 3 days 1 day 21 days 14 days 20 days
Step
Secure, configure network Deploy physical target DB Learn/deploy ETL tool Choose/set up modeling tool Design initial data mart Design ETL processes Hook up OLAP tools Publicize, train, train
Duration
1 day 4 days 28 days 21 days 7 days 28 days 7 days 21 days
Conclusion
For normal people to explore institutional data, data in transactional systems needs to be
Renormalized as star schemas Moved to a system optimized for analysis Merged into a unified whole in a data warehouse Yes, IT people must build the infrastructure But IT people arent the main customers
But transactional systems are complex They dont talk to each other well; each is a silo They require specially trained people to report off of
Admissions officers trying to make good admission decisions Student counselors trying to find/help students at risk Development offers raising funds that support the College Alumni affairs people trying to manage volunteers Faculty deans trying to right-size departments IT people managing software/hardware assets, etc.