Professional Documents
Culture Documents
Introduction
Project REAL is a cooperative effort between Microsoft and a number of technology
partners in the business intelligence (BI) industry to build on actual customer scenarios to
discover best practices for creating BI applications based on SQL Server 2005. The term
REAL in Project REAL is an acronym for Reference implementation, End-to-end, At
scale, and Lots of users. For the latest information about the project, visit
http://www.Microsoft.com/SQL/solutions/BI/ProjectREAL.mspx.
The entirety of Project REAL includes a large-scale implementation using 2 TB of source
data (as delivered from Barnes & Noble Booksellers). The system has been implemented
in many ways, to evaluate design and implementation tradeoffs. The material on this kit
represents our current implementation. It contains a tiny subset of the data so you can see
how various parts of the system work with actual data. Use it to learn and to get ideas for
your own implementation. While we believe it represents a very good design and
generally follows best practices, it should not be regarded as the solution for every BI
situation.
The partners that contributed to Project REAL include Apollo Data Technologies, EMC,
Emulex, Intellinet, Panorama, Proclarity, Scalability Experts and Unisys.
Copyright
The information contained in this document represents the current view of Microsoft Corporation on the issues
discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not
be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any
information presented after the date of publication.
This White Paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR
STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under
copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted
in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose,
without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering
subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the
furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other
intellectual property.
2006 Microsoft Corporation. All rights reserved.
Microsoft, Visual Studio, Windows, and Windows Server are either registered trademarks or trademarks of Microsoft
Corporation in the United States and/or other countries.
The names of actual companies and products mentioned herein may be the trademarks of their respective owners.
Page 1
Table of Contents
Introduction..........................................................................................................................1
Copyright.....................................................................................................................1
Table of Contents.................................................................................................................2
License Agreement..............................................................................................................2
Overview of Project REAL.................................................................................................4
Cooperative effort........................................................................................................4
What this kit contains..........................................................................................................5
Step-by-step installation instructions...................................................................................6
1. Prerequisites.............................................................................................................6
2. Installing the relational databases............................................................................6
3. Creating and processing the Analysis Services cube...............................................7
4. Setting up the ETL packages in Integration Services..............................................8
5. Setting up the reporting environment in Reporting Services.................................11
6. Setting up the Data Mining Models.......................................................................12
7. Setting up the Client Tools.....................................................................................12
Exploring the Relational Data Warehouse.........................................................................13
Exploring the ETL packages..............................................................................................13
Exploring the AS cube.......................................................................................................16
Exploring the RS reports...................................................................................................18
Exploring the Management reports............................................................................18
Viewing the Interactive reports..................................................................................21
Exploring the Data using Analytical Tools........................................................................23
Exploring the Data Mining Models...................................................................................24
Sample OLAP Tools and Scripts.......................................................................................24
The OLAP\AMOShell folder......................................................................................24
The OLAP\REALbuild folder....................................................................................25
The OLAP\Scripts folder.........................................................................................27
REAL Data Lifecycle Samples..........................................................................................28
Initial Loading of Relational Partitions.........................................................................28
Managing the Partition Lifecycle..................................................................................29
Known Issues.....................................................................................................................29
1. SSIS: Package designer issues a warning message when opening a package.......29
2. SSIS: Package aborts with an access violation......................................................30
3. SSIS pipeline hang.................................................................................................30
4. SSIS/AS: Package aborts with no error message while attempting to process an
Analysis Services partition (x64 machines only)..........................................................30
License Agreement
The material on this kit is subject to two license agreements. The data is for internal use
in your company only. The sample code and tools have a less restrictive license. Review
the agreements if you are unsure of how the materials may be used. If you use the
Page 2
software or data, you agree to the licenses. If you do not agree to the licenses, do not
use the software or data.
Page 3
Cooperative effort
Project REAL is a cooperative effort between Microsoft and a set of partner companies
known for their expertise in their respective fields. Each partner committed resources to
the project and agreed to perform technical work that focused on developing general best
practices. The partners are listed below along with their areas of focus:
Apollo Data Technologies designed the data mining models and implemented them both
in the full Project REAL environment and in this smaller sample kit.
Barnes & Noble provided the business scenario for Project REAL and the source data set.
They did this knowing that the purpose of the project was not to create the precise system
that they would deploy, but to create best practices and instructional information for a
wide audience.
EMC provided substantial storage resources for the project, implemented the data
integrity features of the system, and provided a backup system for the project.
Emulex provided host bus adapters (HBAs) for connecting the servers to the storage
subsystems.
Intellinet designed and implemented the overall ETL system at Barnes and Noble and
then modified it to meet the needs of a model implementation for Project REAL.
Panorama developed various client access strategies for the system to model the
connectivity and security needs of intranet, wide-area network, and Internet users.
Proclarity developed and documented guidelines for migrating Analysis Services 2000
implementations to Analysis Services 2005.
Scalability Experts designed and implemented the data lifecycle management
functionality, including the partitioning of relational tables and Analysis Services cubes,
and implemented the management of partitions in the ETL processing.
Page 4
Unisys contributed expertise from their Business Intelligence Center of Excellence and
substantial hardware resources including 32-bit and 64-bit servers. Unisys also designed
and implemented the monitoring system used for ongoing operations.
Page 5
Page 6
c. [Optional, but highly recommended] You do not want to make the data
warehouse read-only, or ETL operations will fail. However, we recommend
creating a snapshot which can be used to roll the database back to the starting
point at some later time. You may have to adjust the file path for your local
system.
USE master
GO
-- Set the file path below appropriately for your system
CREATE DATABASE [REAL_Warehouse_Sample_V6_SNAPSHOT] ON
( NAME = N'REAL_Warehouse_Sample_V6', FILENAME =
'C:\Microsoft Project REAL\DB\REAL_Warehouse_Sample_V6.ss')
Page 8
AS SNAPSHOT OF [REAL_Warehouse_Sample_V6]
At any future time, you can return the sample warehouse to the starting point:
USE master
GO
ALTER DATABASE REAL_Warehouse_Sample_V6
SET SINGLE_USER WITH ROLLBACK IMMEDIATE
go
restore database REAL_Warehouse_Sample_V6
from DATABASE_SNAPSHOT ='REAL_Warehouse_Sample_V6_Snapshot'
ALTER DATABASE REAL_Warehouse_Sample_V6 SET MULTI_USER WITH
ROLLBACK IMMEDIATE
d. Unzip the ETL files. If the Project REAL files were installed at C:\Microsoft
Project REAL, then the zip file will be in C:\Microsoft Project REAL\ETL.
The contents may be extracted to the same directory.
e. Create two system environment variables called REAL_Root_Dir and
REAL_Configuration with the values given below. Click on
Start -> Control Panel -> System. Go to the Advanced Panel, click
Environment Variables button, then New in the System variables box.
If the Project REAL files were installed at C:\Microsoft Project REAL, then
the variable values will be:
Variable Name:
Variable Value:
REAL_Root_Dir
C:\Microsoft Project REAL\ETL
Variable Name:
Variable Value:
REAL_Configuration
%REAL_Root_Dir%\REAL_Config.dtsconfig
Page 9
Page 10
k. There are several ways to see the progress as the packages are executing.
i. For packages started in the BI Development Studio, you can open the
top-level package. The task shown in yellow is currently executing. If
it is an Execute Package task, open that package and so on until you
find the current execution location. If the task is a data flow task, open
the data flow to see the current state.
ii. Go to the SQL Server Management Studio and in the warehouse
database (REAL_Warehouse_Sample_V6) run the query:
select * from audit.uf_Progress()
This will show a list of what packages called what other packages,
when they started and finished, what their success status is, and (where
appropriate) how many rows were handled.
iii. Directly open the audit log table: audit.ExecutionLog in the warehouse
database.
iv. To simply see what packages are running at any given point in time,
use the SQL Server Management Studio. Connect to the Integration
Services service for your machine, and refresh the Running
Packages folder.
l. [Optional] To allow routine viewing of the state of the package execution, a
Reporting Services report can be created to show the package execution
history. An example of this is provided with the kit as an RS project. If the
Project REAL files were installed at C:\Microsoft Project REAL, then the
project files will be in C:\Microsoft Project REAL\ETL\Audit Reporting.
Open the solution file Audit Reporting.sln, open the report
PackageProgress.rdl and preview the report. Optionally, you may deploy
the solution to your server.
Page 11
A description of what to look for in these reports is given in section Exploring the RS
reports below.
If the Project REAL files were installed at C:\Microsoft Project REAL, then the
Management report project files will be in C:\Microsoft Project REAL\RS\REAL
Management Reports and the Interactive report project will be in C:\Microsoft
Project REAL\RS\REAL Interactive Reports. The following steps should be performed
for each project.
a. Each reporting project directory contains a Project file with a data source
referencing the Analysis Services database, a number of reports in .rdl files,
and one .gif image file which contains the Project REAL logo. Double click
on the "Real Warehouse Reports.sln" file. This will start the BI Development
Studio (BIDS) with the project.
b. [Optional] If you are using the PT version of the Analysis Services database,
or if you have renamed your AS database or are running with an instance
name other than the default, you can edit the REAL Warehouse data source
appropriately.
c. Then you should be able to open any report and view it in the preview pane.
d. If you will deploy reports to the default location of
http://localhost/ReportServer, right-click on REAL Warehouse Reports and
choose Deploy. If you want to change the location, first right-click on REAL
Warehouse Reports, choose Properties, and set the TargetServerURL. Then
deploy.
e. Now you can view the reports from the web browser, such as by going to
http://yourserver/reports or http://localhost/reports.
Page 12
Page 13
Page 14
Page 15
Page 16
Return to the Cube Structure tab. The various measures in each measure group can be
seen by expanding the measure groups in the Measures pane on the left-hand side. In
Store Sales, you will see that all the measures have an AggregateFunction of Sum (click
each one and look at the Properties pane on the right-hand side. In contrast, the measures
in the Store Inventory measure group all have an AggregateFunction of LastNonEmpty.
This is because these are semi-additive measures; they cannot properly be summed over
time. This handling of semi-additive measures is one of the great new features of
Analysis Services 2005. Handling such measures at the scale of Project REAL would not
have been possible with earlier releases.
Now look at the Store dimension by opening Store.dim a relatively simple dimension.
The Attributes pane lists all the attributes of this dimension (from the dimension table)
that have been included in the design. Some will be used to create multilevel hierarchies
of attributes, and others will be used to provide attribute-based analysis (more on attribute
analysis in the section Browsing the Data below). This dimension has a multilevel
hierarchy: A Store is in a City which is in a District which is in a Region which is in a
Division. This is a traditional OLAP way of looking at things. It is expressed by creating
a Geography hierarchy, but it is also important that the relationships between attributes
are expressed between the attributes themselves. Expand the City attribute, and you will
see a relationship to District. This hierarchy Store, City, District, Region, Division is
a strong hierarchy or a natural hierarchy. It is important that it be expressed in attribute
relationships as well as in the definition of the hierarchy. This is for a number of reasons,
including aggregation design, security roles, etc. At the same time, two attributes that
have an indirect relationship (through third attribute) should not also have a direct
relationship. Store does not have a relationship expressed to District, Region or Division.
(This is all explained better in the Project REAL: Analysis Services Technical Drilldown
paper mentioned above.)
Now open the Time dimension. You will see that it has two multilevel hierarchies one
for Calendar time and one for Fiscal time. Select the Calendar Year attribute, and look at
its properties. It has NameColumn set to Calendar_Year_Desc and KeyColumns set to
Calendar_Year_ID, both from the dimension table vTbl_Dim_Date. This works fine
because all the year IDs are unique. But take a look at the Calendar Qtr attribute: It has
NameColumn set to Calendar_Qtr_Desc and KeyColumns set to a collection, which
consists of Calendar_Qtr_ID and Calendar_Year_ID. This is because Calendar_Qtr_ID is
not unique across the entire dimension; the ID repeats. To make the key unique it is used
together with Calendar_Year_ID. The same technique is used for the Calendar Week
attribute. It is essential that keys at any level be unique across the entire dimension. Day
keys are already unique in the source table, so a single part key is sufficient.
If you open the Item dimension, you will see both the largest and most complex
dimension in Project REAL. It has nine multilevel hierarchies specified, and over 40
attribute hierarchies. This will allow a wide variety of analysis based on these
Page 17
hierarchies. However, this is not all the attributes that could have been selected. There
are 160 attributes listed in the DSV! There are more analysis possibilities available by
choosing different or more attributes for the dimension. One point to note is that the
DSV is not just a pass-though view of the tables. Observe the attribute Years Since
Published. It is based on a calculation performed in the DSV which is not in the
relational source. The dimension table contains the date a book was published, but what
is more useful for analysis is to know how long since the publish date. A calculation in
the DSV provides this:
ISNULL(DATEDIFF(yyyy,Publish_Year,GETDATE()),0)
Notice several things. First, the calendar hierarchy is filtered so only those members at
the third level (which corresponds to months) will be displayed. The default MDX
statement shows *all* members of the dimension when you check on the parameter
check box, i.e. the original statement was:
Page 18
[Time].[Calendar].ALLMEMBERS ON ROWS
Third, you can see that time is based first by constraining the dimension based on
districts; then on item categories, and finally on months. See also the nested subselects.
This is new AS2K5 syntax put in directly for RS so it could perform this kind of
constrained parameters, just like a relational database. This MDX is generated
automatically by the ordering of the parameters. Click on the Query Parameters box and
you will see what parameters are used for this dataset.
To see what report parameters are defined, go to the menu item Report \ Report
Parameters and you will see what parameters are defined and how they map to a specified
dataset.
2) Examine the "Top Stockage Analysis By Subject By Dept" report. Select the Stockage
dataset and switch out of design mode. You will notice that the MDX statement is:
SELECT { [Measures].[Sales Qty], [Measures].[On Hand Qty] } ON
COLUMNS,
NON EMPTY { TOPCOUNT ([Item].[Subject].[Subject].ALLMEMBERS,
STRTOVALUE(@TopNSubject), ( [Measures].[On
Hand Qty] ) ) } . . . ON ROWS
FROM . . .
The TopNSubject parameter was added by-hand to do a TopCount of the subjects based
on On Hand Qty. See how the StrToValue function is used to translate the parameter
(which is always a string) to the value for the TopCount.
Notice that the parameter had to be defined in the query parameters and on the report
parameters. Any parameter (e.g. TopNSubject) used in a MDX statement must be defined
as a query parameter to be recognized in Analysis Services as a parameter for the query.
3) Examine the "Sales to Model By Region By Category By Strategy" report. Select the
Sales dataset and switch out of design mode. You will notice that the MDX statement is:
SELECT { [Measures].[Model Qty], [Measures].[Sales Qty],
[Measures].[cSales to Model Ratio] } ON COLUMNS,
LastPeriods (STRTOVALUE(@RollingNWeeks),
(STRTOMEMBER(@TimeFiscal))) . . . ON ROWS
FROM . . .
Page 19
Again we are using a custom parameter (@RollingNWeeks) to limit the Fiscal weeks
displayed. In the original MDX, the parameter for the rows was just the TimeFiscal
parameter. We added the LastPeriods function and the @RollingNWeeks parameter.
Notice that there is a query parameter which maps the @RollingNWeeks parameter to the
RollingNWeeks report parameter. The actual translation of the captions for the parameter
on the report to the substituted value is a table contained in the report. Go to the menu
item Report \ Report Parameters and select the RollingNWeeks parameter. You will see
the table hardcoded right there in the Report Parameters dialog box.
While we are on this report, go to the Data tab and the ReplenStrategy dataset. You will
notice that its MDX is pretty complex.
WITH MEMBER [Measures].[ParameterCaption] AS
'iif([Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL = 5 ,
"
+-- " + [Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION + " (5)",
iif([Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL = 4 ,
"
+-- " + [Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION + " (4)",
iif([Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL = 3 ,
"
+-- " + [Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION + " (3)",
iif([Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL = 2 ,
"
+-- " + [Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION + " (2)",
iif([Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL = 1 ,
"
+- " + [Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION + " (1)",
[Replen Strategy].[Replen
Strategy].CURRENTMEMBER.MEMBER_CAPTION)))))'
MEMBER [Measures].[ParameterValue] AS
'[Replen Strategy].[Replen
Strategy].CURRENTMEMBER.UNIQUENAME'
MEMBER [Measures].[ParameterLevel] AS
'[Replen Strategy].[Replen
Strategy].CURRENTMEMBER.LEVEL.ORDINAL'
SELECT {[Measures].[ParameterCaption], [Measures].
[ParameterValue], [Measures].[ParameterLevel]} ON COLUMNS ,
[Replen Strategy].[Replen Strategy].ALLMEMBERS ON ROWS
FROM . . .
However, what is happening is quite simple. In previous MDX statements the dataset was
filtered so that only particular levels are displayed. Here were are showing the entire
dimension (which is the default), but we are modify the caption so it includes some
number of spaces along with "+--" prior to the member caption and then the level
Page 20
number. The number of spaces is dependent on the level ordinal (1 through 5). The net
effect is that the dimension is indented properly.
All
+- Backlist (1)
+-- Modeled (2)
+-- Academic (3)
+-- Core (3)
+-- Display 4 (3)
+-- Non Core (3)
+-- NOS (3)
+-- Regional (3)
+-- Select (3)
+- Frontlist (1)
+-- Buyer Managed (2)
+-- Buyer Managed (3)
+-- No Action (2)
+-- No Action (3)
+-- No Replenishment (2)
+-- No Replenishment (3)
+-- Store Managed (2)
+-- Store Managed (3)
+-- Undefined (2)
+-- Undefined (3)
+- Unknown (1)
+-- Unknown (2)
+-- Unknowned (3)
You can see how a pseudo hierarchy is being created. Unfortunately you need to do
something like this because RS does not have an AS hierarchy control.
Page 21
When the pointer is over one of the row labels, a tooltip that pops up to describe
what clicking on the label would do. It also displays the caption from particular
label your mouse tip is hovering over.
Click on the row label of your choice. (See Figure 2.) Notice how the column
heading for the row labels changed from Product to Subject.
You can keep clicking down to the next level by selecting any of the row labels.
The same drill down can work with the graph bars. If there is particular any bar
that seems interesting to you, maybe because its too high or too low, you can drill
down on it just by clicking.
Page 22
The browser back button always takes you back where you came from.
After going back, you can click on another row label or graph bar of your choice.
Page 24
\90
\SDK
\Assemblies\Microsoft.AnalysisServices.dll
to the folder:
<windows folder>\Microsoft.NET\Framework\v2.0.50727
Note: This is not required to run the packages; only to edit them. Without it, the scripting
editing package cannot use Intellisense for editing.
To get started using AMOShell, just copy the AMOShell.dtsx file to a working folder and
rename it to be whatever you require. Then right-click on the renamed .dtsx file and
select Edit. Make your changes and save it. You can run it using the BI Development
Studio (which includes an SSIS debugger) or you can run it interactively by just doubleclicking on the .dtsx file and select Execute. You can also schedule the package using
SQL Agent.
"Agg design" is Cust (Custom, by-hand aggregation design), Std (Standard using
30%), or UBO (using the Usage-Based Optimization design wizard)
Page 25
SubjPart. We used a different partitioning design because we had too much Store
Inventory data per week thus we had to further partition it by Subject.
This naming convention is used throughout Project REAL for instances and database
names. Within a database, all of the objects are named the same, e.g. the cube "REAL
Warehouse" is the same.
To actually perform the partition processing, Process_phases uses the ascmd utility,
which is distributed in the \OLAP\REALbuild\ascmd folders. Ascmd is a command-line
utility for executing MDX queries, XMLA scripts and DMX (data mining) statements.
The file "Process_phases.bat" separates the work into nine phases, with increasing
complexity and amount of data to be processed. The nine phases are:
1. Item Vendor the 5 partitions for the many-to-many measure group (~35 millions
rows in the measure group),
2. DC Inventory the distribution center inventory data which is fairly small
3. Store Sales 2002
4. Store Sales 2003
5. Store Sales 2004
6. Store Inventory 2004 Q1
7. Store Inventory 2004 Q2
8. Store Inventory 2004 Q3
9. Store Inventory 2004 Q4
The XMLA scripts for these phases are kept in the OLAP\REALbuild\scripts folder. They
have scripting variables to control: 1) database name (ascmddbname); 2) processing type
(script_process_type which should be ProcessUpdate or ProcessFull); and 3) number of
items processed in parallel (script_parallel which is typically 4, 6, 8 or 12). Also in the
\scripts folder we placed the XMLA scripts used to create four of the base databases (MT
vs. PT and TimePart vs. SubjPart all using the Cust aggregation design. And a series
of backup and restore scripts which we used to conduct backup and restore tests.
Each run has a series of records written into the \OLAP\REALbuild\outputs\comment.txt
file kind of like an audit trail. We used this file as a running journal of our processing
activity. Once Process_phases completed we then used notepad to add our own comments
as to what the run was like; whether there were unusual circumstances during the run, etc.
Also in the \outputs folder are the ascmd output files for each phase. The ascmd utility
can also record the trace event using during execution. These are stored in the
\OLAP\REALbuild\traces folder.
Once you have the partitions built, you will want to query them to see if they were built
properly. Again, we use the ascmd utility. The Validate.bat file recursively looks at the
\OLAP\REALbuild\queries folder and executes each .mdx file in the folder. The results
are stored in the \OLAP\REALbuild\query-validate folder.
Page 26
Lastly this folder contains urlencode.exe which is a sample which returns the html
encoding for a string. We found this extremely useful when developing MDX statements
to be placed into the \queries folder. Ascmd requires that the input XML be html encoded.
We developed the .mdx files using SQL Management Studio and then ran selected
sections of the file through urlencode.exe to establish what special encodings are needed,
such as replacing double quotes with " then finally placing the <Statement> and
</Statement> element around the encoded MDX query.
ClearCache.xmla we used this to clear the data cache between our performance
runs to ensure that the system was starting from a cold cache.
NOTE: Because Project REAL uses considerably more data than was included in the
Sample dataset, the following files are for your information. You cannot run them against
the Sample dataset.
The three subfolders contained in the \scripts folder are:
Build Partitions This
that one of the issues when working with 1400+ partitions was
that SQL Management Studio was unable to report on their state (i.e. processed or
unprocessed). There were simply too many partitions for the SSMS Report to run on the
Page 27
AS partitions. To workaround this problem, we wrote an SSIS package which uses AMO
to loop through all of the partitions, measure groups, cubes, or databases and lists each
partition and its state.
work in this package is at the logical fact table level. The partitioned
table is created and a For Loop container is used to loop through each week and load the
associated table from the source database into the new partitioned table. The final step is
to create indexes.
The stored procedures that are used in this project can be recreated in a SQL Server 2005
database by running the script in \DB\Data Lifecycle\SPs\ Lifecycle SPs.sql. Note that
some of the functions that are created in this script are prior versions of the functions
released with this kit in the REAL_Warehouse_Sample_V6 database. For that reason, it
is best to run the script in a separate database.
Two stored procedures of note are up_CreatePartitionFunction and
up_CreatePartitionScheme. Both stored procedures generate the associated DDL
statements to create the partitioning function and partitioning scheme, respectively. This
code would be useful for generating Partition Functions and Partition Schemes for
partitioned tables that have numerous partitions, as we had at Barnes and Noble.
Generating these statements saves time in typing and minimizes typos.
Page 28
Known Issues
The following are the known issues with running the Project REAL Reference
Implementation using SQL Server 2005.
Page 29
Page 30