Professional Documents
Culture Documents
DESIGN OF EXPERIMENTS
SAS Institute Inc. SAS Campus Drive Cary, NC 27513
JMP Design of Experiments, Version 4 Copyright 2000 by SAS Institute Inc., Cary, NC, USA ISBN: 1-58025-631-7 All rights reserved. Printed in the United States of America. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, or otherwise, without prior written permission of the publisher, SAS Institute Inc. Information in this document is subject to change without notice. The software described in this document is furnished under a license agreement and may be used or copied only in accordance with the terms of the agreement. It is against the law to copy the software on any medium except as specifically allowed in the license agreement. First printing, January 2000 JMP, SAS, and all other SAS Institute Inc. product or service names are registered trademarks of SAS Institute Inc. All trademarks above are registered trademarks or trademarks of SAS Institute Inc., in the USA and other countries. indicates USA registration. Other brand and product names are registered trademarks or trademarks of their respective companies. Imageman is a registered trademark or trademark of Data Techniques, Inc. All rights reserved. Microsoft Text-to-Speech Engine is a registered trademark or trademark Microsoft Corporation. All rights reserved. Installer VISE TM , Updater VISE, and MindExpander are trademerks of MindVision Inc. All rights reserved worldwide. Install Shield is a registered trademark of InstallShield Software Corporation. All rights reserved. Mercutio MDEF is a registered trademark or trademark of Digital Alchemy, Ramon M. Felciano. All rights reserved.
Chapter 6 Full Factorial Designs .................................................................................. 85 The Factorial Dialog...................................................................................................... 87 The Five-Factor Reactor Example................................................................................. 88 Chapter 7 Taguchi Designs ........................................................................................... 97 The Taguchi Design Approach ..................................................................................... 99 Taguchi Design Example .............................................................................................. 99 Analyze the Byrne-Taguchi Data ................................................................................ 103 Chapter 8 Mixture Designs ......................................................................................... 105 The Mixture Design Dialog ......................................................................................... 107 Mixture Designs .......................................................................................................... 108 Extreme Vertices Design for Constrained Factors ...................................................... 113 Adding Linear Constraints to Mixture Designs........................................................... 114 Ternary and Tetrary Plots............................................................................................ 115 Fitting Mixture Designs............................................................................................... 116 Chemical Mixture Example......................................................................................... 118 Plotting a Mixture Response Surface .......................................................................... 119 Chapter 9 Augmented Designs ................................................................................... 121 The Augment Design Interface ................................................................................... 123 The Reactor Example Re-visited ................................................................................. 126 Chapter 10 Prosective Power and Sample Size ......................................................... 135 Prospective Power Analysis ........................................................................................ 137 Launch the Sample Size and Power facility ................................................................ 137 References ...................................................................................................................... 145 Index ............................................................................................................................... 149
Origin JMP was developed by SAS Institute Inc., Cary, N.C. JMP is not a part of the SAS System and is not as portable as SAS. A SAS add-on product called SAS/INSIGHT is related to JMP in some ways but has different conventions and capabilities. Portions of JMP were adapted from routines in the SAS System, particularly for linear algebra and probability calculations. Version 1 of JMP went into production in October, 1989 Credits JMP was conceived and started by John Sall. Design and development was done by John Sall, Katherine Ng, Michael Hecht, Richard Potter, Brian Corcoran, Annie Dudley, Bradley Jones, Xan Gregg, Eric Wasserman, Charles Soper, and Kevin Hardman. Ann Lehman coordinated product development, production, quality assurance, and documentation. In the SAS Institute Technical Support division, Ryan Gilmore, Maureen Hayes, Craig Devault, Toby Trott, and Peter Ruzza provide technical support and conducted test site administration. Statistical technical support is provided by Duane Hayes, Kathleen Kiernan, and Annette Sanders. Nicole Jones and Jianfeng Ding provide ongoing quality assurance. Additional testing and technical support is done by Kyoko Takenaka and Noriki Inoue from SAS Japan. Sales and marketing is headed by Colleen Jenkins and includes Dianne Nobles, William Gjertsen, Chris Brown, Carolyn Durst, Mendy Clayton, Bob Hickey, David Sipple, Barbara Droschak, Lisa Rohloff, Bob McCall, Chuck Boiler, Nick Zagone and Bonnie Rigo. Additional support is provided by Kathy Jablonski and Jean Davis. The JMP manuals were written by Ann Lehman, John Sall, Bradley Jones, and Erin Vang with contributions from Annie Dudley and Brian Corcoran. Editing was done by Lee Bumgarner, Brad Kellam, and Lee Creighton, design and production by Creative Solutions. Lee Creighton implemented the online help system and online documentation with contribution from Timothy Christensen. Special thanks to Jim Goodnight for supporting a product outside the usual traditions and to Dave DeLong for valuable ideas and advice on statistical and computational matters. Thanks also to Robert N. Rodriguez, Ying So, Duane Hayes, Mark Bailey, Donna Woodward, and Mike Stockstill for statistical editorial support and statistical QC advice. Thanks to Georges Guirguis, Warren Sarle, Randall Tobias, Gordon Johnston, Ying So, Wolfgang Hartmann, Russell Wolfinger, and Warren Kuhfeld for statistical R&D support. Acknowledgments We owe special gratitude to the people that encouraged us to start JMP, to the alpha and beta testers of JMP, and to the reviewers of the documentation. In particular we thank Michael Benson, Howard Yetter, Al Best, Stan Young, Robert Muenchen, Lenore Herzenberg, Larry Sue, Ramon Leon, Tom Lange, Homer Hegedus, Skip Weed, Michael Emptage, Pat Spagan, John Frei, Paul Wenz, Mike Bowen, Lori Gates, Georgia Morgan, David Coleman, Linda Blazek, Michael Friendly, Joe Hockman, Frank Shen, J.H. Goodman, David Ikle, Lou Valente, Robert Mee, Barry Hembree, Dan Obermiller, Lynn Vanatta, and Kris Ghosh. Also, we thank Dick DeVeaux, Gray McQuarrie, Robert Stein, George Fraction, Al Fulmer, Cary Tuckfield, Ron Thisted, Donna Fulenwider, Nancy McDermott, Veronica Czitrom, Tom Johnson, Avigdor Cahaner, and Andy Mauromoustakos.
vi
We also thank the following individuals for expert advice in their statistical specialties: R. Hocking and P. Spector for advice on effective hypotheses; Jason Hsu for advice on multiple comparisons methods (not all of which we were able to incorporate in JMP); Ralph OBrien for advice on homogeneity of variance tests; Ralph OBrien and S. Paul Wright for advice on statistical power; Keith Muller for advice in multivariate methods; Harry Martz, Wayne Nelson, Ramon Leon, Dave Trindade, Paul Tobias for advice on reliability plots; Lijian Yang and J. S. Marron for bivariate smoothing design; George Milliken and Yurii Bulavski for development of mixed models; Clay Thompson for advice on contour plotting algorithms. For sample data, thanks to Patrice Strahle for Pareto examples, the Texas air control board for the pollution data, and David Coleman for the pollen (eureka) data.
Past Support
Many people were important in the evolution of JMP. Special thanks Jeffrey Perkinson, Mary Cole, Kristin Nauta, Aaron Walker, Ike Walker, Eric Gjertsen, Dave Tilley, Curt Yeo, Patricia Moell, Patrice Cherry, Mike Pezzoni, Mary Ann Hansen, Ruth Lee, Russell Gardner, and Patsy Poole. SAS Institute quality assurance by Jeanne Martin, Fouad Younan, Jeff Schrilla, Jack Berry, Kari Richardson, Jim Borek, Kay Bydalek, and Frank Lassiter. Additional testing for Versions 3 and 4 was done by Li Yang, Brenda Sun, Katrina Hauser, and Andrea Ritter. Thanks to Walt Martin for Postscript support in documentation production. Also thanks to Jenny Kendall, Elizabeth Shaw, and John Hansen, Eddie Routten, David Schlotzhauer, John Boling, and James Mulherin, Thanks to Steve Shack, Greg Weier, and Maura Stokes for testing Version 1. Additional editorial support was given by Marsha Russo, Dea Zullo, and Dee Stribling. Thanks for support from Morgan Wise, Frederick Dalleska, Stuart Janis, Charles Shipp, Harold Gugel, Jim Winters, Matthew Lay, Tim Rey, Rubin Gabriel, Brian Ruff, William Lisowski, David Morganstein, Tom Esposito, Susan West, Chris Fehily, Dan Chilko, Jim Shook, Bud Martin, Hal Queen, Ken Bodner, Rick Blahunka, Dana C. Aultman, and William Fehlner.
1 JMP DOE
Chapter 1 Contents
DOE Choices .............................................................................................................................. 3 Custom Design .................................................................................................................... 4 Screening Design ................................................................................................................. 4 Response Surface Design .................................................................................................... 4 Full Factorial Design ........................................................................................................... 5 Taguchi Arrays .................................................................................................................... 5 Mixture Design.................................................................................................................... 5 Augment Design.................................................................................................................. 5 Sample Size and Power ....................................................................................................... 6 A Simple DOE Example............................................................................................................. 6 The DOE Dialog ......................................................................................................................... 7 Entering Responses ............................................................................................................. 8 Entering Factors................................................................................................................... 9 Select a Design Type ......................................................................................................... 10 Modify a Design ................................................................................................................ 10 The JMP DOE Data Table........................................................................................................ 11 DOE Utility Commands ........................................................................................................... 12
DOE Choices
The DOE platform in JMP is an environment for describing the factors, responses and other specifications, creating a designed experiment, and saving it in a JMP table. When you select the DOE tab on the JMP Starter window, you see the list of design command buttons shown on the tab page as in Figure 1.1. Alternatively, you can choose commands from the DOE main menu shown to the right. Figure 1.1 The DOE JMP Starter Tab
1 JMP DOE
Note that the DOE tab in the JMP Starter window tells what each command does. The specific design types are described briefly in the next sections, and covered in detail by the following chapters in this book.
Custom Design
Custom designs give the most flexibility of all design choices. The Custom designer gives you the following options:
continuous factors categorical factors with arbitrary numbers of levels mixture ingredients covariates (factors that already have unchangable values and design around them) blocking with arbitrary numbers of runs per block interaction terms and polynomial terms for continuous factors inequality constraints on the factors choice of number of experimental runs to do, which can be any number greater than or equal to the number of terms in the model.
After specifying all your requirements, this design solution generates a D-optimal design for those requirements.
Screening Design
As the name suggests, screening experiments separate the wheat from the chaff. The wheat is the group of factors having a significant influence on the response. The chaff is the rest of the factors. Typically screening experiments involve many factors. The Screening designer supplies a list of popular screening designs for 2 or more factors. Screening factors can be continuous or categorical with two or three levels. The list of screening designs also includes designs that group the experimental runs into blocks of equal sizes where the size is a power of two.
with the number of factors. The Response Surface designer in JMP lists well-known RSM designs for two to eight continuous factors. Some of these designs also allow blocking.
1 JMP DOE
Taguchi Arrays
The goal of the Taguchi Method is to find control factor settings that generate acceptable responses despite natural environmental and process variability. In each experiment, Taguchis design approach employs two designs called the inner and outer array. The Taguchi experiment is the cross product of these two arrays. The control factors, used to tweak the process, form the inner array. The noise factors, associated with process or environmental variability, form the outer array. Taguchis Signal-to-Noise Ratios are functions of the observed responses over an outer array. The Taguchi designer in JMP supports all these features of the Taguchi method. The inner and outer array design lists use the traditional Taguchi orthogonal arrays such as L4, L8, L16, and so forth.
Mixture Design
The Mixture designer lets you define a set of factors that are ingredients in a mixture. You choose among several classical mixture design approaches, such as simplex, extreme vertices, and lattice. For the extreme vertices approach you can supply a set of linear inequality constraints limiting the geometry of the mixture factor space.
Augment Design
The Augment designer gives the following four choices for adding new runs to existing design:
add center points replicate the design a specified number of times create a foldover design add runs to the design using a model, which can have more terms than the original model.
The last choice (adding runs to a design) is particularly powerful. You can use this choice to achieve the objectives of response surface methodology by changing a linear model to a full quadratic model and adding the necessary number of runs. For example, suppose you start with a two-factor, two-level, four-run design. If you add quadratic terms to the model and five new points, JMP generates the 3 by 3 full factorial as the optimal augmented design.
enter factors and responses choose a design modify a design generate a JMP table that contains the design runs.
Suppose an engineer wants to investigate a process that uses an electron beam welding machine to join two parts. The engineer fits the two parts into a welding fixture that holds them snugly together. A voltage applied to a beam generator creates a stream of electrons that heats the two parts, causing them to fuse. The ideal depth of the fused region is 0.17 inches. The engineer wants to study the welding process to determine the best settings for the beam generator to produce the desired depth in the fused region.
For this study, the engineer wants to explore the following three inputs, which are the factors for the study: Operator, two technicians who operate the welding machine.
Rotation Speed, which is the speed at which the part rotates under the beam.
1 JMP DOE
Beam Current, which is a current that affects the intensity of the beam. After each processing run, the engineer cuts the part in half. This reveals an area where the two parts have fused. The Length of this fused area is the depth of penetration of the weld. This depth of penetration is the response for the study. The goals of the study are
find which factors affect the depth of the weld quantify those effects find specific factor settings that predict a weld depth of 0.17 inches.
The next sections show how to define this study in JMP with the DOE dialog
The Responses panel has a single default response. You can enter as many responses as you want, and designate response goals as Maximize, Minimize, or Match Target. A response may also have no defined goal. The DOE platform accepts only numeric responses. The Factors panel requires that you enter one or more factors. The appearance of the Factors panel depends on the DOE command you select. For the 2-level design panel shown in Figure 1.2, enter the number of Continuous, 2-Level, or 3-level factors you want and click Add. Factor panels for other types of design are shown in more detail in the following chapters that describe the specific design types.
The results when you click Continue depend on the type of design. There are examples of each design type shown in the chapters that follow. For simplicity, this example uses the Screening designer. Note that the Responses and Factors panels have disclosure buttons so that you can close them. This lets you simplify the dialog when you are ready to Continue.
Figure 1.2 The DOE Design Experiment Dialog For a Screening Design
Responses Panel Enter response and edit response names. Define response goal: Target, Min, Max, or None.
Factors Panel Enter Factors and click Add. Edit Factors names.
Entering Responses
By default, The Responses panel in the DOE dialog appears with one response (named Y) that has Maximize as its goal. There are several things you can do in this panel:
Add an additional response with a specific goal type using selections from the Add Response popup menu. Add N additional responses with the N Responses button. The default goal is maximize. Specify goals appropriate for each goal type.
To continue with the welding example open the Responses panel if it is not already showing. Note that there is a single default response called Y. Change the default response as follows: 1) double click to highlight the response name and change it to Depth (In.).
2) The default goal is Maximize, but this process has a target value of 0.17 inches with a lower bound of 0.12 and an upper bound of 0.22. Click on the Goal text edit area and choose Match Target from the popup menu, as shown here. 3) Click the Lower Bound, Upper Bound, areas and enter 0.12 as the target value, 0.22 as a minimum and maximum acceptable values.
1 JMP DOE
Entering Factors
Next enter factors into the Factors panel, which shows beneath the Responses panel. Design factors have different roles that depend on design type. The Factors panel reflects roles appropriate for the design you choose. The screening design accepts either continuous or categorical factors. This example has one categorical factor (Operator) and two continuous factors (Speed and Current). Enter 1 in the 2-Level Categorical text box and click Add. then click. Enter 2 in the Continuous text box and click Add. These three factors appear with default names (X1, X2, and X3) and the default values shown here. The factor names and values are editable fields. Double click on these fields to enter new names and values. For this example, use Mary and John as values for the categorical factor called Operator. Name the continuous factors Speed and Current. High and low values for Speed are 3 and 5 rpm. Values for Current are 150 and 165 amps. After you enter the response, the factors, and edit their values (optional), click Continue.
10
Modify a Design
Special features for screening designs include the ability to list the Aliasing of Effects, Change Generating Rules for aliasing, and view the Coded Design. A standard feature for all designs lets you specify the Run Order with selections from the run order popup menu. These features are used in examples and discussed in detail in the following chapters. When the design details are complete, click Make Table to create a JMP table that contains the specified design.
11
Note: All dialogs have a Backup button that returns you to the previous stage of the design generation, where you can change the design type selection.
1 JMP DOE
The table panels show table properties automatically created by the DOE platform:
The name of the table is the design type that generated it. A table variable called Design also shows the design type. You can edit this table variable to further document the table, or you can create new table variables. A script to generate the analysis model is saved with the table. The icon labeled Model is a Table Property that runs a script that generates a Model Specification dialog with the analysis specification for the design type you picked. In this example the Model Specification dialog shows a single response, Depth (In.), three main effects, Operator, Speed, and Current, and all two factor interactions.
12
Figure 1.4 The Model Specification dialog Generated by the DOE Dialog
13
completed DOE dialog. The table has a row for each response with a column called Response Name that identifies them. Four additional columns identify response goals to the DOE facility: Lower Limit, Upper Limit, Response Goal, and an Importance weight. This example shows a DOE dialog for four responses with a variety of response goals, and the JMP table that contains the response information.
Load Responses
1 JMP DOE
If the responses and response goals are in a JMP table as described previously, you can use that table to complete the DOE dialog for an experiment. When the responses table you want is open and is the current table, the Load Responses command copies the response names and goals into the DOE dialog. If there is no response table open, Load Responses displays the Open File dialog for you to open the table you want to use.
Save Factors
If an experiment has many factors, it can take time to enter the names and values for each factor. After you finish you can use the Save Factors command to save your work, so you only have to do this job once. The Save Factors command creates a JMP data table that contains the information in a completed factor list. The table has a column for each factor and a row for each factor level. As an example, suppose you entered the information showing in the dialog to the right. Save Factors produces the data table shown below. The columns of this table have a Column
14
Property called Design Role, that identifies them as DOE factors to the DOE facility, and tells what kind of factors they are (continuous, categorical, blocking, and so on.). You can also create a factors table by keying data into an empty table, but you have to assign each column its factor type. Use the New Property menu in the Column Info dialog and select Design Role. Then choose the appropriate design role from the popup menu on the design role column property tab page.
Load Factors
If the factors and levels for an experiment are in a JMP table as described previously, you can use that table to complete the DOE dialog for an experiment. If the factors table you want is open and is the current table, the Load Factors command copies the factor names, values, and factor types into the DOE dialog. If there is no factor table open, Load Factors displays the Open File dialog for you to open the factors table you want to use.
Save Constraints
Entering constraints on continuous factors is another example of work you only want to do once. In the next example, there are three variables, X1, X2, and X3, with three linear constraints. The Save Constraints command creates a JMP table that contains the information you enter into a constraints panel like the one shown here. There is a columns for each constraint with a column property called Constraint State that identifies them as constraints (< or >) to the DOE facility. There is a row for each variable and an additional row that has the inequality condition for each variable.
Load Constraints
If the responses and response goals are in a JMP table as described previously, you can use that table to complete the DOE dialog for an experiment. When the responses table you want is open and is the current table, the Load Constraints command copies the
15
response names and goals into the DOE dialog. If there is no response table open, Load
1 JMP DOE Responses displays the Open File dialog for you to open the table you want to use. Set Random Seed
The Custom designer begins the design process with a random number. After a design is complete the Set Random Number command displays a dialog that shows the generating seed for that design. On this dialog you can set that design to run again, or continue with a new random number.
Simulate Responses
When you check Simulate Response, that item shows as checked for the current design only. It adds simulated response values to the JMP design data table for custom and augmented designs.
17
The DOE platform in JMP has the following two approaches for building an experimental design:
You can let JMP build a design for your specific problem that is consistent with your resource budget. You can choose a predefined design from one of the design catalogs, which are grouped by problem type.
The Custom designer supports the first of these approaches. You can use it for routine factor screening, response optimization, and mixture problems. Also, the custom designer can find designs for special conditions not covered in the lists of predefined designs. This chapter introduces you to the Custom designer. It shows how to use the Custom Design interface to build a design using this easy step-by-step approach:
Key engineering steps: process knowledge and engineering judgement are important.
Describe
Identify factors and responses.
Design
Compute design for maximum infromation from runs.
Collect
Use design to set factors; measure responses for each run.
Fit
Compute best fit of mathematical model to data from test runs.
Predict
Use model to find best factor settings for on-target responses and minimum variability.
Chapter 3, Custom Design: Beyond the Textbook," uses a case study approach to introduce the advanced capabilities of the Custom Design personality.
18
Chapter 2 Contents
Getting Started .......................................................................................................................... 19 Define Factors in the Factors Panel ................................................................................... 19 Describe the Model in the Model Panel ............................................................................ 20 The Design Generation Panel............................................................................................ 20 The Design Panel and Output Options .............................................................................. 21 Make Table........................................................................................................................ 22 Modify a Design Interactively .................................................................................................. 23 Introducing the Prediction Variance Profiler ........................................................................... 24 A Quadratic Model ............................................................................................................ 24 A Cubic Model .................................................................................................................. 26 Routine Screening Using Custom Designs ............................................................................... 28 Main Effects Only ............................................................................................................. 28 All Two-Factor Interactions Involving Only One Factor.................................................. 30 All Two-Factor Interactions .............................................................................................. 31 How the Custom Designer Works ............................................................................................ 32
19
Getting Started
The purpose of this chapter is to guide you through the interface of the Custom Design personality. You interact with this facility to describe your experimental situation, and JMP creates a design that fits your requirements. The Custom Design interface has these key steps:
2 Customized I
1) Enter and name one or more responses, if needed. The DOE dialog always begins with a single response, called Y, and the Response panel is closed by default. 2) Use the Factors panel to name and describe the types of factors you have. 3) Enter factor constraints, if there are any. 4) Choose a model. 5) Modify the sample size alternatives. 6) Choose the run order. 7) Optionally, add center points and replicates. You can use the custom design dialog to enter main effects, then add interactions, and specify center points and replicates.
20
21
you choose your own number of runs. Balancing the cost of each run with the information gained by extra runs you add is a judgment call that you control. The Design Generation panel has the following radio buttons: Minimum is the number of terms in the design model. The resulting design is saturated (no degrees of freedom for error). This is the most risky choice. Use it only when the cost of extra runs is prohibitive. Default is a custom design suggestion for the number of runs. This value is based on heuristics for creating balanced designs with a minimum of additional runs above the minimum. Compromise is a second suggestion that is more conservative than the Default. Its value is generally between Default and Grid. Grid, in most cases, shows the number of points in a full-factorial design. Exceptions are for mixture and blocking designs. Generally Grid is unnecessarily large and is included as an options for reference and comparison. User Specified highlights the Number of Runs text box. You key in a number of runs that is at least the minimum. When the Design Generation panel is the way you want it, click Make Design to see the factor design layout, the Design panel, appended to the Model panel in the DOE dialog.
2 Customized I
22
There are edit boxes to request additional runs at the center points be added, and to request rows that replicate the design (including any additional center points). Note: You can double-click any title bar to change its text. It can be helpful to give your design dialog a meaningful name in the title bar labeled Custom Design by default.
Make Table
When the Design panel shows the layout you want, click Make Table. This creates the JMP data table whose rows are the runs you defined. Make Table also updates the runs in the Design panel to match the JMP data table. The table to the right is the initial two-factor design shown above, which has four additional center points, and is replicated once as specified above.
initial design
replicate 4 added center points initial design replicate 4 added center points
23
24
A Quadratic Model
You can follow the steps in Figure 2.3 to create a simple quadratic model with a single continuous factor. 1) Add one continuous factor and click Continue. 2) Select 2nd from the Powers popup menu in the Model panel to create a quadratic term. 3) Use the default number of runs, 6, and click Make Design.
Figure 2.3 Use One Continuous Factor and Create a Quadratic Model
When the design appears, open the Prediction Variance Profile (as shown next). For continuous factors, the initial setting is at the mid-range of the factor values. For categorical factors the initial setting is the first level. If the design model is quadratic, then the prediction variance function is quartic. The three design points are 1, 0, and 1. The prediction variance profile shows that the variance is a maximum at each of these points, on the interval 1 to 1.
25
The Y axis is the relative variance of prediction of the expected value of the response. The prediction variance is relative to the error variance. When the prediction variance is 1, the absolute variance is equal to the error variance of the regression model. What you are deciding when you choose a sample size is how much variance in the expected response you are willing to tolerate. As the number of runs increases, the prediction curve (prediction variance) decreases. To compare profile plots, Backup and choose Minimum in the Design Generation panel, which gives a sample size of 3. This produces a curve that has the same shape as the previous plot, but the maxima are at 1 instead of 0.5. Figure 2.4 compares plots for sample size 6 and sample size 3 for this quadratic model example. You can see the prediction variance increase as the sample size decreases.
2 Customized I
Figure 2.4 Comparison of Prediction Variance Profiles. These profiles are for middle variance and lowest variance, for sample sizes 6 (top charts) and sample size 3 (bottom charts). .
Note: You can CONTROL-click (COMMAND-click on the Mac) on the factor to set a factor level precisely
26
For a final look at the Prediction Variance Profile for the quadratic model, Backup and enter a sample size of 4 in the Design Generation panel and click Make Design. The sample size of 4 adds a point at 1 (Figure 2.5). Therefore, the variance of prediction at 1 is lower (half the value) than the other sample points. The symmetry of the plot is related to the balance of the factor settings. When the design points are balanced, the plot is symmetric, like those in Figure 2.4; when the design is unbalanced, the prediction plot is not symmetric, as shown below. Figure 2.5 Sample Size of Four for the One-Factor Quadratic Model
A Cubic Model
The runs in the quadratic model are equally spaced. This is not true for the single-factor cubic model shown in this section. To create a one-factor cubic model, follow the same steps as shown previously in Figure 2.3. In addition, add a cubic term to the model with the Powers popup menu. Use the Default number of runs in the Design Generation panel. Click Make Design to continue. Then open the Prediction Variance Profile Plot to see the Prediction Variance Profile and its associated design shown in Figure 2.6. The cubic model has a variance profile that is a 6th degree polynomial. Note that the points are not equally spaced in X. It is interestingly non-intuitive that this design has a better prediction variance profile than the equally spaced design with the same number of runs.
27
You can reproduce the plots in Figure 2.6 with JSL code. The following JSL code shows graphically that the design with unequally spaced points has a better prediction variance than the equally spaced design. Open the file called Cubic Model.jsl, found in the Scripts folder in the Sample Data, and select Submit Script from the Edit menu. When the plot appears, move the free values from the equally spaced points to the optimal points to see that the maximum variance on the interval decreases by more that 10%.
2 Customized I
// DOE for fitting a cubic model. n = 4; // number of points //Start with equally spaced points. u = [-0.333 0.333]; x = {-1,u[1],u[2],1}; y = j(2,1,.2); cubicx = function({x1}, rr=j(4,1,1);for(i=1,i<=3,i++,rr[i+1]=x1^i); rr;); NewWindow("DOE - Variance Function of a Cubic Polynomial", Graph(FrameSize(500,300),XScale(-1.0,1.0),yScale(0,1.2), Double Buffer, M = j(n,1,1); for(i=1,i<=3,i++, M = M||(x^i)); V = M`*M; C = inverse(V); yFunction(xi=cubicx(x);sqrt(xi`*C*xi),x); detV = det(V); text({-0.3,1.1},"Determinant = ",char(detV,6,99)); DragMarker(u,y); for(i=1,i<=2,i++,Text({u[i],.25},char(u[i],6,99)));)); show(n,d,u); // Drag the middle points to -0.445 and 0.445 for a D-Optimal design.
28
Figure 2.6 Comparison of Prediction Variance Profiles For Cubic Design with Unequally Spaced Points and Augmented to Have Equally Spaced Points
29
Note to DOE experts: The result is a resolution 3 screening design. All main effects are estimable but are confounded with two factor interactions. Click Make Design to see the Factor Design table in Figure 2.7. Figure 2.7 A Main Effects Only Screening Design
2 Customized I
The Prediction Variance Profile in Figure 2.8 shows a variance of 0.125 (1/8) at the center of the design, which are the settings that show when you open the Prediction Variance Profile. If you did all of your runs at this point, you would have the same prediction variance. But, then you could not make predictions for any other row of factor settings. The prediction variance profile for each factor is a parabola centered at the midrange of each factor. The maximum prediction variance is at each design point and is equal to p/n, where p is the number of parameters and n is the number of runs.
30
31
Figure 2.9 Two-factor Interactions that Involve Only One of the Factors
2 Customized I
32
33
3 Customized II
You can add factors with any role in any experiment. Categorical factors can have as many levels as you need. You can specify any number of runs per block. Any design can have continuous or categorical covariate factorsfactors whose values are fixed in advance of the experiment. You can have non-mixture factors in a mixture experiment. You can disallow certain regions of the factor space by defining linear inequality constraints.
"
Once you generate a design, you can use the Prediction Variance Profiler as a diagnostic tool to assess the quality of the design. You can use this tool to compare many candidate designs and choose the one that best meets your needs. This chapter presents several examples with aspects that are common in industry but which make them beyond the scope of any design catalog. It introduces various features of the Custom designer in the context of solving real-world problems.
34
Chapter 3 Contents
Custom Situations ..................................................................................................................... 35 Flexible Block Sizes ................................................................................................................. 36 Response Surface Model with Categorical Factors .................................................................. 38 Fixed Covariate Factors............................................................................................................ 43 Mixtures with Nonmixture Factors........................................................................................... 45 Factor Constraints ..................................................................................................................... 48
35
Custom Situations
When your design situation does not fit a standard design, the Custom designer gives you the flexibility to tailor a design to specific circumstances. Here are some examples.
The listed designs in the Screening designer allow only 2-level or 3-level factors. Moreover, the designs that allow blocking limit the block sizes to powers of two. Suppose you are able to do a total of 12 runs, and want to complete one block per day. With a block size of two the experiment takes six days. If you could do three runs a day, it would take only four days instead of six. The Response Surface designer allows only continuous factors. Suppose you wanted to model the behavior of three kinds of epoxy under varying temperatures and pressures in a lamination process. Repeating a complete response surface design for each type of epoxy requires more runs than a single response surface design arranged over the epoxy levels. Preformulated designs rely on the assumption that the experimenter controls all the factors. It is common to have quantitative measurements (a covariate) on the experimental units before the experiment begins. If these measures affect the experimental response, the covariate should be a design factor. The preformulated design that allows only a few discrete values is too restrictive. The Mixture designer requires all factors to be mixture components. It seems natural to vary the process settings along with the percentages of the mixture ingredients. After all, the optimal formulation could change depending on the operating environment. Screening and RSM designs assume it is possible to vary all the factors independently over their experimental ranges. The experimenter might know in advance that running a process at certain specified settings has an undesirable result. Leaving these runs out of an available listed design type destroys the mathematical properties of the design.
3 Customized II
The Custom designer can supply a reasonable design for all these examples. Instead of a list of tables, the Custom designer creates a design table from scratch according to your specifications. Instead of forcing you to modify your problem to conform to the restrictions of a tabled design, it tailors a design to fit your needs. This chapter consists of five examples addressing these custom situations.
36
If you add the two-factor interactions of X1-X3 to the design, as shown by the Model panel and Design Generation panel in Figure 3.2, the default number of runs changes to 12. The blocking factor then has 4 levels. The table in the example results from the Randomize within Blocks option in the Run Order popup menu on the Display and Modify Design panel..
37
Figure 3.2 Model Design Table For Blocking Factor With Four Levels
3 Customized II
The initial Prediction Variance Profile for this design (Figure 3.3) shows that at the center of the design, the block-to-block variance is a constant. This results from the fact that each block has three runs. Figure 3.3 Constant Block-to-Block Variance at Design Center
If you drag the vertical reference lines in the plots of X1 through X3 to their high value of 1, you see the top plot in Figure 3.4. The bottom plot results from dragging the vertical reference line for X4 to block 4. At this vertex the prediction variance is not constant over the blocks. This is due to an unavoidable lack of balance resulting from the fact that there are three runs in each block, but only two values for each continuous variable.
38
The main question here is whether the size of the prediction variance over the possible factor settings is acceptably small. If not, adding more runs (up to 15 or 18) will lower the prediction variance traces.
39
First, define two continuous factors (X1 and X2). Click Continue and then click the RSM button in the Model panel. You should see the panels as they are shown here.
Now, use the Add Factor popup above the Factors panel to create a 3-level categorical factor (X3). As soon as you add the categorical factor, the model updates to show the main effect of the categorical factor in the Model panel. Ignoring the categorical factor, it seems natural to use a 32 factorial design to fit an RSM model for two continuous factors, which gives the design illustrated to the right. The traditional approach would be to repeat this design three times (once for each level of the categorical variable), giving a sample size of 27. This is overkill. In fact, its not strictly necessary to add any runs to accommodate the categorical factor. When you click Continue for this example, the Design Generation panel shows the default number of runs to be 12, but the Minimum option is 8. Note: The minimum number of runs needed for this example is eight because the RSM model for two continuous factors has six parameters (constant, two linear terms, interaction, and two quadratic terms). The main effect of the 3-level categorical factor adds two more parameters, giving a total of eight parameters.
3 Customized II
40
The rest of this example compares the results of 8 runs, 9 runs and the 9-run design with 3 center points added. To see these designs:
Make a design with the Minimum runs (8). Make a second design by typing 9 in the Design Generation Panel Number of Runs text box. For the third design, add three center points to the previously 9-run design and make the design again.
Figure 3.5 shows these three designs after making JMP tables for them, sorted right to left. Figure 3.5 8 runs (Left) 9 runs (Middle) 9 runs with 3 Center Points Added (Right)
Figure 3.6 gives a geometric view of the designs generated by this example. These plots were generated for the runs in each JMP table with the Overlay command in the Graph menu, using the block factor as the Group By variable.
41
Figure 3.6 Geometric View of RSM Designs 8 runs 9 runs 9 runs with 3 center points
3 Customized II
The Prediction Variance Profilers for each of these designs are shown in Figures 3.7-3.9. Figure 3.7 shows the variance traces for the minimum design. Note that at the center of the design the prediction variance is larger than the error variance. If the error variance is small relative to the size of the effect that is important, this should not concern you. If the process variability is sizeable, then adding runs will help reduce the noise in the parameter estimates.
42
The prediction variance trace in Figure 3.8 shows that adding just one more run to the minimum (saturated) design reduces the prediction variance at the center of the design by nearly 40%. If extra runs are not prohibitively expensive, this is a desirable choice.
Figure 3.9 shows the prediction trace after adding three center points to the 9-Run design. The additional center points give the prediction trace a bowl shape which is desirable if you are confident that you have already bracketed the optimum response. There is a further 40% drop in the prediction variance at the center of the design, but this is at the cost of three extra runs instead of one.
Any of the designs described in this section could be acceptable, depending on your research objectives and budget. The Prediction Variance Profile is a tool for assessing the trade-off between improved prediction and extra cost.
43
Add 2 continuous variables to the model, as shown in previous examples. Click Continue and add the interaction to the model. Then select Covariate from the Add Factors popup menu as shown here.
3 Customized II
The Covariate selection displays a variable list of the variables in the current data table. Note: If you have more than one data table open, be sure the table that contains the covariate you want is the active, or current data table. The covariate, weight, shows in the Factors panel with its minimum and maximum as levels, and is a term in the model. The data table in Figure 3.10 shows the Factors panel and the resulting JMP data table.
44
You can see that weight is nearly independent of the X1 and X2 factors by running the model with the two-factor interaction as in the Model Specification dialog in Figure 3.11. The leverage plots are nearly horizontal, and the analysis of variance table (not shown) shows that the model sum of squares is near zero compared to the residuals. Figure 3.11 Analysis To Check That weight is Independent of X1 and X2
You can save the prediction equation from by this analysis and use it to generate a set of predicted weight values over a grid of X1 and X2 values, and append them to the column of observed weight values in the experimental design JMP table. Then use the Spinning Plot platform to generate a plot of X1, X2, and weight. This is a way to illustrate that the X1 and X2 levels are well balanced over the weight values.
45
Figure 3.12 Three-dimensional Spinning Plot of Two Design Factors, Observed Covariate Values and Predicted Covariate Grid
3 Customized II
The response is the electromagnetic damping of an acrylonitrile powder. The three mixture ingredients are copper sulphate, sodium thiosulphate, and glyoxal. The nonmixture environmental factor of interest is the wavelength of light.
Though wavelength is a continuous variable, the researchers were only interested in predictions at three discrete wavelengths. As a result they treat it as a categorical factor with three levels. The Responses panel in Figure 3.13 shows Damping as the response. The authors do not mention how much damping is desirable so the response goal is None. The Factors panel shows the three mixture ingredients and the categorical factor, Wavelength. The mixture ingredients have range constraints that arise from the mechanism of the chemical reaction. To load these factors choose Load Factors from the popup menu on the Factors panel title bar. When the open file dialog appears, open the file Donev Mixture factors.JMP in the DOE folder in the Sample Data.
46
The model in Figure 3.14 is a response surface model in the mixture ingredients along with the additive effect of the wavelength. There are several reasonable choices for sample size. The grid option in the Design Generation Panel (Figure 3.14) corresponds to repeating a 6run mixture design in the mixture ingredients once for each level of the categorical factor. The resulting data table is on the right. Figure 3.14 Mixture Experiment Design Generation Panel and Data Table 1 2
47
Atkinson and Donev provide the response values shown in Figure 3.14. They also discuss the design where the number of runs is limited to 10. In this case it is not possible to run a complete mixture response surface design for every wavelength. Typing "10" in the Number of Runs edit box in the Design Generation panel (Figure 3.15) sets the run choice to User Specified. The Design table to the right in Figure 3.15 shows the factor settings for 10 runs. Figure 3.15 Ten-Run Mixture Response Surface Design.
3 Customized II
Note that there are unequal numbers of runs for each wavelength. Because of this lack of balance it a good idea to look at the prediction variance plot Figure 3.16. The prediction variance is almost constant across the three wavelengths which is a good indication that the lack of balance is not a problem. Figure 3.16 Prediction Variance Plot for Ten- Run Design.
48
The values of the first three ingredients sum to one because they are mixture ingredients. If you vary one of the values, the others adjust to keep the sum constant. Figure 3.17 shows the result of increasing the copper sulphate percentage from 0.38462 to 0.61476. The other two ingredients both drop, keeping their ratio constant. The ratio of Na 2 S 2 O3 to Glyoxal is 5:3 in both plots. Figure 3.17 Increasing the Copper Sulphate Percentage.
Factor Constraints
Sometimes it is impossible to vary all the factors independently over their experimental ranges. The experimenter might know in advance that running a process at certain specified settings has an undesirable result. Leaving these runs out of an available listed design type destroys the mathematical properties of the design, which is unacceptable. The solution is to support factor constraints as an integral part of the design requirements. For this example, define two factors. Suppose that it is impossible or dangerous to perform an experimental run where both factors are at either extreme. That is, none of the corners of the factor region are acceptable points. Figure 3.18 shows a set of four constraints that cut off the corner points. The figure on the right in Figure 3.18 shows the geometric view of the constrains. The allowable region is inside the diamond defined by the four constraints. If you want to avoid entering these constraints yourself, choose Load Constraints from the Design Experiments title bar. Open the sample data file Diamond Constraints.jmp in the DOE folder.
49
X + Y > 1 Y = X 1 X
X + Y > 1 Y=X1
Next, click the RSM button in the Model panel to include the two-factor interaction term and both quadratic effects in the model. This is a second order empirical approximation to the true functional relationship between the factors and the response. Suppose the complexity of this relationship required third order terms for an adequate approximation. Figure 3.19 shows how to create a higher order cross product term. First select one or more factors from the Factors panel and one or more terms from the Model panel. Then click the Cross button to add the cross product terms. Figure 3.19 Creating a Cross-Product Term
3 Customized II
Similarly, you can add the X1*X2*X2 cross product term. To complete the full third order model, select both factors and choose 3rd from the Powers popup menu in the Model panel. There are 10 terms in the design model. A 4 by 4 grid design would be 16 runs. Choosing an intermediate value of 12 runs yields a design similar to the one in Figure 3.20. The geometric view shows many design points at or near the constraint boundary.
50
Figure 3.21 shows the prediction variance as a function of the factor settings at the center of the design and at the upper right constraint boundary. The variance of prediction at the center of the design is 0.602301, nearly the same as it is at the boundary, 0.739579.
Figure 3.21 Prediction Variance at the Center of the Design and at a Boundary.
In many situations it is preferable to have lower prediction variance at the center of the design. You can accomplish this by adding centerpoints to the design. Figure 3.22 shows
51
the result of adding two center points after having generated the 12 run design shown in Figure 3.20. Snee (1985) calls this exercising the boss option. It is practical to add centerpoints to a design even though the resulting set of runs loses the mathematical optimality exhibited by the previous design. It is more important to solve problems than to run "optimal" designs. Figure 3.22 Add Two Center Points to Make a 14 Point Design.
3 Customized II
When you compare the variance profile shown to the right to the one at the top in Figure 3.21 you see that adding two center points has reduced the variance at the center of the design by more than a factor of two, an impressive improvement.
53
4 Screening
restricting the factors to two (or three) levels. performing only a fraction of the full factorial design
Applying these to the case described above, you can restrict the factors to two levels, which yields 222=8 runs. Further, by doing half of these eight combinations you can still assess the separate effects of the three factors. So the screening approach reduces the 24-run experiment to 4 runs. Of course, there is a price for this reduction. This chapter discusses the screening approach in detail, showing both pros and cons.
54
Chapter 4 Contents
Screening Design Types ........................................................................................................... 55 Two-Level Full Factorial................................................................................................... 55 Two-Level Fractional Factorial ......................................................................................... 55 Plackett-Burman Designs .................................................................................................. 56 Mixed-Level Designs ........................................................................................................ 57 Cotter Designs ................................................................................................................... 57 A Screening Example ............................................................................................................... 58 Two-Level Design Selection and Description................................................................... 59 Design Output Options ...................................................................................................... 60 The Coded Design and Factor Generators......................................................................... 61 Aliasing of Effects ............................................................................................................. 63 Output Options for the JMP Design Table ........................................................................ 63 The Design Data Table...................................................................................................... 64 Loading and Saving Responses and Factors (Optional) ........................................................... 66 A Simple Effect Screening Analysis ........................................................................................ 67 Main Effects Report Options ............................................................................................. 67 The Actual-by-Predicted Plot ............................................................................................ 68 The Scaled Estimates Report ............................................................................................. 68
55
These designs are orthogonal. This means that the estimates of the effects are uncorrelated. If you remove 1, 1, 1 an effect in the analysis, the values of the other estimates remain the same. Their p-values change slightly, because the estimate of the error variance and the degrees of freedom are different. Full factorial designs allow the estimation of interactions of all orders up to the number of factors. Most empirical modeling involves first- or second-order approximations to the true functional relationship between the factors and the responses.
4 Screening
1. 1, 1
The big trade-off in screening designs is between the number of runs and what is often referred to as the 1. 1, 1 resolution of the design. If price is no object, you can run several replicates of all possible combinations of m factor levels. This provides a good estimate of everything, including interaction effects to the mth degree. But because running experiments costs time and money, you typically only run a fraction of all possible
56
levels. This causes some of the higher-order effects in a model to become nonestimable. An effect is nonestimable when it is confounded with another effect. In fact, fractional factorials are designed by planning which interaction effects are confounded with the other interaction effects. In practice, few experimenters worry about interactions higher than two-way interactions. These higher-order interactions are assumed to be zero. Experiments can therefore be classified by resolution number into three groups: resolution = 3 Main effects are not confounded with other main effects. They are confounded with one or more two-way interactions, which must be assumed to be zero for the main effects to be meaningful. resolution = 4 Main effects are not confounded with either other main effects or two-factor interactions. However, two-factor interactions can be confounded with other two-factor interactions. resolution 5 There is no confounding between main effects, between two-factor interactions, or between main effects and two-factor interactions. All the fractional factorial designs are minimum aberration designs. A minimum aberration design is one in which there are a minimum number of confoundings for a given resolution.
Plackett-Burman Designs
Plackett-Burman designs are an alternative to fractional factorials for screening. One useful characteristic is that the sample size is a multiple of 4 rather than a power of two. There are no two-level fractional factorial designs with sample sizes between 16 and 32 runs. However, there are 20-run, 24-run, and 28-run Plackett-Burman designs. The main effects are orthogonal and two-factor interactions are only partially confounded with main effects. This is different from resolution 3 fractional factorial where two-factor interactions are indistinguishable from main effects. In cases of effect sparcity, a stepwise regression approach can allow for removing some insignificant main effects while adding highly significant and only somewhat correlated two-factor interactions.
57
Mixed-Level Designs
If you have qualitative factors with three values, then none of the classical designs discussed previously are appropriate. For pure three-level factorials, JMP offers fractional factorials. For mixed two-level and three-level designs, JMP offers complete factorials and specialized orthogonal-array designs, listed in Table 4.1. Table 4.1 Types of Mixed-Level Designs Design L18 John L18 Chakravarty L18 Hunter L36 TwoLevel Factors 1 3 8 11 ThreeLevel Factors 7 6 6 12
If you have less than or equal to the number of factors for a design listed in Table 4.1, you can use that design by selecting an appropriate subset of columns from the original design. Some of these designs are not balanced, even though they are all orthogonal.
Cotter Designs
Cotter designs are used when you have very few resources and many factors, and you believe there may be interactions. Suppose you believe in effect sparsity that very few effects are truly nonzero. You believe in this so strongly that you are willing to bet that if you add up a number of effects, the sum will show an effect if it contains an active effect. The danger is that several active effects with mixed signs will cancel and still sum to near zero and give a false negative. Cotter designs are easy to set up. For k factors, there are 2k + 2 runs. The design is similar to the vary one factor at a time approach many books call inefficient and naive. A Cotter design begins with a run having all factors at their high level. Then follow k runs each with one factor in turn at its low level, and the others high. The next run sets all factors at their low level and sequences through k more runs with one factor high and the rest low. This completes the Cotter design, subject to randomizing the runs. When you use JMP to generate a Cotter design, JMP also includes a set of extra columns to use as regressors. These are of the form factorOdd and factorEven where factor is a factor
4 Screening
58
name. They are constructed by adding up all the odd and even interaction terms for each factor. For example, if you have three factors, A, B, and C: AOdd = A + ABC BOdd = B + ABC COdd = C + ABC AEven = AB + AC BEven = AB + BC CEven = AC + BC
It turns out that because these columns in a Cotter design make an orthogonal transformation, testing the parameters on these combinations is equivalent to testing the combinations on the original effects. In the example of factors listed above, AOdd estimates the sum of odd terms involving A. AEven estimates the sum of the even terms involving A, and so forth. Because Cotter designs have a false-negative risk, many statisticians recommend against them.
A Screening Example
Experiments for screening the effects of many factors usually consider only two levels of each factor. This allows the examination of many factors with a minimum number of runs. Often screening designs are a prelude to further experiments. It is wise to spend only about a quarter of your resource budget on an initial screening experiment. You can then use the results to guide further study. The following example, adapted from Meyer, et. al. (1996), demonstrates how to use the JMP Screening designer. In this study, a chemical engineer investigates the effects of five factors on the percent reaction of a chemical process. The factors are:
feed rate, the amount of raw material added to the reaction chamber in liters per minute percentage of catalyst stir rate, the RPMs of a propeller in the chamber reaction temperature in degrees Celsius concentration of reactant.
To begin, choose Screening Design from the DOE tab on the JMP Starter or from the DOE main menu.
59
Change the default names (X1-X5) to Feed Rate, Catalyst, Stir Rate, Temperature, and Concentration. Enter the high and low values as shown in Figure 4.1.
4 Screening
Note that the Responses outline level is closed. Click the disclosure diamond to open it. You see one default response called Y. Double click on the name and change it to Percent Reacted. In this experiment the goal is to maximize the response, which is the default goal. To see the popup list of other goal choices shown to the right, click on the word Maximize. Change the minimum acceptable reaction percentage to 90 as shown in Figure 4.2. When you complete these changes, click Continue. (See Figure 4.1 ).
60
Now, JMP lists the designs for the number of factors you specified, as shown to the left in Figure 4.3. Select the first item in the list, which is an 8-run fractional factorial design. Click Continue again to see the Design Output Options panel on the right in Figure 4.3. Figure 4.3 Two-level Screening Design (left) and Design Output Options (right)
Controls the choice of different fractional factorial designs for a given number of factors.
Aliasing of Effects
61
Coded Design
Shows the pattern of high and low values for the factors in each run.
Run Order Choice
Controls sorting or randomization through the Run Order Choice popup menu.
Center Points
Add center points by entering the number you want in the edit box. The default is zero.
Replicates
Add the desired number of replicates in the edit box. One replicate doubles the number of runs.
Make Table
Creates a JMP table of the design with columns for the factors and responses.
Backup
Removes the Design Output Options Panel and re-displays the list of designs.
4 Screening
62
You can change the check marks in the Change Generating Rules panel to change the coded design. For example, if you enter check marks as in Figure 4.5 and click Apply, the Coded Design changes as shown. The first three columns of the coded design remain a full factorial for the first three factors (Feed Rate, Catalyst, and Stir Rate). Note: Be sure to click Apply to switch to the new generating rules. Temperature is now the product of Feed Rate and Catalyst, so the fourth column of the coded design is the element by element product of the first two columns. Concentration is a function of Feed Rate and Stir Rate. Figure 4.5 Modified Coded Designs and Generating Rules
63
Aliasing of Effects
A full factorial with 5 factors requires 25 =32 runs. Eight runs can only accommodate a full factorial with three, 2-level factors. As described above, it is necessary to construct the two additional factors in terms of the first three factors. The price of reducing the number of runs from 32 to 8 is effect aliasing (confounding). Confounding is the direct result of the assignment of new factor values to products of the coded design columns. For example, the values for Temperature are the product of the values for Feed Rate and Catalyst. This means you cant tell the difference of the effect of Temperature and the synergistic (interactive) effect of Feed Rate and Catalyst. The Aliasing of Effects panel shows which effects are confounded with which other effects. It shows effects and confounding up to two-factor interactions. In the example shown in Figure 4.6 all the main effects are confounded with two-factor interactions. This is characteristic of resolution 3 designs.
4 Screening
gives the popup menu (shown next), which determines the order of runs as they will appear in the JMP data table.
64
lets you repeat the complete set experimental runs a specified number of times.
65
The Design of Experiments facility in JMP automatically generates a JMP data table with a JSL script that creates a Model Specification dialog with the appropriate model for the analysis of the specified design. If you double click on the Table Property name, Model, the dialog shown here appears with the JSL script generated by the DOE facility. The model generated by this example contains all the main effects and two estimable interaction terms, as shown in Figure 4.8. The two-factor interactions in the model actually represent a group of aliased interactions. Any predictions made using this model implicitly assume that these interactions are active rather than the others in the group.
Figure 4.8 Model Specification Dialog Generated by the Design Table with Interaction Term Added
4 Screening
66
Data table shown here. The data table contains a column for each factor, and a row for each factor level. You use the Save Factors command to name the table and save it. To load the factor names and level values into the DOE dialog:
open the data table that contains the factor names and levels select the design type you want from the DOE menu choose Load Factors from the Design dialog menu.
Use the same steps to save and reload information about Responses. See Chapter 1, Design of Experiments (DOE) for a description of all the platform commands.
67
Profiler shows how a predicted response changes as you change any factor. Interaction Plots gives multiple profile plots across one factor under different settings 4 Screening
of another factor.
Contour Profiler shows how predicted values change with respect to changing factors
two at a time.
Cube Plots show predicted values in the corners of the factor space. Box Cox Transformation finds a power transformation of the response that would fit
68
69
points
axial points
Center points, for which all the factor values are at the zero (or midrange) value. Axial (or star) points, for which all but one factor set at zero (midrange) and one factor set at outer (axial) values.
center points
The Box-Behnken design, shown to the left, is an alternative to central composite designs.
5 Surface
One distinguishing feature of the BoxBehnken design is that there are only three levels per factor.
Another important difference between the two design types is that the Box-Behnken design has no points at the vertices of the cube defined by the ranges of the factors. This is sometimes useful when it is desirable to avoid these points due to engineering considerations. The price of this characteristic is the higher uncertainty of prediction near the vertices compared to the Central Composite design.
70
Chapter 5 Contents
Response Surface Designs........................................................................................................ 71 The Response Surface Design Dialog ............................................................................... 71 The Design Table .............................................................................................................. 72 Axial Scaling Options ....................................................................................................... 73 A Central Composite Design ............................................................................................. 74 Fitting the Model ............................................................................................................... 75 A Box-Behnken Design: The Tennis Ball Example ................................................................. 76 Geometry of a Box-Behnken Design ................................................................................ 78 Analysis of Response Surface Models .............................................................................. 78
71
Uniform precision means that the number of center points is chosen so that the prediction variance at the center is approximately the same as at the design vertices. For orthogonal designs, the number of center points is chosen so that the second order parameter estimates are minimally correlated with the other parameter estimates.
Figure 5.1 Design Dialogs to Specify Factors and Choose Design Type
5 Surface
To complete the dialog, enter the number of factors (up to eight) and click Continue. In the table shown to the right in Figure 5.1, the 15- run Box-Behnken design is selected. Click Continue to use this design.
The left panel in Figure 5.2 shows the next step of the dialog. To reproduce the right panel of Figure 5.2 specify 1 replicate with 2 center points per replicate, and change the run order popup choice to Randomize. When you finish specifying the output options you want, click Make Table . Figure 5.2 Design Dialog to Modify Order of Runs and Simulate Responses
Figure 5.3 The JMP Design Facility Automatically Generates a JMP Data Table
73
The axial scaling options control how far out the axial points are:
Rotatable
makes the variance of prediction depend only on the scaled distance from the center of the design.
Orthogonal
In both previous cases the axial points are more extreme than the 1 or 1 representing the range of the factor. If this factor range cannot be practically achieved, then you can choose either of the following options:
On Face
is the default. These designs leave the axial points at the end of the -1 and 1 ranges.
User Defined
uses the value entered by the user, which can be any value greater than zero.
Inscribe
rescales the whole design so that the axial points are at the low and high ends of the range (the axials are 1 and 1 and the factorials are shrunken in from that).
The column called Pattern identifies the coding of the factors. The Pattern column shows all the factor codings with + for high, for low, a and A for low and high axial
75
values, and 0 for midrange. If the Pattern variable is a label column, then when you click on a point in a plot of the factors, the pattern value shows the factor coding of the point. Note: The resulting data table has a Table Variable called Design that contains the design type. This variable appears as a note at the top of the Tables panel to the left of the data grid. In this example, Design says CCD-Orthogonal Blocks. The table also contains a . model script stored as a Table Property, and displayed as a menu icon labeled Model.
5 Surface
After the response data and factors data loads, the Response Surface Design Choice dialog lists the designs in Figure 5.8.
77
The Box-Behnken design selected for three effects generates the design table of 15 runs shown in Figure 5.9. The data are in the Bounce Data.jmp sample data table. The Table Variable (Model) runs a script to launch the Model Specification dialog. After the experiment is conducted, the responses are entered into the JMP table.
Figure 5.9
5 Surface
79
The prediction model is highly significant with no evidence of lack of fit. All main effect terms are significant as well as the two interaction effects involving Sulfur. Figure 5.11 JMP Statistical Reports for a Response Surface Analysis of Bounce Data
See Chapter 9, Standard Least Squares: Introduction in the JMP Statistics and Graphics Guide for more information about interpretation of the tables in Figure 5.11. The Response Surface report also has the tables shown in Figure 5.12:
The Solution table lists the critical values of the surface factors and tells the kind of solution (maximum, minimum, or saddlepoint). The Canonical Curvature table shows eigenvalues and eigenvectors of the effects.
Note that the solution for the Bounce example is a saddlepoint. The Solution table also warns that the critical values given by the solution are outside the range of data values. See Chapter 11, Standard Least Squares: Exploring the Prediction Equation in the JMP Statistics and Graphics Guide for details about the response surface analysis tables in Figure 5.12.
The eigenvector values show that the dominant negative curvature (yielding a maximum) is mostly in the Sulfur direction. The dominant positive curvature (yielding a minimum) is mostly in the Silica direction. This is confirmed by the prediction profiler in Figure 5.13. The Prediction Profiler The response Prediction Profiler gives you a closer look at the response surface to find the best settings that produce the response target. It is a way of changing one variable at a time and looking at the effects on the predicted response. Open the Prediction Profiler with the Profiler command from the Factor Profiling popup menu on the Response title bar. The Profiler displays prediction traces for each X variable. A prediction trace is the predicted response as one variable is changed while the others are held constant at the current values (Jones 1991). The first profile in Figure 5.13 show initial settings for the factors Silica, Silane, and Sulfur, which result in a value for Stretch of 396, which is close to the specified target of 450. However, you can adjust the prediction traces of the factors and find a Stretch value that is closer to the target. The next step is to choose Desirability Functions from the popup menu on the Profiler title bar. This command appends a new row of plots to the bottom of the plot matrix, which graph
81
desirability on a scale from 0 to 1. The row has a plot for each factor, showing its desirability trace, as illustrated by the second profiler in Figure 5.13. The Desirability Functions command also adds a column that has an adjustable desirability function for each Y variable. The overall desirability measure appears to the left of the row of desirability traces. The response goal for Stretch is a target value of 450, as illustrated by the desirability function in Figure 5.13. If needed, you can drag the middle handle on the desirability function vertically to change the target value. The range of acceptable values is determined by the positions of the upper and lower handles. See Chapter 11, Standard Least Squares: Exploring the Prediction Equation in the JMP Statistics and Graphics Guide for further discussion of the Prediction Profiler. The overall desirability shows to the left of the row of desirability traces. However, note in this example that the desirability function is set to 450, the target value. The current predicted value of Stretch, 396, is based on the default factor setting. It is represented by the horizontal dotted line that shows slightly below the desirability function target value. Figure 5.13 Prediction Profiler for a Response Surface Analysis
5 Surface
You can adjust the factor traces by hand to change the predicted value of Stretch. Another convenient way to find good factor settings is to select Maximize Desirability from the Prediction Profiler popup menu. This command adjusts the profile traces to produce the response value closest to the specified target (the target given by the desirability function). Figure 5.14 shows the result of the most desirable settings. Changing the settings of Silica from 1.2 to 0.94512, Silane from 50 to 50.0038, and Sulfur from 2.3 to 2.11515 raised the predicted response from 396 to the target value of 450. Figure 5.14 Prediction Profiler for a Response Surface Analysis
A Response Surface Plot Another way to look at the response surface is to use the Contour Profiler. The Contour Profiler command in the Factor Profiling menu brings up the interactive contour profiling facility as shown in Figure 5.15. It is useful for optimizing response surfaces graphically, especially when there are multiple responses. This example shows the profile to Silica and Silane for a fixed value of Sulphur. Options on the Contour Profiler title bar can be used to set the grid density, request a surface plot (mesh plot), and add contours at specified intervals, as shown in the contour plot in Figure 5.15. The sliders for each factor set values for Current X and Current Y. The surface plots (mesh plots) at the bottom of the report illustrate the effect on the response surface when you set Sulphur to its minimum (40) and then to its maximum (60). This change in the surface shape clearly shows that there is interaction between Sulfur and the other factors .
83
Silane=40
Silane=60
5 Surface
Figure 5.16 shows the Contour profile when the Current X values have the most desirable settings as shown at the bottom in Figure 5.14 .
The Prediction Profiler and the Contour Profiler are discussed in more detail in Chapter 11 of the Statistics and Graphics Guide, Standard Least Squares: Exploring the Prediction Equation.
85
6 Factorial
86
Chapter 6 Contents
The Factorial Dialog ................................................................................................................. 87 The Five-Factor Reactor Example............................................................................................ 88
87
When you finish adding factors, click Continue. to see a panel of output options (as shown to the right). When you click Make Table, the table shown in Figure 6.2 appears. Note that the values in the Pattern column describe the run each row represents. For continuous variables, plus or minus signs represent high and low levels. Level numbers represent values of of categorical variables.
6 Factorial
88
inus sign for ow level of ontinuous factor lus sign for igh level of ontinuous actor evel number for ategorical ariable
Use the Load Responses command from the popup menu on the Full Factorial Design title bar and open the Reactor Response.jmp file to get the response specifications. Likewise, use the Load Factors command and open the Reactor Factors.jmp file to get the Factors panel.
89
A full factorial design includes runs for all combinations of high and low factors for the five variables, giving 32 runs. Click Continue to see Output Options panel shown to the right. When you click Make Table, the JMP Table in Figure 6.4 is constructed with a run for every combination of high and low values for the five variables, and an empty Y column for entering response values when the experiment is complete. The table has 32 rows, which cover all combinations of a five factors with two levels each. The Reactor 32 Runs.jmp sample data file has these experimental runs and the results from the Box, Hunter, and Hunter study. Figure 6.4 shows the runs and the response data.
6 Factorial
90
91
Begin the analysis with a quick look at the data before fitting the factorial model. The plot on the right shows a distribution of the response, Percent Reacted, using the Normal Quantile plot option on the Distribution command on the Analyze menu. Start the formal analysis with a stepwise regression. The data table has a script stored with it that automatically defines an analysis of the model that includes main effects and all two factor interactions, and brings up the Stepwise control panel. To do this, choose Run Script from the Fit Model popup menu on the title bar of the Reactor 32 Run.jmp table. The Stepwise Regression Control Panel appears with a preliminary Current Estimates report. The probability to enter a factor into the model is 0.05 (the default is 0.25), and the probability to remove a factor is 0.1. A useful way to use Stepwise is to check all the main effects in the Current Estimates table, and then use Mixed as the Direction for the stepwise process, which can both include or exclude factors in the model.
Change from default settings: Prob to Enter Factor is .05 Prob to Leave factor is .10 Mixed direction instead of Forward or Backward
To do this, click the check boxes for the main effects of the factors as shown in Figure 6.5, and click Go on the Stepwise control panel.
6 Factorial
92
The Mixed stepwise procedure removes insignificant main effects and adds important interactions. The end result is shown in Figure 6.6. Note that the Feed Rate and Stir Rate factors are no longer in the model. Figure 6.6 Model After Mixed Stepwise Regression
Click the Make Model button to generate a new model dialog. The Model Specification dialog automatically has the effects identified by the stepwise model (Figure 6.7).
93
Click Run Model to see the analysis for a candidate prediction model. The figure to the right shows the whole model leverage plot. The predicted model covers a range of predictions from 40% to 95% Reacted. The size of the random noise as measured by the RMSE is only 3.3311%, which is more than an order of magnitude smaller. than the range of predictions. This is strong evidence that the model has good predictive capability. Figure 6.8 shows a table of model coefficients and their standard errors. All effects selected by the stepwise process are highly significant.
6 Factorial
94
The factor Prediction Profiler also gives you a way to compare the factors and find optimal settings. Open the Prediction Profiler with the Profiler command on the Factor Profiling submenu on the Response title bar. The Prediction Profiler is discussed in more detail in Chapter 5, Response Surface Models in this book, and Chapter 11, Standard Least Squares: Exploring the Prediction Equation of the JMP Statistics and Graphics Guide. The top profile in Figure 6.9 shows the initial settings. An easy way to find optimal settings is to choose Desirability Functions from the popup menu on the profiler title bar. Then select Maximize Desirability, as shown here. These selections give the bottom profile in Figure 6.9. The plot of Desirability versus Percent Reacted shows that the goal is to maximize Percent Reacted. The reaction is unfeasible economically unless the Percent Reacted is above 90%, therefore the Desirability for values less than 90% is 0. Desirability increases linearly as the Percent Reacted increases. The maximum Desirability is 0.9445 when Catalyst and Temperature are at their highest settings, and Concentration is at its lowest setting. Percent Reacted increases from 65.5 at the center of the factor ranges to 95.2875 at the most desirable setting.
95
6 Factorial
97
7 Taguchi
98
Chapter 7 Contents
The Taguchi Design Approach ................................................................................................. 99 Taguchi Design Example ......................................................................................................... 99 Analyze the Byrne-Taguchi Data ........................................................................................... 103
99
7 Taguchi
Table 7.1 Taguchi's Signal to Noise Ratios Goal nominal is best larger-is-better (maximize)
N s
NLTB=10log n 2 i Y i
smaller-is-better (minimize)
Y NSTB=10log n i i S 1
100
Table 7.2 Definition of Adhesiveness Experiment Effects Factor Name Interfer Wall IDepth Adhesive Time Temp Humidity Type control control control control noise noise noise Levels 3 3 3 3 2 2 2 Comment tubing and connector interference the wall thickness of the connector insertion depth of the tubing into the connector percent adhesive the conditioning time temperature the relative humidity
The factors for the example are in the JMP file called Byrne Taguchi Factors.jmp, found in the DOE Sample Data folder. To start this example, 1) open the factors table. 2) choose Taguchi from the DOE main menu or toolbar, or click the Taguchi button on the DOE tab page of the JMP Starter. 3) Select Load Factors in the platform popup menu as shown here. The factors panel then shows the four threelevel control (signal) factors and three noise factors listed in Figure 7.1.
Figure 7.1
Response, and Signal and Noise Factors for the Byrne-Taguchi Example
101
When you click Continue, the list of available inner and outer array designs appears. This example uses the designs highlighted in the design choice panel shown to the right. L9-Taguchi gives the L9 orthogonal array for the inner design. The outer design has three two-level factors. A full factorial in eight runs is generated . However, it is only used as a guide to identify a new set of eight columns in the final JMP data tableone for each combination of levels in the outer design. Click Make Table to create the design table shown in Figure 7.2. The pull-off adhesive force measures are collected and entered into the new columns, shown in the bottom table of Figure 7.3. As a notational convenience, the Y column names are Y appended with the levels (+ or ) of the noise factors for that run. For example Y is the column of measurements taken with the three noise factors set at their low levels. Figure 7.2 Taguchi Design Before Data Entry
7 Taguchi
102
The column called SN Ratio Y is the performance statistic computed with the formula shown below. In this case, it is the largerthebetter (LTB) formula, which is 10 times the common logarithm of the average squared reciprocal. 10Log10 Mean
1 y - - - 2 y - - + 2 y - + - 2 y - + +2 y+ - - 2 , 1 , 1 , 1 , 1 , 1 y+ - +2 y ++- 2 y +++2 , 1 , 1 ,
This expression is large when all of the individual Y values are small.
103
The data are now ready to analyze. The Table Property called Model in the Tables panel runs a JSL script that launches the Fit Model platform shown to the right. The default model includes the main effects of the four Signal factors. The two responses are the mean and S/N Ratio over the outer array. The goal of the analysis is to find factor settings that maximize both the mean and the S/N Ratio. The prediction profiler is a quick way to find settings that give the highest signalto-noise ratio for this experiment. The default prediction profile has all the factors set to low levels as shown in the top of Figure 7.4. The profile traces indicate that different settings of the first three factors would increase SN Ratio Y. The Prediction Profiler has a popup menu with options to help find the best settings for a given Desirability Function. The Desirability Functions option adds the row of traces and column of function settings to the profiler, as shown at the bottom in Figure 7.4. The default desirability functions are set to larger-is-better, which is what you want in this experiment. See Chapter 11, Standard Least Squares: Perspectives on the Estimates, in The JMP Statistics and Graphics Guide for more details about the Prediction Profiler. After the Desirability Functions option is in effect, you can choose Maximum Desirable, which automatically sets the prediction traces to give the best results according to the desirability functions. In this example you can see that the settings for Interfer and Wall changed from L1 to L2. The Depth setting changed from L1 to L3. There was no change in Adhesive. These new settings increased the signal-to-noise ratio from 24.0253 to 29.9075.
104
105
8 Mixture
Because the proportions sum to one, mixture designs have an interesting geometry. The feasible region for a mixture takes the form of a simplex. For example, consider three factors in a 3-D graph. The plane where the sum of the three factors sum to one is a triangleshaped slice, as illustrated in the diagram to the left. You can rotate the plane to see the triangle face-on and see the points in the form of a ternary plot.
x1
The extreme vertices design is the most flexible, since it handles constraints on the values of the factors.
106
Chapter 8 Contents
The Mixture Design Dialog .................................................................................................... 107 Mixture Designs ..................................................................................................................... 108 Simplex Centroid Design ................................................................................................ 108 Simplex Lattice Design ................................................................................................... 110 Extreme Vertices ............................................................................................................. 112 Extreme Vertices Design for Constrained Factors ................................................................. 113 Adding Linear Constraints to Mixture Designs...................................................................... 114 Details on Extreme Vertices Method for Linear Constraints .......................................... 115 Ternary and Tetrary Plots ....................................................................................................... 115 Fitting Mixture Designs.......................................................................................................... 116 Whole Model Test and Anova Report ............................................................................. 117 Response Surface Reports ............................................................................................... 117 Chemical Mixture Example.................................................................................................... 118 Plotting a Mixture Response Surface ..................................................................................... 119
107
You specify the degree up to which the factor combinations are to be made.
Simplex Lattice
You specify how many levels you want on each edge of the grid.
Extreme Vertices
You specify linear constraints or restrict the upper and lower bounds to be within the 0 to 1 range.
ABCD Design
This approach by Snee (1975) generates a screening design for mixtures. Figure 8.1 Mixture Design Selection Dialog
For Simplex Centroid - enter K Simplex Lattice - enter Levels Extreme Vertices - enter Degree
The design table appears when you click a design type button. The following sections show examples of each mixture design type.
108
Mixture Designs
If the process of interest is determined by a mixture of components, the relative proportions of the ingredients, rather than the absolute amounts, needs to be studied. In mixture designs all the factors sum to 1.
all one factor all combinations of two factors at equal levels all combinations of three factors at equal levels and so on up to k factors at a time combined at k equal levels.
A center point run with equal amounts of all the ingredients is always included. The table of runs for a design of degree 1 with three factors (left in Figure 8.2) shows runs for each single ingredient followed by the center point. The table of runs to the right is for three factors of degree 2. The first three runs are for each single ingredient, the second set shows each combination of two ingredients in equal parts, and the last run is the center point. Figure 8.2 Three-Factor Simplex Centroid Designs of Degrees 1 and 2
Run X1 1 2 3 4 1 0 0 X2 0 1 0 X3 0 0 1 Run 1 2 3 4 5 6 7 X1 1 0 0 0.5 0.5 0 X2 0 1 0 0.5 0 0.5 X3 0 0 1 0 0.5 0.5 0.333
0.333 0.333
109
To generate the set of runs in Figure 8.2, choose the Mixture Design command from the DOE menu and enter three continuous factors. You should see the designs in Figure 8.3. Figure 8.3 Create Simplex Centroid Designs of Degrees 1 and 2
8 Mixture
As another example, enter 5 for the number of factors and click Continue. When the Mixture Design dialog appears, the default value of K is 4, which is fine for this example. Click Simplex Centroid. When the design appears, click Make Table to see the 31-run JMP data table shown in Figure 8.4. Note that the first five runs have only one factor. The next ten runs have all the combinations of two factors. Then, there are ten runs for three-factor combinations, five runs for four-factor combinations, and (as always) the last run with all factors.
110
Figure 8.4 Data Table of Runs for Five-Factor Simplex Centroid Design
111
Figure 8.5 Three-Factor Simplex Lattice Designs for Factor Levels 3, 4, and 5
8 Mixture
Figure 8.6 JMP Design Table for Simplex Lattice, Order (Degree) 3
112
Extreme Vertices
The extreme vertices design incorporates limits on factors into the design and picks the vertices and their averages formed by these limits as the design points. The additional limits are usually in the form of range constraints, upper bounds, and lower bounds on the factor values. The following example design table is for five factors with the constraints shown here, where the ranges are smaller than the default 0 to 1 range. Click Continue and enter 4 as the Degree. Figure 8.7 shows a partial listing of the JMP design table. Figure 8.7 JMP Design Table for Extreme Vertices with Range Constraints
Details on Extreme Vertices Method for Range Constraints If the only constraints are range constraints, the extreme vertices design is constructed using the XVERT method developed by Snee and Marquardt (1974) and Snee (1975). After the vertices are found, a simplex centroid method generates combinations of vertices up to a specified order. The XVERT method first creates a full 2 nf-1 design using the given low and high values of the nf - 1 factors with smallest range. Then, it computes the value of the one factor left out
113
based on the restriction that the factors values must sum to 1. It keeps the point if it is in that factors range. If not, it increments or decrements it to bring it within range, and decrements or increments each of the other factors in turn by the same amount, keeping the points that still satisfy the initial restrictions. The above algorithm creates the vertices of the feasible region in the simplex defined by the factor constraints. However, Snee (1975) has shown that it can also be useful to have the centroids of the edges and faces of the feasible region. A generalized n-dimensional face of the feasible region is defined by nf n of the boundaries and the centroid of a face defined to be the average of the vertices lying on it. The algorithm generates all possible combinations of the boundary conditions and then averages over the vertices generated on the first step.
8 Mixture
114
You first enter the upper and lower limits in the factors panel as shown here. Click Continue to see the Mixture Design dialog. The Extreme Vertices selection on the Mixture Design dialog has an additional button to add linear constraints. Click the Linear Constraints button for each constraint you have. In this example you need three constraint dialogs. Figure 8.8 shows constraints panels completed for each of the constraints given previously. After the constraints are entered, click Extreme Vertices, then Make Table to see the JMP table in Figure 8.8.
Figure 8.8 Constraints and Table of Runs for Snee(1979) Mixture Model Example
115
If there are only range constraints, check Add Linear Constraints to see the results of the CONSIM method, rather than the results from the XVERT method normally used by JMP.
Ternary Plots
The Piepel (1979) example is best understood by the ternary plot shown in Figure 8.9. Each constraint is a line. The area that satisfies all constraints is the shaded feasible area. There are six active constraints, six vertices, and six centroid points shown on the plot, as well as two inactive (redundant) constraints.
X1
1, 0 1, 0 .9, .1 .8, .2 .7,.3
.9, .1
.4 .7*X1 + X3
X1 .1 X3
X2
0, 1
X2 .7
X3 .7
116
A mixture problem in three components can be represented in two dimensions because the third component is a linear function of the others. This ternary plot shows how close to 1 a given component is by how close it is to the vertex of that variable in the triangle. The plot in Figure 8.10 illustrates a ternary plot.
to suppress the intercept to include all the linear main-effect terms to exclude all the square terms (like X1*X1) to include all the cross terms (like X1*X2)
117
This model is called the Scheffe polynomial (Scheffe 1958). This is the model JMP DOE creates and stores with the data table as a Table Property. This Table Property, called Model, runs the script to launch the Model Specification dialog, which is automatically filled with the saved model. In this model, the parameters are easy to interpret (Cornell 1990). The coefficients on the linear terms are the fitted response at the extreme points where the mixture is all one factor. The coefficients on the cross terms indicate the curvature across each edge of the factor space.
8 Mixture
118
Select Mixture Design from the DOE menu or JMP Starter DOE tab page. In the Factors panel, request 3 factors. Name them p1, p2, and p3, and enter the high and low constraints as shown here. Click Continue, then specify a degree of three in Mixture Design Type dialog for an Extreme Vertices design. When you click Make Design, then Generate Table, JMP generates a table with the first 9 runs as shown here to the right.
For this problem, the experimenter added an extra 5 design runs by duplicating the vertex points and center point shown highlighted in the table, giving a total of 14 rows in the design table. After the experiment is complete, the results of the experiment (thickness) are entered in the Y column. Use the Plasticizer.jmp sample data to see the experimental results (Y values). To run the mixture model either use the Table Property called Model, which runs a script that creates the completed Model Specification dialog, or choose Fit Model from the Analyze menu, select p1, p2 and p3 as mixture response surface effects, and Y as the Y variable. Then click Run Model, and when the model has run, choose Save Prediction Formula from the Save commands in the platform popup menu. The predicted values show as a new column in the data table. To see the prediction formula, open the formula for that column: 050.1465*p1282.1982*p2911.6484*p3+p2*317.363 +p3*p1*1464.3298+p3*p2*1846.2177 Note: These results correct the coefficients reported in Cornell[1990].
119
When you fit the response surface model, the Response Surface Solution report shows that a maximum predicted value of 19.570299 occurs at point (0.63505, .015568, 0.20927). You can visualize the results of a mixture design with the Profiler in the Fit Model platform, and a Ternary plot, as described in the next section.
8 Mixture
120
121
Adding centerpoints is useful to check for curvature and to reduce the prediction error in the center of the factor region.
Replication
Replication provides a direct check on the assumption that the error variance is constant. It also reduces the variability of the regression coefficients in the presence of large process or measurement variability.
Foldover Design
A foldover design removes the confounding of two-factor interactions and main effects. This is especially useful as a follow-up to saturated or near saturated fractional factorial or Plackett-Burman designs.
D-optimal Augmentation
D-optimal augmentation is a power tool for sequential design. Using this feature you can add terms to the original model and find optimal new test runs with respect to this expanded model. You can also group the two sets of experimental runs into separate blocks, which optimally blocks the second set with respect to the first. This chapter provides an overview of the interface of the Augment designer. It also presents a case study of design augmentation using the reactor example from Chapter 4, Screening Designs.
122
Chapter 9 Contents
The Augment Design Interface............................................................................................... 123 Replicate Design ............................................................................................................. 124 Add Centerpoints ............................................................................................................. 125 Fold Over ......................................................................................................................... 125 The Reactor Example Re-visited ............................................................................................ 126 Interface for D-Optimal Augmentation ........................................................................... 126 Analyze the Augmented Design...................................................................................... 130
9 Augment
After the file opens, the dialogs in Figure 9.2 prompt you to identify the factors and responses you want to use for the augmented design. Figure 9.2 Choose Columns for Factors and Responses
124
Select the columns that are model factors and click OK. Then select the column or columns that are responses. When you click OK again, the dialog below appears with the list of factors and factor values that were saved with the design data table. Buttons on the dialog give four choices for augmenting a design:
Replicate Design
The Replicate button displays the dialog shown here. Enter the number of times to perform each run. Enter two (2) in the dialog text entry to specify that you want each run to appear twice in the resulting design. This is the same as one replicate. Figure 9.3 shows the Reactor data with one replicate.
Add Centerpoints
When you click Add Centerpoints, a dialog appears for you to enter the number of centerpoints you want. The table shown to the right is the design table for the reactor data with two center points appended to the end of the table.
Fold Over
When you select Foldove r and click Make Data Table, the JMP Table that results has an extra column called Block as shown in Figure 9.4. The first set of runs is block 1 and the new (foldover) runs are block 2. Note: Adding centerpoints or replicating the design also generates an additional Block column in the JMP Table. Figure 9.4 Listing of a Foldover Design for the Reactor Data
9 Augment
126
9 Augment
128
To continue with the reactor analysis, choose 2nd from the Interactions popup menu as shown on the left in Figure 9.6, which adds all the two-factor interactions to the model. The minimum number of runs given the specified model is 16, as shown in the Design Generation text edit box. You can increase this number by clicking in the box and typing a new number. Figure 9.6 Augmented Model
When you click Make Design, the DOE facility computes D-optimally augmented factor settings, as shown in Figure 9.7.
9 Augment
Note: The resulting design is a function of an initial random number seed. To reproduce the exact factor settings table in Figure 9.7, (or the most recent design you generated), choose Set Random Seed from the popup menu on the Augment Design title bar. A dialog shows the most recently used random number. Click OK to use that number again, or Cancel to generate a design based on a new random number. The dialog to the right shows the random number (1859832026) used to generate the runs in Figure 9.7.
130
Figure 9.8 is the data table data from the corresponding runs in the Reactor Example from Chapter 6, "Full Factorial Designs." The Reactor Augment Data.jmp sample data file contains these runs. The example analysis in the next section uses this data table. Figure 9.8 Completed Augmented Experiment
9 Augment
Click Go to see the stepwise regression process begin and continues until all terms are entered into the model that meet the Prob to Enter and Prob to Leave criteria in the Stepwise Regression Control panel. Figure 9.10 shows the result of this example analysis. Note that Feed Rate and Stir Rate are out of the model while the Temperature*Catalyst and the Temperature*Concentration interactions have entered the model.
132
After Stepwise is finished, click Make Model on the Stepwise control panel to generate this reduced model, as shown in Figure 9.11. You can now fit the reduced model to do additional diagnostic work, make predictions, and find the optimal factor settings. Figure 9.11 New Prediction Model Dialog
The ANOVA and Lack of Fit Tests in Figure 9.12 indicate a highly significant regression model with no evidence of Lack of Fit.
Figure 9.12 Prediction Model Analysis of Variance and Lack of Fit Tests
9 Augment
The Scaled Estimates table in Figure 9.13 show that Catalyst has the largest main effect. However, the significant two-factor interactions are of the same order of magnitude as the main effects. This is the reason that the initial screening experiment, shown in Chapter 4, Screening Designs, had ambiguous results. Figure 9.13 Prediction Model Estimates Plot
It is desirable to maximize the percent reaction. The prediction profile plot in Figure 9.14 shows that maximum occurs at the high levels of Catalyst and Temperature and the low level of Concentration. When you drag the prediction traces for each factor to their maximum settings, the estimate of Percent Reacted increases from 65.375 to 95.6635.
134
To summarize, compare the analysis of 16 runs with the analyses of reactor data from previous chapters:
In Chapter 4, Screening Designs, the analysis of a screening design with only 8 runs produced a model with the five main effects and two interaction effects with confounding. None of the factors effects were significant, although the Catalyst factor was large enough to encourage collecting data for further runs. Chapter 6, Full Factorial Designs, a full factorial of the five two-level reactor factors, 32 runs, was first subjected to a stepwise regression. This approach identified three main effects (Catalyst, Temperature, and Concentration) and two interactions (Temperature*Catalyst, Contentration*Temperature) as significant effects. By using a D-optimal augmentation of 8 runs to produce 8 additional runs, a stepwise analysis returned the same results as the analysis of 32 runs. The bottom line is that only half as many runs yielded the same information. Thus, using an iterative approach to DOE can save time and money.
135
Testing two samples have the same mean Testing that there are differences in the means among k samples.
The Power and Sample Size facility assumes that there are equal numbers of units in each group. You can also apply this facility to more general experimental designs, if they are balanced, and a number-of-parameters adjustment is specified.
136
Chapter 10 Contents
Prospective Power Analysis ................................................................................................... 137 Launch the Sample Size and Power facility ........................................................................... 137 Single-Sample Mean ....................................................................................................... 139 Two-Sample Means ......................................................................................................... 141 k-Sample Means .............................................................................................................. 142
137
Alpha is the significance level that prevents declaring a zero effect significant more than alpha portion of the time. Error Standard Deviation is the unexplained random variation around the means. Sample Size is how many experimental units (runs, or samples) are involved in the experiment. Power is the probability of declaring a significant result. Effect Size is how different the means are from each other or from the hypothesized value.
The Sample Size and Power facility in JMP helps estimate in advance either the sample size needed, power expected, or the effect size expected in the experimental situation where there is a single mean comparison, a two sample comparison, or when comparing k sample means. The Sample Size, Power command is on the DOE main menu (or toolbar), or on the DOE tab page of the JMP Starter. When you launch this facility, the dialog shown here appears with a button selection for three experimental situations. Each of these selections then displays its own dialog that prompts for estimated parameter values and the desired computation.
10 Power
138
The Two Sample Means choice in the initial power dialog always requires values for Alpha and the error standard deviation (Error Std Dev), as shown here, and one or two of the other three values: Difference to detect, Sample Size, and Power. The power facility then calculates the missing item. If there are two unspecified fields, the power facility constructs a plot that shows the relationship between those two values: power as a function of sample size, given specific effect size
power as a function of effect size, given a sample size effect size as a function of sample size, for a given power.
The Sample Size dialog asks for the values depending the first choice of design:
Alpha
is the significance level, usually .05. This implies willingness to accept (if the true difference between groups is zero) that 5% (alpha) of the time a significant difference will be incorrectly declared.
Error Std Deviation
is the true residual error. Even though the true error is not known, the power calculations are an exercise in probability that calculates what might happen if the true values were as specified.
Extra Params
is only for multi-factor designs. Leave this field zero in simple cases. In a multi-factor balanced design, in addition to fitting the means described in the situation, there are other factors with the extra parameters that can be specified here. For example, in a three-factor two-level design with all three two-factor interactions, the number of extra parameters is fivetwo parameters for the extra main effects, and three parameters for the interactions. In practice, it isnt very important what values you enter here unless the experiment is in a range where there is very few degrees of freedom for error.
139
Difference to Detect
is the smallest detectable difference (how small a difference you want to be able to declare statistically significant). For single sample problems this is the difference between the hypothesized value and the true value.
Sample Size
is the total number of observations (runs, experimental units, or samples). Sample size is not the number per group, but the total over all groups. Computed sample size numbers can have fractional values, which you need to adjust to real units. This is usually done by increasing the estimated sample size to the smallest number evenly divisible by the number of groups.
Power
is the probability of getting a statistic that will be declared statistically significant. Bigger power is better, but the cost is higher in sample size. Power is equal to alpha when the specified effect size is zero. You should go for powers of at least .90 or .95 if you can afford it. If an experiment requires considerable effort, plan so that the experimental design has the power to detect a sizable effect, when there is one.
Continue
Single-Sample Mean
Suppose there is a single sample and the goal is to detect a difference of 2 where the error variance is .9, as shown in the left-hand dialog in Figure 10.1 To calculate the power when the sample size is 10, leave Power missing in the dialog and click Continue. The dialog on the right in Figure 10.1 shows the power is calculated to be .99998, rounding to 1.
140
To see a plot of the relationship of power and sample size, leave both Sample Size and Power missing and click Continue. Double click on the horizontal axis to get any desired scale. The right-hand graph in Figure 10.2 shows a range of sample sizes for which the power varies from about 0.2 to .95. Change the range of the curve by changing the range of the horizontal axis. For example, the plot on the right in Figure 10.2 has the horizontal axis scaled from 1 to 8, which gives a more typical looking power curve. Figure 10.2 A One-Sample Example
141
When only Sample Size, is specified (Figure 10.3) and Difference to Detect and Power are left blank, a plot of power by difference appears. Figure 10.3 Plot of Power by Difference to Detect for a Given Sample Size
Two-Sample Means
The dialogs work similarly for two samples; the Difference to Detect is the difference between two means. Suppose the error variance is .9 (as before), the desired detectable difference is 1, and the sample size is 16. Leave Power blank and click Continue to see the power calculation, 0.5433, as shown in the dialog on the left in Figure 10.4. This is considerably lower than in the single sample because each mean has only half the sample size. The comparison is between two random samples instead of one. To increase the power requires a larger sample. To find out how large, click Backup on the Power Calculation dialog. Leave Sample Size and Power both blank and examine the plot shown on the right in Figure 10.4. The crosshair tool estimates that a sample size of about 35 is needed to obtain a power of 0.9.
10 Power
142
Figure 10.4 Plot of Power by Difference to Detect for a Given Sample Size
k-Sample Means
The k-sample situation can examine up to 10 kinds of means. The next example considers a situation where 4 levels of means are expected to be about 10 to 13, and the Error Std Dev is 0.9. When a sample size of 16 is entered the power calculation is 0.95. As before, if you leave both Sample Size and Power are left blank, the power facility produces the power curve shown on the right in Figure 10.5. This confirms that a sample size of 16 looks acceptable. Notice that the difference in means is 2.236, calculated as square root of the sum of squared deviations from the grand mean. In this case it is the square root of (1.5)2 +.(5)2 +.052 +1.52 , which is the square root of 5.
143
Figure 10.5 Prospective Power for k-Means and Plot of Power by Sample Size
10 Power
References 145
References
Atkinson, A. C. and Donev, A. N. Optimum Experimental Designs Clarendon Press, Oxford (1992) p. 148. Bose, R.C., (1947) "Mathematical Theory of the Symmetrical Factorial Design" Sankhya: The Indian Journal of Statistics, Vol 8, Part 2, pp. 107-166. Box, G.E.P. and Meyer, R.D. (1986), An analysis of Unreplicated Fractional Factorials, Technometrics 28, 1118. Box, G.E.P. and Draper, N.R. (1987), Empirical ModelBuilding and Response Surfaces, New York: John Wiley and Sons. Box, G.E.P. (1988), SignaltoNoise Ratio, Performance Criteria, and Transformations, Technometrics 30, 140. Box, G.E.P., Hunter,W.G., and Hunter, J.S. (1978), Statistics for Experimenters, New York: John Wiley and Sons, Inc. Byrne, D.M. and Taguchi, G. (1986), ASQC 40th Anniversary Quality Control Congress Transactions, Milwaukee, WI: American Society of Quality Control, 168177. Chen, J., Sun, D.X., and Wu, C.F.J. (1993), A Catalogue of Two-level and Three-Level Fractional Factorial Designs with Small Runs, International Statistical Review, 61, 1, p131-145, International Statistical Institute. Cochran, W.G. and Cox, G.M. (1957), Experimental Designs, Second Edition, New York: John Wiley and Sons. Cornell, J.A. (1990), Experiments with Mixtures, Second Edition New York: John Wiley & Sons. Daniel, C. (1959), "Use of Halfnormal Plots in Interpreting Factorial Twolevel Experiments," Technometrics, 1, 311314. Daniel C. and Wood, F. (1980), Fitting Equations to Data, Revised Edition, New York: John Wiley and Sons, Inc. Derringer, D. and Suich, R. (1980), Simultaneous Optimization of Several Response Variables, Journal of Quality Technology, Oct 1980, 12:4, 214219. Haaland, P.D. (1989), Experimental Design in Biotechnology, New York: Marcel Dekker, Inc.
References
146 References
Hahn, G. J., Meeker, W.Q., and Feder, P. I., (1976), The Evaluation and Comparison of Experimental Designs for Fitting Regression Relationships, Journal of Quality Technology, Vol. 8, #3, pp. 140-157. John, P.W.M. (1972), Statistical Design and Analysis of Experiments, New York: Macmillan Publishing Company, Inc. Johnson, M.E. and Nachtsheim, C.J. (1983), Some Guidelines for Constructing Exact DOptimal Designs on Convex Design Spaces, Technometrics 25, 271277. Jones, Bradley (1991), An Interactive Graph For Exploring Multidimensional Respnse Surfaces, 1991 Joint Statistical Meetings, Atlanta, Georgia Khuri, A.I. and Cornell J.A. (1987) Response Surfaces: Design and Analysis, New York: Marcel Dekker. Lenth, R.V. (1989), "Quick and Easy Analysis of Unreplicated Fractional Factorials," Technometrics, 31, 469473. Mahalanobis, P.C. (1947), "Sankhya," The Indian Journal of Statistics, Vol 8, Part 2, April. Myers, R.H. (1976) Response Surface Methodology, Boston: Allyn and Bacon. Meyers, R.H. (1988), Response Surface Methodology, Virginia Polytechnic and State University. Meyer, R.K. and and Nachtsheim, C.J. (1995), The Coordinate Exhange Algorithm for Constructing Exact Optimal Designs, Technometrics , Vol 37, pp. 60-69. Meyer, R.D., Steinberg, D.M., and Box, G.(1996), Follow-up Designs to Resolve Confounding in Multifactor Experiments, Technometrics , Vol. 38, #4, p307. Mitchell, T.J. (1974), An algorithm for the Construction of D-Optimal Experimental Designs, Technometrics , 16:2, pp.203-210. Piepel, G.F. (1988), "Programs for Generating Extreme Vertices and Centroids of Linearly Constrained Experimental Regions," Journal of Quality Technology 20:2, 125-139. Plackett, R.L. and Burman, J.P. (1947), The Design of Optimum Multifactorial Experiments, Biometrika, 33, 305325. Sheffe, H. (1958) Experiments with Mixtures, JRSS B 20, 344-360. Snee, R.D. and Marquardt, D.W. (1974), Extreme Vertices Designs for Linear Mixture Models, Technometrics, 16, 391408. Snee, R.D., and Marquardt D. (1975), "Extreme vertices designs for linear mixture models", Technometrics 16 399-408. Snee, R.D. (1975), Experimental Designs for Quadratic Models in Constrained Mixture Spaces, Technometrics, 17:2, 149159. Snee, R.D. (1979), Experimental Designs for Mixture Systems with Multicomponent Constraints, Commun. Statistics, A8(4), 303326.
References 147
Snee, Ronald D. (1985)Computer Aided Design of Experiments - Some Practical Experiences, Journal of Quality Technology , Vol 17. No. 4 October 1985 p.231. Snee, R.D. and Marquardt, D.W. (1974), Extreme Vertices Designs for Linear Mixture Models, Technometrics, 16, 391408. Snee, R.D. and Marquardt D.W. (1975), Extreme vertices designs for linear mixture models," Technometrics 16 399-408. St John, R.C. and Draper, N.R. (1975), D-Optimality for Regression Designs: A Review, Technometrics, 17 pp 15-23. Taguchi, G. (1976), An Introduction to Quality Control, Nagoya, Japan: Central Japan Qualiy Control Association.
References
Index 149
Index
A ABCD, mixture design 105 actual-by-predicted plot 68 add center points, augment design 125 aliasing of effects 60, 63 analysis example augmented design 130-134 mixture design 116 response surface design 78-84 screening design 67-68 augment design 121-134 add center points 121, 125 analysis example 130-134 block factor 126 D-optimal 121, 126 data table 125, 129 foldover design 121, 125 interface 123, 126-128 Model Specification dialog 130 random number seed 129 replicate design 121, 124 stepwise regression 130 axial points, RSM 69, 73 axial scaling options 73 backup button 61 B Big Class.jmp sample data 43 BounceData.jmp sample data 76 BounceFactor.jmp sample data 76 BounceResponse.jmp sample data 76 Box-Behnken, RSM 69, 71, 76-78 Box-Cox transformation 67 Byrne Taguchi Data.jmp sample data 99 Byrne Taguchi Factors.jmp sample data 100 C canonical curvature, RSM 80 center points 63, 69, 71 central composite design, RSM 69, 74 coded design 60, 61 column property (data table) 14 constraints, loading and saving 14 contour profiler response su rface design 83 screening design 67 covariate factors 43 cube plot 67 cubic model, custom design 26 Cubic Model.jsl sample script 27 custom design 17, 33-51 all two-factor interactions 31 all two-factor interactions involving only one factor 30 cubic model 26 data table 22 design generation panel 20-21 dialog 19-23 factor constraints 48 factors, defining 19 fixed covariate factors 43-45 flexible block sizes 36-38 internal details 32 JSL scripting example 27 main effects only 28 mixture with nonmixture factors 47 model panel 20 modify design interactively 23 number of runs 21 output options 21
Indes
150
Index
prediction variance profiler 24 quadratic model 24 random number seed 15 RSM with categorical factors 38-42 screening design examples 28-31 D D-Optimal augmentation 126-129 data table 11, 13 augmented design 125, 129 custom design 22 design role 14 extreme vertices mixture design 112 full factorial design 90 pattern variable 63 replicates 63 response surface design 72 run script command 75 screening design 61, 63 simplex centroid mixture design 110 simplex lattice mixture design 111 simulated response 63 table property 75 Taguchi arrays 101 variable constraint state (DOE) 14 design choices mixture design 107 response surface design 77 screening designs 10, 60 Taguchi arrays 101 design output options 60-61 desirability trace, prediction variance profiler 81 Diamond Constraints.jmp sample data 48 DOE Example 1.jmp sample data 11 DOE main menu 3-6 Augment Design 5, 121-134 Custom Design 4, 17-32, 33-51 Full Factorial Design 5, 85-95
Mixture Design 5, 105-120 Response Surface Design 4, 69-84 Sample Size, Power 6, 135-143 Screening Design 4, 53-68 Taguchi Arrays 5, 97-104 Donev Mixture Factors.jmp sample data 45 E effect sparcity 53, 56 extreme vertices mixture design 105, 112 F factors 13 constraints 48-51 entering into dialog 9 generators 61 profiling 67 saving and loading 13 factors panel custom design 19 screening design 58 Taguchi design 100 foldover design, augment design 125 full factorial design 85, 93-95 5-factor example 88 analysis example 91 data table 90 dialog 87 load responses and factors 88 prediction variance profiler 94 sample size 85 stepwise regression 91 I inner array, Taguchi arrays 97 interaction plot 67 J JMP Starter DOE tab 3
Index 151
L L18, L36 screening designs 57 loading constraints 15 loading factors and responses 66, 76, 88 M Main menu, DOE 3 mixture design 105-120 analysis example 116 constrained factors 113-115 data table 110, 111 design choices 107 dialog 107 extreme vertices 105, 112-113 factor constraints 51 prediction variance profiler 119 response surface reports 117 simplex centroid 105, 107, 109-110 simplex lattice 105, 110 ternary plot 115, 120 Model Specification dialog 11, 65 augmented design 130 full factorial design 93 response surface model 75 stepwise regression 132 Taguchi arrays 103 N non-estimable effect 56 O orthogonal axial scaling 73 orthogonal design 55, 57, 71 outer array, Taguchi arrays 97 P pattern variable 63, 68, 74 Plasticizer.jmp sample data 118 power analysis 135-143 alpha 137, 138 difference to detect 139
effect size 137 error standard deviation 137 extra parms 138 k-sample means 142 plotting 140, 142, 143 power 137, 138, 139 single sample 139 standard error deviation 137 two-sample means 141 prediction variance profiler augmented design 134 custom design 24 desirability function 94 desirability trace 81 full factorial design 94 mixture model analysis 47 prediction trace 81 response surface design 81-82 Taguchi arrays 103 prospective power analysis 6, 135-143 Q quadratic model, custom design 24 R random number seed 15, 129 Reactor 32 Runs.jmp, sample data 88 Reactor 8 Runs.jmp sample data 67, 123, 126 Reactor Augment Data.jmp sample data 130 Reactor Factors.jmp sample data 88 Reactor Response.jmp sample data 88 replicate design, augment design 124 replicates 60, 63 resolution 56 response surface design 69-84 3-d geormentric view 78 analysis example 78-84 analysis reports 79 axial points 69
Index
152
Index
axial scaling 72 Box-Behnken 69, 71, 76-78 canonical curvature 80 categorical factors 38-42 central composite 69, 74 contour profiler 83 data table 72 design choices 77 dialog 71 factor constraints 48 load responses and factors 76 Model Specification dialog orthogonal 71 pattern variable 74 plotting 82-84 prediction variance profiler 81 run script command 75 simulate response 74 solution 80 star points 69 uniform precision 71 response surface reports, mixture design 117 responses 8-12, 13 entering into dialog 8 saving and loading 13 simulate 15 rotatable axial scaling 73 run order 60 S sample data Big Class 43 BounceData 76 BounceFactor 76 BounceResponse 76 Byrne Taguchi Data 99 Cubic Model.jsl 27 Diamond Constraints 48 DOE Example 1 11
Donev Mixture factors 45 Plasticizer 118 Reactor 32 Runs 88 Reactor 8 Runs 67, 123, 126 Reactor Augment Data 130 Reactor Factors 88 Reactor Response 88 Sample Size, Power command 135-143 sample size, prospective 6 saving constraints 15 saving factors and responses 66 scaled estimates report 68 screening design 53-68 aliasing of effects 62 analysis example 67 center points 63 coded design 61 Cotter Design 57 data table 64 design choices 60 dialog 7, 58 example 58-65 factor generators 61 factors panel 58 L18, L36 mixed-level designs 57 loading factors and responses 66 mixed-level designs 57 Model Specification dialog 65 non-estimable effect 56 orthogonal 55 output options 60, 63 Plackett-Burman design 56 replicates 63 resolution 56 response panel 58 saving factors and responses 66 simulate response 61, 63 two-level fractional factorial 55
Index 153
two-level full factorial 55 types 55-57 signal-to-noise ratio, Taguchi arrays 97 signal-to-noise ratio, Taguchiarrays 99 simplex centroid, mixture design 105, 109 simplex lattice, mixture design 105, 111 simulate responses 15, 61, 63, 74 single sample power analysis 139 star points, RSM 69 stepwise regression augmented design 130 full factorial design 91 T Taguchi arrays 97-104 contour profiler 103 data table 101 design choices 101 desirability function 103 example 99-102 inner array 97 outer array 97 signal-to-noise ratio 97-99 ternary plot, mixture design 115, 120 U uniform precision, RSM 71 utility functions 12
Index