Contents 
 
Launching the Evaluate Design Platform
To launch the Evaluate Design platform, select DOE > Evaluate Design. The launch window is shown in Figure 14.1.
Figure 14.1 Evaluate Design Launch Window
The launch window has the following features:
Y, Response
Enter the response columns.
X, Factor
Enter the effect columns.
The Evaluate Design Report
This section describes the Evaluate Design report. For complete details, see Creating a Custom Design in Building Custom Designs.
Factors
Lists the factors in the design and if the factors are continuous, nominal, or ordinal.
Model
Lists the terms in the model that you want to fit using the results of the experiment. The buttons let you add or remove terms from the model.
Alias Terms
Lists the alias terms included in the computation of the alias matrix, and for computing correlations between model terms and alias terms. The buttons let you add or remove terms from the list of alias terms. By default, second order interactions are included as alias terms.
Design
Lists the runs of the design, factor levels, and response values.
Design Evaluation
Gives diagnostics for assessing the design. For complete details of each section, see Understanding Design Evaluation in Building Custom Designs.
Prediction Variance Profile is a profiler of the relative variance of prediction as a function of each factor at fixed values of the other factors. See Figure 14.2. To see how the prediction variance changes, drag the vertical dotted lines of the factors. To find the maximum prediction variance in the design space, select Maximize Desirability on the red triangle menu.
Figure 14.2 Prediction Variance Profile
Fraction of Design Space Plot shows how much of the model prediction variance lies above (or below) a given value. See Figure 14.3. This is most useful when there are multiple factors. It summarizes the prediction variance, showing the fractional design space for all the factors taken together.
Figure 14.3 Fraction of Design Space Plot
Prediction Variance Surface is a plot of the prediction variance surface as a function of the design factors. See Figure 14.4. Show or hide the controls by selecting Control Panel on the red triangle menu.
Figure 14.4 Prediction Variance Surface
Power Analysis shows the power associated with each model effect. See Figure 14.5. You can change the assumed Significance Level, Signal to Noise Ratio, and Error Degrees of Freedom. For complete details, see Power Analysis Report in Building Custom Designs.
Figure 14.5 Power Analysis
Variance Inflation Factors shows the variance inflation factors relative to the orthogonal effect coding. See Figure 14.6. These can differ from those produced by Fit Model. If all the values are 1, then the design is orthogonal.
Figure 14.6 Variance Inflation Factors
Alias Matrix shows the alias matrix for the model terms and the alias terms. The alias matrix shows the confounding pattern for the design. See Figure 14.7.
Figure 14.7 Alias Matrix
Color Map on Correlations shows the absolute value of the correlation between each model term and alias term. See Figure 14.8.
Figure 14.8 Color Map on Correlations
Design Diagnostics shows D, G, and A Efficiency values, and the average variance of prediction. The design creation time is also shown.
Figure 14.9 Design Diagnostics
Examples
This section contains examples of using the Evaluate Design platform to assess designs. The first example assesses the impact of lost runs, and the second example assesses the impact of changing the model.
Assessing the Impact of Lost Runs
An experiment was conducted to explore the effect of three factors (Silica, Sulfur, and Silane) on tennis ball bounciness (Stretch). After the experiment, the researcher learned that the two runs where Silica=0.7 and Silane=50 were not processed correctly. These runs could not be included in the analysis of the data.
Use the Evaluate Design platform to assess the impact of not including those two runs.
1. Open the Bounce Data.jmp sample data table found in the Design Experiment folder.
2. Select DOE > Evaluate Design.
3. Assign Stretch to the Y, Response role.
4. Assign Silica, Sulfur, and Silane to the X, Factor role.
5. Click OK.
6. Return to the data table and delete the rows for Silica=0.7 and Silane=50. These are rows 1 and 2.
7. Repeat steps 2-5 to evaluate the reduced experiment.
8. Compare the Fraction of Design Space Plot for both reports. See Figure 14.10.
Figure 14.10 Fraction of Design Space
For small fraction values, the prediction variances are similar. For larger values, the prediction variance for the reduced design is a lot higher.
9. One of the available diagnostics is the maximum prediction variance of the designs. In both reports, select Maximize Desirability from the Prediction Variance Profile red triangle menu.
Figure 14.11 shows the maximum prediction variance for both the complete and reduced designs. For both designs, one of the design points where the maximum prediction variance occurs is Silica=0.7, Sulfur=1.8, and Silane=40. The maximum prediction variance is 1.396 for the complete design, and 3.02 for the reduced design.
Figure 14.11 Prediction Variance Profilers
Note that for this example, the profiles are symmetric, except the Silica profile for the reduced design. A symetric profile means that there are other design points for which the prediction variance is maximized. For example, in the report for the complete design, change the Silane value to 60 and notice the variance is still 1.396.
10. Correlations between the model terms and the alias terms can also help you evaluate a design. In both reports, open the Color Map On Correlations section to show a color map of the correlations.
Figure 14.12 shows the color maps for both designs.The absolute value of the correlations range from 0 (blue) to 1 (red). Place your cursor over a cell with the mouse to see the value of the correlation. The color map for the reduced design has more cells with correlations that are farther away from 0. For example, the correlation between Sulfur and Silica*Sulfur is <.0001 for the complete design, and 0.577 for the reduced design.
Figure 14.12 Correlations between Model and Alias Terms
11. The report also gives several efficiency measures. In both reports, open the Design Diagnostics section (Figure 14.13).
The complete design has higher efficiency values, and a lower average prediction variance.
Figure 14.13 Design Diagnostics
Given all these results, the lost runs appear to have a negative impact on the design.
Assessing the Impact of Changing the Model
An experiment was conducted to investigate the effects of three factors (Brand, Time, and Power) on the number of kernels that pop in a bag of microwave popcorn. For this example, we assume that the same number of kernels are in each bag. The goal is to find the combination of factors that maximize the number of popped kernels. An initial model was created that includes several interactions and quadratic terms. From past studies, there is evidence to suggest that there is no quadratic effect for Time and Power, and no interactions with Brand. Use the Evaluate Design platform to assess the impact of removing these terms from the model.
1. Open the Popcorn DOE Results.jmp sample data table in the Design Experiment folder.
2. Select DOE > Evaluate Design.
3. Assign Brand, Time, and Power to the X, Factor role.
4. Assign Number Popped to the Y, Response role.
5. Click OK.
Note the Model section contains main effects, quadratic terms, and interaction with Brand. The results are an assessment of the design relative to fitting the specified model.
6. Repeat steps 2-5.
7. In the Model section, remove the following terms: Power*Power, Time*Time, Brand*Time, and Brand*Power.
The results are an assessment of the design relative to fitting the reduced model.
The Fraction of Design Space Plot shows that for almost the entire design space, the reduced model has a lower prediction variance than the initial model. See Figure 14.14.
Figure 14.14 Fraction of Design Space
Open the Power Analysis section in both reports. For the reduced model report, enter 11 for the Error Degrees of Freedom. Since the reduced model has 4 fewer effects, the error degrees of freedom increases from 7 to 11. Figure 14.15 shows the results. For the reduced model, the design produces higher power for every effect. The difference is especially large for the Time*Power effect.
Figure 14.15 Power Analysis
Open the Design Diagnostics section in both reports. See Figure 14.16. For the reduced model, the design produces higher efficiency values, except for G efficiency. The A Efficiency (which is related to the variance of the regression coefficients) is a lot larger for the reduced model. The average prediction variance for the reduced model is less than half of that for the initial model.
Figure 14.16 Design Diagnostics
The assumption is that the quadratic terms of Time and Power and interaction terms with Brand are negligible. If this is correct, then using this design to fit the reduced model results in predictions with smaller variance, and more powerful effects tests.
 
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset