Multiple Regression Results - Advanced Tab
Multiple Regression - Computational Approach
Select the Advanced tab of the Multiple Regression Results dialog to access options to review more detailed results of the specified analyses. Use the options on the Residuals/assumptions/prediction tab to perform residual analysis and to make predictions for the dependent variable.
The Regression Summary for Dependent Variable spreadsheet displays the standardized (Beta) and non-standardized (B) regression coefficients (weights), their standard error, and Statistical significance. The Beta coefficients are the coefficients you would have obtained had you first standardized all of your variables to a mean of 0 and a standard deviation of 1. Thus, the magnitude of these Beta coefficients allow you to compare the relative contribution of each independent variable in the prediction of the dependent variable. The summary statistics for the regression analysis (e.g., R, R-square, etc.) will also be displayed in the headers of this spreadsheet.
Note: variance inflation factor. The diagonal elements of the inverse correlation matrix (i.e., -1 times the diagonal elements of the sweep matrix displayed via this option) for variables that are in the equation are also sometimes called variance inflation factors (VIF; e.g., see Neter, Wasserman, Kutner, 1985). This terminology denotes the fact that the variances of the standardized regression coefficients can be computed as the product of the residual variance (for the correlation transformed model) times the respective diagonal elements of the inverse correlation matrix. If the predictor variables are uncorrelated, then the diagonal elements of the inverse correlation matrix are equal to 1.0; thus, for correlated predictors, these elements represent an "inflation factor" for the variance of the regression coefficients, due to the redundancy of the predictors.
- The Beta in (standard regression coefficient for the respective variable if it were to enter into the regression equation as an independent variable);
- The partial correlation (between the respective variable and the dependent variable, after controlling for all other independent variables in the equation);
- The semi-partial (part) correlation (the correlation between the unadjusted dependent variable with the respective variable after controlling for all independent variables in the equation; matrices of partial correlations and semi-partial (or part) correlations can be computed in the General Linear Model (GLM) and General Regression Models (GRM) modules.);
- The tolerance for the respective variable (defined as 1 minus the squared multiple correlation between the respective variable and all independent variables in the regression equation);
- The R-square (between the current variable and all other variables in the regression equation);
- The t-value associated with these statistics for the respective variable, and;
- The Statistical significance of the t-value.
These statistics will first be displayed separately for variables not currently in the regression equation, and for the variables in the regression equation (if any).
For example, if in the first analysis, variables 1 through 5 are selected as independent variables, and in the subsequent analysis with the same dependent variable you specify variables 1 through 3 as the independent variables, then choosing this option will display a spreadsheet with R-square, the R-square decrease, F to remove, and the number of variables removed in the single step (two in this example). If Forward stepwise or Backward stepwise regression is specified, the spreadsheet will contain the R-square increment (or decrease) at each step. Note that you must specify variables using the Model Definition dialog; this option is not available if you only select variables on the Multiple Linear Regression Startup Panel.