Factor Analysis - Notes and Technical Information

Eigenvalues
 At the "heart" of factor analysis is the eigenvalue problem that is solved in this program via the Householder method; see, for example, Golub and Van Loan (1983), Jacobs (1977), or Ralston and Wilf (1967, Vol. II). Eigenvalues are calculated via least squares procedures. The sum of the eigenvalues is equal to the sum of the diagonal elements of the matrix (correlations or covariances) that is analyzed, that is:

∑λј = trace/S/ = ∑σii or јј

where

λј is the j'th eigenvalue
/S/ is the variance/covariance matrix or correlation matrix
σii or јј are the diagonal elements of the variance/covariance matrix or the correlation matrix
Matrix Ill-Conditioning and the Modified Correlation Matrix
 The Introductory Overviews discuss the issue of matrix ill-conditioning. If during the factoring process you received the message that the correlation matrix cannot be inverted, then the correlation matrix will be modified (to allow inversion). Specifically, a small constant is added to the diagonal elements of the correlation matrix until the determinant of that matrix is greater than 1.E-50. All subsequent computations will be performed based on the modified (slightly lowered) correlations; to restore the original exact correlation matrix, click Cancel to return to the Factor Analysis Startup Panel and then click OK to re-read the data file.
Analyzing covariance or moment matrices
STATISTICA also includes two additional modules for (factor-) analyzing covariance matrices, and moment matrices: The Principal Components and Classification Analysis (PCA) module will extract principal components from covariance matrices, and also allows the user to map new observations or variables into the computed factor space. The SEPATH Analysis (structural equations modeling) module will accept correlation, covariance, and moment matrices for the analysis of structured means. Also note that the STATISTICA Visual Basic library of matrix functions contains all necessary computational routines to extract principal components from any type of matrix.

Factor Scores

The matrix of the factor scores F is calculated as follows:

F = B Z

where

B is the matrix of the factor score coefficients, and
Z is the matrix of the standardized values (z scores) of the original raw data.

The matrix of the factor score coefficients B is computed via the so-called regression method. For details, see Harman (1976), p. 368.

For the Principal components extraction method, B is calculated as follows:

B = L D-1 (for unrotated factor loadings)
B = L (L'L)-1 (for rotated factor loadings)

where L is the matrix of the factor loadings, and D is a diagonal matrix that contains the eigenvalues for the extracted factors on the main diagonal.

For all other extraction methods, B is calculated as follows:

B = R-1 L

where R is the correlation matrix of the original data.