Title: | Choose the Number of Principal Components via Recistruction Error |
---|---|
Description: | One way to choose the number of principal components is via the reconstruction error. This package is designed mainly for this purpose. Graphical representation is also supported, plus some other principal component analysis related functions. References include: Jolliffe I.T. (2002). Principal Component Analysis. <doi:10.1007/b98835> and Mardia K.V., Kent J.T. and Bibby J.M. (1979). Multivariate Analysis. ISBN: 978-0124712522. London: Academic Press. |
Authors: | Michail Tsagris [aut, cre] |
Maintainer: | Michail Tsagris <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.0 |
Built: | 2024-11-17 06:27:49 UTC |
Source: | CRAN |
A new robust principal component analysis algorithm is implemented that relies upon the Cauchy Distribution. The algorithm is suitable for high dimensional data even if the sample size is less than the number of variables.
Package: | choosepc |
Type: | Package |
Version: | 1.0 |
Date: | 2023-10-22 |
License: | GPL-2 |
Michail Tsagris <[email protected]>.
Michail Tsagris [email protected]
Jolliffe I.T. (2002). Principal Component Analysis.
Choose the number of principal components via reconstruction error.
pc.choose(x, graph = TRUE)
pc.choose(x, graph = TRUE)
x |
A numerical matrix with more rows than columns. |
graph |
Should the plot of the PRESS values appear? Default value is TRUE. |
SVD stands for Singular Value Decomposition of a rectangular matrix. That is any matrix, not only a square one in contrast to the Spectral Decomposition with eigenvalues and eigenvectors, produced by principal component analysis (PCA). Suppose we have a matrix
. Then using SVD we can write the matrix as
where is an orthonormal matrix containing the eigenvectors of
, the
is an orthonormal matrix containing the eigenvectors of
and
is a
diagonal matrix containing the
non zero singular values
(square root of the eigenvalues) of
(or
) and the remaining
elements of the diagonal are zero. We remind that the maximum rank of an
matrix is equal to
. Using the SVD decomposition equaiton above, each column of
can be written as
This means that we can reconstruct the matrix using less columns (if
) than it has.
where .
The reconstructed matrix will have some discrepancy of course, but it is the level of discrepancy we are interested in. If we center the matrix , subtract the column means from every column, and perform the SVD again, we will see that the orthonormal matrix
contains the eigenvectors of the covariance matrix of the original, the un-centred, matrix
.
Coming back to the a matrix of observations and
variables, the question was how many principal components to retain. We will give an answer to this using SVD to reconstruct the matrix. We describe the steps of this algorithm below.
1. Center the matrix by subtracting from each variable its mean
2. Perform SVD on the centred matrix .
3. Choose a number from to
(the rank of the matrix) and reconstruct the matrix. Let us denote by
the reconstructed matrix.
4. Calculate the sum of squared differences between the reconstructed and the original values
5. Plot for all the values of
and choose graphically the number of principal components.
The graphical way of choosing the number of principal components is not the best and there alternative ways of making a decision (see for example Jolliffe (2002)).
A list including:
values |
The eigenvalues of the covariance matrix. |
cumprop |
The cumulative proportion of the eigenvalues of the covariance matrix. |
per |
The differences in the cumulative proportion of the eigenvalues of the covariance matrix. |
press |
The reconstruction error |
runtime |
The runtime of the algorithm. |
Michail Tsagris.
R implementation and documentation: Michail Tsagris [email protected].
Jolliffe I.T. (2002). Principal Component Analysis.
x <- as.matrix(iris[, 1:4]) a <- pc.choose(x, graph = FALSE)
x <- as.matrix(iris[, 1:4]) a <- pc.choose(x, graph = FALSE)
components
Confidence interval for the percentage of variance retained by the first components.
eigci(x, k, alpha = 0.05, B = 1000, graph = TRUE)
eigci(x, k, alpha = 0.05, B = 1000, graph = TRUE)
x |
A numerical matrix with more rows than columns. |
k |
The number of principal components to use. |
alpha |
This is the significance level. Based on this, an |
B |
The number of bootstrap samples to generate. |
graph |
Should the plot of the bootstrap replicates appear? Default value is TRUE. |
The algorithm is taken by Mardia Kent and Bibby (1979, pg. 233–234). The percentage retained by the fist principal components denoted by
is equal to
where is asymptotically normal with mean
and variance
where
and
.
The bootstrap version provides an estimate of the bias, defined as and confidence intervals calculated via the percentile method and via the standard (or normal) method Efron and Tibshirani (1993). The funciton gives the option to perform bootstrap.
A list including:
res |
If B=1 (no bootstrap) a vector with the esimated percentage of variance due to the first |
ci |
This appears if B>1 (bootstrap). The standard bootstrap and the empirical bootstrap |
Futher, if B>1 and "graph" was set equal to TRUE, a histogram with the bootstrap values, the observed
value and its bootstrap estimate.
Michail Tsagris.
R implementation and documentation: Michail Tsagris [email protected].
Mardia K.V., Kent, J.T. and Bibby, J.M. (1979). Multivariate Analysis. London: Academic Press.
Efron B. and Tibshirani R. J. (1993). An introduction to the bootstrap. Chapman & Hall/CRC.
x <- as.matrix(iris[, 1:4]) eigci(x, k = 2, B = 1)
x <- as.matrix(iris[, 1:4]) eigci(x, k = 2, B = 1)