Title: | Sparse Principal Component Regression |
---|---|
Description: | The sparse principal component regression is computed. The regularization parameters are optimized by cross-validation. |
Authors: | Shuichi Kawano |
Maintainer: | Shuichi Kawano <[email protected]> |
License: | GPL (>= 2) |
Version: | 2.1.1 |
Built: | 2024-10-31 22:25:57 UTC |
Source: | CRAN |
This function performs cross-validation for spcr. cv.spcr
enables us to determine two regularization parameters and
objectively.
cv.spcr(x, y, k, w=0.1, xi=0.01, nfolds=5, adaptive=FALSE, center=TRUE, scale=FALSE, lambda.B.length=10, lambda.gamma.length=10, lambda.B=NULL, lambda.gamma=NULL)
cv.spcr(x, y, k, w=0.1, xi=0.01, nfolds=5, adaptive=FALSE, center=TRUE, scale=FALSE, lambda.B.length=10, lambda.gamma.length=10, lambda.B=NULL, lambda.gamma=NULL)
x |
A data matrix. |
y |
A response vector. |
k |
The number of principal components. |
w |
Weight parameter with |
xi |
The elastic net mixing parameter with |
nfolds |
The number of folds. The default is 5. |
adaptive |
If |
center |
If |
scale |
If |
lambda.B.length |
The number of candidates for the parameter |
lambda.gamma.length |
The number of candidates for the parameter |
lambda.B |
Optional user-supplied candidates for the parameter |
lambda.gamma |
Optional user-supplied candidates for the parameter |
lambda.gamma.seq |
The values of |
lambda.B.seq |
The values of |
CV.mat |
Matrix of the mean values of cross-validation. The row shows a sequence of |
lambda.gamma.cv |
The value of |
lambda.B.cv |
The value of |
cvm |
The minimum of the mean cross-validated error. |
Shuichi Kawano
[email protected]
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2015). Sparse principal component regression with adaptive loading. Compuational Statistics & Data Analysis, 89, 192–203.
spcr
#data n <- 50 np <- 5 set.seed(1) nu0 <- c(-1, 1) x <- matrix( rnorm(np*n), n, np ) e <- rnorm(n) y <- nu0[1]*x[ ,1] + nu0[2]*x[ ,2] + e #fit cv.spcr.fit <- cv.spcr(x=x, y=y, k=2) cv.spcr.fit #fit (adaptive SPCR) cv.adaspcr.fit <- cv.spcr(x=x, y=y, k=2, adaptive=TRUE) cv.adaspcr.fit
#data n <- 50 np <- 5 set.seed(1) nu0 <- c(-1, 1) x <- matrix( rnorm(np*n), n, np ) e <- rnorm(n) y <- nu0[1]*x[ ,1] + nu0[2]*x[ ,2] + e #fit cv.spcr.fit <- cv.spcr(x=x, y=y, k=2) cv.spcr.fit #fit (adaptive SPCR) cv.adaspcr.fit <- cv.spcr(x=x, y=y, k=2, adaptive=TRUE) cv.adaspcr.fit
This function performs cross-validation for SPCR-glm. cv.spcrglm
enables us to determine two regularization parameters and
objectively.
cv.spcrglm(x, y, k, family=c("binomial","poisson","multinomial"), w=0.1, xi=0.01, nfolds=5, adaptive=FALSE, q=1, center=TRUE, scale=FALSE, lambda.B.length=10, lambda.gamma.length=10, lambda.B=NULL, lambda.gamma=NULL)
cv.spcrglm(x, y, k, family=c("binomial","poisson","multinomial"), w=0.1, xi=0.01, nfolds=5, adaptive=FALSE, q=1, center=TRUE, scale=FALSE, lambda.B.length=10, lambda.gamma.length=10, lambda.B=NULL, lambda.gamma=NULL)
x |
A data matrix. |
y |
A response vector. |
k |
The number of principal components. |
family |
Response type. |
w |
Weight parameter with |
xi |
The elastic net mixing parameter with |
nfolds |
The number of folds. The default is 5. |
adaptive |
If |
q |
The tuning parameter that controls weights in aSPCR-glm. The default is 1. |
center |
If |
scale |
If |
lambda.B.length |
The number of candidates for the parameter |
lambda.gamma.length |
The number of candidates for the parameter |
lambda.B |
Optional user-supplied candidates for the parameter |
lambda.gamma |
Optional user-supplied candidates for the parameter |
lambda.gamma.seq |
The values of |
lambda.B.seq |
The values of |
CV.mat |
Matrix of the mean values of cross-validation. The row shows a sequence of |
lambda.gamma.cv |
The value of |
lambda.B.cv |
The value of |
cvm |
The minimum of the mean cross-validated error. |
Shuichi Kawano
[email protected]
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2018). Sparse principal component regression for generalized linear models. Compuational Statistics & Data Analysis, 124, 180–196.
spcrglm
# binomial n <- 100 np <- 3 nu0 <- c(-1, 1) set.seed(4) x <- matrix( rnorm(np*n), n, np ) y <- rbinom(n,1,1-1/(1+exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] )))) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="binomial") cv.spcrglm.fit # Poisson set.seed(5) y <- rpois(n, 1) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="poisson") cv.spcrglm.fit # multinomial set.seed(4) y <- sample(1:4, n, replace=TRUE) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="multinomial") cv.spcrglm.fit
# binomial n <- 100 np <- 3 nu0 <- c(-1, 1) set.seed(4) x <- matrix( rnorm(np*n), n, np ) y <- rbinom(n,1,1-1/(1+exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] )))) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="binomial") cv.spcrglm.fit # Poisson set.seed(5) y <- rpois(n, 1) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="poisson") cv.spcrglm.fit # multinomial set.seed(4) y <- sample(1:4, n, replace=TRUE) cv.spcrglm.fit <- cv.spcrglm(x=x, y=y, k=1, family="multinomial") cv.spcrglm.fit
This function computes a principal component regression model via sparse regularization.
spcr(x, y, k, lambda.B, lambda.gamma, w=0.1, xi=0.01, adaptive=FALSE, center=TRUE, scale=FALSE)
spcr(x, y, k, lambda.B, lambda.gamma, w=0.1, xi=0.01, adaptive=FALSE, center=TRUE, scale=FALSE)
x |
A data matrix. |
y |
A response vector. |
k |
The number of principal components. |
lambda.B |
The regularization parameter for the parameter |
lambda.gamma |
The regularization parameter for the coefficient vector |
w |
Weight parameter with |
xi |
The elastic net mixing parameter with |
adaptive |
If |
center |
If |
scale |
If |
loadings.B |
the loading matrix B |
gamma |
the coefficient |
gamma0 |
intercept |
loadings.A |
the loading matrix A |
Shuichi Kawano
[email protected]
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2015). Sparse principal component regression with adaptive loading. Compuational Statistics & Data Analysis, 89, 192–203.
cv.spcr
#data n <- 100 np <- 5 set.seed(4) nu0 <- c(-1, 1) x <- matrix( rnorm(np*n), n, np ) e <- rnorm(n) y <- nu0[1]*x[ ,1] + nu0[2]*x[ ,2] + e #fit spcr.fit <- spcr(x=x, y=y, k=2, lambda.B=6, lambda.gamma=2) spcr.fit #fit (adaptive SPCR) adaspcr.fit <- spcr(x=x, y=y, k=2, lambda.B=6, lambda.gamma=2, adaptive=TRUE) adaspcr.fit
#data n <- 100 np <- 5 set.seed(4) nu0 <- c(-1, 1) x <- matrix( rnorm(np*n), n, np ) e <- rnorm(n) y <- nu0[1]*x[ ,1] + nu0[2]*x[ ,2] + e #fit spcr.fit <- spcr(x=x, y=y, k=2, lambda.B=6, lambda.gamma=2) spcr.fit #fit (adaptive SPCR) adaspcr.fit <- spcr(x=x, y=y, k=2, lambda.B=6, lambda.gamma=2, adaptive=TRUE) adaspcr.fit
This function computes a principal component regression for generalized linear models via sparse regularization.
spcrglm(x, y, k, family=c("binomial","poisson","multinomial"), lambda.B, lambda.gamma, w=0.1, xi=0.01, adaptive=FALSE, q=1, center=TRUE, scale=FALSE)
spcrglm(x, y, k, family=c("binomial","poisson","multinomial"), lambda.B, lambda.gamma, w=0.1, xi=0.01, adaptive=FALSE, q=1, center=TRUE, scale=FALSE)
x |
A data matrix. |
y |
A response data. |
k |
The number of principal components. |
family |
Response type. |
lambda.B |
The regularization parameter for the parameter |
lambda.gamma |
The regularization parameter for the coefficient vector |
w |
Weight parameter with |
xi |
The elastic net mixing parameter with |
adaptive |
If |
q |
The tuning parameter that controls weights in aSPCR-glm. The default is 1. |
center |
If |
scale |
If |
loadings.B |
the loading matrix B |
gamma |
the coefficient |
gamma0 |
intercept |
loadings.A |
the loading matrix A |
Shuichi Kawano
[email protected]
Kawano, S., Fujisawa, H., Takada, T. and Shiroishi, T. (2018). Sparse principal component regression for generalized linear models. Compuational Statistics & Data Analysis, 124, 180–196.
cv.spcrglm
# binomial n <- 100 np <- 5 nu0 <- c(-1, 1) set.seed(4) x <- matrix( rnorm(np*n), n, np ) y <- rbinom(n,1,1-1/(1+exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] )))) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="binomial", lambda.B=2, lambda.gamma=1) spcrglm.fit # Poisson set.seed(4) y <- rpois(n, exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] ) )) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="poisson", lambda.B=2, lambda.gamma=1) spcrglm.fit # multinomial set.seed(4) y <- sample(1:4, n, replace=TRUE) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="multinomial", lambda.B=2, lambda.gamma=2) spcrglm.fit
# binomial n <- 100 np <- 5 nu0 <- c(-1, 1) set.seed(4) x <- matrix( rnorm(np*n), n, np ) y <- rbinom(n,1,1-1/(1+exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] )))) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="binomial", lambda.B=2, lambda.gamma=1) spcrglm.fit # Poisson set.seed(4) y <- rpois(n, exp( (nu0[1]*x[ ,1] + nu0[2]*x[ ,2] ) )) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="poisson", lambda.B=2, lambda.gamma=1) spcrglm.fit # multinomial set.seed(4) y <- sample(1:4, n, replace=TRUE) spcrglm.fit <- spcrglm(x=x, y=y, k=2, family="multinomial", lambda.B=2, lambda.gamma=2) spcrglm.fit