Title: | Information Criteria for Generalized Linear Regression |
---|---|
Description: | Calculate various information criteria in literature for "lm" and "glm" objects. |
Authors: | Fatih Saglam [aut, cre] , Emre Dunder [aut] |
Maintainer: | Fatih Saglam <[email protected]> |
License: | MIT + file LICENSE |
Version: | 0.1.0 |
Built: | 2024-12-10 06:48:52 UTC |
Source: | CRAN |
Calculates Akaike Information Criterion (AIC) and its variants for "lm" and "glm" objects.
AIC(model) AIC4(model)
AIC(model) AIC4(model)
model |
a "lm" or "glm" object |
AIC (Akaike, 1973) is calculated as
and AIC4 (Bozdogan, 1994) as
AIC or AIC4 measurement of the model
Akaike H., 1973. Maximum likelihood identification of Gaussian autoregressive moving average models. Biometrika, 60(2), 255-265.
Bozdogan, H. 1994. Mixture-model cluster analysis using model selection criteria and a new informational measure of complexity. In Proceedings of the first US/Japan conference on the frontiers of statistical modeling: An informational approach, 69–113. Dordrecht: Springer.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") AIC(m1) AIC(m2) AIC(m3) AIC4(m1) AIC4(m2) AIC4(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") AIC(m1) AIC(m2) AIC(m3) AIC4(m1) AIC4(m2) AIC4(m3)
Calculates Bayesian Information Criterion (BIC) and its variants (BICadj, BICQ) for "lm" and "glm" objects.
BIC(model) BICadj(model) BICQ(model, q = 0.25)
BIC(model) BICadj(model) BICQ(model, q = 0.25)
model |
a "lm" or "glm" object |
q |
adjustment parameter for |
BIC (Schwarz, 1978) is calculated as
Adjusted BIC (Dziak et al., 2020) as
and BICQ (Xu, 2010) as
.
BIC, BICadj or BICQ measurement of the model
Dziak, J. J., Coffman, D. L., Lanza, S. T., Li, R., & Jermiin, L. S. (2020). Sensitivity and specificity of information criteria. Briefings in bioinformatics, 21(2), 553-565.
Xu, C. (2010). Model Selection with Information Criteria.
Schwarz, G. 1978. Estimating the dimension of a model The Annals of Statistics 6 (2), 461–464. <doi:10.1214/aos/1176344136>
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") BIC(m1) BIC(m2) BIC(m3) BICadj(m1) BICadj(m2) BICadj(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") BIC(m1) BIC(m2) BIC(m3) BICadj(m1) BICadj(m2) BICadj(m3)
Consistent Akaike's Information Criterion (CAIC) and Consistent Akaike's Information Criterion with Fisher Information (CAICF) for "lm" and "glm" objects.
CAIC(model) CAICF(model)
CAIC(model) CAICF(model)
model |
a "lm" or "glm" object. |
CAIC (Bozdogan, 1987) is calculated as
CAICF (Bozdogan, 1987) as
F is the Fisher information matrix.
CAIC or CAICF measurement of the model.
Bozdogan, H. (1987). Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions. Psychometrika, 52(3), 345-370.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") CAIC(m1) CAIC(m2) CAIC(m3) CAICF(m1) CAICF(m2) CAICF(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") CAIC(m1) CAIC(m2) CAIC(m3) CAICF(m1) CAICF(m2) CAICF(m3)
Calculates Fisher Information Criterion (FIC) for "lm" and "glm" objects.
FIC(model)
FIC(model)
model |
a "lm" or "glm" object |
FIC (Wei, 1992) is calculated as
FIC measurement of the model
Wei, C. Z. (1992). On predictive least squares principles. The Annals of Statistics, 20(1), 1-42.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") FIC(m1) FIC(m2) FIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") FIC(m1) FIC(m2) FIC(m3)
Calculates Generalized Cross-Validation (GCV) for "lm" and "glm" objects.
GCV(model)
GCV(model)
model |
a "lm" or "glm" object |
GCV (Koc and Bozdogan, 2015) is calculated as
RSS is the residual sum of squares.
GCV measurement of the model
Koc, E. K., & Bozdogan, H. (2015). Model selection in multivariate adaptive regression splines (MARS) using information complexity as the fitness function. Machine Learning, 101(1), 35-58.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") GCV(m1) GCV(m2) GCV(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") GCV(m1) GCV(m2) GCV(m3)
Calculates Haughton Bayesian information criterion (HBIC) for "lm" and "glm" objects.
HBIC(model)
HBIC(model)
model |
a "lm" or "glm" object |
HBIC (Bollen et al., 2014) is calculated as
HBIC measurement of the model
Bollen, K. A., Harden, J. J., Ray, S., & Zavisca, J. (2014). BIC and alternative Bayesian information criteria in the selection of structural equation models. Structural equation modeling: a multidisciplinary journal, 21(1), 1-19.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") HBIC(m1) HBIC(m2) HBIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") HBIC(m1) HBIC(m2) HBIC(m3)
Calculates Hannan-Quinn Information Criterion (HQIC) for "lm" and "glm" objects.
HQIC(model)
HQIC(model)
model |
a "lm" or "glm" object |
HQIC (Hannan and Quinn, 1979) is calculated as
HQIC measurement of the model
Hannan, E. J., & Quinn, B. G. (1979). The determination of the order of an autoregression. Journal of the Royal Statistical Society: Series B (Methodological), 41(2), 190-195.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") HQIC(m1) HQIC(m2) HQIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") HQIC(m1) HQIC(m2) HQIC(m3)
Calculates Information Matrix-Based Information Criterion (IBIC) for "lm" and "glm" objects.
IBIC(model)
IBIC(model)
model |
a "lm" or "glm" object |
IBIC (Bollen et al., 2012) is calculated as
is the fisher information matrix.
While calculating the Fisher information matrix (), we used
the joint parameters (
) of the models.
IBIC measurement of the model
Bollen, K. A., Ray, S., Zavisca, J., & Harden, J. J. (2012). A comparison of Bayes factor approximation methods including two new methods. Sociological Methods & Research, 41(2), 294-324.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") IBIC(m1) IBIC(m2) IBIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") IBIC(m1) IBIC(m2) IBIC(m3)
Calculates Various Information Criteria for "lm" and "glm" objects.
IC( model, criteria = c("AIC", "BIC", "CAIC", "KIC", "HQIC", "FIC", "ICOMP_IFIM_C1", "ICOMP_PEU_C1", "ICOMP_PEU_LN_C1", "CICOMP_C1"), ... )
IC( model, criteria = c("AIC", "BIC", "CAIC", "KIC", "HQIC", "FIC", "ICOMP_IFIM_C1", "ICOMP_PEU_C1", "ICOMP_PEU_LN_C1", "CICOMP_C1"), ... )
model |
a "lm" or "glm" object or object list |
criteria |
a vector of criteria names. Can be set to respective numbers. Possible criteria names at the moment are: |
... |
additional parameters. Currently none. |
Calculates Various Information Criteria for "lm" and "glm" objects.
model
can be a list. If it is a list, function returns a
matrix of selected information criteria for all models.
Information criteria of the model(s) for selected criteria
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") IC(model = m1, criteria = 1:32) IC(model = list(lm = m1, glm = m2, glm_pois = m3), criteria = 1:32)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") IC(model = m1, criteria = 1:32) IC(model = list(lm = m1, glm = m2, glm_pois = m3), criteria = 1:32)
These functions calculates Informational Complexity (ICOMP) variants for "lm" and "glm" objects.
ICOMP(model, type = "IFIM", C = "C1") ICOMP_IFIM_CF(model) ICOMP_IFIM_C1(model) ICOMP_IFIM_C1F(model) ICOMP_IFIM_C1R(model) ICOMP_PEU_CF(model) ICOMP_PEU_C1(model) ICOMP_PEU_C1F(model) ICOMP_PEU_C1R(model) ICOMP_PEU_LN_CF(model) ICOMP_PEU_LN_C1(model) ICOMP_PEU_LN_C1F(model) ICOMP_PEU_LN_C1R(model) CICOMP_CF(model) CICOMP_C1(model) CICOMP_C1F(model) CICOMP_C1R(model)
ICOMP(model, type = "IFIM", C = "C1") ICOMP_IFIM_CF(model) ICOMP_IFIM_C1(model) ICOMP_IFIM_C1F(model) ICOMP_IFIM_C1R(model) ICOMP_PEU_CF(model) ICOMP_PEU_C1(model) ICOMP_PEU_C1F(model) ICOMP_PEU_C1R(model) ICOMP_PEU_LN_CF(model) ICOMP_PEU_LN_C1(model) ICOMP_PEU_LN_C1F(model) ICOMP_PEU_LN_C1R(model) CICOMP_CF(model) CICOMP_C1(model) CICOMP_C1F(model) CICOMP_C1R(model)
model |
a "lm" or "glm" object |
type |
type of ICOMP. Available types are "IFIM", "PEU", "PEU_LN" and "CICOMP". Default is "IFIM". |
C |
type of complexity. Available types are "CF", "C1", "C1F" and "C1R". Default is "C1". |
ICOMP(IFIM) (Bozdogan, 2003) is calculated as
ICOMP(IFIM-peu) (Koc and Bozdogan, 2015) as
ICOMP(IFIM-peuln) (Bozdogan, 2010) as
and CICOMP (Pamukcu et al., 2015) as
is the fisher information matrix.
is the
reverse Fisher information matrix.
is the complexity measure. Four variants are available:
(Bozdogan, 2010) is
(Bozdogan, 2010) is
(Bozdogan, 2010) is
(Bozdogan, 2000) is
Here, is the correlation matrix of the model,
are eigenvalues of
,
and
are arithmetic and
geometric mean of eigenvalues of
, respectively.
is the dimension
of
.
While calculating the Fisher information matrix (
), we used
the joint parameters (
) of the models. In
function,
we utilized the usual variance-covariance matrix
of the
models. beta is the vector of regression coefficients.
Informational Complexity measurement of the model
Bozdogan, H. (2003). Intelligent Statistical Data Mining with Information Complexity and Genetic Algorithms Hamparsum Bozdogan University of Tennessee, Knoxville, USA. In Statistical data mining and knowledge discovery (pp. 47-88). Chapman and Hall/CRC.
Koc, E. K., & Bozdogan, H. (2015). Model selection in multivariate adaptive regression splines (MARS) using information complexity as the fitness function. Machine Learning, 101(1), 35-58.
Bozdogan, H. (2010). A new class of information complexity (ICOMP) criteria with an application to customer profiling and segmentation. İstanbul Üniversitesi İşletme Fakültesi Dergisi, 39(2), 370-398.
Pamukçu, E., Bozdogan, H., & Çalık, S. (2015). A novel hybrid dimension reduction technique for undersized high dimensional gene expression data sets using information complexity criterion for cancer classification. Computational and mathematical methods in medicine, 2015.
Bozdogan, H. (2000). Akaike's information criterion and recent developments in information complexity. Journal of mathematical psychology, 44(1), 62-91.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") ICOMP_IFIM_CF(m1) ICOMP_IFIM_CF(m2) ICOMP_IFIM_CF(m3) CICOMP_C1(m1) CICOMP_C1(m2) CICOMP_C1(m3) ICOMP(m1, type = "PEU", C = "C1R")
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") ICOMP_IFIM_CF(m1) ICOMP_IFIM_CF(m2) ICOMP_IFIM_CF(m3) CICOMP_C1(m1) CICOMP_C1(m2) CICOMP_C1(m3) ICOMP(m1, type = "PEU", C = "C1R")
Joint Information Criterion (JIC) for "lm" and "glm" objects.
JIC(model)
JIC(model)
model |
a "lm" or "glm" object |
JIC (Rahman and King, 1999) is calculated as
JIC measurement of the model
Rahman, M. S., & King, M. L. (1999). Improved model selection criterion. Communications in Statistics-Simulation and Computation, 28(1), 51-71.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") JIC(m1) JIC(m2) JIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") JIC(m1) JIC(m2) JIC(m3)
Calculates Kullback–Leibler Information Criterion (KIC) and its corrected form (KICC) for "lm" and "glm" objects.
KIC(model) KICC(model)
KIC(model) KICC(model)
model |
a "lm" or "glm" object |
KIC (Seghouane, 2006) is calculated as
and KICC (Seghouane, 2006) is calculated as
KIC measurement of the model
Seghouane, A. K. (2006). A note on overfitting properties of KIC and KICC. Signal Processing, 86(10), 3055-3060.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") KIC(m1) KIC(m2) KIC(m3) KICC(m1) KICC(m2) KICC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") KIC(m1) KIC(m2) KIC(m3) KICC(m1) KICC(m2) KICC(m3)
Calculates Scaled Unit Information Prior Bayesian Information Criterion (SPBIC) for "lm" and "glm" objects.
SPBIC(model)
SPBIC(model)
model |
a "lm" or "glm" object |
SPBIC (Bollen et al., 2012) is calculated as
beta and Sigma are vector and covariance matrix of regression coefficients.
SPBIC measurement of the model
Bollen, K. A., Ray, S., Zavisca, J., & Harden, J. J. (2012). A comparison of Bayes factor approximation methods including two new methods. Sociological Methods & Research, 41(2), 294-324.
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") SPBIC(m1) SPBIC(m2) SPBIC(m3)
x1 <- rnorm(100, 3, 2) x2 <- rnorm(100, 5, 3) x3 <- rnorm(100, 67, 5) err <- rnorm(100, 0, 4) ## round so we can use it for Poisson regression y <- round(3 + 2*x1 - 5*x2 + 8*x3 + err) m1 <- lm(y~x1 + x2 + x3) m2 <- glm(y~x1 + x2 + x3, family = "gaussian") m3 <- glm(y~x1 + x2 + x3, family = "poisson") SPBIC(m1) SPBIC(m2) SPBIC(m3)