Title: | Time Series Analysis and Control Package |
---|---|
Description: | Functions for statistical analysis, prediction and control of time series based mainly on Akaike and Nakagawa (1988) <ISBN 978-90-277-2786-2>. |
Authors: | The Institute of Statistical Mathematics |
Maintainer: | Masami Saga <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.3.8-4 |
Built: | 2024-11-10 06:28:20 UTC |
Source: | CRAN |
R functions for statistical analysis and control of time series.
This package provides functions for statistical analysis, prediction and control of time series. The original TIMSAC (TIMe Series Analysis and Control) or TIMSAC-72 was published in Akaike and Nakagawa (1972). After that, TIMSAC-74, TIMSAC-78 and TIMSAC-84 were published as the TIMSAC series in Computer Science Monograph.
For overview of models and information criteria for model selection, see ../doc/timsac-guide_e.pdf or ../doc/timsac-guide_j.pdf (in Japanese).
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.6, Timsac74, A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
H.Akaike, T.Ozaki, M.Ishiguro, Y.Ogata, G.Kitagawa, Y-H.Tamura, E.Arahata, K.Katsura and Y.Tamura (1985) Computer Science Monograph, No.22, Timsac84 Part 1. The Institute of Statistical Mathematics.
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
An airpollution data for testing perars
.
data(Airpollution)
data(Airpollution)
A time series of 372 observations.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
A multivariate non-stationary data for testing blomar
.
data(Amerikamaru)
data(Amerikamaru)
A 2-dimensional array with 896 observations on 2 variables.
[, 1] | rudder |
[, 2] | yawing |
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
Fit an ARMA model with specified order by using DAVIDON's algorithm.
armafit(y, model.order)
armafit(y, model.order)
y |
a univariate time series. |
model.order |
a numerical vector of the form c(ar, ma) which gives the order to be fitted successively. |
The maximum likelihood estimates of the coefficients of a scalar ARMA model
of a time series are obtained by using DAVIDON's algorithm.
Pure autoregression is not allowed.
arcoef |
maximum likelihood estimates of AR coefficients. |
macoef |
maximum likelihood estimates of MA coefficients. |
arstd |
standard deviation (AR). |
mastd |
standard deviation (MA). |
v |
innovation variance. |
aic |
AIC. |
grad |
final gradient. |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=-0.5), n = 1000) z <- armafit(y, model.order = c(2,1)) z$arcoef z$macoef
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=-0.5), n = 1000) z <- armafit(y, model.order = c(2,1)) z$arcoef z$macoef
Compute power spectrum estimates for two trigonometric windows of Blackman-Tukey type by Goertzel method.
auspec(y, lag = NULL, window = "Akaike", log = FALSE, plot = TRUE)
auspec(y, lag = NULL, window = "Akaike", log = FALSE, plot = TRUE)
y |
a univariate time series. |
lag |
maximum lag. Default is |
window |
character string giving the definition of smoothing window. Allowed strings are "Akaike" (default) or "Hanning". |
log |
logical. If |
plot |
logical. If |
Hanning Window : | a1(0)=0.5, | a1(1)=a1(-1)=0.25, | a1(2)=a1(-2)=0 |
Akaike Window : | a2(0)=0.625, | a2(1)=a2(-1)=0.25, | a2(2)=a2(-2)=-0.0625 |
spec |
spectrum smoothing by ' |
stat |
test statistics. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) auspec(y, log = TRUE)
y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) auspec(y, log = TRUE)
Estimate autocovariances and autocorrelations.
autcor(y, lag = NULL, plot = TRUE, lag_axis = TRUE)
autcor(y, lag = NULL, plot = TRUE, lag_axis = TRUE)
y |
a univariate time series. |
lag |
maximum lag. Default is |
plot |
logical. If |
lag_axis |
logical. If |
acov |
autocovariances. |
acor |
autocorrelations (normalized covariances). |
mean |
mean of |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 for the normal distribution y <- rnorm(200) autcor(y, lag_axis = FALSE) # Example 2 for the ARIMA model y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) autcor(y, lag = 20)
# Example 1 for the normal distribution y <- rnorm(200) autcor(y, lag_axis = FALSE) # Example 2 for the ARIMA model y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) autcor(y, lag = 20)
Provide an automatic ARMA model fitting procedure. Models with various orders are fitted and the best choice is determined with the aid of the statistics AIC.
autoarmafit(y, max.order = NULL)
autoarmafit(y, max.order = NULL)
y |
a univariate time series. |
max.order |
upper limit of AR order and MA order. Default is
|
The maximum likelihood estimates of the coefficients of a scalar ARMA model
of a time series are obtained by using DAVIDON's variance algorithm.
Where
is AR order,
is MA order and
is a zero mean
white noise. Pure autoregression is not allowed.
best.model |
the best choice of ARMA coefficients. |
model |
a list with components |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1),ar=c(0.64,-0.8),ma=-0.5), n = 1000) autoarmafit(y)
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1),ar=c(0.64,-0.8),ma=-0.5), n = 1000) autoarmafit(y)
Decompose a nonstationary time series into several possible components.
baysea(y, period = 12, span = 4, shift = 1, forecast = 0, trend.order = 2, seasonal.order = 1, year = 0, month = 1, out = 0, rigid = 1, zersum = 1, delta = 7, alpha = 0.01, beta = 0.01, gamma = 0.1, spec = TRUE, plot = TRUE, separate.graphics = FALSE)
baysea(y, period = 12, span = 4, shift = 1, forecast = 0, trend.order = 2, seasonal.order = 1, year = 0, month = 1, out = 0, rigid = 1, zersum = 1, delta = 7, alpha = 0.01, beta = 0.01, gamma = 0.1, spec = TRUE, plot = TRUE, separate.graphics = FALSE)
y |
a univariate time series. |
||||||
period |
number of seasonals within a period. |
||||||
span |
number of periods to be processed at one time. |
||||||
shift |
number of periods to be shifted to define the new span of data. |
||||||
forecast |
length of forecast at the end of data. |
||||||
trend.order |
order of differencing of trend. |
||||||
seasonal.order |
order of differencing of seasonal. |
||||||
year |
trading-day adjustment option.
|
||||||
month |
number of the month in which the series starts. If |
||||||
out |
outlier correction option.
|
||||||
rigid |
controls the rigidity of the seasonal component. more rigid seasonal with larger than rigid. |
||||||
zersum |
controls the sum of the seasonals within a period. |
||||||
delta |
controls the leap year effect. |
||||||
alpha |
controls prior variance of initial trend. |
||||||
beta |
controls prior variance of initial seasonal. |
||||||
gamma |
controls prior variance of initial sum of seasonal. |
||||||
spec |
logical. If |
||||||
plot |
logical. If |
||||||
separate.graphics |
logical. If |
This function realized a decomposition of time series y
into the form
where is trend component,
is seasonal component,
is irregular,
is trading day factor and
is outlier correction factor. For the purpose of comparison of models the
criterion ABIC is defined
Smaller value of ABIC represents better fit.
outlier |
outlier correction factor. |
trend |
trend. |
season |
seasonal. |
tday |
trading day component if |
irregular |
= |
adjust |
= |
smoothed |
= |
aveABIC |
averaged ABIC. |
irregular.spec |
a list with components |
adjusted.spec |
a list with components |
differenced.trend |
a list with components |
differenced.season |
a list with components |
H.Akaike, T.Ozaki, M.Ishiguro, Y.Ogata, G.Kitagawa, Y-H.Tamura, E.Arahata, K.Katsura and Y.Tamura (1985) Computer Science Monograph, No.22, Timsac84 Part 1. The Institute of Statistical Mathematics.
data(LaborData) baysea(LaborData, forecast = 12)
data(LaborData) baysea(LaborData, forecast = 12)
Compute bi-spectrum using the direct Fourier transform of sample third order moments.
bispec(y, lag = NULL, window = "Akaike", log = FALSE, plot = TRUE)
bispec(y, lag = NULL, window = "Akaike", log = FALSE, plot = TRUE)
y |
a univariate time series. |
lag |
maximum lag. Default is |
window |
character string giving the definition of smoothing window. Allowed strings are "Akaike" (default) or "Hanning". |
log |
logical. If |
plot |
logical. If |
Hanning Window : | a1(0)=0.5, | a1(1)=a1(-1)=0.25, | a1(2)=a1(-2)=0 |
Akaike Window : | a2(0)=0.625, | a2(1)=a2(-1)=0.25, | a2(2)=a2(-2)=-0.0625 |
pspec |
power spectrum smoothed by ' |
sig |
significance. |
cohe |
coherence. |
breal |
real part of bispectrum. |
bimag |
imaginary part of bispectrum. |
exval |
approximate expected value of coherence under Gaussian assumption. |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.6, Timsac74, A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
data(bispecData) bispec(bispecData, lag = 30)
data(bispecData) bispec(bispecData, lag = 30)
A univariate data for testing bispec
and thirmo
.
data(bispecData)
data(bispecData)
A time series of 1500 observations.
H.Akaike, E.Arahata and T.Ozaki (1976) Computer Science Monograph, No.6, Timsac74 A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
Locally fit autoregressive models to non-stationary time series by a Bayesian procedure.
blocar(y, max.order = NULL, span, plot = TRUE)
blocar(y, max.order = NULL, span, plot = TRUE)
y |
a univariate time series. |
max.order |
upper limit of the order of AR model. Default is
|
span |
length of basic local span. |
plot |
logical. If |
The basic AR model of scalar time series is given by
where is order of the model and
is Gaussian white noise
with mean
and variance
v
. At each stage of modeling of locally
AR model, a two-step Bayesian procedure is applied
1. | Averaging of the models with different orders fitted to the newly obtained data. |
2. | Averaging of the models fitted to the present and preceding spans. |
AIC of the model fitted to the new span is defined by
where is the length of new data,
is innovation variance
and
is the equivalent number of parameters, defined as the sum of
squares of the Bayesian weights. AIC of the model fitted to the preceding
spans are defined by
where is the prediction error variance by the model fitted to
periods former span.
var |
variance. |
aic |
AIC. |
bweight |
Bayesian weight. |
pacoef |
partial autocorrelation. |
arcoef |
coefficients ( average by the Bayesian weights ). |
v |
innovation variance. |
init |
initial point of the data fitted to the current model. |
end |
end point of the data fitted to the current model. |
pspec |
power spectrum. |
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-Stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike (1978) A Bayesian Extension of the Minimum AIC Procedure of Autoregressive Model Fitting. Research Memo. NO.126. The Institute of The Statistical Mathematics.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(locarData) z <- blocar(locarData, max.order = 10, span = 300) z$arcoef
data(locarData) z <- blocar(locarData, max.order = 10, span = 300) z$arcoef
Locally fit multivariate autoregressive models to non-stationary time series by a Bayesian procedure.
blomar(y, max.order = NULL, span)
blomar(y, max.order = NULL, span)
y |
A multivariate time series. |
max.order |
upper limit of the order of AR model, less than or equal to
|
span |
length of basic local span. Let |
The basic AR model is given by
where is order of the AR model and
is innovation variance
v
.
mean |
mean. |
var |
variance. |
bweight |
Bayesian weight. |
aic |
AIC with respect to the present data. |
arcoef |
AR coefficients. |
v |
innovation variance. |
eaic |
equivalent AIC of Bayesian model. |
init |
start point of the data fitted to the current model. |
end |
end point of the data fitted to the current model. |
G.Kitagawa and H.Akaike (1978) A Procedure for the Modeling of Non-stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike (1978) A Bayesian Extension of The Minimum AIC Procedure of Autoregressive Model Fitting. Research Memo. NO.126. The institute of Statistical Mathematics.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Amerikamaru) blomar(Amerikamaru, max.order = 10, span = 300)
data(Amerikamaru) blomar(Amerikamaru, max.order = 10, span = 300)
The BLSALLFOOD data. (the Bureau of Labor Statistics, all employees in food industries, January 1967 - December 1979)
data(Blsallfood)
data(Blsallfood)
A time series of 156 observations.
H.Akaike, T.Ozaki, M.Ishiguro, Y.Ogata, G.Kitagawa, Y-H.Tamura, E.Arahata, K.Katsura and Y.Tamura (1984) Computer Science Monographs, Timsac-84 Part 1. The Institute of Statistical Mathematics.
Produce Bayesian estimates of time series models such as pure AR models, AR models with non-linear terms, AR models with polynomial type mean value functions, etc. The goodness of fit of a model is checked by the analysis of several steps ahead prediction errors.
bsubst(y, mtype, lag = NULL, nreg, reg = NULL, term.lag = NULL, cstep = 5, plot = TRUE)
bsubst(y, mtype, lag = NULL, nreg, reg = NULL, term.lag = NULL, cstep = 5, plot = TRUE)
y |
a univariate time series. |
||||||||||
mtype |
model type. Allowed values are
|
||||||||||
lag |
maximum time lag. Default is |
||||||||||
nreg |
number of regressors. |
||||||||||
reg |
specification of regressor ( |
||||||||||
term.lag |
maximum time lag specify the regressors
(
|
||||||||||
cstep |
prediction errors checking (up to |
||||||||||
plot |
logical. If |
The AR model is given by ( mtype
= 2 )
The non-linear model is given by ( mtype
= 2, 3 )
Where is AR order and
is Gaussian white noise with mean
and variance
.
ymean |
mean of |
yvar |
variance of |
v |
innovation variance. |
aic |
AIC(m), (m=0, ... |
aicmin |
minimum AIC. |
daic |
AIC(m)- |
order.maice |
order of minimum AIC. |
v.maice |
innovation variance attained at |
arcoef.maice |
AR coefficients attained at |
v.bay |
residual variance of Bayesian model. |
aic.bay |
AIC of Bayesian model. |
np.bay |
equivalent number of parameters. |
arcoef.bay |
AR coefficients of Bayesian model. |
ind.c |
index of |
parcor2 |
square of partial correlations (normalized by multiplying N). |
damp |
binomial type damper. |
bweight |
final Bayesian weights of partial correlations. |
parcor.bay |
partial correlations of the Bayesian model. |
eicmin |
minimum EIC. |
esum |
whole subset regression models. |
npmean |
mean of number of parameter. |
npmean.nreg |
= |
perr |
prediction error. |
mean |
mean. |
var |
variance. |
skew |
skewness. |
peak |
peakedness. |
peautcor |
autocorrelation function of 1-step ahead prediction error. |
pspec |
power spectrum ( |
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Canadianlynx) Regressor <- matrix( c( 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 1, 2, 1, 3, 1, 2, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 2, 3, 1, 2, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 3 ), nrow = 3, ncol = 19, byrow = TRUE) z <- bsubst(Canadianlynx, mtype = 2, lag = 12, nreg = 19, Regressor) z$arcoef.bay
data(Canadianlynx) Regressor <- matrix( c( 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 1, 2, 1, 3, 1, 2, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 2, 3, 1, 2, 3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 3 ), nrow = 3, ncol = 19, byrow = TRUE) z <- bsubst(Canadianlynx, mtype = 2, lag = 12, nreg = 19, Regressor) z$arcoef.bay
A time series of Canadian lynx data for testing unimar
,
unibar
, bsubst
and exsar
.
data(Canadianlynx)
data(Canadianlynx)
A time series of 114 observations.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
Fit an ARMA model to stationary scalar time series through the analysis of canonical correlations between the future and past sets of observations.
canarm(y, lag = NULL, max.order = NULL, plot = TRUE)
canarm(y, lag = NULL, max.order = NULL, plot = TRUE)
y |
a univariate time series. |
lag |
maximum lag. Default is |
max.order |
upper limit of AR order and MA order, must be less than or
equal to |
plot |
logical. If |
The ARMA model of stationary scalar time series is
given by
where is AR order and
is MA order.
arinit |
AR coefficients of initial AR model fitting by the minimum AIC procedure. |
v |
innovation vector. |
aic |
AIC. |
aicmin |
minimum AIC. |
order.maice |
order of minimum AIC. |
parcor |
partial autocorrelation. |
nc |
total number of case. |
future |
number of present and future variables. |
past |
number of present and past variables. |
cweight |
future set canonical weight. |
canocoef |
canonical R. |
canocoef2 |
R-squared. |
chisquar |
chi-square. |
ndf |
N.D.F. |
dic |
DIC. |
dicmin |
minimum DIC. |
order.dicmin |
order of minimum DIC. |
arcoef |
AR coefficients |
macoef |
MA coefficients |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=c(-0.5)), n = 1000) z <- canarm(y, max.order = 30) z$arcoef z$macoef
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=c(-0.5)), n = 1000) z <- canarm(y, max.order = 30) z$arcoef z$macoef
Analyze canonical correlation of a d-dimensional multivariate time series.
canoca(y)
canoca(y)
y |
a multivariate time series. |
First AR model is fitted by the minimum AIC procedure. The results are used to ortho-normalize the present and past variables. The present and future variables are tested successively to decide on the dependence of their predictors. When the last DIC (=chi-square - 2.0*N.D.F.) is negative the predictor of the variable is decided to be linearly dependent on the antecedents.
aic |
AIC. |
aicmin |
minimum AIC. |
order.maice |
MAICE AR model order. |
v |
innovation variance. |
arcoef |
autoregressive coefficients. |
nc |
number of cases. |
future |
number of variable in the future set. |
past |
number of variables in the past set. |
cweight |
future set canonical weight. |
canocoef |
canonical R. |
canocoef2 |
R-squared. |
chisquar |
chi-square. |
ndf |
N.D.F. |
dic |
DIC. |
dicmin |
minimum DIC. |
order.dicmin |
order of minimum DIC. |
matF |
the transition matrix |
vectH |
structural characteristic vector |
matG |
the estimate of the input matrix |
vectF |
matrix |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow= TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(1000*3), nrow = 1000, ncol = 3) y <- mfilter(x, ar, "recursive") z <- canoca(y) z$arcoef
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow= TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(1000*3), nrow = 1000, ncol = 3) y <- mfilter(x, ar, "recursive") z <- canoca(y) z$arcoef
Produce the Fourier transform of a power gain function in the form of an autocovariance sequence.
covgen(lag, f, gain, plot = TRUE)
covgen(lag, f, gain, plot = TRUE)
lag |
desired maximum lag of covariance. |
f |
frequency |
gain |
power gain of the filter at the frequency |
plot |
logical. If |
acov |
autocovariance. |
acor |
autocovariance normalized. |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
spec <- raspec(h = 100, var = 1, arcoef = c(0.64,-0.8), plot = FALSE) covgen(lag = 100, f = 0:100/200, gain = spec)
spec <- raspec(h = 100, var = 1, arcoef = c(0.64,-0.8), plot = FALSE) covgen(lag = 100, f = 0:100/200, gain = spec)
Decompose a nonstationary time series into several possible components by square-root filter.
decomp(y, trend.order = 2, ar.order = 2, seasonal.order = 1, period = 1, log = FALSE, trade = FALSE, diff = 1, miss = 0, omax = 99999.9, plot = TRUE, ...)
decomp(y, trend.order = 2, ar.order = 2, seasonal.order = 1, period = 1, log = FALSE, trade = FALSE, diff = 1, miss = 0, omax = 99999.9, plot = TRUE, ...)
y |
a univariate time series with or without the tsp attribute. |
||||||
trend.order |
trend order (1, 2 or 3). |
||||||
ar.order |
AR order (less than 11, try 2 first). |
||||||
seasonal.order |
seasonal order (0, 1 or 2). |
||||||
period |
number of seasons in one period. If the tsp attribute of
|
||||||
log |
logical; if |
||||||
trade |
logical; if |
||||||
diff |
numerical differencing (1 sided or 2 sided). |
||||||
miss |
missing value flag.
|
||||||
omax |
maximum or minimum data value (if |
||||||
plot |
logical. If |
||||||
... |
graphical arguments passed to |
The Basic Model
where is trend component,
is AR process,
is
seasonal component,
is trading day factor and
is
observational noise.
Component Models
Trend component (trend.order m1)
AR component (ar.order m2)
Seasonal component (seasonal.order k, frequency f)
Trading day effect
where is the number of
-th days of the week in
-th data and
.
An object of class "decomp"
, which is a list with the following
components:
trend |
trend component. |
seasonal |
seasonal component. |
ar |
AR process. |
trad |
trading day factor. |
noise |
observational noise. |
aic |
AIC. |
lkhd |
likelihood. |
sigma2 |
sigma^2. |
tau1 |
system noise variances |
tau2 |
system noise variances |
tau3 |
system noise variances |
arcoef |
vector of AR coefficients. |
tdf |
trading day factor. |
conv.y |
Missing values are replaced by NA after the specified logarithmic transformation.. |
G.Kitagawa (1981) A Nonstationary Time Series Model and Its Fitting by a Recursive Filter Journal of Time Series Analysis, Vol.2, 103-116.
W.Gersch and G.Kitagawa (1983) The prediction of time series with Trends and Seasonalities Journal of Business and Economic Statistics, Vol.1, 253-264.
G.Kitagawa (1984) A smoothness priors-state space modeling of Time Series with Trend and Seasonality Journal of American Statistical Association, VOL.79, NO.386, 378-389.
data(Blsallfood) y <- ts(Blsallfood, start=c(1967,1), frequency=12) z <- decomp(y, trade = TRUE) z$aic z$lkhd z$sigma2 z$tau1 z$tau2 z$tau3
data(Blsallfood) y <- ts(Blsallfood, start=c(1967,1), frequency=12) z <- decomp(y, trade = TRUE) z$aic z$lkhd z$sigma2 z$tau1 z$tau2 z$tau3
Produce exact maximum likelihood estimates of the parameters of a scalar AR model.
exsar(y, max.order = NULL, plot = FALSE)
exsar(y, max.order = NULL, plot = FALSE)
y |
a univariate time series. |
max.order |
upper limit of AR order. Default is |
plot |
logical. If |
The AR model is given by
where is AR order and
is a zero mean white noise.
mean |
mean. |
var |
variance. |
v |
innovation variance. |
aic |
AIC. |
aicmin |
minimum AIC. |
daic |
AIC- |
order.maice |
order of minimum AIC. |
v.maice |
MAICE innovation variance. |
arcoef.maice |
MAICE AR coefficients. |
v.mle |
maximum likelihood estimates of innovation variance. |
arcoef.mle |
maximum likelihood estimates of AR coefficients. |
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Canadianlynx) z <- exsar(Canadianlynx, max.order = 14) z$arcoef.maice z$arcoef.mle
data(Canadianlynx) z <- exsar(Canadianlynx, max.order = 14) z$arcoef.maice z$arcoef.mle
Compute auto and/or cross covariances and correlations via FFT.
fftcor(y, lag = NULL, isw = 4, plot = TRUE, lag_axis = TRUE)
fftcor(y, lag = NULL, isw = 4, plot = TRUE, lag_axis = TRUE)
y |
data of channel X and Y (data of channel Y is given for |
||||||
lag |
maximum lag. Default is |
||||||
isw |
numerical flag giving the type of computation.
|
||||||
plot |
logical. If |
||||||
lag_axis |
logical. If |
acov |
auto-covariance. |
ccov12 |
cross-covariance. |
ccov21 |
cross-covariance. |
acor |
auto-correlation. |
ccor12 |
cross-correlation. |
ccor21 |
cross-correlation. |
mean |
mean. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 x <- rnorm(200) y <- rnorm(200) xy <- array(c(x,y), dim = c(200,2)) fftcor(xy, lag_axis = FALSE) # Example 2 xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) fftcor(x, lag = 20)
# Example 1 x <- rnorm(200) y <- rnorm(200) xy <- array(c(x,y), dim = c(200,2)) fftcor(xy, lag_axis = FALSE) # Example 2 xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) fftcor(x, lag = 20)
Perform FPE(Final Prediction Error) computation for one-dimensional AR model.
fpeaut(y, max.order = NULL)
fpeaut(y, max.order = NULL)
y |
a univariate time series. |
max.order |
upper limit of model order. Default is
|
The AR model is given by
where is AR order and
is a zero mean white noise.
ordermin |
order of minimum FPE. |
best.ar |
AR coefficients with minimum FPE. |
sigma2m |
= |
fpemin |
minimum FPE. |
rfpemin |
minimum RFPE. |
ofpe |
OFPE. |
arcoef |
AR coefficients. |
sigma2 |
|
fpe |
FPE (Final Prediction Error). |
rfpe |
RFPE. |
parcor |
partial correlation. |
chi2 |
chi-squared. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) fpeaut(y, max.order = 20)
y <- arima.sim(list(order=c(2,0,0), ar=c(0.64,-0.8)), n = 200) fpeaut(y, max.order = 20)
Perform AR model fitting for control.
fpec(y, max.order = NULL, control = NULL, manip = NULL)
fpec(y, max.order = NULL, control = NULL, manip = NULL)
y |
a multivariate time series. |
max.order |
upper limit of model order. Default is
|
control |
controlled variables. Default is |
manip |
manipulated variables. Default number of manipulated variable is
|
cov |
covariance matrix rearrangement. |
fpec |
FPEC (AR model fitting for control). |
rfpec |
RFPEC. |
aic |
AIC. |
ordermin |
order of minimum FPEC. |
fpecmin |
minimum FPEC. |
rfpecmin |
minimum RFPEC. |
aicmin |
minimum AIC. |
perr |
prediction error covariance matrix. |
arcoef |
a set of coefficient matrices. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") fpec(y, max.order = 10)
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") fpec(y, max.order = 10)
Labor force U.S. unemployed 16 years or over (1972-1978) data.
data(LaborData)
data(LaborData)
A time series of 72 observations.
H.Akaike, T.Ozaki, M.Ishiguro, Y.Ogata, G.Kitagawa, Y-H.Tamura, E.Arahata, K.Katsura and Y.Tamura (1985) Computer Science Monograph, No.22, Timsac84 Part 1. The Institute of Statistical Mathematics.
A non-stationary data for testing mlocar
and blocar
.
data(locarData)
data(locarData)
A time series of 1000 observations.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
Compute maximum likelihood estimates of Markovian model.
markov(y)
markov(y)
y |
a multivariate time series. |
This function is usually used with simcon
.
id |
|
ir |
|
ij |
|
ik |
|
grad |
gradient vector. |
matFi |
initial estimate of the transition matrix |
matF |
transition matrix |
matG |
input matrix |
davvar |
DAVIDON variance. |
arcoef |
AR coefficient matrices. |
impulse |
impulse response matrices. |
macoef |
MA coefficient matrices. |
v |
innovation variance. |
aic |
AIC. |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.5, Timsac74, A Time Series Analysis and Control Program Package (1). The Institute of Statistical Mathematics.
x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.30), nrow = 2, ncol = 2, byrow = TRUE) z <- mfilter(y, ar, "recursive") markov(z)
x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.30), nrow = 2, ncol = 2, byrow = TRUE) z <- mfilter(y, ar, "recursive") markov(z)
Applies linear filtering to a multivariate time series.
mfilter(x, filter, method = c("convolution","recursive"), init)
mfilter(x, filter, method = c("convolution","recursive"), init)
x |
a multivariate ( |
filter |
an array of filter coefficients. |
method |
either "convolution" or "recursive" (and can be abbreviated). If "convolution" a moving average is used: if "recursive" an autoregression is used. For convolution filters, the filter coefficients are for past value only. |
init |
specifies the initial values of the time series just prior to the start value, in reverse time order. The default is a set of zeros. |
This is a multivariate version of "filter" function.
Missing values are allowed in 'x
' but not in 'filter
'
(where they would lead to missing values everywhere in the output).
Note that there is an implied coefficient at lag
in the
recursive filter, which gives
No check is made to see if recursive filter is invertible: the output may diverge if it is not. The convolution filter is
mfilter
returns a time series object.
'convolve(, type="filter")
' uses the FFT for computations and so may be
faster for long filters on univariate time series (and so the time alignment
is unclear), nor does it handle missing values. 'filter' is faster for a
filter of length 100 on a series 1000, for examples.
#AR model simulation ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(100*3), nrow = 100, ncol = 3) y <- mfilter(x, ar, "recursive") #Back to white noise ma <- array(0, dim = c(3,3,3)) ma[, , 1] <- diag(3) ma[, , 2] <- -ar[, , 1] ma[, , 3] <- -ar[, , 2] z <- mfilter(y, ma, "convolution") mulcor(z) #AR-MA model simulation x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.30), nrow = 2, ncol = 2, byrow = TRUE) z <- mfilter(y, ar, "recursive")
#AR model simulation ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(100*3), nrow = 100, ncol = 3) y <- mfilter(x, ar, "recursive") #Back to white noise ma <- array(0, dim = c(3,3,3)) ma[, , 1] <- diag(3) ma[, , 2] <- -ar[, , 1] ma[, , 3] <- -ar[, , 2] z <- mfilter(y, ma, "convolution") mulcor(z) #AR-MA model simulation x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.30), nrow = 2, ncol = 2, byrow = TRUE) z <- mfilter(y, ar, "recursive")
Locally fit autoregressive models to non-stationary time series by minimum AIC procedure.
mlocar(y, max.order = NULL, span, const = 0, plot = TRUE)
mlocar(y, max.order = NULL, span, const = 0, plot = TRUE)
y |
a univariate time series. |
max.order |
upper limit of the order of AR model. Default is
|
span |
length of the basic local span. |
const |
integer. |
plot |
logical. If |
The data of length are divided into
locally stationary spans,
where (
) denotes the number of
basic spans, each of length span, which constitute the
-th locally
stationary span. At each local span, the process is represented by a
stationary autoregressive model.
mean |
mean. |
var |
variance. |
ns |
the number of local spans. |
order |
order of the current model. |
arcoef |
AR coefficients of current model. |
v |
innovation variance of the current model. |
init |
initial point of the data fitted to the current model. |
end |
end point of the data fitted to the current model. |
pspec |
power spectrum. |
npre |
data length of the preceding stationary block. |
nnew |
data length of the new block. |
order.mov |
order of the moving model. |
v.mov |
innovation variance of the moving model. |
aic.mov |
AIC of the moving model. |
order.const |
order of the constant model. |
v.const |
innovation variance of the constant model. |
aic.const |
AIC of the constant model. |
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-Stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(locarData) z <- mlocar(locarData, max.order = 10, span = 300, const = 0) z$arcoef
data(locarData) z <- mlocar(locarData, max.order = 10, span = 300, const = 0) z$arcoef
Locally fit multivariate autoregressive models to non-stationary time series by the minimum AIC procedure using the householder transformation.
mlomar(y, max.order = NULL, span, const = 0)
mlomar(y, max.order = NULL, span, const = 0)
y |
a multivariate time series. |
max.order |
upper limit of the order of AR model, less than or equal to
|
span |
length of basic local span. Let |
const |
integer. ' |
The data of length are divided into
locally stationary spans,
where
denoted the number of
basic spans, each of length span, which constitute the
-th locally
stationary span. At each local span, the process is represented by a
stationary autoregressive model.
mean |
mean. |
var |
variance. |
ns |
the number of local spans. |
order |
order of the current model. |
aic |
AIC of the current model. |
arcoef |
AR coefficient matrices of the current model.
|
v |
innovation variance of the current model. |
init |
initial point of the data fitted to the current model. |
end |
end point of the data fitted to the current model. |
npre |
data length of the preceding stationary block. |
nnew |
data length of the new block. |
order.mov |
order of the moving model. |
aic.mov |
AIC of the moving model. |
order.const |
order of the constant model. |
aic.const |
AIC of the constant model. |
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-Stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Amerikamaru) mlomar(Amerikamaru, max.order = 10, span = 300, const = 0)
data(Amerikamaru) mlomar(Amerikamaru, max.order = 10, span = 300, const = 0)
Determine multivariate autoregressive models by a Bayesian procedure. The basic least squares estimates of the parameters are obtained by the householder transformation.
mulbar(y, max.order = NULL, plot = FALSE)
mulbar(y, max.order = NULL, plot = FALSE)
y |
a multivariate time series. |
max.order |
upper limit of the order of AR model, less than or equal to
|
plot |
logical. If |
The statistic AIC is defined by
where is the number of data,
is the estimate of innovation
variance matrix,
is the determinant and
is the number of
free parameters.
Bayesian weight of the -th order model is defined by
where is the normalizing constant and
. The Bayesian estimates of
partial autoregression coefficient matrices of forward and backward models are
obtained by
where the original and
are the (conditional) maximum
likelihood estimates of the highest order coefficient matrices of forward and
backward AR models of order
and
is defined by
The equivalent number of parameters for the Bayesian model is defined by
where denotes dimension of the process.
mean |
mean. |
var |
variance. |
v |
innovation variance. |
aic |
AIC. |
aicmin |
minimum AIC. |
daic |
AIC- |
order.maice |
order of minimum AIC. |
v.maice |
MAICE innovation variance. |
bweight |
Bayesian weights. |
integra.bweight |
integrated Bayesian Weights. |
arcoef.for |
AR coefficients (forward model). |
arcoef.back |
AR coefficients (backward model). |
pacoef.for |
partial autoregression coefficients (forward model). |
pacoef.back |
partial autoregression coefficients (backward model). |
v.bay |
innovation variance of the Bayesian model. |
aic.bay |
equivalent AIC of the Bayesian (forward) model. |
H.Akaike (1978) A Bayesian Extension of The Minimum AIC Procedure of Autoregressive Model Fitting. Research Memo. NO.126, The Institute of Statistical Mathematics.
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Powerplant) z <- mulbar(Powerplant, max.order = 10) z$pacoef.for z$pacoef.back
data(Powerplant) z <- mulbar(Powerplant, max.order = 10) z$pacoef.for z$pacoef.back
Estimate multiple correlation.
mulcor(y, lag = NULL, plot = TRUE, lag_axis = TRUE)
mulcor(y, lag = NULL, plot = TRUE, lag_axis = TRUE)
y |
a multivariate time series. |
lag |
maximum lag. Default is |
plot |
logical. If TRUE (default), correlations |
lag_axis |
logical. If |
cov |
covariances. |
cor |
correlations (normalized covariances). |
mean |
mean. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 y <- rnorm(1000) dim(y) <- c(500,2) mulcor(y, lag_axis = FALSE) # Example 2 xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) mulcor(x, lag = 20)
# Example 1 y <- rnorm(1000) dim(y) <- c(500,2) mulcor(y, lag_axis = FALSE) # Example 2 xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) mulcor(x, lag = 20)
Compute multiple frequency response function, gain, phase, multiple coherency, partial coherency and relative error statistics.
mulfrf(y, lag = NULL, iovar = NULL)
mulfrf(y, lag = NULL, iovar = NULL)
y |
a multivariate time series. |
lag |
maximum lag. Default is |
iovar |
input variables |
cospec |
spectrum (complex). |
freqr |
frequency response function : real part. |
freqi |
frequency response function : imaginary part. |
gain |
gain. |
phase |
phase. |
pcoh |
partial coherency. |
errstat |
relative error statistics. |
mcoh |
multiple coherency. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") mulfrf(y, lag = 20)
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") mulfrf(y, lag = 20)
Fit a multivariate autoregressive model by the minimum AIC procedure. Only the possibilities of zero coefficients at the beginning and end of the model are considered. The least squares estimates of the parameters are obtained by the householder transformation.
mulmar(y, max.order = NULL, plot = FALSE)
mulmar(y, max.order = NULL, plot = FALSE)
y |
a multivariate time series. |
max.order |
upper limit of the order of AR model, less than or equal to
|
plot |
logical. If |
Multivariate autoregressive model is defined by
where is order of the model and
is Gaussian white noise
with mean
and variance matrix
matv
. AIC is defined by
where is the number of data,
is the estimate of innovation
variance matrix,
is the determinant and
is the number of
free parameters.
mean |
mean. |
var |
variance. |
v |
innovation variance. |
aic |
AIC. |
aicmin |
minimum AIC. |
daic |
AIC-aicmin. |
order.maice |
order of minimum AIC. |
v.maice |
MAICE innovation variance. |
np |
number of parameters. |
jnd |
specification of |
subregcoef |
subset regression coefficients. |
rvar |
residual variance. |
aicf |
final estimate of AIC ( |
respns |
instantaneous response. |
regcoef |
regression coefficients matrix. |
matv |
innovation variance matrix. |
morder |
order of the MAICE model. |
arcoef |
AR coefficients. |
aicsum |
the sum of aicf. |
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
# Example 1 data(Powerplant) z <- mulmar(Powerplant, max.order = 10) z$arcoef # Example 2 ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3,byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") z <- mulmar(y, max.order = 10) z$arcoef
# Example 1 data(Powerplant) z <- mulmar(Powerplant, max.order = 10) z$arcoef # Example 2 ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3,byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") z <- mulmar(y, max.order = 10) z$arcoef
Compute relative power contributions in differential and integrated form, assuming the orthogonality between noise sources.
mulnos(y, max.order = NULL, control = NULL, manip = NULL, h)
mulnos(y, max.order = NULL, control = NULL, manip = NULL, h)
y |
a multivariate time series. |
max.order |
upper limit of model order. Default is
|
control |
controlled variables. Default is |
manip |
manipulated variables. Default number of manipulated variable is
' |
h |
specify frequencies |
nperr |
a normalized prediction error covariance matrix. |
diffr |
differential relative power contribution. |
integr |
integrated relative power contribution. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") mulnos(y, max.order = 10, h = 20)
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") mulnos(y, max.order = 10, h = 20)
Compute rational spectrum for d-dimensional ARMA process.
mulrsp(h, d, cov, ar = NULL, ma = NULL, log = FALSE, plot = TRUE, ...)
mulrsp(h, d, cov, ar = NULL, ma = NULL, log = FALSE, plot = TRUE, ...)
h |
specify frequencies |
d |
dimension of the observation vector. |
cov |
covariance matrix. |
ar |
coefficient matrix of autoregressive model. |
ma |
coefficient matrix of moving average model. |
log |
logical. If |
plot |
logical. If |
... |
graphical arguments passed to |
ARMA process :
where is a white noise with zero mean vector and covariance matrix
cov
.
rspec |
rational spectrum. An object of class |
scoh |
simple coherence. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 for the normal distribution xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) aaa <- ar(x) mulrsp(h = 20, d = 2, cov = aaa$var.pred, ar = aaa$ar) # Example 2 for the AR model ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") z <- fpec(y, max.order = 10) mulrsp(h = 20, d = 3, cov = z$perr, ar = z$arcoef)
# Example 1 for the normal distribution xorg <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- xorg[1:1000] x[, 2] <- xorg[4:1003] + 0.5*rnorm(1000) aaa <- ar(x) mulrsp(h = 20, d = 2, cov = aaa$var.pred, ar = aaa$ar) # Example 2 for the AR model ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") z <- fpec(y, max.order = 10) mulrsp(h = 20, d = 3, cov = z$perr, ar = z$arcoef)
Compute multiple spectrum estimates using Akaike window or Hanning window.
mulspe(y, lag = NULL, window = "Akaike", plot = TRUE, ...)
mulspe(y, lag = NULL, window = "Akaike", plot = TRUE, ...)
y |
a multivariate time series with |
||||||
lag |
maximum lag. Default is |
||||||
window |
character string giving the definition of smoothing window. Allowed strings are "Akaike" (default) or "Hanning". |
||||||
plot |
logical. If TRUE (default) spectrums are plotted as
|
||||||
... |
graphical arguments passed to |
Hanning Window : | a1(0)=0.5, | a1(1)=a1(-1)=0.25, | a1(2)=a1(-2)=0 |
Akaike Window : | a2(0)=0.625, | a2(1)=a2(-1)=0.25, | a2(2)=a2(-2)=-0.0625 |
spec |
spectrum smoothing by ' |
||||
specmx |
spectrum matrix. An object of class
|
||||
stat |
test statistics. |
||||
coh |
simple coherence by ' |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
sgnl <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- sgnl[4:1003] # x[i,2] = 0.9*x[i-3,1] + 0.2*N(0,1) x[, 2] <- 0.9*sgnl[1:1000] + 0.2*rnorm(1000) mulspe(x, lag = 100, window = "Hanning")
sgnl <- rnorm(1003) x <- matrix(0, nrow = 1000, ncol = 2) x[, 1] <- sgnl[4:1003] # x[i,2] = 0.9*x[i-3,1] + 0.2*N(0,1) x[, 2] <- 0.9*sgnl[1:1000] + 0.2*rnorm(1000) mulspe(x, lag = 100, window = "Hanning")
Locally fit autoregressive models to non-stationary time series by AIC criterion.
nonst(y, span, max.order = NULL, plot = TRUE)
nonst(y, span, max.order = NULL, plot = TRUE)
y |
a univariate time series. |
span |
length of the basic local span. |
max.order |
highest order of AR model. Default is
|
plot |
logical. If |
The basic AR model is given by
where is order of the AR model and
is innovation variance.
AIC is defined by
where is the length of data,
is the estimates of the
innovation variance and
is the number of parameter.
ns |
the number of local spans. |
arcoef |
AR coefficients. |
v |
innovation variance. |
aic |
AIC. |
daic21 |
= AIC2 - AIC1. |
daic |
= |
init |
start point of the data fitted to the current model. |
end |
end point of the data fitted to the current model. |
pspec |
power spectrum. |
H.Akaike, E.Arahata and T.Ozaki (1976) Computer Science Monograph, No.6, Timsac74 A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
# Non-stationary Test Data data(nonstData) nonst(nonstData, span = 700, max.order = 49)
# Non-stationary Test Data data(nonstData) nonst(nonstData, span = 700, max.order = 49)
A non-stationary data for testing nonst
.
data(nonstData)
data(nonstData)
A time series of 2100 observations.
H.Akaike, E.Arahata and T.Ozaki (1976) Computer Science Monograph, No.6, Timsac74 A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
Compute optimal controller gain matrix for a quadratic criterion defined by two positive definite matrices Q and R.
optdes(y, max.order = NULL, ns, q, r)
optdes(y, max.order = NULL, ns, q, r)
y |
a multivariate time series. |
max.order |
upper limit of model order. Default is
|
ns |
number of D.P. stages. |
q |
positive definite |
r |
positive definite |
perr |
prediction error covariance matrix. |
trans |
first |
gamma |
gamma matrix. |
gain |
gain matrix. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Multivariate Example Data ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow= 3, ncol= 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow= 3, ncol= 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") q.mat <- matrix(c(0.16,0,0,0.09), nrow = 2, ncol = 2) r.mat <- as.matrix(0.001) optdes(y, ns = 20, q = q.mat, r = r.mat)
# Multivariate Example Data ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow= 3, ncol= 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow= 3, ncol= 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") q.mat <- matrix(c(0.16,0,0,0.09), nrow = 2, ncol = 2) r.mat <- as.matrix(0.001) optdes(y, ns = 20, q = q.mat, r = r.mat)
Perform optimal control simulation and evaluate the means and variances of the controlled and manipulated variables X and Y.
optsim(y, max.order = NULL, ns, q, r, noise = NULL, len, plot = TRUE)
optsim(y, max.order = NULL, ns, q, r, noise = NULL, len, plot = TRUE)
y |
a multivariate time series. |
max.order |
upper limit of model order. Default is |
ns |
number of steps of simulation. |
q |
positive definite matrix |
r |
positive definite matrix |
noise |
noise. If not provided, Gaussian vector white noise with the
length |
len |
length of white noise record. |
plot |
logical. If |
trans |
first |
gamma |
gamma matrix. |
gain |
gain matrix. |
convar |
controlled variables |
manvar |
manipulated variables |
xmean |
mean of |
ymean |
mean of |
xvar |
variance of |
yvar |
variance of |
x2sum |
sum of |
y2sum |
sum of |
x2mean |
mean of |
y2mean |
mean of |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Multivariate Example Data ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") q.mat <- matrix(c(0.16,0,0,0.09), nrow = 2, ncol = 2) r.mat <- as.matrix(0.001) optsim(y, max.order = 10, ns = 20, q = q.mat, r = r.mat, len = 20)
# Multivariate Example Data ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") q.mat <- matrix(c(0.16,0,0,0.09), nrow = 2, ncol = 2) r.mat <- as.matrix(0.001) optsim(y, max.order = 10, ns = 20, q = q.mat, r = r.mat, len = 20)
This is the program for the fitting of periodic autoregressive models by the method of least squares realized through householder transformation.
perars(y, ni, lag = NULL, ksw = 0)
perars(y, ni, lag = NULL, ksw = 0)
y |
a univariate time series. |
ni |
number of instants in one period. |
lag |
maximum lag of periods. Default is
|
ksw |
integer. ' |
Periodic autoregressive model
(
ni
) is defined
by
,
,
where is the number of periods,
is the number of instants in
one period and
is the Gaussian white noise. When
ksw
is
set to '', the constant term
is excluded.
The statistics AIC is defined by
, where
is the
length of data,
is the estimate of the innovation variance matrix and
is the number of parameters. The outputs are the estimates of the
regression coefficients and innovation variance of the periodic AR model for
each instant.
mean |
mean. |
var |
variance. |
subset |
specification of i-th regressor ( |
regcoef |
regression coefficients. |
rvar |
residual variances. |
np |
number of parameters. |
aic |
AIC. |
v |
innovation variance matrix. |
arcoef |
AR coefficient matrices. |
const |
constant vector. |
morder |
order of the MAICE model. |
M.Pagano (1978) On Periodic and Multiple Autoregressions. Ann. Statist., 6, 1310–1317.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Airpollution) perars(Airpollution, ni = 6, lag = 2, ksw = 1)
data(Airpollution) perars(Airpollution, ni = 6, lag = 2, ksw = 1)
Plot trend component, seasonal component, AR component, noise and trading day
factor returned by decomp
.
## S3 method for class 'decomp' plot(x, ...)
## S3 method for class 'decomp' plot(x, ...)
x |
an object of class |
... |
further graphical parameters may also be supplied as arguments. |
Plot spectrum returned by mulspe
and mulrsp
.
On and lower diagonal are real parts, and upper diagonal are imaginary parts.
## S3 method for class 'specmx' plot(x, plot.scale = TRUE, ...)
## S3 method for class 'specmx' plot(x, plot.scale = TRUE, ...)
x |
An object of class |
plot.scale |
logical. IF |
... |
further graphical parameters may also be supplied as arguments. |
A Power plant data for testing mulbar
and mulmar
.
data(Powerplant)
data(Powerplant)
A 2-dimensional array with 500 observations on 3 variables.
[, 1] | command |
[, 2] | temperature |
[, 3] | fuel |
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
Operate on a real record of a vector process and compute predicted values.
prdctr(y, r, s, h, arcoef, macoef = NULL, impulse = NULL, v, plot = TRUE)
prdctr(y, r, s, h, arcoef, macoef = NULL, impulse = NULL, v, plot = TRUE)
y |
a univariate time series or a multivariate time series. |
r |
one step ahead prediction starting position |
s |
long range forecast starting position |
h |
maximum span of long range forecast |
arcoef |
AR coefficient matrices. |
macoef |
MA coefficient matrices. |
impulse |
impulse response matrices. |
v |
innovation variance. |
plot |
logical. If |
One step ahead Prediction starts at time and ends at time
.
Prediction is continued without new observations until time
.
Basic model is the autoregressive moving average model of
which is
given by
where is AR order and
is MA order.
predct |
predicted values : |
ys |
|
pstd |
|
p2std |
|
p3std |
|
mstd |
|
m2std |
|
m3std |
|
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.6, Timsac74, A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=c(-0.5)), n = 1000) y1 <- y[1:900] z <- autoarmafit(y1) ar <- z$model[[1]]$arcoef ma <- z$model[[1]]$macoef var <- z$model[[1]]$v y2 <- y[901:990] prdctr(y2, r = 50, s = 90, h = 10, arcoef = ar, macoef = ma, v = var)
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". y <- arima.sim(list(order=c(2,0,1), ar=c(0.64,-0.8), ma=c(-0.5)), n = 1000) y1 <- y[1:900] z <- autoarmafit(y1) ar <- z$model[[1]]$arcoef ma <- z$model[[1]]$macoef var <- z$model[[1]]$v y2 <- y[901:990] prdctr(y2, r = 50, s = 90, h = 10, arcoef = ar, macoef = ma, v = var)
Compute power spectrum of ARMA process.
raspec(h, var, arcoef = NULL, macoef = NULL, log = FALSE, plot = TRUE)
raspec(h, var, arcoef = NULL, macoef = NULL, log = FALSE, plot = TRUE)
h |
specify frequencies
|
var |
variance. |
arcoef |
AR coefficients. |
macoef |
MA coefficients. |
log |
logical. If |
plot |
logical. If |
ARMA process :
where is AR order,
is MA order and
is a white noise
with zero mean and variance equal to
var
.
raspec
gives the rational spectrum.
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 for the AR model raspec(h = 100, var = 1, arcoef = c(0.64,-0.8)) # Example 2 for the MA model raspec(h = 20, var = 1, macoef = c(0.64,-0.8))
# Example 1 for the AR model raspec(h = 100, var = 1, arcoef = c(0.64,-0.8)) # Example 2 for the MA model raspec(h = 20, var = 1, macoef = c(0.64,-0.8))
Compute 1-input,1-output frequency response function, gain, phase, coherency and relative error statistics.
sglfre(y, lag = NULL, invar, outvar)
sglfre(y, lag = NULL, invar, outvar)
y |
a multivariate time series. |
lag |
maximum lag. Default |
invar |
within |
outvar |
within |
inspec |
power spectrum (input). |
outspec |
power spectrum (output). |
cspec |
co-spectrum. |
qspec |
quad-spectrum. |
gain |
gain. |
coh |
coherency. |
freqr |
frequency response function : real part. |
freqi |
frequency response function : imaginary part. |
errstat |
relative error statistics. |
phase |
phase. |
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") sglfre(y, lag = 20, invar = 1, outvar = 2)
ar <- array(0, dim = c(3,3,2)) ar[, , 1] <- matrix(c(0.4, 0, 0.3, 0.2, -0.1, -0.5, 0.3, 0.1, 0), nrow = 3, ncol = 3, byrow = TRUE) ar[, , 2] <- matrix(c(0, -0.3, 0.5, 0.7, -0.4, 1, 0, -0.5, 0.3), nrow = 3, ncol = 3, byrow = TRUE) x <- matrix(rnorm(200*3), nrow = 200, ncol = 3) y <- mfilter(x, ar, "recursive") sglfre(y, lag = 20, invar = 1, outvar = 2)
Produce optimal controller gain and simulate the controlled process.
simcon(span, len, r, arcoef, impulse, v, weight)
simcon(span, len, r, arcoef, impulse, v, weight)
span |
span of control performance evaluation. |
len |
length of experimental observation. |
r |
dimension of control input, less than or equal to |
arcoef |
matrices of autoregressive coefficients. |
impulse |
impulse response matrices. |
v |
covariance matrix of innovation. |
weight |
weighting matrix of performance. |
The basic state space model is obtained from the autoregressive moving average
model of a vector process ;
where (
) are the autoregressive
coefficients of the ARMA representation of
.
gain |
controller gain. |
ave |
average value of i-th component of |
var |
variance. |
std |
standard deviation. |
bc |
sub matrices |
bd |
sub matrices |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.6, Timsac74, A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(y, ar, "recursive") z <- markov(y) weight <- matrix(c(0.0002, 0.0, 0.0, 2.9 ), nrow = 2, ncol = 2, byrow = TRUE) simcon(span = 50, len = 700, r = 1, z$arcoef, z$impulse, z$v, weight)
x <- matrix(rnorm(1000*2), nrow = 1000, ncol = 2) ma <- array(0, dim = c(2,2,2)) ma[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ma[, , 2] <- matrix(c( -0.2, 0.0, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(x, ma, "convolution") ar <- array(0, dim = c(2,2,3)) ar[, , 1] <- matrix(c( -1.0, 0.0, 0.0, -1.0), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 2] <- matrix(c( -0.5, -0.2, -0.2, -0.5), nrow = 2, ncol = 2, byrow = TRUE) ar[, , 3] <- matrix(c( -0.3, -0.05, -0.1, -0.3), nrow = 2, ncol = 2, byrow = TRUE) y <- mfilter(y, ar, "recursive") z <- markov(y) weight <- matrix(c(0.0002, 0.0, 0.0, 2.9 ), nrow = 2, ncol = 2, byrow = TRUE) simcon(span = 50, len = 700, r = 1, z$arcoef, z$impulse, z$v, weight)
Compute the third order moments.
thirmo(y, lag = NULL, plot = TRUE)
thirmo(y, lag = NULL, plot = TRUE)
y |
a univariate time series. |
lag |
maximum lag. Default is |
plot |
logical. If |
mean |
mean. |
acov |
autocovariance. |
acor |
normalized covariance. |
tmomnt |
third order moments. |
H.Akaike, E.Arahata and T.Ozaki (1975) Computer Science Monograph, No.6, Timsac74, A Time Series Analysis and Control Program Package (2). The Institute of Statistical Mathematics.
data(bispecData) z <- thirmo(bispecData, lag = 30) z$tmomnt
data(bispecData) z <- thirmo(bispecData, lag = 30) z$tmomnt
This program fits an autoregressive model by a Bayesian procedure. The least squares estimates of the parameters are obtained by the householder transformation.
unibar(y, ar.order = NULL, plot = TRUE)
unibar(y, ar.order = NULL, plot = TRUE)
y |
a univariate time series. |
ar.order |
order of the AR model. Default is
|
plot |
logical. If |
The AR model is given by
where is AR order and
is Gaussian white noise with mean
and variance
. The basic statistic AIC is defined by
where is the length of data,
is the estimate of innovation
variance, and
is the order of the model.
Bayesian weight of the -th order model is defined by
where is the normalizing constant and
. The equivalent number of
free parameter for the Bayesian model is defined by
where is defined by
.
in the definition of AIC is replaced by
to be define an
equivalent AIC for a Bayesian model.
mean |
mean. |
var |
variance. |
v |
innovation variance. |
aic |
AIC. |
aicmin |
minimum AIC. |
daic |
AIC- |
order.maice |
order of minimum AIC. |
v.maice |
innovation variance attained at m= |
pacoef |
partial autocorrelation coefficients (least squares estimate). |
bweight |
Bayesian Weight. |
integra.bweight |
integrated Bayesian weights. |
v.bay |
innovation variance of Bayesian model. |
aic.bay |
AIC of Bayesian model. |
np |
equivalent number of parameters. |
pacoef.bay |
partial autocorrelation coefficients of Bayesian model. |
arcoef |
AR coefficients of Bayesian model. |
pspec |
power spectrum. |
H.Akaike (1978) A Bayesian Extension of The Minimum AIC Procedure of Autoregressive model Fitting. Research memo. No.126. The Institute of Statistical Mathematics.
G.Kitagawa and H.Akaike (1978) A Procedure for The Modeling of Non-Stationary Time Series. Ann. Inst. Statist. Math., 30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Canadianlynx) z <- unibar(Canadianlynx, ar.order = 20) z$arcoef
data(Canadianlynx) z <- unibar(Canadianlynx, ar.order = 20) z$arcoef
This is the basic program for the fitting of autoregressive models of successively higher by the method of least squares realized through householder transformation.
unimar(y, max.order = NULL, plot = FALSE)
unimar(y, max.order = NULL, plot = FALSE)
y |
a univariate time series. |
max.order |
upper limit of AR order. Default is |
plot |
logical. If |
The AR model is given by
where is AR order and
is Gaussian white noise with mean
and variance
. AIC is defined by
where is the length of data,
is the estimates of the
innovation variance and
is the number of parameter.
mean |
mean. |
var |
variance. |
v |
innovation variance. |
aic |
AIC. |
aicmin |
minimum AIC. |
daic |
AIC- |
order.maice |
order of minimum AIC. |
v.maice |
innovation variance attained at |
arcoef |
AR coefficients. |
G.Kitagawa and H.Akaike (1978) A Procedure For The Modeling of Non-Stationary Time Series. Ann. Inst. Statist. Math.,30, B, 351–363.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
data(Canadianlynx) z <- unimar(Canadianlynx, max.order = 20) z$arcoef
data(Canadianlynx) z <- unimar(Canadianlynx, max.order = 20) z$arcoef
Generate approximately Gaussian vector white noise.
wnoise(len, perr, plot = TRUE)
wnoise(len, perr, plot = TRUE)
len |
length of white noise record. |
perr |
prediction error. |
plot |
logical. If |
wnoise
gives white noises.
H.Akaike and T.Nakagawa (1988) Statistical Analysis and Control of Dynamic Systems. Kluwer Academic publishers.
# Example 1 wnoise(len = 100, perr = 1) # Example 2 v <- matrix(c(1, 0, 0, 0, 2, 0, 0, 0, 3), nrow = 3, ncol = 3, byrow = TRUE) wnoise(len = 20, perr = v)
# Example 1 wnoise(len = 100, perr = 1) # Example 2 v <- matrix(c(1, 0, 0, 0, 2, 0, 0, 0, 3), nrow = 3, ncol = 3, byrow = TRUE) wnoise(len = 20, perr = v)
Produce exact maximum likelihood estimates of the parameters of a scalar ARMA model.
xsarma(y, arcoefi, macoefi)
xsarma(y, arcoefi, macoefi)
y |
a univariate time series. |
arcoefi |
initial estimates of AR coefficients. |
macoefi |
initial estimates of MA coefficients. |
The ARMA model is given by
where is AR order,
is MA order and
is a zero mean white noise.
gradi |
initial gradient. |
lkhoodi |
initial (-2)log likelihood. |
arcoef |
final estimates of AR coefficients. |
macoef |
final estimates of MA coefficients. |
grad |
final gradient. |
alph.ar |
final ALPH (AR part) at subroutine ARCHCK. |
alph.ma |
final ALPH (MA part) at subroutine ARCHCK. |
lkhood |
final (-2)log likelihood. |
wnoise.var |
white noise variance. |
H.Akaike (1978) Covariance matrix computation of the state variable of a stationary Gaussian process. Research Memo. No.139. The Institute of Statistical Mathematics.
H.Akaike, G.Kitagawa, E.Arahata and F.Tada (1979) Computer Science Monograph, No.11, Timsac78. The Institute of Statistical Mathematics.
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". arcoef <- c(1.45, -0.9) macoef <- c(-0.5) y <- arima.sim(list(order=c(2,0,1), ar=arcoef, ma=macoef), n = 100) arcoefi <- c(1.5, -0.8) macoefi <- c(0.0) z <- xsarma(y, arcoefi, macoefi) z$arcoef z$macoef
# "arima.sim" is a function in "stats". # Note that the sign of MA coefficient is opposite from that in "timsac". arcoef <- c(1.45, -0.9) macoef <- c(-0.5) y <- arima.sim(list(order=c(2,0,1), ar=arcoef, ma=macoef), n = 100) arcoefi <- c(1.5, -0.8) macoefi <- c(0.0) z <- xsarma(y, arcoefi, macoefi) z$arcoef z$macoef