Title: | Estimation and Prediction for Remote Effects Spatial Process Models |
---|---|
Description: | Implementation of the remote effects spatial process (RESP) model for teleconnection. The RESP model is a geostatistical model that allows a spatially-referenced variable (like average precipitation) to be influenced by covariates defined on a remote domain (like sea surface temperatures). The RESP model is introduced in Hewitt et al. (2018) <doi:10.1002/env.2523>. Sample code for working with the RESP model is available at <https://jmhewitt.github.io/research/resp_example>. This material is based upon work supported by the National Science Foundation under grant number AGS 1419558. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. |
Authors: | Joshua Hewitt |
Maintainer: | Joshua Hewitt <[email protected]> |
License: | GPL-3 |
Version: | 1.0.3 |
Built: | 2024-11-08 06:46:39 UTC |
Source: | CRAN |
This function extends the abind function from the abind package.
abind3(...)
abind3(...)
... |
Any number of matrices of equal dimension to stack together into a 3d matrix |
Reshape array of data matrices into long format
arrayToLong(X, coords, yrs)
arrayToLong(X, coords, yrs)
X |
3 dimensional array of matrices to extract to long format |
coords |
Spatial coordinates associated with the data (longitude in first column) |
yrs |
Vector with labels for the years |
Canonical correlation analysis (CCA) is sometimes referred to as a
double-barreled principal component analysis. Loosely, it fits a linear
regression model to the scores of principal component decompositions for
of the predictors X
and responses Y
. Oftentimes, only the
largest principal components are used to make predictions.
cca.predict(X, Y, X.new, k.x, k.y)
cca.predict(X, Y, X.new, k.x, k.y)
X |
An |
Y |
An |
X.new |
An |
k.x |
An integer less than |
k.y |
An integer less than |
CCA has been used to predict a teleconnected response (like precipitation) using the remote field generating the teleconnection (like ocean temperatures). In this application, principal components are often referred to as empirical orthogonal functions (EOFs).
Y.new Predicted values for Y.new
Cook, E.R., Briffa, K.R., and Jones, P.D., 1994, Spatial regression methods in dendroclimatology: A review and comparison of two techniques: International Journal of Climatology, v. 14, p. 379-402.
Glahn, H.R., 1968, Canonical Correlation and Its Relationship to Discriminant Analysis and Multiple Regression: Journal of the Atmospheric Sciences, v. 25, p. 23-31.
data("coprecip") attach(coprecip) # compute CCA predictions of Y (CO precipitation) given Z (Pacific ocean SSTs) # using 2 principal components (aka. EOFs) preds = cca.predict(X = Z, Y = Y, X.new = Z, k.x = 2, k.y = 2) # compute R^2 1 - var(as.numeric(preds-Y)) / var(as.numeric(Y))
data("coprecip") attach(coprecip) # compute CCA predictions of Y (CO precipitation) given Z (Pacific ocean SSTs) # using 2 principal components (aka. EOFs) preds = cca.predict(X = Z, Y = Y, X.new = Z, k.x = 2, k.y = 2) # compute R^2 1 - var(as.numeric(preds-Y)) / var(as.numeric(Y))
Compute point estimates for parameters from posterior samples
## S3 method for class 'stFit' coef(object, burn = 1, fun = mean, ...)
## S3 method for class 'stFit' coef(object, burn = 1, fun = mean, ...)
object |
stFit object containing posterior samples for model |
burn |
number of posterior samples to reject before computing estimates |
fun |
function for computing point estimates |
... |
S3 generic/method consistency |
data("coprecip.fit") coef(coprecip.fit, burn = 50)
data("coprecip.fit") coef(coprecip.fit, burn = 50)
Compute point estimates for parameters from posterior samples
## S3 method for class 'stPredict' coef(object, stFit, stData, burn = 1, type = "eof-alpha_knots", ...)
## S3 method for class 'stPredict' coef(object, stFit, stData, burn = 1, type = "eof-alpha_knots", ...)
object |
stPredict object containing posterior estimates of alphas |
stFit |
stFit object containing posterior samples for model |
stData |
stData object containing spatial information for dataset |
burn |
number of posterior samples to reject before computing estimates |
type |
One of the following options to specify what point estimates to return
|
... |
S3 generic/method consistency |
data("coprecip") data("coprecip.fit") data("coprecip.predict") coef(coprecip.predict, stFit = coprecip.fit, stData = coprecip, burn = 50)
data("coprecip") data("coprecip.fit") data("coprecip.predict") coef(coprecip.predict, stFit = coprecip.fit, stData = coprecip, burn = 50)
A dataset containing sample spatially-aggregated climate data from the ERA-Interim and PRISM datasets. The response comes from PRISM, average monthly precipitation in a DJF winter. The covariates come from ERA-Interim, Colorado and Pacific Ocean (sea) surface temperatures. All data has been converted to standardized anomalies.
coprecip
coprecip
A stData object with 3 years of observations
year labels for data columns
centers of grid cells for Colorado data
centers of grid cells for Pacific Ocean data
Array of design matrices for Colorado covariates
Matrix of precipitation observations
Matrix of Pacific Ocean data
Label for covariate data, used by plotting functions
Label for response data, used by plotting functions
Label for covariate data, used by plotting functions
https://rda.ucar.edu/datasets/ds627.0/
data("coprecip") str(coprecip)
data("coprecip") str(coprecip)
An example stFit object containing output from a short run of the MCMC sampler that fits the RESP model to data.
coprecip.fit
coprecip.fit
An stFit object, which is a list of several objects
MCMC samples of model parameters
description of priors used to fit model
TRUE or FALSE to specify whether the spatial distances used to estimate spatial covariance parameters were in units of miles (TRUE) or kilometers (FALSE)
TRUE if remote covariates were not estimated
TRUE if local covariates were not estimated
(deprecated) TRUE if local covariates were estimated as a spatially-varying field
coordinates of remote knot locations
data("coprecip.fit") str(coprecip.fit)
data("coprecip.fit") str(coprecip.fit)
An example stPredict object containing predictions from a short run of the MCMC composition sampler. The output also contains teleconnection estimates.
coprecip.predict
coprecip.predict
An stPredict object, which is a list of several objects
A list containing summaries of posterior predictions
Posterior samples for predictions
centers of grid cells for Colorado data
TRUE if remote covariates were not estimated
(deprecated) TRUE if local covariates were estimated as a spatially-varying field
year labels for prediction timepoints
Label for response data, used by plotting functions
vector of probabilities for using posterior samples to return categorical predictions from the posterior prediction samples
Breakpoints used to discretize posterior predictive distribution at each coordinate in coords.s during composition sampling.
Summaries of posterior estimates of teleconnection effects
Summaries of posterior estimates of teleconnection effects after spatial basis function transformation
data("coprecip.predict") str(coprecip.predict)
data("coprecip.predict") str(coprecip.predict)
Evaluate kron(A,B) * C without storing kron(A,B)
dgemkmm(A, B, C)
dgemkmm(A, B, C)
A |
(m x n) matrix |
B |
(p x q) matrix |
C |
(nq x r) matrix |
Uses the stats::prcomp function to implement EOF decompositions of data
eof(X, center = F, scale = F)
eof(X, center = F, scale = F)
X |
[variable x observation] matrix of data for which to compute EOFs |
center |
TRUE/FALSE to center columns of X in call to prcomp |
scale |
TRUE/FALSE to scale columns of X in call to prcomp |
A list containing EOF patterns as columns, and their scores
data("coprecip") attach(coprecip) # compute ocean surface temperature eofs eofs = eof(Z) # view first EOF, which corresponds to the El-Nino pattern coords.r.mod = coords.r coords.r.mod[,1][coords.r.mod[,1]>0] = coords.r.mod[,1][coords.r.mod[,1]>0] - 360 fields::quilt.plot(coords.r.mod, eofs$patterns[,1]) # alternatively, the plot.stData function can directly compute and plot EOFs plot(coprecip, type='eof', pattern=1)
data("coprecip") attach(coprecip) # compute ocean surface temperature eofs eofs = eof(Z) # view first EOF, which corresponds to the El-Nino pattern coords.r.mod = coords.r coords.r.mod[,1][coords.r.mod[,1]>0] = coords.r.mod[,1][coords.r.mod[,1]>0] - 360 fields::quilt.plot(coords.r.mod, eofs$patterns[,1]) # alternatively, the plot.stData function can directly compute and plot EOFs plot(coprecip, type='eof', pattern=1)
Wrapper for a function to dump errors from C++
errDump(x, fname = file.path(tempdir(), "error_samplerState.RData"))
errDump(x, fname = file.path(tempdir(), "error_samplerState.RData"))
x |
Data to save |
fname |
Path/name to save data to |
This method is intended for use as the main helper function for extractStData.
extractRegion( sgdf, extent, type = "response", aggfact = NULL, mask = NULL, aspect = F, aspect.categories = NULL, slope = F )
extractRegion( sgdf, extent, type = "response", aggfact = NULL, mask = NULL, aspect = F, aspect.categories = NULL, slope = F )
sgdf |
SpatialGridDataFrame containing data to extract |
extent |
raster::extent object featuring region to extract, or a SpatialPolygonsXXX object used for extracting areal data |
type |
whether to return the raw data, anomalies (data minus temporal average at each location), standardized anomalies (anomalies divided by temporal standard deviation at each location), or spatially standardized data (data minus overall spatial average divided by spatial std. dev.; each year gets its own spatial standardization ) |
aggfact |
if provided, will spatially average the data |
mask |
if an sgdf is provided, the data will be masked before extraction, aggregation, and anomaly computation |
aspect |
TRUE to return the aspect of the surface at each location instead of the value of the surface itself |
aspect.categories |
if aspect==TRUE, this specifies the number of discrete categories to divide aspect numbers (0-360) into. NULL if the original scale (0-360) should be kept. By design, the aspect categories will be centered on north in the first category. |
slope |
TRUE to return the slope of the surface at each location instead of the value of the surface itself |
a modified SpatialGridDataFrame, sgdf, with the climatology for each location accessible via attr(sgdf@data@values, 'scaled:center') if anomalies were computed
Basic extraction of SpatialGridDataFrame data for teleconnection analysis
extractStData( X, Y, Z, t = NULL, D.s, D.r, mask.s = NULL, mask.r = NULL, aggfact.s = NULL, aggfact.r = NULL, intercept = T, type.s = "response", type.r = "response", type.s.y = "response", X.lab = NULL, Y.lab = NULL, Z.lab = NULL, aspect = F, aspect.categories = 4, slope = F, colnames.X = NULL, formula = NULL )
extractStData( X, Y, Z, t = NULL, D.s, D.r, mask.s = NULL, mask.r = NULL, aggfact.s = NULL, aggfact.r = NULL, intercept = T, type.s = "response", type.r = "response", type.s.y = "response", X.lab = NULL, Y.lab = NULL, Z.lab = NULL, aspect = F, aspect.categories = 4, slope = F, colnames.X = NULL, formula = NULL )
X |
SpatialGridDataFrame with local covariates. If X is a list, each SpatialGridDataFrame will be included as one covariate. |
Y |
SpatialGridDataFrame with response data |
Z |
SpatialGridDataFrame with remote covariates. If Z is a list, this function assumes each element of the list contains observations for the same covariate, but from different spatial regions. If Z is a list, D.r and mask.r must also be lists so that this function can know which regions to extract from each SpatialGridDataFrame |
t |
Timepoint from which to extract data from X, Y, and Z. If NULL, then all timepoints will be used. |
D.s |
c(xmin, xmax, ymin, ymax) region from which to extract data from X and Y, or a SpatialPolygonsXXX object containing boundaries of regions to extract areal data from. |
D.r |
c(xmin, xmax, ymin, ymax) region from which to extract data from Z |
mask.s |
SpatialGridDataFrame to be used as a mask when extracting data from X and Y. Locations in mask.s with NA values will be ignored when extracting data from X and Y. |
mask.r |
SpatialGridDataFrame to be used as a mask when extracting data from Z. Locations in mask.s with NA values will be ignored when extracting data from Z. |
aggfact.s |
If provided, will spatially average Y and X data |
aggfact.r |
If provided, will spatially average Z data |
intercept |
If TRUE, an intercept will be added to the design matrix |
type.s |
'response' 'anomaly' or 'std.anomaly' or a vector of these options depending on whether data extracted from X should be the observed data, anomalies, or standardized anomalies (where the climatology is computed from the observations as the pointwise temporal average) |
type.r |
'response' 'anomaly' or 'std.anomaly' or a vector of these options depending on whether data extracted from Z should be the observed data, anomalies, or standardized anomalies (where the climatology is computed from the observations as the pointwise temporal average) |
type.s.y |
'response' 'anomaly' or 'std.anomaly' depending on whether data extracted from Y should be the observed data, anomalies, or standardized anomalies (where the climatology is computed from the observations as the pointwise temporal average) |
X.lab |
name for X data (optional) |
Y.lab |
name for Y data (optional) |
Z.lab |
name for Z data (optional) |
aspect |
TRUE or vector of logicals (one for each X object) to return the aspect of the surface at each location instead of the value of the surface itself |
aspect.categories |
if aspect==TRUE, this specifies the number of discrete categories to divide aspect numbers (0-360) into. NULL if the original scale (0-360) should be kept. By design, the aspect categories will be centered on north in the first category. |
slope |
TRUE or vector of logicals (one for each X object) to return the slope of the surface at each location instead of the value of the surface itself |
colnames.X |
names of columns of X |
formula |
formula object to specify how to create the design matrix |
# the extractRegion and extractStData methods create data matrices from # SpatialGridDataFrame objects library(sp) data("coprecip") attach(coprecip) # # build SpatialGridDataFrame objects containing some of the coprecip data # gt = GridTopology(cellcentre.offset = apply(coords.s, 2, min), cellsize = c(.5, .5), cells.dim = c(20, 12)) # Note: This is an example only; this grid will not match coprecip$coords.r gt.Z = GridTopology(cellcentre.offset = apply(coords.r, 2, min), cellsize = c(1.4, 1.4), cells.dim = c(101, 52)) Xd = data.frame(`1981` = X[,2,1], `1982` = X[,2,2]) colnames(Xd) = gsub('X','', colnames(Xd)) sgdf.x = SpatialGridDataFrame(gt, Xd) Yd = data.frame(`1981` = Y[,1], `1982` = Y[,2]) colnames(Yd) = gsub('X','', colnames(Yd)) sgdf.y = SpatialGridDataFrame(gt, Yd) Zd = data.frame(`1981` = Z[,1], `1982` = Z[,2]) colnames(Zd) = gsub('X','', colnames(Zd)) sgdf.z = SpatialGridDataFrame(gt.Z, Zd) # only extract a region of the coordinates coprecip2 = extractStData(sgdf.x, sgdf.y, sgdf.z, D.s = c(-105, -103, 37, 41), D.r = c(-160, -100, -15, 0))
# the extractRegion and extractStData methods create data matrices from # SpatialGridDataFrame objects library(sp) data("coprecip") attach(coprecip) # # build SpatialGridDataFrame objects containing some of the coprecip data # gt = GridTopology(cellcentre.offset = apply(coords.s, 2, min), cellsize = c(.5, .5), cells.dim = c(20, 12)) # Note: This is an example only; this grid will not match coprecip$coords.r gt.Z = GridTopology(cellcentre.offset = apply(coords.r, 2, min), cellsize = c(1.4, 1.4), cells.dim = c(101, 52)) Xd = data.frame(`1981` = X[,2,1], `1982` = X[,2,2]) colnames(Xd) = gsub('X','', colnames(Xd)) sgdf.x = SpatialGridDataFrame(gt, Xd) Yd = data.frame(`1981` = Y[,1], `1982` = Y[,2]) colnames(Yd) = gsub('X','', colnames(Yd)) sgdf.y = SpatialGridDataFrame(gt, Yd) Zd = data.frame(`1981` = Z[,1], `1982` = Z[,2]) colnames(Zd) = gsub('X','', colnames(Zd)) sgdf.z = SpatialGridDataFrame(gt.Z, Zd) # only extract a region of the coordinates coprecip2 = extractStData(sgdf.x, sgdf.y, sgdf.z, D.s = c(-105, -103, 37, 41), D.r = c(-160, -100, -15, 0))
Solves where
and
are lower triangular
matrices.
forwardsolve.kron(A, B, y)
forwardsolve.kron(A, B, y)
A |
an |
B |
an |
y |
an |
x
set.seed(2018) coord.s = matrix(runif(100), ncol=2) coord.r = matrix(runif(50), ncol=2) d.s = as.matrix(dist(coord.s)) d.r = as.matrix(dist(coord.r)) S1 = exp(-d.s) S2 = exp(-d.r) A = t(chol(S1)) B = t(chol(S2)) s = 15 x = matrix(runif(nrow(S1)*nrow(S2)*s), ncol=s) y = kronecker(A,B) %*% x x.solved = forwardsolve.kron(A, B, y) max(abs(x - x.solved))
set.seed(2018) coord.s = matrix(runif(100), ncol=2) coord.r = matrix(runif(50), ncol=2) d.s = as.matrix(dist(coord.s)) d.r = as.matrix(dist(coord.r)) S1 = exp(-d.s) S2 = exp(-d.r) A = t(chol(S1)) B = t(chol(S2)) s = 15 x = matrix(runif(nrow(S1)*nrow(S2)*s), ncol=s) y = kronecker(A,B) %*% x x.solved = forwardsolve.kron(A, B, y) max(abs(x - x.solved))
Compute Highest posterior density intervals from posterior samples
## S3 method for class 'stFit' HPDinterval(stFit, burn = 1, prob = 0.95)
## S3 method for class 'stFit' HPDinterval(stFit, burn = 1, prob = 0.95)
stFit |
stFit object containing posterior samples for model |
burn |
number of posterior samples to reject before computing estimates |
prob |
The target probability content of the intervals |
data("coprecip.fit") HPDinterval.stFit(coprecip.fit, burn = 50)
data("coprecip.fit") HPDinterval.stFit(coprecip.fit, burn = 50)
Samples
invWSamp(Psi, n)
invWSamp(Psi, n)
Psi |
an |
n |
degrees of freedom parameter |
A = matrix(c(1,.5,.5,1), ncol=2) W = invWSamp(A, 3)
A = matrix(c(1,.5,.5,1), ncol=2) W = invWSamp(A, 3)
Samples
kronSamp(A, B)
kronSamp(A, B)
A |
an |
B |
an |
A = matrix(c(1,.5,.5,1), ncol=2) B = diag(2) x = kronSamp(A, B)
A = matrix(c(1,.5,.5,1), ncol=2) B = diag(2) x = kronSamp(A, B)
Formatting for longitude scales in ggplot spatial maps
lat_trans()
lat_trans()
Formatting for longitude scales in ggplot spatial maps
lon_trans()
lon_trans()
This function evaluates the Matern covariance function for the elements of a vector.
maternArray(d, scale = 1, range = 1, smoothness = 0.5, nugget = 0)
maternArray(d, scale = 1, range = 1, smoothness = 0.5, nugget = 0)
d |
A numeric vector of distances at which the Matern correlation function should be evaluated. |
scale |
Scales correlations to covariances. |
range |
Matern range parameter. Controls the decay of pointwise correlations as a function of distance. |
smoothness |
Matern smoothness parameter. Controls the number of process derivatives. |
nugget |
Spatial covariance nugget. |
This function evaluates the Matern covariance function for the elements of a (potentially non-square) spatial distance matrix
maternCov(d, scale = 1, range = 1, smoothness = 0.5, nugget = 0)
maternCov(d, scale = 1, range = 1, smoothness = 0.5, nugget = 0)
d |
A numeric vector or matrix of distances at which the Matern correlation function should be evaluated. |
scale |
Scales correlations to covariances. |
range |
Matern range parameter. Controls the decay of pointwise correlations as a function of distance. |
smoothness |
Matern smoothness parameter. Controls the number of process derivatives. |
nugget |
Spatial covariance nugget. |
data("coprecip") attach(coprecip) # compute spatial covariance matrix for an exponential covariance function # using Colorado coordinates Sigma = maternCov(fields::rdist.earth(coords.s), scale = 1, range = 250, smoothness = .5, nugget = 0)
data("coprecip") attach(coprecip) # compute spatial covariance matrix for an exponential covariance function # using Colorado coordinates Sigma = maternCov(fields::rdist.earth(coords.s), scale = 1, range = 250, smoothness = .5, nugget = 0)
The effective range for an isotropic spatial correlation function is commonly defined to be the distance beyond which the correlation becomes small, typically below .05. Given range and smoothness parameters for a Matern covariance function, this function numerically searches for this distance. Note that the scale is not important for this calculation.
maternEffectiveRange(cor = 0.05, range = 1, smoothness = 0.5)
maternEffectiveRange(cor = 0.05, range = 1, smoothness = 0.5)
cor |
Effective correlation to check for |
range |
Matern range parameter. Controls the decay of pointwise correlations as a function of distance. |
smoothness |
Matern smoothness parameter. Controls the number of process derivatives. |
# effective range for exponential covariance function with range = 1, # which is theoretically known to equal -ln(.05) maternEffectiveRange(cor = .05, range = 1, smoothness = .5)
# effective range for exponential covariance function with range = 1, # which is theoretically known to equal -ln(.05) maternEffectiveRange(cor = .05, range = 1, smoothness = .5)
Combine results from composition sampler
mergeComposition(xfull, yfull)
mergeComposition(xfull, yfull)
xfull |
Raw output from one run of the Rcpp/Armadillo composition sampler |
yfull |
Raw output from another run of the Rcpp/Armadillo composition sampler |
This function combines the sample covariance information from two samples (of the same phenomena) to return the sample covariance matric of the union of the two samples.
mergeCovmat( A.cov.xy, B.cov.xy, A.mean.x, A.mean.y, B.mean.x, B.mean.y, A.n, B.n )
mergeCovmat( A.cov.xy, B.cov.xy, A.mean.x, A.mean.y, B.mean.x, B.mean.y, A.n, B.n )
A.cov.xy |
sample covariance matrix from the first sample, 'A' |
B.cov.xy |
sample covariance matrix from the second sample, 'B' |
A.mean.x |
sample mean from the first sample, 'A' |
A.mean.y |
sample mean from the first sample, 'A' |
B.mean.x |
sample mean from the second sample, 'B' |
B.mean.y |
sample mean from the second sample, 'B' |
A.n |
sample size from the first sample, 'A' |
B.n |
sample size from the second sample, 'B' |
This function assumes the data is normalized by n (the MLE estimator) instead of n-1 (the unbiased estimator).
Pebay, P., 2008, Formulas for Robust, One-Pass Parallel Computation of Covariances and Arbitrary-Order Statistical Moments: Sandia Report.
This function combines the sample mean information from two samples (of the same phenomena) to return the sample mean of the union of the two samples.
mergeMean(x.mean, y.mean, x.n, y.n)
mergeMean(x.mean, y.mean, x.n, y.n)
x.mean |
sample mean from the first sample, 'x' |
y.mean |
sample mean from the second sample, 'y' |
x.n |
sample size from the first sample, 'x' |
y.n |
sample size from the second sample, 'y' |
This function combines the sample variance information from two samples (of the same phenomena) to return the sample variance of the union of the two samples.
mergeVar(x.var, y.var, x.mean, y.mean, x.n, y.n)
mergeVar(x.var, y.var, x.mean, y.mean, x.n, y.n)
x.var |
sample variance from the first sample, 'x' |
y.var |
sample variance from the second sample, 'y' |
x.mean |
sample mean from the first sample, 'x' |
y.mean |
sample mean from the second sample, 'y' |
x.n |
sample size from the first sample, 'x' |
y.n |
sample size from the second sample, 'y' |
This function assumes the data is normalized by n (the MLE estimator) instead of n-1 (the unbiased estimator).
Chan, T.F., Golub, G.H., and LeVeque, R.J., 1979, Updating formulae and a pairwise algorithm for computing sample variances: Technical Report, Stanford University .
This function provides basic plotting for telefit package data.
## S3 method for class 'stData' plot( x, type = "response", t = NULL, p = NULL, map = "world", region = ".", coord.s = NULL, coord.r = NULL, zlim = NULL, fill.lab = NULL, lab.teleconnection = expression(alpha), fill.lab.width = 20, category.breaks = NULL, coords.knots = NULL, signif.telecon = F, dots = NULL, pattern = 1, lwd = 1.75, cutoff = 0.9, signif.level = 0.05, alpha = 0.2, zmid = 0, contour = c(F, F), ... )
## S3 method for class 'stData' plot( x, type = "response", t = NULL, p = NULL, map = "world", region = ".", coord.s = NULL, coord.r = NULL, zlim = NULL, fill.lab = NULL, lab.teleconnection = expression(alpha), fill.lab.width = 20, category.breaks = NULL, coords.knots = NULL, signif.telecon = F, dots = NULL, pattern = 1, lwd = 1.75, cutoff = 0.9, signif.level = 0.05, alpha = 0.2, zmid = 0, contour = c(F, F), ... )
x |
Object of class stData to plot. |
type |
One of the following options to specify what type of plot to build
|
t |
timepoint to plot. Will automatically plot the first timepoint if t=NULL. |
p |
column index of local covariate to plot if type='covariate'. Will automatically assume the local covariate data includes an intercept and will plot the second column if p=NULL. |
map |
name of map provided by the maps package. These include county, france, italy, nz, state, usa, world, world2. By default, all stData plots will include us state outlines. |
region |
name of subregions to include. Defaults to . which includes all subregions. See documentation for map for more details. |
coord.s |
if plot type is 'teleconnection', specifies the longitude and latitude of local coordinate for which to plot teleconnection effects. if NULL, the middle local coordinate will be plotted. |
coord.r |
if plot type is 'teleconnection_local', specifes the longitude and latitude of remote coordinate for which to plot associated teleconnection effects. if NULL, the middle remote coordinate will be plotted. |
zlim |
c(min, max) vector that specifies the colorscale limits |
fill.lab |
Optional label to override the default fill scale labels |
lab.teleconnection |
label used for fill scale in teleconnection plot |
fill.lab.width |
line width for fill scale label |
category.breaks |
[ncoords x ncats] list of breakpoints used for binning responses into categories |
coords.knots |
if plot type is 'remote', specifies the longitude and latitude of knot locations to overlay on the 'remote' plot |
signif.telecon |
if TRUE, will highlight significant grid cells if the plotting data contain a signif column |
dots |
additional named arguments with defaults to pass to additional functions |
pattern |
if type=='eof' this specifies which (remote) EOF pattern to plot or if type=='eof_scores' this (vector) specifies which (remote) EOF pattern scores to plot |
lwd |
line width for when plotting with signif.telecon==T |
cutoff |
Used to denote where this proportion of variance is achieved in the eof_scree plots |
signif.level |
significance level for eof_cor significance highlighting |
alpha |
the level of fading that should be applied to insignificant grid boxes when plotting significant effects |
zmid |
number that specifies the midpoint of the colorscale |
contour |
c(TRUE, TRUE) to plot local and remote responses as contours vs. observations |
... |
additional arguments to pass to functions |
a ggplot object with the specified map
data("coprecip") p = plot(coprecip)
data("coprecip") p = plot(coprecip)
This function provides basic plotting for telefit package data.
## S3 method for class 'stFit' plot( x, type = "density", stData = NULL, coord.s = NULL, coord.knot = NULL, text.size = NULL, axis.text.size = NULL, title.text.size = NULL, burn = 1, signif.telecon = F, p = 1, local.covariate = NULL, lwd = NULL, facet.signif = 3, stat.smooth.bw = NULL, stat.smooth.degree = NULL, dots = NULL, ... )
## S3 method for class 'stFit' plot( x, type = "density", stData = NULL, coord.s = NULL, coord.knot = NULL, text.size = NULL, axis.text.size = NULL, title.text.size = NULL, burn = 1, signif.telecon = F, p = 1, local.covariate = NULL, lwd = NULL, facet.signif = 3, stat.smooth.bw = NULL, stat.smooth.degree = NULL, dots = NULL, ... )
x |
Object of class stFit to plot. |
type |
One of the following options to specify what type of plot to build
|
stData |
Object of class stData to provide coordinate and related information for plotting estimated teleconnection effects |
coord.s |
if plot type is 'teleconnection', specifies the longitude and latitude of local coordinate for which to plot estimated teleconnection effects. if NULL, the middle local coordinate will be plotted. |
coord.knot |
if plot type is 'teleconnection_knot_influence' or 'teleconnection_knot_local', specifies the longitude and latitude of knot coordinate for which to plot influence of remote coefficient on remote covariates, or the teleconnection coefficients associated with coord.knot |
text.size |
number specifying the size of text labels |
axis.text.size |
number specifying the size of axis text labels |
title.text.size |
number specifying the size of title |
burn |
number of observations to exclude from graph |
signif.telecon |
if TRUE, will highlight significant teleconnection effects when type=='teleconnection' |
p |
If stFit was fit with spatially varying coefficients, p specifies the index of the spatially varying coefficient to plot |
local.covariate |
data.frame with variables, 'lon.Y', 'lat.Y', 'x' that will be plotted against teleconnection effects if type=='teleconnection_knot_transect' |
lwd |
specifies linewidth for plots that include reference lines |
facet.signif |
number of significant figures to round facet latitudes and longitudes for if type=='teleconnection_knot_transect' |
stat.smooth.bw |
if type=='teleconnection_knot_transect' this specifies the bandwith of the non-parametric smooth of the estimates |
stat.smooth.degree |
if type=='teleconnection_knot_transect' this specifies the degree of the non-parametric smooth of the estimates |
dots |
additional named arguments with defaults to pass to additional functions |
... |
additional arguments to pass to functions |
a ggplot object with the specified map
data("coprecip.fit") plot(coprecip.fit, burn = 50, type = 'trace')
data("coprecip.fit") plot(coprecip.fit, burn = 50, type = 'trace')
This function provides basic plotting for telefit package data.
## S3 method for class 'stPredict' plot( x, type = "prediction", t = NULL, stFit = NULL, stData = NULL, err.comparison = NULL, err.var = NULL, err.lab = err.var, pattern = 1, dots = NULL, burn = 1, signif.telecon = F, ... )
## S3 method for class 'stPredict' plot( x, type = "prediction", t = NULL, stFit = NULL, stData = NULL, err.comparison = NULL, err.var = NULL, err.lab = err.var, pattern = 1, dots = NULL, burn = 1, signif.telecon = F, ... )
x |
Object of class stPredict to plot. |
type |
One of the following options to specify what type of plot to build
|
t |
timepoint to plot. Will automatically plot the first timepoint if t=NULL. |
stFit |
Object of class stFit to provide related information and structures for plotting estimated teleconnection effects |
stData |
Object of class stData to provide coordinate and related information for plotting estimated teleconnection effects |
err.comparison |
data.frame with Year column and a column for a variable that will be used to plot annual errors against |
err.var |
name of variable in err.comparison for plotting against |
err.lab |
label for name of variable in err.comparison for plotting against |
pattern |
if type=='eof_alpha_knots', this specified which eof the remote coefficients should be mapped onto and then plotted over the local domain |
dots |
additional named arguments with defaults to pass to additional functions |
burn |
number of observations to exclude from graph |
signif.telecon |
TRUE to highlight significant teleconnection effects |
... |
additional arguments to be passed to lower-level plotting functions |
a ggplot object with the specified map
data("coprecip.predict") p = plot(coprecip.predict, t=1981)
data("coprecip.predict") p = plot(coprecip.predict, t=1981)
This function provides basic plotting for analyses returned from cor.tel
## S3 method for class 'teleCor' plot( x, signif = F, coord.s = NULL, map = "world", region = ".", zlim = NULL, dots = NULL, ... )
## S3 method for class 'teleCor' plot( x, signif = F, coord.s = NULL, map = "world", region = ".", zlim = NULL, dots = NULL, ... )
x |
object of class teleCor, containing pointwise correlations |
signif |
if TRUE, then teleCor must have a column labeled 'signif' that indicates which correlations are significant. These correlations will be printed in bold, and the rest will be printed more lightly |
coord.s |
specifies the longitude and latitude of local coordinate for which to plot pointwise correlations (if type=='remote'). if NULL, the middle local coordinate will be plotted. |
map |
name of map provided by the maps package. These include county, france, italy, nz, state, usa, world, world2. By default, all stData plots will include us state outlines. |
region |
name of subregions to include. Defaults to . which includes all subregions. See documentation for map for more details. |
zlim |
c(min, max) vector that specifies the colorscale limits |
dots |
additional named arguments with defaults to pass to additional functions |
... |
additional arguments to be passed to lower-level plotting functions |
a ggplot object with the specified map
data("coprecip") cors = teleCor(coprecip) p = plot(cors, coords.s = c(-105, 39.73))
data("coprecip") cors = teleCor(coprecip) p = plot(cors, coords.s = c(-105, 39.73))
Draw random matrices from the matrix normal distribution
Note that an observation, , from this equation has the following
distribution when vectorized
rmatnorm(n, U, V, M = matrix(0, nrow = nrow(U), ncol = nrow(V)))
rmatnorm(n, U, V, M = matrix(0, nrow = nrow(U), ncol = nrow(V)))
n |
Number of random matrices to simulate |
U |
Covariance matrix defining dependence between rows |
V |
Covariance matrix defining dependence between columns |
M |
average value of each entry in the sampled matrices |
Random wishart matrix
rwishart(V, n)
rwishart(V, n)
V |
symmetric positive definite p x p scale matrix |
n |
degrees of freedom (greater than p-1) |
Provides basic measures for evalutating the fit. Includes Brier skill score against the climatology, MSPE, PPL, overall correlation, and a computation of the coverage probabilities for confidence intervals
stEval(forecast, Y, clim)
stEval(forecast, Y, clim)
forecast |
stPredict object containing predictions for Y |
Y |
observed values of the response |
clim |
the climatology for the location in Y |
data("coprecip") data("coprecip.predict") clim = rowMeans(coprecip$Y) coprecip.predict = stEval(coprecip.predict, coprecip$Y, clim)
data("coprecip") data("coprecip.predict") clim = rowMeans(coprecip$Y) coprecip.predict = stEval(coprecip.predict, coprecip$Y, clim)
Fit the remote effects spatial process (RESP) model
stFit( stData = NULL, priors, maxIt, X = stData$X, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, rw.initsd = NULL, returnll = T, miles = T, C = 1, alpha = 0.44, localOnly = F, varying = F, remoteOnly = F, coords.knots )
stFit( stData = NULL, priors, maxIt, X = stData$X, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, rw.initsd = NULL, returnll = T, miles = T, C = 1, alpha = 0.44, localOnly = F, varying = F, remoteOnly = F, coords.knots )
stData |
Object with class 'stData' containing data needed to fit this model. The data need only be manually entered if not using a stData object. |
priors |
A list containing parameters for the prior distributions. The list needs to contain the following values
|
maxIt |
number of iterations to run the MCMC chain for |
X |
[ns, p, nt] array of design matrices with local covariates |
Y |
[ns, nt] matrix with response data |
Z |
[nr, nt] matrix with remote covariates |
coords.s |
matrix with coordinates where responses were observed (lon, lat) |
coords.r |
matrix with coordinates where remote covariates were observed (lon, lat) |
rw.initsd |
A list containing initial standard deviation parameters for the MCMC parameters requiring random walk updates
|
returnll |
TRUE to compute the model log-likelihood at each iteration |
miles |
TRUE if covariance matrix distances should be in miles, FALSE for kilometers |
C |
scaling factor used in adapting random walk proposal variances. |
alpha |
target acceptance rate for random walk proposals. |
localOnly |
TRUE to fit the model without the teleconnection effects (typically for evaluating impact of teleconnection effects) |
varying |
(depreceated) TRUE to fit the model with spatially varying local coefficients |
remoteOnly |
TRUE to fit the model without local effects. This will fit a local intercept, but will not incorporate local covariates. |
coords.knots |
matrix with coordinates where remote teleconnections will be based (lon, lat) |
library(dplyr) library(foreach) library(itertools) set.seed(2018) data("coprecip") data("coprecip.fit") attach(coprecip) coprecip.fit = stFit(stData = coprecip, priors = coprecip.fit$priors, maxIt = 10, coords.knots = coprecip.fit$coords.knots)
library(dplyr) library(foreach) library(itertools) set.seed(2018) data("coprecip") data("coprecip.fit") attach(coprecip) coprecip.fit = stFit(stData = coprecip, priors = coprecip.fit$priors, maxIt = 10, coords.knots = coprecip.fit$coords.knots)
Compute log likelihood for model
stLL( stData, stFit, beta, sigmasq_y, sigmasq_r, sigmasq_eps, rho_y, rho_r, X = stData$X, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, coords.knots = stFit$coords.knots, miles = TRUE, sigmasq_r_eps )
stLL( stData, stFit, beta, sigmasq_y, sigmasq_r, sigmasq_eps, rho_y, rho_r, X = stData$X, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, coords.knots = stFit$coords.knots, miles = TRUE, sigmasq_r_eps )
stData |
Object with class 'stData' containing data needed to fit this model. The data need only be manually entered if not using a stData object. |
stFit |
Object with class 'stFit' containing posterior parameter samples needed to composition sample the teleconnection effects and generate posterior predictions. The data needed from stFit need only be manually entered if not using a stData object. |
beta |
values of |
sigmasq_y |
values of |
sigmasq_r |
values of |
sigmasq_eps |
values of |
rho_y |
values of |
rho_r |
values of |
X |
[ns, p, nt] array of design matrices with local covariates |
Y |
[ns, nt] matrix with response data |
Z |
[nr, nt] matrix with remote covariates |
coords.s |
matrix with coordinates where responses were observed (lon, lat) |
coords.r |
matrix with coordinates where remote covariates were observed (lon, lat) |
coords.knots |
matrix with coordinates of knots for remote covariates (lon, lat) |
miles |
TRUE if distances should be computed in miles (kilometers otherwise) |
sigmasq_r_eps |
values of |
library(dplyr) library(foreach) library(itertools) set.seed(2018) data("coprecip") data("coprecip.fit") attach(coprecip) ests = coef(coprecip.fit, burn = 50) ll = stLL(stData = coprecip, stFit = coprecip.fit, beta = matrix(ests$beta, ncol = 2), sigmasq_y = ests$sigmasq_y, sigmasq_r = ests$sigmasq_r, sigmasq_eps = ests$sigmasq_eps, rho_y = ests$rho_y, rho_r = ests$rho_r, sigmasq_r_eps = 0)
library(dplyr) library(foreach) library(itertools) set.seed(2018) data("coprecip") data("coprecip.fit") attach(coprecip) ests = coef(coprecip.fit, burn = 50) ll = stLL(stData = coprecip, stFit = coprecip.fit, beta = matrix(ests$beta, ncol = 2), sigmasq_y = ests$sigmasq_y, sigmasq_r = ests$sigmasq_r, sigmasq_eps = ests$sigmasq_eps, rho_y = ests$rho_y, rho_r = ests$rho_r, sigmasq_r_eps = 0)
Predict response at new timepoints by drawing samples of the response from the posterior predictive distribution. Since this requires sampling teleconnection effects, this method can return estimates of the teleconnection effects as a by-product.
stPredict( stFit, stData, stDataNew, burn = 1, prob = 0.95, ncores = 1, conf = 0.95, tLabs = stDataNew$tLabs, X = stData$X, Y = stData$Y, Z = stData$Z, Xnew = stDataNew$X, Znew = stDataNew$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, returnAlphas = T, cat.probs = c(1/3, 2/3), returnFullAlphas = F )
stPredict( stFit, stData, stDataNew, burn = 1, prob = 0.95, ncores = 1, conf = 0.95, tLabs = stDataNew$tLabs, X = stData$X, Y = stData$Y, Z = stData$Z, Xnew = stDataNew$X, Znew = stDataNew$Z, coords.s = stData$coords.s, coords.r = stData$coords.r, returnAlphas = T, cat.probs = c(1/3, 2/3), returnFullAlphas = F )
stFit |
Object with class 'stFit' containing posterior parameter samples needed to composition sample the teleconnection effects and generate posterior predictions. The data needed from stFit need only be manually entered if not using a stData object. |
stData |
Object with class 'stData' containing data needed to fit this model. The data need only be manually entered if not using a stData object. |
stDataNew |
object of class stData that includes information needed for making forecasts. If response data is included, this function will automatically run stEval using the empirical climatology as the reference forecast |
burn |
number of posterior samples to burn before drawing composition samples |
prob |
confidence level for approximate confidence intervals of teleconnection effects (only needed if returnAlphas==TRUE) |
ncores |
Since the teleconnection effects and posterior predictions can be sampled in parallel, this parameter lets users specify the number of cores to use to draw teleconnection and prediction samples |
conf |
Parameter specifying the HPD level to compute for posterior predictive samples |
tLabs |
Forecast timepoint labels |
X |
[ns, p, nt] array of design matrices with local covariates |
Y |
[ns, nt] matrix with response data |
Z |
[nr, nt] matrix with remote covariates |
Xnew |
[ns, p, nt0] array of design matrices with local covariates at forecast timepoints |
Znew |
[nr, nt0] matrix with remote covariates at forecast timepoints |
coords.s |
matrix with coordinates where responses were observed (lon, lat) |
coords.r |
matrix with coordinates where remote covariates were observed (lon, lat) |
returnAlphas |
TRUE to return the teleconnection effects sampled at knot locations. Note that only basic summary information about the teleconnection effects will be returned. |
cat.probs |
vector of probabilities for also returning categorical predictions from the posterior prediction samples; NULL otherwise |
returnFullAlphas |
TRUE to return the teleconnection effects. Note that only basic summary information about the teleconnection effects will be returned. |
set.seed(2018) data("coprecip") data("coprecip.fit") coprecip.predict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = FALSE)
set.seed(2018) data("coprecip") data("coprecip.fit") coprecip.predict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = FALSE)
This function simulates spatio-temporal data. The intention is that data Y and latent parameters alpha will be generated using provided covariates X and Z; spatial domains coords.s, coords.r, and coords.knots; and model parameters.
stSimulate(dat.train, dat.test, coords.knots, params, miles = T)
stSimulate(dat.train, dat.test, coords.knots, params, miles = T)
dat.train |
stData object with training data to simulate new Y values for |
dat.test |
stData object with test data to simulate new Y values for |
coords.knots |
matrix with coordinates of knots for remote covariates (lon, lat) |
params |
A list containing model parameters for use in simulation
|
miles |
TRUE to compute distances for evaluating covariance functions in miles. This is important since the interpretations of the cov.r and cov.s parameters depend on the units with which distance is measured. |
set.seed(2018) data("coprecip") data("coprecip.fit") coprecip.predict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = FALSE)
set.seed(2018) data("coprecip") data("coprecip.fit") coprecip.predict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = FALSE)
VIFs will be computed at the posterior mean of all covariance parameters.
stVIF(stData, stFit, burn)
stVIF(stData, stFit, burn)
stData |
Object with class 'stData' containing data needed to fit this model. |
stFit |
Object with class 'stFit' containing posterior parameter samples needed to composition sample the teleconnection effects and generate posterior predictions. |
burn |
number of posterior samples to burn before drawing composition samples |
data("coprecip") data("coprecip.fit") stVIF(stData = coprecip, stFit = coprecip.fit, burn = 50)
data("coprecip") data("coprecip.fit") stVIF(stData = coprecip, stFit = coprecip.fit, burn = 50)
This function computes approximate normal intervals, etc. for fitted alphas.
summariseAlpha(alpha, prob = 0.95, coords.s, coords.r)
summariseAlpha(alpha, prob = 0.95, coords.s, coords.r)
alpha |
structure containing posterior inference for remote coefficients |
prob |
confidence level for confidence intervals and significance |
coords.s |
matrix with coordinates where responses were observed (lon, lat) |
coords.r |
matrix with coordinates where remote covariates were observed (lon, lat) |
## Not run: data("coprecip") data("coprecip.fit") attach(coprecip) # sample posterior predictive distributions AND estimate teleconnection effects coprecip.precict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = TRUE) alpha.90 = summariseAlpha(alpha = coprecip.precict$alpha, prob = .9, coords.s = coords.s, coords.r = coords.r) ## End(Not run)
## Not run: data("coprecip") data("coprecip.fit") attach(coprecip) # sample posterior predictive distributions AND estimate teleconnection effects coprecip.precict = stPredict(stFit = coprecip.fit, stData = coprecip, stDataNew = coprecip, burn = 90, returnFullAlphas = TRUE) alpha.90 = summariseAlpha(alpha = coprecip.precict$alpha, prob = .9, coords.s = coords.s, coords.r = coords.r) ## End(Not run)
This function computes approximate normal intervals, etc. for fitted eof-mapped alphas.
summariseEOFAlpha(eof_alpha, prob = 0.95, coords.s)
summariseEOFAlpha(eof_alpha, prob = 0.95, coords.s)
eof_alpha |
structure containing posterior inference for transformed remote coefficients |
prob |
confidence level for confidence intervals and significance |
coords.s |
matrix with coordinates where responses were observed (lon, lat) |
data("coprecip.predict") attach(coprecip.predict) alpha.eof.90 = summariseEOFAlpha(eof_alpha = eof_alpha_knots, prob = .9, coords.s = coords.s)
data("coprecip.predict") attach(coprecip.predict) alpha.eof.90 = summariseEOFAlpha(eof_alpha = eof_alpha_knots, prob = .9, coords.s = coords.s)
This function prints basic summary info for telefit stPredict objects
## S3 method for class 'stPredict' summary(object, t = NULL, digits = NULL, ...)
## S3 method for class 'stPredict' summary(object, t = NULL, digits = NULL, ...)
object |
Object of class stPredict to summarise |
t |
timepoint to plot. Will automatically plot all timepoints and overall summary if NULL. |
digits |
Number of digits to pass to signif, if not NULL. |
... |
S3 generic/method consistency |
data("coprecip.predict") summary(coprecip.predict)
data("coprecip.predict") summary(coprecip.predict)
Fit a spatially varying coefficient model
svcFit( y, X, z, coords, miles = T, priors, nSamples, thin = 1, rw.initsd = 0.1, inits = list(), C = 1, alpha = 0.44 )
svcFit( y, X, z, coords, miles = T, priors, nSamples, thin = 1, rw.initsd = 0.1, inits = list(), C = 1, alpha = 0.44 )
y |
vector containing responses for each timepoint. vector is blocked by timepoint. |
X |
matrix containing local covariates for each timepoint. each row are the covariates for one location and timepoint. matrix is blocked by timepoint. |
z |
matrix containing remote covariates. each column has remote covariates for one timepoint. |
coords |
n x 2 matrix containing lon-lat coordinates for locations. |
miles |
T/F for whether to compute great circle distances in miles (T) or km (F) |
priors |
A list containing parameters for the prior distributions. The list needs to contain the following values
|
nSamples |
number of MCMC iterations to run |
thin |
MCMC thinning; defaults to no thinning (thin=1) |
rw.initsd |
Initial proposal standard deviation for RW samplers |
inits |
optional list containing starting parameters for MCMC sampler |
C |
scaling factor used in adapting random walk proposal variances. |
alpha |
target acceptance rate for random walk proposals. |
library(fields) library(mvtnorm) set.seed(2018) # set key parameters dims = list(N=100, nt=3, k=2, p=2) params = list(sigmasq=.2, rho=.3, eps=.5, nu=.5) # generate parameters and data coords = matrix( runif(2 * dims$N), ncol = 2 ) X = matrix( rnorm(dims$p * dims$N * dims$nt), ncol = dims$p ) beta = c(-1, .5) z = matrix( rnorm(dims$k * dims$nt), ncol = dims$nt) H = maternCov(rdist.earth(coords), scale = params$sigmasq, range = params$rho, smoothness = params$nu, nugget = params$sigmasq * params$eps) Hinv = solve(H) Tm = matrix(c(.5,.2, .2, .5), ncol=2)/2 theta = kronSamp(Hinv, Tm) # generate response xb = X %*% beta zt = as.numeric(apply(z, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) w = kronSamp(diag(dims$nt), H) y = xb + zt + w # fit model it = 100 priors = list( T = list(Psi = .1*diag(dims$k), nu = dims$k), beta = list(Linv = diag(dims$p) * 1e-2), sigmasq = list(a=2, b=1), rho = list(L=0, U=1), cov = list(nu=.5) ) fit = svcFit(y=y, X=X, z=z, coords=coords, priors=priors, nSamples=it) # # predict at new timepoints # # generate parameters and data nt0 = 3 Xn = matrix( rnorm(dims$p * dims$N * nt0), ncol = dims$p ) zn = matrix( rnorm(dims$k * nt0), ncol = nt0) # generate response xbn = Xn %*% beta ztn = as.numeric(apply(zn, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) wn = kronSamp(diag(nt0), H) yn = xbn + ztn + wn # predict responses pred = svcPredict(fit, Xn, zn, burn = 50)
library(fields) library(mvtnorm) set.seed(2018) # set key parameters dims = list(N=100, nt=3, k=2, p=2) params = list(sigmasq=.2, rho=.3, eps=.5, nu=.5) # generate parameters and data coords = matrix( runif(2 * dims$N), ncol = 2 ) X = matrix( rnorm(dims$p * dims$N * dims$nt), ncol = dims$p ) beta = c(-1, .5) z = matrix( rnorm(dims$k * dims$nt), ncol = dims$nt) H = maternCov(rdist.earth(coords), scale = params$sigmasq, range = params$rho, smoothness = params$nu, nugget = params$sigmasq * params$eps) Hinv = solve(H) Tm = matrix(c(.5,.2, .2, .5), ncol=2)/2 theta = kronSamp(Hinv, Tm) # generate response xb = X %*% beta zt = as.numeric(apply(z, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) w = kronSamp(diag(dims$nt), H) y = xb + zt + w # fit model it = 100 priors = list( T = list(Psi = .1*diag(dims$k), nu = dims$k), beta = list(Linv = diag(dims$p) * 1e-2), sigmasq = list(a=2, b=1), rho = list(L=0, U=1), cov = list(nu=.5) ) fit = svcFit(y=y, X=X, z=z, coords=coords, priors=priors, nSamples=it) # # predict at new timepoints # # generate parameters and data nt0 = 3 Xn = matrix( rnorm(dims$p * dims$N * nt0), ncol = dims$p ) zn = matrix( rnorm(dims$k * nt0), ncol = nt0) # generate response xbn = Xn %*% beta ztn = as.numeric(apply(zn, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) wn = kronSamp(diag(nt0), H) yn = xbn + ztn + wn # predict responses pred = svcPredict(fit, Xn, zn, burn = 50)
Make predictions using a fitted varying coefficient model
svcPredict( fit, Xn = NULL, Zn = NULL, stData = NULL, stDataNew = NULL, burn = 0, cat.probs = c(1/3, 2/3), conf = 0.95 )
svcPredict( fit, Xn = NULL, Zn = NULL, stData = NULL, stDataNew = NULL, burn = 0, cat.probs = c(1/3, 2/3), conf = 0.95 )
fit |
svcFit object containing posterior samples |
Xn |
[nr*nt, p] matrix of local covariates at new timepoint |
Zn |
[nr, nt] matrix of remote covariates at new timepoints |
stData |
Object with class 'stData' containing data needed to fit this model. The data is used to compute empirical quantiles for making categorical predictions. |
stDataNew |
object of class stData that includes information needed for making forecasts. |
burn |
number of posterior samples to burn from fit |
cat.probs |
vector of probabilities for also returning categorical predictions from the posterior prediction samples; NULL otherwise |
conf |
Parameter specifying the HPD level to compute for posterior predictive samples |
library(fields) library(mvtnorm) set.seed(2018) # set key parameters dims = list(N=100, nt=3, k=2, p=2) params = list(sigmasq=.2, rho=.3, eps=.5, nu=.5) # generate parameters and data coords = matrix( runif(2 * dims$N), ncol = 2 ) X = matrix( rnorm(dims$p * dims$N * dims$nt), ncol = dims$p ) beta = c(-1, .5) z = matrix( rnorm(dims$k * dims$nt), ncol = dims$nt) H = maternCov(rdist.earth(coords), scale = params$sigmasq, range = params$rho, smoothness = params$nu, nugget = params$sigmasq * params$eps) Hinv = solve(H) Tm = matrix(c(.5,.2, .2, .5), ncol=2)/2 theta = kronSamp(Hinv, Tm) # generate response xb = X %*% beta zt = as.numeric(apply(z, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) w = kronSamp(diag(dims$nt), H) y = xb + zt + w # fit model it = 100 priors = list( T = list(Psi = .1*diag(dims$k), nu = dims$k), beta = list(Linv = diag(dims$p) * 1e-2), sigmasq = list(a=2, b=1), rho = list(L=0, U=1), cov = list(nu=.5) ) fit = svcFit(y=y, X=X, z=z, coords=coords, priors=priors, nSamples=it) # # predict at new timepoints # # generate parameters and data nt0 = 3 Xn = matrix( rnorm(dims$p * dims$N * nt0), ncol = dims$p ) zn = matrix( rnorm(dims$k * nt0), ncol = nt0) # generate response xbn = Xn %*% beta ztn = as.numeric(apply(zn, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) wn = kronSamp(diag(nt0), H) yn = xbn + ztn + wn # predict responses pred = svcPredict(fit, Xn, zn, burn = 50)
library(fields) library(mvtnorm) set.seed(2018) # set key parameters dims = list(N=100, nt=3, k=2, p=2) params = list(sigmasq=.2, rho=.3, eps=.5, nu=.5) # generate parameters and data coords = matrix( runif(2 * dims$N), ncol = 2 ) X = matrix( rnorm(dims$p * dims$N * dims$nt), ncol = dims$p ) beta = c(-1, .5) z = matrix( rnorm(dims$k * dims$nt), ncol = dims$nt) H = maternCov(rdist.earth(coords), scale = params$sigmasq, range = params$rho, smoothness = params$nu, nugget = params$sigmasq * params$eps) Hinv = solve(H) Tm = matrix(c(.5,.2, .2, .5), ncol=2)/2 theta = kronSamp(Hinv, Tm) # generate response xb = X %*% beta zt = as.numeric(apply(z, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) w = kronSamp(diag(dims$nt), H) y = xb + zt + w # fit model it = 100 priors = list( T = list(Psi = .1*diag(dims$k), nu = dims$k), beta = list(Linv = diag(dims$p) * 1e-2), sigmasq = list(a=2, b=1), rho = list(L=0, U=1), cov = list(nu=.5) ) fit = svcFit(y=y, X=X, z=z, coords=coords, priors=priors, nSamples=it) # # predict at new timepoints # # generate parameters and data nt0 = 3 Xn = matrix( rnorm(dims$p * dims$N * nt0), ncol = dims$p ) zn = matrix( rnorm(dims$k * nt0), ncol = nt0) # generate response xbn = Xn %*% beta ztn = as.numeric(apply(zn, 2, function(d) { kronecker(diag(dims$N), t(d)) %*% theta })) wn = kronSamp(diag(nt0), H) yn = xbn + ztn + wn # predict responses pred = svcPredict(fit, Xn, zn, burn = 50)
Computes empirical correlations between rows of Y
and Z
,
for use as exploratory analysis of teleconnection patterns between locations
indexed by coords.s
and coords.r
. Optionally, an stData
object containing Y
and Z
can be passed instead.
teleCor( stData = NULL, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r )
teleCor( stData = NULL, Y = stData$Y, Z = stData$Z, coords.s = stData$coords.s, coords.r = stData$coords.r )
stData |
stData object containing data to analyze |
Y |
[ny x nt] a matrix composed of |
Z |
[nz x nt] a matrix composed of |
coords.s |
coordinates of locations in Y |
coords.r |
coordinates of locations in Z |
list with a matrix 'cor' containing correlations. The columns index remote coordinates, while the rows index the local coordinates. The returned list also includes the coordinates.
data("coprecip") cors = teleCor(coprecip)
data("coprecip") cors = teleCor(coprecip)
The package telefit provides functions for fitting the remote effects spatial process (RESP) model.