Title: | MCMC Simulation and Convergence Evaluation using Entropy and Kullback-Leibler Divergence Estimation |
---|---|
Description: | Tools for Markov Chain Monte Carlo (MCMC) simulation and performance analysis. Simulate MCMC algorithms including adaptive MCMC, evaluate their convergence rate, and compare candidate MCMC algorithms for a same target density, based on entropy and Kullback-Leibler divergence criteria. MCMC algorithms can be simulated using provided functions, or imported from external codes. This package is based upon work starting with Chauveau, D. and Vandekerkhove, P. (2013) <doi:10.1051/ps/2012004> and next articles. |
Authors: | Didier Chauveau [aut, cre], Houssam Alrachid [ctb] |
Maintainer: | Didier Chauveau <[email protected]> |
License: | GPL (>= 3) |
Version: | 1.0.4 |
Built: | 2024-11-26 06:40:00 UTC |
Source: | CRAN |
Contains functions to analyse (Adaptive) Markov Chain Monte Carlo (MCMC) algorithms, evaluate their convergence rate, and compare candidate MCMC algorithms for a same target density, based on entropy and Kullback-Leibler divergence criteria. MCMC algorithms can be simulated using provided functions, or imported from external codes.
The diagnostics are based on consistent estimates of entropy and Kulback distance
between the density at
iteration and the target density
, based on iid (parallel) chains.
Package: | EntropyMCMC |
Type: | Package |
Version: | 1.0.4 |
Date: | 2019-03-08 |
License: | GPL (>= 3) |
LazyLoad: | yes |
Statistical background:
This package allows for simulation of standard or adaptive MCMC samplers for a user-defined target density, and provides statistical tools to evaluate convergence of MCMC's and compare performance of algorithms for the same target density (typically against benchmark samplers).
The criteria are graphical and based on plots against iterations (time) ,
of the Kullback divergence
between the density
of the MCMC algorithm at time
,
and the target density
, for
up to the number of iterations
that have been simulated.
This requires estimation of the entropy of
,
and of the external entropy
Consistent estimates are computed based on iid (parallel) chains,
since the
positions of the chains at iterations
forms a
-iid sample from the density
.
Computational considerations:
The simulation of iid chains can be performed in this package, which provides a mechanism for defining (A)MCMC algorithms and building the iid chains required for convergence evaluation. Each MCMC algorithm is defined by a list with five elements. Each user can define its own MCMC, starting from the standard MCMC algorithms that are already defined:
RWHM
: a standard Randow-Walk Hastings-Metropolis (HM) algorithm.
HMIS_norm
: an Independence Sampler HM with gaussian proposal
AMHaario
: the
Haario (2001) Adaptive Hastings-Metropolis algorithm, provided as an example
of a standard AMCMC.
IID_norm
: a “fake” MCMC that is just a gaussian IID sampler, used mostly
for testing purpose. Simulation of iid chains for
iterations using this algorithm just returns
gaussian
-dimensional vectors.
Functions for doing the simulations and the convergence evaluation automatically using these algorithms in their first argument are provided. Two strategies are available:
Simulation and Kullback estimation separately:
A “cube” of chains for
iterations in a space of dimension
is first simulated and stored using
MCMCcopies
or its multicore or cluser versions,
then the entropy and Kullback divergence
are estimated from that object using EntropyMCMC
or its multicore version.
Simulation and Kullback estimation simultaneously:
For each iteration , the next step of all the
N
chains are generated,
then the Entropy and Kullback divergence
are estimated, and the past of the parallel chains
is discarded so that the amount of memory requirement is kept small, and
only entropy-related estimates are stored and returned. Functions for this strategy are
EntropyParallel
and its multicore and cluster version.
See the Examples section of plot_Kblist
for an illustration of these two methods.
Doing the simulations outside from this package
A third hybrid strategy is also available:
the simulation of iid chains can be done using an external code
(in R, C
or any language) and imported
in the EntropyMCMC package (defining an object of the appropriate class
"plMCMC"
and structure, see MCMCcopies
).
Then the Kullback divergence criterion can be computed using EntropyMCMC
or its multicore version, and convergence/comparison diagnostics can be displayed
using the associated plot
method.
About High Performance Computing
The required simulations can be done using singlecore or HCP (multicore computers, snow or clusters using the parallel or Rmpi pakages). Note that the parallel package using socket cluster is not available on Windows machines.
Didier Chauveau, Institut Denis Poisson, University of Orleans, CNRS, Orleans France. https://www.idpoisson.fr/chauveau/
Maintainer: Didier Chauveau [email protected]
Contributor: Houssam Alrachid
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
Internal function for the package EntropyMCMC
, computes the acceptance ratio required in the definition of any Hastings-Metropolis algorithm.
accept_ratio(x, y, target, q_pdf, f_param, q_param, symmetric = FALSE)
accept_ratio(x, y, target, q_pdf, f_param, q_param, symmetric = FALSE)
x |
The current position. |
y |
The next (proposal) position. |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
Target must be a function such as the multidimensional gaussian
|
q_pdf |
The density of the proposal. |
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters
for the proposal density of the MCMC algorithm |
symmetric |
If |
The accept_ratio
is used to decide whether to accept or reject a candidate
. The acceptance ratio indicates how probable the new proposed candidate is with respect to the current candidate
, according to the distribution
target
.
accept_ratio
returns a real value alpha
, which indicates the computed value of the current accept_ratio
.
Didier Chauveau, Houssam Alrachid.
Utility function for the package EntropyMCMC.
CollectChains(s)
CollectChains(s)
s |
An object of class |
Concatenates all simulated copies together in one matrix of dimension (n*nmc,d)
.
Returns a matrix of dimension (n*nmc,d)
.
Didier Chauveau.
Utility function for the package EntropyMCMC, for generating random
starting positions for the parallel Markov chains, used by, e.g.,
MCMCcopies
or EntropyParallel
.
DrawInit(nmc, d, initpdf="rnorm", ...)
DrawInit(nmc, d, initpdf="rnorm", ...)
nmc |
Number of parallel chains = initial points. |
d |
Space dimension. |
initpdf |
Random generator. Generators currently implemented are: "rnorm" as the Normal distribution and "runif" as the uniform distribution. |
... |
Parameters passed to initpdf |
DrawInit
returns a matrix of dimension (nmc,d) where each row is a
-dimensional point.
It is better for mixing properties to use diffuse initial distributions, such as the one proposed here. However Dirac initial points can also be used, precisely to evaluate the efficiency of a MCMC to escape from a wrong initial position (e.g., in the tails of the target density).
Didier Chauveau.
MCMCcopies
and
MCMCcopies.mc
for iid MCMC simulations,
EntropyParallel
and EntropyParallel.cl
for simultaneous simulation and entropy estimation.
Ptheta0 <- DrawInit(10, 5, initpdf="rnorm", mean=0, sd=5)
Ptheta0 <- DrawInit(10, 5, initpdf="rnorm", mean=0, sd=5)
These functions return estimates of the entropy
of the density of a MCMC algorithm at time
,
,
and of the Kullback divergence between
and the target density,
for
up to the number of iterations that have been simulated.
The MCMC simulations must be computed before or externally,
and passed as a "
plMCMC
" object
in the first argument (see details).
The target may be known only up to a multiplicative constant (see details).
EntropyMCMC.mc
is a parallel computing
version that uses the
parallel package to split the task between the available (virtual) cores on the computer. This version using socket cluster is not available for Windows computers.
EntropyMCMC(plmc1, method = "A.Nearest.Neighbor", k=1, trim = 0.02, eps=0, all.f = TRUE, verb = FALSE, EntVect = FALSE, uselogtarget = FALSE, logtarget = NULL) EntropyMCMC.mc(plmc1, method = "A.Nearest.Neighbor", k = 1, trim = 0.02, eps=0, all.f = TRUE, verb = FALSE, EntVect = FALSE, nbcores=detectCores(), uselogtarget = FALSE, logtarget = NULL)
EntropyMCMC(plmc1, method = "A.Nearest.Neighbor", k=1, trim = 0.02, eps=0, all.f = TRUE, verb = FALSE, EntVect = FALSE, uselogtarget = FALSE, logtarget = NULL) EntropyMCMC.mc(plmc1, method = "A.Nearest.Neighbor", k = 1, trim = 0.02, eps=0, all.f = TRUE, verb = FALSE, EntVect = FALSE, nbcores=detectCores(), uselogtarget = FALSE, logtarget = NULL)
plmc1 |
an objects of class |
method |
The method for estimating the entropy |
k |
The k-nearest neighbor index, the default is |
trim |
Parameter controlling the percentage of smallest data from one subsample
that is removed, only for |
eps |
A parameter controlling precision in the |
all.f |
If |
verb |
Verbose mode |
EntVect |
If |
nbcores |
Number of required (virtual) cores, defaults to all as returned
by |
uselogtarget |
Set to |
logtarget |
The function defining |
Methods based on Nearest Neighbors (NN) should be preferred since these require less tuning parameters.
Some options, as uselogtarget
are in testing phase and are not implemented in all the available methods (see Arguments).
An object of class KbMCMC
(for Kullback MCMC), containing:
Kullback |
A vector of estimated divergences |
Entp |
A vector of estimated entropies |
nmc |
The number of iid copies of each single chain. |
dim |
The state space dimension of the MCMC algorithm. |
algo |
The name of the MCMC algorithm that have been used to simulate
the copies of chains, see |
target |
The target density for which the MCMC algorithm is defined;
ususally given only up to a multiplicative constant for MCMC in Bayesian models.
target must be a function such as the multidimensional gaussian
|
method |
The |
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters for the proposal density of the MCMC algorithm that have been used. |
The method "Resubst"
is implemented for testing, without theoretical guarantee of convergence.
Didier Chauveau, Houssam Alrachid.
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
MCMCcopies
and
MCMCcopies.mc
for iid MCMC simulations (single core and multicore),
EntropyParallel
and EntropyParallel.cl
for simultaneous simulation and entropy estimation (single core and multicore).
## Toy example using the bivariate gaussian target ## with default parameters value, see target_norm_param n = 150; nmc = 50; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulation of the nmc iid chains, singlecore s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) summary(s1) # method for "plMCMC" object e1 <- EntropyMCMC(s1) # computes Entropy and Kullback divergence estimates par(mfrow=c(1,2)) plot(e1) # default plot.plMCMC method, convergence after about 80 iterations plot(e1, Kullback = FALSE) # Plot Entropy estimates over time abline(normEntropy(target_norm_param), 0, col=8, lty=2) # true E_f[log(f)]
## Toy example using the bivariate gaussian target ## with default parameters value, see target_norm_param n = 150; nmc = 50; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulation of the nmc iid chains, singlecore s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) summary(s1) # method for "plMCMC" object e1 <- EntropyMCMC(s1) # computes Entropy and Kullback divergence estimates par(mfrow=c(1,2)) plot(e1) # default plot.plMCMC method, convergence after about 80 iterations plot(e1, Kullback = FALSE) # Plot Entropy estimates over time abline(normEntropy(target_norm_param), 0, col=8, lty=2) # true E_f[log(f)]
This function simulates “parallel chains” (iid copies)
of a MCMC algorithm, i.e.
for each “time” iteration the next step of all the
nmc
chains are generated,
then the Entropy of the density of the algorithm at iteration
,
,
and the Kullback divergence between
and the target density
are estimated, based on these
nmc
steps iid from .
By default
keep.all = FALSE
i.e. the past of the parallel chains
is discarded so that the amount of memory requirement is kept small, and
only entropy-related estimates are returned.
If keep.all = TRUE
, the entire set of chains trajectories
is saved in an array of dimensions (n,d,nmc)
, such as the one
returned by MCMCcopies
or MCMCcopies.cl
.
A version of this function implementing several HPC (parallel) computing strategies is available (see details).
EntropyParallel(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, method = "A.Nearest.Neighbor",k=1, trim = 0.02, keep.all = FALSE, verb = TRUE, EntVect = FALSE) EntropyParallel.cl(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, method = "A.Nearest.Neighbor",k=1, eps = 0, trim=0.02, verb=TRUE, EntVect = FALSE, cltype="PAR_SOCK", nbnodes = 4, par.logf = FALSE, uselogtarget = FALSE, logtarget = NULL)
EntropyParallel(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, method = "A.Nearest.Neighbor",k=1, trim = 0.02, keep.all = FALSE, verb = TRUE, EntVect = FALSE) EntropyParallel.cl(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, method = "A.Nearest.Neighbor",k=1, eps = 0, trim=0.02, verb=TRUE, EntVect = FALSE, cltype="PAR_SOCK", nbnodes = 4, par.logf = FALSE, uselogtarget = FALSE, logtarget = NULL)
mcmc_algo |
a list defining an MCMC algorithm in terms of the
functions it uses, such as |
n |
The number of (time) iterations of each single chain to run. |
nmc |
The number of iid copies of each single chain. |
Ptheta0 |
A |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
target must be a function such as the multidimensional gaussian
|
f_param |
A list holding all the necessary target parameters, including the data in an actual Bayesian model, and consistent with the target definition. |
q_param |
A list holding all the necessary parameters
for the proposal density of the MCMC algorithm |
method |
The method for estimating the entropy
|
k |
The k-nearest neighbor index, the default is |
eps |
Error bound: default of 0.0 implies exact nearest neighbour search, see the RANN package. |
trim |
not used in this implementation, only for |
keep.all |
If |
verb |
Verbose mode for summarizing output during the simulation. |
EntVect |
If |
cltype |
Character string specifying the type of cluster;
currently implemented
types are: "PAR_SOCK" for socket cluster with |
nbnodes |
The number of nodes or virtual cores requested to run the |
par.logf |
if |
uselogtarget |
Set to |
logtarget |
The function defining |
About parallel computing:
For the HPC (parallel) version, the computation of the nmc
chains next step
are done by the cluster nodes:
EntropyParallel.cl
is a generic cluster version implementing
several types of cluster for running on a single, multicore computer
or on a true cluster using MPI communications. It is under development and may not
work on all platform/OS. For instance the parallel socket cluster version
does not work on Windows machines (see the parallel package documentation).
Currently tested under LINUX, Mac OSX, and a cluster
using OpenMPI and Sun Grid Engine.
Note that the parallel computing for this approach is less efficient
than the two-steps procedure consisting in
(i) parallel simulation of the iid chains using MCMCcopies.cl
to generate the
“cube” of simulated values,
and then (ii) entropy and Kullback estimation using EntropyMCMC.mc
.
This is because each node computes only one iteration at a time for the nmc
chains
here, whereas it computes all the iterations once for the
nmc
chains
when the entire cube is saved first. This is a trade-off between memory and speed.
Note also that the Rmpi
option is less efficient than the default option
using parallel if you are running on a single computer.
MPI communication are required only for running on a true cluster/grid.
About passing your MCMC algorithm:
The list mcmc_algo
must contain the named elements:
name
, the name of the MCMC, such as "RWHM"
chain
, the function for simulation of n steps of a single chain
step
, the function for simulation of 1 step of that algorithm
q_pdf
, the density of the proposal
q_proposal
, the function that simulates a proposal
For examples, see the algorithms currently implemented:
RWHM
, the Random Walk Hasting-Metropolis with gaussian proposal;
HMIS_norm
, an Independence Sampler HM with gaussian proposal;
IID_norm
, a gaussian iid sampler which is merely
a "fake" MCMC for testing purposes.
Currently only non-adaptive Markov chains or adaptive chains for which
the past can be summarized by some sufficient statistics are eligible for this
computation forgetting the past of the nmc
chains.
Adaptive chains such as AMHaario
, the Adaptive-Metropolis (AM) from Haario (2001) are
not yet tested for this function.
An object of class "KbMCMC"
, containing
Kullback |
A vector of estimated |
Entp |
A vector of estimated |
nmc |
The number of iid copies of each single chain. |
dim |
The state space dimension of the MCMC algorithm. |
algo |
The name of the MCMC algorithm that have been used to simulate
the copies of chains, see |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
target must be a function such as the multidimensional gaussian
|
method |
The |
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters for the proposal density of the MCMC algorithm that have been used. |
prob.accept |
Estimated rate of acceptation (meaningful for accept/reject-type algorithms). |
Ptheta |
The |
Didier Chauveau, Houssam Alrachid.
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
MCMCcopies
, MCMCcopies.mc
and
MCMCcopies.cl
for just simulating the iid chains, and
EntropyMCMC
or EntropyMCMC.mc
for computing entropy and Kullback estimates from an already simulated
set of iid chains (internally or from external code).
## Toy example using the bivariate gaussian target ## same as for MCMCcopies n = 150; nmc = 50; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations and entropy + Kullback using the singlecore version e1 <- EntropyParallel(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) par(mfrow=c(1,2)) plot(e1) # default plot.plMCMC method, convergence after about 80 iterations plot(e1, Kullback = FALSE) # Plot Entropy estimates over time abline(normEntropy(target_norm_param), 0, col=8, lty=2) # true E_f[log(f)] # Another example using multicore version, (not available on Windows) varq=0.05 # variance of the proposal, even smaller q_param=list(mean=rep(0,d),v=varq*diag(d)) n=300 # requires more iterations to show convergence e1 <- EntropyParallel.cl(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, cltype="PAR_SOCK", verb = FALSE, nbnodes = 2) plot(e1) # convergence after about 150 iterations
## Toy example using the bivariate gaussian target ## same as for MCMCcopies n = 150; nmc = 50; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations and entropy + Kullback using the singlecore version e1 <- EntropyParallel(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) par(mfrow=c(1,2)) plot(e1) # default plot.plMCMC method, convergence after about 80 iterations plot(e1, Kullback = FALSE) # Plot Entropy estimates over time abline(normEntropy(target_norm_param), 0, col=8, lty=2) # true E_f[log(f)] # Another example using multicore version, (not available on Windows) varq=0.05 # variance of the proposal, even smaller q_param=list(mean=rep(0,d),v=varq*diag(d)) n=300 # requires more iterations to show convergence e1 <- EntropyParallel.cl(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, cltype="PAR_SOCK", verb = FALSE, nbnodes = 2) plot(e1) # convergence after about 150 iterations
Functions for proposal density evaluation and random generation in MCMC algorithms, in the case where these are Gaussian.
gaussian_pdf(y, x, param) gaussian_proposal(x, param)
gaussian_pdf(y, x, param) gaussian_proposal(x, param)
y |
Candidate for next move, a vector of dimension |
x |
Current position of a chain, a vector of dimension |
param |
The proposal parameters, that must contains the
|
The Gaussian proposal density used in, e.g.,
random walk Hastings-Metropolis algorithm
RWHM
is the multivariate Gaussian density evaluated at point
.
Similarly, the Gaussian proposal (next move) is a random draw
when the chain is at position
.
The value of the density, or the random draw, both in dimension
These functions are calling multivariate Gaussian density and random generation functions imported from the mixtools package (chosen for efficiency) and wrapped in the format required by the EntropyMCMC package.
Didier Chauveau.
Simulates nmc
iid copies of a MCMC algorithm mcmc_algo
for n
(time) iterations and returns an object of class plMCMC
(for parallel MCMC) holding an array of the trajectories and running information.
MCMCcopies(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, verb = TRUE)
MCMCcopies(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, verb = TRUE)
mcmc_algo |
a list defining an MCMC algorithm in terms of the
functions it uses, such as |
n |
The number of (time) iterations of each single chain to run. |
nmc |
The number of iid copies of each single chain. |
Ptheta0 |
A (nmc x d) matrix, with the ith row giving a d-dimensional initial theta values for the ith chain. |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
Target must be a function such as the multidimensional gaussian
|
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters
for the proposal density of the MCMC algorithm |
verb |
Verbose mode for summarizing output during the simulation. |
MCMCcopies
sequentially simulates nmc
iid copies of the
MCMC algorithm passed in the list mcmc_algo
,
for n
(time) iterations, and returns an object of class plMCMC
holding an array of the trajectories and running information.
The list mcmc_algo
must contain the named elements:
name
, the name of the MCMC, such as "RWHM"
chain
, the function for simulation of n steps of a single chain
step
, the function for simulation of 1 step of that algorithm
q_pdf
, the density of the proposal
q_proposal
, the function that simulates a proposal
For examples, see the algorithms currently implemented:
RWHM
, the Random Walk Hasting-Metropolis with gaussian proposal;
HMIS_norm
, an Independence Sampler HM with gaussian proposal;
AMHaario
, the Adaptive-Metropolis (AM) from Haario (2001);
IID_norm
, a gaussian iid sampler which is merely
a "fake" MCMC for testing purposes.
MCMCcopies
returns a list of class plMCMC
with items:
Ptheta |
The |
prob.accept |
The estimated rate of acceptation over all simulations. |
algo |
The MCMC algorithm name i.e. |
target |
The target density. |
f_param |
The list holding all the target parameters. |
q_param |
The list holding all the proposal density parameters. |
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Two multicore and cluster version
MCMCcopies.mc
and
MCMCcopies.cl
,
and functions doing simulation and entropy and Kullback estimation simultaneously:
EntropyParallel
and EntropyParallel.cl
## Toy example using the bivariate gaussian target ## with default parameters value, see target_norm_param n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulation s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) summary(s1) # method for "plMCMC" object par(mfrow=c(1,2)) plot(s1) # just a path of the iid chains, method for "plMCMC" object hist(s1$Ptheta[,1,], col=8) # marginal 1
## Toy example using the bivariate gaussian target ## with default parameters value, see target_norm_param n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulation s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, verb = FALSE) summary(s1) # method for "plMCMC" object par(mfrow=c(1,2)) plot(s1) # just a path of the iid chains, method for "plMCMC" object hist(s1$Ptheta[,1,], col=8) # marginal 1
This function simulates “parallel chains” (iid copies) of a MCMC algorithm
for n
(time) iterations, i.e. for each chain , the whole trajectory of the chain is generated. It returns an object of class
plMCMC
(for parallel MCMC)
holding an array of the trajectories and running information.
This functions is similar to MCMCcopies
and MCMCcopies.mc
except that it uses
HPC in a more generic way, implementing several types of HPC for running on a single, multicore computer or on a true cluster using MPI communications.
MCMCcopies.cl(mcmc_algo, n=100, nmc=10, Ptheta0, target, f_param, q_param, cltype="PAR_SOCK", nbnodes=4)
MCMCcopies.cl(mcmc_algo, n=100, nmc=10, Ptheta0, target, f_param, q_param, cltype="PAR_SOCK", nbnodes=4)
mcmc_algo |
a list defining an MCMC algorithm in terms of the
functions it uses, such as |
n |
The number of (time) iterations of each single chain to run. |
nmc |
The number of iid copies of each single chain. |
Ptheta0 |
A |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
target must be a function such as the multidimensional gaussian
|
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters
for the proposal density of the MCMC algorithm |
cltype |
Character string specifying the type of cluster;
currently implemented
types are: "PAR_SOCK" for socket cluster with |
nbnodes |
The number of nodes or virtual cores requested to run the |
MCMCcopies.cl
simulates in parallel
nmc
iid copies of the MCMC algorithm passed in the list mcmc_algo
,
for n
(time) iterations, and returns an object of class plMCMC
holding an array of the trajectories and running information.
About parallel computing:
The Rmpi
option is less efficient than the default option
using parallel if you are running on a single computer.
MPI communication are required only for running on a true cluster/grid.
This generic cluster version implementing several types of cluster for running on a single, multicore computer or on a true cluster using MPI communications may not work on all platform/OS. For instance the parallel socket cluster version does not work on Windows machines (see the parallel package documentation).
About passing your MCMC algorithm:
The list mcmc_algo
must contain the named elements:
name
, the name of the MCMC, such as "RWHM"
chain
, the function for simulation of n steps of a single chain
step
, the function for simulation of 1 step of that algorithm
q_pdf
, the density of the proposal
q_proposal
, the function that simulates a proposal
For examples, see the algorithms currently implemented:
RWHM
, the Random Walk Hasting-Metropolis with gaussian proposal;
HMIS_norm
, an Independence Sampler HM with gaussian proposal;
AMHaario
, the Adaptive-Metropolis (AM) from Haario (2001);
IID_norm
, a gaussian iid sampler which is merely
a "fake" MCMC for testing purposes.
MCMCcopies.cl
returns a list of class plMCMC
with items:
Ptheta |
The |
prob.accept |
The estimated rate of acceptation over all simulations. |
algo |
The MCMC algorithm name i.e. |
target |
The target density. |
f_param |
The list holding all the target parameters. |
q_param |
The list holding all the proposal density parameters. |
Houssam Alrachid and Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
A simpler cluster version MCMCcopies.mc
,
a single core version MCMCcopies
,
and functions doing simulation and entropy and Kullback estimation simultaneously:
EntropyParallel
and EntropyParallel.cl
## Toy example using the bivariate gaussian target n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations (may be compared with the singlecore version using system.time) s1 <- MCMCcopies.cl(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, nbnodes = 2) summary(s1) # method for "plMCMC" object ## see MCMCcopies example for plots
## Toy example using the bivariate gaussian target n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations (may be compared with the singlecore version using system.time) s1 <- MCMCcopies.cl(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, nbnodes = 2) summary(s1) # method for "plMCMC" object ## see MCMCcopies example for plots
Simulates nmc
iid copies of a MCMC algorithm mcmc_algo
for n
(time) iterations and returns an object of class plMCMC
(for parallel MCMC) holding
an array of the trajectories and running information.
This functions is similar to MCMCcopies
except that it uses the
parallel package (available in the main distribution, but not for Windows machines)
to split the task between the available
virtual cores on the computer.
MCMCcopies.mc(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, verb = TRUE, nbcores=detectCores())
MCMCcopies.mc(mcmc_algo, n = 100, nmc = 10, Ptheta0, target, f_param, q_param, verb = TRUE, nbcores=detectCores())
mcmc_algo |
a list defining an MCMC algorithm in terms of the
functions it uses, such as |
n |
The number of (time) iterations of each single chain to run. |
nmc |
The number of iid copies of each single chain. |
Ptheta0 |
A |
target |
The target density for which the MCMC algorithm is defined;
may be given only up to a multiplicative constant for most MCMC.
target must be a function such as the multidimensional gaussian
|
f_param |
A list holding all the necessary target parameters, consistent with the target definition. |
q_param |
A list holding all the necessary parameters
for the proposal density of the MCMC algorithm |
verb |
Verbose mode for summarizing output during the simulation. |
nbcores |
Number of required (virtual) cores, defaults to all as returned
by |
MCMCcopies.mc
, like MCMCcopies
, sequentially simulates
nmc
iid copies of the
MCMC algorithm passed in the list mcmc_algo
,
for n
(time) iterations, and returns an object of class plMCMC
holding an array of the trajectories and running information.
The list mcmc_algo
must contain the named elements:
name
, the name of the MCMC, such as "RWHM"
chain
, the function for simulation of n steps of a single chain
step
, the function for simulation of 1 step of that algorithm
q_pdf
, the density of the proposal
q_proposal
, the function that simulates a proposal
For examples, see the algorithms currently implemented:
RWHM
, the Random Walk Hasting-Metropolis with gaussian proposal;
HMIS_norm
, an Independence Sampler HM with gaussian proposal;
AMHaario
, the Adaptive-Metropolis (AM) from Haario (2001);
IID_norm
, a gaussian iid sampler which is merely
a "fake" MCMC for testing purposes.
MCMCcopies
returns a list of class plMCMC
with items:
Ptheta |
The |
prob.accept |
The estimated rate of acceptation over all simulations. |
algo |
The MCMC algorithm name i.e. |
target |
The target density. |
f_param |
The list holding all the target parameters. |
q_param |
The list holding all the proposal density parameters. |
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2013), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
A more general cluster version MCMCcopies.cl
,
a single core version MCMCcopies
,
and functions doing simulation and entropy and Kullback estimation simultaneously:
EntropyParallel
and EntropyParallel.cl
## Toy example using the bivariate gaussian target ## not working on Windows since socket cluster not implemented n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations (may be compared with the singlecore version using system.time) s1 <- MCMCcopies.mc(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, nbcores = 2) summary(s1) # method for "plMCMC" object ## see MCMCcopies example for plots
## Toy example using the bivariate gaussian target ## not working on Windows since socket cluster not implemented n = 150; nmc = 20; d=2 # bivariate example varq=0.1 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) # simulations (may be compared with the singlecore version using system.time) s1 <- MCMCcopies.mc(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, nbcores = 2) summary(s1) # method for "plMCMC" object ## see MCMCcopies example for plots
This function computes the entropy
of the density of the multivariate gaussian,
with parameters in a list, as it is the case for
MCMC target density parameters. This function is used
mostly for benchmarking entropy estimation performed by the package
(using, e.g., the iid algorithm
IID_norm
).
normEntropy(target_param)
normEntropy(target_param)
target_param |
A list of two elements: the mean |
The entropy of the Gaussian with these parameters.
Didier Chauveau.
d=2 # model dimension mu=rep(0,d); v = diag(d) # mean and variance target_param = list(mean=mu, v=v) # parameters normEntropy(target_param) # the entropy
d=2 # model dimension mu=rep(0,d); v = diag(d) # mean and variance target_param = list(mean=mu, v=v) # parameters normEntropy(target_param) # the entropy
This function draws on a same plot several sequences of estimates of
Kullback distances , i.e. the convergence criterion vs. time (iteration
),
for each MCMC algorithm for which the convergence criterion has been computed.
plot_Kblist(Kb, which = 1, lim = NULL, ylim = NULL)
plot_Kblist(Kb, which = 1, lim = NULL, ylim = NULL)
Kb |
A list of objects of class |
which |
Controls the level of details in the legend added to the plot (see details) |
lim |
for zooming over |
ylim |
limits on the |
The purpose of this plot if to compare MCMC algorithms (typically based on
different
simulation strategies or kernels) for convergence or efficiency in estimating a same target density
.
For the
th algorithm, the user has to generate the convergence criterion,
i.e. the sequence
for
up to the number of iterations
that has been chosen, and where
is the estimated pdf of the algorithm at time
.
For the legend, which=1
displays the MCMC's names together with some technical information depending on the algorithms definition (e.g. the proposal variance for the RWHM
algorithm) and the
method used for entropy estimation. The legend for
which=2
is shorter, only displaying the MCMC's names together with the number of parallel chains used for each,
typically to compare the effect of that number for a single MCMC algorithm.
The graphic to plot.
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, (2013) 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
## Toy example using the bivariate centered gaussian target ## with default parameters value, see target_norm_param d = 2 # state space dimension n=300; nmc=100 # number of iterations and iid Markov chains ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) ## MCMC 1: Random-Walk Hasting-Metropolis varq=0.05 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## using Method 1: simulation with storage, and *then* entropy estimation # simulation of the nmc iid chains, single core here s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param) summary(s1) # method for "plMCMC" object e1 <- EntropyMCMC(s1) # computes Entropy and Kullback divergence ## MCMC 2: Independence Sampler with large enough gaussian proposal varq=1; q_param <- list(mean=rep(0,d),v=varq*diag(d)) ## using Method 2: simulation & estimation for each t, forgetting the past ## HPC with 2 cores here (using parallel socket cluser, not available on Windows machines) e2 <- EntropyParallel.cl(HMIS_norm, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, cltype="PAR_SOCK", nbnodes=2) ## Compare these two MCMC algorithms plot_Kblist(list(e1,e2)) # MCMC 2 (HMIS, red plot) converges faster.
## Toy example using the bivariate centered gaussian target ## with default parameters value, see target_norm_param d = 2 # state space dimension n=300; nmc=100 # number of iterations and iid Markov chains ## initial distribution, located in (2,2), "far" from target center (0,0) Ptheta0 <- DrawInit(nmc, d, initpdf = "rnorm", mean = 2, sd = 1) ## MCMC 1: Random-Walk Hasting-Metropolis varq=0.05 # variance of the proposal (chosen too small) q_param=list(mean=rep(0,d),v=varq*diag(d)) ## using Method 1: simulation with storage, and *then* entropy estimation # simulation of the nmc iid chains, single core here s1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param) summary(s1) # method for "plMCMC" object e1 <- EntropyMCMC(s1) # computes Entropy and Kullback divergence ## MCMC 2: Independence Sampler with large enough gaussian proposal varq=1; q_param <- list(mean=rep(0,d),v=varq*diag(d)) ## using Method 2: simulation & estimation for each t, forgetting the past ## HPC with 2 cores here (using parallel socket cluser, not available on Windows machines) e2 <- EntropyParallel.cl(HMIS_norm, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, cltype="PAR_SOCK", nbnodes=2) ## Compare these two MCMC algorithms plot_Kblist(list(e1,e2)) # MCMC 2 (HMIS, red plot) converges faster.
This S3 method for plot
plots by default sequences of estimates of
the Kullback distance
between the (estimated) pdf of the MCMC algorithm at time
,
, and the target density
,
for
up to the number of iterations that have been provided/computed.
It can also plot the first term in the Kullback distance, i.e.
the Entropy
.
Its argument is an object of class
KbMCMC
such as the one returned by, e.g., EntropyMCMC
.
## S3 method for class 'KbMCMC' plot(x, Kullback = TRUE, lim = NULL, ylim = NULL, new.plot = TRUE, title = NULL, ...)
## S3 method for class 'KbMCMC' plot(x, Kullback = TRUE, lim = NULL, ylim = NULL, new.plot = TRUE, title = NULL, ...)
x |
An object of class |
Kullback |
|
lim |
for zooming over |
ylim |
|
new.plot |
set to |
title |
The title; if |
... |
Further parameters passed to |
The graphic to plot.
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, (2013) 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
## See the EntropyMCMC Examples.
## See the EntropyMCMC Examples.
This function plots 2d-projections of the paths of i.i.d. copies of Markov chains
output by an MCMC algorithm and stored in an object of class plMCMC
(for parallel MCMC)
such as the one returned by, e.g., MCMCcopies
or the multicore version
MCMCcopies.mc
.
## S3 method for class 'plMCMC' plot(x, xax = 1, yax = 2, title = NULL, cname = NULL, ...)
## S3 method for class 'plMCMC' plot(x, xax = 1, yax = 2, title = NULL, cname = NULL, ...)
x |
An object of class |
xax |
Coordinate for the horizontal axis. |
yax |
Coordinate for the vertical axis. |
title |
The title; if |
cname |
Coordinate base name; "var" is the default, so that coordinates are named "var1", "var2", and so on. |
... |
Further parameters except |
This function is currently limited to a 2D projection path of all the i.i.d. chains for
the two selected coordinates.
The copies of the Markov chain must be in the 3-dimensional
array s$Ptheta
.
The graphic to plot.
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, (2013) 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
MCMCcopies
, MCMCcopies.mc
,
MCMCcopies.cl
## See MCMCcopie Example
## See MCMCcopie Example
Utility function for the package EntropyMCMC, to visualize a
2-dimensional target of a MCMC algorithm, mostly for testing purpose.
This uses the function persp
from package graphics.
plottarget3d(zft, l, r, ms, theta, phi, ...)
plottarget3d(zft, l, r, ms, theta, phi, ...)
zft |
a function, typically a 2-dimensional target of a MCMC. |
l , r , ms
|
mesh boundaries and size. |
theta , phi
|
angles defining the viewing direction. theta gives the azimuthal direction and phi the colatitude. |
... |
additional graphical parameters. |
Returns a 3D plot on a mesh of size (l, r, ms)
.
Didier Chauveau.
These functions are used to define the elements $chain
of the MCMC algorithms
that are (and must be) implemented as lists in EntropyMCMC.
These functions are usually only called by higher-level functions, see details below.
RWHM_chain(theta0, it = 100, target, f_param, q_param, q_pdf = gaussian_pdf, q_proposal = gaussian_proposal) HMIS_norm_chain(theta0, it = 100, target, f_param, q_param, q_pdf = q_pdf_ISnorm, q_proposal = q_proposal_ISnorm) AMHaario_chain(theta0, it = 100, target, f_param, q_param, q_pdf = gaussian_pdf, q_proposal = gaussian_proposal) IID_chain(theta0 = NULL, it = 100, target, f_param, q_param = NULL, q_pdf = NULL, q_proposal = NULL)
RWHM_chain(theta0, it = 100, target, f_param, q_param, q_pdf = gaussian_pdf, q_proposal = gaussian_proposal) HMIS_norm_chain(theta0, it = 100, target, f_param, q_param, q_pdf = q_pdf_ISnorm, q_proposal = q_proposal_ISnorm) AMHaario_chain(theta0, it = 100, target, f_param, q_param, q_pdf = gaussian_pdf, q_proposal = gaussian_proposal) IID_chain(theta0 = NULL, it = 100, target, f_param, q_param = NULL, q_pdf = NULL, q_proposal = NULL)
it |
the number of iterations to simulate |
theta0 |
the initial position of the chain, a |
target |
the user-defined target density |
f_param |
the parameters (hyperparameters, data) of the user-defined target density |
q_param |
the parameters of the proposal density, which structure depends on the algorithm and the proposal density chosen by the user. Defaults are
for |
q_pdf |
the proposal density |
q_proposal |
the function simulating the proposal for the next move |
Each MCMC algorithm is defined as a list with five elements, see
the object RWHM
for an example. The element $chain
must provide
the name of the function performing simulation of a single chain and returning that chain,
with arguments that must follow the definition above.
Each user can define its own MCMC
starting with the algorithms provided (see also section below).
These functions are thus usually called by higher-level functions like
MCMCcopies
, EntropyParallel
, or their multicore versions,
for simulating copies of MCMC chains in an automatic manner.
RWHM_chain
is used in RWHM
, a standard Randow-Walk Hastings-Metropolis algorithm.
HMIS_norm_chain
is used in HMIS_norm
,
an Independence Sampler HM with gaussian proposal
AMHaario_chain
is used in AMHaario
, the
Haario Adaptive Hastings-Metropolis algorithm (Haario 2001), and is provided as an example
of a benchmark AMCMC.
IID_chain
is used in IID_norm
, a “fake” MCMC that is just a gaussian IID sampler.
A list with elements:
theta |
the simulated chain in an array of |
paccept |
the empirical acceptance rate |
finalcov |
the last covariance matrix |
algo |
the name of the algorithm (for plot methods) |
Didier Chauveau.
H. Haario, E. Saksman, and J. Tamminen (2001), An adaptive Metropolis algorithm. Bernoulli 7, 223–242.
The algorithm already implemented, listed in EntropyMCMC-package
.
The higher level functions that use these functions for simulation:
MCMCcopies
, EntropyParallel
and their multicore versions.
plMCMC
object
holding iid copies of MCMC'sThis S3 method for summary
summarizes the content of an object
of class plMCMC
(for parallel MCMC) as returned by, e.g.,
MCMCcopies
, containing the trajectories of iid copies of trajectories
from a MCMC algorithm, and its associated kernel, target and proposal densities.
## S3 method for class 'plMCMC' summary(object, stats = FALSE, ...)
## S3 method for class 'plMCMC' summary(object, stats = FALSE, ...)
object |
An object of class |
stats |
print additional summary statistics for the variables over all chains. |
... |
additional arguments passed to other methods |
Returns the object associated dimensions, the overall rate of acceptation,
and descriptive statistics over the variable coordinates if stats = TRUE
.
Didier Chauveau.
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17, (2013) 419–431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816–2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HAL http://hal.archives-ouvertes.fr/hal-01068081.
## See Example for MCMCcopies
## See Example for MCMCcopies