Function to generate a Markov chain for both continuous and discontinuous posterior distributions.
Description
The function allows to generate a single Markov Chain for sampling from both continuous and discontinuous posterior distributions using a plethora of algorithms. Classic Hamiltonian Monte Carlo (Duane et al. 1987) , NUTS (Hoffman et al. 2014) and XHMC (Betancourt 2016) are embedded into the framework described in (Nishimura et al. 2020), which allows to deal with such posteriors. Furthermore, for each method, it is possible to recycle samples from the trajectories using the method proposed by (Nishimura and Dunson 2020). This is used to improve the estimation of the mass matrix during the warm-up phase without requiring significant additional computational costs. This function should not be used directly, but only through the user interface provided by xdnuts.
Usage
main_function( theta0, nlp, args, k, N, K, tau, L, thin, chain_id, verbose, control )
main_function( theta0, nlp, args, k, N, K, tau, L, thin, chain_id, verbose, control )
Arguments
theta0 |
a vector of length- |
nlp |
a function object that takes three arguments:
|
args |
the necessary arguments to evaluate the negative log posterior and its gradient. |
k |
the number of parameters that induce a discontinuity in the posterior distribution. |
N |
integer containing the number of post warm-up samples to evaluate. |
K |
integer containing the number of recycled samples from each trajectory during the warm-up phase or beyond. |
tau |
the threshold for the exhaustion termination criterion described in (Betancourt 2016). |
L |
the desired length of the trajectory of classic Hamiltonian Monte Carlo algorithm. |
thin |
integer containing the number of samples to discard in order to produce a final iteration of the chain. |
chain_id |
the identification number of the chain. |
verbose |
a boolean value that controls whether to print all the information regarding the sampling process. |
control |
an object of class |
Value
a named list containing:
- values
-
a
matrix containing the sample from the target distribution (if convergence has been reached).
- energy
-
a vector of length-
containing the Markov Chain of the energy level sets.
- step_size
-
a vector of length-
containing the sampled step size used for each iteration.
- step_length
-
a vector of length-
containing the length of each trajectory of the chain.
- alpha
-
a vector of length-
containing the estimate of the Metropolis acceptance probabilities. The first element of the vector is the estimated global acceptance probability. The remaining k elements are the estimate rate of reflection for each parameters which travels coordinate-wise through some discontinuity.
- warm_up
-
a
matrix containing the sample of the chain coming from the warm-up phase. If
keep_warm_up = FALSE
inside thecontrol
argument, nothing is returned. - div_trans
-
a
matrix containing the locations where a divergence has been encountered during the integration of Hamilton equation. Hopefully
, and even better if
.
- M_cont
-
the Mass Matrix of the continuous components estimated during the warm-up phase. Based on the
M_type
value of thecontrol
arguments, this could be an empty object, a vector or a matrix. - M_disc
-
the Mass Matrix of the discontinuous components estimated during the warm-up phase. Based on the
M_type
value of thecontrol
arguments, this could be an empty object or a vector.
References
Betancourt M (2016).
“Identifying the optimal integration time in Hamiltonian Monte Carlo.”
arXiv preprint arXiv:1601.00225.
Duane S, Kennedy AD, Pendleton BJ, Roweth D (1987).
“Hybrid monte carlo.”
Physics letters B, 195(2), 216–222.
Hoffman MD, Gelman A, others (2014).
“The No-U-Turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo.”
J. Mach. Learn. Res., 15(1), 1593–1623.
Nishimura A, Dunson D (2020).
“Recycling Intermediate Steps to Improve Hamiltonian Monte Carlo.”
Bayesian Analysis, 15(4).
ISSN 1936-0975, doi:10.1214/19-ba1171, http://dx.doi.org/10.1214/19-BA1171.
Nishimura A, Dunson DB, Lu J (2020).
“Discontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods.”
Biometrika, 107(2), 365–380.