Package: BayesFluxR 0.1.3

Enrico Wegner

BayesFluxR: Implementation of Bayesian Neural Networks

Implementation of 'BayesFlux.jl' for R; It extends the famous 'Flux.jl' machine learning library to Bayesian Neural Networks. The goal is not to have the fastest production ready library, but rather to allow more people to be able to use and research on Bayesian Neural Networks.

Authors:Enrico Wegner [aut, cre]

BayesFluxR_0.1.3.tar.gz
BayesFluxR_0.1.3.tar.gz(r-4.5-noble)BayesFluxR_0.1.3.tar.gz(r-4.4-noble)
BayesFluxR_0.1.3.tgz(r-4.4-emscripten)BayesFluxR_0.1.3.tgz(r-4.3-emscripten)
BayesFluxR.pdf |BayesFluxR.html
BayesFluxR/json (API)
NEWS

# Install 'BayesFluxR' in R:
install.packages('BayesFluxR', repos = c('https://cran.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

This package does not link to any Github/Gitlab/R-forge repository. No issue tracker or development information is available.

1.70 score 4 scripts 247 downloads 41 exports 8 dependencies

Last updated 1 years agofrom:a0de5620ed. Checks:OK: 2. Indexed: yes.

TargetResultDate
Doc / VignettesOKDec 05 2024
R-4.5-linuxOKDec 05 2024

Exports:.set_seedbayes_by_backpropBayesFluxR_setupBNNBNN.totparamsChainDensefind_modeGammainitialise.allsameInverseGammalikelihood.feedforward_normallikelihood.feedforward_tdistlikelihood.seqtoone_normallikelihood.seqtoone_tdistLSTMmadapter.DiagCovmadapter.FixedMassMatrixmadapter.FullCovmadapter.RMSPropmcmcNormalopt.ADAMopt.Descentopt.RMSPropposterior_predictiveprior_predictiveprior.gaussianprior.mixturescaleRNNsadapter.Constsadapter.DualAveragesampler.AdaptiveMHsampler.GGMCsampler.HMCsampler.SGLDsampler.SGNHTStensor_embed_matto_bayesplotTruncatedvi.get_samples

Dependencies:evaluatehighrJuliaCallknitrRcpprjsonxfunyaml

Readme and manuals

Help Manual

Help pageTopics
Installs Julia packages if needed.install_pkg
Obtain the status of the current Julia project.julia_project_status
Set a seed both in Julia and R.set_seed
Loads Julia packages.using
Use Bayes By Backprop to find Variational Approximation to BNN.bayes_by_backprop
Set up of the Julia environment needed for BayesFluxBayesFluxR_setup
Create a Bayesian Neural NetworkBNN
Obtain the total parameters of the BNNBNN.totparams
Chain various layers together to form a networkChain
Create a Dense layer with `in_size` inputs and `out_size` outputs using `act` activation functionDense
Find the MAP of a BNN using SGDfind_mode
Create a Gamma PriorGamma
Creates a random string that is used as variable in juliaget_random_symbol
Initialises all parameters of the network, all hyper parameters of the prior and all additional parameters of the likelihood by drawing random values from `dist`.initialise.allsame
Create an Inverse-Gamma PriorInverseGamma
Use a Normal likelihood for a Feedforward networklikelihood.feedforward_normal
Use a t-Distribution likelihood for a Feedforward networklikelihood.feedforward_tdist
Use a Normal likelihood for a seq-to-one recurrent networklikelihood.seqtoone_normal
Use a T-likelihood for a seq-to-one recurrent network.likelihood.seqtoone_tdist
Create an LSTM layer with `in_size` input size, and `out_size` hidden state sizeLSTM
Use the diagonal of sample covariance matrix as inverse mass matrix.madapter.DiagCov
Use a fixed mass matrixmadapter.FixedMassMatrix
Use the full covariance matrix as inverse mass matrixmadapter.FullCov
Use RMSProp to adapt the inverse mass matrix.madapter.RMSProp
Sample from a BNN using MCMCmcmc
Create a Normal PriorNormal
ADAM optimiseropt.ADAM
Standard gradient descentopt.Descent
RMSProp optimiseropt.RMSProp
Draw from the posterior predictive distributionposterior_predictive
Sample from the prior predictive of a Bayesian Neural Networkprior_predictive
Use an isotropic Gaussian priorprior.gaussian
Scale Mixture of Gaussian Priorprior.mixturescale
Create a RNN layer with `in_size` input, `out_size` hidden state and `act` activation functionRNN
Use a constant stepsize in mcmcsadapter.Const
Use Dual Averaging like in STAN to tune stepsizesadapter.DualAverage
Adaptive Metropolis Hastings as introduced insampler.AdaptiveMH
Gradient Guided Monte Carlosampler.GGMC
Standard Hamiltonian Monte Carlo (Hybrid Monte Carlo).sampler.HMC
Stochastic Gradient Langevin Dynamics as proposed in Welling, M., & Teh, Y. W. (n.d.). Bayesian Learning via Stochastic Gradient Langevin Dynamics. 8.sampler.SGLD
Stochastic Gradient Nose-Hoover Thermostat as proposed insampler.SGNHTS
Print a summary of a BNNsummary.BNN
Embed a matrix of timeseries into a tensortensor_embed_mat
Convert draws array to conform with `bayesplot`to_bayesplot
Truncates a DistributionTruncated
Draw samples form a variational family.vi.get_samples