The package vMF simulates von Mises-Fisher distribution (ℳ). Unlike the package movMF (Hornik and Grün 2014), which simulates and estimates mixtures of ℳ, vFM performs fast sampling as its source code is written in C++. vFM also computes the density and the normalization constant of ℳ.
The von Mises-Fisher distribution is used to model coordinates on a hypersphere of dimension p ≥ 2. Roughly speaking, it is the equivalent of the normal distribution on a hypersphere. As the normal distribution, ℳ is characterized by two parameters. The location (or mean directional) parameter μ around which draws will be concentrated and the intensity parameter η which measures the intensity of concentration of the draws around μ. The higher η, the more the draws are concentrated around μ. Compared to the normal distribution, μ is similar to the mean parameter of the normal distribution and 1/η is similar to the standard deviation.
There are several definitions of the density function of ℳ. In this package, the density is normalized by the uniform distribution without loss of generality. This is also the case in Mardia and Jupp (2009) and Hornik and Grün (2013).
Let z ∼ ℳ(η, μ). The density of z is given by
fp(z|η, μ) = Cp(η)eηz′μ, where $\displaystyle C_p(x) = \left(\frac{x}{2}\right)^{\frac{p}{2}-1}\frac{1}{\Gamma\left(\frac{p}{2}\right)I_{\frac{p}{2}-1}(x)}$ is the normalization constant and I.(.) the Bessel function of the first kind defined by: $$\displaystyle I_{\alpha}(x) = \sum_{m=0}^{\infty}\frac{\left(\frac{x}{2}\right)^{2m+\alpha}}{m!\Gamma(m+\alpha + 1)}.$$ The normalization with respect to the uniform distribution implies Cp(0) = 1.
The following algorithm provides a rejection sampling scheme for drawing a sample from ℳ with mean directional parameter μ = (0, ..., 0, 1) and concentration (intensity) parameter η ≥ 0 (see Section 2.1 in Hornik and Grün 2014).
$$b = \dfrac{p - 1}{2\eta + \sqrt{2\eta^2 + (p - 1)^2}}.$$ Let x0 = (1 − b)/(1 + b) and c = ηx0 + (p − 1)log (1 − x02).
$$W = \dfrac{1-(1+b)Z}{1-(1-b)Z}.$$
ηW + (p − 1)log (1 − x0W) − c < log (U), go to step 2.
$$\mathbf{X} =\left(\sqrt{1- W^2}\mathbf{V}^{\prime},W\right)^{\prime}$$
The uniform (d − 1)-dimensional unit vector V can be generated by simulating d − 1 independent standard normal random variables and normalizing them so as ∥V∥2 = 1. To get sampling from ℳ with arbitrary mean direction parameter μ, X is multiplied from the left with a matrix where the first d − 1 columns consist of unitary basis vectors of the subspace orthogonal to μ and the last column is equal to μ.
In this section, I compare vMF and movMF.
library(rbenchmark)
fcompare <- function(n) {
benchmark("vMF" = rvMF(n,c(1,0,0)), "movMF" = rmovMF(1,c(1,0,0)))
}
fcompare(1)
#> test replications elapsed relative user.self sys.self user.child sys.child
#> 2 movMF 100 0.018 18 0.018 0 0 0
#> 1 vMF 100 0.001 1 0.000 0 0 0
fcompare(10)
#> test replications elapsed relative user.self sys.self user.child sys.child
#> 2 movMF 100 0.018 18 0.019 0 0 0
#> 1 vMF 100 0.001 1 0.001 0 0 0
fcompare(100)
#> test replications elapsed relative user.self sys.self user.child sys.child
#> 2 movMF 100 0.018 4.5 0.018 0 0 0
#> 1 vMF 100 0.004 1.0 0.004 0 0 0
vMF performs over movMF. The performance of vMF is much better when only few simulations are performed. When the sample is too large, the two package require approximately the same running time.
library(ggplot2)
ggplot(data = data.frame(n = 1:200, time = out), aes(x = n, y = time)) +
geom_point(col = "blue") + geom_hline(yintercept = 1, col = 2)
Many papers use simulations from the von-Mises Fisher distribution in a Markov Chain Monte Carlo (MCMC) process. A single draw is performed at each iteration of the MCMC. This is for example the case in Boucher and Houndetoungan (2022), Breza et al. (2020), McCormick and Zheng (2015). In such a simulation context, using vMF would take much less time than movMF. For example, I consider the process (zt)t ∈ ℕ which follows a random walk of the von-Mises Fisher distribution. The first variable, z0, is randowmly set on a 4-dimensional hypersphere and zt ∼ ℳ(1, zt − 1) ∀ t > 0. Simulating this process has about the same complexity as using von-Mises Fisher drawings in an MCMC.
set.seed(123)
P <- 4
initial <- rmovMF(1, rep(0, P))
# Fonction based on vMF to simulate theta
SamplevMF <- function(n) {
output <- matrix(0, n + 1, P)
output[1, ] <- initial
for (i in 1:n) {
output[i + 1,] <- rvMF(1, output[i,])
}
return(output)
}
# Fonction based on movMF to simulate theta
SamplemovMF <-function(n){
output <- matrix(0, n + 1, P)
output[1, ] <- initial
for (i in 1:n) {
output[i + 1,] <- rmovMF(1, output[i,])
}
return(output)
}
benchmark("vMF" = SamplevMF(1000), "movMF" = SamplemovMF(1000))
#> test replications elapsed relative user.self sys.self user.child sys.child
#> 2 movMF 100 35.605 51.902 35.260 0.032 0 0
#> 1 vMF 100 0.686 1.000 0.643 0.044 0 0
The comparison of the running times vMF is less time-consuming