Title: | Difference measures for multivariate Gaussian probability density functions |
---|---|
Description: | A collection difference measures for multivariate Gaussian probability density functions, such as the Euclidea mean, the Mahalanobis distance, the Kullback-Leibler divergence, the J-Coefficient, the Minkowski L2-distance, the Chi-square divergence and the Hellinger Coefficient. |
Authors: | Henning Rust <[email protected]> |
Maintainer: | Henning Rust <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.1 |
Built: | 2024-12-12 06:50:57 UTC |
Source: | CRAN |
Various difference measures for Gaussian pdfs are implemented: Euclidean distance of the means, Mahalanobis distance, Kullback-Leibler divergence, J-Coefficient, Minkowski L2-distance, Chi-square divergence and the Hellinger coefficient which is a similarity measure.
normdiff(mu1,sigma1=NULL,mu2,sigma2=sigma1,inv=FALSE,s=0.5, method=c("Mahalanobis","KL","J","Chisq", "Hellinger","L2","Euclidean"))
normdiff(mu1,sigma1=NULL,mu2,sigma2=sigma1,inv=FALSE,s=0.5, method=c("Mahalanobis","KL","J","Chisq", "Hellinger","L2","Euclidean"))
mu1 |
mean value of pdf 1, a vector |
sigma1 |
covariance matrix of pdf 1 |
mu2 |
mean value of pdf 2, a vector |
sigma2 |
covariance matrix of pdf 2 |
method |
difference measure to be used, see below |
inv |
if TRUE, 1-Hellinger is reported, default: |
s |
exponent for Hellinger coefficient, default: |
Equations can be found in H.-H. Bock, Analysis of Symbolic Data, Chapter Dissimilarity Measures for Probability Distributions
A scalar object of class normdiff
reporting the distance.
Henning Rust, [email protected]
H.-H. Bock, Analysis of Symbolic Data, Chapter Dissimilarity measures for Probabilistic Distributions
library(gaussDiff) mu1 <- c(0,0,0) sig1 <- diag(c(1,1,1)) mu2 <- c(1,1,1) sig2 <- diag(c(0.5,0.5,0.5)) ## Euclidean distance normdiff(mu1=mu1,mu2=mu2,method="Euclidean") ## Mahalanobis distance normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,method="Mahalanobis") ## Kullback-Leibler divergence normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="KL") ## J-Coefficient normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="J") ## Chi-sqr divergence normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Chisq") ## Minkowsi L2 distance normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="L2") ## Hellinger coefficient normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Hellinger")
library(gaussDiff) mu1 <- c(0,0,0) sig1 <- diag(c(1,1,1)) mu2 <- c(1,1,1) sig2 <- diag(c(0.5,0.5,0.5)) ## Euclidean distance normdiff(mu1=mu1,mu2=mu2,method="Euclidean") ## Mahalanobis distance normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,method="Mahalanobis") ## Kullback-Leibler divergence normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="KL") ## J-Coefficient normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="J") ## Chi-sqr divergence normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Chisq") ## Minkowsi L2 distance normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="L2") ## Hellinger coefficient normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Hellinger")