Abstract:
Divergence measures are widely used in various applications of pattern recognition, signal processing and statistical applications. In this paper, we introduce a new one parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite. Unlike the Kullback-Leibler divergence, BBD measures do not require probability density functions to be absolutely continuous with respect to each other. In the asymptotic limit, BBD measure approach squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and Divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher's Information. For distributions with vector valued parameters, the curvature matrix can be used to obtain the Rao geodesic distance. We also derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. Bounds on the Bayesian error probability are established with BBD measure.