[G-Stats Seminar] Frank Nielsen

Frank Nielsen

Sony CSL

24/11/2022 – h. 2 p.m.

Title: “Revisiting Chernoff Information with Likelihood Ratio Exponential Families”

Abstract: The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback–Leibler divergence. In this talk, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to
(i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing,
(ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices, and
(iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.

Comments are closed.