Institute of Information Theory and Automation

You are here

Bibliography

Research Report

Evaluation of Kullback-Leibler Divergence

Homolová Jitka, Kárný Miroslav

: ÚTIA AV ČR v.v.i, (Praha 2015)

: Research Report 2349

: GA13-13502S, GA ČR

: Kullback-Leibler divergence, cross-entropy, Bayesian decision making, Bayesian learning and approximation

: http://library.utia.cas.cz/separaty/2015/AS/homolova-0444191.pdf

(eng): Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distributions. This technical paper collects its analytical and numerical expressions for the broad range of distributions.

: BC

2019-01-07 08:39