Institute of Information Theory and Automation

You are here

Bibliography

Conference Paper (Czech conference)

Optimality conditions for maximizers of the information divergence from an exponential family

Matúš František

: WUPES '06 Proceedings of 7th Workshop on Uncertainty Processing, p. 96-110 , Eds: Vejnarová J., Kroupa T.

: WUPES 2006, (Mikulov, CZ, 16.09.2006-20.09.2006)

: CEZ:AV0Z10750506

: IAA100750603, GA AV ČR

: Kullback-Leibler divergence, relative entropy, exponential family, information projection, cumulant generating function, log-Laplace transform

(eng): The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.

(cze): Informační divergence pravděpodobnostní míry P od exponenciální rodiny se definuje jako infimum divergencí P od Q v E. Byly spočteny směrové derivace této divergence pomocí nových výsledků o konjugaci log-Lapaceovy transformace. Byly formulovány nové nutné podmínky prvního řádu proto, aby P byla maximalizátorem této divergence

: 12

: BD

2019-01-07 08:39