Publication details

Pattern Recognition by Probabilistic Neural Networks - Mixtures of Product Components versus Mixtures of Dependence Trees

Conference Paper (international conference)

Grim Jiří, Pudil P.

serial: NCTA2014 - International Conference on Neural Computation Theory and Applications, p. 65-75

action: 6-th International Conference on Neural Computation Theory and Applications, (Rome, IT, 22.10.2014-24.10.2014)

project(s): GA14-02652S, GA ČR, GAP403/12/1557, GA ČR

keywords: Probabilistic Neural Networks, Product Mixtures, Mixtures of Dependence Trees, EM Algorithm

preview: Download

abstract (eng):

We compare two probabilistic approaches to neural networks - the first one based on the mixtures of product components and the second one using the mixtures of dependence-tree distributions. The product mixture models can be efficiently estimated from data by means of EM algorithm and have some practically important properties. However, in some cases the simplicity of product components could appear too restrictive and a natural idea is to use a more complex mixture of dependence-tree distributions. By considering the concept of dependence tree we can explicitly describe the statistical relationships between pairs of variables at the level of individual components and therefore the approximation power of the resulting mixture may essentially increase. Nonetheless, in application to classification of numerals we have found that both models perform comparably and the contribution of the dependence-tree structures decreases in the course of EM iterations. Thus the optimal estimate of the dependence-tree mixture tends to converge to a simple product mixture model.