Publication details

Boosting in probabilistic neural networks

Conference Paper (international conference)

Grim Jiří, Pudil Pavel, Somol Petr


serial: Proceedings of the 16th International Conference on Pattern Recognition, p. 136-139 , Eds: Kasturi R., Laurendeau D., Suen C.

publisher: IEEE Computer Society, (Los Alamitos 2002)

action: International Conference on Pattern Recognition /16./, (Québec City, CA, 11.08.2002-15.08.2002)

research: CEZ:AV0Z1075907

project(s): GA402/01/0981, GA ČR, KSK1019101, GA AV ČR

keywords: neural networks, finite mixtures, boosting

preview: Download

abstract (eng):

It has been verified in practical experiments that the classification performance can be improved by increasing the weights of misclassified training samples. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated distributions by a positive function. Consequently, the Bayesian decision-making can be made asymptotically invariant with respect to arbitrary weighting of data under certain conditions.

Cosati: 09K

RIV: BB