Conference Paper (international conference)
,
: Symbolic and Quantitative Approaches to Reasoning with Uncertainty. ECSQARU 2017, p. 125-134 , Eds: Antonucci A., Cholvy L., Papini O.
: ECSQARU: European Conference on Symbolic and Quantitative Approaches to Reasoning and Uncertainty, (Lugano, CH, 20170710)
: GA16-12010S, GA ČR
: computerized adaptive testing, probabilistic graphical models, gradient methods
: 10.1007/978-3-319-61581-3_12
: http://library.utia.cas.cz/separaty/2017/MTR/plajner-0476602.pdf
(eng): Artificial intelligence is present in many modern computer science applications. The question of effectively learning parameters of such models even with small data samples is still very active. It turns out that restricting conditional probabilities of a probabilistic model by monotonicity conditions might be useful in certain situations. Moreover, in some cases, the modeled reality requires these conditions to hold. In this article we focus on monotonicity conditions in Bayesian Network models. We present an algorithm for learning model parameters, which satisfy monotonicity conditions, based on gradient descent optimization. We test the proposed method on two data sets. One set is synthetic and the other is formed by real data collected for computerized adaptive testing. We compare obtained results with the isotonic regression EM method by Masegosa et al. which also learns BN model parameters satisfying monotonicity. A comparison is performed also with the standard unrestricted EM algorithm for BN learning. Obtained experimental results in our experiments clearly justify monotonicity restrictions. As a consequence of monotonicity requirements, resulting models better fit data.
: JD
: 20205