< Back to previous page
Publication
Inference and Learning with Model Uncertainty in Probabilistic Logic Programs
Book Contribution - Book Chapter Conference Contribution
Abstract:An issue that has so far received only limited attention in probabilistic logic programming (PLP) is the modeling of so-called epistemic uncertainty, the uncertainty about the model itself. Accurately quantifying this model uncertainty is paramount to robust inference, learning and ultimately decision making. We introduce BetaProbLog, a PLP language that can model epistemic uncertainty. BetaProbLog has sound semantics, an effective inference algorithm that combines Monte Carlo techniques with knowledge compilation, and a parameter learning algorithm. We empirically outperform state-of-the-art methods on probabilistic inference tasks in second-order Bayesian networks, digit classification and discriminative learning in the presence of epistemic uncertainty.
Book: Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence
Pages: 10060 - 10069
ISBN:1-57735-876-7
Publication year:2022
BOF-keylabel:yes
IOF-keylabel:yes
Authors from:Higher Education
Accessibility:Open
Review status:Peer-reviewed