Signal Processing and Speech Communication Laboratory
hometheses & projects › On Optimal Feature Orderings In Bayesian NEtwork Classifiers

On Optimal Feature Orderings In Bayesian NEtwork Classifiers

Master Thesis
Announcement date
01 Nov 2021
Christian Oswald
Research Areas


Bayesian Network Classifiers (BNCs) provide a versatile and transparent approach for supervised machine learning. There exist multiple strategies to learn such a network’s structure from data, providing additional insight on the dataset’s structure while retaining a small model size. However, learning the network’s structure becomes more difficult when it is trained on discriminative as opposed to generative objectives. Roth and Pernkopf proposed a gradient-based training method for tree-augmented naive (TAN) Bayes structures which is agnostic to the loss function specified, while learning the network’s parameters and structure simultaneously. This thesis extends their approach by answering one of their open questions regarding the ordering of input features, which had a profound impact on their results. With the order-free approach presented in this thesis, performance is at least as good as with random feature orderings used by Roth and Pernkopf. Furthermore, the versatility of Bayesian networks is leveraged by adding support for missing features, which is done through computation of exact marginal probabilities.