Signal Processing and Speech Communication Laboratory
hometheses & projects › Structure Learning of Bayesian Networks Using Latent Variables

Structure Learning of Bayesian Networks Using Latent Variables

Master Thesis
Announcement date
17 Apr 2012
Research Areas

Short Description

Recently, we developed search-and-score structure learning heuristcs for Bayesian network classifiers using a discriminative score [1]. In particular, we introduced the maximum margin score for discriminatively optimizing the structure of Bayesian network classifiers. Furthermore, greedy hill-climbing and simulated annealing search heuristics are applied to determine the classifier structures.

Unfortunately, the number of parameters in the network grows exponentially with the number of conditioning parents of the variables. The aim of this project is to augment the structure learning heuristics with latent variables to keep the size of the parameter space low while maintaining the performance of the network. The developed heuristics should be empirically compared to our methods in [1] using benchmark data sets.

Your Profile/Requirements

  • The candidate should be interested in machine learning, applied mathematics/statistics, Matlab programming, and algorithms. Interested candidates are encouraged to ask for further information. Additionally, the supervision of own projects in one of the above mention fields is possible.


Franz Pernkopf ( or 0316/873 4436)


[1] F. Pernkopf and J. Bilmes, “Efficient Heuristics for Discriminative Structure Learning of Bayesian Network Classifiers”, Journal of Machine Learning Research, Vol. 11, pp. 2323-2360, 2010.