Signal Processing and Speech Communication Laboratory
hometheses & projects › Exact Maximum Margin Structure Learning

Exact Maximum Margin Structure Learning

Status
Open
Type
Master Thesis
Announcement date
04 Mar 2013
Mentors
  • Robert Peharz
Research Areas

Short Description

In general, structure learning in Bayesian networks is NP-hard. Nevertheless, for reasonable small or well-behaved problems, optimal solutions can be found. Recently, we proposed a mixed integer linear programming (MILP) framework, to optimize Bayesian network structures in a discriminative way, i.e. to yield good classifiers. The used score is the so-called class margin, aiming for good separation between conditional class distributions. The MILP framework guarantees that eventually (given enough computer resources) an optimal solution is found. On the other hand, it also provides an any-time solution which can be used as approximation.

Tasks

In this master/diploma thesis we like to explore various techniques and heuristics from combinatoric optimization, to speed up network optimization, to be able to address larger problems, and to improve approximation results. Beyond others, we like to address dynamic constraint generation and dual decomposition techniques.

Requirements

The candidate must have good programming skills (Matlab/C++), and has to be interested in convex and combinatoric optimization.

References

[1]: Robert Peharz and Franz Pernkopf, “Exact Maximum Margin Structure Learning of Bayesian Networks”, International Conference on Machine Learning, Edinburgh, UK, 2012.