Weighted Maximum Margin Bayesian Network Classifiers
- Master Project
- Announcement date
- 08 Mar 2012
- Sebastian Tschiatschek
- Research Areas
In machine learning one can obtain good classifiers by combining a set of weak classifiers. Adaboost is a meta-algorithm that can be used for this purpose and basically describes a way to train a sequence of classifiers and how they can be combined in the end. We plan to adpot such a scheme for the novel Maximum Margin Bayesian Networks (MMBNs) and evaluate the resulting performance.
Goal of this project is to adapt the Adaboost scheme to be applicable to MMBNs, and to adopt an already existing MATLAB implementation for training MMBNs to this proposed scheme.
- Literature review of Adaboost and Maximum Margin Bayesian Networks
- Adaption of Maximum Margin Bayesian Networks to incorporate a boosting like scheme
- Evaluation of the adapted method on various databases
- Short report of the done work (at most 10 pages)
This project is suited for Master students in Telematics, Audio Engineering, Electrical Engineering, Computer Science and Software Development.
- Experience with MATLAB
- Interest in machine learning
- Sebastian Tschiatschek (firstname.lastname@example.org or 0316/873 4385)
 Yoav Freund, and Robert E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, no. 55. 1997