Effect of Quantization in Bayesian Network Classifiers

Project Type: Student Project
Student: Peter Reinprecht

Overview

Recently, we performed classification experiments with variable bitwidth of the parameters of Bayesian network classifiers [1]. Knowledge about the minimal bitwidth allows to optimize our classifiers for custom-precision or reconfigurable hardware such as FPGAs or ASICs, where arithmetic operations of any bitwidth can be implemented.

The task of this thesis is to more thoroughly investigate the issue of parameter quantization effects in Bayesian network classifiers.

 Profile of prospective student

The candidate should be interested in machine learning, applied mathematics/statistics, Matlab programming, and algorithms. Interested candidates are encouraged to ask for further information.

References

[1]  F. Pernkopf, M. Wohlmayr, and M. Mücke, "Maximum Margin Structure Learning of Bayesian Network Classifiers", IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp. 2076--2079, 2011.