Binary Neural Networks

Seminar Type: - None -
Project Status: Open

Deep representation learning is one of the main factors for the recent performance boost in many
image, signal and speech processing problems. This is particularly true when having big amounts
of data and almost unlimited computing resources available as demonstrated in competitions such
as for example ImageNet. However, in real-world scenarios the computing infrastructure is often
restricted and the computational requirements are not fullfilled.

At recent machine learning venues, binarized neural networks (BNNs) have been introduced [1,2]. Essentially, they have binary weights and activations. During training, the binary weights and activations are used for computing the gradient. BNNs have several advantages - they dramatically reduce the memory size and improve the power-efficiency while achieving almost state-of-the-art performances. 

In this research we aim to use and explore BNNs for image and speech classification problems. Furthermore, we are interested to explore them using alternative training approaches. This topic can be also extended to implementation of BNNs on hardware. In particular, deep neural networks are of interest.

 We offer:

  • existing code of BNNs
  • benchmark data

 Your Tasks:

  • simulate BNNs in Python on the GPU using THEANO [1]
  • analyze the implemented systems in terms of accuracy and performance
  • analyze the shortcommings of current BNNs - suggest alternative training methods
  • [contribute to scientific work in form of a paper]

Your Outcome:

  • learn to implement and simulate Neural Networks on a GPU
  • learn how to solve classification problems with BNNs
  • get a good background in applied machine learning

Your Profile:

  • motivation and reliability are a prerequisite
  • good knowledge in machine learning and neural networks is an advantage
  • knowledge in python programming

Additional Information:

This thesis project is planned for a duration of 6 months starting immediately. 

Contact:

Franz Pernkopf (pernkopf@tugraz.at)

Matthias Zoehrer (matthias.zoehrer@tugraz.at or +43 (316) 873 - 4385)

References

[1] K. Minje and P. Smaragdis, Bitwise Neural Networks, ICML Workshop on Resource-Efficient Machine Learning, 2015.

[2] M. Courbariaux, I. Hubara, D. Soundry, R. El-Yaniv, Y. Bengio Binarized, Neural Networks: Training Neural Networks with Wights and Activations Constrained to +1 and -1. NIPS, 2016.