Signal Processing and Speech Communication Laboratory
hometheses & projects › Automatic Differentiation for Large Scale Optimization

Automatic Differentiation for Large Scale Optimization

Status
Open
Type
Master Project
Announcement date
14 Mar 2011
Mentors
  • Sebastian Tschiatschek
Research Areas

Short Description

In machine learning one often has to minimize an error function subject to certain constraints. In many cases, these functions are twice differentiable and the first and second order derivatives are needed to apply efficient optimization methods. However, determining these derivatives and implementing functions for their calculation is error-prone and often consumes much time.

Goal of this project is to setup a toolchain that automatically performs differentiation and code generation (for Matlab or C) using a computer
algebra system, e.g. Maple or Matlab itself, given the error function and the constraints of an optimization problem.

Your Tasks

  • Literature review of automatic differentiation (AD) systems and code generation using Maple and/or Matlab
  • Suggestion of an AD system and its implementation
  • Application to some real-world optimization problems
  • Short docmentation of the AD system
  • Short report of the done work (at most 10 pages)

Your Profile/Requirements

This project is suited for Master students in Telematics, Audio Engineering, Electrical Engineering, Computer Science and Software Development.

  • Some experience with Maple or the will to learn it
  • Experience with MATLAB
  • Interest in optimization techniques and basic mathematics

Contact

  • Sebastian Tschiatschek (tschiatschek@tugraz.at or 0316/873 4385)

References

[1] Christian H. Bischof, Paul D. Hovland, and Boyana Norris. Implementation of automatic di erentiation tools. SIGPLAN Not., January 2002.
[2] Andreas Griewank and Andrea Walther. Evaluating Derivatives: Principles and Techniques of Algorithmic Di erentiation. Number 105 in Other Titles in Applied Mathematics. SIAM, Philadelphia, PA, 2nd edition, 2008.