Forward-mode Automatic Differentiation in Julia

Revels, J., Lubin, M. and Papamarkou, T. (2016) Forward-mode Automatic Differentiation in Julia. AD2016 7th International Conference on Algorithmic Differentiation, Oxford, United Kingdom, 12-15 Sep 2016.

Full text not currently available from Enlighten.

Abstract

We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support for higher-order differentiation and differentiation using custom number types (including complex numbers). For gradient and Jacobian calculations, ForwardDiff provides a variant of vector-forward mode that avoids expensive heap allocation and makes better use of memory bandwidth than traditional vector mode. In our numerical experiments, we demonstrate that for nontrivially large dimensions, ForwardDiff's gradient computations can be faster than a reverse-mode implementation from the Python-based autograd package. We also illustrate how ForwardDiff is used effectively within JuMP, a modeling language for optimization. According to our usage statistics, 41 unique repositories on GitHub depend on ForwardDiff, with users from diverse fields such as astronomy, optimization, finite element analysis, and statistics.

Item Type:Conference or Workshop Item
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Papamarkou, Dr Theodore
Authors: Revels, J., Lubin, M., and Papamarkou, T.
College/School:College of Science and Engineering > School of Mathematics and Statistics > Statistics
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record