ad icon indicating copy to clipboard operation
ad copied to clipboard

Fast, transparent calculations of first and second-order automatic differentiation

ad Package Documentation

.. image:: https://travis-ci.org/tisimst/ad.png?branch=master

Overview

The ad package allows you to easily and transparently perform first and second-order automatic differentiation. Advanced math involving trigonometric, logarithmic, hyperbolic, etc. functions can also be evaluated directly using the admath sub-module.

All base numeric types are supported (int, float, complex, etc.). This package is designed so that the underlying numeric types will interact with each other as they normally do when performing any calculations. Thus, this package acts more like a "wrapper" that simply helps keep track of derivatives while maintaining the original functionality of the numeric calculations.

From the Wikipedia entry on Automatic differentiation_ (AD):

"AD exploits the fact that every computer program, no matter how 
complicated, executes a sequence of elementary arithmetic operations 
(addition, subtraction, multiplication, division, etc.) and elementary 
functions (exp, log, sin, cos, etc.). By applying the chain rule 
repeatedly to these operations, derivatives of arbitrary order can be 
computed automatically, and accurate to working precision."

See the package documentation_ for details and examples.

Main Features

  • Transparent calculations with derivatives: no or little modification of existing code is needed, including when using the Numpy_ module.

  • Almost all mathematical operations are supported, including functions from the standard math_ module (sin, cos, exp, erf, etc.) and cmath_ module (phase, polar, etc.) with additional convenience trigonometric, hyperbolic, and logarithmic functions (csc, acoth, ln, etc.). Comparison operators follow the same rules as the underlying numeric types.

  • Real and complex arithmetic handled seamlessly. Treat objects as you normally would using the math_ and cmath_ functions, but with their new admath counterparts.

  • Automatic gradient and hessian function generator for optimization studies using scipy.optimize_ routines with gh(your_func_here).

  • Compatible Linear Algebra Routines in the ad.linalg submodule, similar to those found in NumPy's linalg submodule, that are not dependent on LAPACK. There are currently:

    a. Decompositions

    1. chol: Cholesky Decomposition
    2. lu: LU Decomposition
    3. qr: QR Decomposition

    b. Solving equations and inverting matrices

    1. solve: General solver for linear systems of equations
    2. lstsq: Least-squares solver for linear systems of equations
    3. inv: Solve for the (multiplicative) inverse of a matrix

Installation

You have several easy, convenient options to install the ad package (administrative privileges may be required):

  1. Download the package files below, unzip to any directory, and run python setup.py install from the command-line.

  2. Simply copy the unzipped ad-XYZ directory to any other location that python can find it and rename it ad.

  3. If setuptools is installed, run easy_install --upgrade ad from the command-line.

  4. If pip is installed, run pip install --upgrade ad from the command-line.

  5. Download the bleeding-edge version on GitHub_

Contact

Please send feature requests, bug reports, or feedback to Abraham Lee_.

Acknowledgements

The author expresses his thanks to :

  • Eric O. LEBIGOT (EOL), author of the uncertainties package, for providing code insight and inspiration
  • Stephen Marks, professor at Pomona College, for useful feedback concerning the interface with optimization routines in scipy.optimize.
  • Wendell Smith, for updating testing functionality and numerous other useful function updates
  • Jonathan Terhorst, for catching a bug that made derivatives of logarithmic functions (base != e) give the wrong answers.
  • GitHub user fhgd for catching a mis-calculation in admath.atan2

.. _NumPy: http://numpy.scipy.org/ .. _math: http://docs.python.org/library/math.html .. _cmath: http://docs.python.org/library/cmath.html .. _Automatic differentiation: http://en.wikipedia.org/wiki/Automatic_differentiation .. _Eric O. LEBIGOT (EOL): http://www.linkedin.com/pub/eric-lebigot/22/293/277 .. _uncertainties: http://pypi.python.org/pypi/uncertainties .. _scipy.optimize: http://docs.scipy.org/doc/scipy/reference/optimize.html .. _Abraham Lee: mailto:[email protected] .. _package documentation: http://pythonhosted.org/ad .. _GitHub: https://github.com/tisimst/ad