Venturecxx icon indicating copy to clipboard operation
Venturecxx copied to clipboard

Define gradients for all Lite SPs

Open axch opened this issue 9 years ago • 0 comments

The claim that Venture "has Hamiltonian Monte Carto" (HMC) is belied by the paucity of gradient annotations in the stochastic procedure library. The most common ones are there, but if we want to assert this capability, we really should get it together and finish annotating.

Impediments:

  • Deriving gradient expressions by hand is Tedious And Error-Prone, even for the conceptually simpler gradientOfLogDensity{,OfCounts} methods.
  • The gradientOfSimulate methods are conceptually confusing: they ask for the derivative of a random output with respect to a parameter.
    • The deconfusion is to view simulate as a deterministic function that accepts the state of the PRNG as an extra input, the partial derivative with respect to which we do not care about.
    • However, the current interface to gradientOfSimulate doesn't expose that view correctly, and is a horrible kludge. Fixing this is Issue #387.

Possible approaches:

  • Keep doing what we have been doing, and write gradient annotations by hand. If you want to know why this is a bad idea, check out the gradientOfSimulate method of GammaOutputPSP.
  • Define a default gradient method that computes a numeric approximation to the gradient.
    • This turns gradient annotation into a performance improvement activity rather than a capability completion activity
    • It does not actually defeat the purpose of automatic differentiation, because the asymptotic lossage is confined to the number of inputs of any given SP, not all of regen.
      • That could still be large for vectorized SPs, however
    • Definitely blocked on #387 for getting gradientOfSimulate, but the gradientOfLogDensity methods are needed much more
  • Use some kind of symbolic differentiation package to find gradient expressions, either entirely out of band or as part of the build.
  • Use a Python automatic differentiation system, either at the SP level or end-to-end.
    • Candidate: https://github.com/HIPS/autograd , which claims to have derivatives of numpy procedures
      • I bet they don't have the samplers!
      • They claim it's easy to add more annotations, though

axch avatar Feb 02 '16 08:02 axch