ad
ad copied to clipboard
Automatic Differentiation
It would be nice to invert our current control mechanism so that instead of having grad :: (Traversable f, Num a) => (forall s. Mode s => f (AD s...
One long standing idea that's been discussed is generating more specialized / optimized code when the expression to be evaluated is known. An example could be generating specialized gradient code...
Compiling `ad` with GHC 9.8.1 reveals a number of `-Wx-partial` warnings caused by uses of the partial `head` and `tail` functions: ``` [12 of 55] Compiling Numeric.AD.Internal.Tower ( src/Numeric/AD/Internal/Tower.hs, /home/ryanglscott/Documents/Hacking/Haskell/ci-maintenance/checkout/ekmett/ad/dist-newstyle/build/x86_64-linux/ghc-9.8.0.20230727/ad-4.5.4/build/Numeric/AD/Internal/Tower.o,...
3. I wonder if it is worth adding a "fast-math" version of `ReverseDouble` that achieves better performance at the expense of not being correct with respect to special IEEE floating-point...
`Tape` appears in many public types, but cannot be referenced in client code without importing internal modules. This doesn't seem intentional to me, but the fact that it has, presumably,...
The library calculates incorrect results for at least some of the extra functions `log1p`, `expm1`, `log1pexp`, and `log1mexp` of the `Floating` class. For example: ```hs >>> import Numeric.AD >>> log1pexp...
While investigating #108, I came across some more discrepancies between differentiation modes. 1. ```hs >>> import Numeric.AD.Mode.Forward as F >>> import Numeric.AD.Mode.Reverse as R >>> F.diff cos 0 >>> R.diff...