clad
clad copied to clipboard
clad -- automatic differentiation for C/C++
Differentiating ```cpp double f(double val) { double res = 0; if(val) res++; return res; } ``` with TBR on results in an error.
Since global kernels cannot be called like normal device functions from other device functions, the following command can be compiled only for host like so: ``` #ifndef __CUDA_ARCH__ auto kernel_g...
Adjoints of class-type arguments are not updated in function call differentiation. Reproducible example: ```cpp class A { public: double data = 0; }; double add(A a, double u) { return...
Template custom derivatives do not get selected, when they should get selected, due to lost type conversions in `ImplicitCastExpr` nodes. Reproducible example: ```cpp #include "clad/Differentiator/Differentiator.h" double sum(float a, float b)...
Previously, if a user wanted to provide a custom pushforward for a function that uses functors in it, it was impossible to use generated pushforwards for that functors' call operators....
Having stolen an example of AD from Clad slides, i see that the use of default arguments does not work. ``` #include double f (double x, int N=5) { double...
``` double example(double x, double *y) { y[0] = 3; return x * y[0]; } int main() { double x = 10.0; double y[3] = {7.0, 4.6, 6.3}; auto ex...
When differentiating product right side isn't always differentiated but simply being cloned. ```cpp if (!ShouldRecompute(R)) { ... } else { RResult = StmtDiff(Clone(R)); } ``` That causes errors such as...
Fixes #985.
```cpp #include "clad/Differentiator/Differentiator.h" double twox(double x) { return 2 * x; } int main() { auto l_h = clad::differentiate(twox, "x"); l_h.dump(); } ``` Gives the output: ```cpp The code is:...