clad icon indicating copy to clipboard operation
clad copied to clipboard

clad -- automatic differentiation for C/C++

Results 301 clad issues
Sort by recently updated
recently updated
newest added

Currently, if a pointer argument is passed to a function and the pointee value is changed, Clad doesn't support restoring it in the reverse sweep. The first step to address...

Our CI has already benchmark comparison action which compares the benchmarks against the submitted pull request: https://github.com/vgvassilev/clad/blob/ec76b704ced2bb7471b25b51a7c24273374700d9/.github/workflows/ci.yml#L755-L781 We should turn that into a script `benchmark_compare.py` that can take two revisions...

good first issue

If I compile this example here: ```c++ // Compile with clang++ -std=c++11 -fplugin=/usr/lib/clad.so cladConfusingWarnings.cxx -o cladConfusingWarnings #include "clad/Differentiator/Differentiator.h" #include double func(double x) { return x * std::tanh(1.0); } int main()...

Reproducible example: ```cpp #include "clad/Differentiator/Differentiator.h" #include #define show(x) std::cout

We have recently added pointer support in reverse mode (#686), but the tests are currently failing when trying to run with the `enable-tbr` flag. The test file is `test/Gadient/Pointers.C`.

Hi all, I want to use the clad implementation on the GPU, the problem however is that when I use the #include and build the .cu file I get the...

**Minimal example**: Compile the following code with Clang++ with the clad plugin: ```c++ #include "clad/Differentiator/Differentiator.h" void g(char c){} double f(double x) { g('a'); return x; } int main() { auto...

This is the code I differentiated: ``` double g(double* i) { i[0]++; return 1; } double func(double* i, double j) { j = g(i); return 0; } ``` And this...

It would be nice if passing around null pointers would nor crash the gradient generation, because then one could apply AD to function with optional pointer arguments. My setup: Arch...

**Minimal example**: ```c++ #include #include "clad/Differentiator/Differentiator.h" double f(double x){ std::mt19937 gen64; std::uniform_real_distribution distribution(0.0,1.0); double rand = distribution(gen64); return x+rand; } int main(){ auto f_dx = clad::differentiate(f, "x"); f_dx.execute(3.14); } ```...