clad icon indicating copy to clipboard operation
clad copied to clipboard

clad -- automatic differentiation for C/C++

Results 301 clad issues
Sort by recently updated
recently updated
newest added

differentiating functions containing std::initializer_list shows the "Not supported" warning and throws error. ```c int fn(int x) { int res = 0; auto &&range = {1, 2, 3}; for (auto i...

The current implementation of the loop's body in the reverse pass of the Reverse mode is not very clear. Consider this example: ``` double fn(double u, double v) { double...

Hi, Can I use std::array to hold data that is accessed by functions that are differentiated? ```c++ double f(double x, double y) { double a[2]; a[0] = 5; a[1] =...

Clad internally generates `pushforward` functions for differentiating function calls in forward mode AD. However, `clad::differentiate`, the user-interface for forward mode AD, generates a more restricted form of `pushforward` function which...

good first issue

@parth-07 Referring to #734 According to my understanding when a reference variable is initialized with pointer dereference, the `clad::gradient` function creates a new adjoint variable for the given reference variable...

Clad generates an incorrect gradient code when a reference variable is initialized with a pointer deference. Reproducible example: ```cpp #include "clad/Differentiator/Differentiator.h" #include #define show(x) std::cout

good first issue

Since the clang versions supported by clad require for older versions of CUDA, which are not supported by newer versions of an OS, being able to develop through a PR...

There appears to be an issue with clad build on osx version 13 using llvm 11 , as seen in https://github.com/vgvassilev/clad/actions/runs/8209849513/job/22456183203 The error which occurs is ``` In file included...

To know all possible L-values of the LHS of an assignment operation in the reverse mode, we call ``utils::GetInnermostReturnExpr``. We use it to know the variables that we should store...

Clad builds a gradient with a specific signature. It also allows that gradient function signature to be forward-declared. However if both mismatch there are subtle crashes in CodeGen. We should...

good first issue