autodiff
autodiff copied to clipboard
Stack overflow
The following minimal example produces a stack overflow:
// C++ includes
#include <iostream>
// autodiff include
#include <autodiff/reverse/var.hpp>
#include <autodiff/reverse/var/eigen.hpp>
using namespace autodiff;
// The scalar function for which the gradient is needed
var f(const ArrayXvar& x)
{
return sqrt((x * x).sum()); // sqrt(sum([xi * xi for i = 1:5]))
}
int main()
{
using Eigen::VectorXd;
VectorXvar x(2048);
x = Eigen::VectorXd::Random(x.size());
var y = f(x); // the output variable y
VectorXd dydx = gradient(y, x); // evaluate the gradient vector dy/dx
std::cout << "y = " << y << std::endl; // print the evaluated output y
std::cout << "dy/dx = \n" << dydx << std::endl; // print the evaluated gradient vector dy/dx
return 0;
}
(This example is a modified version of https://github.com/autodiff/autodiff/blob/main/examples/reverse/example-reverse-gradient-derivatives-using-eigen.cpp )
Remark: dependent on the machine running the executable this stack overflow may arise from slightly higher dimension on than 2048.
The use of var generates a graph at runtime, which is traversed for computing the derivatives in reverse order. Unfortunately, it was never implemented with use cases where many variables are considered, or the operations are very deep.
autodiff was created to address the needs I had for automatic differentiation in my research projects. It turned out that autodiff::real was the appropriate option (and also dual), and these types received more development attention and support.
Perhaps real and dual are suitable for you as well. It really depends on the application.
From: guestieng @.> Sent: Tuesday, 23 January 2024 15:24 To: autodiff/autodiff @.> Cc: Subscribed @.***> Subject: [autodiff/autodiff] Stack overflow (Issue #314)
The following minimal example produces a stack overflow:
// C++ includes
#include
// autodiff include #include <autodiff/reverse/var.hpp> #include <autodiff/reverse/var/eigen.hpp> using namespace autodiff;
// The scalar function for which the gradient is needed var f(const ArrayXvar& x) { return sqrt((x * x).sum()); // sqrt(sum([xi * xi for i = 1:5])) }
int main() { using Eigen::VectorXd;
VectorXvar x(2048);
x = Eigen::VectorXd::Random(x.size());
var y = f(x); // the output variable y
VectorXd dydx = gradient(y, x); // evaluate the gradient vector dy/dx
std::cout << "y = " << y << std::endl; // print the evaluated output y
std::cout << "dy/dx = \n" << dydx << std::endl; // print the evaluated gradient vector dy/dx
return 0;
}
(This example is a modified version of https://github.com/autodiff/autodiff/blob/main/examples/reverse/example-reverse-gradient-derivatives-using-eigen.cpp https://github.com/autodiff/autodiff/blob/main/examples/reverse/example-reverse-gradient-derivatives-using-eigen.cpp )
Remark: dependent on the machine running the executable this stack overflow may arise from slightly higher dimension on than 2048.
— Reply to this email directly, view it on GitHub https://github.com/autodiff/autodiff/issues/314 , or unsubscribe https://github.com/notifications/unsubscribe-auth/ABMOINBFG6XFUQKNOTRBKFDYP7BXRAVCNFSM6AAAAABCHCUNVCVHI2DSMVQWIX3LMV43ASLTON2WKOZSGA4TMMRQGM2DGNA . You are receiving this because you are subscribed to this thread. https://github.com/notifications/beacon/ABMOINHNBTPHSVAITV26MWDYP7BXRA5CNFSM6AAAAABCHCUNVCWGG33NNVSW45C7OR4XAZNFJFZXG5LFVJRW63LNMVXHIX3JMTHHZ4MGVI.gif Message ID: @.*** @.***> >
At the moment I'm interested in the reverse mode.
The real
and dual
types are designed to be for the forward mode only, aren't they?
Yes, dual and real are based on a forward algorithm.
From: guestieng @.> Sent: Tuesday, 23 January 2024 15:54 To: autodiff/autodiff @.> Cc: Allan Leal @.>; Comment @.> Subject: Re: [autodiff/autodiff] Stack overflow (Issue #314)
At the moment I'm interested in the reverse mode. The real and dual types are designed to be for the forward mode only, aren't they?
— Reply to this email directly, view it on GitHub https://github.com/autodiff/autodiff/issues/314#issuecomment-1906220837 , or unsubscribe https://github.com/notifications/unsubscribe-auth/ABMOINHYOQRCWSLSI3A2343YP7FIVAVCNFSM6AAAAABCHCUNVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMBWGIZDAOBTG4 . You are receiving this because you commented. https://github.com/notifications/beacon/ABMOINFZ4CD2WOASK6UR4Y3YP7FIVA5CNFSM6AAAAABCHCUNVCWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTTRT2PSK.gif Message ID: @.*** @.***> >