Allan Leal

Results 176 comments of Allan Leal

This is indeed a lot. I have not been able improve the reverse mode algorithm - focus has been given mainly to forward mode, which uses template meta-programming techniques to...

If your code is self-contained and does not depend on complicated third-party codes, then I could have a look and see how things can be improved for your specific case.

Haven't been able to run your code yet, but the cost function seems indeed complicated. A suggestion: your objective function could be a class method, and the multiple `VectorXvar` objects...

That would be great @supersega !

@supersega , if you open a new issue, we can discuss there what needs to be done.

@davidleitao95 Please read this explanation I wrote in another issue: https://github.com/autodiff/autodiff/issues/19#issuecomment-506390049 Your function evaluation seems very expensive, and if you have many variables, using forward automatic differentiation will require a...

This is related to issue #109 - right now, this optimization will need to be done from your side (by using some sort of memoization). The issue with saving the...

Hi @zoharl3 - I misunderstood you (forget about the memoization). What you are requesting is that the expression tree constructed during a gradient computation is reused for future gradient computations....

You would need to study the header file `autodiff/reverse/reverse.hpp`. Inspect the `derivatives` method: ~~~cpp /// Return the derivatives of a dependent variable y with respect given independent variables. template auto...

The reverse mode automatic differentiation in `autodiff` does require further tuning and optimization. @supersega was working on this a while ago and may provide some details here. The forward mode...