Marius Millea

Results 159 comments of Marius Millea

No I get that. All I'm saying is its unorthodox in terms of the usage of ReshapedArray sure, but in terms of the underlying native code, for Arrays, its compiled...

Sorry I'm confused, if you're ok with ReshapedArray there, whats the issue with the current PR?

I had benchmarked some full gradient calls, and there I saw essentially no difference (presumably extracting the gradient is not a dominant part of the computation except for trivial functions)....

My summary from what I remember is what I said [above](https://github.com/JuliaDiff/ForwardDiff.jl/pull/619#issuecomment-1404248199): > at the moment I guess it boils down to a small speed regression in extract_gradient_chunk which should negligibly...

On the off chance you dont already know about it but https://github.com/JuliaDiff/AbstractDifferentiation.jl is a really good way to write code which is AD-backend agnostic.

I have the same feature request. The code in the readme is basically perfect, it would just be nice if all that boilerplate could be hidden inside a parallel progress...

Actually, I hacked together the following, which allows `next!` and `update!` from any worker. Is this approach something that you'd be interested in having as a PR? ```julia struct DistributedProgress...

One hack is just to do ```julia using OptimKit @eval OptimKit function bisect(iter::HagerZhangLineSearchIterator, a::LineSearchPoint, b::LineSearchPoint) # modified body here... end ``` Another is you could `pkg> dev --local OptimKit` and...

The first of remote-sync, remote-edit, or remote-ftp to implement this feature would definitely win this user over!

Your error logs from specific jobs look to me like its the same issue as mentioned here https://www.cosmologyathome.org/forum_thread.php?id=7456. I unfortunately don't have a solution as its not entirely clear to...