Alex Wiltschko
Alex Wiltschko
For `args = ...`, are you referring to something in the autograd source, or in the user's source?
Hey buuuuddyyy
This dead?
Is this still an open issue? What's the best workaround in the meanwhile?
FYI, if just writing down all the gradients of lots of tensor-valued functions is the blocker, this has been done (at least twice) in the autograd-family of libraries. In autograd:...
On the issue you linked, I think you're conflating the partial derivatives you need to write with the method you will use to perform automatic differentiation of output w.r.t. input....
Yes, the function signature, for some function like e.g. sum(x,y): gradSum[1] = function(incomingGradient, answerOfSum, x, y) ... end gradSum[2] = function(incomingGradient, answerOfSum, x, y) ... end to calculate the partial...
Hey, wanted to follow-up and see if you've made any progress? Interested in learning and discussing some of the ideas behind the package.
It took overnight. I'm still letting it run, it's still exploring, so this number could possibly get better. On Thu, Feb 26, 2015 at 11:43 AM, Soumith Chintala wrote: >...
Running 10 g2.2xlarges in parallel. The job suggestions come in via a pull model over a REST API, so it is trivial to parallelize (no extra code required at all)....