theseus icon indicating copy to clipboard operation
theseus copied to clipboard

Optimize jacobian computation of AutodiffCostFunction

Open luisenp opened this issue 2 years ago • 0 comments

Torch autograd's jacobian, used by LearnableCostFunction computes cross-batch gradients which is undesirable. I haven't seen an out of the box solution, so we might need to do some manual backward() and proper use of vmap() to compute the jacobian on our own.

luisenp avatar Dec 06 '21 22:12 luisenp