tiny-cuda-nn icon indicating copy to clipboard operation
tiny-cuda-nn copied to clipboard

Second derivative, calling autograd.grad twice causes and error?

Open Mehdishishehbor opened this issue 3 years ago • 0 comments

Hi, Thanks for your inspiring research. I am trying to get a second derivative of output with respect to input after passing the input to the tcnn.NetworkWithInputEncoding wish hashtable encoding.

I use the code below to get he second derivitive and loss is defined from the output of tcnn.NetworkWithInputEncoding.

first_derivative = autograd.grad(loss, x, create_graph=True)[0]
# We now have dloss/dx
second_derivative = autograd.grad(first_derivative, x)[0]
# This computes d/dx(dloss/dx) = d2loss/dx2
  

However I get the following error: RuntimeError: DifferentiableObject::backward_backward_input_impl: not implemented error

Could please help me to understand this error and how I can fix it please?

I appreciate your time and efforts! Thanks, Mehdi

Mehdishishehbor avatar Sep 22 '22 19:09 Mehdishishehbor