pyod icon indicating copy to clipboard operation
pyod copied to clipboard

DeepSVDD loss backward is commented

Open modanesh opened this issue 1 year ago • 1 comments

In DeepSVDD, why is the loss.backward() commented out?

At this line: https://github.com/yzhao062/pyod/blob/2a80ac89816248925ad8f659b7185e764e927265/pyod/models/deep_svdd.py#L355

Thanks!

modanesh avatar Oct 30 '24 13:10 modanesh

When it is uncommented, I'm getting this error:

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

modanesh avatar Oct 31 '24 12:10 modanesh