cockpit
cockpit copied to clipboard
No explicit loss and individual loss
Description
What should I do if my model returns the tuple of loss and individual losses and I have no explicit loss function class?
At the moment if I just extend the model, I get:
cockpit/quantities/alpha.py in <dictcomp>(.0)
324 self.save_to_cache(global_step, f"params_{point}", params_dict, block_fn)
325
--> 326 grad_dict = {id(p): p.grad.data.clone().detach() for p in params}
327 self.save_to_cache(global_step, f"grad_{point}", grad_dict, block_fn)
328
AttributeError: 'NoneType' object has no attribute 'data'
Thanks for any suggestions!
Hi,
we will gladly look into this!
To me, it is a bit unclear what you mean by you "have no explicit loss function class"? Could you perhaps provide a small working example to reproduce the behavior?
@f-dangel since this seems mostly related to BackPACK, I co-assigned you.
thank you @fsschneider for your kind offer to help. So yes I do have code that i can show you: e.g.:
the loss is the output of the model in the trainer: https://github.com/zalandoresearch/pytorch-ts/blob/master/pts/trainer.py#L67-L72
where the loss itself is the negative log_prob of some distribution from the outputs of the network e.g.: https://github.com/zalandoresearch/pytorch-ts/blob/master/pts/model/deepar/deepar_network.py#L268
So, you see my model outputs the loss (agg. and individual), and in my trainer I can extend(net)
but I am not sure how to extend the loss... or am I confused and cockpit only works for the specific losses you have in the backdrop documentation?
thanks!
Hi,
I think we need more information to reproduce your issue. The error message you obtain does not seem to be related to your loss being non-standard but rather a parameter gradient in your net that is None
.
Maybe you can come up with a self-contained small snippet that exactly reproduces the issue? You could start from the basic example and swap the building blocks.
I am closing this issue as it is stale. Please feel free to open it again if you have further questions.