Jens Egholm Pedersen
Jens Egholm Pedersen
If we're keen on stopping gradients, I agree this could/should be handled on a module level. And we're actually right now [forcing the voltage tensor to retain the gradient graph](https://github.com/norse/norse/blob/master/norse/torch/module/lif.py#L92)...
I see, that makes a lot of sense, actually. And if it's only related to the previous spikes, wouldn't it be possible to even create a wrapper module that "just"...
Hmm I just checked after the renaming to main, and it's still failing. It could have something to do with the nix files. I'll investigate some more.
Great issue! I took the liberty to assign you ;-)
This would be a great next step to go with the work done by @ChauhanT and @Huizerd. Somehow plugging the existing stdp code into arbitrary modules would make it much...
@ChauhanT you mean a hierarchy like so? `torch.nn.Module
[SpykeTorch](https://github.com/miladmozafari/SpykeTorch) could be one source for inspiration, although that looks a little verbose: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6640212/
> Yeah, although ideally I think these should be fixed in another pull request? The broken templates are not related to the parameter learning doc. Completely correct. Let me open...
That should do it. I force-pushed it to get the history straight. The commit should be attributed to you, but I can't seem to cheat GitHub to understand that you...
@PMMon it seems like there was a bug in the PyTorch ONNX conversion that couldn't properly deal with nested structures. However, it looks to be fixed in nightly: https://github.com/pytorch/pytorch/pull/53311 Would...